CN117425080B - Photographing method and related equipment - Google Patents

Photographing method and related equipment Download PDF

Info

Publication number
CN117425080B
CN117425080B CN202311736947.4A CN202311736947A CN117425080B CN 117425080 B CN117425080 B CN 117425080B CN 202311736947 A CN202311736947 A CN 202311736947A CN 117425080 B CN117425080 B CN 117425080B
Authority
CN
China
Prior art keywords
target
parameters
camera
camera application
sub
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202311736947.4A
Other languages
Chinese (zh)
Other versions
CN117425080A (en
Inventor
谢敏
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Honor Device Co Ltd
Original Assignee
Honor Device Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Honor Device Co Ltd filed Critical Honor Device Co Ltd
Priority to CN202311736947.4A priority Critical patent/CN117425080B/en
Publication of CN117425080A publication Critical patent/CN117425080A/en
Application granted granted Critical
Publication of CN117425080B publication Critical patent/CN117425080B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/70Circuitry for compensating brightness variation in the scene
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Studio Devices (AREA)

Abstract

The application provides a photographing method and related equipment. According to the method, target parameters for acquiring target images are acquired through electronic equipment, the target images comprise X groups of exposure parameters, and according to the target parameters, a camera application is controlled to continuously acquire the target images for calibrating ISP parameters, wherein the target images comprise X groups of sub-target images respectively corresponding to the X groups of exposure parameters. That is, after the electronic device obtains the target parameters, the camera application can be controlled to continuously photograph based on the X groups of exposure parameters respectively, so that the labor cost is saved, and the efficiency of calibrating the ISP parameters can be improved. In the present application, the target parameters may also include, but are not limited to: exposure parameters, target format, target shooting mode, information of target camera device, target zoom magnification, or target chip model. In some embodiments, the electronic device can also automatically classify and sort the target images acquired by the camera application, so that the efficiency of calibrating ISP parameters is further improved.

Description

Photographing method and related equipment
Technical Field
The application relates to the technical field of terminals, in particular to a photographing method and related equipment.
Background
The quality of the output image of the electronic device depends on the Image Signal Processing (ISP) technology of the processor in addition to the modules, processes, etc. of the camera itself. Among other things, the ISP process involves a large number of ISP parameters, such as white balance, contrast, saturation, noise reduction parameters, etc. The effect of the debugging of ISP parameters directly affects the quality of the output image. Calibration of the ISP parameters is a precondition for debugging the ISP parameters, and thus calibration of the ISP parameters is very important for the quality of the output image of the electronic device.
Currently, to achieve calibration of ISP parameters, a large number of images need to be shot by a calibration staff under a calibration environment by using an electronic device for a fixed graphic card (such as a color card, a gray-scale card or an analytical force card), and the shooting process takes a long time, has high labor cost, and greatly affects the calibration efficiency.
Disclosure of Invention
The application provides a photographing method and related equipment, which can improve the efficiency of calibrating ISP parameters.
In a first aspect, a photographing method is provided, applied to an electronic device, and the method includes: acquiring target parameters for acquiring a target image, wherein the target image is used for calibrating image signal processing parameters, the target parameters also comprise X groups of exposure parameters, X is more than 1, X is an integer, and different groups of exposure parameters in the X groups of exposure parameters are not completely identical; according to the target parameters, the camera application is controlled to continuously acquire target images, the target images comprise X groups of sub-target images, the X groups of exposure parameters are in one-to-one correspondence with the X groups of sub-target images, and each group of sub-target images in the X groups of sub-target images comprises one or more images.
In the embodiment of the application, after the electronic equipment acquires the X groups of exposure parameters, the camera application can be controlled to continuously photograph based on the X groups of exposure parameters respectively, so that the labor cost is saved, and the efficiency of calibrating ISP parameters can be improved.
It can be understood that, at present, if calibration personnel want to collect photos based on different sets of parameters (each set of parameters includes one or more parameters), a set of parameters needs to be manually set for the equipment where the camera is applied, and after shooting is completed, the next set of parameters is set, and shooting is continued; and if a group of parameters need to be taken for a plurality of pictures, after the group of parameters need to be manually set, the shooting is carried out by a plurality of manual electronic shooting controls, so that the method is very tedious and labor-consuming. In the scheme, the electronic equipment can acquire X groups of exposure parameters at one time, and can control the camera application to continuously acquire the target images based on different groups of exposure parameters respectively. By way of example, the continuous acquisition herein may be understood that the electronic device does not acquire new target parameters before the camera application acquires the target image corresponding to the target parameters, and the calibration personnel does not need to perform any operation on the electronic device, or in other words, the electronic device does not need to detect the operation of the user; and the electronic equipment continuously sets a plurality of groups of exposure parameters for the equipment where the camera application is located and controls the camera application to take pictures, and the calibration personnel do not need to execute a plurality of times of operations on the electronic equipment, or the electronic equipment does not need to detect the operation of a user after acquiring the target parameters.
In some embodiments, the electronic device detects an operation for instructing start of shooting, and controls the camera application to continuously acquire the target image. That is, after the calibration personnel clicks the control for starting shooting, the next step of processing can be directly carried out after the shooting of the photo is completed, and the photographing process with complicated in-person operation is not needed. In some embodiments, the number of sub-target images per group may be determined based on image signal processing parameters.
It will be appreciated that controlling the camera application to take a picture according to a certain set of exposure parameters, i.e. setting the exposure parameters for controlling the image sensor in the camera device of the device in which the camera application is located as the set of exposure parameters, instead of setting them according to the exposure parameters determined by the automatic exposure module of the device in which the camera application is located.
It can be further understood that the exposure parameter is a parameter of the image sensor at the front end of the ISP for collecting image signals, and the ISP parameter is calibrated after the photo is shot by setting the exposure parameter, so that the mutual adaptation of the image sensor and the image signal processor is facilitated, and the accuracy of ISO parameter calibration is improved. The exposure parameters include at least one of: shutter, or exposure time, exposure gain, sensitivity (international standards organization or photosensibility, ISO).
It can also be appreciated that the degree of ISP processing required to be performed is different based on RAW image format (RAW) maps acquired by different exposure parameters. For example, noise reduction processing is performed on RAW graphs acquired based on different gain, and the noise reduction intensities are different. If the gain is small, the noise of the general RAW graph will be large, and the noise reduction parameters will be large. Therefore, a large number of photos are shot aiming at a plurality of groups of exposure parameters, and the accuracy of ISP parameter calibration can be improved.
It can be further understood that the embodiment of the application is suitable for calibrating and debugging the photographing effect of equipment where the camera is applied, and mainly relates to the operation of calibration personnel or debugging personnel on electronic equipment.
It is further understood that the device that obtains the target parameter may be the same device as the device in which the camera application is located, or may be a different device. If the equipment for acquiring the target parameters is different from the equipment where the camera application is located, interaction between the calibration personnel and the equipment where the camera application is located is not involved, interaction between the calibration personnel and the electronic equipment is carried out, and the electronic equipment controls the equipment where the camera application is located to take pictures.
In one possible embodiment, the method further comprises: creating a target folder, wherein the target folder comprises one or more subfolders, and the one or more subfolders are named according to target parameters; acquiring a target image from equipment where a camera application is located; and storing the target image in one or more subfolders according to the corresponding relation between the target image and the target parameters.
According to the scheme, the collected target images are sorted based on the target parameters, so that the efficiency of ISP parameter calibration can be further improved.
In some embodiments, the electronic device detects an operation for instructing start of shooting, and controls the camera application to continuously acquire the target image. That is, after the calibration personnel clicks the control for starting shooting, photos classified and arranged in the folder can be obtained after the photos are shot, so that the photos shot are not required to be classified and arranged in person, and time and energy are saved.
In one possible embodiment, the object folder includes X subfolders, where the X subfolders are named according to X groups of exposure parameters, and storing the object image in one or more subfolders according to a correspondence between the object image and the object parameters, including: and storing the X groups of sub-target images acquired according to the X groups of exposure parameters in X sub-folders respectively, wherein X is more than or equal to 1, and X is an integer.
According to the scheme, based on the X groups of exposure parameters, the collected X groups of sub-target images are respectively arranged in the X sub-folders, so that the efficiency of ISP parameter calibration can be further improved.
In some embodiments, each subfolder may include one or more secondary subfolders. For example, in the case of capturing a target image by using Y camera devices, the target file includes Y subfolders, each subfolder is named according to the identifier of the camera device, each subfolder includes X secondary subfolders, each secondary subfolder is named according to the exposure parameter, Y is greater than or equal to 1, and Y is an integer. For another example, in the case of capturing a target image using Y shooting modes, the target file includes Y subfolders, each subfolder is named according to the shooting mode, each subfolder includes X secondary subfolders, each secondary subfolder is named according to an exposure parameter, Y is greater than or equal to 1, and Y is an integer. For another example, where the target format includes Z image formats, the target folder includes X subfolders, each subfolder named according to the exposure parameters, each subfolder includes Z secondary subfolders, each secondary subfolder named according to the image format, Z is greater than or equal to 1, and Z is an integer.
In one possible embodiment, controlling the camera application to continuously acquire the target image according to the target parameter includes: according to the ith group exposure parameters, controlling the camera application to continuously acquire an ith group sub-target image; obtaining a target image from a device in which a camera application is located, comprising: acquiring an ith group of sub-target images from a device where a camera application is located; storing the X groups of sub-target images acquired according to the X groups of exposure parameters in X sub-folders respectively, wherein the X groups of sub-target images comprise: and storing the ith group of sub-target images acquired according to the ith group of exposure parameters in an ith subfolder, wherein i is more than or equal to 1 and less than or equal to X, and i is an integer.
In the scheme, when a group of sub-target images are shot, the group of sub-target images shot recently are stored in the corresponding folders, and compared with the mode that all the target images are shot and classified and arranged into files, the difficulty in classifying the images can be reduced, the accuracy of classification and arrangement is improved, and therefore the accuracy of ISP parameter calibration is improved.
In a possible embodiment, the target parameters further comprise a target format of the target image, the target format being determined from the image signal processing parameters, the target format comprising at least one of: the RAW format, the YUV format or the JPG format, and the format of the target image is the target format.
It can be understood that the accuracy of the calibration of the ISP parameters can be improved by shooting images in corresponding formats according to different ISP parameters.
In some embodiments, the method further comprises: detecting ISP parameters to be calibrated input by a user; and determining the target format according to a preset corresponding relation.
That is, the calibration personnel only need to manually input the ISP parameters to be calibrated, and the electronic equipment automatically determines the format of the image to be acquired and stored, so that the efficiency of ISP parameter calibration can be further improved compared with the manual setting of the target format by the calibration personnel.
In one possible embodiment, before controlling the camera application to continuously acquire the target image according to the target parameter, the method further comprises: controlling the camera application to continuously acquire a corresponding group of sub-target images in the X groups of sub-target images according to one group of exposure parameters in the X groups of exposure parameters; acquiring a set of sub-target images from a device in which the camera application is located; an evaluation is made as to whether the exposure parameters of the set of sub-target images match the set of exposure parameters.
According to the scheme, small-batch photographing is performed first, whether target parameters can be successfully set for camera application is determined, the situation that a large number of photographs which do not accord with the target parameters are taken without finding any influence on the accuracy of ISP parameter calibration can be reduced, and the accuracy of ISP parameter calibration can be improved; and the situation that the ISP parameter calibration is required to be repeated because a large number of pictures which do not accord with the target parameter are shot is reduced, and the ISP parameter calibration efficiency is improved.
In a possible embodiment, the target parameters further include a target shooting mode of the target image, where the target shooting mode includes shooting and/or shooting in video, and in a case where the target shooting mode is shooting in video, controlling the camera application to continuously collect the target image according to the target parameters, including: acquiring a first position of a first control used for triggering photographing in video in a camera application in a video interface; and according to the target shooting mode, controlling equipment where the camera is applied to click a first position in the video to acquire a target image.
It can be appreciated that in general, the size of the recorded photo is smaller than that of the directly photographed photo, and the calibration is performed on the photos with different sizes obtained by the camera, so that the calibration on the ISP parameters of the equipment where the camera is located can be more comprehensive.
In some embodiments, the target capture mode further comprises one or more of: portrait mode, professional mode, super macro mode, double image video mode, or panoramic mode.
In some embodiments, acquiring target parameters for acquiring a target image includes: acquiring shooting mode indication information, wherein the shooting mode indication information is used for indicating shooting and/or shooting in video; and determining the target shooting mode as shooting and/or shooting in video according to the shooting mode indication information.
In some embodiments, when the target shooting mode includes at least two shooting modes (for example, shooting and shooting in video), it is possible to control the device where the camera application is located to respectively shoot the X-group sub-target images using different shooting modes (for example, shooting the X-group sub-target images using the shooting mode and shooting the X-group sub-target images using the shooting mode in video) according to each of the X-group exposure parameters, so that time and effort of a calibration person are further saved, and efficiency of ISP parameter calibration is further improved.
In one possible embodiment, the target shooting mode includes at least one of: a photographing mode, a recording photographing mode, a portrait mode, a professional mode, a super macro mode, a double-image video mode or a panoramic mode, and acquiring target parameters for acquiring target images, comprising: acquiring shooting mode indication information, wherein the shooting mode indication information is used for indicating a historical shooting mode, and the historical shooting mode is a shooting mode used before a camera application continuously acquires target images; and determining that the target shooting mode is a historical shooting mode according to the shooting mode indication information.
In some embodiments, the history shooting mode is, for example, a shooting mode used by the camera application after the last time an operation for instructing to start shooting was detected; for example, one or more shooting modes selected by the calibration person are acquired the last time an operation for instructing start of shooting is detected.
It can be understood that if the electronic apparatus is to control the camera application to take a photograph a plurality of times (for example, to control the camera application to take a photograph once after detecting an operation for instructing to start taking a photograph), and the detected photographing mode instruction information is used to instruct the history photographing mode, it means that the target photographing modes used by the adjacent two times are the same. For the situation that the same shooting mode is used as the target shooting mode to shoot a large quantity of photos, repeated operation of selecting the target shooting mode by a calibration staff can be saved, and shooting efficiency is further improved.
In other embodiments, the historical photographing mode is, for example, a photographing mode used when the detected camera application photographed the last photo; and the historical photographing mode is different from the selectable photographing mode provided by the electronic device.
It will be appreciated that assuming that the available shooting modes provided by the electronic device to the calibrator cannot meet the needs of the calibrator, the calibrator may manually select the shooting mode that he wants to use to take at least one picture, and then instruct the electronic device to use the history mode, so that the electronic device can take a picture using the shooting mode that was not provided. Therefore, the shooting mode can be expanded based on manual operation of the calibration personnel, and the original UI interface of the electronic equipment is not required to be changed.
In a possible embodiment, in a case where the apparatus in which the camera application is located includes a plurality of camera devices, the target parameter further includes information of a target camera device for acquiring a target image, and controlling the camera application to continuously acquire the target image according to the target parameter includes: and controlling the camera application to acquire the target image through the target camera device according to the target camera device information.
It can be understood that the calibration of the ISP parameters is performed on the photos taken by a plurality of camera devices (such as a main camera and a front camera) of the equipment where the camera is located, so that the calibration of the ISP parameters of the equipment where the camera is located can be more comprehensive.
In some embodiments, when the target camera device information is used to indicate a plurality of camera devices, it can be achieved that according to each set of exposure parameters in the X sets of exposure parameters, the device where the camera application is controlled to respectively use different camera devices to shoot the X sets of target images (for example, using the main camera to shoot the X sets of sub-target images and using the front camera to shoot the X sets of sub-target images), so that time and effort of a calibration person are further saved, and efficiency of ISP parameter calibration is further improved.
It is appreciated that the image sensors (sensors) of different camera devices are different, and that different sensors generally require different configurations of ISP parameters to achieve better image quality. Therefore, the ISP parameters are required to be calibrated for different cameras respectively, so that the calibration of the ISP parameters of the equipment where the cameras are located is more comprehensive, and the image effect of the photo shot by each camera equipment is better.
In one possible embodiment, the target parameters further include a target zoom magnification used by the camera application to continuously capture target images, and controlling the camera application to continuously capture target images according to the target parameters includes: acquiring a second position of a second control used for switching to a target zoom magnification in a camera application in a camera interface; and according to the target zoom magnification, controlling the equipment where the camera application is located to click the second position so as to acquire the target image.
It can be understood that the calibration of the ISP parameters is performed on the photos acquired by using different zoom magnifications (such as ultra-wide angle and long focus) of the camera, so that the calibration of the ISP parameters of the equipment where the camera is located is more comprehensive.
In some embodiments, the target zoom magnification includes a plurality of sub-target zoom magnifications, so that according to each set of exposure parameters in the X sets of exposure parameters, the device where the camera application is controlled to respectively use different zoom magnifications to shoot the X sets of target images with different sizes (for example, using zoom magnification #1 to shoot the X sets of sub-target images and using zoom magnification #2 to shoot the X sets of sub-target images), thereby further saving time and effort of a calibration person and further improving the efficiency of ISP parameter calibration.
In a possible embodiment, the target parameters further include target chip model information, the target chip model information indicating a model of a chip included in the device in which the camera application is located, and controlling the camera application to continuously collect the target image according to the target parameters includes: determining control information matched with the chip according to the model information of the target chip, wherein the control information is used for controlling the camera application to continuously acquire target images; and sending control information to the device where the camera application is located.
It can be appreciated that if the target parameter does not include the target shooting mode, shooting mode indication information may be included; or if the target parameter does not include shooting mode indication information, the target shooting mode may be included. The above-mentioned target photographing mode may be a photographing mode displayed on the UI interface, and the above-mentioned photographing mode may be a complement or extension of the photographing mode on the UI interface. If the camera application is used or switched to a certain shooting mode before shooting the target image, and the calibration personnel selects an option corresponding to shooting mode indication information on the UI interface, the camera application needs to be controlled to continuously shoot the target image according to the shooting mode.
According to the scheme, the model of the chip included in the equipment where the camera application is located is obtained, so that the equipment where the camera application is located can be controlled to collect the target image in a corresponding mode according to the chip signal, and the accuracy of the target image can be improved.
In one possible embodiment, the device in which the camera is located is a first device, the electronic device is a second device, the first device being different from the second device, the second device including a control application; acquiring target parameters for acquiring a target image includes: the second device detects an operation for indicating to start shooting and reads first target information input by a user in the control application program; and determining X groups of exposure parameters according to the first target information; controlling the camera application to continuously acquire target images according to the target parameters, including: the second device sequentially sends X first android Zhuo Diaoshi bridge (android debug bridge, ADB) commands to the first device according to X groups of exposure parameters, wherein the ith first ADB command is used for controlling a camera application to acquire an ith group of sub-target images based on the ith group of exposure parameters, i is more than or equal to 1 and less than or equal to X, and i is an integer.
In one possible embodiment, creating the target folder includes: in response to an operation for instructing start of shooting, the second device reads second target information provided by the calibration person; determining a first storage path of the target folder and naming modes of X subfolders according to the second target information and X groups of exposure parameters, wherein the X subfolders are named according to the X groups of exposure parameters respectively; creating a target folder according to the first storage path and the naming mode; obtaining a target image from a device in which a camera application is located, comprising: the second device obtains a second storage path of the target image from the first device, sends a second ADB command to the first device, wherein the second ADB command comprises the second storage path and the first storage path, the second ADB command is used for obtaining the target image, and the second device obtains the target image from the first device; storing the target image in one or more subfolders according to the correspondence between the target image and the target parameter, including: and storing the X groups of sub-target images in X sub-folders respectively, wherein X is more than or equal to 1, and X is an integer.
In some embodiments, the second ADB command includes X ADB commands, the X ADB commands corresponding one-to-one to the X sets of exposure parameters. That is, the second device sends one ADB command at a time for indicating a set of exposure parameters.
In a second aspect, the present application provides a photographing system, the photographing system including a first device including a camera application and a second device controlling the first device through ADB debugging; the second device detects an operation for indicating to start shooting and reads target information input by a user in the control application program; and determining X groups of exposure parameters according to the target information; according to the X groups of exposure parameters, X first ADB commands are sequentially sent to first equipment, wherein the ith first ADB command is used for controlling a camera application to acquire an ith group of sub-target images based on the ith group of exposure parameters, i is more than or equal to 1 and less than or equal to X, and i is an integer; the first device captures an ith group of sub-target images based on the ith group of exposure parameters according to the ith first ADB command.
In one possible embodiment, in response to an operation for instructing start of shooting, the second device reads second target information input by the user at the control application; determining a first storage path of the target folder and naming modes of X subfolders according to the second target information and X groups of exposure parameters, wherein the X subfolders are named according to the X groups of exposure parameters respectively; creating a target folder according to the first storage path and the naming mode; the second device obtains a second storage path of the target image from the first device, sends a second ADB command to the first device, wherein the second ADB command comprises the second storage path and the first storage path, the second ADB command is used for obtaining the target image, and the second device obtains the target image from the first device; the first device receives a second ADB command, and the first device sends a target image to the second device; the second device stores the X groups of sub-target images in X sub-folders respectively, wherein X is more than or equal to 1, and X is an integer.
In a third aspect, the present application provides an electronic device comprising one or more processors and one or more memories; wherein the one or more memories are coupled to the one or more processors, the one or more memories being operable to store computer program code comprising computer instructions that, when executed by the one or more processors, cause the electronic device to perform the method as described in the first aspect and any possible implementation of the first aspect.
In a fourth aspect, embodiments of the present application provide a chip system for application to an electronic device, the chip system comprising one or more processors for invoking computer instructions to cause the electronic device to perform a method as described in the first aspect and any possible implementation of the first aspect.
In a fifth aspect, the application provides a computer readable storage medium comprising instructions which, when run on an electronic device, cause the electronic device to perform a method as described in the first aspect and any possible implementation of the first aspect.
In a sixth aspect, the application provides a computer program product comprising instructions which, when run on an electronic device, cause the electronic device to perform a method as described in the first aspect and any possible implementation of the first aspect.
It will be appreciated that the electronic device provided in the second aspect, the chip system provided in the third aspect, the computer storage medium provided in the fourth aspect, and the computer program product provided in the fifth aspect are all configured to perform the method provided by the present application. Therefore, the advantages achieved by the method can be referred to as the advantages of the corresponding method, and will not be described herein.
Drawings
FIG. 1A is a schematic diagram of an exemplary software system of an electronic device 100 provided in an embodiment of the present application;
FIG. 1B is a schematic diagram of an exemplary software system of an electronic device 200 according to an embodiment of the present application;
fig. 2 is a schematic flowchart of a photographing method 300 according to an embodiment of the present application;
Fig. 3 is a schematic flowchart of a photographing method 400 according to an embodiment of the present application;
fig. 4 is a schematic flowchart of a photographing method 500 according to an embodiment of the present application;
FIG. 5A is a schematic diagram illustrating an example of a User Interface (UI) of a control application according to an embodiment of the present application;
FIG. 5B is a diagram of another possible UI interface provided by an embodiment of the application;
FIG. 6A illustrates a hardware system suitable for use with the apparatus of the present application;
Fig. 6B shows another hardware system suitable for use in the apparatus of the present application.
Detailed Description
For the purpose of making the objects, technical solutions and advantages of the present application more apparent, embodiments of the present application will be described in further detail below with reference to the accompanying drawings.
As described in the background, to achieve the calibration of ISP parameters, a large number of images need to be captured by a calibration person using an electronic device for a fixed graphic card (e.g., color card, gray-scale card, resolution card, etc.) in a calibration environment. The calibration personnel can place the electronic equipment and the image card in the lamp box, so that the image card is in the visual field of the camera of the electronic equipment, the brightness of the lamp box is set, the electronic equipment is used for shooting a plurality of pictures on the image card, the pictures stored in the mobile phone are obtained, the pictures are processed, and the parameters of the ISP are adjusted to optimize the image quality according to the image processing result and the actual requirement, so that the calibration of the parameters of the ISP is realized. Because the number of cameras of the current electronic equipment is large, and the adjustable parameters of each camera are large, a large number of pictures need to be taken for ISP calibration, the whole shooting process is long in time consumption, the labor cost is high, and the calibration efficiency is greatly influenced.
It can be understood that at present, if calibration personnel need to collect photos based on different shooting parameters, a group of parameters needs to be manually set for equipment where a camera is applied, and after shooting is completed, the next group of parameters is set, and shooting is continued; and if a group of parameters need to be taken for a plurality of photos, after the group of parameters need to be manually set, the shooting is carried out by manually and electronically shooting the control for a plurality of times, so that the method is very tedious and labor-consuming and has low efficiency.
The application provides a photographing method, which is characterized in that target parameters for acquiring target images are acquired through electronic equipment (hereinafter referred to as electronic equipment 100 for convenience of description), the target images comprise X groups of exposure parameters, and according to the target parameters, a camera application is controlled to continuously acquire target images for calibrating ISP parameters, wherein the target images comprise X groups of sub-target images respectively corresponding to the X groups of exposure parameters.
That is, after the electronic device obtains the X groups of exposure parameters, the camera application can be controlled to continuously photograph based on the X groups of exposure parameters, so that the labor cost is saved, and the efficiency of calibrating the ISP parameters can be improved.
It can be understood that, at present, if calibration personnel want to collect photos based on different sets of parameters (each set of parameters includes one or more parameters), a set of parameters needs to be manually set for the equipment where the camera is applied, and after shooting is completed, the next set of parameters is set, and shooting is continued; and if a group of parameters need to be taken for a plurality of pictures, after the group of parameters need to be manually set, the shooting is carried out by a plurality of manual electronic shooting controls, so that the method is very tedious and labor-consuming. In the scheme, the electronic equipment can acquire X groups of exposure parameters at one time, and can control the camera application to continuously acquire the target images based on different groups of exposure parameters respectively. In some embodiments, the electronic device detects an operation for instructing start of shooting, and controls the camera application to continuously acquire the target image. That is, after the calibration personnel clicks the control for starting shooting, the next step of processing can be directly carried out after the shooting of the photo is completed, and the photographing process with complicated in-person operation is not needed. In some embodiments, the number of sub-target images per group may be determined based on image signal processing parameters.
It can be further understood that the exposure parameter is a parameter of the image sensor at the front end of the ISP for collecting image signals, and the ISP parameter is calibrated after the photo is shot by setting the exposure parameter, so that the mutual adaptation of the image sensor and the image signal processor is facilitated, and the accuracy of ISO parameter calibration is improved.
Illustratively, in the present application, the target parameters may include, but are not limited to: exposure parameters, target format, target shooting mode, information of target camera device, target zoom magnification, or target chip model. These parameters will be described in detail below.
In some embodiments, the electronic device can also automatically classify and sort the target images acquired by the camera application, so that the target images can be processed later, and the efficiency of calibrating ISP parameters is further improved.
In a first possible implementation manner of the present application, the electronic device 100 is a device where the camera application is located (or a device to be calibrated with ISP parameters, hereinafter referred to as the electronic device 200 for convenience of description), that is, the electronic device 100 controls its camera application to capture the target image.
In a second possible implementation of the application, the electronic device 100 is a different device than the electronic device 200, that is to say, the device (electronic device 100, or the first device mentioned above) whose ISP parameters are to be calibrated is controlled by an external device (electronic device 200, or the second device mentioned above) to take the target image.
A schematic diagram of an exemplary software system is presented below in connection with fig. 1A and 1B, respectively, for the two implementations described above.
Fig. 1A is a schematic diagram of an exemplary software system of an electronic device 100 according to an embodiment of the present application. Fig. 1A corresponds to the first implementation described above.
Referring to fig. 1A, an application layer 110, an application framework layer 120, a system layer 130, a hardware abstraction layer 140, a driver layer 150, and a hardware layer 160 may be included in a system architecture of the electronic device 100.
The application layer 110 may include camera applications and control applications, as well as gallery, calendar, talk, map, navigation, wireless local area network (wireless local area network, WLAN), bluetooth, music, video, short message, etc. applications.
In the application, the calibration personnel can set the target parameters based on the control application program. For example, the target parameter is manually input at the UI interface of the control application.
The application framework layer 120 provides an application programming interface (application programming interface, API) and programming framework for the application of the application layer; the application framework layer may include some predefined functions.
For example, the application framework layer 120 may include a camera API or the like.
It will be appreciated that one application may obtain or send data by calling the API interface of another application, thereby enabling data interaction between applications.
In the application, the control application program can control the camera application program to acquire the target image through the camera API.
The system layer 130 is responsible for scheduling and management of the android system.
For example, the system layer 130 includes a memory module and the like. The main task of the storage module is to manage and operate files and directories, including creating, opening, closing, copying, moving, deleting, etc. files and objects.
It will be appreciated that the control application does not create folders directly through the storage module of the system layer. The control application makes a request to the system layer, which will complete the creation of the folder with the aid of the storage module. Specifically, the system layer creates a new directory at the appropriate location and adds path information for the newly created folder to the file path of the system.
In the present application, the storage module may move or copy the images in memory in the hardware layer 160 into a pre-created folder.
The hardware abstraction layer 140 is used to abstract the hardware. For example, the hardware abstraction layer may include a camera hardware abstraction layer and other hardware device abstraction layers.
The driver layer 150 is used to provide drivers for different hardware devices. For example, the drive layer may include a camera drive.
The hardware layer 160 may include camera devices, memory, and other hardware devices.
For example, the camera device may include an image sensor.
Illustratively, in FIG. 1A, dashed arrows represent control flow and solid arrows represent data flow.
Illustratively, the control application opens the camera application through the camera API and controls the camera application to capture the target image based on the target parameters. The camera application then bases on the camera API, the camera hardware abstraction layer, and up to the camera device driver to drive the camera device to take an image. Specifically, in the case where the exposure parameter is included in the target parameter, the control application controls the camera application to set the exposure parameter for the image sensor, and the camera apparatus photographs based on the exposure parameter. Subsequently, the camera device stores the photographed target image in the memory.
Illustratively, the control application creates and names folders through the storage module. The storage module is capable of moving or copying the target image in memory into a correspondingly named folder.
Fig. 1B is a schematic diagram of an exemplary software system of an electronic device 200 according to an embodiment of the present application. Fig. 1B corresponds to the second implementation described above.
Referring to fig. 1B, the electronic apparatus 200 controls the electronic apparatus 100 through ADB debugging. It is understood that ADB is a command line tool for communicating with Android devices. Taking the electronic device 200 as a computer (or other terminal devices as well), the electronic device 100 as a mobile phone (or other terminal devices as well) as an ADB driver (or called ADB client) installed on the computer, the developer option of the mobile phone is opened, and the USB debug option is found and enabled, which allows the computer to communicate with the mobile phone through USB. The USB data line is used to connect the handset to the computer and to ensure that the handset is properly connected and is recognized by the computer. A command line window is opened on the computer, and ADB commands are input in the command line window to communicate with the mobile phone. The computer can send instructions to the mobile phone by using the ADB command, the ADB daemon on the mobile phone performs corresponding operations on the mobile phone, and returns the results to the ADB driver of the computer.
The system architecture of the electronic device 100 may include an application layer 110, a system layer 130, a hardware abstraction layer 140, a driver layer 150, and a hardware layer 160; an application layer 210, a system layer 230, and a hardware layer 260 may be included in the system architecture of the electronic device 200.
The application layer 110, the hardware abstraction layer 140, the driver layer 150, and the hardware layer 160 may be referred to in the corresponding descriptions of fig. 1A, except that the application layer 110 in fig. 1B does not include a control application program, and the system layer 130 does not include a storage module.
The system layer 130, among other things, includes an ADB daemon that is responsible for receiving and executing ADB commands from the electronic device 200. Illustratively, the ADB daemon communicates with other components of the operating system of the cell phone to perform certain operations, such as installing applications, uninstalling applications, copying files, etc. Through the ADB command, the computer can interact with the mobile phone and perform various operations on the mobile phone.
Wherein the application layer 210 includes control applications. In the application, the calibration personnel can set the target parameters based on the control application program. For example, the target parameter is manually input at the UI interface of the control application.
The system layer 230 includes an auto-photo thread and a storage module, among other things. It will be appreciated that the electronic device 200 is provided with an ADB client corresponding to the ADB daemon thread in the electronic device 100, and that the ADB client may use one or more threads, including an auto-photo thread. It will be appreciated that the control application does not create folders directly through the storage module of the system layer. The control application makes a request to the system layer, which will complete the creation of the folder with the aid of the storage module. Specifically, the system layer creates a new directory at the appropriate location and adds path information for the newly created folder to the file path of the system.
Wherein the hardware layer 260 includes a memory.
For example, the control application may provide the target parameters on the UI interface to an auto-photo thread, which sends an ADB command to the ADB daemon indicating that the target image is acquired. The ADB daemon controls the camera application to capture a target image according to the ADB command, which is stored in the memory of the hardware layer 160.
Illustratively, the auto-shoot thread creates and names folders according to the target parameters with the aid of a storage module.
Illustratively, the auto-shoot thread also sends an ADB command to the ADB daemon, instructing the acquisition of the target image stored in the memory of the hardware layer 160 and storing it in a folder created in advance by the storage module. The ADB command includes, for example, a path in which the target image is stored in the electronic device 100 and a path to be stored in the electronic device 200. And after the automatic photographing thread acquires the target image, the target image is stored in a corresponding folder through a storage module.
Fig. 2 is a schematic flowchart of a photographing method 300 according to an embodiment of the present application. It is understood that the photographing method 300 is performed by the electronic device 100 (corresponding to the first implementation above) or the electronic device 200 (corresponding to the second implementation above).
Taking the photographing method 300 as an example, which is performed by the electronic apparatus 100, the method 300 includes S301a to S303a. The user interacting with the electronic device 100 through the UI interface may be a calibration person.
S301a, the electronic device 100 determines that an intermediate image (e.g., RAW format or YUV format) processed to obtain a photograph needs to be stored when taking the photograph (e.g., JPG format) in response to the user clicking on the operation for instructing to store the intermediate image control.
Illustratively, the electronic device 100 detects a user clicking on a UI interface of the control application for indicating an operation to store the intermediate image control. Fig. 5B is a schematic diagram of a possible UI interface according to an embodiment of the present application. Illustratively, in the UI interface, a panel (other panels are not shown) such as a panel for opening a panel for storing intermediate images is included, and the panel includes a plurality of controls thereon: photographing, photographing registers, photographing + previews, video registers, etc. (other controls are not shown). Upon detecting that the user clicks on any of the controls in the store intermediate image panel, the electronic device 100 determines that the intermediate image of the processed photograph is required to be stored for taking the photograph.
Optionally, S302a, it is determined that the calibration environment has completed the arrangement.
Illustratively, the electronic device 100 detects operation of a control by the user to indicate that the calibration environment has completed placement, determining that the calibration environment has completed placement. Such as detecting a user clicking on a "complete placement" control on the UI interface.
It can be appreciated that the user places the electronic device and the graphics card in the light box such that the graphics card is within the field of view of the camera of the electronic device and sets the light box brightness. During shooting, the relative positions of the electronic equipment and the image are fixed. The brightness of the light box is used to simulate different brightness scenes (e.g., night, daytime, etc. scenes).
S303a, an operation of clicking a control for instructing start of shooting by the user is detected.
Illustratively, the user clicks on a "start shooting" control on the UI interface of the control application and waits for shooting to be completed.
Taking the photographing method 300 as an example, the method 300 includes S301b to S303b, which are performed by the electronic apparatus 200. The user interacting with the electronic device 200 through the UI interface may be a calibration person.
S301b, in response to the user clicking on the control for indicating to store the intermediate image, the electronic device 200 indicates that the intermediate image (e.g. RAW format or YUV format) processed to obtain the photograph needs to be stored when the electronic device 100 takes the photograph (e.g. JPG format).
Illustratively, the electronic device 200 detects a user clicking on a UI interface of the control application for indicating an operation to store the intermediate image control. Upon detecting that the user clicks any one of the controls in the panel for storing intermediate images in fig. 5B, the electronic device 200 determines that the intermediate images of the processed photographs need to be stored for taking the photographs.
Illustratively, the electronic device 200 instructs the electronic device 100 to store the intermediate image via the ADB command.
Optionally, S302b, it is determined that the calibration environment has completed the arrangement.
Illustratively, the electronic device 200 detects operation of a control by the user to indicate that the calibration environment has completed placement, and determines that the calibration environment has completed placement. Such as detecting a user clicking on a "complete placement" control on the UI interface.
It can be appreciated that the user places the electronic device and the graphics card in the light box such that the graphics card is within the field of view of the camera of the electronic device and sets the light box brightness. During shooting, the relative positions of the electronic equipment and the image are fixed. The brightness of the light box is used to simulate different brightness scenes (e.g., night, daytime, etc. scenes).
S303b, an operation of clicking a control for instructing start of shooting by the user is detected.
Illustratively, the user clicks on a "start shooting" control on the UI interface of the control application and waits for shooting to be completed.
Fig. 3 is a schematic flowchart of a photographing method 400 according to an embodiment of the present application.
It is understood that method 400 is performed after method 300. For example, S303a or S303b is used to trigger S401 and S402 below, and S301 may be understood as a precondition for steps 2-a to 2-c and 3 below.
It is appreciated that the method 400 is performed by the electronic device 100 (corresponding to the first implementation above) or the electronic device 200 (corresponding to the second implementation above). The user interacting with the electronic device 100 or the electronic device 200 through the control application may be a calibration person.
S401, acquiring target parameters for acquiring a target image.
Wherein the target image is used for calibrating ISP parameters.
Illustratively, the target parameter is obtained from target information entered by the user on the UI interface of the control application before clicking on the control for instructing start of shooting.
Exemplary, target parameters may include, but are not limited to: exposure parameters, target format, target shooting mode, information of target camera device, target zoom magnification, or target chip model.
Example 1, the target parameters include X sets of exposure parameters, X is greater than or equal to 1, and X is an integer. The X-group exposure parameters are used to instruct the camera device of the device in which the camera application is located to perform exposure according to the set exposure parameters, instead of performing exposure according to the default exposure parameters of the device in which the camera application is located. That is, in the present application, the image sensor does not acquire the exposure parameters issued by the automatic exposure module to perform exposure, but performs exposure according to the exposure parameters among the target parameters.
In the present application, the exposure parameters include at least one of: shutter (or exposure time), exposure gain (gain), sensitivity (ISO).
It can be understood that the exposure parameter is a parameter of the image sensor at the front end of the ISP for collecting image signals, and the ISP parameter is calibrated after the photo is shot by setting the exposure parameter, so that the mutual adaptation of the image sensor and the image signal processor is facilitated, and the accuracy of ISO parameter calibration is improved.
Wherein, under the condition that X > 1, different groups of exposure parameters in the X groups of exposure parameters are not identical. For example, exposure parameters include a gamma and gamma, x=3, and different values of gamma and gamma may be selected while maintaining the product of sutter and gamma (or the brightness of the taken picture) unchanged, such as gamma=1 and gamma=8a in the exposure parameters of group 1; gain=2 and router=4a in the exposure parameters of group 2; gain=4, router=2a, a >0 and is an integer in the 3 rd set of exposure parameters.
It will also be appreciated that in the case of X > 1, the degree of ISP processing required to be performed is different based on RAW maps acquired for different exposure parameters. For example, noise reduction processing is performed on RAW graphs acquired based on different gain, and the noise reduction intensities are different. If the gain is small, the noise of the general RAW graph will be large, and the noise reduction parameters will be large. Therefore, a large number of photos are shot aiming at a plurality of groups of exposure parameters, and the accuracy of ISP parameter calibration can be improved.
Example 2, the target parameters further comprise a target format of the target image, the target format being determined from the ISP parameters, the target format comprising at least one of: RAW format, YUV format or JPG format.
Illustratively, the method 400 further includes: detecting ISP parameters to be calibrated input by a user; and determining a target format according to the preset corresponding relation. Among them, the preset relationship may include the following examples.
For example, in the case where the ISP parameter to be calibrated is white balance (bright), the target format includes a RAW format.
For example, in the case where the ISP parameter to be calibrated is Contrast (Contrast), the target format includes at least one of RAW format, YUV format, or JPG format.
For example, in the case where the ISP parameter to be calibrated is Saturation (Saturation), the target format includes a JPG format.
For example, in the case where the ISP parameter to be calibrated is sharpness (SHARPNESS), the target formats include RAW format and YUV format.
For example, in the case where the ISP parameters to be calibrated are noise reduction parameters, the target format includes a RAW format.
For example, in the case where the ISP parameter to be calibrated is a cutoff frequency, the target format includes a RAW format.
That is, the user only needs to manually input the ISP parameters to be calibrated, and the electronic device automatically determines the format of the image to be acquired and stored, so that the efficiency of ISP parameter calibration can be further improved compared with the manual setting of the target format by the user.
Example 3, the target parameters further include a target shooting mode of the target image. The target shooting mode comprises shooting or shooting in video.
For example, the UI interface has the option of photographing in photographing and video, respectively, so that the user can directly select photographing or photographing in video. Or the UI interface has the option of selecting whether to take a picture in the video, if the electronic equipment detects that the user selects the operation of taking the picture in the video, the target shooting mode is determined to be the shooting mode in the video; if the electronic equipment does not detect the shooting operation in the video recording selection process, the target shooting mode is determined to be direct shooting.
Illustratively, acquiring the target parameters for acquiring the target image may include: acquiring shooting mode indication information, wherein the shooting mode indication information is used for indicating shooting and/or shooting in video; and determining the target shooting mode as shooting and/or shooting in video according to the shooting mode indication information.
Optionally, in the case of taking a picture in the video as the target shooting mode, the target parameter further includes obtaining a position of a first control in the camera application, where the first control is used to trigger taking the picture in the video, in a video interface.
The location may be, for example, coordinates of the first control at the video interface. For example, the electronic device may open a developer mode of the device in which the camera application is located, obtain coordinates of the first control on the video interface, and then fill in the UI interface.
Optionally, the target shooting mode further includes one or more of the following: portrait mode, professional mode, super macro mode, double image video mode, or panoramic mode. The UI interface also includes options for these shooting modes for the user to select.
Example 3-1, acquiring target parameters for acquiring a target image, includes: acquiring shooting mode indication information, wherein the shooting mode indication information is used for indicating a historical shooting mode, and the historical shooting mode is a shooting mode used before a camera application continuously acquires target images; and determining that the target shooting mode is a historical shooting mode according to the shooting mode indication information.
The history shooting mode is, for example, a shooting mode used by a camera application after an operation for instructing start of shooting is detected last time; for example, one or more shooting modes selected by the user are acquired the last time an operation for instructing start of shooting is detected.
It can be understood that if the electronic apparatus is to control the camera application to take a photograph a plurality of times (for example, to control the camera application to take a photograph once after detecting an operation for instructing to start taking a photograph), and the detected photographing mode instruction information is used to instruct the history photographing mode, it means that the target photographing modes used by the adjacent two times are the same. In the case of taking a large number of photographs using the same photographing mode as the target photographing mode, it is possible to save the user from repeatedly selecting the target photographing mode, further improving photographing efficiency.
By way of example, the history shooting mode is, for example, a shooting mode used when the detected camera application takes a previous picture; and the historical photographing mode is different from the selectable photographing mode provided by the electronic device.
It will be appreciated that assuming that the user's available modes of taking pictures provided by the electronic device do not meet the user's needs, the user may manually select the mode of taking pictures that he wants to use, and then instruct the electronic device to use the history mode, so that the electronic device can take pictures using the not-provided modes of taking pictures. Therefore, the shooting mode can be expanded based on manual operation of a user, and the original UI interface of the electronic equipment is not required to be changed.
Example 4 where the device in which the camera application is located includes a plurality of camera devices (such as a front camera, a rear camera, etc.), the target parameters also include information of the target camera device for acquiring the target image.
For example, the UI interface has options of front and rear, respectively, and if the electronic device detects an operation of selecting the front camera by the user, the front camera is determined as the target camera device, and if the electronic device detects an operation of selecting the rear camera by the user, the rear camera is determined as the target camera device. Or if the UI interface has the option of selecting whether to use the front camera, if the electronic equipment detects that the user selects the front camera, determining that the target camera device is the front camera; and if the electronic equipment does not detect the operation of selecting the front camera by the user, determining the target camera device as the rear camera.
Optionally, the target camera device information is used to indicate a plurality of camera devices, so that according to each set of exposure parameters in the X sets of exposure parameters, the device where the camera application is controlled to respectively use different camera devices to shoot the X sets of target images (for example, using the main camera to shoot the X sets of sub-target images and using the front camera to shoot the X sets of sub-target images), thereby further saving time and effort of a user and further improving efficiency of ISP parameter calibration.
It is appreciated that the image sensors (sensors) of different camera devices are different, and that different sensors generally require different configurations of ISP parameters to achieve better image quality. Therefore, the ISP parameters are required to be calibrated for different cameras respectively, so that the calibration of the ISP parameters of the equipment where the cameras are located is more comprehensive, and the image effect of the photo shot by each camera equipment is better.
Example 5, the target parameters further include a target zoom magnification used by the camera application to capture the target image. For example, a camera application may take a photograph with a variety of zoom magnifications including wide angle, tele, and zoom magnifications (e.g., 1x,5x,10x, etc.). The switching of zoom magnification may be achieved, for example, by clicking a corresponding control in the camera application.
For example, the UI interface may set options of different zoom magnifications, and the electronic device determines one or more zoom magnifications as the target zoom magnification in response to an operation of selecting the one or more zoom magnifications by the user. Optionally, in the case where the method is performed by the electronic device 200, the UI interface further includes an input box for switching to the position of the control of the target zoom magnification, and the electronic device 200 is responsive to the user inputting data at the input box, and determines the position of the control of the target zoom magnification on the electronic device 100 according to the data. For another example, the UI interface is provided with an option of whether to switch the zoom magnification and an input box for switching to a position of a control of the target zoom magnification, and if the electronic device does not detect an operation of selecting the option of whether to switch the zoom magnification by the user, the target parameter does not include the target zoom magnification, or the target zoom magnification is 1x; if the electronic equipment detects that the user selects the operation of switching the option of the zoom magnification, the position of the corresponding coordinate acquisition control is filled in the input box.
Example 6, the target parameters further include target chip model information indicating a model of a chip included in a device in which the camera application is located.
It will be appreciated that taking the second implementation mentioned above as an example, the electronic device 200 controls the electronic device 100 (i.e. the device in which the camera application is located) through ADB commissioning, the ADB daemon of the electronic device 100 runs on the chip. The electronic device 100 may be a device of a different vendor or model, and thus may be configured with different chips. The electronic device 200 thus controls the electronic device 100 configured with different chips to acquire images based on the same target parameters (e.g., the same set of exposure parameters), and the format of the ADB commands sent will generally be different. That is, if the same ADB command is sent to the electronic device 100 configured with different chips, there is a high probability that some devices cannot successfully set the corresponding target parameters (e.g., a set of exposure parameters). For example, if a set of exposure parameters is not set successfully, then the device in which the camera application is located is likely to be set according to the default exposure parameters of the chip.
Therefore, the model of the chip included in the equipment where the camera application is located is obtained, so that the equipment where the camera application is located can be controlled to collect the target image in a corresponding mode according to the model of the chip, and the accuracy of the target image can be improved.
Note that examples 1 to 6 described above may be arbitrarily combined.
S402, controlling the camera application to continuously acquire target images according to the target parameters.
By way of example, the continuous acquisition herein may be understood that the electronic device may not acquire new target parameters before the camera application acquires the target image corresponding to the target parameters, and the calibration personnel may not need to perform operations on the electronic device, or, the electronic device may not need to detect operations of the user; and the electronic equipment continuously sets a plurality of groups of exposure parameters for the equipment where the camera application is located and controls the camera application to take pictures, and the calibration personnel do not need to execute a plurality of times of operations on the electronic equipment, or the electronic equipment does not need to detect the operation of a user after acquiring the target parameters.
Corresponding to the first implementation, the electronic device 100 controls its own camera application to acquire the target image according to the target parameter.
Or, corresponding to the second implementation, the camera application of the electronic device 100 is controlled by the electronic device 200 to acquire the target image according to the target parameters via the ADB command.
By way of example, S402 has a number of possible implementations.
Example 1-a, corresponding to example 1, the target image comprises an X set of sub-target images, the camera application being controlled to acquire the X set of sub-target images according to the X set of exposure parameters. For example, the camera application is controlled to acquire corresponding images group by group after acquiring the X groups of exposure parameters. And controlling the camera application to acquire the ith group of sub-target images according to the ith group of exposure parameters, wherein i is more than or equal to 1 and less than or equal to X, and i is an integer. Wherein, X groups of exposure parameters are in one-to-one correspondence with X groups of sub-target images. Each of the X sets of sub-target images includes one or more images. Alternatively, in the present application, the number of images in each group of sub-target images may be determined according to a photographing mode (e.g., video or photographing, etc.).
Example 2-a, corresponding to example 2, the target image includes an image in a target format.
It can be understood that the accuracy of the calibration of the ISP parameters can be improved by shooting images in corresponding formats according to different ISP parameters.
Alternatively, where example 2-a is combined with example 1, each group of sub-target images is an image in the target format. For example, in the case where the ISP parameter to be calibrated is sharpness, the i-th group of sub-target images includes images in RAW format and YUV format.
Example 3-a, corresponding to example 3, in which, when the target shooting mode is shooting in video, the position of a first control in the camera application, which is used to trigger shooting in video, in a video interface is obtained; according to the target shooting mode, controlling equipment where the camera application is located to click a first position (namely triggering a first control) in video recording so as to acquire a target image.
In an exemplary case where the target shooting mode includes at least two shooting modes (for example, shooting and in-video shooting), it is possible to control the device where the camera application is located to respectively shoot the X-group sub-target image using different shooting modes (for example, shooting the X-group sub-target image using the shooting mode and shooting the X-group sub-target image using the in-video shooting mode) according to each of the X-group exposure parameters, so that time and effort of a user are further saved, and efficiency of ISP parameter calibration is further improved.
It can be appreciated that in general, the size of the recorded photo is smaller than that of the directly photographed photo, and the calibration is performed on the photos with different sizes obtained by the camera, so that the calibration on the ISP parameters of the equipment where the camera is located can be more comprehensive.
Example 4-a, corresponding to example 4, controls the camera application to capture a target image by the target camera device according to the target camera device information.
It can be understood that the calibration of the ISP parameters is performed on the photos taken by a plurality of camera devices (such as a main camera and a front camera) of the equipment where the camera is located, so that the calibration of the ISP parameters of the equipment where the camera is located can be more comprehensive.
Example 5-a, corresponding to example 5, obtaining a second position in the camera interface of a second control in the camera application for switching to the target zoom magnification; and according to the target zoom magnification, controlling the equipment where the camera application is located to click the second position so as to acquire the target image.
The camera interface can be interfaces such as video recording, portrait, photographing and the like.
It can be understood that the calibration of the ISP parameters is performed on the photos acquired by using different zoom magnifications (such as ultra-wide angle and long focus) of the camera, so that the calibration of the ISP parameters of the equipment where the camera is located is more comprehensive.
In some embodiments, the target zoom magnification includes a plurality of sub-target zoom magnifications, so that according to each of the X groups of exposure parameters, the device where the camera application is controlled to respectively use different zoom magnifications to shoot the X groups of target images with different sizes (for example, using zoom magnification #1 to shoot the X groups of sub-target images and using zoom magnification #2 to shoot the X groups of sub-target images), thereby further saving time and effort of a user and further improving efficiency of ISP parameter calibration.
Example 6-a, determining control information adapted to the chip according to a target chip model, the control information being used to control the camera application to capture a target image; and sending control information to the device where the camera application is located.
For example, corresponding to the first implementation, the control information may be information communicated between applications adapted to the chip; or corresponding to the second implementation, the control information may be ADB debug information adapted to the chip.
According to the scheme, the model of the chip included in the equipment where the camera application is located is obtained, so that the equipment where the camera application is located can be controlled to collect the target image in a corresponding mode according to the model of the chip, and the accuracy of the target image can be improved.
It should be noted that examples 1-a to 6-a described above may be arbitrarily combined.
According to the embodiment of the application, the electronic equipment controls the camera application to shoot a large number of pictures according to the target parameters, so that the efficiency of calibrating ISP parameters can be improved.
Optionally, the method 400 further comprises: step 1, storing the target image into a pre-created folder. For example, step 1 may include S403 to S405.
S403, creating a target folder.
The object folder comprises one or more subfolders, and the subfolders are named according to object parameters.
Or S403, selecting the folder that has been created as the target folder. Or S403, selecting a storage path for one or more subfolders.
Example 1-B, in combination with example 1, the target folder includes X subfolders, the X subfolders being named according to the X sets of exposure parameters, respectively.
For example, in the case where the exposure parameters include gain and share, the i-th subfolder is named with the value of gain and/or share of the i-th set of exposure parameters, or with the value of ISO (the value of ISO equals the value of gain multiplied by 100).
Optionally, in the case of combining example 1-B with example 2, where the target format includes Z image formats, the target folder includes X subfolders, each subfolder is named according to the exposure parameter, each subfolder includes Z secondary subfolders, each secondary subfolder is named according to the image format, Z is greater than or equal to 1, and Z is an integer. For example, in the case where the ISP parameters to be calibrated are sharpness, the ith group of sub-target images includes images in RAW format and YUV format, and the ith sub-folder includes 2 secondary sub-folders, where one secondary sub-folder is named by RAW and the other secondary sub-folder is named by YUV.
Optionally, in the case of combining examples 1-B with example 3, for example, in the case of capturing the target image using Y camera devices, the target file includes Y subfolders, each subfolder is named according to the identifier of the camera device, each subfolder includes X secondary subfolders, each secondary subfolder is named according to the exposure parameter, Y is greater than or equal to 1, and Y is an integer.
Optionally, in the case of combining examples 1-B with example 4, for example, in the case of capturing the target image using Y shooting modes, the target file includes Y subfolders, each subfolder is named according to the shooting mode, each subfolder includes X secondary subfolders, each secondary subfolder is named according to the exposure parameter, Y is greater than or equal to 1, and Y is an integer.
It should be noted that, the embodiment of the present application does not limit the execution of S403 after S402. For example, S403 may be triggered simultaneously with S402 (e.g., by S304), or S403 may precede S401.
S404, acquiring a target image from the device where the camera application is located.
Specific implementations may be found in the associated descriptions in fig. 1A or fig. 1B.
S405, storing the target image in one or more subfolders according to the corresponding relation between the target image and the target parameters.
For example, corresponding to example 1-B, X sets of sub-target images acquired according to the X sets of exposure parameters are stored in the X sub-folders, respectively.
According to the scheme, the collected target images are sorted based on the target parameters, so that the efficiency of ISP parameter calibration can be further improved.
Example 1-C, in combination with example 1, example 1-a, and example 1-B, includes the steps of:
Step 2-a, as an implementation manner of step S402, after acquiring the X-group exposure parameters, controls the camera application to capture the ith sub-target image according to the ith exposure parameters each time.
Step 2-b, as an implementation of step S404, acquires the ith group of sub-target images from the device in which the camera application is located.
Step 2-c, as an implementation manner of step S405, stores the ith group of sub-target images acquired according to the ith group of exposure parameters in the ith sub-folder.
Alternatively, in the case where the i-th subfolder includes one or more secondary subfolders named by image format, the i-th group of sub-target images are stored in one or more secondary subfolders in the i-th subfolder, respectively, according to the format of the i-th group of sub-target images.
Subsequently, steps 3-a to 3-c are repeatedly performed with respect to the i+1th group of exposure parameters.
In the scheme, when a group of sub-target images are shot, the group of sub-target images shot recently are stored in the corresponding folders, and compared with the mode that all the target images are shot and classified and arranged into files, the difficulty in classifying the images can be reduced, the accuracy of classification and arrangement is improved, and therefore the accuracy of ISP parameter calibration is improved.
Optionally, in combination with example 1 (and/or example 1-a ), prior to S402, the method 400 further comprises: step 3, controlling the camera application image according to a set of exposure parameters, judging whether the camera application image is successful, and executing S402 under the condition that the camera application image is successful. Step 3 may specifically include the following steps:
and step 3-a, controlling the camera application to acquire a corresponding group of sub-target images in the X groups of sub-target images according to one group of exposure parameters in the X groups of exposure parameters.
Step 3-b, acquiring the set of sub-target images from the device in which the camera application is located.
Step 3-c, evaluating whether the exposure parameters of the group of sub-target images match the group of exposure parameters.
If the target parameters are not matched, the user can be prompted to check whether the input of the target parameters is accurate or not if the exposure parameters are not successfully set. For example, the user is prompted to check if the exposure parameters were not set successfully due to the fact that the chip model was not selected correctly. If so, indicating that the exposure parameter setting is successful, the user may be prompted to execute S402.
According to the scheme, small-batch photographing is performed first, whether target parameters can be successfully set for camera application is determined, the situation that a large number of photographs which do not accord with the target parameters are taken without finding any influence on the accuracy of ISP parameter calibration can be reduced, and the accuracy of ISP parameter calibration can be improved; and the situation that the ISP parameter calibration is required to be repeated because a large number of pictures which do not accord with the target parameter are shot is reduced, and the ISP parameter calibration efficiency is improved.
Fig. 4 is a schematic flowchart of a photographing method 500 according to an embodiment of the present application. It is appreciated that the method 500 is a specific example of the method 400 and is illustrated as being performed by the electronic device 200 (corresponding to the second implementation above). More specifically, the method 500 is implemented by an auto-photo thread of the electronic device 200. The user interacting with the electronic device 200 through the control application may be a calibration person.
As a possible implementation manner of S401, S501, a setting value of the auto-photographing interface is acquired.
Illustratively, the electronic device 200 includes a control application, the auto-photographing interface is an interface of the control application, the electronic device 200 detects an operation for indicating to start photographing, the control application reads target information input by a user at the UI interface, and the auto-photographing thread determines target parameters according to the target information.
Fig. 5A is a schematic diagram illustrating an example of a UI interface for controlling an application according to an embodiment of the present application.
As shown in fig. 5A, the UI interface name is automatic photographing, the left side of the UI interface includes a photographing group number setting panel, the right side also includes a selection box, an input box or a related control of other target parameters, and the lower right corner includes a preview box of an automatic photographing thread. It is understood that the auto-shoot thread is one execution stream of the control application.
The shooting group number setting panel comprises an input box for setting the group number, gain and share. Or the gain and the router may be replaced with one or more of gain, router and ISO herein. For example, if 8 is input in the input boxes of the group number and the determination control on the right side is clicked, 8 input boxes are displayed below gain and cutter, respectively, and 8 groups of data can be filled in.
In one possible implementation, if the brightness of the image corresponding to the default 8 sets of exposure parameters of the auto-photographing thread is consistent. After filling in one group of gain and router, the user can only fill in one parameter of the other 7 groups of exposure parameters, click on a confirmation control below the shooting group number setting panel, and control the application program to automatically give unfilled values in the other 7 groups of exposure parameters. Accordingly, the electronic device 200 detects that the user inputs one of a group of gain and share, and other 7 groups of exposure parameters, and clicks the confirm control, determines unfilled values in the other 7 groups of exposure parameters according to the group of gain and share, and displays corresponding values in the corresponding input boxes. For example, as shown in fig. 5A, it is detected that the user inputs gain=1.00 in the 1 st group exposure parameter, and the timer= 300000000, and inputs the values of gain in the 2 nd to 8 th groups exposure parameters; after clicking the confirmation control, the other 7 groups of exposure parameters are respectively: in the exposure parameters of group 2, gain=2.00, share= 150000000; in group 3 exposure parameters, gain=4.00, share= 750000000; in the 4 th set of exposure parameters, gain=8.00, share= 375000000; in the exposure parameters of group 5, gain=16.00, share= 1875000; in the exposure parameters of group 6, gain=32.00, and share= 937500; in group 7 exposure parameters, gain=64.00, share=468750; in the 8 th set of exposure parameters, gain=128.00, and share= 234375.
Wherein, alternative #1, #2 and #3 in the chip panel respectively represent different chip models for selecting the target chip model. For example, as shown in fig. 5A, the auto-photographing thread detects an operation of selecting #2 at the UI interface by the user, and determines that the chip model of the electronic apparatus 100 is #2. In the subsequent step, the ADB command sent to the electronic apparatus 100 needs to be set in a command form adapted to this #2 according to the chip model of the electronic apparatus 100.
If the electronic device 200 detects that the user selects the control of the first group of fixed exposures, step 3-a is executed.
The camera killing process duration is used for setting the duration (such as the duration of the pause between the i-th execution of the steps 2-a to 2-c and the i+1th execution of the steps 2-a to 2-c) used by the camera application for killing the camera process between each two groups of pictures.
For example, the auto-photography thread controls the camera application to take a picture according to one set of exposure parameters each time the camera application completes taking a picture, the background of the camera application is killed, and the camera application is instructed to take a picture according to the next set of exposure parameters. The camera killing process duration may be determined based on the capability of the handset's chip to kill processes.
For example, as shown in fig. 5A, when the electronic device 200 detects that the user inputs 8s in the input box of the duration of the camera killing process and determines that the duration of the camera killing process in the target parameters is 8s, in a subsequent step, the electronic device 100 is instructed to take a picture according to the next set of exposure parameters at intervals of 8s after the camera killing process.
And each group of shooting numbers is used for setting the number of photos shot by each group of exposure parameters. It will be appreciated that different ISP parameters require different numbers of pictures to be taken. Optionally, on the basis of the UI interface described in fig. 5A, different numbers of photos may be set for different image formats, so as to meet the requirement of calibrating ISP parameters. For example, as shown in fig. 5A, the operation of the user to input 3 in the input box of the number of shots per group is detected, the number of shots per group in the target parameter is determined to be 3, and in the subsequent step, the control electronic device takes 3 pictures for each group of exposure parameters.
Wherein whether to cut a zoom (zoom) magnification and zoom coordinates (x, y) are used to set a target zoom magnification. For example, in response to the user checking whether to cut "yes" of the zoom magnification, the auto-photographing thread determines to adjust the zoom magnification. Alternatively, the electronic apparatus 200 detects coordinates input by the user at an input box of zoom coordinates (x, y), and determines the position of a control for controlling the target zoom magnification according to the coordinates. Thus, in a subsequent step, the electronic apparatus 100 is controlled to click a corresponding position to adjust the zoom magnification.
If the user is detected to check whether the zoom magnification is cut or not, the automatic photographing thread determines that the zoom magnification is not adjusted, namely, the default zoom magnification is 1x, and the target zoom magnification can be understood to be 1x.
Alternatively, the UI interface may be provided with input boxes of coordinates of a plurality of zoom magnifications, so that the user inputs the coordinates of a plurality of controls of zoom magnifications. Thus, the electronic device 200 detects coordinates of controls of a plurality of zoom magnifications, thereby controlling the electronic device 100 to take photos based on the plurality of zoom magnifications, respectively, for a set of exposure parameters. Therefore, more images can be provided for ISP parameter calibration, so that ISP calibration is more comprehensive.
Generally, the zoom coordinates corresponding to the same zoom magnification in the photographing mode and the video recording mode of the electronic device 100 are different, so that the electronic device 200 can control the electronic device 100 to adjust the zoom magnification more accurately by specifically setting the coordinate position of the zoom magnification.
Wherein, whether to record video and record the button coordinate (x, y) of shooting in the middle, is used for setting up the goal shooting mode. For example, if the automatic photographing thread does not detect the operation of selecting video by the user, the target photographing mode is determined to be photographing. For example, when the electronic device 200 detects that the user selects video recording, it determines that the target shooting mode is video recording; optionally, the electronic device 200 detects the coordinates of the input box for inputting the coordinates of the in-recording shot button by the user, and determines the position of the in-recording shot button on the recording interface of the electronic device 100 according to the coordinates.
The input box of the Raw folder position is used for setting a storage path of the target folder in the automatic photographing thread. For example, as shown in fig. 5A, the electronic apparatus 200 detects a storage path input by the user in the input box of the Raw folder position, and determines that the storage path of the target folder is a folder named test1 of the D disk.
Wherein the content is stored for selecting a target image format. For example, as shown in fig. 5A, the electronic device 200 detects an operation of selecting a RAW format, a YUV format, and a JPG format by a user, and determines that the target image format is the RAW format, the YUV format, and the JPG format.
The file naming method is used for selecting the naming method of the subfolders in the target folder. For example, as shown in fig. 5A, the electronic device 200 detects that the user selects to name 8 subfolders with the value of gain, respectively.
If the shot is selected, the gain and the timer in the left shot group number setting panel are all disabled (the group number is optionally disabled). For example, the electronic apparatus 200 detects an operation that the user selects direct photographing without fixed exposure, determines not to acquire gain and share in the photographing group number setting panel (optionally, the group number is default to 1). In the shooting process, the electronic device 100 uses the default exposure parameters of the chip to perform shooting.
Alternatively, in the case of a failure of the group number, the automatic photographing thread controls the electronic device 100 to photograph 1 group of photos, and if detecting the user selecting the gain as the folder naming mode, uses the default gain to name a subfolder, and the subfolder includes 8 groups of photos (in the case of no failure of the group number).
If the auto-photographing thread detects that the user selects the last operation of the mobile phone mode, the auto-photographing thread may execute the scheme of example 3-1.
If the electronic device 200 detects that the user clicks the control, the automatic photographing thread starts to execute the subsequent steps, and the user waits. Specifically, after detecting the operation of clicking the control by the user, the control application program acquires target information on the UI interface and provides the target information to an automatic photographing thread on the computer, and the automatic photographing thread determines target parameters according to the target information. The preview box in the lower right corner of the UI interface starts displaying the photographing progress of the photographs of 8 sets of exposure parameters. Illustratively, the preview box display in FIG. 5A has been photographed and stored according to 8 sets of exposure parameters. The code #1 is used for displaying a process of photographing and storing corresponding photos according to the 5 th group of exposure parameters; the code #2 is used for displaying a process of photographing and storing the corresponding photo according to the 6 th group exposure parameters; the code #3 is used for displaying the process of photographing and storing the corresponding photo according to the 6 th group exposure parameters.
The control for stopping shooting is used for stopping shooting at any time in the shooting process of the electronic device 100. For example, when the user finds that the shooting process fails, the user may click on a control for terminating shooting. For example, during the shooting process of the electronic device 100, if the electronic device 200 detects that the user clicks the control, an ADB command is sent to the electronic device 100 to terminate shooting.
One possible implementation of S402 includes S502 to S505.
S502, an exposure parameter is indicated to the electronic device 100.
For example, the electronic device 200 sequentially sends X first m Zhuo Diaoshi bridge ADB commands to the first device according to the X-th set of exposure parameters, where the i-th first ADB command is used to control the camera application to collect the i-th set of sub-target images based on the i-th set of exposure parameters, where 1 is equal to or greater than i is equal to or greater than X and i is an integer.
Illustratively, S502 is performed by the auto photographing thread through the ADB command according to the i-th group exposure parameter among the exposure parameters set by the UI interface. It is understood that the auto-shoot thread is one of a plurality of threads included in the ADB client of the computer.
Optionally, if the electronic device 200 detects that the user clicks on the "start shooting" control, the ADB command is sent to the electronic device 100 through the auto-shooting thread, so as to instruct the electronic device 100 to open the camera application.
S503, the electronic device 200 controls the electronic device 100 to turn on the target shooting mode and set the target zoom magnification according to the target parameter.
Illustratively, according to the UI interface shown in fig. 5A, the electronic apparatus 200 controls the electronic apparatus 100 to select the rear camera without switching the zoom magnification (i.e., using the default zoom magnification).
S503 may further include: the electronic apparatus 200 determines other target parameters (some or all of the target parameters mentioned in step S501 except for the target shooting mode and the target zoom magnification) from other target information set on the UI interface. S504, the electronic device 200 controls the electronic device 100 to take a picture according to the i-th set of exposure parameters.
S504 may further include: the electronic apparatus 200 controls the electronic apparatus 100 to take a photograph according to other target parameters (some or all of the target parameters mentioned in step S501 except the exposure parameters).
Illustratively, according to the UI interface shown in fig. 5A, the electronic apparatus 200 controls the electronic apparatus 100 to take 3 photos per group.
It is understood that the electronic device 200 transmits an ADB command to the electronic device 100 to realize control of the electronic device 100. The subsequent taking of the picture by the electronic device 100 is also performed automatically without the need for parameter setting by the user.
S505, the image stored in the electronic device 100 is moved or copied to the designated named folder.
Illustratively, S505 includes steps 4-a through 4-c.
As one implementation of S403, step 4-a, in response to an operation for instructing start of shooting, the second device reads second target information provided by the user; determining a first storage path of the target folder and naming modes of X subfolders according to the second target information, wherein the X subfolders are named according to X groups of exposure parameters respectively; and creating a target folder according to the first storage path and the naming mode.
For example, the electronic device 200 obtains the second target information from the UI interface as D/test 1 from the input box of the RAW folder, thereby determining that the first storage path of the target folder is the folder named test1 of the D disk. For example, assuming that 8 subfolders are named with the value of gain for 8 sets of exposure parameters, the 8 subfolders are named 1.00, 2.00, 4.00, 8.00, 16.00, 32.00, 64.00, and 128.00, respectively.
As an implementation of S404, in step 4-b, the electronic device 200 obtains the target image from the electronic device 100 on a second storage path of the electronic device 100, and sends a second ADB command to the electronic device 100, where the second ADB command includes the second storage path and the first storage path, and the second ADB command is used to obtain the target image. Accordingly, the electronic device 100 transmits the target image to the electronic device 200.
Optionally, the second ADB command includes X ADB commands, and the X ADB commands are in one-to-one correspondence with the X sets of exposure parameters. That is, the electronic device 200 sends an ADB command one at a time for indicating a set of exposure parameters.
As one implementation manner of S405, in step 4-c, X groups of sub-target images are respectively stored in X sub-folders, wherein X is more than or equal to 1, and X is an integer.
Illustratively, each time the electronic device 100 takes a photograph for a set of exposure parameters, the list of images includes 9 images (taking 3 images per set of exposure parameters as an example), with 3 images in RAW, YUV, and JPG formats. The 9 images are moved or copied into a subfolder named as the value of gain in the shuffled exposure parameter (named gain for example) with a storage path D:/test 1.
S506, judging whether the acquisition of the target image is completed.
Illustratively, the auto-photographing thread checks whether all photos corresponding to the set 8 sets of exposure parameters are photographed. If the checking result is yes, shooting is completed, and the finished file folder is popped up to finish the thread; if the checking result is no, the next (i+1th) group of images is continuously acquired, that is, the execution returns to S502, and the next (i+1th) group of exposure parameters is indicated to the electronic device 100. It is appreciated that the electronic device 200 indicates to the electronic device 100 that the next set of exposure parameters does not require the user to make parameter settings at the UI interface.
That is, after detecting the operation of clicking the control for starting shooting by the user, the electronic device 200 controls the electronic device 100 to shoot according to the obtained target parameters, and the information of the UI interface is not detected until the set photos (for example, the target parameters include 8 groups of exposure parameters, 3 image formats and 3 pieces of pictures are shot per group, then the electronic device 100 needs to shoot 24 photos and store 72 images).
In addition, the application also protects an automatic photographing system, the photographing system comprises a first device and a second device, the first device comprises a camera application, and the second device controls the first device through ADB debugging; the second device detects an operation for indicating to start shooting and reads target information input by a user in the control application program; and determining X groups of exposure parameters according to the target information; according to the X groups of exposure parameters, X first ADB commands are sequentially sent to first equipment, wherein the ith first ADB command is used for controlling a camera application to acquire an ith group of sub-target images based on the ith group of exposure parameters, i is more than or equal to 1 and less than or equal to X, and i is an integer; the first device captures an ith group of sub-target images based on the ith group of exposure parameters according to the ith first ADB command.
In one possible embodiment, in response to an operation for instructing start of shooting, the second device reads second target information input by the user at the control application; determining a first storage path of the target folder and naming modes of X subfolders according to the second target information and X groups of exposure parameters, wherein the X subfolders are named according to the X groups of exposure parameters respectively; creating a target folder according to the first storage path and the naming mode; the second device obtains a second storage path of the target image from the first device, sends a second ADB command to the first device, wherein the second ADB command comprises the second storage path and the first storage path, the second ADB command is used for obtaining the target image, and the second device obtains the target image from the first device; the first device receives a second ADB command, and the first device sends a target image to the second device; the second device stores the X groups of sub-target images in X sub-folders respectively, wherein X is more than or equal to 1, and X is an integer.
In the photographing system, the first device and the second device may be the electronic device 100 and the electronic device 200, and the first device and the second device may further execute the steps executed by the electronic device 100 and the electronic device 200 in the above methods 300 to 500, corresponding to the above second implementation, which are not described in detail.
The application also provides a photographing method 600. In this method, the auto-photographing program of the electronic apparatus 200 controls the electronic apparatus 100 to continuously photograph so as to test the hardware performance and power consumption of the camera. Method 600 may refer to method 500 by electronic device 200 controlling electronic device 100 to take a number of pictures according to target parameters, except that the target parameters in method 600 do not relate to exposure parameters. Similarly, the method 600 can save a lot of manpower and improve the efficiency of the test.
Fig. 6A shows a hardware system suitable for use in the apparatus of the present application. The apparatus is the electronic device 100 in fig. 1A.
The electronic device 100 may be a mobile phone, a smart screen, a tablet computer, a wearable electronic device, an in-vehicle electronic device, an Augmented Reality (AR) device, a Virtual Reality (VR) device, a notebook computer, an ultra-mobile personal computer (UMPC), a netbook, a Personal Digital Assistant (PDA), a projector, etc., and the specific type of the electronic device 100 is not limited in the embodiments of the present application.
The electronic device 100 may include at least one processor H110, a camera H120, a display screen H130, and a memory H140.
The configuration shown in fig. 6A does not constitute a specific limitation on the electronic apparatus 100. In other embodiments of the application, electronic device 100 may include more or fewer components than those shown in FIG. 6A, or electronic device 100 may include a combination of some of the components shown in FIG. 6A, or electronic device 100 may include sub-components of some of the components shown in FIG. 6A. The components shown in fig. 6A may be implemented in hardware, software, or a combination of software and hardware.
The processor H110 may include one or more processing units. For example, the processor H110 may include at least one of the following processing units: an application processor (application processor, AP), a modem processor, a graphics processor (graphics processing unit, GPU), an image signal processor (IMAGE SIGNAL processor, ISP), a controller, a video codec, a digital signal processor (DIGITAL SIGNAL processor, DSP), a baseband processor, a neural-Network Processor (NPU). The different processing units may be separate devices or integrated devices.
Illustratively, in an embodiment of the present application, the method may be performed in the processor H110, to acquire target parameters for acquiring a target image for calibrating the image signal processing parameters; and controlling the camera application to acquire the target image according to the target parameters.
The controller can generate operation control signals according to the instruction operation codes and the time sequence signals to finish the control of instruction fetching and instruction execution.
The connection relationship between the modules shown in fig. 6A is merely illustrative, and does not limit the connection relationship between the modules of the electronic device 100. Alternatively, the modules of the electronic device 100 may also use a combination of the various connection manners in the foregoing embodiments.
The electronic device 100 may implement display functions through the GPU, the display screen H130, and the application processor. The GPU is a microprocessor for image processing, and is connected to the display screen H130 and the application processor. The GPU is used to perform mathematical and geometric calculations for graphics rendering. The processor H110 may include one or more GPUs that execute program instructions to generate or change display information.
The display screen H130 may be used to display images or video.
The electronic device 100 may implement a photographing function through an image signal processor, a camera H120, a video codec, a GPU, a display screen H130, an application processor, and the like.
The image signal processor is used for processing the data fed back by the camera H120. For example, when photographing, the shutter is opened, light is transmitted to the camera photosensitive element through the lens, the optical signal is converted into an electric signal, and the camera photosensitive element transmits the electric signal to the image signal processor for processing and is converted into an image visible to naked eyes. The image signal processor can carry out algorithm optimization on noise, brightness and color of the image, and can also optimize parameters such as exposure, color temperature and the like of a shooting scene.
Camera H120 is used to capture still images or video. The object generates an optical image through the lens and projects the optical image onto the photosensitive element. The photosensitive element may be a charge coupled device (DEVICE CAMERA) or a Complementary Metal Oxide Semiconductor (CMOS) phototransistor. The photosensitive element converts the optical signal into an electrical signal, which is then transferred to an image signal processor for conversion into a digital image signal. The image signal processor outputs the digital image signal to the DSP processing. The DSP converts the digital image signal into an image signal in a format of a standard Red Green Blue (RGB), YUV (also called YCrCb), or the like. In some embodiments, the electronic device 100 may include 1 or N cameras H120, N being a positive integer greater than 1.
Illustratively, the digital signal processor is configured to process digital signals, and may process other digital signals in addition to digital image signals. For example, when the electronic device 100 selects a frequency bin, the digital signal processor is used to fourier transform the frequency bin energy, or the like.
Memory H140 for storing instructions and data. In some embodiments, the memory in processor H110 is a cache memory. The memory H140 may hold instructions or data that the processor H110 has just used or recycled. If the processor H110 needs to reuse the instruction or data, it can be called directly from the memory H140. Repeated accesses are avoided and the latency of the processor H110 is reduced, thereby improving the efficiency of the system.
Fig. 6B shows another hardware system suitable for use in the apparatus of the present application. The apparatus is the electronic device 200 in fig. 1B.
The electronic device 200 may be a smart screen, a tablet computer, a wearable electronic device, an in-vehicle electronic device, an augmented reality (augmented reality, AR) device, a Virtual Reality (VR) device, a notebook computer, an ultra-mobile personal computer (UMPC), a netbook, a Personal Digital Assistant (PDA), a projector, etc., and the specific type of the electronic device 200 is not limited in the embodiments of the present application.
The electronic device 200 comprises a processor H210, the processor H210 being coupled to a memory H220, the memory H220 being for storing computer programs or instructions and/or data, the processor H210 being for executing the computer programs or instructions stored by the memory H220 or for reading the data stored by the memory H220 for performing the methods in the method embodiments above.
Optionally, the processor H210 is one or more.
Optionally, the memory H220 is one or more.
Optionally, the memory H220 is integrated with the processor H210 or separately provided.
Optionally, as shown in fig. 6B, the apparatus H200 further includes a transceiver H230, and the transceiver H230 is used for receiving and/or transmitting signals. For example, the processor H210 is configured to control the transceiver H230 to receive and/or transmit signals.
The apparatus H200 is used to implement the operations performed by the electronic device 200 in the various method embodiments above.
For example, the processor H210 is configured to execute computer programs or instructions stored in the memory H220 to implement the relevant operations of the electronic device 200 in the above method embodiments. For example, the method performed by the electronic device 200 (e.g., a computer) in the embodiment shown in fig. 3-4.
Embodiments of the present application provide a chip system including one or more processors configured to invoke from a memory and execute instructions stored in the memory, so that the method of the embodiments of the present application described above is performed. The chip system may be formed of a chip or may include a chip and other discrete devices.
The chip system may include an input circuit or interface for transmitting information or data, and an output circuit or interface for receiving information or data, among other things.
The application also provides a computer program product which, when executed by a processor, implements the method of any of the method embodiments of the application.
The computer program product may be stored in a memory and eventually converted to an executable object file that can be executed by a processor through preprocessing, compiling, assembling, and linking.
The application also provides a computer readable storage medium having stored thereon a computer program which when executed by a computer implements the method according to any of the method embodiments of the application. The computer program may be a high-level language program or an executable object program.
The computer readable storage medium may be volatile memory or nonvolatile memory, or may include both volatile memory and nonvolatile memory. The nonvolatile memory may be a read-only memory (ROM), a Programmable ROM (PROM), an erasable programmable ROM (erasable PROM), an electrically erasable programmable EPROM (EEPROM), or a flash memory. The volatile memory may be random access memory (random access memory, RAM) which acts as external cache memory. By way of example, and not limitation, many forms of RAM are available, such as static random access memory (STATIC RAM, SRAM), dynamic random access memory (DYNAMIC RAM, DRAM), synchronous Dynamic Random Access Memory (SDRAM), double data rate synchronous dynamic random access memory (double DATA RATE SDRAM, DDR SDRAM), enhanced synchronous dynamic random access memory (ENHANCED SDRAM, ESDRAM), synchronous link dynamic random access memory (SYNCHLINK DRAM, SLDRAM), and direct memory bus random access memory (direct rambus RAM, DR RAM).
It will be clearly understood by those skilled in the art that, for convenience and brevity of description, specific working processes and technical effects of the apparatus and device described above may refer to corresponding processes and technical effects in the foregoing method embodiments, which are not described in detail herein.
In the several embodiments provided by the present application, the disclosed systems, devices, and methods may be implemented in other manners. For example, some features of the method embodiments described above may be omitted, or not performed. The above-described apparatus embodiments are merely illustrative, the division of units is merely a logical function division, and there may be additional divisions in actual implementation, and multiple units or components may be combined or integrated into another system. In addition, the coupling between the elements or the coupling between the elements may be direct or indirect, including electrical, mechanical, or other forms of connection.
It should be understood that, in various embodiments of the present application, the size of the sequence number of each process does not mean that the execution sequence of each process should be determined by its functions and internal logic, and should not constitute any limitation on the implementation process of the embodiments of the present application.
It should be understood that references to "a plurality" in this disclosure refer to two or more. The term "and/or" herein is merely one association relationship describing the associated object, meaning that there may be three relationships, e.g., a and/or B, may represent: a exists alone, A and B exist together, and B exists alone. In addition, the character "/" herein generally indicates that the front and rear associated objects are an "or" relationship.
The terms (or numbers) of "first," "second," …, etc. in the embodiments of the present application are used for descriptive purposes only and are not to be construed as indicating or implying any relative importance or number of features indicated, for example, different "coordinates" or the like. Thus, features defining "first," "second," …, etc., may explicitly or implicitly include one or more features. In the description of embodiments of the application, "at least one (an item)" means one or more. The meaning of "plurality" is two or more. "at least one of (an) or the like" below means any combination of these items, including any combination of a single (an) or a plurality (an) of items.
In summary, the foregoing description is only a preferred embodiment of the present application, and is not intended to limit the scope of the present application. Any modification, equivalent replacement, improvement, etc. made within the spirit and principle of the present application should be included in the protection scope of the present application.

Claims (13)

1. A photographing method applied to a second device, wherein a first device is a device where a camera application is located, the first device being different from the second device, the second device including a control application, the method comprising:
detecting an operation for indicating to start shooting, and reading target information input by a user in the control application program;
Determining target parameters for acquiring a target image according to the target information, wherein the target image is used for calibrating image signal processing ISP parameters, the target parameters comprise X groups of exposure parameters, X is more than 1, X is an integer, and different groups of exposure parameters in the X groups of exposure parameters are not completely identical;
Controlling the camera application to continuously acquire the target image according to the target parameters, including: the second device sequentially sends X first safety Zhuo Diaoshi bridge ADB commands to the first device according to the X groups of exposure parameters, wherein the ith first ADB command is used for controlling the camera application to acquire an ith group of sub-target images based on the ith group of exposure parameters, the target images comprise X groups of sub-target images, the X groups of exposure parameters are in one-to-one correspondence with the X groups of sub-target images, and each group of sub-target images in the X groups of sub-target images comprises one or more images, wherein i is greater than or equal to 1 and less than or equal to X and is an integer.
2. The method of claim 1, wherein the method further comprises:
creating a target folder, wherein the target folder comprises one or more subfolders, and the subfolders are named according to the target parameters;
acquiring the target image from equipment where the camera application is located;
And storing the target image in the one or more subfolders according to the corresponding relation between the target image and the target parameters.
3. The method of claim 2, wherein the target folder includes X subfolders named according to the X sets of exposure parameters, respectively, the storing the target image in the one or more subfolders according to a correspondence of the target image and the target parameters, comprising:
and storing the X groups of sub-target images in the X sub-folders respectively, wherein X is more than or equal to 1, and X is an integer.
4. The method of claim 3, wherein,
The obtaining the target image from the device where the camera application is located includes: acquiring the ith group of sub-target images from equipment where the camera application is located;
The storing the X groups of sub-target images in the X sub-folders respectively includes: and storing the ith group of sub-target images acquired according to the ith group of exposure parameters in an ith sub-folder, wherein i is more than or equal to 1 and less than or equal to X, and i is an integer.
5. The method of any of claims 1 to 4, wherein prior to said controlling a camera application to continuously acquire said target image in accordance with said target parameter, the method further comprises:
Controlling a camera application to continuously acquire a corresponding group of sub-target images in the X groups of sub-target images according to one group of exposure parameters in the X groups of exposure parameters;
acquiring the group of sub-target images from a device in which the camera application is located;
And evaluating whether the exposure parameters of the set of sub-target images match the set of exposure parameters.
6. The method of any of claims 1-4, wherein the target parameters further comprise a target format of the target image, the target format being determined from the ISP parameters, the target format comprising at least one of: a RAW format, a YUV format or a JPG format, wherein the format of the target image is the target format.
7. The method according to any one of claims 1 to 4, wherein the target parameters further include a target shooting mode of the target image, the target shooting mode including shooting or shooting in video, and the controlling the camera application to continuously capture the target image according to the target parameters in the case that the target shooting mode is shooting in video includes:
Acquiring a first position of a first control used for triggering photographing in video in the camera application in a video interface;
And according to the target shooting mode, controlling equipment where the camera is applied to click the first position in video recording so as to acquire the target image.
8. The method of claim 7, wherein the target capture mode further comprises at least one of: a portrait mode, a professional mode, a super macro mode, a double image video mode or a panoramic mode, wherein the acquiring the target parameters for acquiring the target image comprises the following steps:
Acquiring shooting mode indication information, wherein the shooting mode indication information is used for indicating a historical shooting mode, and the historical shooting mode is a shooting mode used before the camera application continuously collects the target image;
And determining the target shooting mode as the historical shooting mode according to the shooting mode indication information.
9. The method according to any one of claims 1 to 4, wherein, in case the device in which the camera application is located comprises a plurality of camera devices, the target parameter further comprises information of a target camera device for acquiring the target image, and the controlling the camera application to continuously acquire the target image according to the target parameter comprises:
and controlling the camera application to acquire the target image through the target camera device according to the target camera device information.
10. The method of any of claims 1 to 4, wherein the target parameters further comprise a target zoom magnification used by the camera application to continuously capture the target image, the controlling the camera application to continuously capture the target image in accordance with the target parameters comprising:
acquiring a second position of a second control used for switching to the target zoom magnification in the camera application in a camera interface;
and according to the target zoom magnification, controlling equipment where the camera is applied to click the second position to acquire the target image.
11. The method of any of claims 1-4, wherein the target parameters further comprise target chip model information indicating a model of a chip included by a device in which the camera application is located, the controlling the camera application to continuously capture the target image according to the target parameters comprising:
determining control information adapted to the chip according to the target chip model information, wherein the control information is used for controlling a camera application to continuously acquire the target image;
And sending the control information to equipment where the camera application is located.
12. An electronic device comprising a memory, and one or more processors, wherein the memory is configured to store a computer program; the processor is configured to invoke the computer program to cause the electronic device to perform the method of any of claims 1 to 11.
13. A computer storage medium, comprising: computer instructions; when executed on an electronic device, the computer instructions cause the electronic device to perform the method of any one of claims 1 to 11.
CN202311736947.4A 2023-12-18 2023-12-18 Photographing method and related equipment Active CN117425080B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202311736947.4A CN117425080B (en) 2023-12-18 2023-12-18 Photographing method and related equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202311736947.4A CN117425080B (en) 2023-12-18 2023-12-18 Photographing method and related equipment

Publications (2)

Publication Number Publication Date
CN117425080A CN117425080A (en) 2024-01-19
CN117425080B true CN117425080B (en) 2024-05-28

Family

ID=89531120

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202311736947.4A Active CN117425080B (en) 2023-12-18 2023-12-18 Photographing method and related equipment

Country Status (1)

Country Link
CN (1) CN117425080B (en)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109889731A (en) * 2019-02-22 2019-06-14 生迪智慧科技有限公司 The determination method, apparatus and storage medium of acquisition parameters
CN112291459A (en) * 2020-10-23 2021-01-29 杭州思锐迪科技有限公司 Photographic arrangement and three-dimensional scanning equipment
CN117014720A (en) * 2022-04-26 2023-11-07 Oppo广东移动通信有限公司 Image shooting method, device, terminal, storage medium and product

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109889731A (en) * 2019-02-22 2019-06-14 生迪智慧科技有限公司 The determination method, apparatus and storage medium of acquisition parameters
CN112291459A (en) * 2020-10-23 2021-01-29 杭州思锐迪科技有限公司 Photographic arrangement and three-dimensional scanning equipment
CN117014720A (en) * 2022-04-26 2023-11-07 Oppo广东移动通信有限公司 Image shooting method, device, terminal, storage medium and product

Also Published As

Publication number Publication date
CN117425080A (en) 2024-01-19

Similar Documents

Publication Publication Date Title
CN108933899B (en) Panorama shooting method, device, terminal and computer readable storage medium
JP4236433B2 (en) System and method for simulating fill flash in photography
CN101213828B (en) Method and apparatus for incorporating iris color in red-eye correction
JP3889017B2 (en) System and method for correcting captured images
CN105940673B (en) Image processing apparatus and image processing method
CN104853097B (en) A kind of panning mode image pickup method and device
CN108881701B (en) Shooting method, camera, terminal device and computer readable storage medium
CN106454077A (en) Photographing method, photographing apparatus and terminal
CN113315956B (en) Image processing apparatus, image capturing apparatus, image processing method, and machine-readable medium
CN108900764A (en) Image pickup method and electronic device and filming control method and server
CN108282622B (en) Photo shooting method and device
CN112673311B (en) Method, software product, camera arrangement and system for determining artificial lighting settings and camera settings
CN104394329B (en) A kind of photographic method, device and mobile terminal
US20240273909A1 (en) Camera shooting parameter adjustment method and apparatus, and electronic device
CN106657798A (en) Photographing method for intelligent terminal
CN111492653A (en) Method and device for quickly adjusting white balance of camera and computer readable storage medium
CN110581950B (en) Camera, system and method for selecting camera settings
TWI620442B (en) Flash optimization for camera devices
CN117425080B (en) Photographing method and related equipment
CN109391770B (en) Image processing apparatus and image processing method
US12010433B2 (en) Image processing apparatus, image processing method, and storage medium
US20220141374A1 (en) Intelligent flash intensity control systems and methods
JP7378963B2 (en) Image processing device, image processing method, and computer program
JP7395258B2 (en) Image processing device, image processing method, computer program and storage medium
CN110809119A (en) Photographing method, photographing apparatus, and computer-readable storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant