WO2023000878A1 - Procédé et appareil de prise de photographies, dispositif de commande, dispositif et support de stockage lisible par ordinateur - Google Patents

Procédé et appareil de prise de photographies, dispositif de commande, dispositif et support de stockage lisible par ordinateur Download PDF

Info

Publication number
WO2023000878A1
WO2023000878A1 PCT/CN2022/099221 CN2022099221W WO2023000878A1 WO 2023000878 A1 WO2023000878 A1 WO 2023000878A1 CN 2022099221 W CN2022099221 W CN 2022099221W WO 2023000878 A1 WO2023000878 A1 WO 2023000878A1
Authority
WO
WIPO (PCT)
Prior art keywords
brightness
preview image
target sub
interval
feature information
Prior art date
Application number
PCT/CN2022/099221
Other languages
English (en)
Chinese (zh)
Inventor
郑亮
Original Assignee
中兴通讯股份有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 中兴通讯股份有限公司 filed Critical 中兴通讯股份有限公司
Publication of WO2023000878A1 publication Critical patent/WO2023000878A1/fr

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/50Constructional details
    • H04N23/55Optical parts specially adapted for electronic image sensors; Mounting thereof
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/63Control of cameras or camera modules by using electronic viewfinders
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/70Circuitry for compensating brightness variation in the scene

Definitions

  • the embodiments of the present application relate to but are not limited to the technical field of photographing, and in particular, relate to a photographing method, a photographing device, a controller, a photographing device, and a computer-readable storage medium.
  • HDR High-Dynamic Range, High Dynamic Light Rendering
  • Embodiments of the present application provide a photographing method, a photographing device, a controller, a photographing device, and a computer-readable storage medium.
  • an embodiment of the present application provides a shooting method, including: acquiring a preview image of the current scene; performing feature extraction on the preview image to obtain brightness characteristic information of the preview image; inputting the brightness characteristic information Go to the strategy generation model to obtain an exposure strategy, wherein the strategy generation model is trained by a neural network according to the brightness feature information corresponding to the sample image; and the current scene is photographed based on the exposure strategy.
  • the embodiment of the present application also provides a photographing device, including a processor, a memory, a photographing component, and a display screen, and the processor is respectively connected to the memory, the photographing component, and the display screen, so
  • the processor includes a feature extraction unit and a policy generation unit, the photographing component includes an optical camera; the photographing component is configured to obtain a preview image corresponding to the current scene through the optical camera; the display screen is configured to display the The preview image corresponding to the current scene; the feature extraction unit is configured to acquire the preview image displayed on the display screen, and perform feature extraction on the preview image to obtain brightness feature information of the preview image;
  • the strategy generation unit is configured to input the brightness feature information into a pre-trained strategy generation model in the processor to obtain an exposure strategy, wherein the strategy generation model is configured by the neural network according to the brightness corresponding to the sample image
  • the feature information is obtained through training; the photographing component is further configured to acquire the exposure strategy, and photograph the current scene through the optical camera based on the exposure strategy.
  • the embodiment of the present application also provides a controller, including: a memory, a processor, and a computer program stored on the memory and operable on the processor, and the processor executes the computer program.
  • the program implements the photographing method described in the first aspect above.
  • an embodiment of the present application further provides a photographing device, including the photographing device described in the second aspect above or the controller described in the third aspect above.
  • the embodiment of the present application further provides a computer-readable storage medium storing computer-executable instructions, and the computer-executable instructions are used to execute the photographing method as described in the above-mentioned first aspect.
  • FIG. 1 is a schematic diagram of a system architecture platform for executing a shooting method provided by an embodiment of the present application
  • FIG. 2 is a flowchart of a shooting method provided by an embodiment of the present application
  • Fig. 3 is a flow chart of extracting brightness feature information in a shooting method provided by an embodiment of the present application
  • FIG. 4 is a flow chart of obtaining brightness position weights in a shooting method provided by an embodiment of the present application
  • FIG. 5 is a flow chart of obtaining brightness proportion weights in a shooting method provided by an embodiment of the present application.
  • Fig. 6 is a flow chart of obtaining brightness characteristic information according to brightness position weight, brightness proportion weight and brightness interval ratio in the shooting method provided by an embodiment of the present application;
  • Fig. 7 is a schematic diagram of exposure strategy combinations provided by an embodiment of the present application.
  • Fig. 8 is a schematic structural diagram of a neural network model provided by an embodiment of the present application.
  • FIG. 9 is a schematic structural diagram of a photographing device provided by an embodiment of the present application.
  • an embodiment of the present application provides a shooting method, a shooting device, a controller, a shooting device, and a computer-readable storage medium.
  • the shooting method includes but is not limited to the following steps: acquiring a preview image of the current scene; Perform feature extraction to obtain the brightness feature information of the preview image; input the brightness feature information into the policy generation model to obtain the exposure strategy, wherein the strategy generation model is trained by the neural network according to the brightness feature information corresponding to the sample image; based on the exposure strategy to shoot the current scene.
  • the embodiment of the present application will extract the brightness feature information of the preview image of the current scene and input it into the trained policy generation model, because the policy generation model of the embodiment of the present application is composed of The neural network is trained based on the brightness feature information corresponding to the sample image. Therefore, the strategy generation model will respond to output the exposure strategy corresponding to the preview image of the current scene, and then the exposure strategy output by the strategy generation model can be adopted when the shooting button is pressed. During the shooting process, because the accuracy of the exposure strategy output by the policy generation model is higher, the embodiment of the present application can improve the shooting quality.
  • FIG. 1 is a schematic diagram of a system architecture platform for executing a shooting method provided by an embodiment of the present application.
  • the system architecture platform 100 is provided with a processor 110 and a memory 120 , wherein the processor 110 and the memory 120 may be connected via a bus or in other ways.
  • connection via a bus is taken as an example.
  • the memory 120 can be used to store non-transitory software programs and non-transitory computer-executable programs.
  • the memory 120 may include a high-speed random access memory, and may also include a non-transitory memory, such as at least one magnetic disk storage device, a flash memory device, or other non-transitory solid-state storage devices.
  • the memory 120 may optionally include memory 120 located remotely relative to the processor 110, and these remote memories may be connected to the system architecture platform through a network. Examples of the aforementioned networks include, but are not limited to, the Internet, intranets, local area networks, mobile communication networks, and combinations thereof.
  • system architecture platform can be applied to 3G communication network systems, LTE communication network systems, 5G communication network systems and subsequent evolved mobile communication network systems, etc., which is not specifically limited in this embodiment.
  • FIG. 1 does not constitute a limitation to the embodiment of the present application, and may include more or less components than those shown in the illustration, or combine some components, or have different Part placement.
  • the processor 110 can call the shooting program stored in the memory 120 to execute the shooting method.
  • FIG. 2 is a flowchart of a shooting method provided by an embodiment of the present application, and the method includes but is not limited to step S100 , step S200 , step S300 and step S400 .
  • Step S100 acquiring a preview image of the current scene.
  • Step S200 performing feature extraction on the preview image to obtain brightness feature information of the preview image.
  • Step S300 input the brightness feature information into the strategy generation model to obtain the exposure strategy, wherein the strategy generation model is trained by the neural network according to the brightness feature information corresponding to the sample image.
  • Step S400 shoot the current scene based on the exposure strategy.
  • the brightness feature information of the preview image of the current scene will be extracted and input to the trained policy generation model. Since the policy generation model in the embodiment of the present application is based on the neural network The corresponding brightness feature information is trained. Therefore, the policy generation model will respond to output the exposure strategy corresponding to the preview image of the current scene, and then use the exposure strategy output by the policy generation model to perform shooting processing when the shooting button is pressed. The accuracy of the exposure strategy output by the strategy generation model is higher, so the embodiment of the present application can improve the shooting quality.
  • the preview image refers to the image displayed on the screen of the shooting device before the shutter is pressed.
  • the preview image refers to the image displayed on the screen of the shooting device before the user holds the shooting device against the scenery and does not press the shooting button.
  • the above-mentioned policy generation model is a pre-trained model.
  • the embodiment of the present application will extract the brightness characteristic information of the sample image, and use it as the input of the neural network, so that the neural network can output the corresponding exposure strategy.
  • the strategy generation model is trained, if the brightness feature information of the preview image of the current scene is input to the strategy generation model, the strategy generation model will respond to output a more accurate exposure strategy.
  • the above-mentioned neural network is an algorithmic mathematical model that imitates the behavior characteristics of animal neural networks and performs distributed parallel information processing. This kind of network depends on the complexity of the system, and achieves the purpose of processing information by adjusting the interconnection relationship between a large number of internal nodes.
  • Figure 3 is a flow chart of extracting brightness feature information in the shooting method provided by an embodiment of the present application.
  • step S200 it includes but is not limited to step S410, step S420, step S430 and step S440 .
  • Step S410 divide the preview image to obtain multiple sub-regions, and determine multiple target sub-regions from the multiple sub-regions.
  • Step S420 for each target sub-region, calculate the cumulative brightness distribution value according to the brightness value of the target sub-region, and obtain the brightness interval ratio of the target sub-region according to the cumulative brightness distribution value and the preset brightness threshold.
  • Step S430 obtaining the brightness position weight of each target sub-region.
  • Step S440 according to the luminance position weights and luminance interval ratios of all target sub-regions, calculate the luminance characteristic information of the preview image.
  • the preview image is divided to obtain multiple sub-regions, and a certain number of target sub-regions are selected from the multiple sub-regions; then the cumulative brightness distribution of each target sub-region is calculated Value, that is, the CDF (cumulative distribution function, cumulative distribution function) value of the high and low brightness; since the embodiment of the present application will set a preset brightness threshold, which represents a certain brightness range, the embodiment of the present application will be based on the cumulative distribution function.
  • the brightness distribution value and the preset brightness threshold obtain the brightness interval ratio of the target sub-area, that is, the proportion in each brightness interval; then, because the positions of different target sub-areas are different, their distances from the exposure center are also different.
  • the embodiment of the present application will also obtain the brightness position weight of the target sub-region; finally, the embodiment of the present application will calculate the brightness feature information of the preview image according to the brightness position weight and brightness interval ratio of all target sub-regions, so as to realize the brightness of the preview image Extraction of feature information.
  • the embodiment of the present application calculates the brightness distribution of different areas of the preview image corresponding to the current scene, extracts the brightness feature value as the input of the neural network, and uses the calibrated scene classification as the output of the neural network. After the pre-training is completed, The network structure can effectively solve the selection of exposure parameters in scenes that require exposure fusion.
  • the preset brightness threshold mentioned above may include but not limited to a preset high brightness threshold and a preset low brightness threshold.
  • the brightness interval ratio obtained in the embodiment of the present application also includes, but is not limited to, the ratio of high brightness interval and the ratio of low brightness interval;
  • the information also correspondingly includes feature information of high-brightness intervals and feature information of low-brightness intervals.
  • the step S420 to obtain the brightness interval ratio of the target sub-region according to the cumulative brightness distribution value and the preset brightness threshold includes: according to the accumulated The brightness distribution value and the preset high-brightness threshold obtain the ratio of the high-brightness interval of the target sub-region, and the ratio of the low-brightness interval of the target sub-region is obtained according to the accumulated brightness distribution value and the preset low-brightness threshold.
  • the brightness feature information of the preview image calculated according to the brightness position weights and brightness interval ratios of all target sub-regions in the above step S440, including: according to the brightness position weights and highlight interval ratios of all target sub-regions
  • the feature information of the high-brightness interval of the preview image is calculated, and the feature information of the low-brightness interval of the preview image is calculated according to the brightness position weights of all target sub-regions and the ratio of the low-brightness interval.
  • the number of target sub-regions is less than or equal to the number of sub-regions, specifically, the number of sub-regions and the number of target sub-regions can be consistent, or the number of sub-regions and the number of target sub-regions can be inconsistent .
  • the above-mentioned brightness position weight may refer to the global brightness position weight of the preview image.
  • the aforementioned brightness position weights may refer to local brightness position weights of the preview image.
  • the target sub-region is located at the center of the preview image or at the periphery of the preview image, or, in this embodiment of the present application, an interested position can be selected according to user requirements.
  • FIG. 4 is a flow chart of acquiring brightness position weights in the shooting method provided by an embodiment of the present application.
  • the above step S430 includes but not limited to step S510 and step S520.
  • Step S510 acquiring distance information between the target sub-region and a preset position in the preview image.
  • Step S520 calculating and obtaining the brightness position weight of each target sub-region according to the distance information.
  • the embodiment of the present application will also obtain the distance information between the target sub-region and the preset position in the preview image, and according to the distance The information is calculated to obtain the brightness position weight of each target sub-region.
  • FIG. 5 is a flowchart of obtaining brightness proportion weights in a shooting method provided by an embodiment of the present application.
  • the shooting method in the embodiment of the present application also includes but is not limited to step S600 .
  • Step S600 according to the brightness interval ratio of the target sub-region, calculate the brightness proportion weight.
  • the embodiment of the present application also adjusts the weight value according to the brightness proportion value, that is, the brightness interval ratio.
  • the brightness proportion weights mentioned here are used for fine-tuning the exposure strategy, while the above-mentioned brightness position weights are used for rough adjustment of the exposure strategy.
  • Fig. 6 is a flow chart of obtaining luminance feature information according to luminance position weight, luminance proportion weight and luminance interval ratio in the shooting method provided by an embodiment of the present application.
  • step S440 it includes But not limited to step S700.
  • Step S700 according to the luminance position weights, luminance proportion weights and luminance interval ratios of all target sub-regions, calculate the luminance characteristic information of the preview image.
  • the brightness proportion weight when inputting into the policy generation model, can also be input into the policy generation model. Since the brightness proportion weight can play a role in fine-tuning the exposure strategy, therefore, in step S700 The brightness feature information output by the policy generation model will be more accurate.
  • the number of target sub-regions is i*j, where i ⁇ M, j ⁇ N; the cumulative brightness distribution value is cdf i,j ,i ⁇ [1,M],j ⁇ [1,N]; when the preset highlight threshold is [lum3,lum4], the preset low brightness threshold is [lum1,lum2], where, lum4>lum3 ⁇ lum2 > lum1.
  • the brightness position weight can be obtained by the following formula:
  • W d (i, j) is the brightness position weight
  • ⁇ 1 is an adjustable parameter.
  • the brightness proportion weight can be obtained by the following formula:
  • W a (lum) is the brightness position weight
  • ⁇ 2 and u are adjustable parameters.
  • the feature information of the highlighted interval can be obtained by the following formula:
  • vl k is the feature information of the highlighted interval.
  • the feature information of the low-brightness interval can be obtained by the following formula:
  • vd k is the feature information of the low-brightness interval.
  • the embodiment of the present application provides an overall implementation scheme, specifically as follows:
  • Step 1 Obtain the brightness value of the image with size (w, h), and Lum i, j represents the brightness at position (i, j):
  • Lum i,j Max(R i,j ,G i,j ,B i,j ),i ⁇ h,j ⁇ w
  • Step 2 Divide the obtained luminance image into M*N sub-regions, and calculate the cumulative luminance distribution value of each sub-region respectively: cdf m,n ,m ⁇ [1,M],n ⁇ [1,N].
  • Step 3 obtain the weight of the area to be counted:
  • the global brightness weight W d that is, the global brightness position weight W d , adjusts the weight according to the distance from the center. Examples of weights are as follows but not limited to:
  • ⁇ 1 is an adjustable parameter, m ⁇ M, n ⁇ N.
  • Proportion weight W a that is, brightness proportion weight W a , adjust the weight value according to the brightness proportion value.
  • Examples of weights are as follows but not limited to the following forms:
  • W a (lum) is the brightness position weight
  • ⁇ 2 and u are adjustable parameters, m ⁇ M, n ⁇ N.
  • weight can also be stored in a table, and the calculation is realized by using a LUT (Look-Up-Table, display look-up table) to simplify calculation and increase efficiency.
  • LUT Look-Up-Table, display look-up table
  • the central area W d (i, j) is distributed in the form of the following table 1:
  • Step 4 Obtain the brightness characteristic distribution value of the image through weighted average, that is, the brightness characteristic information.
  • the 2k eigenvalues of the entire image (k>1) are obtained through local or global weights, where the two weights W d and W a of the bright area and the dark area can be different, and the weights of the bright area and the dark area can be calculated as follows :
  • the feature value of the bright area that is, the feature information of the highlighted interval:
  • the feature value of the dark area that is, the feature information of the low-brightness interval:
  • the weight can also be realized by LUT during calculation, so as to obtain a 2k-dimensional vector composed of a set of brightness feature information:
  • vec in vector(vl 1 ,vd 1 ,...,vl k ,vd k )
  • Step 5 Input the trained neural network with the calculated feature vector to obtain an accurate exposure strategy to start the HDR algorithm and synthesize multiple images.
  • FIG. 7 is a schematic diagram of an exposure strategy combination provided by an embodiment of the present application.
  • the exposure strategy required for the corresponding sample photo can be selected as the output of the network, and then the network is trained to obtain the optimal structure.
  • HDR fusion is not limited to three exposure levels (normal, low exposure, high exposure)
  • an exposure setting is added.
  • the structure of the neural network model used above can refer to the form shown in FIG. 8 . That is, one or more hidden layers are included between the input layer and the output layer to better divide or improve the classification effect, and add a softmax layer after the output layer to optimize the network output effect. In some other implementation manners, any other suitable neural network model can also be used.
  • the trained network runs in the terminal through hardware support or software solidification.
  • previewing calculate the brightness characteristic value of the current scene, and input it into the network to calculate the exposure parameters required to start HDR. After it takes effect, the photo is taken. HDR photo fusion.
  • the luminance distribution of the whole picture is counted by the sub-region, the CDF value of the sub-region is calculated to obtain the high-brightness and low-brightness ratio values, and several weight coefficients are designed to obtain the luminance distribution characteristics of the current scene: through The distance weight is obtained according to the distance from the center, and the global bright and low brightness distribution characteristics are obtained; the weight of different brightness is adjusted through the difference in brightness ratio, and the bright and low brightness distribution characteristics are obtained; the middle of the weight distribution is adjusted to 1, and the surrounding is 0, to obtain the local The characteristic value of high and low brightness distribution; adjust the weight distribution to 0 in the middle and 1 around to obtain the local characteristic value of high and low brightness distribution.
  • the backlighting degree of the scene or the exposure combination of HDR is used as the output value, and the network is trained to obtain the network structure.
  • the value of the global or local brightness feature is obtained through several weights and input into the trained network, so as to obtain the degree of backlight or the required exposure parameters triggered by HDR.
  • the training process of the policy generation model is as follows: obtain multiple sample images, and divide the sample images to obtain multiple sub-regions, and determine multiple target sub-regions from the multiple sub-regions; then, for each of the sample images Target sub-area, calculate the cumulative brightness distribution value according to the brightness value of the target sub-region in the sample image, and obtain the highlight interval ratio of the target sub-region according to the cumulative brightness distribution value and the preset highlight threshold value, and according to the cumulative brightness distribution value and the preset low-brightness threshold to obtain the low-brightness interval ratio of the target sub-region; then, obtain the distance information between the target sub-region and the preset position in the sample image, and calculate the brightness position weight of each target sub-region according to the distance information ; Then, according to the brightness interval ratio of the target sub-region, calculate the brightness proportion weight; then, calculate the brightness feature information of the sample image according to the brightness position weight, brightness proportion weight and brightness interval ratio of all target sub-regions; finally , use the calculated brightness feature information to train the neural
  • FIG. 9 is a schematic structural diagram of a photographing device provided by an embodiment of the present application.
  • the photographing device 200 includes but is not limited to a processor 110, a memory 120, a photographing component 130, and a display screen 140, wherein the processor 110 Respectively connected to the memory 120, the photographing component 130 and the display screen 140, the processor 110 includes but not limited to a feature extraction unit 111 and a strategy generation unit 112, the processor 110 can call the photographing program stored in the memory 120, thereby executing the photographing method , the photographing component 130 includes but is not limited to an optical camera 131 .
  • the photographing part 130 aims at the current scene to be photographed through the optical camera 131, and displays a preview image corresponding to the current scene on the display screen 140;
  • the displayed preview image, and feature extraction is performed on the preview image to obtain the brightness feature information of the preview image;
  • the policy generation unit 112 in the processor 110 will input the brightness feature information into the pre-trained policy generation model in the processor 110 , to obtain the exposure strategy, wherein, the strategy generation model is trained by the neural network according to the brightness feature information corresponding to the sample image; finally, the photographing component 130 will obtain the obtained exposure strategy, and based on the exposure strategy, the optical camera 131 will shoot the current scene .
  • an embodiment of the present application provides a controller, which includes: a memory, a processor, and a computer program stored in the memory and operable on the processor.
  • the processor and memory can be connected by a bus or other means.
  • controller in this embodiment may correspond to include the memory and the processor in the embodiment shown in Figure 1, which can constitute a part of the system architecture platform in the embodiment shown in Figure 1, and both belong to The same idea, so both have the same implementation principle and beneficial effect, which will not be described in detail here.
  • the non-transitory software programs and instructions required to realize the shooting method of the above-mentioned embodiment are stored in the memory, and when executed by the processor, the shooting method of the above-mentioned embodiment is executed, for example, the method step S100 in FIG. 2 described above is executed to S400 , method steps S410 to S440 in FIG. 3 , method steps S510 to S520 in FIG. 4 , method step S600 in FIG. 5 , method step S700 in FIG. 6 .
  • the device embodiments described above are only illustrative, and the units described as separate components may or may not be physically separated, that is, they may be located in one place, or may be distributed to multiple network units. Part or all of the modules can be selected according to actual needs to achieve the purpose of the solution of this embodiment.
  • an embodiment of the present application provides a photographing and photographing device, including the photographing device of the foregoing embodiment or the controller of the foregoing embodiment.
  • the shooting and shooting device in this embodiment of the present application may be a terminal device, such as a mobile phone, a tablet computer, a wearable shooting device, and the like.
  • the photographing device itself may carry a camera for capturing images.
  • the shooting device in the above embodiments may also not be a terminal shooting device, but a desktop computer or a server capable of realizing the shooting function.
  • an embodiment of the present application also provides a computer-readable storage medium, and the computer-readable storage medium stores computer-executable instructions.
  • the computer-executable instructions are used to execute the above-mentioned photographing method, for example, to execute the above-described Method steps S100 to S400 in FIG. 2 , method steps S410 to S440 in FIG. 3 , method steps S510 to S520 in FIG. 4 , method steps S600 in FIG. 5 , and method steps S700 in FIG. 6 .
  • the embodiment of the present application includes: first acquiring the preview image of the current scene, and then performing feature extraction on the preview image to obtain the brightness characteristic information of the preview image, and then inputting the brightness characteristic information into a strategy generation model to obtain an exposure strategy , wherein the strategy generation model is trained by a neural network according to the brightness feature information corresponding to the sample image, and finally the current scene is photographed based on the above-mentioned exposure strategy.
  • the embodiment of the present application will extract the brightness feature information of the preview image of the current scene and input it into the trained policy generation model, because the policy generation model of the embodiment of the present application is composed of The neural network is trained based on the brightness feature information corresponding to the sample image.
  • the strategy generation model will respond to output the exposure strategy corresponding to the preview image of the current scene, and then the exposure strategy output by the strategy generation model can be adopted when the shooting button is pressed.
  • the embodiment of the present application can improve the shooting quality.
  • Computer storage media includes, but is not limited to, RAM, ROM, EEPROM, flash memory or other memory technology, CD-ROM, digital versatile disk (DVD) or other optical disk storage, magnetic cartridges, tape, magnetic disk storage or other magnetic storage devices, or can Any other medium used to store desired information and which can be accessed by a computer.
  • communication media typically embody computer readable instructions, data structures, program modules, or other data in a modulated data signal such as a carrier wave or other transport mechanism, and may include any information delivery media .

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Studio Devices (AREA)

Abstract

La présente invention concerne un procédé et un appareil de prise de photographies, un dispositif de commande, un dispositif et un support de stockage lisible par ordinateur. Le procédé de prise de photographies comprend : l'acquisition d'une image de prévisualisation d'un scénario actuel (S100) ; la réalisation d'une extraction de caractéristique sur l'image de prévisualisation de façon à obtenir des informations de caractéristique de luminosité de l'image de prévisualisation (S200) ; l'introduction des informations de caractéristique de luminosité dans un modèle de génération de stratégie de façon à obtenir une stratégie d'exposition, le modèle de génération de stratégie étant obtenu par apprentissage d'un réseau neuronal selon des informations de caractéristique de luminosité, qui correspondent à une image d'échantillon (S300) ; et la photographie du scénario actuel sur la base de la stratégie d'exposition (S400).
PCT/CN2022/099221 2021-07-22 2022-06-16 Procédé et appareil de prise de photographies, dispositif de commande, dispositif et support de stockage lisible par ordinateur WO2023000878A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202110828028.4A CN115696027A (zh) 2021-07-22 2021-07-22 拍摄方法、装置、控制器、设备和计算机可读存储介质
CN202110828028.4 2021-07-22

Publications (1)

Publication Number Publication Date
WO2023000878A1 true WO2023000878A1 (fr) 2023-01-26

Family

ID=84979817

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2022/099221 WO2023000878A1 (fr) 2021-07-22 2022-06-16 Procédé et appareil de prise de photographies, dispositif de commande, dispositif et support de stockage lisible par ordinateur

Country Status (2)

Country Link
CN (1) CN115696027A (fr)
WO (1) WO2023000878A1 (fr)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116193264B (zh) * 2023-04-21 2023-06-23 中国传媒大学 基于曝光参数的摄像机调节方法及系统

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH0534762A (ja) * 1991-07-25 1993-02-12 Sanyo Electric Co Ltd オートアイリス機能付きカメラ
JPH05260370A (ja) * 1992-01-14 1993-10-08 Sharp Corp オートアイリス回路
US5331422A (en) * 1991-03-15 1994-07-19 Sharp Kabushiki Kaisha Video camera having an adaptive automatic iris control circuit
JPH06326919A (ja) * 1993-05-13 1994-11-25 Sanyo Electric Co Ltd 自動露出調節装置
JPH09281544A (ja) * 1996-04-16 1997-10-31 Nikon Corp カメラの測光装置
JP2008060989A (ja) * 2006-08-31 2008-03-13 Noritsu Koki Co Ltd 撮影画像補正方法及び撮影画像補正モジュール
US20170061237A1 (en) * 2015-08-24 2017-03-02 Canon Kabushiki Kaisha Image processing apparatus, image processing method, and storage medium
CN111447405A (zh) * 2019-01-17 2020-07-24 杭州海康威视数字技术股份有限公司 一种针对视频监控的曝光方法及装置
CN111654594A (zh) * 2020-06-16 2020-09-11 Oppo广东移动通信有限公司 图像拍摄方法、图像拍摄装置、移动终端及存储介质

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5331422A (en) * 1991-03-15 1994-07-19 Sharp Kabushiki Kaisha Video camera having an adaptive automatic iris control circuit
JPH0534762A (ja) * 1991-07-25 1993-02-12 Sanyo Electric Co Ltd オートアイリス機能付きカメラ
JPH05260370A (ja) * 1992-01-14 1993-10-08 Sharp Corp オートアイリス回路
JPH06326919A (ja) * 1993-05-13 1994-11-25 Sanyo Electric Co Ltd 自動露出調節装置
JPH09281544A (ja) * 1996-04-16 1997-10-31 Nikon Corp カメラの測光装置
JP2008060989A (ja) * 2006-08-31 2008-03-13 Noritsu Koki Co Ltd 撮影画像補正方法及び撮影画像補正モジュール
US20170061237A1 (en) * 2015-08-24 2017-03-02 Canon Kabushiki Kaisha Image processing apparatus, image processing method, and storage medium
CN111447405A (zh) * 2019-01-17 2020-07-24 杭州海康威视数字技术股份有限公司 一种针对视频监控的曝光方法及装置
CN111654594A (zh) * 2020-06-16 2020-09-11 Oppo广东移动通信有限公司 图像拍摄方法、图像拍摄装置、移动终端及存储介质

Also Published As

Publication number Publication date
CN115696027A (zh) 2023-02-03

Similar Documents

Publication Publication Date Title
CN111418201B (zh) 一种拍摄方法及设备
CN108335279B (zh) 图像融合和hdr成像
CN108174118B (zh) 图像处理方法、装置和电子设备
JP6395810B2 (ja) 動きゴーストフィルタリングのための基準画像選択
JP4240023B2 (ja) 撮像装置、撮像方法および撮像プログラム、ならびに、画像処理装置、画像処理方法および画像処理プログラム
CN110445989B (zh) 图像处理方法、装置、存储介质及电子设备
JP6218389B2 (ja) 画像処理装置及び画像処理方法
EP3306913B1 (fr) Procédé et appareil de photographie
US8953013B2 (en) Image pickup device and image synthesis method thereof
WO2019019904A1 (fr) Procédé et appareil de traitement d'équilibrage de blancs, et terminal
CN112565636A (zh) 图像处理方法、装置、设备和存储介质
TW202022799A (zh) 測光補償方法及其相關監控攝影設備
CN116416122B (zh) 图像处理方法及其相关设备
WO2022206353A1 (fr) Procédé de traitement d'image, appareil photographique, appareil de traitement d'image, et support de stockage lisible
CN111698493A (zh) 白平衡处理方法和装置
WO2023000878A1 (fr) Procédé et appareil de prise de photographies, dispositif de commande, dispositif et support de stockage lisible par ordinateur
CN112653845B (zh) 曝光控制方法、装置、电子设备及可读存储介质
US9473716B2 (en) Image processing method and image processing device
WO2015192545A1 (fr) Procédé et appareil de photographie et support de mémoire informatique
WO2016202073A1 (fr) Procédé et appareil de traitement d'image
WO2021223113A1 (fr) Procédé de mesure, caméra, dispositif électronique et support de stockage lisible par ordinateur
JP2017068513A (ja) 画像処理装置及びその方法、プログラム、記憶媒体
JP6554009B2 (ja) 画像処理装置、その制御方法、プログラム及び記録媒体
CN117135293B (zh) 图像处理方法和电子设备
CN112949392B (zh) 图像处理方法及装置、存储介质、终端

Legal Events

Date Code Title Description
NENP Non-entry into the national phase

Ref country code: DE