WO2019105305A1 - 图像亮度处理方法、计算机可读存储介质和电子设备 - Google Patents

图像亮度处理方法、计算机可读存储介质和电子设备 Download PDF

Info

Publication number
WO2019105305A1
WO2019105305A1 PCT/CN2018/117299 CN2018117299W WO2019105305A1 WO 2019105305 A1 WO2019105305 A1 WO 2019105305A1 CN 2018117299 W CN2018117299 W CN 2018117299W WO 2019105305 A1 WO2019105305 A1 WO 2019105305A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
processed
brightness parameter
environment information
environment
Prior art date
Application number
PCT/CN2018/117299
Other languages
English (en)
French (fr)
Inventor
孙剑波
Original Assignee
Oppo广东移动通信有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Oppo广东移动通信有限公司 filed Critical Oppo广东移动通信有限公司
Publication of WO2019105305A1 publication Critical patent/WO2019105305A1/zh

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/70Circuitry for compensating brightness variation in the scene
    • H04N23/76Circuitry for compensating brightness variation in the scene by influencing the image signals
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/80Camera processing pipelines; Components thereof

Definitions

  • the present application relates to the field of image brightness processing technologies, and in particular, to an image brightness processing method, a non-volatile computer readable storage medium, and an electronic device.
  • the conventional image brightness processing method adopts a fixed brightness parameter to perform shooting, or calculates a corresponding brightness parameter according to the existing environmental information based on the image to be generated, and shoots according to the calculated brightness parameter.
  • the object is photographed such that the presented image is consistent with the brightness of the object perceived by the human eye.
  • the environmental information available from the captured image is limited, so the conventional method is not accurate enough for the brightness reduction of the subject.
  • Various embodiments provided in accordance with the present application provide an image brightness processing method, a non-transitory computer readable storage medium, and an electronic device.
  • An image brightness processing method includes:
  • An electronic device comprising a memory and a processor, the memory storing computer readable instructions, wherein when executed by the processor, the processor causes the processor to:
  • One or more non-transitory computer readable storage media containing computer executable instructions that, when executed by one or more processors, cause the processor to:
  • the image brightness processing method, the storage medium, and the electronic device provided by the embodiments of the present application acquire environment information of a to-be-processed image obtained by moving the camera in advance when capturing an image; and determine a corresponding target brightness parameter according to the environmental information. Since the environmental information of the captured image to be processed is introduced, more reference information is provided for calculating the target brightness parameter, the accuracy of the calculated brightness parameter is improved, and then the image to be processed is processed according to the brightness parameter. It also improves the accuracy of image brightness processing.
  • FIG. 1 is an application environment diagram of an image brightness processing method in an embodiment.
  • FIG. 2 is a schematic diagram showing the internal structure of an electronic device in an embodiment.
  • FIG. 3 is a flow chart of an image brightness processing method in one embodiment.
  • 4A is a schematic diagram of an image to be processed in one embodiment.
  • 4B is a schematic diagram of environmental information of an image to be processed in an embodiment.
  • FIG. 5 is a flow chart of calculating an ambient illuminance of an environment in which an image to be processed is located according to environmental information in one embodiment.
  • Figure 6 is a flow diagram of determining a corresponding target brightness parameter based on ambient illuminance in one embodiment.
  • FIG. 7 is a flow chart of a method of processing image brightness in another embodiment.
  • Figure 8 is a block diagram showing the structure of an image brightness processing apparatus in an embodiment.
  • Figure 9 is a schematic illustration of a capture circuit in one embodiment.
  • FIG. 1 is an application environment diagram of an image brightness processing method in an embodiment.
  • the electronic device 110 can call the camera on the camera to perform shooting, such as real-time scanning of the object 120 in the environment to obtain a frame image, and generate a captured image according to the frame image.
  • the camera may be a dual camera, including a main camera and a sub camera, and the image is generated according to the main camera and the sub camera.
  • the electronic device may use the frame image or the generated image as an image to be processed, and acquire environment information of the image to be processed obtained by moving the camera in advance; determine a corresponding target brightness parameter according to the environment information; and process the image according to the target brightness parameter Perform brightness processing.
  • the electronic device includes a processor, a memory, a display, and a camera connected by a system bus.
  • the processor is used to provide computing and control capabilities to support the operation of the entire electronic device.
  • the memory is used to store data, programs, etc., and at least one computer program is stored on the memory, and the computer program can be executed by the processor to implement an image brightness processing method suitable for an electronic device provided in the embodiments of the present application.
  • the memory may include a non-volatile storage medium such as a magnetic disk, an optical disk, a read-only memory (ROM), or a random storage memory (Random-Access-Memory, RAM).
  • the memory includes a non-volatile storage medium and an internal memory.
  • Non-volatile storage media stores operating systems, databases, and computer programs.
  • the database stores data related to an image brightness processing method provided by the following embodiments, such as data that can be processed, image information, and the like.
  • the computer program can be executed by a processor for implementing an image brightness processing method provided by the various embodiments below.
  • the internal memory provides a cached operating environment for operating systems and computer programs in non-volatile storage media.
  • the display screen can be a touch screen, such as a capacitive screen or an electronic screen, for displaying visual information such as an image to be processed, and can also be used to detect a touch operation acting on the display screen to generate a corresponding instruction.
  • FIG. 2 is only a block diagram of a part of the structure related to the solution of the present application, and does not constitute a limitation on the electronic device to which the solution of the present application is applied.
  • the specific electronic device may be It includes more or fewer components than those shown in the figures, or some components are combined, or have different component arrangements.
  • the electronic device may further include a network interface connected through a system bus, and communicate with other devices through the network interface, for example, data such as an image or a brightness algorithm on other devices may be acquired through the network interface.
  • an image brightness processing method is provided.
  • This embodiment is mainly applied to the electronic device shown in FIG. 1 by the method, and the method includes:
  • Operation 302 obtaining an image to be processed.
  • the image to be processed refers to an image that needs to be processed by brightness, and may be an image that has been generated by shooting, or may be a frame image obtained by real-time scanning by a camera in a shooting mode.
  • the electronic device When the image to be processed is a frame image, when the electronic device receives an instruction to turn on the camera, the electronic camera may be called to enter a shooting state.
  • the camera includes a main camera and a sub camera.
  • the electronic device can scan an object in the shooting environment through the main camera and/or the sub camera to form the frame image.
  • the electronic device may receive a shooting instruction, and generate a captured image according to the real-time frame image obtained by scanning, and the generated image is the image to be processed.
  • the shooting instruction may be a shooting instruction triggered by the detected related touch operation, the pressing operation of the physical button, or the voice control operation.
  • the touch operation can be a touch click operation, a touch long press operation, a touch slide operation, a multi-touch operation, and the like.
  • the electronic device can provide a shooting button for triggering shooting, and when a click operation on the button is detected, a shooting instruction is triggered.
  • the electronic device may also preset shooting voice information for triggering the shooting instruction. By calling the voice receiving device, the corresponding voice information is received, and by analyzing the voice information, when the voice information is detected to match the captured voice information, the shooting instruction can be triggered.
  • Operation 304 Acquire environment information of a to-be-processed image obtained by moving the camera in advance.
  • the electronic device may acquire environmental information related to the image to be processed before generating the image to be processed or in the process of generating the image to be processed, that is, by moving the camera.
  • the environment information is information about an environment in which the image to be processed is located, and includes information about a scene in the image to be processed and a periphery of the scene.
  • the environment information may be embodied in the form of an image or a frame image. Each pixel on the environment information corresponds to a location within the environment, and the color represented by the pixel is displayed by the camera scanning the corresponding position in the environment. s color.
  • the environmental information is presented in the form of an image or a frame image
  • the environment information presents an environment in which the image to be processed is located.
  • FIG. 4A is a schematic diagram of an image to be processed
  • FIG. 4B is a schematic diagram of environmental information of the image to be processed.
  • the image to be processed is mainly a cartoon portrait
  • the environmental information includes the cartoon portrait, and also includes information on the plants on both sides of the portrait body and the white background on both sides of the cartoon portrait head. It can be understood that the user can move the camera in the shooting state before taking the image to be processed.
  • the electronic device can record and organize the environmental information of the image to be processed by the movement of the camera.
  • Operation 306 determining a corresponding target brightness parameter based on the environmental information.
  • the brightness parameter indicates the parameter that needs to be used when processing the brightness of the image.
  • the brightness parameter may include, but is not limited to, one or more of sensitivity, exposure amount, exposure duration, and the like.
  • the target brightness parameter indicates the brightness parameter used when adjusting the brightness rendering effect of the image to be processed.
  • the electronic device may perform a brightness parameter calculation according to the environmental information to calculate a target brightness parameter suitable for the image to be processed.
  • the environment information includes the presented screen content in the image to be processed, and therefore, the target brightness parameter can be determined only according to the environment data.
  • the electronic device may preset a calculation model for the target brightness parameter, which may be used as an input to the calculation model, and run the calculation model to output a corresponding target brightness parameter.
  • Operation 308 performing brightness processing on the image to be processed according to the target brightness parameter.
  • the image to be processed is composed of a plurality of pixels, each of which may be composed of a plurality of color channels, each color channel representing a color component.
  • an image can be composed of three channels of RGB (red, green, and blue), or it can be composed of three channels of HSV (hue, saturation, and lightness), or it can be composed of CMY (cyan, magenta, or magenta and Yellow three colors) three-channel composition, or YUV (also known as YCrCb, is a color coding method used by European television systems) data format.
  • RGB red, green, and blue
  • HSV high-ss, saturation, and lightness
  • CMY cyan, magenta, or magenta and Yellow three colors
  • YUV also known as YCrCb, is a color coding method used by European television systems
  • Each format can be converted to each other, for example, each pixel can be converted from YUV format to RGB format.
  • the electronic device can extract corresponding Y data from the YUV data of the frame image, and the Y data represents brightness (Luminance or Luma), that is, a grayscale value, and the corresponding target is determined according to the Y data. Brightness parameter.
  • Each pixel in the frame image corresponds to a grayscale value, and the electronic device can read the grayscale value corresponding to each pixel, or can read some grayscale values therein.
  • the electronic device can correct the color channel of the corresponding color channel according to the corresponding target brightness parameter. Thereby, the brightness processing of the image to be processed is realized, so that the brightness of the processed image is more accurate and closer to the brightness perceived by the human eye.
  • the image brightness processing method described above when acquiring an image, acquiring environmental information of a to-be-processed image obtained by moving the camera in advance; and determining a corresponding target brightness parameter according to the environmental information; since the captured image to be processed is introduced
  • the environmental information provides more reference information for calculating the target brightness parameter, improves the accuracy of the calculated brightness parameter, and then performs brightness processing on the image to be processed according to the brightness parameter, thereby correspondingly improving the accuracy of image brightness processing. Sex.
  • the foregoing operation 302 may be performed before the operation 304, and may also be performed after the operation 306, that is, before the image to be processed is acquired, the target brightness parameter may also be determined first, and the target brightness parameter may be determined by the environment information. It is calculated that the efficiency of luminance processing of the image to be processed can be further improved.
  • the electronic device can set the camera to the shooting state, and in the process of moving the camera, before generating the image to be processed, the electronic device can obtain real-time environmental information according to the mobile scan, and calculate the target in real time according to the environmental information. Brightness parameter.
  • the electronic device When receiving the shooting instruction, the electronic device generates an image to be processed, and performs brightness processing on the image to be processed using the newly calculated target brightness parameter, thereby improving the efficiency of the brightness processing.
  • operation 304 includes: acquiring a real-time frame image obtained by moving the camera in advance; and obtaining environment information of the image to be processed according to the frame image generated at different times.
  • the electronic device can generate a frame image in real time according to a corresponding frame rate.
  • the frame rate may be a fixed set frame rate, and may also be a frame rate that is adaptively determined according to information such as brightness of the current environment.
  • an electronic device can generate a frame image in real time at a frame rate of 30 frames per second.
  • the camera can be moved such that frame images generated at different times are not necessarily the same.
  • the electronic device can incorporate the information in each frame image into the environment information, and obtain complete environmental information from the frame images generated at different times.
  • the electronic device may extract only the image region of the frame image generated at different times and different from the image information in the previously generated frame image, and incorporate the image region into the environment information, so that the environment information includes the image to be processed.
  • the space information is located, and the space information is not repeated.
  • obtaining the environment information of the image to be processed according to the frame image generated at different times includes: comparing the frame images generated at different times to obtain environment information of the image to be processed.
  • the electronic device may extract a color channel of a pixel in each frame image, and perform image image comparison on the current frame and a preset number of frame images before the current frame, and identify the current frame relative to the current frame. A non-overlapping area of a preset number of frame image frames.
  • the electronic device may further analyze the positional relationship of each of the non-repetitive regions in the entire space, and form environmental information of the image to be processed according to the non-repetitive region and the positional relationship.
  • the environment information may be embodied in a data format of the frame image, that is, the environment information may be a panoramic image synthesized by the above-mentioned detected frame images, the non-overlapping region, and the corresponding spatial position of the region in the entire environment.
  • the real-time frame image is obtained by scanning the mobile camera, and the environment information of the image to be processed is obtained according to the frame image generated at different times, so that the acquired information of the environmental information is more.
  • the electronic device may compare the difference area of the picture between the adjacent front and rear frame images according to the spatial scene modeling algorithm, and determine the camera according to the area of the difference area in the corresponding frame image.
  • the spatial coordinates include linear coordinates and angular coordinates.
  • the electronic device can know the coordinate position between each region in the shooting scene and the camera, thereby obtaining environmental information according to the coordinate position and different screens in each frame image.
  • the area captured by the image to be processed is in the shooting scene.
  • the spatial scene modeling algorithm may be a simultaneous localization and mapping (SLAM) algorithm, and the electronic device generates a corresponding frame image in real time during the movement of the camera, according to the frame image.
  • SLAM simultaneous localization and mapping
  • the environment information of the image to be processed is obtained according to the frame image generated at different times, including: calling a motion detecting component, detecting movement data of the camera when each frame image is generated; and obtaining the image from the frame image according to the moving data. Process environmental information for the image.
  • the motion detecting component is an element suitable for detecting the motion state of the device, and may include, but is not limited to, a gyroscope or a gravity sensing device. Components such as accelerometers.
  • the electronic device can call the built-in motion detection component to calculate the movement data of the camera during the movement.
  • the moving data may include a combination of one or more of a moving speed, a moving distance, and a moving angle.
  • the electronic device calculates the relative movement data of the camera relative to the shooting reference frame when the current frame image is captured according to the frame rate of the shooting scan.
  • the relative movement data is the movement data of the camera at the current time relative to the time when the reference frame is captured.
  • the reference frame may be a frame image when the environment information is first recorded, or an image of any one of the frame images used to participate in the recording of the environmental information.
  • the electronic device can calculate the positional relationship in the space between the picture information of the currently captured frame image and the picture information between the reference frames. It will be appreciated that there may be duplicate portions between the picture information. According to the positional relationship, all the environmental information scanned by the camera can be obtained from the frame images generated at different times.
  • the accuracy of the detected moving data can be improved, and the accuracy of detecting the environmental information can be improved.
  • operation 306 includes calculating an ambient illuminance of an environment in which the image to be processed is based on the environmental information; determining a corresponding target brightness parameter based on the ambient illuminance.
  • Illuminance refers to the luminous flux of visible light received per unit area, used to indicate the intensity of illumination and the amount by which the surface area of the object is illuminated.
  • the unit can be expressed in lux (Lux or Lx).
  • the ambient illuminance of the environment in which the image to be processed is located indicates the amount by which the surface area of the object in the image to be processed is the degree of illumination.
  • the electronic device can acquire the Y data at each pixel in the environmental information to read the brightness information of each pixel, and calculate the ambient illuminance according to the brightness information. Alternatively, the Y data at each pixel point may be superimposed and averaged, and the calculated average value is taken as the ambient illuminance.
  • the target brightness parameter can be determined based on the ambient illumination.
  • the electronic device can establish a correspondence between the ambient illuminance and the target brightness parameter, and the correspondence can be embodied by a comparison table between the environment parameter and the target brightness parameter.
  • the electronic device can query the brightness parameter corresponding to the calculated ambient illuminance from the comparison table, and determine the target brightness parameter according to the brightness parameter.
  • the brightness parameter of the query may be directly used as the target brightness parameter, or the target brightness parameter may be multiplied by a corresponding coefficient, and the obtained product is used as the target brightness parameter.
  • the coefficient may be a fixed coefficient and may also be a coefficient determined according to an area occupied by the image to be processed in the environmental information.
  • the target brightness parameter is calculated according to the ambient illuminance, and the efficiency of the target brightness parameter calculation can be improved.
  • the ambient illuminance of the environment in which the image to be processed is located is calculated according to the environment information, including:
  • Operation 502 generating a panoramic image according to the environmental information.
  • the electronic device can synthesize a corresponding image according to the pixel points included in the environment information and the positional relationship between the pixel points. Since the environment information is obtained by scanning by moving the camera, the synthesized image is similar to the panoramic image.
  • Operation 504 identifies the brightness of each pixel in the panoramic image.
  • the electronic device can read the Y data at each pixel, and determine the brightness of the corresponding pixel by the Y data.
  • the Y data can be directly used as the brightness of the corresponding pixel.
  • the electronic device may acquire brightness information corresponding to each pixel point, determine brightness of the corresponding pixel point according to the brightness information, or convert into a conversion relationship with the YUV three channels.
  • YUV three channels according to the Y data therein to determine the brightness of the corresponding pixel.
  • Operation 506 calculates the ambient illuminance of the environment in which the image to be processed is located according to the brightness.
  • the electronic device may average the brightness of each pixel, and use the obtained average value as the ambient illuminance of the corresponding environment.
  • the electronic device may further combine the position of each pixel in the environmental information, determine a coefficient corresponding to the pixel according to the location, and calculate a corresponding ambient illuminance according to the coefficient and the brightness. For example, the brightness can be multiplied by the corresponding coefficient, and the product obtained for each pixel is summed, and the product sum is taken as the ambient illuminance.
  • the ambient brightness is calculated according to the brightness of each pixel in the panoramic image, and the accuracy of the ambient brightness calculation is further improved.
  • determining corresponding target brightness parameters according to ambient illuminance including:
  • Operation 602 obtaining a reference brightness parameter corresponding to the ambient illuminance.
  • the electronic device may preset a correspondence between different environment poisoning and reference brightness parameters, where the correspondence may be represented by a comparison table between the environment poisoning and the reference brightness parameter. From the look-up table, the electronic device can query the reference brightness parameter that matches the ambient illuminance.
  • the comparison table stores brightness parameters that are suitable for use under different environmental illuminances, and the brightness parameters are reference brightness parameters.
  • the brightness parameter may be one or more of sensitivity or exposure duration.
  • Operation 604 Acquire a current brightness parameter of the image to be processed.
  • the current brightness parameter indicates a currently used brightness parameter
  • the brightness parameter may be a brightness parameter adopted by default on the electronic device, or a brightness parameter set according to a user's shooting habit.
  • the electronic device may display the image to be processed according to the current brightness parameter.
  • Operation 606 calculating a target brightness parameter according to the current brightness parameter and the reference brightness parameter.
  • the electrical device can further set a calculation relationship between the target brightness parameter and the current brightness parameter and the reference brightness parameter, and the corresponding target brightness parameter can be calculated according to the calculation relationship.
  • the target brightness parameter is between the current brightness parameter and the reference brightness parameter, such that the determined target brightness parameter can balance the current target brightness parameter and the reference brightness parameter.
  • a weight corresponding to the target brightness parameter and the reference brightness parameter may be respectively set, and the reference brightness parameter and the current brightness parameter are multiplied by respective weights, and summed, and the obtained weighted product sum is obtained as Corresponding target brightness parameter.
  • the calculated target brightness parameter is more adapted to the user's usage habits.
  • FIG. 7 another image brightness processing method is provided, the method comprising:
  • Operation 702 acquiring an image to be processed.
  • the image to be processed may be an image generated in real time in a shooting mode, or may be a frame image displayed on the display screen in real time according to a preset frame rate.
  • Operation 704 acquiring a real-time frame image obtained by moving the camera in advance.
  • the electronic device displays a prompt message of the moving camera on the display screen to prompt the user to move the camera.
  • the display manner of the prompt information and the data format of the prompt information may include a plurality of types.
  • the electronic device can display similar text prompt information such as “Please move the camera left and right”, or display a mark for indicating left and right moving graphics or symbols, for example, an arrow indicating left and right movement can be displayed.
  • the electronic device can cache the frame image obtained by the real-time scanning before acquiring the image to be processed in the shooting mode.
  • the camera can be moved to any position, such as left and right, up and down, front and back, for example, the camera can be rotated left and right at a certain fixed position.
  • the larger the range of camera movement the more environmental information that can be collected, making the subsequent brightness processing more accurate.
  • the user can hold the electronic device and perform an environment scan on the scene to be photographed before the image is to be processed.
  • the electronic device can be hand-held for 360° rotation to obtain environmental information of the entire space.
  • the frame image can be generated in real time according to a preset frame rate.
  • Operation 706 comparing frame images generated at different times to obtain environment information of the image to be processed.
  • the frame image that participates in extracting the environment information may be a frame image acquired within a preset duration before the generation time of the image to be processed, or in the process of capturing the image to be processed, in the no-stop shooting mode. And the resulting frame image.
  • the electronic device may compare the image frames of each adjacent two frames of images to identify a non-overlapping region of the current frame relative to a preset number of frame image frames preceding the current frame.
  • the electronic device analyzes the positional relationship of each of the non-repetitive regions in the entire space, and forms environmental information of the image to be processed according to the non-repetitive region and the positional relationship.
  • the electronic device may construct the entire spatial information of the shooting scene of the image to be processed according to the captured frame image during the moving shooting according to the preset SLAM algorithm. Based on the spatial information, environmental information is recorded as much as possible to use the environmental information for luminance processing.
  • Operation 708 generating a panoramic image according to the environmental information; identifying the brightness of each pixel in the panoramic image; and calculating the ambient illuminance of the environment in which the image to be processed is located according to the brightness.
  • the non-overlapping region is also composed of corresponding pixel points, and the positional relationship between the non-repetitive regions also determines the positional relationship between each pixel point, and the panoramic image can be synthesized according to the pixel point and the positional relationship. It can be understood that since the manner of moving the camera is not necessarily regular, the synthesized panoramic image is not necessarily a complete rectangle, and there may be a lack of pixel points on a certain area.
  • the electronic device can read the Y data of the YUV channel at each pixel in the panoramic image, calculate an average value of the Y data, and use the average value as the ambient illuminance of the environment in which the image to be processed is located.
  • the environment information has more reference information, so that the ambient illuminance calculated according to the environment information is more accurate.
  • Operation 710 acquire a reference brightness parameter corresponding to the ambient illuminance; obtain a current brightness parameter of the image to be processed; and calculate a target brightness parameter according to the current brightness parameter and the reference brightness parameter.
  • a relationship between the preset ambient illuminance and the reference brightness parameter in the electronic device is a comparison table.
  • the electronic device may query the relationship comparison table for the reference brightness parameter corresponding to the ambient illuminance.
  • the electronic device also acquires a current brightness parameter of the image to be processed, and the current brightness parameter is a brightness parameter used when generating the image to be processed.
  • the current brightness parameter can be the brightness parameter used by the system by default, or the shooting parameter calculated according to the user's shooting habits.
  • the electronic device may respectively set corresponding weights for the current brightness parameter and the reference brightness parameter, multiply the reference brightness parameter and the current brightness parameter with respective corresponding weights, and sum, and obtain the weighted product sum as a corresponding Target brightness parameter.
  • Operation 712 performing brightness processing on the image to be processed according to the target brightness parameter.
  • the values corresponding to the preset current brightness parameter and the reference brightness parameter are respectively 0.6 and 0.4, and the current brightness parameter of the image to be processed is: the current exposure degree is 200 as an example for description.
  • the brightness of the image to be processed is not adjusted too much, and a balance is made between the brightness adjustment range and the true brightness of the image to be processed.
  • the operations in the flowchart of the method of the embodiment of the present application are sequentially displayed in accordance with the indication of the arrows, but the operations are not necessarily performed in the order indicated by the arrows. Except as explicitly stated herein, the execution of these operations is not strictly limited, and may be performed in other sequences. Moreover, at least a part of the operations in the method flowchart of the embodiment of the present application may include multiple sub-operations or multiple stages, which are not necessarily performed at the same time, but may be executed at different times. The order of execution is not necessarily performed sequentially, but may be performed alternately or alternately with at least a portion of the sub-operations or phases of other operations or other operations.
  • an image brightness processing apparatus comprising:
  • the image obtaining module 802 is configured to acquire an image to be processed.
  • the environment information generating module 804 is configured to acquire environment information of the to-be-processed image obtained by moving the camera in advance.
  • the brightness processing module 806 is configured to determine a corresponding target brightness parameter according to the environment information; perform brightness processing on the image to be processed according to the target brightness parameter.
  • the environment information generating module 804 is further configured to acquire a real-time frame image obtained by moving the camera in advance; and obtain environment information of the image to be processed according to the frame image generated at different times.
  • the environment information generating module 804 is further configured to compare frame images generated at different times to obtain environment information of the image to be processed.
  • the environment information generating module 804 is further configured to invoke a motion detecting component to detect motion data of the camera when each frame image is generated; and to obtain environment information of the image to be processed from the frame image according to the motion data.
  • the brightness processing module 806 is further configured to calculate an ambient illuminance of an environment in which the image to be processed is located according to the environment information; and determine a corresponding target brightness parameter according to the ambient illuminance.
  • the brightness processing module 806 is further configured to generate a panoramic image according to the environment information; identify the brightness of each pixel in the panoramic image; and calculate an ambient illuminance of the environment in which the image to be processed is located according to the brightness.
  • the brightness processing module 806 is further configured to acquire a reference brightness parameter corresponding to the ambient illuminance; acquire a current brightness parameter of the image to be processed; and calculate a target brightness parameter according to the current brightness parameter and the reference brightness parameter.
  • each module in the image brightness processing device is for illustrative purposes only. In other embodiments, the image brightness processing device may be divided into different modules as needed to complete all or part of the functions of the image brightness processing device.
  • Each of the above-described image brightness processing devices may be implemented in whole or in part by software, hardware, and combinations thereof.
  • the above modules may be embedded in the hardware in the processor or in the memory in the server, or may be stored in the memory in the server, so that the processor calls the corresponding operations of the above modules.
  • the terms "component”, “module” and “system” and the like are intended to mean a computer-related entity, which may be hardware, a combination of hardware and software, software, or software in execution.
  • a component can be, but is not limited to being, a process running on a processor, a processor, an object, an executable, a thread of execution, a program, and/or a computer.
  • an application running on a server and a server can be a component.
  • One or more components can reside within a process and/or executed thread, and the components can be located within one computer and/or distributed between two or more computers.
  • a non-transitory computer readable storage medium having stored thereon a computer program that, when executed by a processor, implements the operations of the image brightness processing methods provided by the various embodiments described above.
  • an electronic device comprising a memory, a processor, and a computer program stored on the memory and operable on the processor, the image brightness provided by the above embodiments being implemented when the processor executes the computer program The operation of the processing method.
  • the embodiment of the present application also provides a computer program product.
  • a computer program product comprising instructions which, when run on a computer, cause the computer to perform the operations of the image brightness processing method provided by the various embodiments described above.
  • An embodiment of the present application further provides an electronic device.
  • the above electronic device includes a photographing circuit, and the photographing circuit can be implemented by using hardware and/or software components, and can include various processing units defining an ISP (Image Signal Processing) pipeline.
  • Figure 9 is a schematic illustration of a capture circuit in one embodiment. As shown in FIG. 9, for convenience of explanation, only various aspects of the photographing technique related to the embodiment of the present application are shown.
  • the photographing circuit includes an ISP processor 940 and a control logic 950.
  • the image data captured by imaging device 910 is first processed by ISP processor 940, which analyzes the image data to capture image statistics that can be used to determine and/or control one or more control parameters of imaging device 910.
  • Imaging device 910 can include a camera having one or more lenses 912 and image sensor 914.
  • Image sensor 914 may include a color filter array (such as a Bayer filter) that may acquire light intensity and wavelength information captured with each imaging pixel of image sensor 914 and provide a set of primitives that may be processed by ISP processor 940 Image data.
  • a sensor 920 such as a gyroscope, can provide acquired photographic parameters (such as anti-shake parameters) to the ISP processor 940 based on the sensor 920 interface type.
  • the sensor 920 interface may utilize a SMIA (Standard Mobile Imaging Architecture) interface, other serial or parallel camera interfaces, or a combination of the above.
  • SMIA Standard Mobile Imaging Architecture
  • image sensor 914 can also transmit raw image data to sensor 920, which can provide raw image data to ISP processor 940 based on sensor 920 interface type, or sensor 920 can store raw image data into image memory 930.
  • the ISP processor 940 processes the raw image data pixel by pixel in a variety of formats.
  • each image pixel can have a bit depth of 9, 10, 12, or 14 bits, and the ISP processor 940 can perform one or more shooting operations on the raw image data, collecting statistical information about the image data. Among them, the shooting operation can be performed with the same or different bit depth precision.
  • the ISP processor 940 can also receive image data from the image memory 930.
  • the sensor 920 interface transmits raw image data to image memory 930, which is then provided to ISP processor 940 for processing.
  • Image memory 930 may be part of a memory device, a storage device, or a separate dedicated memory within an electronic device, and may include DMA (Direct Memory Access) features.
  • DMA Direct Memory Access
  • the ISP processor 940 can perform one or more capture operations, such as time domain filtering.
  • the processed image data can be sent to image memory 930 for additional processing prior to being displayed.
  • the ISP processor 940 can also receive processing data from the image memory 930, processing the processed data in image data in the original domain and in the RGB and YCbCr color spaces.
  • the processed image data can be output to display 980 for viewing by a user and/or further processed by a graphics engine or a GPU (Graphics Processing Unit). Additionally, the output of ISP processor 940 can also be sent to image memory 930, and display 980 can read image data from image memory 930.
  • image memory 930 can be configured to implement one or more frame buffers. Additionally, the output of ISP processor 940 can be sent to encoder/decoder 970 to encode/decode image data. The encoded image data can be saved and decompressed before being displayed on the display 980 device.
  • the ISP processor 940 processes the image data by performing VFE (Video Front End) processing and CPP (Camera Post Processing) processing on the image data.
  • VFE processing of the image data may include correcting the contrast or brightness of the image data, modifying the digitally recorded illumination state data, performing compensation processing on the image data (such as white balance, automatic gain control, gamma correction, etc.), and performing image data.
  • CPP processing of image data may include scaling the image, providing a preview frame and a recording frame to each path. Among them, CPP can use different codecs to process preview frames and record frames.
  • the image data processed by the ISP processor 940 can be sent to the beauty module 960 for aesthetic processing of the image before being displayed.
  • the beauty treatment of the image data by the beauty module 960 may include: whitening, freckle, dermabrasion, face-lifting, acne, eye enlargement, and the like.
  • the beauty module 960 can be a CPU (Central Processing Unit), a GPU, or a coprocessor in an electronic device.
  • the processed data of the beauty module 960 can be sent to the encoder/decoder 970 to encode/decode the image data.
  • the encoded image data can be saved and decompressed before being displayed on the display 980 device.
  • the beauty module 960 can also be located between the encoder/decoder 970 and the display 980, that is, the beauty module performs cosmetic processing on the imaged image.
  • the encoder/decoder 970 described above may be a CPU, GPU, coprocessor, or the like in an electronic device.
  • the statistics determined by the ISP processor 940 can be sent to the control logic 950 unit.
  • the statistics may include image sensor 914 statistics such as auto exposure, auto white balance, auto focus, flicker detection, black level compensation, lens 912 shading correction, and the like.
  • Control logic 950 can include a processor and/or a microcontroller that executes one or more routines, such as firmware, and one or more routines can determine control parameters of imaging device 910 and ISP processing based on received statistical data.
  • Control parameters of the 940 may include sensor 920 control parameters (eg, gain, integration time for exposure control), camera flash control parameters, lens 912 control parameters (eg, focus or zoom focal length), or a combination of these parameters.
  • the ISP control parameters may include a gain level and color correction matrix for automatic white balance and color adjustment (eg, during RGB processing), and a lens 912 shading correction parameter.
  • the above image brightness processing method can be realized by the photographing technique in FIG.
  • Non-volatile memory can include read only memory (ROM), programmable ROM (PROM), electrically programmable ROM (EPROM), electrically erasable programmable ROM (EEPROM), or flash memory.
  • Volatile memory can include random access memory (RAM), which acts as an external cache.
  • RAM is available in a variety of forms, such as static RAM (SRAM), dynamic RAM (DRAM), synchronous DRAM (SDRAM), dual data rate SDRAM (DDR SDRAM), enhanced SDRAM (ESDRAM), synchronization.
  • SRAM static RAM
  • DRAM dynamic RAM
  • SDRAM synchronous DRAM
  • DDR SDRAM dual data rate SDRAM
  • ESDRAM enhanced SDRAM
  • SLDRAM Link (Synchlink) DRAM
  • SLDRAM Memory Bus
  • Rambus Direct RAM
  • RDRAM Direct Memory Bus Dynamic RAM
  • RDRAM Memory Bus Dynamic RAM

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Studio Devices (AREA)
  • Image Processing (AREA)

Abstract

一种图像亮度处理方法,包括:获取待处理图像;获取预先通过移动摄像头而得到的所述待处理图像的环境信息;根据所述环境信息确定相应的目标亮度参数;根据所述目标亮度参数对所述待处理图像进行亮度处理。

Description

图像亮度处理方法、计算机可读存储介质和电子设备
相关申请的交叉引用
本申请要求于2017年11月28日提交中国专利局、申请号为2017112130315、发明名称为“图像亮度处理方法、装置、存储介质和电子设备”的中国专利申请的优先权,其全部内容通过引用结合在本申请中。
技术领域
本申请涉及图像亮度处理技术领域,特别是涉及一种图像亮度处理方法、非易失性计算机可读存储介质和电子设备。
背景技术
拍照设备在进行拍摄时,在不同光照环境下,需要采用对应的亮度参数,以拍摄出精美的图像。
传统的图像亮度处理方法,都是采用固定的亮度参数来进行拍摄,或者是根据基于拍摄待生成的图像中已有的环境信息,来计算出对应的亮度参数,根据计算出的亮度参数对拍摄对象来进行拍摄,使得呈现出的图像与人眼感知到的物体的亮度一致。然而从该拍摄的图像中能得到的环境信息有限,因此传统的方法对拍摄对象的亮度还原不够准确。
发明内容
根据本申请提供的各种实施例提供一种图像亮度处理方法、非易失性计算机可读存储介质和电子设备。
一种图像亮度处理方法,包括:
获取待处理图像;
获取预先通过移动摄像头而得到的所述待处理图像的环境信息;
根据所述环境信息确定相应的目标亮度参数;及
根据所述目标亮度参数对所述待处理图像进行亮度处理。
一种电子设备,包括存储器及处理器,所述存储器中储存有计算机可读指令,所述指令被所述处理器执行时,使得所述处理器执行以下操作:
获取待处理图像;
获取预先通过移动摄像头而得到的所述待处理图像的环境信息;
根据所述环境信息确定相应的目标亮度参数;及
根据所述目标亮度参数对所述待处理图像进行亮度处理。
一个或多个包含计算机可执行指令的非易失性计算机可读存储介质,当所述计算机可执行指令被一个或多个处理器执行时,使得所述处理器执行以下操作:
获取待处理图像;
获取预先通过移动摄像头而得到的所述待处理图像的环境信息;
根据所述环境信息确定相应的目标亮度参数;及
根据所述目标亮度参数对所述待处理图像进行亮度处理。
本申请实施例所提供的图像亮度处理方法、存储介质和电子设备,当在进行拍摄图像时,获取预先通过移动摄像头而得到的待处理图像的环境信息;并根据环境信息确定相应的目标亮度参数;由于引入了所拍摄的待处理图像的环境信息,从而为计算目标亮度参数提供了更多的参考信息,提高了计算出的亮度参数的准确性,进而再根据亮度参数对待处理图像进行亮度处理,也相应提高了对图像亮度处理的准确性。
本申请的一个或多个实施例的细节在下面的附图和描述中提出。本申请的其它特征、 目的和优点将从说明书、附图以及权利要求书变得明显。
附图说明
为了更清楚地说明本申请实施例或现有技术中的技术方案,下面将对实施例或现有技术描述中所需要使用的附图作简单地介绍,显而易见地,下面描述中的附图仅仅是本申请的一些实施例,对于本领域普通技术人员来讲,在不付出创造性劳动的前提下,还可以根据这些附图获得其他的附图。
图1为一个实施例中图像亮度处理方法的应用环境图。
图2为一个实施例中电子设备的内部结构示意图。
图3为一个实施例中图像亮度处理方法的流程图。
图4A为一个实施例中待处理图像的示意图。
图4B为一个实施例中待处理图像的环境信息的示意图。
图5为一个实施例中根据环境信息计算出待处理图像所处环境的环境照度的流程图。
图6为一个实施例中根据环境照度确定相应的目标亮度参数的流程图。
图7为另一个实施例中图像亮度处理方法的流程图。
图8为一个实施例中图像亮度处理装置的结构框图。
图9为一个实施例中拍摄电路的示意图。
具体实施方式
为了使本申请的目的、技术方案及优点更加清楚明白,以下结合附图及实施例,对本申请进行进一步详细说明。应当理解,此处所描述的具体实施例仅仅用以解释本申请,并不用于限定本申请。
图1为一个实施例中图像亮度处理方法的应用环境图。参考如图1所示,电子设备110可调用其上的摄像头进行拍摄,如对环境中的物体120进行实时扫描得到帧图像,根据该帧图像生成拍摄的图像。可选地,该摄像头可为双摄像头,包含主摄像头和副摄像头,根据该主摄像头和副摄像头共同实现拍摄,生成图像。电子设备可将该帧图像或者生成的图像,作为待处理图像,并获取预先通过移动摄像头而得到的待处理图像的环境信息;根据环境信息确定相应的目标亮度参数;根据目标亮度参数对待处理图像进行亮度处理。
图2为一个实施例中电子设备的内部结构示意图。如图2所示,该电子设备包括通过系统总线连接的处理器、存储器、显示屏和摄像头。其中,该处理器用于提供计算和控制能力,支撑整个电子设备的运行。存储器用于存储数据、程序等,存储器上存储至少一个计算机程序,该计算机程序可被处理器执行,以实现本申请实施例中提供的适用于电子设备的图像亮度处理方法。存储器可包括磁碟、光盘、只读存储记忆体(Read-Only Memory,ROM)等非易失性存储介质,或随机存储记忆体(Random-Access-Memory,RAM)等。例如,在一个实施例中,存储器包括非易失性存储介质及内存储器。非易失性存储介质存储有操作系统、数据库和计算机程序。数据库中存储有用于实现以下各个实施例所提供的一种图像亮度处理方法相关的数据,比如可存储有待处理图像、环境信息等数据。该计算机程序可被处理器所执行,以用于实现以下各个实施例所提供的一种图像亮度处理方法。内存储器为非易失性存储介质中的操作系统和计算机程序提供高速缓存的运行环境。显示屏可以是触摸屏,比如为电容屏或电子屏,用于显示待处理图像等可视信息,还可以被用于检测作用于该显示屏的触摸操作,生成相应的指令。
本领域技术人员可以理解,图2中示出的结构,仅仅是与本申请方案相关的部分结构的框图,并不构成对本申请方案所应用于其上的电子设备的限定,具体的电子设备可以包括比图中所示更多或更少的部件,或者组合某些部件,或者具有不同的部件布置。比如该电子设备还可包括通过系统总线连接的网络接口,并通过该网络接口与其它设备进行通 信,比如可通过该网络接口与获取其它设备上的图像或亮度算法等数据。
在一个实施例中,如图3所示,提供了一种图像亮度处理方法,
本实施例主要以该方法应用于如图1所示的电子设备中进行说明,该方法包括:
操作302,获取待处理图像。
待处理图像是指需要进行亮度处理的图像,可以是已经拍摄生成的图像,还可以是在拍摄模式下,通过摄像头实时扫描得到帧图像。
当待处理图像为帧图像时,电子设备接收到开启摄像头的指令时,可调用摄像头进入拍摄状态。该摄像头包括主摄像头和副摄像头。电子设备可通过该主摄像头和/或副摄像头对拍摄环境中的物体进行扫描,形成该帧图像。
当待处理图像为已经拍摄生成的图像,电子设备可接收拍摄指令,根据扫描得到实时的帧图像生成拍摄的图像,该生成的图像即为该待处理图像。其中,拍摄指令可以是通过侦测到的相关触控操作、物理按键的按压操作或语音控制操作等触发的拍摄指令。触控操作可为触摸点击操作、触摸长按操作、触摸滑动操作、多点触控操作等操作。电子设备可提供用于触发进行拍摄的拍摄按钮,当侦测到对该按钮的点击操作时,触发拍摄指令。电子设备还可预设用于触发该拍摄指令的拍摄语音信息。通过调用语音接收装置,接收对应的语音信息,通过解析该语音信息,当检测到该语音信息与该拍摄语音信息匹配时,可触发该拍摄指令。
操作304,获取预先通过移动摄像头而得到的待处理图像的环境信息。
电子设备可在生成待处理图像之前或在生成待处理图像的过程中,即通过移动摄像头而获取与待处理图像相关的环境信息。其中,该环境信息为待处理图像所处环境的信息,包含待处理图像中的场景和该场景的周边的信息。该环境信息可以以图像或帧图像的数据形式体现,环境信息上的每个像素点对应所处环境内的位置,该像素点呈现的颜色即为该环境中的对应位置通过摄像头扫描而呈现出的颜色。当将环境信息以图像或帧图像的形式呈现时,该环境信息所呈现出的即为待处理图像所处环境。
如图4A和图4B所示,其中,图4A为待处理图像的示意图,图4B即为该待处理图像的环境信息的示意图。其中,待处理图像中呈现的主要是一个卡通人像,而该环境信息中包含了该卡通人像之外,还包含人像身体两侧的植物以及卡通人像头部两侧的白色背景等信息。可以理解地,用户可在拍摄该待处理图像之前,可移动处于拍摄状态的摄像头。电子设备可通过该摄像头的移动,而记录并整理出待处理图像的环境信息。
操作306,根据环境信息确定相应的目标亮度参数。
其中,亮度参数表示在处理图像的亮度时,需要使用到的参数。该亮度参数可包括但不限于感光度、曝光量、曝光时长等其中的一种或多种。目标亮度参数表示在调节待处理图像的亮度呈现效果时,所使用的亮度参数。电子设备可根据该环境信息进行亮度参数计算,以计算出适用于该待处理图像的目标亮度参数。通常地,该环境信息中即包含该待处理图像中的所呈现的画面内容,因此,可仅根据该环境数据来确定该目标亮度参数。
在一个实施例中,电子设备可预设针对目标亮度参数的计算模型,可将该环境信息作为该计算模型的输入,并运行该计算模型,以输出相应的目标亮度参数。
操作308,根据目标亮度参数对待处理图像进行亮度处理。
待处理图像由若干个像素点构成的,每个像素点可以由多个颜色通道构成,每个颜色通道表示一个颜色分量。例如,图像可以由RGB(红、绿、蓝三种颜色)三通道构成,也可以是由HSV(色调、饱和度和明度)三通道构成,还可以是由CMY(青、洋红或品红和黄三种颜色)三通道构成,或者是YUV(亦称YCrCb,是被欧洲电视系统所采用的一种颜色编码方法)的数据格式组成。每种格式之间可进行相互转换,比如可将每个像素点由YUV格式转换成RGB格式。
以YUV格式为例进行说明,电子设备可从帧图像的YUV数据中提取相应的Y数据,该 Y数据表示明亮度(Luminance或Luma),也就是灰阶值,根据该Y数据确定相应的目标亮度参数。帧图像中的每个像素对应一个灰阶值,电子设备可针对每个像素对应的灰阶值进行读取,或者可对其中的部分灰阶值进行读取。
针对每个像素点上的每个颜色通道,电子设备可按照相应的目标亮度参数对对应的显色通道的颜色通道进行修正。从而实现对待处理图像的亮度处理,使得处理后的图像呈现的亮度更加准确,更加接近人眼所感知到的亮度。
上述的图像亮度处理方法,当在进行拍摄图像时,获取预先通过移动摄像头而得到的待处理图像的环境信息;并根据环境信息确定相应的目标亮度参数;由于引入了所拍摄的待处理图像的环境信息,从而为计算目标亮度参数提供了更多的参考信息,提高了计算出的亮度参数的准确性,进而再根据亮度参数对待处理图像进行亮度处理,也相应提高了对图像亮度处理的准确性。
在一个实施例中,上述的操作302可在操作304之前执行,还可在操作306之后执行,即可在获取待处理图像之前,也可以先确定目标亮度参数,该目标亮度参数可由环境信息所计算出,可进一步提高对待处理图像进行亮度处理的效率。比如,电子设备可将摄像头设置为拍摄状态,并在移动摄像头的过程中,在生成拍摄待处理图像之前,电子设备可根据移动扫描来得到实时的环境信息,并根据该环境信息实时计算出目标亮度参数。当接收到拍摄指令时,电子设备生成待处理图像,并使用最新计算出的目标亮度参数对待处理图像进行亮度处理,从而提高了亮度处理的效率。
在一个实施例中,操作304包括:获取预先通过移动摄像头而得到的实时的帧图像;根据不同时刻生成的帧图像得到待处理图像的环境信息。
电子设备可按照对应的帧率来实时生成帧图像。其中,该帧率可为固定设置的帧率,还可为根据当前环境的亮度等信息自适应确定的帧率。比如电子设备可以以每秒30帧的帧率来实时生成帧图像。在实时生成帧图像的过程中,可移动该摄像头,使得不同的时刻生成的帧图像不一定相同。电子设备可将每个帧图像中的信息均纳入环境信息中,由该不同时刻生成的帧图像而得到完整的环境信息。或者电子设备也可仅提取不同时刻生成的帧图像中,与之前所生成的帧图像中的图像信息不同的图像区域,将该图像区域纳入环境信息中,从而由使得环境信息中包含待处理图像所在空间信息,且该空间信息不重复。
在一个实施例中,根据不同时刻生成的帧图像得到待处理图像的环境信息,包括:将不同时刻生成的帧图像进行对比,得到待处理图像的环境信息。
电子设备可提取每个帧图像中的像素点的颜色通道,并对当前帧和在该当前帧之前的预设数量的帧图像进行图像画面比较,识别出当前帧相对于在该当前帧之前的预设数量的帧图像画面的不重复区域。电子设备可以进一步分析每个不重复区域在整个空间中的位置关系,根据该不重复区域和位置关系,形成该待处理图像的环境信息。其中,该环境信息可为帧图像的数据形式体现,即该环境信息可由上述检测出的各个帧图像中,不重复区域以及该区域在整个环境中对应的空间位置,而合成的全景图像。通过移动摄像头扫描而得到实时的帧图像,并根据不同时刻生成的帧图像得到待处理图像的环境信息,使得获取的环境信息的信息量更多。
可选地,电子设备可按照空间场景建模算法,在画面移动过程中,比较相邻的前后帧图像之间的画面的差异区域,根据该差异区域在对应帧图像中的区域来确定摄像头在整个拍摄场景中的空间坐标,该空间坐标包括直线坐标和角度坐标。根据所确定的空间坐标,电子设备可以获知该拍摄场景中的每个区域与摄像头之间的坐标位置,从而根据该坐标位置以及每个帧图像中的不同画面,得到环境信息。其中,该待处理图像所拍摄的区域即处于该拍摄场景中。在一个实施例中,该空间场景建模算法可为即时定位与地图构建(simultaneous localization and mapping,SLAM)算法,电子设备通过在摄像头移动的过程中,实时生成相应的帧图像,按照该帧图像与预设的SLAM算法,可构建出摄像头 在当前拍摄画面下,所处空间中的空间信息,根据该空间信息,尽可能多地记录环境信息,以将该环境信息用于亮度处理。
在一个实施例中,根据不同时刻生成的帧图像得到待处理图像的环境信息,包括:调用运动检测元件,检测在生成每个帧图像时摄像头的移动数据;根据移动数据从帧图像中得到待处理图像的环境信息。
运动检测元件为适用于检测设备运动状态的元件,可包括但不限于陀螺仪或重力感应装置。加速度传感器等元件。电子设备可调用内置的运动检测元件,计算出摄像头在移动过程中的移动数据。其中,移动数据可包括移动速度、移动距离和移动角度等其中的一种或多种的组合。电子设备按照拍摄扫描的帧率,计算出拍摄当前帧图像时,相对于拍摄参考帧,摄像头的相对移动数据。其中,该相对移动数据为在当前时刻,相对于拍摄参考帧的时刻,摄像头的移动数据。参考帧可为首次用于记录环境信息时的帧图像,或者被用于参与记录环境信息的帧图像中的任意一帧图像。根据该相对移动数据,电子设备可以计算出当前拍摄的帧图像的画面信息和参考帧之间的画面信息,在空间中的位置关系。可以理解地,该画面信息之间可能具有重复的部分。根据该位置关系,可以从不同时刻生成的帧图像中,得到通过摄像头所扫描到的全部环境信息。
通过利用电子设备固有的运动检测元件进行移动数据的检测,可提高检出的移动数据的准确性,进而也提高了环境信息的检测的准确性。
在一个实施例中,操作306包括:根据环境信息计算出待处理图像所处环境的环境照度;根据环境照度确定相应的目标亮度参数。
照度指单位面积上所接受可见光的光通量,用于指示光照的强弱和物体表面积被照明程度的量,其单位可用勒克斯(Lux或Lx)表示。待处理图像所处环境的环境照度表示该待处理图像中的物体表面积为照明的程度的量。
电子设备可获取该环境信息中,每个像素点上的Y数据,以读取每个像素点的亮度信息,根据该亮度信息计算出环境照度。可选地,可将每个像素点上的Y数据进行叠加,并求平均,将计算出的平均值作为该环境照度。目标亮度参数可根据该环境照度来确定。电子设备可建立该环境照度与目标亮度参数之间的对应关系,该对应关系可通过诸如环境参数与目标亮度参数之间的对照表来体现。电子设备可从该对照表中查询所计算出的环境照度所对应的亮度参数,根据该亮度参数确定目标亮度参数。比如可将该查询出的亮度参数直接作为目标亮度参数,或者可将该目标亮度参数乘以对应的系数,将得到的乘积作为目标亮度参数。该系数可为固定的系数,还可为根据待处理图像在该环境信息中所占据的面积所确定的系数。
本实施例中,通过计算环境照度,根据该环境照度来计算出目标亮度参数,可提高了目标亮度参数计算的效率。
在一个实施例中,如图5所示,根据环境信息计算出待处理图像所处环境的环境照度,包括:
操作502,根据环境信息生成全景图像。
电子设备可根据该环境信息中包含的像素点以及像素点之间的位置关系,合成对应的图像,由于该环境信息是由通过移动摄像头来扫描而得到的,所以合成的图像即类似全景图像。
操作504,识别全景图像中每个像素点的亮度。
以像素点的颜色通道为YUV三通道为例,电子设备可读取每个像素点上的Y数据,将该Y数据确定对应像素点的亮度。其中,可直接将该Y数据作为对应像素点的亮度。当为RGB三通道或其他通道时,电子设备可以获取对应每个像素点的亮度信息,根据该亮度信息确定对应像素点的亮度,或者可按照与该YUV三通道之间的转换关系,转换成YUV三通道,根据其中的Y数据确定对应像素点的亮度。
操作506,根据亮度计算出待处理图像所处环境的环境照度。
可选地,电子设备可将每个像素点的亮度进行求平均值,将得到的平均值作为对应环境的环境照度。或者电子设备也可进一步结合每个像素点在环境信息中所处的位置,根据所处的位置确定与该像素点对应的系数,根据该系数与亮度来计算出对应的环境照度。比如可将该亮度与对应系数进行相乘,并对每个像素得到的乘积进行求和,将该乘积和作为环境照度。
本实施例中,通过合成群经图像,按照全景图像中每个像素的亮度来计算出环境亮度,进一步提高了环境亮度计算的准确性。
在一个实施例中,如图6所示,根据环境照度确定相应的目标亮度参数,包括:
操作602,获取与环境照度对应的参考亮度参数。
可选地,电子设备可预先设置不同环境中毒与参考亮度参数之间的对应关系,该对应关系可为环境中毒与参考亮度参数之间的对照表来体现。电子设备可从该对照表中,查询与该环境照度相匹配的参考亮度参数。对照表中存储了不同环境照度下,所适宜采用的亮度参数,该亮度参数即为参考亮度参数。其中,该亮度参数可为感光度或者曝光时长等其中的一种或几种。
操作604,获取待处理图像的当前亮度参数。
其中,该当前亮度参数表示当前所使用的亮度参数,该亮度参数可为电子设备上默认采用的亮度参数,或者为根据用户拍摄习惯所设置的亮度参数。在初始显示待处理图像的时候,电子设备可按照该当前亮度参数来显示待处理图像。
操作606,根据当前亮度参数和参考亮度参数计算出目标亮度参数。
电设设备可进一步设置了目标亮度参数与当前亮度参数和参考亮度参数之间的计算关系,可根据该计算关系,计算出对应的目标亮度参数。可选地,目标亮度参数为处于当前亮度参数和参考亮度参数之间,使得确定的目标亮度参数可兼顾当前目标亮度参数和参考亮度参数。
比如,可针对该目标亮度参数与参考亮度参数分别设置对应的权值,通过将参考亮度参数和当前亮度参数与各自对应的权值进行相乘,并求和,将得出的加权乘积和作为对应的目标亮度参数。
通过进一步进入当前亮度参数,并根据该当前亮度参数和参考亮度参数计算出目标亮度参数,使得计算出的目标亮度参数更加适应用户的使用习惯。
在一个实施例中,如图7所示,提供了另一种图像亮度处理方法,该方法包括:
操作702,获取待处理图像。
可选地,该待处理图像可为在拍摄模式下,实时生成的图像,或者可为按照预设帧率实时呈现在显示屏上的帧图像。
操作704,获取预先通过移动摄像头而得到的实时的帧图像。
电子设备在拍摄模式下,在显示屏上显示移动摄像头的提示信息,以提示用户对摄像头进行移动。可以理解地,该提示信息的显示方式和提示信息的数据格式均可包含多种。比如电子设备可以显示“请左右移动摄像头”等类似的文字提示信息,或者可显示用于表示左右移动图形或符号等标记,比如可显示表示左右移动的箭头等。电子设备可在拍摄模式下,在获取到待处理图像之前,即可缓存该实时扫描得到的帧图像。
可选地,摄像头可进行左右、上下、前后等任意位置移动,比如摄像头可以以某个固定位置进行左右转动。摄像头移动范围越大,则对应可采集到的环境信息更丰富,使得后续亮度处理的准确性更高。举例来说,用户可在拍摄出待处理图像之前,可手持该电子设备,对要拍摄的场景进行环境扫描,比如可手持该电子设备进行360°转动一圈,以得到整个空间的环境信息。在电子设备移动的过程中,可按照预设的帧率实时地生成帧图像。
操作706,将不同时刻生成的帧图像进行对比,得到待处理图像的环境信息。
可选地,参与提取环境信息的帧图像,可为在待处理图像的生成时间之前的预设时长之内获取的帧图像,或者为在拍摄待处理图像的过程中,在没有终止拍摄模式下,而生成的帧图像。
电子设备可将每个相邻两帧图像进行图像画面比较,识别出当前帧相对于在该当前帧之前的预设数量的帧图像画面的不重复区域。电子设备分析每个不重复区域在整个空间中的位置关系,根据该不重复区域和位置关系,形成该待处理图像的环境信息。
在一个实施例中,电子设备可按照预设的SLAM算法,在移动拍摄过程中,按照拍摄的帧图像,构建出待处理图像的拍摄场景所处的整个空间信息。根据该空间信息,尽可能多地记录环境信息,以将该环境信息用于亮度处理。
操作708,根据环境信息生成全景图像;识别全景图像中每个像素点的亮度;根据亮度计算出待处理图像所处环境的环境照度。
其中,该不重复区域也是由对应的像素点所构成,该不重复区域之间的位置关系也决定每个像素点之间的位置关系,根据该像素点和位置关系,可合成该全景图像。可以理解地,由于移动摄像头的方式不一定规则,该合成的全景图像并不一定为一个完整的矩形,可能存在某一区域上的像素点的缺失。
电子设备可读取全景图像中,每个像素点上的YUV通道的Y数据,计算该Y数据的平均值,将该平均值作为待处理图像所处环境的环境照度。相比较于待处理图像的像素点,环境信息中具有更多的参考信息,使得根据该环境信息所计算出的环境照度更加准确。
操作710,获取与环境照度对应的参考亮度参数;获取待处理图像的当前亮度参数;根据当前亮度参数和参考亮度参数计算出目标亮度参数。
可选地,电子设备中预设的环境照度与参考亮度参数之间的关系对照表。在计算出对应的环境照度下,电子设备可从该关系对照表中查询与该环境照度对应的参考亮度参数。同时电子设备还获取待处理图像的当前亮度参数,该当前亮度参数为生成待处理图像时,所采用的亮度参数。当前亮度参数可为系统默认使用的亮度参数,或者为根据用户的拍摄习惯进行分析,计算出的拍摄参数。电子设备可针对该当前亮度参数和参考亮度参数分别设置对应的权值,将参考亮度参数和当前亮度参数与各自对应的权值进行相乘,并求和,将得出的加权乘积和作为对应的目标亮度参数。
操作712,根据目标亮度参数对待处理图像进行亮度处理。
以亮度参数为曝光度,预设当前亮度参数和参考亮度参数对应的取值分别为0.6和0.4,待处理图像的当前亮度参数为:当前曝光度为200为例进行说明。电子设备将全景图像中的所有像素点上的Y通道进行平均值计算,得到的环境照度2000Lux,从该关系对照表中查询出2000Lux的环境照度对应的感光度为230。则目标亮度参数为:目标感光度=200×0.6+230×0.4=212。即电子设备可根据感光度为212对待处理图像进行亮度处理。
通过进一步考虑当前目标亮度参数,使得对待处理图像的亮度调整不至于过多,并在待处理图像的亮度调整幅度和真实亮度之间进行平衡。
本申请实施例的方法流程图中的各个操作按照箭头的指示依次显示,但是这些操作并不是必然按照箭头指示的顺序依次执行。除非本文中有明确的说明,这些操作的执行并没有严格的顺序限制,其可以以其他的顺序执行。而且,本申请实施例的方法流程图中的至少一部分操作可以包括多个子操作或者多个阶段,这些子操作或者阶段并不必然是在同一时刻执行完成,而是可以在不同的时刻执行,其执行顺序也不必然是依次进行,而是可以与其他操作或者其他操作的子操作或者阶段的至少一部分轮流或者交替地执行。
在一个实施例中,如图8所示,提供了一种图像亮度处理装置,该装置包括:
图像获取模块802,用于获取待处理图像。
环境信息生成模块804,用于获取预先通过移动摄像头而得到的待处理图像的环境信息。
亮度处理模块806,用于根据环境信息确定相应的目标亮度参数;根据目标亮度参数对待处理图像进行亮度处理。
在一个实施例中,环境信息生成模块804还用于获取预先通过移动摄像头而得到的实时的帧图像;根据不同时刻生成的帧图像得到待处理图像的环境信息。
在一个实施例中,环境信息生成模块804还用于将不同时刻生成的帧图像进行对比,得到待处理图像的环境信息。
在一个实施例中,环境信息生成模块804还用于调用运动检测元件,检测在生成每个帧图像时摄像头的移动数据;根据移动数据从帧图像中得到待处理图像的环境信息。
在一个实施例中,亮度处理模块806还用于根据环境信息计算出待处理图像所处环境的环境照度;根据环境照度确定相应的目标亮度参数。
在一个实施例中,亮度处理模块806还用于根据环境信息生成全景图像;识别全景图像中每个像素点的亮度;根据亮度计算出待处理图像所处环境的环境照度。
在一个实施例中,亮度处理模块806还用于获取与环境照度对应的参考亮度参数;获取待处理图像的当前亮度参数;根据当前亮度参数和参考亮度参数计算出目标亮度参数。
上述图像亮度处理装置中各个模块的划分仅用于举例说明,在其他实施例中,可将图像亮度处理装置按照需要划分为不同的模块,以完成上述图像亮度处理装置的全部或部分功能。
上述图像亮度处理装置中的各个模块可全部或部分通过软件、硬件及其组合来实现。上述各模块可以硬件形式内嵌于或独立于服务器中的处理器中,也可以以软件形式存储于服务器中的存储器中,以便于处理器调用执行以上各个模块对应的操作。如在本申请中所使用的,术语“组件”、“模块”和“系统”等旨在表示计算机相关的实体,它可以是硬件、硬件和软件的组合、软件、或者执行中的软件。例如,组件可以是但不限于是,在处理器上运行的进程、处理器、对象、可执行码、执行的线程、程序和/或计算机。作为说明,运行在服务器上的应用程序和服务器都可以是组件。一个或多个组件可以驻留在进程和/或执行的线程中,并且组件可以位于一个计算机内和/或分布在两个或更多的计算机之间。
在一个实施例中,提供了一种非易失性计算机可读存储介质,其上存储有计算机程序,计算机程序被处理器执行时实现上述各实施例所提供的图像亮度处理方法的操作。
在一个实施例中,提供了一种电子设备,包括存储器、处理器及存储在存储器上并可在处理器上运行的计算机程序,处理器执行计算机程序时实现上述各实施例所提供的图像亮度处理方法的操作。
本申请实施例还提供了一种计算机程序产品。一种包含指令的计算机程序产品,当其在计算机上运行时,使得计算机执行上述各实施例所提供的图像亮度处理方法的操作。
本申请实施例还提供一种电子设备。上述电子设备中包括拍摄电路,拍摄电路可以利用硬件和/或软件组件实现,可包括定义ISP(Image Signal Processing,图像信号处理)管线的各种处理单元。图9为一个实施例中拍摄电路的示意图。如图9所示,为便于说明,仅示出与本申请实施例相关的拍摄技术的各个方面。
如图9所示,拍摄电路包括ISP处理器940和控制逻辑器950。成像设备910捕捉的图像数据首先由ISP处理器940处理,ISP处理器940对图像数据进行分析以捕捉可用于确定和/或成像设备910的一个或多个控制参数的图像统计信息。成像设备910可包括具有一个或多个透镜912和图像传感器914的照相机。图像传感器914可包括色彩滤镜阵列(如Bayer滤镜),图像传感器914可获取用图像传感器914的每个成像像素捕捉的光强度和波长信息,并提供可由ISP处理器940处理的一组原始图像数据。传感器920(如陀螺仪)可基于传感器920接口类型把采集的拍摄的参数(如防抖参数)提供给ISP处理器940。传感器920接口可以利用SMIA(Standard Mobile Imaging Architecture,标准移动成像架构)接口、其它串行或并行照相机接口或上述接口的组合。
此外,图像传感器914也可将原始图像数据发送给传感器920,传感器920可基于传感器920接口类型把原始图像数据提供给ISP处理器940,或者传感器920将原始图像数据存储到图像存储器930中。
ISP处理器940按多种格式逐个像素地处理原始图像数据。例如,每个图像像素可具有9、10、12或14比特的位深度,ISP处理器940可对原始图像数据进行一个或多个拍摄操作、收集关于图像数据的统计信息。其中,拍摄操作可按相同或不同的位深度精度进行。
ISP处理器940还可从图像存储器930接收图像数据。例如,传感器920接口将原始图像数据发送给图像存储器930,图像存储器930中的原始图像数据再提供给ISP处理器940以供处理。图像存储器930可为存储器装置的一部分、存储设备、或电子设备内的独立的专用存储器,并可包括DMA(Direct Memory Access,直接直接存储器存取)特征。
当接收到来自图像传感器914接口或来自传感器920接口或来自图像存储器930的原始图像数据时,ISP处理器940可进行一个或多个拍摄操作,如时域滤波。处理后的图像数据可发送给图像存储器930,以便在被显示之前进行另外的处理。ISP处理器940还可从图像存储器930接收处理数据,对处理数据进行原始域中以及RGB和YCbCr颜色空间中的图像数据处理。处理后的图像数据可输出给显示器980,以供用户观看和/或由图形引擎或GPU(Graphics Processing Unit,图形处理器)进一步处理。此外,ISP处理器940的输出还可发送给图像存储器930,且显示器980可从图像存储器930读取图像数据。在一个实施例中,图像存储器930可被配置为实现一个或多个帧缓冲器。此外,ISP处理器940的输出可发送给编码器/解码器970,以便编码/解码图像数据。编码的图像数据可被保存,并在显示于显示器980设备上之前解压缩。
ISP处理器940处理图像数据的操作包括:对图像数据进行VFE(Video Front End,视频前端)处理和CPP(Camera Post Processing,摄像头后处理)处理。对图像数据的VFE处理可包括修正图像数据的对比度或亮度、修改以数字方式记录的光照状态数据、对图像数据进行补偿处理(如白平衡,自动增益控制,γ校正等)、对图像数据进行滤波处理等。对图像数据的CPP处理可包括对图像进行缩放、向每个路径提供预览帧和记录帧。其中,CPP可使用不同的编解码器来处理预览帧和记录帧。ISP处理器940处理后的图像数据可发送给美颜模块960,以便在被显示之前对图像进行美颜处理。美颜模块960对图像数据美颜处理可包括:美白、祛斑、磨皮、瘦脸、祛痘、增大眼睛等。其中,美颜模块960可为电子设备中CPU(Central Processing Unit,中央处理器)、GPU或协处理器等。美颜模块960处理后的数据可发送给编码器/解码器970,以便编码/解码图像数据。编码的图像数据可被保存,并在显示于显示器980设备上之前解压缩。其中,美颜模块960还可位于编码器/解码器970与显示器980之间,即美颜模块对已成像的图像进行美颜处理。上述编码器/解码器970可为电子设备中CPU、GPU或协处理器等。
ISP处理器940确定的统计数据可发送给控制逻辑器950单元。例如,统计数据可包括自动曝光、自动白平衡、自动聚焦、闪烁检测、黑电平补偿、透镜912阴影校正等图像传感器914统计信息。控制逻辑器950可包括执行一个或多个例程(如固件)的处理器和/或微控制器,一个或多个例程可根据接收的统计数据,确定成像设备910的控制参数以及ISP处理器940的控制参数。例如,成像设备910的控制参数可包括传感器920控制参数(例如增益、曝光控制的积分时间)、照相机闪光控制参数、透镜912控制参数(例如聚焦或变焦用焦距)、或这些参数的组合。ISP控制参数可包括用于自动白平衡和颜色调整(例如,在RGB处理期间)的增益水平和色彩校正矩阵,以及透镜912阴影校正参数。
运用图9中拍摄技术可实现如上的图像亮度处理方法。
本申请所使用的对存储器、存储、数据库或其它介质的任何引用可包括非易失性和/或易失性存储器。合适的非易失性存储器可包括只读存储器(ROM)、可编程ROM(PROM)、 电可编程ROM(EPROM)、电可擦除可编程ROM(EEPROM)或闪存。易失性存储器可包括随机存取存储器(RAM),它用作外部高速缓冲存储器。作为说明而非局限,RAM以多种形式可得,诸如静态RAM(SRAM)、动态RAM(DRAM)、同步DRAM(SDRAM)、双数据率SDRAM(DDR SDRAM)、增强型SDRAM(ESDRAM)、同步链路(Synchlink)DRAM(SLDRAM)、存储器总线(Rambus)直接RAM(RDRAM)、直接存储器总线动态RAM(DRDRAM)、以及存储器总线动态RAM(RDRAM)。
以上所述实施例仅表达了本申请的几种实施方式,其描述较为具体和详细,但并不能因此而理解为对本申请专利范围的限制。应当指出的是,对于本领域的普通技术人员来说,在不脱离本申请构思的前提下,还可以做出若干变形和改进,这些都属于本申请的保护范围。因此,本申请专利的保护范围应以所附权利要求为准。

Claims (21)

  1. 一种图像亮度处理方法,包括:
    获取待处理图像;
    获取预先通过移动摄像头而得到的所述待处理图像的环境信息;
    根据所述环境信息确定相应的目标亮度参数;及
    根据所述目标亮度参数对所述待处理图像进行亮度处理。
  2. 根据权利要求1所述的方法,其特征在于,所述获取预先通过移动摄像头而得到的所述待处理图像的环境信息,包括:
    获取预先通过移动摄像头而得到的实时的帧图像;及
    根据不同时刻生成的帧图像得到所述待处理图像的环境信息。
  3. 根据权利要求2所述的方法,其特征在于,所述根据不同时刻生成的帧图像得到所述待处理图像的环境信息,包括:
    将不同时刻生成的帧图像进行对比,得到所述待处理图像的环境信息。
  4. 根据权利要求2所述的方法,其特征在于,所述根据不同时刻生成的帧图像得到所述待处理图像的环境信息,包括:
    调用运动检测元件,检测在生成每个帧图像时所述摄像头的移动数据;及
    根据所述移动数据从所述帧图像中得到所述待处理图像的环境信息。
  5. 根据权利要求1所述的方法,其特征在于,所述根据所述环境信息确定相应的目标亮度参数,包括:
    根据所述环境信息计算出所述待处理图像所处环境的环境照度;及
    根据所述环境照度确定相应的目标亮度参数。
  6. 根据权利要求5所述的方法,其特征在于,所述根据所述环境信息计算出所述待处理图像所处环境的环境照度,包括:
    根据所述环境信息生成全景图像;
    识别所述全景图像中每个像素点的亮度;及
    根据所述亮度计算出所述待处理图像所处环境的环境照度。
  7. 根据权利要求5所述的方法,其特征在于,所述根据所述环境照度确定相应的目标亮度参数,包括:
    获取与所述环境照度对应的参考亮度参数;
    获取所述待处理图像的当前亮度参数;及
    根据所述当前亮度参数和所述参考亮度参数计算出目标亮度参数。
  8. 一个或多个包含计算机可执行指令的非易失性计算机可读存储介质,当所述计算机可执行指令被一个或多个处理器执行时,使得所述处理器执行以下操作:
    获取待处理图像;
    获取预先通过移动摄像头而得到的所述待处理图像的环境信息;
    根据所述环境信息确定相应的目标亮度参数;及
    根据所述目标亮度参数对所述待处理图像进行亮度处理。
  9. 根据权利要求8所述的非易失性计算机可读存储介质,其特征在于,所述处理器执行所述获取预先通过移动摄像头而得到的所述待处理图像的环境信息时,还执行以下操作:
    获取预先通过移动摄像头而得到的实时的帧图像;及
    根据不同时刻生成的帧图像得到所述待处理图像的环境信息。
  10. 根据权利要求9所述的非易失性计算机可读存储介质,其特征在于,所述处理器执行所述根据不同时刻生成的帧图像得到所述待处理图像的环境信息时,还执行以下操 作:
    将不同时刻生成的帧图像进行对比,得到所述待处理图像的环境信息。
  11. 根据权利要求9所述的非易失性计算机可读存储介质,其特征在于,所述处理器执行所述根据不同时刻生成的帧图像得到所述待处理图像的环境信息时,还执行以下操作:
    调用运动检测元件,检测在生成每个帧图像时所述摄像头的移动数据;及
    根据所述移动数据从所述帧图像中得到所述待处理图像的环境信息。
  12. 根据权利要求8所述的非易失性计算机可读存储介质,其特征在于,所述处理器执行所述根据所述环境信息确定相应的目标亮度参数时,还执行以下操作:
    根据所述环境信息计算出所述待处理图像所处环境的环境照度;及
    根据所述环境照度确定相应的目标亮度参数。
  13. 根据权利要求12所述的非易失性计算机可读存储介质,其特征在于,所述处理器执行所述根据所述环境信息计算出所述待处理图像所处环境的环境照度时,还执行以下操作:
    根据所述环境信息生成全景图像;
    识别所述全景图像中每个像素点的亮度;及
    根据所述亮度计算出所述待处理图像所处环境的环境照度。
  14. 根据权利要求12所述的非易失性计算机可读存储介质,其特征在于,所述处理器执行所述根据所述环境照度确定相应的目标亮度参数时,还执行以下操作:
    获取与所述环境照度对应的参考亮度参数;
    获取所述待处理图像的当前亮度参数;及
    根据所述当前亮度参数和所述参考亮度参数计算出目标亮度参数。
  15. 一种电子设备,包括存储器及处理器,所述存储器中储存有计算机可读指令,所述计算机可读指令被所述处理器执行时,使得所述处理器执行以下操作:
    获取待处理图像;
    获取预先通过移动摄像头而得到的所述待处理图像的环境信息;
    根据所述环境信息确定相应的目标亮度参数;及
    根据所述目标亮度参数对所述待处理图像进行亮度处理。
  16. 根据权利要求15所述的电子设备,其特征在于,所述处理器执行所述获取预先通过移动摄像头而得到的所述待处理图像的环境信息时,还执行以下操作:
    获取预先通过移动摄像头而得到的实时的帧图像;及
    根据不同时刻生成的帧图像得到所述待处理图像的环境信息。
  17. 根据权利要求16所述的电子设备,其特征在于,所述处理器执行所述根据不同时刻生成的帧图像得到所述待处理图像的环境信息时,还执行以下操作:
    将不同时刻生成的帧图像进行对比,得到所述待处理图像的环境信息。
  18. 根据权利要求16所述的电子设备,其特征在于,所述处理器执行所述根据不同时刻生成的帧图像得到所述待处理图像的环境信息,还执行以下操作:
    调用运动检测元件,检测在生成每个帧图像时所述摄像头的移动数据;及
    根据所述移动数据从所述帧图像中得到所述待处理图像的环境信息。
  19. 根据权利要求15所述的电子设备,其特征在于,所述处理器执行所述根据所述环境信息确定相应的目标亮度参数时,还执行以下操作:
    根据所述环境信息计算出所述待处理图像所处环境的环境照度;及
    根据所述环境照度确定相应的目标亮度参数。
  20. 根据权利要求19所述的电子设备,其特征在于,所述处理器执行所述根据所述环境信息计算出所述待处理图像所处环境的环境照度,还执行以下操作:
    根据所述环境信息生成全景图像;
    识别所述全景图像中每个像素点的亮度;及
    根据所述亮度计算出所述待处理图像所处环境的环境照度。
  21. 根据权利要求19所述的电子设备,其特征在于,所述处理器执行所述根据所述环境照度确定相应的目标亮度参数,还执行以下操作:
    获取与所述环境照度对应的参考亮度参数;
    获取所述待处理图像的当前亮度参数;及
    根据所述当前亮度参数和所述参考亮度参数计算出目标亮度参数。
PCT/CN2018/117299 2017-11-28 2018-11-23 图像亮度处理方法、计算机可读存储介质和电子设备 WO2019105305A1 (zh)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201711213031.5 2017-11-28
CN201711213031.5A CN108012078B (zh) 2017-11-28 2017-11-28 图像亮度处理方法、装置、存储介质和电子设备

Publications (1)

Publication Number Publication Date
WO2019105305A1 true WO2019105305A1 (zh) 2019-06-06

Family

ID=62054185

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2018/117299 WO2019105305A1 (zh) 2017-11-28 2018-11-23 图像亮度处理方法、计算机可读存储介质和电子设备

Country Status (2)

Country Link
CN (1) CN108012078B (zh)
WO (1) WO2019105305A1 (zh)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111369482A (zh) * 2020-03-03 2020-07-03 北京市商汤科技开发有限公司 图像处理方法及装置、电子设备和存储介质
CN111915529A (zh) * 2020-08-05 2020-11-10 广州市百果园信息技术有限公司 一种视频的暗光增强方法、装置、移动终端和存储介质
CN114820404A (zh) * 2021-01-29 2022-07-29 北京字节跳动网络技术有限公司 图像处理方法、装置、电子设备及介质
CN115835459A (zh) * 2023-02-15 2023-03-21 河南金品建筑工程有限公司 一种动力照明智能控制方法及系统

Families Citing this family (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108012078B (zh) * 2017-11-28 2020-03-27 Oppo广东移动通信有限公司 图像亮度处理方法、装置、存储介质和电子设备
CN109060120A (zh) * 2018-06-19 2018-12-21 米亚索乐装备集成(福建)有限公司 一种光伏模拟器、光强调整方法、电子设备及存储介质
CN110346116B (zh) * 2019-06-14 2021-06-15 东南大学 一种基于图像采集的场景照度计算方法
CN110708801B (zh) * 2019-11-01 2021-06-11 广州云蝶科技有限公司 一种照明控制方法和系统
CN112767268A (zh) * 2021-01-14 2021-05-07 北京迈格威科技有限公司 人物图像处理方法及装置、电子设备、存储介质
CN117011153A (zh) * 2022-04-28 2023-11-07 华为技术有限公司 图像处理方法及装置
CN117177076A (zh) * 2022-05-24 2023-12-05 格兰菲智能科技有限公司 通道数值计算方法、环视图生成方法、装置、设备、介质

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105592254A (zh) * 2014-10-21 2016-05-18 宏碁股份有限公司 图像显示方法及电子装置
KR20160120648A (ko) * 2015-04-08 2016-10-18 주식회사 에스카 환경에 따른 필터 구동과 ir라이트 구동 카메라장치
CN106851119A (zh) * 2017-04-05 2017-06-13 奇酷互联网络科技(深圳)有限公司 一种图片生成的方法和设备以及移动终端
CN107197146A (zh) * 2017-05-31 2017-09-22 广东欧珀移动通信有限公司 图像处理方法及相关产品
WO2017185265A1 (zh) * 2016-04-27 2017-11-02 华为技术有限公司 一种图像拍摄参数的确定方法及摄像装置
CN108012078A (zh) * 2017-11-28 2018-05-08 广东欧珀移动通信有限公司 图像亮度处理方法、装置、存储介质和电子设备

Family Cites Families (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP5017989B2 (ja) * 2006-09-27 2012-09-05 ソニー株式会社 撮像装置、撮像方法
JP2010087977A (ja) * 2008-10-01 2010-04-15 Sony Corp 画像処理装置、画像処理方法、及び、プログラム
CN102905079B (zh) * 2012-10-16 2015-08-19 小米科技有限责任公司 用于全景拍摄的方法、装置及移动终端
CN103942782A (zh) * 2014-03-31 2014-07-23 Tcl集团股份有限公司 一种图像拼接方法及装置
CN105100640B (zh) * 2015-01-23 2018-12-18 武汉智源泉信息科技有限公司 一种局部配准并行视频拼接方法及系统
CN104917960B (zh) * 2015-05-19 2017-10-17 广东欧珀移动通信有限公司 一种控制摄像头旋转的方法及终端
CN105227945B (zh) * 2015-10-21 2017-05-17 维沃移动通信有限公司 一种自动白平衡的控制方法及移动终端
CN105721773B (zh) * 2016-01-29 2019-04-30 深圳市美好幸福生活安全系统有限公司 一种视频获取系统及方法
CN106254791A (zh) * 2016-08-11 2016-12-21 广东欧珀移动通信有限公司 摄像头的启动方法及移动终端
CN106709868A (zh) * 2016-12-14 2017-05-24 云南电网有限责任公司电力科学研究院 一种图像拼接方法及装置
CN106657947A (zh) * 2017-01-13 2017-05-10 奇酷互联网络科技(深圳)有限公司 生成图像的方法和摄像装置

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105592254A (zh) * 2014-10-21 2016-05-18 宏碁股份有限公司 图像显示方法及电子装置
KR20160120648A (ko) * 2015-04-08 2016-10-18 주식회사 에스카 환경에 따른 필터 구동과 ir라이트 구동 카메라장치
WO2017185265A1 (zh) * 2016-04-27 2017-11-02 华为技术有限公司 一种图像拍摄参数的确定方法及摄像装置
CN106851119A (zh) * 2017-04-05 2017-06-13 奇酷互联网络科技(深圳)有限公司 一种图片生成的方法和设备以及移动终端
CN107197146A (zh) * 2017-05-31 2017-09-22 广东欧珀移动通信有限公司 图像处理方法及相关产品
CN108012078A (zh) * 2017-11-28 2018-05-08 广东欧珀移动通信有限公司 图像亮度处理方法、装置、存储介质和电子设备

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111369482A (zh) * 2020-03-03 2020-07-03 北京市商汤科技开发有限公司 图像处理方法及装置、电子设备和存储介质
CN111369482B (zh) * 2020-03-03 2023-06-23 北京市商汤科技开发有限公司 图像处理方法及装置、电子设备和存储介质
CN111915529A (zh) * 2020-08-05 2020-11-10 广州市百果园信息技术有限公司 一种视频的暗光增强方法、装置、移动终端和存储介质
CN114820404A (zh) * 2021-01-29 2022-07-29 北京字节跳动网络技术有限公司 图像处理方法、装置、电子设备及介质
CN115835459A (zh) * 2023-02-15 2023-03-21 河南金品建筑工程有限公司 一种动力照明智能控制方法及系统
CN115835459B (zh) * 2023-02-15 2023-05-05 河南金品建筑工程有限公司 一种动力照明智能控制方法及系统

Also Published As

Publication number Publication date
CN108012078A (zh) 2018-05-08
CN108012078B (zh) 2020-03-27

Similar Documents

Publication Publication Date Title
WO2019105305A1 (zh) 图像亮度处理方法、计算机可读存储介质和电子设备
EP3477931B1 (en) Image processing method and device, readable storage medium and electronic device
CN107730445B (zh) 图像处理方法、装置、存储介质和电子设备
CN107911682B (zh) 图像白平衡处理方法、装置、存储介质和电子设备
WO2020038028A1 (zh) 夜景拍摄方法、装置、电子设备及存储介质
CN109068058B (zh) 超级夜景模式下的拍摄控制方法、装置和电子设备
US11431915B2 (en) Image acquisition method, electronic device, and non-transitory computer readable storage medium
WO2019105304A1 (zh) 图像白平衡处理方法、计算机可读存储介质和电子设备
CN108419028B (zh) 图像处理方法、装置、计算机可读存储介质和电子设备
TWI425826B (zh) 影像選擇裝置、影像選擇方法
CN107730446B (zh) 图像处理方法、装置、计算机设备及计算机可读存储介质
CN108846807B (zh) 光效处理方法、装置、终端及计算机可读存储介质
CN108198152B (zh) 图像处理方法和装置、电子设备、计算机可读存储介质
WO2021047345A1 (zh) 图像降噪方法、装置、存储介质及电子设备
CN110290323B (zh) 图像处理方法、装置、电子设备和计算机可读存储介质
CN108055452A (zh) 图像处理方法、装置及设备
CN107993209B (zh) 图像处理方法、装置、计算机可读存储介质和电子设备
CN108717530B (zh) 图像处理方法、装置、计算机可读存储介质和电子设备
KR20200023651A (ko) 미리보기 사진 블러링 방법 및 장치 및 저장 매체
JP2020528702A (ja) 画像処理方法および装置
CN107959841B (zh) 图像处理方法、装置、存储介质和电子设备
CN107872631B (zh) 基于双摄像头的图像拍摄方法、装置及移动终端
CN109242794B (zh) 图像处理方法、装置、电子设备及计算机可读存储介质
CN107948511B (zh) 图像亮度处理方法、装置、存储介质和图像亮度处理设备
CN107945106B (zh) 图像处理方法、装置、电子设备及计算机可读存储介质

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18883339

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 18883339

Country of ref document: EP

Kind code of ref document: A1