WO2023207137A1 - Procédé et dispositif de traitement d'images - Google Patents

Procédé et dispositif de traitement d'images Download PDF

Info

Publication number
WO2023207137A1
WO2023207137A1 PCT/CN2022/139253 CN2022139253W WO2023207137A1 WO 2023207137 A1 WO2023207137 A1 WO 2023207137A1 CN 2022139253 W CN2022139253 W CN 2022139253W WO 2023207137 A1 WO2023207137 A1 WO 2023207137A1
Authority
WO
WIPO (PCT)
Prior art keywords
image processing
image
information
original image
original
Prior art date
Application number
PCT/CN2022/139253
Other languages
English (en)
Chinese (zh)
Inventor
罗达新
马莎
Original Assignee
华为技术有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 华为技术有限公司 filed Critical 华为技术有限公司
Publication of WO2023207137A1 publication Critical patent/WO2023207137A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20004Adaptive image processing
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/80Camera processing pipelines; Components thereof

Definitions

  • the present application relates to the field of image processing, and in particular to image processing methods and devices.
  • the image processing device can process the acquired image so that the processed image meets certain requirements. For example, the image processing device can process an image with lower brightness so that the brightness of the processed image is greater than or equal to the threshold 1, or the image processing device can process an image with higher brightness and overexposure so that the processed image The brightness of the image is less than or equal to the threshold 2, so that the image recognition device can recognize the image.
  • image processing devices can use the adaptive Gamma correction with weighting distribution (AGCWD) algorithm to process original images to obtain processed images that meet certain requirements.
  • AGCWD adaptive Gamma correction with weighting distribution
  • the image processing device can obtain the original image, generate a grayscale histogram based on the original image, redistribute the grayscale histogram, obtain gamma (gamma) parameters, and enhance the original image through the gamma parameters. Get the processed image.
  • AGCWD adaptive Gamma correction with weighting distribution
  • Embodiments of the present application provide image processing methods and devices, which can reduce the amount of calculation in the image processing process.
  • an image processing method is provided.
  • the device for executing the method may be an image processing device; it may also be a module used in the image processing device, such as a chip or a chip system.
  • the following description takes the execution subject as an image processing device as an example.
  • the method includes: acquiring an image processing model, a first original image and first environment information, where the first environment information is information about the environment when acquiring the first original image; and using the first environment information as an input to the image processing model. parameters to obtain the first image processing parameters; use the first image processing parameters to process the first original image to obtain the first processed image.
  • the image processing device can obtain the first image processing parameters based on the environmental information and the image processing model when acquiring the first original image, and process the first original image according to the first image processing parameters, Get the processed image.
  • the above method it does not involve the generation and redistribution of grayscale histograms and other computationally intensive processes.
  • an image processing model trained in advance and related to environmental information is used to obtain the first image processing parameters.
  • the first image processing parameter processes the first original image.
  • the calculation process is relatively simple and the calculation amount is small. It can be applied to equipment with low computing power, so that the equipment can also perform image processing, or be applied to real-time processing.
  • the above method obtains the first image processing parameters based on the environmental information and the image processing model when acquiring the first original image. That is to say, the first image processing parameters are related to the environmental information and have nothing to do with the original image. Therefore, , there will be no problem that the processed image is too bright and overexposed due to dark objects in the original image.
  • the first image processing parameters are parameters related to environmental information, if the environmental information corresponding to the two original images is the same (or has little difference), then the first image processing parameters corresponding to the two original images are also the same. (Or not much different). Therefore, if this method is used to process video streams, the problem of video flickering will not occur.
  • the first environment information includes at least one of the following: time information for obtaining the first original image, information on the illumination intensity when the first original image is obtained, and information on the illumination direction when the first original image is obtained. information, brightness information when the first original image is obtained, or position information when the first original image is obtained.
  • the above information is related to the gamma correction function. Therefore, the first image processing parameters obtained according to the first environment information can be used for gamma correction.
  • the above information is also smooth and excessive, that is, the first environment information corresponding to two adjacent frames of images in the video is not much different, so the first environment information corresponding to the two adjacent frames of images obtained based on the first environment information The processing parameters are not much different, so there will be no problem of video flickering.
  • obtaining an image processing model includes: obtaining at least one second environment information and at least one second original image, the at least one second original image including each second environment in the at least one second environment information A second original image corresponding to the information; acquiring at least one target image, the at least one target image including a target image corresponding to each second environment information in the at least one second environment information, the second environment information is to obtain the second environment
  • the second original image and the third original image corresponding to the information are environmental information
  • the third original image is the original image of the target image corresponding to the second environmental information
  • the type of information included in the second environmental information is the same as that of the third original image.
  • a piece of environmental information includes the same type of information; the image processing model is obtained according to the at least one second environmental information, the at least one second original image and the at least one target image.
  • the image processing device can obtain the image processing model.
  • obtaining the image processing model according to the at least one second environment information, the at least one second original image and the at least one target image includes: using at least one second image processing parameter to respectively A second original image is processed to obtain at least one second processed image, and the absolute value of the difference between the correlation coefficient of the at least one second processed image and the correlation coefficient of the at least one target image is less than or equal to the first threshold; according to the at least one A second environment information and the at least one second image processing parameter are used to obtain the image processing model.
  • the image processing device can train at least one second image processing parameter and at least one second environment information to obtain an image processing model.
  • at least one second image processing parameter can make at least one second processed image have a high similarity with at least one target image. Therefore, environmental information (such as first environmental information) is input into the image processing model obtained according to the above method.
  • the image processing parameters (such as the first image processing parameters) can be obtained, and the original image (such as the first original image) corresponding to the environmental information is processed by using the image processing parameters, so that the processed image (such as the first processed image) can be compared with at least A target image has higher similarity.
  • the processed image can also meet the requirements of the target image.
  • the method before using at least one second image processing parameter to respectively process the at least one second original image to obtain at least one second processed image, the method further includes: obtaining a set of candidate image processing parameters, the The set of candidate image processing parameters includes a plurality of candidate image processing parameters, the plurality of candidate image processing parameters including the at least one second processing parameter.
  • the image processing device can first obtain multiple candidate parameters, and then use the candidate parameters to process the second original image to obtain a processed image. If the correlation coefficient of the processed image is consistent with the target image If the absolute value of the difference between the correlation coefficients is less than or equal to the first threshold, the image processing device determines the candidate parameter corresponding to the processed image as the second processing parameter.
  • the method further includes: obtaining initial image processing parameters; performing a first operation on the initial image processing parameters to obtain the candidate image processing parameter set.
  • the image processing apparatus can obtain the candidate image processing parameter set.
  • the first operation includes at least one of the following: determining candidate image processing parameters according to rules according to the initial image processing parameters, randomly determining candidate image processing parameters based on the initial image processing parameters, or determining candidate image processing parameters based on historically determined parameters.
  • the candidate image processing parameters determine the set of candidate image processing parameters.
  • the image processing device can obtain the candidate image processing parameter set through multiple methods, which improves the flexibility and diversity of obtaining the candidate image processing parameter set.
  • the image processing model includes functions or algorithms.
  • the calculation amount of using a function or algorithm to calculate the parameters in the first environment information is much less than the calculation amount of generating and redistributing the grayscale histogram. Therefore, the image processing speed of the image processing device can be greatly improved.
  • an image processing device for implementing the above method.
  • the image processing device may be the image processing device in the above-described first aspect, or a device including the above-described image processing device.
  • the image processing device includes corresponding modules, units, or means (means) for implementing the above method.
  • the modules, units, or means can be implemented by hardware, software, or by hardware executing corresponding software.
  • the hardware or software includes one or more modules or units corresponding to the above functions.
  • the image processing device may include a processing module.
  • This processing module can be used to implement the processing functions in any of the above aspects and any possible implementation manner thereof.
  • the processing module may be, for example, a processor.
  • a third aspect provides an image processing device, including: a processor; the processor is configured to be coupled to a memory, and after reading instructions in the memory, execute the method described in the first aspect according to the instructions.
  • the image processing device may be the image processing device in the above-described first aspect, or a device including the above-described image processing device.
  • the image processing device further includes a memory, and the memory is used to store necessary program instructions and data.
  • the image processing device is a chip or a chip system.
  • the image processing device when it is a chip system, it may be composed of a chip, or may include a chip and other discrete devices.
  • an image processing device including: a processor and an interface circuit; the interface circuit is used to receive a computer program or instructions and transmit them to the processor; the processor is used to execute the computer program or instructions, so that the The image processing device executes the method described in the above first aspect.
  • the image processing device is a chip or a chip system.
  • the image processing device when it is a chip system, it may be composed of a chip, or may include a chip and other discrete devices.
  • a computer-readable storage medium is provided. Instructions are stored in the computer-readable storage medium, and when run on a computer, the computer can execute the method described in the first aspect.
  • a sixth aspect provides a computer program product containing instructions that, when run on a computer, enable the computer to execute the method described in the first aspect.
  • a seventh aspect provides an intelligent driving vehicle, which includes an image processing device for executing the method described in the first aspect.
  • Figure 1 is a schematic diagram of the image processing process
  • Figure 2 is a schematic diagram before and after image processing
  • Figure 3A is a schematic diagram of the image processing system architecture provided by an embodiment of the present application.
  • Figure 3B is a schematic diagram of an image processing device provided by an embodiment of the present application.
  • Figure 4 is a schematic diagram of the hardware structure of an image processing device provided by an embodiment of the present application.
  • FIG. 5 is a schematic flowchart 1 of the image processing method provided by the embodiment of the present application.
  • FIG. 6 is a schematic flowchart 2 of the image processing method provided by the embodiment of the present application.
  • Figure 7 is a schematic flow chart of the feature extraction method provided by the embodiment of the present application.
  • Figure 8 is a schematic diagram of an image processing model provided by an embodiment of the present application.
  • FIG. 9 is a schematic structural diagram of an image processing device provided by an embodiment of the present application.
  • the image sensing device (such as a camera) can project the ambient light signal into the visual sensor chip, and the visual sensor chip can convert the light signal into an electrical signal and maintain it as a raw data (RAW) image.
  • the image processing device can convert the RAW image into a red green blue (RGB) image. That is, the image data type may include RAW images or RGB images.
  • a RAW image may represent a native, unprocessed image.
  • RAW images include images sensed from image sensing devices such as digital cameras, mobile phones, tablets, or scanners.
  • RAW images correspond to RAW files.
  • RAW files can contain data information needed to create a viewable image, such as file header and pixel area information.
  • the structure of RAW files can follow a common pattern.
  • the structure of a RAW file may include a RAW image format ISO12234-2 or TIFF/EP that complies with the International Organization for Standardization (ISO) standard.
  • ISO International Organization for Standardization
  • a RAW image can refer to an image in bayer format. The number of pixels is the same as the number of pixels in the image perceived by the image sensing device.
  • width and height indicate the width and height of the image respectively, and 1 indicates that the RAW image has one channel.
  • Each pixel represents one of three colors: red (red, R), green (green, G), or blue (blue, B).
  • a floating point number from 0.0 to 1.0 is usually used to represent the signal strength of each color.
  • RGB images can represent various eye-friendly images generated after RAW images are processed by an image signal processor (ISP), for example, common jpeg format images, BMP format images or PNG format images, etc.
  • ISP image signal processor
  • files corresponding to RGB images do not include file headers, but include pixel area information.
  • the file corresponding to an RGB image includes a matrix of width*height*3. 3 means that the RGB image has 3 channels, each channel represents one of the three colors of R/G/B, and an integer from 0 to 255 is used to represent each pixel value of the image.
  • the image processing device may include an ISP for processing images.
  • the ISPs used by different image processing devices may be the same or different.
  • the ISP can have at least one of the following functions: demosaicing, black level correction, lens correction, bad pixel correction, gamma correction, denoising, white balance or color mapping.
  • the parameters used by the ISPs of different image processing devices may be the same or different. For example, for image 1, the ISP of image processing device 1 processes image 1 through gamma parameter 1 to obtain image 2, and the ISP of image processing device 2 processes image 1 through gamma parameter 2 to obtain image 3. Or, for image 1, both the ISP of image processing device 1 and the ISP of image processing device 2 process image 1 through gamma parameter 1 to obtain image 2.
  • the image processing method provided by the embodiment of the present application can be applied to one or more of the above functions.
  • parameters in one or more of the above functions can be obtained according to environmental information, and the image can be processed according to the obtained parameters. Process to get the processed image.
  • the following embodiments of the present application are described by taking the image processing method applied to gamma correction (for example, obtaining gamma parameters according to environmental information, processing the image according to the gamma parameters, and obtaining a processed image) as an example.
  • the embodiments of the present application provide The application of the image processing method to other functions is similar to the application to gamma correction. Please refer to the following embodiments of this application and will not be described again.
  • Figure 2 shows images taken under different lighting conditions from sunrise to sunset.
  • the original images taken at 5:00am (5:00am) and 19:00pm (19:00pm) have lower brightness and the images are too dark, while the original images taken at 13:00pm (13:00pm)
  • the brightness of the image is high, and the image is too bright and overexposed, which makes it difficult for the image recognition device to identify the content in the image, such as objects or people in the image.
  • the original image in Figure 2 can be used to adjust the brightness of the image through the method shown in Figure 1 to obtain the processed image.
  • the brightness of the processed image is moderate, which can improve the recognition rate of the image recognition device and the accuracy of identifying objects.
  • the method shown in Figure 1 is more complex and requires a large amount of calculation. Usually, when digital signal processing (DSP) chips use this algorithm, it takes tens of milliseconds. It is not suitable for some scenarios that require high real-time performance, such as intelligent driving scenarios. Moreover, in the method shown in Figure 1, if there are dark objects in the original image, the processed image will be brighter, and the image may be too bright and overexposed. In addition, in the method shown in Figure 1, the gamma parameters obtained according to different original images are different. Therefore, if the method shown in Figure 1 is used to process the video stream, the gamma parameters of two adjacent frames of images will be different. Since the gamma parameter is related to image brightness, the problem of video flickering may occur.
  • DSP digital signal processing
  • the image processing method can obtain the first image processing parameters based on the environmental information and the image processing model when acquiring the first original image.
  • the parameters are used to process the first original image to obtain the processed image.
  • an image processing model trained in advance and related to environmental information is used to obtain the first image processing parameters.
  • the first image processing parameter processes the first original image, the calculation process is relatively simple, and the calculation amount is small.
  • the above method obtains the first image processing parameters based on the environmental information and the image processing model when acquiring the first original image.
  • the first image processing parameters are related to the environmental information and have nothing to do with the original image. Therefore, , there will be no problem that the processed image is too bright and overexposed due to dark objects in the original image.
  • the first image processing parameters are parameters related to environmental information, if the environmental information corresponding to the two original images is the same (or has little difference), then the first image processing parameters corresponding to the two original images are also the same. (Or not much different). Therefore, if this method is used to process video streams, the problem of video flickering will not occur. The specific process of this method will be introduced in the embodiment shown in Figure 5 below, and will not be described again here.
  • the method provided by the embodiments of the present application can be used in various image processing systems to process the original image and obtain a processed image, so that the processed image meets certain requirements.
  • the following uses the image processing system 30 shown in FIG. 3A as an example to describe the method provided by the embodiment of the present application.
  • FIG. 3A it is a schematic architectural diagram of an image processing system 30 provided by an embodiment of the present application.
  • the image processing system 30 may include one or more image processing devices 301 (only one is shown).
  • the image processing system 30 also includes an image sensing device 302 and/or an image recognition device 303 that can communicate with the image processing device 301 .
  • FIG. 3A is only a schematic diagram and does not constitute a limitation on the applicable scenarios of the technical solution provided by this application.
  • the image processing device in the embodiment of the present application can be any device with computing functions.
  • the image processing device also has image perception capabilities and/or image recognition capabilities, for example, it can perceive the original image (or shoot the original image) and/or be able to identify the content in the image.
  • the image processing device 301 can obtain the image processing model, the first original image and the first environment information, use the first environment information as an input parameter of the image processing model, obtain the first image processing parameters, and use the first image processing The parameters are used to process the first original image to obtain the first processed image.
  • the first environment information is information about the environment when the first original image is acquired.
  • the image processing device includes but is not limited to: handheld device, vehicle-mounted device, computing device or intelligent driving vehicle.
  • the image processing device may be a mobile phone, a tablet, a computer, various devices with computing capabilities in a car, or an intelligent driving vehicle with an automatic driving function or an assisted driving function.
  • Various devices with computing capabilities in the car can include: gateway, vehicle T-Box (telematics box), body control module (BCM), smart cockpit domain controller (cockpit domain controller, CDC), multi-domain Controller (multi domain controller, MDC), vehicle control unit (vehicle control unit, VCU), electronic control unit (electronic control unit, ECU), vehicle domain controller (vehicle domain controller, VDC) or vehicle integrated unit (vehicle integrated/integration unit, VIU) etc.
  • Automatic driving means that the automatic driving device in the vehicle can operate the vehicle to drive safely without the participation of the driver during the driving process.
  • Assisted driving refers to the auxiliary driving device in the vehicle that assists the driver in safe driving while the vehicle is driving.
  • Autonomous driving or assisted driving can also be called intelligent driving.
  • the image processing device can also be a virtual reality (VR) device, an augmented reality (AR) device, a wearable device, a wireless terminal in industrial control, a wireless terminal in driverless driving, or a remote control device.
  • VR virtual reality
  • AR augmented reality
  • Wireless terminals in medical treatment wireless terminals in smart grids, wireless terminals in smart cities, or wireless terminals in smart homes, etc.
  • the image processing device can also be a terminal in the Internet of Things (IoT) system.
  • IoT Internet of Things
  • the image processing device of the present application may be a vehicle-mounted module, vehicle-mounted module, vehicle-mounted component, vehicle-mounted chip or vehicle-mounted unit built into the vehicle as one or more components or units.
  • the vehicle uses the built-in vehicle-mounted module, vehicle-mounted module, Vehicle-mounted components, vehicle-mounted chips or vehicle-mounted units can implement the method of the present application.
  • the embodiments of the present application can be applied to the Internet of Vehicles, such as vehicle outreach (vehicle to everything, V2X), inter-vehicle communication long term evolution technology (long term evolution vehicle, LTE-V), vehicle to vehicle (vehicle to vehicle, V2V) wait.
  • vehicle outreach vehicle to everything, V2X
  • inter-vehicle communication long term evolution technology long term evolution vehicle, LTE-V
  • vehicle to vehicle vehicle to vehicle, V2V
  • the image sensing device in the embodiment of the present application can be any device with image sensing capabilities.
  • the image sensing device may include one or more of a monocular camera, a binocular camera, a trinocular camera, a depth camera, or a scanner.
  • the image recognition device in the embodiment of the present application can be any device with image recognition capabilities and can recognize the content in the image.
  • the image processing system 30 shown in FIG. 3A is only used as an example and is not used to limit the technical solution of the present application. Those skilled in the art should understand that during specific implementation, the image processing system 30 may also include other equipment, and the number of image processing devices, image sensing devices or image recognition devices may also be determined according to specific needs without limitation.
  • the functions of the image processing apparatus in the embodiments of the present application can be implemented by one device or module, or by multiple devices or modules.
  • the functions of the image processing device are implemented by multiple devices or modules, the image processing device may be as shown in Figure 3B.
  • the image processing device 301 may include multiple modules, namely a model acquisition module 3011, an information acquisition module 3012 and an image processing module 3013.
  • the image processing device 301 also includes an image acquisition module 3014 and/or an image recognition module 3015.
  • the model acquisition module 3011 can be used to acquire an image processing model and send the image processing model to the image processing module 3013.
  • the information acquisition module 3012 may be used to acquire the first environment information and send the first environment information to the image processing module 3013.
  • the image processing module 3013 can be used to receive the image processing model from the model acquisition module 3011, receive the first environment information from the information acquisition module 3012, acquire the first original image, and use the first environment information as an input parameter of the image processing model to obtain first image processing parameters, and use the first image processing parameters to process the first original image to obtain a first processed image.
  • the image processing module 3013 can also send the first processed image to the image recognition module 3015, so that the image recognition module 3015 can recognize the content in the first processed image, etc.
  • the model acquisition module 3011 may be used to acquire an image processing model and send the image processing model to the image processing module 3013.
  • the information acquisition module 3012 may be used to acquire the first environment information and send the first environment information to the image processing module 3013.
  • the image acquisition module 3014 may be used to acquire the first original image and send the first original image to the image processing module 3013.
  • the image processing module 3013 may be configured to receive the image processing model from the model acquisition module 3011, receive the first environment information from the information acquisition module 3012, receive the first original image from the image acquisition module 3014, and process the first environment information as an image.
  • the input parameters of the model are used to obtain the first image processing parameters, and the first original image is processed using the first image processing parameters to obtain the first processed image.
  • the image processing module 3013 can also send the first processed image to the image recognition module 3015, so that the image recognition module 3015 can recognize the content in the first processed image, etc.
  • module can be replaced by “device”.
  • image processing module can be replaced by “image processing device”.
  • the image processing device 301 shown in FIG. 3B is only used as an example and is not used to limit the technical solution of the present application. Those skilled in the art should understand that during specific implementation, the image processing device 301 may also include other modules or equipment, and may also determine a model acquisition module, an information acquisition module, an image processing module, an image acquisition module, or a module according to specific needs. The number of image recognition modules is not limited.
  • each device or module (such as an image processing device, a model acquisition module, an information acquisition module, or an image processing module, etc.) in Figure 3A or Figure 3B in the embodiment of this application can be a general device or a special device, The embodiments of the present application do not specifically limit this.
  • each device or module such as an image processing device, a model acquisition module, an information acquisition module, or an image processing module, etc.
  • each device or module can be implemented by one device, or they can It can be jointly implemented by multiple devices, or can also be implemented by one or more functional modules in one device, which is not specifically limited in the embodiments of the present application.
  • the above functions can be either network elements in hardware devices, software functions running on dedicated hardware, or a combination of hardware and software, or virtualization instantiated on a platform (for example, a cloud platform) Function.
  • each device or module (such as image processing device, model acquisition module, information acquisition module, or image processing module, etc.) in Figure 3A or Figure 3B in the embodiment of the present application can adopt the composition structure shown in Figure 4 , or include the components shown in Figure 4.
  • FIG. 4 shows a schematic diagram of the hardware structure of an image processing device applicable to embodiments of the present application.
  • the image processing device 40 includes at least one processor 401 and at least one communication interface 404, which are used to implement the method provided by the embodiment of the present application.
  • the image processing device 40 may also include a communication line 402 and a memory 403.
  • the processor 401 can be a general central processing unit (CPU), a microprocessor, an application-specific integrated circuit (ASIC), or one or more processors used to control the execution of the program of the present application. integrated circuit.
  • CPU central processing unit
  • ASIC application-specific integrated circuit
  • Communication line 402 may include a path, such as a bus, that carries information between the above-mentioned components.
  • Communication interface 404 is used to communicate with other devices or communication networks.
  • the communication interface 404 can be any device such as a transceiver, such as an Ethernet interface, a radio access network (RAN) interface, a wireless local area networks (WLAN) interface, a transceiver, and pins , bus, or transceiver circuit, etc.
  • RAN radio access network
  • WLAN wireless local area networks
  • Memory 403 may be a read-only memory (ROM) or other type of static storage device that can store static information and instructions, a random access memory (random access memory (RAM)) or other type that can store information and instructions.
  • a dynamic storage device can also be an electrically erasable programmable read-only memory (EEPROM), a compact disc read-only memory (CD-ROM) or other optical disk storage, optical disc storage (including compressed optical discs, laser discs, optical discs, digital versatile discs, Blu-ray discs, etc.), disk storage media or other magnetic storage devices, or can be used to carry or store desired program code in the form of instructions or data structures and can be used by a computer Any other medium for access, but not limited to this.
  • the memory may exist independently and be coupled to the processor 401 through a communication line 402 . Memory 403 may also be integrated with processor 401.
  • the memory provided by the embodiment of the present application may generally be non-volatile.
  • the memory 403 is used to store computer execution instructions involved in executing the solutions provided by the embodiments of this application, and the processor 401 controls the execution.
  • the processor 401 is used to execute computer execution instructions stored in the memory 403, thereby implementing the method provided by the embodiment of the present application.
  • the processor 401 may also perform processing-related functions in the methods provided in the following embodiments of the present application, and the communication interface 404 is responsible for communicating with other devices or communication networks. This application implements The example does not specifically limit this.
  • the computer-executed instructions in the embodiments of the present application may also be called application codes, which are not specifically limited in the embodiments of the present application.
  • the coupling in the embodiment of this application is an indirect coupling or communication connection between devices, units or modules, which may be in electrical, mechanical or other forms, and is used for information interaction between devices, units or modules.
  • the processor 401 may include one or more CPUs, such as CPU0 and CPU1 in FIG. 4 .
  • the image processing device 40 may include multiple processors, such as the processor 401 and the processor 407 in FIG. 4 . Each of these processors may be a single-CPU processor or a multi-CPU processor.
  • a processor here may refer to one or more devices, circuits, and/or processing cores for processing data (eg, computer program instructions).
  • the image processing apparatus 40 may also include an output device 405 and/or an input device 406.
  • Output device 405 is coupled to processor 401 and can display information in a variety of ways.
  • the output device 405 may be a liquid crystal display (LCD), a light emitting diode (LED) display device, a cathode ray tube (CRT) display device, or a projector (projector), etc.
  • Input device 406 is coupled to processor 401 and can receive user input in a variety of ways.
  • the input device 406 may be a mouse, a keyboard, a touch screen device, a sensing device, or the like.
  • the image processing device 40 also includes an image perception module and/or an image recognition module (not shown in Figure 4).
  • the image sensing module can have image sensing capabilities.
  • the image processing device 40 is configured with one or more of a monocular camera, a binocular camera, a trinocular camera, a depth camera or a scanning module.
  • the image recognition module can have the ability to recognize the content in the image.
  • composition structure shown in Figure 4 does not constitute a limitation on the image processing device.
  • the image processing device may include more or fewer components than shown in the figure, or Combining certain parts, or different arrangements of parts.
  • image in the embodiments of this application can be replaced by “picture” or “photograph” or other names similar to “image” without limitation.
  • A/B may indicate A or B; "and/or” may be used to describe There are three relationships between associated objects.
  • a and/or B can represent three situations: A exists alone, A and B exist simultaneously, and B exists alone.
  • a and B can be singular or plural.
  • expressions similar to "at least one of A, B and C" or "at least one of A, B or C” are often used to mean any of the following: A alone; B alone; alone C exists; A and B exist simultaneously; A and C exist simultaneously; B and C exist simultaneously; A, B, and C exist simultaneously.
  • the above is an example of three elements A, B and C to illustrate the optional items of this project. When there are more elements in the expression, the meaning of the expression can be obtained according to the aforementioned rules.
  • words such as “first” and “second” may be used to distinguish technical features with the same or similar functions.
  • the words “first”, “second” and other words do not limit the quantity and execution order, and the words “first” and “second” do not limit the number and execution order.
  • words such as “exemplary” or “for example” are used to express examples, illustrations or illustrations, and any embodiment or design solution described as “exemplary” or “for example” shall not be interpreted. To be more preferred or advantageous than other embodiments or designs.
  • the use of words such as “exemplary” or “such as” is intended to present related concepts in a concrete manner that is easier to understand.
  • an embodiment means that a particular feature, structure, or characteristic associated with the embodiment is included in at least one embodiment of the present application. Therefore, various embodiments are not necessarily referred to the same embodiment throughout this specification. Furthermore, the particular features, structures or characteristics may be combined in any suitable manner in one or more embodiments. It can be understood that in the various embodiments of the present application, the size of the sequence numbers of each process does not mean the order of execution. The execution order of each process should be determined by its functions and internal logic, and should not be determined by the execution order of the embodiments of the present application. The implementation process constitutes no limitation.
  • At the same time in this application can be understood as at the same point in time, within a period of time, or within the same cycle.
  • the image processing device can perform some or all of the steps in the embodiments of the present application. These steps are only examples. The embodiments of the present application can also perform other steps or variations of various steps. In addition, various steps may be performed in a different order than those presented in the embodiments of the present application, and it may not be necessary to perform all the steps in the embodiments of the present application.
  • the image processing method may include the following steps:
  • the image processing device obtains the image processing model, the first original image and the first environment information.
  • the image processing device in S501 may be the image processing device 301 in FIG. 3A or FIG. 3B.
  • the image processing device may acquire the image processing model, the first original image, and the first environment information at the same time, or may not acquire the image processing model, the first original image, and the first environment information at the same time.
  • the image processing device may first acquire the image processing model and the first environment information, and then acquire the first original image, or the image processing device may first acquire the image processing model, then acquire the first environment information, and finally acquire the first original image.
  • the image processing model may include an input terminal and an output terminal. After the input parameters are input into the image processing model through the input terminal, the output terminal can output the output parameters.
  • an image processing model includes a function or algorithm.
  • the image processing model can include any function or algorithm, such as a quadratic function, a cubic function, an exponential function, a custom function or a custom algorithm, etc. It can be understood that the calculation amount of using a function or algorithm to calculate the parameters in the first environment information is much less than the calculation amount of generating and redistributing the grayscale histogram. Therefore, the image processing speed of the image processing device can be greatly improved.
  • the input parameters may include environmental information, such as environmental information when acquiring the original image.
  • the input parameters include first environment information.
  • the original image (such as the first original image, and/or the second original image, and/or the third original image, etc.) may be an image of any type or format.
  • the original image is a RAW image or an RGB image.
  • the original image is an image in bayer format, an image in jpeg format, an image in BMP format, or an image in PNG format, etc.
  • the output parameters may include image processing parameters (such as first image processing parameters).
  • the image processing parameters may include at least one of the following: parameters in the demosaic function, parameters in the black level correction function, parameters in the lens correction function, parameters in the gamma correction function (such as gamma parameters), parameters in the white balance function Parameters or parameters in a colormap function.
  • the first environment information is information about the environment when the first original image is acquired. It can be understood that the first environment information is related to the function applied by the image processing method. Specifically, if the function is a gamma correction function, the first environment information is related to the gamma correction function. In this way, the first image processing parameters obtained according to the first environment information can be used for gamma correction.
  • the first environment information includes at least one of the following: time information when the first original image is obtained, information about the illumination intensity when the first original image is obtained, information about the illumination direction when the first original image is obtained, information about the illumination direction when the first original image is obtained, The brightness information when the image is taken or the position information when the first original image is obtained.
  • the time information for obtaining the first original image may include at least one of the following: a moment when the first original image is obtained, a month when the first original image is obtained, or a season when the first original image is obtained.
  • the lighting direction can include east, south, west, or north; or the lighting direction can include east, south, west, north, northeast, southeast, northwest, or southwest; or the lighting direction can include angles, such as the lighting direction and each coordinate system. angle of the axis.
  • the coordinate system can be any two-dimensional or three-dimensional coordinate system in which the first original image is located. Location information can include longitude and latitude. It can be understood that the above information is related to the gamma correction function. Therefore, the first image processing parameters obtained according to the first environment information can be used for gamma correction.
  • the above information is also smooth and excessive, that is, the first environment information corresponding to two adjacent frames of images in the video is not much different, so the corresponding two adjacent frames of images obtained based on the first environment information in S502
  • the first processing parameters are also not much different, so there will be no problem of video flickering.
  • the first environment information includes multiple bits, and different bits correspond to different information.
  • the first environment information may include 6 bits, where the first two bits are related to the season in which the first original image was acquired.
  • the last 4 bits correspond to the information of the illumination intensity when acquiring the first original image. If the value of the first two bits is "00", it means that the season in which the first original image is obtained is spring. If the value of the first two bits is "01”, it means that the season in which the first original image is obtained is summer. If the value of the two bits is "10”, it means that the season in which the first original image is obtained is autumn. If the value of the first two bits is "11”, it means that the season in which the first original image is obtained is winter. The value of the last 4 bits can represent the illumination intensity when the first original image was acquired.
  • the image processing device can obtain the image processing model, and/or the first original image, and/or the first environment information in various ways.
  • the image processing device can obtain the above information locally, or the image processing device can obtain the above information through other devices or equipment, or the image processing device can obtain the above information by itself.
  • Detailed introduction is given below.
  • the image processing device locally stores the image processing model, and/or the first original image, and/or the first environment information. Therefore, the image processing device can obtain the image processing model, and/or the first original image, and/or the first environment information locally.
  • the model acquisition device can acquire the image processing model and send it to the image processing device, and accordingly, the image processing device receives the image processing model.
  • the image sensing device can acquire the first original image and send it to the image processing device, and accordingly, the image processing device receives the first original image.
  • the image sensing device may be image sensing device 302 in Figure 3A.
  • the information acquisition device can acquire the first environment information and send it to the image processing device.
  • the image processing device receives the first environment information. It can be understood that the model acquisition device, the image sensing device and the information acquisition device are different devices from the image processing device.
  • the model acquisition device, image sensing device and information acquisition device may be the same device or different devices without limitation.
  • the image processing device may acquire the image processing model by itself in the following manner: the image processing device acquires at least one second environment information, at least one second original image, and at least one target image, and determines the image processing model according to the at least one second environment information. , at least one second original image and at least one target image, and obtain an image processing model.
  • the image processing device may have image sensing capabilities.
  • the image processing device is configured with a camera device, and the first original image may be acquired through the camera device.
  • the camera device may include one or more of the following: a monocular camera, a binocular camera, a trinocular camera, a depth camera or a scanner.
  • the image processing device can obtain the first environment information through one or more of the network, a sensor in the image processing device, or software installed in the image processing device.
  • the image processing device determines the time information for acquiring the first original image through the network or time software installed thereon.
  • the image processing device determines, through a sensor configured on it, the information on the illumination intensity when acquiring the first original image, and/or the information on the illumination direction when acquiring the first original image, and/or the information on the illumination direction when acquiring the first original image.
  • brightness information For another example, the image processing device determines the location information when the first original image is acquired through the map software installed thereon.
  • the image processing device uses the first environment information as an input parameter of the image processing model to obtain the first image processing parameters.
  • the image processing device inputs the first environment information into the image processing model, that is, the first image processing parameters can be obtained. That is to say, the first image processing parameters can be obtained by calculating the first environment information using the function or algorithm in the image processing model.
  • the first image processing parameters may include at least one of the following: parameters in the demosaic function, parameters in the black level correction function, parameters in the lens correction function, and parameters in the gamma correction function (such as gamma parameters), parameters in the white balance function, or parameters in the color mapping function.
  • the image processing device uses the first image processing parameters to process the first original image to obtain the first processed image.
  • the image processing device can use the gamma parameter to perform gamma correction on the first original image to obtain the first processed image.
  • the image processing device may identify the content in the first processed image.
  • the image processing device may send the first processed image to the image recognition device, so that the image recognition device recognizes the content in the first image.
  • the image recognition device may be the image recognition device 303 in FIG. 3A.
  • the actions of the image processing device in the above S501-S503 can be executed by the processor 401 in the image processing device 40 shown in Figure 4 calling the application code stored in the memory 403.
  • This embodiment of the present application does not do anything in this regard. limit.
  • the image processing device can obtain the first image processing parameters based on the environmental information and the image processing model when acquiring the first original image, and process the first original image according to the first image processing parameters to obtain Processed image.
  • it does not involve the generation and redistribution of grayscale histograms and other computationally intensive processes. Instead, it uses an image processing model trained in advance and related to environmental information to obtain the first image.
  • Processing parameters process the first original image according to the first image processing parameters. The calculation process is relatively simple and the calculation amount is small.
  • the smart driving vehicle can identify targets (such as vehicles, pedestrians, lane lines or obstacles, etc.) as quickly as possible based on the first processed image.
  • the method shown in Figure 5 above obtains the first image processing parameters based on the environment information and the image processing model when acquiring the first original image. That is to say, the first image processing parameters are related to the environment information and are related to the environment information.
  • the original image has nothing to do, so there will be no problem of the processed image being too bright and overexposed due to dark objects in the original image.
  • the first image processing parameters are parameters related to environmental information, if the environmental information corresponding to the two original images is the same (or has little difference), then the first image processing parameters corresponding to the two original images are also the same. (Or not much different). Therefore, if this method is used to process video streams, the problem of video flickering will not occur.
  • the image processing device can obtain the image processing model by itself. Specifically, as shown in Figure 6, the image processing device can obtain the image processing model through the following steps:
  • the image processing device obtains at least one second environment information and at least one second original image.
  • At least one second original image includes a second original image corresponding to each of the at least one second environment information.
  • one piece of second environment information may correspond to at least one second original image.
  • the image processing device acquires 2 pieces of second environment information and 50 pieces of second original images.
  • the first second environment information corresponds to the first 20 second original images
  • the second second environment information corresponds to the last 30 second original images.
  • the image processing device can obtain at least one second environment information and/or at least one second original image in various ways.
  • the image processing device can obtain the above information locally, or the image processing device can obtain the above information through other devices or equipment, or the image processing device can obtain the above information by itself.
  • the image processing device acquires at least one target image.
  • At least one target image includes a target image corresponding to each second environment information in at least one second environment information.
  • one piece of second environment information can correspond to at least one target image.
  • the target image may be an image obtained by processing the third original image, and the target image meets certain requirements.
  • the image processing device can obtain at least one target image in various ways.
  • the image processing device may acquire at least one target image locally, that is, at least one target image is stored locally in advance.
  • the image processing device can obtain at least one target image through other devices or equipment.
  • other devices or equipment use the method shown in Figure 1 to process at least one third original image to obtain the at least one third original image corresponding to at least one target image, and sending the at least one target image to the image processing device.
  • the image processing device may acquire at least one target image by itself.
  • the image processing device uses the method shown in FIG. 1 to process at least one third original image to obtain at least one target image corresponding to at least one third original image. It should be understood that in addition to the method shown in Figure 1, other methods can also be used to obtain the target image without limitation.
  • the second environment information is information about the environment when acquiring the second original image and the third original image corresponding to the second environment information.
  • the third original image is the original image of the target image corresponding to the second environment information.
  • the second environment information includes at least one of the following: acquiring the time information of the second original image and the third original image corresponding to the second environment information, and acquiring the second original image and the third original image corresponding to the second environment information.
  • the second environment information includes the time at which the second original image and the third original image corresponding to the second environment information are acquired, and the time is 12:00
  • the second original image corresponding to the second environment information is The images are image 1 and image 2
  • the third original image corresponding to the second environment information is image 3.
  • the time at which image 1, image 2 and image 3 are acquired is all 12:00.
  • the type of information included in the second environment information is the same as the type of information included in the first environment information. That is to say, the parameters included in the first environment information will be determined by the parameters included in the second environment information. However, the values of parameters included in the second environment information may not be the same as the values of parameters included in the first environment information. Same, maybe different. For example, if the first environment information includes information about the illumination intensity when acquiring the first original image, then the second environment information includes information about the illumination intensity when acquiring the second original image and the third original image corresponding to the second environment information, and The illumination intensity information included in the first environment information and the illumination intensity information included in the second environment information may be the same or different.
  • the second environment information includes acquiring the second original image corresponding to the second environment information and The illumination direction information of the third original image, and the brightness information of the second original image and the third original image corresponding to the second environment information
  • the first environment information includes the illumination direction information and the second environment information
  • the included illumination direction information may be the same or different, and the brightness information included in the first environment information and the brightness information included in the second environment information may be the same or different.
  • the image processing device acquires an image processing model based on at least one second environment information, at least one second original image, and at least one target image.
  • the image processing device can train at least one second environment information, at least one second original image and at least one target image to obtain an image processing model.
  • the image processing parameters (such as the first image processing parameters) obtained through the image processing model are also more accurate.
  • the image processing device uses at least one second image processing parameter to respectively process at least one second original image to obtain at least one second processed image, and performs processing according to at least one second environment information and at least one second Image processing parameters to obtain the image processing model.
  • At least one second image processing parameter corresponds to at least one second original image one-to-one, and the second image processing parameters corresponding to different second original images may be the same or different.
  • the absolute value of the difference between the correlation coefficient of the at least one second processed image and the correlation coefficient of the at least one target image is less than or equal to the first threshold.
  • the correlation coefficient can also be called entropy.
  • the image processing device can extract features of at least one second processed image, and obtain a correlation coefficient of at least one second processed image based on the extracted features. Specifically, as shown in FIG. 7 , the image processing device can obtain the grayscale image of each second processed image in at least one second processed image, and obtain at least one grayscale histogram based on the at least one grayscale image. A gray histogram is averaged to obtain the first average gray histogram. Subsequently, the image processing device may determine the correlation coefficient of at least one second processed image based on the first average grayscale histogram.
  • the correlation coefficient of at least one second processed image can satisfy the formula:
  • H 1 is the correlation coefficient of at least one second processed image
  • xi is the pixel value, specifically it can be an integer greater than or equal to 0 and less than or equal to 255
  • p(xi ) is the point with pixel value xi in the first
  • N is 255.
  • the method by which the image processing apparatus obtains the correlation coefficient of at least one target image is similar to the method of obtaining the correlation coefficient of at least one second processed image.
  • the image processing device may extract features of at least one target image, and obtain a correlation coefficient of at least one target image based on the extracted features.
  • the image processing device can acquire a grayscale image of each target image in at least one target image, obtain at least one grayscale histogram based on the at least one grayscale image, average the at least one grayscale histogram, and obtain a third Two average grayscale histograms. Subsequently, the image processing device may determine the correlation coefficient of at least one target image according to the second average grayscale histogram.
  • the correlation coefficient of at least one target image can satisfy the formula: Among them, H 2 is the correlation coefficient of at least one target image, xi is the pixel value, specifically it can be an integer greater than or equal to 0 and less than or equal to 255, q(xi ) is the second average gray value of the point with pixel value xi The probability corresponding to the degree histogram. This probability can represent the probability that a point with a pixel value of x i appears in at least one gray level histogram corresponding to at least one target image. N is 255.
  • the absolute value of the difference between the correlation coefficient of at least one second processing image and the correlation coefficient of at least one target image can be expressed as
  • This absolute value may represent the similarity between at least one second processed image and at least one target image. Specifically, the larger the absolute value is, the less similar the at least one second processed image is to the at least one target image. The smaller the absolute value is, the more similar the at least one second processed image is to the at least one target image. Therefore, if the absolute value is less than or equal to the first threshold, the similarity between the at least one second processed image and the at least one target image can be high.
  • the first threshold can be set as needed and is not limited.
  • the difference between the correlation coefficient of at least one second processed image and the correlation coefficient of at least one target image can also be characterized by KL divergence (Kullback–Leibler divergence) or relative entropy, or in other words, at least one second The processing image is similar to at least one target image.
  • the KL divergence of at least one second processing image and at least one target image can satisfy the formula:
  • q) is the KL divergence of at least one second processing image and at least one target image
  • xi is a pixel value, specifically it can be an integer greater than or equal to 0 and less than or equal to 255
  • p( xi ) is the probability that the point with pixel value xi corresponds to the first average grayscale histogram
  • q(xi ) is the probability that the point with pixel value xi corresponds to the second average grayscale histogram.
  • KL divergence indicates that the at least one second processing image and at least one target image are less similar, and a smaller value of KL divergence indicates that the at least one second processing image and at least one target image are more similar. Therefore, if the value of the KL divergence is less than or equal to the first threshold, the similarity between the at least one second processed image and the at least one target image can be made high.
  • the first threshold can be set as needed and is not limited.
  • the image processing device can obtain an image processing model based on at least one second environment information and at least one second image processing parameter.
  • the image processing parameters can be obtained.
  • the image processing parameters can be obtained.
  • the image processing parameters can be obtained.
  • the image processing parameters to process the original image corresponding to the environmental information can make the processed image similar to at least one target image. The degree is higher. Therefore, the processed image can also meet the requirements met by the target image.
  • the image processing device obtains the image processing model based on at least one second environment information and at least one second image processing parameter.
  • the image processing device can obtain at least one second image processing parameter in various ways.
  • At least one second image processing parameter can be preconfigured in the image processing device based on experience, and the image processing device obtains at least one second image processing parameter locally.
  • the image processing device obtains a set of candidate image processing parameters.
  • the set of candidate image processing parameters includes a plurality of candidate image processing parameters, the plurality of candidate image processing parameters including at least one second processing parameter. That is to say, the image processing device can first obtain multiple candidate parameters, and then use the candidate parameters to process the second original image to obtain a processed image. If the correlation coefficient of the processed image and the correlation coefficient of the target image If the absolute value of the difference is less than or equal to the first threshold, the image processing device determines the candidate parameter corresponding to the processed image as the second processing parameter.
  • the image processing device obtains initial image processing parameters and performs a first operation on the initial image processing parameters to obtain a set of candidate image processing parameters.
  • the initial image processing parameters can be set as needed.
  • the first operation includes at least one of the following: determining candidate image processing parameters according to rules based on initial image processing parameters, randomly determining candidate image processing parameters based on initial image processing parameters, or determining a set of candidate image processing parameters based on historically determined candidate image processing parameters. .
  • the image processing device can determine 20 values in the interval from 1.90 to 2.10 with an interval of 0.01 as multiple candidate image processing parameters.
  • the image processing device can randomly add or subtract a number within 0.3 to 2.0 to obtain multiple candidate image processing parameters.
  • the image processing apparatus may determine all or part of the candidate image processing parameters determined historically as a plurality of candidate image processing parameters.
  • the image processing device can determine 20 values in the interval from 1.90 to 2.10 at intervals of 0.01, and then select among the 20 values that have not been historically determined as candidate image processing parameters. Values are determined as multiple candidate image processing parameters.
  • the image processing device can randomly add or subtract a number within 0.3 to 2.0 to obtain 30 values, and then select among the 30 values, values that have not been historically determined as candidate image processing parameters. , determined as multiple candidate image processing parameters.
  • the image processing device first obtains a set of candidate image processing parameters, and then processes the second original image according to the parameters in the set to obtain at least one second image processing parameter.
  • the image processing device can also obtain a candidate image processing parameter, and process the second original image according to the candidate image processing parameter. If the correlation coefficient of the processed image and the target image If the absolute value of the difference in correlation coefficients is less than or equal to the first threshold, then the candidate image processing parameter is the second image processing parameter. After that, the image processing device obtains a candidate image processing parameter again, and processes the second original image according to the candidate image processing parameter.
  • the candidate image processing parameter is the second image processing parameter.
  • the image processing device can train at least one second image processing parameter and at least one second environment information to obtain an image processing model.
  • at least one second image processing parameter can make at least one second processed image have a high similarity with at least one target image. Therefore, environmental information (such as first environmental information) is input into the image processing model obtained according to the above method.
  • the image processing parameters (such as the first image processing parameters) can be obtained, and the original image (such as the first original image) corresponding to the environmental information is processed by using the image processing parameters, so that the processed image (such as the first processed image) can be compared with at least A target image has higher similarity.
  • the processed image can also meet the requirements of the target image.
  • the actions of the image processing device in the above-mentioned S5011-S5013 can be executed by the processor 401 in the image processing device 40 shown in Figure 4 calling the application code stored in the memory 403.
  • This embodiment of the present application does not do anything in this regard. limit.
  • the methods and/or steps implemented by the image processing device can also be implemented by components (such as chips or circuits) that can be used in the image processing device.
  • embodiments of the present application also provide an image processing device, which may be the image processing device in the above method embodiment, or a device including the above image processing device, or a component that can be used in the image processing device.
  • the above-mentioned image processing device includes hardware structures and/or software modules corresponding to each function.
  • the present application can be implemented in the form of hardware or a combination of hardware and computer software. Whether a function is performed by hardware or computer software driving the hardware depends on the specific application and design constraints of the technical solution. Skilled artisans may implement the described functionality using different methods for each specific application, but such implementations should not be considered beyond the scope of this application.
  • Embodiments of the present application can divide the image processing device into functional modules according to the above method examples.
  • each functional module can be divided corresponding to each function, or two or more functions can be integrated into one processing module.
  • the above integrated modules can be implemented in the form of hardware or software function modules. It can be understood that the division of modules in the embodiment of the present application is schematic and is only a logical function division. In actual implementation, there may be other division methods.
  • FIG. 9 shows a schematic structural diagram of an image processing device 90 .
  • the image processing device 90 includes a processing module 901 .
  • the processing module 901 which may also be called a processing unit, is used to perform operations other than sending and receiving operations, and may be, for example, a processing circuit or a processor.
  • the image processing device 90 may also include a storage module (not shown in FIG. 9) for storing program instructions and data.
  • the image processing device 90 is used to implement the functions of the above image processing device.
  • the image processing device 90 is, for example, the image processing device described in the embodiment shown in FIG. 5 or the embodiment shown in FIG. 6 .
  • the processing module 901 is used to obtain the image processing model, the first original image and the first environment information.
  • the first environment information is information about the environment when the first original image is acquired.
  • the processing module 901 may be used to perform S501.
  • the processing module 901 is also used to use the first environment information as an input parameter of the image processing model to obtain the first image processing parameters.
  • the processing module 901 can also be used to perform S502.
  • the processing module 901 is also used to process the first original image using the first image processing parameters to obtain the first processed image.
  • the processing module 901 can also be used to perform S503.
  • the first environment information includes at least one of the following: time information for obtaining the first original image, information on the illumination intensity when obtaining the first original image, information on the illumination direction when obtaining the first original image, information on obtaining the first original image.
  • time information for obtaining the first original image information on the illumination intensity when obtaining the first original image
  • information on the illumination direction when obtaining the first original image information on obtaining the first original image.
  • the processing module 901 is specifically configured to obtain at least one second environment information and at least one second original image.
  • the at least one second original image includes at least one second environment information corresponding to each second environment information.
  • the second original image; the processing module 901 is also specifically configured to obtain at least one target image.
  • the at least one target image includes a target image corresponding to each second environmental information in at least one second environmental information.
  • the second environmental information is to obtain the first
  • the second original image and the third original image corresponding to the second environmental information are environmental information, the third original image is the original image of the target image corresponding to the second environmental information, and the type of information included in the second environmental information is the same as that of the first environment.
  • the processing module 901 is also specifically configured to use at least one second image processing parameter to process at least one second original image respectively to obtain at least one second processed image and the correlation of at least one second processed image.
  • the absolute value of the difference between the coefficient and the correlation coefficient of at least one target image is less than or equal to the first threshold; the processing module 901 is also specifically configured to obtain an image processing model based on at least one second environment information and at least one second image processing parameter.
  • the processing module 901 is also used to obtain a candidate image processing parameter set, where the candidate image processing parameter set includes a plurality of candidate image processing parameters, and the plurality of candidate image processing parameters include at least one second processing parameter.
  • the processing module 901 is also used to obtain initial image processing parameters; the processing module 901 is also used to perform a first operation on the initial image processing parameters to obtain a set of candidate image processing parameters.
  • the first operation includes at least one of the following: determining candidate image processing parameters according to rules according to initial image processing parameters, randomly determining candidate image processing parameters based on initial image processing parameters, or determining candidate image processing parameters based on history. Parameters determine a set of candidate image processing parameters.
  • the image processing model includes functions or algorithms.
  • the image processing device 90 may take the form shown in FIG. 4 .
  • the processor 401 in Figure 4 can cause the image processing device 90 to execute the method described in the above method embodiment by calling the computer execution instructions stored in the memory 403.
  • the function/implementation process of the processing module 901 in Figure 9 can be implemented by the processor 401 in Figure 4 calling the computer execution instructions stored in the memory 403.
  • the above modules or units can be implemented in software, hardware, or a combination of both.
  • the software exists in the form of computer program instructions and is stored in the memory.
  • the processor can be used to execute the program instructions and implement the above method flow.
  • the processor can be built into an SoC (System on a Chip) or ASIC, or it can be an independent semiconductor chip.
  • the processor can further include necessary hardware accelerators, such as field programmable gate array (FPGA), PLD (programmable logic device) , or a logic circuit that implements dedicated logic operations.
  • FPGA field programmable gate array
  • PLD programmable logic device
  • the hardware can be a CPU, a microprocessor, a digital signal processing (DSP) chip, a microcontroller unit (MCU), an artificial intelligence processor, an ASIC, Any one or any combination of SoC, FPGA, PLD, dedicated digital circuits, hardware accelerators or non-integrated discrete devices, which can run the necessary software or not rely on software to perform the above method flow.
  • DSP digital signal processing
  • MCU microcontroller unit
  • embodiments of the present application also provide a chip system, including: at least one processor and an interface.
  • the at least one processor is coupled to the memory through the interface.
  • the at least one processor executes the computer program or instructions in the memory
  • the chip system further includes a memory.
  • the chip system may be composed of chips, or may include chips and other discrete devices, which is not specifically limited in the embodiments of the present application.
  • embodiments of the present application also provide a computer-readable storage medium. All or part of the processes in the above method embodiments can be completed by instructing relevant hardware through a computer program.
  • the program can be stored in the above computer-readable storage medium. When executed, the program can include the processes of the above method embodiments. .
  • the computer-readable storage medium may be an internal storage unit of the image processing device of any of the aforementioned embodiments, such as a hard disk or memory of the image processing device.
  • the computer-readable storage medium may also be an external storage device of the image processing device, such as a plug-in hard drive, a smart media card (SMC), or a secure digital (SD) equipped on the image processing device. card, flash card, etc.
  • SMC smart media card
  • SD secure digital
  • the above computer-readable storage medium may also include both the internal storage unit of the above image processing apparatus and an external storage device.
  • the above-mentioned computer-readable storage medium is used to store the above-mentioned computer program and other programs and data required by the above-mentioned image processing apparatus.
  • the above-mentioned computer-readable storage media can also be used to temporarily store data that has been output or is to be output.
  • the embodiment of the present application also provides a computer program product. All or part of the processes in the above method embodiments can be completed by instructing relevant hardware through a computer program.
  • the program can be stored in the above computer program product. When executed, the program can include the processes of the above method embodiments.
  • the embodiment of the present application also provides a computer instruction. All or part of the processes in the above method embodiments can be completed by computer instructions to instruct related hardware (such as computers, processors, access network equipment, mobility management network elements or session management network elements, etc.).
  • the program may be stored in the above-mentioned computer-readable storage medium or in the above-mentioned computer program product.
  • embodiments of the present application also provide an intelligent driving vehicle, including the image processing device in the above embodiment.
  • the disclosed devices and methods can be implemented in other ways.
  • the device embodiments described above are only illustrative.
  • the division of modules or units is only a logical function division.
  • there may be other division methods for example, multiple units or components may be The combination can either be integrated into another device, or some features can be omitted, or not implemented.
  • the coupling or direct coupling or communication connection between each other shown or discussed may be through some interfaces, and the indirect coupling or communication connection of the devices or units may be in electrical, mechanical or other forms.
  • the units described as separate components may or may not be physically separated.
  • the components shown as units may be one physical unit or multiple physical units, that is, they may be located in one place, or they may be distributed to multiple different places. . Some or all of the units can be selected according to actual needs to achieve the purpose of this embodiment.
  • each functional unit in each embodiment of the present application can be integrated into one processing unit, each unit can exist physically alone, or two or more units can be integrated into one unit.
  • the above integrated units can be implemented in the form of hardware or software functional units.

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Image Processing (AREA)

Abstract

La présente demande se rapporte au domaine du traitement d'image, et divulgue un procédé et un dispositif de traitement d'image, qui peuvent réduire le volume de calcul dans un processus de traitement d'image. Le procédé consiste à : acquérir un modèle de traitement d'image, une première image originale et des premières informations d'environnement ; utiliser les premières informations d'environnement en tant que paramètre d'entrée du modèle de traitement d'image afin d'obtenir un premier paramètre de traitement d'image ; et traiter la première image originale au moyen du premier paramètre de traitement d'image afin d'obtenir une première image traitée. Les premières informations d'environnement sont des informations d'un environnement au moment où la première image originale est acquise.
PCT/CN2022/139253 2022-04-28 2022-12-15 Procédé et dispositif de traitement d'images WO2023207137A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202210470192.7 2022-04-28
CN202210470192.7A CN117011153A (zh) 2022-04-28 2022-04-28 图像处理方法及装置

Publications (1)

Publication Number Publication Date
WO2023207137A1 true WO2023207137A1 (fr) 2023-11-02

Family

ID=88517233

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2022/139253 WO2023207137A1 (fr) 2022-04-28 2022-12-15 Procédé et dispositif de traitement d'images

Country Status (2)

Country Link
CN (1) CN117011153A (fr)
WO (1) WO2023207137A1 (fr)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102194215A (zh) * 2010-03-19 2011-09-21 索尼公司 图像处理设备、方法以及程序
CN107948511A (zh) * 2017-11-28 2018-04-20 广东欧珀移动通信有限公司 图像亮度处理方法、装置、存储介质和电子设备
CN108012078A (zh) * 2017-11-28 2018-05-08 广东欧珀移动通信有限公司 图像亮度处理方法、装置、存储介质和电子设备
WO2021109620A1 (fr) * 2019-12-06 2021-06-10 华为技术有限公司 Procédé et appareil d'ajustement de paramètre d'exposition
WO2021114184A1 (fr) * 2019-12-12 2021-06-17 华为技术有限公司 Procédé d'apprentissage de modèle de réseau neuronal et procédé de traitement d'image, et appareils associés
CN113989394A (zh) * 2021-10-22 2022-01-28 浙江天行健智能科技有限公司 用于自动驾驶模拟环境色温的图像处理方法及系统

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102194215A (zh) * 2010-03-19 2011-09-21 索尼公司 图像处理设备、方法以及程序
CN107948511A (zh) * 2017-11-28 2018-04-20 广东欧珀移动通信有限公司 图像亮度处理方法、装置、存储介质和电子设备
CN108012078A (zh) * 2017-11-28 2018-05-08 广东欧珀移动通信有限公司 图像亮度处理方法、装置、存储介质和电子设备
WO2021109620A1 (fr) * 2019-12-06 2021-06-10 华为技术有限公司 Procédé et appareil d'ajustement de paramètre d'exposition
WO2021114184A1 (fr) * 2019-12-12 2021-06-17 华为技术有限公司 Procédé d'apprentissage de modèle de réseau neuronal et procédé de traitement d'image, et appareils associés
CN113989394A (zh) * 2021-10-22 2022-01-28 浙江天行健智能科技有限公司 用于自动驾驶模拟环境色温的图像处理方法及系统

Also Published As

Publication number Publication date
CN117011153A (zh) 2023-11-07

Similar Documents

Publication Publication Date Title
US11183067B2 (en) Image generating apparatus, image generating method, and recording medium
JP6789402B2 (ja) 画像内の物体の姿の確定方法、装置、設備及び記憶媒体
CN113034358B (zh) 一种超分辨率图像处理方法以及相关装置
WO2023164845A1 (fr) Procédé de reconstruction tridimensionnelle, dispositif, système, ainsi que support d'enregistrement
WO2020083307A1 (fr) Procédé, appareil et support de stockage pour obtenir une image de profondeur
CN109413399B (zh) 使用深度图合成对象的装置及其方法
US10929961B2 (en) Electronic device and method for correcting images using external electronic device
TWI785162B (zh) 提供影像的方法及用於支持所述方法的電子裝置
WO2022165722A1 (fr) Procédé, appareil et dispositif d'estimation de profondeur monoculaire
WO2023207379A1 (fr) Procédé et appareil de traitement d'images, dispositif et support de stockage
CN113378605A (zh) 多源信息融合方法及装置、电子设备和存储介质
CN114076970A (zh) 一种定位方法、装置及系统
CN117274107B (zh) 低照度场景下端到端色彩及细节增强方法、装置及设备
CN111563517A (zh) 图像处理方法、装置、电子设备及存储介质
WO2024119997A1 (fr) Procédé et appareil d'estimation d'éclairage
CN113721631A (zh) 传感器数据处理方法、系统及可读存储介质
CN113393510B (zh) 一种图像处理方法、智能终端及存储介质
WO2023207137A1 (fr) Procédé et dispositif de traitement d'images
CN115965961B (zh) 局部到全局的多模态融合方法、系统、设备及存储介质
US11425300B2 (en) Electronic device and method for processing image by electronic device
WO2022193132A1 (fr) Procédé et appareil de détection d'image, et dispositif électronique
CN115601275A (zh) 点云增广方法及装置、计算机可读存储介质、终端设备
US12081879B2 (en) Method for generating image and electronic device therefor
CN117455974A (zh) 一种显示方法、装置和电子设备
CN113824894A (zh) 曝光控制方法、装置、设备及存储介质

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22939947

Country of ref document: EP

Kind code of ref document: A1