WO2019153264A1 - 图像预处理方法及装置、图像传感器接口、图像处理方法及装置 - Google Patents

图像预处理方法及装置、图像传感器接口、图像处理方法及装置 Download PDF

Info

Publication number
WO2019153264A1
WO2019153264A1 PCT/CN2018/076041 CN2018076041W WO2019153264A1 WO 2019153264 A1 WO2019153264 A1 WO 2019153264A1 CN 2018076041 W CN2018076041 W CN 2018076041W WO 2019153264 A1 WO2019153264 A1 WO 2019153264A1
Authority
WO
WIPO (PCT)
Prior art keywords
data
frame
timing signal
output
deserialized
Prior art date
Application number
PCT/CN2018/076041
Other languages
English (en)
French (fr)
Inventor
袁扬智
刘俊秀
胡江鸣
韦毅
石岭
Original Assignee
深圳开阳电子股份有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 深圳开阳电子股份有限公司 filed Critical 深圳开阳电子股份有限公司
Priority to CN201880077740.1A priority Critical patent/CN111492650B/zh
Priority to PCT/CN2018/076041 priority patent/WO2019153264A1/zh
Publication of WO2019153264A1 publication Critical patent/WO2019153264A1/zh

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/01Conversion of standards, e.g. involving analogue television standards or digital television standards processed at pixel level

Definitions

  • the present invention relates to the field of image processing technologies, and in particular, to an image preprocessing method and apparatus, an image sensor interface, an image processing method and apparatus.
  • ISPs image signal processors
  • imaging technology current image sensors or otherwise added devices have been able to acquire more data for assisting in enhancing image sharpness, such as LVDS multi-frame data or infrared images. Therefore, ISPs also need to make corresponding improvements to meet the need to improve image clarity.
  • Embodiments of the present invention provide an image preprocessing method and apparatus, an image sensor interface, an image processing method, and an apparatus.
  • An image pre-processing method of an embodiment of the present invention is for an image sensor interface, the image sensor interface including an input interface and an on-chip buffer.
  • the image processing method includes the following steps:
  • the deserialized data includes single frame deserialized data
  • the second timing signal includes a single frame timing signal
  • the step of transmitting the deserialized data to the on-chip buffer according to the second timing signal comprises the following steps:
  • the image preprocessing method includes:
  • the step of reading and outputting the single frame deserialized data from the on-chip buffer according to the single frame timing signal includes the following steps:
  • the single frame deserialized data is converted to a single frame output data conforming to the bus protocol standard by the bus interface and stored in the main memory through the bus.
  • An image pre-processing apparatus includes a first processing module, a second processing module, a third processing module, and a fourth processing module.
  • the first processing module is configured to select the input interface to receive raw data in a parallel format or a serial format according to a user input.
  • the second processing module is configured to: when the received raw data is in a parallel format, acquire a first timing signal from the original data, and send the original data to the on-chip cache according to the first timing signal .
  • the third processing module is configured to convert the original data into a deserialized data in a parallel format and perform desynchronization processing on the deserialized data to obtain a second timing when the received raw data is in a serial format signal.
  • the fourth processing module is configured to send the deserialized data to the on-chip cache save according to the second timing signal.
  • the fourth processing module includes a first transmitting submodule.
  • the first sending submodule is configured to send the single frame deserialized data to the on-chip cache according to the single frame timing signal.
  • the image pre-processing device includes a second output module.
  • the second output module is configured to read and output the single frame deserialized data from the on-chip buffer according to the single frame timing signal.
  • the second output module includes a second receiving submodule, a fourth output submodule, a fifth output submodule, and a sixth output submodule.
  • the second receiving submodule is configured to receive a through signal.
  • the fourth output submodule is configured to output the single frame deserialized data to the image signal processor according to the single frame timing signal when the through signal is enabled.
  • the fifth output submodule is configured to output the single frame deserialized data to the bus interface according to the single frame timing signal when the through signal is not enabled.
  • the sixth output submodule is configured to convert the single frame deserialized data into a single frame output data conforming to a bus protocol standard through the bus interface and store the data in a main memory through a bus.
  • the step of processing the luminance path data to obtain luminance data includes the following steps:
  • the step of processing the chrominance path data to obtain chrominance data comprises the steps of:
  • the image processing method includes the following steps:
  • the chromaticity data is output.
  • An image processing apparatus includes an acquisition unit, a first processing unit, a branching unit, a second processing unit, and a third processing unit.
  • the acquiring unit is configured to acquire data processed by the image preprocessing method of any of the above embodiments to obtain data to be processed.
  • the first processing unit is configured to process the to-be-processed data to obtain data to be shunted.
  • the branching unit is configured to divide the to-be-divided data into luminance path data and chrominance path data.
  • the second processing unit is configured to process the brightness path data to obtain brightness data.
  • the third processing unit is configured to process the chrominance path data to obtain chrominance data.
  • the second processing unit includes a luma interpolation subunit.
  • the luma interpolation subunit is configured to perform interpolation processing on the luma path data.
  • the third processing unit includes a chroma interpolation subunit.
  • the chrominance interpolation subunit is configured to perform interpolation processing on the chrominance path data.
  • the image processing device includes a brightness output unit and a chrominance output unit.
  • the brightness output unit is configured to output the brightness data after the interpolation process.
  • the chrominance output unit is configured to output the chrominance data after the interpolation process.
  • the image preprocessing method, the image preprocessing apparatus, the image processing method, and the image processing apparatus may select to receive original data in a serial format or a parallel format, and the original data is directly processed by the image signal processor.
  • Parallel format is directly cached, and when the original data is in serial format, it is converted to parallel formatted deserialized data and cached. In this way, the image signal processor can process various data to enhance image sharpness.
  • the image preprocessing method, the image preprocessing apparatus, the image sensor interface, the image processing method, and the image processing apparatus of the embodiments of the present invention have the following beneficial effects: first, the image signal processor can support multiple images.
  • the input data format of the sensor such as the raw data format of the parallel input, the original data format of the serial multi-frame input, and the input format of the infrared sensor, etc.
  • the image signal processor can support complex algorithm processing (for example, multi-frame) Wide dynamic processing of input, image defogging, etc.) and better image denoising and enhanced adaptive processing; third, providing more flexibility in the chrominance and luminance processing fields, independent of the chrominance path and the luminance path The data processing path mechanism.
  • the system bandwidth and power consumption and storage resources can be flexibly configured according to different input modes and configuration modes, so that the chip works in an optimal state.
  • FIG. 1 is a schematic flow chart of an image preprocessing method according to an embodiment of the present invention.
  • FIG. 2 is a block diagram of an image pre-processing apparatus according to an embodiment of the present invention.
  • FIG. 3 is a block diagram of an image sensor interface according to an embodiment of the present invention.
  • FIG. 4 is a schematic flow chart of an image processing method according to an embodiment of the present invention.
  • FIG. 5 is a block diagram showing an image processing apparatus according to an embodiment of the present invention.
  • FIG. 6 is a block diagram showing a first processing unit of an image processing apparatus according to an embodiment of the present invention.
  • FIG. 7 is a block diagram showing a second processing unit of an image processing apparatus according to an embodiment of the present invention.
  • FIG. 8 is a block diagram showing a third processing unit of the image processing device according to the embodiment of the present invention.
  • Image preprocessing apparatus 10 first processing module 12, second processing module 14, third processing module 16, fourth processing module 18, image sensor interface 20, input interface 22, on-chip buffer 24, bus interface 26, infrared processing Unit 28, serial data processor 21, cache controller 23, image processing device 30, acquisition unit 32, first processing unit 34, first fusion subunit 342, second fusion subunit 344, 3A statistical subunit 346, White balance correction sub-unit 348, dead-point correction sub-unit 341, lens dimple compensation sub-unit 343, transparency calculation sub-unit 345, demultiplexing unit 36, second processing unit 38, luminance interpolation sub-unit 382, luminance tone mapping sub-unit 384, luminance color matrix correction sub-unit 386, defogging sub-unit 388, luminance color space conversion sub-unit 381, luminance 2D noise reduction sub-unit 383, sharpening sub-unit 385, third processing unit 39, chroma interpolation sub-unit 392 The chroma color matrix correction subunit 394.
  • an image preprocessing method is used for an image preprocessing apparatus 10 and an image sensor interface 20 .
  • the image sensor interface 20 includes an input interface 22 and an on-chip buffer 24 .
  • the image preprocessing method of the embodiment of the present invention includes the following steps:
  • the input interface 22 is selected according to the user input to receive the original data in the parallel format or the serial format;
  • S18 Send the deserialized data to the on-chip buffer 24 according to the second timing signal.
  • the deserialized data includes single frame deserialized data
  • the second timing signal includes a single frame timing signal
  • step S18 includes the following steps:
  • S182 Send the single frame deserialized data to the on-chip buffer 24 according to the single frame timing signal.
  • S13 Read and output single frame deserialized data from the on-chip buffer 24 according to the single frame timing signal.
  • Step S13 includes the following steps:
  • the single frame deserialized data is output to the image signal processor according to the single frame timing signal;
  • the single frame deserialized data is output to the bus interface 26 according to the single frame timing signal.
  • the single frame deserialized data is converted to a single frame output data conforming to the bus protocol standard via the bus interface 26 and stored in the main memory via the bus 80a.
  • the image pre-processing apparatus 10 of the embodiment of the present invention includes a first processing module 12, a second processing module 14, a third processing module 16, and a fourth processing module 18.
  • the first processing module 12 is configured to receive the raw data in the parallel format or the serial format according to the user input selected input interface 22.
  • the second processing module 14 is configured to: when the received raw data is in a parallel format, acquire the first timing signal from the original data and send the original data to the on-chip buffer 24 according to the first timing signal.
  • the third processing module 16 is configured to convert the original data into the deserialized data in the parallel format and de-synchronize the deserialized data to obtain the second timing signal when the received original data is in a serial format.
  • the fourth processing module 18 is configured to send the deserialized data to the on-chip buffer 24 according to the second timing signal.
  • the fourth processing module 18 includes a first transmitting sub-module 182.
  • the first transmitting sub-module 182 is configured to send the single-frame deserialized data to the on-chip buffer 24 according to the single-frame timing signal.
  • the image pre-processing apparatus 10 of the embodiment of the present invention includes a second output module 13.
  • the second output module 13 is configured to read and output the single frame deserialized data from the on-chip buffer 24 according to the single frame timing signal.
  • the second output module 13 includes a second receiving submodule, a fourth output submodule, a fifth output submodule, and a sixth output submodule.
  • the second receiving submodule is configured to receive a through signal.
  • the fourth output submodule is configured to output the single frame deserialized data to the image signal processor according to the single frame timing signal when the through signal is enabled.
  • the fifth output sub-module is configured to output the single-frame deserialized data to the bus interface 26 according to the single-frame timing signal when the through signal is not enabled.
  • the sixth output sub-module is for converting the single-frame deserialized data to a single-frame output data conforming to the bus protocol standard through the bus interface 26 and storing it in the main memory via the bus 80a.
  • single frame deserialized data is output from image sensor interface 20 for further processing by the image signal processor.
  • Two output modes are provided for single-frame deserialization data, which makes the output of single-frame deserialized data more flexible.
  • Single frame deserialized data has only one frame, and frame buffering is not necessary. Therefore, in the through mode, the single frame deserialized data does not have to be sent to the main memory through the bus for frame buffering, but is directly sent to the image signal processor, thereby alleviating the bus bandwidth requirement.
  • the image pre-processing apparatus 10 of the embodiment of the present invention can be applied to a terminal.
  • the terminal includes a memory and a processor.
  • the memory stores computer readable instructions.
  • the processor is caused to perform the image preprocessing method described above.
  • the terminal further includes a bus, an image signal processor, a display controller, and an input device.
  • the user can determine the shooting mode through the input device to select the input interface 22.
  • the bus connects the image sensor interface 20, the memory, the processor, the image signal processor, the display controller, and the input device. In this way, information can be transferred between various functions of the computer via the bus.
  • the terminal further includes an image sensor (not shown), and the image sensor inputs the collected image raw data to the image sensor interface 20.
  • the image sensor can also send the collected image raw data to the main memory (the main memory can be part of the memory or the memory), and the image sensor interface 20 can also read the image raw data from the main memory. There is no specific limit here.
  • the image preprocessing method, the image preprocessing apparatus 10, and the image sensor interface 20 of the embodiments of the present invention may select to receive raw data in a serial format or a parallel format, and when the original data is a parallel format that the image signal processor can directly process. Direct caching, and when the original data is in serial format, it is converted to parallelized deserialized data and cached. In this way, an image signal processor (ISP) can process various data to enhance image sharpness.
  • ISP image signal processor
  • the image pre-processing method of an embodiment of the present invention includes the steps of reading and outputting raw data from the on-chip buffer 24 in accordance with a first timing signal.
  • the image pre-processing apparatus 10 of an embodiment of the present invention includes a first output module.
  • the first output module is configured to read and output the original data from the on-chip buffer 24 according to the first timing signal.
  • raw data is output from image sensor interface 20 for further processing by the image signal processor.
  • the first output module can output the raw data directly to the image signal processor.
  • the first output module can output raw data to the main memory, and the image signal processor reads the raw data from the main memory and performs subsequent processing.
  • the specific situation is as follows.
  • the image sensor interface 20 includes a bus interface 26, and the step of reading and outputting raw data from the on-chip buffer 24 according to the first timing signal includes the steps of: receiving a through signal; when the through signal is enabled, according to The first timing signal outputs raw data to the image signal processor; when the through signal is not enabled, the raw data is output to the bus interface 26 according to the first timing signal; and the raw data is converted to the bus protocol standard via the bus interface 26.
  • the first output module includes a first receiving sub-module, a first output sub-module, a second output sub-module, and a third output sub-module. The first receiving submodule is configured to receive a through signal.
  • the first output sub-module is configured to output the raw data to the image signal processor according to the first timing signal when the through signal is enabled.
  • the second output sub-module is configured to output the raw data to the bus interface 26 according to the first timing signal when the through signal is not enabled.
  • the third output sub-module is for converting raw data to parallel output data conforming to the bus protocol standard via the bus interface 26 and storing the parallel output data to the main memory via the bus. In this way, two output modes are provided for the original data, making the output of the original data more flexible.
  • raw data in parallel format has only one frame and no frame buffering is necessary.
  • the raw data of the parallel format does not have to be sent to the main memory through the bus for frame buffering, but is sent directly to the image signal processor, thereby reducing the need for bus bandwidth.
  • the single frame deserialized data includes normal single frame deserialized data
  • the single frame timing signal includes a normal single frame timing signal.
  • the image and processing method of the embodiment of the present invention can be used for ordinary single frame deserialization data.
  • the single frame deserialized data includes infrared single frame deserialized data
  • the single frame timing signal includes an infrared single frame timing signal
  • Image sensor interface 20 includes an infrared processing unit 28.
  • the image preprocessing method includes the steps of: reading infrared correction data from a main memory according to an infrared single frame timing signal and transmitting the infrared correction data to an on-chip buffer 24; and receiving from the on-chip according to the infrared single frame timing signal
  • the buffer 24 reads the infrared correction data and transmits the infrared correction data to the infrared processing unit 28; the infrared correction data is processed by the infrared processing unit 28 to obtain the processed infrared correction data; and the processed infrared correction data is output.
  • image pre-processing device 10 includes an infrared acquisition module, an infrared transmission module, an infrared processing module, and an infrared output module.
  • the infrared acquisition module is configured to read infrared correction data from the main memory according to the infrared single frame timing signal and send the infrared correction data to the on-chip buffer 24 for storage.
  • the infrared transmitting module is configured to read the infrared correction data from the on-chip buffer 24 according to the infrared single frame timing signal and transmit the infrared correction data to the infrared processing unit 28.
  • the infrared processing module is configured to process the infrared correction data by the infrared processing unit 28 to obtain the processed infrared correction data.
  • the infrared output module is used to output the processed infrared correction data.
  • the infrared sensor requires a correction data to correct the error of the sensor itself.
  • the infrared correction data is read from the main memory according to the infrared single frame timing signal, and stored in the on-chip buffer 24, and the infrared correction data in the on-chip buffer 24 is read out according to the infrared single frame timing signal. And sent to the infrared processing unit 28 to enable the infrared processing unit 28 to process the infrared correction data.
  • the processed infrared correction data is output to the infrared sensor. In this way, the infrared sensor can correct the error of the infrared sensor itself by using the processed infrared correction data, thereby improving the image quality.
  • the deserialized data includes two frames of deserialized data, and the two frames of deserialized data include first long frame deserialized data and first midframe deserialized data.
  • the second timing signal includes a first long frame timing signal and a first medium frame timing signal.
  • Step S18 includes the steps of: transmitting the first long frame deserialization data and the first mid frame deserialization data to the on-chip buffer 24 according to the first long frame timing signal and the first medium frame timing signal, respectively.
  • the image preprocessing method of the embodiment of the present invention includes the steps of: reading and outputting the first long frame deserialization data and the first middle frame deserialization from the on-chip buffer 24 according to the first long frame timing signal and the first medium frame timing signal, respectively. data.
  • the deserialized data includes two frames of deserialized data, and the two frames of deserialized data include first long frame deserialized data and first midframe deserialized data.
  • the second timing signal includes a first long frame timing signal and a first medium frame timing signal.
  • the fourth processing module 18 includes a second transmitting sub-module. The second transmitting submodule is configured to send the first long frame deserialized data and the first mid frame deserialized data to the on-chip buffer 24 according to the first long frame timing signal and the first medium frame timing signal, respectively.
  • the image pre-processing apparatus 10 of the embodiment of the present invention includes a third output module.
  • the third output module is configured to read and output the first long frame deserialization data and the first middle frame deserialization data from the on-chip buffer 24 according to the first long frame timing signal and the first middle frame timing signal, respectively.
  • the image preprocessing method can be applied to two frames of deserialized data. Since the deserialization data has two frames, the first long frame deserialization data and the first middle frame deserialization data are written and extracted from the on-chip buffer 24 according to the first long frame timing signal and the first middle frame timing signal, respectively. Output.
  • image sensor interface 20 includes a bus interface 26.
  • the step of reading and outputting the first long frame deserial data and the first middle frame deserial data from the on-chip buffer 24 according to the first long frame timing signal and the first middle frame timing signal respectively includes the following steps: respectively according to the first length
  • the frame timing signal and the first medium frame timing signal output the first long frame deserial data and the first middle frame deserial data to the bus interface 26; and the first long frame deserialized data and the first middle frame are solved by the bus interface 26
  • the string data is converted into first long frame output data and first medium frame output data conforming to the bus protocol standard and stored in the main memory through the bus.
  • image sensor interface 20 includes a bus interface 26.
  • the third output module includes a seventh output sub-module and an eighth output sub-module.
  • the seventh output sub-module is configured to output the first long frame deserialization data and the first mid-frame deserialization data to the bus interface 26 according to the first long frame timing signal and the first middle frame timing signal, respectively.
  • the eighth output submodule is configured to convert the first long frame deserialized data and the first middle frame deserialized data into a first long frame output data and a first middle frame output data conforming to a bus protocol standard through the bus interface 26 and through the bus Stored in main memory.
  • the first long frame deserialization data and the first middle frame deserialization data are not synchronized, the first long frame deserialization data and the first middle frame deserialized data need to be converted into the first conforming to the bus protocol standard through the bus interface 26.
  • the long frame output data and the first medium frame output data are stored in the main memory through the bus for frame buffering, so that the image signal processor can read the first long frame output data and the first medium frame output data from the main memory and The first long frame output data and the first medium frame output data are subjected to subsequent processing.
  • the deserialized data includes three frames of deserialized data, and the three frames of deserialized data includes second long frame deserialized data, second middle frame deserialized data, and short frame deserialized data.
  • the second timing signal includes a second long frame timing signal, a second medium frame timing signal, and a short frame timing signal.
  • Step S18 includes the steps of: transmitting the second long frame deserialization data, the second intermediate frame deserialization data, and the short frame deserialization data to the slice according to the second long frame timing signal, the second middle frame timing signal, and the short frame timing signal, respectively.
  • the inner cache 24 is saved.
  • the image preprocessing method includes the steps of: reading and outputting the second long frame deserialized data from the on-chip buffer 24 according to the second long frame timing signal, the second medium frame timing signal, and the short frame timing signal, respectively.
  • Two medium frame deserialized data and short frame deserialized data include three frames of deserialized data, and the three frames of deserialized data includes second long frame deserialized data, second middle frame deserialized data, and short frame deserialized data.
  • the second timing signal includes a second long frame timing signal, a second medium frame timing signal, and a short frame timing signal.
  • the fourth processing module 18 includes a third transmitting sub-module.
  • the third sending submodule is configured to send the second long frame deserialized data, the second middle frame deserialized data, and the short frame deserialized data according to the second long frame timing signal, the second middle frame timing signal, and the short frame timing signal, respectively
  • the on-chip cache 24 is saved.
  • the image pre-processing apparatus 10 of the embodiment of the present invention includes a fourth output module.
  • the fourth output module is configured to read and output the second long frame deserial data and the second middle frame deserial data from the on-chip buffer 24 according to the second long frame timing signal, the second middle frame timing signal, and the short frame timing signal, respectively. And short frame deserialize data.
  • the image preprocessing method according to the embodiment of the present invention is applicable to three frames of deserialized data.
  • the second long frame timing signal, the second medium frame timing signal, and the short frame timing signal are respectively used to be second.
  • the long frame deserialization data, the second mid frame deserialization data, and the short frame deserialization data are written to the on-chip buffer 24 and output from the on-chip buffer 24.
  • image sensor interface 20 includes a bus interface 26. Reading and outputting the second long frame deserialization data, the second middle frame deserialization data, and the short frame deserialization data from the on-chip buffer 24 according to the second long frame timing signal, the second middle frame timing signal, and the short frame timing signal, respectively
  • the step includes the following steps: transmitting second long frame deserial data, second middle frame deserial data, and short frame deserial data to the bus according to the second long frame timing signal, the second middle frame timing signal, and the short frame timing signal, respectively.
  • the interface 26; and the second long frame deserial data, the second middle frame deserial data, and the short frame deserial data are converted into the second long frame output data and the second middle frame output data conforming to the bus protocol standard through the bus interface 26.
  • the short frame outputs data and is stored in the main memory via the bus.
  • image sensor interface 20 includes a bus interface 26.
  • the fourth output module includes a ninth output sub-module and a tenth output sub-module.
  • the ninth output sub-module is configured to send the second long frame deserial data, the second middle frame deserial data, and the short frame de-serial data according to the second long frame timing signal, the second middle frame timing signal, and the short frame timing signal, respectively.
  • Bus interface 26 is configured to send the second long frame deserial data, the second middle frame deserial data, and the short frame de-serial data according to the second long frame timing signal, the second middle frame timing signal, and the short frame timing signal, respectively.
  • the tenth output sub-module is configured to convert the second long frame deserial data, the second intermediate frame deserial data, and the short frame deserial data into a second long frame output data conforming to a bus protocol standard through the bus interface 26, and the second The frame output data and the short frame output data are stored in the main memory via the bus. Since the second long frame deserialization data, the second intermediate frame deserialization data, and the short frame deserialization data are not synchronized, the second long frame deserialization data, the second intermediate frame deserialization data, and the short frame deserialization data need to be passed.
  • the bus interface 26 converts the second long frame output data, the second middle frame output data, and the short frame output data in accordance with the bus protocol standard and stores them in the main memory through the bus for frame buffering, so that the image signal processor can be obtained from the main memory.
  • the second long frame output data, the second middle frame output data, and the short frame output data are read and the second long frame output data, the second middle frame output data, and the short frame output data are subjected to subsequent processing.
  • each unit in the image pre-processing apparatus 10 described above is for illustrative purposes only. In other embodiments, the image pre-processing apparatus 10 may be divided into different units as needed to complete all or part of the image pre-processing apparatus 10 described above. Features.
  • the image sensor interface 20 of an embodiment of the present invention includes an input interface 22, a serial data processor 21, an on-chip buffer 24, and a cache controller 23.
  • the input interface 22 is configured to receive a parallel input interface (not shown) of raw data in a parallel format and a serial input interface (not shown) for receiving raw data in a serial format, and the raw data in the parallel format includes the first timing signal.
  • the serial data processor 21 is configured to convert the raw data in the serial format into the deserialized data in the parallel format and de-synchronize the deserialized data to obtain a second timing signal.
  • the on-chip buffer 24 is configured to transmit the raw data in the parallel format to the on-chip buffer 24 according to the first timing signal
  • the serial data processor is configured to send the deserialized data to the on-chip buffer 24 according to the second timing signal.
  • the cache controller 23 is used to control data input, read and write, and output of the on-chip buffer 24.
  • the timing signal is output from the serial data processor 21 to the image signal processor, and the image data is read and output from the on-chip buffer 24 by the cache controller 23.
  • image sensor interface 20 includes a bus interface 26.
  • the bus interface 26 is used to convert the output data through the bus interface 26 into a format conforming to the bus protocol standard and transmitted to the main memory via the bus.
  • image sensor interface 20 includes a bus interface 26 and an infrared processing unit 28.
  • the bus interface 26 is for reading infrared correction data for correcting the error of the infrared sensor itself from the main memory through the bus and transmitting it to the infrared processing unit 28.
  • Infrared processing unit 28 is operative to receive, process, and transmit infrared correction data.
  • an image processing method according to an embodiment of the present invention is applied to an image processing apparatus 30.
  • the image processing method of the embodiment of the present invention includes the following steps:
  • S32 Acquire data processed by the image preprocessing method to obtain data to be processed
  • Step S38 includes the following steps:
  • S382 Perform interpolation processing on the brightness path data to obtain the brightness data.
  • Step S39 includes the following steps:
  • the image processing method includes the following steps:
  • an image processing apparatus 30 includes an acquisition unit 32, a first processing unit 34, a branching unit 36, a second processing unit 38, and a third processing unit 39.
  • the obtaining unit 32 is configured to acquire data processed by the image preprocessing method to obtain data to be processed.
  • the first processing unit 34 is configured to process the data to be processed to obtain data to be shunted.
  • the branching unit 36 is configured to divide the data to be shunt into luminance path data and chrominance path data.
  • the second processing unit 38 is operative to process the luminance path data to obtain luminance data.
  • the third processing unit 39 is configured to process the chrominance path data to obtain chrominance data.
  • the second processing unit 38 includes a luminance interpolation sub-unit 382.
  • the luminance interpolation sub-unit 382 is for performing interpolation processing on the luminance path data.
  • the image processing apparatus 10 of the embodiment of the present invention includes a luminance output unit.
  • the luminance output unit is configured to output the luminance data after the interpolation processing.
  • the luminance path data is interpolated, that is, the luminance path data is demosaiced, so that the luminance path data of the original domain is converted into the luminance data of the RGB domain, thereby being displayed.
  • the chroma interpolation sub-unit 392 is used to perform interpolation processing on the chroma path data.
  • the image processing apparatus 10 of the embodiment of the present invention includes a chromaticity output unit 35.
  • the chrominance output unit 35 is for outputting the chrominance data after the interpolation processing.
  • the chrominance path data is interpolated, that is, the chrominance path data is demosaiced, so that the chrominance path data of the original domain is converted into the chrominance data of the RGB domain, thereby being displayed.
  • the image processing apparatus 30 of the embodiment of the present invention can be applied to a terminal including a memory and a processor.
  • the memory is stored with computer readable instructions.
  • the processor is caused to perform the image processing method of any of the embodiments.
  • the terminal further includes a bus, an image signal processor, a display controller, and an input device.
  • the bus connects the image sensor interface, memory, processor, image signal processor, display controller, and input device. In this way, information can be transferred between various functions of the computer via the bus.
  • the image signal processor acquires image data that is output directly from the image sensor interface 20b to the image signal processor to obtain data to be processed.
  • the image signal processor reads image data processed by the image preprocessing method and stored to the main memory from the main memory (the main memory can be part of the memory or memory) via the bus to obtain the data to be processed.
  • the image processing method, the image processing apparatus 30, the computer readable storage medium, and the terminal of the embodiment of the present invention respectively process the luminance path data and the chrominance path data through the two paths of the luminance path and the chrominance path, so that the image processing is more flexible.
  • the luminance and chrominance paths can be processed by different algorithm processing modules as needed to reduce the latency of storage resources while increasing the rate of data processing.
  • step S34 includes the following steps: wide dynamics of the first long frame output data and the first medium frame output data Fusion to get the data to be shunted.
  • the first processing unit 34 includes a first fusion sub-unit 342 when the data to be processed includes the first long frame output data and the first medium frame output data.
  • the first fusion sub-unit 342 is configured to perform wide dynamic fusion of the first long frame output data and the first medium frame output data to obtain data to be branched.
  • the first long frame output data and the first medium frame output data are widely and dynamically fused, so that the two frames of data are fused into one frame, so that the image has a wider dynamic range, thereby satisfying the display requirements of images of different scenes.
  • step S34 when the data to be processed includes the second long frame output data, the second medium frame output data, and the short frame output data, step S34 includes the following steps: outputting the second long frame data, the second medium frame The output data and the short frame output data are subjected to wide dynamic fusion to obtain data to be shunted.
  • the first processing unit 34 includes a second fusion sub-unit 344 when the data to be processed includes second long frame output data, second medium frame output data, and short frame output data.
  • the second fusion sub-unit 344 is configured to perform wide dynamic fusion on the second long frame output data, the second intermediate frame output data, and the short frame output data to obtain the to-be-divided data.
  • the second long frame output data, the second middle frame output data and the short frame output data are widely and dynamically fused, so that the three frames of data are fused into one frame, so that the image has a wider dynamic range, thereby satisfying the display of images of different scenes. demand.
  • step S34 further includes the following steps: performing 3A statistics on the data to be processed.
  • the first processing unit 34 includes a 3A statistics sub-unit 346.
  • the 3A statistics sub-unit 346 is configured to perform 3A statistics on the data to be processed. Statistics are processed on the processed data to obtain statistics related to auto exposure, auto white balance, and autofocus. The software automatically controls the exposure, focus, and gain of the white balance module based on statistical data.
  • step S34 further includes the step of performing white balance correction.
  • the first processing unit 34 includes a white balance correction sub-unit 348.
  • the white balance correction sub-unit 348 is used to perform white balance correction. Performing white balance correction enables subsequent units to correctly restore the color of the image.
  • step S34 further includes the step of performing a dead pixel correction.
  • the first processing unit 34 includes a dead point correction sub-unit 341.
  • the dead pixel correction sub-unit 341 is used to perform dead pixel correction. Since the image sensor has many components and is prone to defective pixels, performing dead pixel correction can eliminate the dead pixels and reduce the influence of the dead pixels on subsequent processing.
  • step S34 further includes the step of performing lens vignetting compensation.
  • the first processing unit 34 includes a lens vignetting compensation sub-unit 343.
  • the lens dimple compensation sub-unit 343 is used to perform lens vignetting compensation.
  • the beam that can pass through the camera lens will slowly decrease, resulting in a brighter image and a darker edge, resulting in uneven image brightness.
  • Performing lens vignetting compensation can eliminate this adverse effect and improve the accuracy of subsequent processing.
  • step S34 further includes the step of performing a transparency calculation to obtain transparency and transmitting the transparency to the main memory storage.
  • the first processing unit 34 includes a transparency calculation sub-unit 345.
  • Transparency calculation sub-unit 345 is used to perform transparency calculations and send transparency to main memory storage. The transparency stored in the main memory can be read and utilized by subsequent defogging subunits.
  • step S34 further includes the steps of performing 3D noise reduction and outputting noise reduction results and motion errors.
  • the first processing unit 34 includes a 3D noise reduction sub-unit 347.
  • the 3D noise reduction sub-unit 347 is used to perform 3D noise reduction and output noise reduction results and motion errors.
  • the 3D noise reduction sub-unit 347 receives the reference motion error and the input of the reference image frame, and performs 3D noise reduction based on the reference motion error and the reference frame image.
  • the noise reduction results and motion errors can be sent to the main memory for storage or directly to subsequent subunits that need to be used.
  • step S34 further includes the step of performing on-screen menu adjustment.
  • the first processing unit 34 includes a screen menu type adjustment sub-unit 349.
  • the on-screen menu type adjustment sub-unit 349 is used to perform on-screen menu adjustment.
  • the screen menu type adjustment sub-unit 349 receives the noise reduction result and the motion error output by the 3D noise reduction sub-unit 347.
  • the on-screen menu type adjustment display is enabled, the motion error is mapped to a certain color and superimposed on the image.
  • the on-screen menu adjustment is not enabled, the input is delayed and output. In this way, it is convenient and intuitive to know the motion in the image, so as to perform adaptive noise reduction and enhancement of various scene images.
  • step S38 includes the step of applying tone mapping to the luminance path data.
  • the second processing unit 38 includes a luminance tone mapping sub-unit 384.
  • Luma tone mapping sub-unit 384 is used to apply tone mapping to the luminance path data. Applying tone mapping to the luminance path data can present a suitable dynamic range on the display device, which can better show the contrast and detail of the image to some extent.
  • tone mapping is applied only to the luminance path data, and no tone mapping is applied to the chrominance path data to solve the problem that tone mapping may cause color cast and distortion.
  • step S38 includes the step of performing a color matrix correction.
  • the second processing unit 38 includes a luminance color matrix correction sub-unit 386.
  • the luminance color matrix correction sub-unit 386 is used to perform color matrix correction.
  • step S38 includes the step of performing a dehazing process based on the transparency.
  • the second processing unit 38 includes a defogging subunit 388. The defogging unit 388 is used for dehazing treatment according to transparency.
  • step S38 includes the step of performing a color space conversion.
  • the second processing unit 38 includes a luminance color space conversion sub-unit 381.
  • the luminance color space conversion sub-unit 381 is used to perform color space conversion.
  • the luminance interpolation sub-unit 382 converts the luminance path data from the data of the original domain into the data of the RGB domain, and the luminance color space conversion sub-unit 381 converts the data of the RGB domain into the data of the YCbCr domain.
  • step S38 includes the step of performing 2D noise reduction.
  • the second processing unit 38 includes a luminance 2D noise reduction sub-unit 383.
  • the brightness 2D noise reduction sub-unit 383 is used for 2D noise reduction to further remove noise on the image brightness and improve the quality of the image.
  • step S38 includes the step of performing a sharpening process.
  • the second processing unit 38 includes a sharpening subunit 385.
  • the sharpening subunit 385 is used to perform a sharpening process.
  • the sharpening sub-unit 385 also receives the motion error input, where the motion error is shared with the motion error of the 3D noise reduction sub-unit, and no additional computing resources are required, so that adaptive processing can be performed on the sharpening according to the motion of the image. For example, a statically stable graph can select a larger gain, and vice versa, it can better meet the visual needs of the human eye.
  • step S39 includes the following steps: performing color matrix correction.
  • the third processing unit 39 includes a chrominance color matrix correction sub-unit 394.
  • the chrominance color matrix correction sub-unit 394 is used to perform color matrix correction. Color matrix correction of the image allows you to fine-tune the color of the image.
  • step S39 includes the step of performing a chroma color space conversion.
  • the third processing unit 39 includes a chrominance color space conversion sub-unit 396.
  • Chroma color space conversion sub-unit 396 is used to perform chroma color space conversion.
  • the chroma interpolation sub-unit 392 converts the chroma path data from the data of the original domain into the data of the RGB domain, and the chroma color space conversion sub-unit 396 converts the data of the RGB domain into the data of the YCbCr domain.
  • step S39 includes the step of performing a chromaticity correction.
  • the third processing unit 39 includes a chrominance correction sub-unit 398. Chroma correction sub-unit 398 is used to correct the chrominance.
  • step S39 includes the step of performing 2D noise reduction.
  • the third processing unit 39 includes a chrominance 2D noise reduction sub-unit 391.
  • the chrominance 2D noise reduction sub-unit 391 is used to perform 2D noise reduction, and further removes noise on the image chrominance.
  • the image preprocessing method, the image preprocessing apparatus 10, the image sensor interface 20, the image processing method, and the image processing apparatus 30 of the embodiments of the present invention have the following beneficial effects:
  • the image signal processor can support Input data format of various image sensors (such as raw data format for parallel input, raw data format for serial multi-frame input, input format of infrared sensor, etc.);
  • a more flexible data processing path mechanism is possible to support Input data format of various image sensors (such as raw data format for parallel input, raw data format for serial multi-frame input, input format of infrared sensor, etc.);
  • complex algorithm processing For example, wide dynamic processing of multi-frame input, image defogging, etc.
  • system bandwidth and power consumption and storage resources can be flexibly configured according to different input modes and configuration modes, so that the system (System-on-a-Chip, SOC) Work in an optimal state.
  • each unit in the image processing apparatus 30 described above is for illustrative purposes only. In other embodiments, the image processing apparatus 30 may be divided into different units as needed to perform all or part of the functions of the image processing apparatus 30 described above.
  • first and second are used for descriptive purposes only and are not to be construed as indicating or implying a relative importance or implicitly indicating the number of technical features indicated.
  • features defining “first” or “second” may include at least one of the features, either explicitly or implicitly.
  • the meaning of "a plurality” is at least two, such as two, three, etc., unless specifically defined otherwise.
  • portions of the invention may be implemented in hardware, software, firmware or a combination thereof.
  • multiple steps or methods may be implemented in software or firmware stored in a memory and executed by a suitable instruction execution system.
  • a suitable instruction execution system For example, if implemented in hardware, as in another embodiment, it can be implemented by any one or combination of the following techniques well known in the art: having logic gates for implementing logic functions on data signals. Discrete logic circuits, application specific integrated circuits with suitable combinational logic gates, programmable gate arrays (PGAs), field programmable gate arrays (FPGAs), etc.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Studio Devices (AREA)

Abstract

一种图像预处理方法、图像预处理装置、图像传感器接口、图像处理方法和图像处理装置。所述图像预处理方法包括:根据用户输入选定所述输入接口接收并行格式或串行格式的原始数据(S12);当接收的所述原始数据为并行格式时,从所述原始数据获取第一时序信号并根据所述第一时序信号发送所述原始数据到所述片内缓存保存(S14);当接收的所述原始数据为串行格式时,将所述原始数据转化为并行格式的解串数据并对所述解串数据进行解同步处理以得到第二时序信号(S16);和根据所述第二时序信号发送所述解串数据到所述片内缓存保存(S18)。

Description

图像预处理方法及装置、图像传感器接口、图像处理方法及装置 技术领域
本发明涉及图像处理技术领域,特别涉及一种图像预处理方法及装置、图像传感器接口、图像处理方法及装置。
背景技术
相关技术的图像信号处理器(image signal processor,ISP)一般仅能接收并处理串行或并行格式的原始数据(raw data)。然而,随着成像技术的发展,目前的图像传感器或者另外增加的器件已经能够获取更多用于辅助提升图像清晰度的数据,例如LVDS多帧数据或者红外图像。因此,ISP也需作出对应的改进以满足提升图像清晰度的需求。
发明内容
本发明的实施例提供了一种图像预处理方法及装置、图像传感器接口、图像处理方法及装置。
本发明实施方式的图像预处理方法用于图像传感器接口,所述图像传感器接口包括输入接口和片内缓存。图像处理方法包括以下步骤:
根据用户输入选定所述输入接口接收并行格式或串行格式的原始数据;
当接收的所述原始数据为并行格式时,从所述原始数据获取第一时序信号并根据所述第一时序信号发送所述原始数据到所述片内缓存保存;
当接收的所述原始数据为串行格式时,将所述原始数据转化为并行格式的解串数据并对所述解串数据进行解同步处理以得到第二时序信号;和
根据所述第二时序信号发送所述解串数据到所述片内缓存保存;
所述解串数据包括单帧解串数据,所述第二时序信号包括单帧时序信号,根据所述第二时序信号发送所述解串数据到所述片内缓存保存的步骤包括以下步骤:
根据所述单帧时序信号发送所述单帧解串数据到所述片内缓存保存;
所述图像预处理方法包括:
根据所述单帧时序信号从所述片内缓存读取并输出所述单帧解串数据;
根据所述单帧时序信号从所述片内缓存读取并输出所述单帧解串数据的步骤包括以下步骤:
接收直通信号;
当所述直通信号使能时,根据所述单帧时序信号将所述单帧解串数据输出到图像信号处理器;
当所述直通信号不使能时,根据所述单帧时序信号将所述单帧解串数据输出到所述总线接口;和
通过所述总线接口将所述单帧解串数据转换成符合总线协议标准的单帧输出数据并通过总线存储到主存储器中。
本发明实施方式的图像预处理装置包括第一处理模块、第二处理模块、第三处理模块和第四处理模块。所述第一处理模块用于根据用户输入选定所述输入接口接收并行格式或串行格式的原始数据。所述第二处理模块用于当接收的所述原始数据为并行格式时,从所述原始数据获取第一时序信号并根据所述第一时序信号发送所述原始数据到所述片内缓存保存。所述第三处理模块用于当接收的所述原始数据为串行格式时,将所述原始数据转化为并行格式的解串数据并对所述解串数据进行解同步处理以得到第二时序信号。所述第四处理模块用于根据所述第二时序信号发送所述解串数据到所述片内缓存保存。所述第四处理模块包括第一发送子模块。所述第一发送子模块用于根据所述单帧时序信号发送所述单帧解串数据到所述片内缓存保存。所述图像预处理装置包括第二输出模块。所述第二输出模块用于根据所述单帧时序信号从所述片内缓存读取并输出所述单帧解串数据。所述第二输出模块包括第二接收子模块、第四输出子模块、第五输出子模块和第六输出子模块。所述第二接收子模块用于接收直通信号。所述第四输出子模块用于当所述直通信号使能时,根据所述单帧时序信号将所述单帧解串数据输出到图像信号处理器。所述第五输出子模块用于当所述直通信号不使能时,根据所述单帧时序信号将所述单帧解串数据输出到所述总线接口。所述第六输出子模块用于通过所述总线接口将所述单帧解串数据转换成符合总线协议标准的单帧输出数据并通过总线存储到主存储器中。
本发明实施方式的图像处理方法包括以下步骤:
获取经所述图像预处理方法处理的数据以得到待处理数据;
处理所述待处理数据以得到待分路数据;
将所述待分路数据分成亮度通路数据和色度通路数据;
处理所述亮度通路数据以得到亮度数据;
处理所述色度通路数据以得到色度数据;
处理所述亮度通路数据以得到亮度数据的步骤包括以下步骤:
对所述亮度通路数据进行插值处理以得到所述亮度数据;
处理所述色度通路数据以得到色度数据的步骤包括以下步骤:
对所述色度通路数据进行插值处理以得到所述色度数据;
图像处理方法包括以下步骤:
将所述亮度数据输出;
将所述色度数据输出。
本发明实施方式的图像处理装置包括获取单元、第一处理单元、分路单元、第二处理单元和第三处理单元。所述获取单元用于获取上述任一实施方式的图像预处理方法处理的数据以得到待处理数据。所述第一处理单元用于处理所述待处理数据以得到待分路数据。所述分路单元用于将所述待分路数据分成亮度通路数据和色度通路数据。所述第二处理单元用于处理所述亮度通路数据以得到亮度数据。所述第三处理单元用于处理所述色度通路数据以得到色度数据。所述第二处理单元包括亮度插值子单元。所述亮度插值子单元用于对所述亮度通路数据进行插值处理。所述第三处理单元包括色度插值子单元。所述色度插值子单元用于对所述色度通路数据进行插值处理。所述图像处理装置包括亮度输出单元和色度输出单元。所述亮度输出单元用于将插值处理后的所述亮度数据输出。所述色度输出单元用于将插值处理后的所述色度数据输出。
本发明实施方式的图像预处理方法、图像预处理装置、图像处理方法和图像处理装置,可以选定接收串行格式或并行格式的原始数据,并在原始数据为图像信号处理器能直接处理的并行格式时直接缓存,而在原始数据为串行格式时转换为并行格式的解串数据并进行缓存。如此,可使图像信号处理器能够处理各种数据,从而提升图像清晰度。
总之,本发明实施方式的图像预处理方法、图像预处理装置、图像传感器接口、图像处理方法和图像处理装置有以下几个方面的有益效果:第一,使得图像信号处理器可以支持多种图像传感器的输入数据格式(例如并行输入的原始数据格式、串行多帧输入的原始数据格式以及红外传感器的输入格式等);第二,使得图像信号处理器可以支持复杂的算法处理(例如多帧输入的宽动态处理、图像去雾等)和效果更好的图像降噪和增强的自适应处理;第三,在色度和亮度处理域中,提供了色度通路与亮度通路独立的更加灵活的数据处理通路机制。第四,可以根据不同输入模式和配置模式灵活的配置系统带宽和功耗以及存储资源(包括片内静态随机存取存储器和主存储器等),使芯片工作在最优状态。
本发明的附加方面和优点将在下面的描述中部分给出,部分将从下面的描述中变得明显,或通过本发明的实践了解到。
附图说明
本发明的上述和/或附加的方面和优点从结合下面附图对实施方式的描述中将变得明显和容易理解,其中:
图1是本发明实施方式的图像预处理方法的流程示意图;
图2是本发明实施方式的图像预处理装置的模块示意图;
图3是本发明实施方式的图像传感器接口的模块示意图;
图4是本发明实施方式的图像处理方法的流程示意图;
图5是本发明实施方式的图像处理装置的模块示意图;
图6是本发明实施方式的图像处理装置的第一处理单元的模块示意图;
图7是本发明实施方式的图像处理装置的第二处理单元的模块示意图;
图8是本发明实施方式的图像处理装置的第三处理单元的模块示意图。
主要元件符号说明:
图像预处理装置10、第一处理模块12、第二处理模块14、第三处理模块16、第四处理模块18、图像传感器接口20、输入接口22、片内缓存24、总线接口26、红外处理单元28、串行数据处理器21、缓存控制器23、图像处理装置30、获取单元32、第一处理单元34、第一融合子单元342、第二融合子单元344、3A统计子单元346、白平衡校正子单元348、坏点校正子单元341、镜头暗角补偿子单元343、透明度计算子单元345、分路单元36、第二处理单元38、亮度插值子单元382、亮度色调映射子单元384、亮度颜色矩阵校正子单元386、去雾子单元388、亮度颜色空间转换子单元381、亮度2D降噪子单元383、锐化子单元385、第三处理单元39、色度插值子单元392、色度颜色矩阵校正子单元394。
具体实施方式
下面详细描述本发明的实施方式,所述实施方式的示例在附图中示出,其中自始至终相同或类似的标号表示相同或类似的元件或具有相同或类似功能的元件。下面通过参考附图描述的实施方式是示例性的,仅用于解释本发明,而不能理解为对本发明的限制。
请一并参阅图1、图2和图3,本发明实施方式的图像预处理方法用于图像预处理装置10和图像传感器接口20,图像传感器接口20包括输入接口22和片内缓存24。本发明实施方式的图像预处理方法包括以下步骤:
S12:根据用户输入选定输入接口22接收并行格式或串行格式的原始数据;
S14:当接收的原始数据为并行格式时,从原始数据获取第一时序信号并根据第一时序信号发送原始数据到片内缓存24保存;
S16:当接收的原始数据为串行格式时,将原始数据转化为并行格式的解串数据并对解串数据进行解同步处理以得到第二时序信号;和
S18:根据第二时序信号发送解串数据到片内缓存24保存。
解串数据包括单帧解串数据,第二时序信号包括单帧时序信号,步骤S18包括以下步骤:
S182:根据单帧时序信号发送单帧解串数据到片内缓存24保存。
本发明实施方式的图像预处理方法包括步骤:
S13:根据单帧时序信号从片内缓存24读取并输出单帧解串数据。
步骤S13包括以下步骤:
接收直通信号;
当直通信号使能时,根据单帧时序信号将单帧解串数据输出到图像信号处理器;和
当直通信号不使能时,根据单帧时序信号将单帧解串数据输出到总线接口26;和
通过总线接口26将单帧解串数据转换成符合总线协议标准的单帧输出数据并通过总线80a存储到主存储器中。
本发明实施方式的图像预处理装置10包括第一处理模块12、第二处理模块14、第三处理模块16和第四处理模块18。第一处理模块12用于根据用户输入选定输入接口22接收并行格式或串行格式的原始数据。第二处理模块14用于当接收的原始数据为并行格式时,从原始数据获取第一时序信号并根据第一时序信号发送原始数据到片内缓存24保存。第三处理模块16用于当接收的原始数据为串行格式时,将原始数据转化为并行格式的解串数据并对解串数据进行解同步处理以得到第二时序信号。第四处理模块18用于根据第二时序信号发送解串数据到片内缓存24保存。第四处理模块18包括第一发送子模块182。第一发送子模块182用于根据单帧时序信号发送单帧解串数据到片内缓存24保存。本发明实施方式的图像预处理装置10包括第二输出模块13。第二输出模块13用于根据单帧时序信号从片内缓存24读取并输出单帧解串数据。第二输出模块13包括第二接收子模块、第四输出子模块、第五输出子模块和第六输出子模块。第二接收子模块用于接收直通信号。第四输出子模块用于当直通信号使能时,根据单帧时序信号将单帧解串数据输出到图像信号处理器。第五输出子模块用于当直通信号不使能时,根据单帧时序信号将单帧解串数据输出到总线接口26。第六输出子模块用于通过总线接口26将单帧解串数据转换成符合总线协议标准的单帧输出数据并通过总线80a存储到主存储器中。
如此,单帧解串数据得以从图像传感器接口20输出从而被图像信号处理器进一步处理。为单帧解串数据提供两种输出方式,使得单帧解串数据的输出更加灵活。单帧解串数据只有一帧,不必进行帧缓存。因而在直通模式时,单帧解串数据不必通过总线发送到主存储器中进行帧缓存,而是直接发送到图像信号处理器,从而减轻总线带宽的需求。
本发明实施方式的图像预处理装置10可以应用于终端中。终端包括存储器及处理器。存储器储存有计算机可读指令。指令被处理器执行时,使得处理器执行上述图像预处理方法。
在某些实施方式中,终端还包括总线、图像信号处理器、显示控制器和输入装置。用 户可通过输入装置确定拍摄模式从而选定输入接口22。总线将图像传感器接口20、存储器、处理器、图像信号处理器、显示控制器和输入装置连接起来。如此,计算机各种功能部件之间可以通过总线传送信息。
可以理解,终端还包括图像传感器(图未示),图像传感器将采集到的图像原始数据输入至图像传感器接口20。当然,图像传感器也可将采集到的图像原始数据发送至主存储器(主存储器可为存储器或存储器的一部分)保存,图像传感器接口20也可从主存储器中读取图像原始数据。在此不再具体限制。
本发明实施方式的图像预处理方法、图像预处理装置10、图像传感器接口20可以选定接收串行格式或并行格式的原始数据,并在原始数据为图像信号处理器能直接处理的并行格式时直接缓存,而在原始数据为串行格式时转换为并行格式的解串数据并进行缓存。如此,可使图像信号处理器(image signal processor,ISP)能够处理各种数据,从而提升图像清晰度。
在某些实施方式中,本发明实施方式的图像预处理方法包括以下步骤:根据第一时序信号从片内缓存24读取并输出原始数据。在某些实施方式中,本发明实施方式的图像预处理装置10包括第一输出模块。第一输出模块用于根据第一时序信号从片内缓存24读取并输出原始数据。如此,原始数据得以从图像传感器接口20输出从而被图像信号处理器进一步处理。在某些实施方式中,第一输出模块可将原始数据直接输出到图像信号处理器。在某些实施方式中,第一输出模块可将原始数据输出至主存储器,图像信号处理器从主存储器中读取原始数据并对其进行后续处理。具体情形如下所述。
在某些实施方式中,图像传感器接口20包括总线接口26,根据第一时序信号从片内缓存24读取并输出原始数据的步骤包括以下步骤:接收直通信号;当直通信号使能时,根据第一时序信号将原始数据输出到图像信号处理器;当直通信号不使能时,根据第一时序信号将原始数据输出到总线接口26;和通过总线接口26将原始数据转换成符合总线协议标准的并行输出数据并通过总线将并行输出数据存储到主存储器中。在某些实施方式中,第一输出模块包括第一接收子模块、第一输出子模块、第二输出子模块和第三输出子模块。第一接收子模块用于接收直通信号。第一输出子模块用于当直通信号使能时,根据第一时序信号将原始数据输出到图像信号处理器。第二输出子模块用于当直通信号不使能时,根据第一时序信号将原始数据输出到总线接口26。第三输出子模块用于通过总线接口26将原始数据转换成符合总线协议标准的并行输出数据并通过总线将并行输出数据存储到主存储器中。如此,为原始数据提供两种输出方式,使得原始数据的输出更加灵活。通常地,并行格式的原始数据只有一帧,不必进行帧缓存。因而在直通模式时,并行格式的原始数据不必通过总线发送到主存储器中进行帧缓存,而是直接发送到图像信号处理器,从而减 轻总线带宽的需求。
在某些实施方式中,单帧解串数据包括普通单帧解串数据,单帧时序信号包括普通单帧时序信号。本发明实施方式的图像与处理方法可以用于普通单帧解串数据。
在某些实施方式中,单帧解串数据包括红外单帧解串数据,单帧时序信号包括红外单帧时序信号。图像传感器接口20包括红外处理单元28。本发明实施方式的图像预处理方法包括以下步骤:根据红外单帧时序信号从主存储器中读取红外校正数据并将红外校正数据发送到片内缓存24保存;根据红外单帧时序信号从片内缓存24读取红外校正数据并将红外校正数据发送到红外处理单元28;通过红外处理单元28处理红外校正数据以得到处理后红外校正数据;和将处理后红外校正数据输出。在某些实施方式中,图像预处理装置10包括红外获取模块、红外发送模块、红外处理模块和红外输出模块。红外获取模块用于根据红外单帧时序信号从主存储器中读取红外校正数据并将红外校正数据发送到片内缓存24保存。红外发送模块用于根据红外单帧时序信号从片内缓存24读取红外校正数据并将红外校正数据发送到红外处理单元28。红外处理模块用于通过红外处理单元28处理红外校正数据以得到处理后红外校正数据。红外输出模块用于将处理后的红外校正数据输出。红外传感器需要一个校正数据用以校正传感器本身的误差。红外模式时,根据红外单帧时序信号将红外校正数据从主存储器中读取出来,并存储到片内缓存24中,再根据红外单帧时序信号将片内缓存24中的红外校正数据读出并发送到红外处理单元28使得红外处理单元28得以对红外校正数据进行处理。处理后红外校正数据输出到红外传感器。如此,红外传感器得以利用处理后红外校正数据校正红外传感器本身的误差,从而使得图像的质量提高。
在某些实施方式中,解串数据包括二帧解串数据,二帧解串数据包括第一长帧解串数据和第一中帧解串数据。第二时序信号包括第一长帧时序信号和第一中帧时序信号。步骤S18包括以下步骤:分别根据第一长帧时序信号和第一中帧时序信号发送第一长帧解串数据和第一中帧解串数据到片内缓存24。本发明实施方式的图像预处理方法包括步骤:分别根据第一长帧时序信号和第一中帧时序信号从片内缓存24读取并输出第一长帧解串数据和第一中帧解串数据。在某些实施方式中,解串数据包括二帧解串数据,二帧解串数据包括第一长帧解串数据和第一中帧解串数据。第二时序信号包括第一长帧时序信号和第一中帧时序信号。第四处理模块18包括第二发送子模块。第二发送子模块用于分别根据第一长帧时序信号和第一中帧时序信号发送第一长帧解串数据和第一中帧解串数据到片内缓存24。本发明实施方式的图像预处理装置10包括第三输出模块。第三输出模块用于分别根据第一长帧时序信号和第一中帧时序信号从片内缓存24读取并输出第一长帧解串数据和第一中帧解串数据。图像预处理方法可适用于二帧解串数据。由于解串数据有两帧,因而要分别 根据第一长帧时序信号和第一中帧时序信号将第一长帧解串数据和第一中帧解串数据写入并从片内缓存24中输出。
在某些实施方式中,图像传感器接口20包括总线接口26。分别根据第一长帧时序信号和第一中帧时序信号从片内缓存24读取并输出第一长帧解串数据和第一中帧解串数据的步骤包括以下步骤:分别根据第一长帧时序信号和第一中帧时序信号输出第一长帧解串数据和第一中帧解串数据到总线接口26;和通过总线接口26将第一长帧解串数据和第一中帧解串数据转换成符合总线协议标准的第一长帧输出数据和第一中帧输出数据并通过总线存储到主存储器中。在某些实施方式中,图像传感器接口20包括总线接口26。第三输出模块包括第七输出子模块和第八输出子模块。第七输出子模块用于分别根据第一长帧时序信号和第一中帧时序信号输出第一长帧解串数据和第一中帧解串数据到总线接口26。第八输出子模块用于通过总线接口26将第一长帧解串数据和第一中帧解串数据转换成符合总线协议标准的第一长帧输出数据和第一中帧输出数据并通过总线存储到主存储器中。由于第一长帧解串数据和第一中帧解串数据不同步,所以需要将第一长帧解串数据和第一中帧解串数据通过总线接口26转换成符合总线协议标准的第一长帧输出数据和第一中帧输出数据并通过总线存储到主存储器中进行帧缓存,使得图像信号处理器得以从主存储器中读取第一长帧输出数据和第一中帧输出数据并对第一长帧输出数据和第一中帧输出数据进行后续的处理。
在某些实施方式中,解串数据包括三帧解串数据,三帧解串数据包括第二长帧解串数据、第二中帧解串数据和短帧解串数据。第二时序信号包括第二长帧时序信号、第二中帧时序信号和短帧时序信号。步骤S18包括以下步骤:分别根据第二长帧时序信号、第二中帧时序信号和短帧时序信号发送第二长帧解串数据、第二中帧解串数据和短帧解串数据到片内缓存24保存。本发明实施方式的图像预处理方法包括步骤:分别根据第二长帧时序信号、第二中帧时序信号和短帧时序信号从片内缓存24读取并输出第二长帧解串数据、第二中帧解串数据和短帧解串数据。在某些实施方式中,解串数据包括三帧解串数据,三帧解串数据包括第二长帧解串数据、第二中帧解串数据和短帧解串数据。第二时序信号包括第二长帧时序信号、第二中帧时序信号和短帧时序信号。第四处理模块18包括第三发送子模块。第三发送子模块用于分别根据第二长帧时序信号、第二中帧时序信号和短帧时序信号发送第二长帧解串数据、第二中帧解串数据和短帧解串数据到片内缓存24保存。本发明实施方式的图像预处理装置10包括第四输出模块。第四输出模块用于分别根据第二长帧时序信号、第二中帧时序信号和短帧时序信号从片内缓存24读取并输出第二长帧解串数据、第二中帧解串数据和短帧解串数据。本发明实施方式的图像预处理方法可适用于三帧解串数据,由于解串数据有三帧,因而要分别根据第二长帧时序信号、第二中帧时序信号和短帧 时序信号将第二长帧解串数据、第二中帧解串数据和短帧解串数据写入片内缓存24并从片内缓存24中输出。
在某些实施方式中,图像传感器接口20包括总线接口26。分别根据第二长帧时序信号、第二中帧时序信号和短帧时序信号从片内缓存24读取并输出第二长帧解串数据、第二中帧解串数据和短帧解串数据的步骤包括以下步骤:分别根据第二长帧时序信号、第二中帧时序信号和短帧时序信号发送第二长帧解串数据、第二中帧解串数据和短帧解串数据到总线接口26;和通过总线接口26将第二长帧解串数据、第二中帧解串数据和短帧解串数据转换成符合总线协议标准的第二长帧输出数据、第二中帧输出数据和短帧输出数据并通过总线存储到主存储器中。在某些实施方式中,图像传感器接口20包括总线接口26。第四输出模块包括第九输出子模块和第十输出子模块。第九输出子模块用于分别根据第二长帧时序信号、第二中帧时序信号和短帧时序信号发送第二长帧解串数据、第二中帧解串数据和短帧解串数据到总线接口26。第十输出子模块用于通过总线接口26将第二长帧解串数据、第二中帧解串数据和短帧解串数据转换成符合总线协议标准的第二长帧输出数据、第二中帧输出数据和短帧输出数据并通过总线存储到主存储器中。由于第二长帧解串数据、第二中帧解串数据和短帧解串数据不同步,所以需要将第二长帧解串数据、第二中帧解串数据和短帧解串数据通过总线接口26转换成符合总线协议标准的第二长帧输出数据、第二中帧输出数据和短帧输出数据并通过总线存储到主存储器中进行帧缓存,使得图像信号处理器得以从主存储器中读取第二长帧输出数据、第二中帧输出数据和短帧输出数据并对第二长帧输出数据、第二中帧输出数据和短帧输出数据进行后续的处理。
上述图像预处理装置10中各个单元的划分仅用于举例说明,在其他实施例中,可将图像预处理装置10按照需要划分为不同的单元,以完成上述图像预处理装置10的全部或部分功能。
本发明实施方式的图像传感器接口20包括输入接口22、串行数据处理器21、片内缓存24和缓存控制器23。输入接口22用于接收并行格式的原始数据的并行输入接口(图未示)和用于接收串行格式的原始数据的串行输入接口(图未示),并行格式的原始数据包括第一时序信号。串行数据处理器21用于将串行格式的原始数据转换成并行格式的解串数据并对解串数据进行解同步处理以得到第二时序信号。片内缓存24用于根据第一时序信号将并行格式的原始数据发送至片内缓存24保存,串行数据处理器用于根据第二时序信号将解串数据发送至片内缓存24保存。缓存控制器23用于控制片内缓存24的数据输入、读写和输出。
此外,直通模式时,时序信号从串行数据处理器21输出至图像信号处理器,图像数据由缓存控制器23从片内缓存24读取并输出。
在某些实施方式中,图像传感器接口20包括总线接口26。总线接口26用于将通过总 线接口26将输出的数据转换成符合总线协议标准的格式并通过总线发送到主存储器中。
在某些实施方式中,图像传感器接口20包括总线接口26和红外处理单元28。总线接口26用于通过总线从主存储器中读取用以校正红外传感器本身误差的红外校正数据并发送至红外处理单元28。红外处理单元28用于接收、处理和发送红外校正数据。
请参阅图4,本发明实施方式的图像处理方法用于图像处理装置30。本发明实施方式的图像处理方法包括以下步骤:
S32:获取经图像预处理方法处理的数据以得到待处理数据;
S34:处理待处理数据以得到待分路数据;
S36:将待分路数据分成亮度通路数据和色度通路数据;
S38:处理亮度通路数据以得到亮度数据;和
S39:处理色度通路数据以得到色度数据;
步骤S38包括以下步骤:
S382:对所述亮度通路数据进行插值处理以得到所述亮度数据;
步骤S39包括以下步骤:
S392:对所述色度通路数据进行插值处理以得到所述色度数据;
图像处理方法包括以下步骤:
S33:将所述亮度数据输出;
S35:将所述色度数据输出。
请参阅图5,本发明实施方式的图像处理装置30包括获取单元32、第一处理单元34、分路单元36、第二处理单元38和第三处理单元39。获取单元32用于获取经图像预处理方法处理的数据以得到待处理数据。第一处理单元34用于处理待处理数据以得到待分路数据。分路单元36用于将待分路数据分成亮度通路数据和色度通路数据。第二处理单元38用于处理亮度通路数据以得到亮度数据。第三处理单元39用于处理色度通路数据以得到色度数据。第二处理单元38包括亮度插值子单元382。亮度插值子单元382用于对亮度通路数据进行插值处理。本发明实施方式的图像处理装置10包括亮度输出单元。亮度输出单元用于将插值处理后的亮度数据输出。对亮度通路数据进行插值处理,即对亮度通路数据去马赛克,使得原始域的亮度通路数据转换成RGB域的亮度数据,从而得以显示。色度插值子单元392用于对色度通路数据进行插值处理。本发明实施方式的图像处理装置10包括色度输出单元35。色度输出单元35用于将插值处理后的色度数据输出。对色度通路数据进行插值处理,即对色度通路数据去马赛克,使得原始域的色度通路数据转换成RGB域的色度数据,从而得以显示。
本发明实施方式的图像处理装置30可以应用于终端包括存储器及处理器。存储器储存 有计算机可读指令。指令被处理器执行时,使得处理器执行任一实施方式的图像处理方法。
在某些实施方式中,终端还包括总线、图像信号处理器、显示控制器和输入装置。总线将图像传感器接口、存储器、处理器、图像信号处理器、显示控制器和输入装置连接起来。如此,计算机各种功能部件之间可以通过总线传送信息。
在某些实施方式中,图像信号处理器获取从图像传感器接口20b直接输出至图像信号处理器的图像数据以得到待处理数据。在某些实施方式中,图像信号处理器通过总线从主存储器(主存储器可为存储器或存储器的一部分)读取经图像预处理方法处理并存储至主存储器的图像数据以得到待处理数据。
本发明实施方式的图像处理方法、图像处理装置30、计算机可读存储介质和终端通过亮度通路和色度通路两个通路对亮度通路数据和色度通路数据分别进行处理,使得图像处理更加灵活。亮度通路和色度通路可以根据需要经过不同的算法处理模块以在提高数据处理的速率的同时减少延迟的存储资源。
在某些实施方式中,当待处理数据包括第一长帧输出数据和第一中帧输出数据时,步骤S34包括以下步骤:将第一长帧输出数据和第一中帧输出数据进行宽动态融合以得到待分路数据。在某些实施方式中,当待处理数据包括第一长帧输出数据和第一中帧输出数据时,第一处理单元34包括第一融合子单元342。第一融合子单元342用于将第一长帧输出数据和第一中帧输出数据进行宽动态融合以得到待分路数据。将第一长帧输出数据和第一中帧输出数据进行宽动态融合,使得二帧数据融合成一帧,从而使图像具有更宽的动态范围,进而满足不同场景的图像的显示需求。
在某些实施方式中,当待处理数据包括第二长帧输出数据、第二中帧输出数据和短帧输出数据时,步骤S34包括以下步骤:将第二长帧输出数据、第二中帧输出数据和短帧输出数据进行宽动态融合以得到待分路数据。在某些实施方式中,当待处理数据包括第二长帧输出数据、第二中帧输出数据和短帧输出数据时,第一处理单元34包括第二融合子单元344。第二融合子单元344用于将第二长帧输出数据、第二中帧输出数据和短帧输出数据进行宽动态融合以得到待分路数据。将第二长帧输出数据、第二中帧输出数据和短帧输出数据进行宽动态融合,使得三帧数据融合成一帧,从而使图像具有更宽的动态范围,进而满足不同场景的图像的显示需求。
请参阅图6,在某些实施方式中,步骤S34还包括以下步骤:对待处理数据进行3A统计。在某些实施方式中,第一处理单元34包括3A统计子单元346。3A统计子单元346用于对待处理数据进行3A统计。对待处理数据进行统计,从而得到与自动曝光、自动白平衡以及自动对焦相关的统计数据。软件根据统计数据自动控制图像传感器的曝光、对焦和控制白平衡模块的增益。
在某些实施方式中,步骤S34还包括以下步骤:进行白平衡校正。在某些实施方式中,第一处理单元34包括白平衡校正子单元348。白平衡校正子单元348用于进行白平衡校正。进行白平衡校正能够使得后续单元正确地还原图像的颜色。
在某些实施方式中,步骤S34还包括以下步骤:进行坏点校正。在某些实施方式中,第一处理单元34包括坏点校正子单元341。坏点校正子单元341用于进行坏点校正。由于图像传感器的元件众多,易出现缺陷像素,进行坏点校正可以消除坏点,减少坏点对后续处理的影响。
在某些实施方式中,步骤S34还包括以下步骤:进行镜头暗角补偿。在某些实施方式中,第一处理单元34包括镜头暗角补偿子单元343。镜头暗角补偿子单元343用于进行镜头暗角补偿。由于相机在成像距离较远时,随着视场角慢慢增大,能够通过照相机镜头的光束将慢慢减少,使得获得的图像中间比较亮,边缘比较暗,从而使得图像亮度不均。进行镜头暗角补偿可以消除这种不良影响,提高后续处理的准确性。
在某些实施方式中,步骤S34还包括以下步骤:进行透明度计算以得到透明度并将透明度发送到主存储器存储。在某些实施方式中,第一处理单元34包括透明度计算子单元345。透明度计算子单元345用于进行透明度计算并将透明度发送到主存储器存储。存储在主存储器中的透明度可被后续的去雾子单元读取和利用。
在某些实施方式中,步骤S34还包括以下步骤:进行3D降噪并输出降噪结果和运动误差。在某些实施方式中,第一处理单元34包括3D降噪子单元347。3D降噪子单元347用于进行3D降噪并输出降噪结果和运动误差。3D降噪子单元347接收参考运动误差和参考图像帧的输入,并根据参考运动误差和参考帧图像进行3D降噪。降噪结果和运动误差可发送到主存储器中存储也可直接输出给后续需要使用的子单元。
在某些实施方式中,步骤S34还包括以下步骤:进行屏幕菜单式调节。在某些实施方式中,第一处理单元34包括屏幕菜单式调节子单元349。屏幕菜单式调节子单元349用于进行屏幕菜单式调节。屏幕菜单式调节子单元349接收3D降噪子单元347输出的降噪结果和运动误差,当屏幕菜单式调节显示使能时,将运动误差映射成一定的颜色,并将之叠加到图像上。当屏幕菜单式调节不使能时,直接将输入延迟后输出。如此,可方便直观地知道图像中的运动情况,以便进行各种场景图像的自适应降噪和增强等。
请参阅图7,在某些实施方式中,步骤S38包括以下步骤:对亮度通路数据应用色调映射。在某些实施方式中,第二处理单元38包括亮度色调映射子单元384。亮度色调映射子单元384用于对亮度通路数据应用色调映射。对亮度通路数据应用色调映射,可以在显示设备上呈现出合适的动态范围,能在一定程度上更好地展现图像的对比度和细节。此外,只对亮度通路数据应用色调映射,不对色度通路数据应用色调映射解决了色调映射可能造 成颜色偏色和失真的问题。
在某些实施方式中,步骤S38包括以下步骤:进行颜色矩阵校正。在某些实施方式中,第二处理单元38包括亮度颜色矩阵校正子单元386。亮度颜色矩阵校正子单元386用于进行颜色矩阵校正。
在某些实施方式中,步骤S38包括以下步骤:根据透明度进行去雾处理。在某些实施方式中,第二处理单元38包括去雾子单元388。去雾子单元388用于根据透明度进行去雾处理。
在某些实施方式中,步骤S38包括以下步骤:进行颜色空间转换。在某些实施方式中,第二处理单元38包括亮度颜色空间转换子单元381。亮度颜色空间转换子单元381用于进行颜色空间转换。亮度插值子单元382使得亮度通路数据从原始域的数据转换成RGB域的数据,亮度颜色空间转换子单元381使得RGB域的数据转换为YCbCr域的数据。
在某些实施方式中,步骤S38包括以下步骤:进行2D降噪。在某些实施方式中,第二处理单元38包括亮度2D降噪子单元383。亮度2D降噪子单元383用于进行2D降噪,进一步去除图像亮度上的噪声,提高图像的质量。
在某些实施方式中,步骤S38包括以下步骤:进行锐化处理。在某些实施方式中,第二处理单元38包括锐化子单元385。锐化子单元385用于进行锐化处理。锐化子单元385也接收运动误差输入,此处的运动误差与3D降噪子单元的运动误差共享,不需要额外的计算资源,使得可以根据图像的运动情况在锐化上进行自适应的处理,例如静态稳定的图可以选择增益较大,反之则越大,更能满足人眼的视觉需求。
请参阅图8,在某些实施方式中,步骤S39包括以下步骤:进行颜色矩阵校正。在某些实施方式中,第三处理单元39包括色度颜色矩阵校正子单元394。色度颜色矩阵校正子单元394用于进行颜色矩阵校正。对图像进行颜色矩阵校正可以更加细致地调整图像的颜色。
在某些实施方式中,步骤S39包括以下步骤:进行色度颜色空间转换。在某些实施方式中,第三处理单元39包括色度颜色空间转换子单元396。色度颜色空间转换子单元396用于进行色度颜色空间转换。色度插值子单元392使得色度通路数据从原始域的数据转换成RGB域的数据,色度颜色空间转换子单元396使得RGB域的数据转换为YCbCr域的数据。
在某些实施方式中,步骤S39包括以下步骤:进行色度校正。在某些实施方式中,第三处理单元39包括色度校正子单元398。色度校正子单元398用于对色度进行校正。
在某些实施方式中,步骤S39包括以下步骤:进行2D降噪。在某些实施方式中,第三处理单元39包括色度2D降噪子单元391。色度2D降噪子单元391用于进行2D降噪, 更进一步地去除图像色度上的噪声。
总而言之,本发明实施方式的图像预处理方法、图像预处理装置10、图像传感器接口20、图像处理方法和图像处理装置30有以下几个方面的有益效果:第一,使得图像信号处理器可以支持多种图像传感器的输入数据格式(例如并行输入的原始数据格式、串行多帧输入的原始数据格式以及红外传感器的输入格式等);第二,使得图像信号处理器可以支持复杂的算法处理(例如多帧输入的宽动态处理、图像去雾等)和效果更好的图像降噪和增强的自适应处理;第三,在色度和亮度处理域中,提供了色度通路与亮度通路独立的更加灵活的数据处理通路机制。第四,可以根据不同输入模式和配置模式灵活的配置系统带宽和功耗以及存储资源(包括片内静态随机存取存储器和主存储器等),使芯片(System-on-a-Chip,SOC)工作在最优状态。
上述图像处理装置30中各个单元的划分仅用于举例说明,在其他实施例中,可将图像处理装置30按照需要划分为不同的单元,以完成上述图像处理装置30的全部或部分功能。
在本说明书的描述中,参考术语“一个实施例”、“一些实施例”、“示例”、“具体示例”、或“一些示例”等的描述意指结合该实施例或示例描述的具体特征、结构、材料或者特点包含于本发明的至少一个实施例或示例中。在本说明书中,对上述术语的示意性表述不必须针对的是相同的实施例或示例。而且,描述的具体特征、结构、材料或者特点可以在任一个或多个实施例或示例中以合适的方式结合。此外,在不相互矛盾的情况下,本领域的技术人员可以将本说明书中描述的不同实施例或示例以及不同实施例或示例的特征进行结合和组合。
此外,术语“第一”、“第二”仅用于描述目的,而不能理解为指示或暗示相对重要性或者隐含指明所指示的技术特征的数量。由此,限定有“第一”、“第二”的特征可以明示或者隐含地包括至少一个该特征。在本发明的描述中,“多个”的含义是至少两个,例如两个,三个等,除非另有明确具体的限定。
流程图中或在此以其他方式描述的任何过程或方法描述可以被理解为,表示包括一个或更多个用于实现特定逻辑功能或过程的步骤的可执行指令的代码的模块、片段或部分,并且本发明的优选实施方式的范围包括另外的实现,其中可以不按所示出或讨论的顺序,包括根据所涉及的功能按基本同时的方式或按相反的顺序,来执行功能,这应被本发明的实施例所属技术领域的技术人员所理解。
应当理解,本发明的各部分可以用硬件、软件、固件或它们的组合来实现。在上述实施方式中,多个步骤或方法可以用存储在存储器中且由合适的指令执行系统执行的软件或固件来实现。例如,如果用硬件来实现,和在另一实施方式中一样,可用本领域公知的下列技术中的任一项或他们的组合来实现:具有用于对数据信号实现逻辑功能的逻辑门电路 的离散逻辑电路,具有合适的组合逻辑门电路的专用集成电路,可编程门阵列(PGA),现场可编程门阵列(FPGA)等。
尽管上面已经示出和描述了本发明的实施例,可以理解的是,上述实施例是示例性的,不能理解为对本发明的限制,本领域的普通技术人员在本发明的范围内可以对上述实施例进行变化、修改、替换和变型。

Claims (20)

  1. 一种图像预处理方法,用于图像传感器接口,所述图像传感器接口包括输入接口和片内缓存,其特征在于,所述图像预处理方法包括以下步骤:
    根据用户输入选定所述输入接口接收并行格式或串行格式的原始数据;
    当接收的所述原始数据为并行格式时,从所述原始数据获取第一时序信号并根据所述第一时序信号发送所述原始数据到所述片内缓存保存;
    当接收的所述原始数据为串行格式时,将所述原始数据转化为并行格式的解串数据并对所述解串数据进行解同步处理以得到第二时序信号;和
    根据所述第二时序信号发送所述解串数据到所述片内缓存保存;
    所述解串数据包括单帧解串数据,所述第二时序信号包括单帧时序信号,根据所述第二时序信号发送所述解串数据到所述片内缓存保存的步骤包括以下步骤:
    根据所述单帧时序信号发送所述单帧解串数据到所述片内缓存保存;
    所述图像预处理方法包括:
    根据所述单帧时序信号从所述片内缓存读取并输出所述单帧解串数据;
    根据所述单帧时序信号从所述片内缓存读取并输出所述单帧解串数据的步骤包括以下步骤:
    接收直通信号;
    当所述直通信号使能时,根据所述单帧时序信号将所述单帧解串数据输出到图像信号处理器;
    当所述直通信号不使能时,根据所述单帧时序信号将所述单帧解串数据输出到所述总线接口;和
    通过所述总线接口将所述单帧解串数据转换成符合总线协议标准的单帧输出数据并通过总线存储到主存储器中。
  2. 如权利要求1所述的图像预处理方法,其特征在于,所述单帧解串数据包括红外单帧解串数据,所述单帧时序信号包括红外单帧时序信号,所述图像传感器接口包括红外处理单元,所述图像预处理方法包括以下步骤:
    根据所述红外单帧时序信号从所述主存储器中读取红外校正数据并将所述红外校正数据发送到所述片内缓存保存;
    根据所述红外单帧时序信号从所述片内缓存读取所述红外校正数据并将所述红外校正数据发送到所述红外处理单元;
    通过所述红外处理单元处理所述红外校正数据以得到处理后红外校正数据;和
    将所述处理后红外校正数据输出。
  3. 如权利要求1所述的图像预处理方法,其特征在于,所述解串数据包括二帧解串数据,所述二帧解串数据包括第一长帧解串数据和第一中帧解串数据,所述第二时序信号包括第一长帧时序信号和第一中帧时序信号,根据所述第二时序信号发送所述解串数据到所述片内缓存保存的步骤包括以下步骤:
    分别根据所述第一长帧时序信号和所述第一中帧时序信号发送所述第一长帧解串数据和所述第一中帧解串数据到所述片内缓存;
    所述图像预处理方法包括步骤:
    分别根据所述第一长帧时序信号和所述第一中帧时序信号从所述片内缓存读取并输出所述第一长帧解串数据和所述第一中帧解串数据;
    分别根据所述第一长帧时序信号和所述第一中帧时序信号从所述片内缓存读取并输出所述第一长帧解串数据和所述第一中帧解串数据的步骤包括以下步骤:
    分别根据所述第一长帧时序信号和所述第一中帧时序信号输出所述第一长帧解串数据和所述第一中帧解串数据到所述总线接口;和
    通过所述总线接口将所述第一长帧解串数据和所述第一中帧解串数据转换成符合总线协议标准的第一长帧输出数据和第一中帧输出数据并通过总线存储到主存储器中。
  4. 如权利要求1所述的图像预处理方法,其特征在于,所述解串数据包括三帧解串数据,所述三帧解串数据包括第二长帧解串数据、第二中帧解串数据和短帧解串数据,所述第二时序信号包括第二长帧时序信号、第二中帧时序信号和短帧时序信号,根据所述第二时序信号发送所述解串数据到所述片内缓存保存的步骤包括以下步骤:
    分别根据所述第二长帧时序信号、所述第二中帧时序信号和所述短帧时序信号发送所述第二长帧解串数据、所述第二中帧解串数据和所述短帧解串数据到所述片内缓存保存;
    所述图像预处理方法包括步骤:
    分别根据所述第二长帧时序信号、所述第二中帧时序信号和所述短帧时序信号从所述片内缓存读取并输出所述第二长帧解串数据、所述第二中帧解串数据和所述短帧解串数据;
    分别根据所述第二长帧时序信号、所述第二中帧时序信号和所述短帧时序信号从所述片内缓存读取并输出所述第二长帧解串数据、所述第二中帧解串数据和所述短帧解串数据的步骤包括以下步骤:
    分别根据所述第二长帧时序信号、所述第二中帧时序信号和所述短帧时序信号发送所述第二长帧解串数据、所述第二中帧解串数据和所述短帧解串数据到所述总线接口;和
    通过所述总线接口将所述第二长帧解串数据、所述第二中帧解串数据和所述短帧解串数据转换成符合总线协议标准的第二长帧输出数据、第二中帧输出数据和短帧输出数据并通过总线存储到主存储器中。
  5. 一种图像预处理装置,其特征在于包括:
    第一处理模块,所述第一处理模块用于根据用户输入选定所述输入接口接收并行格式或串行格式的原始数据;
    第二处理模块,所述第二处理模块用于当接收的所述原始数据为并行格式时,从所述原始数据获取第一时序信号并根据所述第一时序信号发送所述原始数据到所述片内缓存保存;
    第三处理模块,所述第三处理模块用于当接收的所述原始数据为串行格式时,将所述原始数据转化为并行格式的解串数据并对所述解串数据进行解同步处理以得到第二时序信号;和
    第四处理模块,所述第四处理模块用于根据所述第二时序信号发送所述解串数据到所述片内缓存保存;
    所述解串数据包括单帧解串数据,所述第二时序信号包括单帧时序信号,所述第四处理模块包括:
    第一发送子模块,所述第一发送子模块用于根据所述单帧时序信号发送所述单帧解串数据到所述片内缓存保存;
    所述图像预处理装置包括:
    第二输出模块,所述第二输出模块用于根据所述单帧时序信号从所述片内缓存读取并输出所述单帧解串数据;
    所述第二输出模块包括:
    第二接收子模块,所述第二接收子模块用于接收直通信号;
    第四输出子模块,所述第四输出子模块用于当所述直通信号使能时,根据所述单帧时序信号将所述单帧解串数据输出到图像信号处理器;
    第五输出子模块,所述第五输出子模块用于当所述直通信号不使能时,根据所述单帧时序信号将所述单帧解串数据输出到所述总线接口;和
    第六输出子模块,所述第六输出子模块用于通过所述总线接口将所述单帧解串数据转换成符合总线协议标准的单帧输出数据并通过总线存储到主存储器中。
  6. 如权利要求5所述的图像预处理装置,其特征在于,所述单帧解串数据包括红外单帧解串数据,所述单帧时序信号包括红外单帧时序信号,所述图像传感器接口包括红外处理单元,所述图像预处理装置包括:
    红外获取模块,所述红外发送模块用于根据所述红外单帧时序信号从所述主存储器中读取红外校正数据并将所述红外校正数据发送到所述片内缓存保存;
    红外发送模块,所述红外处理模块用于根据所述红外单帧时序信号从所述片内缓存读取所述红外校正数据并将所述红外校正数据发送到所述红外处理单元;
    红外处理模块,所述红外处理模块用于通过所述红外处理单元处理所述红外校正数据以得到处理后红外校正数据;和
    红外输出模块,所述红外输出模块用于将处理后的所述红外校正数据输出。
  7. 如权利要求5所述的图像预处理装置,其特征在于,所述解串数据包括二帧解串数据,所述二帧解串数据包括第一长帧解串数据和第一中帧解串数据,所述第二时序信号包括第一长帧时序信号和第一中帧时序信号,所述第四处理模块包括:
    第二发送子模块,所述第二发送子模块用于分别根据所述第一长帧时序信号和所述第一中帧时序信号发送所述第一长帧解串数据和所述第一中帧解串数据到所述片内缓存;
    所述图像预处理装置包括:
    第三输出模块,所述第三输出模块用于分别根据所述第一长帧时序信号和所述第一中帧时序信号从所述片内缓存读取并输出所述第一长帧解串数据和所述第一中帧解串数据;
    所述第三输出模块包括:
    第七输出子模块,所述第七输出子模块用于分别根据所述第一长帧时序信号和所述第一中帧时序信号输出所述第一长帧解串数据和所述第一中帧解串数据到所述总线接口;和
    第八输出子模块,所述第八输出子模块用于通过所述总线接口将所述第一长帧解串数据和所述第一中帧解串数据转换成符合总线协议标准的第一长帧输出数据和第一中帧输出数据并通过总线存储到主存储器中。
  8. 如权利要求5所述的图像预处理装置,其特征在于,所述解串数据包括三帧解串数据,所述三帧解串数据包括第二长帧解串数据、第二中帧解串数据和短帧解串数据,所述第二时序信号包括第二长帧时序信号、第二中帧时序信号和短帧时序信号,所述第四处理模块包括:
    第三发送子模块,所述第三发送子模块用于分别根据所述第二长帧时序信号、所述第二中帧时序信号和所述短帧时序信号发送所述第二长帧解串数据、所述第二中帧解串数据和所述短帧解串数据到所述片内缓存保存;
    所述图像预处理装置包括:
    第四输出模块,所述第四输出模块用于分别根据所述第二长帧时序信号、所述第二中帧时序信号和所述短帧时序信号从所述片内缓存读取并输出所述第二长帧解串数据、所述第二中帧解串数据和所述短帧解串数据;
    所述第四输出模块包括:
    第九输出子模块,所述第九输出子模块用于分别根据所述第二长帧时序信号、所述第二中帧时序信号和所述短帧时序信号发送所述第二长帧解串数据、所述第二中帧解串数据和所述短帧解串数据到所述总线接口;和
    第十输出子模块,所述第十输出子模块用于通过所述总线接口将所述第二长帧解串数据、所述第二中帧解串数据和所述短帧解串数据转换成符合总线协议标准的第二长帧输出数据、第二中帧输出数据和短帧输出数据并通过总线存储到主存储器中。
  9. 一种图像传感器接口,其特征在于包括:
    输入接口,所述输入接口包括用于接收并行格式的原始数据的并行输入接口和用于接收串行格式的原始数据的串行输入接口,并行格式的所述原始数据包括第一时序信号;
    串行数据处理器,所述串行数据处理器用于将串行格式的所述原始数据转换成并行格式的解串数据并对所述解串数据进行解同步处理以得到第二时序信号;
    片内缓存,并行输入接口用于根据第一时序信号将并行格式的所述原始数据发送至所述片内缓存保存,串行数据处理器用于根据所述第二时序信号将所述解串数据发送至所述片内缓存保存;
    缓存控制器,所述缓存控制器用于控制所述片内缓存的数据输入、读写和输出;和
    总线接口,所述总线接口用于将通过所述总线接口将输出的数据转换成符合总线协议标准的格式并通过总线发送到主存储器中。
  10. 如权利要求9所述的图像传感器接口,其特征在于,所述图像传感器接口包括红外处理单元,所述总线接口用于通过总线从主存储器中读取用以校正红外传感器本身误差的红外校正数据并发送至所述红外处理单元,所述红外处理单元用于接收、处理和发送所述红外校正数据。
  11. 一种图像处理方法,其特征在于,包括以下步骤:
    获取经权利要求1-4任意一项所述的图像预处理方法处理的数据以得到待处理数据;
    处理所述待处理数据以得到待分路数据;
    将所述待分路数据分成亮度通路数据和色度通路数据;
    处理所述亮度通路数据以得到亮度数据;
    处理所述色度通路数据以得到色度数据;
    处理所述亮度通路数据以得到亮度数据的步骤包括以下步骤:
    对所述亮度通路数据进行插值处理以得到所述亮度数据;
    处理所述色度通路数据以得到色度数据的步骤包括以下步骤:
    对所述色度通路数据进行插值处理以得到所述色度数据;
    图像处理方法包括以下步骤:
    将所述亮度数据输出;
    将所述色度数据输出。
  12. 如权利要求11所述的图像处理方法,其特征在于,当所述待处理数据包括第一长帧输出数据和第一中帧输出数据时,处理所述待处理数据以得到待分路数据的步骤包括以下步骤:
    将所述第一长帧输出数据和所述第一中帧输出数据进行宽动态融合以得到所述待分路数据;
    当所述待处理数据包括第二长帧输出数据、第二中帧输出数据和短帧输出数据时,处理所述待处理数据以得到待分路数据的步骤包括以下步骤:
    将所述第二长帧输出数据、所述第二中帧输出数据和短帧输出数据进行宽动态融合以得到所述待分路数据。
  13. 如权利要求11所述的图像处理方法,其特征在于,在对所述亮度通路数据进行插值处理以得到所述亮度数据之前,处理所述亮度通路数据以得到亮度数据的步骤包括以下步骤:
    对所述亮度通路数据应用色调映射。
  14. 如权利要求11所述的图像处理方法,其特征在于,处理所述亮度通路数据以得到亮度数据的步骤包括以下步骤:
    进行去雾处理。
  15. 如权利要求11所述的图像处理方法,其特征在于,处理所述色度通路数据以得到色度数据的步骤包括以下步骤:
    进行2D降噪。
  16. 一种图像处理装置,其特征在于包括:
    获取单元,所述获取单元用于获取经权利要求1-4任意一项所述的图像预处理方法处理的数据以得到待处理数据;
    第一处理单元,所述第一处理单元用于处理所述待处理数据以得到待分路数据;
    分路单元,所述分路单元用于将所述待分路数据分成亮度通路数据和色度通路数据;
    第二处理单元,所述第二处理单元用于处理所述亮度通路数据以得到亮度数据;
    第三处理单元,所述第三处理单元用于处理所述色度通路数据以得到色度数据;
    所述第二处理单元包括:
    亮度插值子单元,所述亮度插值子单元用于对所述亮度通路数据进行插值处理;
    所述第三处理单元包括:
    色度插值子单元,所述色度插值子单元用于对所述色度通路数据进行插值处理;
    所述图像处理装置包括:
    亮度输出单元,所述亮度输出单元用于将插值处理后的所述亮度数据输出;
    色度输出单元,所述色度输出单元用于将插值处理后的所述色度数据输出。
  17. 如权利要求16所述的图像处理装置,其特征在于,当所述待处理数据包括第一长帧输出数据和第一中帧输出数据时,所述第一处理单元包括:第一融合子单元,所述第一融合子单元用于将所述第一长帧输出数据和所述第一中帧输出数据进行宽动态融合以得到所述待分路数据;
    当所述待处理数据包括第二长帧输出数据、第二中帧输出数据和短帧输出数据时,所述第一处理单元包括:第二融合子单元,所述第二融合子单元用于将所述第二长帧输出数据、所述第二中帧输出数据和短帧输出数据进行宽动态融合以得到所述待分路数据。
  18. 如权利要求16所述的图像处理装置,其特征在于,在所述亮度插值子单元之前,所述第二处理单元包括亮度色调映射子单元,所述亮度色调映射子单元用于对所述亮度通路数据应用色调映射。
  19. 如权利要求16所述的图像处理装置,其特征在于,所述第二处理单元包括去雾子单元,所述去雾子单元用于进行去雾处理。
  20. 如权利要求16所述的图像处理装置,其特征在于,所述第三处理单元包括2D降噪子单元,所述2D降噪子单元用于进行2D降噪。
PCT/CN2018/076041 2018-02-09 2018-02-09 图像预处理方法及装置、图像传感器接口、图像处理方法及装置 WO2019153264A1 (zh)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN201880077740.1A CN111492650B (zh) 2018-02-09 2018-02-09 图像预处理方法及装置、图像传感器接口、图像处理方法及装置
PCT/CN2018/076041 WO2019153264A1 (zh) 2018-02-09 2018-02-09 图像预处理方法及装置、图像传感器接口、图像处理方法及装置

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2018/076041 WO2019153264A1 (zh) 2018-02-09 2018-02-09 图像预处理方法及装置、图像传感器接口、图像处理方法及装置

Publications (1)

Publication Number Publication Date
WO2019153264A1 true WO2019153264A1 (zh) 2019-08-15

Family

ID=67548728

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2018/076041 WO2019153264A1 (zh) 2018-02-09 2018-02-09 图像预处理方法及装置、图像传感器接口、图像处理方法及装置

Country Status (2)

Country Link
CN (1) CN111492650B (zh)
WO (1) WO2019153264A1 (zh)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101472039A (zh) * 2007-12-26 2009-07-01 中国科学院沈阳自动化研究所 一种数字图像接收卡
CN102098441A (zh) * 2010-12-16 2011-06-15 深圳市经纬科技有限公司 基于spi接口的图像数据传输方法及照相设备
JP2012088996A (ja) * 2010-10-21 2012-05-10 Konica Minolta Business Technologies Inc メモリ制御方法、メモリ制御装置、および画像形成装置
CN106791550A (zh) * 2016-12-05 2017-05-31 中国航空工业集团公司洛阳电光设备研究所 一种低帧频lvds转高帧频dvi视频的装置和方法

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20020118296A1 (en) * 1999-05-06 2002-08-29 Schwab Barry H. Integrated multi-format audio/video production system
CN101221439B (zh) * 2008-01-14 2010-06-23 清华大学 高速并行多路数字图像采集与处理的嵌入式系统
CN102006420B (zh) * 2010-12-17 2012-02-08 四川川大智胜软件股份有限公司 可使用外接同步的多种数据输出格式摄像机的设计方法
CN105721818B (zh) * 2016-03-18 2018-10-09 武汉精测电子集团股份有限公司 一种信号转换方法及装置
CN107249101B (zh) * 2017-07-13 2020-01-10 浙江工业大学 一种高分辨率图像采集与处理装置

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101472039A (zh) * 2007-12-26 2009-07-01 中国科学院沈阳自动化研究所 一种数字图像接收卡
JP2012088996A (ja) * 2010-10-21 2012-05-10 Konica Minolta Business Technologies Inc メモリ制御方法、メモリ制御装置、および画像形成装置
CN102098441A (zh) * 2010-12-16 2011-06-15 深圳市经纬科技有限公司 基于spi接口的图像数据传输方法及照相设备
CN106791550A (zh) * 2016-12-05 2017-05-31 中国航空工业集团公司洛阳电光设备研究所 一种低帧频lvds转高帧频dvi视频的装置和方法

Also Published As

Publication number Publication date
CN111492650B (zh) 2021-04-30
CN111492650A (zh) 2020-08-04

Similar Documents

Publication Publication Date Title
US10616511B2 (en) Method and system of camera control and image processing with a multi-frame-based window for image data statistics
WO2020029732A1 (zh) 全景拍摄方法、装置和成像设备
US9514525B2 (en) Temporal filtering for image data using spatial filtering and noise history
WO2020057199A1 (zh) 成像方法、装置和电子设备
US8923652B2 (en) Methods and apparatus for registering and warping image stacks
US8929683B2 (en) Techniques for registering and warping image stacks
WO2020034701A1 (zh) 成像控制方法、装置、电子设备以及可读存储介质
US10469749B1 (en) Temporal filter with criteria setting maximum amount of temporal blend
US11468539B2 (en) Image processing device and imaging device
WO2020029679A1 (zh) 控制方法、装置、成像设备、电子设备及可读存储介质
EP3891974A1 (en) High dynamic range anti-ghosting and fusion
US10679320B1 (en) High dynamic range sensor system with row increment operation
JP5325655B2 (ja) 撮像装置
US9554070B2 (en) Imaging device for reducing pressure on data bus bandwidth
US9007479B2 (en) Imaging apparatus and evaluation value generation apparatus
US20140133781A1 (en) Image processing device and image processing method
US20190051270A1 (en) Display processing device and imaging device
US10346323B2 (en) Data transfer device and data transfer method for smoothing data to a common bus
WO2022089083A1 (zh) Led 电视墙的显示方法、电视和计算机可读存储介质
US20190174181A1 (en) Video signal processing apparatus, video signal processing method, and program
US9374526B2 (en) Providing frame delay using a temporal filter
WO2019153264A1 (zh) 图像预处理方法及装置、图像传感器接口、图像处理方法及装置
US11843871B1 (en) Smart high dynamic range image clamping
US9288397B2 (en) Imaging device, method for processing image, and program product for processing image
JP2005326528A (ja) 画像表示装置

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18905796

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

32PN Ep: public notification in the ep bulletin as address of the adressee cannot be established

Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205A DATED 03.12.2020)

122 Ep: pct application non-entry in european phase

Ref document number: 18905796

Country of ref document: EP

Kind code of ref document: A1