WO2021238506A1 - 多媒体处理芯片、电子设备及动态图像处理方法 - Google Patents

多媒体处理芯片、电子设备及动态图像处理方法 Download PDF

Info

Publication number
WO2021238506A1
WO2021238506A1 PCT/CN2021/088513 CN2021088513W WO2021238506A1 WO 2021238506 A1 WO2021238506 A1 WO 2021238506A1 CN 2021088513 W CN2021088513 W CN 2021088513W WO 2021238506 A1 WO2021238506 A1 WO 2021238506A1
Authority
WO
WIPO (PCT)
Prior art keywords
image data
processing chip
processing
multimedia
data
Prior art date
Application number
PCT/CN2021/088513
Other languages
English (en)
French (fr)
Inventor
王文东
Original Assignee
Oppo广东移动通信有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Oppo广东移动通信有限公司 filed Critical Oppo广东移动通信有限公司
Priority to EP21812348.7A priority Critical patent/EP4148656A4/en
Publication of WO2021238506A1 publication Critical patent/WO2021238506A1/zh
Priority to US18/059,254 priority patent/US20230086519A1/en

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/64Computer-aided capture of images, e.g. transfer from script file into camera, check of taken image quality, advice or proposal for image composition or decision on when to take image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T1/00General purpose image data processing
    • G06T1/20Processor architectures; Processor configuration, e.g. pipelining
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/67Focus control based on electronic image sensor signals
    • H04N23/672Focus control based on electronic image sensor signals based on the phase difference signals
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/70Circuitry for compensating brightness variation in the scene
    • H04N23/73Circuitry for compensating brightness variation in the scene by influencing the exposure time
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N9/00Details of colour television systems
    • H04N9/64Circuits for processing colour signals
    • H04N9/73Colour balance circuits, e.g. white balance circuits or colour temperature control

Definitions

  • This application relates to the field of image processing technology, and in particular to a multimedia processing chip, electronic equipment, and a dynamic image processing method.
  • Various multimedia devices such as digital cameras, smart phones, tablet computers, etc.
  • image sensors for acquiring images
  • multimedia processing chips that can perform image processing
  • application processors AP , Application Processor
  • the image sensor can be connected to the multimedia processing chip through the MIPI (Mobile Industry Processor Interface) line
  • the multimedia processing chip can be connected to the AP through the MIPI line.
  • MIPI Mobile Industry Processor Interface
  • the image sensor may include a Complementary Metal-Oxide-Semiconductor (CMOS) image sensor, a Charge Coupled Device (CCD) image sensor, and the like.
  • CMOS Complementary Metal-Oxide-Semiconductor
  • CCD Charge Coupled Device
  • the multimedia processing chip generally uses an image signal processor (Image Signal Processor, ISP) to process the image acquired by the image sensor, and the multimedia processing chip obtains the processing result after the image processing is completed, and transmits the processing result to the AP.
  • ISP Image Signal Processor
  • the multimedia processing chips in the related art have limited image processing capabilities.
  • the embodiments of the present application provide a multimedia processing chip, an electronic device, and a dynamic image processing method, which can improve the image processing capability of the multimedia processing chip.
  • the embodiment of the application discloses a multimedia processing chip, which includes:
  • Image signal processor used to count the status information of image data
  • Neural network processor used for neural network algorithm processing of image data
  • the multimedia processing chip is used for preprocessing the image data at least through the neural network processor, and sending the state information and the preprocessed image data to the application processing chip.
  • An embodiment of the application discloses an electronic device, which includes:
  • the multimedia processing chip is the multimedia processing chip as described above.
  • the application processing chip is configured to obtain the result of the preprocessing and the statistical status information from the multimedia processing chip, and the application processing chip performs post-processing on the result of the preprocessing based on the status information.
  • the embodiment of the application discloses a dynamic image processing method, which includes:
  • the dynamic image data use a multimedia processing chip to count the status information of the dynamic image data, and preprocess the dynamic image data;
  • the pre-processed dynamic image data is post-processed by the application processing chip based on the state information.
  • FIG. 1 is a schematic diagram of the first structure of an image processing apparatus provided by an embodiment of the application.
  • FIG. 2 is a schematic diagram of a second structure of an image processing device provided by an embodiment of the application.
  • Fig. 3 is a schematic diagram of a first application scenario of the image processing device shown in Fig. 1.
  • Fig. 4 is a schematic diagram of a second application scenario of the image processing device shown in Fig. 1.
  • FIG. 5 is a schematic flowchart of a method for processing a video image by an image processing device provided by an embodiment of the application
  • FIG. 6 is a schematic diagram of the first structure of a multimedia processing chip provided by an embodiment of the application.
  • FIG. 7 is a schematic diagram of a second structure of a multimedia processing chip provided by an embodiment of the application.
  • FIG. 8 is a schematic diagram of a third structure of a multimedia processing chip provided by an embodiment of the application.
  • FIG. 9 is a schematic diagram of a fourth structure of a multimedia processing chip provided by an embodiment of the application.
  • FIG. 10 is a schematic diagram of the first type of data flow in which the multimedia processing chip provided in an embodiment of the application processes image data.
  • FIG. 11 is a schematic diagram of a first method for processing image data by a multimedia processing chip provided by an embodiment of the application.
  • FIG. 12 is a schematic diagram of the second type of data flow in which the multimedia processing chip provided by an embodiment of the application processes image data.
  • FIG. 13 is a schematic diagram of a second method for processing image data by a multimedia processing chip provided by an embodiment of the application.
  • FIG. 14 is a schematic diagram of a third data flow direction in which the multimedia processing chip provided by an embodiment of the application processes image data.
  • 15 is a schematic diagram of a third method for processing image data by the multimedia processing chip provided in an embodiment of the application
  • FIG. 16 is a schematic diagram of a fourth data flow direction in which the multimedia processing chip provided by an embodiment of the application processes image data.
  • FIG. 17 is a schematic diagram of a fourth method for processing image data by a multimedia processing chip provided by an embodiment of the application.
  • FIG. 18 is a schematic diagram of a fifth structure of a multimedia processing chip provided by an embodiment of the application.
  • FIG. 19 is a schematic flowchart of an offline static image processing method provided by an embodiment of the application.
  • FIG. 20 is a schematic flowchart of a method for editing and processing RAW images using a multimedia processing chip according to an embodiment of the application.
  • FIG. 21 is a schematic flowchart of an offline dynamic image processing method provided by an embodiment of the application.
  • FIG. 22 is a schematic flowchart of a method for processing image data played by a video by using a multimedia processing chip according to an embodiment of the application.
  • FIG. 23 is a schematic diagram of a sixth structure of a multimedia processing chip provided by an embodiment of the application.
  • FIG. 24 is a schematic diagram of a seventh structure of a multimedia processing chip provided by an embodiment of the application.
  • FIG. 25 is a schematic diagram of an eighth structure of a multimedia processing chip provided by an embodiment of the application.
  • FIG. 26 is a schematic diagram of the first structure of an electronic device provided by an embodiment of this application.
  • FIG. 27 is a schematic diagram of a second structure of an electronic device provided by an embodiment of this application.
  • FIG. 28 is a schematic diagram of a third structure of an electronic device provided by an embodiment of this application.
  • FIG. 29 is a schematic diagram of a process of processing image data by an image signal processor in an application processing chip provided by an embodiment of the application.
  • FIG. 30 is a comparison diagram between an embodiment of this application and related technologies.
  • FIG. 31 is a comparison diagram between an embodiment of this application and related technologies.
  • FIG. 32 is a schematic diagram of the first structure of a circuit board provided by an embodiment of the application.
  • FIG. 33 is a schematic diagram of a second structure of a circuit board provided by an embodiment of the application.
  • the embodiments of the present application provide a multimedia processing chip, an electronic device, and a dynamic image processing method.
  • the multimedia processing chip can be integrated in a circuit board such as a motherboard to be applied to an electronic device to realize image processing to improve image quality.
  • the multimedia processing chip includes:
  • Image signal processor used to count the status information of image data
  • Neural network processor used for neural network algorithm processing of image data
  • the multimedia processing chip is used for preprocessing the image data at least through the neural network processor, and sending the state information and the preprocessed image data to the application processing chip.
  • state information can be calculated from the image data such as dynamic image data.
  • the application processing chip processes image data such as dynamic image data
  • the image data is processed based on the status information to improve the processing capability of the image data.
  • the multimedia processing chip first processes the image data, and then the application processing chip further processes the image, which can save the power consumption of the application processing chip.
  • the image signal processor is further configured to perform a first preprocessing on the image data; the neural network processor is also configured to perform a first preprocessing on the image data after the first preprocessing; The image data is preprocessed for the second time.
  • the image signal processor is further configured to perform a third preprocessing on the image data after the second preprocessing, and the image signal processor performs a third preprocessing on the image data.
  • the three preprocessing includes performing bit width adjustment processing on the image data, so that the bit width of the image data after the bit width adjustment is the same as the bit width of the image data processed by the application processing chip.
  • the first preprocessing performed by the image signal processor on the image data includes at least one of dead pixel compensation, linearization processing, and black level correction.
  • the first preprocessing of the image data by the image signal processor further includes image cropping processing and/or image reduction processing.
  • the image signal processor is further configured to adjust the bit width of the image data processed by the neural network algorithm, so that the bit width of the image data after the bit width adjustment is equal to The bit widths of the image data processed by the application processing chip are the same.
  • the image data includes dynamic image data
  • the multimedia processing chip is used for processing dynamic image data
  • the neural network processor is used for a neural network that processes the dynamic image data
  • the algorithm includes at least one of a night scene algorithm, an HDR algorithm, a blur algorithm, a noise reduction algorithm, a super-resolution algorithm, and a semantic segmentation algorithm.
  • the multimedia processing chip is used to process the dynamic image data in real time, and transmit the processed dynamic image data to the application processing chip in real time.
  • the image data includes static image data
  • the multimedia processing chip is used for processing static image data
  • the neural network processor is used for a neural network that processes the static image data
  • the algorithm includes at least one of a night scene algorithm, an HDR algorithm, a blur algorithm, a noise reduction algorithm, a super-resolution algorithm, and a semantic segmentation algorithm.
  • the multimedia processing chip is also used for offline processing of static image data and/or dynamic image data.
  • the image data is RAW image data
  • the multimedia processing chip is used to process the RAW image data
  • the status information includes at least one of auto exposure status information, auto white balance status information, and auto focus status information.
  • the state information further includes lens shading correction state information.
  • An embodiment of the present application also provides an electronic device, which includes:
  • the multimedia processing chip is the multimedia processing chip according to any one of claims 1-13;
  • the application processing chip is configured to obtain the pre-processing result and statistical status information from the multimedia processing chip, and the application processing chip performs post-processing on the pre-processing result based on the status information.
  • the status information includes at least one of autofocus status information, auto white balance status information, and auto exposure status information
  • the application processing chip is configured to:
  • An exposure parameter is calculated based on the automatic exposure state information, and the exposure parameter is configured to the camera of the electronic device, or the exposure parameter is compensated and configured to the camera of the electronic device.
  • the auto focus state information includes phase focus state information and contrast focus state information
  • the image signal processor of the multimedia processing chip is used for:
  • the application processing chip is also used for:
  • the state information further includes lens shading correction state information
  • the application processing chip is further configured to:
  • a lens shading correction parameter is calculated based on the lens shading correction state information, and a lens shading correction is performed on the result of the preprocessing based on the lens shading correction parameter.
  • An embodiment of the present application also provides a dynamic image processing method, wherein the method includes:
  • the state information of the dynamic image data is counted by a multimedia processing chip, and the dynamic image data is preprocessed;
  • the pre-processed dynamic image data is post-processed by the application processing chip based on the state information.
  • the preprocessing of dynamic image data by a multimedia processing chip includes:
  • Neural network algorithm processing is performed on the optimized dynamic image data.
  • the status information includes at least one of autofocus status information, auto white balance status information, and auto exposure status information
  • the application processing chip is processed based on the status information.
  • Post-processing the pre-processed dynamic image data includes:
  • An exposure parameter is calculated based on the automatic exposure state information, and the exposure parameter is configured to the camera, or the exposure parameter is compensated and configured to the camera.
  • the image processing device 110 may process the acquired data, such as RAW data, so that other image processors can further process the image data to improve image quality.
  • the image processing device 110 may process still image data, such as still image data acquired by the user in the photographing mode.
  • the image processing device 110 may also process moving image data, such as moving image data obtained by the user in the preview mode or the recorded video mode.
  • both static image data and dynamic image data can be processed by a processor on the platform side (System-on-a-Chip, SoC chip).
  • the platform side can be understood as an application processing chip, and the platform side processor can be understood as an image signal processor (Image Signal Processing, ISP) and an application processor (AP, Application Processor).
  • ISP Image Signal Processing
  • AP Application Processor
  • the platform side often has limited processing capabilities for image data. As users have higher and higher requirements for image quality, only processing image data on the platform side is often not able to meet user needs.
  • Some embodiments of the present application may provide an image pre-processor (pre-ISP) such as a Neural-network Processing Unit (NPU) to pre-process the image first, and transmit the pre-processed result to the platform.
  • pre-ISP image pre-processor
  • NPU Neural-network Processing Unit
  • the platform is based on the processing result of pre-ISP as input data and performs post-processing. Thereby the image quality can be improved.
  • the pre-ISP first preprocesses the static image data, and this pre-processing operation generally does not destroy the state information of the static image data.
  • pre-ISP pre-processing the static image data it can be directly transmitted to the platform end, and the platform end can directly post-process the static image data processed by the pre-ISP.
  • the status information can be understood as the information required by the platform to post-process the image data, that is, the platform can post-process the pre-processing result of the image data based on the status information.
  • the status information may include automatic white balance (AWB) status information, automatic exposure (AE) status information, and automatic focus (AF) status information, which may be referred to as 3A status information.
  • Status information can also be understood as status data. It should be noted that the status information is not limited to this.
  • the status information also includes Lens Shade Correction (LSC) status information.
  • Auto white balance status information can be understood as status information required for white balance processing
  • auto exposure status information can be understood as status information required for exposure
  • auto focus status information can be understood as status information required for focusing
  • lens shading correction status information It can be understood as the status information required for lens shading correction.
  • some embodiments of the present application may use an image processing device such as the statistics module 112 in the image processing device 110 shown in FIG. 1 to perform statistics on the dynamic image data, so as to obtain status information from the dynamic image data.
  • pre-ISP such as a neural network processor
  • the platform can perform postprocessing on the preprocessing result based on the status information obtained by the statistical module 112 of the image processing device 110 to improve the quality of the dynamic image.
  • some embodiments of the present application first optimize the dynamic image in the process of processing the dynamic image to reduce Eliminate the problem of playing lag.
  • some embodiments of the present application may use an image processing device such as the optimization module 114 in the image processing device 110 shown in FIG. 1 to optimize the dynamic image data to solve problems such as dead pixels in the dynamic image data. Then transfer the optimized data to a pre-ISP such as a neural network processor, which can speed up the convergence of the neural network processor to increase the time it takes for the neural network processor to process one frame of image, thereby ensuring that the neural network processor can One frame of dynamic image data is processed within a preset time period.
  • the preset time period is, for example, 33 nm (milliseconds).
  • the optimization processing performed by the optimization module 114 on the dynamic image data may include at least one of bad pixel correction (BPC), linearization processing, and black level correction (BLC).
  • the algorithm for the optimization module 114 to perform optimization processing on the dynamic image data may include at least one of a black level correction algorithm, a dead pixel compensation algorithm, and a linearization processing algorithm.
  • the optimization module 114 executes the black level correction algorithm to realize the black level correction of the dynamic image data, the optimization module 114 executes the dead pixel compensation algorithm to realize the dead pixel compensation of the dynamic image data, and the optimization module 114 executes the linearization processing algorithm to achieve Linearization processing of dynamic image data.
  • the optimization processing performed by the optimization module 114 on the dynamic image data is not limited to this.
  • the optimization processing performed by the optimization module 114 on the dynamic image data may also include at least one of image cropping (Crop) processing and image reduction (Bayerscaler) processing.
  • the algorithm for the optimization module 114 to perform optimization processing on the dynamic image data may include at least one of an image cropping algorithm and an image reduction algorithm.
  • the optimization module 114 executes the image cropping algorithm to implement the cropping of the dynamic image, and the optimization module 114 executes the image reduction algorithm to implement the reduction of the dynamic image.
  • optimization modules may be used to execute different algorithms respectively to achieve different optimization results. It can also be divided into several optimization sub-modules in the optimization module to execute different algorithms to achieve different optimization results.
  • the optimization module 114 of the image processing device 110 may include multiple optimization sub-modules, and the optimization sub-modules may be defined as an optimization unit.
  • the optimization module 114 includes a first optimization unit 1142 and a second optimization unit 1144.
  • the first optimization unit 1142 can perform dead pixel compensation on the dynamic image data
  • the second optimization unit 1144 can perform linearization processing on the dynamic image data. This can ensure that the optimized data processed by the optimization module 114 accelerates the convergence speed of the pre-ISP such as neural network processor, thereby ensuring
  • a pre-ISP such as a neural network processor can complete the processing of a frame of image within a preset time period to solve the problem of playback jams.
  • the optimization unit of the optimization module 114 is not limited to the first optimization unit 1142 and the second optimization unit 1144.
  • the optimization module 114 may also include a third optimization unit, which can perform black power on the dynamic image data. Level correction.
  • the optimization module 114 may further include a fourth optimization unit that can perform image cropping processing on the dynamic image data, and the optimization module 114 can also include a fifth optimization unit that can perform image reduction processing on the dynamic image data.
  • the number and functions of the optimization units of the optimization module 114 are not limited to this, and the above are only some examples of the optimization units of the optimization module in some embodiments of the present application.
  • the functional sub-modules that can accelerate the convergence speed of the pre-ISP such as the neural network processor in processing the dynamic image data are all within the protection scope of the present application.
  • optimization process performed by the optimization module 114 on the dynamic image data may not be for speeding up the convergence speed of the dynamic image data processed by the pre-ISP such as a neural network processor.
  • the optimization processing of the dynamic image data of the optimization module 114 can be designed according to actual needs.
  • the image processing device 110 provided by the embodiment of the present application may also perform statistics on the static image data to obtain the status information.
  • the image processing device 110 provided in the embodiment of the present application may also perform optimization processing on still image data to improve the quality of the still image.
  • the image processing device 110 may be connected with one or more cameras 120 to obtain image data collected by the camera 120, such as dynamic image data, from the camera 120. It can also be understood that the image processing device 110 is connected to the camera 120 and can receive dynamic image data sent by the camera 120 to the image processing device 110.
  • the dynamic image data can be divided into two channels, one channel can be transmitted to the statistics module 112, and the other channel can be transmitted to the optimization module 114.
  • the electrical connection between the two devices defined in the embodiments of the present application can be understood as connecting the two devices through a signal such as a wire, which can realize signal transmission.
  • a signal such as a wire
  • two devices are connected together, such as being welded together by soldering points.
  • the statistics module 112 can count some of the information based on the dynamic image data, which can be defined as status information, such as 3A status information.
  • the status information can be directly sent to the first image processor 130.
  • the first image processor 130 can be understood as a processor on the platform side such as an ISP and an AP.
  • the optimization module 114 may perform one or more optimization processes on the dynamic image data after receiving the dynamic image data, such as dead pixel compensation, linearization processing, and the like. After the optimization processing of the dynamic image data by the optimization module 114 is completed, the optimized dynamic image data may be transmitted to the second image processor 140.
  • the second image processor 140 can be understood as a pre-ISP, such as a neural network processor.
  • the first image processor 130, the second image processor 14 and the image processing device 110 need to be completed separately, which will increase the cost during the production and processing stage.
  • some signals will be additionally transmitted from one device to another, which will increase time and power consumption.
  • some other embodiments of the present application may integrate the second image processor 140, the statistics module 112, and the optimization module 114 on one device.
  • FIG. 4 Please refer to FIG. 4.
  • the second image processor 140, the statistics module 112 and the optimization module 114 are integrated on one device, such as the image processing device 110. Therefore, in terms of structure, the statistics module 112, the optimization module 114, and the second processor 140 can be integrated, which can save costs, speed up the rate of mutual data transmission, and save time and power consumption.
  • the image processing device 110 can not only calculate some status information, but also optimize the dynamic image data. This can speed up the convergence of per-ISP such as neural network processors to ensure that the processing of one frame of image is completed within a preset time period, and it can also ensure that the first image processor 130 can be based on the statistics of the statistical module 112.
  • the status information is post-processed on the basis of pre-ISP pre-processing.
  • the first image processor 130 may perform format conversion on the dynamic image data in the RAW format after processing the RAW format image data, such as converting the RAW format image data into YUV format image data.
  • the first image processor 130 may also process YUV format image data such as RGBToYUV.
  • the first image processor 130 may display the processed image data on the display screen and store it in the memory.
  • the dynamic image processing method includes:
  • the image processing device 110 acquires dynamic image data.
  • the image processing device 110 may obtain dynamic image data from the camera 120.
  • the moving image data may be RAW data.
  • the statistics module 112 of the image processing device 110 calculates status information from the dynamic image data.
  • the status information may include 3A status information.
  • the image processing device 110 sends the status information counted by the statistics module to the first image processor 130.
  • the first image processor 130 can understand the AP and ISP on the platform side, and can perform image processing based on the status information.
  • the status information includes 3A status information, and the first image processor 130 may perform 3A processing based on the 3A status information.
  • 3A processing can be understood as processing based on 3A status information.
  • the optimization module 114 of the image processing device 110 performs optimization processing on the dynamic image data.
  • the optimization processing may include at least one of dead pixel compensation, linearization processing, and black level correction of the dynamic image data.
  • the image processing device 110 sends the optimized dynamic image data to the second image processor 140.
  • the second image processor 140 can be understood as a neural network processor.
  • the optimized dynamic image data is sent to the second image processor 140, so that the second image processor 140 can process one frame of dynamic image data within a preset time period, or it can speed up the convergence speed of the dynamic image data. In this way, the second image processor 140 can transmit the processed dynamic image data to the first image processor 130 in real time, so as to solve the problem of playing jams.
  • the neural network processor can perform neural network algorithm processing on the optimized dynamic image data, and can transmit the processed image data to the first image processor 130, and the first image processor 130 can analyze the neural network based on the state information.
  • the dynamic image data processed by the algorithm is subjected to post-processing such as 3A processing.
  • the second image processor such as a neural network processor performs some algorithms on the dynamic image data
  • the bit width of the data will be larger than the bit width of the image data processed by the first image processor.
  • the optimization module 114 may perform the bit width adjustment processing on the processing result of the second image processor. The data after the bit width adjustment is made to conform to the bit width of the data processed by the first image processor. Then, the dynamic image data after the bit width adjustment processing is sent to the first image processor, so that the first image processor can further process the bit width adjusted data based on the reference data.
  • a multimedia processing chip such as the multimedia processing chip 200, can process the acquired image data, such as RAW data, to improve image quality. It should be noted that the multimedia processing chip 200 can transmit its processing results to the application processing chip, so that the application processing chip can post-process the image data for display or storage.
  • the image data can also be understood as image information.
  • RAW data Compared with other image data such as YUV data, RAW data retains more details.
  • the multimedia processing chip 200 may include a neural network processor (Neural-network Processing Unit, NPU) 220.
  • the neural network processor 220 may perform enhancement processing on the image data acquired by the multimedia processing chip 200, and the neural network processor 220 may run artificial Intelligent training network processing image algorithm to enhance processing of image data.
  • the neural network processor 220 has high efficiency in processing image data, which significantly improves image quality.
  • the neural network processor 220 may be a dedicated processor for processing images, and may be referred to as a dedicated processor for short. Hardening can be implemented in the hardware configuration process such as circuit arrangement and programming, so as to ensure the stability of the neural network processor 220 in the process of processing image data, and reduce the power consumption required by the neural network processor 220 to process image data and time. It can be understood that when the neural network processor 220 is a dedicated processor, its function is to process image data, and it cannot process other data such as text information. It should be noted that in some other embodiments, the neural network processor 220 may also process other information such as text.
  • the neural network processor 220 may process the image data in a manner of reading data blocks in a row manner, and processing the data blocks in a row manner. For example, the neural network processor 220 reads data blocks in a multi-line manner, and processes the data blocks in a multi-line manner. It can be understood that a frame of image may have multiple rows of data blocks, that is, the neural network processor 220 may process a part of the frame of image, such as a frame, where n is a positive integer, such as 2, 4, 5, and so on. When the neural network processor 220 has not completely processed one frame of image, the neural network processor 220 may have a built-in cache to store the data of multiple rows of data blocks processed by the neural network processor 220 in the process of processing one frame of image.
  • the neural network processor 220 can write the processed data to a memory, such as the memory 230 of the multimedia processing chip 200.
  • a memory such as the memory 230 of the multimedia processing chip 200.
  • the memory 230 may be built in the multimedia processing chip 200 or may be external.
  • the storage controller can be used to realize data transmission.
  • the neural network processor 220 can process the RAW data. It is understandable that the information of the RAW data is relatively complete. Compared with the processing of the YUV data, the neural network processor 220 can improve the image in more details. quality.
  • the neural network processor 220 can complete the processing according to a preset time in the data stream.
  • the preset time for the neural network processor 220 to process one frame of image is 33 ms, which can ensure that the neural network processor 220 can realize real-time data transmission based on the rapid processing of image data.
  • the neural network processor 220 defined in some embodiments of the application is a dedicated neural network processor, which can speed up the processing of image data and ensure that a frame of image is completed within a preset time. ⁇ Treatment.
  • the neural network processor 220 may process dynamic image data, such as the dynamic image data obtained by the user in the video recording mode.
  • the neural network processor 220 may include algorithms for processing dynamic image data, such as night scene algorithms, HDR algorithms, blurring algorithms, noise reduction algorithms, super-resolution algorithms, and so on.
  • the dynamic image data may include the image data of the recorded video, the image data of the video playback, and the data of the preview image.
  • dynamic image data can be understood as video image data.
  • the neural network processor 220 may also process still image data, such as still image data obtained by the user in the camera mode.
  • the neural network processor 220 may include algorithms for processing static image data, such as HDR algorithms, night scene algorithms, blurring algorithms, noise reduction algorithms, super-resolution algorithms, semantic segmentation algorithms, and the like. It should be noted that the static image data may also include the image displayed when the photo album application is opened.
  • the neural network processor 220 defined in the embodiment of the present application can process both dynamic image data and static image data, so that the multimedia processing chip 200 can be applied to different scenes, such as photographing scenes and video recording scenes. It should be noted that the neural network processor 220 defined in the embodiment of the present application may also only process dynamic image data, but not static image data. The following takes the neural network processor 220 processing dynamic image data as an example for description.
  • the multimedia processing chip 200 obtains image data such as dynamic image data
  • the neural network processor 220 presets the dynamic image data according to requirements Algorithm processing to get the processing result.
  • the dynamic image data is often distorted after the neural network processor 220 processes the dynamic image data through the preset algorithm. If the multimedia processing chip 200 sends the distorted data processed by the neural network processor 220 to the application processing chip, It will cause the application processing chip's status information of the dynamic image data, such as the status information required for autofocus, to be wrong, which will cause focus failure and cause the camera to be unable to focus.
  • the dynamic image data can be understood as data received by the multimedia processing chip 200, but data that has not been processed.
  • the data sent by the image sensor to the multimedia processing chip 200 is defined as initial dynamic image data.
  • a statistical module may be integrated in the multimedia processing chip 200, and the statistical module is used to calculate the data required by the application processing chip for image data processing, or the statistical module is used to perform statistical application processing. State information required by the chip for image data processing. After the statistical module completes the statistics required by the application processing chip, it can send the statistical data to the application processing chip to ensure that the application processing chip can smoothly complete image data processing such as 3A processing.
  • the statistics module may be integrated into an image signal processor (Image Signal Processing, ISP) 210, or the multimedia processing chip 200 further includes an image signal processor 210.
  • the processor 210 includes a statistics module 212. After the multimedia processing chip 200 obtains the initial dynamic image data, it can be transmitted to the image signal processor 210 first, and the statistics module 212 of the image signal processor 210 performs statistics on the initial dynamic image data to calculate the application processing.
  • the state information required by the chip such as 3A state information. Therefore, it can be ensured that the application processing chip performs post-processing on the processing result sent by the multimedia processing chip 200 to the application processing chip based on the status information counted by the statistics module 212.
  • the status information counted by the statistics module 212 of the image signal processor 210 is not limited to 3A status information, such as the statistics module 212 of the image signal processor 210 statistics status information such as lens shading correction status information.
  • the neural network processor 220 directly processes the dynamic image data, the neural network processor 220 performs a preset algorithm for the dynamic image data according to requirements. Processing to get the processing result.
  • the neural network processor 220 directly processes the dynamic image data through the preset algorithm, which will slow down the convergence speed of the neural network processor 220, thereby reducing the time required for the neural network processor 220 to process one frame of image, and it is not easy to achieve rapid image processing. Data and the purpose of effectively improving image quality.
  • an optimization module may be integrated in the image signal processor 210, and the optimization module may perform the first preprocessing such as dead pixel compensation on the dynamic image data to obtain the first processing result. Then the neural network processor 220 performs a second preprocessing on the result of the first preprocessing, which can not only solve the problems of image dead pixels, but also improve the neural network algorithm convergence speed of the neural network processor 220 to ensure the neural network
  • the processor 220 can complete the processing of one frame of image within a preset time, thereby achieving the purpose of processing the image quickly and in real time.
  • the image signal processor 210 further includes an optimization module 214 that can perform dead pixel compensation on the dynamic image data, and the optimization module 214 can execute a dead pixel compensation algorithm to implement dead pixel compensation for the dynamic image data.
  • the optimization module 214 may perform linearization processing on the dynamic image data, and the optimization module 214 may perform linearization processing algorithms to implement linearization processing on the dynamic image data.
  • the optimization module 214 can perform black level correction on the dynamic image data, and the optimization module 214 can execute a black level correction algorithm to implement black level correction on the dynamic image data.
  • the first preprocessing of the dynamic image by the optimization module 214 of the image signal processor 210 is not limited to this.
  • the optimization module 214 performs image cropping processing on the initial image data, and the optimization module 214 can execute the image cropping algorithm to Realize the cutting of dynamic image data.
  • the optimization module 214 performs image reduction processing on the dynamic image data, and the optimization module 214 may execute an image reduction algorithm to realize the reduction of the dynamic image data.
  • the multimedia processing chip 200 can directly send the image data processed by the neural network processor 220 to the application processing chip.
  • the bit width of the data processed by the neural network processor 220 is often different from the bit width of the data processed by the application processing chip.
  • the neural network processor 220 uses the video HDR (High-Dynamic Range, high dynamic range image) algorithm to process the dynamic image data with a bit width of 20 bits (bits), and the bit width of the data to be processed by the application processing chip is 14 bits. . Therefore, the bit width of the image data processed by the neural network processor 220 exceeds the bit width of the data to be processed by the application processing chip. Therefore, it is necessary to perform a bit width adjustment operation on the data processed by the neural network processor 220, so that the bit width of the data transmitted from the multimedia processing chip 200 to the application processing chip is the same.
  • the optimization module 214 of the image signal processor 210 may first perform tone mapping, so that The bit width of the data adjusted by the optimization module 214 is the same as the bit width of the data to be processed by the application processing chip. Therefore, it can be ensured that after the data processed by the multimedia processing chip 200 on the dynamic image data is transmitted to the application processing chip, the application processing chip can post-process the data to improve the image quality.
  • several different optimization modules may be used to execute different algorithms respectively to achieve different optimization results. It can also be divided into several optimization sub-modules in the optimization module 214 to execute different algorithms to achieve different optimization results. For example, a sub-module of the optimization module 214 can perform dead pixel compensation on the dynamic image data, a sub-module of the optimization module 214 can perform linearization processing on the dynamic image data, and a sub-module of the optimization module 214 can perform black power on the dynamic image data.
  • a sub-module of the optimization module 214 can perform image cropping processing on the dynamic image data, a sub-module of the optimization module 214 can perform image reduction processing on the dynamic image data, and a sub-module of the optimization module 214 can perform image data processing.
  • Wide adjustment processing For leveling correction, a sub-module of the optimization module 214 can perform image cropping processing on the dynamic image data, a sub-module of the optimization module 214 can perform image reduction processing on the dynamic image data, and a sub-module of the optimization module 214 can perform image data processing.
  • Wide adjustment processing for leveling correction, a sub-module of the optimization module 214 can perform image cropping processing on the dynamic image data, a sub-module of the optimization module 214 can perform image reduction processing on the dynamic image data, and a sub-module of the optimization module 214 can perform image data processing.
  • the optimization module 214 can have one or more of the above sub-modules, and the optimization module 214 can perform one or more of the above operations, so as to ensure that the data transmitted by the multimedia processing chip 200 to the application processing chip can be used by the application.
  • the processing chip performs further processing.
  • the neural network processor 220 can accelerate the convergence, so as to achieve the purpose of improving the image quality.
  • the optimization module 214 may also have other sub-modules, which will not be illustrated one by one here.
  • the multimedia processing chip 200 may include a first interface 201 and a second interface 202. Both the first interface 201 and the second interface 202 may be Mobile Industry Processor Interface (MIPI).
  • the first interface 201 can receive image data such as RAW data, for example, the first interface 201 can receive RAW data acquired from a camera.
  • the image data received by the first interface 201 can be image data, that is, the image data received by the first interface 201 is image data that has not been processed. Specifically, the original image data can be understood as unprocessed. Image data processed by the image processor.
  • the image data can be transmitted to the image signal processor 210.
  • the second interface 202 can receive the result of processing the image data by the image signal processor 210, and the second interface 202 can also receive the result of processing the image data by the neural network processor 220.
  • the second interface 202 may be connected to an application processing chip to transmit image data, such as dynamic image data, received by the second interface 202 to the application processing chip.
  • the first interface 201 and the second interface 202 can be connected through the image signal processor 210, and the data received by the first interface 201 can be divided into at least two channels for transmission, for example, one channel of data is transmitted to the statistics module 212 of the image signal processor 210 , Another way of data is stored in the memory 230. Or another channel of data is processed by the optimization module 214.
  • the second interface 202 can transmit the data counted by the statistics module 212, and the second interface 202 can also transmit the data processed by the optimization module 214.
  • the memory 230 stores various data and instructions of the multimedia processing chip 200.
  • the memory 230 can store the original image data
  • the memory 230 can store the data processed by the optimization module 214
  • the memory 230 can store the data processed by the neural network processor 220
  • the memory 230 can also store the multimedia processing chip 200. operating system.
  • the number of memories 230 may be one, two, three, or even more.
  • the type of the memory 230 may be a static memory or a dynamic memory, such as DDR (Double Data Rate SDRAM).
  • the memory 230 may be built-in or external. For example, in the packaging process, the image signal processor 210, the neural network processor 220 and other devices are packaged first, and then packaged with the memory 230.
  • the data transmission of the multimedia processing chip 200 may be implemented by one or more storage access controllers.
  • the multimedia processing chip 200 may also include a memory access controller 250.
  • the memory access controller 250 may be a direct memory access controller (Direct Memory Access, DMA), which has high data transfer efficiency and can transfer large data. data.
  • DMA Direct Memory Access
  • the direct storage access controller 250 can move data from one address space to another address space. For example, the direct storage access controller 250 can move the data stored in the memory 230 to the neural network processor 220.
  • the direct storage access controller 250 may include an AHB (Advanced High performance Bus) direct storage access controller, and may also include an AXI (Advanced eXtensible Interface) direct storage access controller.
  • AHB Advanced High performance Bus
  • AXI Advanced eXtensible Interface
  • the various components of the multimedia processing chip 200 may be connected by the system bus 240.
  • the image signal processor 210 is connected to the system bus 240
  • the neural network processor 220 is connected to the system bus 240
  • the memory 230 is connected to the system bus 240
  • the storage access controller 250 is connected to the system bus 240.
  • the multimedia processing chip 200 may be implemented by a control processor to realize the operation of the multimedia processing chip 200 system.
  • the multimedia processing chip 200 may also include a central processing unit (CPU) 240, which is used to control the operation of the system of the multimedia processing chip 200, such as peripheral parameter configuration and control interrupts. Response etc.
  • CPU central processing unit
  • the method for processing data by the multimedia processing chip 200 includes:
  • the first interface 201 of the multimedia processing chip 200 receives original data, such as dynamic image data.
  • the raw data is transmitted all the way to the statistics module 212 of the image signal processor 210, and the statistics module 212 performs statistical processing on the received raw data to calculate status information.
  • the original data may also be stored in the memory 230 first, and then the statistical module 212 performs statistical processing on the original data stored in the memory 230 to calculate the status information.
  • the data collected by the statistics module 212 is transmitted through the second interface 202, such as to an application processing chip. It should be noted that the data collected by the statistics module 212, such as status information, may also be stored in the memory 230 first, and then transmitted through the second interface 202.
  • the original data is stored in the memory 230 through another channel.
  • the raw data stored in the memory 230 is sent to the neural network processor 220 and processed by the neural network processor 220.
  • the neural network processor 220 obtains the original data from the memory 230, and processes the original data, such as neural network algorithm processing.
  • the processed data of the neural network processor 220 is stored in the memory 230.
  • the result of processing the data by the neural network processor 220 may be defined as a preprocessing result.
  • the data processed by the neural network processor 220 is transmitted through the second interface 202, for example, to an application processing chip.
  • the above is the first method for the multimedia processing chip 200 of the embodiment of the application to perform data processing.
  • the application processing chip can further process the processing result of the neural network processor 220 based on the state information to improve image quality, such as improving video playback. the quality of.
  • the method for processing data by the multimedia processing chip 200 includes:
  • the first interface 201 of the multimedia processing chip 200 receives original data, such as dynamic image data.
  • the original data is transmitted to the statistical module 212 of the image signal processor 210 through a single channel, and the statistical module 212 performs statistical processing on the received raw data to calculate status information. It should be noted that the original data may also be stored in the memory 230 first, and then the statistical module 212 performs statistical processing on the original data stored in the memory 230 to calculate the status information.
  • the data collected by the statistics module 212 is transmitted through the second interface 202, such as to an application processing chip. It should be noted that the status information counted by the statistics module 212 may also be stored in the memory 230 first, and then transmitted through the second interface 202.
  • the original data is transmitted to the optimization module 214 through another channel, and the optimization module 214 performs optimization processing, such as dead pixel compensation and linearization processing.
  • the data processed by the optimization module 214 is sent to the neural network processor 220 and processed by the neural network processor 220. It should be noted that the data processed by the optimization module 214 can be sent to the memory 230 first, and then the data stored in the memory 230 and processed by the optimization module 214 is transmitted to the neural network processor 220 and processed by the neural network. The processor 220 processes the data processed by the optimization module 214.
  • the 2026 Store the processed data of the neural network processor 220 in the memory 230.
  • the result of processing the data by the neural network processor 220 may be defined as a preprocessing result.
  • the above is the second method for the multimedia processing chip 200 of the embodiment of the application to perform data processing.
  • the multimedia processing chip 200 can transmit the original data to the statistics module 212 in different channels for data statistics, and the optimization module 214 for optimization processing.
  • the optimized data can be processed by the neural network processor 220, and the data and state information processed by the neural network processor 220 are transmitted to the application processing chip, which not only ensures that the application processing chip processes the neural network based on the state information
  • the processing result of the processor 220 is further processed to improve image quality, such as improving the quality of video playback.
  • the convergence speed of the neural network processor 220 can also be accelerated to improve the smoothness of video playback.
  • the application processing chip 200 performs optimization processing on the data, such as dead pixel compensation, linearization processing, etc.
  • the application processing chip does not need to perform corresponding processing on the received image data.
  • the optimization module 214 performs dead pixel compensation, linearization processing and black level correction on the image data
  • the application processing chip does not need to perform dead pixel compensation, linearization processing and black level correction on the received image data, so that it can Reduce the power consumption of the application processing chip.
  • the method for processing data by the multimedia processing chip 200 includes:
  • the first interface 201 of the multimedia processing chip 200 receives original data, such as dynamic image data.
  • the raw data is transmitted to the statistics module 212 of the image signal processor 210 through one channel, and the statistics module 212 performs statistical processing on the received raw data to obtain statistics such as status information.
  • the original data may also be stored in the memory 230 first, and then the statistical module 212 performs statistical processing on the original data stored in the memory 230 to calculate the status information.
  • the 2033 Transmit the data collected by the statistics module 212 through the second interface 202, such as to an application processing chip. It should be noted that the status information counted by the statistics module 212 may also be stored in the memory 230 first, and then transmitted through the second interface 202.
  • the 2035 Send the original data stored in the memory 230 to the neural network processor 220, and the neural network processor 220 performs processing. Or the neural network processor 220 obtains the original data from the memory 230, and processes the original data, such as neural network algorithm processing.
  • the optimization module 214 performs bit width adjustment processing on the data processed by the neural network processor 220, so that the adjusted bit width is consistent with the application processing chip
  • the bit width of the data to be processed is the same.
  • the result of data processing by the optimization module 214 may be defined as a preprocessing result. It should be noted that the data processed by the neural network processor 220 can be sent to the memory 230 first, and then the data stored in the memory 230 and processed by the neural network processing 220 can be transmitted to the optimization module 214, and the optimization module 214 214 performs bit width adjustment processing on the data processed by the neural network processor 220.
  • the above is the third method for the multimedia processing chip 200 of the embodiment of the application to perform data processing, which can ensure that the application processing chip performs further processing on the data after the bit width adjustment processing based on the state information, so as to improve image quality, such as improving video playback. quality.
  • the data processing method of the multimedia processing chip 200 includes:
  • the first interface 201 of the multimedia processing chip 200 receives original data, such as dynamic image data.
  • the original data is transmitted to the statistics module 212 of the image signal processor 210 through one channel, and the statistics module 212 performs statistical processing on the received raw data to obtain statistics such as status information. It should be noted that the original data may also be stored in the memory 230 first, and then the statistical module 212 performs statistical processing on the original data stored in the memory 230 to calculate the status information.
  • the 2043 Transmit the data collected by the statistics module 212 through the second interface 202, such as to an application processing chip. It should be noted that the status information counted by the statistics module 212 may also be stored in the memory 230 first, and then transmitted through the second interface 202.
  • the original data is transmitted to the optimization module 214 through another channel, and the optimization module 214 performs the first optimization processing, such as dead pixel compensation, linearization processing, and black level correction.
  • the data after the first optimization processing by the optimization module 214 is sent to the neural network processor 220 and processed by the neural network processor 220.
  • the data processed by the optimization module 214 can be sent to the memory 230 first, and then the data stored in the memory 230 and processed by the optimization module 214 for the first optimization process is transmitted to the neural network processor 220.
  • the neural network processor 220 processes the data that has been optimized for the first time by the optimization module 214.
  • the second optimization processing may include adjusting the bit width of the data, so that the adjusted bit width is the same as the bit width of the data to be processed by the application processing chip.
  • the above is the fourth method for the multimedia processing chip 200 of the embodiment of the application to perform data processing, which can ensure that the application processing chip performs further processing on the data after the bit width adjustment based on the state information, so as to improve image quality, such as improving video playback. the quality of.
  • the convergence speed of the neural network processor 220 can also be accelerated to improve the smoothness of video playback.
  • the main control processor 260 can determine whether the image data is corrupted or not. If there are problems such as dots, the optimization module 214 can be started to perform optimization processing such as dead pixels on the image data first, and if there are no problems, the neural network processor 220 can directly process them. After the neural network processor 220 has processed the data, the main control processor 260 can determine whether the bit width of the data processed by the neural network processor 220 is the same as the preset bit width. The data processed by the neural network processor 220 is transmitted to the application processing chip.
  • the optimization module 214 can perform the optimization processing of the bit width adjustment, so that the bit width of the data after the bit width adjustment by the optimization module 214 is the same as the preset bit width. It is understandable that the preset bit width can be understood as the bit width required by the application processing chip to process data.
  • connection manner of the multimedia processing chip 200 in the embodiment of the present application with other devices such as an application processing chip is not limited to this.
  • the multimedia processing chip 200 may further include a third interface connected to the application processing chip.
  • the multimedia processing chip 200 may further include a third interface 203.
  • the third interface 203 may be referred to as an interconnection bus interface.
  • the third interface 203 is a high-speed interconnection bus interface (Peripheral Component Interconnect Express, PCIE) 203, It can also be called a high-speed peripheral component interconnection interface, a peripheral device interconnection bus interface, which is a high-speed serial computer expansion bus standard interface.
  • PCIE Peripheral Component Interconnect Express
  • PCIE Peripheral Component Interconnect Express
  • the third interface 203 is connected to the system bus 240, and the third interface 203 can realize data transmission with other devices through the system bus 240.
  • the third interface 203 can receive the result of processing the image data by the image signal processor 210, and the third interface 203 can also receive the result of processing the image data by the neural network processor 220.
  • the third interface 203 can also be connected with an application processing chip to transmit data processed by the multimedia processing chip 200 to the application processing chip.
  • the third interface 203 can transmit image data offline.
  • the third interface 203 can transmit data of static images offline, and the third interface 203 can also transmit data of dynamic images offline.
  • the multimedia processing chip 200 of the embodiment of the present application can not only process the image data collected by the camera, but also process the static image data and/or the dynamic image data offline to realize the enhancement of picture quality and the quality of video playback.
  • the offline static image processing methods include:
  • the photo album viewing instruction may be received by an application processor of an electronic device to which the multimedia chip 200 is applied, such as a smart phone.
  • the album interface displays two virtual controls, "enhanced mode" and "normal mode".
  • the application processor determines that the enhanced mode is entered, and step 3013 is executed.
  • the application processor determines that the enhanced mode is not entered, and executes 3016. It should be noted that the method of determining whether to enter the picture enhancement mode is not limited to this, and it is only an example.
  • the picture enhancement mode can be understood as a mode for improving picture quality, that is, a mode for processing picture data by the multimedia processing chip 200 defined by the embodiment of the present application.
  • the normal mode can be understood as a mode in which picture data is not processed by the multimedia processing chip 200 as defined in this application.
  • An instruction may be issued by an application processor of an electronic device to which the multimedia chip 200 is applied, such as a smart phone, to send the picture data to be displayed to the multimedia processing chip 200.
  • the display screen of an electronic device to which the multimedia chip 200 is applied such as a smart phone, displays pictures enhanced by the multimedia processing chip 200.
  • the picture can be displayed directly on the display screen of an electronic device to which the multimedia chip 200 is applied, such as a smart phone, without the multimedia processing chip 200 for enhancement processing.
  • the application processing chip can transmit the image data of the album photo to the multimedia processing chip 200 through the third interface 203, and the multimedia processing chip 200 can perform RAW image editing processing on the image data.
  • the multimedia processing chip 200 finishes processing the RAW image data, the processed data is transmitted through the third interface 203 for display on the display screen of the electronic device. Therefore, the multimedia processing chip 200 of some embodiments of the present application can realize the processing of the RAW image of the gallery.
  • the image can be stored as RAW data, so that the multimedia processing chip 200 can implement RAW image editing on the photos in the album or the data of the photos in the gallery. deal with.
  • the methods of using multimedia processing chips to edit and process RAW images include:
  • the application processor of the application processing chip receives the first instruction to open the album
  • the application processor of the application processing chip transmits the RAW image data of the photos to be processed in the album to the multimedia processing chip 200 through the third interface 203 according to the first instruction.
  • the multimedia processing chip 200 performs RAW image editing processing on the RAW image data.
  • the multimedia processing chip 200 transmits the data after the RAW image editing process on the RAW image data to the external memory through the third interface 203.
  • the multimedia processing chip 200 after the multimedia processing chip 200 performs RAW image editing processing on the RAW image data, it can be transmitted to an external memory through the third interface 203, such as a memory used by an electronic device to store photos in an album. Then, the photo processed by the multimedia processing chip 200 on the RAW image data can be displayed on the display screen of the electronic device.
  • the external memory can be understood as a memory outside the multimedia processing chip.
  • Offline dynamic image processing methods include:
  • the application processor of an electronic device may determine whether to enter the video enhancement mode. For example, after the user enters the video interface, the video interface displays two virtual controls, "enhanced mode" and "normal mode". When the user touches the "enhanced mode” virtual control, the application processor determines that the enhanced mode is entered, and step 4013 is executed. When the user touches the "normal mode” virtual control, the application processor determines that the enhanced mode is not entered, and 4016 is executed. It should be noted that the method of determining whether to enter the video enhancement mode is not limited to this, and it is only an example.
  • the video enhancement mode can be understood as a mode for improving the quality of video playback, that is, the mode in which the multimedia processing chip 200 processes video playback data as defined by the embodiment of the present application.
  • the normal mode can be understood as a mode in which video playback data is not processed by the multimedia processing chip 200 as defined in this application.
  • An instruction may be issued by an application processor of an electronic device to which the multimedia chip 200 is applied, such as a smart phone, to send the video data to be played to the multimedia processing chip 200.
  • 4014 Perform enhancement processing on the video data to be played through the multimedia processing chip 200. To improve the quality of video playback.
  • the video data enhanced by the multimedia processing chip 200 can be played by the display screen of an electronic device to which the multimedia chip 200 is applied, such as a smart phone.
  • the video 4016 play the video.
  • the video can be played directly on the display screen of an electronic device to which the multimedia chip 200 is applied, such as a smart phone, without the multimedia processing chip 200 for enhancement processing.
  • the application processing chip can transmit the image data of the played video to the multimedia processing chip 200 through the third interface 203.
  • the multimedia processing chip 200 may process the image data, such as processing the image data by the neural network processor 220. Can improve the resolution of video playback, and solve the problem of particles appearing in the process of video playback.
  • the multimedia processing chip 200 After the multimedia processing chip 200 finishes processing the image data, the multimedia processing chip 200 transmits the processed data through the third interface 203 to play it on the display screen of the electronic device. Therefore, the multimedia processing chip 200 of some embodiments of the present application can implement video playback processing.
  • the method for processing image data played by a video by using a multimedia processing chip includes:
  • the application processor of the application processing chip receives a second instruction for video playback
  • the application processor of the application processing chip transmits the image data in the video playback process to the multimedia processing chip 200 through the third interface 203 according to the second instruction.
  • the multimedia processing chip 200 enhances the image data during the video playback process through the neural network processor 220, such as SR (Super Resolution) processing, so as to improve the resolution of the video playback and solve the problem of particles occurring during the video playback process.
  • SR Super Resolution
  • the multimedia processing chip 200 transmits the processed image data during the video playback process to the external memory through the third interface 203.
  • the multimedia processing chip 200 after the multimedia processing chip 200 performs image processing on the image data played by the video, it can be transmitted to an external memory through the third interface 203, such as a memory used by an electronic device to store videos. Then, the video processed by the multimedia processing chip 200 on the image data can be displayed on the display screen of the electronic device. It can be understood that the external memory can be understood as a memory outside the multimedia processing chip.
  • the processing of image data by the multimedia processing chip 200 can be understood as two types: one is that the image signal processor 210 performs statistics on the image data to obtain status information. The other is that all or part of the image processors in the multimedia processing chip 200, such as the image signal processor 210 and the neural network processor 220, preprocess the image data.
  • This preprocessing can be understood to be that the image signal processor 210 performs the first preprocessing of the image data such as optimization processing, and then the neural network processor performs the second preprocessing of the image data after the first preprocessing, such as neural network. Algorithm processing, and then the image signal processor 210 performs a third preprocessing such as bit width adjustment processing on the image data after the second preprocessing.
  • image data preprocessing performed by all or part of the image processors in the multimedia processing chip 200 includes at least the neural network processor 220 performing neural network algorithm processing on the image data, and the neural network processor 220 processes the image data.
  • the image signal processor 210 may first perform optimization processing on the image data. After processing by the neural network processor 220, the image signal processor 210 may also perform bit width adjustment processing on the image data.
  • the third interface 203 may perform data transmission, which does not occupy the second interface 202.
  • the second interface 202 can perform real-time data transmission.
  • the module for processing image data in the multimedia processing chip 200 in the embodiment of the present application is not limited to this.
  • the multimedia processing chip 200 may also include other processing modules to process image data.
  • the multimedia processing chip 200 may also include a digital signal processor.
  • the multimedia processing chip 200 may further include a digital signal processor (Digital Signal Processing) 270, and the digital signal processor 270 may be used to assist the image signal processor 210 and the neural network processor 220.
  • the digital signal processor 270 can also process image data with a small amount of calculation.
  • the digital signal processor 270 uses some general algorithms to process image data.
  • the digital signal processor 270 can use an image quality detection algorithm to select one frame of image from multiple frames of images.
  • the neural network processor 220 cannot support some algorithms. For example, for ultra-wide-angle cameras, if deformity correction processing is required, and the neural network processor 220 may not be able to achieve it, the digital signal processor 270 can be used. To deal with it.
  • the digital signal processor 270 in the embodiment of the present application is mainly used to process some image data with a relatively small amount of data
  • the neural network processor 220 is mainly used to process some image data with a relatively large amount of data.
  • the digital signal processor 270 may be used to process static images
  • the neural network processor 220 may be used to process dynamic images such as video images.
  • the digital signal processor 270 is used to process image data in the photographing mode
  • the neural network processor 220 is used to process image data in the video recording mode, video playback mode, and preview image mode. Therefore, the embodiment of the present application adopts the combination of the digital signal processor 270 and the neural network processor 220 to achieve better and more comprehensive image processing optimization, so that the quality of the image data processed by the multimedia processing chip 200 is better. Good, the display effect is better.
  • the multimedia processing chip 200 may transmit the image data in the photo shooting mode through the third interface 203.
  • the multimedia processing chip 200 can transmit image data in the video recording mode through the second interface 202.
  • the multimedia processing chip 200 can transmit image data in the preview image mode through the second interface 202.
  • the multimedia processing chip 200 can transmit the image data of the played video through the third interface 203.
  • the multimedia processing chip 200 can output the image data of the displayed photo through the third interface 203.
  • the third interface 203 can transmit image data in real time or offline, and can also transmit data such as configuration parameters.
  • the third interface 203 has high data transmission efficiency. Based on this, the embodiment of the present application can allocate different data to the second interface 202 and the third interface 203 for transmission. In order to improve the efficiency of data transmission.
  • the main control processor 260 can determine which type of image data the image data received by the multimedia processing chip 200 is, or the main control processor 260 can determine whether the image data received by the multimedia processing chip 200 is In which mode the image data was acquired. When the multimedia processing chip 200 receives image data, the main control processor 260 can determine according to the image data which mode the image data is the image data acquired.
  • the main control processor 260 may control the neural network processor 220 to process the image data.
  • the main control processor 260 may control the digital signal processor 270 to process the image data.
  • the image data in the photo shooting mode may also be transmitted through the second interface 202.
  • the difference between the multimedia processing chip 200 shown in FIG. 24 and the multimedia processing chip 200 shown in FIG. 23 is that the multimedia processing chip 200 shown in FIG. 24 is not provided with a third interface.
  • the data processed by the multimedia processing chip 200 on the static image may also be transmitted through the second interface 202.
  • the second interface 202 has multiple channels.
  • the multimedia processing chip 200 processes the dynamic image it can be directly transmitted through one or more channels of the second interface 202. That is, each path of the second interface 202 is preferentially allocated to the data for the multimedia processing chip 200 to process the dynamic image.
  • the main control processor 260 can first determine whether each path of the second interface 202 has an idle path, that is, whether there is a path that is not transmitting dynamic image data. If one or more of the multiple channels of the second interface 202 are in an idle state, the data processed by the multimedia processing chip 200 for the static image can be transmitted through the one or more channels in the idle state.
  • the first interface 201 and the second interface 202 of the multimedia processing chip 200 can also be directly connected, so that the first interface 201 can directly transmit some image data such as static image data to the second interface 202 instead of The image data is processed by the image signal processor 210 and/or the neural network processor 220.
  • the image data when the multimedia processing chip 200 receives the image data of the recorded video, the image data may be transmitted to the image signal processor 210 through the first interface 201 for processing.
  • the image data can be directly transmitted to the second interface 202 through the first interface 201.
  • the image data can be directly transmitted to the second interface 202 through the first interface 201.
  • the image data can also be transmitted to the image signal processor 210 through the first interface 201 for processing, so that the problem of image consistency can be solved.
  • the multimedia processing chip 200 can be applied to an electronic device such as a smart phone, a tablet computer, and the like.
  • the electronic device 20 may include an image sensor 600, a multimedia processing chip 200, and an application processing chip 400.
  • the camera 600 can collect image data.
  • the camera 600 may be a front camera or a rear camera.
  • the camera 600 may include an image sensor and a lens, and the image sensor may be a Complementary Metal-Oxide-Semiconductor (CMOS) image sensor, a Charge Coupled Device (CCD) image sensor, or the like.
  • CMOS Complementary Metal-Oxide-Semiconductor
  • CCD Charge Coupled Device
  • the camera 600 may be electrically connected to the multimedia processing chip 200, such as the camera 600 and the first interface 201 of the multimedia processing chip 200.
  • the camera 600 can collect raw image data such as RAW image data, and transmit it to the multimedia processing chip 200 through the first interface 201, for processing by the image processors inside the multimedia processing chip 200, such as the image signal processor 210 and the neural network processor 220 .
  • the multimedia processing chip 200 is any multimedia processing chip 200 as described above, and will not be repeated here.
  • the application processing chip 400 can realize the control of various functions of the electronic device 20.
  • the application processing chip 400 can control the camera 600 of the electronic device 20 to collect images, and the application processing chip 400 can also control the multimedia processing chip 200 to process the images collected by the camera 600.
  • the application processing chip 400 can also process image data.
  • the image data collected by the camera 600 can be transmitted to the interface of the multimedia processing chip 200.
  • the multimedia processing chip 200 can preprocess the image data
  • the application processing chip 400 can postprocess the image data.
  • the processing of image data between the multimedia processing chip 200 and the application processing chip 400 may be differentiated processing, or the same processing may exist.
  • the application processing chip 400 of the electronic device 20 may include an application processor 410, an image signal processor 420, a memory 430, a system bus 440, and a fourth interface 401.
  • the application processor 410 may serve as the control center of the electronic device 20. It can also execute some algorithms to process image data.
  • the memory 430 may store various data such as image data, system data, and so on.
  • the memory 430 may be built in the application processing chip 410, or may be external to the application processor 410.
  • the fourth interface 401 may be a mobile industry processor interface, and the fourth interface 401 is electrically connected to the second interface 202, and can receive data processed by the multimedia processing chip 200.
  • the image signal processor 420 can process image data.
  • the application processor 410 and the image signal processor 420 may jointly perform post-processing on the preprocessed image data. It can be understood that the application processor 410 and the image signal processor 420 may also process the image data collected by the camera 600 together.
  • the result of image data processing by the image signal processor 210 may be transmitted to the memory 430 of the application processing chip 400 through the connection of the second interface 202 and the fourth interface 401.
  • the result of image data processing by the neural network processor 220 may be transmitted to the memory 430 of the application processing chip 400 through the connection of the second interface 202 and the fourth interface 401.
  • the result of the image data processing by the digital signal processor 270 may be transmitted to the memory 430 of the application processing chip 400 through the connection of the second interface 202 and the fourth interface 401.
  • the data received by the multimedia processing chip 200 can be directly transmitted from the first interface 201 to the second interface 202, and then through the second interface The connection of 202 and the fourth interface 401 is transferred to the memory 430.
  • the manner in which the multimedia processing chip 200 transmits data to the application processing chip 400 is not limited to this.
  • the multimedia processing chip 200 in the electronic device 20 may refer to FIG. 23.
  • the application processing chip 400 may also include a fifth interface 402.
  • the fifth interface 402 may be referred to as an interconnection bus interface.
  • the interconnection bus interface can also be referred to as a high-speed peripheral component interconnection interface, an external device interconnection bus interface, which is a high-speed serial computer expansion bus standard interface.
  • the fifth interface 402 may also be a low-speed interconnection bus interface.
  • the fifth interface 402 and the third interface 203 are connected.
  • the fifth interface 402 and the third interface 203 are of the same type, for example, the fifth interface 402 and the third interface 203 are both high-speed interconnection bus interfaces.
  • the multimedia processing chip 200 can transmit some image data such as static image data and preview image data to the memory 430 through the connection of the third interface 203 and the fifth interface 402.
  • the multimedia processing chip 200 may also transmit some data, such as data of photo albums and video playback, to the memory 430 through the connection of the third interface 203 and the fifth interface 402.
  • the application processing chip 400 when the multimedia processing chip 200 transmits the processed image data to the application processing chip 400, the application processing chip 400 performs post-processing on the data processed by the multimedia processing chip 200, and stores the processed data And displayed on the display.
  • the multimedia processing chip 200 acquires dynamic image data such as video recorded image data.
  • the multimedia processing chip 200 uses the multimedia processing chip 200 to count the status information of the dynamic image data according to the acquired dynamic image data, and the multimedia processing chip 200 to The dynamic image data is preprocessed.
  • the multimedia processing chip 200 can send the statistical status information and the preprocessed dynamic image data to the application processing chip 400.
  • the application processing chip 400 performs post-processing on the pre-processed dynamic image data based on the state information. Thereby, the quality of the image can be improved.
  • the application processor 410 receives the third instruction for starting the camera 600.
  • the application processor 410 starts the camera 600 based on the third instruction for starting the camera 600.
  • the camera 600 may be configured by the application processor 410 to realize the activation of the camera 600.
  • the camera 600 collects image data and transmits the image data to the first interface 201 of the multimedia processing chip 200.
  • the statistical module 212 of the image signal processor 210 performs statistical processing on the image data to calculate the state information, and transmits the statistical state information to the fourth interface 401 through the second interface 202.
  • the optimization module 214 of the image signal processor 210 performs optimization processing such as linearization processing, dead pixel compensation, black level correction, etc., on the image data, and transmits the optimized data to the neural network processor 220. It is understandable that the optimization module 214 can directly transmit the optimized data to the neural network processor 210, or store it in the memory 230, and the neural network processor 220 can obtain it from the memory 230.
  • the neural network processor 210 processes the data optimized by the optimization module 214, such as neural network algorithm processing, and transmits the processed data to the memory 230.
  • the multimedia processing chip 200 transmits the processed data to the fourth interface 401 through the second interface 202.
  • the application processor 410 and the image signal processor 420 perform post-processing, such as 3A processing, on the data processed by the neural network processor 220 based on the state information.
  • the image signal processor of the application processing chip needs to perform statistics on the state information of the image data, and the application processor of the application processing chip executes some algorithms based on the state.
  • the information calculates some parameters such as focus parameters, exposure parameters, white balance parameters, lens shading correction parameters, etc.
  • the application processor can configure the camera, and the image signal processor can perform correction processing on the image data.
  • the entire processing process is executed by the application processing chip, resulting in high power consumption of the application processing chip.
  • the application processing chip often needs to control various other functions, which may affect the performance of the application processing chip during the entire image processing process.
  • part of the processing of the image data is handed over to the multimedia processing chip 400 for processing, and another part of the processing is handed over to the application processing chip 200 for processing, so that on the basis of saving power consumption of the application processing chip 200, the image quality can also be improved.
  • the image signal processor 210 can send the statistical status information to the application processing chip 400, and the application processor 410 executes some algorithms to calculate some parameters based on the status information, such as focus parameters, exposure parameters, and whiteness. Balance parameters, lens shading correction parameters, etc.
  • the application processor 410 may configure the camera 600, and the image signal processor 420 may perform correction processing on the image data.
  • the image signal processor 210 can also be calculated without using the application processor 410 to execute some algorithms.
  • the main control processor 260 executes some algorithms and calculates some parameters based on the status information. Then, the parameters are transmitted to the application processing chip 400, and the application processor 410 can configure the camera 600, and the image signal processor 420 can perform correction processing on the image data.
  • the image signal processor 420 can also execute some algorithms to calculate some parameters based on the status information. Based on the parameters, the application processor 410 can configure the camera 600, and The image signal processor 420 performs correction processing on the image data.
  • the algorithms executed by the application processor 410 and the main control processor 260 can be updated, but the algorithms executed by the image signal processor 420 often cannot be updated. In the actual application process, the application processor 410 or The main control processor 260 executes related algorithms to calculate the status information.
  • the state information may include auto-focus state information
  • the application processor 410 may execute related algorithms, calculate a focus parameter based on the auto-focus state information, and configure the focus parameter to the camera 600.
  • the camera 600 can focus based on the focus parameter.
  • the main control processor 260 may also execute related algorithms to calculate the focus parameters based on the auto-focus state information, and then configure the focus parameters to the camera 600 or send to the application processing chip 400, and the application processor 410 configures the focus parameters to Camera 600.
  • the image signal processor 420 may execute related algorithms to calculate the focus parameters.
  • the auto focus state information may include one or more of phase focus state information, contrast focus state information, laser focus state information, and TOF (Time of Flight) focus state information.
  • the auto focus state information may include contrast focus state information, and the image data such as dynamic image data may be processed by a preset algorithm through the image signal processor 210 to calculate the contrast focus state information.
  • the application processor 410 may execute related algorithms, calculate a contrast focus parameter based on the contrast focus state information, and configure the contrast focus parameter to the camera 600.
  • the camera 600 can perform focusing based on the contrast focusing parameter.
  • the main control processor 260 may also execute the relevant algorithm, calculate the contrast focus parameter based on the contrast focus state information, and then configure the contrast focus parameter to the camera 600 or send it to the application processing chip 400, and the application processor 410 will focus the contrast
  • the parameters are configured to the camera 600.
  • the image signal processor 420 to execute related algorithms to calculate the contrast focus parameters.
  • the auto-focusing state information may also include phase-focusing state information, and image data such as dynamic image data may be extracted through the image sensor 210, such as marking and distinguishing the image data, to extract the phase-focusing state information.
  • the application processor 410 may execute related algorithms, calculate a phase focus parameter based on the phase focus state information, and configure the phase focus parameter to the camera 600.
  • the camera 600 can perform focusing based on the phase focusing parameter.
  • the main control processor 260 may also execute related algorithms, calculate the phase focus parameters based on the phase focus state information, and then configure the phase focus parameters to the camera 600 or send to the application processing chip 400, and the application processor 410 will focus the phase.
  • the parameters are configured to the camera 600.
  • the image signal processor 420 it is also possible for the image signal processor 420 to execute related algorithms to calculate the phase focus parameters.
  • the auto focus state information may also include laser focus state information, and the image data such as dynamic image data may be processed by a preset algorithm through the image signal processor 210 to calculate the laser focus state information.
  • the application processor 410 may execute related algorithms, calculate laser focus parameters based on the laser focus state information, and configure the laser focus parameters to the camera 600.
  • the camera 600 can focus based on the laser focus parameter.
  • the main control processor 260 may also execute related algorithms, calculate the laser focus parameters based on the laser focus status information, and then configure the laser focus parameters to the camera 600 or send to the application processing chip 400, and the application processor 410 focuses the laser
  • the parameters are configured to the camera 600.
  • the image signal processor 420 may execute related algorithms to calculate the laser focus parameters.
  • the auto focus state information may also include TOF focus state information, and the image data such as dynamic image data may be processed by a preset algorithm through the image signal processor 210 to calculate the TOF focus state information.
  • the application processor 410 may execute related algorithms, calculate TOF focus parameters based on the TOF focus state information, and configure the TOF focus parameters to the camera 600.
  • the camera 600 can focus based on the TOF focus parameter.
  • the main control processor 260 may also execute related algorithms, calculate TOF focus parameters based on the TOF focus state information, and then configure the TOF focus parameters to the camera 600 or send them to the application processing chip 400, and the application processor 410 will focus the TOF
  • the parameters are configured to the camera 600.
  • the image signal processor 420 may execute related algorithms to calculate the TOF focus parameters.
  • the state information may also include automatic white balance state information.
  • the application processor 410 may execute related algorithms and calculate white balance parameters based on the automatic white balance state information.
  • the image signal processor 420 may preprocess the multimedia processing chip 200 based on the white balance parameters. The processed image data is subjected to white balance processing, or image correction processing.
  • the main control processor 260 may also execute related algorithms, calculate white balance parameters based on the automatic white balance state information, and then send the white balance parameters to the application processing chip 400, and the image signal processor 420 performs multimedia processing based on the white balance parameters.
  • the image data preprocessed by the chip 200 is subjected to white balance processing.
  • the state information may also include automatic exposure state information, and the application processor 410 may execute related algorithms, calculate exposure parameters based on the automatic exposure state information, and configure the exposure parameters to the camera 600.
  • the camera 600 can perform exposure based on the exposure parameter.
  • the main control processor 260 may also execute related algorithms to calculate the exposure parameters based on the automatic exposure status information, and then configure the exposure parameters to the camera 600 or send to the application processing chip 400, and the application processor 410 configures the exposure parameters to Camera 600.
  • the image signal processor 420 may execute related algorithms to calculate the exposure parameters.
  • the image signal processor 420 can perform compensation processing on the exposure parameters, and then the application processor 410 can configure the compensated exposure parameters to the camera 600, and the camera 600 is based on the compensation. Expose after the exposure parameters.
  • the status information also includes lens shading correction status information.
  • the application processor 410 can execute related algorithms and calculate lens shading correction parameters based on the lens shading correction status information.
  • the image signal processor 420 can pre-determine the multimedia processing chip 200 based on the lens shading correction parameters.
  • the processed image data is subject to lens shading correction.
  • the main control processor 260 may also execute related algorithms, calculate the lens shading correction parameters based on the lens shading correction status information, and then send the lens shading correction parameters to the application processing chip 400, and the image signal processor 420 performs the pairing based on the white balance parameters.
  • the image data preprocessed by the multimedia processing chip 200 is subjected to white balance processing.
  • the image signal processor 420 it is also possible for the image signal processor 420 to execute related algorithms to calculate the lens shading correction parameters.
  • the image signal processor 420 statistics the state information of the image data can be understood as: using an algorithm to calculate some state information and/or using an extraction method to extract some state information.
  • the image signal processor 420 does not need to process the image data performed by the optimization module 214.
  • the optimization module 214 performs dead pixel compensation linearization processing and black level correction on the image data
  • the image signal processor 420 does not need to perform dead pixel compensation, linearization processing, and black level correction.
  • the multimedia processing chip 200 and the application processing chip 400 of the embodiment of the present application perform differential processing on image data.
  • the power consumption of the application processing chip 400 can be saved.
  • the application processing chip 400 and the multimedia processing chip 200 may also perform part of the same processing on the image data.
  • the multimedia processing chip 200 performs noise reduction processing on the dynamic image data, and the application processing chip 400 also performs noise reduction processing on the dynamic image data.
  • the multimedia processing chip 200 performs statistical processing on the dynamic image data, and the processing chip 400 also performs statistical processing on the dynamic image data.
  • the image signal processor 420 sends the processed data to the display screen and the memory 430 to display and store the image.
  • the image can be encoded by an encoder before being stored, and then stored after the encoding is completed. If the image is a static image, it can be compressed in the memory first, such as JPEG compression, and then stored after waiting for compression.
  • the image data processed by the multimedia processing chip 200 can be RAW image data
  • the application processing chip 400 can process the RAW image data such as 3A processing, and can also convert the RAW format to the YUV format to correct the YUV format.
  • the image is processed.
  • the image signal processor 420 performs RGBToYUV processing on the image in the YUV format.
  • the main control processor 260 may first determine the bit width of the data processed by the neural network processor 220 Whether the bit width of the data to be processed by the application processing chip 400 is the same, if the same, the image signal processor 210 transmits the data processed by the neural network processor 220 to the fourth interface 401 through the second interface 202. If they are not the same, the optimization module 214 of the image signal processor 210 performs bit width adjustment processing on the data processed by the neural network processor 220, so that the bit width of the adjusted data is the same as the bit width of the data to be processed by the application processing chip 400 same. To ensure that the application processing chip 400 can normally process the data transmitted by the multimedia processing chip 200.
  • the multimedia processing chip 200 when the multimedia processing chip 200 processes the image data, the original image may not be optimized by the optimization module 214, but the neural network processor 220 may directly perform the processing.
  • the method for applying the processing chip 400 to process image data includes:
  • the fourth interface 401 of the application processing chip 400 receives the status information of the dynamic image data collected by the statistics module 212.
  • the fourth interface 401 of the application processing chip 400 receives the result of the neural network processor 220 performing neural network algorithm processing on the dynamic image data.
  • the optimization module 214 may perform optimization processing on the dynamic image data.
  • the application processing chip 400 performs secondary processing on the result of processing the dynamic image data by the neural network processor 220 based on the state information.
  • the optimization module 214 may perform bit width adjustment processing on the data processed by the neural network processor 220.
  • the application processing chip 400 performs postprocessing on the image data to improve the image quality.
  • FIG. 30 and FIG. 31 show a frame of image displayed by the multimedia processing chip 200 and the application processing chip 400 of the embodiment of the present application jointly processing the image, which includes the image data processed by the neural network processor 220 in the embodiment of the present application. Perform HDR algorithm processing.
  • the second diagram of FIG. 30 shows a frame of image displayed only by processing the image alone by the application processing chip. From the comparison between the first image and the second image, it can be seen that the two frames of images are different in multiple aspects and multiple regions.
  • the brightness around the character in the second image is too bright, and the display of objects close to the character is too clear.
  • the sharpness of the object in the second area B is greater than the sharpness of the second area A, resulting in the character not being prominent enough.
  • the details of the surrounding area of the second image, such as the first area B are not as good as the details of the first area A.
  • the third diagram of FIG. 31 shows a frame of image displayed by the multimedia processing chip 200 and the application processing chip 400 of the embodiment of the present application when the image is processed together.
  • the image signal is processed by the video night scene algorithm.
  • the fourth diagram of FIG. 31 shows a frame of image displayed only by processing the image alone by the application processing chip. It can be seen from the comparison of the third image and the fourth image that there are differences in multiple regions between the two images.
  • the third area A in the third figure is clearer than the third area B in the fourth figure.
  • the fourth area A in the third image shows more details than the fourth area B in the fourth image.
  • the camera 600, the multimedia processing chip 200, and the application processing chip 400 defined in the embodiment of the present application can be installed together, such as the camera 600, the multimedia processing chip 200 and the application processing chip 400 are installed on a circuit board.
  • the circuit board 22 is mounted with an image sensor 600, a multimedia processing chip 200, and an application processing chip 400.
  • the camera 600, the multimedia processing chip 200, and the application processing chip 400 are all connected by signal lines to realize signal transmission.
  • circuit board 22 may also be equipped with other components, and the examples are not given here.
  • the camera 600 can also be mounted on the same circuit board as the multimedia processing chip 200 and the application processing chip 400.
  • the camera 600 is separately mounted on a circuit board, and the multimedia processing chip 200 and the application processing chip 400 are mounted on the same circuit board.
  • the camera 600 and the multimedia processing chip 200 are connected through a signal line.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Image Processing (AREA)
  • Studio Devices (AREA)

Abstract

本申请实施例提供一种多媒体处理芯片、电子设备及动态图像处理方法,多媒体处理芯片包括图像信号处理器和神经网络处理器,图像信号处理器统计图像数据的状态信息;神经网络处理器对图像数据进行神经网络算法处理;多媒体处理芯片至少通过所述神经网络处理器对图像数据进行预处理,并将状态信息和预处理过的图像数据发送至应用处理芯片。

Description

多媒体处理芯片、电子设备及动态图像处理方法
本申请要求于2020年05月29日提交中国专利局、申请号为202010478366.5、发明名称为“多媒体处理芯片、电子设备及动态图像处理方法”的中国专利申请的优先权,其全部内容通过引用结合在本申请中。
技术领域
本申请涉及图像处理技术领域,特别涉及一种多媒体处理芯片、电子设备及动态图像处理方法。
背景技术
各种可以进行视频拍摄和拍照功能的多媒体设备(如数码相机、智能手机、平板电脑等)中,一般都具有获取图像的图像传感器,可以进行图像处理的多媒体处理芯片、以及应用处理器(AP,Application Processor)。图像传感器可以通过MIPI(Mobile Industry Processor Interface,移动产业处理器接口)线连接到多媒体处理芯片,多媒体处理芯片可以通过MIPI线连接到AP。
图像传感器可包括互补金属氧化物半导体(Complementary Metal-Oxide-Semiconductor,CMOS)图像传感器、电荷藕合器件(Charge Coupled Device,CCD)图像传感器等。多媒体处理芯片一般采用图像信号处理器(Image Signal Processor,ISP)对图像传感器获取的图像进行处理,多媒体处理芯片对图像处理完成后得到处理结果,并将处理结果传输到AP。然而,相关技术中的多媒体处理芯片对图像的处理能力有限。
发明内容
本申请实施例提供一种多媒体处理芯片、电子设备及动态图像处理方法,可以提高多媒体处理芯片对图像处理的能力。
本申请实施例公开一种多媒体处理芯片,其包括:
图像信号处理器,用于统计图像数据的状态信息;和
神经网络处理器,用于对图像数据进行神经网络算法处理;
其中,所述多媒体处理芯片用于至少通过所述神经网络处理器对所述图像数据进行预处理,并将所述状态信息和预处理过的图像数据发送至应用处理芯片。
本申请实施例公开一种电子设备,其包括:
多媒体处理芯片,为如上所述的多媒体处理芯片;和
应用处理芯片,用于从所述多媒体处理芯片获取于所述预处理的结果和统计的状态信息,所述应用处理芯片基于所述状态信息对所述预处理的结果进行后处理。
本申请实施例公开一种动态图像处理方法,其包括:
获取动态图像数据;
根据所述动态图像数据,通过多媒体处理芯片统计所述动态图像数据的状态信息,并对所述动态图像数据进行预处理;
将所述多媒体处理芯片所统计的状态信息和预处理后的动态图像数据发送至应用处理芯片;
基于所述状态信息通过所述应用处理芯片对所述预处理后的动态图像数据进行后处理。
附图说明
图1为本申请实施例提供的图像处理装置的第一种结构示意图。
图2为本申请实施例提供的图像处理装置的第二种结构示意图。
图3为图1所示图像处理装置的第一应用场景示意图。
图4为图1所示图像处理装置的第二应用场景示意图。
图5为本申请实施例提供的图像处理装置对视频图像进行处理的方法流程示意图
图6为本申请实施例提供的多媒体处理芯片的第一种结构示意图。
图7为本申请实施例提供的多媒体处理芯片的第二种结构示意图。
图8为本申请实施例提供的多媒体处理芯片的第三种结构示意图。
图9为本申请实施例提供的多媒体处理芯片的第四种结构示意图。
图10为本申请实施例提供的多媒体处理芯片对图像数据进行处理的第一种数据流向示意图。
图11为本申请实施例提供的多媒体处理芯片对图像数据进行处理的第一种方法示意图。
图12为本申请实施例提供的多媒体处理芯片对图像数据进行处理的第二种数据流向示意图。
图13为本申请实施例提供的多媒体处理芯片对图像数据进行处理的第二种方法示意图。
图14为本申请实施例提供的多媒体处理芯片对图像数据进行处理的第三种数据流向示意图。
图15为本申请实施例提供的多媒体处理芯片对图像数据进行处理的第三种方法示意图
图16为本申请实施例提供的多媒体处理芯片对图像数据进行处理的第四种数据流向示意图。
图17为本申请实施例提供的多媒体处理芯片对图像数据进行处理的第四种方法示意图。
图18为本申请实施例提供的多媒体处理芯片的第五种结构示意图。
图19为本申请实施例提供的离线静态图像处理方法的流程示意图。
图20为本申请实施例采用多媒体处理芯片对RAW图像编辑处理的方法流程示意图。
图21为本申请实施例提供的离线动态图像处理方法的流程示意图。
图22为本申请实施例采用多媒体处理芯片对视频播放的图像数据进行处理的方法流程示意图。
图23为本申请实施例提供的多媒体处理芯片的第六种结构示意图。
图24为本申请实施例提供的多媒体处理芯片的第七种结构示意图。
图25为本申请实施例提供的多媒体处理芯片的第八种结构示意图。
图26为本申请实施例提供的电子设备第一种结构示意图。
图27为本申请实施例提供的电子设备的第二种结构示意图。
图28为本申请实施例提供的电子设备的第三种结构示意图。
图29为本申请实施例提供的应用处理芯片中图像信号处理器对图像数据进行处理的流程示意图。
图30为本申请实施例与相关技术的比较图。
图31为本申请实施例与相关技术的比较图。
图32为本申请实施例提供的电路板的第一种结构示意图。
图33为本申请实施例提供的电路板的第二种结构示意图。
具体实施方式
本申请实施例提供一种多媒体处理芯片、电子设备及动态图像处理方法。其中,多媒体处理芯片可以集成于电路板诸如主板中,以应用到电子设备中实现对图像的处理,以提升图像质量。
本申请实施例中,多媒体处理芯片包括:
图像信号处理器,用于统计图像数据的状态信息;和
神经网络处理器,用于对图像数据进行神经网络算法处理;
其中,所述多媒体处理芯片用于至少通过所述神经网络处理器对所述图像数据进行预处理,并将所述状态信息和预处理过的图像数据发送至应用处理芯片。
本申请实施例中,应用处理芯片对图像数据诸如动态图像数据进行处理之前,可以先从图像数据诸如动态图像数据中统计出状态信息,应用处理芯片对图像数据诸如动态图像数据进行处理时,可以基于该状态信息对图像数据进行处理,以提高对图像数据的处理能力。同时本申请实施例先由多媒体处理芯片对图像数据进行处理,再由应用处理芯片对图像进行进一步处理,可以节省应用处理芯片的功耗。
本申请一种可选的实施例中,所述图像信号处理器还用于对所述图像数据进行第一次预处理;所述神经网络处理器还用于对所述第一次预处理后的图像数据进行第二次预处理。
本申请一种可选的实施例中,所述图像信号处理器还用于对所述第二次预处理后的图像数据进行第三次预处理,所述图像信号处理器对图像数据进行第三次预处理包括对图像数据进行位宽调整处理,以使得位宽调整后的图像数据的位宽与所述应用处理芯片所处理图像数据的位宽相同。
本申请一种可选的实施例中,所述图像信号处理器对所述图像数据进行第一次预处理包括坏点补偿、线性化处理和黑电平校正中的至少一者。
本申请一种可选的实施例中,所述图像信号处理器对所述图像数据进行第一次预处理还包括图像裁 剪处理和/或图像缩小处理。
本申请一种可选的实施例中,所述图像信号处理器还用于对所述神经网络算法处理后的图像数据进行位宽调整处理,以使得位宽调整后的图像数据的位宽与所述应用处理芯片所处理图像数据的位宽相同。
本申请一种可选的实施例中,所述图像数据包括动态图像数据,所述多媒体处理芯片用于处理动态图像数据,所述神经网络处理器用于对所述动态图像数据进行处理的神经网络算法包括夜景算法、HDR算法、虚化算法、降噪算法,超分辨率算法、语义分割算法中的至少一个。
本申请一种可选的实施例中,所述多媒体处理芯片用于实时处理所述动态图像数据,并实时将处理过的动态图像数据传输到所述应用处理芯片。
本申请一种可选的实施例中,所述图像数据包括静态图像数据,所述多媒体处理芯片用于处理静态图像数据,所述神经网络处理器用于对所述静态图像数据进行处理的神经网络算法包括夜景算法、HDR算法、虚化算法、降噪算法,超分辨率算法、语义分割算法中的至少一个。
本申请一种可选的实施例中,所述多媒体处理芯片还用于离线处理静态图像数据和/或动态图像数据。
本申请一种可选的实施例中,所述图像数据为RAW图像数据,所述多媒体处理芯片用于对RAW图像数据进行处理。
本申请一种可选的实施例中,所述状态信息包括自动曝光状态信息、自动白平衡状态信息和自动对焦状态信息中的至少一种。
本申请一种可选的实施例中,所述状态信息还包括镜头阴影校正状态信息。
本申请实施例还提供一种电子设备,其中,包括:
多媒体处理芯片,为如权利要求1-13任一项所述的多媒体处理芯片;和
应用处理芯片,用于从所述多媒体处理芯片获取所述预处理的结果和统计的状态信息,所述应用处理芯片基于所述状态信息对所述预处理的结果进行后处理。
本申请一种可选的实施例中,所述状态信息包括自动对焦状态信息、自动白平衡状态信息和自动曝光状态信息中的至少一种,所述应用处理芯片用于:
基于所述自动对焦状态信息计算出对焦参数,并将所述对焦参数配置给所述电子设备的摄像头;
基于所述自动白平衡状态信息计算出白平衡参数,并基于所述白平衡参数对所述预处理的结果进行白平衡处理;
基于所述自动曝光状态信息计算出曝光参数,并将所述曝光参数配置给所述电子设备的摄像头,或对所述曝光参数进行补偿后配置给所述电子设备的摄像头。
本申请一种可选的实施例中,所述自动对焦状态信息包括相位对焦状态信息和反差对焦状态信息,所述多媒体处理芯片的图像信号处理器用于:
对所述图像数据进行预设算法处理以获取反差对焦状态信息;
从所述图像数据中抽取出相位对焦状态信息;
所述应用处理芯片还用于:
基于所述反差对焦状态信息计算出反差对焦参数,并将所述反差对焦参数配置给所述电子设备的摄像头;
基于所述相位对焦状态信息计算出相位对焦参数,并将所述相位对焦参数配置给所述电子设备的摄像头。
本申请一种可选的实施例中,所述状态信息还包括镜头阴影校正状态信息,所述应用处理芯片还用于:
基于所述镜头阴影校正状态信息计算出镜头阴影校正参数,并基于所述镜头阴影校正参数对所述预处理的结果进行镜头阴影校正。
本申请实施例还提供一种动态图像处理方法,其中,所述方法包括:
获取动态图像数据;
根据所述动态图像数据,通过多媒体处理芯片统计所述动态图像数据的状态信息,并对所述动态图像数据进行预处理;
将所述多媒体处理芯片所统计的状态信息和预处理后的动态图像数据发送至应用处理芯片;
基于所述状态信息通过所述应用处理芯片对所述预处理后的动态图像数据进行后处理。
本申请一种可选的实施例中,所述通过多媒体处理芯片对动态图像数据进行预处理,包括:
对所述动态图像数据进行优化处理;
对优化处理后的动态图像数据进行神经网络算法处理。
本申请一种可选的实施例中,所述状态信息包括自动对焦状态信息、自动白平衡状态信息和自动曝光状态信息中的至少一种,所述基于所述状态信息通过所述应用处理芯片对所述预处理后的动态图像数据进行后处理包括:
基于所述自动对焦状态信息计算出对焦参数,并将所述对焦参数配置给摄像头;
基于所述自动白平衡状态信息计算出白平衡参数,并基于所述白平衡参数对所述预处理的结果进行白平衡处理;
基于所述自动曝光状态信息计算出曝光参数,并将所述曝光参数配置给所述摄像头,或对所述曝光参数进行补偿后配置给所述摄像头。
示例性的,请参阅图1,图像处理装置110可以对其获取到的数据诸如RAW数据进行处理,以便于其他图像处理器对图像数据进行进一步处理,以提升图像质量。
图像处理装置110可以对静态图像数据进行处理,诸如用户在拍照模式下所获取到的静态图像数据。图像处理装置110还可以对动态图像数据进行处理,诸如用户在预览模式或录制视频模式所获取到的动态图像数据。
可以理解的是,静态图像数据和动态图像数据均可以由平台端(System-on-a-Chip,SoC芯片)的处理器进行处理。平台端可以理解为应用处理芯片,平台端的处理器可以理解为图像信号处理器(Image Signal Processing,ISP)和应用处理器(AP,Application Processor)。然而,平台端往往对图像数据的处理能力有限。随着用户对图像质量的要求越来越高,仅通过平台端对图像数据进行处理往往并不能够满足用户需求。
为了提升图像质量,可以理解为图像显示时的质量。本申请一些实施例可以提供一图像预处理器(pre-ISP)诸如神经网络处理器(Neural-network Processing Unit,NPU)先对图像进行预处理,并将预处理的结果传输到平台端。平台端基于pre-ISP的处理结果作为输入数据,并进行后处理。从而可以提升图像质量。
本申请在实际研发过程中发现对于静态图像数据,由pre-ISP先对静态图像数据进行预处理,该预处理操作一般并不会破坏静态图像数据的状态信息。待pre-ISP对静态图像数据预处理后可直接传输到平台端,平台端可以直接对pre-ISP所处理过的静态图像数据进行后处理。
其中,状态信息可以理解为平台端对图像数据进行后处理所需要的信息,即平台端基于该状态信息可以对图像数据的预处理结果进行后处理。
其中,状态信息可以包括自动白平衡(Automatic White Balance,AWB)状态信息、自动曝光(Automatic Exposure,AE)状态信息和自动对焦(Automatic Focus,AF)状态信息,可以简称3A状态信息。状态信息还可以理解为状态数据。需要说明的是,状态信息并不限于此。诸如状态信息还包括镜头阴影校正(Lens Shade Correction,LSC)状态信息。自动白平衡状态信息可以理解为白平衡处理所需的状态信息,自动曝光状态信息可以理解为曝光所需的状态信息,自动对焦状态信息可以理解为对焦所需的状态信息,镜头阴影校正状态信息可以理解为镜头阴影校正所需的状态信息。
然而,对于动态图像数据诸如视频图像数据由pre-ISP诸如神经网络处理器预处理后会使得动态图像数据的状态信息被破坏,诸如图像色彩、图像亮度、对焦所需数据等。即使pre-ISP将其对动态图像数据进行预处理的结果传输到平台端,由于pre-ISP诸如神经网络处理器对动态图像数据处理后所得到 的预处理结果破坏了状态信息,导致平台端无法在pre-ISP进行预处理的基础上进行后处理。
基于此,本申请一些实施例可以采用图像处理装置诸如图1所示的图像处理装置110中的统计模块112对动态图像数据进行统计,以从动态图像数据中得到状态信息。pre-ISP诸如神经网络处理器对动态图像数据进行预处理后,平台端可以基于图像处理装置110的统计模块112所统计得到的状态信息对预处理结果进行后处理,以提升动态图像质量。
然而,可以理解的是,对于动态图像数据而言,无论是播放视频,还是录制视频,若视频出现卡顿等现象,则会对用户造成较大的影响。为了尽可能保持视频图像的连续性,或者说为了尽可能减少甚至消除视频图像的卡顿问题,本申请一些实施例在对动态图像进行处理过程中,先对动态图像进行优化处理,以减少甚至消除播放卡顿的问题。
基于此,本申请一些实施例可以采用图像处理装置诸如图1所示的图像处理装置110中的优化模块114对动态图像数据进行优化处理,以解决动态图像数据所存在的坏点等问题。再将优化处理后的数据传输到pre-ISP诸如神经网络处理器,可以加快神经网络处理器的收敛,以提高神经网络处理器处理一帧图像所用的时间,从而就可以确保神经网络处理器可以在预设时间段内处理完一帧动态图像数据。该预设时间段诸如为33nm(毫秒)。
其中,优化模块114对动态图像数据进行优化处理可以包括坏点补偿(Bad Pixel Correction,BPC)、线性化(Linearization)处理、黑电平校正(Black Level Correction,BLC)中的至少一个。优化模块114对动态图像数据进行优化处理的算法可以包括黑电平校正算法、坏点补偿算法和线性化处理算法中的至少一个。优化模块114执行黑电平校正算法可以实现对动态图像数据的黑电平校正,优化模块114执行坏点补偿算法可以实现对动态图像数据的坏点补偿,优化模块114执行线性化处理算法可以实现对动态图像数据的线性化处理。
需要说明的是,优化模块114对动态图像数据进行优化处理并不限于此,诸如优化模块114对动态图像数据进行优化处理还可以包括图像裁剪(Crop)处理、图像缩小(Bayerscaler)处理中的至少一个。优化模块114对动态图像数据进行优化处理的算法可以包括图像裁剪算法、图像缩小算法中的至少一个。优化模块114执行图像裁剪算法可以实现对动态图像的裁剪,优化模块114执行图像缩小算法可以实现对动态图像的缩小。
本申请一些实施例中,可以采用几个不同的优化模块来分别执行不同的算法,以达到不同的优化结果。也可以在优化模块中分为几个优化子模块来分别执行不同的算法,以达到不同的优化结果。
请参阅图2,图像处理装置110的优化模块114可以包括多个优化子模块,在此可以将优化子模块定义为优化单元。诸如优化模块114包括第一优化单元1142和第二优化单元1144。第一优化单元1142可以对动态图像数据进行坏点补偿,第二优化单元1144可以对动态图像数据进行线性化处理。从而可以确保优化模块114优化处理过的数据加快pre-ISP诸如神经网络处理器的收敛速度,进而可以保证
pre-ISP诸如神经网络处理器能够在预设时间段内完成对一帧图像的处理,以解决播放卡顿的问题。
可以理解的是,优化模块114的优化单元并不限于第一优化单元1142和第二优化单元1144,诸如优化模块114还可以包括第三优化单元,第三优化单元可以对动态图像数据进行黑电平校正。优化模块114还可以包括第四优化单元可以对动态图像数据进行图像裁剪处理,优化模块114还可以包括第五优化单元可以对动态图像数据进行图像缩小处理。
需要说明的是,优化模块114的优化单元的个数及功能并不限于此,以上仅为本申请一些实施例对优化模块的优化单元的一些举例说明。优化模块114对动态图像数据进行优化处理后能够加快pre-ISP诸如神经网络处理器处理动态图像数据的收敛速度的功能子模块均在本申请的保护范围内。
还需要说明的是,优化模块114对动态图像数据进行优化处理也可以不是为了加快pre-ISP诸如神经网络处理器处理动态图像数据的收敛速度。优化模块114的对动态图像数据的优化处理可以根据实际需要而设计。
本申请实施例所提供的图像处理装置110还可以对静态图像数据进行统计以统计出状态信息。本申请实施例所提供的图像处理装置110还可以对静态图像数据进行优化处理,以改善静态图像质量。
以上为本申请实施例从统计模块112和优化模块114的角度进行的限定。为了进一步说明本申请实施例所限定的图像处理装置110在处理数据过程中的流向,下面将本申请实施例的图像处理装置110结合到其他电路中进行说明。
请参阅图3,图像处理装置110可以与一个或多个摄像头120进行连接,以从摄像头120获取摄像头120所采集的图像数据诸如动态图像数据。还可以理解为图像处理装置110与摄像头120连接,可以接收摄像头120发送给图像处理装置110的动态图像数据。动态图像数据可以分为两路,一路可以传输给统计模块112,另一路可以传输给优化模块114。
可以理解的是,本申请实施例所限定的两个器件之间的电连接,可以理解为两个器件通过信号诸如导线连接,可以实现信号的传输。当然,还可以理解为两个器件连接在一起,诸如通过焊接点焊接在一起。
统计模块112在接收到动态图像数据后,可以基于该动态图像数据统计其中的一些信息,在此可以定义为状态信息,诸如3A状态信息等。待统计模块112统计数据完成,即统计模块112统计出状态信息时,可以将该状态信息直接发送到第一图像处理器130。该第一图像处理器130可以理解为平台端的处理器诸如ISP和AP。
优化模块114在接收到动态图像数据后可以对动态图像数据进行一次或多次优化处理,诸如坏点补偿、线性化处理等。待优化模块114对动态图像数据的优化处理完成后,可以将优化处理后的动态图像数据传输给第二图像处理器140。第二图像处理器140可以理解为pre-ISP,诸如神经网络处理器。
需要说明的是,在实际生产加工过程中,需要分别完成第一图像处理器130、第二图像处理器14和图像处理装置110,在生产加工阶段会额外增加成本。而在信号传输和处理阶段会使得一些信号额外从一个器件传输到另一个器件,会额外增加时间和功耗。
基于此,为了节省成本、节省时间和功耗,本申请其他一些实施例可以将第二图像处理器140、统计模块112和优化模块114集成在一个器件上。
请参阅图4,图4和图3的区别在于,第二图像处理器140、统计模块112和优化模块114集成在一个器件上,诸如命名为图像处理装置110。由此在构造上,可以将统计模块112、优化模块114和第二处理器140集成在一起,可以节省成本,可以加快相互传输数据的速率,节省时间和功耗。
综上,本申请一些实施例通过图像处理装置110不仅可以统计出一些状态信息,而且还可以对动态图像数据进行优化处理。从而可以在加快per-ISP诸如神经网络处理器收敛的速度,以保证其在预设时间段内对一帧图像处理完成,而且还可以确保第一图像处理器130能够基于统计模块112所统计的状态信息在pre-ISP进行预处理的基础上进行后处理。进而以提升动态图像的质量。第一图像处理器130处理完RAW格式的动态图像数据可以对其进行格式转换,诸如将RAW格式图像数据转换为YUV格式图像数据。第一图像处理器130还可以对YUV格式图像数据进行处理诸如RGBToYUV。第一图像处理器130可以将处理完的图像数据显示于显示屏,以及存储到存储器中。
为了进一步说明本申请实施例所限定的图像处理装置110在处理数据过程中的流向,下面从图像处理装置110处理数据的方法的角度进行描述。
请参阅图5,动态图像处理方法包括:
1001,图像处理装置110获取动态图像数据。图像处理装置110可以从摄像头120获取动态图像数据。该动态图像数据可以为RAW数据。
1002,图像处理装置110的统计模块112从动态图像数据中统计出状态信息。状态信息可以包括3A状态信息。
1003,图像处理装置110的将统计模块所统计的状态信息发送至第一图像处理器130。该第一图像处理器130可以理解平台端的AP和ISP,其可以基于该状态信息进行图像处理。诸如状态信息包括3A状态信息,第一图像处理器130可以基于3A状态信息进行3A处理。3A处理可以理解为基于3A状态信息进行的处理。
1004,图像处理装置110的优化模块114对动态图像数据进行优化处理。该优化处理可以包括对动 态图像数据的坏点补偿、线性化处理和黑电平校正中的至少一个。
1005,图像处理装置110将优化处理后的动态图像数据发送至第二图像处理器140。该第二图像处理器140可以理解为神经网络处理器。优化处理后的动态图像数据给到第二图像处理器140,可以使得第二图像处理器140在预设时间段内处理一帧动态图像数据,或者说可以加快其对动态图像数据的收敛速度。从而实现第二图像处理器140实时地将其处理过的动态图像数据传输到第一图像处理器130,以解决播放卡顿的问题。
神经网络处理器可以对优化处理后的动态图像数据进行神经网络算法处理,并可以将其处理过的图像数据传输到第一图像处理器130,第一图像处理器130可以基于状态信息对神经网络算法处理后的动态图像数据进行后处理诸如3A处理。
需要说明的是,第二图像处理器诸如神经网络处理器在对动态图像数据进行一些算法的情况下会使得数据的位宽大于第一图像处理器处理图像数据的位宽。基于此,本申请实施例在接收到第二图像处理器诸如神经网络处理器对动态图像数据进行处理的结果后,可以通过优化模块114对第二图像处理器的处理结果进行位宽调整处理,使得位宽调整后的数据符合第一图像处理器处理数据的位宽。然后将位宽调整处理后的动态图像数据发送至第一图像处理器,以使得第一图像处理器能够基于基准数据对位宽调整后的数据进行进一步处理。
下面从统计模块、优化模块和神经网络处理器集成在一起的角度进行详细说明。
请参阅图6,多媒体处理芯片诸如多媒体处理芯片200可以对其所获取到的图像数据诸如RAW数据进行处理,以提升图像质量。需要说明的是,多媒体处理芯片200可以将其处理结果传输到应用处理芯片,以便于应用处理芯片能够对图像数据进行后处理,以进行显示或存储。其中图像数据也可以理解为图像信息。
RAW数据相比其他图像数据诸如YUV数据,RAW数据保留的细节更多。
多媒体处理芯片200可以包括神经网络处理器(Neural-network Processing Unit,NPU)220,神经网络处理器220可以对多媒体处理芯片200所获取到的图像数据进行增强处理,神经网络处理器220可以运行人工智能训练网络处理图像算法对图像数据进行增强处理。神经网络处理器220处理图像数据的效率高,对图像质量的提升明显。
本申请一些实施例中,神经网络处理器220可以为用来处理图像的专用处理器,可以简称为专用处理器。可以在电路排布、编程等硬件配置过程中来进行硬化实现,从而可以保证神经网络处理器220在处理图像数据过程中的稳定性,以及降低神经网络处理器220处理图像数据所需的功耗和时间。可以理解的是,当神经网络处理器220为专用处理器时,其功能为用来处理图像数据,且其不能够处理其他一些数据诸如文本信息。需要说明的是,在其他一些实施例中神经网络处理器220也可以处理其他诸如文本信息。
神经网络处理器220处理图像数据的方式可以是按照行的方式读取数据块,并按照行的方式对数据块进行处理。诸如神经网络处理器220按照多行的方式读取数据块,并按照多行的方式对数据块进行处理。可以理解的是,一帧图像可以具有多行数据块,即神经网络处理器220可以对一帧图像的一部分诸如帧进行处理,其中n为正整数,诸如2、4、5等。当神经网络处理器220对一帧图像未全部处理完,则神经网络处理器220可以内置缓存来存储神经网络处理器220在处理一帧图像过程中所处理多行数据块的数据。等待神经网络处理器220对一帧图像处理完成,则神经网络处理器220可以将处理过的数据写入到一存储器诸如多媒体处理芯片200的存储器230。其中,该存储器230可以内置于多媒体处理芯片200内,也可以外置。可以采用存储控制器实现数据的传输。
神经网络处理器220可以对RAW数据进行处理,可以理解的是,RAW数据的信息比较全,神经网络处理器220对RAW数据进行处理相比对YUV数据进行处理,可以在更多细节上提升图像质量。
需要说明的是,神经网络处理器220在数据流中,可以按照预设时间处理完成。预设时间诸如为30fps=33ms(毫秒)。或者说神经网络处理器220处理一帧图像所预设的时间为33ms,从而可以保证神经网络处理器220在快速处理图像数据的基础上,可以实现数据的实时传输。
可以理解的是,有些神经网络处理器处理图像方式为:从一存储有图像数据的存储器加载一帧图像,并对该帧图像做相应的算法处理。其在处理过程中,该神经网络处理器的卷积层所计算的临时数据往往需要保存到该存储器中。由此可见,相对于有些神经网络处理器,本申请一些实施例所限定的神经网络处理器220为专用神经网络处理器,可以加快图像数据的处理速度,保证在预设时间内完成一帧图像的处理。
神经网络处理器220可以对动态图像数据进行处理,诸如用户在视频录制模式下所获取到的动态图像数据。神经网络处理器220可以包括处理动态图像数据的算法,诸如夜景算法、HDR算法、虚化算法、降噪算法,超分辨率算法等。其中,动态图像数据可以包括录制视频的图像数据、视频播放的图像数据和预览图像的数据。本申请实施例可以将动态图像数据理解为视频图像数据。
神经网络处理器220也可以对静态图像数据进行处理,诸如用户在拍照模式下所获取到的静态图像数据。神经网络处理器220可以包括处理静态图像数据的算法,诸如HDR算法、夜景算法、虚化算法、降噪算法、超分辨率算法、语义分割算法等。需要说明的是,静态图像数据还可以包括打开相册应用所显示的图像。
本申请实施例所限定的神经网络处理器220即可以处理动态图像数据,又可以处理静态图像数据,从而使得多媒体处理芯片200可以应用于不同的场景,诸如拍照场景、视频录制场景。需要说明的是,本申请实施例所限定的神经网络处理器220也可以仅处理动态图像数据,而不处理静态图像数据。下面以神经网络处理器220处理动态图像数据为例进行说明。
需要说明的是,在多媒体处理芯片200获取到图像数据诸如动态图像数据后,若直接由神经网络处理器220对该动态图像数据进行处理,神经网络处理器220根据需求对动态图像数据进行预设算法的处理,以得到处理结果。然而,往往神经网络处理器220通过预设算法处理完动态图像数据后会导致动态图像数据失真,若多媒体处理芯片200将神经网络处理器220处理完而形成失真的数据发送给应用处理芯片时,会造成应用处理芯片对该动态图像数据的状态信息诸如自动对焦所需的状态信息错误,进而导致对焦失败,造成摄像头无法对焦。其中,该动态图像数据可以理解为多媒体处理芯片200接收到的数据,但未进行处理的数据。诸如将图像传感器所发送至多媒体处理芯片200的数据定义为初始的动态图像数据。
基于此,本申请一些实施例中,可以在多媒体处理芯片200内集成一统计模块,通过该统计模块来统计应用处理芯片进行图像数据处理所需要的数据,或者说通过该统计模块来统计应用处理芯片进行图像数据处理所需要的状态信息。统计模块统计完成应用处理芯片所需要的数据后可以将其所统计的数据发送给应用处理芯片,以确保应用处理芯片能够顺利完成图像数据处理诸如3A处理。
请继续参阅图6,本申请一些实施例中,可以将该统计模块集成于一图像信号处理器(Image Signal Processing,ISP)210,或者说多媒体处理芯片200还包括图像信号处理器210,图像信号处理器210包括统计模块212。多媒体处理芯片200在获取到初始的动态图像数据后,可以优先传输到图像信号处理器210,并由图像信号处理器210的统计模块212对该初始的动态图像数据进行统计,以统计出应用处理芯片所需的状态信息诸如3A状态信息。从而可以确保应用处理芯片基于该统计模块212所统计的状态信息对多媒体处理芯片200发送到应用处理芯片的处理结果进行后处理。
可以理解的是,图像信号处理器210的统计模块212所统计的状态信息并不限于3A状态信息,诸如图像信号处理器210的统计模块212统计镜头阴影校正状态信息等状态信息。
还需要说明的是,多媒体处理芯片200在获取到动态图像数据后,若直接由神经网络处理器220对该动态图像数据进行处理,神经网络处理器220根据需求对动态图像数据进行预设算法的处理,以得到的处理结果。然而,往往动态图像数据会存在坏点等问题。神经网络处理器220通过预设算法直接对动态图像数据进行处理会使得神经网络处理器220的收敛速度变慢,从而降低神经网络处理器220处理一帧图像所需的时间,不易实现快速处理图像数据以及有效提升图像质量的目的。
基于此,本申请一些实施例,可以在图像信号处理器210内集成优化模块,优化模块可以对动态图像数据进行第一次预处理诸如坏点补偿,以得到第一处理结果。然后再通过神经网络处理器220对该第 一次预处理结果进行第二次预处理,不仅可以解决图像坏点等问题,还可以提高神经网络处理器220的神经网络算法收敛速度可以保证神经网络处理器220能够在预设时间内完成一帧图像的处理,进而可以实现快速、实时处理图像的目的。
请参阅图7,图像信号处理器210还包括优化模块214,优化模块214可以对动态图像数据进行坏点补偿,优化模块214可以执行坏点补偿算法以实现对动态图像数据的坏点补偿。优化模块214可以对动态图像数据进行线性化处理,优化模块214可以执行线性化处理算法以实现对动态图像数据的线性化处理。优化模块214可以对动态图像数据进行黑电平校正,优化模块214可以执行黑电平校正算法以实现对动态图像数据的黑电平校正。
可以理解的是,图像信号处理器210的优化模块214对动态图像进行第一次预处理并不限于此,诸如优化模块214对初始图像数据进行图像裁剪处理,优化模块214可以执行图像裁剪算法以实现对动态图像数据的裁剪。再比如优化模块214对动态图像数据进行图像缩小处理,优化模块214可以执行图像缩小算法以实现对动态图像数据的缩小。
还需要说明的是,在神经网络处理器220处理完图像数据后,多媒体处理芯片200可以直接将神经网络处理器220所对图像数据处理后的数据发送给应用处理芯片。然而,在一些情况下,神经网络处理器220处理完的数据往往在位宽上与应用处理芯片所处理的数据的位宽存在差异。诸如神经网络处理器220采用视频HDR(High-Dynamic Range,高动态范围图像)算法对动态图像数据进行处理后的位宽为20bit(比特),而应用处理芯片所要处理的数据的位宽为14bit。因此,神经网络处理器220对图像数据进行处理后的位宽超出了应用处理芯片所要处理数据的位宽。从而就需要对神经网络处理器220处理后的数据进行位宽调整操作,以使得多媒体处理芯片200传输到应用处理芯片的数据位宽是相同的。
基于此,本申请一些实施例多媒体处理芯片200的神经网络处理器220在对动态图像数据进行处理后,可以先由图像信号处理器210的优化模块214进行位宽调整处理(tone mapping),使得优化模块214调整后的数据的位宽与应用处理芯片所需处理的数据的位宽相同。从而可以确保多媒体处理芯片200对动态图像数据进行处理后的数据传输到应用处理芯片后,应用处理芯片能够对该数据进行后处理,以提升图像质量。
本申请一些实施例中,可以采用几个不同的优化模块来分别执行不同的算法,以达到不同的优化结果。也可以在优化模块214中分为几个优化子模块来分别执行不同的算法,以达到不同的优化结果。诸如优化模块214的一个子模块可以对动态图像数据进行坏点补偿,优化模块214的一个子模块可以对动态图像数据进行线性化处理,优化模块214的一个子模块可以对动态图像数据进行黑电平校正,优化模块214的一个子模块可以对动态图像数据进行图像裁剪处理,优化模块214的一个子模块可以对动态图像数据进行图像缩小处理,优化模块214的一个子模块可以对图像数据进行位宽调整处理。
需要说明的是,优化模块214可以具有以上子模块中的一个或多个,优化模块214可以执行以上一个或多个的操作,从而可以保证多媒体处理芯片200传输到应用处理芯片的数据可以由应用处理芯片进行进一步的处理。当然,还可以确保神经网络处理器220能够加快收敛,以实现提升图像质量的目的。可以理解的是,优化模块214还可以具有其它子模块,在此不再一一举例说明。
请继续参阅图6和图7,多媒体处理芯片200可以包括第一接口201和第二接口202。第一接口201和第二接口202均可以为移动产业处理器接口(Mobile Industry Processor Interface,MIPI)。第一接口201可以接收图像数据诸如RAW数据,诸如第一接口201可以接收来自摄像头所获取到的RAW数据。第一接口201所接收到的图像数据诸如RAW数据可以为的图像数据,即第一接口201所接收到的图像数据为未经过处理加工的图像数据,具体可以将原始的图像数据理解为未经过图像处理器处理的图像数据。第一接口201接收到图像数据诸如原始的图像数据后可以将该图像数据传输到图像信号处理器210。
第二接口202可以接收图像信号处理器210对图像数据进行处理的结果,第二接口202也可以接收神经网络处理器220对图像数据进行处理的结果。第二接口202可以与应用处理芯片连接,以将第二接口202所接收到的图像数据诸如动态图像数据等传输到应用处理芯片。
第一接口201和第二接口202可以通过图像信号处理器210连接,第一接口201所接收到的数据可 以分为至少两路进行传输,比如一路数据传输到图像信号处理器210的统计模块212,另一路数据存储到存储器230。或者另一路数据由优化模块214进行处理。第二接口202可以传输由统计模块212所统计到的数据,第二接口202也可以传输由优化模块214所处理的数据。
请继续参阅图6和图7,存储器230存储多媒体处理芯片200的各种数据和指令。诸如存储器230可以存储原始的图像数据,存储器230可以存储由的优化模块214所处理过的数据,存储器230可以存储由神经网络处理器220处理过的数据,存储器230还可以存储多媒体处理芯片200的操作系统。存储器230的个数可以为一个、两个、三个、甚至更多个。存储器230的类型可以为静态存储器,也可以为动态存储器,诸如DDR(Double Data Rate SDRAM)。存储器230可以内置,也可以外置。诸如在封装的过程中,先将图像信号处理器210、神经网络处理器220等器件进行封装,再与存储器230进行封装。
多媒体处理芯片200的数据传输可以由一个或多个存储访问控制器实现。
请参阅图8,多媒体处理芯片200还可以包括存储访问控制器250,该存储访问控制器250可以为直接存储访问控制器(Direct Memory Access,DMA),其搬移数据的效率高,且可以搬移大数据。直接存储访问控制器250可将数据从一个地址空间搬移到另一个地址空间。诸如直接存储访问控制器250可以将存储于存储器230内的数据搬移到神经网络处理器220。
直接存储访问控制器250可以包括AHB(Advanced High performance Bus)直接存储访问控制器,也可以包括AXI(Advanced eXtensible Interface)直接存储访问控制器。
请继续参阅图6至图8,多媒体处理芯片200的各个元器件可以由系统总线240连接。比如图像信号处理器210和系统总线240连接,神经网络处理器220和系统总线240连接,存储器230和系统总线240连接,存储访问控制器250和系统总线240连接。
多媒体处理芯片200可以由一控制处理器来实现对多媒体处理芯片200系统的运行。
请参阅图9,多媒体处理芯片200还可以包括主控处理器240(Central Processing Unit,CPU),主控处理器240用来控制多媒体处理芯片200的系统的运行,诸如外设参数配置、控制中断响应等。
为了进一步说明本申请实施例所提供的多媒体处理芯片对图像数据尤其是动态图像数据的处理过程,结合图5至图8,下面从多媒体处理芯片对数据进行处理的数据流向和方法进行描述。
请参阅图10和图11,多媒体处理芯片200对数据的处理的方法包括:
2011,多媒体处理芯片200的第一接口201接收原始数据,该原始数据诸如为动态图像数据。
2012,将原始数据通过一路传输到图像信号处理器210的统计模块212,并由统计模块212对其接收到的原始数据进行统计处理,以统计出状态信息。需要说明的是,原始数据也可以先存储到存储器230中,然后由统计模块212对存储于存储器230的原始数据进行统计处理,以统计出状态信息。
2013,将统计模块212所统计到的数据通过第二接口202传输出去,诸如传输到应用处理芯片。需要说明的是,统计模块212所统计到的数据如状态信息也可以先存储到存储器230中,然后再通过第二接口202传输出去。
2014,将原始数据通过另一路存储到存储器230。
2015,将存储器230所存储的原始数据发送到神经网络处理器220,并由神经网络处理器220进行处理。或者神经网络处理器220从存储器230获取原始数据,并对该原始数据进行处理诸如神经网络算法处理。
2016,将神经网络处理器220的处理完的数据存储到存储器230中。在此可以将神经网络处理器220对数据进行处理的结果定义为预处理结果。
2017,将神经网络处理器220处理完的数据通过第二接口202传输出去,如传输到应用处理芯片。
以上为本申请实施例多媒体处理芯片200进行数据处理的第一种方式,应用处理芯片可基于该状态信息对神经网络处理器220的处理结果进行进一步的处理,以提高图像质量,如提高视频播放的质量。
请参阅图12和图13,多媒体处理芯片200对数据的处理的方法包括:
2021,多媒体处理芯片200的第一接口201接收原始数据,该原始数据诸如为动态图像数据。
2022,将原始数据通过一路传输到图像信号处理器210的统计模块212,并由统计模块212对其接 收到的原始数据进行统计处理,以统计出状态信息。需要说明的是,原始数据也可以先存储到存储器230中,然后由统计模块212对存储于存储器230的原始数据进行统计处理,以统计出状态信息。
2023,将统计模块212所统计到的数据通过第二接口202传输出去,诸如传输到应用处理芯片。需要说明的是,统计模块212所统计到的状态信息也可以先存储到存储器230中,然后再通过第二接口202传输出去。
2024,将原始数据通过另一路传输到优化模块214,并由优化模块214进行优化处理,诸如坏点补偿、线性化处理等。
2025,将优化模块214处理后的数据发送到神经网络处理器220,并由神经网络处理器220进行处理。需要说明的是,可以先将优化模块214处理后的数据发送到存储器230,然后再将存储在存储器230、且由优化模块214处理过的数据传输到神经网络处理器220,并由神经网络处理器220对优化模块214处理过的数据进行处理。
2026,将神经网络处理器220的处理完的数据存储到存储器230中。在此可以将神经网络处理器220对数据进行处理的结果定义为预处理结果。
2027,将神经网络处理器220处理完的数据通过第二接口203传输出去,诸如传输到应用处理芯片。
以上为本申请实施例多媒体处理芯片200进行数据处理的第二种方式,多媒体处理芯片200可分不同的通路将原始数据传输到统计模块212进行数据统计,和优化模块214进行优化处理。优化处理后的数据可以再由神经网络处理器220进行处理,将神经网络处理器220所处理过的数据及状态信息传输到应用处理芯片,不仅可以确保应用处理芯片基于该状态信息对神经网络处理器220的处理结果进行进一步的处理,以提高图像质量,诸如提高视频播放的质量。还可以加快神经网络处理器220的收敛速度,以提高视频播放的流畅度。
还需要说明的是,多媒体处理芯片200的优化模块214在对数据进行优化处理后,诸如坏点补偿、线性化处理等之后,应用处理芯片无需再对其接收到的图像数据进行相应的处理。诸如优化模块214对图像数据进行坏点补偿、线性化处理和黑电平校正,应用处理芯片无需再对其接收到的图像数据进行坏点补偿、线性化处理和黑电平校正,从而就可以减少应用处理芯片的功耗。
请参阅图14和图15,多媒体处理芯片200对数据的处理的方法包括:
2031,多媒体处理芯片200的第一接口201接收原始数据,该原始数据诸如为动态图像数据。
2032,将原始数据通过一路传输到图像信号处理器210的统计模块212,并由统计模块212对其接收到的原始数据进行统计处理,以统计出数据如状态信息。需要说明的是,原始数据也可以先存储到存储器230中,然后由统计模块212对存储于存储器230的原始数据进行统计处理,以统计出状态信息。
2033,将统计模块212所统计到的数据通过第二接口202传输出去,诸如传输到应用处理芯片。需要说明的是,统计模块212所统计到的状态信息也可以先存储到存储器230中,然后再通过第二接口202传输出去。
2034,将原始数据通过另一路存储到存储器230。
2035,将存储器230所存储的原始数据发送到神经网络处理器220,并由神经网络处理器220进行处理。或者神经网络处理器220从存储器230获取原始数据,并对该原始数据进行处理诸如神经网络算法处理。
2036,将神经网络处理器220的处理完的数据传输到优化模块214,由优化模块214对神经网络处理器220处理过的数据进行位宽调整处理,以使得调整后的位宽与应用处理芯片所需处理数据的位宽相同。在此可以将优化模块214对数据进行处理的结果定义为预处理结果。需要说明的是,可以先将神经网络处理器220处理后的数据发送到存储器230,然后再将存储在存储器230、且由神经网络处理220处理过的数据传输到优化模块214,并由优化模块214对神经网络处理器220处理过的数据进行位宽调整处理。
2037,将优化模块214进行位宽调整处理完的数据通过第二接口203传输出去,诸如传输到应用处理芯片。
以上为本申请实施例多媒体处理芯片200进行数据处理的第三种方式,可以确保应用处理芯片基于该状态信息对位宽调整处理后的数据进行进一步的处理,以提高图像质量,如提高视频播放质量。
请参阅图16和图17,多媒体处理芯片200对数据的处理的方法包括:
2041,多媒体处理芯片200的第一接口201接收原始数据,该原始数据诸如为动态图像数据。
2042,将原始数据通过一路传输到图像信号处理器210的统计模块212,并由统计模块212对其接收到的原始数据进行统计处理,以统计出数据如状态信息。需要说明的是,原始数据也可以先存储到存储器230中,然后由统计模块212对存储于存储器230的原始数据进行统计处理,以统计出状态信息。
2043,将统计模块212所统计到的数据通过第二接口202传输出去,诸如传输到应用处理芯片。需要说明的是,统计模块212所统计到的状态信息也可以先存储到存储器230中,然后再通过第二接口202传输出去。
2044,将原始数据通过另一路传输到优化模块214,并由优化模块214进行第一次优化处理,诸如坏点补偿、线性化处理、黑电平校正等。
2045,将优化模块214第一次优化处理后的数据发送到神经网络处理器220,并由神经网络处理器220进行处理。需要说明的是,可以先将优化模块214处理后的数据发送到存储器230,然后再将存储在存储器230、且由优化模块214进行第一次优化处理过的数据传输到神经网络处理器220,并由神经网络处理器220对优化模块214进行第一次优化处理过的数据进行处理。
2046,将神经网络处理器220处理完的数据传输到优化模块214,优化模块214对神经网络处理器220处理过的数据进行第二次优化处理。在此可以将优化模块214进行第二次优化处理的结果定义为预处理结果。需要说明的是,可以先将神经网络处理器220处理后的数据存储到存储器230,然后再将存储在存储器230、且由神经网络处理器220处理过的数据传输到优化模块214,并由优化模块214对神经网络处理器220处理过的数据进行第二次优化处理。其中,第二次优化处理可以包括对数据的位宽进行调整,以使得调整后的位宽与应用处理芯片所需要处理数据的位宽相同。
2047,将优化模块214第二次优化处理完的数据通过第二接口203传输出去,诸如传输到应用处理芯片。
以上为本申请实施例多媒体处理芯片200进行数据处理的第四种方式,可以确保应用处理芯片基于该状态信息对位宽调整处理后的数据进行进一步的处理,以提高图像质量,诸如提高视频播放的质量。还可以加快神经网络处理器220的收敛速度,以提高视频播放的流畅度。
对于本申请实施例多媒体处理芯片200的以上四种处理数据的方式,还需要说明的是,当多媒体处理芯片200接收到图像数据后,可以由主控处理器260确定出该图像数据是否存在坏点等问题,若存在则可以启动优化模块214对图像数据先进行坏点等优化处理,若不存在则可以直接由神经网络处理器220进行处理。当神经网络处理器220对数据处理完成后,可以由主控处理器260确定出该神经网络处理器220所处理后的数据的位宽与预设位宽是否相同,如果相同则可以直接将该神经网络处理器220处理过的数据传输给到应用处理芯片。如果不相同则可以经过优化模块214进行位宽调整的优化处理,以使得优化模块214进行位宽调整后的数据的位宽与预设位宽相同。可以理解的是,该预设位宽可以理解为应用处理芯片对数据进行处理所需的位宽。
需要说明的是,本申请实施例多媒体处理芯片200与其他器件诸如应用处理芯片的连接方式并不限于此,诸如多媒体处理芯片200还可以包括与应用处理芯片连接的第三接口。
请参阅图18,多媒体处理芯片200还可以包括第三接口203,第三接口203可以称为互连总线接口,诸如第三接口203为高速互连总线接口(Peripheral Component Interconnect Express,PCIE)203,也可以称为高速外围组件互连接口,外部设备互连总线接口,其是一种高速串行计算机扩展总线标准的接口。需要说明的是,第三接口203也可以为低速互连总线接口。
第三接口203与系统总线240连接,第三接口203可以通过系统总线240与其他器件实现数据的传输。诸如第三接口203可以接收图像信号处理器210对图像数据进行处理的结果,第三接口203也可以接收神经网络处理器220对图像数据进行处理的结果。第三接口203还可以与应用处理芯片连接,以将 多媒体处理芯片200处理过的数据传输到应用处理芯片。
第三接口203可以离线传输图像数据。诸如第三接口203可以离线传输静态图像的数据,第三接口203也可以离线传输动态图像的数据。本申请实施例多媒体处理芯片200不仅可以处理由摄像头所采集到的图像数据,还可以离线处理静态图像数据和/或动态图像数据,以实现对图片质量的增强,以及实现视频播放的质量。
请参阅图19,离线静态图像处理方法包括:
3011,接收相册查看指令。可以由多媒体芯片200所应用到的电子设备诸如智能手机的应用处理器接收该相册查看指令。
3012,根据该相册查看指令,确定是否进入图片增强模式。可以由多媒体芯片200所应用到的电子设备诸如智能手机的应用处理器来确定是否进入图片增强模式。比如用户进入相册界面后,该相册界面显示出“增强模式”和“普通模式”两个虚拟控件。当用户触控“增强模式”虚拟控件时,则应用处理器确定出进入增强模式,则执行步骤3013。当用户触控“普通模式”虚拟控件时,则应用处理器确定出不进入增强模式,则执行3016。需要说明的是,确定是否进入图片增强模式的方式并不限于此,其仅为举例说明。
其中,图片增强模式可以理解为改善图片质量的模式,即由本申请实施例所限定的多媒体处理芯片200对图片数据进行处理的模式。普通模式可以理解为图片数据未被本申请所限定的多媒体处理芯片200进行处理的模式。
3013,将所需显示的图片数据发送至多媒体处理芯片200。可以由多媒体芯片200所应用到的电子设备诸如智能手机的应用处理器发出指令,以将所需显示的图片数据发送到多媒体处理芯片200。
3014,通过多媒体处理芯片200对所需显示的图片数据进行增强处理。以提升图片质量。
3015,显示由多媒体处理芯片200进行增强处理的图片。可以由多媒体芯片200所应用到的电子设备诸如智能手机的显示屏显示由多媒体处理芯片200进行增强处理的图片。
3016,显示图片。可以由多媒体芯片200所应用到的电子设备诸如智能手机的显示屏直接显示图片,而不经过多媒体处理芯片200进行增强处理。
比如用户使用一电子设备诸如智能手机打开相册时,应用处理芯片可以将相册照片的图像数据通过第三接口203传输到多媒体处理芯片200,多媒体处理芯片200可以对该图像数据进行RAW图像编辑处理。当多媒体处理芯片200对该RAW图像数据处理完成后,再通过第三接口203将其处理过的数据传输出去,以通过电子设备的显示屏进行显示。从而本申请一些实施例的多媒体处理芯片200可以实现对图库RAW图像的处理。
需要说明的是,用户使用一电子设备诸如智能手机在拍摄图像时,可以将图像存储为RAW数据,以便于多媒体处理芯片200可以实现对相册中照片,或者说图库中照片的数据进行RAW图像编辑处理。
请参阅图20,采用多媒体处理芯片对RAW图像编辑处理的方法包括:
3021,应用处理芯片的应用处理器接收打开相册的第一指令;
3022,应用处理芯片的应用处理器根据所述第一指令将相册中所需处理的照片的RAW图像数据通过第三接口203传输到多媒体处理芯片200。
3023,多媒体处理芯片200对所述RAW图像数据进行RAW图像编辑处理。
3024,多媒体处理芯片200将其对RAW图像数据进行RAW图像编辑处理后的数据通过第三接口203传输到外部存储器。
需要说明的是,多媒体处理芯片200对RAW图像数据进行RAW图像编辑处理后,可以通过第三接口203传输到外部存储器中,诸如电子设备用来存储相册中照片的存储器。然后,可以在电子设备的显示屏中显示出多媒体处理芯片200对RAW图像数据进行处理过的照片。可以理解的是,该外部存储器可以理解为多媒体处理芯片外的存储器。
请参阅图21,离线动态图像处理方法包括:
4011,接收播放指令;可以由多媒体芯片200所应用到的电子设备诸如智能手机的应用处理器接收 该播放指令。
4012,根据所述播放指令确定是否对进入视频增强模式。可以由多媒体芯片200所应用到的电子设备诸如智能手机的应用处理器来确定是否进入视频增强模式。比如用户进入视频界面后,该视频界面显示出“增强模式”和“普通模式”两个虚拟控件。当用户触控“增强模式”虚拟控件时,则应用处理器确定出进入增强模式,则执行步骤4013。当用户触控“普通模式”虚拟控件时,则应用处理器确定出不进入增强模式,则执行4016。需要说明的是,确定是否进入视频增强模式的方式并不限于此,其仅为举例说明。
其中,视频增强模式可以理解为改善视频播放质量的模式,即由本申请实施例所限定的多媒体处理芯片200对视频播放数据进行处理的模式。普通模式可以理解为视频播放数据未被本申请所限定的多媒体处理芯片200进行处理的模式。
4013,根据所述播放指令将所需播放的视频数据发送至多媒体处理芯片。可以由多媒体芯片200所应用到的电子设备诸如智能手机的应用处理器发出指令,以将所需播放的视频数据发送到多媒体处理芯片200。
4014,通过多媒体处理芯片200对所需播放的视频数据进行增强处理。以提升视频播放质量。
4015,播放由多媒体处理芯片200进行增强处理的视频。可以由多媒体芯片200所应用到的电子设备诸如智能手机的显示屏播放由多媒体处理芯片200进行增强处理的视频数据。
4016,播放视频。可以由多媒体芯片200所应用到的电子设备诸如智能手机的显示屏直接播放视频,而不经过多媒体处理芯片200进行增强处理。
比如,用户使用一电子设备诸如智能手机播放视频时,或者说电子设备在播放视频的模式下,应用处理芯片可以将播放视频的图像数据通过第三接口203传输到多媒体处理芯片200,多媒体处理芯片200可以对该图像数据进行处理,诸如通过神经网络处理器220对该图像数据进行处理。可以提高视频播放的分辨率,以及解决视频播放过程中出现颗粒的问题。当多媒体处理芯片200对该图像数据处理完成后,多媒体处理芯片200再通过第三接口203将其处理过的数据传输出去,以通过电子设备的显示屏进行播放。从而本申请一些实施例的多媒体处理芯片200可以实现对视频播放的处理。
请参阅图22,采用多媒体处理芯片对视频播放的图像数据进行处理的方法包括:
4021,应用处理芯片的应用处理器接收视频播放的第二指令;
4022,应用处理芯片的应用处理器根据所述第二指令将视频播放过程中的图像数据通过第三接口203传输到多媒体处理芯片200。
4023,多媒体处理芯片200通过神经网络处理器220对所述视频播放过程中的图像数据进行增强处理,诸如SR(Super Resolution)处理,以提高视频播放的分辨率,以及解决视频播放过程中出现颗粒的问题。
4024,多媒体处理芯片200将其所述视频播放过程中的图像数据进行处理后的数据通过第三接口203传输到外部存储器。
需要说明的是,多媒体处理芯片200对视频播放的图像数据进行图像处理后,可以通过第三接口203传输到外部存储器中,诸如电子设备用来存储视频的存储器。然后,可以在电子设备的显示屏中显示出多媒体处理芯片200对图像数据进行处理过的视频。可以理解的是,该外部存储器可以理解为多媒体处理芯片外的存储器。
其中,多媒体处理芯片200对图像数据进行处理可以理解为两种:一种是由图像信号处理器210对图像数据进行统计,以统计出状态信息。另一种是由多媒体处理芯片200内的所有或部分图像处理器诸如图像信号处理器210和神经网络处理器220对图像数据进行预处理。该预处理可以理解先由图像信号处理器210对图像数据进行第一次预处理诸如优化处理,然后由神经网络处理器对第一次预处理后的图像数据进行第二次预处理诸如神经网络算法处理,再然后由图像信号处理器210对第二次预处理后的图像数据进行第三次预处理诸如位宽调整处理。需要说明的是,另一种由多媒体处理芯片200内的所有或部分图像处理器对图像数据进行预处理至少包括神经网络处理器220对图像数据进行神经网络算法处 理,在神经网络处理器220处理之前可以由图像信号处理器210先对图像数据进行优化处理。在神经网络处理器220处理之后还可以由图像信号处理器210对图像数据进行位宽调整处理。
可以理解的是,本申请实施例多媒体处理芯片200在处理离线图片或离线视频时,可以由第三接口203进行数据的传输,其不会占用到第二接口202。第二接口202可以进行实时数据的传输。
需要说明的是,本申请实施例的多媒体处理芯片200中处理图像数据的模块并不限于此。多媒体处理芯片200还可以包括其他处理模块来对图像数据进行处理,诸如多媒体处理芯片200还可以包括数字信号处理器。
参阅图23,多媒体处理芯片200还可以包括数字信号处理器(Digital Signal Processing)270,数字信号处理器270可以用来协助图像信号处理器210和神经网络处理器220。然而,数字信号处理器270也可以对计算量较小的图像数据进行处理。
数字信号处理器270采用一些通用算法对图像数据进行处理,诸如数字信号处理器270可以采用图像质量检测算法从多帧图像中选择出一帧图像。需要说明的是,在一些情况下,神经网络处理器220无法支持一些算法,诸如对于超广角的摄像头,如果需要畸形校正处理,神经网络处理器220可能无法实现,则可以采用数字信号处理器270来进行处理。
由此可见,本申请实施例数字信号处理器270主要用来处理一些数据量较小的图像数据,神经网络处理器220主要用来处理一些数据量较大的图像数据。诸如数字信号处理器270可以用来处理静态图像,神经网络处理器220用来处理动态图像诸如视频图像。再比如数字信号处理器270用来在拍照模式下处理图像数据,神经网络处理器220用来在视频录制模式、视频播放模式和预览图像模式下处理图像数据。从而,本申请实施例采用数字信号处理器270和神经网络处理器220相结合的方式,能够实现更好的、更全面的图像处理优化,以使得多媒体处理芯片200处理过的图像数据的质量更好,显示的效果更佳。
在一些实施例中,在拍摄照片模式下,多媒体处理芯片200可以通过第三接口203对拍摄照片模式下的图像数据进行传输。在视频录制模式下,多媒体处理芯片200可以通过第二接口202对视频录制模式下的图像数据进行传输。在预览图像模式下,多媒体处理芯片200可以通过第二接口202对预览图像模式下的图像数据进行传输。在视频播放模式下,多媒体处理芯片200可以通过第三接口203对播放视频的图像数据进行传输。在相册显示照片模式下,多媒体处理芯片200可以通过第三接口203对显示照片的图像数据的进行输。
第三接口203可以实时或离线传输图像数据,也可以传输配置参数等数据,第三接口203传输数据的效率高。基于此,本申请实施例可以将第二接口202和第三接口203分配不同的数据进行传输。以提高数据的传输效率。可以通过主控处理器260确定出多媒体处理芯片200所接收到的图像数据是哪一种类型的图像数据,或者说可以通过主控处理器260确定出多媒体处理芯片200所接收到的图像数据是在哪一个模式下的获取的图像数据。当多媒体处理芯片200接收到图像数据时,主控处理器260可以根据该图像数据确定该图像数据是哪一个模式下获取的图像数据。当主控处理器260确定出多媒体处理芯片200接收的图像数据为视频录制模式和预览图像模式下的图像数据时,主控处理器260可以控制神经网络处理器220对该图像数据件处理。当主控处理器260确定出多媒体处理芯片200接收的图像数据为拍照模式下的图像数据时,主控处理器260可以控制数字信号处理器270对该图像数据进行处理。
需要说明的是,拍摄照片模式下的图像数据也可以通过第二接口202进行传输。
请参阅图24,图24所示的多媒体处理芯片200与图23所示的多媒体处理芯片200的区别在于:图24所示的多媒体处理芯片200未设置第三接口。多媒体处理芯片200对静态图像进行处理的数据也可以通过第二接口202传输出去。诸如第二接口202具有多个通路,当多媒体处理芯片200对动态图像进行处理时,可以直接通过第二接口202的一个或多个通路传输。即第二接口202的各个通路优先配置给多媒体处理芯片200对动态图像进行处理的数据。当多媒体处理芯片200对静态图像进行处理时,可以通过主控处理器260先确定出第二接口202的各个通路是否有空闲的通路,即是否有未正在传输动态图像数据的通路。如果第二接口202的多个通路中有一个或多个处于空闲状态,则多媒体处理芯片200对静态图像处理的数据可以通过处于空闲状态的一个或多个通路传输出去。
需要说明的是,当第二接口202的所有通路均未处于空闲状态,则可以等到第二接口202至少有一路通路处于空闲状态再将静态图像的数据通过处于空闲状态的通路传输出去。当然,本申请其他一些实施例还可以采用其他方式传输静态图像的数据,而无需根据第二接口202的通路状态再确定是否传输静态图像的数据。
请参阅图25,多媒体处理芯片200的第一接口201和第二接口202还可以直接连接,从而第一接口201在接收到一些图像数据诸如静态图像数据可以直接传输到第二接口202,而不经过图像信号处理器210和/或神经网络处理器220对图像数据进行处理。
在一些实施例中,当多媒体处理芯片200接收到录制视频的图像数据时,可以通过第一接口201将图像数据传输到图像信号处理器210进行处理。当多媒体处理芯片200接收预览图像的数据时,可以通过第一接口201直接将图像数据传输到第二接口202。当多媒体处理芯片200接收拍照模式下的拍照图像时,可以通过第一接口201直接将图像数据传输到第二接口202。
在其他一些实施例中,当多媒体处理芯片200接收预览模式下的预览图像时,也可以通过第一接口201将图像数据传输图像信号处理器210进行处理,从而可以解决画面一致性的问题。
为了进一步说明本申请实施例所提供的多媒体处理芯片与其他器件的数据交互,下面从多媒体处理芯片的应用的角度进行描述。可以理解的是,多媒体处理芯片200可以应用于一电子设备诸如智能手机、平板电脑等设备中。
请参阅图26,电子设备20可包括图像传感器600、多媒体处理芯片200和应用处理芯片400。
其中,摄像头600可以采集图像数据。该摄像头600可以为前置摄像头,也可以为后置摄像头。摄像头600可以包括图像传感器和镜头,图像传感器可以为互补金属氧化物半导体(Complementary Metal-Oxide-Semiconductor,CMOS)图像传感器、电荷藕合器件(Charge Coupled Device,CCD)图像传感器等。摄像头600可以与多媒体处理芯片200电连接,诸如摄像头600与多媒体处理芯片200的第一接口201电连接。摄像头600可以采集原始图像数据诸如RAW图像数据,并通过第一接口201传输到多媒体处理芯片200,以供多媒体处理芯片200内部的图像处理器诸如图像信号处理器210和神经网络处理器220进行处理。
其中,多媒体处理芯片200为如上任一多媒体处理芯片200,在此不再赘述。
其中,应用处理芯片400可以实现对电子设备20各种功能的控制。诸如应用处理芯片400可以控制电子设备20的摄像头600采集图像,应用处理芯片400还可以控制多媒体处理芯片200对摄像头600采集的图像进行处理等。应用处理芯片400还可以对图像数据进行处理。
摄像头600采集到图像数据可以传输到多媒体处理芯片200的接口,多媒体处理芯片200可以对图像数据进行预处理,应用处理芯片400可以对图像数据进行后处理。多媒体处理芯片200和应用处理芯片400之间对图像数据的处理可以是差异化的处理,也可以存在相同的处理。
下面从其中一种多媒体处理芯片应用到电子设备20中进行详细说明。
请参阅图27,电子设备20的应用处理芯片400可以包括应用处理器410、图像信号处理器420、存储器430、系统总线440和第四接口401。
其中,应用处理器410可以作为电子设备20的控制中心。其也可以执行一些算法以对图像数据进行处理。
其中,存储器430可以存储各种数据诸如图像数据、系统数据等。该存储器430可以内置于应用处理芯片410,也可以外置于应用处理器410。
其中,第四接口401可以为移动产业处理器接口,第四接口401与第二接口202电连接连接,可以接收由多媒体处理芯片200所处理过的数据。
其中,图像信号处理器420可以对图像数据进行处理。
本申请一些实施例可以由应用处理器410和图像信号处理器420共同对预处理后的图像数据进行后处理。可以理解的是,应用处理器410和图像信号处理器420也可以共同对摄像头600所采集的图像数据进行处理。
在一些实施例中,图像信号处理器210对图像数据处理的结果可以通过第二接口202和第四接口401的连接传输到应用处理芯片400的存储器430。神经网络处理器220对图像数据处理的结果可以通过第二接口202和第四接口401的连接传输到应用处理芯片400的存储器430。数字信号处理器270对图像数据处理的结果可以通过第二接口202和第四接口401的连接传输到应用处理芯片400的存储器430。
结合图25,需要说明的是,在一些情况下,多媒体处理芯片200所接收到的数据诸如拍摄照片模式下的图像数据可以直接由第一接口201传输到第二接口202,然后通过第二接口202和第四接口401的连接传输到存储器430。
需要说明的是,多媒体处理芯片200将数据传输到应用处理芯片400的方式并不限于此。
请参阅图28,电子设备20中的多媒体处理芯片200可以参阅图23,应用处理芯片400还可以包括第五接口402,第五接口402可称为互连总线接口,诸如第五接口402为高速互连总线接口,也可以称为高速外围组件互连接口,外部设备互连总线接口,其是一种高速串行计算机扩展总线标准的接口。需要说明的是,第五接口402也可以为低速互连总线接口。
第五接口402和第三接口203连接。在一些实施例中,第五接口402和第三接口203的类型相同,诸如第五接口402和第三接口203均为高速互连总线接口。多媒体处理芯片200可以将一些图像数据诸如静态图像数据、预览图像数据通过第三接口203和第五接口402的连接传输到存储器430。当然,多媒体处理芯片200还可以将一些数据诸如相册照片的数据、视频播放的数据通过第三接口203和第五接口402的连接传输到存储器430。
在一些实施例中,当多媒体处理芯片200将其处理过的图像数据传输到应用处理芯片400,应用处理芯片400对多媒体处理芯片200处理过的数据进行后处理,并将处理完的数据进行存储和通过显示屏显示。
下面从数据处理的过程的角度进行描述。
多媒体处理芯片200获取动态图像数据诸如视频录制的图像数据,多媒体处理芯片200根据其获取到的动态图像数据,通过多媒体处理芯片200统计该动态图像数据的状态信息,以及通过多媒体处理芯片200对该动态图像数据进行预处理。待多媒体处理芯片200统计出状态信息,以及对动态图像数据进行预处理完成,多媒体处理芯片200可以将其所统计的状态信息和预处理后的动态图像数据发送至应用处理芯片400。应用处理芯片400基于该状态信息对预处理后的动态图像数据进行后处理。从而可以提升图像的质量。
应用处理器410接收启动摄像头600的第三指令。
应用处理器410基于该启动摄像头600的第三指令启动摄像头600。可以由应用处理器410对摄像头600进行配置,以实现摄像头600的启动。
摄像头600采集图像数据,并将该图像数据传输到多媒体处理芯片200的第一接口201。
图像信号处理器210的统计模块212对该图像数据进行统计处理,以统计出状态信息,并将其统计出的状态信息通过第二接口202传输到第四接口401。
图像信号处理器210的优化模块214对该图像数据进行优化处理诸如线性化处理、坏点补偿、黑电平校正等处理,并将优化处理后的数据传输到神经网络处理器220。可以理解的是,优化模块214可以将其优化处理后的数据直接传输到神经网络处理器210,也可以存储到存储器230中,由神经网络处理器220从存储器230获取。
神经网络处理器210对优化模块214优化处理后的数据进行处理,诸如神经网络算法处理,并将处理后的数据传输到存储器230。
多媒体处理芯片200将其处理后的数据通过第二接口202传输到第四接口401。
应用处理器410和图像信号处理器420基于状态信息,对神经网络处理器220处理过的数据进行后处理,诸如3A处理。
可以理解的是,若单独由应用处理芯片对图像数据进行处理,则需要由应用处理芯片的图像信号处理器对图像数据进行状态信息的统计,由应用处理芯片的应用处理器执行一些算法基于状态信息计算出 一些参数诸如对焦参数、曝光参数、白平衡参数、镜头阴影校正参数等。基于所计算出的参数应用处理器可以对摄像头进行配置,以及由图像信号处理器对图像数据进行校正处理。整个处理过程均是由应用处理芯片执行,导致应用处理芯片的功耗较高。而应用处理芯片往往还需要对其他各种功能进行管控,从而在整个图像处理过程中,可能对应用处理芯片的性能产生影响。
本申请实施例将图像数据的一部分处理交由多媒体处理芯片400进行处理,另一部分处理交由应用处理芯片200处理,从而在节省应用处理芯片200功耗的基础上,还可以提升图像质量。其中,图像信号处理器210统计出状态信息后,可以将其统计的状态信息发送到应用处理芯片400,由应用处理器410执行一些算法基于状态信息计算出一些参数诸如对焦参数、曝光参数、白平衡参数、镜头阴影校正参数等。基于所计算出的参数应用处理器410可以对摄像头600进行配置,以及由图像信号处理器420对图像数据进行校正处理。需要说明的是,图像信号处理器210统计出状态信息后,也可以不通过应用处理器410执行一些算法来进行计算,比如由主控处理器260来执行一些算法,基于状态信息计算出一些参数,然后将参数传输到应用处理芯片400,并由应用处理器410可以对摄像头600进行配置,以及由图像信号处理器420对图像数据进行校正处理。另外,图像信号处理器210统计出状态信息后还可以由图像信号处理器420执行一些算法,基于状态信息计算出一些参数,基于该参数并由应用处理器410可以对摄像头600进行配置,以及由图像信号处理器420对图像数据进行校正处理。可以理解的是,应用处理器410和主控处理器260所执行的算法可以进行更新,而图像信号处理器420所执行的算法往往无法更新,在实际应用过程中可以优先选择应用处理器410或主控处理器260执行相关算法对状态信息进行计算。
下面针对不同的状态信息具体举例说明。
状态信息可包括自动对焦状态信息,应用处理器410可以执行相关算法,基于该自动对焦状态信息计算出对焦参数,并将该对焦参数配置给摄像头600。摄像头600可以基于该对焦参数进行对焦。也可以由主控处理器260执行相关算法,基于该自动对焦状态信息计算出对焦参数,然后将对焦参数配置给摄像头600,或者发送到应用处理芯片400,由应用处理器410将对焦参数配置给摄像头600。当然,由图像信号处理器420执行相关算法以计算出对焦参数也是可以的。其中,自动对焦状态信息可以包括相位对焦状态信息、反差对焦状态信息、激光对焦状态信息和TOF(Time of Flight)对焦状态信息中的一种或多种。
自动对焦状态信息可以包括反差对焦状态信息,可以通过图像信号处理器210对图像数据诸如动态图像数据进行预设算法处理以计算出反差对焦状态信息。应用处理器410可以执行相关算法,基于该反差对焦状态信息计算出反差对焦参数,并将该反差对焦参数配置给摄像头600。摄像头600可以基于该反差对焦参数进行对焦。也可以由主控处理器260执行相关算法,基于该反差对焦状态信息计算出反差对焦参数,然后将反差对焦参数配置给摄像头600,或者发送到应用处理芯片400,由应用处理器410将反差对焦参数配置给摄像头600。当然,由图像信号处理器420执行相关算法以计算出反差对焦参数也是可以的。
自动对焦状态信息还可以包括相位对焦状态信息,可以通过图像传感器210对图像数据诸如动态图像数据进行抽取,诸如对图像数据进行标记区分,以抽取出相位对焦状态信息。应用处理器410可以执行相关算法,基于该相位对焦状态信息计算出相位对焦参数,并将该相位对焦参数配置给摄像头600。摄像头600可以基于该相位对焦参数进行对焦。也可以由主控处理器260执行相关算法,基于该相位对焦状态信息计算出相位对焦参数,然后将相位对焦参数配置给摄像头600,或者发送到应用处理芯片400,由应用处理器410将相位对焦参数配置给摄像头600。当然,由图像信号处理器420执行相关算法以计算出相位对焦参数也是可以的。
自动对焦状态信息还可以包括激光对焦状态信息,可以通过图像信号处理器210对图像数据诸如动态图像数据进行预设算法处理以计算出激光对焦状态信息。应用处理器410可以执行相关算法,基于该激光对焦状态信息计算出激光对焦参数,并将该激光对焦参数配置给摄像头600。摄像头600可以基于该激光对焦参数进行对焦。也可以由主控处理器260执行相关算法,基于该激光对焦状态信息计算出激 光对焦参数,然后将激光对焦参数配置给摄像头600,或者发送到应用处理芯片400,由应用处理器410将激光对焦参数配置给摄像头600。当然,由图像信号处理器420执行相关算法以计算出激光对焦参数也是可以的。
自动对焦状态信息还可以包括TOF对焦状态信息,可以通过图像信号处理器210对图像数据诸如动态图像数据进行预设算法处理以计算出TOF对焦状态信息。应用处理器410可以执行相关算法,基于该TOF对焦状态信息计算出TOF对焦参数,并将该TOF对焦参数配置给摄像头600。摄像头600可以基于该TOF对焦参数进行对焦。也可以由主控处理器260执行相关算法,基于该TOF对焦状态信息计算出TOF对焦参数,然后将TOF对焦参数配置给摄像头600,或者发送到应用处理芯片400,由应用处理器410将TOF对焦参数配置给摄像头600。当然,由图像信号处理器420执行相关算法以计算出TOF对焦参数也是可以的。
状态信息还可以包括自动白平衡状态信息,应用处理器410可以执行相关算法,基于该自动白平衡状态信息计算出白平衡参数,图像信号处理器420可以基于白平衡参数对多媒体处理芯片200预处理过的图像数据进行白平衡处理,或者说图像校正处理。也可以由主控处理器260执行相关算法,基于该自动白平衡状态信息计算出白平衡参数,然后将白平衡参数发送到应用处理芯片400,由图像信号处理器420基于白平衡参数对多媒体处理芯片200预处理过的图像数据进行白平衡处理。当然,由图像信号处理器420执行相关算法以计算出白平衡参数也是可以的。
状态信息还可以包括自动曝光状态信息,应用处理器410可以执行相关算法,基于该自动曝光状态信息计算出曝光参数,并将该曝光参数配置给摄像头600。摄像头600可以基于该曝光参数进行曝光。也可以由主控处理器260执行相关算法,基于该自动曝光状态信息计算出曝光参数,然后将曝光参数配置给摄像头600,或者发送到应用处理芯片400,由应用处理器410将曝光参数配置给摄像头600。当然,由图像信号处理器420执行相关算法以计算出曝光参数也是可以的。需要说明的是,在需要对曝光参数进行补偿时,可以由图像信号处理器420对曝光参数进行补偿处理,然后可以由应用处理器410将补偿后的曝光参数配置给摄像头600,摄像头600基于补偿后的曝光参数进行曝光。
状态信息还包括镜头阴影校正状态信息,应用处理器410可以执行相关算法,基于该镜头阴影校正状态信息计算出镜头阴影校正参数,图像信号处理器420可以基于镜头阴影校正参数对多媒体处理芯片200预处理过的图像数据进行镜头阴影校正。也可以由主控处理器260执行相关算法,基于该镜头阴影校正状态信息计算出镜头阴影校正参数,然后将镜头阴影校正参数发送到应用处理芯片400,由图像信号处理器420基于白平衡参数对多媒体处理芯片200预处理过的图像数据进行白平衡处理。当然,由图像信号处理器420执行相关算法以计算出镜头阴影校正参数也是可以的。
其中,图像信号处理器420统计图像数据的状态信息可以理解为:采用算法的方式计算出一些状态信息和/或采用抽取的方式抽取出一些状态信息。
优化模块214对图像数据进行的处理,图像信号处理器420无需再进行处理。比如优化模块214对图像数据进行坏点补偿线性化处理、黑电平校正,图像信号处理器420无需再进行坏点补偿、线性化处理、黑电平校正。本申请实施例多媒体处理芯片200和应用处理芯片400对图像数据进行差异化处理。从而可以节省应用处理芯片400的功耗。诸如优化模块214对图像数据进行的坏点补偿线性化处理、黑电平校正,图像信号处理器420无需再进行坏点补偿、线性化处理、黑电平校正。然而,应用处理芯片400和多媒体处理芯片200对图像数据也可以进行一部分相同的处理。诸如多媒体处理芯片200对动态图像数据进行降噪处理,应用处理芯片400也对动态图像数据进行降噪处理。再诸如多媒体处理芯片200对动态图像数据进行统计处理,用处理芯片400也对动态图像数据进行统计处理。
图像信号处理器420将处理后的数据发送至显示屏和存储器430,以显示和存储该图像。
需要说明的是,若该图像为动态图像则在存储前可以先通过编码器对其进行编码处理,等待编码完成后再进行存储。若该图像为静态图像则在存储器可以先进行压缩诸如JPEG压缩,等待压缩后再进行存储。
还需要说明的是,多媒体处理芯片200所处理的图像数据可以是RAW图像数据,应用处理芯片400 可以对RAW图像数据进行处理诸如3A处理,也可以将RAW格式转换为YUV格式,以对YUV格式的图像进行处理。诸如图像信号处理器420对YUV格式的图像进行RGBToYUV处理。
在图像信号处理器210将神经网络处理器220处理后的数据通过第二接口202传输到第四接口401之前,可以由主控处理器260先确定神经网络处理器220处理后的数据的位宽与应用处理芯片400所要处理数据的位宽是否相同,如相同则图像信号处理器210将神经网络处理器220处理后的数据通过第二接口202传输到第四接口401。若不相同,则图像信号处理器210的优化模块214对神经网络处理器220处理后的数据进行位宽调整处理,以使得调整后的数据的位宽与应用处理芯片400所要处理数据的位宽相同。以确保应用处理芯片400可以正常处理由多媒体处理芯片200传输过来的数据。
还需要说明的是,在其他一些实施例中,多媒体处理芯片200在对图像数据进行处理时,也可以不通过优化模块214对原始图像进行优化处理,而直接由神经网络处理器220进行处理。
本申请实施例由多媒体处理芯片200对图像数据进行处理的方式可以参阅图10至图17,在此不再赘述。
下面从应用处理芯片400处理图像数据的过程进行描述。
请参阅图29,应用处理芯片400对图像数据进行处理的方法包括:
5011,应用处理芯片400的第四接口401接收由统计模块212对动态图像数据进行统计的状态信息。
5012,应用处理芯片400的第四接口401接收由神经网络处理器220对动态图像数据进行神经网络算法处理的结果。
需要说明的是,神经网络处理器220对动态图像数据进行神经网络算法处理之前,可以由优化模块214对动态图像数据进行优化处理。
5013,应用处理芯片400基于状态信息对神经网络处理器220对动态图像数据进行处理的结果进行二次处理。
需要说明的是,在应用处理芯片400进行处理之前,可以由优化模块214对神经网络处理器220处理后的数据进行位宽调整处理。
为了进一步说明本申请实施例通过多媒体处理芯片200对图像数据进行预处理后,再由应用处理芯片400对图像数据进行后处理以提升图像质量,下面请参阅图30和图31。图30的第一图,示出了本申请实施例多媒体处理芯片200和应用处理芯片400共同对图像进行处理而显示的一帧图像,其包括本申请实施例通过神经网络处理器220对图像数据进行HDR算法处理。图30的第二图,示出了仅由应用处理芯片单独对图像进行处理而显示的一帧图像。由第一图和第二图的比较可以看出,两帧图像在多个方面和多个区域存在差异。诸如第二图人物周围的亮度过亮,靠近人物的物品展示又过于清晰诸如第二区域B的物体的清晰度大于第二区域A的清晰度,导致人物不够突出。且第二图的周围诸如第一区域B的细节展示不如第一区域A的细节展示。
其中,图31的第三图,示出了本申请实施例多媒体处理芯片200和应用处理芯片400共同对图像进行处理而显示的一帧图像,其包括本申请实施例通过神经网络处理器220对图像信号进行视频夜景算法处理。图31的第四图,示出了仅由应用处理芯片单独对图像进行处理而显示的一帧图像。由第三图和第四图的比较可以看出,两帧图像在多个区域存在差异。诸如第三图的第三区域A比第四图的第三区域B的清晰。再比如第三图的第四区域A比第四图的第四区域B展示出更多的细节。
可以理解的是,本申请实施例所限定的摄像头600、多媒体处理芯片200和应用处理芯片400可以安装在一起,诸如摄像头600、多媒体处理芯片200和应用处理芯片400安装在一个电路板上。
请参阅图32,电路板22安装有图像传感器600、多媒体处理芯片200和应用处理芯片400。摄像头600、多媒体处理芯片200和应用处理芯片400均通过信号线连接,以实现信号的传输。
可以理解的是,电路板22还可以安装有其他元器件,在此不再一一举例。
请参阅图33,摄像头600也可以与多媒体处理芯片200和应用处理芯片400不安装在一个电路板上,诸如摄像头600单独安装在一个电路板上,多媒体处理芯片200和应用处理芯片400安装在一个电路板22上,摄像头600与多媒体处理芯片200通过信号线连接。
以上对本申请实施例提供的多媒体处理芯片、电子设备及动态图像处理方法进行了详细介绍。本文中应用了具体个例对本申请的原理及实施方式进行了阐述,以上实施例的说明只是用于帮助理解本申请。同时,对于本领域的技术人员,依据本申请的思想,在具体实施方式及应用范围上均会有改变之处,综上所述,本说明书内容不应理解为对本申请的限制。

Claims (20)

  1. 一种多媒体处理芯片,其中,所述多媒体处理芯片包括:
    图像信号处理器,用于统计图像数据的状态信息;和
    神经网络处理器,用于对图像数据进行神经网络算法处理;
    其中,所述多媒体处理芯片用于至少通过所述神经网络处理器对所述图像数据进行预处理,并将所述状态信息和预处理过的图像数据发送至应用处理芯片。
  2. 根据权利要求1所述的多媒体处理芯片,其中,所述图像信号处理器还用于对所述图像数据进行第一次预处理;
    所述神经网络处理器还用于对所述第一次预处理后的图像数据进行第二次预处理。
  3. 根据权利要求2所述的多媒体处理芯片,其中,所述图像信号处理器还用于对所述第二次预处理后的图像数据进行第三次预处理,所述图像信号处理器对图像数据进行第三次预处理包括对图像数据进行位宽调整处理,以使得位宽调整后的图像数据的位宽与所述应用处理芯片所处理图像数据的位宽相同。
  4. 根据权利要求2所述的多媒体处理芯片,其中,所述图像信号处理器对所述图像数据进行第一次预处理包括坏点补偿、线性化处理和黑电平校正中的至少一者。
  5. 根据权利要求4所述的多媒体处理芯片,其中,所述图像信号处理器对所述图像数据进行第一次预处理还包括图像裁剪处理和/或图像缩小处理。
  6. 根据权利要求1所述的多媒体处理芯片,其中,所述图像信号处理器还用于对所述神经网络算法处理后的图像数据进行位宽调整处理,以使得位宽调整后的图像数据的位宽与所述应用处理芯片所处理图像数据的位宽相同。
  7. 根据权利要求1至6任一项所述的多媒体处理芯片,其中,所述图像数据包括动态图像数据,所述多媒体处理芯片用于处理动态图像数据,所述神经网络处理器用于对所述动态图像数据进行处理的神经网络算法包括夜景算法、HDR算法、虚化算法、降噪算法,超分辨率算法、语义分割算法中的至少一个。
  8. 根据权利要求7所述的多媒体处理芯片,其中,所述多媒体处理芯片用于实时处理所述动态图像数据,并实时将处理过的动态图像数据传输到所述应用处理芯片。
  9. 根据权利要求1至6任一项所述的多媒体处理芯片,其中,所述图像数据包括静态图像数据,所述多媒体处理芯片用于处理静态图像数据,所述神经网络处理器用于对所述静态图像数据进行处理的神经网络算法包括夜景算法、HDR算法、虚化算法、降噪算法,超分辨率算法、语义分割算法中的至少一个。
  10. 根据权利要求1至6任一项所述的多媒体处理芯片,其中,所述多媒体处理芯片还用于离线处理静态图像数据和/或动态图像数据。
  11. 根据权利要求1至6任一项所述的多媒体处理芯片,其中,所述图像数据为RAW图像数据,所述多媒体处理芯片用于对RAW图像数据进行处理。
  12. 根据权利要求1至6任一项所述的多媒体处理芯片,其中,所述状态信息包括自动曝光状态信息、自动白平衡状态信息和自动对焦状态信息中的至少一种。
  13. 根据权利要求12所述的多媒体处理芯片,其中,所述状态信息还包括镜头阴影校正状态信息。
  14. 一种电子设备,其中,包括:
    多媒体处理芯片,为如权利要求1-13任一项所述的多媒体处理芯片;和
    应用处理芯片,用于从所述多媒体处理芯片获取所述预处理的结果和统计的状态信息,所述应用处理芯片基于所述状态信息对所述预处理的结果进行后处理。
  15. 根据权利要求14所述的电子设备,其中,所述状态信息包括自动对焦状态信息、自动白平衡状态信息和自动曝光状态信息中的至少一种,所述应用处理芯片用于:
    基于所述自动对焦状态信息计算出对焦参数,并将所述对焦参数配置给所述电子设备的摄像头;
    基于所述自动白平衡状态信息计算出白平衡参数,并基于所述白平衡参数对所述预处理的结果进行白平衡处理;
    基于所述自动曝光状态信息计算出曝光参数,并将所述曝光参数配置给所述电子设备的摄像头,或对所述曝光参数进行补偿后配置给所述电子设备的摄像头。
  16. 根据权利要求15所述的电子设备,其中,所述自动对焦状态信息包括相位对焦状态信息和反差对焦状态信息,所述多媒体处理芯片的图像信号处理器用于:
    对所述图像数据进行预设算法处理以获取反差对焦状态信息;
    从所述图像数据中抽取出相位对焦状态信息;
    所述应用处理芯片还用于:
    基于所述反差对焦状态信息计算出反差对焦参数,并将所述反差对焦参数配置给所述电子设备的摄像头;
    基于所述相位对焦状态信息计算出相位对焦参数,并将所述相位对焦参数配置给所述电子设备的摄像头。
  17. 根据权利要求15所述的电子设备,其中,所述状态信息还包括镜头阴影校正状态信息,所述应用处理芯片还用于:
    基于所述镜头阴影校正状态信息计算出镜头阴影校正参数,并基于所述镜头阴影校正参数对所述预处理的结果进行镜头阴影校正。
  18. 一种动态图像处理方法,其中,所述方法包括:
    获取动态图像数据;
    根据所述动态图像数据,通过多媒体处理芯片统计所述动态图像数据的状态信息,并对所述动态图像数据进行预处理;
    将所述多媒体处理芯片所统计的状态信息和预处理后的动态图像数据发送至应用处理芯片;
    基于所述状态信息通过所述应用处理芯片对所述预处理后的动态图像数据进行后处理。
  19. 根据权利要求18所述的动态图像处理方法,其中,所述通过多媒体处理芯片对动态图像数据进行预处理,包括:
    对所述动态图像数据进行优化处理;
    对优化处理后的动态图像数据进行神经网络算法处理。
  20. 根据权利要求18或19所述的动态图像处理方法,其中,所述状态信息包括自动对焦状态信息、自动白平衡状态信息和自动曝光状态信息中的至少一种,所述基于所述状态信息通过所述应用处理芯片对所述预处理后的动态图像数据进行后处理包括:
    基于所述自动对焦状态信息计算出对焦参数,并将所述对焦参数配置给摄像头;
    基于所述自动白平衡状态信息计算出白平衡参数,并基于所述白平衡参数对所述预处理的结果进行白平衡处理;
    基于所述自动曝光状态信息计算出曝光参数,并将所述曝光参数配置给所述摄像头,或对所述曝光参数进行补偿后配置给所述摄像头。
PCT/CN2021/088513 2020-05-29 2021-04-20 多媒体处理芯片、电子设备及动态图像处理方法 WO2021238506A1 (zh)

Priority Applications (2)

Application Number Priority Date Filing Date Title
EP21812348.7A EP4148656A4 (en) 2020-05-29 2021-04-20 MULTIMEDIA PROCESSING CHIP, ELECTRONIC DEVICE AND DYNAMIC IMAGE PROCESSING METHOD
US18/059,254 US20230086519A1 (en) 2020-05-29 2022-11-28 Multimedia processing chip, electronic device, and method for dynamic-image processing

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202010478366.5 2020-05-29
CN202010478366.5A CN113744117A (zh) 2020-05-29 2020-05-29 多媒体处理芯片、电子设备及动态图像处理方法

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US18/059,254 Continuation US20230086519A1 (en) 2020-05-29 2022-11-28 Multimedia processing chip, electronic device, and method for dynamic-image processing

Publications (1)

Publication Number Publication Date
WO2021238506A1 true WO2021238506A1 (zh) 2021-12-02

Family

ID=78724882

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2021/088513 WO2021238506A1 (zh) 2020-05-29 2021-04-20 多媒体处理芯片、电子设备及动态图像处理方法

Country Status (4)

Country Link
US (1) US20230086519A1 (zh)
EP (1) EP4148656A4 (zh)
CN (1) CN113744117A (zh)
WO (1) WO2021238506A1 (zh)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116095355A (zh) * 2023-01-18 2023-05-09 百果园技术(新加坡)有限公司 视频显示控制方法及其装置、设备、介质、产品

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114727082B (zh) * 2022-03-10 2024-01-30 杭州中天微系统有限公司 图像处理装置、图像信号处理器、图像处理方法和介质

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106210727A (zh) * 2016-08-16 2016-12-07 广东中星电子有限公司 基于神经网络处理器阵列的视频分级码流编码方法和架构
CN108711429A (zh) * 2018-06-08 2018-10-26 Oppo广东移动通信有限公司 电子设备及设备控制方法
CN109345556A (zh) * 2017-07-27 2019-02-15 罗克韦尔柯林斯公司 用于混合现实的神经网络前景分离
CN110166708A (zh) * 2019-06-13 2019-08-23 Oppo广东移动通信有限公司 夜景图像处理方法、装置、电子设备以及存储介质
US20200150598A1 (en) * 2018-11-13 2020-05-14 Samsung Electronics Co., Ltd. Method for processing data using neural network and electronic device for supporting the same

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7973823B2 (en) * 2006-12-29 2011-07-05 Nokia Corporation Method and system for image pre-processing
US9466012B2 (en) * 2013-07-11 2016-10-11 Radiological Imaging Technology, Inc. Phantom image classification
JP6725733B2 (ja) * 2018-07-31 2020-07-22 ソニーセミコンダクタソリューションズ株式会社 固体撮像装置および電子機器
KR102075293B1 (ko) * 2019-05-22 2020-02-07 주식회사 루닛 의료 영상의 메타데이터 예측 장치 및 방법

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106210727A (zh) * 2016-08-16 2016-12-07 广东中星电子有限公司 基于神经网络处理器阵列的视频分级码流编码方法和架构
CN109345556A (zh) * 2017-07-27 2019-02-15 罗克韦尔柯林斯公司 用于混合现实的神经网络前景分离
CN108711429A (zh) * 2018-06-08 2018-10-26 Oppo广东移动通信有限公司 电子设备及设备控制方法
US20200150598A1 (en) * 2018-11-13 2020-05-14 Samsung Electronics Co., Ltd. Method for processing data using neural network and electronic device for supporting the same
CN110166708A (zh) * 2019-06-13 2019-08-23 Oppo广东移动通信有限公司 夜景图像处理方法、装置、电子设备以及存储介质

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See also references of EP4148656A4 *

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116095355A (zh) * 2023-01-18 2023-05-09 百果园技术(新加坡)有限公司 视频显示控制方法及其装置、设备、介质、产品

Also Published As

Publication number Publication date
US20230086519A1 (en) 2023-03-23
CN113744117A (zh) 2021-12-03
EP4148656A1 (en) 2023-03-15
EP4148656A4 (en) 2023-12-27

Similar Documents

Publication Publication Date Title
WO2021238522A1 (zh) 多媒体处理芯片、电子设备和图像处理方法
CN111028189A (zh) 图像处理方法、装置、存储介质及电子设备
US8194155B2 (en) Information processing apparatus, buffer control method, and computer program
JP2004515981A (ja) 携帯電話用の最適化されたカメラセンサ構造
US20080212888A1 (en) Frame Region Filters
WO2023124123A1 (zh) 图像处理方法及其相关设备
CN113810593B (zh) 图像处理方法、装置、存储介质及电子设备
JP6493454B2 (ja) 電子カメラ
CN113744119A (zh) 多媒体处理芯片和电子设备
US20230086519A1 (en) Multimedia processing chip, electronic device, and method for dynamic-image processing
WO2022151852A1 (zh) 图像处理方法、装置、系统、电子设备以及存储介质
JP4761048B2 (ja) 撮像装置及びそのプログラム
JP2013175824A (ja) 電子カメラ
CN113873141B (zh) 电子设备
CN113744139A (zh) 图像处理方法、装置、电子设备及存储介质
WO2023230393A1 (en) Smart high dynamic range image clamping
CN113873142B (zh) 多媒体处理芯片、电子设备和动态图像处理方法
CN111491101B (zh) 图像处理器、图像处理方法、拍摄装置和电子设备
JP5906846B2 (ja) 電子カメラ
CN113873178B (zh) 多媒体处理芯片、电子设备和图像处理方法
JP2003259161A (ja) 撮像装置および撮像方法
CN113873143B (zh) 多媒体处理芯片和电子设备
CN113744118A (zh) 多媒体处理芯片、电子设备和图像处理方法
US20240214692A1 (en) High dynamic range region based compute gating
US9277119B2 (en) Electronic apparatus, method for controlling the same, and computer readable recording medium

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21812348

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2021812348

Country of ref document: EP

Effective date: 20221208

NENP Non-entry into the national phase

Ref country code: DE