WO2020063505A1 - 图像处理方法、系统及计算机可读存储介质 - Google Patents

图像处理方法、系统及计算机可读存储介质 Download PDF

Info

Publication number
WO2020063505A1
WO2020063505A1 PCT/CN2019/107192 CN2019107192W WO2020063505A1 WO 2020063505 A1 WO2020063505 A1 WO 2020063505A1 CN 2019107192 W CN2019107192 W CN 2019107192W WO 2020063505 A1 WO2020063505 A1 WO 2020063505A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
processing chip
image data
data
image processing
Prior art date
Application number
PCT/CN2019/107192
Other languages
English (en)
French (fr)
Inventor
郑自浩
宋刚
Original Assignee
上海众链科技有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 上海众链科技有限公司 filed Critical 上海众链科技有限公司
Publication of WO2020063505A1 publication Critical patent/WO2020063505A1/zh

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/10Cameras or camera modules comprising electronic image sensors; Control thereof for generating image signals from different wavelengths
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/80Camera processing pipelines; Components thereof
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/76Television signal recording

Definitions

  • the invention relates to the field of intelligent control, in particular to an image processing method, system and computer-readable storage medium.
  • mobile terminal devices such as mobile phones and tablet computers have increasingly entered people's lives. Therefore, such mobile terminals are also integrated with multiple functions such as shooting, camera, positioning, and lighting, so that people can complete multiple requirements after holding a mobile terminal.
  • the resolution and pixels of the camera module are getting higher and higher, making the imaging form that the mobile terminal can achieve in this function more and more close to professional camera equipment.
  • an image sensor processing module ISP is built in.
  • the camera group will send data to the ISP through the mipi interface.
  • the ISP will process the data to different ones according to different applications.
  • the data path 1 is sent to the data processing unit DPU.
  • the DPU After the DPU is processed, it is sent to the display LCD for display of the camera module.
  • the data path 2 is sent to the video processing unit VPU.
  • the VPU encodes and stores the data to In the storage device, it is used for the video recording function of the camera module; for example, it is sent to the image compressor jpeg encoder via data path 3. After the data is compressed into a jpeg file, the data is stored in the storage device for the camera module to take pictures Features;.
  • the CPU has image sensor processing module
  • an image sensing processing module is connected outside the CPU, and an internal ISPbypass method is used.
  • the data flow is the same as the first method except the built-in ISPbypass.
  • the internal image sensing and processing functions of the CPU are used. Due to the limitation of a certain internal function of the CPU, the processing of image data is limited, and more professional and detailed adjustments cannot be made to the captured images. The CPU's data processing load makes the mobile terminal run slower.
  • an object of the present invention is to provide an image processing method, a system, and a computer-readable storage medium, so that preview image data is directly displayed without going through a CPU, and the image preview speed is accelerated.
  • the invention discloses an image processing method, which includes the following steps:
  • the image sensor collects image data of the shooting object
  • the image sensor sends the image data to an image processing chip, and the image sensing processing module of the image processing chip processes the image data to form enhanced image data;
  • the image processing chip returns the enhanced image data to a central processing chip in a smart terminal, and the central processing chip records the enhanced image data to a storage device;
  • the image processing chip sends the enhanced image data to a display module, and the display module displays an enhanced image corresponding to the enhanced image data.
  • the step S200 includes:
  • the image sensor sends the image data to an image processing chip through a camera interface
  • S220 The image sensing module of the image processing chip processes the image data to form enhanced image data.
  • the step S300 includes:
  • the image processing chip returns the enhanced image data to a central processing chip in a smart terminal;
  • S320 the central processing chip encodes the enhanced image data to form video data
  • the central processing chip records video data to the storage device.
  • the step S300 includes:
  • the step S300 includes:
  • S320 The image processing chip integrates the interactive interface data and enhanced image data to form an integrated image.
  • the interactive interface data includes transparency information of the interactive interface.
  • the invention also discloses an image processing system, which includes a display module, an image sensor, an image processing chip and a central processing chip provided in a terminal;
  • the image processing chip is connected to the display module, the image sensor and the central processing chip, respectively;
  • the image sensor collects image data of a shooting object and sends the image data to an image processing chip
  • An image sensing processing module of the image processing chip processes the image data to form enhanced image data
  • the image processing chip returns the enhanced image data to the central processing chip, the central processing chip records the enhanced image data to a storage device, and the image processing chip sends the enhanced image data To a display module, the display module displays an enhanced image corresponding to the enhanced image data.
  • the central processing chip encodes the enhanced image data to form video data, and records the video data to the storage device, or
  • the central processing chip compresses the enhanced image data to form a compressed image, and records the compressed image to the storage device.
  • the central processing chip sends interactive interface data to the image processing chip
  • the image processing chip integrates the interactive interface data and the enhanced image data to form an integrated image.
  • the present invention also discloses a computer-readable storage medium on which a computer program is stored.
  • a computer program is stored on which a computer program is stored.
  • the steps of the image processing method described above are implemented.
  • the image preview data will be sent directly to the display screen, which will not be processed by the CPU, reducing the CPU load and speeding up the preview speed;
  • FIG. 1 is a schematic structural diagram of a system for realizing captured image processing in the prior art
  • FIG. 2 is a schematic structural diagram of a system for realizing captured image processing in the prior art
  • FIG. 3 is a schematic flowchart of an image processing method according to an embodiment of the present invention.
  • FIG. 4 is a schematic flowchart of an image processing method according to the first preferred embodiment of the present invention.
  • FIG. 5 is a schematic flowchart of an image processing method according to a second preferred embodiment of the present invention.
  • FIG. 6 is a schematic flowchart of an image processing method according to a third preferred embodiment of the present invention.
  • FIG. 7 is a schematic structural diagram of an image processing system according to a preferred embodiment of the present invention.
  • first, second, third, etc. may be used in this disclosure to describe various information, such information should not be limited to these terms. These terms are only used to distinguish the same type of information from each other.
  • first information may also be referred to as the second information, and similarly, the second information may also be referred to as the first information.
  • word “if” as used herein can be interpreted as “at” or “when” or “in response to a determination”
  • the image processing method includes the following steps:
  • the image sensor collects image data of the shooting object
  • an image sensor such as a photographing camera module, a lens module, a prism module, etc. is used to perform image acquisition such as shooting, video recording, recording motion frames, etc. on a photographic object, thereby forming original image data .
  • the image data collected may be unmodified image data without any modification, or it may have been subjected to beauty, cropping, optimization, rendering, etc. Image data for initial processing. Regardless of whether the image data in this step has been processed, it will be further optimized in the subsequent steps.
  • the image sensor sends the image data to the image processing chip, and the image sensing processing module of the image processing chip processes the image data to form enhanced image data
  • the image data formed in step S100 is sent by the image sensor to an image processing chip (IPIC) electrically connected to the image sensor.
  • IPIC image processing chip
  • the electrical connection between the image sensor and the image processing chip may be integrated on the same printed circuit board.
  • the two independent image sensing modules and image processing modules may also be two independent devices connected by wires, or two independent devices arranged in the same local area network or wirelessly connected to each other.
  • the image processing chip IPIC After receiving the image data, the image processing chip IPIC performs preprocessing such as black correction, dead pixel correction, denoising, lens correction, RAW data processing, and image enhancement to form an enhanced image data.
  • the enhanced image data can be optimized for different degrees of image based on the different processing capabilities of the image processing chip IPIC.
  • the image processing chip operates independently of any other modules, modules, units, and devices, and operates independently. Optimizing the processing of image data in this way can reduce the load pressure of image processing functions on other devices on the one hand, and on the other hand can be used exclusively for integrated devices that do not castrate some functions on the image processing to increase the efficiency and effect of image processing.
  • the image processing chip returns the enhanced image data to a central processing chip in a smart terminal, and the central processing chip records the enhanced image data to a storage device
  • the enhanced image data formed after processing is returned from the image processing chip to a central processing chip (CPU or AP side).
  • the central processing chip is set in a smart terminal, and the image processing chip is externally connected to the central processing chip, or it can be set in the same smart terminal, but it is set separately from the central processing chip, and the two are distributed design.
  • the image processing chip and the central processing chip may be designed on the same printed circuit board (but not integrated), or the image processing chip and the central processing chip may be connected through a wire, or may be wirelessly connected through mutually supported wireless protocols. Separate the central processing chip from the image processing chip.
  • the central processing chip can centrally process the instructions of each component in the smart terminal and issue them. For the image processing function, the central processing chip is divided into separate external image processing chips.
  • the central processing chip After the central processing chip receives the enhanced image data, it records the enhanced image data to a storage device, such as the memory of the smart terminal, the external storage device of the smart terminal (U disk, mobile hard disk, other devices connected to OTG, etc.). In this step, in order to store the processed enhanced image data, the central processing chip executes a forwarding recording action, and does not make any changes to the enhanced image data itself, reducing the operating load of the central processing chip.
  • a storage device such as the memory of the smart terminal, the external storage device of the smart terminal (U disk, mobile hard disk, other devices connected to OTG, etc.
  • the image processing chip sends the enhanced image data to a display module, and the display module displays the enhanced image corresponding to the enhanced image data.
  • Step S400 and step S300 can be performed synchronously or asynchronously, that is, at any time such as when the central processing chip records the enhanced image data to the storage device, before or after, the image processing chip sends the enhanced image data to a display module, specifically as intelligent Display modules such as display screens and touch screens installed on the terminal receive and display enhanced images corresponding to the enhanced image data, that is, intelligent optimization, rendering, color temperature, fill light, AR enhancement based on the initial image data , Information recognition, beauty, cropping and other image processing enhanced image formed.
  • a display module specifically as intelligent Display modules such as display screens and touch screens installed on the terminal receive and display enhanced images corresponding to the enhanced image data, that is, intelligent optimization, rendering, color temperature, fill light, AR enhancement based on the initial image data , Information recognition, beauty, cropping and other image processing enhanced image formed.
  • the above-mentioned separate image processing chip is used to enhance the image, and the processed enhanced images are directly sent to the display module for display and all the way to the storage device for storage to improve the transmission speed of the processed images and the loading on the user side. And display speed. In addition, it can also improve the effects of the camera module in previewing, taking pictures, and recording, and improve the recording effect of fast motion pictures.
  • step S200 includes:
  • the image sensor sends the image data to the image processing chip through the camera interface
  • the image sensor sends the image data to the image processing chip through the camera interface CAMIF.
  • CAMIF is the first part of the video front-end (VFE) hardware. The main task is to synchronize the line and field synchronization signals involved in the process of the sensor sending data. It also has image extraction and image mirroring capabilities. In order to accurately synchronize with the external camera sensor, the CAMIF hardware must provide two programmable interrupt lines, one to control the opening of the shutter and the other to control the flash.
  • CAMIF hardware input devices include PCLK, HSYNC, VSYNC, a pixel enable gate, and a 12-bit data line. The input interface type is controlled by adsp.
  • the CAMIF module At power-on, the CAMIF module is closed and will not capture data until the adsp setting is enabled to enable data capture.
  • CAMIF waits until the beginning of the next frame to start collecting data.
  • adps is set to disable, CAMIF will stop working when the next frame of data is sent to ensure that everything is done.
  • CAMIF hardware provides the function of resampling camera and sensor data. Before CAMIF output data, adsp can independently control whether the complete output image or sub-sampling.
  • S220 The image sensing module of the image processing chip processes the image data to form enhanced image data.
  • the image processing chip processes the image data or enhanced image data differently. Specifically, it will be described in detail through the following descriptions of different embodiments.
  • step S300 includes:
  • the image processing chip returns the enhanced image data to a central processing chip in a smart terminal
  • the image sensor processing module ISP of the image processing chip IPIC will preprocess the image as described above. After the preprocessing is completed, the preprocessed image data will be sent to the image processing unit IPU of the image processing chip IPIC for image enhancement. The enhanced image data formed after the image enhancement processing is completed will be transmitted back to the central processing chip.
  • S320 The central processing chip encodes the enhanced image data to form video data
  • the central processing chip After the central processing chip receives the enhanced image data, it performs video encoding.
  • the encoding protocol can be ITU H.261, H.263, M-JPEG of the Moving Picture Experts Group, and MPEG series standards of the International Organization for Standardization Moving Picture Experts Group. Etc.
  • Video data is formed after encoding. Because the video data is based on the enhanced image data formed by the image processing chip, each frame of the formed video data is a picture frame that has been optimized, and the video data after the synthesis of each frame is more colorful on the screen sense.
  • the central processing chip records video data to a storage device.
  • the final video data is recorded by the central processing chip to a storage device for storage and subsequent recall.
  • the IPU is connected to the display module in another way. Specifically, the enhanced image is sent to the data processing unit DPU.
  • the DPU adds transparency to the interactive interface and sends it to the display module to load and display.
  • the first embodiment is an application that uses a camera component or a camera module for video shooting based on the technical solution of the present invention.
  • An external storage interface (USB, TF, etc.) is added through the image processing chip, which can be directly stored in the storage outside the image processing chip when recording. On the device.
  • step S300 includes:
  • the image processing chip returns the enhanced image data to a central processing chip in a smart terminal
  • the image sensor processing module ISP of the image processing chip IPIC will preprocess the image as described above. After the preprocessing is completed, the preprocessed image data will be sent to the image processing chip IPIC's image processing.
  • the unit IPU performs image enhancement, and the enhanced image data formed after the image enhancement processing is completed will be transmitted back to the central processing chip.
  • the central processing chip compresses the enhanced image data to form a compressed image
  • the central processing chip will compress the enhanced image data, such as a jpeg encoder jpeg encoder, to form a compressed image.
  • the JPEG encoding process first needs to convert the RGB format in the enhanced image data to YUV format.
  • RGB red, green, blue
  • RGB red, green, blue
  • RGB space uses RGB space to save the image.
  • a pixel is 24 bits, and a color intensity (0-255) is stored every 8 bits. For example, red is stored as 0xFF0000.
  • YUV is a color coding method adopted by the European television system, and this method is also commonly used in China's radio and television.
  • Y represents the brightness (Luminance or Luma), which is the grayscale value
  • YUV space in color TVs is precisely to solve the compatibility problem between color TVs and black and white TVs with the luminance signal Y, so that black and white TVs can also receive color TV signals.
  • RGB 0.299R + 0.587G + 0.114B
  • U -0.147R-0.289G + 0.436B
  • V 0.615R-0.515G-0.100B
  • R Y + 1.14 V
  • G Y-0.39U-0.58V
  • B Y + 2.03U.
  • DCT discrete cosine transform
  • Discrete cosine transform is a transform coding method commonly used for digital rate compression. The Fourier transform of any continuous real symmetric function contains only the cosine term, so the cosine transform has the same clear physical meaning as the Fourier transform.
  • DCT first divides the overall image into N * N pixel blocks, and then performs DCT transformation on the N * N pixel blocks one by one. Because the high-frequency components of most images are small, the coefficients corresponding to the high-frequency components of the images are often zero, and the human eye is less sensitive to distortion of high-frequency components, so coarser quantization can be used. Therefore, the bit rate for transmitting the transform coefficients is much smaller than the bit rate used for transmitting image pixels. After arriving at the receiving end, the samples are returned to the samples through the inverse discrete cosine transform. Although there will be some distortion, the human eye is acceptable.
  • E (n) is a basis
  • C (n) is the DCT coefficient
  • F (n) is the image signal.
  • any image block can be represented as a combination of 64 coefficients of different sizes. Since the basic image is equivalent to a single coefficient in the transform domain, any pixel can also be regarded as a combination of 64 basic images with different amplitudes. This has the same physical meaning as any combination of signals that can be broken down into fundamental waves and harmonics of different amplitudes.
  • the quantization process is a process of discretizing the amplitude of a signal. The discrete signal becomes a digital signal after quantization. Because HVS is more sensitive to low-frequency signals, a relatively short quantization step size is used for the low-frequency part of the signal, and a relatively long quantization step size is used for the high-frequency part of the signal.
  • run-length coding refers to that a code can simultaneously represent the value of the code and a few zeros in front of it.
  • Z-shaped readout there are more chances of continuous zeros. Especially at the end, if it is all zero, after reading the last number, just give "end of block” (EOB) code, you can end the output, thus saving a lot of code rates.
  • the central processing chip records the compressed image to a storage device
  • the final compressed image is recorded by the central processing chip into a storage device and stored for subsequent recall.
  • the second embodiment is an application that uses a camera component or a camera module to perform image shooting and photography based on the technical solution of the present invention.
  • An external storage interface (USB, TF, etc.) is added through the image processing chip, and the image can be stored directly outside the image processing chip. Storage device.
  • step S300 includes:
  • the central processing chip After the central processing chip is connected to the image processing chip, the central processing chip sends pre-configured interactive interface GUI data of the preview interface for image preview to the image processing chip through the DSI interface.
  • DSI defines a high-speed serial interface between the processor and the display module. It consists of four layers, the PHY layer, Lane management layer, Low level Protocol layer, and Application layer, which correspond to the D-PHY, DSI, and DCS specifications, respectively.
  • the PHY defines the transmission medium, input / output circuits, and clock and signal mechanisms.
  • the Lane Management layer sends and collects data streams to each lane.
  • the Low Level protocol layer defines how to frame and analyze, and detects errors.
  • the Application layer describes The high-level layer encodes and parses the data stream, and the interactive interface data includes transparency information of the interactive interface, that is, Alpha information.
  • the image processing chip integrates the interactive interface data with the enhanced image data through Alpha Blending, so that the enhanced image is displayed on the interactive interface as a display interface, that is, an integrated image is displayed.
  • the integrated image will be sent to the display module via the DPU for display.
  • the third embodiment is an application that provides a preview function to a user when an image is captured using a camera component or a camera module based on the technical solution of the present invention, and provides a user with a more acceptable and habitual visual perception through cooperation with an interactive interface.
  • the present invention also discloses an image processing system including a display module, an image sensor, an image processing chip, and a central processing chip provided in a terminal; the image processing chip is separately from the display module, the image sensor, and the central processing chip. Connection; the image sensor collects the image data of the subject and sends it to the image processing chip; the image sensor processing module of the image processing chip processes the image data to form enhanced image data; the image processing chip returns the enhanced image data to the central processing chip The central processing chip records the enhanced image data to a storage device, and the image processing chip sends the enhanced image data to a display module, and the display module displays the enhanced image corresponding to the enhanced image data.
  • the image processing system can also perform corresponding operation processes on applications such as video shooting, picture display, photo preview, etc. required by the image sensor.
  • the central processing chip encodes the enhanced image data to form video data and records the video data to a storage device, or the central processing chip compresses the enhanced image data to form a compressed image and records the compressed image to the storage device; or the central processing The chip sends interactive interface data to the image processing chip; the image processing chip integrates the interactive interface data and enhanced image data to form an integrated image.
  • a computer-readable storage medium may be installed in the smart terminal, and a computer program is stored thereon, and when the computer program is executed by a processor, the steps of the image processing method as described above are implemented.
  • Smart terminals can be implemented in various forms.
  • the terminals described in the present invention may include smart terminals such as mobile phones, smart phones, notebook computers, PDAs (personal digital assistants), PADs (tablets), PMPs (portable multimedia players), navigation devices, and the like as well as Fixed terminal for digital TV, desktop computer, etc.
  • the terminal is a smart terminal.
  • the configuration according to the embodiment of the present invention can be applied to a terminal of a fixed type, in addition to an element particularly used for mobile purposes.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Image Processing (AREA)
  • Studio Devices (AREA)

Abstract

本发明提供了一种图像处理方法、系统及计算机可读存储介质。图像处理方法,包括以下步骤:S100:图像传感器采集拍摄对象的图像数据;S200:图像传感器将图像数据发送至图像处理芯片,由图像处理芯片的图像传感处理模块对图像数据进行处理,形成增强图像数据;S300:图像处理芯片将增强图像数据回传至一智能终端内的中央处理芯片,由中央处理芯片记录增强图像数据至一存储设备;S400:图像处理芯片将增强图像数据发送至一显示模块,由显示模块显示对应增强图像数据的增强图像。采用上述技术方案后,可将图像处理流程独立出CPU,充分利用外置硬件,改进成像质量和移动终端的显示效果。

Description

图像处理方法、系统及计算机可读存储介质 技术领域
本发明涉及智能控制领域,尤其涉及一种图像处理方法、系统及计算机可读存储介质。
技术背景
例如手机、平板电脑等移动终端设备已越来越走入人们的生活。因此在此类移动终端上,也集成有如拍摄、摄像、定位、照明等多功能,以方便人们持有一件移动终端后可完成多件需求。在使用移动终端进行拍摄及摄像时,摄像模组的分辨率及像素越来越高,使得移动终端在该功能上所能实现的成像形态越发接近于专业的相机设备。具体地,移动终端设备的成像形式,主要有两种:
1.CPU内置有图像感应处理模块:
如图1所示,在移动终端的中央处理单元CPU内,内置有一图像感应处理模块ISP,摄像组将将数据通过mipi接口送到ISP,ISP会根据不同的应用把数据处理之后送给不同的模块。如通过数据路径1送给数据处理单元DPU,DPU处理之后送给显示屏LCD显示,用于摄像模块的摄像预览;如通过数据路径2送给视频处理单元VPU,VPU进行编码之后把数据存储到存储设备中,用于摄像模块的录像功能;又如通过数据路径3送给图片压缩器jpeg encoder,jpeg encoder把数据压缩成jpeg文件之后,把数据存储到存储设备中,用于摄像模块的拍照功能;。
2.CPU外置有图像感应处理模块
如图2所示,在CPU外连接有一图像感应处理模块,并采用内部ISPbypass的方式,数据流程除了内置ISPbypass外,其他与方式一等同。
由于现有的方案中,均采用CPU的内部图像感应与处理功能,受到CPU的内部某功能的限制,对于图像数据的处理受限,无法对拍摄图像进行更加专业和细致的调整,同时,加重了CPU的数据处理负载,使得移动终端的运行速度变慢。
因此,需要一种新型的图像处理方法及系统,可将图像处理流程独立出CPU,充分利用外置硬件,改进成像质量和移动终端的显示效果。
发明概要
为了克服上述技术缺陷,本发明的目的在于提供一种图像处理方法、系统及计算机可读存储介质,使得预览的图像数据不经过CPU直接显示,加快图像预览速度。
本发明公开了一种图像处理方法,包括以下步骤:
S100:图像传感器采集拍摄对象的图像数据;
S200:图像传感器将所述图像数据发送至图像处理芯片,由所述图像处理芯片的图像传感处理模块对所述图像数据进行处理,形成增强图像数据;
S300:所述图像处理芯片将所述增强图像数据回传至一智能终端内的中央处理芯片,由所述中央处理芯片记录所述增强图像数据至一存储设备;
S400:所述图像处理芯片将所述增强图像数据发送至一显示模块,由所述显示模块显示对应所述增强图像数据的增强图像。
优选地,所述步骤S200包括:
S210:图像传感器通过摄像接口将所述图像数据发送至图像处理芯片;
S220:所述图像处理芯片的图像传感模块对所述图像数据处理,形成增强图像数据。
优选地,所述步骤S300包括:
S310:所述图像处理芯片将所述增强图像数据回传至一智能终端内的中央处理芯片;
S320:所述中央处理芯片对所述增强图像数据进行编码,形成视频数据;
S330:所述中央处理芯片记录视频数据至所述存储设备。
优选地,所述步骤S300包括:
S310’:所述图像处理芯片将所述增强图像数据回传至一智能终端内的中央处理芯片;
S320’:所述中央处理芯片对所述增强图像数据进行压缩,形成压缩图像;
S330’:所述中央处理芯片记录压缩图像至所述存储设备。
优选地,所述步骤S300包括:
S310”:所述中央处理芯片发送交互界面数据至所述图像处理芯片;
S320”:所述图像处理芯片整合所述交互界面数据及增强图像数据,形成整合图像。
优选地,所述交互界面数据包括交互界面的透明度信息。
本发明还公开了一种图像处理系统,包括设于一终端内的显示模块、图像传感器、图像处理芯片及中央处理芯片;
所述图像处理芯片分别与所述显示模块、图像传感器及中央处理芯片连接;
所述图像传感器采集拍摄对象的图像数据并发送至图像处理芯片;
所述图像处理芯片的图像传感处理模块对所述图像数据进行处理,形成增强图像数据;
所述图像处理芯片将所述增强图像数据回传至所述中央处理芯片,由所述中央处理芯片记录所述增强图像数据至一存储设备,且所述图像处理芯片将所述增强图像数据发送至一显示模块,由所述显示模块显示对应所述增强图像数据的增强图像。
优选地,所述中央处理芯片对所述增强图像数据进行编码,形成视频数据,并记录视频数据至所述存储设备,或
所述中央处理芯片对所述增强图像数据进行压缩,形成压缩图像,并记录压缩图像至所述存储设备。
优选地,所述中央处理芯片发送交互界面数据至所述图像处理芯片;
所述图像处理芯片整合所述交互界面数据及增强图像数据,形成整合图像。
本发明又公开了一种计算机可读存储介质,其上存储有计算机程序,所述计算机程序被处理器执行时实现以下如上所述的图像处理方法的步骤。
采用了上述技术方案后,与现有技术相比,具有以下有益效果:
1.图像预览数据将直接发送至显示屏显示,不由CPU处理,降低CPU负载,加快预览速度;
2.充分利用外置硬件,改进成像质量和移动终端的显示效果。
附图说明
图1为现有技术中实现拍摄图像处理的系统结构示意图;
图2为现有技术中实现拍摄图像处理的系统结构示意图;
图3为符合本发明一实施例中图像处理方法的流程示意图;
图4为符合本发明第一优选实施例中图像处理方法的流程示意图;
图5为符合本发明第二优选实施例中图像处理方法的流程示意图;
图6为符合本发明第三优选实施例中图像处理方法的流程示意图;
图7为符合本发明一优选实施例中图像处理系统的结构示意图。
发明内容
以下结合附图与具体实施例进一步阐述本发明的优点。
这里将详细地对示例性实施例进行说明,其示例表示在附图中。下面的描述涉及附 图时,除非另有表示,不同附图中的相同数字表示相同或相似的要素。以下示例性实施例中所描述的实施方式并不代表与本公开相一致的所有实施方式。相反,它们仅是与如所附权利要求书中所详述的、本公开的一些方面相一致的装置和方法的例子。
在本公开使用的术语是仅仅出于描述特定实施例的目的,而非旨在限制本公开。在本公开和所附权利要求书中所使用的单数形式的“一种”、“所述”和“该”也旨在包括多数形式,除非上下文清楚地表示其他含义。还应当理解,本文中使用的术语“和/或”是指并包含一个或多个相关联的列出项目的任何或所有可能组合。
应当理解,尽管在本公开可能采用术语第一、第二、第三等来描述各种信息,但这些信息不应限于这些术语。这些术语仅用来将同一类型的信息彼此区分开。例如,在不脱离本公开范围的情况下,第一信息也可以被称为第二信息,类似地,第二信息也可以被称为第一信息。取决于语境,如在此所使用的词语“如果”可以被解释成为“在……时”或“当……时”或“响应于确定”
在本发明的描述中,需要理解的是,术语“纵向”、“横向”、“上”、“下”、“前”、“后”、“左”、“右”、“竖直”、“水平”、“顶”、“底”“内”、“外”等指示的方位或位置关系为基于附图所示的方位或位置关系,仅是为了便于描述本发明和简化描述,而不是指示或暗示所指的装置或元件必须具有特定的方位、以特定的方位构造和操作,因此不能理解为对本发明的限制。
在本发明的描述中,除非另有规定和限定,需要说明的是,术语“安装”、“相连”、“连接”应做广义理解,例如,可以是机械连接或电连接,也可以是两个元件内部的连通,可以是直接相连,也可以通过中间媒介间接相连,对于本领域的普通技术人员而言,可以根据具体情况理解上述术语的具体含义。
在后续的描述中,使用用于表示元件的诸如“模块”、“部件”或“单元”的后缀仅为了有利于本发明的说明,其本身并没有特定的意义。因此,“模块”与“部件”可以混合地使用。
参阅图3,为符合本发明一优选实施例中图像处理方法的流程示意图。在该实施例中,图像处理方法包括以下步骤:
S100:图像传感器采集拍摄对象的图像数据
为形成待处理的初始图像,利用一图像传感器,如摄影摄像模组、镜头模组、棱镜模组等对一拍摄对象进行如拍摄、录像、记录运动帧等图像采集,从而形成原始的图像数据。
可以理解的是,由于不同图像传感器的硬件结构不同,所采集的图像数据,可能是未作任何修饰的未修片型的图像数据,也可能是已进行如美颜、裁剪、优化、渲染等初期处理的图像数据。不论在该步骤内的图像数据是否已作处理,均将在后续步骤内作进一步的优化。
S200:图像传感器将图像数据发送至图像处理芯片,由图像处理芯片的图像传感处理模块对图像数据进行处理,形成增强图像数据
在步骤S100中所形成的图像数据,将由图像传感器发送至与该图像传感器电连接的一图像处理芯片(IPIC),图像传感器与图像处理芯片的电连接方式,可以是集成在同一印刷电路板上的两个独立的图像传感模块及图像处理模块,也可以是以导线连接的两个独立的器件,也可以是设置在同一局域网内或互相无线连接的两个独立的器件。图像处理芯片IPIC在接收到图像数据后,将对其进行如黑色矫正、坏点矫正的预处理,去噪、镜头矫正的RAW数据处理,图像增强等精处理,从而形成一增强图像数据。增强图像数据基于图像处理芯片IPIC的处理能力的不同,可进行不同程度的图像优化,在本实施例中,图像处理芯片作为单独于其他任何模块、模组、单元、设备的器件,以独立运行的方式对图像数据进行优化处理,一方面可减少图像处理功能对其他设备的负载压力,另一方面可专能专用,不由阉割部分功能的集成设备对图像处理,增加图像处理的效率和效果。
S300:图像处理芯片将增强图像数据回传至一智能终端内的中央处理芯片,由中央处理芯片记录增强图像数据至一存储设备
处理完毕后形成的增强图像数据将由图像处理芯片回传至一中央处理芯片(CPU或AP端)。该中央处理芯片设置在一智能终端内,而图像处理芯片外挂于中央处理芯片,也可设置在同一智能终端内,但与中央处理芯片独立设置,两者分布式设计。图像处理芯片可与中央处理芯片设计在同一印刷电路板上(但不集成),或图像处理芯片与中央处理芯片通过导线连接,或是通过互相支持的无线协议无线连接。将中央处理芯片与图像处理芯片独立,中央处理芯片可集中处理智能终端内的对各部件的指令转达及下发,而图像处理的功能,则分割出中央处理芯片,单独由外挂的图像处理芯片完成,减少中央处理芯片负载的同时,也可简化中央处理芯片的制式要求,其内部的原图像处理能力可压缩或简化,对于外部采集的图像数据的优化处理工作,交由更为专业化、优化度更高的图像处理芯片完成,可加快图像处理速度。同时,由于图像数据不再经过中央处理芯片,对于处理完毕后的图像数据的显示和加载也将更快。
中央处理芯片接收到增强图像数据后,将记录增强图像数据至一存储设备,如智能终端的内存,智能终端的外接存储装置(U盘、移动硬盘,OTG连接的其他设备等)。在该步骤中,为存储已处理完毕的增强图像数据,中央处理芯片执行一转发记录的动作,对于增强图像数据本身不作任何变化,减少中央处理芯片的运行负载。
S400:图像处理芯片将增强图像数据发送至一显示模块,由显示模块显示对应增强图像数据的增强图像。
步骤S400与步骤S300可同步或异步执行,即在中央处理芯片记录增强图像数据至存储设备的同时、之前或之后等任意时刻,图像处理芯片将增强图像数据发送至一显示模块,具体地如智能终端上安装的显示屏、触摸屏等显示模块,由显示模块接收并显示对应增强图像数据的增强图像,即在初始的图像数据的基础上进行智能优化、渲染、调色温、补光、AR增强、信息识别、美颜、裁剪等各项图像处理后形成的增强图像。
通过上述单独的图像处理芯片对图像作增强处理,并直接将处理后的增强图像一路发送至显示模块显示,一路发送至存储设备存储,以提高处理后图像的传输速度,及在用户侧的加载和显示速度。此外还可提升摄像模块在预览、拍照、录像时的效果,及提升快速运动画面的录像效果。
一优选实施例中,步骤S200包括:
S210:图像传感器通过摄像接口将图像数据发送至图像处理芯片
图像传感器Image sensor通过摄像接口CAMIF,将图像数据发送至图像处理芯片。CAMIF是video front-end(VFE)硬件的的第一部分,主要任务是同步sensor发送数据过程中涉及到的行、场同步信号。另外它还具有图像提取和图像镜像能力。为了和外部camera sensor精确的同步,CAMIF硬件必须提供两根可编程的中断线,一根用来控制快门的打开,另一根用来控制闪光。CAMIF硬件输入设备包括PCLK、HSYNC、VSYNC、一个像素使能闸门、和12bit数据线。输入接口类型是由adsp控制的。在上电时,CAMIF模块是关闭的,也不会抓取数据,直到adsp设置使能为enable,开启数据捕获。当设置使能数据位后,CAMIF会等到下一个帧开始时才开始收集数据。类似的,当adps设置为disable时,CAMIF会等到下一帧数据发送完成时停止工作,确保所有的事情都处理完。CAMIF硬件提供了对camera sensor数据重采样的功能,在CAMIF输出数据之前,adsp可以独立的控制是完整的输出图像还是做二次采样。
S220:图像处理芯片的图像传感模块对图像数据处理,形成增强图像数据。
在不同实施例中,图像处理芯片对于图像数据或增强图像数据的处理方式不同,具 体地,将通过以下不同实施例的介绍详细说明。
实施例一
参阅图4,在该实施例中,步骤S300包括:
S310:图像处理芯片将增强图像数据回传至一智能终端内的中央处理芯片
图像处理芯片IPIC的图像传感处理模块ISP将对图像进行如上文所述的预处理,预处理完毕后,再发送预处理后的图像数据至图像处理芯片IPIC的图像处理单元IPU作图像增强,图像增强处理完毕后形成的增强图像数据将回传至中央处理芯片。
S320:中央处理芯片对增强图像数据进行编码,形成视频数据
中央处理芯片收到增强图像数据后进行视频编码,编码协议可以是国际电联的H.261、H.263,运动静止图像专家组的M-JPEG和国际标准化组织运动图像专家组的MPEG系列标准等,编码后形成有视频数据。由于该视频数据基于图像处理芯片处理后形成的增强图像数据,因此所形成的视频数据的每一帧均为已执行过优化处理的画面帧,各个帧合成后的视频数据在画面上更有色彩感。
S330:中央处理芯片记录视频数据至存储设备。
最终形成的视频数据由中央处理芯片记录至存储设备内存储,共后续调用查看。
IPU另一路与显示模块连接,具体地,将增强图像发送至数据处理单元DPU,由DPU增加透明度交互界面,并发送至显示模块加载显示。
该实施例一为基于本发明技术方案的利用摄像组件或摄像模块进行视频拍摄的应用,通过图像处理芯片增加外部存储接口(USB,TF等),录像时可直接存储在图像处理芯片外部的存储设备上。
实施例二
参阅图5,在该实施例中,步骤S300包括:
S310’:图像处理芯片将增强图像数据回传至一智能终端内的中央处理芯片
与实施例一相同,图像处理芯片IPIC的图像传感处理模块ISP将对图像进行如上文所述的预处理,预处理完毕后,再发送预处理后的图像数据至图像处理芯片IPIC的图像处理单元IPU作图像增强,图像增强处理完毕后形成的增强图像数据将回传至中央处理芯片。
S320’:中央处理芯片对增强图像数据进行压缩,形成压缩图像
中央处理芯片将对增强图像数据进行压缩,如jpeg编码器jpeg encoder,形成压缩图像。JPEG编码过程,首先需将增强图像数据内的RGB格式转换为YUV格式,在记录计 算机图像时,最常见的是采用RGB(红、绿,蓝)颜色分量来保存颜色信息,例如非压缩的24位的BMP图像就采用RGB空间来保存图像。一个像素24位,每8位保存一种颜色强度(0-255),例如红色保存为0xFF0000。YUV是被欧洲电视系统所采用的一种颜色编码方法,我国广播电视也普遍采用这类方法。其中“Y”表示明亮度(Luminance或Luma),也就是灰阶值;而“U”和“V”表示的则是色度(Chrominance或Chroma)。彩色电视采用YUV空间正是为了用亮度信号Y解决彩色电视机与黑白电视机的兼容问题,使黑白电视机也能接收彩色电视信号。在将RGB转化为YUV时,通常采取如Y=0.299R+0.587G+0.114B、U=-0.147R-0.289G+0.436B、V=0.615R-0.515G-0.100B、R=Y+1.14V、G=Y-0.39U-0.58V、B=Y+2.03U的方式进行。后将图像8*8分块,在原始图像转换为YUV格式后,对图像按一定的采样格式进行采样,常见的格式有4:4:4,4:2:2和4:2:0。取样完成后,将图像按8*8(pixel)划分成MCU。随后便进行离散余弦变换(DCT),离散余弦变换DCT(Discrete Cosine Transform)是数码率压缩需要常用的一个变换编码方法。任何连续的实对称函数的付立叶变换中只含余弦项,因此余弦变换与付立叶变换一样有明确的物理意义。DCT是先将整体图像分成N*N像素块,然后对N*N像素块逐一进行DCT变换。由于大多数图像的高频分量较小,相应于图像高频分量的系数经常为零,加上人眼对高频成分的失真不太敏感,所以可用更粗的量化。因此,传送变换系数的数码率要大大小于传送图像像素所用的数码率。到达接收端后通过反离散余弦变换回到样值,虽然会有一定的失真,但人眼是可以接受的。图像信号被分解成为直流成分;以及从低频到高频的各种余弦成分;而DCT系数只是表示了该种成分所占原图像信号的份额大小;显然,恢复图像信息可以表示为这样一个矩阵形式:F(n)=C(n)*E(n),式中E(n)是一个基底,C(n)是DCT系数,F(n)则是图像信号。如果再考虑垂直方向上的变化,那么,就需要一个二维的基底,即该基底不仅要反映水平方向频率的变化;而且要反映垂直空间频率的变化;对应于8*8的像素块;空间基底是由64个像素值所组成的图像,通常也称之为基本图像。把它们称为基本图像是因为在离散余弦变换的反变换式中,任何像块都可以表示成64个系数的不同大小的组合。既然基本图像相当于变换域中的单一的系数,那么任何像元也可以看成由64个不同幅度的基本图像的组合。这与任何信号可以分解成基波和不同幅度的谐波的组合具有相同的物理意义。量化过程是一个将信号的幅度离散化的过程,离散信号经过量化后变为数字信号。由于HVS对低频信号更为敏感,所以对信号的低频部分采用相对短的量化步长,对信号的高频部分采用相对长的量化步长。这样可以在一定程度上,得到相对清晰的图像和更高的压缩率。之后执行Z 字形编码(zigzag scan),按Z字形把量化后的数据读出。最后,使用行程长度编码(RLE)对交流系数(AC)进行编码,所谓游程长度编码是指一个码可以同时表示码的值和前面有几个零。这样就发挥了Z字型读出的优点,因为Z字型读出,出现连零的机会比较多,特别到最后,如果都是零,在读到最后一个数后,只要给出“块结束”(EOB)码,就可以结束输出,因此节省了很多码率。
S330’:中央处理芯片记录压缩图像至存储设备
最终形成的压缩图像由中央处理芯片记录至存储设备内存储,共后续调用查看。
该实施例二为基于本发明技术方案的利用摄像组件或摄像模块进行图像拍摄、拍照的应用,通过图像处理芯片增加外部存储接口(USB,TF等),拍照后可直接存储在图像处理芯片外部的存储设备上。
实施例三
参阅图6,在该实施例中,步骤S300包括:
S310”:中央处理芯片发送交互界面数据至图像处理芯片;
中央处理芯片与图像处理芯片连接后,中央处理芯片将预先配置的对于图像预览的预览界面的交互界面GUI数据通过DSI接口发送至图像处理芯片。DSI定义了一个位于处理器和显示模组之间的高速串行接口,其分四层,PHY层、Lane Management层、Low Level Protocol层、Application层,分别对应D-PHY、DSI、DCS规范,其中PHY定义了传输媒介,输入/输出电路和和时钟和信号机制,Lane Management层发送和收集数据流到每条lane,Low Level Protocol层定义了如何组帧和解析以及错误检测等,Application层描述高层编码和解析数据流,且该交互界面数据包括交互界面的透明度信息,即Alpha信息。
S320”:图像处理芯片整合交互界面数据及增强图像数据,形成整合图像
图像处理芯片通过Alpha blending将交互界面数据与增强图像数据整合,使得增强图像以交互界面为显示界面显示,即显示一整合图像。该整合图像将通过DPU发送至显示模块以显示。
该实施例三为基于本发明技术方案的利用摄像组件或摄像模块进行图像拍摄时提供预览功能至用户的应用,通过与交互界面的配合,向用户提供更易接受和习惯的视觉观感。
参阅图7,本发明还公开了一种图像处理系统,包括设于一终端内的显示模块、图像传感器、图像处理芯片及中央处理芯片;图像处理芯片分别与显示模块、图像传感器及 中央处理芯片连接;图像传感器采集拍摄对象的图像数据并发送至图像处理芯片;图像处理芯片的图像传感处理模块对图像数据进行处理,形成增强图像数据;图像处理芯片将增强图像数据回传至中央处理芯片,由中央处理芯片记录增强图像数据至一存储设备,且图像处理芯片将增强图像数据发送至一显示模块,由显示模块显示对应增强图像数据的增强图像。
优选地,该图像处理系统同样可对图像传感器所需的视频拍摄、图片显示、拍照预览等应用执行相应的操作流程。如中央处理芯片对增强图像数据进行编码,形成视频数据,并记录视频数据至存储设备,或中央处理芯片对增强图像数据进行压缩,形成压缩图像,并记录压缩图像至存储设备;亦或中央处理芯片发送交互界面数据至图像处理芯片;图像处理芯片整合交互界面数据及增强图像数据,形成整合图像等。
此外,基于上述图像处理方法,可在智能终端内安装计算机可读存储介质,其上存储有计算机程序,当该计算机程序被处理器执行时,实现如上文所述的图像处理方法的步骤。
智能终端可以以各种形式来实施。例如,本发明中描述的终端可以包括诸如移动电话、智能电话、笔记本电脑、PDA(个人数字助理)、PAD(平板电脑)、PMP(便携式多媒体播放器)、导航装置等等的智能终端以及诸如数字TV、台式计算机等等的固定终端。下面,假设终端是智能终端。然而,本领域技术人员将理解的是,除了特别用于移动目的的元件之外,根据本发明的实施方式的构造也能够应用于固定类型的终端。
应当注意的是,本发明的实施例有较佳的实施性,且并非对本发明作任何形式的限制,任何熟悉该领域的技术人员可能利用上述揭示的技术内容变更或修饰为等同的有效实施例,但凡未脱离本发明技术方案的内容,依据本发明的技术实质对以上实施例所作的任何修改或等同变化及修饰,均仍属于本发明技术方案的范围内。

Claims (10)

  1. 一种图像处理方法,其特征在于,包括以下步骤:
    S100:图像传感器采集拍摄对象的图像数据;
    S200:图像传感器将所述图像数据发送至图像处理芯片,由所述图像处理芯片的图像传感处理模块对所述图像数据进行处理,形成增强图像数据;
    S300:所述图像处理芯片将所述增强图像数据回传至一智能终端内的中央处理芯片,由所述中央处理芯片记录所述增强图像数据至一存储设备;
    S400:所述图像处理芯片将所述增强图像数据发送至一显示模块,由所述显示模块显示对应所述增强图像数据的增强图像。
  2. 如权利要求1所述的图像处理方法,其特征在于,
    所述步骤S200包括:
    S210:图像传感器通过摄像接口将所述图像数据发送至图像处理芯片;
    S220:所述图像处理芯片的图像传感模块对所述图像数据处理,形成增强图像数据。
  3. 如权利要求2所述的图像处理方法,其特征在于,
    所述步骤S300包括:
    S310:所述图像处理芯片将所述增强图像数据回传至一智能终端内的中央处理芯片;
    S320:所述中央处理芯片对所述增强图像数据进行编码,形成视频数据;
    S330:所述中央处理芯片记录视频数据至所述存储设备。
  4. 如权利要求2所述的图像处理方法,其特征在于,
    所述步骤S300包括:
    S310’:所述图像处理芯片将所述增强图像数据回传至一智能终端内的中央处理芯片;
    S320’:所述中央处理芯片对所述增强图像数据进行压缩,形成压缩图像;
    S330’:所述中央处理芯片记录压缩图像至所述存储设备。
  5. 如权利要求2所述的图像处理方法,其特征在于,
    所述步骤S300包括:
    S310”:所述中央处理芯片发送交互界面数据至所述图像处理芯片;
    S320”:所述图像处理芯片整合所述交互界面数据及增强图像数据,形成整合图像。
  6. 如权利要求6所述的图像处理方法,其特征在于,
    所述交互界面数据包括交互界面的透明度信息。
  7. 一种图像处理系统,其特征在于,包括设于一终端内的显示模块、图像传感器、图像 处理芯片及中央处理芯片;
    所述图像处理芯片分别与所述显示模块、图像传感器及中央处理芯片连接;
    所述图像传感器采集拍摄对象的图像数据并发送至图像处理芯片;
    所述图像处理芯片的图像传感处理模块对所述图像数据进行处理,形成增强图像数据;
    所述图像处理芯片将所述增强图像数据回传至所述中央处理芯片,由所述中央处理芯片记录所述增强图像数据至一存储设备,且所述图像处理芯片将所述增强图像数据发送至一显示模块,由所述显示模块显示对应所述增强图像数据的增强图像。
  8. 如权利要求7所述的图像处理系统,其特征在于,
    所述中央处理芯片对所述增强图像数据进行编码,形成视频数据,并记录视频数据至所述存储设备,或
    所述中央处理芯片对所述增强图像数据进行压缩,形成压缩图像,并记录压缩图像至所述存储设备。
  9. 如权利要求7所述的图像处理系统,其特征在于,
    所述中央处理芯片发送交互界面数据至所述图像处理芯片;
    所述图像处理芯片整合所述交互界面数据及增强图像数据,形成整合图像。
  10. 一种计算机可读存储介质,其上存储有计算机程序,其特征在于,所述计算机程序被处理器执行时实现如权利要求1-6任一项所述的图像处理方法的步骤。
PCT/CN2019/107192 2018-09-29 2019-09-23 图像处理方法、系统及计算机可读存储介质 WO2020063505A1 (zh)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201811145617.7 2018-09-29
CN201811145617.7A CN109167915A (zh) 2018-09-29 2018-09-29 图像处理方法、系统及计算机可读存储介质

Publications (1)

Publication Number Publication Date
WO2020063505A1 true WO2020063505A1 (zh) 2020-04-02

Family

ID=64892864

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2019/107192 WO2020063505A1 (zh) 2018-09-29 2019-09-23 图像处理方法、系统及计算机可读存储介质

Country Status (2)

Country Link
CN (1) CN109167915A (zh)
WO (1) WO2020063505A1 (zh)

Families Citing this family (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109286753A (zh) * 2018-09-29 2019-01-29 南昌黑鲨科技有限公司 图像处理方法、系统及计算机可读存储介质
CN109167916A (zh) * 2018-09-29 2019-01-08 南昌黑鲨科技有限公司 智能终端、图像处理方法及计算机可读存储介质
CN109167915A (zh) * 2018-09-29 2019-01-08 南昌黑鲨科技有限公司 图像处理方法、系统及计算机可读存储介质
CN110062161B (zh) * 2019-04-10 2021-06-25 Oppo广东移动通信有限公司 图像处理器、图像处理方法、拍摄装置和电子设备
CN111601019B (zh) * 2020-02-28 2021-11-16 北京爱芯科技有限公司 图像数据处理模组及电子设备
KR20230091877A (ko) * 2020-10-19 2023-06-23 퀄컴 인코포레이티드 계층 특성을 우선순위화하는 것에 의한 이미지 데이터의 프로세싱
CN112565603B (zh) * 2020-11-30 2022-05-10 维沃移动通信有限公司 图像处理方法、装置及电子设备
CN114827430B (zh) * 2021-01-19 2024-05-21 Oppo广东移动通信有限公司 一种图像处理方法、芯片及电子设备
CN116095509B (zh) * 2021-11-05 2024-04-12 荣耀终端有限公司 生成视频帧的方法、装置、电子设备及存储介质
CN114286003A (zh) * 2021-12-28 2022-04-05 维沃移动通信有限公司 拍摄方法、拍摄装置和电子设备
CN114285957A (zh) * 2021-12-28 2022-04-05 维沃移动通信有限公司 图像处理电路及数据传输方法
CN114339071A (zh) * 2021-12-28 2022-04-12 维沃移动通信有限公司 图像处理电路、图像处理方法及电子设备
CN115107653A (zh) * 2022-07-29 2022-09-27 山东浪潮科学研究院有限公司 基于fpga的电子后视镜系统

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103220530A (zh) * 2013-04-22 2013-07-24 郑永春 用于智能监控的高分辨率图像处理系统及方法
CN104869381A (zh) * 2014-02-25 2015-08-26 炬芯(珠海)科技有限公司 一种图像处理系统、方法及装置
CN205375584U (zh) * 2016-01-04 2016-07-06 临沂大学 一种计算机内独立图像采集系统
US20160352999A1 (en) * 2014-06-03 2016-12-01 2P & M Holdings, LLC RAW Camera Peripheral With Handheld Mobile Unit Processing RAW Image Data
CN108322648A (zh) * 2018-02-02 2018-07-24 广东欧珀移动通信有限公司 图像处理方法和装置、电子设备、计算机可读存储介质
CN109167915A (zh) * 2018-09-29 2019-01-08 南昌黑鲨科技有限公司 图像处理方法、系统及计算机可读存储介质
CN109167916A (zh) * 2018-09-29 2019-01-08 南昌黑鲨科技有限公司 智能终端、图像处理方法及计算机可读存储介质
CN109242757A (zh) * 2018-09-29 2019-01-18 南昌黑鲨科技有限公司 智能终端、图像处理方法及计算机可读存储介质
CN109286753A (zh) * 2018-09-29 2019-01-29 南昌黑鲨科技有限公司 图像处理方法、系统及计算机可读存储介质

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN100576164C (zh) * 2006-11-30 2009-12-30 北京思比科微电子技术有限公司 基于usb模式的图像传输方法及装置
KR100969322B1 (ko) * 2008-01-10 2010-07-09 엘지전자 주식회사 멀티 그래픽 컨트롤러를 구비한 데이터 처리 장치 및 이를이용한 데이터 처리 방법
CN102883167A (zh) * 2012-09-19 2013-01-16 旗瀚科技有限公司 一种视频图像数据处理方法及系统
CN203675199U (zh) * 2013-12-31 2014-06-25 冠捷显示科技(厦门)有限公司 一种可升级软硬件性能的电视
TW201637432A (zh) * 2015-04-02 2016-10-16 Ultracker Technology Co Ltd 即時影像縫合裝置及即時影像縫合方法

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103220530A (zh) * 2013-04-22 2013-07-24 郑永春 用于智能监控的高分辨率图像处理系统及方法
CN104869381A (zh) * 2014-02-25 2015-08-26 炬芯(珠海)科技有限公司 一种图像处理系统、方法及装置
US20160352999A1 (en) * 2014-06-03 2016-12-01 2P & M Holdings, LLC RAW Camera Peripheral With Handheld Mobile Unit Processing RAW Image Data
CN205375584U (zh) * 2016-01-04 2016-07-06 临沂大学 一种计算机内独立图像采集系统
CN108322648A (zh) * 2018-02-02 2018-07-24 广东欧珀移动通信有限公司 图像处理方法和装置、电子设备、计算机可读存储介质
CN109167915A (zh) * 2018-09-29 2019-01-08 南昌黑鲨科技有限公司 图像处理方法、系统及计算机可读存储介质
CN109167916A (zh) * 2018-09-29 2019-01-08 南昌黑鲨科技有限公司 智能终端、图像处理方法及计算机可读存储介质
CN109242757A (zh) * 2018-09-29 2019-01-18 南昌黑鲨科技有限公司 智能终端、图像处理方法及计算机可读存储介质
CN109286753A (zh) * 2018-09-29 2019-01-29 南昌黑鲨科技有限公司 图像处理方法、系统及计算机可读存储介质

Also Published As

Publication number Publication date
CN109167915A (zh) 2019-01-08

Similar Documents

Publication Publication Date Title
WO2020063505A1 (zh) 图像处理方法、系统及计算机可读存储介质
WO2020063507A1 (zh) 图像处理方法、系统及计算机可读存储介质
WO2020063508A1 (zh) 智能终端、图像处理方法及计算机可读存储介质
WO2020063506A1 (zh) 智能终端、图像处理方法及计算机可读存储介质
US10326904B2 (en) On-chip image sensor data compression
JP3995595B2 (ja) 携帯電話用の最適化されたカメラセンサ構造
US8558909B2 (en) Method and apparatus for generating compressed file, camera module associated therewith, and terminal including the same
US8179452B2 (en) Method and apparatus for generating compressed file, and terminal comprising the apparatus
EP1667457A1 (en) Image processing display device and image processing display method
US6697106B1 (en) Apparatus for processing image signals representative of a still picture and moving pictures picked up
US20070065022A1 (en) Image signal processing apparatus and method
JP2001238190A (ja) 画像処理装置及びその制御処理方法
US20050190270A1 (en) Method and apparatus for processing image signal
US20120262603A1 (en) Image Capturing Device and Image Processing Method Thereof
US20090167888A1 (en) Methods of processing imaging signal and signal processing devices performing the same
JP4302661B2 (ja) 画像処理システム
KR101666927B1 (ko) 압축 파일 생성 방법 및 장치, 이를 포함하는 단말기
US20050068336A1 (en) Image overlay apparatus and method for operating the same
US8154749B2 (en) Image signal processor and deferred vertical synchronous signal outputting method
JP2018207424A (ja) 情報転送装置
TWI493502B (zh) 處理影像旋轉的方法與裝置
KR100902421B1 (ko) 캡쳐 영상을 시간 지연 없이 표시할 수 있는 영상 처리장치, 방법 및 상기 방법을 프로그램화하여 수록한컴퓨터로 읽을 수 있는 기록매체
JPH02107094A (ja) 静止画のディジタル記録装置
JPH11220643A (ja) デジタルスチルカメラ
JPH05103294A (ja) 画像処理装置

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19867557

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

32PN Ep: public notification in the ep bulletin as address of the adressee cannot be established

Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205A DATED 25.06.2021)

122 Ep: pct application non-entry in european phase

Ref document number: 19867557

Country of ref document: EP

Kind code of ref document: A1

122 Ep: pct application non-entry in european phase

Ref document number: 19867557

Country of ref document: EP

Kind code of ref document: A1