WO2020063506A1 - 智能终端、图像处理方法及计算机可读存储介质 - Google Patents

智能终端、图像处理方法及计算机可读存储介质 Download PDF

Info

Publication number
WO2020063506A1
WO2020063506A1 PCT/CN2019/107193 CN2019107193W WO2020063506A1 WO 2020063506 A1 WO2020063506 A1 WO 2020063506A1 CN 2019107193 W CN2019107193 W CN 2019107193W WO 2020063506 A1 WO2020063506 A1 WO 2020063506A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
processing chip
data
image data
central processing
Prior art date
Application number
PCT/CN2019/107193
Other languages
English (en)
French (fr)
Inventor
郑自浩
宋刚
Original Assignee
上海众链科技有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 上海众链科技有限公司 filed Critical 上海众链科技有限公司
Publication of WO2020063506A1 publication Critical patent/WO2020063506A1/zh

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/62Control of parameters via user interfaces
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/10Cameras or camera modules comprising electronic image sensors; Control thereof for generating image signals from different wavelengths
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/80Camera processing pipelines; Components thereof
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/76Television signal recording

Definitions

  • the invention relates to the field of intelligent control, in particular to an intelligent terminal, an image processing method, and a computer-readable storage medium.
  • mobile terminal devices such as mobile phones and tablet computers have increasingly entered people's lives. Therefore, such mobile terminals are also integrated with multiple functions such as shooting, camera, positioning, and lighting, so that people can complete multiple requirements after holding a mobile terminal.
  • the resolution and pixels of the camera module are getting higher and higher, making the imaging form that the mobile terminal can achieve in this function more and more close to professional camera equipment.
  • an image sensor processing module ISP is built in.
  • the camera group will send data to the ISP through the mipi interface.
  • the ISP will process the data to different ones according to different applications.
  • the data path 1 is sent to the data processing unit DPU.
  • the DPU After the DPU is processed, it is sent to the display LCD for display of the camera module.
  • the data path 2 is sent to the video processing unit VPU.
  • the VPU encodes and stores the data to In the storage device, it is used for the video recording function of the camera module; for example, it is sent to the image compressor jpeg encoder via data path 3. After the data is compressed into a jpeg file, the data is stored in the storage device for the camera module to take pictures Features;.
  • the CPU has image sensor processing module
  • an image sensor processing module is connected outside the CPU and adopts the internal ISP bypass method.
  • the data flow is the same as the first method except the built-in ISP bypass.
  • the internal image sensing and processing functions of the CPU are used. Due to the limitation of a certain internal function of the CPU, the processing of image data is limited, and more professional and detailed adjustments cannot be made to the captured images. The CPU's data processing load makes the mobile terminal run slower.
  • an object of the present invention is to provide an intelligent terminal, an image processing method, and a computer-readable storage medium, so that the preview image data is directly displayed without going through the CPU, and the image preview speed is accelerated.
  • the invention discloses an intelligent terminal, which includes a display module, an image sensor, and a central processing chip.
  • the smart terminal further includes an image processing chip independent of the central processing chip;
  • the image processing chip is connected to the display module, the image sensor and the central processing chip, respectively;
  • the image sensor collects image data of a shooting object and sends the image data to an image processing chip
  • An image sensing processing module of the image processing chip processes the image data to form enhanced image data
  • the image processing chip returns the enhanced image data to the central processing chip, the central processing chip records the enhanced image data to a storage device, and the image processing chip sends the enhanced image data To a display module, the display module displays an enhanced image corresponding to the enhanced image data.
  • the image processing chip returns the enhanced image data to a central processing chip in a smart terminal;
  • the central processing chip encodes the enhanced image data to form video data
  • the central processing chip records video data to the storage device.
  • the image processing chip returns the enhanced image data to a central processing chip in a smart terminal;
  • the central processing chip records the compressed image to the storage device.
  • the central processing chip sends interactive interface data to the image processing chip
  • the image processing chip integrates the interactive interface data and the enhanced image data to form an integrated image.
  • the image processing chip includes:
  • An image processing unit and a first data processing unit An image processing unit and a first data processing unit
  • the image sensing processing module is connected to the image sensor and the image processing unit, the image processing unit is connected to the first data processing unit, and the first data processing unit is connected to the display module;
  • the central processing chip includes:
  • the image sensing processing module is connected to the image processing unit, and receives enhanced image data
  • the second data processing unit is connected to the image processing unit and sends interactive interface data.
  • the central processing chip further includes:
  • a picture compressor which is connected to the image sensing processing module, receives image enhancement data and compresses it;
  • the picture compressor is also connected to the storage device, and sends a compressed image to the storage device.
  • the invention also discloses an image processing method, which includes the following steps:
  • the image sensor collects image data of the shooting object
  • the image sensor sends the image data to an image processing chip, and the image sensing processing module of the image processing chip processes the image data to form enhanced image data;
  • the image processing chip returns the enhanced image data to a central processing chip in a smart terminal, and the central processing chip records the enhanced image data to a storage device;
  • the image processing chip sends the enhanced image data to a display module, and the display module displays an enhanced image corresponding to the enhanced image data.
  • the step S300 includes:
  • the image processing chip returns the enhanced image data to a central processing chip in a smart terminal;
  • S320 the central processing chip encodes the enhanced image data to form video data
  • the central processing chip records video data to the storage device.
  • the step S300 includes:
  • the present invention also discloses a computer-readable storage medium on which a computer program is stored.
  • a computer program is stored on which a computer program is stored.
  • the steps of the image processing method described above are implemented.
  • the image preview data will be sent directly to the display screen, which will not be processed by the CPU, reducing the CPU load and speeding up the preview speed;
  • FIG. 1 is a schematic structural diagram of a system for realizing captured image processing in the prior art
  • FIG. 2 is a schematic structural diagram of a system for realizing captured image processing in the prior art
  • FIG. 3 is a schematic structural diagram of a smart terminal according to a preferred embodiment of the present invention.
  • FIG. 4 is a schematic structural diagram of a module according to a preferred embodiment of the present invention.
  • FIG. 5 is a schematic flowchart of an image processing method according to an embodiment of the present invention.
  • FIG. 6 is a schematic flowchart of an image processing method according to the first preferred embodiment of the present invention.
  • FIG. 7 is a schematic flowchart of an image processing method in accordance with a second preferred embodiment of the present invention.
  • FIG. 8 is a schematic flowchart of an image processing method according to a third preferred embodiment of the present invention.
  • first, second, third, etc. may be used in this disclosure to describe various information, such information should not be limited to these terms. These terms are only used to distinguish the same type of information from each other.
  • first information may also be referred to as the second information, and similarly, the second information may also be referred to as the first information.
  • word “if” as used herein can be interpreted as “at” or “when” or “in response to a determination”
  • the intelligent terminal includes: an image sensor, a central processing chip, and a display module.
  • An image sensor such as a camera module, a lens module, and a prism module, performs image acquisition such as shooting, video recording, and recording motion frames on a photographic object to form original image data.
  • image data collected may be unmodified image data without any modification, or it may have been subjected to beauty, cropping, optimization, rendering, etc. Image data for initial processing. Regardless of whether the image data in this step has been processed, it will be further optimized in the subsequent steps.
  • the image processing chip is a component independent of the central processing chip, and the image processing chip is connected to the display module, the image sensor and the central processing chip respectively; the image sensor collects the image data of the shooting object and sends it to the image processing chip; the image transmission of the image processing chip
  • the sensor processing module processes the image data to form enhanced image data; the image processing chip returns the enhanced image data to the central processing chip, and the central processing chip records the enhanced image data to a storage device, and the image processing chip sends the enhanced image data To a display module, the display module displays an enhanced image corresponding to the enhanced image data.
  • the electrical connection between the image sensor and the image processing chip may be two independent image sensing modules and image processing modules integrated on the same printed circuit board, or two independent devices connected by wires. It can also be two independent devices installed in the same local area network or wirelessly connected to each other.
  • the image processing chip IPIC After receiving the image data, the image processing chip IPIC performs preprocessing such as black correction, dead pixel correction, denoising, lens correction, RAW data processing, and image enhancement to form an enhanced image data.
  • the enhanced image data can be optimized for different degrees of image based on the different processing capabilities of the image processing chip IPIC.
  • the image processing chip operates independently of any other modules, modules, units, and devices, and operates independently. Optimizing the processing of image data in this way can reduce the load pressure of image processing functions on other devices on the one hand, and on the other hand can be used exclusively for integrated devices that do not castrate some functions on the image processing to increase the efficiency and effect of image processing.
  • the enhanced image data formed after processing is returned from the image processing chip to a central processing chip (CPU or AP side).
  • the central processing chip is set in a smart terminal, and the image processing chip is externally connected to the central processing chip, or it can be set in the same smart terminal, but it is set separately from the central processing chip, and the two are distributed design.
  • the image processing chip and the central processing chip may be designed on the same printed circuit board (but not integrated), or the image processing chip and the central processing chip may be connected through a wire, or may be wirelessly connected through mutually supported wireless protocols. Separate the central processing chip from the image processing chip.
  • the central processing chip can centrally process the instructions of each component in the smart terminal and issue them. For the image processing function, the central processing chip is divided into separate external image processing chips.
  • the central processing chip After the central processing chip receives the enhanced image data, it records the enhanced image data to a storage device, such as the memory of the smart terminal, the external storage device of the smart terminal (U disk, mobile hard disk, other devices connected to OTG, etc.). In this step, in order to store the processed enhanced image data, the central processing chip executes a forwarding recording action, and does not make any changes to the enhanced image data itself, reducing the operating load of the central processing chip.
  • a storage device such as the memory of the smart terminal, the external storage device of the smart terminal (U disk, mobile hard disk, other devices connected to OTG, etc.
  • the image processing chip When the central processing chip records the enhanced image data to the storage device at any time, before, or after, the image processing chip sends the enhanced image data to a display module, such as a display module, a touch screen, and other display modules installed on the smart terminal.
  • the display module receives and displays the enhanced image corresponding to the enhanced image data, that is, based on the initial image data, intelligent optimization, rendering, color temperature, fill light, AR enhancement, information recognition, beauty, cropping and other images Enhanced image formed after processing.
  • the above-mentioned separate image processing chip is used to enhance the image, and the processed enhanced images are directly sent to the display module for display and all the way to the storage device for storage to improve the transmission speed of the processed images and the loading on the user side. And display speed. In addition, it can also improve the effects of the camera module in previewing, taking pictures, and recording, and improve the recording effect of fast motion pictures.
  • the image sensor sends the image data to the image processing chip through the camera interface CAMIF.
  • CAMIF is the first part of the video front-end (VFE) hardware. The main task is to synchronize the line and field synchronization signals involved in the process of the sensor sending data. It also has image extraction and image mirroring capabilities. In order to accurately synchronize with the external camera sensor, the CAMIF hardware must provide two programmable interrupt lines, one to control the opening of the shutter and the other to control the flash.
  • CAMIF hardware input devices include PCLK, HSYNC, VSYNC, a pixel enable gate, and a 12-bit data line. The input interface type is controlled by adsp.
  • the CAMIF module At power-on, the CAMIF module is closed and will not capture data until the adsp setting is enabled to enable data capture.
  • CAMIF waits until the beginning of the next frame to start collecting data.
  • adps is set to disable, CAMIF will stop working when the next frame of data is sent to ensure that everything is done.
  • CAMIF hardware provides the function of resampling camera and sensor data. Before CAMIF output data, adsp can independently control whether the complete output image or sub-sampling.
  • the image processing chip processes image data or enhanced image data differently. Specifically, it will be described in detail through the following descriptions of different embodiments.
  • the image processing chip returns the enhanced image data to a central processing chip in a smart terminal.
  • the central processing chip encodes the enhanced image data to form video data, and records the video data to a storage device.
  • the image sensor processing module ISP of the image processing chip IPIC will preprocess the image as described above. After the preprocessing is completed, the preprocessed image data is sent to the image processing unit IPU of the image processing chip IPIC. Image enhancement. The enhanced image data formed after the image enhancement processing is completed will be returned to the central processing chip. After the central processing chip receives the enhanced image data, it performs video encoding.
  • the encoding protocol can be ITU H.261, H.263, M-JPEG of the Moving Picture Experts Group, and MPEG series standards of the International Organization for Standardization Moving Picture Experts Group. Etc.
  • Video data is formed after encoding. Because the video data is based on the enhanced image data formed by the image processing chip, each frame of the formed video data is a picture frame that has been optimized, and the video data after the synthesis of each frame is more colorful on the screen sense. The final video data is recorded by the central processing chip to a storage device for storage and subsequent recall.
  • the IPU is connected to the display module in another way. Specifically, the enhanced image is sent to the data processing unit DPU.
  • the DPU adds transparency to the interactive interface and sends it to the display module to load and display.
  • the first embodiment is an application that uses a camera component or a camera module for video shooting based on the technical solution of the present invention.
  • An external storage interface (USB, TF, etc.) is added through the image processing chip, which can be directly stored in the storage outside the image processing chip when recording. On the device.
  • the image processing chip returns the enhanced image data to a central processing chip in a smart terminal.
  • the central processing chip compresses the enhanced image data to form a compressed image, and records the compressed image to a storage device.
  • the image sensor processing module ISP of the image processing chip IPIC will preprocess the image as described above. After the preprocessing is completed, the preprocessed image data is sent to the image processing chip IPIC.
  • the image processing unit IPU performs image enhancement, and the enhanced image data formed after the image enhancement processing is completed will be returned to the central processing chip.
  • the central processing chip will compress the enhanced image data, such as a jpeg encoder jpeg encoder, to form a compressed image.
  • the JPEG encoding process first needs to convert the RGB format in the enhanced image data to YUV format.
  • RGB red, green, blue
  • color information such as uncompressed 24 Bit BMP images use RGB space to save the image.
  • a pixel is 24 bits, and a color intensity (0-255) is stored every 8 bits.
  • red is stored as 0xFF0000.
  • YUV is a color coding method adopted by the European television system, and this method is also commonly used in China's radio and television. Wherein "Y” represents the brightness (Luminance or Luma), that is, the grayscale value; and "U” and "V” represent the chrominance (Chrominance or Chroma).
  • YUV space in color TVs is precisely to solve the compatibility problem between color TVs and black and white TVs with the luminance signal Y, so that black and white TVs can also receive color TV signals.
  • RGB 0.299R + 0.587G + 0.114B
  • U -0.147R-0.289G + 0.436B
  • V 0.615R-0.515G-0.100B
  • R Y + 1.14 V
  • G Y-0.39U-0.58V
  • B Y + 2.03U.
  • the image is divided into 8 * 8 blocks. After the original image is converted to YUV format, the image is sampled according to a certain sampling format.
  • DCT discrete cosine transform
  • Discrete cosine transform is a transform coding method commonly used for digital rate compression.
  • the Fourier transform of any continuous real symmetric function contains only the cosine term, so the cosine transform has the same clear physical meaning as the Fourier transform.
  • DCT first divides the overall image into N * N pixel blocks, and then performs DCT transformation on the N * N pixel blocks one by one.
  • the bit rate for transmitting the transform coefficients is much smaller than the bit rate used for transmitting image pixels.
  • E (n) is a basis
  • C (n) is the DCT coefficient
  • F (n) is the image signal.
  • any image block can be represented as a combination of 64 coefficients of different sizes. Since the basic image is equivalent to a single coefficient in the transform domain, any pixel can also be regarded as a combination of 64 basic images with different amplitudes. This has the same physical meaning as any combination of signals that can be broken down into fundamental waves and harmonics of different amplitudes.
  • the quantization process is a process of discretizing the amplitude of a signal. The discrete signal becomes a digital signal after quantization. Because HVS is more sensitive to low-frequency signals, a relatively short quantization step size is used for the low-frequency part of the signal, and a relatively long quantization step size is used for the high-frequency part of the signal.
  • run-length coding refers to that a code can simultaneously represent the value of the code and a few zeros in front of it.
  • Z-shaped readout there are more chances of continuous zeros. Especially at the end, if it is all zero, after reading the last number, just give "end of block”. (EOB) code, you can end the output, thus saving a lot of code rates.
  • the final compressed image is recorded by the central processing chip into a storage device and stored for subsequent recall.
  • the second embodiment is an application that uses a camera component or a camera module to perform image shooting and photography based on the technical solution of the present invention.
  • An external storage interface (USB, TF, etc.) is added through the image processing chip, and the image can be stored directly outside the image processing chip. Storage device.
  • the central processing chip sends the interactive interface data to the image processing chip, and the post image processing chip integrates the interactive interface data and the enhanced image data to form an integrated image. Specifically, after the central processing chip is connected to the image processing chip, the central processing chip sends pre-configured interactive interface GUI data of the preview interface for image preview to the image processing chip through the DSI interface.
  • DSI defines a high-speed serial interface between the processor and the display module. It consists of four layers, the PHY layer, Lane Management layer, Low Level Protocol layer, and Application layer, which correspond to the D-PHY, DSI, and DCS specifications, respectively.
  • the PHY defines the transmission medium, input / output circuits, and clock and signal mechanisms.
  • the Lane Management layer sends and collects data streams to each lane, and the Low Level.
  • the Protocol layer defines how to frame and analyze, and detect errors.
  • the Application layer describes the high-level
  • the data stream is encoded and parsed, and the interactive interface data includes transparency information of the interactive interface, that is, Alpha information.
  • the image processing chip integrates the interactive interface data with the enhanced image data through Alpha Blending, so that the enhanced image is displayed on the interactive interface as a display interface, that is, an integrated image is displayed.
  • the integrated image will be sent to the display module via the DPU for display.
  • the third embodiment is an application that provides a preview function to a user when an image is captured using a camera component or a camera module based on the technical solution of the present invention, and provides a user with a more acceptable and habitual visual perception through cooperation with an interactive interface.
  • the image processing chip includes:
  • An image sensing processing module ISP is connected to the image sensor Image sensor and the image processing unit IPU, the image processing unit IPU is connected to the first data processing unit DPU, and the first data processing unit DPU is connected to the display module LCD.
  • the central processing chip (CPU or AP with integrated CPU) includes:
  • Image sensing processing module ISP (bypass), second data processing unit DPU, picture compressor Jpeg Encoder.
  • the image sensing processing module ISP (bypass) is connected to the image processing unit IPU to receive enhanced image data; the second data processing unit DPU is connected to the image processing unit IPU to send interactive interface data; the image compressor image compressor Jpeg Encoder and image sensing
  • the processing module ISP (bypass) is connected to receive image enhancement data and compress it; the image compressor is also connected to the storage device and sends the compressed image to the storage device.
  • the picture on the AP side is processed by the DPU and transmitted to the image processing chip IPIC through the mipi (DSI).
  • the IPU performs image enhancement on the data sent by the AP side, and the image enhanced data is transmitted through the DPU. Display to LCD;
  • the image sensor data is sent to IPIC through CamIF, and the GUI data on the AP side is sent to IPIC through DSI.
  • the GUI data carries alpha information.
  • IPIC processes the image sensor data and sends it to the AP side.
  • the GUI does alpha blending, and the data after alpha blending is output to the LCD through the DPU;
  • the image sensor data is sent to IPIC through CamIF.
  • the ISP processes the image sensor data.
  • the ISP processed data is sent to the IPU for image enhancement.
  • the IPU processed data is sent back to the AP.
  • the AP uses the VPU for video encoding and then writes it to the storage device;
  • the data processed by the IPU is additionally alpha-blending with the GUI data (the GUI data contains alpha information) sent by the AP, and the data after the alpha-blending passes IPIC DPU output to LCD display;
  • the image sensor data is sent to IPIC through CamIF.
  • the IPIC ISP processes the image sensor data.
  • the data processed by the ISP is sent to IPU for image enhancement.
  • the data after IPU processing is sent back to CamIF.
  • the AP uses the jpeg encoder to compress the image and write it to the storage device.
  • the present invention also discloses an image processing method, including the following steps:
  • the image sensor collects image data of the shooting object
  • the image sensor sends the image data to the image processing chip, and the image sensing processing module of the image processing chip processes the image data to form enhanced image data;
  • the image processing chip returns the enhanced image data to a central processing chip in a smart terminal, and the central processing chip records the enhanced image data to a storage device;
  • the image processing chip sends the enhanced image data to a display module, and the display module displays the enhanced image corresponding to the enhanced image data.
  • step S300 in a preferred embodiment includes:
  • the image processing chip returns the enhanced image data to a central processing chip in a smart terminal;
  • S320 The central processing chip encodes the enhanced image data to form video data
  • the central processing chip records video data to a storage device.
  • step S300 includes:
  • the image processing chip returns the enhanced image data to a central processing chip in a smart terminal;
  • the central processing chip compresses the enhanced image data to form a compressed image
  • the central processing chip records the compressed image to a storage device.
  • step S300 includes:
  • S320 The image processing chip integrates interactive interface data and enhanced image data to form an integrated image.
  • a computer-readable storage medium may be installed in the smart terminal, and a computer program is stored thereon, and when the computer program is executed by a processor, the steps of the image processing method as described above are implemented.
  • Smart terminals can be implemented in various forms.
  • the terminals described in the present invention may include smart terminals such as mobile phones, smart phones, notebook computers, PDAs (personal digital assistants), PADs (tablets), PMPs (portable multimedia players), navigation devices, and the like as well as Fixed terminal for digital TV, desktop computer, etc.
  • the terminal is a smart terminal.
  • the configuration according to the embodiment of the present invention can be applied to a terminal of a fixed type, in addition to an element particularly used for mobile purposes.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Human Computer Interaction (AREA)
  • Studio Devices (AREA)
  • Image Processing (AREA)

Abstract

一种智能终端、图像处理方法及计算机可读存储介质。智能终端,包括显示模块、图像传感器及中央处理芯片,还包括独立于中央处理芯片的图像处理芯片;图像处理芯片与显示模块、图像传感器及中央处理芯片连接;图像传感器采集拍摄对象的图像数据并发送至图像处理芯片;图像处理芯片的图像传感处理模块对图像数据处理,形成增强图像数据;图像处理芯片将增强图像数据回传至中央处理芯片,由中央处理芯片记录增强图像数据至存储设备,图像处理芯片将增强图像数据发送至显示模块,由显示模块显示对应增强图像数据的增强图像。采用上述技术方案后,可将图像处理流程独立出CPU,充分利用外置硬件,改进成像质量和移动终端的显示效果。

Description

智能终端、图像处理方法及计算机可读存储介质 技术领域
本发明涉及智能控制领域,尤其涉及一种智能终端、图像处理方法及计算机可读存储介质。
技术背景
例如手机、平板电脑等移动终端设备已越来越走入人们的生活。因此在此类移动终端上,也集成有如拍摄、摄像、定位、照明等多功能,以方便人们持有一件移动终端后可完成多件需求。在使用移动终端进行拍摄及摄像时,摄像模组的分辨率及像素越来越高,使得移动终端在该功能上所能实现的成像形态越发接近于专业的相机设备。具体地,移动终端设备的成像形式,主要有两种:
1.CPU内置有图像感应处理模块:
如图1所示,在移动终端的中央处理单元CPU内,内置有一图像感应处理模块ISP,摄像组将将数据通过mipi接口送到ISP,ISP会根据不同的应用把数据处理之后送给不同的模块。如通过数据路径1送给数据处理单元DPU,DPU处理之后送给显示屏LCD显示,用于摄像模块的摄像预览;如通过数据路径2送给视频处理单元VPU,VPU进行编码之后把数据存储到存储设备中,用于摄像模块的录像功能;又如通过数据路径3送给图片压缩器jpeg encoder,jpeg encoder把数据压缩成jpeg文件之后,把数据存储到存储设备中,用于摄像模块的拍照功能;。
2.CPU外置有图像感应处理模块
如图2所示,在CPU外连接有一图像感应处理模块,并采用内部ISP bypass的方式,数据流程除了内置ISP bypass外,其他与方式一等同。
由于现有的方案中,均采用CPU的内部图像感应与处理功能,受到CPU的内部某功能的限制,对于图像数据的处理受限,无法对拍摄图像进行更加专业和细致的调整,同时,加重了CPU的数据处理负载,使得移动终端的运行速度变慢。
因此,需要一种新型的智能终端、图像处理方法及计算机可读存储介质,可将图像处理流程独立出CPU,充分利用外置硬件,改进成像质量和移动终端的显示效果。
发明概要
为了克服上述技术缺陷,本发明的目的在于提供一智能终端、图像处理方法及计算机可读存储介质,使得预览的图像数据不经过CPU直接显示,加快图像预览速度。
本发明公开了一种智能终端,包括包括显示模块、图像传感器及中央处理芯片,
所述智能终端还包括独立于所述中央处理芯片的图像处理芯片;
所述图像处理芯片分别与所述显示模块、图像传感器及中央处理芯片连接;
所述图像传感器采集拍摄对象的图像数据并发送至图像处理芯片;
所述图像处理芯片的图像传感处理模块对所述图像数据进行处理,形成增强图像数据;
所述图像处理芯片将所述增强图像数据回传至所述中央处理芯片,由所述中央处理芯片记录所述增强图像数据至一存储设备,且所述图像处理芯片将所述增强图像数据发送至一显示模块,由所述显示模块显示对应所述增强图像数据的增强图像。
优选地,所述图像处理芯片将所述增强图像数据回传至一智能终端内的中央处理芯片;
所述中央处理芯片对所述增强图像数据进行编码,形成视频数据;
所述中央处理芯片记录视频数据至所述存储设备。
优选地,所述图像处理芯片将所述增强图像数据回传至一智能终端内的中央处理芯片;
所述中央处理芯片对所述增强图像数据进行压缩,形成压缩图像;
所述中央处理芯片记录压缩图像至所述存储设备。
优选地,所述中央处理芯片发送交互界面数据至所述图像处理芯片;
所述图像处理芯片整合所述交互界面数据及增强图像数据,形成整合图像。
优选地,所述图像处理芯片包括:
图像处理单元、第一数据处理单元;
所述图像传感处理模块分别与所述图像传感器及所述图像处理单元连接,所述图像处理单元与所述第一数据处理单元连接,所述第一数据处理单元与所述显示模块连接;
所述中央处理芯片包括:
图像感应处理模块、第二数据处理单元;
所述图像感应处理模块与所述图像处理单元连接,接收增强图像数据;
所述第二数据处理单元与所述图像处理单元连接,发送交互界面数据。
优选地,所述中央处理芯片还包括:
图片压缩器,与所述图像感应处理模块连接,接收图像增强数据并压缩;
所述图片压缩器还与所述存储设备连接,发送压缩图像至存储设备。
本发明还公开了一种图像处理方法,包括以下步骤:
S100:图像传感器采集拍摄对象的图像数据;
S200:图像传感器将所述图像数据发送至图像处理芯片,由所述图像处理芯片的图像传感处理模块对所述图像数据进行处理,形成增强图像数据;
S300:所述图像处理芯片将所述增强图像数据回传至一智能终端内的中央处理芯片,由所述中央处理芯片记录所述增强图像数据至一存储设备;
S400:所述图像处理芯片将所述增强图像数据发送至一显示模块,由所述显示模块显示对应所述增强图像数据的增强图像。
优选地,所述步骤S300包括:
S310:所述图像处理芯片将所述增强图像数据回传至一智能终端内的中央处理芯片;
S320:所述中央处理芯片对所述增强图像数据进行编码,形成视频数据;
S330:所述中央处理芯片记录视频数据至所述存储设备。
优选地,所述步骤S300包括:
S310’:所述图像处理芯片将所述增强图像数据回传至一智能终端内的中央处理芯片;
S320’:所述中央处理芯片对所述增强图像数据进行压缩,形成压缩图像;
S330’:所述中央处理芯片记录压缩图像至所述存储设备。
本发明又公开了一种计算机可读存储介质,其上存储有计算机程序,所述计算机程序被处理器执行时实现如上所述的图像处理方法的步骤。
采用了上述技术方案后,与现有技术相比,具有以下有益效果:
1.图像预览数据将直接发送至显示屏显示,不由CPU处理,降低CPU负载,加快预览速度;
2.充分利用外置硬件,改进成像质量和移动终端的显示效果。
附图说明
图1为现有技术中实现拍摄图像处理的系统结构示意图;
图2为现有技术中实现拍摄图像处理的系统结构示意图;
图3为符合本发明一优选实施例中智能终端的结构示意图;
图4为符合本发明一优选实施例中智能终端的模块结构示意图;
图5为符合本发明一实施例中图像处理方法的流程示意图;
图6为符合本发明第一优选实施例中图像处理方法的流程示意图;
图7为符合本发明第二优选实施例中图像处理方法的流程示意图;
图8为符合本发明第三优选实施例中图像处理方法的流程示意图。
发明内容
以下结合附图与具体实施例进一步阐述本发明的优点。
这里将详细地对示例性实施例进行说明,其示例表示在附图中。下面的描述涉及附图时,除非另有表示,不同附图中的相同数字表示相同或相似的要素。以下示例性实施例中所描述的实施方式并不代表与本公开相一致的所有实施方式。相反,它们仅是与如所附权利要求书中所详述的、本公开的一些方面相一致的装置和方法的例子。
在本公开使用的术语是仅仅出于描述特定实施例的目的,而非旨在限制本公开。在本公开和所附权利要求书中所使用的单数形式的“一种”、“所述”和“该”也旨在包括多数形式,除非上下文清楚地表示其他含义。还应当理解,本文中使用的术语“和/或”是指并包含一个或多个相关联的列出项目的任何或所有可能组合。
应当理解,尽管在本公开可能采用术语第一、第二、第三等来描述各种信息,但这些信息不应限于这些术语。这些术语仅用来将同一类型的信息彼此区分开。例如,在不脱离本公开范围的情况下,第一信息也可以被称为第二信息,类似地,第二信息也可以被称为第一信息。取决于语境,如在此所使用的词语“如果”可以被解释成为“在……时”或“当……时”或“响应于确定”
在本发明的描述中,需要理解的是,术语“纵向”、“横向”、“上”、“下”、“前”、“后”、“左”、“右”、“竖直”、“水平”、“顶”、“底”“内”、“外”等指示的方位或位置关系为基于附图所示的方位或位置关系,仅是为了便于描述本发明和简化描述,而不是指示或暗示所指的装置或元件必须具有特定的方位、以特定的方位构造和操作,因此不能理解为对本发明的限制。
在本发明的描述中,除非另有规定和限定,需要说明的是,术语“安装”、“相连”、“连接”应做广义理解,例如,可以是机械连接或电连接,也可以是两个元件内部的连通,可以是直接相连,也可以通过中间媒介间接相连,对于本领域的普通技术人员而言,可以根据具体情况理解上述术语的具体含义。
在后续的描述中,使用用于表示元件的诸如“模块”、“部件”或“单元”的后缀仅为了有利于本发明的说明,其本身并没有特定的意义。因此,“模块”与“部件”可以混合地使用。
参阅图3,为符合本发明一优选实施例中智能终端的结构示意图。智能终端包括:图像传感器、中央处理芯片及显示模块。图像传感器,如摄影摄像模组、镜头模组、棱镜模组等对一拍摄对象进行如拍摄、录像、记录运动帧等图像采集,从而形成原始的图像数据。可以理解的是,由于不同图像传感器的硬件结构不同,所采集的图像数据,可能是未作任何修饰的未修片型的图像数据,也可能是已进行如美颜、裁剪、优化、渲染等初期处理的图像数据。不论在该步骤内的图像数据是否已作处理,均将在后续步骤内作进一步的优化。
图像处理芯片为独立于中央处理芯片的元件,且图像处理芯片分别与显示模块、图像传感器及中央处理芯片连接;图像传感器采集拍摄对象的图像数据并发送至图像处理芯片;图像处理芯片的图像传感处理模块对图像数据进行处理,形成增强图像数据;图像处理芯片将增强图像数据回传至中央处理芯片,由中央处理芯片记录增强图像数据至一存储设备,且图像处理芯片将增强图像数据发送至一显示模块,由显示模块显示对应增强图像数据的增强图像。
具体地,图像传感器与图像处理芯片的电连接方式,可以是集成在同一印刷电路板上的两个独立的图像传感模块及图像处理模块,也可以是以导线连接的两个独立的器件,也可以是设置在同一局域网内或互相无线连接的两个独立的器件。图像处理芯片IPIC在接收到图像数据后,将对其进行如黑色矫正、坏点矫正的预处理,去噪、镜头矫正的RAW数据处理,图像增强等精处理,从而形成一增强图像数据。增强图像数据基于图像处理芯片IPIC的处理能力的不同,可进行不同程度的图像优化,在本实施例中,图像处理芯片作为单独于其他任何模块、模组、单元、设备的器件,以独立运行的方式对图像数据进行优化处理,一方面可减少图像处理功能对其他设备的负载压力,另一方面可专能专用,不由阉割部分功能的集成设备对图像处理,增加图像处理的效率和效果。
处理完毕后形成的增强图像数据将由图像处理芯片回传至一中央处理芯片(CPU或AP端)。该中央处理芯片设置在一智能终端内,而图像处理芯片外挂于中央处理芯片,也可设置在同一智能终端内,但与中央处理芯片独立设置,两者分布式设计。图像处理芯片可与中央处理芯片设计在同一印刷电路板上(但不集成),或图像处理芯片与中央处理芯片通过导线连接,或是通过互相支持的无线协议无线连接。将中央处理芯片与图像处 理芯片独立,中央处理芯片可集中处理智能终端内的对各部件的指令转达及下发,而图像处理的功能,则分割出中央处理芯片,单独由外挂的图像处理芯片完成,减少中央处理芯片负载的同时,也可简化中央处理芯片的制式要求,其内部的原图像处理能力可压缩或简化,对于外部采集的图像数据的优化处理工作,交由更为专业化、优化度更高的图像处理芯片完成,可加快图像处理速度。同时,由于图像数据不再经过中央处理芯片,对于处理完毕后的图像数据的显示和加载也将更快。
中央处理芯片接收到增强图像数据后,将记录增强图像数据至一存储设备,如智能终端的内存,智能终端的外接存储装置(U盘、移动硬盘,OTG连接的其他设备等)。在该步骤中,为存储已处理完毕的增强图像数据,中央处理芯片执行一转发记录的动作,对于增强图像数据本身不作任何变化,减少中央处理芯片的运行负载。
在中央处理芯片记录增强图像数据至存储设备的同时、之前或之后等任意时刻,图像处理芯片将增强图像数据发送至一显示模块,具体地如智能终端上安装的显示屏、触摸屏等显示模块,由显示模块接收并显示对应增强图像数据的增强图像,即在初始的图像数据的基础上进行智能优化、渲染、调色温、补光、AR增强、信息识别、美颜、裁剪等各项图像处理后形成的增强图像。
通过上述单独的图像处理芯片对图像作增强处理,并直接将处理后的增强图像一路发送至显示模块显示,一路发送至存储设备存储,以提高处理后图像的传输速度,及在用户侧的加载和显示速度。此外还可提升摄像模块在预览、拍照、录像时的效果,及提升快速运动画面的录像效果。
图像传感器Image sensor通过摄像接口CAMIF,将图像数据发送至图像处理芯片。CAMIF是video front-end(VFE)硬件的的第一部分,主要任务是同步sensor发送数据过程中涉及到的行、场同步信号。另外它还具有图像提取和图像镜像能力。为了和外部camera sensor精确的同步,CAMIF硬件必须提供两根可编程的中断线,一根用来控制快门的打开,另一根用来控制闪光。CAMIF硬件输入设备包括PCLK、HSYNC、VSYNC、一个像素使能闸门、和12bit数据线。输入接口类型是由adsp控制的。在上电时,CAMIF模块是关闭的,也不会抓取数据,直到adsp设置使能为enable,开启数据捕获。当设置使能数据位后,CAMIF会等到下一个帧开始时才开始收集数据。类似的,当adps设置为disable时,CAMIF会等到下一帧数据发送完成时停止工作,确保所有的事情都处理完。CAMIF硬件提供了对camera sensor数据重采样的功能,在CAMIF输出数据之前,adsp可以独立的控制是完整的输出图像还是做二次采样。
在不同实施例中,图像处理芯片对于图像数据或增强图像数据的处理方式不同,具体地,将通过以下不同实施例的介绍详细说明。
实施例一
在该实施例中,图像处理芯片将增强图像数据回传至一智能终端内的中央处理芯片。中央处理芯片对增强图像数据进行编码,形成视频数据,并记录视频数据至存储设备。具体地,图像处理芯片IPIC的图像传感处理模块ISP将对图像进行如上文所述的预处理,预处理完毕后,再发送预处理后的图像数据至图像处理芯片IPIC的图像处理单元IPU作图像增强,图像增强处理完毕后形成的增强图像数据将回传至中央处理芯片。中央处理芯片收到增强图像数据后进行视频编码,编码协议可以是国际电联的H.261、H.263,运动静止图像专家组的M-JPEG和国际标准化组织运动图像专家组的MPEG系列标准等,编码后形成有视频数据。由于该视频数据基于图像处理芯片处理后形成的增强图像数据,因此所形成的视频数据的每一帧均为已执行过优化处理的画面帧,各个帧合成后的视频数据在画面上更有色彩感。最终形成的视频数据由中央处理芯片记录至存储设备内存储,共后续调用查看。IPU另一路与显示模块连接,具体地,将增强图像发送至数据处理单元DPU,由DPU增加透明度交互界面,并发送至显示模块加载显示。
该实施例一为基于本发明技术方案的利用摄像组件或摄像模块进行视频拍摄的应用,通过图像处理芯片增加外部存储接口(USB,TF等),录像时可直接存储在图像处理芯片外部的存储设备上。
实施例二
在该实施例中,图像处理芯片将增强图像数据回传至一智能终端内的中央处理芯片,中央处理芯片对增强图像数据进行压缩,形成压缩图像,并记录压缩图像至存储设备。具体地,与实施例一相同,图像处理芯片IPIC的图像传感处理模块ISP将对图像进行如上文所述的预处理,预处理完毕后,再发送预处理后的图像数据至图像处理芯片IPIC的图像处理单元IPU作图像增强,图像增强处理完毕后形成的增强图像数据将回传至中央处理芯片。中央处理芯片将对增强图像数据进行压缩,如jpeg编码器jpeg encoder,形成压缩图像。JPEG编码过程,首先需将增强图像数据内的RGB格式转换为YUV格式,在记录计算机图像时,最常见的是采用RGB(红、绿,蓝)颜色分量来保存颜色信息,例如非压缩的24位的BMP图像就采用RGB空间来保存图像。一个像素24位,每8位保存一种颜色强度(0-255),例如红色保存为0xFF0000。YUV是被欧洲电视系统所采用的一种颜色编码方法,我国广播电视也普遍采用这类方法。其中“Y”表示明亮度(Luminance或Luma), 也就是灰阶值;而“U”和“V”表示的则是色度(Chrominance或Chroma)。彩色电视采用YUV空间正是为了用亮度信号Y解决彩色电视机与黑白电视机的兼容问题,使黑白电视机也能接收彩色电视信号。在将RGB转化为YUV时,通常采取如Y=0.299R+0.587G+0.114B、U=-0.147R-0.289G+0.436B、V=0.615R-0.515G-0.100B、R=Y+1.14V、G=Y-0.39U-0.58V、B=Y+2.03U的方式进行。后将图像8*8分块,在原始图像转换为YUV格式后,对图像按一定的采样格式进行采样,常见的格式有4:4:4,4:2:2和4:2:0。取样完成后,将图像按8*8(pixel)划分成MCU。随后便进行离散余弦变换(DCT),离散余弦变换DCT(Discrete Cosine Transform)是数码率压缩需要常用的一个变换编码方法。任何连续的实对称函数的付立叶变换中只含余弦项,因此余弦变换与付立叶变换一样有明确的物理意义。DCT是先将整体图像分成N*N像素块,然后对N*N像素块逐一进行DCT变换。由于大多数图像的高频分量较小,相应于图像高频分量的系数经常为零,加上人眼对高频成分的失真不太敏感,所以可用更粗的量化。因此,传送变换系数的数码率要大大小于传送图像像素所用的数码率。到达接收端后通过反离散余弦变换回到样值,虽然会有一定的失真,但人眼是可以接受的。图像信号被分解成为直流成分;以及从低频到高频的各种余弦成分;而DCT系数只是表示了该种成分所占原图像信号的份额大小;显然,恢复图像信息可以表示为这样一个矩阵形式:F(n)=C(n)*E(n),式中E(n)是一个基底,C(n)是DCT系数,F(n)则是图像信号。如果再考虑垂直方向上的变化,那么,就需要一个二维的基底,即该基底不仅要反映水平方向频率的变化;而且要反映垂直空间频率的变化;对应于8*8的像素块;空间基底是由64个像素值所组成的图像,通常也称之为基本图像。把它们称为基本图像是因为在离散余弦变换的反变换式中,任何像块都可以表示成64个系数的不同大小的组合。既然基本图像相当于变换域中的单一的系数,那么任何像元也可以看成由64个不同幅度的基本图像的组合。这与任何信号可以分解成基波和不同幅度的谐波的组合具有相同的物理意义。量化过程是一个将信号的幅度离散化的过程,离散信号经过量化后变为数字信号。由于HVS对低频信号更为敏感,所以对信号的低频部分采用相对短的量化步长,对信号的高频部分采用相对长的量化步长。这样可以在一定程度上,得到相对清晰的图像和更高的压缩率。之后执行Z字形编码(zigzag scan),按Z字形把量化后的数据读出。最后,使用行程长度编码(RLE)对交流系数(AC)进行编码,所谓游程长度编码是指一个码可以同时表示码的值和前面有几个零。这样就发挥了Z字型读出的优点,因为Z字型读出,出现连零的机会比较多,特别到最后,如果都是零,在读到最后一个数后,只要给出“块结 束”(EOB)码,就可以结束输出,因此节省了很多码率。最终形成的压缩图像由中央处理芯片记录至存储设备内存储,共后续调用查看。
该实施例二为基于本发明技术方案的利用摄像组件或摄像模块进行图像拍摄、拍照的应用,通过图像处理芯片增加外部存储接口(USB,TF等),拍照后可直接存储在图像处理芯片外部的存储设备上。
实施例三
在该实施例中,中央处理芯片发送交互界面数据至图像处理芯片,后图像处理芯片整合交互界面数据及增强图像数据,形成整合图像。具体地,中央处理芯片与图像处理芯片连接后,中央处理芯片将预先配置的对于图像预览的预览界面的交互界面GUI数据通过DSI接口发送至图像处理芯片。DSI定义了一个位于处理器和显示模组之间的高速串行接口,其分四层,PHY层、LaneManagement层、Low Level Protocol层、Application层,分别对应D-PHY、DSI、DCS规范,其中PHY定义了传输媒介,输入/输出电路和和时钟和信号机制,Lane Management层发送和收集数据流到每条lane,Low Level Protocol层定义了如何组帧和解析以及错误检测等,Application层描述高层编码和解析数据流,且该交互界面数据包括交互界面的透明度信息,即Alpha信息。图像处理芯片通过Alpha blending将交互界面数据与增强图像数据整合,使得增强图像以交互界面为显示界面显示,即显示一整合图像。该整合图像将通过DPU发送至显示模块以显示。
该实施例三为基于本发明技术方案的利用摄像组件或摄像模块进行图像拍摄时提供预览功能至用户的应用,通过与交互界面的配合,向用户提供更易接受和习惯的视觉观感。
为实现上述任一实施例中智能终端的各部件功能,对于智能终端内模块的设计及连接,参阅图4,实施方案如下:
图像处理芯片包括:
图像传感处理模块ISP、图像处理单元IPU、第一数据处理单元DPU。图像传感处理模块ISP分别与图像传感器Image sensor及图像处理单元IPU连接,图像处理单元IPU与第一数据处理单元DPU连接,第一数据处理单元DPU与显示模块LCD连接。
中央处理芯片(CPU或集成有CPU的AP端)包括:
图像感应处理模块ISP(by pass)、第二数据处理单元DPU、图片压缩器Jpeg Encoder。图像感应处理模块ISP(by pass)与图像处理单元IPU连接,接收增强图像数据;第二数据处理单元DPU与图像处理单元IPU连接,发送交互界面数据;图片压缩器图片压缩 器Jpeg Encoder与图像感应处理模块ISP(by pass)连接,接收图像增强数据并压缩;图片压缩器还与存储设备连接,发送压缩图像至存储设备。
当需要在AP端显示数据流程时,AP端的画面经过DPU处理后,通过mipi(DSI)传给图像处理芯片IPIC,IPU对AP端送来的数据进行图像增强,图像增强后的数据通过DPU传给LCD显示;
当需要对拍摄照片进行预览时,image sensor的数据通过CamIF送至IPIC,AP端的GUI数据通过DSI送至IPIC,GUI数据带有alpha信息,IPIC对image sensor的数据进行处理后与AP端送来的GUI做alpha blending,alpha blending之后的数据通过DPU输出给LCD显示;
当需要进行拍摄录像时,image sensor的数据通过CamIF送至IPIC,ISP对image sensor的数据进行处理,ISP处理后的数据送给IPU做图像增强,IPU处理后的数据一路回传给AP端,AP端收到数据后使用VPU进行视频编码,然后写入存储设备;IPU处理后的数据另外一路与AP端送来的GUI数据(GUI数据含有alpha信息)做alpha blending,alpha blending之后的数据通过IPIC的DPU输出给LCD显示;
当需要进行拍摄照相时,image sensor的数据通过CamIF送至IPIC,IPIC的ISP对image sensor的数据进行处理,ISP处理之后的数据送给IPU做图像增强,IPU处理之后的数据通过CamIF回传给AP端,AP端收到数据后使用jpeg encoder进行图像压缩,写入存储设备。
参阅图5,本发明还公开了一种图像处理方法,包括以下步骤:
S100:图像传感器采集拍摄对象的图像数据;
S200:图像传感器将图像数据发送至图像处理芯片,由图像处理芯片的图像传感处理模块对图像数据进行处理,形成增强图像数据;
S300:图像处理芯片将增强图像数据回传至一智能终端内的中央处理芯片,由中央处理芯片记录增强图像数据至一存储设备;
S400:图像处理芯片将增强图像数据发送至一显示模块,由显示模块显示对应增强图像数据的增强图像。
参阅图6,一优选实施例中步骤S300包括:
S310:图像处理芯片将增强图像数据回传至一智能终端内的中央处理芯片;
S320:中央处理芯片对增强图像数据进行编码,形成视频数据;
S330:中央处理芯片记录视频数据至存储设备。
参阅图7,一优选实施例中,步骤S300包括:
S310’:图像处理芯片将增强图像数据回传至一智能终端内的中央处理芯片;
S320’:中央处理芯片对增强图像数据进行压缩,形成压缩图像;
S330’:中央处理芯片记录压缩图像至存储设备。
参阅图8,一优选实施例中,步骤S300包括:
S310”:中央处理芯片发送交互界面数据至图像处理芯片;
S320”:图像处理芯片整合交互界面数据及增强图像数据,形成整合图像。
此外,基于上述图像处理方法,可在智能终端内安装计算机可读存储介质,其上存储有计算机程序,当该计算机程序被处理器执行时,实现如上文所述的图像处理方法的步骤。
智能终端可以以各种形式来实施。例如,本发明中描述的终端可以包括诸如移动电话、智能电话、笔记本电脑、PDA(个人数字助理)、PAD(平板电脑)、PMP(便携式多媒体播放器)、导航装置等等的智能终端以及诸如数字TV、台式计算机等等的固定终端。下面,假设终端是智能终端。然而,本领域技术人员将理解的是,除了特别用于移动目的的元件之外,根据本发明的实施方式的构造也能够应用于固定类型的终端。
应当注意的是,本发明的实施例有较佳的实施性,且并非对本发明作任何形式的限制,任何熟悉该领域的技术人员可能利用上述揭示的技术内容变更或修饰为等同的有效实施例,但凡未脱离本发明技术方案的内容,依据本发明的技术实质对以上实施例所作的任何修改或等同变化及修饰,均仍属于本发明技术方案的范围内。

Claims (10)

  1. 一种智能终端,包括包括显示模块、图像传感器及中央处理芯片,其特征在于,
    所述智能终端还包括独立于所述中央处理芯片的图像处理芯片;
    所述图像处理芯片分别与所述显示模块、图像传感器及中央处理芯片连接;
    所述图像传感器采集拍摄对象的图像数据并发送至图像处理芯片;
    所述图像处理芯片的图像传感处理模块对所述图像数据进行处理,形成增强图像数据;
    所述图像处理芯片将所述增强图像数据回传至所述中央处理芯片,由所述中央处理芯片记录所述增强图像数据至一存储设备,且所述图像处理芯片将所述增强图像数据发送至一显示模块,由所述显示模块显示对应所述增强图像数据的增强图像。
  2. 如权利要求1所述的智能终端,其特征在于,
    所述图像处理芯片将所述增强图像数据回传至一智能终端内的中央处理芯片;
    所述中央处理芯片对所述增强图像数据进行编码,形成视频数据;
    所述中央处理芯片记录视频数据至所述存储设备。
  3. 如权利要求1所述的智能终端,其特征在于,
    所述图像处理芯片将所述增强图像数据回传至一智能终端内的中央处理芯片;
    所述中央处理芯片对所述增强图像数据进行压缩,形成压缩图像;
    所述中央处理芯片记录压缩图像至所述存储设备。
  4. 如权利要求1所述的智能终端,其特征在于,
    所述中央处理芯片发送交互界面数据至所述图像处理芯片;
    所述图像处理芯片整合所述交互界面数据及增强图像数据,形成整合图像。
  5. 如权利要求2-4任一项所述的智能终端,其特征在于,
    所述图像处理芯片包括:
    图像处理单元、第一数据处理单元;
    所述图像传感处理模块分别与所述图像传感器及所述图像处理单元连接,所述图像处理单元与所述第一数据处理单元连接,所述第一数据处理单元与所述显示模块连接;
    所述中央处理芯片包括:
    图像感应处理模块、第二数据处理单元;
    所述图像感应处理模块与所述图像处理单元连接,接收增强图像数据;
    所述第二数据处理单元与所述图像处理单元连接,发送交互界面数据。
  6. 如权利要求5所述的智能终端,其特征在于,
    所述中央处理芯片还包括:
    图片压缩器,与所述图像感应处理模块连接,接收图像增强数据并压缩;
    所述图片压缩器还与所述存储设备连接,发送压缩图像至存储设备。
  7. 一种图像处理方法,其特征在于,包括以下步骤:
    S100:图像传感器采集拍摄对象的图像数据;
    S200:图像传感器将所述图像数据发送至图像处理芯片,由所述图像处理芯片的图像传感处理模块对所述图像数据进行处理,形成增强图像数据;
    S300:所述图像处理芯片将所述增强图像数据回传至一智能终端内的中央处理芯片,由所述中央处理芯片记录所述增强图像数据至一存储设备;
    S400:所述图像处理芯片将所述增强图像数据发送至一显示模块,由所述显示模块显示对应所述增强图像数据的增强图像。
  8. 如权利要求7所述的图像处理方法,其特征在于,
    所述步骤S300包括:
    S310:所述图像处理芯片将所述增强图像数据回传至一智能终端内的中央处理芯片;
    S320:所述中央处理芯片对所述增强图像数据进行编码,形成视频数据;
    S330:所述中央处理芯片记录视频数据至所述存储设备。
  9. 如权利要求7所述的图像处理方法,其特征在于,
    所述步骤S300包括:
    S310’:所述图像处理芯片将所述增强图像数据回传至一智能终端内的中央处理芯片;
    S320’:所述中央处理芯片对所述增强图像数据进行压缩,形成压缩图像;
    S330’:所述中央处理芯片记录压缩图像至所述存储设备。
  10. 一种计算机可读存储介质,其上存储有计算机程序,其特征在于,所述计算机程序被处理器执行时实现如权利要求7-9任一项所述的图像处理方法的步骤。
PCT/CN2019/107193 2018-09-29 2019-09-23 智能终端、图像处理方法及计算机可读存储介质 WO2020063506A1 (zh)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201811145686.8 2018-09-29
CN201811145686.8A CN109167916A (zh) 2018-09-29 2018-09-29 智能终端、图像处理方法及计算机可读存储介质

Publications (1)

Publication Number Publication Date
WO2020063506A1 true WO2020063506A1 (zh) 2020-04-02

Family

ID=64892835

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2019/107193 WO2020063506A1 (zh) 2018-09-29 2019-09-23 智能终端、图像处理方法及计算机可读存储介质

Country Status (2)

Country Link
CN (1) CN109167916A (zh)
WO (1) WO2020063506A1 (zh)

Families Citing this family (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109286753A (zh) * 2018-09-29 2019-01-29 南昌黑鲨科技有限公司 图像处理方法、系统及计算机可读存储介质
CN109167915A (zh) * 2018-09-29 2019-01-08 南昌黑鲨科技有限公司 图像处理方法、系统及计算机可读存储介质
CN109167916A (zh) * 2018-09-29 2019-01-08 南昌黑鲨科技有限公司 智能终端、图像处理方法及计算机可读存储介质
CN110062161B (zh) * 2019-04-10 2021-06-25 Oppo广东移动通信有限公司 图像处理器、图像处理方法、拍摄装置和电子设备
CN111601019B (zh) * 2020-02-28 2021-11-16 北京爱芯科技有限公司 图像数据处理模组及电子设备
CN114285957A (zh) * 2021-12-28 2022-04-05 维沃移动通信有限公司 图像处理电路及数据传输方法

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101192136A (zh) * 2006-11-30 2008-06-04 北京思比科微电子技术有限公司 基于usb模式的图像传输方法及装置
CN101202880A (zh) * 2007-06-08 2008-06-18 深圳市德诺通讯技术有限公司 一种具有摄像功能的移动终端
JP2009207014A (ja) * 2008-02-28 2009-09-10 Sharp Corp 画像処理装置
CN107220208A (zh) * 2017-07-07 2017-09-29 深圳市图芯智能科技有限公司 一种图像处理系统及方法
CN109167915A (zh) * 2018-09-29 2019-01-08 南昌黑鲨科技有限公司 图像处理方法、系统及计算机可读存储介质
CN109167916A (zh) * 2018-09-29 2019-01-08 南昌黑鲨科技有限公司 智能终端、图像处理方法及计算机可读存储介质
CN109242757A (zh) * 2018-09-29 2019-01-18 南昌黑鲨科技有限公司 智能终端、图像处理方法及计算机可读存储介质
CN109286753A (zh) * 2018-09-29 2019-01-29 南昌黑鲨科技有限公司 图像处理方法、系统及计算机可读存储介质

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR100969322B1 (ko) * 2008-01-10 2010-07-09 엘지전자 주식회사 멀티 그래픽 컨트롤러를 구비한 데이터 처리 장치 및 이를이용한 데이터 처리 방법
CN102883167A (zh) * 2012-09-19 2013-01-16 旗瀚科技有限公司 一种视频图像数据处理方法及系统
CN203675199U (zh) * 2013-12-31 2014-06-25 冠捷显示科技(厦门)有限公司 一种可升级软硬件性能的电视
TW201637432A (zh) * 2015-04-02 2016-10-16 Ultracker Technology Co Ltd 即時影像縫合裝置及即時影像縫合方法

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101192136A (zh) * 2006-11-30 2008-06-04 北京思比科微电子技术有限公司 基于usb模式的图像传输方法及装置
CN101202880A (zh) * 2007-06-08 2008-06-18 深圳市德诺通讯技术有限公司 一种具有摄像功能的移动终端
JP2009207014A (ja) * 2008-02-28 2009-09-10 Sharp Corp 画像処理装置
CN107220208A (zh) * 2017-07-07 2017-09-29 深圳市图芯智能科技有限公司 一种图像处理系统及方法
CN109167915A (zh) * 2018-09-29 2019-01-08 南昌黑鲨科技有限公司 图像处理方法、系统及计算机可读存储介质
CN109167916A (zh) * 2018-09-29 2019-01-08 南昌黑鲨科技有限公司 智能终端、图像处理方法及计算机可读存储介质
CN109242757A (zh) * 2018-09-29 2019-01-18 南昌黑鲨科技有限公司 智能终端、图像处理方法及计算机可读存储介质
CN109286753A (zh) * 2018-09-29 2019-01-29 南昌黑鲨科技有限公司 图像处理方法、系统及计算机可读存储介质

Also Published As

Publication number Publication date
CN109167916A (zh) 2019-01-08

Similar Documents

Publication Publication Date Title
WO2020063505A1 (zh) 图像处理方法、系统及计算机可读存储介质
WO2020063507A1 (zh) 图像处理方法、系统及计算机可读存储介质
WO2020063508A1 (zh) 智能终端、图像处理方法及计算机可读存储介质
WO2020063506A1 (zh) 智能终端、图像处理方法及计算机可读存储介质
US10326904B2 (en) On-chip image sensor data compression
US8179452B2 (en) Method and apparatus for generating compressed file, and terminal comprising the apparatus
US8558909B2 (en) Method and apparatus for generating compressed file, camera module associated therewith, and terminal including the same
EP1667457A1 (en) Image processing display device and image processing display method
US6697106B1 (en) Apparatus for processing image signals representative of a still picture and moving pictures picked up
KR20090015402A (ko) Jpeg 캡쳐 시간 단축을 위한 영상 처리 장치 및 그영상 처리 장치에서 jpeg 캡쳐 방법
WO2023160295A1 (zh) 视频处理方法和装置
US20070065022A1 (en) Image signal processing apparatus and method
JP2001238190A (ja) 画像処理装置及びその制御処理方法
US10304213B2 (en) Near lossless compression scheme and system for processing high dynamic range (HDR) images
US20120262603A1 (en) Image Capturing Device and Image Processing Method Thereof
JP4302661B2 (ja) 画像処理システム
KR101666927B1 (ko) 압축 파일 생성 방법 및 장치, 이를 포함하는 단말기
US8154749B2 (en) Image signal processor and deferred vertical synchronous signal outputting method
US20050068336A1 (en) Image overlay apparatus and method for operating the same
JP2001024929A (ja) ディジタルカメラ
US20080225165A1 (en) Image Pickup Device and Encoded Data Transferring Method
JP2018207424A (ja) 情報転送装置
WO2023016043A1 (zh) 视频处理方法、装置、电子设备和存储介质
KR20080113649A (ko) 캡쳐 영상을 시간 지연 없이 표시할 수 있는 영상 처리장치, 방법 및 상기 방법을 프로그램화하여 수록한컴퓨터로 읽을 수 있는 기록매체
JPH11220643A (ja) デジタルスチルカメラ

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19867096

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

32PN Ep: public notification in the ep bulletin as address of the adressee cannot be established

Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205A DATED 25.06.2021)

122 Ep: pct application non-entry in european phase

Ref document number: 19867096

Country of ref document: EP

Kind code of ref document: A1