WO2017041714A1 - Procédé et dispositif d'acquisition de données rvb - Google Patents

Procédé et dispositif d'acquisition de données rvb Download PDF

Info

Publication number
WO2017041714A1
WO2017041714A1 PCT/CN2016/098318 CN2016098318W WO2017041714A1 WO 2017041714 A1 WO2017041714 A1 WO 2017041714A1 CN 2016098318 W CN2016098318 W CN 2016098318W WO 2017041714 A1 WO2017041714 A1 WO 2017041714A1
Authority
WO
WIPO (PCT)
Prior art keywords
channel
pixel
image
gradient
interpolation result
Prior art date
Application number
PCT/CN2016/098318
Other languages
English (en)
Chinese (zh)
Inventor
朱德志
Original Assignee
努比亚技术有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 努比亚技术有限公司 filed Critical 努比亚技术有限公司
Publication of WO2017041714A1 publication Critical patent/WO2017041714A1/fr

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/10Cameras or camera modules comprising electronic image sensors; Control thereof for generating image signals from different wavelengths
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T3/00Geometric image transformations in the plane of the image
    • G06T3/40Scaling of whole images or parts thereof, e.g. expanding or contracting
    • G06T3/4015Image demosaicing, e.g. colour filter arrays [CFA] or Bayer patterns
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N9/00Details of colour television systems
    • H04N9/64Circuits for processing colour signals
    • H04N9/646Circuits for processing colour signals for image enhancement, e.g. vertical detail restoration, cross-colour elimination, contour correction, chrominance trapping filters

Definitions

  • This document relates to, but is not limited to, image processing technology, and more particularly to a method and apparatus for acquiring data of RGB (Red Green Blue).
  • CMOS Complementary Metal Oxide Semiconductor
  • RGB data More methods for obtaining RGB data generally include:
  • the pixel values of the G channels of the pixels corresponding to the R channel and the B channel in the image are reconstructed.
  • the process of reconstructing the pixel value of the G channel of the pixel corresponding to the R channel in the image is similar, and the process of reconstructing the pixel value of the G channel of the pixel corresponding to the B channel in the image is similar.
  • ⁇ H i1,j1 is the vertical gradient of the pixel in the j1th column of the i1th row
  • ⁇ V i1, j1 is the horizontal gradient of the pixel in the j1th column of the i1th row
  • G i1, j1-1 is the i1th row (j1) 1
  • the pixel value of the G channel of the pixel of the column, G i1, j1+1 is the pixel value of the G channel of the pixel of the (i1+1)th column of the i1th row
  • R i1, j1 is the j1 column of the i1th row
  • the pixel value of the R channel of the pixel, R i1, j1-2 is the pixel value of the R channel of the pixel of the (i1-2)th column of the i1th row
  • R i1, j1+2 is the i1th row (j1+2)
  • the pixel values of the R channel of the pixel corresponding to the G channel and the B channel of the pixel corresponding to the G channel in the image are reconstructed.
  • the formula Reconstructing the pixel value of the R channel of the pixel of the i2th row and the j2th column; The pixel value of the B channel of the pixel of the i2th row and the j2th column is reconstructed.
  • r i2, j2 are the pixel values of the R channel of the pixel of the reconstructed i2 row and j2 column
  • R i2-1, j2 are the pixel values of the R channel of the pixel of the (j2)th row and the j2th column
  • R i2+1, j2 is the pixel value of the R channel of the pixel of the j2th column of the (i2+1)th row
  • G i2, j2 is the pixel value of the G channel of the pixel of the i2th row and the j2th column
  • g i2-1 , j2 is the pixel value of the G channel of the pixel of the reconstructed (i2-1)th row j2 column
  • g i2+1, j2 is the pixel of the G channel of the pixel of the reconstructed (i2+1)th row j2 column
  • b i2, j2 is the pixel value of the B channel of the pixel of the
  • reconstructing the pixel values of the R channel of the pixel corresponding to the B channel in the image is similar.
  • D 45 (i3,j3)
  • Calculate the 45 degree gradient of the pixel in the j3th column of the i3rd line, according to the formula D 135 (i3, j3)
  • D 45 (i3, j3) is the 45 degree gradient of the pixel of the i3th row and the j3th column
  • D 135 (i3, j3) is the 145 degree gradient of the pixel of the i3th row and the j3th column
  • B i3-1, j3 +1 is the pixel value of the B channel of the pixel of the (j3+1)th column of the (i3-1)th row
  • B i3+1, j3-1 is the (i3+1)th row (j3-1) column of the (i3+1)th row.
  • the pixel value of the B channel of the pixel, g i3, j3 is the pixel value of the G channel of the pixel of the i3th row and the j3th column of the reconstruction, and g i3-1, j3+1 is the reconstructed (i3-1) line ( J3+1)
  • the pixel value of the G channel of the pixel of the column, g i3+1, j3-1 is the pixel value of the G channel of the pixel of the (i3+1)th row (j3-1) column of the reconstruction
  • B i3 -1, j3 -1 is the pixel value of the B channel of the pixel of the (i3-1)th row (j3-1) column
  • B i3+1, j3+1 is the (i3+1)th row (j3+) 1
  • the pixel value of the B channel of the column of pixels, g i3-1, j3-1 is the pixel value of the G channel of the pixel of the (i3
  • Embodiments of the present invention provide a method and apparatus for acquiring RGB data, which can reduce false color and moiré, thereby improving visual quality of an image.
  • An embodiment of the present invention provides an apparatus for acquiring red, green, and blue RGB data, including:
  • Obtaining a module configured to acquire a first horizontal interpolation result and a first vertical interpolation result of pixels corresponding to a red R channel/blue B channel in the image; and acquire a level of a pixel corresponding to the R channel/B channel in the image Gradient and vertical gradients;
  • a first reconstruction module configured to determine that an absolute value of the difference between the obtained horizontal gradient and the vertical gradient is greater than 0 and less than a first preset threshold, according to the obtained first horizontal interpolation result and the first vertical difference result Weighted average reconstructed image pixel values of the green G channel of the pixel corresponding to the R channel/B channel;
  • a second reconstruction module configured to reconstruct a pixel value of an R channel/B channel of a pixel corresponding to the G channel in the image; and reconstructing an R channel corresponding to the pixel value of the reconstructed G channel of the pixel corresponding to the R channel and the B channel in the image.
  • the first reconstruction module is further configured to: when determining that an absolute value of a difference between the horizontal gradient and the vertical gradient is equal to 0, according to the first horizontal interpolation result and the first vertical difference The average of the results reconstructs the pixel values of the G channels of the pixels corresponding to the R channel/B channel in the image.
  • the first reconstruction module is further configured to: when it is determined that an absolute value of a difference between the horizontal gradient and the vertical gradient is greater than the first preset threshold, and the horizontal gradient is smaller than the vertical gradient And reconstructing, according to the first horizontal interpolation result, a pixel value of a G channel of a pixel corresponding to the R channel/B channel in the image;
  • the second reconstruction module is configured to acquire a second horizontal interpolation result or a second vertical interpolation result of the pixel corresponding to the G channel in the image, and reconstruct according to the second horizontal interpolation result or the second vertical interpolation result a pixel value of an R channel/B channel of a pixel corresponding to the G channel in the image;
  • the second reconstruction module is configured to reconstruct a pixel value of an R channel/B channel of a pixel corresponding to the G channel in the image;
  • Determining that the absolute value of the difference between the 45-degree gradient and the 135-degree gradient is greater than 0 and less than a second predetermined threshold, according to the 45-degree interpolation result and the weighted average reconstruction of the 135-degree interpolation result The pixel value of the B channel/R channel of the pixel corresponding to the R channel/B channel in the image.
  • the second reconstruction module is further configured to: when determining that the absolute value of the difference between the 45 degree gradient and the 135 degree gradient is equal to 0, according to the 45 degree interpolation result and the 135 degree interpolation result
  • the average value reconstructs the pixel values of the B channel/R channel of the pixel corresponding to the R channel/B channel in the image.
  • the second reconstruction module is further configured to: when it is determined that an absolute value of a difference between the 45 degree gradient and the 135 degree gradient is greater than the second preset threshold, and the 45 degree gradient is smaller than the a 135 degree gradient, reconstructing a pixel value of a B channel/R channel of a pixel corresponding to the R channel/B channel in the image according to the 45 degree interpolation result;
  • the interpolation result reconstructs the pixel value of the B channel/R channel of the pixel corresponding to the R channel/B channel in the image.
  • the obtaining module is configured to obtain a first horizontal interpolation result and a first vertical interpolation result of the pixel corresponding to the red R channel/blue B channel in the image by:
  • G i1, j1 are the first horizontal interpolation result
  • g2 i1, j1 is the first vertical interpolation result
  • G i1, j1-1 are the pixels of the G channel of the pixel of the (i1-1)th column (j1-1) column in the image.
  • the value, G i1, j1+1 is the pixel value of the G channel of the pixel in the (i1+1)th column of the i1th row in the image
  • a i1, j1 is the R channel/B of the pixel of the i1th row and the j1th column in the image.
  • the pixel value of the channel, A i1, j1-2 is the pixel value of the R channel/B channel of the pixel of the (i1-2)th column in the i1th row of the image, and A i1, j1+2 is the i1th row of the image ( J1+2) the pixel value of the R channel/B channel of the pixel of the column; G i1-1, j1 is the pixel value of the G channel of the pixel of the (j1)th row and the j1th column of the image, G i1+1, J1 is the pixel value of the G channel of the pixel of the j1th column of the (i1+1)th row in the image, and A i1-2, j1 is the R channel/B channel of the pixel of the (j1)th row of the (i1-2)th row in the image.
  • the pixel value, A i1+2, j1 is the pixel value of the R channel/B channel of the pixel of the j1th column of
  • the acquiring module is configured to obtain a horizontal gradient and a vertical gradient of pixels corresponding to the R channel/B channel in the image by:
  • ⁇ H1 i1, j1 is a horizontal gradient
  • ⁇ H2 i1, and j1 is a vertical gradient
  • the first reconstruction module is configured to implement, according to the obtained first horizontal interpolation result and the first vertical difference result, weighted average reconstructed pixels of the green channel of the pixel corresponding to the R channel/B channel in the image. value:
  • the embodiment of the invention further provides a method for acquiring red, green and blue RGB data, comprising:
  • the pixel value of the B channel of the pixel corresponding to the R channel in the image and the pixel value of the R channel of the pixel corresponding to the B channel in the image are reconstructed according to the pixel values of the reconstructed G channel of the pixel corresponding to the R channel and the B channel in the image.
  • the method further includes: when it is determined that the absolute value of the difference between the horizontal gradient and the vertical gradient is equal to 0,
  • the method further includes:
  • the pixel values of the R channel/B channel of the pixel corresponding to the G channel in the reconstructed image include:
  • the pixel values of the B channel/R channel of the pixel corresponding to the R channel/B channel in the image reconstructed according to the pixel value of the reconstructed G channel of the pixel corresponding to the R channel and the B channel in the image include:
  • the determining, according to the pixel value of the pixel corresponding to the G channel in the image and the pixel value of the reconstructed G channel, the pixel value of the B channel/R channel of the pixel corresponding to the R channel/B channel in the image further includes: when determining When the absolute value of the difference between the 45-degree gradient and the 135-degree gradient is equal to 0, the R channel/B in the image is reconstructed according to the 45-degree interpolation result and the average value of the 135-degree interpolation result The pixel value of the B channel/R channel of the pixel corresponding to the channel.
  • the determining, according to the pixel value of the pixel corresponding to the G channel in the image and the pixel value of the reconstructed G channel, the pixel value of the B channel/R channel of the pixel corresponding to the R channel/B channel in the image further includes:
  • the interpolation result reconstructs the pixel value of the B channel/R channel of the pixel corresponding to the R channel/B channel in the image.
  • obtaining a first horizontal interpolation result and a first vertical interpolation result of the pixel corresponding to the red R channel/blue B channel in the image including:
  • G i1, j1 are the first horizontal interpolation result
  • g2 i1, j1 is the first vertical interpolation result
  • G i1, j1-1 are the pixels of the G channel of the pixel of the (i1-1)th column (j1-1) column in the image.
  • the value, G i1, j1+1 is the pixel value of the G channel of the pixel in the (i1+1)th column of the i1th row in the image
  • a i1, j1 is the R channel/B of the pixel of the i1th row and the j1th column in the image.
  • the pixel value of the channel, A i1, j1-2 is the pixel value of the R channel/B channel of the pixel of the (i1-2)th column in the i1th row of the image, and A i1, j1+2 is the i1th row of the image ( J1+2) the pixel value of the R channel/B channel of the pixel of the column; G i1-1, j1 is the pixel value of the G channel of the pixel of the (j1)th row and the j1th column of the image, G i1+1, J1 is the pixel value of the G channel of the pixel of the j1th column of the (i1+1)th row in the image, and A i1-2, j1 is the R channel/B channel of the pixel of the (j1)th row of the (i1-2)th row in the image.
  • the pixel value, A i1+2, j1 is the pixel value of the R channel/B channel of the pixel of the j1th column of
  • Obtaining the horizontal and vertical gradients of the pixels corresponding to the R channel/B channel in the image includes:
  • ⁇ H1 i1, j1 is a horizontal gradient
  • ⁇ H2 i1, and j1 is a vertical gradient
  • the pixel value of the G channel of the pixel corresponding to the R channel/B channel in the image is reconstructed according to the obtained first horizontal interpolation result and the weighted average of the first vertical difference result, including:
  • the technical solution of the embodiment of the present invention includes: acquiring a first horizontal interpolation result and a first vertical interpolation result of pixels corresponding to the R channel/B channel in the image; and acquiring pixels corresponding to the R channel/B channel in the image.
  • the pixel value of the reconstructed G channel reconstructs the pixel value of the B channel of the pixel corresponding to the R channel in the image, and the pixel value of the R channel of the pixel corresponding to the B channel in the image.
  • the weighted average reconstructs the pixel values of the G channel of the pixel corresponding to the R channel or the B channel in the image, reducing the pseudo color and moiré, thereby improving the visual quality of the image.
  • FIG. 1 is a schematic structural diagram of hardware of a mobile terminal that implements various embodiments of the present invention
  • FIG. 2 is a schematic diagram of a wireless communication system of the mobile terminal shown in FIG. 1;
  • FIG. 3 is a flowchart of a method for acquiring RGB data according to an embodiment of the present invention.
  • FIG. 4 is a schematic diagram of an image according to an embodiment of the present invention.
  • FIG. 5 is a schematic structural diagram of an apparatus for acquiring RGB data according to an embodiment of the present invention.
  • the mobile terminal can be implemented in various forms.
  • the terminal described in the embodiments of the present invention may include, for example, a mobile phone, a smart phone, a notebook computer, a digital broadcast receiver, a PDA (Personal Digital Assistant), a PAD (Tablet), a PMP (Portable Multimedia Player), a navigation device Mobile terminals of the like and fixed terminals such as digital TVs, desktop computers, and the like.
  • PDA Personal Digital Assistant
  • PAD Tablett
  • PMP Portable Multimedia Player
  • FIG. 1 is a schematic diagram showing the hardware structure of a mobile terminal embodying various embodiments of the present invention.
  • the mobile terminal 100 may include a wireless communication unit 110, an A/V (Audio/Video) input unit 120, The user input unit 130, the sensing unit 140, the output unit 150, the memory 160, the interface unit 170, the controller 180, the power supply unit 190, and the like.
  • Figure 1 illustrates a mobile terminal having various components, but it should be understood that not all illustrated components are required to be implemented. More or fewer components can be implemented instead. The elements of the mobile terminal will be described in detail below.
  • Wireless communication unit 110 typically includes one or more components that permit radio communication between mobile terminal 100 and a wireless communication system or network.
  • the wireless communication unit may include at least one of a broadcast receiving module 111, a mobile communication module 112, a wireless internet module 113, a short-range communication module 114, and a location information module 115.
  • the broadcast receiving module 111 receives a broadcast signal and/or broadcast associated information from an external broadcast management server via a broadcast channel.
  • the broadcast channel can include a satellite channel and/or a terrestrial channel.
  • the broadcast management server may be a server that generates and transmits a broadcast signal and/or broadcast associated information or a server that receives a previously generated broadcast signal and/or broadcast associated information and transmits it to the terminal.
  • the broadcast signal may include a TV broadcast signal, a radio broadcast signal, a data broadcast signal, and the like.
  • the broadcast signal may further include a broadcast signal combined with a TV or radio broadcast signal.
  • the broadcast associated information may also be provided via a mobile communication network, and in this case, the broadcast associated information may be received by the mobile communication module 112.
  • the broadcast signal may exist in various forms, for example, it may exist in the form of Digital Multimedia Broadcasting (DMB) Electronic Program Guide (EPG), Digital Video Broadcasting Handheld (DVB-H) Electronic Service Guide (ESG), and the like.
  • the broadcast receiving module 111 can receive a signal broadcast by using various types of broadcast systems.
  • the broadcast receiving module 111 can use forward link media (MediaFLO) by using, for example, multimedia broadcast-terrestrial (DMB-T), digital multimedia broadcast-satellite (DMB-S), digital video broadcast-handheld (DVB-H)
  • MediaFLO forward link media
  • the digital broadcasting system of the @) data broadcasting system, the terrestrial digital broadcasting integrated service (ISDB-T), and the like receives digital broadcasting.
  • the broadcast receiving module 111 can be constructed as various broadcast systems suitable for providing broadcast signals as well as the above-described digital broadcast system.
  • the broadcast signal and/or broadcast associated information received via the broadcast receiving module 111 may be stored in the memory 160 (or other type of storage
  • the mobile communication module 112 transmits the radio signals to and/or receives radio signals from at least one of a base station (e.g., an access point, a Node B, etc.), an external terminal, and a server.
  • a base station e.g., an access point, a Node B, etc.
  • Such radio signals may include voice call signals, video call signals, or various types of data transmitted and/or received in accordance with text and/or multimedia messages.
  • the wireless internet module 113 supports wireless internet access of the mobile terminal.
  • the module can be internally or externally coupled to the terminal.
  • the wireless Internet access technologies involved in the module may include WLAN (Wireless LAN) (Wi-Fi), Wibro (Wireless Broadband), Wimax (Worldwide Interoperability for Microwave Access), HSDPA (High Speed Downlink Packet Access), etc. .
  • the short range communication module 114 is a module that is configured to support short range communication.
  • Some examples of short-range communication technology include Bluetooth TM, a radio frequency identification (RFID), infrared data association (IrDA), ultra wideband (UWB), ZigBee, etc. TM.
  • the location information module 115 is a module configured to check or acquire location information of the mobile terminal.
  • a typical example of a location information module is GPS (Global Positioning System).
  • GPS Global Positioning System
  • the GPS module 115 calculates distance information and accurate time information from three or more satellites and applies triangulation to the calculated information to accurately calculate three-dimensional current position information based on longitude, latitude, and altitude.
  • the method for calculating position and time information uses three satellites and corrects the calculated position and time information errors by using another satellite.
  • the GPS module 115 is capable of calculating speed information by continuously calculating current position information in real time.
  • the A/V input unit 120 is arranged to receive an audio or video signal.
  • the A/V input unit 120 may include a camera 121 and a microphone 1220 that processes image data of still pictures or video obtained by the image capturing device in a video capturing mode or an image capturing mode.
  • the processed image frame can be displayed on the display unit 151.
  • the image frames processed by the camera 121 may be stored in the memory 160 (or other storage medium) or transmitted via the wireless communication unit 110, and two or more cameras 1210 may be provided according to the configuration of the mobile terminal.
  • the microphone 122 can receive sound (audio data) via a microphone in an operation mode of a telephone call mode, a recording mode, a voice recognition mode, and the like, and can process such sound as audio data.
  • the processed audio (voice) data can be converted to a format output that can be transmitted to the mobile communication base station via the mobile communication module 112 in the case of a telephone call mode.
  • the microphone 122 can implement various types of noise cancellation (or suppression) algorithms to cancel (or suppress) noise or interference generated during the process of receiving and transmitting audio signals.
  • the user input unit 130 may generate key input data according to a command input by the user to control various operations of the mobile terminal.
  • the user input unit 130 allows the user to input various types of information, and may include a keyboard, a pot, a touch pad (eg, a touch sensitive component that detects changes in resistance, pressure, capacitance, etc. due to contact), a scroll wheel , rocker, etc.
  • a touch pad eg, a touch sensitive component that detects changes in resistance, pressure, capacitance, etc. due to contact
  • a scroll wheel e.g, a scroll wheel, rocker, etc.
  • a touch screen can be formed.
  • the sensing unit 140 detects the current state of the mobile terminal 100 (eg, the open or closed state of the mobile terminal 100), the location of the mobile terminal 100, the presence or absence of contact (ie, touch input) by the user with the mobile terminal 100, and the mobile terminal.
  • the sensing unit 140 can sense whether the slide type phone is turned on or off.
  • the sensing unit 140 can detect whether the power supply unit 190 provides power or whether the interface unit 170 is coupled to an external device.
  • Sensing unit 140 may include proximity sensor 1410 which will be described below in connection with a touch screen.
  • the interface unit 170 serves as an interface through which at least one external device can connect with the mobile terminal 100.
  • the external device may include a wired or wireless headset port, an external power (or battery charger) port, a wired or wireless data port, a memory card port, a port for connecting a device having an identification module, and an audio input/output. (I/O) port, video I/O port, headphone port, and more.
  • the identification module may be stored to verify various information used by the user using the mobile terminal 100 and may include a User Identification Module (UIM), a Customer Identification Module (SIM), a Universal Customer Identity Module (USIM), and the like.
  • the device having the identification module may take the form of a smart card, and thus the identification device may be connected to the mobile terminal 100 via a port or other connection device.
  • the interface unit 170 may be arranged to receive input from an external device (eg, data information, power, etc.) and transmit the received input to one or more components within the mobile terminal 100 or may be used at the mobile terminal and external device Transfer data between.
  • the interface unit 170 may function as a path through which power is supplied from the base to the mobile terminal 100 or may be used as a transmission of various command signals allowing input from the base to the mobile terminal 100 The path to the terminal.
  • Various command signals or power input from the base can be used as signals for identifying whether the mobile terminal is accurately mounted on the base.
  • Output unit 150 is configured to provide an output signal (eg, an audio signal, a video signal, an alarm signal, a vibration signal, etc.) in a visual, audio, and/or tactile manner.
  • the output unit 150 may include a display unit 151, an audio output module 152, an alarm unit 153, and the like.
  • the display unit 151 can display information processed in the mobile terminal 100. For example, when the mobile terminal 100 is in the phone call mode, the display unit 151 can display a call or other communication (eg, text) This messaging, multimedia file download, etc.) related user interface (UI) or graphical user interface (GUI). When the mobile terminal 100 is in a video call mode or an image capturing mode, the display unit 151 may display a captured image and/or a received image, a UI or GUI showing a video or image and related functions, and the like.
  • UI user interface
  • GUI graphical user interface
  • the display unit 151 can function as an input device and an output device.
  • the display unit 151 may include at least one of a liquid crystal display (LCD), a thin film transistor LCD (TFT-LCD), an organic light emitting diode (OLED) display, a flexible display, a three-dimensional (3D) display, and the like.
  • LCD liquid crystal display
  • TFT-LCD thin film transistor LCD
  • OLED organic light emitting diode
  • a flexible display a three-dimensional (3D) display, and the like.
  • 3D three-dimensional
  • Some of these displays may be configured to be transparent to allow a user to view from the outside, which may be referred to as a transparent display, and a typical transparent display may be, for example, a TOLED (Transparent Organic Light Emitting Diode) display or the like.
  • TOLED Transparent Organic Light Emitting Diode
  • the mobile terminal 100 may include two or more display units (or other display devices), for example, the mobile terminal may include an external display unit (not shown) and an internal display unit (not shown) .
  • the touch screen can be set to detect touch input pressure as well as touch input position and touch input area.
  • the audio output module 152 may convert audio data received by the wireless communication unit 110 or stored in the memory 160 when the mobile terminal is in a call signal receiving mode, a call mode, a recording mode, a voice recognition mode, a broadcast receiving mode, and the like.
  • the audio signal is output as sound.
  • the audio output module 152 can provide audio output (eg, call signal reception sound, message reception sound, etc.) associated with a particular function performed by the mobile terminal 100.
  • the audio output module 152 can include a speaker, a buzzer, and the like.
  • the alarm unit 153 can provide an output to notify the mobile terminal 100 of the occurrence of an event. Typical events may include call reception, message reception, key signal input, touch input, and the like. In addition to audio or video output, the alert unit 153 can provide an output in a different manner to notify of the occurrence of an event. For example, the alarm unit 153 can provide an output in the form of vibrations, and when a call, message, or some other incoming communication is received, the alarm unit 153 can provide a tactile output (ie, vibration) to notify the user of it. By providing such a tactile output, the user is able to recognize the occurrence of various events even when the user's mobile phone is in the user's pocket. The alarm unit 153 can also provide an output of the notification event occurrence via the display unit 151 or the audio output module 152.
  • the memory 160 may store a software program or the like that performs processing and control operations performed by the controller 180, or may temporarily store data that has been output or is to be output (for example, a phone book, a message, Still images, videos, etc.). Moreover, the memory 160 can store data regarding vibrations and audio signals of various manners that are output when a touch is applied to the touch screen.
  • the memory 160 may include at least one type of storage medium including a flash memory, a hard disk, a multimedia card, a card type memory (eg, SD or DX memory, etc.), a random access memory (RAM), a static random access memory ( SRAM), read only memory (ROM), electrically erasable programmable read only memory (EEPROM), programmable read only memory (PROM), magnetic memory, magnetic disk, optical disk, and the like.
  • the mobile terminal 100 can cooperate with a network storage device that performs a storage function of the memory 160 through a network connection.
  • the controller 180 typically controls the overall operation of the mobile terminal. For example, the controller 180 performs the control and processing associated with voice calls, data communications, video calls, and the like. Additionally, the controller 180 can include a multimedia module 1810 that is configured to reproduce (or play back) multimedia data, and the multimedia module 1810 can be constructed within the controller 180 or can be configured to be separate from the controller 180. The controller 180 may perform a pattern recognition process to recognize a handwriting input or a picture drawing input performed on the touch screen as a character or an image.
  • the power supply unit 190 receives external power or internal power under the control of the controller 180 and provides appropriate power required to operate the various components and components.
  • the various embodiments described herein can be implemented in a computer readable medium using, for example, computer software, hardware, or any combination thereof.
  • the embodiments described herein may be through the use of application specific integrated circuits (ASICs), digital signal processors (DSPs), digital signal processing devices (DSPDs), programmable logic devices (PLDs), field programmable gate arrays (ASICs), digital signal processors (DSPs), digital signal processing devices (DSPDs), programmable logic devices (PLDs), field programmable gate arrays ( An FPGA, a processor, a controller, a microcontroller, a microprocessor, at least one of the electronic units designed to perform the functions described herein, in some cases, such an embodiment may be at the controller 180 Implemented in the middle.
  • implementations such as procedures or functions may be implemented with separate software modules that permit the execution of at least one function or operation.
  • the software code can be implemented by a software application (or program) written in any suitable programming language, which can be stored in memory 160 and executed by
  • the mobile terminal has been described in terms of its function.
  • a slide type mobile terminal among various types of mobile terminals such as a folding type, a bar type, a swing type, a slide type mobile terminal, and the like will be described as an example. Therefore, embodiments of the present invention can be applied to any type of Mobile terminal, and is not limited to a slide type mobile terminal.
  • the mobile terminal 100 as shown in FIG. 1 may be configured to operate using a communication system such as a wired and wireless communication system and a satellite-based communication system that transmits data via frames or packets.
  • a communication system such as a wired and wireless communication system and a satellite-based communication system that transmits data via frames or packets.
  • a communication system in which a mobile terminal is operable according to an embodiment of the present invention will now be described with reference to FIG.
  • Such communication systems may use different air interfaces and/or physical layers.
  • air interfaces used by communication systems include, for example, Frequency Division Multiple Access (FDMA), Time Division Multiple Access (TDMA), Code Division Multiple Access (CDMA), and Universal Mobile Telecommunications System (UMTS) (in particular, Long Term Evolution (LTE)). ), Global System for Mobile Communications (GSM), etc.
  • FDMA Frequency Division Multiple Access
  • TDMA Time Division Multiple Access
  • CDMA Code Division Multiple Access
  • UMTS Universal Mobile Telecommunications System
  • LTE Long Term Evolution
  • GSM Global System for Mobile Communications
  • the following description relates to a CDMA communication system, but such teachings are equally applicable to other types of systems.
  • a CDMA wireless communication system can include a plurality of mobile terminals 100, a plurality of base stations (BS) 270, a base station controller (BSC) 275, and a mobile switching center (MSC) 280.
  • the MSC 280 is configured to interface with a public switched telephone network (PSTN) 290.
  • PSTN public switched telephone network
  • the MSC 280 is also configured to interface with a BSC 275 that can be coupled to the base station 270 via a backhaul line.
  • the backhaul line can be constructed in accordance with any of a number of well known interfaces including, for example, E1/T1, ATM, IP, PPP, Frame Relay, HDSL, ADSL, or xDSL. It will be appreciated that the system as shown in FIG. 2 may include multiple BSC 2750s.
  • Each BS 270 can serve one or more partitions (or regions), each of which is covered by a multi-directional antenna or an antenna directed to a particular direction radially away from the BS 270. Alternatively, each partition may be covered by two or more antennas that are set to receive diversity. Each BS 270 can be configured to support multiple frequency allocations, and each frequency allocation has a particular frequency spectrum (eg, 1.25 MHz, 5 MHz, etc.).
  • BS 270 may also be referred to as a Base Transceiver Subsystem (BTS) or other equivalent terminology.
  • BTS Base Transceiver Subsystem
  • the term "base station” can be used to generally refer to a single BSC 275 and at least one BS 270.
  • a base station can also be referred to as a "cell station.”
  • each partition of a particular BS 270 may be referred to as a plurality of cellular stations.
  • a broadcast transmitter (BT) 295 transmits a broadcast signal to the mobile terminal 100 operating within the system.
  • a broadcast receiving module 111 as shown in FIG. 1 is provided at the mobile terminal 100 to receive a broadcast signal transmitted by the BT 295.
  • GPS Global Positioning System
  • the satellite 300 helps locate at least one of the plurality of mobile terminals 100.
  • a plurality of satellites 300 are depicted, but it is understood that useful positioning information can be obtained using any number of satellites.
  • the GPS module 115 as shown in Figure 1 is typically configured to cooperate with the satellite 300 to obtain desired positioning information. Instead of GPS tracking technology or in addition to GPS tracking technology, other techniques that can track the location of the mobile terminal can be used. Additionally, at least one GPS satellite 300 can selectively or additionally process satellite DMB transmissions.
  • BS 270 receives reverse link signals from various mobile terminals 100.
  • Mobile terminal 100 typically participates in calls, messaging, and other types of communications.
  • Each reverse link signal received by a particular base station 270 is processed within a particular BS 270.
  • the obtained data is forwarded to the relevant BSC 275.
  • the BSC provides call resource allocation and coordinated mobility management functions including a soft handoff procedure between the BSs 270.
  • the BSC 275 also routes the received data to the MSC 280, which provides additional routing services for interfacing with the PSTN 290.
  • PSTN 290 interfaces with MSC 280, which forms an interface with BSC 275, and BSC 275 controls BS 270 accordingly to transmit forward link signals to mobile terminal 100.
  • a first embodiment of the present invention provides a method for acquiring RGB data, including:
  • Step 300 Acquire a first horizontal interpolation result of the pixel corresponding to the R channel/B channel in the image and a first vertical interpolation result; and obtain a horizontal gradient and a vertical gradient of the pixel corresponding to the R channel/B channel in the image.
  • "/" means replaceable, that is, the process of reconstructing the pixel value of the G channel of the pixel corresponding to the R channel in the reconstructed image and the pixel value of the G channel of the pixel corresponding to the B channel in the reconstructed image are the same.
  • FIG. 4 is a schematic illustration of an image. As shown in FIG. 4, the format of the odd lines in the image is RGRG..., and the format of the even lines is GBGB. FIG. 4 is only a schematic diagram of an image, and does not represent the format of an image that can be processed by the embodiment of the present invention. Any image in the bayer format can be interpolated by the method of the embodiment of the present invention.
  • the first horizontal interpolation result of the pixel corresponding to the R channel/B channel in the image is obtained.
  • the first vertical interpolation result includes:
  • G i1, j1 are the first horizontal interpolation result
  • g2 i1, j1 is the first vertical interpolation result
  • G i1, j1-1 are the pixels of the G channel of the pixel of the (i1-1)th column (j1-1) column in the image.
  • the value, G i1, j1+1 is the pixel value of the G channel of the pixel in the (i1+1)th column of the i1th row in the image
  • a i1, j1 is the R channel/B of the pixel of the i1th row and the j1th column in the image.
  • the pixel value of the channel, A i1, j1-2 is the pixel value of the R channel/B channel of the pixel of the (i1-2)th column in the i1th row of the image, and A i1, j1+2 is the i1th row of the image ( J1+2) the pixel value of the R channel/B channel of the pixel of the column; G i1-1, j1 is the pixel value of the G channel of the pixel of the (j1)th row and the j1th column of the image, G i1+1, J1 is the pixel value of the G channel of the pixel of the j1th column of the (i1+1)th row in the image, and A i1-2, j1 is the R channel/B channel of the pixel of the (j1)th row of the (i1-2)th row in the image.
  • the pixel value, A i1+2, j1 is the pixel value of the R channel/B channel of the pixel of the j1th column of
  • obtaining horizontal and vertical gradients of pixels corresponding to the R channel/B channel in the image includes:
  • ⁇ H1 i1, j1 is a horizontal gradient
  • ⁇ H2 i1, and j1 is a vertical gradient
  • Step 301 Determine that the absolute value of the difference between the obtained horizontal gradient and the vertical gradient is greater than 0 and less than the first preset threshold, and reconstruct the image according to the obtained first horizontal interpolation result and the weighted average of the first vertical difference result.
  • the pixel values of the G channel of the pixel corresponding to the R channel/B channel in the weighted average reconstructed image according to the obtained first horizontal interpolation result and the first vertical difference result include:
  • g i1, j1 is the pixel value of the reconstructed G channel of the pixel corresponding to the R channel/B channel in the image.
  • the R channel/B channel corresponding to the image is reconstructed according to the average of the first horizontal interpolation result and the first vertical difference result.
  • the pixel values of the G channel of the pixel corresponding to the R channel/B channel in the image are reconstructed according to the average of the first horizontal interpolation result and the first vertical difference result, including:
  • the R channel/B channel corresponding to the image is reconstructed according to the first horizontal interpolation result.
  • the R channel/B channel corresponding to the image is reconstructed according to the first vertical interpolation result.
  • the horizontal gradient and the vertical gradient are not much different, for example, the horizontal gradient is 10, and the vertical gradient is 13, at this time, if the two gradients can be accurately determined to be absolutely accurate, the gradient can be small.
  • the interpolation result in the direction ie, the edge direction
  • the problem now is that, due to other factors such as noise (main factor, noise is always present), if the calculated edge direction interpolation result is still used as the final result, the interpolation result cannot be guaranteed, so the result will be in the picture.
  • Step 302 reconstruct a pixel value of an R channel/B channel of a pixel corresponding to the G channel in the image.
  • This step can be performed before or after any step, and is not affected by the order of execution of the other steps.
  • the pixel values of the R channel/B channel of the pixel corresponding to the G channel in the reconstructed image include:
  • the second horizontal interpolation result or the second vertical interpolation result of the pixel corresponding to the G channel in the image is:
  • a1 i2, j2 is the second horizontal interpolation result
  • a2 i2, j2 is the second vertical interpolation result
  • a i2-1, j2 is the R channel/B of the pixel of the (j2-1)th row j2 column in the image.
  • the pixel value of the channel, A i2+1, j2 is the pixel value of the R channel/B channel of the pixel of the ( j2 +1)th row and the j2th column in the image
  • G i2, j2 is the i2th row and the j2th column in the image.
  • the pixel value of the G channel of the pixel, G i2-2, j2 is the pixel value of the G channel of the pixel of the (j2-2)th row and the j2th column in the image
  • G i2+2, j2 is the image (i2+2)
  • the pixel value of the G channel of the pixel of the j2th column, A i2, j2-1 is the pixel value of the R channel/B channel of the pixel of the (i2-1)th column of the i2th row in the image
  • a i2, j2+ 1 is the pixel value of the R channel/B channel of the pixel of the (i2th)th column of the i2th row in the image
  • G i2, j2-2 is the G channel of the pixel of the (i2th)th column of the i2th row in the image.
  • the pixel value, G i2, j2+2 is the pixel value of the G channel of the pixel of the (i2+2)th
  • Step 303 reconstruct, according to the pixel values of the reconstructed G channel of the pixel corresponding to the R channel and the B channel in the image, the pixel value of the B channel of the pixel corresponding to the R channel in the image, and the pixel of the R channel of the pixel corresponding to the B channel in the image. value.
  • the pixel values of the B channel/R channel of the pixel corresponding to the R channel/B channel in the image are reconstructed according to the pixel values of the reconstructed G channel of the pixel corresponding to the R channel and the B channel in the image, including:
  • the 45-degree interpolation result and the 135-degree interpolation result of the pixel corresponding to the R channel/B channel in the image are calculated according to the pixel values of the reconstructed G channel of the pixel corresponding to the R channel and the B channel in the image, and the 135 degree interpolation result includes:
  • a1 i3, j3 are 45-degree interpolation results
  • a2 i3, j3 are 135-degree interpolation results
  • a i3-1, j3+1 are pixels of the (i3-1)th row (j3+1) column in the image.
  • the pixel value of the R channel/B channel, A i3+1, j3 -1 is the pixel value of the R channel/B channel of the pixel of the (i3+1)th row (j3-1) column in the image, g i3, j3
  • the pixel value of the reconstructed G channel of the pixel of the i3th row and the j3th column in the image, g i3-1, j3+1 is the reconstructed pixel of the (j3+1)th column of the (i3-1)th row in the image.
  • the pixel value of the G channel, g i3+1, j3-1 is the pixel value of the reconstructed G channel of the pixel of the (i3+1)th row (j3-1) column in the image, A i3-1, j3-1
  • the pixel value of the R channel/B channel of the pixel in the (j3-1)th column of the (i3-1)th line in the image, A i3+1, j3+1 is the (i3+1)th row in the image (j3 +1) the pixel value of the R channel/B channel of the pixel of the column
  • g i3-1, j3-1 is the pixel of the reconstructed G channel of the pixel of the (i3-1)th row (j3-1) column in the image
  • the value, g i3+1, j3+1 is the pixel value of the reconstructed G channel of the pixel of the (j3+1)th column of the (i3+1)th row in the image.
  • calculating a 45 degree gradient and a 135 degree gradient of the pixel corresponding to the B channel/R channel in the image include:
  • D 45 (i3, j3) is a 45 degree gradient
  • D 135 (i3, j3) is a 135 degree gradient
  • C i3, j3 are the pixel values of the B channel/R channel of the pixel of the i3th row and the j3th column in the image.
  • the pixel values of the B channel/R channel of the pixel corresponding to the R channel/B channel in the weighted average reconstructed image according to the 45 degree interpolation result and the 135 degree interpolation result include:
  • a i3, j3 are the pixel values of the reconstructed B channel/R channel of the pixel corresponding to the R channel/B channel in the image.
  • the B of the pixel corresponding to the R channel/B channel in the image is reconstructed according to the average of the 45-degree interpolation result and the 135-degree interpolation result.
  • the pixel value of the channel/R channel is reconstructed according to the average of the 45-degree interpolation result and the 135-degree interpolation result.
  • the pixel values of the B channel/R channel of the pixel corresponding to the R channel/B channel in the image are reconstructed according to the 45-degree interpolation result and the average value of the 135-degree interpolation result, including:
  • the R channel in the image is reconstructed according to the 45 degree interpolation result
  • the corresponding R channel/B channel in the image is reconstructed according to the 135-degree interpolation result.
  • step 302 "/" means replaceable, that is, the process of reconstructing the pixel value of the B channel of the pixel corresponding to the R channel in the image, and the pixel value of the R channel of the pixel corresponding to the B channel in the image is similar. .
  • the gradient between the 45-degree gradient and the 135-degree gradient is not large, for example, the gradient of 45 degrees is 10, and the gradient of 135 degrees is 13, if the two gradients can be accurately determined, it is absolutely accurate.
  • the interpolation result in the direction in which the gradient is small ie, the edge direction
  • the edge direction can be used as the final interpolation result, so that the interpolation result is the most accurate.
  • the edge direction can be accurately determined at this time, and the interpolation result of the edge direction can be used as the final result, and if the interpolation result of the two directions is still used for the weighted average, Decreased image quality (as well as experimental results), so gradients are used when the gradients are very different (I.e., the edge direction) as a result of the final interpolation result.
  • the weighted average reconstructs the pixel values of the G channel of the pixel corresponding to the R channel or the B channel in the image, reducing the pseudo color and moiré, thereby improving the visual quality of the image.
  • a second embodiment of the present invention further provides an apparatus for acquiring RGB data, including:
  • Obtaining a module configured to acquire a first horizontal interpolation result and a first vertical interpolation result of pixels corresponding to the R channel/B channel in the image; and obtain a horizontal gradient of the pixel corresponding to the R channel/B channel in the image And vertical gradients;
  • a first reconstruction module configured to determine that an absolute value of the difference between the obtained horizontal gradient and the vertical gradient is greater than 0 and less than a first preset threshold, according to the obtained first horizontal interpolation result and the first vertical difference result Weighted average reconstructed image pixel values of G channels of pixels corresponding to R channels/B channels;
  • a second reconstruction module configured to reconstruct a pixel value of an R channel/B channel of a pixel corresponding to the G channel in the image; and reconstructing an R channel corresponding to the pixel value of the reconstructed G channel of the pixel corresponding to the R channel and the B channel in the image.
  • the first reconstruction module is further configured to:
  • the G channel of the pixel corresponding to the R channel/B channel in the image is reconstructed according to the average of the first horizontal interpolation result and the first vertical difference result The pixel value.
  • the first reconstruction module is further configured to:
  • the first reconstruction module is further configured to:
  • the second reconstruction module is configured to:
  • the pixel value of the B channel of the pixel corresponding to the R channel in the image and the pixel value of the R channel of the pixel corresponding to the B channel in the image are reconstructed according to the pixel values of the reconstructed G channel of the pixel corresponding to the R channel and the B channel in the image.
  • the second reconstruction module is configured to:
  • Determining that the absolute value of the difference between the 45-degree gradient and the 135-degree gradient is greater than 0 and less than the second predetermined threshold, and corresponding to the R channel/B channel in the weighted average reconstructed image according to the 45-degree interpolation result and the 135-degree interpolation result The pixel value of the B channel/R channel of the pixel.
  • the second reconstruction module is further configured to:
  • the B channel of the pixel corresponding to the R channel/B channel in the image is reconstructed according to the 45-degree interpolation result and the average value of the 135-degree interpolation result.
  • the pixel value of the R channel is reconstructed according to the 45-degree interpolation result and the average value of the 135-degree interpolation result.
  • the second reconstruction module is further configured to:
  • the corresponding R channel/B channel in the image is reconstructed according to the 45-degree interpolation result.
  • the second reconstruction module is further configured to:
  • the corresponding R channel/B channel in the image is reconstructed according to the 135-degree interpolation result.
  • the acquisition module, the first reconstruction module, and the second reconstruction module may all be disposed in the controller of FIG. 1.
  • the above technical solution reduces false color and moiré, thereby improving the visual quality of the image.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Image Processing (AREA)
  • Color Television Image Signal Generators (AREA)

Abstract

L'invention concerne un procédé et un dispositif d'acquisition de données RVB. Le dispositif comprend : un module d'acquisition, configuré pour acquérir un premier résultat d'interpolation horizontale et un premier résultat d'interpolation verticale de pixels correspondant à un canal R/canal B dans une image et acquérir un gradient horizontal et un gradient vertical des pixels correspondant au canal R/canal B dans l'image ; un premier module de reconstruction, configuré pour déterminer que la valeur absolue d'une différence entre le gradient horizontal et le gradient vertical acquis est supérieure à 0 et inférieure à un premier seuil préétabli et reconstruire une valeur de pixel d'un canal vert V des pixels correspondant au canal R/canal B dans l'image en fonction d'une moyenne pondérée du premier résultat d'interpolation horizontale et du premier résultat d'interpolation verticale acquis ; et un deuxième module de reconstruction, configuré pour reconstruire une valeur de pixel le canal R/canal B des pixels correspondant au canal V dans l'image et reconstruire une valeur de pixel du canal B des pixels correspondant au canal R dans l'image et une valeur de pixel du canal R des pixels correspondant au canal B dans l'image en fonction de la valeur de pixel reconstruite du canal V des pixels correspondant au canal R et au canal B dans l'image.
PCT/CN2016/098318 2015-09-07 2016-09-07 Procédé et dispositif d'acquisition de données rvb WO2017041714A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201510562752.1A CN105160628B (zh) 2015-09-07 2015-09-07 一种获取rgb数据的方法和装置
CN201510562752.1 2015-09-07

Publications (1)

Publication Number Publication Date
WO2017041714A1 true WO2017041714A1 (fr) 2017-03-16

Family

ID=54801471

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2016/098318 WO2017041714A1 (fr) 2015-09-07 2016-09-07 Procédé et dispositif d'acquisition de données rvb

Country Status (2)

Country Link
CN (1) CN105160628B (fr)
WO (1) WO2017041714A1 (fr)

Families Citing this family (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105160628B (zh) * 2015-09-07 2018-09-14 努比亚技术有限公司 一种获取rgb数据的方法和装置
CN105578160A (zh) * 2015-12-23 2016-05-11 天津天地伟业数码科技有限公司 一种基于fpga平台的高清晰度去马赛克插值方法
CN108615227B (zh) * 2018-05-08 2021-02-26 浙江大华技术股份有限公司 一种图像摩尔纹的抑制方法及设备
CN112218062B (zh) * 2020-10-12 2022-04-22 Oppo广东移动通信有限公司 图像缩放装置、电子设备、图像缩放方法及图像处理芯片
CN112712467B (zh) * 2021-01-11 2022-11-11 郑州科技学院 基于计算机视觉与色彩滤波阵列的图像处理方法
CN113259635B (zh) * 2021-06-15 2021-10-01 珠海亿智电子科技有限公司 去马赛克方法、装置、设备及存储介质
CN113242413B (zh) * 2021-07-12 2021-09-21 深圳市艾为智能有限公司 抗锯齿的rccb滤镜阵列插值计算方法及系统
CN114004769B (zh) * 2021-12-30 2022-03-15 江苏游隼微电子有限公司 一种基于离散权重的Bayer去噪颜色插值方法

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090066821A1 (en) * 2007-09-07 2009-03-12 Jeffrey Matthew Achong Method And Apparatus For Interpolating Missing Colors In A Color Filter Array
US20100182466A1 (en) * 2009-01-16 2010-07-22 Samsung Digital Imaging Co., Ltd. Image interpolation method and apparatus using pattern characteristics of color filter array
CN104240182A (zh) * 2013-06-06 2014-12-24 富士通株式会社 图像处理装置、图像处理方法以及电子设备
CN104732561A (zh) * 2013-12-18 2015-06-24 展讯通信(上海)有限公司 一种图像的插值方法及装置
CN105160628A (zh) * 2015-09-07 2015-12-16 努比亚技术有限公司 一种获取rgb数据的方法和装置

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP5603676B2 (ja) * 2010-06-29 2014-10-08 オリンパス株式会社 画像処理装置及びプログラム
CN102630019B (zh) * 2012-03-27 2014-09-10 上海算芯微电子有限公司 去马赛克的方法和装置
CN103561255B (zh) * 2013-10-24 2016-01-27 洪丹 一种裸眼立体显示方法

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090066821A1 (en) * 2007-09-07 2009-03-12 Jeffrey Matthew Achong Method And Apparatus For Interpolating Missing Colors In A Color Filter Array
US20100182466A1 (en) * 2009-01-16 2010-07-22 Samsung Digital Imaging Co., Ltd. Image interpolation method and apparatus using pattern characteristics of color filter array
CN104240182A (zh) * 2013-06-06 2014-12-24 富士通株式会社 图像处理装置、图像处理方法以及电子设备
CN104732561A (zh) * 2013-12-18 2015-06-24 展讯通信(上海)有限公司 一种图像的插值方法及装置
CN105160628A (zh) * 2015-09-07 2015-12-16 努比亚技术有限公司 一种获取rgb数据的方法和装置

Also Published As

Publication number Publication date
CN105160628B (zh) 2018-09-14
CN105160628A (zh) 2015-12-16

Similar Documents

Publication Publication Date Title
WO2017041714A1 (fr) Procédé et dispositif d'acquisition de données rvb
WO2018019124A1 (fr) Procédé de traitement d'image et dispositif électronique et support d'informations
WO2017050115A1 (fr) Procédé de synthèse d'image
WO2017020836A1 (fr) Dispositif et procédé pour traiter une image de profondeur par estompage
US8780258B2 (en) Mobile terminal and method for generating an out-of-focus image
WO2017067526A1 (fr) Procédé d'amélioration d'image et terminal mobile
WO2016180325A1 (fr) Procédé et dispositif de traitement d'images
WO2016058458A1 (fr) Procédé de gestion de la quantité d'électricité d'une batterie, terminal mobile et support de stockage informatique
WO2018019128A1 (fr) Procédé de traitement d'image de scène de nuit et terminal mobile
WO2017071500A1 (fr) Antenne et terminal mobile
WO2017071481A1 (fr) Terminal mobile et procédé de mise en œuvre d'écran divisé
WO2017088629A1 (fr) Procédé et appareil de suppression de bruit de couleurs d'image, terminal mobile et support d'informations
WO2017143855A1 (fr) Dispositif doté d'une fonction de capture d'écran et procédé de capture d'écran
WO2017071542A1 (fr) Procédé et appareil de traitement d'image
WO2017071310A1 (fr) Système, dispositif et procédé d'appels vidéo
WO2018076938A1 (fr) Procédé et dispositif de traitement d'image et support de mise en mémoire informatique
WO2017008722A1 (fr) Procédé de communication basé sur la séparation intranet-extranet, serveur et système
WO2017071476A1 (fr) Procédé et dispositif de synthèse d'image et support de stockage
WO2017071475A1 (fr) Procédé de traitement d'image, et terminal et support d'informations
WO2016070681A1 (fr) Procédé et dispositif pour régler une température de couleur d'un écran, et support de stockage informatique
WO2017143854A1 (fr) Terminal mobile, procédé de commande de volume associé, et support de stockage lisible par ordinateur
WO2017071532A1 (fr) Procédé et appareil de prise de selfie de groupe
WO2017071592A1 (fr) Procédé et appareil de mise au point, procédé et appareil de photographie
WO2017071469A1 (fr) Terminal mobile, procédé de capture d'image et support d'enregistrement informatique
WO2017113893A1 (fr) Procédé et dispositif de recherche de réseau

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 16843652

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

32PN Ep: public notification in the ep bulletin as address of the adressee cannot be established

Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205A DATED 23.07.2018)

122 Ep: pct application non-entry in european phase

Ref document number: 16843652

Country of ref document: EP

Kind code of ref document: A1