WO2017067523A1 - 图像处理方法、装置及移动终端 - Google Patents

图像处理方法、装置及移动终端 Download PDF

Info

Publication number
WO2017067523A1
WO2017067523A1 PCT/CN2016/103062 CN2016103062W WO2017067523A1 WO 2017067523 A1 WO2017067523 A1 WO 2017067523A1 CN 2016103062 W CN2016103062 W CN 2016103062W WO 2017067523 A1 WO2017067523 A1 WO 2017067523A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
depth
correction
original image
eyepiece
Prior art date
Application number
PCT/CN2016/103062
Other languages
English (en)
French (fr)
Inventor
张登康
黄德文
Original Assignee
努比亚技术有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 努比亚技术有限公司 filed Critical 努比亚技术有限公司
Publication of WO2017067523A1 publication Critical patent/WO2017067523A1/zh

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/80Camera processing pipelines; Components thereof
    • H04N23/84Camera processing pipelines; Components thereof for processing colour signals
    • H04N23/88Camera processing pipelines; Components thereof for processing colour signals for colour balance, e.g. white-balance circuits or colour temperature control
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N9/00Details of colour television systems
    • H04N9/64Circuits for processing colour signals
    • H04N9/73Colour balance circuits, e.g. white balance circuits or colour temperature control

Definitions

  • Embodiments of the present invention relate to, but are not limited to, the field of mobile terminal technologies, and in particular, to an image processing method and apparatus, and a mobile terminal.
  • the camera function of mobile phones has been widely used. Photographing has gradually become one of the most important attributes of smart phones. It has also become a function that most users use every day. In life, you can use the camera function of mobile phones to capture various interesting moments of life. Every day, we use our mobile phones to take a lot of photos.
  • the user also hopes that the camera function of the mobile phone has a better effect.
  • various pictures can be used to modify the picture taken by the mobile phone camera, the effect of the picture is better, however, the mobile phone
  • the camera function is limited by the hardware, and the depth information cannot be recorded well. Even if the post-processing of the photo map software can not make up for the limitation of the shooting effect caused by the lack of depth information of the photo, this undoubtedly limits the camera performance. Further improvement.
  • the embodiment of the invention provides an image processing method, device and mobile terminal, which can provide a new image processing mode and a better user experience for the user.
  • An embodiment of the present invention provides an image processing apparatus, including:
  • a photographing unit configured to take a plurality of images by taking a plurality of eyepieces
  • An acquiring unit configured to acquire an original image of the photograph according to the plurality of captured images, and depth information corresponding to the original image
  • a determining unit configured to correct the matching relationship according to the preset depth, and determine the correction information corresponding to the depth information
  • the correction unit is configured to process the original image according to the correction information, and acquire a corresponding output image.
  • the eyepiece comprises a first eyepiece and a second eyepiece
  • the plurality of images include: a first image obtained by the first eyepiece, and a second image obtained by the second eyepiece;
  • the obtaining unit includes:
  • the original image obtaining module is configured to acquire a corresponding original image according to the first image and the second image;
  • the depth information acquiring module is configured to obtain depth information corresponding to each pixel in the original image.
  • the distance between the first eyepiece and the second eyepiece is related to a precision range of the depth information.
  • the original image obtaining module includes:
  • the first correction image corresponding to the first image and the second correction image corresponding to the second image are respectively acquired according to the first image and the second image; wherein the first correction The image and the second corrected image are images that have been aligned in the x-axis direction;
  • the original image acquisition sub-module is configured to use a common portion of the first corrected image and the second corrected image as the original image.
  • the method for acquiring the first corrected image and the second corrected image is: a stereo correction algorithm.
  • the depth information acquiring module includes:
  • the inspection map acquisition sub-module is configured to acquire the inspection map corresponding to the original image by using a stereo matching algorithm by using the first corrected image and the second corrected image;
  • the depth information obtaining submodule is configured to obtain corresponding depth information according to the inspection map.
  • the inspection map is a two-dimensional data, and the pixel points in the inspection map are in one-to-one correspondence with the pixel points in the original image.
  • the correction information is a color temperature adjustment value
  • the depth information is a depth value
  • the determining unit includes a modified matching relationship setting module configured to match the depth correction Relationship setting, where;
  • the depth correction matching relationship is set to:
  • S is the depth value
  • K is the color temperature adjustment value
  • f(S) is the transformation function corresponding to the depth correction matching relationship
  • a1 and b1 are preset parameters.
  • a slider processing unit configured to acquire a slider adjustment value; and, according to the slider adjustment value, adjust a range of the correction information corresponding to the acquired depth information, and according to the adjusted correction information
  • the original image is processed.
  • An embodiment of the present invention further provides a mobile terminal, including the image processing apparatus according to any one of the preceding claims.
  • An embodiment of the present invention further provides an image processing method, including:
  • the original image is processed according to the correction information, and a corresponding output image is acquired.
  • the eyepiece includes a first eyepiece and a second eyepiece; wherein the plurality of images comprises: a first image obtained by the first eyepiece, and a second image obtained by the second eyepiece;
  • the acquiring the original image of the photograph according to the captured plurality of images, and the depth information corresponding to the original image includes:
  • the distance between the first eyepiece and the second eyepiece is related to a precision range of the depth information.
  • the acquiring the corresponding original image according to the first image and the second image includes:
  • the common portion of the first corrected image and the second corrected image is taken as the original image.
  • the method for acquiring the first corrected image and the second corrected image is: a stereo correction algorithm.
  • the acquiring the depth information corresponding to each pixel in the original image includes:
  • Corresponding depth information is obtained according to the inspection map.
  • the inspection map is a two-dimensional data, and the pixel points in the inspection map are in one-to-one correspondence with the pixel points in the original image.
  • the correction information is a color temperature adjustment value
  • the depth information is a depth value
  • the method further includes:
  • S is the depth value
  • K is the color temperature adjustment value
  • f(S) is the transformation function corresponding to the depth correction matching relationship
  • a1 and b1 are preset parameters.
  • the method further includes:
  • the range of the correction information corresponding to the acquired depth information is adjusted according to the slider adjustment value, and the original image is processed according to the adjusted correction information.
  • the embodiment of the present invention further provides a computer readable storage medium storing computer executable instructions for performing the image processing method according to any one of the above.
  • the depth information can be acquired, and the image is optimized according to the depth information. For example, through the treatment of front and rear warm and cold tones, the foreground subject of the image can be more prominent, so that the background retreats further, and the advantage of binocular lens shooting is used to create a more precise separation of front and rear scenes, and to achieve user expectations. The effect is to improve the user experience.
  • FIG. 1 is a schematic structural diagram of hardware of a mobile terminal that implements various embodiments of the present invention
  • FIG. 2 is a schematic diagram of a wireless communication system of the mobile terminal shown in FIG. 1;
  • FIG. 3 is a schematic structural diagram of an image processing apparatus according to an embodiment of the present invention.
  • FIG. 4 is a schematic view showing the arrangement of the first eyepiece F1 and the second eyepiece F2;
  • FIG. 5 is a schematic flowchart diagram of an image processing method according to an embodiment of the present invention.
  • the mobile terminal can be implemented in various forms.
  • the terminal described in the present invention may include, for example, a mobile phone, a smart phone, a notebook computer, a digital broadcast receiver, a PDA (Personal Digital Assistance) Mobile terminals, PAD (Tablet), PMP (Portable Multimedia Player), navigation devices, and the like, and fixed terminals such as digital TVs, desktop computers, and the like.
  • PDA Personal Digital Assistance
  • PAD Tablett
  • PMP Portable Multimedia Player
  • navigation devices and the like
  • fixed terminals such as digital TVs, desktop computers, and the like.
  • the terminal is a mobile terminal.
  • those skilled in the art will appreciate that configurations in accordance with embodiments of the present invention can be applied to fixed type terminals in addition to components that are specifically for mobile purposes.
  • FIG. 1 is a schematic diagram showing the hardware structure of a mobile terminal embodying various embodiments of the present invention.
  • the mobile terminal 100 may include a wireless communication unit 110, an A/V (Audio/Video) input unit 120, a user input unit 130, a sensing unit 140, an output unit 150, a memory 160, an interface unit 170, a controller 180, and a power supply unit 190. and many more.
  • Figure 1 illustrates a mobile terminal having various components, but it should be understood that not all illustrated components are required to be implemented. More or fewer components can be implemented instead. The elements of the mobile terminal will be described in detail below.
  • Wireless communication unit 110 typically includes one or more components that permit radio communication between mobile terminal 100 and a wireless communication system or network.
  • the wireless communication unit may include at least one of a broadcast receiving module 111, a mobile communication module 112, a wireless internet module 113, a short-range communication module 114, and a location information module 115.
  • the broadcast receiving module 111 receives a broadcast signal and/or broadcast associated information from an external broadcast management server via a broadcast channel.
  • the broadcast channel can include a satellite channel and/or a terrestrial channel.
  • the broadcast management server may be a server that generates and transmits a broadcast signal and/or broadcast associated information or a server that receives a previously generated broadcast signal and/or broadcast associated information and transmits it to the terminal.
  • the broadcast signal may include a TV broadcast signal, a radio broadcast signal, a data broadcast signal, and the like.
  • the broadcast signal may further include a broadcast signal combined with a TV or radio broadcast signal.
  • the broadcast associated information may also be provided via a mobile communication network, and in this case, the broadcast associated information may be received by the mobile communication module 112.
  • the broadcast signal may exist in various forms, for example, it may exist in the form of Digital Multimedia Broadcasting (DMB) Electronic Program Guide (EPG), Digital Video Broadcasting Handheld (DVB-H) Electronic Service Guide (ESG), and the like.
  • the broadcast receiving module 111 can receive a signal broadcast by using various types of broadcast systems.
  • the broadcast receiving module 111 can use forward link media (MediaFLO) by using, for example, multimedia broadcast-terrestrial (DMB-T), digital multimedia broadcast-satellite (DMB-S), digital video broadcast-handheld (DVB-H)
  • MediaFLO forward link media
  • the digital broadcasting system of the @ ) data broadcasting system, the terrestrial digital broadcasting integrated service (ISDB-T), and the like receives digital broadcasting.
  • the broadcast receiving module 111 can be constructed as various broadcast systems suitable for providing broadcast signals as well as the above-described digital broadcast system.
  • the broadcast signal and/or broadcast associated information received via the broadcast receiving module 111 may be stored in the memory 160 (or other type of
  • the mobile communication module 112 transmits the radio signals to and/or receives radio signals from at least one of a base station (e.g., an access point, a Node B, etc.), an external terminal, and a server.
  • a base station e.g., an access point, a Node B, etc.
  • Such radio signals may include voice call signals, video call signals, or various types of data transmitted and/or received in accordance with text and/or multimedia messages.
  • the wireless internet module 113 supports wireless internet access of the mobile terminal.
  • the module can be internally or externally coupled to the terminal.
  • the wireless Internet access technologies involved in the module may include WLAN (Wireless LAN) (Wi-Fi), Wibro (Wireless Broadband), Wimax (Worldwide Interoperability for Microwave Access), HSDPA (High Speed Downlink Packet Access), etc. .
  • the short range communication module 114 is a module for supporting short range communication.
  • Some examples of short-range communication technology include Bluetooth TM, a radio frequency identification (RFID), infrared data association (IrDA), ultra wideband (UWB), ZigBee, etc. TM.
  • the location information module 115 is a module for checking or acquiring location information of the mobile terminal.
  • a typical example of a location information module is GPS (Global Positioning System).
  • GPS Global Positioning System
  • the GPS module 115 calculates distance information and accurate time information from three or more satellites and applies triangulation to the calculated information to accurately calculate three-dimensional current position information based on longitude, latitude, and altitude.
  • the method for calculating position and time information uses three satellites and corrects the calculated position and time information errors by using another satellite.
  • the GPS module 115 is capable of calculating speed information by continuously calculating current position information in real time.
  • the A/V input unit 120 is for receiving an audio or video signal.
  • the A/V input unit 120 may include a camera 121 and a microphone 1220 that processes image data of still pictures or video obtained by the image capturing device in a video capturing mode or an image capturing mode.
  • the processed image frame can be displayed on the display unit 151.
  • the image frames processed by the camera 121 may be stored in the memory 160 (or other storage medium) or transmitted via the wireless communication unit 110, and two or more cameras 1210 may be provided according to the configuration of the mobile terminal.
  • the microphone 122 can receive sound (audio data) via a microphone in an operation mode of a telephone call mode, a recording mode, a voice recognition mode, and the like, and can process such sound as audio data.
  • the processed audio (voice) data can be converted to a format transmittable to the mobile communication base station via the mobile communication module 112 in the case of a telephone call mode.
  • the microphone 122 can implement various types of noise cancellation (or suppression) algorithms to cancel (or suppress) noise or interference generated during the process of receiving and transmitting audio signals.
  • the user input unit 130 may generate key input data according to a command input by the user to control various operations of the mobile terminal.
  • the user input unit 130 allows the user to input various types of information, and may include a keyboard, a pot, a touch pad (eg, a touch sensitive component that detects changes in resistance, pressure, capacitance, etc. due to contact), a scroll wheel , rocker, etc.
  • a touch screen can be formed.
  • the sensing unit 140 detects the current state of the mobile terminal 100 (eg, the open or closed state of the mobile terminal 100), the location of the mobile terminal 100, the presence or absence of contact (ie, touch input) by the user with the mobile terminal 100, and the mobile terminal.
  • the sensing unit 140 can sense whether the slide type phone is turned on or off.
  • the sensing unit 140 can detect whether the power supply unit 190 provides power or whether the interface unit 170 is coupled to an external device.
  • Sensing unit 140 may include proximity sensor 1410 which will be described below in connection with a touch screen.
  • the interface unit 170 serves as an interface through which at least one external device can connect with the mobile terminal 100.
  • the external device may include a wired or wireless headset port, an external power (or battery charger) port, a wired or wireless data port, a memory card port, a port for connecting a device having an identification module, and an audio input/output. (I/O) port, video I/O port, headphone port, and more.
  • the identification module may be stored to verify various information used by the user using the mobile terminal 100 and may include a User Identification Module (UIM), a Customer Identification Module (SIM), a Universal Customer Identity Module (USIM), and the like.
  • the device having the identification module may take the form of a smart card, and thus the identification device may be connected to the mobile terminal 100 via a port or other connection device.
  • the interface unit 170 can be configured to receive input from an external device (eg, data information, power, etc.) and transmit the received input to one or more components within the mobile terminal 100 or can be used at the mobile terminal and external device Transfer data between.
  • the interface unit 170 may function as a path through which power is supplied from the base to the mobile terminal 100 or may be used as a transmission of various command signals allowing input from the base to the mobile terminal 100 The path to the terminal.
  • Various command letters input from the base The number or power can be used as a signal for identifying whether the mobile terminal is accurately mounted on the base.
  • Output unit 150 is configured to provide an output signal (eg, an audio signal, a video signal, an alarm signal, a vibration signal, etc.) in a visual, audio, and/or tactile manner.
  • the output unit 150 may include a display unit 151, an audio output module 152, an alarm unit 153, and the like.
  • the display unit 151 can display information processed in the mobile terminal 100. For example, when the mobile terminal 100 is in a phone call mode, the display unit 151 can display a user interface (UI) or a graphical user interface (GUI) related to a call or other communication (eg, text messaging, multimedia file download, etc.). When the mobile terminal 100 is in a video call mode or an image capturing mode, the display unit 151 may display a captured image and/or a received image, a UI or GUI showing a video or image and related functions, and the like.
  • UI user interface
  • GUI graphical user interface
  • the display unit 151 can function as an input device and an output device.
  • the display unit 151 may include at least one of a liquid crystal display (LCD), a thin film transistor LCD (TFT-LCD), an organic light emitting diode (OLED) display, a flexible display, a three-dimensional (3D) display, and the like.
  • LCD liquid crystal display
  • TFT-LCD thin film transistor LCD
  • OLED organic light emitting diode
  • a flexible display a three-dimensional (3D) display, and the like.
  • 3D three-dimensional
  • Some of these displays may be configured to be transparent to allow a user to view from the outside, which may be referred to as a transparent display, and a typical transparent display may be, for example, a TOLED (Transparent Organic Light Emitting Diode) display or the like.
  • TOLED Transparent Organic Light Emitting Diode
  • the mobile terminal 100 may include two or more display units (or other display devices), for example, the mobile terminal may include an external display unit (not shown) and an internal display unit (not shown) .
  • the touch screen can be used to detect touch input pressure as well as touch input position and touch input area.
  • the audio output module 152 may convert audio data received by the wireless communication unit 110 or stored in the memory 160 when the mobile terminal is in a call signal receiving mode, a call mode, a recording mode, a voice recognition mode, a broadcast receiving mode, and the like.
  • the audio signal is output as sound.
  • the audio output module 152 can provide audio output (eg, call signal reception sound, message reception sound, etc.) associated with a particular function performed by the mobile terminal 100.
  • the audio output module 152 can include a speaker, a buzzer, and the like.
  • the alarm unit 153 can provide an output to notify the mobile terminal 100 of the occurrence of an event. Typical events may include call reception, message reception, key signal input, touch input, and the like. In addition to audio or video output, the alert unit 153 can provide an output in a different manner to notify of the occurrence of an event. For example, the alarm unit 153 can provide an output in the form of vibration when receiving a call, The alert unit 153 may provide a tactile output (ie, vibration) to notify the user of the information or some other incoming communication. By providing such a tactile output, the user is able to recognize the occurrence of various events even when the user's mobile phone is in the user's pocket. The alarm unit 153 can also provide an output of the notification event occurrence via the display unit 151 or the audio output module 152.
  • the memory 160 may store a software program or the like for processing and control operations performed by the controller 180, or may temporarily store data (for example, a phone book, a message, a still image, a video, etc.) that has been output or is to be output. Moreover, the memory 160 can store data regarding vibrations and audio signals of various manners that are output when a touch is applied to the touch screen.
  • the memory 160 may include at least one type of storage medium including a flash memory, a hard disk, a multimedia card, a card type memory (eg, SD or DX memory, etc.), a random access memory (RAM), a static random access memory ( SRAM), read only memory (ROM), electrically erasable programmable read only memory (EEPROM), programmable read only memory (PROM), magnetic memory, magnetic disk, optical disk, and the like.
  • the mobile terminal 100 can cooperate with a network storage device that performs a storage function of the memory 160 through a network connection.
  • the controller 180 typically controls the overall operation of the mobile terminal. For example, the controller 180 performs the control and processing associated with voice calls, data communications, video calls, and the like. Additionally, the controller 180 can include a multimedia module 1810 for reproducing (or playing back) multimedia data, which can be constructed within the controller 180 or can be configured to be separate from the controller 180. The controller 180 may perform a pattern recognition process to recognize a handwriting input or a picture drawing input performed on the touch screen as a character or an image.
  • the power supply unit 190 receives external power or internal power under the control of the controller 180 and provides appropriate power required to operate the various components and components.
  • the various embodiments described herein can be implemented in a computer readable medium using, for example, computer software, hardware, or any combination thereof.
  • the embodiments described herein may be through the use of application specific integrated circuits (ASICs), digital signal processors (DSPs), digital signal processing devices (DSPDs), programmable logic devices (PLDs), field programmable gate arrays (ASICs), digital signal processors (DSPs), digital signal processing devices (DSPDs), programmable logic devices (PLDs), field programmable gate arrays ( An FPGA, a processor, a controller, a microcontroller, a microprocessor, at least one of the electronic units designed to perform the functions described herein, in some cases, such an embodiment may be at the controller 180 Implemented in the middle.
  • implementations such as procedures or functions may be implemented with separate software modules that permit the execution of at least one function or operation.
  • the software code can be implemented by a software application (or program) written in any suitable programming language, which can be stored in memory 160 and executed by
  • the mobile terminal has been described in terms of its function.
  • a slide type mobile terminal among various types of mobile terminals such as a folding type, a bar type, a swing type, a slide type mobile terminal, and the like will be described as an example. Therefore, the present invention can be applied to any type of mobile terminal, and is not limited to a slide type mobile terminal.
  • the mobile terminal 100 as shown in FIG. 1 may be configured to operate using a communication system such as a wired and wireless communication system and a satellite-based communication system that transmits data via frames or packets.
  • a communication system such as a wired and wireless communication system and a satellite-based communication system that transmits data via frames or packets.
  • Such communication systems may use different air interfaces and/or physical layers.
  • air interfaces used by communication systems include, for example, Frequency Division Multiple Access (FDMA), Time Division Multiple Access (TDMA), Code Division Multiple Access (CDMA), and Universal Mobile Telecommunications System (UMTS) (in particular, Long Term Evolution (LTE)). ), Global System for Mobile Communications (GSM), etc.
  • FDMA Frequency Division Multiple Access
  • TDMA Time Division Multiple Access
  • CDMA Code Division Multiple Access
  • UMTS Universal Mobile Telecommunications System
  • LTE Long Term Evolution
  • GSM Global System for Mobile Communications
  • the following description relates to a CDMA communication system, but such teachings are equally applicable to other types of systems.
  • a CDMA wireless communication system can include a plurality of mobile terminals 100, a plurality of base stations (BS) 270, a base station controller (BSC) 275, and a mobile switching center (MSC) 280.
  • the MSC 280 is configured to interface with a public switched telephone network (PSTN) 290.
  • PSTN public switched telephone network
  • the MSC 280 is also configured to interface with a BSC 275 that can be coupled to the base station 270 via a backhaul line.
  • the backhaul line can be constructed in accordance with any of a number of well known interfaces including, for example, E1/T1, ATM, IP, PPP, Frame Relay, HDSL, ADSL, or xDSL. It will be appreciated that the system as shown in FIG. 2 may include multiple BSC 2750s.
  • Each BS 270 can serve one or more partitions (or regions), each of which is covered by a multi-directional antenna or an antenna directed to a particular direction radially away from the BS 270. Alternatively, each partition may be covered by two or more antennas for diversity reception. Each BS 270 can be configured to support multiple frequency allocations, and each frequency allocation has a particular frequency spectrum (eg, 1.25 MHz, 5 MHz, etc.).
  • BS270 can also be called base Station Transceiver Subsystem (BTS) or other equivalent terminology.
  • BTS base Station Transceiver Subsystem
  • the term "base station” can be used to generally refer to a single BSC 275 and at least one BS 270.
  • a base station can also be referred to as a "cell station.”
  • each partition of a particular BS 270 may be referred to as a plurality of cellular stations.
  • a broadcast transmitter (BT) 295 transmits a broadcast signal to the mobile terminal 100 operating within the system.
  • a broadcast receiving module 111 as shown in FIG. 1 is provided at the mobile terminal 100 to receive a broadcast signal transmitted by the BT 295.
  • GPS Global Positioning System
  • the satellite 300 helps locate at least one of the plurality of mobile terminals 100.
  • a plurality of satellites 300 are depicted, but it is understood that useful positioning information can be obtained using any number of satellites.
  • the GPS module 115 as shown in Figure 1 is typically configured to cooperate with the satellite 300 to obtain desired positioning information. Instead of GPS tracking technology or in addition to GPS tracking technology, other techniques that can track the location of the mobile terminal can be used. Additionally, at least one GPS satellite 300 can selectively or additionally process satellite DMB transmissions.
  • BS 270 receives reverse link signals from various mobile terminals 100.
  • Mobile terminal 100 typically participates in calls, messaging, and other types of communications.
  • Each reverse link signal received by a particular base station 270 is processed within a particular BS 270.
  • the obtained data is forwarded to the relevant BSC 275.
  • the BSC provides call resource allocation and coordinated mobility management functions including a soft handoff procedure between the BSs 270.
  • the BSC 275 also routes the received data to the MSC 280, which provides additional routing services for interfacing with the PSTN 290.
  • PSTN 290 interfaces with MSC 280, which forms an interface with BSC 275, and BSC 275 controls BS 270 accordingly to transmit forward link signals to mobile terminal 100.
  • an embodiment of the present invention provides an image processing apparatus, where the image processing apparatus includes:
  • the photographing unit 10 is configured to obtain a plurality of images by taking a plurality of eyepieces
  • the obtaining unit 20 is configured to acquire an original image of the photograph according to the captured plurality of images, and depth information corresponding to the original image;
  • the determining unit 30 is configured to correct the matching relationship according to the preset depth, and obtain the correction information corresponding to the depth information;
  • the correction unit 40 is configured to process the original image according to the correction information and acquire a corresponding output image.
  • the eyepiece includes a first eyepiece and a second eyepiece; as shown in FIG. 4, it is a schematic diagram of the arrangement of the first eyepiece F1 and the second eyepiece F2.
  • the distance between the first eyepiece F1 and the second eyepiece F2 needs to be properly set. If the distance of the binocular lens is too close, the depth information of the object cannot be effectively calculated, but the image quality can be improved; the binocular lens is If the distance is too far, the obtained object depth information is limited, and at the same time affects the setting of the mobile phone structure. It should be noted that the distance between the first eyepiece F1 and the second eyepiece F2 needs to be balanced according to the accuracy range of the depth information and the structure. The appropriate setting can be determined by the camera module manufacturer according to the experimental experience value. Different vendors may vary slightly depending on the needs. The specific implementation is not intended to limit the scope of the invention.
  • the photographing unit acquires the first image through the first eyepiece and the second image through the second eyepiece;
  • the obtaining unit 20 includes:
  • the original image obtaining module 21 is configured to acquire a corresponding original image according to the first image and the second image;
  • the depth information obtaining module 22 is configured to acquire depth information corresponding to each pixel point in the original image.
  • the original image obtaining module 21 includes: an aligning sub-module and an original image acquiring sub-module;
  • the aligning sub-module is configured to respectively acquire a first corrected image corresponding to the first image and a second corrected image corresponding to the second image according to the first image and the second image; wherein the first corrected image and the second corrected image are An image that has been aligned in the x direction.
  • the original image acquisition sub-module is configured to use a common portion of the first corrected image and the second corrected image as the original image.
  • the first image and the second image are I L and I R , respectively, and the first corrected image and the second corrected image are I L-adj and I R-adj respectively , and are corrected by a stereo correction algorithm to obtain an image content only with an x direction.
  • the first corrected image I L-adj and the second corrected image I R-adj are translated.
  • the x direction refers to the direction in which the lines of the centers of the first eyepiece and the second eyepiece are located.
  • the common portion of the first corrected image I L-adj and the second corrected image I R-adj is the original image D original .
  • the first image I L is an image of 1024*800
  • the second image I R is an image of 1024*800
  • the first corrected image I L-adj and the second corrected image I R- Adj is also an image of 1024*800. Due to the difference in position of the first eyepiece and the second eyepiece on the x-axis, the first corrected image I L-adj and the second corrected image I R-adj correspond to corresponding differences on the x-axis.
  • the image of different positions usually has a small difference. Therefore, after the original image is generated, the size of the original image becomes smaller.
  • the original image D is originally an image of 1000*800
  • the original image D originally corresponds to the inspection.
  • Figure D is also image data of 1000*800.
  • the depth information acquiring module 22 includes: a view map obtaining submodule and a depth information acquiring submodule; wherein
  • the inspection map acquisition sub-module is configured to acquire the inspection map corresponding to the original image by using a stereo matching algorithm by using the first corrected image and the second corrected image;
  • the depth information obtaining submodule is configured to obtain corresponding depth information according to the inspection map.
  • Using stereo matching algorithm may obtain an original image corresponding to an original D in FIG inspection D.
  • the inspection map D is a two-dimensional data, and the pixel points in the inspection map D are in one-to-one correspondence with the pixel points in the original image D original .
  • the depth of field Z[i,j] of the corresponding pixel point in the depth map Z is calculated by the following formula, and Z[i,j] corresponding to each pixel point is calculated one by one. to arrive at the original image corresponding to the original depth D Z.
  • f is the distance from the plane of the two digital camera avatars to the main plane in the stereoscopic imaging device, that is, the focal length of the aperture imaging model (here, two camera parameters are taken as an example), and B is the center distance between the left and right lenses.
  • D is the inspection map corresponding to the original image.
  • x l and x r are the abscissa positions of the left and right cameras, respectively.
  • Z depth map D in the original image is recorded in each pixel corresponding to an original depth value.
  • the original image is acquired according to the first image and the second image, and the original image is acquired.
  • the inspected map can be implemented by the algorithm in the prior art. The specific implementation is not limited to the scope of protection of the embodiments of the present invention, and details are not described herein again.
  • the correction information is a color temperature adjustment value; and the depth information is a depth value.
  • the image processing device when the user uses the binocular lens to shoot or after shooting, the image can be optimized, and the processing of the front and rear warm and cold colors can make the foreground object of the image more prominent, and the background retreats further.
  • binocular lens shooting it creates a more precise separation of front and rear scenes, adjusts the temperature and warmth, achieves the user's desired effect, and improves the user experience.
  • the foreground image is the subject image
  • the background image is the background image.
  • it can be set to increase the color temperature value of the foreground image and reduce the color temperature value of the background image. The effect of the separation of the scene.
  • the determining unit 30 includes a modified matching relationship setting module, and is configured to set a depth correction matching relationship, where
  • S is the depth value
  • K is the color temperature adjustment value
  • f(S) is the transformation function corresponding to the depth correction matching relationship.
  • a1 and b1 are preset parameters.
  • a1 can be set to a positive value.
  • f(S) is set to an increasing function, so that the foreground edge is warmed and the background is cooled. As the depth of field increases, the color temperature is adjusted to It is more cold.
  • the mapping table shown in Table 1 can be pre-stored in the system.
  • the mapping table of the cold and warm tone map can be completed by directly searching the mapping table according to the depth value.
  • the range of color temperature can be set from 1000K to 8000K, that is, the range can be from 1000K to warm tone, until the color temperature is 8000K.
  • the mapping table also stores a color temperature or a correspondence between the hue and the RGB color, and the color temperature ranges from a warm tone corresponding to 1000K to a cold color with a color temperature of 8000K as an example, and the relationship between the color temperature or the hue and the RGB color. Can be set to:
  • 1100K corresponds to RGB 0xff3800
  • 1200K corresponds to RGB 0xff5200
  • 1300K corresponds to RGB 0xff5d00
  • 7900K corresponds to RGB 0xe7eaff
  • RGB values may be calculated in other ways to obtain the color temperature or the correspondence between the hue and the RGB color commonly used in the mapping table.
  • the color temperature adjustment value K corresponding to each pixel point can be calculated by using the depth value corresponding to each pixel point in the original image, and the original image is obtained.
  • the temperature and temperature adjustment value K corresponding to each pixel in the original image is stored in the cold and warm gradient map.
  • the correcting unit After the determining unit generates the above-mentioned cold and warm gradient map, the correcting unit synthesizes/processes the image data of the pixel point corresponding to each pixel point of the original image and the texture image information corresponding to the cold and warm map, and generates corresponding synthetic effect image data D processing .
  • the depth information value corresponding to one pixel point D original [i, j] is S[i, j]
  • the corresponding color temperature adjustment value K[i, j] is found from the above table according to S[i, j]
  • the D original [i, j] is adjusted according to K[i, j]
  • the corresponding processed pixel point D processing [i, j] is obtained.
  • the embodiment of the present invention further includes a slider processing unit, wherein the slider processing unit is configured to acquire a slider adjustment value; and, according to the slider adjustment value, adjust a range of the correction information corresponding to the acquired depth information. And processing the original image according to the adjusted correction information.
  • the value of the slider value H is 1 to 100, and the slider value H is converted to convert H to 1/100.0 to 100/100.0.
  • the slider's starting position is 1/100.0 and the ending position is 100/100.0.
  • the starting color temperature T1 and the ending color temperature T2 after the slider adjustment can be calculated
  • T H*(T end -T begin )+T begin
  • the default starting color temperature T begin and the default ending color temperature T end are:
  • the value of H is the input value of the slider adjustment, ranging from 1/100.0 to 100/100.0.
  • the starting position is 0.2
  • the ending position is 0.88.
  • the starting color temperature value T1 is:
  • the ending color temperature value T2 is:
  • T begin is the default starting color temperature 1000K
  • T end is the default end color temperature 8000K
  • the default color temperature mode is adjusted from 1000K to 8000K to (0.2*(T end -T begin )+T begin )K to (0.88*(T end -T begin )+T Begin .
  • the correspondence between the depth value S and the color temperature adjustment value K is adjusted correspondingly, thereby further adjusting the size of the cold and warm gradient map, thereby changing the effect of the image adjustment.
  • the method of adjusting the color temperature is described.
  • the cold and warm adjustment of the front and back of the image may also be performed according to the hue information.
  • the embodiment of the present invention further provides a mobile terminal, which includes any of the image processing devices provided by the embodiments of the present invention, based on the same or similar concepts as the above embodiments.
  • FIG. 5 it is a schematic flowchart of an image processing method according to an embodiment of the present invention. As shown in FIG. Processing methods include:
  • Step 100 taking a plurality of images through a plurality of eyepieces to obtain a plurality of images
  • Step 200 Acquire an original image of the photograph according to the captured plurality of images, and depth information corresponding to the original image;
  • Step 400 Correct the matching relationship according to the preset depth, and obtain the correction information corresponding to the depth information.
  • Step 500 Process the original image according to the correction information, and acquire a corresponding output image.
  • the eyepiece includes a first eyepiece and a second eyepiece; wherein the first image is obtained by the first eyepiece, and the second image is obtained by the second eyepiece;
  • Obtaining the original image of the photographed according to the plurality of captured images in step 200, and the depth information corresponding to the original image includes:
  • Step 210 Acquire a corresponding original image according to the first image and the second image.
  • Step 220 Obtain depth information corresponding to each pixel in the original image.
  • step 210 the acquiring the corresponding original image according to the first image and the second image includes:
  • Step 211 respectively acquiring a first corrected image corresponding to the first image and a second corrected image corresponding to the second image according to the first image and the second image; wherein the first corrected image and the second corrected image are already in the x direction Aligned image
  • Step 212 the common portion of the first corrected image and the second corrected image is taken as the original image.
  • step 220 the acquiring depth information corresponding to each pixel in the original image includes:
  • Step 221 Acquire a view map corresponding to the original image by using a stereo matching algorithm by using the first corrected image and the second corrected image;
  • Step 222 Acquire corresponding depth information according to the inspection map.
  • the correction information is a color temperature adjustment value
  • the depth information is a depth value
  • the method further includes:
  • step 300 the depth correction matching relationship is set.
  • step 300 and step 200 and step 100 there is no order relationship between step 300 and step 200 and step 100.
  • the depth correction matching relationship may be set as:
  • S is the depth value
  • K is the color temperature adjustment value
  • f(S) is the transformation function corresponding to the depth correction matching relationship
  • a1 and b1 are preset parameters.
  • the method further includes:
  • the range of the correction information corresponding to the acquired depth information is performed. Adjust and process the original image based on the adjusted correction information.
  • the value of the slider value H can be set from 1/100.0 to 100/100.0, that is, the starting position of the slider is 1/100.0, and the ending position is 100/100.0.
  • the slider range 1/100.0 to 100/100.0 corresponds to a color temperature of 1000K to 8000K, respectively.
  • the default starting color temperature T begin and the default ending color temperature T end are:
  • T end 8000K.
  • the starting color temperature T1 and the ending color temperature T2 after the slider adjustment can be calculated
  • T H*(T end -T begin )+T begin ;
  • the ending color temperature value T2 is:
  • the corresponding relationship between the adjusted depth value S and the color temperature adjustment value K is determined, wherein the value range of K is 1000K-8000K before the slider adjustment, and T1 is adjusted after adjustment. T2, which further adjusts the size of the cold and warm gradient map to change the effect of image adjustment.
  • the foregoing embodiment method can be implemented by means of software plus a necessary general hardware platform, and of course, can also be through hardware, but in many cases, the former is better.
  • Implementation Based on such understanding, the technical solution of the present invention, which is essential or contributes to the prior art, may be embodied in the form of a software product stored in a storage medium (such as ROM/RAM, disk,
  • the optical disc includes a number of instructions for causing a terminal device (which may be a cell phone, a computer, a server, an air conditioner, or a network device, etc.) to perform the methods described in various embodiments of the present invention.
  • the image processing method and device and the mobile terminal according to the embodiments of the present invention through the technical solution of the embodiment of the present invention, when the user uses the binocular lens to shoot or after shooting, the image can be optimized according to the depth information, and the image is cooled by warm front and back.
  • the processing can make the foreground main object of the image more prominent, so that the background retreats further, and the advantage of binocular lens shooting is used to create a more precise separation of front and rear scenes to adjust the warm and cold, achieve the desired effect of the user, and improve the user experience.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Telephone Function (AREA)

Abstract

一种图像处理方法、装置及移动终端,通过本发明实施例的技术方案,用户使用双目镜头拍摄时或拍摄后,可以根据深度信息对图像进行优化处理,通过前后冷暖色调的处理可以使图像的前景主体物更突出,使得背景退得更远,利用双目镜头拍摄的优势,创造了更加精准的前后景分离调整冷暖,达到了用户期望的效果,提高了用户体验。

Description

图像处理方法、装置及移动终端 技术领域
本发明实施例涉及但不限于移动终端技术领域,尤指一种图像处理方法和装置、以及移动终端。
背景技术
手机的拍照功能得到了广泛的应用,拍照逐渐成为目前智能手机最重要的属性之一,也成为大多数用户每天都要使用的功能,生活中可以随时使用手机的拍照功能捕捉生活的各个有趣瞬间,每天我们都会用手机拍摄很多的照片。
随着手机拍照功能的广泛应用,用户也希望手机的拍照功能具有更好的效果,虽然可以采用各种美图软件对手机相机拍摄的图片进行修饰处理,使得图片的效果更好,然而,手机拍照功能受到硬件的限制,不能很好地记录深度信息,就算通过图片美图软件的后期处理,也不能弥补照片的深度信息的缺少造成的拍摄效果的局限性,这样,无疑限制了手机拍照性能的进一步提高。
发明内容
以下是对本文详细描述的主题的概述。本概述并非是为了限制权利要求的保护范围。
本发明实施例提出一种图像处理方法、装置及移动终端,能够给用户提供一种新的图像处理方式和更好的用户体验。
本发明实施例提出了一种图像处理装置,包括:
拍照单元,设置为通过多个目镜拍照得到多个图像;
获取单元,设置为根据拍到的多个图像获取拍照的原始图像,以及原始图像对应的深度信息;
确定单元,设置为根据预设的深度修正匹配关系,确定所述深度信息对应的修正信息;
修正单元,设置为根据所述修正信息,对原始图像进行处理,并获取对应的输出图像。
可选地,所述目镜包括第一目镜和第二目镜;
所述多个图像包括:通过第一目镜获得的第一图像,通过第二目镜获得的第二图像;
所述获取单元包括:
原始图像获取模块,设置为根据所述第一图像和所述第二图像,获取对应的原始图像;
深度信息获取模块,设置为获取原始图像中各个像素点对应的深度信息。
可选地,所述第一目镜和所述第二目镜之间的距离与所述深度信息的精度范围相关。
可选地,所述原始图像获取模块包括:
对齐子模块,设置为根据所述第一图像和所述第二图像,分别获取所述第一图像对应的第一校正图像和所述第二图像对应的第二校正图像;其中,第一校正图像和第二校正图像为在x轴方向已经对齐的图像;
原始图像获取子模块,设置为将第一校正图像和第二校正图像的公共部分作为所述原始图像。
可选地,所述获取所述第一校正图像和所述第二校正图像的方法为:立体校正算法。
可选地,所述深度信息获取模块包括:
视察图获取子模块,设置为利用所述第一校正图像和所述第二校正图像,通过立体匹配算法获取所述原始图像对应的视察图;
深度信息获取子模块,设置为根据所述视察图获取对应的深度信息。
可选地,所述视察图是一个二维数据,所述视察图中像素点与所述原始图像中的像素点一一对应。
可选地,所述修正信息为色温调节值;所述深度信息为深度值;
所述确定单元包括修正匹配关系设置模块,设置为对所述深度修正匹配 关系进行设置,其中;
所述深度修正匹配关系设置为:
K=a1*S;或,K=f(S)+b1;
其中,S为深度值,K为色温调节值,f(S)为深度修正匹配关系对应的变换函数,a1和b1为预设的参数。
可选地,还包括滑块处理单元,设置为获取滑块调节值;以及,根据所述滑块调节值,对获取的深度信息对应的修正信息的范围进行调整,并根据调整后的修正信息对所述原始图像进行处理。
本发明实施例还提供了一种移动终端,包括上述任一项所述的图像处理装置。
本发明实施例再提供了一种图像处理方法,包括:
通过多个目镜拍照得到多个图像;
根据拍到的多个图像获取拍照的原始图像,以及原始图像对应的深度信息;
根据预设的深度修正匹配关系,获取所述深度信息对应的修正信息;
根据所述修正信息,对原始图像进行处理,并获取对应的输出图像。
可选地,所述目镜包括第一目镜和第二目镜;其中,所述多个图像包括:通过第一目镜获得的第一图像,通过第二目镜获得的第二图像;
所述根据拍到的多个图像获取拍照的原始图像,以及原始图像对应的深度信息包括:
根据所述第一图像和所述第二图像,获取对应的原始图像;
获取原始图像中各个像素点对应的深度信息。
可选地,所述第一目镜和所述第二目镜之间的距离与所述深度信息的精度范围相关。
可选地,所述根据所述第一图像和所述第二图像,获取对应的原始图像包括:
根据所述第一图像和所述第二图像,分别获取所述第一图像对应的第一 校正图像和所述第二图像对应的第二校正图像;其中,第一校正图像和第二校正图像为在x轴方向已经对齐的图像;
将第一校正图像和第二校正图像的公共部分作为所述原始图像。
可选地,所述获取所述第一校正图像和所述第二校正图像的方法为:立体校正算法。
可选地,所述获取原始图像中各个像素点对应的深度信息包括:
利用所述第一校正图像和所述第二校正图像,通过立体匹配算法获取所述原始图像对应的视察图;
根据所述视察图获取对应的深度信息。
可选地,所述视察图是一个二维数据,所述视察图中像素点与所述原始图像中的像素点一一对应。
可选地,所述修正信息为色温调节值;所述深度信息为深度值;
在所述根据预设的深度修正匹配关系,获取所述深度信息对应的修正信息之前,还包括:
对深度修正匹配关系进行设置;
其中,所述深度修正匹配关系设置为:
K=a1*S+b1;或,K=f(S);
其中,S为深度值,K为色温调节值,f(S)为深度修正匹配关系对应的变换函数,a1和b1为预设的参数。
可选地,该方法还包括:
获取滑块调节值;
根据所述滑块调节值,对获取的深度信息对应的修正信息的范围进行调整,并根据调整后的修正信息对所述原始图像进行处理。
本发明实施例又提供了一种计算机可读存储介质,存储有计算机可执行指令,所述计算机可执行指令用于执行上述任一项所述的图像处理方法。
与现有技术相比,通过本发明实施例提供的技术方案,用户使用双目镜头拍摄时或拍摄后,可以获取深度信息,并根据深度信息对图像进行优化处 理,例如,通过前后冷暖色调的处理可以使图像的前景主体物更突出,使得背景退得更远,利用双目镜头拍摄的优势,创造了更加精准的前后景分离调整冷暖,达到了用户期望的效果,提高了用户体验。
在阅读并理解了附图和详细描述后,可以明白其他方面。
附图概述
此处所说明的附图用来提供对本发明的进一步理解,构成本申请的一部分,本发明的示意性实施例及其说明用于解释本发明,并不构成对本发明的不当限定。在附图中:
图1为实现本发明各个实施例的移动终端的硬件结构示意图;
图2为如图1所示的移动终端的无线通信系统示意图;
图3为本发明实施例提出的一种图像处理装置的结构示意图;
图4为第一目镜F1和第二目镜F2的设置示意图;
图5为本发明实施例提供的图像处理方法的流程示意图。
本发明目的的实现、功能特点及优点将结合实施例,参照附图做进一步说明。
本发明的较佳实施方式
为了便于本领域技术人员的理解,下面结合附图对本发明作进一步的描述,并不能用来限制本发明的保护范围。需要说明的是,在不冲突的情况下,本申请中的实施例及实施例中的各种方式可以相互组合。
应当理解,此处所描述的具体实施例仅用以解释本发明,并不用于限定本发明。
现在将参考附图描述实现本发明各个实施例的移动终端。在后续的描述中,使用用于表示元件的诸如“模块”、“部件”或“单元”的后缀仅为了有利于本发明的说明,其本身并没有特定的意义。因此,"模块"与"部件"可以混合地使用。
移动终端可以以各种形式来实施。例如,本发明中描述的终端可以包括诸如移动电话、智能电话、笔记本电脑、数字广播接收器、PDA(个人数字助 理)、PAD(平板电脑)、PMP(便携式多媒体播放器)、导航装置等等的移动终端以及诸如数字TV、台式计算机等等的固定终端。下面,假设终端是移动终端。然而,本领域技术人员将理解的是,除了特别用于移动目的的元件之外,根据本发明的实施方式的构造也能够应用于固定类型的终端。
图1为实现本发明各个实施例的移动终端的硬件结构示意。
移动终端100可以包括无线通信单元110、A/V(音频/视频)输入单元120、用户输入单元130、感测单元140、输出单元150、存储器160、接口单元170、控制器180和电源单元190等等。图1示出了具有各种组件的移动终端,但是应理解的是,并不要求实施所有示出的组件。可以替代地实施更多或更少的组件。将在下面详细描述移动终端的元件。
无线通信单元110通常包括一个或多个组件,其允许移动终端100与无线通信系统或网络之间的无线电通信。例如,无线通信单元可以包括广播接收模块111、移动通信模块112、无线互联网模块113、短程通信模块114和位置信息模块115中的至少一个。
广播接收模块111经由广播信道从外部广播管理服务器接收广播信号和/或广播相关信息。广播信道可以包括卫星信道和/或地面信道。广播管理服务器可以是生成并发送广播信号和/或广播相关信息的服务器或者接收之前生成的广播信号和/或广播相关信息并且将其发送给终端的服务器。广播信号可以包括TV广播信号、无线电广播信号、数据广播信号等等。而且,广播信号可以进一步包括与TV或无线电广播信号组合的广播信号。广播相关信息也可以经由移动通信网络提供,并且在该情况下,广播相关信息可以由移动通信模块112来接收。广播信号可以以各种形式存在,例如,其可以以数字多媒体广播(DMB)的电子节目指南(EPG)、数字视频广播手持(DVB-H)的电子服务指南(ESG)等等的形式而存在。广播接收模块111可以通过使用各种类型的广播系统接收信号广播。特别地,广播接收模块111可以通过使用诸如多媒体广播-地面(DMB-T)、数字多媒体广播-卫星(DMB-S)、数字视频广播-手持(DVB-H),前向链路媒体(MediaFLO@)的数据广播系统、地面数字广播综合服务(ISDB-T)等等的数字广播系统接收数字广播。广播接收模块111可以被构造为适合提供广播信号的各种广播系统以及上述数字广播系统。经由广 播接收模块111接收的广播信号和/或广播相关信息可以存储在存储器160(或者其它类型的存储介质)中。
移动通信模块112将无线电信号发送到基站(例如,接入点、节点B等等)、外部终端以及服务器中的至少一个和/或从其接收无线电信号。这样的无线电信号可以包括语音通话信号、视频通话信号、或者根据文本和/或多媒体消息发送和/或接收的各种类型的数据。
无线互联网模块113支持移动终端的无线互联网接入。该模块可以内部或外部地耦接到终端。该模块所涉及的无线互联网接入技术可以包括WLAN(无线LAN)(Wi-Fi)、Wibro(无线宽带)、Wimax(全球微波互联接入)、HSDPA(高速下行链路分组接入)等等。
短程通信模块114是用于支持短程通信的模块。短程通信技术的一些示例包括蓝牙TM、射频识别(RFID)、红外数据协会(IrDA)、超宽带(UWB)、紫蜂TM等等。
位置信息模块115是用于检查或获取移动终端的位置信息的模块。位置信息模块的典型示例是GPS(全球定位系统)。根据当前的技术,GPS模块115计算来自三个或更多卫星的距离信息和准确的时间信息并且对于计算的信息应用三角测量法,从而根据经度、纬度和高度准确地计算三维当前位置信息。当前,用于计算位置和时间信息的方法使用三颗卫星并且通过使用另外的一颗卫星校正计算出的位置和时间信息的误差。此外,GPS模块115能够通过实时地连续计算当前位置信息来计算速度信息。
A/V输入单元120用于接收音频或视频信号。A/V输入单元120可以包括相机121和麦克风1220,相机121对在视频捕获模式或图像捕获模式中由图像捕获装置获得的静态图片或视频的图像数据进行处理。处理后的图像帧可以显示在显示单元151上。经相机121处理后的图像帧可以存储在存储器160(或其它存储介质)中或者经由无线通信单元110进行发送,可以根据移动终端的构造提供两个或更多相机1210。麦克风122可以在电话通话模式、记录模式、语音识别模式等等运行模式中经由麦克风接收声音(音频数据),并且能够将这样的声音处理为音频数据。处理后的音频(语音)数据可以在电话通话模式的情况下转换为可经由移动通信模块112发送到移动通信基站的格式输 出。麦克风122可以实施各种类型的噪声消除(或抑制)算法以消除(或抑制)在接收和发送音频信号的过程中产生的噪声或者干扰。
用户输入单元130可以根据用户输入的命令生成键输入数据以控制移动终端的各种操作。用户输入单元130允许用户输入各种类型的信息,并且可以包括键盘、锅仔片、触摸板(例如,检测由于被接触而导致的电阻、压力、电容等等的变化的触敏组件)、滚轮、摇杆等等。特别地,当触摸板以层的形式叠加在显示单元151上时,可以形成触摸屏。
感测单元140检测移动终端100的当前状态,(例如,移动终端100的打开或关闭状态)、移动终端100的位置、用户对于移动终端100的接触(即,触摸输入)的有无、移动终端100的取向、移动终端100的加速或减速移动和方向等等,并且生成用于控制移动终端100的操作的命令或信号。例如,当移动终端100实施为滑动型移动电话时,感测单元140可以感测该滑动型电话是打开还是关闭。另外,感测单元140能够检测电源单元190是否提供电力或者接口单元170是否与外部装置耦接。感测单元140可以包括接近传感器1410将在下面结合触摸屏来对此进行描述。
接口单元170用作至少一个外部装置与移动终端100连接可以通过的接口。例如,外部装置可以包括有线或无线头戴式耳机端口、外部电源(或电池充电器)端口、有线或无线数据端口、存储卡端口、用于连接具有识别模块的装置的端口、音频输入/输出(I/O)端口、视频I/O端口、耳机端口等等。识别模块可以是存储用于验证用户使用移动终端100的各种信息并且可以包括用户识别模块(UIM)、客户识别模块(SIM)、通用客户识别模块(USIM)等等。另外,具有识别模块的装置(下面称为"识别装置")可以采取智能卡的形式,因此,识别装置可以经由端口或其它连接装置与移动终端100连接。接口单元170可以用于接收来自外部装置的输入(例如,数据信息、电力等等)并且将接收到的输入传输到移动终端100内的一个或多个元件或者可以用于在移动终端和外部装置之间传输数据。
另外,当移动终端100与外部底座连接时,接口单元170可以用作允许通过其将电力从底座提供到移动终端100的路径或者可以用作允许从底座输入的各种命令信号通过其传输到移动终端的路径。从底座输入的各种命令信 号或电力可以用作用于识别移动终端是否准确地安装在底座上的信号。输出单元150被构造为以视觉、音频和/或触觉方式提供输出信号(例如,音频信号、视频信号、警报信号、振动信号等等)。输出单元150可以包括显示单元151、音频输出模块152、警报单元153等等。
显示单元151可以显示在移动终端100中处理的信息。例如,当移动终端100处于电话通话模式时,显示单元151可以显示与通话或其它通信(例如,文本消息收发、多媒体文件下载等等)相关的用户界面(UI)或图形用户界面(GUI)。当移动终端100处于视频通话模式或者图像捕获模式时,显示单元151可以显示捕获的图像和/或接收的图像、示出视频或图像以及相关功能的UI或GUI等等。
同时,当显示单元151和触摸板以层的形式彼此叠加以形成触摸屏时,显示单元151可以用作输入装置和输出装置。显示单元151可以包括液晶显示器(LCD)、薄膜晶体管LCD(TFT-LCD)、有机发光二极管(OLED)显示器、柔性显示器、三维(3D)显示器等等中的至少一种。这些显示器中的一些可以被构造为透明状以允许用户从外部观看,这可以称为透明显示器,典型的透明显示器可以例如为TOLED(透明有机发光二极管)显示器等等。根据特定想要的实施方式,移动终端100可以包括两个或更多显示单元(或其它显示装置),例如,移动终端可以包括外部显示单元(未示出)和内部显示单元(未示出)。触摸屏可用于检测触摸输入压力以及触摸输入位置和触摸输入面积。
音频输出模块152可以在移动终端处于呼叫信号接收模式、通话模式、记录模式、语音识别模式、广播接收模式等等模式下时,将无线通信单元110接收的或者在存储器160中存储的音频数据转换音频信号并且输出为声音。而且,音频输出模块152可以提供与移动终端100执行的特定功能相关的音频输出(例如,呼叫信号接收声音、消息接收声音等等)。音频输出模块152可以包括扬声器、蜂鸣器等等。
警报单元153可以提供输出以将事件的发生通知给移动终端100。典型的事件可以包括呼叫接收、消息接收、键信号输入、触摸输入等等。除了音频或视频输出之外,警报单元153可以以不同的方式提供输出以通知事件的发生。例如,警报单元153可以以振动的形式提供输出,当接收到呼叫、消 息或一些其它进入通信(incomingcommunication)时,警报单元153可以提供触觉输出(即,振动)以将其通知给用户。通过提供这样的触觉输出,即使在用户的移动电话处于用户的口袋中时,用户也能够识别出各种事件的发生。警报单元153也可以经由显示单元151或音频输出模块152提供通知事件的发生的输出。
存储器160可以存储由控制器180执行的处理和控制操作的软件程序等等,或者可以暂时地存储己经输出或将要输出的数据(例如,电话簿、消息、静态图像、视频等等)。而且,存储器160可以存储关于当触摸施加到触摸屏时输出的各种方式的振动和音频信号的数据。
存储器160可以包括至少一种类型的存储介质,所述存储介质包括闪存、硬盘、多媒体卡、卡型存储器(例如,SD或DX存储器等等)、随机访问存储器(RAM)、静态随机访问存储器(SRAM)、只读存储器(ROM)、电可擦除可编程只读存储器(EEPROM)、可编程只读存储器(PROM)、磁性存储器、磁盘、光盘等等。而且,移动终端100可以与通过网络连接执行存储器160的存储功能的网络存储装置协作。
控制器180通常控制移动终端的总体操作。例如,控制器180执行与语音通话、数据通信、视频通话等等相关的控制和处理。另外,控制器180可以包括用于再现(或回放)多媒体数据的多媒体模块1810,多媒体模块1810可以构造在控制器180内,或者可以构造为与控制器180分离。控制器180可以执行模式识别处理,以将在触摸屏上执行的手写输入或者图片绘制输入识别为字符或图像。
电源单元190在控制器180的控制下接收外部电力或内部电力并且提供操作各元件和组件所需的适当的电力。
这里描述的各种实施方式可以以使用例如计算机软件、硬件或其任何组合的计算机可读介质来实施。对于硬件实施,这里描述的实施方式可以通过使用特定用途集成电路(ASIC)、数字信号处理器(DSP)、数字信号处理装置(DSPD)、可编程逻辑装置(PLD)、现场可编程门阵列(FPGA)、处理器、控制器、微控制器、微处理器、被设计为执行这里描述的功能的电子单元中的至少一种来实施,在一些情况下,这样的实施方式可以在控制器180中实施。 对于软件实施,诸如过程或功能的实施方式可以与允许执行至少一种功能或操作的单独的软件模块来实施。软件代码可以由以任何适当的编程语言编写的软件应用程序(或程序)来实施,软件代码可以存储在存储器160中并且由控制器180执行。
至此,己经按照其功能描述了移动终端。下面,为了简要起见,将描述诸如折叠型、直板型、摆动型、滑动型移动终端等等的各种类型的移动终端中的滑动型移动终端作为示例。因此,本发明能够应用于任何类型的移动终端,并且不限于滑动型移动终端。
如图1中所示的移动终端100可以被构造为利用经由帧或分组发送数据的诸如有线和无线通信系统以及基于卫星的通信系统来操作。
现在将参考图2描述其中根据本发明的移动终端能够操作的通信系统。
这样的通信系统可以使用不同的空中接口和/或物理层。例如,由通信系统使用的空中接口包括例如频分多址(FDMA)、时分多址(TDMA)、码分多址(CDMA)和通用移动通信系统(UMTS)(特别地,长期演进(LTE))、全球移动通信系统(GSM)等等。作为非限制性示例,下面的描述涉及CDMA通信系统,但是这样的教导同样适用于其它类型的系统。
参考图2,CDMA无线通信系统可以包括多个移动终端100、多个基站(BS)270、基站控制器(BSC)275和移动交换中心(MSC)280。MSC280被构造为与公共电话交换网络(PSTN)290形成接口。MSC280还被构造为与可以经由回程线路耦接到基站270的BSC275形成接口。回程线路可以根据若干己知的接口中的任一种来构造,所述接口包括例如E1/T1、ATM,IP、PPP、帧中继、HDSL、ADSL或xDSL。将理解的是,如图2中所示的系统可以包括多个BSC2750。
每个BS270可以服务一个或多个分区(或区域),由多向天线或指向特定方向的天线覆盖的每个分区放射状地远离BS270。或者,每个分区可以由用于分集接收的两个或更多天线覆盖。每个BS270可以被构造为支持多个频率分配,并且每个频率分配具有特定频谱(例如,1.25MHz,5MHz等等)。
分区与频率分配的交叉可以被称为CDMA信道。BS270也可以被称为基 站收发器子系统(BTS)或者其它等效术语。在这样的情况下,术语"基站"可以用于笼统地表示单个BSC275和至少一个BS270。基站也可以被称为"蜂窝站"。或者,特定BS270的各分区可以被称为多个蜂窝站。
如图2中所示,广播发射器(BT)295将广播信号发送给在系统内操作的移动终端100。如图1中所示的广播接收模块111被设置在移动终端100处以接收由BT295发送的广播信号。在图2中,示出了几个全球定位系统(GPS)卫星300。卫星300帮助定位多个移动终端100中的至少一个。
在图2中,描绘了多个卫星300,但是理解的是,可以利用任何数目的卫星获得有用的定位信息。如图1中所示的GPS模块115通常被构造为与卫星300配合以获得想要的定位信息。替代GPS跟踪技术或者在GPS跟踪技术之外,可以使用可以跟踪移动终端的位置的其它技术。另外,至少一个GPS卫星300可以选择性地或者额外地处理卫星DMB传输。
作为无线通信系统的一个典型操作,BS270接收来自各种移动终端100的反向链路信号。移动终端100通常参与通话、消息收发和其它类型的通信。特定基站270接收的每个反向链路信号被在特定BS270内进行处理。获得的数据被转发给相关的BSC275。BSC提供通话资源分配和包括BS270之间的软切换过程的协调的移动管理功能。BSC275还将接收到的数据路由到MSC280,其提供用于与PSTN290形成接口的额外的路由服务。类似地,PSTN290与MSC280形成接口,MSC与BSC275形成接口,并且BSC275相应地控制BS270以将正向链路信号发送到移动终端100。
基于上述移动终端硬件结构以及通信系统,提出本发明方法各个实施例。
如图3所示,本发明实施例提出一种图像处理装置,该图像处理装置包括:
拍照单元10,设置为通过多个目镜拍照得到多个图像;
获取单元20,设置为根据拍到的多个图像获取拍照的原始图像,以及原始图像对应的深度信息;
确定单元30,设置为根据预设的深度修正匹配关系,获取所述深度信息对应的修正信息;
修正单元40,设置为根据所述修正信息,对原始图像进行处理,并获取对应的输出图像。
本发明实施例中,所述目镜包括第一目镜和第二目镜;如图4所示,为第一目镜F1和第二目镜F2的设置示意图。第一目镜F1和第二目镜F2之间的距离需要经过合适的设定,双目镜头的距离太近的话,不能有效的计算出的物体的深度信息,但能提高图像质量;双目镜头的距离太远的话,得到的物体深度信息有限,同时影响手机结构的设置。需要说明的是,第一目镜F1和第二目镜F2之间的距离需要根据深度信息的精度范围以及结构上做出一个平衡,这个合适的设定可以由摄像头模组厂商根据实验经验值所定,不同厂商可能根据需求可能略有不同。具体实现并不用于限定本发明的保护范围。
拍照单元通过第一目镜获取第一图像,通过第二目镜获取第二图像;
所述获取单元20包括:
原始图像获取模块21,设置为根据所述第一图像和所述第二图像,获取对应的原始图像;
深度信息获取模块22,设置为获取原始图像中各个像素点对应的深度信息。
本发明实施例中,所述原始图像获取模块21包括:对齐子模块、原始图像获取子模块;其中,
对齐子模块,设置为根据第一图像和第二图像,分别获取第一图像对应的第一校正图像和第二图像对应的第二校正图像;其中,第一校正图像和第二校正图像为在x方向已经对齐的图像。
原始图像获取子模块,设置为将第一校正图像和第二校正图像的公共部分做为原始图像。
其中,第一图像和第二图像分别为IL和IR,第一校正图像和第二校正图像分别为IL-adj和IR-adj,利用立体校正算法校正,得到图像内容只有x方向平移的第一校正图像IL-adj和第二校正图像IR-adj
其中,x方向指第一目镜和第二目镜的中心的连线所在的方向。
第一校正图像IL-adj和第二校正图像IR-adj的公共部分为原始图像D原始
下面举一个简单的例子进行说明,假设第一图像IL为1024*800的图像,第二图像IR为1024*800的图像,第一校正图像IL-adj和第二校正图像IR-adj同样为1024*800的图像,由于第一目镜和第二目镜在x轴上位置的差异,第一校正图像IL-adj和第二校正图像IR-adj对应于x轴上具有对应差异的不同位置的图像,通常这个差异较小,因此,在生成原始图像之后,原始图像的尺寸会变小,例如,原始图像D原始为1000*800的图像,同时,原始图像D原始对应的视察图D同样为1000*800的图像数据。
本发明实施例中,所述深度信息获取模块22包括:视察图获取子模块、深度信息获取子模块;其中,
视察图获取子模块,设置为利用所述第一校正图像和第二校正图像,通过立体匹配算法获取所述原始图像对应的视察图;
深度信息获取子模块,设置为根据所述视察图获取对应的深度信息。
利用立体匹配算法,可以获取原始图像D原始对应的视察图D。
视察图D是一个二维数据,视察图D中像素点与原始图像D原始中的像素点一一对应。
由D中任意像素点的视差D[i,j],使用以下公式计算出景深图Z中对应的像素点的景深Z[i,j],逐一计算各个像素点对应的Z[i,j],从而得出原始图像D原始对应的景深图Z。
Figure PCTCN2016103062-appb-000001
从上述公式中可以推出:
Figure PCTCN2016103062-appb-000002
其中,f是立体成像装置中两个数码摄像头像平面到主平面的距离,即小孔成像模型的焦距(这里以两个摄像头参数一样为例进行说明),B是左右镜头之间的中心距离,D是原始图像对应的视察图。xl和xr分别为左右摄像头的横坐标位置。
景深图Z中记录了原始图像D原始中每个像素点对应的深度值。
其中,根据第一图像和第二图像获取原始图像,以及获取原始图像对应 的视察图可以通过现有技术中的算法完成,具体实现方式并不用于限定本发明实施例的保护范围,这里不再赘述。
本发明实施例中,所述修正信息为色温调节值;所述深度信息为深度值。
通过本发明提供的图像处理装置,用户使用双目镜头拍摄时或拍摄后,可以对图像进行优化的处理,而前后冷暖色调的处理可以使图像的前景主体物更突出,使得背景退得更远,利用双目镜头拍摄的优势,创造了更加精准的前后景分离调整冷暖,达到了用户期望的效果,提高了用户体验。
增加色温值,使得更加偏暖色调,减小色温值,使得更加偏冷色调,通过深度信息,可以区分前景图像和后景图像,并针对前景图像和后景图像作出不同的处理,例如,一般来说,前景为拍摄的主题图像,后景为拍摄的背景图像,为了使图像的前景主体物更突出,可以设置为增加前景图像的色温值,并减小后景图像色温值,从而达到前后景分离的效果。
所述确定单元30包括修正匹配关系设置模块,设置为对深度修正匹配关系进行设置,其中;
将所述深度修正匹配关系设置为:
K=a1*S+b1;或,K=f(S);
其中,S为深度值,K为色温调节值,f(S)为深度修正匹配关系对应的变换函数。其中a1和b1为预设的参数。
为了达到前后景分离的效果,可以将a1设置为正值,同样,f(S)设置为增函数,这样,使得前景边变暖,后景变冷,随着景深的加大,色温调节为偏冷的程度更大。
下面结合表1,给出一组深度值S和色温调节值K的对应关系的示例。
深度值S 色温调节值K
0.1 -50k
0.3 0k
0.5 50k
0.7 100K
表1
如表1所示,随着深度值S越来越小,色温调节值K逐渐从正值变为负值,并且随着深度值S的进一步增加,色温调节值K负值的绝对值也越来越大。
表1所示的映射表可以预先存储在系统中,在图像处理过程中,根据深度值直接查找该映射表即可完成对冷暖色调贴图的一一映射过程。
下面结合一个具体的示例进行说明。
色温的取值范围可以设置为1000K~8000K,即范围可由1000K对应的暖色调,直至色温为8000K的冷色调。
该映射表还存储有色温或者色调与RGB颜色之间的对应关系,以色温的范围可由1000K对应的暖色调,直至色温为8000K的冷色调为例,色温或者色调与RGB颜色之间的对应关系可以设置为:
1000K对应RGB 0xff3300
1100K对应RGB 0xff3800
1200K对应RGB 0xff5200
1300K对应RGB 0xff5d00
………
7900K对应RGB 0xe7eaff
8000K对应RGB 0xe5e9ff
以上举例仅为说明,也可由其他方式计算出RGB值,得到映射表中常用的色温或者色调与RGB颜色之间的对应关系。
此外,除了通过上述表格内容来存储深度值S和色温调节值K的对应关系,在实际处理过程中,也可以存储K=a1*S+b1;或K=f(S)之类的深度值S和色温调节值K的对应关系。
只要设置并存储了深度值S和色温调节值K的函数对应关系,那么,通过原始图像中各个像素点对应的深度值便可以计算得出各个像素点对应的色温调节值K,并获取原始图像对应的冷暖渐变贴图,冷暖渐变贴图中存储有原始图像中各个像素点对应的色温调节值K。
确定单元在生成上述冷暖渐变贴图之后,修正单元针对原始图像各像素点和冷暖贴图对应的贴图图像信息,合成/处理该像素点的图像数据,生成对应的合成效果图像数据D处理
对具体合成过程给一个例子。例如,一个像素点D原始[i,j]对应的深度信息值是S[i,j],根据S[i,j]从上述表格找出对应的色温调节值K[i,j],然后根据K[i,j]对D原始[i,j]进行调节,得到对应的处理后的像素点D处理[i,j]。
本发明实施例中,还包括滑块处理单元,所述滑块处理单元用于获取滑块调节值;以及,根据所述滑块调节值,对获取的深度信息对应的修正信息的范围进行调整,并根据调整后的修正信息对原始图像进行处理。
下面给出一个具体的示例,
滑块值H的取值为1~100,将滑块值H转换,将H对应转换为1/100.0到100/100.0。也就是说,滑块的起始位置为1/100.0,结束位置为100/100.0。
根据下面公式可以计算滑块调节后的开始色温T1和结束色温T2;
T=H*(Tend-Tbegin)+Tbegin
假设整个滑块范围为1/100.0到100/100.0,分别对应着1000K到8000K的色温,则默认开始色温Tbegin和默认结束色温Tend分别为:
Tbegin=1000K;
Tend=8000K
H的取值为滑块调节的输入值,范围为1/100.0到100/100.0。
实例一、
假设用户滑动滑块,起始位置为0.2,结束位置为0.88,
那么此时分别对应的
起始色温值T1为:
0.2*(Tend-Tbegin)+Tbegin
结束色温值T2为:
0.88*(Tend-Tbegin)+Tbegin
其中
Tbegin为默认开始色温1000K
Tend为默认结束色温8000K
也就是说,根据滑块值H将默认的色温方式按照以上由1000K到8000K调节到(0.2*(Tend-Tbegin)+Tbegin)K到(0.88*(Tend-Tbegin)+Tbegin
根据该调节后的色温值的取值范围,将对应调整深度值S和色温调节值K的对应关系,从而进一步调整冷暖渐变贴图的大小,从而改变图像调节的效果。
本发明实施例中,均以色温调节的方式进行说明,在设置映射表的过程中,也可根据色调信息执行对图像的前后景的冷暖调节。
基于与上述实施例相同或相似的构思,本发明实施例还提供一种移动终端,所述移动终端包括本发明实施例提供的任一种图像处理装置。
基于与上述实施例相同或相似的构思,本发明实施例还提供一种图像处理方法,参见图5,为本发明实施例提供的图像处理方法的流程示意图,如图5所示,所述图像处理方法包括:
步骤100,通过多个目镜拍照得到多个图像;
步骤200,根据拍到的多个图像获取拍照的原始图像,以及原始图像对应的深度信息;
步骤400,根据预设的深度修正匹配关系,获取所述深度信息对应的修正信息;
步骤500,根据所述修正信息,对原始图像进行处理,并获取对应的输出图像。
本发明实施例中,所述目镜包括第一目镜和第二目镜;其中,通过第一目镜获取第一图像,通过第二目镜获取第二图像;
步骤200中的根据拍到的多个图像获取拍照的原始图像,以及原始图像对应的深度信息包括:
步骤210,根据所述第一图像和所述第二图像,获取对应的原始图像;
步骤220,获取原始图像中各个像素点对应的深度信息。
本发明实施例中,步骤210中,所述根据所述第一图像和所述第二图像,获取对应的原始图像包括:
步骤211,根据第一图像和第二图像,分别获取第一图像对应的第一校正图像和第二图像对应的第二校正图像;其中,第一校正图像和第二校正图像为在x方向已经对齐的图像;
步骤212,将第一校正图像和第二校正图像的公共部分作为原始图像。
本发明实施例中,步骤220中,所述获取原始图像中各个像素点对应的深度信息包括:
步骤221,利用所述第一校正图像和第二校正图像,通过立体匹配算法获取所述原始图像对应的视察图;
步骤222,根据所述视察图获取对应的深度信息。
本发明实施例中,所述修正信息为色温调节值;所述深度信息为深度值;
在步骤400之前,还包括:
步骤300,对深度修正匹配关系进行设置。
需要说明,步骤300和步骤200以及步骤100之间没有先后顺序关系。
其中,可以将所述深度修正匹配关系设置为:
K=a1*S+b1;或,K=f(S);
其中,S为深度值,K为色温调节值,f(S)为深度修正匹配关系对应的变换函数,a1和b1为预设的参数。
本发明实施例中,还包括:
获取滑块调节值;
根据获得的滑块调节值,对获取的深度信息对应的修正信息的范围进行 调整,并根据调整后的修正信息对原始图像进行处理。
滑块值H的取值可以设置1/100.0到100/100.0,也就是说,滑块的起始位置为1/100.0,结束位置为100/100.0。
假设默认的色温范围为1000K到8000K,则滑块范围1/100.0到100/100.0分别对应着1000K到8000K的色温,默认开始色温Tbegin和默认结束色温Tend分别为:
Tbegin=1000K;
Tend=8000K。
根据下面公式可以计算滑块调节后的开始色温T1和结束色温T2;
T=H*(Tend-Tbegin)+Tbegin
假设用户滑动滑块,将滑块的起始位置设置为0.2,结束位置设置为0.88,
那么对应的起始色温值T1为:
0.2*(Tend-Tbegin)+Tbegin
结束色温值T2为:
0.88*(Tend-Tbegin)+Tbegin
根据该调节后的色温值的取值范围,将对应调整深度值S和色温调节值K的对应关系,其中,K的取值范围在滑块调节之前为1000K~8000K,在调节后为T1~T2,从而进一步调整冷暖渐变贴图的大小,从而改变图像调节的效果。
需要说明的是,在本文中,术语“包括”、“包含”或者其任何其他变体意在涵盖非排他性的包含,从而使得包括一系列要素的过程、方法、物品或者装置不仅包括那些要素,而且还包括没有明确列出的其他要素,或者是还包括为这种过程、方法、物品或者装置所固有的要素。在没有更多限制的情况下,由语句“包括一个……”限定的要素,并不排除在包括该要素的过程、方法、物品或者装置中还存在另外的相同要素。
上述本发明实施例序号仅仅为了描述,不代表实施例的优劣。
通过以上的实施方式的描述,本领域的技术人员可以清楚地了解到上述实施例方法可借助软件加必需的通用硬件平台的方式来实现,当然也可以通过硬件,但很多情况下前者是更佳的实施方式。基于这样的理解,本发明的技术方案本质上或者说对现有技术做出贡献的部分可以以软件产品的形式体现出来,该计算机软件产品存储在一个存储介质(如ROM/RAM、磁碟、光盘)中,包括若干指令用以使得一台终端设备(可以是手机,计算机,服务器,空调器,或者网络设备等)执行本发明各个实施例所述的方法。
以上仅为本发明的优选实施例,并非因此限制本发明的专利范围,凡是利用本发明说明书及附图内容所作的等效结构或等效流程变换,或直接或间接运用在其他相关的技术领域,均同理包括在本发明的专利保护范围内。
工业实用性
本发明实施例提出的图像处理方法、装置及移动终端,通过本发明实施例的技术方案,用户使用双目镜头拍摄时或拍摄后,可以根据深度信息对图像进行优化处理,通过前后冷暖色调的处理可以使图像的前景主体物更突出,使得背景退得更远,利用双目镜头拍摄的优势,创造了更加精准的前后景分离调整冷暖,达到了用户期望的效果,提高了用户体验。

Claims (20)

  1. 一种图像处理装置,包括:
    拍照单元,设置为通过多个目镜拍照得到多个图像;
    获取单元,设置为根据拍到的多个图像获取拍照的原始图像,以及原始图像对应的深度信息;
    确定单元,设置为根据预设的深度修正匹配关系,确定所述深度信息对应的修正信息;
    修正单元,设置为根据所述修正信息,对原始图像进行处理,并获取对应的输出图像。
  2. 根据权利要求1所述的图像处理装置,其中,所述目镜包括第一目镜和第二目镜;
    所述多个图像包括:通过第一目镜获得的第一图像,通过第二目镜获得的第二图像;
    所述获取单元包括:
    原始图像获取模块,设置为根据所述第一图像和所述第二图像,获取对应的原始图像;
    深度信息获取模块,设置为获取原始图像中各个像素点对应的深度信息。
  3. 根据权利要求2所述的图像处理装置,其中,所述第一目镜和所述第二目镜之间的距离与所述深度信息的精度范围相关。
  4. 根据权利要求2所述的图像处理装置,其中,所述原始图像获取模块包括:
    对齐子模块,设置为根据所述第一图像和所述第二图像,分别获取所述第一图像对应的第一校正图像和所述第二图像对应的第二校正图像;其中,第一校正图像和第二校正图像为在x轴方向已经对齐的图像;
    原始图像获取子模块,设置为将第一校正图像和第二校正图像的公共部分作为所述原始图像。
  5. 根据权利要求4所述的图像处理装置,其中,所述获取所述第一校正图像和所述第二校正图像的方法为:立体校正算法。
  6. 根据权利要求2所述的图像处理装置,其中,所述深度信息获取模块包括:
    视察图获取子模块,设置为利用所述第一校正图像和所述第二校正图像,通过立体匹配算法获取所述原始图像对应的视察图;
    深度信息获取子模块,设置为根据所述视察图获取对应的深度信息。
  7. 根据权利要求6所述的图像处理装置,其中,所述视察图是一个二维数据,所述视察图中像素点与所述原始图像中的像素点一一对应。
  8. 根据权利要求1所述的图像处理装置,其中,所述修正信息为色温调节值;所述深度信息为深度值;
    所述确定单元包括修正匹配关系设置模块,设置为对所述深度修正匹配关系进行设置,其中;
    所述深度修正匹配关系设置为:
    K=a1*S;或,K=f(S)+b1;
    其中,S为深度值,K为色温调节值,f(S)为深度修正匹配关系对应的变换函数,a1和b1为预设的参数。
  9. 根据权利要求8所述的图像处理装置,还包括滑块处理单元,设置为获取滑块调节值;以及,根据所述滑块调节值,对获取的深度信息对应的修正信息的范围进行调整,并根据调整后的修正信息对所述原始图像进行处理。
  10. 一种移动终端,包括如权利要求1~9中任一项所述的图像处理装置。
  11. 一种图像处理方法,包括:
    通过多个目镜拍照得到多个图像;
    根据拍到的多个图像获取拍照的原始图像,以及原始图像对应的深度信息;
    根据预设的深度修正匹配关系,获取所述深度信息对应的修正信息;
    根据所述修正信息,对原始图像进行处理,并获取对应的输出图像。
  12. 根据权利要求11所述的图像处理方法,其中,所述目镜包括第一目镜和第二目镜;其中,所述多个图像包括:通过第一目镜获得的第一图像,通过第二目镜获得的第二图像;
    所述根据拍到的多个图像获取拍照的原始图像,以及原始图像对应的深度信息包括:
    根据所述第一图像和所述第二图像,获取对应的原始图像;
    获取原始图像中各个像素点对应的深度信息。
  13. 根据权利要求11所述的图像处理方法,其中,所述第一目镜和所述第二目镜之间的距离与所述深度信息的精度范围相关。
  14. 根据权利要求11所述的图像处理方法,其中,所述根据所述第一图像和所述第二图像,获取对应的原始图像包括:
    根据所述第一图像和所述第二图像,分别获取所述第一图像对应的第一校正图像和所述第二图像对应的第二校正图像;其中,第一校正图像和第二校正图像为在x轴方向已经对齐的图像;
    将第一校正图像和第二校正图像的公共部分作为所述原始图像。
  15. 根据权利要求14所述的图像处理方法,其中,所述获取所述第一校正图像和所述第二校正图像的方法为:立体校正算法。
  16. 根据权利要求11所述的图像处理方法,其中,所述获取原始图像中各个像素点对应的深度信息包括:
    利用所述第一校正图像和所述第二校正图像,通过立体匹配算法获取所述原始图像对应的视察图;
    根据所述视察图获取对应的深度信息。
  17. 根据权利要求16所述的图像处理方法,其中,所述视察图是一个二维数据,所述视察图中像素点与所述原始图像中的像素点一一对应。
  18. 根据权利要求11所述的图像处理方法,其中,所述修正信息为色温调节值;所述深度信息为深度值;
    在所述根据预设的深度修正匹配关系,获取所述深度信息对应的修正信息之前,还包括:
    对深度修正匹配关系进行设置;
    其中,所述深度修正匹配关系设置为:
    K=a1*S+b1;或,K=f(S);
    其中,S为深度值,K为色温调节值,f(S)为深度修正匹配关系对应的变换函数,a1和b1为预设的参数。
  19. 根据权利要求18所述的图像处理方法,该方法还包括:
    获取滑块调节值;
    根据所述滑块调节值,对获取的深度信息对应的修正信息的范围进行调整,并根据调整后的修正信息对所述原始图像进行处理。
  20. 一种计算机可读存储介质,存储有计算机可执行指令,所述计算机可执行指令用于执行权利要求11-19任一项所述的图像处理方法。
PCT/CN2016/103062 2015-10-22 2016-10-24 图像处理方法、装置及移动终端 WO2017067523A1 (zh)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201510690792.4 2015-10-22
CN201510690792.4A CN106612393B (zh) 2015-10-22 2015-10-22 一种图像处理方法和装置、以及移动终端

Publications (1)

Publication Number Publication Date
WO2017067523A1 true WO2017067523A1 (zh) 2017-04-27

Family

ID=58556662

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2016/103062 WO2017067523A1 (zh) 2015-10-22 2016-10-24 图像处理方法、装置及移动终端

Country Status (2)

Country Link
CN (1) CN106612393B (zh)
WO (1) WO2017067523A1 (zh)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109785226A (zh) * 2018-12-28 2019-05-21 维沃移动通信有限公司 一种图像处理方法、装置及终端设备

Families Citing this family (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107222734A (zh) * 2017-06-30 2017-09-29 联想(北京)有限公司 一种图像采集装置及电子设备
CN107707834B (zh) * 2017-09-11 2020-07-17 Oppo广东移动通信有限公司 图像处理方法和装置、电子装置和计算机可读存储介质
CN107613224A (zh) * 2017-09-11 2018-01-19 广东欧珀移动通信有限公司 图像处理方法和装置、电子装置和计算机可读存储介质
WO2019047985A1 (zh) 2017-09-11 2019-03-14 Oppo广东移动通信有限公司 图像处理方法和装置、电子装置和计算机可读存储介质
CN107948516A (zh) * 2017-11-30 2018-04-20 维沃移动通信有限公司 一种图像处理方法、装置及移动终端
CN109697957B (zh) * 2019-01-07 2020-11-03 京东方科技集团股份有限公司 图像像素校正方法及系统

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102523464A (zh) * 2011-12-12 2012-06-27 上海大学 一种双目立体视频的深度图像估计方法
US20130101169A1 (en) * 2011-10-20 2013-04-25 Lg Innotek Co., Ltd. Image processing method and apparatus for detecting target
CN104506768A (zh) * 2014-11-28 2015-04-08 广东欧珀移动通信有限公司 图像选择方法、装置及终端
CN104639926A (zh) * 2013-11-11 2015-05-20 聚晶半导体股份有限公司 根据深度信息处理图像的方法及装置

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2008172523A (ja) * 2007-01-11 2008-07-24 Fujifilm Corp 多焦点カメラ装置及びそれに用いられる制御方法並びにプログラム
US8913154B2 (en) * 2010-02-04 2014-12-16 Canon Kabushiki Kaisha Image processing apparatus
KR101901184B1 (ko) * 2012-09-20 2018-09-21 삼성전자주식회사 깊이 영상을 사용한 컬러 영상 처리 장치 및 방법

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20130101169A1 (en) * 2011-10-20 2013-04-25 Lg Innotek Co., Ltd. Image processing method and apparatus for detecting target
CN102523464A (zh) * 2011-12-12 2012-06-27 上海大学 一种双目立体视频的深度图像估计方法
CN104639926A (zh) * 2013-11-11 2015-05-20 聚晶半导体股份有限公司 根据深度信息处理图像的方法及装置
CN104506768A (zh) * 2014-11-28 2015-04-08 广东欧珀移动通信有限公司 图像选择方法、装置及终端

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109785226A (zh) * 2018-12-28 2019-05-21 维沃移动通信有限公司 一种图像处理方法、装置及终端设备
CN109785226B (zh) * 2018-12-28 2023-11-17 维沃移动通信有限公司 一种图像处理方法、装置及终端设备

Also Published As

Publication number Publication date
CN106612393B (zh) 2019-10-15
CN106612393A (zh) 2017-05-03

Similar Documents

Publication Publication Date Title
WO2018019124A1 (zh) 一种图像处理方法及电子设备、存储介质
WO2017067523A1 (zh) 图像处理方法、装置及移动终端
WO2017045650A1 (zh) 一种图片处理方法及终端
CN106454121B (zh) 双摄像头拍照方法及装置
WO2017050115A1 (zh) 一种图像合成方法和装置
WO2017067526A1 (zh) 图像增强方法及移动终端
WO2017020836A1 (zh) 一种虚化处理深度图像的装置和方法
WO2018076935A1 (zh) 图像虚化处理方法、装置、移动终端和计算机存储介质
WO2016180325A1 (zh) 图像处理方法及装置
WO2017016511A1 (zh) 一种图像处理方法及装置、终端
WO2017071559A1 (zh) 图像处理装置及方法
WO2017067520A1 (zh) 具有双目摄像头的移动终端及其拍照方法
CN106713716B (zh) 一种双摄像头的拍摄控制方法和装置
WO2017071475A1 (zh) 一种图像处理方法及终端、存储介质
WO2018019128A1 (zh) 一种夜景图像的处理方法和移动终端
WO2017071542A1 (zh) 图像处理方法及装置
WO2017041714A1 (zh) 一种获取rgb数据的方法和装置
WO2017071476A1 (zh) 一种图像合成方法和装置、存储介质
WO2017045647A1 (zh) 一种处理图像的移动终端和方法
WO2017071532A1 (zh) 一种自拍合影的方法和装置
WO2017206657A1 (zh) 一种图像处理方法、装置、移动终端及计算机存储介质
WO2017071469A1 (zh) 一种移动终端和图像拍摄方法、计算机存储介质
WO2018076938A1 (zh) 图像处理装置及方法和计算机存储介质
WO2017088618A1 (zh) 图片合成方法及装置
WO2016070681A1 (zh) 调节屏幕色温的方法、装置和计算机存储介质

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 16856954

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 16856954

Country of ref document: EP

Kind code of ref document: A1