WO2017206656A1 - Image processing method, terminal, and computer storage medium - Google Patents

Image processing method, terminal, and computer storage medium Download PDF

Info

Publication number
WO2017206656A1
WO2017206656A1 PCT/CN2017/082941 CN2017082941W WO2017206656A1 WO 2017206656 A1 WO2017206656 A1 WO 2017206656A1 CN 2017082941 W CN2017082941 W CN 2017082941W WO 2017206656 A1 WO2017206656 A1 WO 2017206656A1
Authority
WO
WIPO (PCT)
Prior art keywords
image data
image
frame
data stream
reference frame
Prior art date
Application number
PCT/CN2017/082941
Other languages
French (fr)
Chinese (zh)
Inventor
戴向东
Original Assignee
努比亚技术有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from CN201610377764.1A external-priority patent/CN105915796A/en
Priority claimed from CN201610375523.3A external-priority patent/CN105898159B/en
Application filed by 努比亚技术有限公司 filed Critical 努比亚技术有限公司
Publication of WO2017206656A1 publication Critical patent/WO2017206656A1/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/67Focus control based on electronic image sensor signals
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/222Studio circuitry; Studio devices; Studio equipment
    • H04N5/262Studio circuits, e.g. for mixing, switching-over, change of character of image, other special effects ; Cameras specially adapted for the electronic generation of special effects
    • H04N5/265Mixing

Definitions

  • the present invention relates to image processing technology in the field of photographing, and in particular to an image processing method and terminal, and a computer storage medium.
  • the camera function is one of the commonly used functions of the mobile terminal, and the camera function of some mobile terminals has an electronic aperture mode.
  • the electronic aperture mode after the user adjusts the aperture value, the mobile terminal performs continuous and uninterrupted shooting during the exposure time, and then transparently processes the captured multiple images before superimposing the overall effect and the "slow shutter"
  • the actual effect is very consistent, mainly highlighting the long exposure time and even reaching the B-level exposure time.
  • the algorithm used in the electronic aperture mode does not cause overexposure, and the overall effect of the picture is relatively natural.
  • an embodiment of the present invention provides an image processing method, a terminal, and a computer storage medium.
  • An acquiring unit configured to acquire an image data stream by using an electronic aperture, the image data stream comprising a plurality of frames of image data;
  • a registration unit configured to register image data of each frame in the image data stream
  • the fusion unit is configured to perform fusion processing on the image data of each frame after registration to obtain a target image.
  • the acquiring unit includes:
  • a shooting subunit configured to capture an original image data stream by using an electronic aperture in a handheld mode, the original image data stream comprising a plurality of frames of raw image data;
  • a pre-processing unit configured to pre-process each frame of the original image data stream to obtain the image data stream; wherein the pre-processing comprises at least one of: image filtering, contrast stretching .
  • the terminal further includes:
  • the prompting unit is configured to display a prompt box on the display interface, and the prompting box is configured to prompt the orientation of the handheld shaking when the electronic aperture is used in the handheld mode.
  • the registration unit is further configured to: use the first frame image data in the image data stream as a reference frame, and divide each of the image data streams except the first frame image data.
  • the frame image data is aligned with the reference frame; wherein the alignment refers to aligning pixel points of the same spatial location.
  • the image data stream comprising a plurality of frames of image data
  • the image data of each frame after registration is subjected to fusion processing to obtain a target image.
  • the acquiring an image data stream by using an electronic aperture includes:
  • an original image data stream is obtained by taking an electronic aperture, the original image data stream comprising a plurality of frames of original image data;
  • the preprocessing comprises at least one of the following: image filtering, contrast stretching.
  • the method further includes:
  • a prompt box is displayed on the display interface; the prompt box is used to indicate the orientation of the handheld shake when the electronic aperture is used in the handheld mode.
  • the registering each frame image data in the image data stream includes:
  • the alignment refers to aligning pixel points of the same spatial position.
  • the image processing of each frame image after the registration is performed, including:
  • Each pixel point of the image data of each frame after registration is superimposed according to the spatial position correspondence.
  • An acquiring unit configured to acquire an image data stream by using an electronic aperture, the image data stream comprising a plurality of frames of image data;
  • a registration unit configured to determine a reference frame from the image data stream, and register, in the image data stream, image data of each frame other than the reference frame with the reference frame;
  • the merging unit is configured to perform merging processing on the image data of each frame after the registration, and replace the pixels in the black area at the image boundary with the pixels corresponding to the reference frame, and perform fusion processing to obtain a target image.
  • the fusion unit includes:
  • the analysis subunit is configured to analyze the image data of each frame after registration to determine a black area at a boundary of each frame image
  • the replacement and fusion subunits are configured to perform fusion processing on the image data of each frame after the registration, and replace the pixels in the black area at the image boundary with the pixels corresponding to the reference frame, and perform fusion processing to obtain a target image.
  • the replacing and merging subunit is further configured to determine, according to a black area at the image boundary, a reference area corresponding to the black area in the reference frame; The pixel of the region is replaced with the pixel in the reference region in the reference frame, and then the fusion processing is performed to obtain the target image.
  • the registration unit is further configured to: use the first frame image data in the image data stream as a reference frame, and divide each of the image data streams except the first frame image data.
  • the frame image data is aligned with the reference frame; wherein the alignment refers to aligning pixel points of the same spatial location.
  • the image data stream comprising a plurality of frames of image data
  • the pixels of the black area at the image boundary are replaced with the pixels corresponding to the reference frame, and then the fusion processing is performed to obtain the target image.
  • the pixel of the black area at the image boundary is replaced with the pixel corresponding to the reference frame, and then the fusion process is performed to obtain the target image, including:
  • the pixels of the black area at the image boundary are replaced with the pixels corresponding to the reference frame, and then the fusion processing is performed to obtain the target image.
  • the fusion process is performed to obtain the target image, including:
  • the pixel in the black area at the image boundary is replaced with the pixel in the reference area in the reference frame, and then the fusion processing is performed to obtain the target image.
  • the determining a reference frame from the image data stream, and registering each frame image data of the image data stream except the reference frame with the reference frame includes:
  • the alignment refers to aligning pixel points of the same spatial position.
  • the image processing of each frame image after the registration is performed, including:
  • Each pixel point of the image data of each frame after registration is superimposed according to the spatial position correspondence.
  • the terminal provided by the embodiment of the present invention includes: a camera and a processor;
  • the camera is configured to acquire an image data stream using an electronic aperture, the image data stream comprising a plurality of frames of image data;
  • the processor is configured to register image data of each frame in the image data stream, and perform fusion processing on the image data of each frame after registration to obtain a target image.
  • the terminal provided by the embodiment of the present invention includes: a camera and a processor;
  • the camera is configured to acquire an image data stream using an electronic aperture, the image data stream comprising a plurality of frames of image data;
  • the processor configured to determine a reference frame from the image data stream, the image And storing, in the data stream, image data of each frame other than the reference frame and the reference frame; and performing fusion processing on the image data of each frame after registration, replacing pixels of the black area at the boundary of the image with the reference
  • the pixel corresponding to the frame is subjected to fusion processing to obtain a target image.
  • the computer storage medium provided by the embodiment of the present invention stores computer executable instructions configured to perform any of the image processing methods described.
  • an image data stream is acquired by using an electronic aperture, the image data stream includes multi-frame image data; and each frame image data in the image data stream is registered; The image data of each frame after registration is subjected to fusion processing to obtain a target image.
  • the embodiment of the present invention increases the hand-held mode of the electronic aperture, improves the convenience of the user's shooting, and introduces the image registration in the embodiment of the present invention to ensure the shooting effect and improve the user's shooting experience.
  • the embodiment of the present invention further adds a prompt box on the display interface to prevent the handheld terminal from being over-ranged when the user shoots.
  • an image data stream is acquired by using an electronic aperture, the image data stream includes multi-frame image data; a reference frame is determined from the image data stream, and the image data is determined. And storing, in the stream, image data of each frame other than the reference frame and the reference frame; and performing fusion processing on the image data of each frame after the registration, replacing pixels of the black area at the boundary of the image with the reference frame The corresponding pixel is subjected to fusion processing to obtain a target image.
  • the user can perform the shooting of the electronic aperture by the hand-held terminal, thereby improving the convenience of the user's shooting, avoiding the problem of unclear image caused by the hand-held, ensuring the effect of shooting, and improving the effect of the shooting.
  • User shooting experience When image fusion is performed, the black side of the pixel caused by image registration is processed, which ensures that the pixel point transition of each position of the entire image is natural.
  • FIG. 1 is a schematic structural diagram of hardware of an optional mobile terminal embodying various embodiments of the present invention
  • FIG. 2 is a schematic diagram of a wireless communication system of the mobile terminal shown in FIG. 1;
  • FIG. 3 is a schematic flowchart of an image processing method according to Embodiment 1 of the present invention.
  • FIG. 4 is a schematic diagram of pixel point matching of an optical flow image
  • Figure 5 is a schematic diagram of a simple mobile phone motion model
  • FIG. 6 is a flowchart of aligning a multi-frame image by using an optical flow according to an embodiment of the present invention
  • FIG. 7 is a schematic flowchart diagram of an image processing method according to Embodiment 2 of the present invention.
  • FIG. 8 is a schematic diagram of an interface of a prompt box according to an embodiment of the present invention.
  • FIG. 9 is a schematic structural diagram of a terminal according to Embodiment 1 of the present invention.
  • FIG. 10 is a schematic structural diagram of a terminal according to an embodiment of the present invention.
  • FIG. 11 is a schematic flowchart of an image processing method according to Embodiment 3 of the present invention.
  • FIG. 12 is a schematic flowchart diagram of an image processing method according to Embodiment 4 of the present invention.
  • FIG. 13 is a schematic diagram of image fusion according to an embodiment of the present invention.
  • FIG. 14 is a schematic structural diagram of a terminal according to Embodiment 2 of the present invention.
  • Figure 15 is a block diagram of the electrical structure of the camera.
  • module A mobile terminal embodying various embodiments of the present invention will now be described with reference to the accompanying drawings.
  • suffixes such as “module,” “component,” or “unit” used to denote an element are merely illustrative of the embodiments of the present invention, and do not have a specific meaning per se. Therefore, “module” and “component” can be used in combination.
  • the mobile terminal can be implemented in various forms.
  • the terminal described in the embodiments of the present invention may include, for example, a mobile phone, a smart phone, a notebook computer, a digital broadcast receiver, a personal digital assistant (PDA, Personal Digital Assistant), a tablet (PAD), a portable multimedia player (PMP). , Portable Media Player), mobile devices of navigation devices and the like, and fixed terminals such as digital TVs, desktop computers, and the like.
  • PDA Personal Digital Assistant
  • PDA Personal Digital Assistant
  • PAD personal Digital Assistant
  • PMP portable multimedia player
  • mobile devices of navigation devices and the like and fixed terminals
  • fixed terminals such as digital TVs, desktop computers, and the like.
  • fixed terminals such as digital TVs, desktop computers, and the like.
  • FIG. 1 is a schematic diagram showing the hardware structure of a mobile terminal embodying various embodiments of the present invention.
  • the mobile terminal 100 may include a wireless communication unit 110, an audio/video (A/V) input unit 120, a user input unit 130, a sensing unit 140, an output unit 150, a memory 160, an interface unit 170, a controller 180, and a power supply unit 190. and many more.
  • Figure 1 illustrates a mobile terminal having various components, but it should be understood that not all illustrated components are required to be implemented. More or fewer components can be implemented instead. The elements of the mobile terminal will be described in detail below.
  • Wireless communication unit 110 typically includes one or more components that permit radio communication between mobile terminal 100 and a wireless communication system or network.
  • the wireless communication unit may include at least one of a broadcast receiving module 111, a mobile communication module 112, a wireless internet module 113, a short-range communication module 114, and a location information module 115.
  • the broadcast receiving module 111 receives a broadcast signal and/or broadcast associated information from an external broadcast management server via a broadcast channel.
  • the broadcast channel can include a satellite channel and/or a terrestrial channel.
  • the broadcast management server may be a server that generates and transmits a broadcast signal and/or broadcast associated information or a server that receives a previously generated broadcast signal and/or broadcast associated information and transmits it to the terminal.
  • the broadcast signal may include a TV broadcast signal, a radio broadcast signal, a data broadcast signal, and the like.
  • the broadcast signal may further include a broadcast signal combined with a TV or radio broadcast signal.
  • the broadcast associated information may also be provided via a mobile communication network, and in this case, the broadcast associated information may be received by the mobile communication module 112.
  • the broadcast signal may exist in various forms, for example, it may be an Electronic Program Guide (EPG) of Digital Multimedia Broadcasting (DMB), a digital video broadcast handheld (DVB-H, Digital Video Broadcasting-Handheld). ) exists in the form of an ESG (Electronic Service Guide) and the like.
  • EPG Electronic Program Guide
  • DMB Digital Multimedia Broadcasting
  • DVD-H Digital Video Broadcasting-Handheld
  • ESG Electronic Service Guide
  • the broadcast receiving module 111 can receive a signal broadcast by using various types of broadcast systems.
  • the broadcast receiving module 111 can use, for example, DMB-T (Digital Multimedia Broadcasting-Terrestrial), Digital Multimedia Broadcasting-Satellite (DMB-S, Digital Multimedia Broadcasting-Satellite), Digital Video Broadcasting Handheld (DVB-H), Forward Link Media (MediaFLO, Media Forward Link Only) data broadcasting system, Digital Broadcasting Integrated Services (ISDB-T, Integrated Services Digital Broadcasting-Terrestrial), etc.
  • the system receives digital broadcasts.
  • the broadcast receiving module 111 can be constructed as various broadcast systems suitable for providing broadcast signals as well as the above-described digital broadcast system.
  • the broadcast signal and/or broadcast associated information received via the broadcast receiving module 111 may be stored in the memory 160 (or other type of storage medium).
  • the mobile communication module 112 transmits the radio signals to and/or receives radio signals from at least one of a base station (e.g., an access point, a Node B, etc.), an external terminal, and a server.
  • a base station e.g., an access point, a Node B, etc.
  • Such radio signals may include voice call signals, video call signals, or various types of data transmitted and/or received in accordance with text and/or multimedia messages.
  • the wireless internet module 113 supports wireless internet access of the mobile terminal.
  • the module can be internally or externally coupled to the terminal.
  • the wireless Internet access technologies involved in the module may include Wi-Fi (WLAN, Wireless Local Area Networks), Wireless Broadband (Wibro), Worldwide Interoperability for Microwave Access (Wimax), and High Speed Downlink Packet Access. (HSDPA, High Speed Downlink Packet Access) and so on.
  • the short range communication module 114 is a module for supporting short range communication.
  • Some examples of short-range communication technologies include Bluetooth, Radio Frequency Identification (RFID), Infrared Data Association (IrDA), Ultra Wideband (UWB), Zigbee, and the like.
  • the location information module 115 is a module for checking or acquiring location information of the mobile terminal.
  • a typical example of a location information module is the Global Positioning System (GPS).
  • GPS Global Positioning System
  • the position information module 115 as a GPS calculates distance information and accurate time information from three or more satellites and applies triangulation to the calculated information to accurately calculate the three-dimensional current based on longitude, latitude, and altitude. location information.
  • the method used to calculate position and time information uses three satellites and is corrected by using another satellite. The error of the position and time information.
  • the position information module 115 as a GPS can calculate the speed information by continuously calculating the current position information in real time.
  • the A/V input unit 120 is for receiving an audio or video signal.
  • the A/V input unit 120 may include a camera 121 and a microphone 122 that processes image data of still pictures or video obtained by the image capturing device in a video capturing mode or an image capturing mode.
  • the processed image frame can be displayed on the display unit 151.
  • the image frames processed by the camera 121 may be stored in the memory 160 (or other storage medium) or transmitted via the wireless communication unit 110, and two or more cameras 121 may be provided according to the configuration of the mobile terminal.
  • the microphone 122 can receive sound (audio data) via a microphone in an operation mode of a telephone call mode, a recording mode, a voice recognition mode, and the like, and can process such sound as audio data.
  • the processed audio (voice) data can be converted to a format output that can be transmitted to the mobile communication base station via the mobile communication module 112 in the case of a telephone call mode.
  • the microphone 122 can implement various types of noise cancellation (or suppression) algorithms to cancel (or suppress) noise or interference generated during the process of receiving and transmitting audio signals.
  • the user input unit 130 may generate key input data according to a command input by the user to control various operations of the mobile terminal.
  • the user input unit 130 allows the user to input various types of information, and may include a keyboard, a pot, a touch pad (eg, a touch sensitive component that detects changes in resistance, pressure, capacitance, etc. due to contact), a scroll wheel , rocker, etc.
  • a touch screen can be formed.
  • the sensing unit 140 detects the current state of the mobile terminal 100 (eg, the open or closed state of the mobile terminal 100), the location of the mobile terminal 100, the presence or absence of contact (ie, touch input) by the user with the mobile terminal 100, and the mobile terminal.
  • the sensing unit 140 can sense whether the slide type phone is turned on or off.
  • the sensing unit 140 can detect the power supply unit 190 Whether power is supplied or whether the interface unit 170 is coupled to an external device.
  • the sensing unit 140 may include a proximity sensor 141.
  • the interface unit 170 serves as an interface through which at least one external device can connect with the mobile terminal 100.
  • the external device may include a wired or wireless headset port, an external power (or battery charger) port, a wired or wireless data port, a memory card port, a port for connecting a device having an identification module, and an audio input/output. (I/O) port, video I/O port, headphone port, and more.
  • the identification module may be stored to verify various information used by the user using the mobile terminal 100 and may include a User Identification Module (UIM), a Subscriber Identity Module (SIM), and a Universal Customer Identification Module (USIM, Universal). Subscriber Identity Module) and more.
  • UIM User Identification Module
  • SIM Subscriber Identity Module
  • USB Universal Customer Identification Module
  • the device having the identification module may take the form of a smart card, and thus the identification device may be connected to the mobile terminal 100 via a port or other connection device.
  • the interface unit 170 can be configured to receive input from an external device (eg, data information, power, etc.) and transmit the received input to one or more components within the mobile terminal 100 or can be used at the mobile terminal and external device Transfer data between.
  • the interface unit 170 may function as a path through which power is supplied from the base to the mobile terminal 100 or may be used as a transmission of various command signals allowing input from the base to the mobile terminal 100 The path to the terminal.
  • Various command signals or power input from the base can be used as signals for identifying whether the mobile terminal is accurately mounted on the base.
  • Output unit 150 is configured to provide an output signal (eg, an audio signal, a video signal, an alarm signal, a vibration signal, etc.) in a visual, audio, and/or tactile manner.
  • the output unit 150 may include a display unit 151, an audio output module 152, an alarm unit 153, and the like.
  • the display unit 151 can display information processed in the mobile terminal 100. For example, when the mobile terminal 100 is in the phone call mode, the display unit 151 can display a user interface related to a call or other communication (eg, text messaging, multimedia file download, etc.) (UI, User) Interface) or Graphical User Interface (GUI). When the mobile terminal 100 is in a video call mode or an image capturing mode, the display unit 151 may display a captured image and/or a received image, a UI or GUI showing a video or image and related functions, and the like.
  • UI text messaging, multimedia file download, etc.
  • GUI Graphical User Interface
  • the display unit 151 can function as an input device and an output device.
  • the display unit 151 may include a Liquid Crystal Display (LCD), a Thin Film Transistor (LCD), an Organic Light-Emitting Diode (OLED) display, a flexible display, and a three-dimensional (3D) At least one of a display or the like.
  • LCD Liquid Crystal Display
  • LCD Thin Film Transistor
  • OLED Organic Light-Emitting Diode
  • 3D three-dimensional
  • Some of these displays may be configured to be transparent to allow a user to view from the outside, which may be referred to as a transparent display, and a typical transparent display may be, for example, a transparent organic light emitting diode (TOLED) display or the like.
  • TOLED transparent organic light emitting diode
  • the mobile terminal 100 may include two or more display units (or other display devices), for example, the mobile terminal may include an external display unit (not shown) and an internal display unit (not shown) .
  • the touch screen can be used to detect touch input pressure as well as touch input position and touch input area.
  • the audio output module 152 may convert audio data received by the wireless communication unit 110 or stored in the memory 160 when the mobile terminal is in a call signal receiving mode, a call mode, a recording mode, a voice recognition mode, a broadcast receiving mode, and the like.
  • the audio signal is output as sound.
  • the audio output module 152 can provide audio output (eg, call signal reception sound, message reception sound, etc.) associated with a particular function performed by the mobile terminal 100.
  • the audio output module 152 can include a speaker, a buzzer, and the like.
  • the alarm unit 153 can provide an output to notify the mobile terminal 100 of the occurrence of an event. Typical events may include call reception, message reception, key signal input, touch input, and the like. In addition to audio or video output, the alert unit 153 can provide an output in a different manner to notify of the occurrence of an event. For example, the alarm unit 153 can provide an output in the form of vibrations, and when a call, message, or some other incoming communication is received, the alarm unit 153 A tactile output (ie, vibration) can be provided to notify the user of it. By providing such a tactile output, the user is able to recognize the occurrence of various events even when the user's mobile phone is in the user's pocket. The alarm unit 153 can also provide an output of the notification event occurrence via the display unit 151 or the audio output module 152.
  • the memory 160 may store a software program or the like for processing and control operations performed by the controller 180, or may temporarily store data (for example, a phone book, a message, a still image, a video, etc.) that has been output or is to be output. Moreover, the memory 160 can store data regarding vibrations and audio signals of various manners that are output when a touch is applied to the touch screen.
  • the memory 160 may include at least one type of storage medium including a flash memory, a hard disk, a multimedia card, a card type memory (for example, SD or DX memory, etc.), a random access memory (RAM), and a static memory.
  • Random Access Memory SRAM
  • Read Only Memory ROM
  • EEPROM Electrically Erasable Programmable Read Only Memory
  • PROM Programmable Read Only Memory
  • magnetic memory disk, optical disk, etc.
  • the mobile terminal 100 can cooperate with a network storage device that performs a storage function of the memory 160 through a network connection.
  • the controller 180 typically controls the overall operation of the mobile terminal. For example, the controller 180 performs the control and processing associated with voice calls, data communications, video calls, and the like.
  • the controller 180 may include a multimedia module 181 for reproducing (or playing back) multimedia data, which may be constructed within the controller 180 or may be configured to be separate from the controller 180.
  • the controller 180 may perform a pattern recognition process to recognize a handwriting input or a picture drawing input performed on the touch screen as a character or an image.
  • the power supply unit 190 receives external power or internal power under the control of the controller 180 and provides appropriate power required to operate the various components and components.
  • the various embodiments described herein can be used, for example, in computer software, hardware, or any of them.
  • the combined computer readable medium is implemented.
  • the embodiments described herein may use an Application Specific Integrated Circuit (ASIC), a Digital Signal Processing (DSP), a Digital Signal Processing Device (DSPD), Programmable Logic Device (PLD), Field Programmable Gate Array (FPGA), processor, controller, microcontroller, microprocessor, electronics designed to perform the functions described herein
  • ASIC Application Specific Integrated Circuit
  • DSP Digital Signal Processing
  • DSPD Digital Signal Processing Device
  • PLD Programmable Logic Device
  • FPGA Field Programmable Gate Array
  • processor controller, microcontroller, microprocessor, electronics designed to perform the functions described herein
  • controller 180 microcontroller, microprocessor
  • implementations such as procedures or functions may be implemented with separate software modules that permit the execution of at least one function or operation.
  • the software code can be implemented by a software application (or program) written in any suitable programming language, which can be stored in memory 160 and executed by
  • the mobile terminal has been described in terms of its function.
  • a slide type mobile terminal among various types of mobile terminals such as a folding type, a bar type, a swing type, a slide type mobile terminal, and the like will be described as an example. Therefore, the present invention can be applied to any type of mobile terminal, and is not limited to a slide type mobile terminal.
  • the mobile terminal 100 as shown in FIG. 1 may be configured to operate using a communication system such as a wired and wireless communication system and a satellite-based communication system that transmits data via frames or packets.
  • a communication system such as a wired and wireless communication system and a satellite-based communication system that transmits data via frames or packets.
  • a communication system in which a mobile terminal is operable according to an embodiment of the present invention will now be described with reference to FIG.
  • Such communication systems may use different air interfaces and/or physical layers.
  • the air interface used by the communication system includes, for example, Frequency Division Multiple Access (FDMA), Time Division Multiple Access (TDMA), Code Division Multiple Access (CDMA), and General Purpose Code Division Multiple Access (CDMA).
  • FDMA Frequency Division Multiple Access
  • TDMA Time Division Multiple Access
  • CDMA Code Division Multiple Access
  • CDMA General Purpose Code Division Multiple Access
  • UMTS Universal Mobile Telecommunications System
  • LTE Long Term Evolution
  • GSM Global System for Mobile Communications
  • the following description relates to CDMA communication systems, but such teachings are equally applicable to other types of systems.
  • the CDMA wireless communication system may include a plurality of mobile terminals 100, a plurality of base stations (BS) 270, a base station controller (BSC) 275, and a mobile switching center (MSC) 280.
  • the MSC 280 is configured to interface with a Public Switched Telephone Network (PSTN) 290.
  • PSTN Public Switched Telephone Network
  • the MSC 280 is also configured to interface with a BSC 275 that can be coupled to the base station 270 via a backhaul line.
  • the backhaul line can be constructed in accordance with any of a number of well known interfaces including, for example, E1/T1, ATM, IP, PPP, Frame Relay, HDSL, ADSL, or xDSL. It will be appreciated that the system as shown in FIG. 2 can include multiple BSCs 275.
  • Each BS 270 can serve one or more partitions (or regions), with each partition covered by a multi-directional antenna or an antenna pointing in a particular direction radially away from the BS 270. Alternatively, each partition may be covered by two or more antennas for diversity reception. Each BS 270 can be configured to support multiple frequency allocations, and each frequency allocation has a particular frequency spectrum (eg, 1.25 MHz, 5 MHz, etc.).
  • BS 270 may also be referred to as a Base Transceiver Subsystem (BTS) or other equivalent terminology.
  • BTS Base Transceiver Subsystem
  • the term "base station” can be used to generally refer to a single BSC 275 and at least one BS 270.
  • a base station can also be referred to as a "cell station.”
  • each partition of a particular BS 270 may be referred to as multiple cellular stations.
  • a broadcast transmitter (BT, Broadcast Transmitter) 295 transmits a broadcast signal to the mobile terminal 100 operating within the system.
  • a broadcast receiving module 111 as shown in FIG. 1 is provided at the mobile terminal 100 to receive a broadcast signal transmitted by the BT 295.
  • several satellites 300 are shown, for example GPS satellites 300 may be employed. The satellite 300 helps locate at least one of the plurality of mobile terminals 100.
  • a plurality of satellites 300 are depicted, but it is understood that useful positioning information can be obtained using any number of satellites.
  • the position information module 115 as a GPS as shown in FIG. 1 is generally configured to cooperate with the satellite 300 to obtain desired positioning information. Instead of GPS tracking technology or in addition to GPS tracking technology, other techniques that can track the location of the mobile terminal can be used. Additionally, at least one GPS satellite 300 can selectively or additionally process satellite DMB transmissions.
  • BS 270 receives reverse link signals from various mobile terminals 100.
  • Mobile terminal 100 typically participates in calls, messaging, and other types of communications.
  • Each reverse link signal received by a particular base station 270 is processed within a particular BS 270.
  • the obtained data is forwarded to the relevant BSC 275.
  • the BSC provides call resource allocation and coordinated mobility management functions including a soft handoff procedure between the BSs 270.
  • the BSC 275 also routes the received data to the MSC 280, which provides additional routing services for interfacing with the PSTN 290.
  • PSTN 290 interfaces with MSC 280, which forms an interface with BSC 275, and BSC 275 controls BS 270 accordingly to transmit forward link signals to mobile terminal 100.
  • the mobile communication module 112 of the wireless communication unit 110 in the mobile terminal accesses the mobile based on necessary data (including user identification information and authentication information) of the mobile communication network (such as 2G/3G/4G mobile communication network) built in the mobile terminal.
  • the communication network transmits mobile communication data (including uplink mobile communication data and downlink mobile communication data) for services such as web browsing and network multimedia playback of the mobile terminal user.
  • the wireless internet module 113 of the wireless communication unit 110 implements a function of a wireless hotspot by operating a related protocol function of a wireless hotspot, and the wireless hotspot supports access of a plurality of mobile terminals (any mobile terminal other than the mobile terminal) by multiplexing the mobile communication module.
  • the mobile communication connection between the mobile communication network and the mobile communication network transmits mobile communication data (including uplink mobile communication data and downlink mobile communication data) for the mobile terminal user's web browsing, network multimedia playback, etc., since the mobile terminal is substantially complex Transmitting a mobile communication connection between a mobile terminal and a communication network
  • the communication data is transmitted, so that the traffic of the mobile communication data consumed by the mobile terminal is included in the communication fee of the mobile terminal by the charging entity on the communication network side, thereby consuming the data traffic of the mobile communication data included in the communication tariff used by the mobile terminal.
  • FIG. 3 is a schematic flowchart of an image processing method according to Embodiment 1 of the present invention.
  • the image processing method in this example is applied to a terminal. As shown in FIG. 3, the image processing method includes the following steps:
  • Step 301 Acquire an image data stream by using an electronic aperture, the image data stream comprising a plurality of frames of image data.
  • the terminal may be an electronic device such as a mobile phone or a tablet computer.
  • the terminal has a photographing function, and the photographing function of the terminal has an electronic aperture mode; when photographing by using an electronic aperture, the user needs to set the photographing function to the electronic aperture mode.
  • the terminal In the electronic aperture mode, after the user adjusts the aperture value, the terminal performs continuous and uninterrupted shooting during the exposure time, and then fuses the captured multiple images.
  • the embodiment of the present invention adds a handheld mode for the electronic aperture shooting. In the gesture mode, the user can conveniently use the electronic aperture to shoot.
  • an image data stream is first acquired, the image data stream comprising a plurality of frames of image data.
  • the captured original image data stream is first acquired synchronously; then the original image data stream is subjected to data reading and image preprocessing to obtain the image data stream.
  • the raw image data of each frame in the original image data stream acquired synchronously is in illumination, noise, sharpness, and focus point. There is a difference.
  • the preprocessing includes image filtering to eliminate noise, contrast stretching to improve image sharpness and image illumination differences. After the pre-processing is performed, the difference of the image data of each frame in the image data stream will be reduced, which is helpful for the improvement of the effect of the subsequent image registration algorithm.
  • Step 302 Register each frame image data in the image data stream.
  • the pixels in the same spatial position in each frame image data are aligned, thereby avoiding blurring when the image is merged.
  • registration is also called alignment, and there are many algorithms for image alignment, which are mainly divided into a local feature based method and a global feature based method.
  • the typical method based on local features is to extract the key feature points of the image, and then use these key feature points to calculate the mapping matrix of the image space alignment model, and finally use the mapping matrix to perform image alignment.
  • the registration effect of this type of method can generally meet the requirements of many scenes, such as changes in illumination (combination of different exposure images), large-scale image shift (panoramic image stitching), dark light image (noise increase) and other complexities.
  • Scene The other type is a search alignment method based on global mutual information matching, which can reduce the matching error caused by random feature points.
  • the optical flow field is also a point-based matching algorithm, which is the instantaneous velocity of the pixel motion of the space moving object on the observation imaging plane, which is the change of the pixel in the time domain and the correlation between adjacent frames.
  • the purpose of studying the optical flow field is to approximate the motion field that cannot be directly obtained from the sequence of pictures.
  • the motion field is actually the motion of the object in the three-dimensional real world; the optical flow field is the projection of the motion field on the two-dimensional image plane (human eye or camera).
  • the motion velocity and direction of motion of each pixel in each image are found to be the optical flow field.
  • I(x, y, t) is the pixel value of the image at the (x, y) position.
  • Vx and Vy are the compositions of x, y in the optical flow vector of I(x, y, t), respectively.
  • Ix and Iy are the differences of the image in the corresponding direction at (x, y, t). So there are:
  • the Lucas-Kanade optical flow method assumes that the spatial pixel points move in unison, and adjacent points on one scene are projected onto the image as neighboring points, and the neighboring points are at the same speed. This is unique to the Lucas-Kanade optical flow method. Assumptions, because the optical flow method has only one basic equation constraint, and requires speed in the x, y direction, there are two unknown variables. Assuming similar motions in the neighborhood of the feature points, it is possible to find the velocity in the x, y direction by n multiple equations (n is the total number of points in the neighborhood of the feature points, including the feature points). You can get the following equation:
  • the small motion hypothesis mentioned in the above scheme assumes that when the target speed is fast, the assumption will not be established, and multi-scale can solve this problem.
  • Figure 6 shows the alignment process for multi-frame images using optical flow.
  • the sparse matching points between the two images can be obtained, and then the coordinates of the points are calculated.
  • a mapping model In the image alignment step, it is important to choose the correct image alignment transformation model.
  • Common spatial transformation models include affine transformation and perspective transformation models.
  • the affine transformation can be visually represented in the following form. Any parallelogram in one plane can be mapped to another parallelogram by affine transformation.
  • the mapping operation of the image is performed in the same spatial plane, and different transformation parameters are used to deform to obtain different types of parallelograms.
  • Transmission transformation is a more general transformation model. Compared with affine transformation, transmission transformation is more flexible.
  • a transmission transformation can transform a rectangle into a trapezoid. It describes the projection of one plane in space into another spatial plane. Shot transformation is a special case of perspective transformation.
  • a 02 and a 12 are displacement parameters
  • a 00 a 01 and a 10 a 11 are scaling and rotation parameters
  • a 20 a 21 is the amount of deformation in the horizontal and vertical directions.
  • the perspective transformation model mainly considering the handheld terminal.
  • the jitter motion of the mobile phone is basically not in the same plane.
  • the simple motion model is shown in Figure 5.
  • Step 303 Perform fusion processing on the image data of each frame after registration to obtain a target image.
  • I is for each image, m is the mth image, k is the number of images that have been synthesized, and N is the total number of composite images.
  • an image processing method which uses image registration principle to perform image alignment on image data to be synthesized in multiple frames, and the alignment method allows image jitter error within a certain range, and the final composite effect of the image The pixel deviation that occurs is small.
  • FIG. 7 is a schematic flowchart of an image processing method according to Embodiment 2 of the present invention.
  • the image processing method in this example is applied to a terminal. As shown in FIG. 7, the image processing method includes the following steps:
  • Step 701 In the handheld mode, the original image data stream is obtained by using an electronic aperture, the original image data stream includes a plurality of frames of original image data, and the original image data of each frame in the original image data stream is preprocessed to obtain The image data stream.
  • the terminal may be an electronic device such as a mobile phone or a tablet computer.
  • the terminal has a photographing function, and the photographing function of the terminal has an electronic aperture mode; when photographing by using an electronic aperture, the user needs to set the photographing function to the electronic aperture mode.
  • the terminal In the electronic aperture mode, after the user adjusts the aperture value, the terminal performs continuous and uninterrupted shooting during the exposure time, and then fuses the captured multiple images.
  • the embodiment of the present invention adds a handheld mode for the electronic aperture shooting. In the gesture mode, the user can conveniently use the electronic aperture to shoot.
  • an image data stream is first acquired, the image data stream comprising a plurality of frames of image data.
  • the captured original image data stream is first acquired synchronously; then the original image data stream is subjected to data reading and image preprocessing to obtain the image data stream.
  • the raw image data of each frame in the synchronously acquired original image data stream is different in illumination, noise, sharpness, and focus point.
  • the original image data of each frame in the original image data stream needs to be preprocessed through a necessary preprocessing process.
  • the preprocessing process includes: image filtering to eliminate noise and contrast stretching to improve image clarity and The difference in illumination of the image. After the pre-processing is performed, the difference of the image data of each frame in the image data stream will be reduced, which is helpful for the improvement of the effect of the subsequent image registration algorithm.
  • Step 702 Align each frame image data of the image data stream except the first frame image data with the reference frame by using the first frame image data in the image data stream as a reference frame.
  • the pixels in the same spatial position in each frame image data are aligned, thereby avoiding blurring when the image is merged.
  • registration is also called alignment, and there are many algorithms for image alignment, which are mainly divided into a local feature based method and a global feature based method.
  • the typical method based on local features is to extract the key feature points of the image, and then use these key feature points to calculate the mapping matrix of the image space alignment model, and finally use the mapping matrix to perform image alignment.
  • the registration effect of this type of method can generally meet the requirements of many scenes, such as changes in illumination (combination of different exposure images), large-scale image shift (panoramic image stitching), dark light image (noise increase) and other complexities.
  • Scene The other type is a search alignment method based on global mutual information matching, which can reduce the matching error caused by random feature points.
  • the optical flow field is also a point-based matching algorithm, which is the instantaneous velocity of the pixel motion of the space moving object on the observation imaging plane, which is the change of the pixel in the time domain and the correlation between adjacent frames.
  • Step 703 Display a prompt box on the display interface; the prompt box is used to prompt the orientation of the handheld shake when the electronic aperture is used in the handheld mode.
  • a user prompt box is displayed, which prompts the user to manually hold the direction of the jitter, which can facilitate the user to correct in time.
  • Step 704 Superimpose each pixel point of each frame image data after registration according to spatial position correspondence to obtain a target image.
  • the image data of each frame is registered (that is, aligned)
  • the image needs to be fused, and here, a fusion method in which image pixels are sequentially superimposed is adopted.
  • an image processing method which uses image registration principle to perform image alignment on image data to be synthesized in multiple frames, and the alignment method allows image jitter error within a certain range, and the final composite effect of the image The pixel deviation that occurs is small.
  • a prompt box is added to prevent the terminal from being over-ranged when shooting.
  • FIG. 9 is a schematic structural diagram of a terminal according to Embodiment 1 of the present invention. As shown in FIG. 9, the terminal includes:
  • the obtaining unit 91 is configured to acquire an image data stream by using an electronic aperture, the image data stream comprising multi-frame image data;
  • the registration unit 92 is configured to register image data of each frame in the image data stream
  • the merging unit 93 is configured to perform merging processing on the image data of each frame after registration to obtain a target image.
  • the obtaining unit 91 includes:
  • the photographing subunit 911 is configured to obtain an original image data stream by using an electronic aperture in a handheld mode, the original image data stream comprising a plurality of frames of original image data;
  • the pre-processing sub-unit 912 is configured to pre-process the original image data of each frame in the original image data stream to obtain the image data stream; wherein the pre-processing includes at least one of the following: image filtering, contrast pull Stretch.
  • the terminal further includes:
  • the prompting unit 94 is configured to display a prompt box on the display interface; the prompt box is used for prompting In the hand-held mode, the orientation of the hand-held shake is taken when shooting with an electronic aperture.
  • the registration unit 92 is further configured to use the first frame image data in the image data stream as a reference frame, and image data of each frame except the first frame image data in the image data stream.
  • the reference frames are aligned; wherein the alignment refers to aligning pixel points of the same spatial location.
  • the merging unit 93 is further configured to superimpose each pixel point of each frame image data after registration according to spatial position correspondence.
  • FIG. 10 is a schematic structural diagram of a terminal according to Embodiment 1 of the present invention.
  • the terminal includes: a processor 1001, a camera 1002, and a display screen 1003.
  • the processor 1001, the camera 1002, and the display screen 1003 are all connected by a bus 1004.
  • the camera 1002 is configured to acquire an image data stream by using an electronic aperture, the image data stream comprising a plurality of frames of image data;
  • the processor 1001 is configured to register image data of each frame in the image data stream, and perform fusion processing on the image data of each frame after registration to obtain a target image.
  • the camera 1002 is configured to obtain an original image data stream by using an electronic aperture in a handheld mode, where the original image data stream includes multiple frames of original image data;
  • the processor 1001 is configured to perform pre-processing on each frame of the original image data stream to obtain the image data stream, where the pre-processing includes at least one of the following: image filtering, contrast pull And mapping the image data of each frame in the image data stream; and performing the fusion processing on the image data of each frame after registration to obtain a target image.
  • the display screen 1003 is configured to display a prompt box on the display interface; the prompt box is used to prompt the orientation of the handheld shake when the electronic aperture is used in the handheld mode.
  • FIG. 11 is a schematic flowchart of an image processing method according to Embodiment 3 of the present invention.
  • the image processing method in this example is applied to a terminal. As shown in FIG. 11, the image processing method includes the following steps:
  • Step 1101 Acquire an image data stream using an electronic aperture, the image data stream comprising a plurality of frames of image data.
  • the terminal may be an electronic device such as a mobile phone or a tablet computer.
  • the terminal has a photographing function, and the photographing function of the terminal has an electronic aperture mode; when photographing by using an electronic aperture, the user needs to set the photographing function to the electronic aperture mode.
  • the terminal In the electronic aperture mode, after the user adjusts the aperture value, the terminal performs continuous and uninterrupted shooting during the exposure time, and then fuses the captured multiple images.
  • the embodiment of the present invention adds a handheld mode for the electronic aperture shooting. In the gesture mode, the user can conveniently use the electronic aperture to shoot.
  • an image data stream is first acquired, the image data stream comprising a plurality of frames of image data.
  • the captured original image data stream is first acquired synchronously; then the original image data stream is subjected to data reading and image preprocessing to obtain the image data stream.
  • the raw image data of each frame in the synchronously acquired original image data stream is different in illumination, noise, sharpness, and focus point.
  • the original image data of each frame in the original image data stream needs to be preprocessed through a necessary preprocessing process.
  • the preprocessing process includes: image filtering to eliminate noise and contrast stretching to improve image clarity. Degree and the difference in illumination of the image. After the pre-processing is performed, the difference of the image data of each frame in the image data stream will be reduced, which is helpful for the improvement of the effect of the subsequent image registration algorithm.
  • Step 1102 Determine a reference frame from the image data stream, and register each frame image data of the image data stream except the reference frame with the reference frame.
  • the pixels in the same spatial position in each frame image data are aligned, thereby avoiding blurring when the image is merged.
  • registration is also called alignment, and there are many algorithms for image alignment, which are mainly divided into a local feature based method and a global feature based method.
  • the typical method based on local features is to extract the key feature points of the image, and then use these key feature points to calculate the mapping matrix of the image space alignment model, and finally use the mapping matrix to perform image alignment.
  • the registration effect of this type of method can generally meet the requirements of many scenes, such as changes in illumination (combination of different exposure images), large-scale image shift (panoramic image stitching), dark light image (noise increase) and other complexities.
  • Scene The other type is a search alignment method based on global mutual information matching, which can reduce the matching error caused by random feature points.
  • the optical flow field is also a point-based matching algorithm, which is the instantaneous velocity of the pixel motion of the space moving object on the observation imaging plane, which is the change of the pixel in the time domain and the correlation between adjacent frames.
  • the purpose of studying the optical flow field is to approximate the motion field that cannot be directly obtained from the sequence of pictures.
  • the motion field is actually the motion of the object in the three-dimensional real world; the optical flow field is the projection of the motion field on the two-dimensional image plane (human eye or camera).
  • the motion velocity and direction of motion of each pixel in each image are found to be the optical flow field.
  • I(x, y, t) is the pixel value of the image at the (x, y) position.
  • Vx and Vy are the compositions of x, y in the optical flow vector of I(x, y, t), respectively.
  • Ix and Iy are the differences of the image in the corresponding direction at (x, y, t). So there are:
  • the Lucas-Kanade optical flow method assumes that the spatial pixel points move in unison, and adjacent points on one scene are projected onto the image as neighboring points, and the neighboring points are at the same speed. This is a unique assumption of the Lucas-Kanade optical flow method because the basic equation of the optical flow method has only one constraint, and the speed in the x, y direction requires two unknown variables. Assuming similar motions in the neighborhood of the feature points, it is possible to find the velocity in the x, y direction by n multiple equations (n is the total number of points in the neighborhood of the feature points, including the feature points). You can get the following equation:
  • the small motion hypothesis mentioned in the above scheme assumes that when the target speed is fast, the assumption will not be established, and multi-scale can solve this problem.
  • Figure 6 shows the alignment process for multi-frame images using optical flow.
  • the sparse matching points between the two images can be obtained, and then the image mapping model is calculated by using the coordinates of these points.
  • the image alignment step it is important to choose the correct image alignment transformation model.
  • Common spatial transformation models include affine transformation and perspective transformation models.
  • the affine transformation can be visually represented in the following form. Any parallelogram in one plane can be mapped to another parallelogram by affine transformation, and the mapping operation of the image is in the same space. In-plane, different deformation parameters are used to deform different types of parallelograms.
  • Transmission transformation is a more general transformation model. Compared with affine transformation, transmission transformation is more flexible.
  • a transmission transformation can transform a rectangle into a trapezoid. It describes the projection of one plane in space into another spatial plane. Shot transformation is a special case of perspective transformation.
  • a 02 and a 12 are displacement parameters
  • a 00 a 01 and a 10 a 11 are scaling and rotation parameters
  • a 20 a 21 is the amount of deformation in the horizontal and vertical directions.
  • the perspective transformation model mainly considering the handheld terminal.
  • the jitter motion of the mobile phone is basically not in the same plane.
  • the simple motion model is shown in Figure 5.
  • Step 1103 When performing the fusion processing on the image data of each frame after the registration, the pixels in the black area at the image boundary are replaced with the pixels corresponding to the reference frame, and then the fusion processing is performed to obtain the target image.
  • black edges are formed after image map conversion, and if the pixels at these black edges are fused in a weighted average manner, a difference in luminance is formed, which affects the overall vision of the image.
  • the reference frame as the first frame image data in the image data stream as an example
  • the other frame image data are all registered and fused under the first frame image data, and the reference frame does not have black-rim pixels, and the image with black edges
  • the pixel at the black edge is replaced with the pixel at the corresponding position of the reference frame to participate in the weighted average, thereby effectively solving the problem of black edge of the image.
  • an image processing method which uses image registration principle to perform image alignment on image data to be synthesized in multiple frames, and the alignment method allows image jitter error within a certain range, and the final composite effect of the image The pixel deviation that occurs is small. More importantly, in the image fusion, the black edges of the pixels caused by image registration are processed to ensure that the pixel points of the entire image are naturally transitioned.
  • FIG. 12 is a schematic flowchart of an image processing method according to Embodiment 4 of the present invention.
  • the image processing method in this example is applied to a terminal. As shown in FIG. 12, the image processing method includes the following steps:
  • Step 1201 In the handheld mode, the original image data stream is obtained by using an electronic aperture, the original image data stream includes a plurality of frames of original image data; and the original image data of each frame in the original image data stream is preprocessed to obtain The image data stream.
  • the terminal may be an electronic device such as a mobile phone or a tablet computer.
  • the terminal has a photographing function, and the photographing function of the terminal has an electronic aperture mode; when photographing by using an electronic aperture, the user needs to set the photographing function to the electronic aperture mode.
  • the terminal In the electronic aperture mode, after the user adjusts the aperture value, the terminal performs continuous and uninterrupted shooting during the exposure time, and then fuses the captured multiple images.
  • the embodiment of the present invention adds a handheld mode for the electronic aperture shooting. In the gesture mode, the user can conveniently use the hand.
  • the terminal uses the electronic aperture to shoot.
  • an image data stream is first acquired, the image data stream comprising a plurality of frames of image data.
  • the captured original image data stream is first acquired synchronously; then the original image data stream is subjected to data reading and image preprocessing to obtain the image data stream.
  • the raw image data of each frame in the synchronously acquired original image data stream is different in illumination, noise, sharpness, and focus point.
  • the original image data of each frame in the original image data stream needs to be preprocessed through a necessary preprocessing process.
  • the preprocessing process includes: image filtering to eliminate noise and contrast stretching to improve image clarity. Degree and the difference in illumination of the image. After the pre-processing is performed, the difference of the image data of each frame in the image data stream will be reduced, which is helpful for the improvement of the effect of the subsequent image registration algorithm.
  • Step 1202 Align each frame image data of the image data stream except the first frame image data with the reference frame by using the first frame image data in the image data stream as a reference frame.
  • the pixels in the same spatial position in each frame image data are aligned, thereby avoiding blurring when the image is merged.
  • registration is also called alignment, and there are many algorithms for image alignment, which are mainly divided into a local feature based method and a global feature based method.
  • the typical method based on local features is to extract the key feature points of the image, and then use these key feature points to calculate the mapping matrix of the image space alignment model, and finally use the mapping matrix to perform image alignment.
  • the registration effect of this type of method can generally meet the requirements of many scenes, such as changes in illumination (combination of different exposure images), large-scale image shift (panoramic image stitching), dark light image (noise increase) and other complexities.
  • Scene The other type is a search alignment method based on global mutual information matching, which can reduce the matching error caused by random feature points.
  • the optical flow field is also a point-based matching algorithm, which is the instantaneous velocity of the pixel motion of the space moving object on the observation imaging plane, which is the change of the pixel in the time domain and the correlation between adjacent frames.
  • Step 1203 Display a prompt box on the display interface; the prompt box is used to prompt the orientation of the handheld shake when shooting in the handheld mode with the electronic aperture.
  • a user prompt box is displayed, which prompts the user to manually hold the direction of the jitter, which can facilitate the user to correct in time.
  • Step 1204 Analyze the image data of each frame after registration to determine a black area at the boundary of each frame image; when performing fusion processing on the image data of each frame after registration, replace the pixel of the black area at the boundary of the image with The pixels corresponding to the reference frame are subjected to fusion processing to obtain a target image.
  • the fusion process is performed to obtain the target image, including:
  • the pixel in the black area at the image boundary is replaced with the pixel in the reference area in the reference frame, and then the fusion processing is performed to obtain the target image.
  • the black area at the boundary of the image can be divided by a clustering algorithm, such as the K-means algorithm.
  • the image data of each frame is registered (that is, aligned)
  • the image needs to be fused, and here, a fusion method in which image pixels are sequentially superimposed is adopted.
  • black edges are formed after image map conversion, and if the pixels at these black edges are fused in a weighted average manner, a difference in luminance is formed, which affects the overall vision of the image.
  • the image data of each other frame is registered and fused under the image data of the first frame, and the black pixel is not present in the reference frame.
  • the pixel at the black edge is replaced with the corresponding position of the reference frame.
  • the pixels are involved in the weighted average, which effectively solves the problem of black edges of the image.
  • an image processing method which uses image registration principle to perform image alignment on image data to be synthesized in multiple frames, and the alignment method allows image jitter error within a certain range, and the final composite effect of the image
  • the pixel deviation that occurs is small.
  • the black edges of the pixels caused by image registration are processed to ensure that the pixel points of the entire image are naturally transitioned.
  • a prompt box is added to prevent the terminal from being over-ranged when shooting.
  • FIG. 14 is a schematic structural diagram of a terminal according to Embodiment 2 of the present invention. As shown in FIG. 14, the terminal includes:
  • An acquiring unit 41 configured to acquire an image data stream by using an electronic aperture, where the image data stream includes multi-frame image data;
  • the registration unit 42 is configured to determine a reference frame from the image data stream, and register, in the image data stream, image data of each frame other than the reference frame with the reference frame;
  • the merging unit 43 is configured to perform merging processing on the image data of each frame after the registration, and replace the pixels in the black area at the image boundary with the pixels corresponding to the reference frame, and perform fusion processing to obtain a target image.
  • the fusion unit 43 includes:
  • the analyzing sub-unit 431 is configured to analyze the image data of each frame after registration, and determine a black area at a boundary of each frame image;
  • the replacement and fusion sub-unit 432 is configured to perform merging processing on the image data of each frame after the registration, and replace the pixels in the black area at the image boundary with the pixels corresponding to the reference frame, and perform fusion processing to obtain a target image.
  • the replacement and fusion subunit 432 is further configured to: according to a black area at the boundary of the image, Determining a reference area corresponding to the black area in the reference frame; replacing a pixel of the black area at the image boundary with a pixel in the reference area in the reference frame, and performing a blending process to obtain a target image.
  • the registration unit 42 is further configured to use, by using the first frame image data in the image data stream as a reference frame, image data of each frame except the first frame image data in the image data stream.
  • the reference frames are aligned; wherein the alignment refers to aligning pixel points of the same spatial location.
  • the merging unit 43 is further configured to superimpose each pixel point of each frame image data after registration according to spatial position correspondence.
  • the terminal includes: a processor 1001, a camera 1002, and a display screen 1003.
  • the processor 1001, the camera 1002, and the display screen 1003 are all connected by a bus 1004.
  • the camera 1102 is configured to acquire an image data stream using an electronic aperture, the image data stream comprising a plurality of frames of image data;
  • the processor 1001 is configured to determine a reference frame from the image data stream, and register image data of each frame except the reference frame in the image data stream with the reference frame; When the subsequent image data of each frame is subjected to the fusion processing, the pixels of the black area at the image boundary are replaced with the pixels corresponding to the reference frame, and then the fusion processing is performed to obtain the target image.
  • the camera 1102 is configured to obtain an original image data stream by using an electronic aperture in a handheld mode, where the original image data stream includes multiple frames of original image data;
  • the processor 1101 is configured to perform pre-processing on each frame of the original image data stream to obtain the image data stream; wherein the pre-processing includes at least the following a: image filtering, contrast stretching; determining a reference frame from the image data stream, and registering each frame image data of the image data stream except the reference frame with the reference frame; When the image data of each frame is subjected to the fusion processing, the pixels of the black area at the image boundary are replaced with the pixels corresponding to the reference frame, and then the fusion processing is performed to obtain the target image.
  • the display screen 1103 is configured to display a prompt box on the display interface; the prompt box is used to prompt the orientation of the handheld shake when photographing with the electronic aperture in the handheld mode.
  • Figure 15 is a block diagram of the electrical structure of the camera.
  • the lens 1211 is composed of a plurality of optical lenses for forming a subject image, and is a single focus lens or a zoom lens.
  • the lens 1211 is movable in the optical axis direction under the control of the lens driver 1221, and the lens driver 1221 controls the focus position of the lens 1211 in accordance with a control signal from the lens driving control circuit 1222, and in the case of the zoom lens, the focus distance can also be controlled.
  • the lens drive control circuit 1222 performs drive control of the lens driver 1221 in accordance with a control command from the microprocessor 1217.
  • An imaging element 1212 is disposed in the vicinity of the position of the subject image formed by the lens 1211 on the optical axis of the lens 1211.
  • the imaging element 1212 is for capturing an image of a subject and acquiring captured image data.
  • Photodiodes constituting each pixel are arranged two-dimensionally and in a matrix on the imaging element 1212. Each photodiode generates a photoelectric conversion current corresponding to the amount of received light, and the photoelectric conversion current is charged by a capacitor connected to each photodiode.
  • a red-green-blue (RGB) color filter of a Bayer arrangement is disposed on the front surface of each pixel.
  • the imaging element 1212 is connected to the imaging circuit 1213.
  • the imaging circuit 1213 performs charge accumulation control and image signal readout control in the imaging element 1212, and performs waveform shaping after reducing the reset noise of the read image signal (analog image signal). Further, gain improvement or the like is performed to obtain an appropriate level signal.
  • the imaging circuit 1213 is connected to an A/D converter 1214 that performs analog-to-digital conversion on the analog image signal and outputs a digital image signal to the bus 1227 (hereinafter referred to as an image). data).
  • the bus 1227 is a transmission path for transmitting various data read or generated inside the camera.
  • the A/D converter 1214 is connected to the bus 1227, and an image processor 1215, a JPEG processor 1216, a microprocessor 1217, a Synchronous Dynamic Random Access Memory (SDRAM) 1218, and a memory are also connected.
  • An interface hereinafter referred to as a memory I/F
  • a liquid crystal display (LCD) driver 1220 is a liquid crystal display
  • the image processor 1215 performs optical black (OB, Optical Black) subtraction processing, white balance adjustment, color matrix calculation, gamma conversion, color difference signal processing, noise removal processing, and simultaneous processing on the image data based on the output of the imaging element 1212.
  • OB optical black
  • the JPEG processor 1216 compresses the image data read out from the SDRAM 1218 in accordance with the JPEG compression method when the image data is recorded on the storage medium 1225. Further, the JPEG processor 1216 performs decompression of JPEG image data for image reproduction display.
  • the file recorded in the storage medium 1225 is read, and after the compression processing is performed in the JPEG processor 1216, the decompressed image data is temporarily stored in the SDRAM 1218 and displayed on the LCD 1226.
  • the JPEG method is adopted as the image compression/decompression method.
  • the compression/decompression method is not limited thereto, and other compression/decompression methods such as MPEG, TIFF, and H.264 may be used.
  • the microprocessor 1217 functions as a control unit of the entire camera, and collectively controls various processing sequences of the camera.
  • the microprocessor 1217 is connected to the operating unit 1223 and the flash memory 1224.
  • the operating unit 1223 includes, but is not limited to, a physical button or a virtual button, and the entity or virtual button may be a power button, a camera button, an edit button, a dynamic image button, a reproduction button, a menu button, a cross button, an OK button, a delete button, an enlarge button
  • the operation controls such as various input buttons and various input keys detect the operational state of these operation controls.
  • the detection result is output to the microprocessor 1217.
  • the LCD1226 as a display
  • the front surface is provided with a touch panel to detect the touch position of the user, and the touch position is output to the microprocessor 1217.
  • the microprocessor 1217 executes various processing sequences corresponding to the user's operation in accordance with the detection result from the operation position of the operation unit 1223.
  • Flash memory 1224 stores programs for executing various processing sequences of microprocessor 1217.
  • the microprocessor 1217 performs overall control of the camera in accordance with the program. Further, the flash memory 1224 stores various adjustment values of the camera, and the microprocessor 1217 reads out the adjustment value, and performs control of the camera in accordance with the adjustment value.
  • the SDRAM 1218 is an electrically replaceable volatile memory for temporarily storing image data or the like.
  • the SDRAM 1218 temporarily stores image data output from the analog/digital (A/D) converter 1214 and image data processed in the image processor 1215, the JPEG processor 1216, and the like.
  • the memory interface 1219 is connected to the storage medium 1225, and performs control for writing image data and a file header attached to the image data to the storage medium 1225 and reading out from the storage medium 1225.
  • the storage medium 1225 can be implemented as a storage medium such as a memory card that can be detachably attached to the camera body.
  • the storage medium 1225 is not limited thereto, and may be a hard disk or the like built in the camera body.
  • the LCD driver 1210 is connected to the LCD 1226, and stores the image data processed by the image processor 1215 in the SDRAM 1218.
  • the image data stored in the SDRAM 1218 is read and displayed on the LCD 1226, or the JPEG processor 1216 is compressed.
  • the image data is stored in the SDRAM 1218.
  • the JPEG processor 1216 reads the compressed image data of the SDRAM 1218, decompresses it, and displays the decompressed image data through the LCD 1226.
  • the LCD 1226 is disposed on the back side of the camera body for image display.
  • image display may be performed using various display panels based on organic EL, that is, OLED.
  • the terminal in the embodiment of the present invention, if the terminal is implemented in the form of a software function module and sold or used as a stand-alone product, it may also be stored in a computer readable storage medium.
  • a computer device which may be a personal computer, server, or network device, etc.
  • the foregoing storage medium includes various media that can store program codes, such as a USB flash drive, a mobile hard disk, a ROM, a magnetic disk, or an optical disk.
  • embodiments of the invention are not limited to any specific combination of hardware and software.
  • an embodiment of the present invention further provides a computer storage medium in which a computer program is stored, the computer program being configured to perform an image processing method according to an embodiment of the present invention.
  • the disclosed method and smart device may be implemented in other manners.
  • the device embodiments described above are merely illustrative.
  • the division of the unit is only a logical function division.
  • there may be another division manner such as: multiple units or components may be combined, or Can be integrated into another system, or some features can be ignored or not executed.
  • the coupling, or direct coupling, or communication connection of the components shown or discussed may be indirect coupling or communication connection through some interfaces, devices or units, and may be electrical, mechanical or other forms. of.
  • the units described above as separate components may or may not be physically separated, and the components displayed as the unit may or may not be physical units, that is, may be located in one place or distributed to multiple network units; Some or all of the units may be selected according to actual needs to achieve the purpose of the solution of the embodiment.
  • each functional unit in each embodiment of the present invention may be integrated into one second processing unit, or each unit may be separately used as one unit, or two or more units may be integrated into one unit;
  • the above integrated unit can be implemented in the form of hardware or in the form of hardware plus software functional units.
  • the technical solution of the embodiment of the invention increases the hand-held mode of the electronic aperture, improves the convenience of the user's shooting, and introduces the image registration in the embodiment of the invention, which ensures the effect of the shooting and improves the user.
  • the embodiment of the present invention further adds a prompt box on the display interface to prevent the handheld terminal from being over-ranged when the user shoots.
  • the user can carry out the shooting of the electronic aperture by the hand-held terminal, which improves the convenience of the user's shooting, avoids the problem that the image is unclear due to the hand-held, ensures the shooting effect, and improves the user's shooting experience.
  • image fusion is performed, the black side of the pixel caused by image registration is processed, which ensures that the pixel point transition of each position of the entire image is natural.

Abstract

Disclosed are an image processing method and a terminal, comprising: obtaining an image data stream using an electronic aperture, the image data stream comprising multiple frames of image data; registering the respective frames of image data in the image data stream; and fusing the registered respective frames of image data to obtain a target image. By means of adding a handheld mode of the electronic aperture, embodiments of the present invention increase user convenience in photographing, guarantee the photographing effect, and improve user photographing experience.

Description

一种图像处理方法及终端、计算机存储介质Image processing method and terminal, computer storage medium
相关申请的交叉引用Cross-reference to related applications
本申请基于申请号为201610375523.3、申请日为2016年05月31日的中国专利申请、以及申请号为201610377764.1、申请日为2016年05月31日的中国专利申请提出,并要求该中国专利申请的优先权,该中国专利申请的全部内容在此引入本申请作为参考。This application is based on a Chinese patent application with the application number 201610375523.3, the application date is May 31, 2016, and the Chinese patent application with the application number 201610377764.1, the application date is May 31, 2016, and requires the Chinese patent application. Priority is hereby incorporated by reference in its entirety in its entirety in its entirety in its entirety in its entirety in its entirety in
技术领域Technical field
本发明涉及拍照领域中的图像处理技术,尤其涉及一种图像处理方法及终端、计算机存储介质。The present invention relates to image processing technology in the field of photographing, and in particular to an image processing method and terminal, and a computer storage medium.
背景技术Background technique
拍照功能是移动终端常用的功能之一,部分移动终端的拍照功能具有电子光圈模式。在电子光圈模式下,用户调整光圈值之后,在曝光时间内移动终端会进行连续不间断的拍摄,然后对拍摄出来的多张图像透明化之后再进行叠加处理,整体效果与“慢快门”所带来的实际效果十分一致,主要突出超长甚至可以达到B门级别的曝光时间。电子光圈模式采用的算法不会产生过曝,画面整体呈现的效果比较自然。The camera function is one of the commonly used functions of the mobile terminal, and the camera function of some mobile terminals has an electronic aperture mode. In the electronic aperture mode, after the user adjusts the aperture value, the mobile terminal performs continuous and uninterrupted shooting during the exposure time, and then transparently processes the captured multiple images before superimposing the overall effect and the "slow shutter" The actual effect is very consistent, mainly highlighting the long exposure time and even reaching the B-level exposure time. The algorithm used in the electronic aperture mode does not cause overexposure, and the overall effect of the picture is relatively natural.
采用电子光圈模式进行拍照时,由于需要连续拍摄多张图像进行图像融合,因此,各个图像需要保证对齐,移动终端不能出现晃动,否则图像像素点将会出现模糊,基于此,通常需要配合三脚架进行电子光圈模式的拍摄,但是这样会限制电子光圈模式的易用性和用户体验性。此外,在图像的边界处由于图像对齐会形成黑边,这些黑边处的像素如果按照加权平均的方式进行融合会形成亮度的差异,给图像的整体视觉带来影响。 When taking pictures in the electronic aperture mode, since multiple images need to be continuously captured for image fusion, each image needs to be aligned, and the mobile terminal cannot be shaken, otherwise the image pixels will be blurred. Based on this, it is usually necessary to cooperate with a tripod. Shooting in the electronic iris mode, but this limits the ease of use and user experience of the electronic iris mode. In addition, black spots are formed at the boundary of the image due to image alignment, and the pixels at these black edges are merged in a weighted average manner to form a difference in brightness, which affects the overall vision of the image.
发明内容Summary of the invention
为解决上述技术问题,本发明实施例提供了一种图像处理方法及终端、计算机存储介质。To solve the above technical problem, an embodiment of the present invention provides an image processing method, a terminal, and a computer storage medium.
本发明实施例提供的终端,包括:The terminal provided by the embodiment of the present invention includes:
获取单元,配置为利用电子光圈获取图像数据流,所述图像数据流包括多帧图像数据;An acquiring unit configured to acquire an image data stream by using an electronic aperture, the image data stream comprising a plurality of frames of image data;
配准单元,配置为将所述图像数据流中的各帧图像数据进行配准;a registration unit configured to register image data of each frame in the image data stream;
融合单元,配置为将配准后的各帧图像数据进行融合处理,得到目标图像。The fusion unit is configured to perform fusion processing on the image data of each frame after registration to obtain a target image.
本发明实施例中,所述获取单元包括:In the embodiment of the present invention, the acquiring unit includes:
拍摄子单元,配置为在手持模式下,利用电子光圈拍摄得到原始图像数据流,所述原始图像数据流包括多帧原始图像数据;a shooting subunit configured to capture an original image data stream by using an electronic aperture in a handheld mode, the original image data stream comprising a plurality of frames of raw image data;
预处理子单元,配置为对所述原始图像数据流中的各帧原始图像数据进行预处理,得到所述图像数据流;其中,所述预处理包括以下至少之一:图像滤波、对比度拉伸。a pre-processing unit configured to pre-process each frame of the original image data stream to obtain the image data stream; wherein the pre-processing comprises at least one of: image filtering, contrast stretching .
本发明实施例中,所述终端还包括:In the embodiment of the present invention, the terminal further includes:
提示单元,配置为在显示界面上显示提示框;所述提示框用于提示在所述在手持模式下,利用电子光圈拍摄时手持抖动的方位。The prompting unit is configured to display a prompt box on the display interface, and the prompting box is configured to prompt the orientation of the handheld shaking when the electronic aperture is used in the handheld mode.
本发明实施例中,所述配准单元,还配置为以所述图像数据流中的第一帧图像数据作为参考帧,将所述图像数据流中除所述第一帧图像数据以外的各帧图像数据与所述参考帧进行对齐;其中,所述对齐是指将相同空间位置的像素点对齐。In the embodiment of the present invention, the registration unit is further configured to: use the first frame image data in the image data stream as a reference frame, and divide each of the image data streams except the first frame image data. The frame image data is aligned with the reference frame; wherein the alignment refers to aligning pixel points of the same spatial location.
本发明实施例提供的图像处理方法,包括:An image processing method provided by an embodiment of the present invention includes:
利用电子光圈获取图像数据流,所述图像数据流包括多帧图像数据;Acquiring an image data stream using an electronic aperture, the image data stream comprising a plurality of frames of image data;
将所述图像数据流中的各帧图像数据进行配准; Registering each frame of image data in the image data stream;
将配准后的各帧图像数据进行融合处理,得到目标图像。The image data of each frame after registration is subjected to fusion processing to obtain a target image.
本发明实施例中,所述利用电子光圈获取图像数据流,包括:In the embodiment of the present invention, the acquiring an image data stream by using an electronic aperture includes:
在手持模式下,利用电子光圈拍摄得到原始图像数据流,所述原始图像数据流包括多帧原始图像数据;In the handheld mode, an original image data stream is obtained by taking an electronic aperture, the original image data stream comprising a plurality of frames of original image data;
对所述原始图像数据流中的各帧原始图像数据进行预处理,得到所述图像数据流;Performing pre-processing on each frame of the original image data in the original image data stream to obtain the image data stream;
其中,所述预处理包括以下至少之一:图像滤波、对比度拉伸。Wherein, the preprocessing comprises at least one of the following: image filtering, contrast stretching.
本发明实施例中,所述方法还包括:In the embodiment of the present invention, the method further includes:
在显示界面上显示提示框;所述提示框用于提示在所述在手持模式下,利用电子光圈拍摄时手持抖动的方位。A prompt box is displayed on the display interface; the prompt box is used to indicate the orientation of the handheld shake when the electronic aperture is used in the handheld mode.
本发明实施例中,所述将所述图像数据流中的各帧图像数据进行配准,包括:In the embodiment of the present invention, the registering each frame image data in the image data stream includes:
以所述图像数据流中的第一帧图像数据作为参考帧,将所述图像数据流中除所述第一帧图像数据以外的各帧图像数据与所述参考帧进行对齐;Aligning, by using the first frame image data in the image data stream as a reference frame, each frame image data of the image data stream except the first frame image data, and the reference frame;
其中,所述对齐是指将相同空间位置的像素点对齐。Wherein, the alignment refers to aligning pixel points of the same spatial position.
本发明实施例中,所述将配准后的各帧图像数据进行融合处理,包括:In the embodiment of the present invention, the image processing of each frame image after the registration is performed, including:
将配准后的各帧图像数据的各个像素点按照空间位置对应进行叠加。Each pixel point of the image data of each frame after registration is superimposed according to the spatial position correspondence.
本发明另一实施例提供的终端包括:A terminal provided by another embodiment of the present invention includes:
获取单元,配置为利用电子光圈获取图像数据流,所述图像数据流包括多帧图像数据;An acquiring unit configured to acquire an image data stream by using an electronic aperture, the image data stream comprising a plurality of frames of image data;
配准单元,配置为从所述图像数据流中确定出参考帧,将所述图像数据流中除所述参考帧以外的各帧图像数据与所述参考帧进行配准;a registration unit, configured to determine a reference frame from the image data stream, and register, in the image data stream, image data of each frame other than the reference frame with the reference frame;
融合单元,配置为对配准后的各帧图像数据进行融合处理时,将图像边界处黑色区域的像素替换为所述参考帧对应的像素后进行融合处理,得到目标图像。 The merging unit is configured to perform merging processing on the image data of each frame after the registration, and replace the pixels in the black area at the image boundary with the pixels corresponding to the reference frame, and perform fusion processing to obtain a target image.
本发明实施例中,所述融合单元包括:In the embodiment of the present invention, the fusion unit includes:
分析子单元,配置为对配准后的各帧图像数据进行分析,确定出各帧图像边界处的黑色区域;The analysis subunit is configured to analyze the image data of each frame after registration to determine a black area at a boundary of each frame image;
替换及融合子单元,配置为对配准后的各帧图像数据进行融合处理时,将图像边界处黑色区域的像素替换为所述参考帧对应的像素后进行融合处理,得到目标图像。The replacement and fusion subunits are configured to perform fusion processing on the image data of each frame after the registration, and replace the pixels in the black area at the image boundary with the pixels corresponding to the reference frame, and perform fusion processing to obtain a target image.
本发明实施例中,所述替换及融合子单元,还配置为根据所述图像边界的处黑色区域,确定在所述参考帧中与所述黑色区域相对应的参考区域;将图像边界处黑色区域的像素替换为所述参考帧中参考区域中的像素后进行融合处理,得到目标图像。In the embodiment of the present invention, the replacing and merging subunit is further configured to determine, according to a black area at the image boundary, a reference area corresponding to the black area in the reference frame; The pixel of the region is replaced with the pixel in the reference region in the reference frame, and then the fusion processing is performed to obtain the target image.
本发明实施例中,所述配准单元,还配置为以所述图像数据流中的第一帧图像数据作为参考帧,将所述图像数据流中除所述第一帧图像数据以外的各帧图像数据与所述参考帧进行对齐;其中,所述对齐是指将相同空间位置的像素点对齐。In the embodiment of the present invention, the registration unit is further configured to: use the first frame image data in the image data stream as a reference frame, and divide each of the image data streams except the first frame image data. The frame image data is aligned with the reference frame; wherein the alignment refers to aligning pixel points of the same spatial location.
本发明另一实施例提供的图像处理方法,所述方法包括:An image processing method according to another embodiment of the present invention includes:
利用电子光圈获取图像数据流,所述图像数据流包括多帧图像数据;Acquiring an image data stream using an electronic aperture, the image data stream comprising a plurality of frames of image data;
从所述图像数据流中确定出参考帧,将所述图像数据流中除所述参考帧以外的各帧图像数据与所述参考帧进行配准;Determining a reference frame from the image data stream, and registering each frame image data of the image data stream except the reference frame with the reference frame;
对配准后的各帧图像数据进行融合处理时,将图像边界处黑色区域的像素替换为所述参考帧对应的像素后进行融合处理,得到目标图像。When the image data of each frame after the registration is subjected to the fusion processing, the pixels of the black area at the image boundary are replaced with the pixels corresponding to the reference frame, and then the fusion processing is performed to obtain the target image.
本发明实施例中,所述对配准后的各帧图像数据进行融合处理时,将图像边界处黑色区域的像素替换为所述参考帧对应的像素后进行融合处理,得到目标图像,包括:In the embodiment of the present invention, when performing the merging process on the image data of each frame after the registration, the pixel of the black area at the image boundary is replaced with the pixel corresponding to the reference frame, and then the fusion process is performed to obtain the target image, including:
对配准后的各帧图像数据进行分析,确定出各帧图像边界处的黑色区域; Performing analysis on the image data of each frame after registration to determine a black area at the boundary of each frame image;
对配准后的各帧图像数据进行融合处理时,将图像边界处黑色区域的像素替换为所述参考帧对应的像素后进行融合处理,得到目标图像。When the image data of each frame after the registration is subjected to the fusion processing, the pixels of the black area at the image boundary are replaced with the pixels corresponding to the reference frame, and then the fusion processing is performed to obtain the target image.
本发明实施例中,所述将图像边界处黑色区域的像素替换为所述参考帧对应的像素后进行融合处理,得到目标图像,包括:In the embodiment of the present invention, after the pixel of the black area at the image boundary is replaced with the pixel corresponding to the reference frame, the fusion process is performed to obtain the target image, including:
根据所述图像边界的处黑色区域,确定在所述参考帧中与所述黑色区域相对应的参考区域;Determining a reference area corresponding to the black area in the reference frame according to a black area at the boundary of the image;
将图像边界处黑色区域的像素替换为所述参考帧中参考区域中的像素后进行融合处理,得到目标图像。The pixel in the black area at the image boundary is replaced with the pixel in the reference area in the reference frame, and then the fusion processing is performed to obtain the target image.
本发明实施例中,所述从所述图像数据流中确定出参考帧,将所述图像数据流中除所述参考帧以外的各帧图像数据与所述参考帧进行配准,包括:In the embodiment of the present invention, the determining a reference frame from the image data stream, and registering each frame image data of the image data stream except the reference frame with the reference frame, includes:
以所述图像数据流中的第一帧图像数据作为参考帧,将所述图像数据流中除所述第一帧图像数据以外的各帧图像数据与所述参考帧进行对齐;Aligning, by using the first frame image data in the image data stream as a reference frame, each frame image data of the image data stream except the first frame image data, and the reference frame;
其中,所述对齐是指将相同空间位置的像素点对齐。Wherein, the alignment refers to aligning pixel points of the same spatial position.
本发明实施例中,所述将配准后的各帧图像数据进行融合处理,包括:In the embodiment of the present invention, the image processing of each frame image after the registration is performed, including:
将配准后的各帧图像数据的各个像素点按照空间位置对应进行叠加。Each pixel point of the image data of each frame after registration is superimposed according to the spatial position correspondence.
本发明实施例提供的终端包括:相机、处理器;The terminal provided by the embodiment of the present invention includes: a camera and a processor;
所述相机,配置为利用电子光圈获取图像数据流,所述图像数据流包括多帧图像数据;The camera is configured to acquire an image data stream using an electronic aperture, the image data stream comprising a plurality of frames of image data;
所述处理器,配置为将所述图像数据流中的各帧图像数据进行配准;将配准后的各帧图像数据进行融合处理,得到目标图像。The processor is configured to register image data of each frame in the image data stream, and perform fusion processing on the image data of each frame after registration to obtain a target image.
本发明实施例提供的终端包括:相机、处理器;The terminal provided by the embodiment of the present invention includes: a camera and a processor;
所述相机,配置为利用电子光圈获取图像数据流,所述图像数据流包括多帧图像数据;The camera is configured to acquire an image data stream using an electronic aperture, the image data stream comprising a plurality of frames of image data;
所述处理器,配置为从所述图像数据流中确定出参考帧,将所述图像 数据流中除所述参考帧以外的各帧图像数据与所述参考帧进行配准;对配准后的各帧图像数据进行融合处理时,将图像边界处黑色区域的像素替换为所述参考帧对应的像素后进行融合处理,得到目标图像。The processor, configured to determine a reference frame from the image data stream, the image And storing, in the data stream, image data of each frame other than the reference frame and the reference frame; and performing fusion processing on the image data of each frame after registration, replacing pixels of the black area at the boundary of the image with the reference The pixel corresponding to the frame is subjected to fusion processing to obtain a target image.
本发明实施例提供的计算机存储介质中存储有计算机可执行指令,该计算机可执行指令配置为执行所述的任意所述的图像处理方法。The computer storage medium provided by the embodiment of the present invention stores computer executable instructions configured to perform any of the image processing methods described.
本发明实施例的技术方案中,在手持模式下,利用电子光圈获取图像数据流,所述图像数据流包括多帧图像数据;将所述图像数据流中的各帧图像数据进行配准;将配准后的各帧图像数据进行融合处理,得到目标图像。可见,本发明实施例增加了电子光圈的手持模式,提高了用户拍摄的便利性,对于手持而引起的抖动,本发明实施例引入了图像配准,保障了拍摄的效果,提升了用户拍摄体验。此外,本发明实施例还在显示界面上增加了提示框,防止用户拍摄时手持终端抖动范围过大。In the technical solution of the embodiment of the present invention, in the handheld mode, an image data stream is acquired by using an electronic aperture, the image data stream includes multi-frame image data; and each frame image data in the image data stream is registered; The image data of each frame after registration is subjected to fusion processing to obtain a target image. It can be seen that the embodiment of the present invention increases the hand-held mode of the electronic aperture, improves the convenience of the user's shooting, and introduces the image registration in the embodiment of the present invention to ensure the shooting effect and improve the user's shooting experience. . In addition, the embodiment of the present invention further adds a prompt box on the display interface to prevent the handheld terminal from being over-ranged when the user shoots.
本发明实施例的技术方案中,在手持模式下,利用电子光圈获取图像数据流,所述图像数据流包括多帧图像数据;从所述图像数据流中确定出参考帧,将所述图像数据流中除所述参考帧以外的各帧图像数据与所述参考帧进行配准;对配准后的各帧图像数据进行融合处理时,将图像边界处黑色区域的像素替换为所述参考帧对应的像素后进行融合处理,得到目标图像。通过对本发明实施例技术方案的实施,用户可以手持终端进行电子光圈的拍摄,在提高用户拍摄便利性的同时,避免了由于手持而引起的图像不清楚的问题,保障了拍摄的效果,提升了用户拍摄体验。在进行图像融合时,对于图像配准引起的像素黑边进行了处理,保证了整个图像各个位置的像素点过渡自然。In the technical solution of the embodiment of the present invention, in the handheld mode, an image data stream is acquired by using an electronic aperture, the image data stream includes multi-frame image data; a reference frame is determined from the image data stream, and the image data is determined. And storing, in the stream, image data of each frame other than the reference frame and the reference frame; and performing fusion processing on the image data of each frame after the registration, replacing pixels of the black area at the boundary of the image with the reference frame The corresponding pixel is subjected to fusion processing to obtain a target image. Through the implementation of the technical solution of the embodiment of the present invention, the user can perform the shooting of the electronic aperture by the hand-held terminal, thereby improving the convenience of the user's shooting, avoiding the problem of unclear image caused by the hand-held, ensuring the effect of shooting, and improving the effect of the shooting. User shooting experience. When image fusion is performed, the black side of the pixel caused by image registration is processed, which ensures that the pixel point transition of each position of the entire image is natural.
附图说明DRAWINGS
图1为实现本发明各个实施例一个可选的移动终端的硬件结构示意图;1 is a schematic structural diagram of hardware of an optional mobile terminal embodying various embodiments of the present invention;
图2为如图1所示的移动终端的无线通信系统示意图; 2 is a schematic diagram of a wireless communication system of the mobile terminal shown in FIG. 1;
图3为本发明实施例一的图像处理方法的流程示意图;3 is a schematic flowchart of an image processing method according to Embodiment 1 of the present invention;
图4为光流图像像素点匹配示意图;4 is a schematic diagram of pixel point matching of an optical flow image;
图5为简易的手机运动模型示意图;Figure 5 is a schematic diagram of a simple mobile phone motion model;
图6为本发明实施例的利用光流对多帧图像进行对齐流程图;FIG. 6 is a flowchart of aligning a multi-frame image by using an optical flow according to an embodiment of the present invention; FIG.
图7为本发明实施例二的图像处理方法的流程示意图;FIG. 7 is a schematic flowchart diagram of an image processing method according to Embodiment 2 of the present invention; FIG.
图8为本发明实施例的提示框的界面示意图;FIG. 8 is a schematic diagram of an interface of a prompt box according to an embodiment of the present invention; FIG.
图9为本发明实施例一的终端的结构组成示意图;9 is a schematic structural diagram of a terminal according to Embodiment 1 of the present invention;
图10为本发明实施例的终端的结构组成示意图;FIG. 10 is a schematic structural diagram of a terminal according to an embodiment of the present invention;
图11为本发明实施例三的图像处理方法的流程示意图;11 is a schematic flowchart of an image processing method according to Embodiment 3 of the present invention;
图12为本发明实施例四的图像处理方法的流程示意图;FIG. 12 is a schematic flowchart diagram of an image processing method according to Embodiment 4 of the present invention; FIG.
图13为本发明实施例的图像融合示意图;FIG. 13 is a schematic diagram of image fusion according to an embodiment of the present invention; FIG.
图14为本发明实施例二的终端的结构组成示意图;14 is a schematic structural diagram of a terminal according to Embodiment 2 of the present invention;
图15为相机的电气结构框图。Figure 15 is a block diagram of the electrical structure of the camera.
具体实施方式detailed description
现在将参考附图描述实现本发明各个实施例的移动终端。在后续的描述中,使用用于表示元件的诸如“模块”、“部件”或“单元”的后缀仅为了有利于本发明实施例的说明,其本身并没有特定的意义。因此,"模块"与"部件"可以混合地使用。A mobile terminal embodying various embodiments of the present invention will now be described with reference to the accompanying drawings. In the following description, the suffixes such as "module," "component," or "unit" used to denote an element are merely illustrative of the embodiments of the present invention, and do not have a specific meaning per se. Therefore, "module" and "component" can be used in combination.
移动终端可以以各种形式来实施。例如,本发明实施例中描述的终端可以包括诸如移动电话、智能电话、笔记本电脑、数字广播接收器、个人数字助理(PDA,Personal Digital Assistant)、平板电脑(PAD)、便携式多媒体播放器(PMP,Portable Media Player)、导航装置等等的移动终端以及诸如数字TV、台式计算机等等的固定终端。下面,假设终端是移动终端。然而,本领域技术人员将理解的是,除了特别用于移动目的的元件之外,根据本发明的实施方式的构造也能够应用于固定类型的终端。 The mobile terminal can be implemented in various forms. For example, the terminal described in the embodiments of the present invention may include, for example, a mobile phone, a smart phone, a notebook computer, a digital broadcast receiver, a personal digital assistant (PDA, Personal Digital Assistant), a tablet (PAD), a portable multimedia player (PMP). , Portable Media Player), mobile devices of navigation devices and the like, and fixed terminals such as digital TVs, desktop computers, and the like. In the following, it is assumed that the terminal is a mobile terminal. However, those skilled in the art will appreciate that configurations in accordance with embodiments of the present invention can be applied to fixed type terminals in addition to components that are specifically for mobile purposes.
图1为实现本发明各个实施例的移动终端的硬件结构示意。FIG. 1 is a schematic diagram showing the hardware structure of a mobile terminal embodying various embodiments of the present invention.
移动终端100可以包括无线通信单元110、音频/视频(A/V)输入单元120、用户输入单元130、感测单元140、输出单元150、存储器160、接口单元170、控制器180和电源单元190等等。图1示出了具有各种组件的移动终端,但是应理解的是,并不要求实施所有示出的组件。可以替代地实施更多或更少的组件。将在下面详细描述移动终端的元件。The mobile terminal 100 may include a wireless communication unit 110, an audio/video (A/V) input unit 120, a user input unit 130, a sensing unit 140, an output unit 150, a memory 160, an interface unit 170, a controller 180, and a power supply unit 190. and many more. Figure 1 illustrates a mobile terminal having various components, but it should be understood that not all illustrated components are required to be implemented. More or fewer components can be implemented instead. The elements of the mobile terminal will be described in detail below.
无线通信单元110通常包括一个或多个组件,其允许移动终端100与无线通信系统或网络之间的无线电通信。例如,无线通信单元可以包括广播接收模块111、移动通信模块112、无线互联网模块113、短程通信模块114和位置信息模块115中的至少一个。 Wireless communication unit 110 typically includes one or more components that permit radio communication between mobile terminal 100 and a wireless communication system or network. For example, the wireless communication unit may include at least one of a broadcast receiving module 111, a mobile communication module 112, a wireless internet module 113, a short-range communication module 114, and a location information module 115.
广播接收模块111经由广播信道从外部广播管理服务器接收广播信号和/或广播相关信息。广播信道可以包括卫星信道和/或地面信道。广播管理服务器可以是生成并发送广播信号和/或广播相关信息的服务器或者接收之前生成的广播信号和/或广播相关信息并且将其发送给终端的服务器。广播信号可以包括TV广播信号、无线电广播信号、数据广播信号等等。而且,广播信号可以进一步包括与TV或无线电广播信号组合的广播信号。广播相关信息也可以经由移动通信网络提供,并且在该情况下,广播相关信息可以由移动通信模块112来接收。广播信号可以以各种形式存在,例如,其可以以数字多媒体广播(DMB,Digital Multimedia Broadcasting)的电子节目指南(EPG,Electronic Program Guide)、数字视频广播手持(DVB-H,Digital Video Broadcasting-Handheld)的电子服务指南(ESG,Electronic Service Guide)等等的形式而存在。广播接收模块111可以通过使用各种类型的广播系统接收信号广播。特别地,广播接收模块111可以通过使用诸如多媒体广播-地面(DMB-T,Digital Multimedia Broadcasting-Terrestrial)、数字多媒体广播-卫星(DMB-S,Digital Multimedia Broadcasting-Satellite)、 数字视频广播手持(DVB-H),前向链路媒体(MediaFLO,Media Forward Link Only)的数据广播系统、地面数字广播综合服务(ISDB-T,Integrated Services Digital Broadcasting-Terrestrial)等等的数字广播系统接收数字广播。广播接收模块111可以被构造为适合提供广播信号的各种广播系统以及上述数字广播系统。经由广播接收模块111接收的广播信号和/或广播相关信息可以存储在存储器160(或者其它类型的存储介质)中。The broadcast receiving module 111 receives a broadcast signal and/or broadcast associated information from an external broadcast management server via a broadcast channel. The broadcast channel can include a satellite channel and/or a terrestrial channel. The broadcast management server may be a server that generates and transmits a broadcast signal and/or broadcast associated information or a server that receives a previously generated broadcast signal and/or broadcast associated information and transmits it to the terminal. The broadcast signal may include a TV broadcast signal, a radio broadcast signal, a data broadcast signal, and the like. Moreover, the broadcast signal may further include a broadcast signal combined with a TV or radio broadcast signal. The broadcast associated information may also be provided via a mobile communication network, and in this case, the broadcast associated information may be received by the mobile communication module 112. The broadcast signal may exist in various forms, for example, it may be an Electronic Program Guide (EPG) of Digital Multimedia Broadcasting (DMB), a digital video broadcast handheld (DVB-H, Digital Video Broadcasting-Handheld). ) exists in the form of an ESG (Electronic Service Guide) and the like. The broadcast receiving module 111 can receive a signal broadcast by using various types of broadcast systems. In particular, the broadcast receiving module 111 can use, for example, DMB-T (Digital Multimedia Broadcasting-Terrestrial), Digital Multimedia Broadcasting-Satellite (DMB-S, Digital Multimedia Broadcasting-Satellite), Digital Video Broadcasting Handheld (DVB-H), Forward Link Media (MediaFLO, Media Forward Link Only) data broadcasting system, Digital Broadcasting Integrated Services (ISDB-T, Integrated Services Digital Broadcasting-Terrestrial), etc. The system receives digital broadcasts. The broadcast receiving module 111 can be constructed as various broadcast systems suitable for providing broadcast signals as well as the above-described digital broadcast system. The broadcast signal and/or broadcast associated information received via the broadcast receiving module 111 may be stored in the memory 160 (or other type of storage medium).
移动通信模块112将无线电信号发送到基站(例如,接入点、节点B等等)、外部终端以及服务器中的至少一个和/或从其接收无线电信号。这样的无线电信号可以包括语音通话信号、视频通话信号、或者根据文本和/或多媒体消息发送和/或接收的各种类型的数据。The mobile communication module 112 transmits the radio signals to and/or receives radio signals from at least one of a base station (e.g., an access point, a Node B, etc.), an external terminal, and a server. Such radio signals may include voice call signals, video call signals, or various types of data transmitted and/or received in accordance with text and/or multimedia messages.
无线互联网模块113支持移动终端的无线互联网接入。该模块可以内部或外部地耦接到终端。该模块所涉及的无线互联网接入技术可以包括无线局域网络(Wi-Fi,WLAN,Wireless Local Area Networks)、无线宽带(Wibro)、全球微波互联接入(Wimax)、高速下行链路分组接入(HSDPA,High Speed Downlink Packet Access)等等。The wireless internet module 113 supports wireless internet access of the mobile terminal. The module can be internally or externally coupled to the terminal. The wireless Internet access technologies involved in the module may include Wi-Fi (WLAN, Wireless Local Area Networks), Wireless Broadband (Wibro), Worldwide Interoperability for Microwave Access (Wimax), and High Speed Downlink Packet Access. (HSDPA, High Speed Downlink Packet Access) and so on.
短程通信模块114是用于支持短程通信的模块。短程通信技术的一些示例包括蓝牙、射频识别(RFID,Radio Frequency Identification)、红外数据协会(IrDA,Infrared Data Association)、超宽带(UWB,Ultra Wideband)、紫蜂等等。The short range communication module 114 is a module for supporting short range communication. Some examples of short-range communication technologies include Bluetooth, Radio Frequency Identification (RFID), Infrared Data Association (IrDA), Ultra Wideband (UWB), Zigbee, and the like.
位置信息模块115是用于检查或获取移动终端的位置信息的模块。位置信息模块的典型示例是全球定位系统(GPS,Global Positioning System)。根据当前的技术,作为GPS的位置信息模块115计算来自三个或更多卫星的距离信息和准确的时间信息并且对于计算的信息应用三角测量法,从而根据经度、纬度和高度准确地计算三维当前位置信息。当前,用于计算位置和时间信息的方法使用三颗卫星并且通过使用另外的一颗卫星校正计算 出的位置和时间信息的误差。此外,作为GPS的位置信息模块115能够通过实时地连续计算当前位置信息来计算速度信息。The location information module 115 is a module for checking or acquiring location information of the mobile terminal. A typical example of a location information module is the Global Positioning System (GPS). According to the current technology, the position information module 115 as a GPS calculates distance information and accurate time information from three or more satellites and applies triangulation to the calculated information to accurately calculate the three-dimensional current based on longitude, latitude, and altitude. location information. Currently, the method used to calculate position and time information uses three satellites and is corrected by using another satellite. The error of the position and time information. Further, the position information module 115 as a GPS can calculate the speed information by continuously calculating the current position information in real time.
A/V输入单元120用于接收音频或视频信号。A/V输入单元120可以包括相机121和麦克风122,相机121对在视频捕获模式或图像捕获模式中由图像捕获装置获得的静态图片或视频的图像数据进行处理。处理后的图像帧可以显示在显示单元151上。经相机121处理后的图像帧可以存储在存储器160(或其它存储介质)中或者经由无线通信单元110进行发送,可以根据移动终端的构造提供两个或更多相机121。麦克风122可以在电话通话模式、记录模式、语音识别模式等等运行模式中经由麦克风接收声音(音频数据),并且能够将这样的声音处理为音频数据。处理后的音频(语音)数据可以在电话通话模式的情况下转换为可经由移动通信模块112发送到移动通信基站的格式输出。麦克风122可以实施各种类型的噪声消除(或抑制)算法以消除(或抑制)在接收和发送音频信号的过程中产生的噪声或者干扰。The A/V input unit 120 is for receiving an audio or video signal. The A/V input unit 120 may include a camera 121 and a microphone 122 that processes image data of still pictures or video obtained by the image capturing device in a video capturing mode or an image capturing mode. The processed image frame can be displayed on the display unit 151. The image frames processed by the camera 121 may be stored in the memory 160 (or other storage medium) or transmitted via the wireless communication unit 110, and two or more cameras 121 may be provided according to the configuration of the mobile terminal. The microphone 122 can receive sound (audio data) via a microphone in an operation mode of a telephone call mode, a recording mode, a voice recognition mode, and the like, and can process such sound as audio data. The processed audio (voice) data can be converted to a format output that can be transmitted to the mobile communication base station via the mobile communication module 112 in the case of a telephone call mode. The microphone 122 can implement various types of noise cancellation (or suppression) algorithms to cancel (or suppress) noise or interference generated during the process of receiving and transmitting audio signals.
用户输入单元130可以根据用户输入的命令生成键输入数据以控制移动终端的各种操作。用户输入单元130允许用户输入各种类型的信息,并且可以包括键盘、锅仔片、触摸板(例如,检测由于被接触而导致的电阻、压力、电容等等的变化的触敏组件)、滚轮、摇杆等等。特别地,当触摸板以层的形式叠加在显示单元151上时,可以形成触摸屏。The user input unit 130 may generate key input data according to a command input by the user to control various operations of the mobile terminal. The user input unit 130 allows the user to input various types of information, and may include a keyboard, a pot, a touch pad (eg, a touch sensitive component that detects changes in resistance, pressure, capacitance, etc. due to contact), a scroll wheel , rocker, etc. In particular, when the touch panel is superimposed on the display unit 151 in the form of a layer, a touch screen can be formed.
感测单元140检测移动终端100的当前状态,(例如,移动终端100的打开或关闭状态)、移动终端100的位置、用户对于移动终端100的接触(即,触摸输入)的有无、移动终端100的取向、移动终端100的加速或减速移动和方向等等,并且生成用于控制移动终端100的操作的命令或信号。例如,当移动终端100实施为滑动型移动电话时,感测单元140可以感测该滑动型电话是打开还是关闭。另外,感测单元140能够检测电源单元190 是否提供电力或者接口单元170是否与外部装置耦接。感测单元140可以包括接近传感器141。The sensing unit 140 detects the current state of the mobile terminal 100 (eg, the open or closed state of the mobile terminal 100), the location of the mobile terminal 100, the presence or absence of contact (ie, touch input) by the user with the mobile terminal 100, and the mobile terminal. The orientation of 100, the acceleration or deceleration movement and direction of the mobile terminal 100, and the like, and generates a command or signal for controlling the operation of the mobile terminal 100. For example, when the mobile terminal 100 is implemented as a slide type mobile phone, the sensing unit 140 can sense whether the slide type phone is turned on or off. In addition, the sensing unit 140 can detect the power supply unit 190 Whether power is supplied or whether the interface unit 170 is coupled to an external device. The sensing unit 140 may include a proximity sensor 141.
接口单元170用作至少一个外部装置与移动终端100连接可以通过的接口。例如,外部装置可以包括有线或无线头戴式耳机端口、外部电源(或电池充电器)端口、有线或无线数据端口、存储卡端口、用于连接具有识别模块的装置的端口、音频输入/输出(I/O)端口、视频I/O端口、耳机端口等等。识别模块可以是存储用于验证用户使用移动终端100的各种信息并且可以包括用户识别模块(UIM,User Identify Module)、客户识别模块(SIM,Subscriber Identity Module)、通用客户识别模块(USIM,Universal Subscriber Identity Module)等等。另外,具有识别模块的装置(下面称为"识别装置")可以采取智能卡的形式,因此,识别装置可以经由端口或其它连接装置与移动终端100连接。接口单元170可以用于接收来自外部装置的输入(例如,数据信息、电力等等)并且将接收到的输入传输到移动终端100内的一个或多个元件或者可以用于在移动终端和外部装置之间传输数据。The interface unit 170 serves as an interface through which at least one external device can connect with the mobile terminal 100. For example, the external device may include a wired or wireless headset port, an external power (or battery charger) port, a wired or wireless data port, a memory card port, a port for connecting a device having an identification module, and an audio input/output. (I/O) port, video I/O port, headphone port, and more. The identification module may be stored to verify various information used by the user using the mobile terminal 100 and may include a User Identification Module (UIM), a Subscriber Identity Module (SIM), and a Universal Customer Identification Module (USIM, Universal). Subscriber Identity Module) and more. In addition, the device having the identification module (hereinafter referred to as "identification device") may take the form of a smart card, and thus the identification device may be connected to the mobile terminal 100 via a port or other connection device. The interface unit 170 can be configured to receive input from an external device (eg, data information, power, etc.) and transmit the received input to one or more components within the mobile terminal 100 or can be used at the mobile terminal and external device Transfer data between.
另外,当移动终端100与外部底座连接时,接口单元170可以用作允许通过其将电力从底座提供到移动终端100的路径或者可以用作允许从底座输入的各种命令信号通过其传输到移动终端的路径。从底座输入的各种命令信号或电力可以用作用于识别移动终端是否准确地安装在底座上的信号。输出单元150被构造为以视觉、音频和/或触觉方式提供输出信号(例如,音频信号、视频信号、警报信号、振动信号等等)。输出单元150可以包括显示单元151、音频输出模块152、警报单元153等等。In addition, when the mobile terminal 100 is connected to the external base, the interface unit 170 may function as a path through which power is supplied from the base to the mobile terminal 100 or may be used as a transmission of various command signals allowing input from the base to the mobile terminal 100 The path to the terminal. Various command signals or power input from the base can be used as signals for identifying whether the mobile terminal is accurately mounted on the base. Output unit 150 is configured to provide an output signal (eg, an audio signal, a video signal, an alarm signal, a vibration signal, etc.) in a visual, audio, and/or tactile manner. The output unit 150 may include a display unit 151, an audio output module 152, an alarm unit 153, and the like.
显示单元151可以显示在移动终端100中处理的信息。例如,当移动终端100处于电话通话模式时,显示单元151可以显示与通话或其它通信(例如,文本消息收发、多媒体文件下载等等)相关的用户界面(UI,User  Interface)或图形用户界面(GUI,Graphical User Interface)。当移动终端100处于视频通话模式或者图像捕获模式时,显示单元151可以显示捕获的图像和/或接收的图像、示出视频或图像以及相关功能的UI或GUI等等。The display unit 151 can display information processed in the mobile terminal 100. For example, when the mobile terminal 100 is in the phone call mode, the display unit 151 can display a user interface related to a call or other communication (eg, text messaging, multimedia file download, etc.) (UI, User) Interface) or Graphical User Interface (GUI). When the mobile terminal 100 is in a video call mode or an image capturing mode, the display unit 151 may display a captured image and/or a received image, a UI or GUI showing a video or image and related functions, and the like.
同时,当显示单元151和触摸板以层的形式彼此叠加以形成触摸屏时,显示单元151可以用作输入装置和输出装置。显示单元151可以包括液晶显示器(LCD,Liquid Crystal Display)、薄膜晶体管LCD(TFT-LCD,Thin Film Transistor-LCD)、有机发光二极管(OLED,Organic Light-Emitting Diode)显示器、柔性显示器、三维(3D)显示器等等中的至少一种。这些显示器中的一些可以被构造为透明状以允许用户从外部观看,这可以称为透明显示器,典型的透明显示器可以例如为透明有机发光二极管(TOLED)显示器等等。根据特定想要的实施方式,移动终端100可以包括两个或更多显示单元(或其它显示装置),例如,移动终端可以包括外部显示单元(未示出)和内部显示单元(未示出)。触摸屏可用于检测触摸输入压力以及触摸输入位置和触摸输入面积。Meanwhile, when the display unit 151 and the touch panel are superposed on each other in the form of a layer to form a touch screen, the display unit 151 can function as an input device and an output device. The display unit 151 may include a Liquid Crystal Display (LCD), a Thin Film Transistor (LCD), an Organic Light-Emitting Diode (OLED) display, a flexible display, and a three-dimensional (3D) At least one of a display or the like. Some of these displays may be configured to be transparent to allow a user to view from the outside, which may be referred to as a transparent display, and a typical transparent display may be, for example, a transparent organic light emitting diode (TOLED) display or the like. According to a particular desired embodiment, the mobile terminal 100 may include two or more display units (or other display devices), for example, the mobile terminal may include an external display unit (not shown) and an internal display unit (not shown) . The touch screen can be used to detect touch input pressure as well as touch input position and touch input area.
音频输出模块152可以在移动终端处于呼叫信号接收模式、通话模式、记录模式、语音识别模式、广播接收模式等等模式下时,将无线通信单元110接收的或者在存储器160中存储的音频数据转换音频信号并且输出为声音。而且,音频输出模块152可以提供与移动终端100执行的特定功能相关的音频输出(例如,呼叫信号接收声音、消息接收声音等等)。音频输出模块152可以包括扬声器、蜂鸣器等等。The audio output module 152 may convert audio data received by the wireless communication unit 110 or stored in the memory 160 when the mobile terminal is in a call signal receiving mode, a call mode, a recording mode, a voice recognition mode, a broadcast receiving mode, and the like. The audio signal is output as sound. Moreover, the audio output module 152 can provide audio output (eg, call signal reception sound, message reception sound, etc.) associated with a particular function performed by the mobile terminal 100. The audio output module 152 can include a speaker, a buzzer, and the like.
警报单元153可以提供输出以将事件的发生通知给移动终端100。典型的事件可以包括呼叫接收、消息接收、键信号输入、触摸输入等等。除了音频或视频输出之外,警报单元153可以以不同的方式提供输出以通知事件的发生。例如,警报单元153可以以振动的形式提供输出,当接收到呼叫、消息或一些其它进入通信(incoming communication)时,警报单元153 可以提供触觉输出(即,振动)以将其通知给用户。通过提供这样的触觉输出,即使在用户的移动电话处于用户的口袋中时,用户也能够识别出各种事件的发生。警报单元153也可以经由显示单元151或音频输出模块152提供通知事件的发生的输出。The alarm unit 153 can provide an output to notify the mobile terminal 100 of the occurrence of an event. Typical events may include call reception, message reception, key signal input, touch input, and the like. In addition to audio or video output, the alert unit 153 can provide an output in a different manner to notify of the occurrence of an event. For example, the alarm unit 153 can provide an output in the form of vibrations, and when a call, message, or some other incoming communication is received, the alarm unit 153 A tactile output (ie, vibration) can be provided to notify the user of it. By providing such a tactile output, the user is able to recognize the occurrence of various events even when the user's mobile phone is in the user's pocket. The alarm unit 153 can also provide an output of the notification event occurrence via the display unit 151 or the audio output module 152.
存储器160可以存储由控制器180执行的处理和控制操作的软件程序等等,或者可以暂时地存储己经输出或将要输出的数据(例如,电话簿、消息、静态图像、视频等等)。而且,存储器160可以存储关于当触摸施加到触摸屏时输出的各种方式的振动和音频信号的数据。The memory 160 may store a software program or the like for processing and control operations performed by the controller 180, or may temporarily store data (for example, a phone book, a message, a still image, a video, etc.) that has been output or is to be output. Moreover, the memory 160 can store data regarding vibrations and audio signals of various manners that are output when a touch is applied to the touch screen.
存储器160可以包括至少一种类型的存储介质,所述存储介质包括闪存、硬盘、多媒体卡、卡型存储器(例如,SD或DX存储器等等)、随机访问存储器(RAM,Random Access Memory)、静态随机访问存储器(SRAM,Static Random Access Memory)、只读存储器(ROM,Read Only Memory)、电可擦除可编程只读存储器(EEPROM,Electrically Erasable Programmable Read Only Memory)、可编程只读存储器(PROM,Programmable Read Only Memory)、磁性存储器、磁盘、光盘等等。而且,移动终端100可以与通过网络连接执行存储器160的存储功能的网络存储装置协作。The memory 160 may include at least one type of storage medium including a flash memory, a hard disk, a multimedia card, a card type memory (for example, SD or DX memory, etc.), a random access memory (RAM), and a static memory. Random Access Memory (SRAM), Read Only Memory (ROM), Electrically Erasable Programmable Read Only Memory (EEPROM), Programmable Read Only Memory (PROM) , Programmable Read Only Memory), magnetic memory, disk, optical disk, etc. Moreover, the mobile terminal 100 can cooperate with a network storage device that performs a storage function of the memory 160 through a network connection.
控制器180通常控制移动终端的总体操作。例如,控制器180执行与语音通话、数据通信、视频通话等等相关的控制和处理。另外,控制器180可以包括用于再现(或回放)多媒体数据的多媒体模块181,多媒体模块181可以构造在控制器180内,或者可以构造为与控制器180分离。控制器180可以执行模式识别处理,以将在触摸屏上执行的手写输入或者图片绘制输入识别为字符或图像。The controller 180 typically controls the overall operation of the mobile terminal. For example, the controller 180 performs the control and processing associated with voice calls, data communications, video calls, and the like. In addition, the controller 180 may include a multimedia module 181 for reproducing (or playing back) multimedia data, which may be constructed within the controller 180 or may be configured to be separate from the controller 180. The controller 180 may perform a pattern recognition process to recognize a handwriting input or a picture drawing input performed on the touch screen as a character or an image.
电源单元190在控制器180的控制下接收外部电力或内部电力并且提供操作各元件和组件所需的适当的电力。The power supply unit 190 receives external power or internal power under the control of the controller 180 and provides appropriate power required to operate the various components and components.
这里描述的各种实施方式可以以使用例如计算机软件、硬件或其任何 组合的计算机可读介质来实施。对于硬件实施,这里描述的实施方式可以通过使用特定用途集成电路(ASIC,Application Specific Integrated Circuit)、数字信号处理器(DSP,Digital Signal Processing)、数字信号处理装置(DSPD,Digital Signal Processing Device)、可编程逻辑装置(PLD,Programmable Logic Device)、现场可编程门阵列(FPGA,Field Programmable Gate Array)、处理器、控制器、微控制器、微处理器、被设计为执行这里描述的功能的电子单元中的至少一种来实施,在一些情况下,这样的实施方式可以在控制器180中实施。对于软件实施,诸如过程或功能的实施方式可以与允许执行至少一种功能或操作的单独的软件模块来实施。软件代码可以由以任何适当的编程语言编写的软件应用程序(或程序)来实施,软件代码可以存储在存储器160中并且由控制器180执行。The various embodiments described herein can be used, for example, in computer software, hardware, or any of them. The combined computer readable medium is implemented. For hardware implementation, the embodiments described herein may use an Application Specific Integrated Circuit (ASIC), a Digital Signal Processing (DSP), a Digital Signal Processing Device (DSPD), Programmable Logic Device (PLD), Field Programmable Gate Array (FPGA), processor, controller, microcontroller, microprocessor, electronics designed to perform the functions described herein At least one of the units is implemented, and in some cases, such an implementation may be implemented in controller 180. For software implementations, implementations such as procedures or functions may be implemented with separate software modules that permit the execution of at least one function or operation. The software code can be implemented by a software application (or program) written in any suitable programming language, which can be stored in memory 160 and executed by controller 180.
至此,己经按照其功能描述了移动终端。下面,为了简要起见,将描述诸如折叠型、直板型、摆动型、滑动型移动终端等等的各种类型的移动终端中的滑动型移动终端作为示例。因此,本发明能够应用于任何类型的移动终端,并且不限于滑动型移动终端。So far, the mobile terminal has been described in terms of its function. Hereinafter, for the sake of brevity, a slide type mobile terminal among various types of mobile terminals such as a folding type, a bar type, a swing type, a slide type mobile terminal, and the like will be described as an example. Therefore, the present invention can be applied to any type of mobile terminal, and is not limited to a slide type mobile terminal.
如图1中所示的移动终端100可以被构造为利用经由帧或分组发送数据的诸如有线和无线通信系统以及基于卫星的通信系统来操作。The mobile terminal 100 as shown in FIG. 1 may be configured to operate using a communication system such as a wired and wireless communication system and a satellite-based communication system that transmits data via frames or packets.
现在将参考图2描述其中根据本发明实施例的移动终端能够操作的通信系统。A communication system in which a mobile terminal is operable according to an embodiment of the present invention will now be described with reference to FIG.
这样的通信系统可以使用不同的空中接口和/或物理层。例如,由通信系统使用的空中接口包括例如频分多址(FDMA,Frequency Division Multiple Access)、时分多址(TDMA,Time Division Multiple Access)、码分多址(CDMA,Code Division Multiple Access)和通用移动通信系统(UMTS,Universal Mobile Telecommunications System)(特别地,长期演进(LTE,Long Term Evolution))、全球移动通信系统(GSM)等等。作为 非限制性示例,下面的描述涉及CDMA通信系统,但是这样的教导同样适用于其它类型的系统。Such communication systems may use different air interfaces and/or physical layers. For example, the air interface used by the communication system includes, for example, Frequency Division Multiple Access (FDMA), Time Division Multiple Access (TDMA), Code Division Multiple Access (CDMA), and General Purpose Code Division Multiple Access (CDMA). UMTS (Universal Mobile Telecommunications System) (particularly, Long Term Evolution (LTE), Global System for Mobile Communications (GSM), and the like. As By way of non-limiting example, the following description relates to CDMA communication systems, but such teachings are equally applicable to other types of systems.
参考图2,CDMA无线通信系统可以包括多个移动终端100、多个基站(BS,Base Station)270、基站控制器(BSC,Base Station Controller)275和移动交换中心(MSC,Mobile Switching Center)280。MSC280被构造为与公共电话交换网络(PSTN,Public Switched Telephone Network)290形成接口。MSC280还被构造为与可以经由回程线路耦接到基站270的BSC275形成接口。回程线路可以根据若干己知的接口中的任一种来构造,所述接口包括例如E1/T1、ATM、IP、PPP、帧中继、HDSL、ADSL或xDSL。将理解的是,如图2中所示的系统可以包括多个BSC275。Referring to FIG. 2, the CDMA wireless communication system may include a plurality of mobile terminals 100, a plurality of base stations (BS) 270, a base station controller (BSC) 275, and a mobile switching center (MSC) 280. . The MSC 280 is configured to interface with a Public Switched Telephone Network (PSTN) 290. The MSC 280 is also configured to interface with a BSC 275 that can be coupled to the base station 270 via a backhaul line. The backhaul line can be constructed in accordance with any of a number of well known interfaces including, for example, E1/T1, ATM, IP, PPP, Frame Relay, HDSL, ADSL, or xDSL. It will be appreciated that the system as shown in FIG. 2 can include multiple BSCs 275.
每个BS 270可以服务一个或多个分区(或区域),由多向天线或指向特定方向的天线覆盖的每个分区放射状地远离BS 270。或者,每个分区可以由用于分集接收的两个或更多天线覆盖。每个BS 270可以被构造为支持多个频率分配,并且每个频率分配具有特定频谱(例如,1.25MHz,5MHz等等)。Each BS 270 can serve one or more partitions (or regions), with each partition covered by a multi-directional antenna or an antenna pointing in a particular direction radially away from the BS 270. Alternatively, each partition may be covered by two or more antennas for diversity reception. Each BS 270 can be configured to support multiple frequency allocations, and each frequency allocation has a particular frequency spectrum (eg, 1.25 MHz, 5 MHz, etc.).
分区与频率分配的交叉可以被称为CDMA信道。BS 270也可以被称为基站收发器子系统(BTS,Base Transceiver Station)或者其它等效术语。在这样的情况下,术语“基站”可以用于笼统地表示单个BSC275和至少一个BS 270。基站也可以被称为“蜂窝站”。或者,特定BS 270的各分区可以被称为多个蜂窝站。The intersection of partitioning and frequency allocation can be referred to as a CDMA channel. BS 270 may also be referred to as a Base Transceiver Subsystem (BTS) or other equivalent terminology. In such a case, the term "base station" can be used to generally refer to a single BSC 275 and at least one BS 270. A base station can also be referred to as a "cell station." Alternatively, each partition of a particular BS 270 may be referred to as multiple cellular stations.
如图2中所示,广播发射器(BT,Broadcast Transmitter)295将广播信号发送给在系统内操作的移动终端100。如图1中所示的广播接收模块111被设置在移动终端100处以接收由BT295发送的广播信号。在图2中,示出了几个卫星300,例如可以采用GPS卫星300。卫星300帮助定位多个移动终端100中的至少一个。 As shown in FIG. 2, a broadcast transmitter (BT, Broadcast Transmitter) 295 transmits a broadcast signal to the mobile terminal 100 operating within the system. A broadcast receiving module 111 as shown in FIG. 1 is provided at the mobile terminal 100 to receive a broadcast signal transmitted by the BT 295. In Figure 2, several satellites 300 are shown, for example GPS satellites 300 may be employed. The satellite 300 helps locate at least one of the plurality of mobile terminals 100.
在图2中,描绘了多个卫星300,但是理解的是,可以利用任何数目的卫星获得有用的定位信息。如图1中所示的作为GPS的位置信息模块115通常被构造为与卫星300配合以获得想要的定位信息。替代GPS跟踪技术或者在GPS跟踪技术之外,可以使用可以跟踪移动终端的位置的其它技术。另外,至少一个GPS卫星300可以选择性地或者额外地处理卫星DMB传输。In Figure 2, a plurality of satellites 300 are depicted, but it is understood that useful positioning information can be obtained using any number of satellites. The position information module 115 as a GPS as shown in FIG. 1 is generally configured to cooperate with the satellite 300 to obtain desired positioning information. Instead of GPS tracking technology or in addition to GPS tracking technology, other techniques that can track the location of the mobile terminal can be used. Additionally, at least one GPS satellite 300 can selectively or additionally process satellite DMB transmissions.
作为无线通信系统的一个典型操作,BS 270接收来自各种移动终端100的反向链路信号。移动终端100通常参与通话、消息收发和其它类型的通信。特定基站270接收的每个反向链路信号被在特定BS 270内进行处理。获得的数据被转发给相关的BSC275。BSC提供通话资源分配和包括BS 270之间的软切换过程的协调的移动管理功能。BSC275还将接收到的数据路由到MSC280,其提供用于与PSTN290形成接口的额外的路由服务。类似地,PSTN290与MSC280形成接口,MSC与BSC275形成接口,并且BSC275相应地控制BS 270以将正向链路信号发送到移动终端100。As a typical operation of a wireless communication system, BS 270 receives reverse link signals from various mobile terminals 100. Mobile terminal 100 typically participates in calls, messaging, and other types of communications. Each reverse link signal received by a particular base station 270 is processed within a particular BS 270. The obtained data is forwarded to the relevant BSC 275. The BSC provides call resource allocation and coordinated mobility management functions including a soft handoff procedure between the BSs 270. The BSC 275 also routes the received data to the MSC 280, which provides additional routing services for interfacing with the PSTN 290. Similarly, PSTN 290 interfaces with MSC 280, which forms an interface with BSC 275, and BSC 275 controls BS 270 accordingly to transmit forward link signals to mobile terminal 100.
移动终端中无线通信单元110的移动通信模块112基于移动终端内置的接入移动通信网络(如2G/3G/4G等移动通信网络)的必要数据(包括用户识别信息和鉴权信息)接入移动通信网络为移动终端用户的网页浏览、网络多媒体播放等业务传输移动通信数据(包括上行的移动通信数据和下行的移动通信数据)。The mobile communication module 112 of the wireless communication unit 110 in the mobile terminal accesses the mobile based on necessary data (including user identification information and authentication information) of the mobile communication network (such as 2G/3G/4G mobile communication network) built in the mobile terminal. The communication network transmits mobile communication data (including uplink mobile communication data and downlink mobile communication data) for services such as web browsing and network multimedia playback of the mobile terminal user.
无线通信单元110的无线互联网模块113通过运行无线热点的相关协议功能而实现无线热点的功能,无线热点支持多个移动终端(移动终端之外的任意移动终端)接入,通过复用移动通信模块112与移动通信网络之间的移动通信连接为移动终端用户的网页浏览、网络多媒体播放等业务传输移动通信数据(包括上行的移动通信数据和下行的移动通信数据),由于移动终端实质上是复用移动终端与通信网络之间的移动通信连接传输移动 通信数据的,因此移动终端消耗的移动通信数据的流量由通信网络侧的计费实体计入移动终端的通信资费,从而消耗移动终端签约使用的通信资费中包括的移动通信数据的数据流量。The wireless internet module 113 of the wireless communication unit 110 implements a function of a wireless hotspot by operating a related protocol function of a wireless hotspot, and the wireless hotspot supports access of a plurality of mobile terminals (any mobile terminal other than the mobile terminal) by multiplexing the mobile communication module. The mobile communication connection between the mobile communication network and the mobile communication network transmits mobile communication data (including uplink mobile communication data and downlink mobile communication data) for the mobile terminal user's web browsing, network multimedia playback, etc., since the mobile terminal is substantially complex Transmitting a mobile communication connection between a mobile terminal and a communication network The communication data is transmitted, so that the traffic of the mobile communication data consumed by the mobile terminal is included in the communication fee of the mobile terminal by the charging entity on the communication network side, thereby consuming the data traffic of the mobile communication data included in the communication tariff used by the mobile terminal.
基于上述移动终端100硬件结构以及通信系统,提出本发明方法各个实施例。Based on the above-described hardware structure of the mobile terminal 100 and the communication system, various embodiments of the method of the present invention are proposed.
图3为本发明实施例一的图像处理方法的流程示意图,本示例中的图像处理方法应用于终端,如图3所示,所述图像处理方法包括以下步骤:3 is a schematic flowchart of an image processing method according to Embodiment 1 of the present invention. The image processing method in this example is applied to a terminal. As shown in FIG. 3, the image processing method includes the following steps:
步骤301:利用电子光圈获取图像数据流,所述图像数据流包括多帧图像数据。Step 301: Acquire an image data stream by using an electronic aperture, the image data stream comprising a plurality of frames of image data.
本发明实施例中,终端可以是手机、平板电脑等电子设备。终端具有拍照功能,且终端的拍照功能具有电子光圈模式;利用电子光圈拍照时,需要用户将拍照功能设置为电子光圈模式。In the embodiment of the present invention, the terminal may be an electronic device such as a mobile phone or a tablet computer. The terminal has a photographing function, and the photographing function of the terminal has an electronic aperture mode; when photographing by using an electronic aperture, the user needs to set the photographing function to the electronic aperture mode.
在电子光圈模式下,用户调整光圈值之后,在曝光时间内终端会进行连续不间断的拍摄,然后对拍摄出来的多张图像进行融合处理。采用电子光圈进行拍照时,由于需要连续拍摄多张图像进行融合处理,因此,各张图像需要保证对齐。为了保证电子光圈的易用性和用户体验性,本发明实施例为电子光圈拍摄增加了手持模式,在手势模式下,用户可以方便的手持终端利用电子光圈进行拍摄。In the electronic aperture mode, after the user adjusts the aperture value, the terminal performs continuous and uninterrupted shooting during the exposure time, and then fuses the captured multiple images. When taking pictures with an electronic aperture, since it is necessary to continuously take multiple images for fusion processing, each image needs to be aligned. In order to ensure the ease of use and user experience of the electronic aperture, the embodiment of the present invention adds a handheld mode for the electronic aperture shooting. In the gesture mode, the user can conveniently use the electronic aperture to shoot.
在拍摄时,首先获取图像数据流,所述图像数据流包括多帧图像数据。具体地,首先同步获取拍摄的原始图像数据流;然后对原始图像数据流进行数据读取和图像预处理,得到所述图像数据流。这里,由于相机拍摄的图像信号处理(ISP,Image Signal Processing)流程以及外界环境的不可预知变换,同步获取的原始图像数据流中的各帧原始图像数据在光照、噪声、清晰度、对焦点上出现差异。在进行融合处理前,需要通过必要的预处理过程对原始图像数据流中的各帧原始图像数据进行预处理,这里,预处理 过程包括:图像滤波来消除噪声、对比度拉伸来提高图像的清晰度以及图像的光照差异。这样进行预处理后,图像数据流中的各帧图像数据的差异将会减小,有助于后续图像配准算法效果的提升。At the time of shooting, an image data stream is first acquired, the image data stream comprising a plurality of frames of image data. Specifically, the captured original image data stream is first acquired synchronously; then the original image data stream is subjected to data reading and image preprocessing to obtain the image data stream. Here, due to the image signal processing (ISP, Image Signal Processing) process of the camera and the unpredictable transformation of the external environment, the raw image data of each frame in the original image data stream acquired synchronously is in illumination, noise, sharpness, and focus point. There is a difference. Before performing the fusion processing, it is necessary to preprocess the original image data of each frame in the original image data stream through a necessary preprocessing process. Here, the preprocessing The process includes image filtering to eliminate noise, contrast stretching to improve image sharpness and image illumination differences. After the pre-processing is performed, the difference of the image data of each frame in the image data stream will be reduced, which is helpful for the improvement of the effect of the subsequent image registration algorithm.
步骤302:将所述图像数据流中的各帧图像数据进行配准。Step 302: Register each frame image data in the image data stream.
本发明实施例中,将图像数据流中的各帧图像数据进行配准后,各帧图像数据中相同空间位置的像素点实现了对齐,避免了后续进行图像融合时带来模糊。In the embodiment of the present invention, after the image data of each frame in the image data stream is registered, the pixels in the same spatial position in each frame image data are aligned, thereby avoiding blurring when the image is merged.
本发明实施例中,配准也称为对齐,图像对齐有很多算法,主要分为基于局部特征的方法和基于全局特征的方法。其中,基于局部特征的典型方法是提取图像的关键特征点,然后利用这些关键特征点进行图像空间对齐模型的映射矩阵计算,最后利用映射矩阵进行图像对齐。这类方法的配准效果一般可以满足很多场景的要求,如光照的变化(不同曝光图像的合成),大范围图像偏移(全景图像拼接)、暗光图像(噪声加大)等各种复杂的场景。另外一类是基于全局互信息匹配的搜索对齐方法,可以减少随机特征点引起的匹配误差。In the embodiment of the present invention, registration is also called alignment, and there are many algorithms for image alignment, which are mainly divided into a local feature based method and a global feature based method. Among them, the typical method based on local features is to extract the key feature points of the image, and then use these key feature points to calculate the mapping matrix of the image space alignment model, and finally use the mapping matrix to perform image alignment. The registration effect of this type of method can generally meet the requirements of many scenes, such as changes in illumination (combination of different exposure images), large-scale image shift (panoramic image stitching), dark light image (noise increase) and other complexities. Scene. The other type is a search alignment method based on global mutual information matching, which can reduce the matching error caused by random feature points.
光流场也是一种基于点的匹配算法,它是空间运动物体在观察成像平面上的像素运动的瞬时速度,是利用图像序列中像素在时间域上的变化以及相邻帧之间的相关性来找到上一帧跟当前帧之间存在的对应关系,从而计算出相邻帧之间物体的运动信息的一种方法。研究光流场的目的就是为了从图片序列中近似得到不能直接得到的运动场。这里,运动场其实就是物体在三维真实世界中的运动;光流场是运动场在二维图像平面上(人的眼睛或者摄像头)的投影。The optical flow field is also a point-based matching algorithm, which is the instantaneous velocity of the pixel motion of the space moving object on the observation imaging plane, which is the change of the pixel in the time domain and the correlation between adjacent frames. A method for finding the correspondence between the previous frame and the current frame, thereby calculating the motion information of the object between adjacent frames. The purpose of studying the optical flow field is to approximate the motion field that cannot be directly obtained from the sequence of pictures. Here, the motion field is actually the motion of the object in the three-dimensional real world; the optical flow field is the projection of the motion field on the two-dimensional image plane (human eye or camera).
通过一个图片序列,把每张图像中每个像素的运动速度和运动方向找出来就是光流场。如图4所示,第T帧的时候A点的位置是(x1,y1),那么我们在第T+1帧的时候再找到A点,假如它的位置是(x2,y2),那么我们就 可以确定A点的运动向量:V=(x2,y2)-(x1,y1)。Through a sequence of pictures, the motion velocity and direction of motion of each pixel in each image are found to be the optical flow field. As shown in Figure 4, the position of point A in the T frame is (x1, y1), then we find point A in the T+1 frame, if its position is (x2, y2), then we Just The motion vector of point A can be determined: V = (x2, y2) - (x1, y1).
如何找到第t+1帧的时候A点的位置可通过Lucas-Kanade光流法实现,基本过程如下:How to find the position of point A when the t+1 frame is obtained by Lucas-Kanade optical flow method, the basic process is as follows:
假设一个物体的颜色在前后两帧没有巨大而明显的变化。基于这个思路,可以得到图像约束方程。不同的光流算法解决了假定了不同附加条件的光流问题。对于空间和时间坐标使用偏导数,图像约束方程可以写为:Suppose that the color of an object does not have a large and significant change in the two frames before and after. Based on this idea, an image constraint equation can be obtained. Different optical flow algorithms solve the problem of optical flow that assumes different additional conditions. Using partial derivatives for spatial and temporal coordinates, the image constraint equation can be written as:
I(x,y,t)=I(x+dx,y+dy,t+dt)               (1a)I(x,y,t)=I(x+dx,y+dy,t+dt) (1a)
其中,I(x,y,t)为图像在(x,y)位置的像素值。Where I(x, y, t) is the pixel value of the image at the (x, y) position.
其中,
Figure PCTCN2017082941-appb-000001
among them,
Figure PCTCN2017082941-appb-000001
假设移动足够的小,那么对图像约束方程使用泰勒公式,可以得到:Assuming that the movement is small enough, then using the Taylor formula for the image constraint equation, you can get:
Figure PCTCN2017082941-appb-000002
Figure PCTCN2017082941-appb-000002
其中,HOT指更高阶,HOT在移动足够小的情况下可以忽略。从方程(2a)中可以得到:Among them, HOT refers to higher order, and HOT can be ignored when the movement is small enough. From equation (2a) you can get:
Figure PCTCN2017082941-appb-000003
Figure PCTCN2017082941-appb-000003
Vx=dx,Vx=dx                     (4a)Vx=dx, Vx=dx (4a)
Figure PCTCN2017082941-appb-000004
Figure PCTCN2017082941-appb-000004
其中,Vx,Vy分别是I(x,y,t)的光流向量中x,y的组成。Ix和Iy则是图像在(x,y,t)这一点向相应方向的差分。所以就有:Where Vx and Vy are the compositions of x, y in the optical flow vector of I(x, y, t), respectively. Ix and Iy are the differences of the image in the corresponding direction at (x, y, t). So there are:
Ix*Vx+Iy*Vy=-It                    (6a)Ix*Vx+Iy*Vy=-It (6a)
Figure PCTCN2017082941-appb-000005
Figure PCTCN2017082941-appb-000005
上述方程中有2个未知量,至少需要两个非相关的方程进行求解。Lucas-Kanade光流法假定空间像素点运动一致,一个场景上邻近的点投影到图像上也是邻近点,且邻近点速度一致。这是Lucas-Kanade光流法特有 的假定,因为光流法基本方程约束只有一个,而要求x,y方向的速度,有两个未知变量。假定特征点邻域内做相似运动,就可以联立n多个方程求取x,y方向的速度(n为特征点邻域总点数,包括该特征点)。可以得到下面的方程:There are 2 unknowns in the above equation, and at least two non-correlated equations are needed to solve. The Lucas-Kanade optical flow method assumes that the spatial pixel points move in unison, and adjacent points on one scene are projected onto the image as neighboring points, and the neighboring points are at the same speed. This is unique to the Lucas-Kanade optical flow method. Assumptions, because the optical flow method has only one basic equation constraint, and requires speed in the x, y direction, there are two unknown variables. Assuming similar motions in the neighborhood of the feature points, it is possible to find the velocity in the x, y direction by n multiple equations (n is the total number of points in the neighborhood of the feature points, including the feature points). You can get the following equation:
Figure PCTCN2017082941-appb-000006
Figure PCTCN2017082941-appb-000006
Figure PCTCN2017082941-appb-000007
Figure PCTCN2017082941-appb-000007
为了解决这个超定问题,采用最小二乘法:In order to solve this overdetermined problem, the least squares method is used:
Figure PCTCN2017082941-appb-000008
Figure PCTCN2017082941-appb-000008
继而可以得到光流相邻V:Then the optical flow can be obtained adjacent to V:
Figure PCTCN2017082941-appb-000009
Figure PCTCN2017082941-appb-000009
上述方案中提到小运动假定,当目标速度很快这一假定会不成立,多尺度能解决这个问题。首先,对每一帧建立一个高斯金字塔,最大尺度图片在最顶层,原始图片在底层。然后,从顶层开始估计下一帧所在位置,作为下一层的初始位置,沿着金字塔向下搜索,重复估计动作,直到到达金字塔的底层。这样搜索可以快速定位到像素点的运动方向和位置。The small motion hypothesis mentioned in the above scheme assumes that when the target speed is fast, the assumption will not be established, and multi-scale can solve this problem. First, create a Gaussian pyramid for each frame, with the largest scale image at the top and the original image at the bottom. Then, from the top level, estimate the position of the next frame as the initial position of the next layer, search down the pyramid, and repeat the estimation action until it reaches the bottom of the pyramid. This search can quickly locate the direction and position of the pixel.
图6给出了利用光流对多帧图像进行对齐流程。利用光流场的计算方法,可以得到两幅图像之间的稀疏匹配点,之后利用这些点的坐标计算图 像映射模型。在图像对齐步骤中,选择正确的图像对齐变换模型很重要。常见的空间变换模型有仿射变换和透视变换模型。Figure 6 shows the alignment process for multi-frame images using optical flow. Using the calculation method of the optical flow field, the sparse matching points between the two images can be obtained, and then the coordinates of the points are calculated. Like a mapping model. In the image alignment step, it is important to choose the correct image alignment transformation model. Common spatial transformation models include affine transformation and perspective transformation models.
仿射变换可以形象的表示成以下形式。一个平面内的任意平行四边形可以被仿射变换映射为另一个平行四边形,图像的映射操作在同一个空间平面内进行,通过不同的变换参数使其变形而得到不同类型的平行四边形。The affine transformation can be visually represented in the following form. Any parallelogram in one plane can be mapped to another parallelogram by affine transformation. The mapping operation of the image is performed in the same spatial plane, and different transformation parameters are used to deform to obtain different types of parallelograms.
Figure PCTCN2017082941-appb-000010
Figure PCTCN2017082941-appb-000010
透射变换是更一般化的变换模型,相比较仿射变换,透射变换更具有灵活性,一个透射变换可以将矩形转变成梯形,它描述了将空间内一个平面投影到另一个空间平面内,仿射变换是透视变换的一个特例。Transmission transformation is a more general transformation model. Compared with affine transformation, transmission transformation is more flexible. A transmission transformation can transform a rectangle into a trapezoid. It describes the projection of one plane in space into another spatial plane. Shot transformation is a special case of perspective transformation.
Figure PCTCN2017082941-appb-000011
Figure PCTCN2017082941-appb-000011
上述矩阵各个元素的意义如下:The meaning of each element of the above matrix is as follows:
a02和a12为位移参数;a 02 and a 12 are displacement parameters;
a00a01和a10a11为缩放与旋转参数;a 00 a 01 and a 10 a 11 are scaling and rotation parameters;
a20a21为水平与垂直方向的变形量。a 20 a 21 is the amount of deformation in the horizontal and vertical directions.
这里需要选择透视变换模型,主要考虑到手持终端,如手机在连续拍摄多幅图像时,手机的抖动运动基本上不在同一个平面,简易的运动模型如图5所示。Here we need to choose the perspective transformation model, mainly considering the handheld terminal. For example, when the mobile phone continuously captures multiple images, the jitter motion of the mobile phone is basically not in the same plane. The simple motion model is shown in Figure 5.
步骤303:将配准后的各帧图像数据进行融合处理,得到目标图像。Step 303: Perform fusion processing on the image data of each frame after registration to obtain a target image.
本发明实施例中,各帧图像数据进行配准(也即对齐)后,需要对图像进行融合处理,这里,采取图像像素点依次叠加的融合方法。依据公式(14a)进行融合处理: In the embodiment of the present invention, after the image data of each frame is registered (that is, aligned), the image needs to be fused, and here, a fusion method in which image pixels are sequentially superimposed is adopted. Fusion processing according to formula (14a):
Figure PCTCN2017082941-appb-000012
Figure PCTCN2017082941-appb-000012
其中,I为每张图像,m为第m张图像,k为已经合成的图像张数,N为图像总合成张数。Where I is for each image, m is the mth image, k is the number of images that have been synthesized, and N is the total number of composite images.
本发明实施例的技术方案,提出了一种图像处理方法,利用图像配准原理对多帧待合成的图像数据进行图像对齐,该对齐方法允许一定范围内的图像抖动误差,图像最后的合成效果出现的像素偏差较小。According to the technical solution of the embodiment of the present invention, an image processing method is proposed, which uses image registration principle to perform image alignment on image data to be synthesized in multiple frames, and the alignment method allows image jitter error within a certain range, and the final composite effect of the image The pixel deviation that occurs is small.
图7为本发明实施例二的图像处理方法的流程示意图,本示例中的图像处理方法应用于终端,如图7所示,所述图像处理方法包括以下步骤:FIG. 7 is a schematic flowchart of an image processing method according to Embodiment 2 of the present invention. The image processing method in this example is applied to a terminal. As shown in FIG. 7, the image processing method includes the following steps:
步骤701:在手持模式下,利用电子光圈拍摄得到原始图像数据流,所述原始图像数据流包括多帧原始图像数据;对所述原始图像数据流中的各帧原始图像数据进行预处理,得到所述图像数据流。Step 701: In the handheld mode, the original image data stream is obtained by using an electronic aperture, the original image data stream includes a plurality of frames of original image data, and the original image data of each frame in the original image data stream is preprocessed to obtain The image data stream.
本发明实施例中,终端可以是手机、平板电脑等电子设备。终端具有拍照功能,且终端的拍照功能具有电子光圈模式;利用电子光圈拍照时,需要用户将拍照功能设置为电子光圈模式。In the embodiment of the present invention, the terminal may be an electronic device such as a mobile phone or a tablet computer. The terminal has a photographing function, and the photographing function of the terminal has an electronic aperture mode; when photographing by using an electronic aperture, the user needs to set the photographing function to the electronic aperture mode.
在电子光圈模式下,用户调整光圈值之后,在曝光时间内终端会进行连续不间断的拍摄,然后对拍摄出来的多张图像进行融合处理。采用电子光圈进行拍照时,由于需要连续拍摄多张图像进行融合处理,因此,各张图像需要保证对齐。为了保证电子光圈的易用性和用户体验性,本发明实施例为电子光圈拍摄增加了手持模式,在手势模式下,用户可以方便的手持终端利用电子光圈进行拍摄。In the electronic aperture mode, after the user adjusts the aperture value, the terminal performs continuous and uninterrupted shooting during the exposure time, and then fuses the captured multiple images. When taking pictures with an electronic aperture, since it is necessary to continuously take multiple images for fusion processing, each image needs to be aligned. In order to ensure the ease of use and user experience of the electronic aperture, the embodiment of the present invention adds a handheld mode for the electronic aperture shooting. In the gesture mode, the user can conveniently use the electronic aperture to shoot.
在拍摄时,首先获取图像数据流,所述图像数据流包括多帧图像数据。具体地,首先同步获取拍摄的原始图像数据流;然后对原始图像数据流进行数据读取和图像预处理,得到所述图像数据流。这里,由于相机拍摄的ISP处理流程以及外界环境的不可预知变换,同步获取的原始图像数据流中的各帧原始图像数据在光照、噪声、清晰度、对焦点上出现差异。在进行 融合处理前,需要通过必要的预处理过程对原始图像数据流中的各帧原始图像数据进行预处理,这里,预处理过程包括:图像滤波来消除噪声、对比度拉伸来提高图像的清晰度以及图像的光照差异。这样进行预处理后,图像数据流中的各帧图像数据的差异将会减小,有助于后续图像配准算法效果的提升。At the time of shooting, an image data stream is first acquired, the image data stream comprising a plurality of frames of image data. Specifically, the captured original image data stream is first acquired synchronously; then the original image data stream is subjected to data reading and image preprocessing to obtain the image data stream. Here, due to the ISP processing flow of the camera shooting and the unpredictable transformation of the external environment, the raw image data of each frame in the synchronously acquired original image data stream is different in illumination, noise, sharpness, and focus point. In progress Before the fusion processing, the original image data of each frame in the original image data stream needs to be preprocessed through a necessary preprocessing process. Here, the preprocessing process includes: image filtering to eliminate noise and contrast stretching to improve image clarity and The difference in illumination of the image. After the pre-processing is performed, the difference of the image data of each frame in the image data stream will be reduced, which is helpful for the improvement of the effect of the subsequent image registration algorithm.
步骤702:以所述图像数据流中的第一帧图像数据作为参考帧,将所述图像数据流中除所述第一帧图像数据以外的各帧图像数据与所述参考帧进行对齐。Step 702: Align each frame image data of the image data stream except the first frame image data with the reference frame by using the first frame image data in the image data stream as a reference frame.
本发明实施例中,将图像数据流中的各帧图像数据进行配准后,各帧图像数据中相同空间位置的像素点实现了对齐,避免了后续进行图像融合时带来模糊。In the embodiment of the present invention, after the image data of each frame in the image data stream is registered, the pixels in the same spatial position in each frame image data are aligned, thereby avoiding blurring when the image is merged.
本发明实施例中,配准也称为对齐,图像对齐有很多算法,主要分为基于局部特征的方法和基于全局特征的方法。其中,基于局部特征的典型方法是提取图像的关键特征点,然后利用这些关键特征点进行图像空间对齐模型的映射矩阵计算,最后利用映射矩阵进行图像对齐。这类方法的配准效果一般可以满足很多场景的要求,如光照的变化(不同曝光图像的合成),大范围图像偏移(全景图像拼接)、暗光图像(噪声加大)等各种复杂的场景。另外一类是基于全局互信息匹配的搜索对齐方法,可以减少随机特征点引起的匹配误差。In the embodiment of the present invention, registration is also called alignment, and there are many algorithms for image alignment, which are mainly divided into a local feature based method and a global feature based method. Among them, the typical method based on local features is to extract the key feature points of the image, and then use these key feature points to calculate the mapping matrix of the image space alignment model, and finally use the mapping matrix to perform image alignment. The registration effect of this type of method can generally meet the requirements of many scenes, such as changes in illumination (combination of different exposure images), large-scale image shift (panoramic image stitching), dark light image (noise increase) and other complexities. Scene. The other type is a search alignment method based on global mutual information matching, which can reduce the matching error caused by random feature points.
光流场也是一种基于点的匹配算法,它是空间运动物体在观察成像平面上的像素运动的瞬时速度,是利用图像序列中像素在时间域上的变化以及相邻帧之间的相关性来找到上一帧跟当前帧之间存在的对应关系,从而计算出相邻帧之间物体的运动信息的一种方法。The optical flow field is also a point-based matching algorithm, which is the instantaneous velocity of the pixel motion of the space moving object on the observation imaging plane, which is the change of the pixel in the time domain and the correlation between adjacent frames. A method for finding the correspondence between the previous frame and the current frame, thereby calculating the motion information of the object between adjacent frames.
步骤703:在显示界面上显示提示框;所述提示框用于提示在所述在手持模式下,利用电子光圈拍摄时手持抖动的方位。 Step 703: Display a prompt box on the display interface; the prompt box is used to prompt the orientation of the handheld shake when the electronic aperture is used in the handheld mode.
参照图8,在应用交互界面上,显示一个用户提示框,该提示框可以提示用户当前手持抖动的方位,这样可以便于用户及时纠正。Referring to FIG. 8, on the application interaction interface, a user prompt box is displayed, which prompts the user to manually hold the direction of the jitter, which can facilitate the user to correct in time.
步骤704:将配准后的各帧图像数据的各个像素点按照空间位置对应进行叠加,得到目标图像。Step 704: Superimpose each pixel point of each frame image data after registration according to spatial position correspondence to obtain a target image.
本发明实施例中,各帧图像数据进行配准(也即对齐)后,需要对图像进行融合处理,这里,采取图像像素点依次叠加的融合方法。In the embodiment of the present invention, after the image data of each frame is registered (that is, aligned), the image needs to be fused, and here, a fusion method in which image pixels are sequentially superimposed is adopted.
本发明实施例的技术方案,提出了一种图像处理方法,利用图像配准原理对多帧待合成的图像数据进行图像对齐,该对齐方法允许一定范围内的图像抖动误差,图像最后的合成效果出现的像素偏差较小。此外,在电子光圈的拍摄过程中,加入了提示框,防止用户拍摄时终端抖动范围过大。According to the technical solution of the embodiment of the present invention, an image processing method is proposed, which uses image registration principle to perform image alignment on image data to be synthesized in multiple frames, and the alignment method allows image jitter error within a certain range, and the final composite effect of the image The pixel deviation that occurs is small. In addition, during the shooting of the electronic aperture, a prompt box is added to prevent the terminal from being over-ranged when shooting.
图9为本发明实施例一的终端的结构组成示意图,如图9所示,所述终端包括:FIG. 9 is a schematic structural diagram of a terminal according to Embodiment 1 of the present invention. As shown in FIG. 9, the terminal includes:
获取单元91,配置为利用电子光圈获取图像数据流,所述图像数据流包括多帧图像数据;The obtaining unit 91 is configured to acquire an image data stream by using an electronic aperture, the image data stream comprising multi-frame image data;
配准单元92,配置为将所述图像数据流中的各帧图像数据进行配准;The registration unit 92 is configured to register image data of each frame in the image data stream;
融合单元93,配置为将配准后的各帧图像数据进行融合处理,得到目标图像。The merging unit 93 is configured to perform merging processing on the image data of each frame after registration to obtain a target image.
所述获取单元91包括:The obtaining unit 91 includes:
拍摄子单元911,配置为在手持模式下,利用电子光圈拍摄得到原始图像数据流,所述原始图像数据流包括多帧原始图像数据;The photographing subunit 911 is configured to obtain an original image data stream by using an electronic aperture in a handheld mode, the original image data stream comprising a plurality of frames of original image data;
预处理子单元912,配置为对所述原始图像数据流中的各帧原始图像数据进行预处理,得到所述图像数据流;其中,所述预处理包括以下至少之一:图像滤波、对比度拉伸。The pre-processing sub-unit 912 is configured to pre-process the original image data of each frame in the original image data stream to obtain the image data stream; wherein the pre-processing includes at least one of the following: image filtering, contrast pull Stretch.
所述终端还包括:The terminal further includes:
提示单元94,配置为在显示界面上显示提示框;所述提示框用于提示 在所述在手持模式下,利用电子光圈拍摄时手持抖动的方位。The prompting unit 94 is configured to display a prompt box on the display interface; the prompt box is used for prompting In the hand-held mode, the orientation of the hand-held shake is taken when shooting with an electronic aperture.
所述配准单元92,还配置为以所述图像数据流中的第一帧图像数据作为参考帧,将所述图像数据流中除所述第一帧图像数据以外的各帧图像数据与所述参考帧进行对齐;其中,所述对齐是指将相同空间位置的像素点对齐。The registration unit 92 is further configured to use the first frame image data in the image data stream as a reference frame, and image data of each frame except the first frame image data in the image data stream. The reference frames are aligned; wherein the alignment refers to aligning pixel points of the same spatial location.
所述融合单元93,还配置为将配准后的各帧图像数据的各个像素点按照空间位置对应进行叠加。The merging unit 93 is further configured to superimpose each pixel point of each frame image data after registration according to spatial position correspondence.
这里需要指出的是:以上终端实施例的描述,与上述方法实施例的描述是类似的,具有同方法实施例相似的有益效果。对于本发明终端实施例中未披露的技术细节,请参照本发明方法实施例的描述而理解。It should be noted here that the description of the above terminal embodiment is similar to the description of the above method embodiment, and has similar advantages as the method embodiment. For technical details not disclosed in the terminal embodiment of the present invention, please refer to the description of the method embodiment of the present invention.
图10为本发明实施例一的终端的结构组成示意图,所述终端包括:处理器1001、相机1002、显示屏1003;所述处理器1001、相机1002以及显示屏1003均通过总线1004连接。10 is a schematic structural diagram of a terminal according to Embodiment 1 of the present invention. The terminal includes: a processor 1001, a camera 1002, and a display screen 1003. The processor 1001, the camera 1002, and the display screen 1003 are all connected by a bus 1004.
所述相机1002,配置为利用电子光圈获取图像数据流,所述图像数据流包括多帧图像数据;The camera 1002 is configured to acquire an image data stream by using an electronic aperture, the image data stream comprising a plurality of frames of image data;
所述处理器1001,配置为将所述图像数据流中的各帧图像数据进行配准;将配准后的各帧图像数据进行融合处理,得到目标图像。The processor 1001 is configured to register image data of each frame in the image data stream, and perform fusion processing on the image data of each frame after registration to obtain a target image.
具体地,所述相机1002,配置为在手持模式下,利用电子光圈拍摄得到原始图像数据流,所述原始图像数据流包括多帧原始图像数据;Specifically, the camera 1002 is configured to obtain an original image data stream by using an electronic aperture in a handheld mode, where the original image data stream includes multiple frames of original image data;
所述处理器1001,配置为对所述原始图像数据流中的各帧原始图像数据进行预处理,得到所述图像数据流;其中,所述预处理包括以下至少之一:图像滤波、对比度拉伸;将所述图像数据流中的各帧图像数据进行配准;将配准后的各帧图像数据进行融合处理,得到目标图像。The processor 1001 is configured to perform pre-processing on each frame of the original image data stream to obtain the image data stream, where the pre-processing includes at least one of the following: image filtering, contrast pull And mapping the image data of each frame in the image data stream; and performing the fusion processing on the image data of each frame after registration to obtain a target image.
所述显示屏1003,配置为在显示界面上显示提示框;所述提示框用于提示在所述在手持模式下,利用电子光圈拍摄时手持抖动的方位。 The display screen 1003 is configured to display a prompt box on the display interface; the prompt box is used to prompt the orientation of the handheld shake when the electronic aperture is used in the handheld mode.
图11为本发明实施例三的图像处理方法的流程示意图,本示例中的图像处理方法应用于终端,如图11所示,所述图像处理方法包括以下步骤:11 is a schematic flowchart of an image processing method according to Embodiment 3 of the present invention. The image processing method in this example is applied to a terminal. As shown in FIG. 11, the image processing method includes the following steps:
步骤1101:利用电子光圈获取图像数据流,所述图像数据流包括多帧图像数据。Step 1101: Acquire an image data stream using an electronic aperture, the image data stream comprising a plurality of frames of image data.
本发明实施例中,终端可以是手机、平板电脑等电子设备。终端具有拍照功能,且终端的拍照功能具有电子光圈模式;利用电子光圈拍照时,需要用户将拍照功能设置为电子光圈模式。In the embodiment of the present invention, the terminal may be an electronic device such as a mobile phone or a tablet computer. The terminal has a photographing function, and the photographing function of the terminal has an electronic aperture mode; when photographing by using an electronic aperture, the user needs to set the photographing function to the electronic aperture mode.
在电子光圈模式下,用户调整光圈值之后,在曝光时间内终端会进行连续不间断的拍摄,然后对拍摄出来的多张图像进行融合处理。采用电子光圈进行拍照时,由于需要连续拍摄多张图像进行融合处理,因此,各张图像需要保证对齐。为了保证电子光圈的易用性和用户体验性,本发明实施例为电子光圈拍摄增加了手持模式,在手势模式下,用户可以方便的手持终端利用电子光圈进行拍摄。In the electronic aperture mode, after the user adjusts the aperture value, the terminal performs continuous and uninterrupted shooting during the exposure time, and then fuses the captured multiple images. When taking pictures with an electronic aperture, since it is necessary to continuously take multiple images for fusion processing, each image needs to be aligned. In order to ensure the ease of use and user experience of the electronic aperture, the embodiment of the present invention adds a handheld mode for the electronic aperture shooting. In the gesture mode, the user can conveniently use the electronic aperture to shoot.
在拍摄时,首先获取图像数据流,所述图像数据流包括多帧图像数据。具体地,首先同步获取拍摄的原始图像数据流;然后对原始图像数据流进行数据读取和图像预处理,得到所述图像数据流。这里,由于相机拍摄的ISP流程以及外界环境的不可预知变换,同步获取的原始图像数据流中的各帧原始图像数据在光照、噪声、清晰度、对焦点上出现差异。在进行融合处理前,需要通过必要的预处理过程对原始图像数据流中的各帧原始图像数据进行预处理,这里,预处理过程包括:图像滤波来消除噪声、对比度拉伸来提高图像的清晰度以及图像的光照差异。这样进行预处理后,图像数据流中的各帧图像数据的差异将会减小,有助于后续图像配准算法效果的提升。At the time of shooting, an image data stream is first acquired, the image data stream comprising a plurality of frames of image data. Specifically, the captured original image data stream is first acquired synchronously; then the original image data stream is subjected to data reading and image preprocessing to obtain the image data stream. Here, due to the ISP process captured by the camera and the unpredictable transformation of the external environment, the raw image data of each frame in the synchronously acquired original image data stream is different in illumination, noise, sharpness, and focus point. Before the fusion processing, the original image data of each frame in the original image data stream needs to be preprocessed through a necessary preprocessing process. Here, the preprocessing process includes: image filtering to eliminate noise and contrast stretching to improve image clarity. Degree and the difference in illumination of the image. After the pre-processing is performed, the difference of the image data of each frame in the image data stream will be reduced, which is helpful for the improvement of the effect of the subsequent image registration algorithm.
步骤1102:从所述图像数据流中确定出参考帧,将所述图像数据流中除所述参考帧以外的各帧图像数据与所述参考帧进行配准。 Step 1102: Determine a reference frame from the image data stream, and register each frame image data of the image data stream except the reference frame with the reference frame.
本发明实施例中,将图像数据流中的各帧图像数据进行配准后,各帧图像数据中相同空间位置的像素点实现了对齐,避免了后续进行图像融合时带来模糊。In the embodiment of the present invention, after the image data of each frame in the image data stream is registered, the pixels in the same spatial position in each frame image data are aligned, thereby avoiding blurring when the image is merged.
本发明实施例中,配准也称为对齐,图像对齐有很多算法,主要分为基于局部特征的方法和基于全局特征的方法。其中,基于局部特征的典型方法是提取图像的关键特征点,然后利用这些关键特征点进行图像空间对齐模型的映射矩阵计算,最后利用映射矩阵进行图像对齐。这类方法的配准效果一般可以满足很多场景的要求,如光照的变化(不同曝光图像的合成),大范围图像偏移(全景图像拼接)、暗光图像(噪声加大)等各种复杂的场景。另外一类是基于全局互信息匹配的搜索对齐方法,可以减少随机特征点引起的匹配误差。In the embodiment of the present invention, registration is also called alignment, and there are many algorithms for image alignment, which are mainly divided into a local feature based method and a global feature based method. Among them, the typical method based on local features is to extract the key feature points of the image, and then use these key feature points to calculate the mapping matrix of the image space alignment model, and finally use the mapping matrix to perform image alignment. The registration effect of this type of method can generally meet the requirements of many scenes, such as changes in illumination (combination of different exposure images), large-scale image shift (panoramic image stitching), dark light image (noise increase) and other complexities. Scene. The other type is a search alignment method based on global mutual information matching, which can reduce the matching error caused by random feature points.
光流场也是一种基于点的匹配算法,它是空间运动物体在观察成像平面上的像素运动的瞬时速度,是利用图像序列中像素在时间域上的变化以及相邻帧之间的相关性来找到上一帧跟当前帧之间存在的对应关系,从而计算出相邻帧之间物体的运动信息的一种方法。研究光流场的目的就是为了从图片序列中近似得到不能直接得到的运动场。这里,运动场其实就是物体在三维真实世界中的运动;光流场是运动场在二维图像平面上(人的眼睛或者摄像头)的投影。The optical flow field is also a point-based matching algorithm, which is the instantaneous velocity of the pixel motion of the space moving object on the observation imaging plane, which is the change of the pixel in the time domain and the correlation between adjacent frames. A method for finding the correspondence between the previous frame and the current frame, thereby calculating the motion information of the object between adjacent frames. The purpose of studying the optical flow field is to approximate the motion field that cannot be directly obtained from the sequence of pictures. Here, the motion field is actually the motion of the object in the three-dimensional real world; the optical flow field is the projection of the motion field on the two-dimensional image plane (human eye or camera).
通过一个图片序列,把每张图像中每个像素的运动速度和运动方向找出来就是光流场。如图4所示,第T帧的时候A点的位置是(x1,y1),那么我们在第T+1帧的时候再找到A点,假如它的位置是(x2,y2),那么我们就可以确定A点的运动向量:V=(x2,y2)-(x1,y1)。Through a sequence of pictures, the motion velocity and direction of motion of each pixel in each image are found to be the optical flow field. As shown in Figure 4, the position of point A in the T frame is (x1, y1), then we find point A in the T+1 frame, if its position is (x2, y2), then we It is possible to determine the motion vector of point A: V = (x2, y2) - (x1, y1).
如何找到第t+1帧的时候A点的位置可通过Lucas-Kanade光流法实现,基本过程如下:How to find the position of point A when the t+1 frame is obtained by Lucas-Kanade optical flow method, the basic process is as follows:
假设一个物体的颜色在前后两帧没有巨大而明显的变化。基于这个思 路,可以得到图像约束方程。不同的光流算法解决了假定了不同附加条件的光流问题。对于空间和时间坐标使用偏导数,图像约束方程可以写为:Suppose that the color of an object does not have a large and significant change in the two frames before and after. Based on this thinking Road, you can get the image constraint equation. Different optical flow algorithms solve the problem of optical flow that assumes different additional conditions. Using partial derivatives for spatial and temporal coordinates, the image constraint equation can be written as:
I(x,y,t)=I(x+dx,y+dy,t+dt)               (1b)I(x,y,t)=I(x+dx,y+dy,t+dt) (1b)
其中,I(x,y,t)为图像在(x,y)位置的像素值。Where I(x, y, t) is the pixel value of the image at the (x, y) position.
其中,
Figure PCTCN2017082941-appb-000013
among them,
Figure PCTCN2017082941-appb-000013
假设移动足够的小,那么对图像约束方程使用泰勒公式,可以得到:Assuming that the movement is small enough, then using the Taylor formula for the image constraint equation, you can get:
Figure PCTCN2017082941-appb-000014
Figure PCTCN2017082941-appb-000014
其中,HOT指更高阶,HOT在移动足够小的情况下可以忽略。从方程(2b)中可以得到:Among them, HOT refers to higher order, and HOT can be ignored when the movement is small enough. From equation (2b) you can get:
Figure PCTCN2017082941-appb-000015
Figure PCTCN2017082941-appb-000015
Vx=dx,Vx=dx                     (4b)Vx=dx, Vx=dx (4b)
Figure PCTCN2017082941-appb-000016
Figure PCTCN2017082941-appb-000016
其中,Vx,Vy分别是I(x,y,t)的光流向量中x,y的组成。Ix和Iy则是图像在(x,y,t)这一点向相应方向的差分。所以就有:Where Vx and Vy are the compositions of x, y in the optical flow vector of I(x, y, t), respectively. Ix and Iy are the differences of the image in the corresponding direction at (x, y, t). So there are:
Ix*Vx+Iy*Vy=-It                    (6b)Ix*Vx+Iy*Vy=-It (6b)
Figure PCTCN2017082941-appb-000017
Figure PCTCN2017082941-appb-000017
上述方程中有2个未知量,至少需要两个非相关的方程进行求解。Lucas-Kanade光流法假定空间像素点运动一致,一个场景上邻近的点投影到图像上也是邻近点,且邻近点速度一致。这是Lucas-Kanade光流法特有的假定,因为光流法基本方程约束只有一个,而要求x,y方向的速度,有两个未知变量。假定特征点邻域内做相似运动,就可以联立n多个方程求取x,y方向的速度(n为特征点邻域总点数,包括该特征点)。可以得到下面的方程: There are 2 unknowns in the above equation, and at least two non-correlated equations are needed to solve. The Lucas-Kanade optical flow method assumes that the spatial pixel points move in unison, and adjacent points on one scene are projected onto the image as neighboring points, and the neighboring points are at the same speed. This is a unique assumption of the Lucas-Kanade optical flow method because the basic equation of the optical flow method has only one constraint, and the speed in the x, y direction requires two unknown variables. Assuming similar motions in the neighborhood of the feature points, it is possible to find the velocity in the x, y direction by n multiple equations (n is the total number of points in the neighborhood of the feature points, including the feature points). You can get the following equation:
Figure PCTCN2017082941-appb-000018
Figure PCTCN2017082941-appb-000018
Figure PCTCN2017082941-appb-000019
Figure PCTCN2017082941-appb-000019
为了解决这个超定问题,采用最小二乘法:In order to solve this overdetermined problem, the least squares method is used:
Figure PCTCN2017082941-appb-000020
Figure PCTCN2017082941-appb-000020
继而可以得到光流相邻V:Then the optical flow can be obtained adjacent to V:
Figure PCTCN2017082941-appb-000021
Figure PCTCN2017082941-appb-000021
上述方案中提到小运动假定,当目标速度很快这一假定会不成立,多尺度能解决这个问题。首先,对每一帧建立一个高斯金字塔,最大尺度图片在最顶层,原始图片在底层。然后,从顶层开始估计下一帧所在位置,作为下一层的初始位置,沿着金字塔向下搜索,重复估计动作,直到到达金字塔的底层。这样搜索可以快速定位到像素点的运动方向和位置。The small motion hypothesis mentioned in the above scheme assumes that when the target speed is fast, the assumption will not be established, and multi-scale can solve this problem. First, create a Gaussian pyramid for each frame, with the largest scale image at the top and the original image at the bottom. Then, from the top level, estimate the position of the next frame as the initial position of the next layer, search down the pyramid, and repeat the estimation action until it reaches the bottom of the pyramid. This search can quickly locate the direction and position of the pixel.
图6给出了利用光流对多帧图像进行对齐流程。利用光流场的计算方法,可以得到两幅图像之间的稀疏匹配点,之后利用这些点的坐标计算图像映射模型。在图像对齐步骤中,选择正确的图像对齐变换模型很重要。常见的空间变换模型有仿射变换和透视变换模型。Figure 6 shows the alignment process for multi-frame images using optical flow. Using the calculation method of the optical flow field, the sparse matching points between the two images can be obtained, and then the image mapping model is calculated by using the coordinates of these points. In the image alignment step, it is important to choose the correct image alignment transformation model. Common spatial transformation models include affine transformation and perspective transformation models.
仿射变换可以形象的表示成以下形式。一个平面内的任意平行四边形可以被仿射变换映射为另一个平行四边形,图像的映射操作在同一个空间 平面内进行,通过不同的变换参数使其变形而得到不同类型的平行四边形。The affine transformation can be visually represented in the following form. Any parallelogram in one plane can be mapped to another parallelogram by affine transformation, and the mapping operation of the image is in the same space. In-plane, different deformation parameters are used to deform different types of parallelograms.
Figure PCTCN2017082941-appb-000022
Figure PCTCN2017082941-appb-000022
透射变换是更一般化的变换模型,相比较仿射变换,透射变换更具有灵活性,一个透射变换可以将矩形转变成梯形,它描述了将空间内一个平面投影到另一个空间平面内,仿射变换是透视变换的一个特例。Transmission transformation is a more general transformation model. Compared with affine transformation, transmission transformation is more flexible. A transmission transformation can transform a rectangle into a trapezoid. It describes the projection of one plane in space into another spatial plane. Shot transformation is a special case of perspective transformation.
Figure PCTCN2017082941-appb-000023
Figure PCTCN2017082941-appb-000023
上述矩阵各个元素的意义如下:The meaning of each element of the above matrix is as follows:
a02和a12为位移参数;a 02 and a 12 are displacement parameters;
a00a01和a10a11为缩放与旋转参数;a 00 a 01 and a 10 a 11 are scaling and rotation parameters;
a20a21为水平与垂直方向的变形量。a 20 a 21 is the amount of deformation in the horizontal and vertical directions.
这里需要选择透视变换模型,主要考虑到手持终端,如手机在连续拍摄多幅图像时,手机的抖动运动基本上不在同一个平面,简易的运动模型如图5所示。Here we need to choose the perspective transformation model, mainly considering the handheld terminal. For example, when the mobile phone continuously captures multiple images, the jitter motion of the mobile phone is basically not in the same plane. The simple motion model is shown in Figure 5.
步骤1103:对配准后的各帧图像数据进行融合处理时,将图像边界处黑色区域的像素替换为所述参考帧对应的像素后进行融合处理,得到目标图像。Step 1103: When performing the fusion processing on the image data of each frame after the registration, the pixels in the black area at the image boundary are replaced with the pixels corresponding to the reference frame, and then the fusion processing is performed to obtain the target image.
本发明实施例中,各帧图像数据进行配准(也即对齐)后,需要对图像进行融合处理,这里,采取图像像素点依次叠加的融合方法。依据公式(14b)进行融合处理:In the embodiment of the present invention, after the image data of each frame is registered (that is, aligned), the image needs to be fused, and here, a fusion method in which image pixels are sequentially superimposed is adopted. Fusion processing according to formula (14b):
Figure PCTCN2017082941-appb-000024
Figure PCTCN2017082941-appb-000024
其中,I为每张图像,m为第m张图像,k为已经合成的图像张数,N 为图像总合成张数。Where I is for each image, m is the mth image, and k is the number of images that have been synthesized, N The total number of sheets synthesized for the image.
参照图13,在图像的边界处由于图像映射变换后会形成黑边,这些黑边处的像素如果按照加权平均的方式进行融合会形成亮度的差异,给图像的整体视觉带来影响。以参考帧为图像数据流中的第一帧图像数据为例,其他各帧图像数据都在第一帧图像数据下进行配准融合,参考帧不会出现黑边像素,对于具有黑边的图像数据而言,将黑边处的像素替换为参考帧对应位置处的像素来参与加权平均,从而有效的解决了图像黑边的问题。Referring to Fig. 13, at the boundary of the image, black edges are formed after image map conversion, and if the pixels at these black edges are fused in a weighted average manner, a difference in luminance is formed, which affects the overall vision of the image. Taking the reference frame as the first frame image data in the image data stream as an example, the other frame image data are all registered and fused under the first frame image data, and the reference frame does not have black-rim pixels, and the image with black edges In terms of data, the pixel at the black edge is replaced with the pixel at the corresponding position of the reference frame to participate in the weighted average, thereby effectively solving the problem of black edge of the image.
本发明实施例的技术方案,提出了一种图像处理方法,利用图像配准原理对多帧待合成的图像数据进行图像对齐,该对齐方法允许一定范围内的图像抖动误差,图像最后的合成效果出现的像素偏差较小。更为重要的是,在图像融合时,对于图像配准引起的像素黑边进行了处理,保证了整个图像各个位置的像素点过渡自然。According to the technical solution of the embodiment of the present invention, an image processing method is proposed, which uses image registration principle to perform image alignment on image data to be synthesized in multiple frames, and the alignment method allows image jitter error within a certain range, and the final composite effect of the image The pixel deviation that occurs is small. More importantly, in the image fusion, the black edges of the pixels caused by image registration are processed to ensure that the pixel points of the entire image are naturally transitioned.
图12为本发明实施例四的图像处理方法的流程示意图,本示例中的图像处理方法应用于终端,如图12所示,所述图像处理方法包括以下步骤:FIG. 12 is a schematic flowchart of an image processing method according to Embodiment 4 of the present invention. The image processing method in this example is applied to a terminal. As shown in FIG. 12, the image processing method includes the following steps:
步骤1201:在手持模式下,利用电子光圈拍摄得到原始图像数据流,所述原始图像数据流包括多帧原始图像数据;对所述原始图像数据流中的各帧原始图像数据进行预处理,得到所述图像数据流。Step 1201: In the handheld mode, the original image data stream is obtained by using an electronic aperture, the original image data stream includes a plurality of frames of original image data; and the original image data of each frame in the original image data stream is preprocessed to obtain The image data stream.
本发明实施例中,终端可以是手机、平板电脑等电子设备。终端具有拍照功能,且终端的拍照功能具有电子光圈模式;利用电子光圈拍照时,需要用户将拍照功能设置为电子光圈模式。In the embodiment of the present invention, the terminal may be an electronic device such as a mobile phone or a tablet computer. The terminal has a photographing function, and the photographing function of the terminal has an electronic aperture mode; when photographing by using an electronic aperture, the user needs to set the photographing function to the electronic aperture mode.
在电子光圈模式下,用户调整光圈值之后,在曝光时间内终端会进行连续不间断的拍摄,然后对拍摄出来的多张图像进行融合处理。采用电子光圈进行拍照时,由于需要连续拍摄多张图像进行融合处理,因此,各张图像需要保证对齐。为了保证电子光圈的易用性和用户体验性,本发明实施例为电子光圈拍摄增加了手持模式,在手势模式下,用户可以方便的手 持终端利用电子光圈进行拍摄。In the electronic aperture mode, after the user adjusts the aperture value, the terminal performs continuous and uninterrupted shooting during the exposure time, and then fuses the captured multiple images. When taking pictures with an electronic aperture, since it is necessary to continuously take multiple images for fusion processing, each image needs to be aligned. In order to ensure the ease of use and user experience of the electronic aperture, the embodiment of the present invention adds a handheld mode for the electronic aperture shooting. In the gesture mode, the user can conveniently use the hand. The terminal uses the electronic aperture to shoot.
在拍摄时,首先获取图像数据流,所述图像数据流包括多帧图像数据。具体地,首先同步获取拍摄的原始图像数据流;然后对原始图像数据流进行数据读取和图像预处理,得到所述图像数据流。这里,由于相机拍摄的ISP处理流程以及外界环境的不可预知变换,同步获取的原始图像数据流中的各帧原始图像数据在光照、噪声、清晰度、对焦点上出现差异。在进行融合处理前,需要通过必要的预处理过程对原始图像数据流中的各帧原始图像数据进行预处理,这里,预处理过程包括:图像滤波来消除噪声、对比度拉伸来提高图像的清晰度以及图像的光照差异。这样进行预处理后,图像数据流中的各帧图像数据的差异将会减小,有助于后续图像配准算法效果的提升。At the time of shooting, an image data stream is first acquired, the image data stream comprising a plurality of frames of image data. Specifically, the captured original image data stream is first acquired synchronously; then the original image data stream is subjected to data reading and image preprocessing to obtain the image data stream. Here, due to the ISP processing flow of the camera shooting and the unpredictable transformation of the external environment, the raw image data of each frame in the synchronously acquired original image data stream is different in illumination, noise, sharpness, and focus point. Before the fusion processing, the original image data of each frame in the original image data stream needs to be preprocessed through a necessary preprocessing process. Here, the preprocessing process includes: image filtering to eliminate noise and contrast stretching to improve image clarity. Degree and the difference in illumination of the image. After the pre-processing is performed, the difference of the image data of each frame in the image data stream will be reduced, which is helpful for the improvement of the effect of the subsequent image registration algorithm.
步骤1202:以所述图像数据流中的第一帧图像数据作为参考帧,将所述图像数据流中除所述第一帧图像数据以外的各帧图像数据与所述参考帧进行对齐。Step 1202: Align each frame image data of the image data stream except the first frame image data with the reference frame by using the first frame image data in the image data stream as a reference frame.
本发明实施例中,将图像数据流中的各帧图像数据进行配准后,各帧图像数据中相同空间位置的像素点实现了对齐,避免了后续进行图像融合时带来模糊。In the embodiment of the present invention, after the image data of each frame in the image data stream is registered, the pixels in the same spatial position in each frame image data are aligned, thereby avoiding blurring when the image is merged.
本发明实施例中,配准也称为对齐,图像对齐有很多算法,主要分为基于局部特征的方法和基于全局特征的方法。其中,基于局部特征的典型方法是提取图像的关键特征点,然后利用这些关键特征点进行图像空间对齐模型的映射矩阵计算,最后利用映射矩阵进行图像对齐。这类方法的配准效果一般可以满足很多场景的要求,如光照的变化(不同曝光图像的合成),大范围图像偏移(全景图像拼接)、暗光图像(噪声加大)等各种复杂的场景。另外一类是基于全局互信息匹配的搜索对齐方法,可以减少随机特征点引起的匹配误差。 In the embodiment of the present invention, registration is also called alignment, and there are many algorithms for image alignment, which are mainly divided into a local feature based method and a global feature based method. Among them, the typical method based on local features is to extract the key feature points of the image, and then use these key feature points to calculate the mapping matrix of the image space alignment model, and finally use the mapping matrix to perform image alignment. The registration effect of this type of method can generally meet the requirements of many scenes, such as changes in illumination (combination of different exposure images), large-scale image shift (panoramic image stitching), dark light image (noise increase) and other complexities. Scene. The other type is a search alignment method based on global mutual information matching, which can reduce the matching error caused by random feature points.
光流场也是一种基于点的匹配算法,它是空间运动物体在观察成像平面上的像素运动的瞬时速度,是利用图像序列中像素在时间域上的变化以及相邻帧之间的相关性来找到上一帧跟当前帧之间存在的对应关系,从而计算出相邻帧之间物体的运动信息的一种方法。The optical flow field is also a point-based matching algorithm, which is the instantaneous velocity of the pixel motion of the space moving object on the observation imaging plane, which is the change of the pixel in the time domain and the correlation between adjacent frames. A method for finding the correspondence between the previous frame and the current frame, thereby calculating the motion information of the object between adjacent frames.
步骤1203:在显示界面上显示提示框;所述提示框用于提示在所述在手持模式下,利用电子光圈拍摄时手持抖动的方位。Step 1203: Display a prompt box on the display interface; the prompt box is used to prompt the orientation of the handheld shake when shooting in the handheld mode with the electronic aperture.
参照图8,在应用交互界面上,显示一个用户提示框,该提示框可以提示用户当前手持抖动的方位,这样可以便于用户及时纠正。Referring to FIG. 8, on the application interaction interface, a user prompt box is displayed, which prompts the user to manually hold the direction of the jitter, which can facilitate the user to correct in time.
步骤1204:对配准后的各帧图像数据进行分析,确定出各帧图像边界处的黑色区域;对配准后的各帧图像数据进行融合处理时,将图像边界处黑色区域的像素替换为所述参考帧对应的像素后进行融合处理,得到目标图像。Step 1204: Analyze the image data of each frame after registration to determine a black area at the boundary of each frame image; when performing fusion processing on the image data of each frame after registration, replace the pixel of the black area at the boundary of the image with The pixels corresponding to the reference frame are subjected to fusion processing to obtain a target image.
本发明实施例中,所述将图像边界处黑色区域的像素替换为所述参考帧对应的像素后进行融合处理,得到目标图像,包括:In the embodiment of the present invention, after the pixel of the black area at the image boundary is replaced with the pixel corresponding to the reference frame, the fusion process is performed to obtain the target image, including:
根据所述图像边界的处黑色区域,确定在所述参考帧中与所述黑色区域相对应的参考区域;Determining a reference area corresponding to the black area in the reference frame according to a black area at the boundary of the image;
将图像边界处黑色区域的像素替换为所述参考帧中参考区域中的像素后进行融合处理,得到目标图像。The pixel in the black area at the image boundary is replaced with the pixel in the reference area in the reference frame, and then the fusion processing is performed to obtain the target image.
这里,可以通过聚类算法将图像边界的处黑色区域划分出来,例如K-means算法。Here, the black area at the boundary of the image can be divided by a clustering algorithm, such as the K-means algorithm.
本发明实施例中,各帧图像数据进行配准(也即对齐)后,需要对图像进行融合处理,这里,采取图像像素点依次叠加的融合方法。In the embodiment of the present invention, after the image data of each frame is registered (that is, aligned), the image needs to be fused, and here, a fusion method in which image pixels are sequentially superimposed is adopted.
参照图13,在图像的边界处由于图像映射变换后会形成黑边,这些黑边处的像素如果按照加权平均的方式进行融合会形成亮度的差异,给图像的整体视觉带来影响。以参考帧为图像数据流中的第一帧图像数据为例, 其他各帧图像数据都在第一帧图像数据下进行配准融合,参考帧不会出现黑边像素,对于具有黑边的图像数据而言,将黑边处的像素替换为参考帧对应位置处的像素来参与加权平均,从而有效的解决了图像黑边的问题。Referring to Fig. 13, at the boundary of the image, black edges are formed after image map conversion, and if the pixels at these black edges are fused in a weighted average manner, a difference in luminance is formed, which affects the overall vision of the image. Taking the reference frame as the first frame image data in the image data stream as an example, The image data of each other frame is registered and fused under the image data of the first frame, and the black pixel is not present in the reference frame. For the image data with black border, the pixel at the black edge is replaced with the corresponding position of the reference frame. The pixels are involved in the weighted average, which effectively solves the problem of black edges of the image.
本发明实施例的技术方案,提出了一种图像处理方法,利用图像配准原理对多帧待合成的图像数据进行图像对齐,该对齐方法允许一定范围内的图像抖动误差,图像最后的合成效果出现的像素偏差较小。更为重要的是,在图像融合时,对于图像配准引起的像素黑边进行了处理,保证了整个图像各个位置的像素点过渡自然。此外,在电子光圈的拍摄过程中,加入了提示框,防止用户拍摄时终端抖动范围过大。According to the technical solution of the embodiment of the present invention, an image processing method is proposed, which uses image registration principle to perform image alignment on image data to be synthesized in multiple frames, and the alignment method allows image jitter error within a certain range, and the final composite effect of the image The pixel deviation that occurs is small. More importantly, in the image fusion, the black edges of the pixels caused by image registration are processed to ensure that the pixel points of the entire image are naturally transitioned. In addition, during the shooting of the electronic aperture, a prompt box is added to prevent the terminal from being over-ranged when shooting.
图14为本发明实施例二的终端的结构组成示意图,如图14所示,所述终端包括:FIG. 14 is a schematic structural diagram of a terminal according to Embodiment 2 of the present invention. As shown in FIG. 14, the terminal includes:
获取单元41,用于利用电子光圈获取图像数据流,所述图像数据流包括多帧图像数据;An acquiring unit 41, configured to acquire an image data stream by using an electronic aperture, where the image data stream includes multi-frame image data;
配准单元42,用于从所述图像数据流中确定出参考帧,将所述图像数据流中除所述参考帧以外的各帧图像数据与所述参考帧进行配准;The registration unit 42 is configured to determine a reference frame from the image data stream, and register, in the image data stream, image data of each frame other than the reference frame with the reference frame;
融合单元43,用于对配准后的各帧图像数据进行融合处理时,将图像边界处黑色区域的像素替换为所述参考帧对应的像素后进行融合处理,得到目标图像。The merging unit 43 is configured to perform merging processing on the image data of each frame after the registration, and replace the pixels in the black area at the image boundary with the pixels corresponding to the reference frame, and perform fusion processing to obtain a target image.
所述融合单元43包括:The fusion unit 43 includes:
分析子单元431,用于对配准后的各帧图像数据进行分析,确定出各帧图像边界处的黑色区域;The analyzing sub-unit 431 is configured to analyze the image data of each frame after registration, and determine a black area at a boundary of each frame image;
替换及融合子单元432,用于对配准后的各帧图像数据进行融合处理时,将图像边界处黑色区域的像素替换为所述参考帧对应的像素后进行融合处理,得到目标图像。The replacement and fusion sub-unit 432 is configured to perform merging processing on the image data of each frame after the registration, and replace the pixels in the black area at the image boundary with the pixels corresponding to the reference frame, and perform fusion processing to obtain a target image.
所述替换及融合子单元432,还用于根据所述图像边界的处黑色区域, 确定在所述参考帧中与所述黑色区域相对应的参考区域;将图像边界处黑色区域的像素替换为所述参考帧中参考区域中的像素后进行融合处理,得到目标图像。The replacement and fusion subunit 432 is further configured to: according to a black area at the boundary of the image, Determining a reference area corresponding to the black area in the reference frame; replacing a pixel of the black area at the image boundary with a pixel in the reference area in the reference frame, and performing a blending process to obtain a target image.
所述配准单元42,还用于以所述图像数据流中的第一帧图像数据作为参考帧,将所述图像数据流中除所述第一帧图像数据以外的各帧图像数据与所述参考帧进行对齐;其中,所述对齐是指将相同空间位置的像素点对齐。The registration unit 42 is further configured to use, by using the first frame image data in the image data stream as a reference frame, image data of each frame except the first frame image data in the image data stream. The reference frames are aligned; wherein the alignment refers to aligning pixel points of the same spatial location.
所述融合单元43,还用于将配准后的各帧图像数据的各个像素点按照空间位置对应进行叠加。The merging unit 43 is further configured to superimpose each pixel point of each frame image data after registration according to spatial position correspondence.
这里需要指出的是:以上终端实施例的描述,与上述方法实施例的描述是类似的,具有同方法实施例相似的有益效果。对于本发明终端实施例中未披露的技术细节,请参照本发明方法实施例的描述而理解。It should be noted here that the description of the above terminal embodiment is similar to the description of the above method embodiment, and has similar advantages as the method embodiment. For technical details not disclosed in the terminal embodiment of the present invention, please refer to the description of the method embodiment of the present invention.
图10为本发明实施例的终端的结构组成示意图,所述终端包括:处理器1001、相机1002、显示屏1003;所述处理器1001、相机1002以及显示屏1003均通过总线1004连接。10 is a schematic structural diagram of a terminal according to an embodiment of the present invention. The terminal includes: a processor 1001, a camera 1002, and a display screen 1003. The processor 1001, the camera 1002, and the display screen 1003 are all connected by a bus 1004.
所述相机1102,配置为利用电子光圈获取图像数据流,所述图像数据流包括多帧图像数据;;The camera 1102 is configured to acquire an image data stream using an electronic aperture, the image data stream comprising a plurality of frames of image data;
所述处理器1001,配置为从所述图像数据流中确定出参考帧,将所述图像数据流中除所述参考帧以外的各帧图像数据与所述参考帧进行配准;对配准后的各帧图像数据进行融合处理时,将图像边界处黑色区域的像素替换为所述参考帧对应的像素后进行融合处理,得到目标图像。The processor 1001 is configured to determine a reference frame from the image data stream, and register image data of each frame except the reference frame in the image data stream with the reference frame; When the subsequent image data of each frame is subjected to the fusion processing, the pixels of the black area at the image boundary are replaced with the pixels corresponding to the reference frame, and then the fusion processing is performed to obtain the target image.
具体地,所述相机1102,配置为在手持模式下,利用电子光圈拍摄得到原始图像数据流,所述原始图像数据流包括多帧原始图像数据;Specifically, the camera 1102 is configured to obtain an original image data stream by using an electronic aperture in a handheld mode, where the original image data stream includes multiple frames of original image data;
所述处理器1101,配置为对所述原始图像数据流中的各帧原始图像数据进行预处理,得到所述图像数据流;其中,所述预处理包括以下至少之 一:图像滤波、对比度拉伸;从所述图像数据流中确定出参考帧,将所述图像数据流中除所述参考帧以外的各帧图像数据与所述参考帧进行配准;对配准后的各帧图像数据进行融合处理时,将图像边界处黑色区域的像素替换为所述参考帧对应的像素后进行融合处理,得到目标图像。The processor 1101 is configured to perform pre-processing on each frame of the original image data stream to obtain the image data stream; wherein the pre-processing includes at least the following a: image filtering, contrast stretching; determining a reference frame from the image data stream, and registering each frame image data of the image data stream except the reference frame with the reference frame; When the image data of each frame is subjected to the fusion processing, the pixels of the black area at the image boundary are replaced with the pixels corresponding to the reference frame, and then the fusion processing is performed to obtain the target image.
所述显示屏1103,配置为在显示界面上显示提示框;所述提示框用于提示在所述在手持模式下,利用电子光圈拍摄时手持抖动的方位。The display screen 1103 is configured to display a prompt box on the display interface; the prompt box is used to prompt the orientation of the handheld shake when photographing with the electronic aperture in the handheld mode.
图15为相机的电气结构框图。Figure 15 is a block diagram of the electrical structure of the camera.
镜头1211由用于形成被摄体像的多个光学镜头构成,为单焦点镜头或变焦镜头。镜头1211在镜头驱动器1221的控制下能够在光轴方向上移动,镜头驱动器1221根据来自镜头驱动控制电路1222的控制信号,控制镜头1211的焦点位置,在变焦镜头的情况下,也可控制焦点距离。镜头驱动控制电路1222按照来自微处理器1217的控制命令进行镜头驱动器1221的驱动控制。The lens 1211 is composed of a plurality of optical lenses for forming a subject image, and is a single focus lens or a zoom lens. The lens 1211 is movable in the optical axis direction under the control of the lens driver 1221, and the lens driver 1221 controls the focus position of the lens 1211 in accordance with a control signal from the lens driving control circuit 1222, and in the case of the zoom lens, the focus distance can also be controlled. . The lens drive control circuit 1222 performs drive control of the lens driver 1221 in accordance with a control command from the microprocessor 1217.
在镜头1211的光轴上、由镜头1211形成的被摄体像的位置附近配置有摄像元件1212。摄像元件1212用于对被摄体像摄像并取得摄像图像数据。在摄像元件1212上二维且呈矩阵状配置有构成各像素的光电二极管。各光电二极管产生与受光量对应的光电转换电流,该光电转换电流由与各光电二极管连接的电容器进行电荷蓄积。各像素的前表面配置有拜耳排列的红绿蓝(RGB)滤色器。An imaging element 1212 is disposed in the vicinity of the position of the subject image formed by the lens 1211 on the optical axis of the lens 1211. The imaging element 1212 is for capturing an image of a subject and acquiring captured image data. Photodiodes constituting each pixel are arranged two-dimensionally and in a matrix on the imaging element 1212. Each photodiode generates a photoelectric conversion current corresponding to the amount of received light, and the photoelectric conversion current is charged by a capacitor connected to each photodiode. A red-green-blue (RGB) color filter of a Bayer arrangement is disposed on the front surface of each pixel.
摄像元件1212与摄像电路1213连接,该摄像电路1213在摄像元件1212中进行电荷蓄积控制和图像信号读出控制,对该读出的图像信号(模拟图像信号)降低重置噪声后进行波形整形,进而进行增益提高等以成为适当的电平信号。The imaging element 1212 is connected to the imaging circuit 1213. The imaging circuit 1213 performs charge accumulation control and image signal readout control in the imaging element 1212, and performs waveform shaping after reducing the reset noise of the read image signal (analog image signal). Further, gain improvement or the like is performed to obtain an appropriate level signal.
摄像电路1213与A/D转换器1214连接,该A/D转换器1214对模拟图像信号进行模数转换,向总线1227输出数字图像信号(以下称之为图像 数据)。The imaging circuit 1213 is connected to an A/D converter 1214 that performs analog-to-digital conversion on the analog image signal and outputs a digital image signal to the bus 1227 (hereinafter referred to as an image). data).
总线1227是用于传送在相机的内部读出或生成的各种数据的传送路径。在总线1227连接着上述A/D转换器1214,此外还连接着图像处理器1215、JPEG处理器1216、微处理器1217、同步动态随机存取内存(SDRAM,Synchronous Dynamic random access memory)1218、存储器接口(以下称之为存储器I/F)1219、液晶显示器(LCD,Liquid Crystal Display)驱动器1220。The bus 1227 is a transmission path for transmitting various data read or generated inside the camera. The A/D converter 1214 is connected to the bus 1227, and an image processor 1215, a JPEG processor 1216, a microprocessor 1217, a Synchronous Dynamic Random Access Memory (SDRAM) 1218, and a memory are also connected. An interface (hereinafter referred to as a memory I/F) 1219 and a liquid crystal display (LCD) driver 1220.
图像处理器1215对基于摄像元件1212的输出的图像数据进行光学黑色(OB,Optical Black)相减处理、白平衡调整、颜色矩阵运算、伽马转换、色差信号处理、噪声去除处理、同时化处理、边缘处理等各种图像处理。JPEG处理器1216在将图像数据记录于存储介质1225时,按照JPEG压缩方式压缩从SDRAM 1218读出的图像数据。此外,JPEG处理器1216为了进行图像再现显示而进行JPEG图像数据的解压缩。进行解压缩时,读出记录在存储介质1225中的文件,在JPEG处理器1216中实施了解压缩处理后,将解压缩的图像数据暂时存储于SDRAM 1218中并在LCD 1226上进行显示。另外,在本实施方式中,作为图像压缩解压缩方式采用的是JPEG方式,然而压缩解压缩方式不限于此,当然可以采用MPEG、TIFF、H.264等其他的压缩解压缩方式。The image processor 1215 performs optical black (OB, Optical Black) subtraction processing, white balance adjustment, color matrix calculation, gamma conversion, color difference signal processing, noise removal processing, and simultaneous processing on the image data based on the output of the imaging element 1212. Various image processing such as edge processing. The JPEG processor 1216 compresses the image data read out from the SDRAM 1218 in accordance with the JPEG compression method when the image data is recorded on the storage medium 1225. Further, the JPEG processor 1216 performs decompression of JPEG image data for image reproduction display. At the time of decompression, the file recorded in the storage medium 1225 is read, and after the compression processing is performed in the JPEG processor 1216, the decompressed image data is temporarily stored in the SDRAM 1218 and displayed on the LCD 1226. Further, in the present embodiment, the JPEG method is adopted as the image compression/decompression method. However, the compression/decompression method is not limited thereto, and other compression/decompression methods such as MPEG, TIFF, and H.264 may be used.
微处理器1217发挥作为该相机整体的控制部的功能,统一控制相机的各种处理序列。微处理器1217连接着操作单元1223和闪存1224。The microprocessor 1217 functions as a control unit of the entire camera, and collectively controls various processing sequences of the camera. The microprocessor 1217 is connected to the operating unit 1223 and the flash memory 1224.
操作单元1223包括但不限于实体按键或者虚拟按键,该实体或虚拟按键可以为电源按钮、拍照键、编辑按键、动态图像按钮、再现按钮、菜单按钮、十字键、OK按钮、删除按钮、放大按钮等各种输入按钮和各种输入键等操作控件,检测这些操作控件的操作状态。The operating unit 1223 includes, but is not limited to, a physical button or a virtual button, and the entity or virtual button may be a power button, a camera button, an edit button, a dynamic image button, a reproduction button, a menu button, a cross button, an OK button, a delete button, an enlarge button The operation controls such as various input buttons and various input keys detect the operational state of these operation controls.
将检测结果向微处理器1217输出。此外,在作为显示器的LCD1226 的前表面设有触摸面板,检测用户的触摸位置,将该触摸位置向微处理器1217输出。微处理器1217根据来自操作单元1223的操作位置的检测结果,执行与用户的操作对应的各种处理序列。The detection result is output to the microprocessor 1217. Also, in the LCD1226 as a display The front surface is provided with a touch panel to detect the touch position of the user, and the touch position is output to the microprocessor 1217. The microprocessor 1217 executes various processing sequences corresponding to the user's operation in accordance with the detection result from the operation position of the operation unit 1223.
闪存1224存储用于执行微处理器1217的各种处理序列的程序。微处理器1217根据该程序进行相机整体的控制。此外,闪存1224存储相机的各种调整值,微处理器1217读出调整值,按照该调整值进行相机的控制。 Flash memory 1224 stores programs for executing various processing sequences of microprocessor 1217. The microprocessor 1217 performs overall control of the camera in accordance with the program. Further, the flash memory 1224 stores various adjustment values of the camera, and the microprocessor 1217 reads out the adjustment value, and performs control of the camera in accordance with the adjustment value.
SDRAM 1218是用于对图像数据等进行暂时存储的可电改写的易失性存储器。该SDRAM 1218暂时存储从模拟/数字(A/D)转换器1214输出的图像数据和在图像处理器1215、JPEG处理器1216等中进行了处理后的图像数据。The SDRAM 1218 is an electrically replaceable volatile memory for temporarily storing image data or the like. The SDRAM 1218 temporarily stores image data output from the analog/digital (A/D) converter 1214 and image data processed in the image processor 1215, the JPEG processor 1216, and the like.
存储器接口1219与存储介质1225连接,进行将图像数据和附加在图像数据中的文件头等数据写入存储介质1225和从存储介质1225中读出的控制。存储介质1225可以实施为能够在相机主体上自由拆装的存储器卡等存储介质,然而不限于此,也可以是内置在相机主体中的硬盘等。The memory interface 1219 is connected to the storage medium 1225, and performs control for writing image data and a file header attached to the image data to the storage medium 1225 and reading out from the storage medium 1225. The storage medium 1225 can be implemented as a storage medium such as a memory card that can be detachably attached to the camera body. However, the storage medium 1225 is not limited thereto, and may be a hard disk or the like built in the camera body.
LCD驱动器1210与LCD 1226连接,将由图像处理器1215处理后的图像数据存储于SDRAM 1218,需要显示时,读取SDRAM 1218存储的图像数据并在LCD 1226上显示,或者,JPEG处理器1216压缩过的图像数据存储于SDRAM 1218,在需要显示时,JPEG处理器1216读取SDRAM 1218的压缩过的图像数据,再进行解压缩,将解压缩后的图像数据通过LCD 1226进行显示。The LCD driver 1210 is connected to the LCD 1226, and stores the image data processed by the image processor 1215 in the SDRAM 1218. When display is required, the image data stored in the SDRAM 1218 is read and displayed on the LCD 1226, or the JPEG processor 1216 is compressed. The image data is stored in the SDRAM 1218. When display is required, the JPEG processor 1216 reads the compressed image data of the SDRAM 1218, decompresses it, and displays the decompressed image data through the LCD 1226.
LCD1226配置在相机主体的背面进行图像显示,然而不限于此,也可以采用基于有机EL也就是OLED的各种显示面板进行图像显示。The LCD 1226 is disposed on the back side of the camera body for image display. However, the present invention is not limited thereto, and image display may be performed using various display panels based on organic EL, that is, OLED.
本发明实施例上述终端如果以软件功能模块的形式实现并作为独立的产品销售或使用时,也可以存储在一个计算机可读取存储介质中。基于这样的理解,本发明实施例的技术方案本质上或者说对现有技术做出贡献的 部分可以以软件产品的形式体现出来,该计算机软件产品存储在一个存储介质中,包括若干指令用以使得一台计算机设备(可以是个人计算机、服务器、或者网络设备等)执行本发明各个实施例所述方法的全部或部分。而前述的存储介质包括:U盘、移动硬盘、ROM、磁碟或者光盘等各种可以存储程序代码的介质。这样,本发明实施例不限制于任何特定的硬件和软件结合。In the embodiment of the present invention, if the terminal is implemented in the form of a software function module and sold or used as a stand-alone product, it may also be stored in a computer readable storage medium. Based on such understanding, the technical solution of the embodiments of the present invention contributes essentially or to the prior art. Portions may be embodied in the form of a software product stored in a storage medium, including instructions for causing a computer device (which may be a personal computer, server, or network device, etc.) to perform various embodiments of the present invention All or part of the method. The foregoing storage medium includes various media that can store program codes, such as a USB flash drive, a mobile hard disk, a ROM, a magnetic disk, or an optical disk. Thus, embodiments of the invention are not limited to any specific combination of hardware and software.
相应地,本发明实施例还提供一种计算机存储介质,其中存储有计算机程序,该计算机程序配置为执行本发明实施例的图像处理方法。Accordingly, an embodiment of the present invention further provides a computer storage medium in which a computer program is stored, the computer program being configured to perform an image processing method according to an embodiment of the present invention.
本发明实施例所记载的技术方案之间,在不冲突的情况下,可以任意组合。The technical solutions described in the embodiments of the present invention can be arbitrarily combined without conflict.
在本发明所提供的几个实施例中,应该理解到,所揭露的方法和智能设备,可以通过其它的方式实现。以上所描述的设备实施例仅仅是示意性的,例如,所述单元的划分,仅仅为一种逻辑功能划分,实际实现时可以有另外的划分方式,如:多个单元或组件可以结合,或可以集成到另一个系统,或一些特征可以忽略,或不执行。另外,所显示或讨论的各组成部分相互之间的耦合、或直接耦合、或通信连接可以是通过一些接口,设备或单元的间接耦合或通信连接,可以是电性的、机械的或其它形式的。In the several embodiments provided by the present invention, it should be understood that the disclosed method and smart device may be implemented in other manners. The device embodiments described above are merely illustrative. For example, the division of the unit is only a logical function division. In actual implementation, there may be another division manner, such as: multiple units or components may be combined, or Can be integrated into another system, or some features can be ignored or not executed. In addition, the coupling, or direct coupling, or communication connection of the components shown or discussed may be indirect coupling or communication connection through some interfaces, devices or units, and may be electrical, mechanical or other forms. of.
上述作为分离部件说明的单元可以是、或也可以不是物理上分开的,作为单元显示的部件可以是、或也可以不是物理单元,即可以位于一个地方,也可以分布到多个网络单元上;可以根据实际的需要选择其中的部分或全部单元来实现本实施例方案的目的。The units described above as separate components may or may not be physically separated, and the components displayed as the unit may or may not be physical units, that is, may be located in one place or distributed to multiple network units; Some or all of the units may be selected according to actual needs to achieve the purpose of the solution of the embodiment.
另外,在本发明各实施例中的各功能单元可以全部集成在一个第二处理单元中,也可以是各单元分别单独作为一个单元,也可以两个或两个以上单元集成在一个单元中;上述集成的单元既可以采用硬件的形式实现,也可以采用硬件加软件功能单元的形式实现。 In addition, each functional unit in each embodiment of the present invention may be integrated into one second processing unit, or each unit may be separately used as one unit, or two or more units may be integrated into one unit; The above integrated unit can be implemented in the form of hardware or in the form of hardware plus software functional units.
以上所述,仅为本发明的具体实施方式,但本发明的保护范围并不局限于此,任何熟悉本技术领域的技术人员在本发明揭露的技术范围内,可轻易想到变化或替换,都应涵盖在本发明的保护范围之内。The above is only a specific embodiment of the present invention, but the scope of the present invention is not limited thereto, and any person skilled in the art can easily think of changes or substitutions within the technical scope of the present invention. It should be covered by the scope of the present invention.
工业实用性Industrial applicability
本发明实施例的技术方案,增加了电子光圈的手持模式,提高了用户拍摄的便利性,对于手持而引起的抖动,本发明实施例引入了图像配准,保障了拍摄的效果,提升了用户拍摄体验。此外,本发明实施例还在显示界面上增加了提示框,防止用户拍摄时手持终端抖动范围过大。此外,用户可以手持终端进行电子光圈的拍摄,在提高用户拍摄便利性的同时,避免了由于手持而引起的图像不清楚的问题,保障了拍摄的效果,提升了用户拍摄体验。在进行图像融合时,对于图像配准引起的像素黑边进行了处理,保证了整个图像各个位置的像素点过渡自然。 The technical solution of the embodiment of the invention increases the hand-held mode of the electronic aperture, improves the convenience of the user's shooting, and introduces the image registration in the embodiment of the invention, which ensures the effect of the shooting and improves the user. Shooting experience. In addition, the embodiment of the present invention further adds a prompt box on the display interface to prevent the handheld terminal from being over-ranged when the user shoots. In addition, the user can carry out the shooting of the electronic aperture by the hand-held terminal, which improves the convenience of the user's shooting, avoids the problem that the image is unclear due to the hand-held, ensures the shooting effect, and improves the user's shooting experience. When image fusion is performed, the black side of the pixel caused by image registration is processed, which ensures that the pixel point transition of each position of the entire image is natural.

Claims (22)

  1. 一种终端,所述终端包括:A terminal, the terminal comprising:
    获取单元,配置为利用电子光圈获取图像数据流,所述图像数据流包括多帧图像数据;An acquiring unit configured to acquire an image data stream by using an electronic aperture, the image data stream comprising a plurality of frames of image data;
    配准单元,配置为将所述图像数据流中的各帧图像数据进行配准;a registration unit configured to register image data of each frame in the image data stream;
    融合单元,配置为将配准后的各帧图像数据进行融合处理,得到目标图像。The fusion unit is configured to perform fusion processing on the image data of each frame after registration to obtain a target image.
  2. 根据权利要求1所述的终端,其中,所述获取单元包括:The terminal according to claim 1, wherein the obtaining unit comprises:
    拍摄子单元,配置为在手持模式下,利用电子光圈拍摄得到原始图像数据流,所述原始图像数据流包括多帧原始图像数据;a shooting subunit configured to capture an original image data stream by using an electronic aperture in a handheld mode, the original image data stream comprising a plurality of frames of raw image data;
    预处理子单元,配置为对所述原始图像数据流中的各帧原始图像数据进行预处理,得到所述图像数据流;其中,所述预处理包括以下至少之一:图像滤波、对比度拉伸。a pre-processing unit configured to pre-process each frame of the original image data stream to obtain the image data stream; wherein the pre-processing comprises at least one of: image filtering, contrast stretching .
  3. 根据权利要求2所述的终端,其中,所述终端还包括:The terminal of claim 2, wherein the terminal further comprises:
    提示单元,配置为在显示界面上显示提示框;所述提示框用于提示在所述在手持模式下,利用电子光圈拍摄时手持抖动的方位。The prompting unit is configured to display a prompt box on the display interface, and the prompting box is configured to prompt the orientation of the handheld shaking when the electronic aperture is used in the handheld mode.
  4. 根据权利要求1至3任一项所述的终端,其中,所述配准单元,还配置为以所述图像数据流中的第一帧图像数据作为参考帧,将所述图像数据流中除所述第一帧图像数据以外的各帧图像数据与所述参考帧进行对齐;其中,所述对齐是指将相同空间位置的像素点对齐。The terminal according to any one of claims 1 to 3, wherein the registration unit is further configured to divide the image data stream by using first frame image data in the image data stream as a reference frame. Each frame image data other than the first frame image data is aligned with the reference frame; wherein the alignment refers to aligning pixel points of the same spatial position.
  5. 一种图像处理方法,所述方法包括:An image processing method, the method comprising:
    利用电子光圈获取图像数据流,所述图像数据流包括多帧图像数据;Acquiring an image data stream using an electronic aperture, the image data stream comprising a plurality of frames of image data;
    将所述图像数据流中的各帧图像数据进行配准;Registering each frame of image data in the image data stream;
    将配准后的各帧图像数据进行融合处理,得到目标图像。The image data of each frame after registration is subjected to fusion processing to obtain a target image.
  6. 根据权利要求5所述的图像处理方法,其中,所述利用电子光圈 获取图像数据流,包括:The image processing method according to claim 5, wherein said utilizing an electronic aperture Get the image data stream, including:
    在手持模式下,利用电子光圈拍摄得到原始图像数据流,所述原始图像数据流包括多帧原始图像数据;In the handheld mode, an original image data stream is obtained by taking an electronic aperture, the original image data stream comprising a plurality of frames of original image data;
    对所述原始图像数据流中的各帧原始图像数据进行预处理,得到所述图像数据流;Performing pre-processing on each frame of the original image data in the original image data stream to obtain the image data stream;
    其中,所述预处理包括以下至少之一:图像滤波、对比度拉伸。Wherein, the preprocessing comprises at least one of the following: image filtering, contrast stretching.
  7. 根据权利要求6所述的图像处理方法,其中,所述方法还包括:The image processing method according to claim 6, wherein the method further comprises:
    在显示界面上显示提示框;所述提示框用于提示在所述在手持模式下,利用电子光圈拍摄时手持抖动的方位。A prompt box is displayed on the display interface; the prompt box is used to indicate the orientation of the handheld shake when the electronic aperture is used in the handheld mode.
  8. 根据权利要求5至7任一项所述的图像处理方法,其中,所述将所述图像数据流中的各帧图像数据进行配准,包括:The image processing method according to any one of claims 5 to 7, wherein the registering each frame image data in the image data stream comprises:
    以所述图像数据流中的第一帧图像数据作为参考帧,将所述图像数据流中除所述第一帧图像数据以外的各帧图像数据与所述参考帧进行对齐;Aligning, by using the first frame image data in the image data stream as a reference frame, each frame image data of the image data stream except the first frame image data, and the reference frame;
    其中,所述对齐是指将相同空间位置的像素点对齐。Wherein, the alignment refers to aligning pixel points of the same spatial position.
  9. 根据权利要求5至7任一项所述的图像处理方法,其中,所述将配准后的各帧图像数据进行融合处理,包括:The image processing method according to any one of claims 5 to 7, wherein the performing the fusion processing on the image data of each frame after registration comprises:
    将配准后的各帧图像数据的各个像素点按照空间位置对应进行叠加。Each pixel point of the image data of each frame after registration is superimposed according to the spatial position correspondence.
  10. 一种终端,所述终端包括:A terminal, the terminal comprising:
    获取单元,配置为利用电子光圈获取图像数据流,所述图像数据流包括多帧图像数据;An acquiring unit configured to acquire an image data stream by using an electronic aperture, the image data stream comprising a plurality of frames of image data;
    配准单元,配置为从所述图像数据流中确定出参考帧,将所述图像数据流中除所述参考帧以外的各帧图像数据与所述参考帧进行配准;a registration unit, configured to determine a reference frame from the image data stream, and register, in the image data stream, image data of each frame other than the reference frame with the reference frame;
    融合单元,配置为对配准后的各帧图像数据进行融合处理时,将图 像边界处黑色区域的像素替换为所述参考帧对应的像素后进行融合处理,得到目标图像。The fusion unit is configured to perform fusion processing on the image data of each frame after registration The pixel corresponding to the black area at the boundary is replaced with the pixel corresponding to the reference frame, and then the fusion processing is performed to obtain the target image.
  11. 根据权利要求10所述的终端,其中,所述融合单元包括:The terminal according to claim 10, wherein the fusion unit comprises:
    分析子单元,配置为对配准后的各帧图像数据进行分析,确定出各帧图像边界处的黑色区域;The analysis subunit is configured to analyze the image data of each frame after registration to determine a black area at a boundary of each frame image;
    替换及融合子单元,配置为对配准后的各帧图像数据进行融合处理时,将图像边界处黑色区域的像素替换为所述参考帧对应的像素后进行融合处理,得到目标图像。The replacement and fusion subunits are configured to perform fusion processing on the image data of each frame after the registration, and replace the pixels in the black area at the image boundary with the pixels corresponding to the reference frame, and perform fusion processing to obtain a target image.
  12. 根据权利要求11所述的终端,其中,所述替换及融合子单元,还配置为根据所述图像边界的处黑色区域,确定在所述参考帧中与所述黑色区域相对应的参考区域;将图像边界处黑色区域的像素替换为所述参考帧中参考区域中的像素后进行融合处理,得到目标图像。The terminal according to claim 11, wherein the replacement and fusion subunit is further configured to determine a reference area corresponding to the black area in the reference frame according to a black area at the image boundary; The pixel in the black area at the image boundary is replaced with the pixel in the reference area in the reference frame, and then the fusion processing is performed to obtain the target image.
  13. 根据权利要求10至12任一项所述的终端,其中,所述配准单元,还配置为以所述图像数据流中的第一帧图像数据作为参考帧,将所述图像数据流中除所述第一帧图像数据以外的各帧图像数据与所述参考帧进行对齐;其中,所述对齐是指将相同空间位置的像素点对齐。The terminal according to any one of claims 10 to 12, wherein the registration unit is further configured to divide the image data stream by using first frame image data in the image data stream as a reference frame. Each frame image data other than the first frame image data is aligned with the reference frame; wherein the alignment refers to aligning pixel points of the same spatial position.
  14. 一种图像处理方法,所述方法包括:An image processing method, the method comprising:
    利用电子光圈获取图像数据流,所述图像数据流包括多帧图像数据;Acquiring an image data stream using an electronic aperture, the image data stream comprising a plurality of frames of image data;
    从所述图像数据流中确定出参考帧,将所述图像数据流中除所述参考帧以外的各帧图像数据与所述参考帧进行配准;Determining a reference frame from the image data stream, and registering each frame image data of the image data stream except the reference frame with the reference frame;
    对配准后的各帧图像数据进行融合处理时,将图像边界处黑色区域的像素替换为所述参考帧对应的像素后进行融合处理,得到目标图像。When the image data of each frame after the registration is subjected to the fusion processing, the pixels of the black area at the image boundary are replaced with the pixels corresponding to the reference frame, and then the fusion processing is performed to obtain the target image.
  15. 根据权利要求14所述的图像处理方法,其中,所述对配准后的各帧图像数据进行融合处理时,将图像边界处黑色区域的像素替换为所述参考帧对应的像素后进行融合处理,得到目标图像,包括: The image processing method according to claim 14, wherein when the image data of each frame after the registration is subjected to the fusion processing, the pixels of the black area at the image boundary are replaced with the pixels corresponding to the reference frame, and then the fusion processing is performed. , get the target image, including:
    对配准后的各帧图像数据进行分析,确定出各帧图像边界处的黑色区域;Performing analysis on the image data of each frame after registration to determine a black area at the boundary of each frame image;
    对配准后的各帧图像数据进行融合处理时,将图像边界处黑色区域的像素替换为所述参考帧对应的像素后进行融合处理,得到目标图像。When the image data of each frame after the registration is subjected to the fusion processing, the pixels of the black area at the image boundary are replaced with the pixels corresponding to the reference frame, and then the fusion processing is performed to obtain the target image.
  16. 根据权利要求15所述的图像处理方法,其中,所述将图像边界处黑色区域的像素替换为所述参考帧对应的像素后进行融合处理,得到目标图像,包括:The image processing method according to claim 15, wherein the replacing the pixels of the black area at the image boundary with the pixels corresponding to the reference frame and performing the merging process to obtain the target image comprises:
    根据所述图像边界的处黑色区域,确定在所述参考帧中与所述黑色区域相对应的参考区域;Determining a reference area corresponding to the black area in the reference frame according to a black area at the boundary of the image;
    将图像边界处黑色区域的像素替换为所述参考帧中参考区域中的像素后进行融合处理,得到目标图像。The pixel in the black area at the image boundary is replaced with the pixel in the reference area in the reference frame, and then the fusion processing is performed to obtain the target image.
  17. 根据权利要求14至16任一项所述的图像处理方法,其中,所述从所述图像数据流中确定出参考帧,将所述图像数据流中除所述参考帧以外的各帧图像数据与所述参考帧进行配准,包括:The image processing method according to any one of claims 14 to 16, wherein said determining a reference frame from said image data stream, and extracting image data of each frame other than said reference frame in said image data stream Registration with the reference frame, including:
    以所述图像数据流中的第一帧图像数据作为参考帧,将所述图像数据流中除所述第一帧图像数据以外的各帧图像数据与所述参考帧进行对齐;Aligning, by using the first frame image data in the image data stream as a reference frame, each frame image data of the image data stream except the first frame image data, and the reference frame;
    其中,所述对齐是指将相同空间位置的像素点对齐。Wherein, the alignment refers to aligning pixel points of the same spatial position.
  18. 根据权利要求14至16任一项所述的图像处理方法,其中,所述将配准后的各帧图像数据进行融合处理,包括:The image processing method according to any one of claims 14 to 16, wherein the performing the fusion processing on the image data of each frame after registration comprises:
    将配准后的各帧图像数据的各个像素点按照空间位置对应进行叠加。Each pixel point of the image data of each frame after registration is superimposed according to the spatial position correspondence.
  19. 一种终端,所述终端包括:相机、处理器;A terminal, the terminal comprising: a camera and a processor;
    所述相机,配置为利用电子光圈获取图像数据流,所述图像数据流包括多帧图像数据; The camera is configured to acquire an image data stream using an electronic aperture, the image data stream comprising a plurality of frames of image data;
    所述处理器,配置为将所述图像数据流中的各帧图像数据进行配准;将配准后的各帧图像数据进行融合处理,得到目标图像。The processor is configured to register image data of each frame in the image data stream, and perform fusion processing on the image data of each frame after registration to obtain a target image.
  20. 一种终端,所述终端包括:相机、处理器;A terminal, the terminal comprising: a camera and a processor;
    所述相机,配置为利用电子光圈获取图像数据流,所述图像数据流包括多帧图像数据;The camera is configured to acquire an image data stream using an electronic aperture, the image data stream comprising a plurality of frames of image data;
    所述处理器,配置为从所述图像数据流中确定出参考帧,将所述图像数据流中除所述参考帧以外的各帧图像数据与所述参考帧进行配准;对配准后的各帧图像数据进行融合处理时,将图像边界处黑色区域的像素替换为所述参考帧对应的像素后进行融合处理,得到目标图像。The processor is configured to determine a reference frame from the image data stream, and register image data of each frame other than the reference frame in the image data stream with the reference frame; When the image data of each frame is subjected to the fusion processing, the pixels of the black area at the image boundary are replaced with the pixels corresponding to the reference frame, and then the fusion processing is performed to obtain the target image.
  21. 一种计算机存储介质,所述计算机存储介质中存储有计算机可执行指令,该计算机可执行指令配置为执行权利要求5-9任一项所述的图像处理方法。A computer storage medium having stored therein computer executable instructions configured to perform the image processing method of any of claims 5-9.
  22. 一种计算机存储介质,所述计算机存储介质中存储有计算机可执行指令,该计算机可执行指令配置为执行权利要求14-18任一项所述的图像处理方法。 A computer storage medium having stored therein computer executable instructions configured to perform the image processing method of any of claims 14-18.
PCT/CN2017/082941 2016-05-31 2017-05-03 Image processing method, terminal, and computer storage medium WO2017206656A1 (en)

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
CN201610377764.1A CN105915796A (en) 2016-05-31 2016-05-31 Electronic aperture shooting method and terminal
CN201610375523.3A CN105898159B (en) 2016-05-31 2016-05-31 A kind of image processing method and terminal
CN201610375523.3 2016-05-31
CN201610377764.1 2016-05-31

Publications (1)

Publication Number Publication Date
WO2017206656A1 true WO2017206656A1 (en) 2017-12-07

Family

ID=60479709

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2017/082941 WO2017206656A1 (en) 2016-05-31 2017-05-03 Image processing method, terminal, and computer storage medium

Country Status (1)

Country Link
WO (1) WO2017206656A1 (en)

Cited By (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109819163A (en) * 2019-01-23 2019-05-28 努比亚技术有限公司 A kind of image processing control, terminal and computer readable storage medium
CN110598712A (en) * 2019-08-28 2019-12-20 万维科研有限公司 Object position identification method and device, computer equipment and storage medium
CN112106357A (en) * 2018-05-11 2020-12-18 杜比实验室特许公司 High fidelity full reference and efficient partial reference coding in an end-to-end single layer backward compatible coding pipeline
CN112261290A (en) * 2020-10-16 2021-01-22 海信视像科技股份有限公司 Display device, camera and AI data synchronous transmission method
CN113114947A (en) * 2021-04-20 2021-07-13 重庆紫光华山智安科技有限公司 Focusing adjustment method and device, electronic equipment and storage medium
CN113192101A (en) * 2021-05-06 2021-07-30 影石创新科技股份有限公司 Image processing method, image processing device, computer equipment and storage medium
CN113518243A (en) * 2020-04-10 2021-10-19 Tcl科技集团股份有限公司 Image processing method and device
CN113706421A (en) * 2021-10-27 2021-11-26 深圳市慧鲤科技有限公司 Image processing method and device, electronic equipment and storage medium
WO2022262599A1 (en) * 2021-06-18 2022-12-22 影石创新科技股份有限公司 Image processing method and apparatus, and computer device and storage medium

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20070053536A (en) * 2005-11-21 2007-05-25 한국전자통신연구원 Apparatus and method for taking a clear image in slow shutter speed
CN104751488A (en) * 2015-04-08 2015-07-01 努比亚技术有限公司 Photographing method for moving track of moving object and terminal equipment
CN105488756A (en) * 2015-11-26 2016-04-13 努比亚技术有限公司 Picture synthesizing method and device
CN105611181A (en) * 2016-03-30 2016-05-25 努比亚技术有限公司 Multi-frame photographed image synthesizer and method
CN105898159A (en) * 2016-05-31 2016-08-24 努比亚技术有限公司 Image processing method and terminal
CN105915796A (en) * 2016-05-31 2016-08-31 努比亚技术有限公司 Electronic aperture shooting method and terminal

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20070053536A (en) * 2005-11-21 2007-05-25 한국전자통신연구원 Apparatus and method for taking a clear image in slow shutter speed
CN104751488A (en) * 2015-04-08 2015-07-01 努比亚技术有限公司 Photographing method for moving track of moving object and terminal equipment
CN105488756A (en) * 2015-11-26 2016-04-13 努比亚技术有限公司 Picture synthesizing method and device
CN105611181A (en) * 2016-03-30 2016-05-25 努比亚技术有限公司 Multi-frame photographed image synthesizer and method
CN105898159A (en) * 2016-05-31 2016-08-24 努比亚技术有限公司 Image processing method and terminal
CN105915796A (en) * 2016-05-31 2016-08-31 努比亚技术有限公司 Electronic aperture shooting method and terminal

Cited By (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112106357A (en) * 2018-05-11 2020-12-18 杜比实验室特许公司 High fidelity full reference and efficient partial reference coding in an end-to-end single layer backward compatible coding pipeline
CN112106357B (en) * 2018-05-11 2024-03-12 杜比实验室特许公司 Method and apparatus for encoding and decoding image data
CN109819163A (en) * 2019-01-23 2019-05-28 努比亚技术有限公司 A kind of image processing control, terminal and computer readable storage medium
CN110598712A (en) * 2019-08-28 2019-12-20 万维科研有限公司 Object position identification method and device, computer equipment and storage medium
CN110598712B (en) * 2019-08-28 2022-06-03 万维科研有限公司 Object position identification method and device, computer equipment and storage medium
CN113518243A (en) * 2020-04-10 2021-10-19 Tcl科技集团股份有限公司 Image processing method and device
CN112261290A (en) * 2020-10-16 2021-01-22 海信视像科技股份有限公司 Display device, camera and AI data synchronous transmission method
CN113114947A (en) * 2021-04-20 2021-07-13 重庆紫光华山智安科技有限公司 Focusing adjustment method and device, electronic equipment and storage medium
CN113192101A (en) * 2021-05-06 2021-07-30 影石创新科技股份有限公司 Image processing method, image processing device, computer equipment and storage medium
CN113192101B (en) * 2021-05-06 2024-03-29 影石创新科技股份有限公司 Image processing method, device, computer equipment and storage medium
WO2022262599A1 (en) * 2021-06-18 2022-12-22 影石创新科技股份有限公司 Image processing method and apparatus, and computer device and storage medium
CN113706421A (en) * 2021-10-27 2021-11-26 深圳市慧鲤科技有限公司 Image processing method and device, electronic equipment and storage medium
CN113706421B (en) * 2021-10-27 2022-02-22 深圳市慧鲤科技有限公司 Image processing method and device, electronic equipment and storage medium

Similar Documents

Publication Publication Date Title
WO2017206656A1 (en) Image processing method, terminal, and computer storage medium
CN106454121B (en) Double-camera shooting method and device
KR102381713B1 (en) Photographic method, photographic apparatus, and mobile terminal
CN105898159B (en) A kind of image processing method and terminal
CN108605097B (en) Optical imaging method and device
ES2784905T3 (en) Image processing method and device, computer-readable storage medium and electronic device
CN106937039B (en) Imaging method based on double cameras, mobile terminal and storage medium
US9159169B2 (en) Image display apparatus, imaging apparatus, image display method, control method for imaging apparatus, and program
WO2018019124A1 (en) Image processing method and electronic device and storage medium
WO2017045650A1 (en) Picture processing method and terminal
WO2017140182A1 (en) Image synthesis method and apparatus, and storage medium
WO2017050115A1 (en) Image synthesis method
CN114092364B (en) Image processing method and related device
WO2017016511A1 (en) Image processing method and device, and terminal
WO2018076938A1 (en) Method and device for processing image, and computer storage medium
CN106612397A (en) Image processing method and terminal
CN107040723B (en) Imaging method based on double cameras, mobile terminal and storage medium
WO2017071542A1 (en) Image processing method and apparatus
CN111064895B (en) Virtual shooting method and electronic equipment
WO2017045647A1 (en) Method and mobile terminal for processing image
CN106954020B (en) A kind of image processing method and terminal
CN109120858B (en) Image shooting method, device, equipment and storage medium
CN106303229A (en) A kind of photographic method and device
CN113810590A (en) Image processing method, electronic device, medium, and system
CN105915796A (en) Electronic aperture shooting method and terminal

Legal Events

Date Code Title Description
NENP Non-entry into the national phase

Ref country code: DE

121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 17805601

Country of ref document: EP

Kind code of ref document: A1

32PN Ep: public notification in the ep bulletin as address of the adressee cannot be established

Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205A DATED 30.04.2019)

122 Ep: pct application non-entry in european phase

Ref document number: 17805601

Country of ref document: EP

Kind code of ref document: A1