WO2017206656A1 - Procédé de traitement d'image, terminal et support d'informations informatique - Google Patents

Procédé de traitement d'image, terminal et support d'informations informatique Download PDF

Info

Publication number
WO2017206656A1
WO2017206656A1 PCT/CN2017/082941 CN2017082941W WO2017206656A1 WO 2017206656 A1 WO2017206656 A1 WO 2017206656A1 CN 2017082941 W CN2017082941 W CN 2017082941W WO 2017206656 A1 WO2017206656 A1 WO 2017206656A1
Authority
WO
WIPO (PCT)
Prior art keywords
image data
image
frame
data stream
reference frame
Prior art date
Application number
PCT/CN2017/082941
Other languages
English (en)
Chinese (zh)
Inventor
戴向东
Original Assignee
努比亚技术有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from CN201610377764.1A external-priority patent/CN105915796A/zh
Priority claimed from CN201610375523.3A external-priority patent/CN105898159B/zh
Application filed by 努比亚技术有限公司 filed Critical 努比亚技术有限公司
Publication of WO2017206656A1 publication Critical patent/WO2017206656A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/67Focus control based on electronic image sensor signals
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/222Studio circuitry; Studio devices; Studio equipment
    • H04N5/262Studio circuits, e.g. for mixing, switching-over, change of character of image, other special effects ; Cameras specially adapted for the electronic generation of special effects
    • H04N5/265Mixing

Definitions

  • the present invention relates to image processing technology in the field of photographing, and in particular to an image processing method and terminal, and a computer storage medium.
  • the camera function is one of the commonly used functions of the mobile terminal, and the camera function of some mobile terminals has an electronic aperture mode.
  • the electronic aperture mode after the user adjusts the aperture value, the mobile terminal performs continuous and uninterrupted shooting during the exposure time, and then transparently processes the captured multiple images before superimposing the overall effect and the "slow shutter"
  • the actual effect is very consistent, mainly highlighting the long exposure time and even reaching the B-level exposure time.
  • the algorithm used in the electronic aperture mode does not cause overexposure, and the overall effect of the picture is relatively natural.
  • an embodiment of the present invention provides an image processing method, a terminal, and a computer storage medium.
  • An acquiring unit configured to acquire an image data stream by using an electronic aperture, the image data stream comprising a plurality of frames of image data;
  • a registration unit configured to register image data of each frame in the image data stream
  • the fusion unit is configured to perform fusion processing on the image data of each frame after registration to obtain a target image.
  • the acquiring unit includes:
  • a shooting subunit configured to capture an original image data stream by using an electronic aperture in a handheld mode, the original image data stream comprising a plurality of frames of raw image data;
  • a pre-processing unit configured to pre-process each frame of the original image data stream to obtain the image data stream; wherein the pre-processing comprises at least one of: image filtering, contrast stretching .
  • the terminal further includes:
  • the prompting unit is configured to display a prompt box on the display interface, and the prompting box is configured to prompt the orientation of the handheld shaking when the electronic aperture is used in the handheld mode.
  • the registration unit is further configured to: use the first frame image data in the image data stream as a reference frame, and divide each of the image data streams except the first frame image data.
  • the frame image data is aligned with the reference frame; wherein the alignment refers to aligning pixel points of the same spatial location.
  • the image data stream comprising a plurality of frames of image data
  • the image data of each frame after registration is subjected to fusion processing to obtain a target image.
  • the acquiring an image data stream by using an electronic aperture includes:
  • an original image data stream is obtained by taking an electronic aperture, the original image data stream comprising a plurality of frames of original image data;
  • the preprocessing comprises at least one of the following: image filtering, contrast stretching.
  • the method further includes:
  • a prompt box is displayed on the display interface; the prompt box is used to indicate the orientation of the handheld shake when the electronic aperture is used in the handheld mode.
  • the registering each frame image data in the image data stream includes:
  • the alignment refers to aligning pixel points of the same spatial position.
  • the image processing of each frame image after the registration is performed, including:
  • Each pixel point of the image data of each frame after registration is superimposed according to the spatial position correspondence.
  • An acquiring unit configured to acquire an image data stream by using an electronic aperture, the image data stream comprising a plurality of frames of image data;
  • a registration unit configured to determine a reference frame from the image data stream, and register, in the image data stream, image data of each frame other than the reference frame with the reference frame;
  • the merging unit is configured to perform merging processing on the image data of each frame after the registration, and replace the pixels in the black area at the image boundary with the pixels corresponding to the reference frame, and perform fusion processing to obtain a target image.
  • the fusion unit includes:
  • the analysis subunit is configured to analyze the image data of each frame after registration to determine a black area at a boundary of each frame image
  • the replacement and fusion subunits are configured to perform fusion processing on the image data of each frame after the registration, and replace the pixels in the black area at the image boundary with the pixels corresponding to the reference frame, and perform fusion processing to obtain a target image.
  • the replacing and merging subunit is further configured to determine, according to a black area at the image boundary, a reference area corresponding to the black area in the reference frame; The pixel of the region is replaced with the pixel in the reference region in the reference frame, and then the fusion processing is performed to obtain the target image.
  • the registration unit is further configured to: use the first frame image data in the image data stream as a reference frame, and divide each of the image data streams except the first frame image data.
  • the frame image data is aligned with the reference frame; wherein the alignment refers to aligning pixel points of the same spatial location.
  • the image data stream comprising a plurality of frames of image data
  • the pixels of the black area at the image boundary are replaced with the pixels corresponding to the reference frame, and then the fusion processing is performed to obtain the target image.
  • the pixel of the black area at the image boundary is replaced with the pixel corresponding to the reference frame, and then the fusion process is performed to obtain the target image, including:
  • the pixels of the black area at the image boundary are replaced with the pixels corresponding to the reference frame, and then the fusion processing is performed to obtain the target image.
  • the fusion process is performed to obtain the target image, including:
  • the pixel in the black area at the image boundary is replaced with the pixel in the reference area in the reference frame, and then the fusion processing is performed to obtain the target image.
  • the determining a reference frame from the image data stream, and registering each frame image data of the image data stream except the reference frame with the reference frame includes:
  • the alignment refers to aligning pixel points of the same spatial position.
  • the image processing of each frame image after the registration is performed, including:
  • Each pixel point of the image data of each frame after registration is superimposed according to the spatial position correspondence.
  • the terminal provided by the embodiment of the present invention includes: a camera and a processor;
  • the camera is configured to acquire an image data stream using an electronic aperture, the image data stream comprising a plurality of frames of image data;
  • the processor is configured to register image data of each frame in the image data stream, and perform fusion processing on the image data of each frame after registration to obtain a target image.
  • the terminal provided by the embodiment of the present invention includes: a camera and a processor;
  • the camera is configured to acquire an image data stream using an electronic aperture, the image data stream comprising a plurality of frames of image data;
  • the processor configured to determine a reference frame from the image data stream, the image And storing, in the data stream, image data of each frame other than the reference frame and the reference frame; and performing fusion processing on the image data of each frame after registration, replacing pixels of the black area at the boundary of the image with the reference
  • the pixel corresponding to the frame is subjected to fusion processing to obtain a target image.
  • the computer storage medium provided by the embodiment of the present invention stores computer executable instructions configured to perform any of the image processing methods described.
  • an image data stream is acquired by using an electronic aperture, the image data stream includes multi-frame image data; and each frame image data in the image data stream is registered; The image data of each frame after registration is subjected to fusion processing to obtain a target image.
  • the embodiment of the present invention increases the hand-held mode of the electronic aperture, improves the convenience of the user's shooting, and introduces the image registration in the embodiment of the present invention to ensure the shooting effect and improve the user's shooting experience.
  • the embodiment of the present invention further adds a prompt box on the display interface to prevent the handheld terminal from being over-ranged when the user shoots.
  • an image data stream is acquired by using an electronic aperture, the image data stream includes multi-frame image data; a reference frame is determined from the image data stream, and the image data is determined. And storing, in the stream, image data of each frame other than the reference frame and the reference frame; and performing fusion processing on the image data of each frame after the registration, replacing pixels of the black area at the boundary of the image with the reference frame The corresponding pixel is subjected to fusion processing to obtain a target image.
  • the user can perform the shooting of the electronic aperture by the hand-held terminal, thereby improving the convenience of the user's shooting, avoiding the problem of unclear image caused by the hand-held, ensuring the effect of shooting, and improving the effect of the shooting.
  • User shooting experience When image fusion is performed, the black side of the pixel caused by image registration is processed, which ensures that the pixel point transition of each position of the entire image is natural.
  • FIG. 1 is a schematic structural diagram of hardware of an optional mobile terminal embodying various embodiments of the present invention
  • FIG. 2 is a schematic diagram of a wireless communication system of the mobile terminal shown in FIG. 1;
  • FIG. 3 is a schematic flowchart of an image processing method according to Embodiment 1 of the present invention.
  • FIG. 4 is a schematic diagram of pixel point matching of an optical flow image
  • Figure 5 is a schematic diagram of a simple mobile phone motion model
  • FIG. 6 is a flowchart of aligning a multi-frame image by using an optical flow according to an embodiment of the present invention
  • FIG. 7 is a schematic flowchart diagram of an image processing method according to Embodiment 2 of the present invention.
  • FIG. 8 is a schematic diagram of an interface of a prompt box according to an embodiment of the present invention.
  • FIG. 9 is a schematic structural diagram of a terminal according to Embodiment 1 of the present invention.
  • FIG. 10 is a schematic structural diagram of a terminal according to an embodiment of the present invention.
  • FIG. 11 is a schematic flowchart of an image processing method according to Embodiment 3 of the present invention.
  • FIG. 12 is a schematic flowchart diagram of an image processing method according to Embodiment 4 of the present invention.
  • FIG. 13 is a schematic diagram of image fusion according to an embodiment of the present invention.
  • FIG. 14 is a schematic structural diagram of a terminal according to Embodiment 2 of the present invention.
  • Figure 15 is a block diagram of the electrical structure of the camera.
  • module A mobile terminal embodying various embodiments of the present invention will now be described with reference to the accompanying drawings.
  • suffixes such as “module,” “component,” or “unit” used to denote an element are merely illustrative of the embodiments of the present invention, and do not have a specific meaning per se. Therefore, “module” and “component” can be used in combination.
  • the mobile terminal can be implemented in various forms.
  • the terminal described in the embodiments of the present invention may include, for example, a mobile phone, a smart phone, a notebook computer, a digital broadcast receiver, a personal digital assistant (PDA, Personal Digital Assistant), a tablet (PAD), a portable multimedia player (PMP). , Portable Media Player), mobile devices of navigation devices and the like, and fixed terminals such as digital TVs, desktop computers, and the like.
  • PDA Personal Digital Assistant
  • PDA Personal Digital Assistant
  • PAD personal Digital Assistant
  • PMP portable multimedia player
  • mobile devices of navigation devices and the like and fixed terminals
  • fixed terminals such as digital TVs, desktop computers, and the like.
  • fixed terminals such as digital TVs, desktop computers, and the like.
  • FIG. 1 is a schematic diagram showing the hardware structure of a mobile terminal embodying various embodiments of the present invention.
  • the mobile terminal 100 may include a wireless communication unit 110, an audio/video (A/V) input unit 120, a user input unit 130, a sensing unit 140, an output unit 150, a memory 160, an interface unit 170, a controller 180, and a power supply unit 190. and many more.
  • Figure 1 illustrates a mobile terminal having various components, but it should be understood that not all illustrated components are required to be implemented. More or fewer components can be implemented instead. The elements of the mobile terminal will be described in detail below.
  • Wireless communication unit 110 typically includes one or more components that permit radio communication between mobile terminal 100 and a wireless communication system or network.
  • the wireless communication unit may include at least one of a broadcast receiving module 111, a mobile communication module 112, a wireless internet module 113, a short-range communication module 114, and a location information module 115.
  • the broadcast receiving module 111 receives a broadcast signal and/or broadcast associated information from an external broadcast management server via a broadcast channel.
  • the broadcast channel can include a satellite channel and/or a terrestrial channel.
  • the broadcast management server may be a server that generates and transmits a broadcast signal and/or broadcast associated information or a server that receives a previously generated broadcast signal and/or broadcast associated information and transmits it to the terminal.
  • the broadcast signal may include a TV broadcast signal, a radio broadcast signal, a data broadcast signal, and the like.
  • the broadcast signal may further include a broadcast signal combined with a TV or radio broadcast signal.
  • the broadcast associated information may also be provided via a mobile communication network, and in this case, the broadcast associated information may be received by the mobile communication module 112.
  • the broadcast signal may exist in various forms, for example, it may be an Electronic Program Guide (EPG) of Digital Multimedia Broadcasting (DMB), a digital video broadcast handheld (DVB-H, Digital Video Broadcasting-Handheld). ) exists in the form of an ESG (Electronic Service Guide) and the like.
  • EPG Electronic Program Guide
  • DMB Digital Multimedia Broadcasting
  • DVD-H Digital Video Broadcasting-Handheld
  • ESG Electronic Service Guide
  • the broadcast receiving module 111 can receive a signal broadcast by using various types of broadcast systems.
  • the broadcast receiving module 111 can use, for example, DMB-T (Digital Multimedia Broadcasting-Terrestrial), Digital Multimedia Broadcasting-Satellite (DMB-S, Digital Multimedia Broadcasting-Satellite), Digital Video Broadcasting Handheld (DVB-H), Forward Link Media (MediaFLO, Media Forward Link Only) data broadcasting system, Digital Broadcasting Integrated Services (ISDB-T, Integrated Services Digital Broadcasting-Terrestrial), etc.
  • the system receives digital broadcasts.
  • the broadcast receiving module 111 can be constructed as various broadcast systems suitable for providing broadcast signals as well as the above-described digital broadcast system.
  • the broadcast signal and/or broadcast associated information received via the broadcast receiving module 111 may be stored in the memory 160 (or other type of storage medium).
  • the mobile communication module 112 transmits the radio signals to and/or receives radio signals from at least one of a base station (e.g., an access point, a Node B, etc.), an external terminal, and a server.
  • a base station e.g., an access point, a Node B, etc.
  • Such radio signals may include voice call signals, video call signals, or various types of data transmitted and/or received in accordance with text and/or multimedia messages.
  • the wireless internet module 113 supports wireless internet access of the mobile terminal.
  • the module can be internally or externally coupled to the terminal.
  • the wireless Internet access technologies involved in the module may include Wi-Fi (WLAN, Wireless Local Area Networks), Wireless Broadband (Wibro), Worldwide Interoperability for Microwave Access (Wimax), and High Speed Downlink Packet Access. (HSDPA, High Speed Downlink Packet Access) and so on.
  • the short range communication module 114 is a module for supporting short range communication.
  • Some examples of short-range communication technologies include Bluetooth, Radio Frequency Identification (RFID), Infrared Data Association (IrDA), Ultra Wideband (UWB), Zigbee, and the like.
  • the location information module 115 is a module for checking or acquiring location information of the mobile terminal.
  • a typical example of a location information module is the Global Positioning System (GPS).
  • GPS Global Positioning System
  • the position information module 115 as a GPS calculates distance information and accurate time information from three or more satellites and applies triangulation to the calculated information to accurately calculate the three-dimensional current based on longitude, latitude, and altitude. location information.
  • the method used to calculate position and time information uses three satellites and is corrected by using another satellite. The error of the position and time information.
  • the position information module 115 as a GPS can calculate the speed information by continuously calculating the current position information in real time.
  • the A/V input unit 120 is for receiving an audio or video signal.
  • the A/V input unit 120 may include a camera 121 and a microphone 122 that processes image data of still pictures or video obtained by the image capturing device in a video capturing mode or an image capturing mode.
  • the processed image frame can be displayed on the display unit 151.
  • the image frames processed by the camera 121 may be stored in the memory 160 (or other storage medium) or transmitted via the wireless communication unit 110, and two or more cameras 121 may be provided according to the configuration of the mobile terminal.
  • the microphone 122 can receive sound (audio data) via a microphone in an operation mode of a telephone call mode, a recording mode, a voice recognition mode, and the like, and can process such sound as audio data.
  • the processed audio (voice) data can be converted to a format output that can be transmitted to the mobile communication base station via the mobile communication module 112 in the case of a telephone call mode.
  • the microphone 122 can implement various types of noise cancellation (or suppression) algorithms to cancel (or suppress) noise or interference generated during the process of receiving and transmitting audio signals.
  • the user input unit 130 may generate key input data according to a command input by the user to control various operations of the mobile terminal.
  • the user input unit 130 allows the user to input various types of information, and may include a keyboard, a pot, a touch pad (eg, a touch sensitive component that detects changes in resistance, pressure, capacitance, etc. due to contact), a scroll wheel , rocker, etc.
  • a touch screen can be formed.
  • the sensing unit 140 detects the current state of the mobile terminal 100 (eg, the open or closed state of the mobile terminal 100), the location of the mobile terminal 100, the presence or absence of contact (ie, touch input) by the user with the mobile terminal 100, and the mobile terminal.
  • the sensing unit 140 can sense whether the slide type phone is turned on or off.
  • the sensing unit 140 can detect the power supply unit 190 Whether power is supplied or whether the interface unit 170 is coupled to an external device.
  • the sensing unit 140 may include a proximity sensor 141.
  • the interface unit 170 serves as an interface through which at least one external device can connect with the mobile terminal 100.
  • the external device may include a wired or wireless headset port, an external power (or battery charger) port, a wired or wireless data port, a memory card port, a port for connecting a device having an identification module, and an audio input/output. (I/O) port, video I/O port, headphone port, and more.
  • the identification module may be stored to verify various information used by the user using the mobile terminal 100 and may include a User Identification Module (UIM), a Subscriber Identity Module (SIM), and a Universal Customer Identification Module (USIM, Universal). Subscriber Identity Module) and more.
  • UIM User Identification Module
  • SIM Subscriber Identity Module
  • USB Universal Customer Identification Module
  • the device having the identification module may take the form of a smart card, and thus the identification device may be connected to the mobile terminal 100 via a port or other connection device.
  • the interface unit 170 can be configured to receive input from an external device (eg, data information, power, etc.) and transmit the received input to one or more components within the mobile terminal 100 or can be used at the mobile terminal and external device Transfer data between.
  • the interface unit 170 may function as a path through which power is supplied from the base to the mobile terminal 100 or may be used as a transmission of various command signals allowing input from the base to the mobile terminal 100 The path to the terminal.
  • Various command signals or power input from the base can be used as signals for identifying whether the mobile terminal is accurately mounted on the base.
  • Output unit 150 is configured to provide an output signal (eg, an audio signal, a video signal, an alarm signal, a vibration signal, etc.) in a visual, audio, and/or tactile manner.
  • the output unit 150 may include a display unit 151, an audio output module 152, an alarm unit 153, and the like.
  • the display unit 151 can display information processed in the mobile terminal 100. For example, when the mobile terminal 100 is in the phone call mode, the display unit 151 can display a user interface related to a call or other communication (eg, text messaging, multimedia file download, etc.) (UI, User) Interface) or Graphical User Interface (GUI). When the mobile terminal 100 is in a video call mode or an image capturing mode, the display unit 151 may display a captured image and/or a received image, a UI or GUI showing a video or image and related functions, and the like.
  • UI text messaging, multimedia file download, etc.
  • GUI Graphical User Interface
  • the display unit 151 can function as an input device and an output device.
  • the display unit 151 may include a Liquid Crystal Display (LCD), a Thin Film Transistor (LCD), an Organic Light-Emitting Diode (OLED) display, a flexible display, and a three-dimensional (3D) At least one of a display or the like.
  • LCD Liquid Crystal Display
  • LCD Thin Film Transistor
  • OLED Organic Light-Emitting Diode
  • 3D three-dimensional
  • Some of these displays may be configured to be transparent to allow a user to view from the outside, which may be referred to as a transparent display, and a typical transparent display may be, for example, a transparent organic light emitting diode (TOLED) display or the like.
  • TOLED transparent organic light emitting diode
  • the mobile terminal 100 may include two or more display units (or other display devices), for example, the mobile terminal may include an external display unit (not shown) and an internal display unit (not shown) .
  • the touch screen can be used to detect touch input pressure as well as touch input position and touch input area.
  • the audio output module 152 may convert audio data received by the wireless communication unit 110 or stored in the memory 160 when the mobile terminal is in a call signal receiving mode, a call mode, a recording mode, a voice recognition mode, a broadcast receiving mode, and the like.
  • the audio signal is output as sound.
  • the audio output module 152 can provide audio output (eg, call signal reception sound, message reception sound, etc.) associated with a particular function performed by the mobile terminal 100.
  • the audio output module 152 can include a speaker, a buzzer, and the like.
  • the alarm unit 153 can provide an output to notify the mobile terminal 100 of the occurrence of an event. Typical events may include call reception, message reception, key signal input, touch input, and the like. In addition to audio or video output, the alert unit 153 can provide an output in a different manner to notify of the occurrence of an event. For example, the alarm unit 153 can provide an output in the form of vibrations, and when a call, message, or some other incoming communication is received, the alarm unit 153 A tactile output (ie, vibration) can be provided to notify the user of it. By providing such a tactile output, the user is able to recognize the occurrence of various events even when the user's mobile phone is in the user's pocket. The alarm unit 153 can also provide an output of the notification event occurrence via the display unit 151 or the audio output module 152.
  • the memory 160 may store a software program or the like for processing and control operations performed by the controller 180, or may temporarily store data (for example, a phone book, a message, a still image, a video, etc.) that has been output or is to be output. Moreover, the memory 160 can store data regarding vibrations and audio signals of various manners that are output when a touch is applied to the touch screen.
  • the memory 160 may include at least one type of storage medium including a flash memory, a hard disk, a multimedia card, a card type memory (for example, SD or DX memory, etc.), a random access memory (RAM), and a static memory.
  • Random Access Memory SRAM
  • Read Only Memory ROM
  • EEPROM Electrically Erasable Programmable Read Only Memory
  • PROM Programmable Read Only Memory
  • magnetic memory disk, optical disk, etc.
  • the mobile terminal 100 can cooperate with a network storage device that performs a storage function of the memory 160 through a network connection.
  • the controller 180 typically controls the overall operation of the mobile terminal. For example, the controller 180 performs the control and processing associated with voice calls, data communications, video calls, and the like.
  • the controller 180 may include a multimedia module 181 for reproducing (or playing back) multimedia data, which may be constructed within the controller 180 or may be configured to be separate from the controller 180.
  • the controller 180 may perform a pattern recognition process to recognize a handwriting input or a picture drawing input performed on the touch screen as a character or an image.
  • the power supply unit 190 receives external power or internal power under the control of the controller 180 and provides appropriate power required to operate the various components and components.
  • the various embodiments described herein can be used, for example, in computer software, hardware, or any of them.
  • the combined computer readable medium is implemented.
  • the embodiments described herein may use an Application Specific Integrated Circuit (ASIC), a Digital Signal Processing (DSP), a Digital Signal Processing Device (DSPD), Programmable Logic Device (PLD), Field Programmable Gate Array (FPGA), processor, controller, microcontroller, microprocessor, electronics designed to perform the functions described herein
  • ASIC Application Specific Integrated Circuit
  • DSP Digital Signal Processing
  • DSPD Digital Signal Processing Device
  • PLD Programmable Logic Device
  • FPGA Field Programmable Gate Array
  • processor controller, microcontroller, microprocessor, electronics designed to perform the functions described herein
  • controller 180 microcontroller, microprocessor
  • implementations such as procedures or functions may be implemented with separate software modules that permit the execution of at least one function or operation.
  • the software code can be implemented by a software application (or program) written in any suitable programming language, which can be stored in memory 160 and executed by
  • the mobile terminal has been described in terms of its function.
  • a slide type mobile terminal among various types of mobile terminals such as a folding type, a bar type, a swing type, a slide type mobile terminal, and the like will be described as an example. Therefore, the present invention can be applied to any type of mobile terminal, and is not limited to a slide type mobile terminal.
  • the mobile terminal 100 as shown in FIG. 1 may be configured to operate using a communication system such as a wired and wireless communication system and a satellite-based communication system that transmits data via frames or packets.
  • a communication system such as a wired and wireless communication system and a satellite-based communication system that transmits data via frames or packets.
  • a communication system in which a mobile terminal is operable according to an embodiment of the present invention will now be described with reference to FIG.
  • Such communication systems may use different air interfaces and/or physical layers.
  • the air interface used by the communication system includes, for example, Frequency Division Multiple Access (FDMA), Time Division Multiple Access (TDMA), Code Division Multiple Access (CDMA), and General Purpose Code Division Multiple Access (CDMA).
  • FDMA Frequency Division Multiple Access
  • TDMA Time Division Multiple Access
  • CDMA Code Division Multiple Access
  • CDMA General Purpose Code Division Multiple Access
  • UMTS Universal Mobile Telecommunications System
  • LTE Long Term Evolution
  • GSM Global System for Mobile Communications
  • the following description relates to CDMA communication systems, but such teachings are equally applicable to other types of systems.
  • the CDMA wireless communication system may include a plurality of mobile terminals 100, a plurality of base stations (BS) 270, a base station controller (BSC) 275, and a mobile switching center (MSC) 280.
  • the MSC 280 is configured to interface with a Public Switched Telephone Network (PSTN) 290.
  • PSTN Public Switched Telephone Network
  • the MSC 280 is also configured to interface with a BSC 275 that can be coupled to the base station 270 via a backhaul line.
  • the backhaul line can be constructed in accordance with any of a number of well known interfaces including, for example, E1/T1, ATM, IP, PPP, Frame Relay, HDSL, ADSL, or xDSL. It will be appreciated that the system as shown in FIG. 2 can include multiple BSCs 275.
  • Each BS 270 can serve one or more partitions (or regions), with each partition covered by a multi-directional antenna or an antenna pointing in a particular direction radially away from the BS 270. Alternatively, each partition may be covered by two or more antennas for diversity reception. Each BS 270 can be configured to support multiple frequency allocations, and each frequency allocation has a particular frequency spectrum (eg, 1.25 MHz, 5 MHz, etc.).
  • BS 270 may also be referred to as a Base Transceiver Subsystem (BTS) or other equivalent terminology.
  • BTS Base Transceiver Subsystem
  • the term "base station” can be used to generally refer to a single BSC 275 and at least one BS 270.
  • a base station can also be referred to as a "cell station.”
  • each partition of a particular BS 270 may be referred to as multiple cellular stations.
  • a broadcast transmitter (BT, Broadcast Transmitter) 295 transmits a broadcast signal to the mobile terminal 100 operating within the system.
  • a broadcast receiving module 111 as shown in FIG. 1 is provided at the mobile terminal 100 to receive a broadcast signal transmitted by the BT 295.
  • several satellites 300 are shown, for example GPS satellites 300 may be employed. The satellite 300 helps locate at least one of the plurality of mobile terminals 100.
  • a plurality of satellites 300 are depicted, but it is understood that useful positioning information can be obtained using any number of satellites.
  • the position information module 115 as a GPS as shown in FIG. 1 is generally configured to cooperate with the satellite 300 to obtain desired positioning information. Instead of GPS tracking technology or in addition to GPS tracking technology, other techniques that can track the location of the mobile terminal can be used. Additionally, at least one GPS satellite 300 can selectively or additionally process satellite DMB transmissions.
  • BS 270 receives reverse link signals from various mobile terminals 100.
  • Mobile terminal 100 typically participates in calls, messaging, and other types of communications.
  • Each reverse link signal received by a particular base station 270 is processed within a particular BS 270.
  • the obtained data is forwarded to the relevant BSC 275.
  • the BSC provides call resource allocation and coordinated mobility management functions including a soft handoff procedure between the BSs 270.
  • the BSC 275 also routes the received data to the MSC 280, which provides additional routing services for interfacing with the PSTN 290.
  • PSTN 290 interfaces with MSC 280, which forms an interface with BSC 275, and BSC 275 controls BS 270 accordingly to transmit forward link signals to mobile terminal 100.
  • the mobile communication module 112 of the wireless communication unit 110 in the mobile terminal accesses the mobile based on necessary data (including user identification information and authentication information) of the mobile communication network (such as 2G/3G/4G mobile communication network) built in the mobile terminal.
  • the communication network transmits mobile communication data (including uplink mobile communication data and downlink mobile communication data) for services such as web browsing and network multimedia playback of the mobile terminal user.
  • the wireless internet module 113 of the wireless communication unit 110 implements a function of a wireless hotspot by operating a related protocol function of a wireless hotspot, and the wireless hotspot supports access of a plurality of mobile terminals (any mobile terminal other than the mobile terminal) by multiplexing the mobile communication module.
  • the mobile communication connection between the mobile communication network and the mobile communication network transmits mobile communication data (including uplink mobile communication data and downlink mobile communication data) for the mobile terminal user's web browsing, network multimedia playback, etc., since the mobile terminal is substantially complex Transmitting a mobile communication connection between a mobile terminal and a communication network
  • the communication data is transmitted, so that the traffic of the mobile communication data consumed by the mobile terminal is included in the communication fee of the mobile terminal by the charging entity on the communication network side, thereby consuming the data traffic of the mobile communication data included in the communication tariff used by the mobile terminal.
  • FIG. 3 is a schematic flowchart of an image processing method according to Embodiment 1 of the present invention.
  • the image processing method in this example is applied to a terminal. As shown in FIG. 3, the image processing method includes the following steps:
  • Step 301 Acquire an image data stream by using an electronic aperture, the image data stream comprising a plurality of frames of image data.
  • the terminal may be an electronic device such as a mobile phone or a tablet computer.
  • the terminal has a photographing function, and the photographing function of the terminal has an electronic aperture mode; when photographing by using an electronic aperture, the user needs to set the photographing function to the electronic aperture mode.
  • the terminal In the electronic aperture mode, after the user adjusts the aperture value, the terminal performs continuous and uninterrupted shooting during the exposure time, and then fuses the captured multiple images.
  • the embodiment of the present invention adds a handheld mode for the electronic aperture shooting. In the gesture mode, the user can conveniently use the electronic aperture to shoot.
  • an image data stream is first acquired, the image data stream comprising a plurality of frames of image data.
  • the captured original image data stream is first acquired synchronously; then the original image data stream is subjected to data reading and image preprocessing to obtain the image data stream.
  • the raw image data of each frame in the original image data stream acquired synchronously is in illumination, noise, sharpness, and focus point. There is a difference.
  • the preprocessing includes image filtering to eliminate noise, contrast stretching to improve image sharpness and image illumination differences. After the pre-processing is performed, the difference of the image data of each frame in the image data stream will be reduced, which is helpful for the improvement of the effect of the subsequent image registration algorithm.
  • Step 302 Register each frame image data in the image data stream.
  • the pixels in the same spatial position in each frame image data are aligned, thereby avoiding blurring when the image is merged.
  • registration is also called alignment, and there are many algorithms for image alignment, which are mainly divided into a local feature based method and a global feature based method.
  • the typical method based on local features is to extract the key feature points of the image, and then use these key feature points to calculate the mapping matrix of the image space alignment model, and finally use the mapping matrix to perform image alignment.
  • the registration effect of this type of method can generally meet the requirements of many scenes, such as changes in illumination (combination of different exposure images), large-scale image shift (panoramic image stitching), dark light image (noise increase) and other complexities.
  • Scene The other type is a search alignment method based on global mutual information matching, which can reduce the matching error caused by random feature points.
  • the optical flow field is also a point-based matching algorithm, which is the instantaneous velocity of the pixel motion of the space moving object on the observation imaging plane, which is the change of the pixel in the time domain and the correlation between adjacent frames.
  • the purpose of studying the optical flow field is to approximate the motion field that cannot be directly obtained from the sequence of pictures.
  • the motion field is actually the motion of the object in the three-dimensional real world; the optical flow field is the projection of the motion field on the two-dimensional image plane (human eye or camera).
  • the motion velocity and direction of motion of each pixel in each image are found to be the optical flow field.
  • I(x, y, t) is the pixel value of the image at the (x, y) position.
  • Vx and Vy are the compositions of x, y in the optical flow vector of I(x, y, t), respectively.
  • Ix and Iy are the differences of the image in the corresponding direction at (x, y, t). So there are:
  • the Lucas-Kanade optical flow method assumes that the spatial pixel points move in unison, and adjacent points on one scene are projected onto the image as neighboring points, and the neighboring points are at the same speed. This is unique to the Lucas-Kanade optical flow method. Assumptions, because the optical flow method has only one basic equation constraint, and requires speed in the x, y direction, there are two unknown variables. Assuming similar motions in the neighborhood of the feature points, it is possible to find the velocity in the x, y direction by n multiple equations (n is the total number of points in the neighborhood of the feature points, including the feature points). You can get the following equation:
  • the small motion hypothesis mentioned in the above scheme assumes that when the target speed is fast, the assumption will not be established, and multi-scale can solve this problem.
  • Figure 6 shows the alignment process for multi-frame images using optical flow.
  • the sparse matching points between the two images can be obtained, and then the coordinates of the points are calculated.
  • a mapping model In the image alignment step, it is important to choose the correct image alignment transformation model.
  • Common spatial transformation models include affine transformation and perspective transformation models.
  • the affine transformation can be visually represented in the following form. Any parallelogram in one plane can be mapped to another parallelogram by affine transformation.
  • the mapping operation of the image is performed in the same spatial plane, and different transformation parameters are used to deform to obtain different types of parallelograms.
  • Transmission transformation is a more general transformation model. Compared with affine transformation, transmission transformation is more flexible.
  • a transmission transformation can transform a rectangle into a trapezoid. It describes the projection of one plane in space into another spatial plane. Shot transformation is a special case of perspective transformation.
  • a 02 and a 12 are displacement parameters
  • a 00 a 01 and a 10 a 11 are scaling and rotation parameters
  • a 20 a 21 is the amount of deformation in the horizontal and vertical directions.
  • the perspective transformation model mainly considering the handheld terminal.
  • the jitter motion of the mobile phone is basically not in the same plane.
  • the simple motion model is shown in Figure 5.
  • Step 303 Perform fusion processing on the image data of each frame after registration to obtain a target image.
  • I is for each image, m is the mth image, k is the number of images that have been synthesized, and N is the total number of composite images.
  • an image processing method which uses image registration principle to perform image alignment on image data to be synthesized in multiple frames, and the alignment method allows image jitter error within a certain range, and the final composite effect of the image The pixel deviation that occurs is small.
  • FIG. 7 is a schematic flowchart of an image processing method according to Embodiment 2 of the present invention.
  • the image processing method in this example is applied to a terminal. As shown in FIG. 7, the image processing method includes the following steps:
  • Step 701 In the handheld mode, the original image data stream is obtained by using an electronic aperture, the original image data stream includes a plurality of frames of original image data, and the original image data of each frame in the original image data stream is preprocessed to obtain The image data stream.
  • the terminal may be an electronic device such as a mobile phone or a tablet computer.
  • the terminal has a photographing function, and the photographing function of the terminal has an electronic aperture mode; when photographing by using an electronic aperture, the user needs to set the photographing function to the electronic aperture mode.
  • the terminal In the electronic aperture mode, after the user adjusts the aperture value, the terminal performs continuous and uninterrupted shooting during the exposure time, and then fuses the captured multiple images.
  • the embodiment of the present invention adds a handheld mode for the electronic aperture shooting. In the gesture mode, the user can conveniently use the electronic aperture to shoot.
  • an image data stream is first acquired, the image data stream comprising a plurality of frames of image data.
  • the captured original image data stream is first acquired synchronously; then the original image data stream is subjected to data reading and image preprocessing to obtain the image data stream.
  • the raw image data of each frame in the synchronously acquired original image data stream is different in illumination, noise, sharpness, and focus point.
  • the original image data of each frame in the original image data stream needs to be preprocessed through a necessary preprocessing process.
  • the preprocessing process includes: image filtering to eliminate noise and contrast stretching to improve image clarity and The difference in illumination of the image. After the pre-processing is performed, the difference of the image data of each frame in the image data stream will be reduced, which is helpful for the improvement of the effect of the subsequent image registration algorithm.
  • Step 702 Align each frame image data of the image data stream except the first frame image data with the reference frame by using the first frame image data in the image data stream as a reference frame.
  • the pixels in the same spatial position in each frame image data are aligned, thereby avoiding blurring when the image is merged.
  • registration is also called alignment, and there are many algorithms for image alignment, which are mainly divided into a local feature based method and a global feature based method.
  • the typical method based on local features is to extract the key feature points of the image, and then use these key feature points to calculate the mapping matrix of the image space alignment model, and finally use the mapping matrix to perform image alignment.
  • the registration effect of this type of method can generally meet the requirements of many scenes, such as changes in illumination (combination of different exposure images), large-scale image shift (panoramic image stitching), dark light image (noise increase) and other complexities.
  • Scene The other type is a search alignment method based on global mutual information matching, which can reduce the matching error caused by random feature points.
  • the optical flow field is also a point-based matching algorithm, which is the instantaneous velocity of the pixel motion of the space moving object on the observation imaging plane, which is the change of the pixel in the time domain and the correlation between adjacent frames.
  • Step 703 Display a prompt box on the display interface; the prompt box is used to prompt the orientation of the handheld shake when the electronic aperture is used in the handheld mode.
  • a user prompt box is displayed, which prompts the user to manually hold the direction of the jitter, which can facilitate the user to correct in time.
  • Step 704 Superimpose each pixel point of each frame image data after registration according to spatial position correspondence to obtain a target image.
  • the image data of each frame is registered (that is, aligned)
  • the image needs to be fused, and here, a fusion method in which image pixels are sequentially superimposed is adopted.
  • an image processing method which uses image registration principle to perform image alignment on image data to be synthesized in multiple frames, and the alignment method allows image jitter error within a certain range, and the final composite effect of the image The pixel deviation that occurs is small.
  • a prompt box is added to prevent the terminal from being over-ranged when shooting.
  • FIG. 9 is a schematic structural diagram of a terminal according to Embodiment 1 of the present invention. As shown in FIG. 9, the terminal includes:
  • the obtaining unit 91 is configured to acquire an image data stream by using an electronic aperture, the image data stream comprising multi-frame image data;
  • the registration unit 92 is configured to register image data of each frame in the image data stream
  • the merging unit 93 is configured to perform merging processing on the image data of each frame after registration to obtain a target image.
  • the obtaining unit 91 includes:
  • the photographing subunit 911 is configured to obtain an original image data stream by using an electronic aperture in a handheld mode, the original image data stream comprising a plurality of frames of original image data;
  • the pre-processing sub-unit 912 is configured to pre-process the original image data of each frame in the original image data stream to obtain the image data stream; wherein the pre-processing includes at least one of the following: image filtering, contrast pull Stretch.
  • the terminal further includes:
  • the prompting unit 94 is configured to display a prompt box on the display interface; the prompt box is used for prompting In the hand-held mode, the orientation of the hand-held shake is taken when shooting with an electronic aperture.
  • the registration unit 92 is further configured to use the first frame image data in the image data stream as a reference frame, and image data of each frame except the first frame image data in the image data stream.
  • the reference frames are aligned; wherein the alignment refers to aligning pixel points of the same spatial location.
  • the merging unit 93 is further configured to superimpose each pixel point of each frame image data after registration according to spatial position correspondence.
  • FIG. 10 is a schematic structural diagram of a terminal according to Embodiment 1 of the present invention.
  • the terminal includes: a processor 1001, a camera 1002, and a display screen 1003.
  • the processor 1001, the camera 1002, and the display screen 1003 are all connected by a bus 1004.
  • the camera 1002 is configured to acquire an image data stream by using an electronic aperture, the image data stream comprising a plurality of frames of image data;
  • the processor 1001 is configured to register image data of each frame in the image data stream, and perform fusion processing on the image data of each frame after registration to obtain a target image.
  • the camera 1002 is configured to obtain an original image data stream by using an electronic aperture in a handheld mode, where the original image data stream includes multiple frames of original image data;
  • the processor 1001 is configured to perform pre-processing on each frame of the original image data stream to obtain the image data stream, where the pre-processing includes at least one of the following: image filtering, contrast pull And mapping the image data of each frame in the image data stream; and performing the fusion processing on the image data of each frame after registration to obtain a target image.
  • the display screen 1003 is configured to display a prompt box on the display interface; the prompt box is used to prompt the orientation of the handheld shake when the electronic aperture is used in the handheld mode.
  • FIG. 11 is a schematic flowchart of an image processing method according to Embodiment 3 of the present invention.
  • the image processing method in this example is applied to a terminal. As shown in FIG. 11, the image processing method includes the following steps:
  • Step 1101 Acquire an image data stream using an electronic aperture, the image data stream comprising a plurality of frames of image data.
  • the terminal may be an electronic device such as a mobile phone or a tablet computer.
  • the terminal has a photographing function, and the photographing function of the terminal has an electronic aperture mode; when photographing by using an electronic aperture, the user needs to set the photographing function to the electronic aperture mode.
  • the terminal In the electronic aperture mode, after the user adjusts the aperture value, the terminal performs continuous and uninterrupted shooting during the exposure time, and then fuses the captured multiple images.
  • the embodiment of the present invention adds a handheld mode for the electronic aperture shooting. In the gesture mode, the user can conveniently use the electronic aperture to shoot.
  • an image data stream is first acquired, the image data stream comprising a plurality of frames of image data.
  • the captured original image data stream is first acquired synchronously; then the original image data stream is subjected to data reading and image preprocessing to obtain the image data stream.
  • the raw image data of each frame in the synchronously acquired original image data stream is different in illumination, noise, sharpness, and focus point.
  • the original image data of each frame in the original image data stream needs to be preprocessed through a necessary preprocessing process.
  • the preprocessing process includes: image filtering to eliminate noise and contrast stretching to improve image clarity. Degree and the difference in illumination of the image. After the pre-processing is performed, the difference of the image data of each frame in the image data stream will be reduced, which is helpful for the improvement of the effect of the subsequent image registration algorithm.
  • Step 1102 Determine a reference frame from the image data stream, and register each frame image data of the image data stream except the reference frame with the reference frame.
  • the pixels in the same spatial position in each frame image data are aligned, thereby avoiding blurring when the image is merged.
  • registration is also called alignment, and there are many algorithms for image alignment, which are mainly divided into a local feature based method and a global feature based method.
  • the typical method based on local features is to extract the key feature points of the image, and then use these key feature points to calculate the mapping matrix of the image space alignment model, and finally use the mapping matrix to perform image alignment.
  • the registration effect of this type of method can generally meet the requirements of many scenes, such as changes in illumination (combination of different exposure images), large-scale image shift (panoramic image stitching), dark light image (noise increase) and other complexities.
  • Scene The other type is a search alignment method based on global mutual information matching, which can reduce the matching error caused by random feature points.
  • the optical flow field is also a point-based matching algorithm, which is the instantaneous velocity of the pixel motion of the space moving object on the observation imaging plane, which is the change of the pixel in the time domain and the correlation between adjacent frames.
  • the purpose of studying the optical flow field is to approximate the motion field that cannot be directly obtained from the sequence of pictures.
  • the motion field is actually the motion of the object in the three-dimensional real world; the optical flow field is the projection of the motion field on the two-dimensional image plane (human eye or camera).
  • the motion velocity and direction of motion of each pixel in each image are found to be the optical flow field.
  • I(x, y, t) is the pixel value of the image at the (x, y) position.
  • Vx and Vy are the compositions of x, y in the optical flow vector of I(x, y, t), respectively.
  • Ix and Iy are the differences of the image in the corresponding direction at (x, y, t). So there are:
  • the Lucas-Kanade optical flow method assumes that the spatial pixel points move in unison, and adjacent points on one scene are projected onto the image as neighboring points, and the neighboring points are at the same speed. This is a unique assumption of the Lucas-Kanade optical flow method because the basic equation of the optical flow method has only one constraint, and the speed in the x, y direction requires two unknown variables. Assuming similar motions in the neighborhood of the feature points, it is possible to find the velocity in the x, y direction by n multiple equations (n is the total number of points in the neighborhood of the feature points, including the feature points). You can get the following equation:
  • the small motion hypothesis mentioned in the above scheme assumes that when the target speed is fast, the assumption will not be established, and multi-scale can solve this problem.
  • Figure 6 shows the alignment process for multi-frame images using optical flow.
  • the sparse matching points between the two images can be obtained, and then the image mapping model is calculated by using the coordinates of these points.
  • the image alignment step it is important to choose the correct image alignment transformation model.
  • Common spatial transformation models include affine transformation and perspective transformation models.
  • the affine transformation can be visually represented in the following form. Any parallelogram in one plane can be mapped to another parallelogram by affine transformation, and the mapping operation of the image is in the same space. In-plane, different deformation parameters are used to deform different types of parallelograms.
  • Transmission transformation is a more general transformation model. Compared with affine transformation, transmission transformation is more flexible.
  • a transmission transformation can transform a rectangle into a trapezoid. It describes the projection of one plane in space into another spatial plane. Shot transformation is a special case of perspective transformation.
  • a 02 and a 12 are displacement parameters
  • a 00 a 01 and a 10 a 11 are scaling and rotation parameters
  • a 20 a 21 is the amount of deformation in the horizontal and vertical directions.
  • the perspective transformation model mainly considering the handheld terminal.
  • the jitter motion of the mobile phone is basically not in the same plane.
  • the simple motion model is shown in Figure 5.
  • Step 1103 When performing the fusion processing on the image data of each frame after the registration, the pixels in the black area at the image boundary are replaced with the pixels corresponding to the reference frame, and then the fusion processing is performed to obtain the target image.
  • black edges are formed after image map conversion, and if the pixels at these black edges are fused in a weighted average manner, a difference in luminance is formed, which affects the overall vision of the image.
  • the reference frame as the first frame image data in the image data stream as an example
  • the other frame image data are all registered and fused under the first frame image data, and the reference frame does not have black-rim pixels, and the image with black edges
  • the pixel at the black edge is replaced with the pixel at the corresponding position of the reference frame to participate in the weighted average, thereby effectively solving the problem of black edge of the image.
  • an image processing method which uses image registration principle to perform image alignment on image data to be synthesized in multiple frames, and the alignment method allows image jitter error within a certain range, and the final composite effect of the image The pixel deviation that occurs is small. More importantly, in the image fusion, the black edges of the pixels caused by image registration are processed to ensure that the pixel points of the entire image are naturally transitioned.
  • FIG. 12 is a schematic flowchart of an image processing method according to Embodiment 4 of the present invention.
  • the image processing method in this example is applied to a terminal. As shown in FIG. 12, the image processing method includes the following steps:
  • Step 1201 In the handheld mode, the original image data stream is obtained by using an electronic aperture, the original image data stream includes a plurality of frames of original image data; and the original image data of each frame in the original image data stream is preprocessed to obtain The image data stream.
  • the terminal may be an electronic device such as a mobile phone or a tablet computer.
  • the terminal has a photographing function, and the photographing function of the terminal has an electronic aperture mode; when photographing by using an electronic aperture, the user needs to set the photographing function to the electronic aperture mode.
  • the terminal In the electronic aperture mode, after the user adjusts the aperture value, the terminal performs continuous and uninterrupted shooting during the exposure time, and then fuses the captured multiple images.
  • the embodiment of the present invention adds a handheld mode for the electronic aperture shooting. In the gesture mode, the user can conveniently use the hand.
  • the terminal uses the electronic aperture to shoot.
  • an image data stream is first acquired, the image data stream comprising a plurality of frames of image data.
  • the captured original image data stream is first acquired synchronously; then the original image data stream is subjected to data reading and image preprocessing to obtain the image data stream.
  • the raw image data of each frame in the synchronously acquired original image data stream is different in illumination, noise, sharpness, and focus point.
  • the original image data of each frame in the original image data stream needs to be preprocessed through a necessary preprocessing process.
  • the preprocessing process includes: image filtering to eliminate noise and contrast stretching to improve image clarity. Degree and the difference in illumination of the image. After the pre-processing is performed, the difference of the image data of each frame in the image data stream will be reduced, which is helpful for the improvement of the effect of the subsequent image registration algorithm.
  • Step 1202 Align each frame image data of the image data stream except the first frame image data with the reference frame by using the first frame image data in the image data stream as a reference frame.
  • the pixels in the same spatial position in each frame image data are aligned, thereby avoiding blurring when the image is merged.
  • registration is also called alignment, and there are many algorithms for image alignment, which are mainly divided into a local feature based method and a global feature based method.
  • the typical method based on local features is to extract the key feature points of the image, and then use these key feature points to calculate the mapping matrix of the image space alignment model, and finally use the mapping matrix to perform image alignment.
  • the registration effect of this type of method can generally meet the requirements of many scenes, such as changes in illumination (combination of different exposure images), large-scale image shift (panoramic image stitching), dark light image (noise increase) and other complexities.
  • Scene The other type is a search alignment method based on global mutual information matching, which can reduce the matching error caused by random feature points.
  • the optical flow field is also a point-based matching algorithm, which is the instantaneous velocity of the pixel motion of the space moving object on the observation imaging plane, which is the change of the pixel in the time domain and the correlation between adjacent frames.
  • Step 1203 Display a prompt box on the display interface; the prompt box is used to prompt the orientation of the handheld shake when shooting in the handheld mode with the electronic aperture.
  • a user prompt box is displayed, which prompts the user to manually hold the direction of the jitter, which can facilitate the user to correct in time.
  • Step 1204 Analyze the image data of each frame after registration to determine a black area at the boundary of each frame image; when performing fusion processing on the image data of each frame after registration, replace the pixel of the black area at the boundary of the image with The pixels corresponding to the reference frame are subjected to fusion processing to obtain a target image.
  • the fusion process is performed to obtain the target image, including:
  • the pixel in the black area at the image boundary is replaced with the pixel in the reference area in the reference frame, and then the fusion processing is performed to obtain the target image.
  • the black area at the boundary of the image can be divided by a clustering algorithm, such as the K-means algorithm.
  • the image data of each frame is registered (that is, aligned)
  • the image needs to be fused, and here, a fusion method in which image pixels are sequentially superimposed is adopted.
  • black edges are formed after image map conversion, and if the pixels at these black edges are fused in a weighted average manner, a difference in luminance is formed, which affects the overall vision of the image.
  • the image data of each other frame is registered and fused under the image data of the first frame, and the black pixel is not present in the reference frame.
  • the pixel at the black edge is replaced with the corresponding position of the reference frame.
  • the pixels are involved in the weighted average, which effectively solves the problem of black edges of the image.
  • an image processing method which uses image registration principle to perform image alignment on image data to be synthesized in multiple frames, and the alignment method allows image jitter error within a certain range, and the final composite effect of the image
  • the pixel deviation that occurs is small.
  • the black edges of the pixels caused by image registration are processed to ensure that the pixel points of the entire image are naturally transitioned.
  • a prompt box is added to prevent the terminal from being over-ranged when shooting.
  • FIG. 14 is a schematic structural diagram of a terminal according to Embodiment 2 of the present invention. As shown in FIG. 14, the terminal includes:
  • An acquiring unit 41 configured to acquire an image data stream by using an electronic aperture, where the image data stream includes multi-frame image data;
  • the registration unit 42 is configured to determine a reference frame from the image data stream, and register, in the image data stream, image data of each frame other than the reference frame with the reference frame;
  • the merging unit 43 is configured to perform merging processing on the image data of each frame after the registration, and replace the pixels in the black area at the image boundary with the pixels corresponding to the reference frame, and perform fusion processing to obtain a target image.
  • the fusion unit 43 includes:
  • the analyzing sub-unit 431 is configured to analyze the image data of each frame after registration, and determine a black area at a boundary of each frame image;
  • the replacement and fusion sub-unit 432 is configured to perform merging processing on the image data of each frame after the registration, and replace the pixels in the black area at the image boundary with the pixels corresponding to the reference frame, and perform fusion processing to obtain a target image.
  • the replacement and fusion subunit 432 is further configured to: according to a black area at the boundary of the image, Determining a reference area corresponding to the black area in the reference frame; replacing a pixel of the black area at the image boundary with a pixel in the reference area in the reference frame, and performing a blending process to obtain a target image.
  • the registration unit 42 is further configured to use, by using the first frame image data in the image data stream as a reference frame, image data of each frame except the first frame image data in the image data stream.
  • the reference frames are aligned; wherein the alignment refers to aligning pixel points of the same spatial location.
  • the merging unit 43 is further configured to superimpose each pixel point of each frame image data after registration according to spatial position correspondence.
  • the terminal includes: a processor 1001, a camera 1002, and a display screen 1003.
  • the processor 1001, the camera 1002, and the display screen 1003 are all connected by a bus 1004.
  • the camera 1102 is configured to acquire an image data stream using an electronic aperture, the image data stream comprising a plurality of frames of image data;
  • the processor 1001 is configured to determine a reference frame from the image data stream, and register image data of each frame except the reference frame in the image data stream with the reference frame; When the subsequent image data of each frame is subjected to the fusion processing, the pixels of the black area at the image boundary are replaced with the pixels corresponding to the reference frame, and then the fusion processing is performed to obtain the target image.
  • the camera 1102 is configured to obtain an original image data stream by using an electronic aperture in a handheld mode, where the original image data stream includes multiple frames of original image data;
  • the processor 1101 is configured to perform pre-processing on each frame of the original image data stream to obtain the image data stream; wherein the pre-processing includes at least the following a: image filtering, contrast stretching; determining a reference frame from the image data stream, and registering each frame image data of the image data stream except the reference frame with the reference frame; When the image data of each frame is subjected to the fusion processing, the pixels of the black area at the image boundary are replaced with the pixels corresponding to the reference frame, and then the fusion processing is performed to obtain the target image.
  • the display screen 1103 is configured to display a prompt box on the display interface; the prompt box is used to prompt the orientation of the handheld shake when photographing with the electronic aperture in the handheld mode.
  • Figure 15 is a block diagram of the electrical structure of the camera.
  • the lens 1211 is composed of a plurality of optical lenses for forming a subject image, and is a single focus lens or a zoom lens.
  • the lens 1211 is movable in the optical axis direction under the control of the lens driver 1221, and the lens driver 1221 controls the focus position of the lens 1211 in accordance with a control signal from the lens driving control circuit 1222, and in the case of the zoom lens, the focus distance can also be controlled.
  • the lens drive control circuit 1222 performs drive control of the lens driver 1221 in accordance with a control command from the microprocessor 1217.
  • An imaging element 1212 is disposed in the vicinity of the position of the subject image formed by the lens 1211 on the optical axis of the lens 1211.
  • the imaging element 1212 is for capturing an image of a subject and acquiring captured image data.
  • Photodiodes constituting each pixel are arranged two-dimensionally and in a matrix on the imaging element 1212. Each photodiode generates a photoelectric conversion current corresponding to the amount of received light, and the photoelectric conversion current is charged by a capacitor connected to each photodiode.
  • a red-green-blue (RGB) color filter of a Bayer arrangement is disposed on the front surface of each pixel.
  • the imaging element 1212 is connected to the imaging circuit 1213.
  • the imaging circuit 1213 performs charge accumulation control and image signal readout control in the imaging element 1212, and performs waveform shaping after reducing the reset noise of the read image signal (analog image signal). Further, gain improvement or the like is performed to obtain an appropriate level signal.
  • the imaging circuit 1213 is connected to an A/D converter 1214 that performs analog-to-digital conversion on the analog image signal and outputs a digital image signal to the bus 1227 (hereinafter referred to as an image). data).
  • the bus 1227 is a transmission path for transmitting various data read or generated inside the camera.
  • the A/D converter 1214 is connected to the bus 1227, and an image processor 1215, a JPEG processor 1216, a microprocessor 1217, a Synchronous Dynamic Random Access Memory (SDRAM) 1218, and a memory are also connected.
  • An interface hereinafter referred to as a memory I/F
  • a liquid crystal display (LCD) driver 1220 is a liquid crystal display
  • the image processor 1215 performs optical black (OB, Optical Black) subtraction processing, white balance adjustment, color matrix calculation, gamma conversion, color difference signal processing, noise removal processing, and simultaneous processing on the image data based on the output of the imaging element 1212.
  • OB optical black
  • the JPEG processor 1216 compresses the image data read out from the SDRAM 1218 in accordance with the JPEG compression method when the image data is recorded on the storage medium 1225. Further, the JPEG processor 1216 performs decompression of JPEG image data for image reproduction display.
  • the file recorded in the storage medium 1225 is read, and after the compression processing is performed in the JPEG processor 1216, the decompressed image data is temporarily stored in the SDRAM 1218 and displayed on the LCD 1226.
  • the JPEG method is adopted as the image compression/decompression method.
  • the compression/decompression method is not limited thereto, and other compression/decompression methods such as MPEG, TIFF, and H.264 may be used.
  • the microprocessor 1217 functions as a control unit of the entire camera, and collectively controls various processing sequences of the camera.
  • the microprocessor 1217 is connected to the operating unit 1223 and the flash memory 1224.
  • the operating unit 1223 includes, but is not limited to, a physical button or a virtual button, and the entity or virtual button may be a power button, a camera button, an edit button, a dynamic image button, a reproduction button, a menu button, a cross button, an OK button, a delete button, an enlarge button
  • the operation controls such as various input buttons and various input keys detect the operational state of these operation controls.
  • the detection result is output to the microprocessor 1217.
  • the LCD1226 as a display
  • the front surface is provided with a touch panel to detect the touch position of the user, and the touch position is output to the microprocessor 1217.
  • the microprocessor 1217 executes various processing sequences corresponding to the user's operation in accordance with the detection result from the operation position of the operation unit 1223.
  • Flash memory 1224 stores programs for executing various processing sequences of microprocessor 1217.
  • the microprocessor 1217 performs overall control of the camera in accordance with the program. Further, the flash memory 1224 stores various adjustment values of the camera, and the microprocessor 1217 reads out the adjustment value, and performs control of the camera in accordance with the adjustment value.
  • the SDRAM 1218 is an electrically replaceable volatile memory for temporarily storing image data or the like.
  • the SDRAM 1218 temporarily stores image data output from the analog/digital (A/D) converter 1214 and image data processed in the image processor 1215, the JPEG processor 1216, and the like.
  • the memory interface 1219 is connected to the storage medium 1225, and performs control for writing image data and a file header attached to the image data to the storage medium 1225 and reading out from the storage medium 1225.
  • the storage medium 1225 can be implemented as a storage medium such as a memory card that can be detachably attached to the camera body.
  • the storage medium 1225 is not limited thereto, and may be a hard disk or the like built in the camera body.
  • the LCD driver 1210 is connected to the LCD 1226, and stores the image data processed by the image processor 1215 in the SDRAM 1218.
  • the image data stored in the SDRAM 1218 is read and displayed on the LCD 1226, or the JPEG processor 1216 is compressed.
  • the image data is stored in the SDRAM 1218.
  • the JPEG processor 1216 reads the compressed image data of the SDRAM 1218, decompresses it, and displays the decompressed image data through the LCD 1226.
  • the LCD 1226 is disposed on the back side of the camera body for image display.
  • image display may be performed using various display panels based on organic EL, that is, OLED.
  • the terminal in the embodiment of the present invention, if the terminal is implemented in the form of a software function module and sold or used as a stand-alone product, it may also be stored in a computer readable storage medium.
  • a computer device which may be a personal computer, server, or network device, etc.
  • the foregoing storage medium includes various media that can store program codes, such as a USB flash drive, a mobile hard disk, a ROM, a magnetic disk, or an optical disk.
  • embodiments of the invention are not limited to any specific combination of hardware and software.
  • an embodiment of the present invention further provides a computer storage medium in which a computer program is stored, the computer program being configured to perform an image processing method according to an embodiment of the present invention.
  • the disclosed method and smart device may be implemented in other manners.
  • the device embodiments described above are merely illustrative.
  • the division of the unit is only a logical function division.
  • there may be another division manner such as: multiple units or components may be combined, or Can be integrated into another system, or some features can be ignored or not executed.
  • the coupling, or direct coupling, or communication connection of the components shown or discussed may be indirect coupling or communication connection through some interfaces, devices or units, and may be electrical, mechanical or other forms. of.
  • the units described above as separate components may or may not be physically separated, and the components displayed as the unit may or may not be physical units, that is, may be located in one place or distributed to multiple network units; Some or all of the units may be selected according to actual needs to achieve the purpose of the solution of the embodiment.
  • each functional unit in each embodiment of the present invention may be integrated into one second processing unit, or each unit may be separately used as one unit, or two or more units may be integrated into one unit;
  • the above integrated unit can be implemented in the form of hardware or in the form of hardware plus software functional units.
  • the technical solution of the embodiment of the invention increases the hand-held mode of the electronic aperture, improves the convenience of the user's shooting, and introduces the image registration in the embodiment of the invention, which ensures the effect of the shooting and improves the user.
  • the embodiment of the present invention further adds a prompt box on the display interface to prevent the handheld terminal from being over-ranged when the user shoots.
  • the user can carry out the shooting of the electronic aperture by the hand-held terminal, which improves the convenience of the user's shooting, avoids the problem that the image is unclear due to the hand-held, ensures the shooting effect, and improves the user's shooting experience.
  • image fusion is performed, the black side of the pixel caused by image registration is processed, which ensures that the pixel point transition of each position of the entire image is natural.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Studio Devices (AREA)

Abstract

L'invention concerne un procédé de traitement d'image et un terminal comprenant : l'obtention d'un flux de données d'image à l'aide d'une ouverture électronique, le flux de données d'image comprenant des trames multiples de données d'image ; l'enregistrement des trames respectives des données d'image dans le flux de données d'image ; et la fusion des trames respectives enregistrées des données d'image afin d'obtenir une image cible. Grâce à l'ajout d'un mode portatif de l'ouverture électronique, les modes de réalisation de la présente invention permettent d'augmenter la commodité de l'utilisateur pour la photographie et de garantir ainsi l'effet photographique et d'améliorer l'expérience photographique de l'utilisateur.
PCT/CN2017/082941 2016-05-31 2017-05-03 Procédé de traitement d'image, terminal et support d'informations informatique WO2017206656A1 (fr)

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
CN201610375523.3 2016-05-31
CN201610377764.1A CN105915796A (zh) 2016-05-31 2016-05-31 一种电子光圈拍摄方法及终端
CN201610377764.1 2016-05-31
CN201610375523.3A CN105898159B (zh) 2016-05-31 2016-05-31 一种图像处理方法及终端

Publications (1)

Publication Number Publication Date
WO2017206656A1 true WO2017206656A1 (fr) 2017-12-07

Family

ID=60479709

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2017/082941 WO2017206656A1 (fr) 2016-05-31 2017-05-03 Procédé de traitement d'image, terminal et support d'informations informatique

Country Status (1)

Country Link
WO (1) WO2017206656A1 (fr)

Cited By (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109819163A (zh) * 2019-01-23 2019-05-28 努比亚技术有限公司 一种图像处理控制方法、终端及计算机可读存储介质
CN110598712A (zh) * 2019-08-28 2019-12-20 万维科研有限公司 物体位置识别方法、装置、计算机设备及存储介质
CN112106357A (zh) * 2018-05-11 2020-12-18 杜比实验室特许公司 端对端单层后向兼容编码流水线中的高保真度全参考和高效部分参考编码
CN112261290A (zh) * 2020-10-16 2021-01-22 海信视像科技股份有限公司 显示设备、摄像头以及ai数据同步传输方法
CN113114947A (zh) * 2021-04-20 2021-07-13 重庆紫光华山智安科技有限公司 对焦调节方法和装置、电子设备及存储介质
CN113192101A (zh) * 2021-05-06 2021-07-30 影石创新科技股份有限公司 图像处理方法、装置、计算机设备和存储介质
CN113518243A (zh) * 2020-04-10 2021-10-19 Tcl科技集团股份有限公司 一种图像处理方法及装置
CN113706421A (zh) * 2021-10-27 2021-11-26 深圳市慧鲤科技有限公司 一种图像处理方法及装置、电子设备和存储介质
CN114785957A (zh) * 2022-05-26 2022-07-22 维沃移动通信有限公司 拍摄方法及其装置
WO2022262599A1 (fr) * 2021-06-18 2022-12-22 影石创新科技股份有限公司 Procédé et appareil de traitement d'image, et dispositif informatique et support de stockage

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20070053536A (ko) * 2005-11-21 2007-05-25 한국전자통신연구원 느린 셔터 속도에서 선명한 영상 검출 장치 및 방법
CN104751488A (zh) * 2015-04-08 2015-07-01 努比亚技术有限公司 运动物体的运动轨迹的拍摄方法及终端设备
CN105488756A (zh) * 2015-11-26 2016-04-13 努比亚技术有限公司 图片合成方法及装置
CN105611181A (zh) * 2016-03-30 2016-05-25 努比亚技术有限公司 多帧拍摄图像合成装置和方法
CN105898159A (zh) * 2016-05-31 2016-08-24 努比亚技术有限公司 一种图像处理方法及终端
CN105915796A (zh) * 2016-05-31 2016-08-31 努比亚技术有限公司 一种电子光圈拍摄方法及终端

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20070053536A (ko) * 2005-11-21 2007-05-25 한국전자통신연구원 느린 셔터 속도에서 선명한 영상 검출 장치 및 방법
CN104751488A (zh) * 2015-04-08 2015-07-01 努比亚技术有限公司 运动物体的运动轨迹的拍摄方法及终端设备
CN105488756A (zh) * 2015-11-26 2016-04-13 努比亚技术有限公司 图片合成方法及装置
CN105611181A (zh) * 2016-03-30 2016-05-25 努比亚技术有限公司 多帧拍摄图像合成装置和方法
CN105898159A (zh) * 2016-05-31 2016-08-24 努比亚技术有限公司 一种图像处理方法及终端
CN105915796A (zh) * 2016-05-31 2016-08-31 努比亚技术有限公司 一种电子光圈拍摄方法及终端

Cited By (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112106357A (zh) * 2018-05-11 2020-12-18 杜比实验室特许公司 端对端单层后向兼容编码流水线中的高保真度全参考和高效部分参考编码
CN112106357B (zh) * 2018-05-11 2024-03-12 杜比实验室特许公司 用于对图像数据进行编码和解码的方法及装置
CN109819163A (zh) * 2019-01-23 2019-05-28 努比亚技术有限公司 一种图像处理控制方法、终端及计算机可读存储介质
CN110598712A (zh) * 2019-08-28 2019-12-20 万维科研有限公司 物体位置识别方法、装置、计算机设备及存储介质
CN110598712B (zh) * 2019-08-28 2022-06-03 万维科研有限公司 物体位置识别方法、装置、计算机设备及存储介质
CN113518243A (zh) * 2020-04-10 2021-10-19 Tcl科技集团股份有限公司 一种图像处理方法及装置
CN112261290A (zh) * 2020-10-16 2021-01-22 海信视像科技股份有限公司 显示设备、摄像头以及ai数据同步传输方法
CN113114947A (zh) * 2021-04-20 2021-07-13 重庆紫光华山智安科技有限公司 对焦调节方法和装置、电子设备及存储介质
CN113192101A (zh) * 2021-05-06 2021-07-30 影石创新科技股份有限公司 图像处理方法、装置、计算机设备和存储介质
CN113192101B (zh) * 2021-05-06 2024-03-29 影石创新科技股份有限公司 图像处理方法、装置、计算机设备和存储介质
WO2022262599A1 (fr) * 2021-06-18 2022-12-22 影石创新科技股份有限公司 Procédé et appareil de traitement d'image, et dispositif informatique et support de stockage
CN113706421A (zh) * 2021-10-27 2021-11-26 深圳市慧鲤科技有限公司 一种图像处理方法及装置、电子设备和存储介质
CN113706421B (zh) * 2021-10-27 2022-02-22 深圳市慧鲤科技有限公司 一种图像处理方法及装置、电子设备和存储介质
CN114785957A (zh) * 2022-05-26 2022-07-22 维沃移动通信有限公司 拍摄方法及其装置

Similar Documents

Publication Publication Date Title
CN105898159B (zh) 一种图像处理方法及终端
WO2017206656A1 (fr) Procédé de traitement d'image, terminal et support d'informations informatique
CN106454121B (zh) 双摄像头拍照方法及装置
CN108605097B (zh) 光学成像方法及其装置
CN106937039B (zh) 一种基于双摄像头的成像方法、移动终端及存储介质
CN106612397A (zh) 一种图像处理方法及终端
US9159169B2 (en) Image display apparatus, imaging apparatus, image display method, control method for imaging apparatus, and program
WO2018019124A1 (fr) Procédé de traitement d'image et dispositif électronique et support d'informations
WO2017050115A1 (fr) Procédé de synthèse d'image
WO2017016511A1 (fr) Procédé et dispositif de traitement d'image ainsi que terminal
CN105915796A (zh) 一种电子光圈拍摄方法及终端
WO2018076938A1 (fr) Procédé et dispositif de traitement d'image et support de mise en mémoire informatique
CN114092364A (zh) 图像处理方法及其相关设备
CN105187724B (zh) 一种处理图像的移动终端和方法
CN106303290B (zh) 一种终端及获取视频的方法
CN107040723B (zh) 一种基于双摄像头的成像方法、移动终端及存储介质
CN107133939A (zh) 一种照片合成方法、设备及计算机可读存储介质
WO2017071542A1 (fr) Procédé et appareil de traitement d'image
CN111064895B (zh) 一种虚化拍摄方法和电子设备
CN106954020B (zh) 一种图像处理方法及终端
CN112995467A (zh) 图像处理方法、移动终端及存储介质
CN109120858B (zh) 一种图像拍摄方法、装置、设备及存储介质
CN106851125B (zh) 一种移动终端及多重曝光拍摄方法
CN106303229A (zh) 一种拍照方法及装置
CN113810590A (zh) 图像处理方法、电子设备、介质和系统

Legal Events

Date Code Title Description
NENP Non-entry into the national phase

Ref country code: DE

121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 17805601

Country of ref document: EP

Kind code of ref document: A1

32PN Ep: public notification in the ep bulletin as address of the adressee cannot be established

Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205A DATED 30.04.2019)

122 Ep: pct application non-entry in european phase

Ref document number: 17805601

Country of ref document: EP

Kind code of ref document: A1