CN107085841B - Picture zooming processing method and terminal - Google Patents

Picture zooming processing method and terminal Download PDF

Info

Publication number
CN107085841B
CN107085841B CN201710113484.4A CN201710113484A CN107085841B CN 107085841 B CN107085841 B CN 107085841B CN 201710113484 A CN201710113484 A CN 201710113484A CN 107085841 B CN107085841 B CN 107085841B
Authority
CN
China
Prior art keywords
picture
zoom
pixel
optical flow
value
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201710113484.4A
Other languages
Chinese (zh)
Other versions
CN107085841A (en
Inventor
孟勇
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Nubia Technology Co Ltd
Original Assignee
Nubia Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nubia Technology Co Ltd filed Critical Nubia Technology Co Ltd
Priority to CN201710113484.4A priority Critical patent/CN107085841B/en
Publication of CN107085841A publication Critical patent/CN107085841A/en
Application granted granted Critical
Publication of CN107085841B publication Critical patent/CN107085841B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/50Image enhancement or restoration by the use of more than one image, e.g. averaging, subtraction
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10004Still image; Photographic image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20212Image combination
    • G06T2207/20221Image fusion; Image merging

Abstract

The invention discloses a picture zooming processing method and a terminal, for an original picture to be zoomed, copying the original picture to obtain a copy picture, then calculating the pixel value of an optical flow image of each pixel point on the copy picture according to a zooming parameter, updating the pixel value of each pixel point on the copy picture to the pixel value of each optical flow image to obtain an optical flow picture, and finally weighting and superposing the obtained optical flow picture and the original picture to obtain a zooming synthetic picture, namely the zooming picture. The picture zooming processing method provided by the invention can zoom the shot picture so as to simulate the zoomed picture shot by the zoom lens, so that the zoomed picture can be obtained without being limited to a professional camera with high cost, the cost can be reduced, and the satisfaction degree of user experience can be improved.

Description

Picture zooming processing method and terminal
Technical Field
The invention relates to the technical field of terminals, in particular to a picture zooming processing method and a terminal.
Background
The traditional zoom photographing can only be applied to a professional camera and is realized through a zoom lens of the professional camera. For example, in a single lens reflex camera, a zoom lens is used, and a small aperture is used for a long-time exposure during shooting. In the exposure process, after the shutter is pressed down, the camera is exposed for one or two seconds, then the zoom ring is rotated carefully to realize zooming, and the effect is more obvious when the zoom distance is larger. The finally obtained shot picture can form a radial scene, and the effect is very shocking. However, for a series of photographing terminals without zoom lenses, such as mobile phones, IPADs, readers, etc., with photographing functions, because fixed focus lenses are adopted, zoom pictures cannot be obtained on such terminals, and user experience satisfaction is poor.
Disclosure of Invention
The technical problem to be solved by the invention is as follows: the photographing terminal only provided with the fixed-focus lens cannot obtain a picture with a zooming effect, the user experience satisfaction is poor, and the picture zooming processing method and the terminal are provided.
In order to solve the above technical problem, the present invention provides a method for processing picture zooming, comprising:
obtaining a copy picture of the original picture;
acquiring zoom parameters of the copied picture;
calculating the optical flow image pixel value of each pixel point on the copy picture according to the zoom parameter, and updating the pixel value of each pixel point on the copy picture into the respective optical flow image pixel value to obtain an optical flow picture;
and performing weighted superposition on the optical flow picture and the original picture to obtain a zoom synthetic picture.
Optionally, the weighted superposition of the optical flow picture and the original picture comprises:
multiplying an optical flow pixel value P (j) of a pixel point j on the optical flow picture by an optical flow pixel weight W1(j), multiplying an original pixel value src (j) of a corresponding pixel point j on the original picture by an original pixel weight W2(j), and updating the pixel value of the pixel point j on the optical flow picture into the sum of the two values; and j is more than or equal to 1 and less than or equal to the total number N of the pixel points on the copy picture.
Optionally, the zoom-out parameter comprises a special effect mode of the zoom-out process;
in the daytime special effect mode, the value of W1(j) is 1, and the value of W2(j) is 0;
in the night special effect mode, the ratio of W1(j) to W2(j) is equal to the ratio of the gray value of the pixel point j on the optical flow picture to the gray value of the pixel point j on the original picture, and the sum of W1(j) and W2(j) is equal to 1.
Optionally, the zoom parameter includes zoom center point position information on the copy picture and a zoom multiple of the copy picture;
calculating an optical flow map pixel value of each pixel point on the copy picture according to the zoom parameter comprises:
acquiring a zoom length dj corresponding to a pixel point j on the copy picture according to the zoom multiple;
acquiring pixel values and weights of all pixel points in a focus pulling length dj area on a straight line connecting a pixel point j and the focus pulling central point;
adding the product of the pixel value and the weight value of each pixel point in the area of the zoom length dj, and dividing the sum by the zoom length dj to obtain the optical flow pixel value P (j) of the pixel point j; and the zooming length dj is less than the linear connecting line length Lj, j is more than or equal to 1 and is less than or equal to the total number N of the pixel points on the copied picture.
Optionally, the zoom factor includes a zoom factor kl in the length direction and a zoom factor kw in the width direction of the picture;
obtaining a zoom length dj corresponding to a pixel point j on the copy picture according to the zoom factor comprises:
acquiring the linear connection length Lj of the pixel point j and the focus pulling central point;
calculating the square root k of the product of the zoom factor kl and the zoom factor kw;
and dividing k-1 by k, and multiplying the result by the linear connecting line length Lj to obtain a value which is used as the zoom length dj.
Optionally, the zoom factor kl is equal to the zoom factor kw; the zooming parameter comprises a special effect mode of zooming treatment, and in the daytime special effect mode, the zooming times kl and kw are values which are greater than or equal to m1 and less than or equal to m 2; in the night special effect mode, taking the zoom factor kl and the zoom factor kw as values which are more than or equal to m3 and less than or equal to m 4;
the m1 is more than or equal to 1 and less than or equal to the m 3; the m2 is less than the m 4.
Optionally, the zoom-out parameter comprises a special effect mode of the zoom-out process; obtaining the weight value of the pixel point j on the copy picture comprises the following steps:
and judging whether the special effect mode of the current zooming processing is a daytime special effect mode, if so, taking the weight as 1, otherwise, acquiring the gray value of the pixel point j on the copy picture, and acquiring the value of the weight according to the gray value and the corresponding relation between the preset gray value and the weight.
The present invention also provides a terminal, comprising:
the picture acquisition module is used for acquiring a copy picture of an original picture;
the parameter acquisition module is used for acquiring the zoom parameter of the copied picture;
the optical flow picture generation module is used for calculating the optical flow image pixel value of each pixel point on the copy picture according to the zoom parameter and updating the pixel value of each pixel point on the copy picture into the respective optical flow image pixel value to obtain an optical flow picture;
and the synthesis module is used for weighting and superposing the optical flow picture and the original picture to obtain a zoom synthesis picture.
Optionally, the synthesizing module is configured to multiply an optical flow pixel value p (j) of a pixel j on the optical flow picture by an optical flow pixel weight W1(j), multiply an original pixel value src (j) of a corresponding pixel j on the original picture by an original pixel weight W2(j), and update the pixel value of the pixel j on the optical flow picture to be the sum of the two values; and j is more than or equal to 1 and less than or equal to the total number N of the pixel points on the copy picture.
Optionally, the zoom parameter includes zoom center point position information on the copy picture and a zoom multiple of the copy picture;
the optical flow picture generation module is used for acquiring a zoom length dj corresponding to a pixel point j on the copy picture according to the zoom multiple, acquiring pixel values and weight values of all pixel points in a zoom length dj area on a straight line connecting the pixel point j and the zoom central point, and dividing the sum of the pixel values and the weight values of all the pixel points in the zoom length dj area by the zoom length dj to obtain an optical flow pixel value P (j) of the pixel point j; and the zooming length dj is less than the linear connecting line length Lj, j is more than or equal to 1 and is less than or equal to the total number N of the pixel points on the copied picture.
Advantageous effects
The picture zoom processing method and the terminal provided by the invention copy an original picture (which can be a shot picture or any picture of other sources) to be zoomed to obtain a copy picture, then calculate the optical flow image pixel value of each pixel point on the copy picture according to the zoom parameters, update the pixel value of each pixel point on the copy picture to the respective optical flow image pixel value to obtain an optical flow picture, and finally perform weighted superposition on the obtained optical flow picture and the original picture to obtain a zoom synthetic picture, namely obtain the zoom picture. The picture zooming processing method provided by the invention can zoom the shot picture so as to simulate the zoomed picture shot by the zoom lens, so that the zoomed picture can be obtained without being limited to a professional camera with high cost, the cost can be reduced, and the satisfaction degree of user experience can be improved.
Drawings
The invention will be further described with reference to the accompanying drawings and examples, in which:
fig. 1 is a schematic diagram of a hardware structure of an alternative mobile terminal for implementing various embodiments of the present invention;
FIG. 2 is a diagram illustrating an alternative camera hardware configuration for implementing various embodiments of the invention;
fig. 3 is a schematic flow chart of a picture zooming processing method according to an embodiment of the present invention;
FIG. 4 is a schematic flow chart illustrating a process of calculating an optical flow map pixel value of each pixel point on a copy picture according to an embodiment of the present invention;
fig. 5 is a schematic flow chart of calculating the zoom length according to the first embodiment of the present invention;
FIG. 6 is a schematic drawing of the zoom length according to the first embodiment of the present invention;
fig. 7 is a schematic diagram of distribution of pixel points of a picture according to an embodiment of the present invention;
fig. 8 is a schematic flow chart of a weight obtaining process according to a first embodiment of the present invention;
FIG. 9 is a diagram illustrating an original picture according to a second embodiment of the present invention;
fig. 10 is a schematic view of a zoom-out composite picture according to a second embodiment of the present invention;
fig. 11 is a schematic structural diagram of a terminal according to a third embodiment of the present invention;
fig. 12 is a schematic diagram of an original picture according to a fourth embodiment of the present invention;
fig. 13 is a schematic view of a zoom-out composite picture according to the fourth embodiment of the present invention.
Detailed Description
It should be understood that the specific embodiments described herein are merely illustrative of the invention and are not intended to limit the invention.
The terminal copies an original picture (which can be a shot picture or any picture from other sources) to be subjected to zoom processing to obtain a copy picture, calculates the optical flow image pixel value of each pixel point on the copy picture according to the zoom parameters, updates the pixel value of each pixel point on the copy picture into the respective optical flow image pixel value to obtain an optical flow picture, and finally performs weighted superposition on the obtained optical flow picture and the original picture to obtain a zoom synthetic picture, namely the zoom picture. The picture zooming processing method provided by the invention can zoom the shot picture so as to simulate the zoomed picture shot by the zoom lens, so that the zoomed picture can be obtained without being limited to a professional camera with high cost, the cost can be reduced, and the satisfaction degree of user experience can be improved.
The terminal in the invention comprises a mobile terminal with a camera application and a non-fixed terminal with the camera application. The following description is given by way of example only of a mobile terminal. The mobile terminal may be implemented in various forms. For example, the terminal described in the present invention may include a mobile terminal such as a mobile phone, a smart phone, a notebook computer, a digital broadcast receiver, a PDA (personal digital assistant), a PAD (tablet computer), a PMP (portable multimedia player), a navigation device, and the like, and a stationary terminal such as a digital TV, a desktop computer, and the like. In the following, it is assumed that the terminal is a mobile terminal. However, it will be understood by those skilled in the art that the configuration according to the embodiment of the present invention can be applied to a fixed type terminal in addition to elements particularly used for moving purposes.
Fig. 1 is a schematic diagram of a hardware structure of an optional mobile terminal for implementing various embodiments of the present invention.
The mobile terminal 100 may include a wireless communication unit 110, an a/V (audio/video) input unit 120, a user input unit 130, a sensing unit 140, an output unit 150, a memory 160, an interface unit 170, a controller 180, and a power supply unit 190, etc. Fig. 1 illustrates a mobile terminal having various components, but it is to be understood that not all illustrated components are required to be implemented. More or fewer components may alternatively be implemented. Elements of the mobile terminal will be described in detail below.
The wireless communication unit 110 typically includes one or more components that allow radio communication between the mobile terminal 100 and a wireless communication system or network. For example, the wireless communication unit may be a mobile communication module, a wireless internet module, a short-range communication module, or the like.
The a/V input unit 120 is used to receive an audio or video signal. The a/V input unit 120 may include a camera 121 and a microphone 1220, and the camera 121 processes image data of still pictures or video obtained by an image capturing apparatus in a video capturing mode or an image capturing mode. The processed image frames may be displayed on the display module 151. The image frames processed by the cameras 121 may be stored in the memory 160 (or other storage medium) or transmitted via the wireless communication unit 110, and two or more cameras 121 may be provided according to the construction of the mobile terminal. The microphone 122 may receive sounds (audio data) via the microphone in a phone call mode, a recording mode, a voice recognition mode, or the like, and can process such sounds into audio data. The processed audio (voice) data may be converted into a format output transmittable to a mobile communication base station via the mobile communication module 112 in case of a phone call mode. The microphone 122 may implement various types of noise cancellation (or suppression) algorithms to cancel (or suppress) noise or interference generated in the course of receiving and transmitting audio signals.
The user input unit 130 may generate key input data according to a command input by a user to control various operations of the mobile terminal. The user input unit 130 allows a user to input various types of information, and may include a keyboard, dome sheet, touch pad (e.g., a touch-sensitive member that detects changes in resistance, pressure, capacitance, and the like due to being touched), scroll wheel, joystick, and the like. In particular, when the touch pad is superimposed on the display module 151 in the form of a layer, a touch screen may be formed.
The sensing unit 140 detects a current state of the mobile terminal 100 (e.g., an open or closed state of the mobile terminal 100), a position of the mobile terminal 100, presence or absence of contact (i.e., touch input) by a user with the mobile terminal 100, an orientation of the mobile terminal 100, acceleration or deceleration movement and direction of the mobile terminal 100, and the like, and generates a command or signal for controlling an operation of the mobile terminal 100.
The interface unit 170 serves as an interface through which at least one external device is connected to the mobile terminal 100. For example, the external device may include a wired or wireless headset port, an external power supply (or battery charger) port, a wired or wireless data port, a memory card port, a port for connecting a device having an identification module, an audio input/output (J/O) port, a video J/O port, an earphone port, and the like. The identification module may store various information for authenticating a user using the mobile terminal 100 and may include a subscriber identification module (UJM), a subscriber identification module (SJM), a universal subscriber identification module (USJM), and the like. In addition, a device having an identification module (hereinafter, referred to as an "identification device") may take the form of a smart card, and thus, the identification device may be connected with the mobile terminal 100 via a port or other connection means. The interface unit 170 may be used to receive input (e.g., data information, power, etc.) from an external device and transmit the received input to one or more elements within the mobile terminal 100 or may be used to transmit data between the mobile terminal and the external device.
In addition, when the mobile terminal 100 is connected with an external cradle, the interface unit 170 may serve as a path through which power is supplied from the cradle to the mobile terminal 100 or may serve as a path through which various command signals input from the cradle are transmitted to the mobile terminal. Various command signals or power input from the cradle may be used as signals for recognizing whether the mobile terminal is accurately mounted on the cradle. The output unit 150 is configured to provide output signals (e.g., audio signals, video signals, alarm signals, vibration signals, etc.) in a visual, audio, and/or tactile manner.
The output unit 150 may include a display module 151, an audio output module 152, and the like.
The display module 151 may display information processed in the mobile terminal 100. For example, when the mobile terminal 100 is in a phone call mode, the display module 151 may display a user interface (UJ) or a graphical user interface (GUJ) associated with a call or other communication (e.g., text messaging, multimedia file download, etc.). For another example, it may play and display various video files stored in the terminal, including but not limited to recorded video files, video files acquired from a network or other terminals. The display module 151 may also display a captured image and/or a received image, UJ or GUJ showing a video or image and related functions, and the like, when the mobile terminal 100 is in a video call mode or an image capturing mode.
Meanwhile, when the display module 151 and the touch pad are stacked on each other in the form of layers to form a touch screen, the display module 151 may serve as an input device and an output device. The display module 151 may include at least one of a Liquid Crystal Display (LCD), a thin film transistor LCD (TFT-LCD), an Organic Light Emitting Diode (OLED) display, a flexible display, a three-dimensional (3D) display, and the like. Some of these displays may be configured to be transparent to allow a user to view from the outside, which may be referred to as transparent displays, and a typical transparent display may be, for example, a TOLED (transparent organic light emitting diode) display or the like. The mobile terminal 100 may include two or more display modules (or other display devices) according to a particular desired implementation, for example, the mobile terminal may include an external display module (not shown) and an internal display module (not shown). The touch screen may be used to detect a touch input pressure as well as a touch input position and a touch input area.
The audio output module 152 may convert audio data received by the wireless communication unit 110 or stored in the memory 160 into an audio signal and output as sound when the mobile terminal is in a call signal reception mode, a call mode, a recording mode, a voice recognition mode, a broadcast reception mode, or the like. Also, the audio output module 152 may provide audio output related to a specific function performed by the mobile terminal 100 (e.g., a call signal reception sound, a message reception sound, etc.). The audio output module 152 may include a speaker, a buzzer, and the like.
The memory 160 may store software programs and the like for processing and controlling operations performed by the controller 180, or may temporarily store data (e.g., a phonebook, messages, still images, videos, and the like) that has been or will be output. Also, the memory 160 may store data regarding various ways of vibration and audio signals output when a touch is applied to the touch screen. The manner in which the memory 160 stores data may be stored in a data buffer queue, which may be generated by a queue generating module in the controller 180, and the rules for storing data in the data buffer queue may be controlled by a storage control module in the controller 180. It should be understood that the queue generating module and the storage control module may be built in the controller 180 or may be separately provided from the controller 180.
The memory 160 may include at least one type of storage medium including a flash memory, a hard disk, a multimedia card, a card-type memory (e.g., SD or DX memory, etc.), a Random Access Memory (RAM), a Static Random Access Memory (SRAM), a read-only memory (ROM), an electrically erasable programmable read-only memory (EEPROM), a programmable read-only memory (PROM), a magnetic memory, a magnetic disk, an optical disk, and the like. Also, the mobile terminal 100 may cooperate with a network storage device that performs a storage function of the memory 160 through a network connection.
The controller 180 generally controls the overall operation of the mobile terminal. For example, the controller 180 performs control and processing related to voice calls, data communications, video calls, and the like. In addition, the controller 180 may include a document scanning module 1810 for performing document scanning, and a processing module 1820 for performing document recording processing.
The power supply unit 190 receives external power or internal power and provides appropriate power required to operate various elements and components under the control of the controller 180.
The various embodiments described herein may be implemented in a computer-readable medium using, for example, computer software, hardware, or any combination thereof. For a hardware implementation, the embodiments described herein may be implemented using at least one of an application specific integrated circuit (ASJC), a Digital Signal Processor (DSP), a Digital Signal Processing Device (DSPD), a Programmable Logic Device (PLD), a Field Programmable Gate Array (FPGA), a processor, a controller, a microcontroller, a microprocessor, an electronic unit designed to perform the functions described herein, and in some cases, such embodiments may be implemented in the controller 180. For a software implementation, the implementation such as a process or a function may be implemented with a separate software module that allows performing at least one function or operation. The software codes may be implemented by software applications (or programs) written in any suitable programming language, which may be stored in the memory 160 and executed by the controller 180.
Up to this point, mobile terminals have been described in terms of their functionality. Hereinafter, a slide-type mobile terminal among various types of mobile terminals, such as a folder-type, bar-type, swing-type, slide-type mobile terminal, and the like, will be described as an example for the sake of brevity. Accordingly, the present invention can be applied to any type of mobile terminal, and is not limited to a slide type mobile terminal.
An electrical configuration block diagram of the camera will now be described with reference to fig. 2.
The photographing lens 1211 is composed of a plurality of optical lenses for forming an object image, and is a single focus lens or a zoom lens. The photographing lens 1211 is movable in the optical axis direction under the control of the lens driver 1221, and the lens driver 1221 controls the focal position of the photographing lens 1211 in accordance with a control signal from the lens driving control circuit 1222. The lens drive control circuit 1222 performs drive control of the lens driver 1221 in accordance with a control command from the microcomputer 1217.
An image pickup device 1212 is disposed on the optical axis of the photographing lens 1211 near the position of the object image formed by the photographing lens 1211. The image pickup device 1212 is used to pick up an image of an object and acquire picked-up image data. Photodiodes constituting each pixel are two-dimensionally arranged in a matrix on the image pickup device 1212. Each photodiode generates a photoelectric conversion current corresponding to the amount of received light, and the photoelectric conversion current is charged by a capacitor connected to each photodiode. A bayer RGB color filter is disposed on the front surface of each pixel.
The image pickup device 1212 is connected to an image pickup circuit 1213, and the image pickup circuit 1213 performs charge accumulation control and image signal reading control in the image pickup device 1212, performs waveform shaping after reducing reset noise for the read image signal (analog image signal), and further performs gain improvement or the like so as to obtain an appropriate signal level.
The imaging circuit 1213 is connected to an a/D converter 1214, and the a/D converter 1214 performs analog-to-digital conversion on the analog image signal and outputs a digital image signal (hereinafter referred to as image data) to the bus 1227.
The bus 1227 is a transfer path for transferring various data read out or generated inside the camera. The a/D converter 1214 described above is connected to the bus 1227, and further, an image processor 1215, a JPEG processor 1216, a microcomputer 1217, an SDRAM (Synchronous dynamic random access memory) 1218, a memory interface (hereinafter referred to as memory J/F)1219, and an LCD (Ljqujd Crystal djspray) driver 1220 are connected thereto.
The image processor 1215 performs various image processing such as OB subtraction processing, white balance adjustment, color matrix operation, gamma conversion, color difference signal processing, noise removal processing, synchronization processing, and edge processing on image data output from the image pickup device 1212. The JPEG processor 1216 compresses the image data read out from the SDRAM1218 in a JPEG compression method when recording the image data in the recording medium 1225. The JPEG processor 1216 decompresses JPEG image data for image reproduction display. When decompression is performed, a file recorded in the recording medium 1225 is read out, decompression processing is performed in the JPEG processor 1216, and the decompressed image data is temporarily stored in the SDRAM1218 and displayed on the LCD 1226. In the present embodiment, the JPEG system is used as the image compression/decompression system, but the compression/decompression system is not limited to this, and other compression/decompression systems such as MPEG, TJFF, and h.264 may be used.
The microcomputer 1217 functions as a control unit of the entire camera, and collectively controls various processing sequences of the camera. The microcomputer 1217 is connected to an operation unit 1223 and a flash memory 1224.
The operation unit 1223 includes, but is not limited to, physical keys or virtual keys, which may be various input buttons such as a power button, a photographing key, an editing key, a moving image button, a reproduction button, a menu button, a cross key, an OK button, a delete button, and an enlargement button, and various operation controls such as various input keys, and detects operation states of these operation controls.
The detection result is output to the microcomputer 1217. A touch panel is provided on the front surface of the LCD1226 as a display, and a touch position of the user is detected and output to the microcomputer 1217. The microcomputer 1217 executes various processing sequences corresponding to the user's operation according to the detection result of the operation position from the operation unit 1223.
The flash memory 1224 stores programs for executing various processing sequences of the microcomputer 1217. The microcomputer 1217 controls the entire camera according to the program. The flash memory 1224 stores various adjustment values of the camera, and the microcomputer 1217 reads the adjustment values and controls the camera in accordance with the adjustment values.
The SDRAM1218 is an electrically rewritable volatile memory for temporarily storing image data and the like. The SDRAM1218 temporarily stores the image data output from the a/D converter 1214 and the image data processed by the image processor 1215, JPEG processor 1216, and the like.
The memory interface 1219 is connected to the recording medium 1225, and performs control for writing and reading image data and data such as a file header added to the image data to and from the recording medium 1225. The recording medium 1225 is, for example, a recording medium such as a memory card that can be attached to and detached from the camera body, but is not limited to this, and may be a hard disk or the like that is built in the camera body.
The LCD driver 1210 is connected to the LCD1226, and stores the image data processed by the image processor 1215 in the SDRAM1218, and when display is required, reads the image data stored in the SDRAM1218 and displays the image data on the LCD1226, or the image data compressed by the JPEG processor 1216 is stored in the SDRAM1218, and when display is required, the JPEG processor 1216 reads the compressed image data in the SDRAM1218, decompresses the data, and displays the decompressed image data through the LCD 1226.
The LCD1226 is disposed on the back surface of the camera body and displays an image. The LCD1226LCD is not limited to this, and various display panels (LCD1226) such as organic EL may be used.
The camera shown in fig. 2 can complete video recording under the control of the recording module 1830 to obtain corresponding video data, and store the video data in a data buffer queue in the memory according to a certain rule under the control of the storage control module 1820. For convenience of understanding, the following presents various embodiments of the present invention based on the above-described hardware structure of the mobile terminal and an electrical schematic diagram of the camera.
First embodiment
The present embodiment provides a method for processing picture zooming, which is suitable for performing zooming processing on any picture, including a picture taken through a fixed-focus lens or a zoom lens, and an effect of the picture obtained by performing the zooming processing on the picture through the method for processing picture zooming provided by the present embodiment can achieve an effect of a zoomed picture taken through the zoom lens. Referring to fig. 3, the method for processing picture zooming in the present embodiment includes:
s301: a copy of the original picture is obtained.
The original picture in this embodiment may be a picture taken by a camera, or may be a picture or a photograph that is local to the terminal or acquired from another terminal or a network. And copying the original picture to obtain a copy picture of the original picture.
It should be understood that in some application scenarios, the original picture may not be copied, but may be directly used, for example, when the weighted values of the original picture are all 0 when the final weighted overlap is performed, the original picture may not be copied. Of course, if archiving is considered, a copy of the original picture can still be archived at this point.
It should be understood that the copied picture and the original picture are identical, and any one of the copied picture and the original picture can be selected for use in the subsequent use.
S302: and acquiring zoom parameters of the copied picture.
The zoom-out parameter in this embodiment may be obtained according to a selection instruction issued by a user, or may be automatically obtained according to a preset rule, for example, the picture is automatically analyzed, and a preset zoom-out parameter meeting the condition is automatically obtained according to an analysis result.
The zoom-in parameter in this embodiment includes, but is not limited to, a special effect mode of the zoom-in process (for example, including, but not limited to, a day special effect mode, a night special effect mode), a zoom-in center point, at least one of a zoom-in multiple, and in some examples, a weight value.
S303: and calculating the optical flow image pixel value of each pixel point on the copy picture according to the zoom parameter, and updating the pixel value of each pixel point on the copy picture into the respective optical flow image pixel value to obtain the optical flow picture.
In this embodiment, the method for processing the copied picture according to the zoom parameter to obtain the optical flow picture may be implemented by any method for obtaining the optical flow picture through calculation.
S304: and weighting and superposing the obtained optical flow picture and the original picture to obtain a zoom synthetic picture, wherein the display effect of the obtained zoom synthetic picture is the same as the zoom effect of the zoom lens.
According to the method, the effect of zooming photos shot by the zoom lens can be simulated by processing the common pictures through the process, so that the terminal without the zoom lens can obtain the photos with the zooming photo effect, and the user experience is improved.
In one example, the obtained zoom parameters include zoom center point location information on the copy picture and a zoom multiple of the copy picture. The zoom-in central point can be flexibly set by a user, and can also be automatically set according to preset conditions, for example, the geometric central point on the copy picture can be defaulted as the zoom-in central point. The zoom-in central point can also be designated by the user in the copy picture, and specifically, the position clicked by the user on the picture can be detected and taken as the zoom-in central point. The specific detection and position calculation methods are not described herein.
The zoom factor can be freely set by the user or automatically set. In one example, two modes can be provided, in which a user can freely set, in which a zoom factor reference example can be provided at the same time for the user to select setting, and in which the zoom factor reference example can be automatically set according to a preset condition.
In S303, a process of calculating an optical flow image pixel value of each pixel point on the copy image according to the obtained zoom parameter is shown in fig. 4, and includes:
s401: and acquiring the zoom length dj corresponding to the pixel point j on the copy picture according to the zoom multiple.
S402: and acquiring pixel values and weights of all pixel points in a focus pulling length dj area on a straight line connecting the pixel point j and the focus pulling central point.
S403: adding the product of the pixel value and the weight value of each pixel point in the area of the zoom length dj, and dividing the sum by the zoom length dj to obtain the optical flow pixel value P (j) of the pixel point j; wherein the zoom length dj is less than the linear connecting line length Lj, j is more than or equal to 1 and less than or equal to the total number N of the pixel points on the copy picture.
The zoom factor in this embodiment includes a zoom factor kl in the picture length direction and a zoom factor kw in the picture width direction. It should be understood that the picture length direction zoom factor kl and the width direction zoom factor kw may be set to be the same or different. In order to ensure that the picture is not distorted after being processed, the zoom factor kl in the length direction of the picture and the zoom factor kw in the width direction of the picture are preferably set to be equal.
In S401, obtaining the zoom length dj corresponding to the pixel point j on the copy picture according to the zoom multiple is shown in fig. 5, and includes:
s501: acquiring the linear connection length Lj of the pixel point j and the focus pulling central point;
s502: calculating the square root k of the product of the zoom factor kl and the zoom factor kw;
s503: dividing k-1 by k, and multiplying the result by the linear link length Lj to obtain a value which is used as the zoom length dj, namely:
Figure BDA0001235071020000171
in the above formula, Lj is the length of the straight line connecting the pixel point j and the focus pulling central point, and k is the square root k of the product of the focus pulling multiple kl and the focus pulling multiple kw.
The process of obtaining the above formula (1) will be described below with reference to the drawings.
The zooming process is equivalent to the image enlarging process, please refer to fig. 6, where the zooming times kl in the length direction of the image and the zooming times kw in the width direction of the image are set to be equal to k, and k is greater than or equal to 1, and the origin in the image is the center point of the image. Under the zoom multiple, the pixel point j 'can move to the pixel point j, the moving distance of the pixel point j' is dj, and the moving distance is equal to:
dj=(k-1)*L…………………………………………………(2)
in the above formula, L is the linear distance from the pixel point j' to the focus pulling central point.
The linear distance Lj from the pixel point j to the focus pulling central point is equal to:
Lj=L+dj………………………………………………………(3)
the above formula (1) can be obtained by combining the above formula (2) and formula (3). In this embodiment, to avoid that the pixel points at the edge of the picture will move out of the picture due to the movement from j 'to j, the moving direction is taken as the movement from j to j'. Lj in the pattern (1) can be directly calculated by a linear distance calculation formula of two points, and k is known, so that dj, namely the zoom length, can be calculated.
In the above S402, specifically, the pixel point j may be used as a starting point, and the pixel values and weights of all the pixel points in the area of the focus pulling length dj on the straight line connecting the pixel point j with the focus pulling central point are sequentially obtained. Of course, in some examples, the pixel point j may not be used as a starting point, for example, one or 2 pixel points adjacent to the pixel point j may be used as a starting point. See, for example, fig. 7: assuming that the current pixel point is j ═ 0, the calculated d0 is shown in fig. 7, and the pixel values P (0), P (1), P (2), P (3), and P (4) of all the pixel points 0, 1, 2, 3, and 4 in d0 and the weights W0(0), W0(1), W0(2), W0(3), and W0(4) of the respective pixel points are obtained from the pixel point 0, and the optical flow pixel value P (0) of the pixel point 0 can be calculated according to the following formula:
Figure BDA0001235071020000181
in the above formula, p (j) is the original pixel value of the pixel, and W0(j) is the weight of the pixel j.
It should be understood that for a pixel point with a distance Lj from the focus center point of 0, the corresponding focus length dj is also 0, and therefore the optical flow pixel value p (j) of the pixel point is equal to the original pixel value.
In this embodiment, when the zoom-in parameter includes a day special effect mode and a night special effect mode, the special effect mode may be flexibly selected by a user, or may be automatically selected after an image is automatically analyzed. Wherein:
in the daytime special effect mode, the zoom factor kl and the zoom factor kw are taken to be more than or equal to m1M is less than or equal to2A value of (d);
in the night special effect mode, taking the zoom factor kl and the zoom factor kw as m or more3M is less than or equal to4A value of (d);
m11 or more and m or less3;m2Less than m4For example:
in a daytime special effect mode, taking the zoom factor kl and the zoom factor kw as values which are more than or equal to 1 and less than or equal to 2;
and under the night special effect mode, taking the zoom factor kl and the zoom factor kw as values which are more than or equal to 1 and less than or equal to 5.
In this embodiment, the process of obtaining the weight W0(j) of the pixel point j on the copy picture is shown in fig. 8, and includes:
s801: judging whether the special effect mode of the current zooming processing is the daytime special effect mode, if so, turning to S802; otherwise, go to S803.
S802: the weight W0(j) is set to 1.
S803: and acquiring the gray value of a pixel point j on the copy picture, and acquiring the value of the weight according to the gray value and the corresponding relation between the preset gray value and the weight.
The manner of obtaining the gray value of the pixel point j in this embodiment includes, but is not limited to, the following manners:
if the color of pixel j is RGB (R, G, B), we can convert it into gray scale by the following methods:
1. floating point arithmetic: gray (j) ═ R0.3 + G0.59 + B0.11
2. Integer method: gray (j) ═ (R30 + G59 + B11)/100
3. The shifting method comprises the following steps: gray (j) ═ (R77 + G151 + B28) > > 8;
4. average value method: gray (j) ═ R + G + B)/3;
5. taking green only: gray (j) ═ G.
The correspondence between W0(j) and gray (j) is proportional, for example, when gray (j) is 200, W0(j) is 1, gray (j) is 220, W0(j) is 1.1, gray (j) is 250, W0(j) is 1.5, and so on. The specific corresponding relationship between the two can be flexibly set according to the actual application requirements, and is not described herein again.
In this embodiment, after processing the copy picture to obtain the optical flow picture, performing weighted superposition on the optical flow picture and the original picture includes:
multiplying the optical flow pixel value P (j) of the pixel point j on the optical flow picture by the optical flow pixel weight W1(j), multiplying the original pixel value src (j) of the corresponding pixel point j on the original picture by the original pixel weight W2(j), and updating the pixel value of the pixel point j on the optical flow picture (of course, the original picture) into the sum R (j); j is more than or equal to 1 and less than or equal to the total number N of the pixel points on the copy picture. Specifically, it can be expressed by the following formula:
R(j)=W1(j)*P(j)+W2(j)*src(j)…………………………(5)
r (j) is the pixel value of each pixel point on the final output zoom synthesis picture.
In this embodiment, in the daytime special effect mode, W1(j) takes a value of 1, and W2(j) takes a value of 0; the actual output at this time is the resulting optical flow picture.
In the night special effect mode, the ratio of W1(j) to W2(j) is equal to the ratio of the gray value of the pixel point j on the optical flow picture to the gray value of the pixel point j on the original picture, and the sum of W1(j) and W2(j) is equal to 1. At this time, the weight value is determined according to the proportion of the two gray values.
The image zooming processing method provided by the embodiment is suitable for zooming any image, including a picture obtained by shooting through a fixed-focus lens or a zoom lens, and the image effect obtained by zooming the image through the image zooming processing method provided by the embodiment can achieve the zooming picture effect obtained by zooming and shooting through the zoom lens, so that the satisfaction degree of user experience can be improved.
Second embodiment
In this embodiment, based on the above embodiment, a terminal is taken as a mobile phone, and a picture to be zoomed is taken as an original picture for explanation. The process of the decoking treatment is completed on the photo as follows:
and taking a common original photo (src) by using the mobile phone, and copying the original photo to obtain a copy photo (C). The original photograph is shown in fig. 9.
The user selects the special effect mode, and the user is supposed to select the daytime special effect mode.
And setting different zoom parameters according to the selection of the user. Different special effect modes correspond to different zoom parameters, such as zoom multiples (kl and kw), weight W0(j) and the like. When the daytime special effect mode is set, the value range of the zoom factors (kl and kw) is set to be greater than 1 and less than 2, and then W0(j) is set to 1. When setting the night special effect, the zoom magnification (K) is set to a value range of more than 1 and less than 5 (or more), w (j) is set according to the brightness Gray (Gray value) of p (j), and the value of w (j) is a variable which becomes larger as the Gray value increases. In this example, the zoom factor kl — kw — 1.5 and the weight W0(j) may be 1.
The copied image C is subjected to a zoom process. From the zoom factor kl kw 1.5, the zoom length dj can be calculated from the above formula. Traversing the whole image C pixel by pixel, calculating all optical flow pixel values of C, calculating the weighted sum of the pixel values and the weights of all pixels on a straight line of a certain distance dj from the current pixel point j of the image C to the focus center, and dividing the value of sum by dj to obtain the optical flow pixel value P (j) of the current pixel point sum/dj. After the zoom of the entire image C is calculated, an optical flow picture (B) is generated.
The generated optical flow picture B and the original picture src are subjected to weighted superposition, and the formula is referred to. If the day special effect mode is set, the weight W2(j) of src (j) is 0, and the weight of p (j) is 1. If the night special effect is set, the weight of src (j) and the weight of p (j) are variable, and the weights can be superimposed according to the proportion of gray values to form the final zoom-in picture with different effects.
As shown in fig. 10, the picture effect shown in fig. 10 is substantially the same as the picture effect shot through the zoom lens, so that the user can also obtain the zoom effect picture through a shooting terminal of a fixed-focus lens such as a mobile phone, and the satisfaction degree of user experience is improved.
Third embodiment
The embodiment provides a terminal, which may be a terminal with a photographing function or a terminal without the photographing function, as shown in fig. 11, and includes:
the picture obtaining module 111 is used for obtaining a copy picture of the original picture.
The original picture in this embodiment may be a picture taken by a camera, or may be a picture or a photograph acquired locally by the terminal or acquired by the picture acquisition module 111 from another terminal or a network. And copying the original picture to obtain a copy picture of the original picture. The copy picture and the original picture are identical, and one of the copy picture and the original picture can be arbitrarily selected for use in subsequent use.
A parameter obtaining module 112, configured to obtain a zoom parameter of the copy picture.
In this embodiment, the parameter obtaining module 112 may obtain the zoom parameter according to a selection instruction issued by a user, or may automatically obtain the zoom parameter according to a preset rule, for example, automatically analyze the picture, and automatically obtain the preset zoom parameter meeting the condition according to the analysis result.
The zoom-in parameter in this embodiment includes, but is not limited to, a special effect mode of the zoom-in process (for example, including, but not limited to, a day special effect mode, a night special effect mode), a zoom-in center point, at least one of a zoom-in multiple, and in some examples, a weight value.
The parameter obtaining module 112 may specifically display a parameter setting interface for a user to issue a selection instruction for parameter setting.
And the optical flow picture generating module 113 is configured to calculate an optical flow image pixel value of each pixel point on the copy picture according to the zoom parameter, and update the pixel value of each pixel point on the copy picture to a respective optical flow image pixel value to obtain an optical flow picture. The optical flow picture generating module 113 may be implemented by processing the copied picture according to the zoom parameter to obtain the optical flow picture through any method of obtaining the optical flow picture through calculation.
And the synthesis module 114 is configured to perform weighted superposition on the optical flow picture and the original picture to obtain a zoom synthesis picture. The display effect of the obtained zoom synthetic picture is the same as the zoom effect through the zoom lens.
Through the process, the effect of zooming the picture shot by the zoom lens can be simulated by processing the common picture, so that the terminal without the zoom lens can obtain the picture with the zooming picture effect, and the user experience is improved.
In one example, the obtained zoom parameters include zoom center point location information on the copy picture and a zoom multiple of the copy picture. The focus pulling central point can be flexibly set by a user and can also be automatically set according to preset conditions. The zoom factor can also be freely set by a user or automatically set. The parameter acquiring module 112 may provide two modes, one mode may be freely set by the user, the other mode may be automatically set according to a preset condition, and the zoom factor reference example may be provided at the same time for the user to select the setting.
The optical flow picture generating module 113 is configured to obtain a zoom length dj corresponding to a pixel point j on the copy picture according to the zoom multiple, obtain pixel values and weights of all pixel points in a zoom length dj region on a straight line connecting the pixel point j and a zoom center point, add a product of the pixel value and the weight of each pixel point in the zoom length dj region, and divide the sum by the zoom length dj to obtain an optical flow pixel value p (j) of the pixel point j; the zoom length dj is less than the linear connecting line length Lj, j is more than or equal to 1 and less than or equal to the total number N of pixel points on the copy picture.
The zoom factor in this embodiment includes a zoom factor kl in the picture length direction and a zoom factor kw in the picture width direction. It should be understood that the picture length direction zoom factor kl and the width direction zoom factor kw may be set to be the same or different. In order to ensure that the picture is not distorted after being processed, the zoom factor kl in the length direction of the picture and the zoom factor kw in the width direction of the picture are preferably set to be equal.
The optical flow picture generating module 113 obtaining the zoom length dj corresponding to the pixel point j on the copy picture according to the zoom multiple includes:
the optical flow picture generating module 113 obtains the length Lj of a straight line connecting the pixel point j and the focus pulling central point;
the optical flow picture generation module 113 calculates the square root k of the product of the zoom factor kl and the zoom factor kw;
the optical flow picture generation module 113 divides k by k and multiplies the result by the straight-line link length Lj to obtain a value as the zoom length dj.
Specifically, the optical flow image generation module 113 may use the pixel point j as a starting point to sequentially obtain pixel values and weights of all pixel points in the area of the zoom length dj on the straight line connecting the pixel point and the zoom center point. Of course, in some examples, the pixel point j may not be used as a starting point, for example, one or 2 pixel points adjacent to the pixel point j may be used as a starting point. It should be understood that for a pixel point with a distance Lj from the focus center point of 0, the corresponding focus length dj is also 0, and therefore the optical flow pixel value p (j) of the pixel point is equal to the original pixel value.
In this embodiment, when the zoom-in parameter includes a day special effect mode and a night special effect mode, the special effect mode may be flexibly selected by a user, or may be automatically selected after an image is automatically analyzed. Wherein:
in the daytime special effect mode, the zoom factor kl and the zoom factor kw are taken to be more than or equal to m1M is less than or equal to2A value of (d);
in the night special effect mode, taking the zoom factor kl and the zoom factor kw as m or more3M is less than or equal to4A value of (d);
m11 or more and m or less3;m2Less than m4For example:
in a daytime special effect mode, taking the zoom factor kl and the zoom factor kw as values which are more than or equal to 1 and less than or equal to 2;
and in the night special effect mode, the zoom factor kl and the zoom factor kw are taken as values which are more than or equal to 1.5 and less than or equal to 6.
The optical flow picture generation module 113 obtaining the weight value W0(j) of the pixel point j on the copy picture includes:
the optical flow image generation module 113 determines whether the current special effect mode of the zoom processing is the daytime special effect mode, and if so, takes the weight value W0(j) and takes the weight value as 1. Otherwise, acquiring the gray value of the pixel point j on the copy picture, and acquiring the value of the weight according to the gray value and the corresponding relation between the preset gray value and the weight.
In this embodiment, the synthesizing module 114 is configured to multiply the optical flow pixel value p (j) of the pixel j on the optical flow picture by the optical flow pixel weight W1(j), multiply the original pixel value src (j) of the corresponding pixel j on the original picture by the original pixel weight W2(j), and update the pixel value of the pixel j on the optical flow picture (which may be the original picture) to be the sum of the two values. Specifically, it can be expressed by the following formula:
R(j)=W1(j)*P(j)+W2(j)*src(j)…………………………(6)
r (j) is the pixel value of each pixel point on the final output zoom synthesis picture.
In this embodiment, in the daytime special effect mode, W1(j) takes a value of 1, and W2(j) takes a value of 0; the actual output at this time is the resulting optical flow picture.
In the night special effect mode, the ratio of W1(j) to W2(j) is equal to the ratio of the gray value of the pixel point j on the optical flow picture to the gray value of the pixel point j on the original picture, and the sum of W1(j) and W2(j) is equal to 1. At this time, the weight value is determined according to the proportion of the two gray values.
The terminal in this embodiment may be the terminal shown in fig. 1, and the functions of the picture acquiring module 111, the parameter acquiring module 112, the optical flow picture generating module 113, and the synthesizing module 114 may be specifically implemented by a controller or a processor of the terminal, and each module may be embedded in the controller or the processor. The terminal provided by the embodiment can perform zoom processing on any picture, including a picture obtained by shooting through the fixed-focus lens or the zoom lens, and the picture effect obtained after the zoom processing can achieve the zoom picture effect obtained by shooting through the zoom lens, so that the satisfaction degree of user experience can be improved to a greater extent.
Fourth embodiment
In this embodiment, based on the above embodiment, the terminal is an IPAD, and the picture to be subjected to zoom processing is an original picture. The process of the decoking treatment is completed on the photo as follows:
and taking a common original photo (src) by using the mobile phone, and copying the original photo to obtain a copy photo (C). The original photograph is shown in fig. 12.
The user selects the special effect mode, assuming that the user selects the night special effect mode.
And setting different zoom parameters according to the selection of the user. Different special effect modes correspond to different zoom parameters, such as zoom multiples (kl and kw), weight W0(j) and the like. When the daytime special effect mode is set, the value range of the zoom factors (kl and kw) is set to be greater than 1 and less than 2, and then W0(j) is set to 1. When the night special effect is set, the value range of the zoom magnification (K) is set to be more than 1.5 and less than 6, the setting of W (j) is determined by the brightness Gray (Gray value) of p (j), and the value of W (j) is a variable which is increased along with the increase of the Gray value. In this example, the zoom factor kl-kw-2 may be taken.
The copied image C is subjected to a zoom process. From the zoom factor kl — kw — 2, the zoom length d j can be calculated from the above equation. Traversing the whole image C pixel by pixel, calculating all optical flow pixel values of C, calculating the weighted sum of the pixel values and the weights of all pixels on a straight line of a certain distance dj from the current pixel point j of the image C to the focus center, and dividing the value of sum by dj to obtain the optical flow pixel value P (j) of the current pixel point sum/dj. After the zoom of the entire image C is calculated, an optical flow picture (B) is generated.
The generated optical flow picture B and the original picture src are subjected to weighted superposition, and the formula is referred to. If the daytime special effect mode is set, the weight W2(j) of src (j) and the weight W1(j) of p (j) are determined according to the gray value proportion of src (j) and p (j), the weights of src (j) and p (j) are uncertain, and the images are superposed according to the proportion of gray value to form the final zoom images with different effects. Wherein, the weight W2(j) and the weight W1(j) should satisfy the following conditions:
the ratio of W1(j) to W2(j) is equal to the ratio of the gray value of pixel point j on the optical flow picture to the gray value of pixel point j on the original picture, and the sum of W1(j) and W2(j) is equal to 1.
After the authentication, superposition and synthesis, a zoom-out synthesized picture is output, as shown in fig. 13, the picture effect shown in fig. 13 is basically consistent with the picture effect shot through the zoom-out lens, so that a user can also obtain the zoom-out effect picture through a shooting terminal of a fixed-focus lens such as an IPAD, and the satisfaction degree of user experience is improved.
It should be noted that, in this document, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element.
The above-mentioned serial numbers of the embodiments of the present invention are merely for description and do not represent the merits of the embodiments.
Through the above description of the embodiments, those skilled in the art will clearly understand that the method of the above embodiments can be implemented by software plus a necessary general hardware platform, and certainly can also be implemented by hardware, but in many cases, the former is a better implementation manner. Based on such understanding, the technical solutions of the present invention may be embodied in the form of a software product, which is stored in a storage medium (such as ROM/RAM, magnetic disk, optical disk) and includes instructions for enabling a terminal device (such as a mobile phone, a computer, a server, an air conditioner, or a network device) to execute the method according to the embodiments of the present invention.
While the present invention has been described with reference to the embodiments shown in the drawings, the present invention is not limited to the embodiments, which are illustrative and not restrictive, and it will be apparent to those skilled in the art that various changes and modifications can be made therein without departing from the spirit and scope of the invention as defined in the appended claims.

Claims (8)

1. A picture zooming processing method is characterized by comprising the following steps:
copying the original picture to obtain a copied picture;
acquiring zoom parameters of the copied picture;
calculating the optical flow image pixel value of each pixel point on the copy picture according to the zoom parameter, and updating the pixel value of each pixel point on the copy picture into the respective optical flow image pixel value to obtain an optical flow picture;
weighting and superposing the optical flow picture and the original picture to obtain a zoom synthetic picture;
wherein the zoom parameter includes position information of a zoom center point on the copy picture and a zoom multiple of the copy picture, and the calculating an optical flow image pixel value of each pixel point on the copy picture according to the zoom parameter includes:
acquiring a zoom length dj corresponding to a pixel point j on the copy picture according to the zoom multiple;
acquiring pixel values and weights of all pixel points in a focus pulling length dj area on a straight line connecting a pixel point j and the focus pulling central point;
adding the product of the pixel value and the weight value of each pixel point in the area of the zoom length dj, and dividing the sum by the zoom length dj to obtain the optical flow pixel value P (j) of the pixel point j; and the zooming length dj is less than the linear connecting line length Lj, j is more than or equal to 1 and is less than or equal to the total number N of the pixel points on the copied picture.
2. The picture-zooming processing method of claim 1, wherein the weighted superposition of the optical flow picture and the original picture comprises:
multiplying an optical flow pixel value P (j) of a pixel point j on the optical flow picture by an optical flow pixel weight W1(j), multiplying an original pixel value src (j) of a corresponding pixel point j on the original picture by an original pixel weight W2(j), and updating the pixel value of the pixel point j on the optical flow picture into the sum of the two values; and j is more than or equal to 1 and less than or equal to the total number N of the pixel points on the copy picture.
3. The picture zooming processing method according to claim 2, wherein the zooming parameters include a special effect mode of the zooming processing;
in the daytime special effect mode, the value of W1(j) is 1, and the value of W2(j) is 0;
in the night special effect mode, the ratio of W1(j) to W2(j) is equal to the ratio of the gray value of the pixel point j on the optical flow picture to the gray value of the pixel point j on the original picture, and the sum of W1(j) and W2(j) is equal to 1.
4. A picture-focusing processing method as claimed in any one of claims 1 to 3, wherein said focusing multiple includes a focusing multiple kl in a length direction and a focusing multiple kw in a width direction of said picture;
obtaining a zoom length dj corresponding to a pixel point j on the copy picture according to the zoom factor comprises:
acquiring the linear connection length Lj of the pixel point j and the focus pulling central point;
calculating the square root k of the product of the zoom factor kl and the zoom factor kw;
and dividing k-1 by k, and multiplying the result by the linear connecting line length Lj to obtain a value which is used as the zoom length dj.
5. The picture zoom processing method according to claim 4, wherein the zoom factor kl is equal to the zoom factor kw; the coke-drawing parameter comprises a special effect mode of coke-drawing treatment, and the coke-drawing times kl and kw are taken as m or more in the special effect mode in the daytime1M is less than or equal to2A value of (d); in the night special effect mode, taking the zoom factor kl and the zoom factor kw as m or more3M is less than or equal to4A value of (d);
m is11 or more and m or less3(ii) a M is2Less than said m4
6. A method for picture-zoom-processing according to any one of claims 1 to 3, wherein the zoom-parameters include a special effect mode of the zoom-processing; obtaining the weight value of the pixel point j on the copy picture comprises the following steps:
and judging whether the special effect mode of the current zooming processing is a daytime special effect mode, if so, taking the weight as 1, otherwise, acquiring the gray value of the pixel point j on the copy picture, and acquiring the value of the weight according to the gray value and the corresponding relation between the preset gray value and the weight.
7. A terminal, comprising:
the image acquisition module is used for copying the original image to obtain a copied image;
the parameter acquisition module is used for acquiring the zoom parameter of the copied picture;
the optical flow picture generation module is used for calculating the optical flow image pixel value of each pixel point on the copy picture according to the zoom parameter and updating the pixel value of each pixel point on the copy picture into the respective optical flow image pixel value to obtain an optical flow picture;
the synthesis module is used for weighting and superposing the optical flow picture and the original picture to obtain a zoom synthesis picture;
the zooming parameters comprise the position information of a zooming central point on the copy picture and the zooming times of the copy picture;
the optical flow picture generation module is used for acquiring a zoom length dj corresponding to a pixel point j on the copy picture according to the zoom multiple, acquiring pixel values and weight values of all pixel points in a zoom length dj area on a straight line connecting the pixel point j and the zoom central point, and dividing the sum of the pixel values and the weight values of all the pixel points in the zoom length dj area by the zoom length dj to obtain an optical flow pixel value P (j) of the pixel point j; and the zooming length dj is less than the linear connecting line length Lj, j is more than or equal to 1 and is less than or equal to the total number N of the pixel points on the copied picture.
8. The terminal of claim 7, wherein the synthesis module is configured to multiply the optical flow pixel value p (j) of a pixel j on the optical flow picture by an optical flow pixel weight W1(j), and multiply the original pixel value src (j) of a corresponding pixel j on the original picture by an original pixel weight W2(j), and update the pixel value of the pixel j on the optical flow picture to be the sum of the two; and j is more than or equal to 1 and less than or equal to the total number N of the pixel points on the copy picture.
CN201710113484.4A 2017-02-28 2017-02-28 Picture zooming processing method and terminal Active CN107085841B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201710113484.4A CN107085841B (en) 2017-02-28 2017-02-28 Picture zooming processing method and terminal

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201710113484.4A CN107085841B (en) 2017-02-28 2017-02-28 Picture zooming processing method and terminal

Publications (2)

Publication Number Publication Date
CN107085841A CN107085841A (en) 2017-08-22
CN107085841B true CN107085841B (en) 2020-04-28

Family

ID=59614160

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201710113484.4A Active CN107085841B (en) 2017-02-28 2017-02-28 Picture zooming processing method and terminal

Country Status (1)

Country Link
CN (1) CN107085841B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109872271A (en) * 2019-01-28 2019-06-11 努比亚技术有限公司 A kind of image processing method, terminal and computer readable storage medium

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101233745A (en) * 2005-11-22 2008-07-30 松下电器产业株式会社 Imaging device, portable terminal, imaging method, and program
CN101572804A (en) * 2009-03-30 2009-11-04 浙江大学 Multi-camera intelligent control method and device
CN102265215A (en) * 2008-12-05 2011-11-30 索尼爱立信移动通讯有限公司 Camera system with touch focus and method

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101233745A (en) * 2005-11-22 2008-07-30 松下电器产业株式会社 Imaging device, portable terminal, imaging method, and program
CN102265215A (en) * 2008-12-05 2011-11-30 索尼爱立信移动通讯有限公司 Camera system with touch focus and method
CN101572804A (en) * 2009-03-30 2009-11-04 浙江大学 Multi-camera intelligent control method and device

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
手机摄影后期教程;黎胄1964521;《https://wenku.baidu.com/view/3952500a4afe04a1b171de00.html》;20160708;第1-13页 *

Also Published As

Publication number Publication date
CN107085841A (en) 2017-08-22

Similar Documents

Publication Publication Date Title
JP5567235B2 (en) Image processing apparatus, photographing apparatus, program, and image processing method
US9389758B2 (en) Portable electronic device and display control method
JP5657182B2 (en) Imaging apparatus and signal correction method
US10158798B2 (en) Imaging apparatus and method of controlling the same
CN106534667B (en) Distributed collaborative rendering method and terminal
US20130120617A1 (en) Zoom control method and apparatus, and digital photographing apparatus
JP5864037B2 (en) Image processing apparatus, imaging apparatus, image processing method, and program
JPWO2015045829A1 (en) Imaging apparatus and imaging method
CN104853091A (en) Picture taking method and mobile terminal
US9111129B2 (en) Subject detecting method and apparatus, and digital photographing apparatus
CN111385470B (en) Electronic device, control method of electronic device, and computer-readable medium
JP2014146989A (en) Image pickup device, image pickup method, and image pickup program
US20180204311A1 (en) Image processing device, image processing method, and program
CN106713656B (en) Shooting method and mobile terminal
CN103945116A (en) Apparatus and method for processing image in mobile terminal having camera
WO2017088600A1 (en) Method and device for enlarging imaging range, mobile terminal and computer storage medium
EP2890116A1 (en) Method of displaying a photographing mode by using lens characteristics, computer-readable storage medium of recording the method and an electronic apparatus
CN107085841B (en) Picture zooming processing method and terminal
KR20220102401A (en) Electronic device and operating method thereof
CN105426081B (en) Interface switching device and method of mobile terminal
CN106569666B (en) Mask processing control device and method and terminal
CN108370415A (en) Image processing apparatus and image processing method
CN107071293B (en) Shooting device, method and mobile terminal
CN106993138B (en) Time-gradient image shooting device and method
CN110933300B (en) Image processing method and electronic terminal equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant