CN106851115A - A kind of image processing method and device - Google Patents

A kind of image processing method and device Download PDF

Info

Publication number
CN106851115A
CN106851115A CN201710207173.4A CN201710207173A CN106851115A CN 106851115 A CN106851115 A CN 106851115A CN 201710207173 A CN201710207173 A CN 201710207173A CN 106851115 A CN106851115 A CN 106851115A
Authority
CN
China
Prior art keywords
image
template area
fringe region
exposure
template
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201710207173.4A
Other languages
Chinese (zh)
Other versions
CN106851115B (en
Inventor
王东
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Lenovo Beijing Ltd
Original Assignee
Lenovo Beijing Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Lenovo Beijing Ltd filed Critical Lenovo Beijing Ltd
Priority to CN201710207173.4A priority Critical patent/CN106851115B/en
Publication of CN106851115A publication Critical patent/CN106851115A/en
Application granted granted Critical
Publication of CN106851115B publication Critical patent/CN106851115B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/80Camera processing pipelines; Components thereof
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/70Circuitry for compensating brightness variation in the scene
    • H04N23/741Circuitry for compensating brightness variation in the scene by increasing the dynamic range of the image compared to the dynamic range of the electronic image sensors

Abstract

The embodiment provides a kind of image processing method and device, described image processing method includes:The first image and the second image are obtained, wherein, the time for exposure of described first image is shorter than the time for exposure of second image;The fringe region of described first image and the fringe region of second image are extracted respectively;The fringe region of fringe region and second image to described first image is processed, and obtains template area;Fusion treatment is carried out to described first image and second image based on the template area.

Description

A kind of image processing method and device
Technical field
Embodiments of the invention are related to technical field of image processing, more particularly to a kind of image processing method and device.
Background technology
With the development and popularization of intelligent terminal technology, the image capturing method based on intelligent terminal technology is also more and more general Time.In general, during shooting image, the requirement to shooting environmental is higher, for example, in shooting environmental insufficient light When, in order to obtain enough dark portion details, image exposuring time may be forced extension, and this often occurs camera between exposure period Move and cause image blurring phenomenon.
In the prior art, the method by image calculating can be selected to captured image deblurring, is currently adopted The main method of image deblurring direct deblurring in blurred image generally by calculating, but this method Amount of calculation is huge, leads to not be used on portable terminal device, and unsatisfactory to the effect of image deblurring.
The content of the invention
According to an aspect of the invention, there is provided a kind of image processing method, including:Obtain the first image and the second figure Picture, wherein, the time for exposure of described first image is shorter than the time for exposure of second image;Described first image is extracted respectively Fringe region and second image fringe region;The side of fringe region and second image to described first image Edge region is processed, and obtains template area;Described first image and second image are carried out based on the template area Fusion treatment.
According to another aspect of the present invention, there is provided a kind of image processing apparatus, including:Acquiring unit, is configured to obtain The first image and the second image are taken, wherein, the time for exposure of described first image is shorter than the time for exposure of second image;Carry Unit is taken, is configured to extract respectively the fringe region of described first image and the fringe region of second image;Processing unit, It is configured to process the fringe region of described first image and the fringe region of second image, obtains template area; Integrated unit, be configured to the template area carries out fusion treatment to described first image and second image.
In the image processing method and device provided according to the present invention, exposure image long and short exposure can be respectively obtained Image, and fringe region to acquired exposure image long and short exposed images extracted and processed respectively, is melted with obtaining Close the result for the treatment of.The image processing method and device of the embodiment of the present invention can avoid image blurring caused by exposure image long The technical problem big with dark portion details missing, image noise caused by short exposed images, improves the final picture quality for obtaining, and changes Kind Consumer's Experience.
Brief description of the drawings
Technical scheme in order to illustrate more clearly the embodiments of the present invention, below will be in embodiment or description of the prior art The required accompanying drawing for using is briefly described, it should be apparent that, drawings in the following description are only some realities of the invention Example is applied, for those of ordinary skill in the art, on the premise of not paying creative work, can also be according to these accompanying drawings Obtain other accompanying drawings.
Fig. 1 shows the schematic diagram of the photo shot under the time for exposure long;
Fig. 2 shows the schematic diagram of the photo shot under short exposure time;
Fig. 3 shows the flow chart of image processing method according to embodiments of the present invention;
Fig. 4 shows the mould to being obtained after the fringe region superposition of described first image and the second image in the embodiment of the present invention The schematic diagram in plate region;
Fig. 5 shows the signal after the first image and the second image being merged based on template area in the embodiment of the present invention Figure;
Fig. 6 shows the block diagram of image processing apparatus according to embodiments of the present invention;
Fig. 7 shows the block diagram of image processing apparatus according to embodiments of the present invention.
Specific embodiment
Below in conjunction with the accompanying drawing in the embodiment of the present invention, the technical scheme in the embodiment of the present invention is carried out clear, complete Site preparation is described, it is clear that described embodiment is a part of embodiment of the invention, rather than whole embodiments.
When photo is shot in the case of ambient light deficiency, if shot by the way of time exposure, although More dark portion details are obtained in that, but can be because the image blurring phenomenon of the mobile generation of camera.Fig. 1 shows and exposed long The schematic diagram of the photo shot under the time.As shown in figure 1, because environment is excessively dark, therefore the time for exposure is more long, in time for exposure model The situation for avoiding camera movement is difficult in enclosing, so that its edges captured in Fig. 1 are obscured, shooting effect is have impact on.
Fig. 2 shows the schematic diagram of the photo shot under short exposure time.Although the photo shown in Fig. 2 is due to exposure Between the short situation for avoiding the edge blurry that camera movement is produced, but because ambient light is not enough, therefore dark portion can be caused thin Section missing, captured photographic quality brightness is not good, and image noise is excessive.
In order to solve the problems, such as that exposure image long and short exposed images shown in above-mentioned Fig. 1 and Fig. 2 are respectively present, general meeting Exposure image long in considering to Fig. 1 optimizes image effect by the calculation of image deblurring.But, exposed due to long The edge of image is very fuzzy, and common deblurring algorithm is difficult to determine fuzzy core, and needs iterate, amount of calculation Huge, effect is very undesirable.
Based on above mentioned problem, it is considered to propose following image processing method, Fig. 3 shows image according to embodiments of the present invention The flow chart of processing method 300, described image processing method can apply to electronic equipment, and electronic equipment can be used for The various terminals of image and treatment image are obtained, such as mobile phone, PDA, panel computer and notebook etc. can be with It is portable, pocket, hand-held, built-in computer or vehicle-mounted device.
As shown in figure 3, image processing method 300 can include step S301:The first image and the second image are obtained, its In, the time for exposure of described first image is shorter than the time for exposure of second image.Wherein, first in the embodiment of the present invention Image is short with respect to the time for exposure of the second image, and it can be considered to the first image is referred to as into short exposed images, the second image claims It is exposure image long.The example of the first image and the second image can respectively referring to Fig. 2 and Fig. 1.It should be noted that the first figure Time for exposure length between picture and the second image is relative value, and specific numerical value is not limited at this.For example, the first figure The time for exposure of picture can be 1/60 second, and comparatively, the time for exposure of the second image can be 1/8 second.Certainly, the first figure The time for exposure of picture can be 1/250 second, and now the time for exposure of the second image can be 1/60 second.In the embodiment of the present invention Practical application in, for the first image and the second image arbitrary exposure time can be selected, as long as meeting the first image Time for exposure is shorter than the time for exposure of the second image, it is preferable that the selection of the time for exposure of the first image and the second image Can depend on that ambient lighting conditions and captured object are static or the condition such as motion determines.Wherein, the first image In the range of can be for one or more objects of identical or the same visual field with the second image.Preferably, the first image and Two images can continuously be obtained by pre-setting (set or setting manually as automatic) on the premise of camera is not moved, I.e. can be by obtaining the second image that first image and length of short exposure expose simultaneously by shooting push button next time.In the present invention In one embodiment, the first image can be first shot, then shoot the second image, naturally it is also possible to shoot the second figure first Picture, then shoots the first image.Wherein, the first image due to the time for exposure it is shorter, therefore produce blurred picture possibility phase To smaller, and the second image is relatively long due to the time for exposure, although the chroma-luminance of image more preferably, but is more likely to result in producing Raw blurred picture.
In step s 302, the fringe region of described first image and the fringe region of second image are extracted respectively. In this step, edge extracting is carried out for captured the first image and the second image respectively.Wherein, for captured image In each object the more violent place of boundary gray-value variation, you can be defined as edge.In embodiments of the present invention, can be with Image is extracted using various edge detection methods based on the isoparametric situation of change of gray value in the first image and the second image In captured object edge contour, for example, rim detection can be carried out based on single order or Second Order Differential Operator, based on mathematics shape State carries out rim detection and/or carries out rim detection etc. based on wavelet transformation.In embodiments of the present invention, it is above-mentioned to be directed to first The edge extracting mode of image and the second image is merely illustrative, in actual applications, can select arbitrary edge extracting mode, As long as edge extracting can be carried out to the first image and the second image, wherein, for the first image and the side of the second image Edge extracted region mode can be with identical or different.
In step S303, fringe region to described first image and at the fringe region of second image Reason, obtains template area.Specifically, in embodiments of the present invention, can be by the fringe region of described first image and described The fringe region of two images is overlapped, and obtains the template area after superposition.Because the shooting of the first image and the second image is right As being identical object or same field range, therefore, the fringe region extracted in the first image and the second image It is similar.In this step, the fringe region for the first image zooming-out and the fringe region for the second image zooming-out are carried out Superposition, to obtain the fringe region after superposition, as template area.Fig. 4 is shown in the embodiment of the present invention to described first image With the schematic diagram of the template area obtained after the fringe region superposition of the second image.Preferably, the mould after superposition can also be directed to Plate region is filtered, to realize better image effect.It is above-mentioned to the first image and the treatment side in the second image border region Formula is merely illustrative, in actual applications, can using any to the first image and the processing mode in the second image border region, with Obtain template area.
In step s 304, described first image and second image are carried out at fusion based on the template area Reason.Specifically, described first image can be extracted corresponding to the template area image in the template area, and extracts described the Non-template area image of two images outside the template area;By the template area image of described first image and described The non-template area image of two images is merged.Wherein, described first image is being extracted corresponding in the template area During the image of template area, can be corresponded to according to second image first described in the monochrome information amendment in the template area The brightness of the template area image of image.
In this step, when the fringe region of the fringe region according to the first image and the second image determine template area it Afterwards, the fringe region part of the i.e. short exposed images of the first image and the non-side of the second image exposure image i.e. long can respectively be extracted Edge region part forms fused images.Because the first image and the second image can be by being continuously shot acquisition, therefore with phase The same visual field, so that the fringe region and non-edge of the first image and the second image directly can be merged to obtain one Open the fused images after the fused treatment of same field of view scope.Wherein, the fused images and described first image and second The field range of image or captured object all same.Especially, due to containing the edge of the first image in fused images Short exposed images are both contained in region and the non-edge of the second image, therefore the fused images of fused treatment generation Clearly fringe region, contains the non-edge with enough dark portion details and low noise point of exposure image long again. In another embodiment of the present invention, for the fringe region that the first image is used to merge, it is possible to use the marginal zone of the second image Monochrome information in domain is modified, with the fringe region of the first image after being highlighted and the non-edge area of the second image Domain is merged, and reaches more perfect syncretizing effect.Wherein it is possible to gather each pixel week in the fringe region of the second image Enclose the parameters such as the average and/or variance of brightness, and relevant position to the fringe region of the first image is processed so that first Phase as far as possible in the parameters such as average and/or variance around the pixel of the relevant position of the fringe region of image and the second image It is same or similar, so as to reach the effect highlighted to the fringe region of the first image.
Fig. 5 shows the signal after the first image and the second image being merged based on template area in the embodiment of the present invention Figure.Wherein, the short exposed images in the exposure image long and Fig. 2 in Fig. 1 are carried out into edge extracting respectively, and is obtained shown in Fig. 4 Superposition after fringe region as template area, the fusion knot of the template area in non-template region and Fig. 2 in Fig. 1 Really, the fused images shown in Fig. 5 are obtained.In fusion process, the intersection of the first image and the second image can be taken gradually Change is processed, to improve syncretizing effect.
According in the image processing method that the present invention is provided, exposure image long and short exposed images can be respectively obtained, and The fringe region of acquired exposure image long and short exposed images is extracted and processed respectively, to obtain fusion treatment As a result.Image blurring and short exposed images are led caused by the image processing method of the embodiment of the present invention can avoid exposure image long The big technical problem of the dark portion details missing of cause, image noise, improves the final picture quality for obtaining, and improves Consumer's Experience.
Below, reference picture 6 describes the block diagram of image processing apparatus according to embodiments of the present invention.The device can be performed Above-mentioned image processing method.Operation due to the device and each step base above with reference to the image processing method described in Fig. 3 This is identical, therefore only carries out brief description to it herein, and omits the repeated description to identical content.
As shown in fig. 6, image processing apparatus 600 include acquiring unit 610, extraction unit 620, processing unit 630 and melt Close unit 640.It will be appreciated that Fig. 6 only shows the part related to embodiments of the invention, and miscellaneous part is eliminated, but This is schematical, and as needed, device 600 can include miscellaneous part.
The electronic equipment where image processing apparatus 600 in Fig. 6 can be used for obtaining image and treatment image Various terminals, such as mobile phone, PDA, panel computer and notebook etc. can also be portable, pocket, hand Hold formula, built-in computer or vehicle-mounted device.
As shown in fig. 6, acquiring unit 610 obtains the first image and the second image, wherein, during the exposure of described first image Between it is shorter than the time for exposure of second image.Wherein, exposure of first image with respect to the second image in the embodiment of the present invention Time is short, and it can be considered to the first image is referred to as into short exposed images, the second image is referred to as exposure image long.First image and The example of the second image can respectively referring to Fig. 2 and Fig. 1.It should be noted that the exposure between the first image and the second image Time length is relative value, and specific numerical value is not limited at this.For example, the time for exposure of the first image can be 1/60 Second, and comparatively, the time for exposure of the second image can be 1/8 second.Certainly, the time for exposure of the first image can be 1/ 250 seconds, and now the time for exposure of the second image can be 1/60 second.In the practical application of the embodiment of the present invention, the first figure Picture and the second image can select arbitrary exposure time, as long as the time for exposure for meeting the first image is shorter than the exposure of the second image The light time, it is preferable that the selection of the time for exposure of the first image and the second image can depend on ambient lighting conditions and Captured object is static or the condition such as motion determines.Wherein, the first image and the second image can be directed to identical In the range of one or more objects or the same visual field.Preferably, the first image and the second image can be by pre-setting (such as It is automatic set or set manually) continuously obtained on the premise of camera is not moved, namely can be pressed by by shooting next time Button obtains the first image of short exposure and the second image of exposure long simultaneously.In an embodiment of the invention, can clap first The first image is taken the photograph, the second image is then shot, naturally it is also possible to shoot the second image first, the first image is then shot.Wherein, First image is shorter due to the time for exposure, therefore produces the possibility of blurred picture relatively small, and the second image is due to exposure Time is relatively long, although the chroma-luminance of image more preferably, but is more likely to result in producing blurred picture.
Extraction unit 620 extracts the fringe region of described first image and the fringe region of second image respectively.Its In, extraction unit 620 carries out edge extracting for captured the first image and the second image respectively.For captured image In each object the more violent place of boundary gray-value variation, you can be defined as edge.In embodiments of the present invention, extract The isoparametric situation of change of gray value that unit 620 can be based in the first image and the second image uses various side edge detections Method extracts the edge contour of captured object in image, for example, extraction unit 620 can be entered based on single order or Second Order Differential Operator Row rim detection, rim detection is carried out based on mathematical morphology and/or rim detection etc. is carried out based on wavelet transformation.In the present invention In embodiment, extraction unit 620 is merely illustrative for the edge extracting mode of the first image and the second image, in practical application In, arbitrary edge extracting mode can be selected, as long as edge extracting can be carried out to the first image and the second image, its In, extraction unit 620 can be with identical or different for the fringe region extracting mode of the first image and the second image.
Processing unit 630 is processed the fringe region of described first image and the fringe region of second image, Obtain template area.Specifically, in embodiments of the present invention, processing unit 630 can be by the fringe region of described first image Fringe region with second image is overlapped, and obtains the template area after superposition.Due to the first image and the second image Reference object be identical object or same field range, therefore, processing unit 630 is in the first image and the second image Middle extracted fringe region is similar to.Preferably, processing unit 630 will be directed to the fringe region of the first image zooming-out and for the The fringe region of two image zooming-outs is overlapped, to obtain the fringe region after superposition, as template area.Fig. 4 shows this hair Processing unit 630 is to the template area of acquisition after the fringe region superposition of described first image and the second image in bright embodiment Schematic diagram.Preferably, the template area that processing unit 630 can also be directed to after superposition is filtered, to realize better image Effect.The above-mentioned processing mode to the first image and the second image border region is merely illustrative, in actual applications, processing unit 630 can be using any to the first image and the processing mode in the second image border region, to obtain template area.
Integrated unit 640 carries out fusion treatment based on the template area to described first image and second image. Specifically, integrated unit 640 can extract described first image corresponding to the template area image in the template area, and carry Take non-template area image of second image outside the template area;By the template area image of described first image Non-template area image with second image is merged.Wherein, integrated unit 640 is extracting described first image correspondence When template area image in the template area, can be corresponded to according to second image bright in the template area Spend the brightness of the template area image of Information revision described first image.
When integrated unit 640 according to the fringe region of the first image and the fringe region of the second image determine template area it Afterwards, the fringe region part of the i.e. short exposed images of the first image and the non-side of the second image exposure image i.e. long can respectively be extracted Edge region part forms fused images.Because the first image and the second image can be by being continuously shot acquisition, therefore with phase The same visual field, so that the fringe region and non-edge of the first image and the second image directly can be merged to obtain one Open the fused images after the fused treatment of same field of view scope.Wherein, the fused images and described first image and second The field range of image or captured object all same.Especially, due to containing the edge of the first image in fused images Both included in region and the non-edge of the second image, therefore the fused images of the fusion treatment generation of fused unit 640 The clearly fringe region of short exposed images, contain again exposure image long with enough dark portion details and low noise point Non-edge.In another embodiment, the fringe region for merging, integrated unit 640 are used for for the first image Can be modified using the monochrome information in the fringe region of the second image, with the edge of the first image after being highlighted Region is merged with the non-edge of the second image, reaches more perfect syncretizing effect.Wherein, integrated unit 640 can The parameters such as average and/or variance with each pixel surrounding brightness in the fringe region for gathering the second image, and to the first image The relevant position of fringe region processed so that it is equal around the pixel of the relevant position of the fringe region of the first image It is as far as possible same or similar in value and/or the parameter such as variance and the second image, the fringe region of the first image is carried so as to reach Bright effect.
Fig. 5 shows the signal after the first image and the second image being merged based on template area in the embodiment of the present invention Figure.Wherein, the short exposed images in exposure image long and Fig. 2 point in Fig. 1 that extraction unit 620 obtains acquiring unit 610 Do not carry out edge extracting, and fringe region after obtaining the superposition shown in Fig. 4 through processing unit 630 is used as template area, fusion The fusion results of the template area in non-template region and Fig. 2 of the unit 640 in Fig. 1, obtain the fusion figure shown in Fig. 5 Picture.In fusion process, gradual change can be taken to process the intersection of the first image and the second image, to improve syncretizing effect.
According in the image processing apparatus that the present invention is provided, exposure image long and short exposed images can be respectively obtained, and The fringe region of acquired exposure image long and short exposed images is extracted and processed respectively, to obtain fusion treatment As a result.Image blurring and short exposed images are led caused by the image processing apparatus of the embodiment of the present invention can avoid exposure image long The big technical problem of the dark portion details missing of cause, image noise, improves the final picture quality for obtaining, and improves Consumer's Experience.
Below, reference picture 7 describes the block diagram of image processing apparatus according to embodiments of the present invention.The image processing apparatus Above-mentioned image processing method can be performed.Due to the device operation with above with reference to each of the image processing method described in Fig. 3 Individual step is essentially identical, therefore only carries out brief description to it herein, and omits the repeated description to identical content.
Image processing apparatus 700 in Fig. 7 can include one or more processors 710 and memory 720, certainly, figure As processing unit 700 can also including input block, output unit (not shown) etc. other every components, these components are logical Cross bindiny mechanism's (not shown) interconnection of bus system and/or other forms.It should be noted that the image processing apparatus shown in Fig. 7 700 component and structure is illustrative, and not restrictive, and as needed, image processing apparatus 700 can also have it His component and structure.
Processor 710 is control centre, using various interfaces and the various pieces of connection whole device, by operation Or software program and/or module of the storage in memory 720 are performed, and data of the storage in memory 720 are called, hold The various functions and processing data of row image processing apparatus 700, so as to carry out integral monitoring to image processing apparatus 700.Preferably Ground, processor 710 may include one or more processing cores;Preferably, processor 710 can integrated application processor and modulatedemodulate Processor is adjusted, wherein, application processor mainly processes operating system, user interface and application program etc., modem processor Main treatment radio communication.It is understood that above-mentioned modem processor can not also be integrated into processor 710.
Memory 720 can include one or more computer program products, and the computer program product can include Various forms of computer-readable recording mediums, such as volatile memory and/or nonvolatile memory.The volatibility is deposited Reservoir can for example include random access memory (RAM) and/or cache memory (cache) etc..It is described non-volatile Memory for example can be including read-only storage (ROM), hard disk, flash memory etc..Can be with the computer-readable recording medium Store one or more computer program instructions.
Processor 710 can run described program instruction, to realize following steps:The first image and the second image are obtained, Wherein, the time for exposure of described first image is shorter than the time for exposure of second image;Described first image is extracted respectively The fringe region of fringe region and second image;The edge of fringe region and second image to described first image Region is processed, and obtains template area;Described first image and second image are melted based on the template area Conjunction is processed.
Unshowned input block can be used for receive input numeral or character information, and produce with user set and The relevant keyboard of function control, mouse, action bars, optics or trace ball signal input.Specifically, input block may include to touch Sensitive surfaces and other input equipments.Touch sensitive surface, also referred to as touch display screen or Trackpad, user can be collected thereon or (such as user on Touch sensitive surface or is being touched using any suitable objects such as finger, stylus or annex for neighbouring touch operation Operation near sensitive surfaces), and corresponding attachment means are driven according to formula set in advance.Preferably, Touch sensitive surface can be wrapped Include two parts of touch detecting apparatus and touch controller.Wherein, touch detecting apparatus detect the touch orientation of user, and detect The signal that touch operation brings, transmits a signal to touch controller;Touch controller receives touch from touch detecting apparatus Information, and be converted into contact coordinate, then give processor 710, and the order sent of receiving processor 710 and can be held OK.Furthermore, it is possible to realize Touch sensitive surface using polytypes such as resistance-type, condenser type, infrared ray and surface acoustic waves.Except touching Sensitive surfaces, input block can also include other input equipments.Specifically, other input equipments can include but is not limited to physics One or more in keyboard, function key (such as volume control button, switch key etc.), trace ball, mouse, action bars etc..
Output unit can export various information, such as image information, application control information etc. to outside (such as user). For example, output unit can be display unit, can be used for show by user input information or be supplied to user information and The various graphical user interface of image processing apparatus 700, these graphical user interface can by figure, text, icon, video and It is combined to constitute.Display unit may include display panel, it is preferred that LCD (Liquid Crystal can be used Display, liquid crystal display), the form such as OLED (Organic Light-Emitting Diode, Organic Light Emitting Diode) comes Configuration display panel.Further, Touch sensitive surface can cover display panel, when Touch sensitive surface is detected thereon or neighbouring is touched After touching operation, processor 710 is sent to determine the type of touch event, with preprocessor 710 according to the type of touch event Corresponding visual output is provided on a display panel.Touch sensitive surface can be realized with display panel as two independent parts Input and input function, in certain embodiments, it is also possible to by Touch sensitive surface and display panel it is integrated and realize input and export Function.
One of ordinary skill in the art will appreciate that realizing that all or part of step of above-described embodiment can be by hardware To complete, it is also possible to instruct the hardware of correlation to complete by program, described program can be stored in a kind of computer-readable In storage medium, storage medium mentioned above can be read-only storage, disk or CD etc..
Those of ordinary skill in the art are it is to be appreciated that the list of each example described with reference to the embodiments described herein Unit and algorithm steps, can be realized with the combination of electronic hardware or computer software and electronic hardware.These functions are actually Performed with hardware or software mode, depending on the application-specific and design constraint of technical scheme.Professional and technical personnel Described function, but this realization can be realized it is not considered that exceeding using distinct methods to each specific application The scope of the present invention.
It is apparent to those skilled in the art that, for convenience and simplicity of description, the information of foregoing description Implementing for processing method, may be referred to the correspondence description in product embodiments.
In several embodiments provided by the present invention, it should be understood that disclosed apparatus and method, can be by it Its mode is realized.For example, device embodiment described above is only schematical, for example, the division of the unit, only Only a kind of division of logic function, can there is other dividing mode when actually realizing, such as multiple units or component can be tied Another equipment is closed or is desirably integrated into, or some features can be ignored, or do not perform.
The unit that is illustrated as separating component can be or may not be it is physically separate, it is aobvious as unit The part for showing can be or may not be physical location.Can select according to the actual needs therein some or all of Unit realizes the purpose of this embodiment scheme.
The above, specific embodiment only of the invention, but protection scope of the present invention is not limited thereto, and it is any Those familiar with the art the invention discloses technical scope in, change or replacement can be readily occurred in, should all contain Cover within protection scope of the present invention.Therefore, protection scope of the present invention described should be defined by scope of the claims.

Claims (10)

1. a kind of image processing method, including:
Obtain the first image and the second image, wherein, time for exposure of described first image than second image exposure when Between it is short;
The fringe region of described first image and the fringe region of second image are extracted respectively;
The fringe region of fringe region and second image to described first image is processed, and obtains template area;
Fusion treatment is carried out to described first image and second image based on the template area.
2. the method for claim 1, wherein fringe region to described first image and second image Fringe region is processed, and obtaining template area includes:
The fringe region of the fringe region of described first image and second image is overlapped, the template after superposition is obtained Region.
3. the method for claim 1, wherein it is described based on the template area to described first image and described second Image carries out fusion treatment to be included:
The template area image that described first image corresponds in the template area is extracted, and extracts second image in institute State the non-template area image outside template area;
The non-template area image of the template area image of described first image and second image is merged.
4. the method for claim 1, wherein it is described respectively extract described first image fringe region and described second The fringe region of image includes:
Gray-value variation according to described first image and second image extracts described first image and described second respectively The fringe region of image.
5. method as claimed in claim 3, wherein, it is described to extract the mould that described first image corresponds in the template area Plate area image includes:
Correspond to the template area of the monochrome information amendment described first image in the template area according to second image The brightness of image.
6. a kind of image processing apparatus, including:
Acquiring unit, is configured to obtain the first image and the second image, wherein, time for exposure of described first image is than described the The time for exposure of two images is short;
Extraction unit, is configured to extract respectively the fringe region of described first image and the fringe region of second image;
Processing unit, is configured to process the fringe region of described first image and the fringe region of second image, Obtain template area;
Integrated unit, be configured to the template area carries out fusion treatment to described first image and second image.
7. device as claimed in claim 6, wherein,
Be overlapped for the fringe region of the fringe region of described first image and second image by the processing unit, obtains Template area after superposition.
8. device as claimed in claim 6, wherein,
The integrated unit extracts the template area image that described first image corresponds in the template area, and extracts described Non-template area image of second image outside the template area;
The non-template area image of the template area image of described first image and second image is merged.
9. device as claimed in claim 6, wherein,
The extraction unit extracts first figure respectively according to the gray-value variation of described first image and second image The fringe region of picture and second image.
10. device as claimed in claim 8, wherein,
The integrated unit first figure according to second image corresponding to the monochrome information amendment in the template area The brightness of the template area image of picture.
CN201710207173.4A 2017-03-31 2017-03-31 Image processing method and device Active CN106851115B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201710207173.4A CN106851115B (en) 2017-03-31 2017-03-31 Image processing method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201710207173.4A CN106851115B (en) 2017-03-31 2017-03-31 Image processing method and device

Publications (2)

Publication Number Publication Date
CN106851115A true CN106851115A (en) 2017-06-13
CN106851115B CN106851115B (en) 2020-05-26

Family

ID=59142020

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201710207173.4A Active CN106851115B (en) 2017-03-31 2017-03-31 Image processing method and device

Country Status (1)

Country Link
CN (1) CN106851115B (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114926351A (en) * 2022-04-12 2022-08-19 荣耀终端有限公司 Image processing method, electronic device, and computer storage medium

Citations (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH10233966A (en) * 1997-02-21 1998-09-02 Matsushita Electric Ind Co Ltd Solid-state image pickup device
CN101222584A (en) * 2007-01-12 2008-07-16 三洋电机株式会社 Apparatus and method for blur detection, and apparatus and method for blur correction
CN101390384A (en) * 2005-12-27 2009-03-18 京瓷株式会社 Imaging device and its image processing method
US20090086174A1 (en) * 2007-09-28 2009-04-02 Sanyo Electric Co., Ltd. Image recording apparatus, image correcting apparatus, and image sensing apparatus
CN101424856A (en) * 2007-10-31 2009-05-06 华晶科技股份有限公司 Image- acquiring device for providing image compensating function and image compensation process thereof
US20100123807A1 (en) * 2008-11-19 2010-05-20 Seok Lee Image processing apparatus and method
CN101867721A (en) * 2010-04-15 2010-10-20 青岛海信网络科技股份有限公司 Implement method, implement device and imaging device for wide dynamic images
CN102082909A (en) * 2009-11-26 2011-06-01 三星电子株式会社 Digital photographing apparatus and method of controlling the same
CN103259976A (en) * 2012-02-17 2013-08-21 佳能株式会社 Image processing apparatus, image pickup apparatus, image processing method
US20140205193A1 (en) * 2013-01-24 2014-07-24 Fujitsu Semiconductor Limited Image processing apparatus, method and imaging apparatus
US8965120B2 (en) * 2012-02-02 2015-02-24 Canon Kabushiki Kaisha Image processing apparatus and method of controlling the same
CN104851079A (en) * 2015-05-06 2015-08-19 中国人民解放军国防科学技术大学 Noise/blurred image pair-based low-illumination license plate image restoration method
CN104966071A (en) * 2015-07-03 2015-10-07 武汉烽火众智数字技术有限责任公司 Infrared light supplement based night license plate detection and recognition method and apparatus
CN105791659A (en) * 2014-12-19 2016-07-20 联想(北京)有限公司 Image processing method and electronic device

Patent Citations (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH10233966A (en) * 1997-02-21 1998-09-02 Matsushita Electric Ind Co Ltd Solid-state image pickup device
CN101390384A (en) * 2005-12-27 2009-03-18 京瓷株式会社 Imaging device and its image processing method
CN101222584A (en) * 2007-01-12 2008-07-16 三洋电机株式会社 Apparatus and method for blur detection, and apparatus and method for blur correction
US20090086174A1 (en) * 2007-09-28 2009-04-02 Sanyo Electric Co., Ltd. Image recording apparatus, image correcting apparatus, and image sensing apparatus
CN101424856A (en) * 2007-10-31 2009-05-06 华晶科技股份有限公司 Image- acquiring device for providing image compensating function and image compensation process thereof
US20100123807A1 (en) * 2008-11-19 2010-05-20 Seok Lee Image processing apparatus and method
CN102082909A (en) * 2009-11-26 2011-06-01 三星电子株式会社 Digital photographing apparatus and method of controlling the same
CN101867721A (en) * 2010-04-15 2010-10-20 青岛海信网络科技股份有限公司 Implement method, implement device and imaging device for wide dynamic images
US8965120B2 (en) * 2012-02-02 2015-02-24 Canon Kabushiki Kaisha Image processing apparatus and method of controlling the same
CN103259976A (en) * 2012-02-17 2013-08-21 佳能株式会社 Image processing apparatus, image pickup apparatus, image processing method
US20140205193A1 (en) * 2013-01-24 2014-07-24 Fujitsu Semiconductor Limited Image processing apparatus, method and imaging apparatus
CN105791659A (en) * 2014-12-19 2016-07-20 联想(北京)有限公司 Image processing method and electronic device
CN104851079A (en) * 2015-05-06 2015-08-19 中国人民解放军国防科学技术大学 Noise/blurred image pair-based low-illumination license plate image restoration method
CN104966071A (en) * 2015-07-03 2015-10-07 武汉烽火众智数字技术有限责任公司 Infrared light supplement based night license plate detection and recognition method and apparatus

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114926351A (en) * 2022-04-12 2022-08-19 荣耀终端有限公司 Image processing method, electronic device, and computer storage medium

Also Published As

Publication number Publication date
CN106851115B (en) 2020-05-26

Similar Documents

Publication Publication Date Title
CN110929651B (en) Image processing method, image processing device, electronic equipment and storage medium
CN105847674B (en) A kind of preview image processing method and mobile terminal based on mobile terminal
CN107589963B (en) A kind of image processing method, mobile terminal and computer readable storage medium
CN106937045B (en) Display method of preview image, terminal equipment and computer storage medium
CN104135609B (en) Auxiliary photo-taking method, apparatus and terminal
US20120242852A1 (en) Gesture-Based Configuration of Image Processing Techniques
CN107172296A (en) A kind of image capturing method and mobile terminal
CN106713764A (en) Photographic method and mobile terminal
CN106937055A (en) A kind of image processing method and mobile terminal
CN105827951A (en) Moving object photographing method and mobile terminal
CN108551552B (en) Image processing method, device, storage medium and mobile terminal
CN106096043B (en) A kind of photographic method and mobile terminal
CN108494996B (en) Image processing method, device, storage medium and mobile terminal
CN108924418A (en) A kind for the treatment of method and apparatus of preview image, terminal, readable storage medium storing program for executing
CN109948525A (en) It takes pictures processing method, device, mobile terminal and storage medium
CN107392933A (en) A kind of method and mobile terminal of image segmentation
CN109167893A (en) Shoot processing method, device, storage medium and the mobile terminal of image
US20160054839A1 (en) Interactive Object Contour Detection Algorithm for Touchscreens Application
EP2939411B1 (en) Image capture
CN109495616A (en) A kind of photographic method and terminal device
CN113126862A (en) Screen capture method and device, electronic equipment and readable storage medium
CN112308797A (en) Corner detection method and device, electronic equipment and readable storage medium
CN106815809A (en) A kind of image processing method and device
CN106408628A (en) Image processing method and image processing device
CN108540729A (en) Image processing method and mobile terminal

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant