CN105912257A - Method and device for image hybrid processing - Google Patents
Method and device for image hybrid processing Download PDFInfo
- Publication number
- CN105912257A CN105912257A CN201610224740.2A CN201610224740A CN105912257A CN 105912257 A CN105912257 A CN 105912257A CN 201610224740 A CN201610224740 A CN 201610224740A CN 105912257 A CN105912257 A CN 105912257A
- Authority
- CN
- China
- Prior art keywords
- touch point
- image
- original image
- touch
- pattern
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T11/00—2D [Two Dimensional] image generation
- G06T11/60—Editing figures and text; Combining figures or text
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- General Engineering & Computer Science (AREA)
- Human Computer Interaction (AREA)
- Processing Or Creating Images (AREA)
Abstract
The invention relates to a method and device for image hybrid processing. The method for the image hybrid processing comprises the steps that an original image is acquired and displayed; a first touch point generated on the original image by a touch event is acquired, and a second touch point generated on the original image through moving with the first touch point as a starting point is acquired; the distance between the second touch point and the first touch point generated on the original image is acquired; whether the distance is bigger than or equal to a distance parameter value which is generated this time randomly in advance is determined, and image coordinates of the second touch point are recorded if the distance is bigger than or equal to the distance parameter value; a selected pattern is drawn on the second touch point; the drawn pattern is compounded with the original image, so that a compounded image is obtained; and moving is then carried out with the second touch point as a new starting point. Spaced and random patterns are drawn on the original image, so that the random hybrid processing is implemented to the images.
Description
Technical field
The present invention relates to image processing field, particularly relate to a kind of image blend treating method and apparatus.
Background technology
Along with the development of technology, increasing user can use the terminal device shooting image of self conveniently
Or from network, obtain image, in order to obtain the image oneself wanted, use various image processing tool pair
Original image carries out the effect processing to obtain wanting.Picture is had in the application program of many image processing tool
Pen (or scribble) function.Traditional paintbrush function is photo user shot or the figure chosen from photograph album
Sheet is illustrated on screen, and image can carry out cutting, rotation, adds the operations such as filter, it is impossible to image with
Machine carries out mixed processing.
Summary of the invention
Based on this, it is necessary to cannot dynamically image be mixed at random in traditional image processing tool
The problem that conjunction processes, it is provided that a kind of image blend processing method, it is possible to image is carried out at random mixed processing.
Additionally, there is a need to provide a kind of image blend processing means, it is possible to image is carried out at mixing at random
Reason.
A kind of image blend processing method, including:
Obtain original image, and show described original image;
Receive the touch event to described original image;
Obtain the first touch point that described touch event produces on described original image, and obtain with described
First touch point is that starting point moves the second touch point produced on described original image;
Obtain the distance between the second touch point and the first touch point produced on described original image;
Judge described distance whether more than or equal to the distance parameter value of this stochastic generation in advance, the most then
Record the image coordinate of described second touch point;
The pattern chosen is plotted to described second touch point;
The pattern of described drafting and described original image are carried out the image after being synthesized;
Described second touch point is moved as new starting point.
A kind of image blend processing means, including:
Display module, is used for obtaining original image, and shows described original image;
Event receiver module, for receiving the touch event to described original image;
Parameter acquisition module, for obtaining the first touch-control that described touch event produces on described original image
Point, and obtain with described first touch point for starting point move on described original image produce second
Touch point;
Distance acquisition module, for obtaining the second touch point and the first touch-control produced on described original image
Distance between point;
Judge module, for judging that described distance is whether more than or equal to the distance ginseng of this stochastic generation in advance
Numerical value, the most then record the image coordinate of described second touch point;
Drafting module, for being plotted to described second touch point by the pattern chosen;
Synthesis module, after carrying out being synthesized by the pattern of described drafting and described original image
Image;
Described parameter acquisition module is additionally operable to obtain and is moved as new starting point by described second touch point
The the second new touch point produced on described original image.
Above-mentioned image blend processing method, obtains original image and also shows, obtains the produced at original image
One touch point and the second touch point, and obtain the distance between the first touch point and the second touch point, it is judged that away from
From whether more than or equal to the distance parameter value of this stochastic generation, if then drawing choosing at the second touch point
The pattern taken, then the starting point being new with the second touch point moves, and obtains the second new touch point, calculates new
Touch point relative to the distance of new starting point, then judging distance whether more than or equal to this stochastic generation away from
From parameter value, the most then at the second new touch point, draw the pattern newly chosen, repeat to draw, draw
It is spaced random pattern to original image, it is achieved that image is carried out at random mixed processing.
Accompanying drawing explanation
Fig. 1 is the internal structure schematic diagram of terminal in an embodiment;
Fig. 2 is the flow chart of image blend processing method in an embodiment;
Fig. 3 is the schematic diagram pressing original image in an embodiment;
Fig. 4 is the flow chart of image blend processing method in another embodiment;
Fig. 5 be the track moved with finger on image add that gap is random, color change, lighten mixing
The effect schematic diagram of circular paintbrush brush;
Fig. 6 is the structured flowchart of image blend processing means in an embodiment;
Fig. 7 is the structured flowchart of image blend processing means in another embodiment;
Fig. 8 is the structured flowchart of image blend processing means in another embodiment;
Fig. 9 is the structured flowchart of image blend processing means in another embodiment.
Detailed description of the invention
In order to make the purpose of the present invention, technical scheme and advantage clearer, below in conjunction with accompanying drawing and reality
Execute example, the present invention is further elaborated.Only should be appreciated that specific embodiment described herein
Only in order to explain the present invention, it is not intended to limit the present invention.
Fig. 1 is the internal structure schematic diagram of terminal in an embodiment.As it is shown in figure 1, this terminal includes leading to
Cross processor, non-volatile memory medium, internal memory, display screen and input unit that system bus connects.Its
In, the non-volatile memory medium storage of terminal has operating system.This processor is used for providing calculating and controlling
Ability, supports the operation of whole terminal, and this processor is used for image blend processing method, including: obtain
Original image, and show described original image;Touch event to described original image;Obtain described touch-control
The first touch point that event produces on described original image, and obtain with described first touch point as starting point
Move the second touch point produced on described original image;Obtain generation on described original image
Distance between second touch point and the first touch point;Judge that whether described distance is more than or equal to the most random
The distance parameter value generated, the most then record the image coordinate of described second touch point;The pattern that will choose
It is plotted to described second touch point;The pattern of described drafting and described original image are carried out synthesis synthesized
After image;Described second touch point is moved as starting point.The display screen of terminal can be liquid crystal
Display screen or electric ink display screen etc., input unit can be the touch layer covered on display screen, it is also possible to
Button, trace ball or the Trackpad arranged in terminal enclosure, it is also possible to be external keyboard, Trackpad or
Mouse etc..This terminal can be mobile phone, panel computer, personal digital assistant or Wearable etc..Ability
Field technique personnel are appreciated that the structure shown in Fig. 1, are only the part knot relevant to the application scheme
The block diagram of structure, is not intended that the restriction to the terminal that the application scheme is applied thereon, and concrete terminal can
To include the more or less of parts than shown in figure, or combine some parts, or there is different portions
Part is arranged.
Fig. 2 is the flow chart of image blend processing method in an embodiment.As in figure 2 it is shown, a kind of image
Mixed processing method, runs in the terminal in Fig. 1, including:
Step 202, obtains original image, and shows this original image.
In the present embodiment, the photo of shooting or the picture chosen from photograph album or the picture downloaded from network are made
For original image, in loading original image to image procossing application program, and original image is shown in terminal
Screen on.
Step 204, receives the touch event to this original image.
In the present embodiment, terminal receives user by the touch control object such as finger or the pointer touch-control to original image
Event.This touch event refers to the pressing operation to original image.
Step 206, obtains the first touch point that this touch event produces on this original image, and obtain with
This first touch point is that starting point moves the second touch point produced on this original image.
In the present embodiment, obtain the image coordinate of user's pressing the first touch point on the original image.Obtain
The image coordinate of the first touch point produced on the original image.If two variable p1 and p2 are tactile in order to store
The image coordinate of control point.P1 is for the image coordinate of storage the first touch point.P2 is for storage the second touch point
Image coordinate.
Further, obtain, according to the first touch point and the second touch point, the moving direction moved.Movement side
Xiang Kewei change or constant, such as user moves as straight line, then moving direction is not changed in, if
User's motion track is that a curve then moving direction is in dynamically change.
In the present embodiment, the computing formula of moving direction is double
Angle=atan (-1*double (y2-y1)/(x2-x1), wherein, the coordinate of the first touch point P1 is (x1, y1),
The coordinate of the second touch point P2 is (x2, y2).Because the codomain of arctan function is (-pi/2, pi/2), according to coordinate
Angle is reconverted into by quadrant: if (x2 < x1) { angle=angle+ π }.
Fig. 3 is the schematic diagram pressing original image in an embodiment.As it is shown on figure 3, user passes through finger
The original image of pressing display produces touch event, and record that this touch event produces on the original image the
The image coordinate of one touch point P1 and move to the image coordinate of the second touch point P2.
Step 208, obtain on this original image produce the second touch point and the first touch point between away from
From.
In the present embodiment, calculate the according to the image coordinate of the image coordinate of the first touch point and the second touch point
Distance between one touch point and the second touch point.Such as, the image coordinate of the first touch point is (x1, y1),
The image coordinate of the second touch point is (x2, y2), then the distance between the first touch point and the second touch point is
Step 210, it is judged that this distance whether more than or equal to the distance parameter value of this stochastic generation in advance, if
It is to perform step 212, if it is not, perform step 206.
In the present embodiment, stochastic generation distance parameter value in advance.
Step 212, records the image coordinate of this second touch point.
In the present embodiment, the image coordinate of the second touch point is stored in p2.
Step 214, is plotted to this second touch point by the pattern chosen.
In the present embodiment, the second touch point is drawn, as the center of the pattern chosen, the pattern chosen.
The pattern chosen can be arbitrary graphic pattern.As being circular pattern, delta pattern, mulle etc., no
It is limited to this.
Step 216, carries out the image after being synthesized by the pattern of this drafting and this original image.
In the present embodiment, the pattern of drafting is attached to the image after being synthesized on original image.
Step 218, moves this second touch point as new starting point, then performs step 206.
Specifically, being deposited in p1 by the coordinate of p2 point, repetition step 206 is to 218, until detecting and lifting
The event of rising terminates.Second touch point moves as new starting point, and obtaining with the second touch point is new rising
Point moves the second new touch point produced on the original image, calculates the second new touch point with new
Distance between starting point, it is judged that whether distance is more than or equal to the distance threshold of this stochastic generation, if so,
Then record the second new touch point, the pattern newly chosen is plotted to the second new touch point, the figure that will draw
Case and original image carry out the image after being synthesized.
The distance threshold of stochastic generation may be the same or different, according to random integers stochastic generation every time.Institute every time
The pattern chosen can be identical, it is possible to different.
Above-mentioned image blend processing method, obtains original image and also shows, obtains the produced at original image
One touch point and the second touch point, and obtain the distance between the first touch point and the second touch point, it is judged that away from
From whether more than or equal to the distance parameter value of this stochastic generation, if then drawing choosing at the second touch point
The pattern taken, then the starting point being new with the second touch point moves, and obtains the second new touch point, calculates new
Touch point relative to the distance of new starting point, then judging distance whether more than or equal to this stochastic generation away from
From parameter value, the most then at the second new touch point, draw the pattern newly chosen, repeat to draw, draw
It is spaced random pattern to original image, it is achieved that image is carried out at random mixed processing.
In one embodiment, stochastic generation distance parameter value in advance, including: generate random integers;
By these random integers to n remainder, obtaining a span is the integer of 0 to n-1, when remainder is 0,
The distance parameter value generated is more than the diameter of the pattern chosen, when remainder is not 0, the distance parameter of generation
Value is less than the diameter of the pattern chosen, and wherein, n is the natural number more than 2.
In the present embodiment, if n is 10, this integer that span is [0,9] obtained, when remainder is 0
Time, the distance parameter value of generation is more than the diameter of the pattern chosen;When remainder is not 0, the distance of generation
Parameter value is less than the diameter of the pattern chosen.So be equivalent to the figure having the possibility of 10% to draw out gap
Case, and the possibility of 90% draws out the pattern overlapped, and forms the effect that close quarters is more.
If n is 5, then the integer that span is [0,4] obtained, when remainder is 0, the distance of generation
Parameter value is more than the diameter of the pattern chosen;When remainder is not 0, the distance parameter value of generation is less than choosing
The diameter of pattern.So be equivalent to the pattern having the possibility of 20% to draw out gap, and 80% can
Property can draw out the pattern overlapped, form the effect that close quarters is more.
In other embodiments, when remainder is 0, the distance parameter value of generation is straight more than the pattern chosen
Footpath, and the diameter of the pattern chosen less than prearranged multiple.
In one embodiment, before the pattern chosen is plotted to the step of this second touch point, this image
Mixed processing method also includes: randomly select color from pre-configured color.
In the present embodiment, it is pre-configured with multiple color, then the random a certain color of therefrom selection.This is in advance
The color of configuration can be a span.
After the pattern chosen is plotted to the step of this second touch point, this image blend processing method is also wrapped
Include: this color chosen is mapped on the pattern of this drafting.
In the present embodiment, according to lighten (brightening) mixed mode, the color chosen is mapped to the figure of drafting
In case.This pattern is plotted on the position that original image is corresponding.
The pattern that brightens (Lighten): formula: C=Max (A, B), contrary with Darken, take in two pixels more
Bright conduct result.
Check the colouring information of each passage, and according to two colors of pixel comparison, that is brighter, just with this
Kind of color, as the final color of this pixel, namely takes the light tone in two colors as final look.Draw
In look, the bright color in background color is retained, and secretly the color in background color is replaced.
Also can just understand other mixed modes the color chosen to be mapped on the pattern of drafting.Other mixed modes
Including dissolution mode, behind pattern, dimmed pattern etc..Wherein, the final look of dissolution mode and drawing form and aspect
With, simply the transparency according to the position at each pixel place is different, can take with color rendering and background color at random
Generation.The final look of pattern is with drawing look identical behind, the behind mould when operation on the figure layer having transparent region
Formula just there will be, and can be placed on by the lines of drafting in figure layer after image.Dimmed pattern is by two images
In darker that be elected to be as result, for searching the colouring information in each Color Channel, and according to pixel pair
Ratio background color and drawing look, that is darker, using darker color as the final color of this image.Bright in background color
Color be replaced, secretly in background color color keep constant.
In one embodiment, before the pattern chosen is plotted to the step of this second touch point, the method
Image blend processes and also includes: randomly select transparency from pre-configured transparency.
In the present embodiment, it is pre-configured with multiple transparence value, transparency can be randomly selected.This configuration transparent
Degree span can be [40,255] etc., is not limited to this.
After the pattern chosen is plotted to the step of this second touch point, this image blend processing method is also wrapped
Include: the transparency of the pattern of this drafting is adjusted to this transparency chosen.
In one embodiment, the first touch point that this touch event of this acquisition produces on this original image
The step of image coordinate includes: obtain the screen coordinate of the first touch point that this touch event produces on screen;
The screen coordinate of the first touch point produced on this screen is converted to the first touch-control produced at this original image
The image coordinate of point.
In the present embodiment, pre-build the transformational relation of screen coordinate system and image coordinate system, get screen
After coordinate, according to the transformational relation of screen coordinate system Yu image coordinate system, screen coordinate is converted to image coordinate.
This acquisition moves, with this first touch point, the second touch point produced on this original image for starting point
The step of image coordinate include: obtain and move event with this first touch point for starting point and produce on screen
The screen coordinate of the second raw touch point;The screen coordinate of the second touch point produced on this screen is converted to
The image coordinate of the second touch point produced on this original image.
In the present embodiment, pre-build the transformational relation of screen coordinate system and image coordinate system, get screen
After coordinate, according to the transformational relation of screen coordinate system Yu image coordinate system, screen coordinate is converted to image coordinate.
Fig. 4 is the flow chart of image blend processing method in another embodiment.As shown in Figure 4, Yi Zhongtu
Picture mixed processing method, including:
Step 402, obtains original image, and shows this original image.
In the present embodiment, the photo of shooting or the picture chosen from photograph album or the picture downloaded from network are made
For original image, in loading original image to image procossing application program, and original image is shown in terminal
Screen on.
Step 404, receives the touch event to this original image.
In the present embodiment, terminal receives user by the touch control object such as finger or the pointer touch-control to original image
Event.This touch event refers to the pressing operation to original image.
Step 406, obtains the first touch point that this touch event produces on this original image.
In the present embodiment, obtain the image coordinate of the first touch point produced on the original image.If two changes
P1 and p2 is in order to store the image coordinate of touch point for amount.P1 is for the image coordinate of storage the first touch point.
P2 is for the image coordinate of storage the second touch point.
Step 408, obtains and moves, for starting point, second produced on this original image with this first touch point
Touch point.
In the present embodiment, obtain and move, with the first touch point, second produced on the original image for starting point
The image coordinate of touch point.
Step 410, it may be judged whether lift event, the most then perform step 412, if it is not, perform step
414。
The event of lifting can be that user lifts finger or pointer etc., represents that touch-control terminates.
Step 412, drafting terminates.
Step 414, obtain on this original image produce the second touch point and the first touch point between away from
From.
Step 416, it is judged that this distance whether more than or equal to the distance parameter value of this stochastic generation in advance, if
It is then to perform step 418, if it is not, perform step 408.
Stochastic generation distance parameter value in advance, including: generate random integers;By these random integers to n
Remainder, obtaining a span is the integer of 0 to n-1, when remainder is 0, the distance parameter value of generation
More than the diameter of the pattern chosen, when remainder is not 0, the distance parameter value of generation is less than the pattern chosen
Diameter, wherein, n is the natural number more than 2.
Step 418, records the image coordinate of this second touch point.
Step 420, is plotted to this second touch point by the pattern chosen.
In the present embodiment, the second touch point is drawn, as the center of the pattern chosen, the pattern chosen.
Step 422, carries out the image after being synthesized by the pattern of this drafting and this original image.
Step 424, moves this second touch point as new starting point, then performs step 408.
Specifically, the second touch point moves as new starting point, and obtaining with the second touch point is new rising
Point moves the second new touch point produced on the original image, calculates the second new touch point with new
Distance between starting point, it is judged that whether distance is more than or equal to the distance threshold of this stochastic generation, if so,
Then record the second new touch point, the pattern newly chosen is plotted to the second new touch point, the figure that will draw
Case and original image carry out the image after being synthesized.
The distance threshold of stochastic generation may be the same or different, according to random integers stochastic generation every time.Institute every time
The pattern chosen can be identical, it is possible to different.
Above-mentioned image blend processing method, obtains original image and also shows, obtains the produced at original image
One touch point and the second touch point, and obtain the distance between the first touch point and the second touch point, it is judged that away from
From whether more than or equal to the distance parameter value of this stochastic generation, if then drawing choosing at the second touch point
The pattern taken, then the starting point being new with the second touch point moves, and obtains the second new touch point, calculates new
Touch point relative to the distance of new starting point, then judging distance whether more than or equal to this stochastic generation away from
From parameter value, the most then at the second new touch point, draw the pattern newly chosen, repeat to draw, draw
It is spaced random pattern to original image, it is achieved that image is carried out at random mixed processing.
Fig. 5 be the track moved with finger on image add that gap is random, color change, lighten mixing
The effect schematic diagram of circular paintbrush brush.As it is shown in figure 5, obtain what user selected from the photograph album of terminal
One pictures, display, on the screen of terminal, detects that user's finger is pressed against on the screen of terminal, obtains
The image coordinate of the first touch point of user's finger pressing, records and touches along with user's finger moves the second of generation
The image coordinate of control point, obtains the according to the image coordinate of the second touch point and the image coordinate of the first touch point
Distance between one touch point and the second touch point, it is judged that whether this distance is more than or equal to this stochastic generation
Distance parameter value, if more than or equal to distance parameter value, then centered by the second touch point, draw circular diagram
Case, and according to the color chosen, circular pattern is carried out lighten mixed mode process at circular pattern, then
Move as new starting point using the second touch point, then obtain the second new touch point, repeat above-mentioned
Circular pattern is added in picture by process according to stochastic backlash, and according to the color randomly selected to circular diagram
Case carries out color and is mapped on the picture of correspondence, until detecting that finger frames out, such as, and the first touch-control
Point-rendering red circular pattern, the second touch point draws green circular pattern, the 3rd touch point drafting blueness circle
Shape pattern, the distance between the first touch point and the second touch point is 0.8 times of circular pattern diameter, the second touch-control
Point is 1.1 times of circular pattern diameters with the spacing of the 3rd touch point.So formed on picture gap random,
The mixed processing effect of color change.
Fig. 6 is the structured flowchart of image blend processing means in an embodiment.As shown in Figure 6, Yi Zhongtu
As mixed processing device, including the virtual module corresponding to the image blend processing method institute framework in Fig. 2,
Including display module 602, event receiver module 604, parameter acquisition module 606, distance acquisition module 608,
Judge module 610, drafting module 612 and synthesis module 614.Wherein:
Display module 602 is used for obtaining original image, and shows this original image.
In the present embodiment, the photo of shooting or the picture chosen from photograph album or the picture downloaded from network are made
For original image, in loading original image to image procossing application program, and original image is shown in terminal
Screen on.
Event receiver module 604 is for receiving the touch event to this original image.
In the present embodiment, terminal receives user by the touch control object such as finger or the pointer touch-control to original image
Event.This touch event refers to the pressing operation to original image.
Parameter acquisition module 606 is used for obtaining the first touch point that this touch event produces on this original image,
And acquisition moves, with this first touch point, the second touch point produced on this original image for starting point.
In the present embodiment, obtain the image coordinate of user's pressing the first touch point on the original image.Obtain
The image coordinate of the first touch point produced on the original image.If two variable p1 and p2 are tactile in order to store
The image coordinate of control point.P1 is for the image coordinate of storage the first touch point.P2 is for storage the second touch point
Image coordinate.
Distance acquisition module 608 is for obtaining the second touch point and the first touch-control produced on this original image
Distance between point.
In the present embodiment, calculate the according to the image coordinate of the image coordinate of the first touch point and the second touch point
Distance between one touch point and the second touch point.Such as, the image coordinate of the first touch point is (x1, y1),
The image coordinate of the second touch point is (x2, y2), then the distance between the first touch point and the second touch point is
Judge module 610 is for judging that whether this distance joins more than or equal to the distance of this stochastic generation in advance
Numerical value, the most then record the image coordinate of this second touch point.
In the present embodiment, stochastic generation distance parameter value in advance.
Drafting module 612 for being plotted to this second touch point by the pattern chosen.
In the present embodiment, the second touch point is drawn, as the center of the pattern chosen, the pattern chosen.
The pattern chosen can be arbitrary graphic pattern.As being circular pattern, delta pattern, mulle etc., no
It is limited to this.
Synthesis module 614 is for carrying out the figure after being synthesized by the pattern of this drafting and this original image
Picture.In the present embodiment, the pattern of drafting is attached to the image after being synthesized on original image.
This parameter acquisition module 606 is additionally operable to obtain and is moved as new starting point by this second touch point
The the second new touch point produced on this original image.
Specifically, the second touch point moves as new starting point, and this parameter acquisition module 606 is additionally operable to
The starting point that acquisition is new with the second touch point moves the second new touch point produced on the original image,
Distance acquisition module 608 calculates the distance between new the second touch point and new starting point;Judge module 610
Whether judging distance is more than or equal to the distance threshold of this stochastic generation, the most then record new second and touch
Control point;The pattern newly chosen is plotted to the second new touch point by drafting module 612;Synthesis module 614 will
The pattern drawn and original image carry out the image after being synthesized.
The distance threshold of stochastic generation may be the same or different, according to random integers stochastic generation every time.Institute every time
The pattern chosen can be identical, it is possible to different.
Above-mentioned image blend processing means, obtains original image and also shows, obtains the produced at original image
One touch point and the second touch point, and obtain the distance between the first touch point and the second touch point, it is judged that away from
From whether more than or equal to the distance parameter value of this stochastic generation, if then drawing choosing at the second touch point
The pattern taken, then the starting point being new with the second touch point moves, and obtains the second new touch point, calculates new
Touch point relative to the distance of new starting point, then judging distance whether more than or equal to this stochastic generation away from
From parameter value, the most then at the second new touch point, draw the pattern newly chosen, repeat to draw, draw
It is spaced random pattern to original image, it is achieved that image is carried out at random mixed processing.
Fig. 7 is the structured flowchart of image blend processing means in another embodiment.As it is shown in fig. 7, it is a kind of
Image blend processing means, including the virtual module corresponding to the image blend processing method institute framework in Fig. 2,
Except including display module 602, event receiver module 604, parameter acquisition module 606, distance acquisition module
608, judge module 610, drafting module 612 and synthesis module 614, also include parameter generation module 616.
Wherein:
Parameter generation module 616 is used for stochastic generation distance parameter value in advance, including:
Generate random integers;
By these random integers to n remainder, obtaining a span is the integer of 0 to n-1, when remainder is 0
Time, the distance parameter value of generation is more than the diameter of the pattern chosen, when remainder is not 0, the distance of generation
Parameter value is less than the diameter of the pattern chosen, and wherein, n is natural number.
Fig. 8 is the structured flowchart of image blend processing means in another embodiment.As shown in Figure 8, a kind of
Image blend processing means, except including display module 602, event receiver module 604, parameter acquisition module
606, distance acquisition module 608, judge module 610, drafting module 612 and synthesis module 614, also include
Color chooses module 618.Wherein:
Color chooses module, before the pattern chosen is plotted to this second touch point, from pre-configured
Color randomly selects color.
After this drafting module 612 is additionally operable to the pattern chosen is plotted to this second touch point, this is chosen
Color be mapped on the pattern of this drafting.
Fig. 9 is the structured flowchart of image blend processing means in another embodiment.As it is shown in figure 9, it is a kind of
Image blend processing means, except including display module 602, event receiver module 604, parameter acquisition module
606, distance acquisition module 608, judge module 610, drafting module 612 and synthesis module 614, also include
Transparency chooses 620.Wherein:
Transparency chose module 620 before the pattern chosen is plotted to this second touch point, from prewired
The transparency put randomly selects transparency.
This drafting module 612 is additionally operable to, after the pattern chosen is plotted to this second touch point, this be painted
The transparency of the pattern of system is adjusted to this transparency chosen.
In other embodiments, a kind of image blend processing means, it may include display module 602, event receive
Module 604, parameter acquisition module 606, distance acquisition module 608, judge module 610, drafting module 612,
Synthesis module 614, parameter generation module 616, color choose module 618, transparency is chosen in module 620
All possible combination.
In one embodiment, this parameter acquisition module 606 is additionally operable to obtain this touch event and produces on screen
The screen coordinate of the first raw touch point, is converted to the screen coordinate of the first touch point produced on this screen
Image coordinate at the first touch point that this original image produces;And obtain with this first touch point as starting point
The screen coordinate of the second touch point that the event that moves produces on screen, second will produced on this screen
The screen coordinate of touch point is converted to the image coordinate of the second touch point produced on this original image.
One of ordinary skill in the art will appreciate that all or part of flow process realizing in above-described embodiment method,
Can be by computer program and complete to instruct relevant hardware, described program can be stored in one non-easily
In the property lost computer read/write memory medium, this program is upon execution, it may include such as the enforcement of above-mentioned each method
The flow process of example.Wherein, described storage medium can be magnetic disc, CD, read-only store-memory body (Read-Only
Memory, ROM) etc..
Embodiment described above only have expressed the several embodiments of the present invention, and it describes more concrete and detailed,
But therefore can not be interpreted as the restriction to the scope of the claims of the present invention.It should be pointed out that, for this area
Those of ordinary skill for, without departing from the inventive concept of the premise, it is also possible to make some deformation and
Improving, these broadly fall into protection scope of the present invention.Therefore, the protection domain of patent of the present invention should be with appended
Claim is as the criterion.
Claims (10)
1. an image blend processing method, including:
Obtain original image, and show described original image;
Receive the touch event to described original image;
Obtain the first touch point that described touch event produces on described original image, and obtain with described
First touch point is that starting point moves the second touch point produced on described original image;
Obtain the distance between the second touch point and the first touch point produced on described original image;
Judge described distance whether more than or equal to the distance parameter value of this stochastic generation in advance, the most then
Record the image coordinate of described second touch point;
The pattern chosen is plotted to described second touch point;
The pattern of described drafting and described original image are carried out the image after being synthesized;
Described second touch point is moved as new starting point.
Method the most according to claim 1, it is characterised in that described method also includes:
Stochastic generation distance parameter value in advance, including:
Generate random integers;
By described random integers to n remainder, obtaining a span is the integer of 0 to n-1, when remainder is
When 0, the distance parameter value of generation is more than the diameter of the pattern chosen, when remainder is not 0, the distance of generation
Parameter value is less than the diameter of the pattern chosen, and wherein, n is natural number.
Method the most according to claim 1, it is characterised in that the pattern chosen is plotted to described
Before the step of two touch points, described method also includes:
Color is randomly selected from pre-configured color;
After the pattern chosen is plotted to the step of described second touch point, described method also includes:
The described color chosen is mapped on the pattern of described drafting.
Method the most according to claim 1, it is characterised in that the pattern chosen is plotted to described
Before the step of two touch points, described method also includes:
Transparency is randomly selected from pre-configured transparency;
After the pattern chosen is plotted to the step of described second touch point, described method also includes:
The transparency chosen described in the transparency of the pattern of described drafting is adjusted to.
Method the most according to claim 1, it is characterised in that the described touch event of described acquisition is in institute
The step of the image coordinate stating the first touch point produced on original image includes:
Obtain the screen coordinate of the first touch point that described touch event produces on screen;
Be converted to the screen coordinate of the first touch point produced on described screen to produce at described original image
The image coordinate of the first touch point;
Described acquisition moves what event produced on described original image with described first touch point for starting point
The step of the image coordinate of the second touch point includes:
Obtain and move, for starting point, the second touch point that event produces on screen with described first touch point
Screen coordinate;
Be converted to the screen coordinate of the second touch point produced on described screen produce on described original image
The image coordinate of the second touch point.
6. an image blend processing means, it is characterised in that including:
Display module, is used for obtaining original image, and shows described original image;
Event receiver module, for receiving the touch event to described original image;
Parameter acquisition module, for obtaining the first touch-control that described touch event produces on described original image
Point, and obtain with described first touch point for starting point move on described original image produce second
Touch point;
Distance acquisition module, for obtaining the second touch point and the first touch-control produced on described original image
Distance between point;
Judge module, for judging that described distance is whether more than or equal to the distance ginseng of this stochastic generation in advance
Numerical value, the most then record the image coordinate of described second touch point;
Drafting module, for being plotted to described second touch point by the pattern chosen;
Synthesis module, after carrying out being synthesized by the pattern of described drafting and described original image
Image;
Described parameter acquisition module is additionally operable to obtain and is moved as new starting point by described second touch point
The the second new touch point produced on described original image.
Device the most according to claim 6, it is characterised in that described device also includes:
Parameter generation module, is used for stochastic generation distance parameter value in advance, including:
Generate random integers;
By described random integers to n remainder, obtaining a span is the integer of 0 to n-1, when remainder is
When 0, the distance parameter value of generation is more than the diameter of the pattern chosen, when remainder is not 0, the distance of generation
Parameter value is less than the diameter of the pattern chosen, and wherein, n is natural number.
Device the most according to claim 6, it is characterised in that described device also includes:
Color chooses module, before the pattern chosen is plotted to described second touch point, from pre-configured
Color in randomly select color;
After described drafting module is additionally operable to the pattern chosen is plotted to described second touch point, by described choosing
The color taken is mapped on the pattern of described drafting.
Device the most according to claim 6, it is characterised in that described device also includes:
Transparency chooses module, before the pattern chosen is plotted to described second touch point, from prewired
The transparency put randomly selects transparency;
Described drafting module is additionally operable to after the pattern chosen is plotted to described second touch point, by described
The transparency of pattern drawn be adjusted to described in the transparency chosen.
Device the most according to claim 6, it is characterised in that described parameter acquisition module is additionally operable to
Obtain the screen coordinate of the first touch point that described touch event produces on screen, will produce on described screen
The screen coordinate of the first touch point image of the first touch point that is converted to produce at described original image sit
Mark;And acquisition moves, for starting point, the second touch-control that event produces on screen with described first touch point
The screen coordinate of point, is converted to the screen coordinate of the second touch point produced on described screen described original
The image coordinate of the second touch point produced on image.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201610224740.2A CN105912257B (en) | 2016-04-11 | 2016-04-11 | Image blend treating method and apparatus |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201610224740.2A CN105912257B (en) | 2016-04-11 | 2016-04-11 | Image blend treating method and apparatus |
Publications (2)
Publication Number | Publication Date |
---|---|
CN105912257A true CN105912257A (en) | 2016-08-31 |
CN105912257B CN105912257B (en) | 2019-03-05 |
Family
ID=56745989
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201610224740.2A Active CN105912257B (en) | 2016-04-11 | 2016-04-11 | Image blend treating method and apparatus |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN105912257B (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN110162258A (en) * | 2018-07-03 | 2019-08-23 | 腾讯数码(天津)有限公司 | The processing method and processing device of individual scene image |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN103246466A (en) * | 2012-02-14 | 2013-08-14 | 深圳富泰宏精密工业有限公司 | Touch screen unlocking system and method |
CN103257828A (en) * | 2013-05-30 | 2013-08-21 | 无锡久源软件科技有限公司 | Graffiti type full screen sliding and unlocking method |
CN103309557A (en) * | 2012-03-06 | 2013-09-18 | 卡西欧计算机株式会社 | Image processing apparatus and image processing method |
CN105320434A (en) * | 2014-06-16 | 2016-02-10 | 中兴通讯股份有限公司 | Curve drawing method and device based on android system and terminal |
-
2016
- 2016-04-11 CN CN201610224740.2A patent/CN105912257B/en active Active
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN103246466A (en) * | 2012-02-14 | 2013-08-14 | 深圳富泰宏精密工业有限公司 | Touch screen unlocking system and method |
CN103309557A (en) * | 2012-03-06 | 2013-09-18 | 卡西欧计算机株式会社 | Image processing apparatus and image processing method |
CN103257828A (en) * | 2013-05-30 | 2013-08-21 | 无锡久源软件科技有限公司 | Graffiti type full screen sliding and unlocking method |
CN105320434A (en) * | 2014-06-16 | 2016-02-10 | 中兴通讯股份有限公司 | Curve drawing method and device based on android system and terminal |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN110162258A (en) * | 2018-07-03 | 2019-08-23 | 腾讯数码(天津)有限公司 | The processing method and processing device of individual scene image |
Also Published As
Publication number | Publication date |
---|---|
CN105912257B (en) | 2019-03-05 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US11481097B2 (en) | User interface tools for cropping and straightening image | |
CN105723445B (en) | Display device and its control method | |
US20080307341A1 (en) | Rendering graphical objects based on context | |
DE102011108861A1 (en) | Electronic device and method for generating a graphical user interface thereof | |
CN105894554A (en) | Image processing method and image processing device | |
CN110427131B (en) | Animation display method and device based on pressing | |
JP2012213019A (en) | Image processing apparatus, image processing method, and program | |
KR20150114434A (en) | System generating display overlay parameters utilizing touch inputs and method thereof | |
US7064753B2 (en) | Image generating method, storage medium, image generating apparatus, data signal and program | |
US20210026508A1 (en) | Method, device and computer program for overlaying a graphical image | |
US7327364B2 (en) | Method and apparatus for rendering three-dimensional images of objects with hand-drawn appearance in real time | |
CN106297477A (en) | A kind of method and device generating digitized copybook | |
US20150042675A1 (en) | Pattern Based Design Application | |
CN111158840B (en) | Image carousel method and device | |
CN105912257A (en) | Method and device for image hybrid processing | |
CN104574473B (en) | Method and device for generating dynamic effect on basis of static image | |
CN103823651B (en) | The method and electronic equipment of a kind of information processing | |
CN109766530B (en) | Method and device for generating chart frame, storage medium and electronic equipment | |
JP2008059540A (en) | Image coloring device using computer | |
CN105892663A (en) | Information processing method and electronic device | |
KR20090122805A (en) | Mobile terminal capable of controlling operation using a proximity sensor and control method thereof | |
US7432939B1 (en) | Method and apparatus for displaying pixel images for a graphical user interface | |
CN106325745A (en) | Screen capturing method and device | |
JP2014527234A (en) | User interface for drawing with electronic devices | |
JP2004239998A (en) | Display device |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
C06 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |