CN106933474A - Image blend processing method and processing device - Google Patents
Image blend processing method and processing device Download PDFInfo
- Publication number
- CN106933474A CN106933474A CN201511021796.XA CN201511021796A CN106933474A CN 106933474 A CN106933474 A CN 106933474A CN 201511021796 A CN201511021796 A CN 201511021796A CN 106933474 A CN106933474 A CN 106933474A
- Authority
- CN
- China
- Prior art keywords
- pressure
- dataller
- touch
- parameter
- area
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
- G06F3/04845—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range for image manipulation, e.g. dragging, rotation, expansion or change of colour
Landscapes
- Engineering & Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- User Interface Of Digital Computer (AREA)
- Processing Or Creating Images (AREA)
- Image Generation (AREA)
Abstract
The disclosure is directed to a kind of image blend processing method and processing device.Method includes:Create a background base map and a dynamic texture is drawn on the background base map, and at least partly the dynamic texture is in opaque state;The touch event of input is received, and obtains the location parameter and pressure parameter in the touch event;Figure dataller tool is created, and the transparency value added that the figure dataller has dynamically is adjusted according to the pressure parameter of the touch event;The location parameter motion track that dynamically the adjustment figure dataller has in the touch event;In the region that the motion track of figure dataller tool is passed through, increase the transparency of the dynamic texture using figure dataller tool, and the dynamic texture is mixed with the background base map.The disclosure adjusts transparency value added according to the pressure parameter of pressing, and according to the location parameter adjustment erasing track of pressing, simulates true with pressosensitive erasing operation so that feedback effects operate more consistent with user.
Description
Technical field
This disclosure relates to human-computer interaction technique field, in particular to a kind for the treatment of of image blend
Method and image blend processing unit.
Background technology
With the fast development of mobile communication technology, increasing trip is occurred in that on touch control terminal
Play application.One type game application is that erasing move, simulating reality are carried out on screen with finger
The process of article surface shelter, such game application such as scraping award, erasing rubber are removed in life
Except person's handwriting etc..At present, the technical scheme in dependent game application mainly has following two:
For example, when finger is slided on screen, the picture that blocks of finger contact area is deleted, leakage
Go out to block the background picture below picture.Regardless of finger pressure during erasing, block
The transparency of picture is to be directly added to 100% from 0, visually blocks picture directly quilt
Leave out completely, the technical scheme implementation mechanization is quite different with actual scene.
For example, in erase process is carried out, setting single erasing can only increase the fixation for blocking picture
Numerical value transparency, repeatedly erasing can will just block picture and wipe completely.The program is in erase process
" dabbing " mode is simulated, but cannot the pressure sensitivity different to user make different anti-
Feedback, when user exerts oneself erasing, the result for obtaining is true with user as result when dabbing
Real operating effect is not inconsistent.
Above two wipes scheme, is not the direct shelter by finger contact area visually
Directly wipe, after exactly needing to be wiped repeatedly back and forth on shelter, shelter could completely be moved
Remove, do not account in actual scene, general during erasing all to be reacted with weight, different power
The erasing effect of degree is different, therefore more stiff.
It should be noted that information is only used for strengthening to this public affairs disclosed in above-mentioned background section
The understanding of the background opened, therefore can include not constituting to known to persons of ordinary skill in the art existing
There is the information of technology.
The content of the invention
For subproblem of the prior art or whole issue, the disclosure provides a kind of image and mixes
Close processing method and image blend processing unit.
According to the first aspect of the embodiment of the present disclosure, there is provided a kind of image blend processing method, including:
S10. create a background base map and a dynamic texture, and institute are drawn on the background base map
State dynamic texture and be in opaque state;
S20. receive the touch event of input, and obtain the location parameter in the touch event and
Pressure parameter;
S30. figure dataller tool is created, and institute is dynamically adjusted according to the pressure parameter of the touch event
State the transparency value added of figure dataller tool;
S40. the shifting that dynamically the adjustment figure dataller has of the location parameter in the touch event
Dynamic rail mark;
S50. the region passed through in the motion track of figure dataller tool, has using the figure dataller
The transparency of the dynamic texture is removed, and the dynamic texture is mixed with the background base map.
In a kind of exemplary embodiment of the disclosure, the step S30 also includes:
The pressure parameter area that dynamically the adjustment figure dataller has according to the touch event.
In a kind of exemplary embodiment of the disclosure, the step S30 includes:
Judge that the pressure parameter is in interval first pressure, second pressure interval or the 3rd pressure area
Between;Maximum is less than minimum value, described second in second pressure interval in the first pressure interval
Maximum is less than minimum value in the 3rd pressure range in pressure range;
When the pressure parameter is in first pressure interval, adjusting the figure dataller has
First transparency value added;
When the pressure parameter is in second pressure interval, adjusting the figure dataller has
Second transparency value added;The second transparency value added is higher than the first transparency value added;
When the pressure parameter is in three pressure range, adjusting the figure dataller has
3rd transparency value added;The 3rd transparency value added is higher than the second transparency value added.
In a kind of exemplary embodiment of the disclosure, wherein:
When the pressure parameter is in first pressure interval, adjusting the figure dataller has
First area;
When the pressure parameter is in second pressure interval, adjusting the figure dataller has
Second area;The second area is more than first area;
When the pressure parameter is in three pressure range, adjusting the figure dataller has
3rd area;3rd area is more than the second area.
In a kind of exemplary embodiment of the disclosure, wherein:
The step S10 and step S30 to step S50 is performed by a transparency control module;
The step S20 touches receiver module and performs by one;
Described image mixed processing method also includes:
S01. the receiver module that touches registers touch event to the operating system of touch control terminal, so that
Operating system is input into the touch receiver module when the touch event is detected;
S02. the transparency control module registers ginseng to the touch receiver module of the touch control terminal
Number notification event, so that the touch receiver module is when the touch event is received, will be described
Location parameter and pressure parameter in touch event are input into the transparency control module.
In a kind of exemplary embodiment of the disclosure, the touch event include touch initiation event,
Touch moving event and touch End Event;Described image mixed processing method also includes:
S60. when the touch End Event occurs, this image blend is terminated.
In a kind of exemplary embodiment of the disclosure, described image mixed processing method also includes:
In described image mixed process, detect whether all regions of the dynamic texture are complete
Pellucidity is in entirely;
When all regions of the dynamic texture are completely opaque, terminate image blend.
According to the second aspect of the embodiment of the present disclosure, a kind of image blend processing unit is additionally provided,
It is applied to be capable of achieving the touch control terminal of pressure-sensing;Described image mixed processing device includes:
Image creation unit, moves for creating a background base map and drawing one on the background base map
State texture, and the dynamic texture is in opaque state;
Receiver module is touched, for receiving the touch event of input, and the touch event is obtained
Location parameter and pressure parameter;
Figure brush control unit, for creating figure dataller tool, and according to the pressure of the touch event
Dynamic state of parameters adjusts the transparency value added of the figure dataller tool;
TRAJECTORY CONTROL unit, for described in the location parameter in the touch event dynamically adjustment
The motion track of figure dataller's tool;
Image-blending unit, the region that the motion track for having in the figure dataller is passed through, profit
Having with the figure dataller increases the transparency of the dynamic texture, and by the dynamic texture with it is described
Background base map mixes.
In a kind of exemplary embodiment of the disclosure, the figure brush control unit is additionally operable to, according to institute
State the pressure parameter area that dynamically the adjustment figure dataller has of touch event.
In a kind of exemplary embodiment of the disclosure, the pressure parameter according to the touch event
The transparency value added of the dynamic adjustment figure dataller tool includes:
Judge that the pressure parameter is in interval first pressure, second pressure interval or the 3rd pressure area
Between;Maximum is less than minimum value, described second in second pressure interval in the first pressure interval
Maximum is less than minimum value in the 3rd pressure range in pressure range;
When the pressure parameter is in first pressure interval, adjusting the figure dataller has
First transparency value added;
When the pressure parameter is in second pressure interval, adjusting the figure dataller has
Second transparency value added;The second transparency value added is higher than the first transparency value added;
When the pressure parameter is in three pressure range, adjusting the figure dataller has
3rd transparency value added;The 3rd transparency value added is higher than the second transparency value added.
In a kind of exemplary embodiment of the disclosure, the pressure parameter according to the touch event
The area of the dynamic adjustment figure dataller tool includes:
When the pressure parameter is in first pressure interval, adjusting the figure dataller has
First area;
When the pressure parameter is in second pressure interval, adjusting the figure dataller has
Second area;The second area is more than first area;
When the pressure parameter is in three pressure range, adjusting the figure dataller has
3rd area;3rd area is more than the second area.
In a kind of exemplary embodiment of the disclosure, wherein:
Described image creating unit, figure brush control unit, TRAJECTORY CONTROL unit and image blend list
Unit is packaged in a transparency control module;
The touch receiver module is additionally operable to, and touch event is registered to the operating system of touch control terminal,
So that operating system is input into the touch receiver module when the touch event is detected;
The transparency control module is additionally operable to, and is registered to the touch receiver module of the touch control terminal
Parameter notification event, so that the touch receiver module is when the touch event is received, by institute
Location parameter and pressure parameter in touch event is stated to be input into the transparency control module.
In a kind of exemplary embodiment of the disclosure, the touch event include touch initiation event,
Touch moving event and touch End Event;Described image mixed processing device also includes:
Finishing control unit, for when the touch End Event occurs, terminating this image and mixing
Close.
In a kind of exemplary embodiment of the disclosure, the finishing control unit is additionally operable to, described
During image blend, detect the dynamic texture all regions whether be completely in it is transparent
State, and when all regions of the dynamic texture are completely opaque, terminate image blend.
Image blend processing method and processing device in a kind of embodiment of the disclosure, based on screen to not
With the sensing of pressure, transparency value added can be adjusted according to the pressure parameter of pressing, and according to
The location parameter adjustment erasing track of pressing so that Intelligence Feedback is made to different pressure sensitivity, is solved
The problem of different erasing mode feedback one in screen game process is wiped, in realizing erase process
Stress control, so that the preferably actual use scene of simulation, simulates true with pressosensitive
Erasing operation so that feedback effects operate more consistent with user.
It should be appreciated that the general description of the above and detailed description hereinafter are only exemplary reconciliation
The property released, the disclosure can not be limited.
Brief description of the drawings
Accompanying drawing herein is merged in specification and constitutes the part of this specification, shows and meets
Embodiment of the disclosure, and it is used to explain the principle of the disclosure together with specification.Obviously
Ground, drawings in the following description are only some embodiments of the present disclosure, for the common skill in this area
For art personnel, on the premise of not paying creative work, can also be obtained according to these accompanying drawings
Other accompanying drawings.
Fig. 1 schematically shows a kind of stream of image blend processing method in disclosure exemplary embodiment
Cheng Tu.
Fig. 2 schematically shows a kind of operation interface of game application in disclosure exemplary embodiment.
Fig. 3 schematically shows a kind of operation interface of game application in disclosure exemplary embodiment.
Fig. 4 schematically shows a kind of flow chart of step S30 in disclosure exemplary embodiment.
Fig. 5 schematically shows a kind of stream of image blend processing method in disclosure exemplary embodiment
Cheng Tu.
Fig. 6 schematically shows a kind of side of image blend processing unit in disclosure exemplary embodiment
Block diagram.
Fig. 7 schematically shows a kind of side of image blend processing unit in disclosure exemplary embodiment
Block diagram.
Specific embodiment
Example embodiment is described more fully with referring now to accompanying drawing.However, example embodiment
Can in a variety of forms implement, and be not understood as limited to example set forth herein;Conversely, carrying
Cause that the disclosure will more fully and completely, and by the structure of example embodiment for these implementation methods
Think of comprehensively conveys to those skilled in the art.Described feature, structure or characteristic can be with
Any suitable mode is combined in one or more implementation methods.In the following description, there is provided
Many details are so as to provide fully understanding for implementation method of this disclosure.However, this area
Technical staff is by, it is realized that can put into practice the technical scheme of the disclosure and omit the specific detail
It is one or more, or can be using other methods, constituent element, device, step etc..At it
In the case of it, it is not shown in detail or describes known solution and this public affairs is caused to avoid that a presumptuous guest usurps the role of the host
The each side opened thickens.
Additionally, accompanying drawing is only the schematic illustrations of the disclosure, it is not necessarily drawn to scale.Figure
Middle identical reference represents same or similar part, thus their repetition is retouched by omitting
State.Some block diagrams shown in accompanying drawing are functional entitys, not necessarily must with physically or logically
Independent entity is corresponding.These functional entitys can be realized using software form, or at one
Or these functional entitys are realized in multiple hardware modules or integrated circuit, or at heterogeneous networks and/or place
These functional entitys are realized in reason device device and/or microcontroller device.
With the development of electronic technology, have been realized in that the touch-control end of pressure-sensing can be carried out at present
End, such that it is able to bring new manipulation and input mode for user.For example, Huawei Company and
Apple Inc. issued the touch-control smart mobile phone for possessing pressure-sensing in 2015.Such touch-control end
End can not only sense the touch operation of user as the input of touch control terminal, it is also possible to by pressure
The size of power, position and action time are perceived, so as to by pressure separately as touch control terminal
Input, or pressure and other input modes are combined the input as touch control terminal, be use
The operation at family brings many convenience and interest.For example
A kind of image blend processing method, the image blend are provide firstly in this example embodiment
Processing method can apply to the touch control terminal of above-mentioned achievable pressure-sensing.The touch control terminal is for example
Can possess touch screen for mobile phone, panel computer, notebook computer, game machine, PDA etc. are various
The electronic equipment of curtain.But it should be recognized that being likely in the non-touch control terminal of part by simulation
Keyboard and mouse action are modeled as touch control operation by the modes such as device, and this mode can equally be considered as
Touch control terminal described in the disclosure.With reference to shown in Fig. 1 and Fig. 2, Fig. 3, described image
Mixed processing method may comprise steps of:
S10. create a background base map and a dynamic texture, and institute are drawn on the background base map
State dynamic texture and be in opaque state.
With reference to shown in Fig. 2, game application is controlled by the application programming interfaces (API) of touch control terminal
The Touch Screen display game operation interface 1 of touch control terminal processed, the operation in this example embodiment
Interface 1 can be touch control terminal all can viewing area, that is, be displayed in full screen;It can also be touch-control
The part of terminal can viewing area, i.e. window show.At least include being created in the operation interface 1
The background base map 101 built and the dynamic texture 102 drawn on the background base map 101.Remove
Outside this, can also be including control button, information panel and explanatory note etc. in operation interface 1
Other parts.In this example embodiment, the dynamic texture is in opaque state, and it can be with
The background base map is blocked completely, it is also possible to background base map only described in shield portions, and spill described
The other parts of background base map.
S20. receive the touch event of input, and obtain the location parameter in the touch event and
Pressure parameter.
Touch event is periodically detected, the touch event can include user in operation interface 1
Carry out simple slide, simple pressing operation and pressed while slide
Operation.The difference of the pressure value according to pressing operation, can be divided into multiple differences by pressing operation
Grade, can for example be divided into weight, light pressure and pole light press (can be considered and do not press),
According to the sensitivity of pressure sensing module, the other division of more stages can also be carried out.This example is implemented
In mode, when the slip erasing operation with pressing dynamics is performed, this can be obtained with pressing
A series of location parameter in the slip touch event of dynamics for example wipes track data etc., and
The specific pressure value of pressure parameter for example on each position etc..
S30. figure dataller tool is created, and institute is dynamically adjusted according to the pressure parameter of the touch event
State the transparency value added of figure dataller tool.
When user performs touch operation on touch screen, figure dataller tool, this example reality are created at once
Apply in mode, the figure dataller tool is the image procossing control with preset shape, color, for example
It can be circular white texture brush;But should be noted that figure dataller's tool can be with specific shape
Shape manifests to user, it is also possible to do not manifest to user, does not do special to this in the present exemplary embodiment
Limit.The pressure parameter of touch event has been obtained in above-mentioned steps S20, in this step,
The transparency value added of the figure dataller's tool for having created can be adjusted based on the pressure parameter.By increasing
Plus the transparency of dynamic texture, visually can then show to erase the part dynamic texture
Pixel.Transparency value added is higher, represents that the transparency of dynamic texture is bigger, the back of the body for being exposed
Scape base map is more clear.Therefore, can be in this example embodiment, if the touch operation
Pressure parameter is bigger, then transparency value added is bigger.
S40. the shifting that dynamically the adjustment figure dataller has of the location parameter in the touch event
Dynamic rail mark.
The location parameter of touch event has been obtained in above-mentioned steps S20, therefore can be according to this
Location parameter simulates the erasing track of finger to adjust the motion track of figure dataller tool 301.
S50. the region passed through in the motion track of figure dataller tool, has using the figure dataller
Increase the transparency of the dynamic texture, and the dynamic texture is mixed with the background base map.
As it was previously stated, through figure dataller tool 301 erasing operation after, the transparency of dynamic texture
Become big, after the dynamic texture is mixed with background base map, can obtain close to real wiping
It is not to be determined by system completely every time during effect except after, and whole erasing operation
The increased transparence value of erasing operation, but according to the pressing dynamics of user come real-time adjustment so that
The feedback effects of erasing are consistent with user's operating effect.
The touch event can include that (such as finger starts to contact touch screen touch initiation event
Curtain), touch moving event (such as on Touch Screen mobile finger) and touch End Event (example
As finger leaves Touch Screen);Described image mixed processing method can also include:
S60. when the touch End Event occurs, this image blend is terminated.
Further, above-mentioned steps S30 can also include in this example embodiment:
The pressure parameter area that dynamically the adjustment figure dataller has according to the touch event.
The area of figure dataller tool 301 can correspond to the scope of each erasing, the figure for creating at the beginning
The area of dataller's tool 301 is probably the area of system default or is used when user's last time operates
Area, but figure dataller tool 301 area can carry out real-time adjustment with the pressure parameter of touch event,
So that the simulation effect of figure dataller tool 301 is more consistent with the erasing operation effect of the finger of user.
Additionally, described image mixed processing method can also include:
In described image mixed process, detect whether all regions of the dynamic texture are complete
Pellucidity is in entirely, also, when all regions of the dynamic texture are completely opaque,
Terminate image blend, complete all erase process.There is subregion in the detection dynamic texture
During in incomplete pellucidity, continue the S20 to step S60 that repeats the above steps, until detection
All regions of the dynamic texture have been completely in pellucidity.
Image blend processing method in a kind of embodiment of the disclosure is based on screen to different pressures
Sensing, can adjust transparency value added according to the pressure parameter of pressing, and according to the position of pressing
Put parameter adjustment erasing track so that Intelligence Feedback is made to different pressure sensitivity, solve wiping screen trip
The problem of erasing mode feedback one different during play, realizes the Stress control in erase process,
So as to the actual use scene of preferable simulation, simulate truly with pressosensitive erasing operation,
So that feedback effects are more consistent with user's operation.
With reference to shown in Fig. 4, the step S30 can include:
S31. judge that the pressure parameter of the touch event is in first pressure interval, second pressure area
Between or the 3rd pressure range;During maximum is interval less than second pressure in the first pressure interval most
Small value, maximum is less than minimum value in the 3rd pressure range in the second pressure interval, also,
It is continuous interval with the 3rd pressure range that first pressure is interval, second pressure is interval;For example, institute
State first pressure interval for 0~X1Newton, it is X that the second pressure is interval1~X2Newton, described
Three pressure ranges are X2~X3Newton etc..
S32. when the pressure parameter is in first pressure interval, the figure dataller tool is adjusted
With the first transparency value added.
S33. when the pressure parameter is in second pressure interval, the figure dataller tool is adjusted
With the second transparency value added;The second transparency value added increases higher than first transparency
It is value added, for example, the second transparency value added is 1.5 times of the first transparency value added
Deng.
S34. when the pressure parameter is in three pressure range, the figure dataller tool is adjusted
With the 3rd transparency value added;The 3rd transparency value added increases higher than second transparency
It is value added, for example, the 3rd transparency value added is 1.5 times of the second transparency value added
Deng.
In the present embodiment, three pressure ranges have only been distinguished, it should be appreciated that can also distinguish between as more
Multiple pressure ranges, the transparency value added for making each pressure range correspondence different, residing pressure
Interval pressure value is bigger, then corresponding transparency value added is bigger.It is equally possible that allowing
Each pressure value one transparency value added of correspondence, in the specific implementation, by user or touch-control end
Manufacturer or game services business sets itself as needed are held, this is not done in this example embodiment
Particular determination.
In the corresponding exemplary embodiments of Fig. 4, each pressure range one transparency of correspondence increases
Value, in this illustrative embodiments, it is also possible to make the one of each pressure range corresponding diagram dataller tool
Individual area, for example:
When the pressure parameter is in first pressure interval, adjusting the figure dataller has
First area.
When the pressure parameter is in second pressure interval, adjusting the figure dataller has
Second area;The second area is more than first area, for example, the second area is institute
State 1.5 times of the first area etc..
When the pressure parameter is in three pressure range, adjusting the figure dataller has
3rd area;3rd area is more than the second area, for example, the 3rd area is institute
State 1.5 times of second area etc..
With the increase of the pressure parameter in touch control operation, the area of figure dataller's tool also becomes big, and upper
The change programme for stating transparency value added is combined, and produced effect is with touch control operation
The increase of pressure parameter, erasing effect is also more obvious, i.e. the feedback of game application more conforms to use
The operating effect at family.
The step S10 and step S30, step S40, step S50 can be by a transparencies
Control module is performed;The step S20 can touch receiver module and perform by one;With reference in Fig. 5
Shown, described image mixed processing method can also include:
S01. the receiver module that touches registers touch event to the operating system of touch control terminal, so that
The operating system is input into the touch receiver module when the touch event is detected.
Touch receiver module shifts to an earlier date registers touch event, touch event to the operating system of touch control terminal
Can include touching beginning event, touch moving event, touch End Event, will be likely to occur
Touch event inform in advance operating system, so, once have correlation touch event occur when,
Operating system notifies the touch event of the appearance to touch receiver module.
S02. the transparency control module registers ginseng to the touch receiver module of the touch control terminal
Number notification event, so that the touch receiver module is when the touch event is received, will be described
Location parameter and pressure parameter in touch event are input into the transparency control module.
Similarly, transparency control module ginseng also required for advance notice touch receiver module oneself
Number, so, once touching receiver module obtains touch event, can lead to related parameter information
Know transparency control module, be easy to transparency control module to make corresponding control process.
For example, in game process, touch modules receive the touch event from operating system,
The related touch notification event of game is changed into, transparency control module is notified.
If transparency control module receives touch initiation event, according in the touch initiation event
Original position parameter setting figure dataller tool original position.
If transparency control module receives touch moving event, according in touch moving event
Pressure parameter sets the area and transparency value added of figure dataller's tool, the position in moving event is touched
Put between parameter and original position, figure dataller's tool is plotted in dynamic texture, form erase process.
If transparency control module receives touch End Event, erasing operation of mark is completed,
And judge that (i.e. game background base map has manifested or dynamic completely for game background base map completion erasing
Texture is fully transparent), if game background base map is wiped free of completely, erasing operation terminates.
Further, a kind of image blend processing unit is additionally provided in this example embodiment, should
Touch control terminal for being capable of achieving pressure-sensing.With reference to shown in Fig. 6, described image mixed processing
Device 6 can include image creation unit 61, touch receiver module 62, figure brush control unit 63,
TRAJECTORY CONTROL unit 64 and image-blending unit 65.Wherein:
Image creation unit 61 is mainly used in creating a background base map and being painted on the background base map
A dynamic texture is made, and the dynamic texture is in opaque state.
Touch receiver module 62 and be mainly used in receiving the touch event of input, and obtain the touch thing
Location parameter and pressure parameter in part.
Figure brush control unit 63 is mainly used in creating figure dataller tool, and according to the touch event
The pressure parameter transparency value added that dynamically the adjustment figure dataller has.
The location parameter that TRAJECTORY CONTROL unit 64 is mainly used in the touch event is dynamically adjusted
The motion track of the whole figure dataller tool.
Image-blending unit 65 is mainly used in the area passed through in the motion track of figure dataller tool
Domain, increases the transparency of the dynamic texture using figure dataller tool, and by the dynamic texture
Mix with the background base map.
In one exemplary embodiment, the figure brush control unit 63 can be also used for being touched according to described
Touch the pressure parameter area that dynamically the adjustment figure dataller has of event.
Wherein, the figure brush control unit 63 is dynamically adjusted according to the pressure parameter of the touch event
The process of the transparency value added of the figure dataller tool can include:
Judge that the pressure parameter is in interval first pressure, second pressure interval or the 3rd pressure area
Between;Maximum is less than minimum value, described second in second pressure interval in the first pressure interval
Maximum is less than minimum value in the 3rd pressure range in pressure range.
When the pressure parameter is in first pressure interval, adjusting the figure dataller has
First transparency value added.
When the pressure parameter is in second pressure interval, adjusting the figure dataller has
Second transparency value added;The second transparency value added is higher than the first transparency value added.
When the pressure parameter is in three pressure range, adjusting the figure dataller has
3rd transparency value added;The 3rd transparency value added is higher than the second transparency value added.
Wherein, the TRAJECTORY CONTROL unit 64 is dynamically adjusted according to the pressure parameter of the touch event
The process of the area of the figure dataller tool can include:
When the pressure parameter is in first pressure interval, adjusting the figure dataller has
First area.
When the pressure parameter is in second pressure interval, adjusting the figure dataller has
Second area;The second area is more than first area.
When the pressure parameter is in three pressure range, adjusting the figure dataller has
3rd area;3rd area is more than the second area.
With continued reference to Fig. 6, the touch event include touch initiation event, touch moving event with
And touch End Event;Described image mixed processing device 6 can also include:
Finishing control unit 66, for when the touch End Event occurs, terminating this image
Mixing.
Wherein, the finishing control unit 66 can be also used for, in described image mixed process,
Detect whether all regions of the dynamic texture have been completely in pellucidity, and described dynamic
When all regions of state texture are completely opaque, terminate image blend.
With reference to shown in Fig. 7, described image creating unit 61, figure brush control unit 63, TRAJECTORY CONTROL
Unit 64 and image-blending unit 65 are packaged in a transparency control module 71.
The touch receiver module 62 is additionally operable to, and thing is touched to the operating system registration of touch control terminal
Part, so that operating system is input into the touch receiver module when the touch event is detected.
The transparency control module 71 is additionally operable to, and is noted to the touch receiver module of the touch control terminal
Volume parameter notification event, so that the touch receiver module is when the touch event is received, will
Location parameter and pressure parameter in the touch event are input into the transparency control module.
By the image blend processing unit provided using the disclosure, when screen game wipe, energy
Enough different pressures put on according to user on touch terminal give different feedback effects, when dabbing
Increase the less transparency of shelter (dynamic texture), more transparency then increased when wiping again,
Realize real erasing feedback effects.
Under different scenes, by the dynamic control for transparency parameter, can simulate more
Usage scenario.For example for shelter such as the cloud of easy erasing, dab move can just go it is more
Transparency, and for being difficult the material such as scraping award area of erasing, dab and remove less transparency, need
Wiping again can remove more transparency, and solving cannot be different to user in correlation technique
Pressure sensitivity makes the problem of different feedbacks.
The detail of each module is in corresponding image blend in above-mentioned image blend processing unit
Carry out thinking description in detail in processing method, therefore here is omitted.
If although it should be noted that being referred to the equipment for action executing in above-detailed
Dry module or unit, but this division is not enforceable.In fact, according to the disclosure
The feature and function of implementation method, above-described two or more modules or unit can be one
Embodied in individual module or unit.Conversely, the feature of an above-described module or unit
Can be further divided into being embodied by multiple modules or unit with function.
Additionally, although each step of method in the disclosure is described with particular order in the accompanying drawings,
But, this does not require that or implies must perform these steps according to the particular order, or
Having to carry out the step shown in whole could realize desired result.It is additional or alternative, can be with
Some steps are omitted, multiple steps are merged into a step is performed, and/or one is walked
Suddenly execution of multiple steps etc. are decomposed into.
Through the above description of the embodiments, those skilled in the art it can be readily appreciated that retouch here
The example embodiment stated can be realized by software, it is also possible to combine necessary hardware by software
Mode realize.Therefore, the technical scheme according to disclosure implementation method can be with software product
Form embody, the software product can be stored (can be with a non-volatile memory medium
Be CD-ROM, USB flash disk, mobile hard disk etc.) in or network on, including some instructions are causing one
Platform computing device (can be personal computer, server, touch control terminal or network equipment etc.)
Perform the method according to disclosure implementation method.
Those skilled in the art will easily think after considering specification and putting into practice invention disclosed herein
To other embodiments of the disclosure.The application be intended to any modification of the disclosure, purposes or
Person's adaptations, the generality that these modifications, purposes or adaptations follow the disclosure is former
Manage and including the undocumented common knowledge or conventional techniques in the art of the disclosure.Say
Bright book and embodiment are considered only as exemplary, and the true scope of the disclosure and spirit are by following power
Profit requires to point out.
It should be appreciated that the disclosure is not limited to what is be described above and be shown in the drawings
Precision architecture, and can without departing from the scope carry out various modifications and changes.The model of the disclosure
Enclose and only limited by appended claim.
Claims (14)
1. a kind of image blend processing method, is applied to be capable of achieving the touch control terminal of pressure-sensing;Its
It is characterised by, described image mixed processing method includes:
S10. create a background base map and draw a dynamic texture on the background base map, and extremely
Dynamic texture described in small part is in opaque state;
S20. receive the touch event of input, and obtain the location parameter in the touch event and
Pressure parameter;
S30. figure dataller tool is created, and institute is dynamically adjusted according to the pressure parameter of the touch event
State the transparency value added of figure dataller tool;
S40. the shifting that dynamically the adjustment figure dataller has of the location parameter in the touch event
Dynamic rail mark;
S50. the region passed through in the motion track of figure dataller tool, has using the figure dataller
Increase the transparency of the dynamic texture, and the dynamic texture is mixed with the background base map.
2. image blend processing method according to claim 1, it is characterised in that the step
Rapid S30 also includes:
The pressure parameter area that dynamically the adjustment figure dataller has according to the touch event.
3. image blend processing method according to claim 2, it is characterised in that the step
Rapid S30 includes:
Judge that the pressure parameter is in interval first pressure, second pressure interval or the 3rd pressure area
Between;Maximum is less than minimum value, described second in second pressure interval in the first pressure interval
Maximum is less than minimum value in the 3rd pressure range in pressure range;
When the pressure parameter is in first pressure interval, adjusting the figure dataller has
First transparency value added;
When the pressure parameter is in second pressure interval, adjusting the figure dataller has
Second transparency value added;The second transparency value added is higher than the first transparency value added;
When the pressure parameter is in three pressure range, adjusting the figure dataller has
3rd transparency value added;The 3rd transparency value added is higher than the second transparency value added.
4. image blend processing method according to claim 3, it is characterised in that wherein:
When the pressure parameter is in first pressure interval, adjusting the figure dataller has
First area;
When the pressure parameter is in second pressure interval, adjusting the figure dataller has
Second area;The second area is more than first area;
When the pressure parameter is in three pressure range, adjusting the figure dataller has
3rd area;3rd area is more than the second area.
5. image blend processing method according to claim 1, it is characterised in that wherein:
The step S10 and step S30 to step S50 is performed by a transparency control module;
The step S20 touches receiver module and performs by one;
Described image mixed processing method also includes:
S01. the receiver module that touches registers touch event to the operating system of the touch control terminal,
So that the operating system is input into the touch receiver module when the touch event is detected;
S02. the transparency control module registers ginseng to the touch receiver module of the touch control terminal
Number notification event, so that the touch receiver module is when the touch event is received, will be described
Location parameter and pressure parameter in touch event are input into the transparency control module.
6. the image blend processing method according to claim 1 to 5 any one, its feature
It is that the touch event includes touching initiation event, touches moving event and touch and terminate thing
Part;Described image mixed processing method also includes:
S60. when the touch End Event occurs, this image blend is terminated.
7. image blend processing method according to claim 6, it is characterised in that the figure
As mixed processing method also includes:
In described image mixed process, detect whether all regions of the dynamic texture are complete
Pellucidity is in entirely;
When all regions of the dynamic texture are completely opaque, terminate image blend.
8. a kind of image blend processing unit, is applied to be capable of achieving the touch control terminal of pressure-sensing;Its
It is characterised by, described image mixed processing device includes:
Image creation unit, moves for creating a background base map and drawing one on the background base map
State texture, and the dynamic texture is in opaque state;
Receiver module is touched, for receiving the touch event of input, and the touch event is obtained
Location parameter and pressure parameter;
Figure brush control unit, for creating figure dataller tool, and according to the pressure of the touch event
Dynamic state of parameters adjusts the transparency value added of the figure dataller tool;
TRAJECTORY CONTROL unit, for described in the location parameter in the touch event dynamically adjustment
The motion track of figure dataller's tool;
Image-blending unit, the region that the motion track for having in the figure dataller is passed through, profit
Having with the figure dataller increases the transparency of the dynamic texture, and by the dynamic texture with it is described
Background base map mixes.
9. image blend processing unit according to claim 8, it is characterised in that the figure
Brush control unit is additionally operable to, and the pressure parameter according to the touch event dynamically adjusts the figure dataller
The area of tool.
10. image blend processing unit according to claim 9, it is characterised in that described
The transparency value added bag that the figure dataller has dynamically is adjusted according to the pressure parameter of the touch event
Include:
Judge that the pressure parameter is in interval first pressure, second pressure interval or the 3rd pressure area
Between;Maximum is less than minimum value, described second in second pressure interval in the first pressure interval
Maximum is less than minimum value in the 3rd pressure range in pressure range;
When the pressure parameter is in first pressure interval, adjusting the figure dataller has
First transparency value added;
When the pressure parameter is in second pressure interval, adjusting the figure dataller has
Second transparency value added;The second transparency value added is higher than the first transparency value added;
When the pressure parameter is in three pressure range, adjusting the figure dataller has
3rd transparency value added;The 3rd transparency value added is higher than the second transparency value added.
11. image blend processing units according to claim 10, it is characterised in that described
The pressure parameter area that dynamically the adjustment figure dataller has according to the touch event includes:
When the pressure parameter is in first pressure interval, adjusting the figure dataller has
First area;
When the pressure parameter is in second pressure interval, adjusting the figure dataller has
Second area;The second area is more than first area;
When the pressure parameter is in three pressure range, adjusting the figure dataller has
3rd area;3rd area is more than the second area.
12. image blend processing units according to claim 8, it is characterised in that wherein:
Described image creating unit, figure brush control unit, TRAJECTORY CONTROL unit and image blend list
Unit is packaged in a transparency control module;
The touch receiver module is additionally operable to, and thing is touched to the operating system registration of the touch control terminal
Part, so that the operating system is input into described touch when the touch event is detected receives mould
Block;
The transparency control module is additionally operable to, and is registered to the touch receiver module of the touch control terminal
Parameter notification event, so that the touch receiver module is when the touch event is received, by institute
Location parameter and pressure parameter in touch event is stated to send to the transparency control module.
The 13. image blend processing unit according to claim 8 to 12 any one, it is special
Levy and be, the touch event includes touching initiation event, touches moving event and touch and terminate
Event;Described image mixed processing device also includes:
Finishing control unit, for when the touch End Event occurs, terminating this image and mixing
Close.
14. image blend processing units according to claim 13, it is characterised in that described
Finishing control unit is additionally operable to, and in described image mixed process, detects the institute of the dynamic texture
Have whether region has been completely in pellucidity, and in all regions of the dynamic texture
When completely opaque, terminate image blend.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201511021796.XA CN106933474B (en) | 2015-12-30 | 2015-12-30 | Image mixing processing method and device |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201511021796.XA CN106933474B (en) | 2015-12-30 | 2015-12-30 | Image mixing processing method and device |
Publications (2)
Publication Number | Publication Date |
---|---|
CN106933474A true CN106933474A (en) | 2017-07-07 |
CN106933474B CN106933474B (en) | 2020-02-04 |
Family
ID=59441092
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201511021796.XA Active CN106933474B (en) | 2015-12-30 | 2015-12-30 | Image mixing processing method and device |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN106933474B (en) |
Cited By (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN108733293A (en) * | 2018-06-11 | 2018-11-02 | 广州视源电子科技股份有限公司 | writing track processing method and device |
CN110083302A (en) * | 2019-04-30 | 2019-08-02 | 维沃移动通信有限公司 | A kind of method, apparatus and terminal executing predetermined registration operation |
WO2020015724A1 (en) * | 2018-07-20 | 2020-01-23 | 华为技术有限公司 | Picture acquisition method, and picture processing method and device |
CN110910307A (en) * | 2019-11-29 | 2020-03-24 | 珠海豹趣科技有限公司 | Image processing method, device, terminal and storage medium |
CN111221461A (en) * | 2019-11-19 | 2020-06-02 | 北京字节跳动网络技术有限公司 | Method, apparatus, device and storage medium for gradually presenting image presentation content |
WO2023193663A1 (en) * | 2022-04-06 | 2023-10-12 | 华为技术有限公司 | Stylus and terminal device |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102681678A (en) * | 2011-03-08 | 2012-09-19 | 汉王科技股份有限公司 | Electromagnetic pen with pressure-sensing erasing function and realization method thereof |
CN102939575A (en) * | 2010-06-14 | 2013-02-20 | 微软公司 | Ink rendering |
US9027153B2 (en) * | 2013-03-15 | 2015-05-05 | Google Technology Holdings LLC | Operating a computer with a touchscreen |
CN105183349A (en) * | 2015-08-27 | 2015-12-23 | 广东欧珀移动通信有限公司 | Image editing tool display method and mobile terminal |
-
2015
- 2015-12-30 CN CN201511021796.XA patent/CN106933474B/en active Active
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102939575A (en) * | 2010-06-14 | 2013-02-20 | 微软公司 | Ink rendering |
CN102681678A (en) * | 2011-03-08 | 2012-09-19 | 汉王科技股份有限公司 | Electromagnetic pen with pressure-sensing erasing function and realization method thereof |
US9027153B2 (en) * | 2013-03-15 | 2015-05-05 | Google Technology Holdings LLC | Operating a computer with a touchscreen |
CN105183349A (en) * | 2015-08-27 | 2015-12-23 | 广东欧珀移动通信有限公司 | Image editing tool display method and mobile terminal |
Cited By (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN108733293A (en) * | 2018-06-11 | 2018-11-02 | 广州视源电子科技股份有限公司 | writing track processing method and device |
WO2020015724A1 (en) * | 2018-07-20 | 2020-01-23 | 华为技术有限公司 | Picture acquisition method, and picture processing method and device |
US11302286B2 (en) | 2018-07-20 | 2022-04-12 | Huawei Technologies Co., Ltd. | Picture obtaining method and apparatus and picture processing method and apparatus |
CN110083302A (en) * | 2019-04-30 | 2019-08-02 | 维沃移动通信有限公司 | A kind of method, apparatus and terminal executing predetermined registration operation |
CN111221461A (en) * | 2019-11-19 | 2020-06-02 | 北京字节跳动网络技术有限公司 | Method, apparatus, device and storage medium for gradually presenting image presentation content |
CN110910307A (en) * | 2019-11-29 | 2020-03-24 | 珠海豹趣科技有限公司 | Image processing method, device, terminal and storage medium |
CN110910307B (en) * | 2019-11-29 | 2023-09-22 | 珠海豹趣科技有限公司 | Image processing method, device, terminal and storage medium |
WO2023193663A1 (en) * | 2022-04-06 | 2023-10-12 | 华为技术有限公司 | Stylus and terminal device |
Also Published As
Publication number | Publication date |
---|---|
CN106933474B (en) | 2020-02-04 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN106933474A (en) | Image blend processing method and processing device | |
Xia et al. | Object-oriented drawing | |
CN103365595B (en) | Gesture for touch sensitive input devices | |
Akaoka et al. | DisplayObjects: prototyping functional physical interfaces on 3d styrofoam, paper or cardboard models | |
CN104718528B (en) | Determine the method, apparatus and terminal device of the color of interface control | |
CN104737096B (en) | Display device | |
CN106933526A (en) | A kind of method of dynamic regulation screen refresh rate, device and mobile terminal | |
CN101315586B (en) | Electronic pen for interactive electronic white board and interaction control method thereof | |
CN102939575B (en) | Ink presents | |
CN104254831B (en) | System and method for visual interface content to be presented | |
CN104199550B (en) | Virtual keyboard operation device, system and method | |
CN106952235B (en) | A kind of image processing method and mobile terminal | |
CN104423836B (en) | Information processing unit | |
CN102096548A (en) | Method and system for duplicating an object using a touch-sensitive display | |
CN109254766A (en) | Visual programming platform and two-dimentional drawing three-dimensional visualization method based on mobile terminal | |
CN103092518A (en) | Moving cloud desktop accurate touch method based on remote desktop protocol (RDP) | |
CN107533426A (en) | The method and apparatus that simulation is used using numeral flag device, touch-control input and low-power reflected displaying device wall chart | |
US9075438B2 (en) | Systems and related methods involving stylus tactile feel | |
CN106569708B (en) | Method and terminal for realizing pressing simulation display | |
CN102096551B (en) | Method and device for implementing multi-pointer device operation in same work window | |
CN106855772A (en) | A kind of information displaying method and device | |
CN107454321A (en) | A kind of image pickup method, mobile terminal and computer-readable recording medium | |
CN105843539A (en) | Information processing method and electronic device | |
CN103064619A (en) | Method and system for performing slingshot unlocking on touch screen mobile device | |
CN107621910A (en) | Keyboard pad and terminal |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |