CN115344187A - Method and device for identifying brush style of An Zhuo Huatu software - Google Patents

Method and device for identifying brush style of An Zhuo Huatu software Download PDF

Info

Publication number
CN115344187A
CN115344187A CN202210968412.9A CN202210968412A CN115344187A CN 115344187 A CN115344187 A CN 115344187A CN 202210968412 A CN202210968412 A CN 202210968412A CN 115344187 A CN115344187 A CN 115344187A
Authority
CN
China
Prior art keywords
touch area
calculating
level
tmp
variance
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202210968412.9A
Other languages
Chinese (zh)
Other versions
CN115344187B (en
Inventor
袁伟铨
程泉森
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Konka Electronic Technology Co Ltd
Original Assignee
Shenzhen Konka Electronic Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Konka Electronic Technology Co Ltd filed Critical Shenzhen Konka Electronic Technology Co Ltd
Priority to CN202210968412.9A priority Critical patent/CN115344187B/en
Publication of CN115344187A publication Critical patent/CN115344187A/en
Application granted granted Critical
Publication of CN115344187B publication Critical patent/CN115344187B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures

Abstract

The present disclosure provides a method and a device for identifying a brush style of an ampere Zhuo Huatu software, wherein the method comprises the following steps: when a touch event of a target object is detected, acquiring a touch area of the target object; determining a target level corresponding to the target object according to the touch area, wherein the android drawing software is provided with a plurality of levels, each level corresponds to a brush style, and the plurality of levels comprise the target level; and identifying a target brush style corresponding to the target level. By the aid of the method and the device, the problem that touch area standards reported by different touch frames in the related technology are not uniform is solved, drawing with different painting brush styles is allowed on the basis of a touch area interval, and limitation of using a customized touch pen is avoided.

Description

Method and device for identifying brush style of An Zhuo Huatu software
Technical Field
The invention relates to the technical field of computers, in particular to a method and a device for identifying a brush style of an' Zhuo Huatu software.
Background
With the popularization of commercial display and touch televisions, more and more android drawing software is designed for the large screen of the commercial display and touch televisions. And functions of drawing a line, changing the color of the drawn line, changing the thickness of the drawn line, or changing the tip of the drawn line, which are one of core functions of the android drawing software, are developed.
In the prior art, the colors, thicknesses, pen points, transparencies and the like of drawn lines can be modified, so that the drawn lines have different 'styles', but the modification of the styles is usually aimed at the whole and single user. However, a large screen such as a commercial display and a touch television has a use scene in which a plurality of users draw a line on the screen at the same time. It is also said that if this is the case, each user operating simultaneously is given the same style. However, if one more choice is given to the user, it is hoped that the user experience is better if each user operating simultaneously can draw the line of the exclusive style.
To achieve the result, a set of pens with different widths can be provided for the user when the commercial display and touch television is sold, and the built-in android drawing software compares the touch width obtained when the pens touch the touch screen with the preset touch width to distinguish the users who operate simultaneously. However, the default is also obvious, the preset touch area cannot be modified, and the touch widths reported by the infrared frame and the capacitive screen to the same width are different, even if the infrared frames of different manufacturers of the infrared frame report different, the different firmware versions of the infrared frame of the same manufacturer report different versions. Therefore, the scheme of drawing lines with different widths by providing pens with different widths for users is severely limited by hardware, heavy and poor in user experience.
At present, no effective solution is provided for the problem that the touch area standards reported by different touch frames in the related art are not uniform.
Disclosure of Invention
The present disclosure aims to overcome the defects in the prior art, and provides a method and an apparatus for identifying a brush style of an an Zhuo Huatu software, so as to at least solve the problem that touch area standards reported by different touch frames in the related art are not uniform.
According to one aspect of the disclosure, a method for identifying a pen style of software Zhuo Huatu is provided, which includes:
when a touch event of a target object is detected, acquiring a touch area of the target object;
determining a target level corresponding to the target object according to the touch area, wherein the android drawing software is provided with a plurality of levels, each level corresponds to a brush style, and the plurality of levels comprise the target level;
and identifying a target brush style corresponding to the target level.
According to another aspect of the present disclosure, there is provided an apparatus for recognizing a brush style of software Zhuo Huatu, including:
the device comprises a first acquisition unit, a second acquisition unit and a third acquisition unit, wherein the first acquisition unit is used for acquiring the touch area of a target object when a touch event of the target object is detected;
the determining unit is used for determining a target level corresponding to the target object according to the touch area, wherein the android drawing software is provided with a plurality of levels, each level corresponds to one brush style, and the plurality of levels comprise the target level;
and the identification unit is used for identifying the target brush style corresponding to the target level.
According to another aspect of the present disclosure, there is provided an electronic device including:
a processor; and
a memory for storing the program, wherein the program is stored in the memory,
wherein the program comprises instructions which, when executed by the processor, cause the processor to perform the method of identifying the android drawing software brush style in the present disclosure.
According to another aspect of the present disclosure, there is provided a non-transitory computer readable storage medium storing computer instructions for causing the computer to perform the method of identifying the android drawing software brush style in the present disclosure.
According to one or more technical solutions provided in the embodiments of the present disclosure, when a touch event of a target object is detected, a touch area of the target object is obtained; determining a target level corresponding to the target object according to the touch area, wherein the android drawing software is provided with a plurality of levels, each level corresponds to a brush style, and the plurality of levels comprise the target level; and identifying a target brush style corresponding to the target level. According to the touch area drawing method and device, the problem that touch area standards reported by different touch frames in the related technology are not uniform is solved, drawing with different brush styles is allowed to be performed on the basis of the touch area interval, and the limitation that a customized touch pen needs to be used is avoided.
Drawings
Further details, features and advantages of the disclosure are disclosed in the following description of exemplary embodiments, taken in conjunction with the accompanying drawings, in which:
FIG. 1 shows a flowchart of a method for identifying a brush style of android drawing software according to an exemplary embodiment of the present disclosure;
FIG. 2 shows a schematic diagram of a multi-level calibration process according to an example embodiment of the present disclosure;
FIG. 3 shows a schematic diagram of a multi-level style selection process according to an example embodiment of the present disclosure;
FIG. 4 shows a schematic diagram of a multi-stage pattern recognition process according to an example embodiment of the present disclosure;
FIG. 5 is a schematic block diagram of an apparatus for recognition of a brush style of android drawing software according to an exemplary embodiment of the present disclosure;
FIG. 6 illustrates a block diagram of an exemplary electronic device that can be used to implement embodiments of the present disclosure.
Detailed Description
Embodiments of the present disclosure will be described in more detail below with reference to the accompanying drawings. While certain embodiments of the present disclosure are shown in the drawings, it is to be understood that the present disclosure may be embodied in various forms and should not be construed as limited to the embodiments set forth herein, but rather are provided for a more thorough and complete understanding of the present disclosure. It should be understood that the drawings and embodiments of the disclosure are for illustration purposes only and are not intended to limit the scope of the disclosure.
It should be understood that the various steps recited in method embodiments of the present disclosure may be performed in a different order, and/or performed in parallel. Moreover, method embodiments may include additional steps and/or omit performing the illustrated steps. The scope of the present disclosure is not limited in this respect.
The term "include" and variations thereof as used herein are open-ended, i.e., "including but not limited to". The term "based on" is "based, at least in part, on". The term "one embodiment" means "at least one embodiment"; the term "another embodiment" means "at least one additional embodiment"; the term "some embodiments" means "at least some embodiments". Relevant definitions for other terms will be given in the following description. It should be noted that the terms "first", "second", and the like in the present disclosure are only used for distinguishing different devices, modules or units, and are not used for limiting the order or interdependence relationship of the functions performed by the devices, modules or units.
It is noted that references to "a", "an", and "the" modifications in this disclosure are intended to be illustrative rather than limiting, and that those skilled in the art will recognize that "one or more" may be used unless the context clearly dictates otherwise.
The names of messages or information exchanged between devices in the embodiments of the present disclosure are for illustrative purposes only, and are not intended to limit the scope of the messages or information.
Aspects of the present disclosure are described below with reference to the accompanying drawings.
The disclosed exemplary embodiment provides a method for identifying a pen pattern of software Zhuo Huatu. Fig. 1 is a flowchart illustrating a method for identifying a brush style of android drawing software according to an exemplary embodiment of the present disclosure, where as shown in fig. 1, the method includes the following steps:
step S101, when a touch event of a target object is detected, acquiring a touch area of the target object;
step S102, determining a target level corresponding to the target object according to the touch area, wherein the android drawing software is provided with a plurality of levels, each level corresponds to a brush style, and the plurality of levels comprise the target level;
and step S103, identifying a target brush style corresponding to the target level.
Through the steps, based on the touch area interval, drawing is allowed to be carried out in different brush styles, the problem that touch area standards reported by different touch frames are different is solved, and the limitation that a customized touch pen needs to be used is avoided.
In some embodiments, the method for identifying the brush style of the android drawing software further comprises the following steps:
acquiring a touch area set of each level in the plurality of levels;
judging whether the touch area set of each level in the multiple levels meets a preset requirement or not;
and if the touch area set of each level in the multiple levels meets the preset requirement, calculating a boundary value of the touch area set of the adjacent level.
In some embodiments, determining whether the set of touch areas of each of the plurality of levels meets a preset requirement includes:
judgment S n And S n-1 Whether the cross section of (a) exceeds the fault tolerance range, wherein S n Set of touch areas for n levels, S n-1 Is a set of touch areas of n-1 levels, n being an integer greater than 1;
if S n And S n-1 Does not exceed the fault-tolerant range, S n Whether the data in (1) is centralized;
if S n In the data set of (1), then S is determined n And the preset requirement is met.
In some of these embodiments, the determination S n And S n-1 Whether the cross section of (a) exceeds the fault tolerance range includes:
obtaining S n-1 MAX of (3) n-1
Calculating S n In less than MAX n-1 The proportion of the elements (c);
if the ratio is greater than a first threshold, determining S n And S n-1 The cross section of (a) exceeds the fault tolerance range;
if soIf the ratio is less than or equal to the first threshold value, S is determined n And S n-1 Does not exceed the fault tolerance.
In some of these embodiments, the determination S n Whether the data in (1) collectively includes:
calculating S n The variance σ of (c);
if the variance σ is greater than the second threshold, the following steps are repeatedly executed until the variance σ' is less than the second threshold: calculating S n Variance σ after maximum value removal 1 And S n Variance σ after the minimum is removed 2 (ii) a According to the variance σ 1 Sum variance σ 2 Obtaining the magnitude relation of S n ', wherein, if the variance σ 1 Less than variance σ 2 ,S n Is' S n Set obtained after removing maximum value if variance σ 2 Less than variance σ 1 ,S n Is' S n Removing the minimum value to obtain a set; calculating S n ' variance σ ' of ' and n ' as S n
Calculating S n Becomes S n ' data to be discarded accounts for S n (ii) percent (d);
if the percentage is greater than the first threshold, determining S n In the data set (1).
In some embodiments, if the touch area set of each of the multiple levels meets the preset requirement, calculating the boundary value of the touch area sets of adjacent levels includes:
calculating S n-1 Average value of (Avg) n-1 And mixing Avg n-1 Assign to P left Calculating S n Average value of (Avg) n And mixing Avg n Assign to P right
Calculation = (P) left +P right )/2;
Calculating S n-1 Is greater than P tmp Ratio of data of (A) to (B) 1 Calculating S n In (b) is less than P tmp Ratio of data of (A) 2
If the ratio is A 1 And ratio of A 2 Are all less than or equal to the first threshold value, P is tmp Is assigned to D n-1 Wherein D is n-1 Is S n-1 And S n The cut-off value of (1);
if the ratio is A 1 Greater than the first threshold, and/or a ratio A 2 If the value is larger than the first threshold value, the value of P is taken as P left Or P right
If P is greater than P tmp Then P is assigned to P right ', introducing P tmp Assign to P left ’;
If P is less than or equal to P tmp Then P is assigned to P left ', introducing P tmp Assign to P right ’;
Calculating P tmp ’=(P left ’+P right ’)/2;
Will P tmp ' assign value to the P tmp And returning to perform the calculation S n-1 Is greater than P tmp Ratio of data of (A) 1 Calculating S n Is less than P tmp Ratio of data of (A) to (B) 2 The step (2).
In some embodiments, determining the target level corresponding to the target object according to the touch area includes:
judging the grading value of the touch area;
if the touch area is smaller than D1, determining that the target level corresponding to the target object is a first level;
if the touch area is larger than or equal to D n-2 And said is less than D n-1 Determining that the target level corresponding to the target object is the (n-1) th level;
if the touch area is larger than or equal to D n-1 Then the target level corresponding to the target object is determined to be the nth level.
The embodiment of the disclosure can be divided into a multi-level calibration module, a multi-level style selection module and a multi-level identification module based on the acquisition and calculation of touch events.
In practical applications, only two levels are calibrated, which is generally called thin pen for level 1 and thick pen for level 2. More stages can be opened according to the requirements of customers, and the names of the stages can also be set according to the preference of the customers. During calibration, a light gray transparent circle is arranged in the middle of the screen and is used for guiding a user to touch the screen by different gestures, and the circle is displayed in different places after each click and is used for guiding the user to touch the screen by different gestures. The number of times of each calibration can also be configurable, such as 10 times, but it is also arbitrary if the customer has a need to change the profile to 30 times.
The multi-level pattern selection module can switch and select the brush style of modifying level 1 or level 2, can select the brush type, can select different brush types, can select the thickness and the color, and can set more patterns, such as transparency, granularity and the like.
The specific implementation method of the embodiment of the disclosure is described as follows:
as shown in fig. 2, the implementation process of the multi-stage calibration module specifically includes:
1. and selecting a multi-level recognition level N (dividing into N sets of different drawing patterns according to different widths of the brushes during drawing).
2. If N is 1, finishing calibration, and if N is more than or equal to 2, performing N-round data acquisition.
3. Each round of calibration corresponds to one level of data acquisition, and the levels go from low to high, corresponding to the widths going from thin to thick. The first round, level 1 data acquisition is taken as an example below.
a. The software then prompts the user that this is a level 1 calibration round and guides the user to make multiple touches of different gestures using the same object.
a-1. Since for an object, it touches with different gestures, its touch area is different. Taking a pen as an example, the different posture can be understood as an angle formed by the pen and the screen, wherein the pen point is an area when falling perpendicular to the screen, another area when the pen point is 45 degrees away from the screen, and another area when the pen point is 10 degrees away from the screen. Considering that the user usually uses the touch screen with different gestures, it is common that the gestures are different in area, but the gestures are required to be classified into one class or one class for the user, so that data of touches with different gestures are required to be collected for subsequent calculation.
b. For the collected touch area set to level 1, it is denoted as S 1 . (by analogy, S for stage 2) 2 Of class N is S n )。
4. After the calibration is finished, S is obtained 1 、S 2 、...、S n
5. According to S 1 、S 2 、...、S n Whether the data meets the requirements or not, whether the data is concentrated enough or not, whether the cross part of the adjacent collections exceeds the fault tolerance range or not, and the like are judged, and other standards are also met.
6. If the requirements are not met, a friendly prompt pops up, and the user is advised to use other objects as line drawing tools or reduce the multi-level recognition level.
7. If the data meets the requirement, calculating a boundary value according to the adjacent set, and recording the boundary value as D n-1 With S 1 、S 2 For example, a boundary value D is calculated 1
567-1, firstly, judging whether the intersection part of the current collection and the collection one level lower than the current collection exceeds the fault tolerance range. The judgment criterion is that the maximum value in the aggregate is lower than the maximum value by one step (if the maximum value is S) 1 And then corresponding to 0), the current collection S) is calculated n If the element proportion is larger than C1 (the first threshold value), the data is considered to be unsatisfactory, and the process proceeds to 6. Otherwise, 567-2 is entered.
567-2, judging whether the data are sufficiently concentrated.
a. And calculating the variance sigma of the current collection, and if the sigma is larger than C2 (the second threshold degree), considering that the data needs to be further simplified.
b. Calculating the variance sigma of the union after removing the current maximum value 1 And the variance sigma after the current minimum value is removed from the collection 2 To judge σ 1 And σ 2 And (5) taking the smaller one as a simplification basis for the collection to obtain a new collection. For example, if σ 1 Is relatively small in size and small in size,it means we have removed and concentrated the current maximum to get a new collection.
c. And calculating the variance sigma of the new collection, and if the sigma is larger than C2, considering that the data needs to be further simplified. And b is continuously executed.
d. If the sigma is less than or equal to C2, the data is considered to be completely simplified (considering the worst case, only one simplified data is left, and the condition that d can be always met enters e.)
e. And calculating the percentage of the data removed in the process of changing the original collection into the reduced collection in the original collection, wherein if the percentage is more than C1, the data is considered not to meet the requirement, entering 6, and otherwise, entering 7. ( Note: enter 7. The original collection before the reduction is used as a temporary judgment )
567-3 calculating a boundary value D 1
a. Computing a set S 1 Average value of A vg1 Assign it to P left Calculating S 2 Average value of A vg2 Assign it to P right
b. By (P) left +P right ) 2 obtaining P tmp
c. Calculating S 1 Is greater than P tmp The ratio of the data of (1), calculating S 2 Is less than P tmp The ratio of the data of (a). If the ratio of both collections is less than or equal to C1, the value is assigned to D 1 Obtaining a boundary value D 1 . If not, then go to D, find D by dichotomy 1 .
d. If a certain ratio is larger than C1, P (or one of P and P) close to another collection is taken for subsequent processing, and if P is larger than C, P is assigned to P right ’,P tmp Assign to P left '. Otherwise, assigning P to P left ', assigned to P right '. Then calculate P tmp ’=(P left ’+P right ')/2, adding P tmp ' assign value to the P tmp And returning to execute the step c. (since the preceding 567-1, 567-2, the occupation ratios of both collections will not be greater than C1)
For example, we calculateObtain S 1 I.e. S 1 Is greater than P tmp If the ratio of the data of (1) is greater than C1, we take the data closer to the other collection, that is, S 2 And according to our progression definition is from thin to thick, that is to say S 2 The data of (2) are all large. Plus we are on P left And P right Assignment of (2) assignment manner of P right Is a ratio of P left Is larger and therefore closer to S 2 P is P right
e. Repeat b.c.d. eventually always find D 1
8. Through calculation, D is obtained 1 、D 2 。。。。。。。
9. And (5) completing calibration after successful calibration.
As shown in fig. 3, the process of the multi-level style selection module specifically includes:
1. and providing a corresponding style selection interface according to the success level of the latest multi-level calibration module.
2. Selecting a corresponding grade on the interface, and entering a style selection interface corresponding to the grade (for example, if the last successful grade is 3, there are 1,2,3, three grades in total).
3. The patterns corresponding to the series can be adjusted, including width, color, transparency, stroke edge, primitive drawing points and other customized pattern effects, and the patterns corresponding to the series are recorded as Y 1 ,Y 2 ,...,。
As shown in fig. 4, the process of the multi-stage identification module specifically includes:
1. when an ACTION _ POINTER _ DOWN event occurs (ACTION _ POINTER _ DOWN represents a touch event when an object is pressed), according to a touch area recorded by the touch event, recording the touch area as area, judging the progression of the touch area, and recording the PointerID (PointerID is a value indicating which finger/object a single touch event belongs to).
2. The judgment method comprises the following steps: if area < D1, it is 1 grade, if D1 ≦ area < D2, it is 2 grade n-1 If the area is less than or equal to N grade.
After the number of stages is judged, the line drawn by the touch event corresponding to the rear PointerID will use the numberThe pattern of the number of stages, for example, when it is judged that the number of stages is 2, the pattern Y is used 2
The touch area interval is used as a basis, drawing is allowed to be carried out in different brush pen styles, the problem that touch area standards reported by different touch frames are different is solved through the calibration module, and the limitation that a customized touch pen needs to be used is avoided.
It should be noted that the steps illustrated in the above-described flow diagrams or in the flow diagrams of the figures may be performed in a computer system, such as a set of computer-executable instructions, and that, although a logical order is illustrated in the flow diagrams, in some cases, the steps illustrated or described may be performed in an order different than here.
The exemplary embodiment of the present disclosure further provides a device for recognizing a brush style of software Zhuo Huatu, where the device is used to implement the foregoing embodiment and the preferred embodiments, and description thereof is omitted for brevity. As used hereinafter, the terms "module," "unit," "subunit," and the like may implement a combination of software and/or hardware for a predetermined function. Although the means described in the embodiments below are preferably implemented in software, an implementation in hardware or a combination of software and hardware is also possible and contemplated.
Fig. 5 is a schematic block diagram illustrating an apparatus for identifying a brush style of android drawing software according to an exemplary embodiment of the present disclosure, as shown in fig. 5, the apparatus includes:
a first acquiring unit 51 configured to acquire a touch area of a target object when a touch event of the target object is detected;
a determining unit 52, configured to determine a target level corresponding to the target object according to the touch area, where the android drawing software is provided with multiple levels, each level corresponds to one brush style, and the multiple levels include the target level;
and the identifying unit 53 is configured to identify a target brush style corresponding to the target level.
In some embodiments, the device for identifying the brush style of the android drawing software further includes:
a second acquisition unit configured to acquire a set of touch areas for each of the plurality of levels;
the judging unit is used for judging whether the touch area set of each level in the multiple levels meets the preset requirement or not;
and the calculating unit is used for calculating the boundary value of the touch area set of the adjacent level if the touch area set of each level in the multiple levels meets the preset requirement.
In some embodiments, the determining unit includes:
a first judging module for judging S n And S n-1 Whether the cross section of (2) exceeds the fault tolerance range, wherein S n Set of touch areas of n levels, S n-1 Is a set of touch areas of n-1 levels, n being an integer greater than 1;
a second judgment module for judging if S n And S n-1 Does not exceed the fault tolerance range, S is judged n Whether the data in (1) is centralized;
a first determination module for determining if S n In the data set of (1), then S is determined n And the preset requirement is met.
In some embodiments, the first determining module comprises:
an acquisition submodule for acquiring S n-1 MAX of (3) n-1
A first calculation submodule for calculating S n In the range of less than MAX n-1 The proportion of the elements (c);
a first determination submodule for determining S if the ratio is greater than a first threshold n And S n-1 The cross section of (a) exceeds the fault tolerance range;
a second determination submodule for determining S if the ratio is less than or equal to the first threshold n And S n-1 Does not exceed the fault tolerance.
In some embodiments, the second determining module comprises:
second calculatorModule for calculating S n The variance σ of (c);
an execution submodule, configured to, if the variance σ is greater than a second threshold, repeatedly execute the following steps until the variance σ' is less than the second threshold: calculating S n Variance σ after maximum value removal 1 And S n Variance σ after the minimum is removed 2 (ii) a According to variance σ 1 Sum variance σ 2 Obtaining the magnitude relation of S n ', wherein, if the variance σ 1 Less than variance σ 2Sn’ Is S n Set obtained by removing maximum value, if variance σ 2 Less than variance σ 1 ,S n Is' S n Removing the minimum value to obtain a set; calculating S n ' variance σ ' of ' and n ' as S n
A third calculation submodule for calculating S n Becomes S n ' data to be discarded accounts for S n (ii) percent (d);
a third determining submodule for determining S if the percentage is greater than the first threshold n In the data set (1).
In some of these embodiments, the computing unit comprises:
a first calculation module for calculating S n-1 Average value of (Avg) n-1 And mixing Avg n-1 Assign to P left Calculating S n Average value of (Avg) n And mixing Avg n Assign to P right
A second calculation module for calculating P tmp =(P left +P right )/2;
A third calculation module for calculating S n-1 Is greater than P tmp Ratio of data of (A) 1 Calculating S n Is less than P tmp Ratio of data of (A) 2
A first assignment module for assigning a value of A 1 And ratio of A 2 Are all less than or equal to the first threshold value, P is tmp Is assigned to D n-1 Wherein D is n-1 Is S n-1 And S n The cut-off value of (1);
a second assignment module for assigning a value if the ratio is A 1 Greater than the first threshold, and/or a ratio A 2 If the value is larger than the first threshold value, the value of P is taken as P left Or P right
A third assignment module for assigning a value to the P if P is greater than P tmp Then P is assigned to P right ', assigning Ptmp to P left ’;
A fourth assignment module for assigning a value to P if P is less than or equal to P tmp Then P is assigned to P left ', will P tmp Assign to P right ’;
A fourth calculation module for calculating P tmp ’=(P left ’+P right ’)/2;
A fifth assignment module for assigning P tmp ' assign value to the P tmp And returning to perform the calculation S n-1 Is greater than P tmp Ratio of data of (A) 1 Calculating S n In (b) is less than P tmp Ratio of data of (A) 2 The step (2).
In some of these embodiments, the determining unit 52 includes:
the third judging module is used for judging the grading value of the touch area;
a second determining module for determining if the touch area is smaller than D 1 Determining that the target level corresponding to the target object is a first level;
a third determining module for determining if the touch area is greater than or equal to D n-2 And said is less than D n-1 Determining that the target level corresponding to the target object is the (n-1) th level;
a fourth determining module, configured to determine whether the touch area is greater than or equal to D n-1 Then, the target level corresponding to the target object is determined to be the nth level.
The above modules may be functional modules or program modules, and may be implemented by software or hardware. For a module implemented by hardware, the modules may be located in the same processor; or the modules may be located in different processors in any combination.
An exemplary embodiment of the present disclosure also provides an electronic device including: at least one processor; and a memory communicatively coupled to the at least one processor. The memory stores a computer program executable by the at least one processor, the computer program, when executed by the at least one processor, is for causing the electronic device to perform a method according to an embodiment of the disclosure.
The disclosed exemplary embodiments also provide a non-transitory computer readable storage medium storing a computer program, wherein the computer program, when executed by a processor of a computer, is adapted to cause the computer to perform a method according to an embodiment of the present disclosure.
The exemplary embodiments of the present disclosure also provide a computer program product comprising a computer program, wherein the computer program, when executed by a processor of a computer, is adapted to cause the computer to perform a method according to an embodiment of the present disclosure.
Referring to fig. 6, a block diagram of a structure of an electronic device 600, which may be a server or a client of the present disclosure, which is an example of a hardware device that may be applied to aspects of the present disclosure, will now be described. Electronic device is intended to represent various forms of digital electronic computer devices, such as laptops, desktops, workstations, personal digital assistants, servers, blade servers, mainframes, and other suitable computers. The electronic device may also represent various forms of mobile devices, such as personal digital processing, cellular phones, smart phones, wearable devices, and other similar computing devices. The components shown herein, their connections and relationships, and their functions, are meant to be examples only, and are not meant to limit implementations of the disclosure described and/or claimed herein.
As shown in fig. 6, the electronic device 600 includes a computing unit 601 that can perform various appropriate actions and processes according to a computer program stored in a Read Only Memory (ROM) 602 or a computer program loaded from a storage unit 608 into a Random Access Memory (RAM) 603. In the RAM 603, various programs and data necessary for the operation of the device 600 can also be stored. The calculation unit 601, the ROM 602, and the RAM 603 are connected to each other via a bus 604. An input/output (I/O) interface 605 is also connected to bus 604.
Various components in the electronic device 600 are connected to the I/O interface 605, including: an input unit 606, an output unit 607, a storage unit 608, and a communication unit 609. The input unit 606 may be any type of device capable of inputting information to the electronic device 600, and the input unit 606 may receive input numeric or character information and generate key signal inputs related to user settings and/or function controls of the electronic device. Output unit 607 may be any type of device capable of presenting information and may include, but is not limited to, a display, speakers, a video/audio output terminal, a vibrator, and/or a printer. The storage unit 608 may include, but is not limited to, a magnetic disk, an optical disk. The communication unit 609 allows the electronic device 600 to exchange information/data with other devices via a computer network, such as the internet, and/or various telecommunications networks, and may include, but is not limited to, a modem, a network card, an infrared communication device, a wireless communication transceiver, and/or a chipset, such as a bluetooth device, a WiFi device, a WiMax device, a cellular communication device, and/or the like.
Computing unit 601 may be a variety of general and/or special purpose processing components with processing and computing capabilities. Some examples of the computing unit 601 include, but are not limited to, a Central Processing Unit (CPU), a Graphics Processing Unit (GPU), various dedicated Artificial Intelligence (AI) computing chips, various computing units running machine learning model algorithms, a Digital Signal Processor (DSP), and any suitable processor, controller, microcontroller, and so forth. The calculation unit 601 performs the respective methods and processes described above. For example, in some embodiments, the android drawing software brush style identification method may be implemented as a computer software program tangibly embodied in a machine-readable medium, such as the storage unit 608. In some embodiments, part or all of the computer program may be loaded and/or installed onto the electronic device 600 via the ROM 602 and/or the communication unit 609. In some embodiments, the computing unit 601 may be configured to perform the android drawing software brush style identification method in any other suitable manner (e.g., by way of firmware).
Program code for implementing the methods of the present disclosure may be written in any combination of one or more programming languages. These program code may be provided to a processor or controller of a general purpose computer, special purpose computer, or other programmable data processing apparatus, such that the program code, when executed by the processor or controller, causes the functions/acts specified in the flowchart and/or block diagram to be performed. The program code may execute entirely on the machine, partly on the machine, as a stand-alone software package partly on the machine and partly on a remote machine or entirely on the remote machine or server.
In the context of this disclosure, a machine-readable medium may be a tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. The machine-readable medium may be a machine-readable signal medium or a machine-readable storage medium. A machine-readable medium may include, but is not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any suitable combination of the foregoing. More specific examples of a machine-readable storage medium would include an electrical connection based on one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing.
As used in this disclosure, the terms "machine-readable medium" and "computer-readable medium" refer to any computer program product, apparatus, and/or device (e.g., magnetic discs, optical disks, memory, programmable Logic Devices (PLDs)) used to provide machine instructions and/or data to a programmable processor, including a machine-readable medium that receives machine instructions as a machine-readable signal. The term "machine-readable signal" refers to any signal used to provide machine instructions and/or data to a programmable processor.
To provide for interaction with a user, the systems and techniques described here can be implemented on a computer having: a display device (e.g., a CRT (cathode ray tube) or LCD (liquid crystal display) monitor) for displaying information to a user; and a keyboard and a pointing device (e.g., a mouse or a trackball) by which a user can provide input to the computer. Other kinds of devices may also be used to provide for interaction with a user; for example, feedback provided to the user can be any form of sensory feedback (e.g., visual feedback, auditory feedback, or tactile feedback); and input from the user may be received in any form, including acoustic, speech, or tactile input.
The systems and techniques described here can be implemented in a computing system that includes a back-end component (e.g., as a data server), or that includes a middleware component (e.g., an application server), or that includes a front-end component (e.g., a user computer having a graphical user interface or a web browser through which a user can interact with an implementation of the systems and techniques described here), or any combination of such back-end, middleware, or front-end components. The components of the system can be interconnected by any form or medium of digital data communication (e.g., a communication network). Examples of communication networks include: local Area Networks (LANs), wide Area Networks (WANs), and the Internet.
The computer system may include clients and servers. A client and server are generally remote from each other and typically interact through a communication network. The relationship of client and server arises by virtue of computer programs running on the respective computers and having a client-server relationship to each other.

Claims (10)

1. A recognition method for a pen pattern of an Ang Zhuo Huatu software is characterized by comprising the following steps:
when a touch event of a target object is detected, acquiring a touch area of the target object;
determining a target level corresponding to the target object according to the touch area, wherein the android drawing software is provided with a plurality of levels, each level corresponds to a brush style, and the plurality of levels comprise the target level;
and identifying a target brush style corresponding to the target level.
2. The method for identifying the brush style of the android drawing software of claim 1, further comprising:
acquiring a touch area set of each level in the plurality of levels;
judging whether the touch area set of each level in the multiple levels meets a preset requirement or not;
and if the touch area set of each level in the multiple levels meets the preset requirement, calculating a boundary value of the touch area set of the adjacent level.
3. The android drawing software brush style identification method of claim 2, wherein determining whether the set of touch areas at each of the plurality of levels meets a preset requirement comprises:
judgment S n And S n-1 Whether the cross section of (2) exceeds the fault tolerance range, wherein S n Set of touch areas for n levels, S n-1 Is a set of touch areas of n-1 levels, n being an integer greater than 1;
if S n And S n-1 Does not exceed the fault tolerance range, S is judged n Whether the data in (1) is centralized;
if S n In the data set of (1), then S is determined n And the preset requirement is met.
4. The method for identifying the brush style of the android drawing software as claimed in claim 3, wherein the method is characterized in thatCharacterized in that S is judged n And S n-1 Whether the cross section of (a) exceeds the fault tolerance range includes:
obtaining S n-1 MAX of (3) n-1
Calculating S n In less than MAX n-1 The proportion of the elements (c);
if the ratio is greater than a first threshold, determining S n And S n-1 The cross section of (a) exceeds the fault tolerance range;
if the duty ratio is less than or equal to the first threshold, determining S n And S n-1 Does not exceed the fault tolerance.
5. The android drawing software brush style identification method of claim 3, characterized in that S is judged n Whether the data in (1) collectively includes:
calculating S n The variance σ of (c);
if the variance σ is greater than the second threshold, the following steps are repeatedly executed until the variance σ' is less than the second threshold: calculating S n Variance σ after maximum value removal 1 And S n Variance σ after the minimum is removed 2 (ii) a According to variance σ 1 Sum variance σ 2 Obtaining the magnitude relation of S n ', wherein, if the variance σ 1 Less than variance σ 2 ,S n Is' S n Set obtained by removing maximum value, if variance σ 2 Less than variance σ 1 ,S n Is' S n Removing the minimum value to obtain a set; calculating S n ' variance σ ' of ' and n ' as S n
Calculating S n Becomes S n ' data to be discarded accounts for S n (ii) percent (d);
if the percentage is greater than the first threshold, determining S n In the data set (1).
6. The method for identifying the brush style of the android drawing software of claim 3, wherein if the touch area set of each of the plurality of levels meets the preset requirement, calculating the boundary value of the touch area sets of adjacent levels comprises:
calculating S n-1 Average value of (Avg) n-1 And mixing Avg n-1 Assign to P left Calculating S n Average value of (2) Avg n And mixing Avg n Assign to P right
Calculating P tmp =(P left +P right )/2;
Calculating S n-1 Is greater than P tmp Ratio of data of (A) 1 Calculating S n Is less than P tmp Ratio of data of (A) 2
If the ratio is A 1 And ratio of A 2 Are all less than or equal to the first threshold value, P is tmp Is assigned to D n-1 Wherein D is n-1 Is S n-1 And S n The cut-off value of (1);
ratio of A 1 Greater than the first threshold, and/or a ratio A 2 If the value is larger than the first threshold value, the value of P is taken as P left Or P right
If P is greater than P tmp Then P is assigned to P right ', will P tmp Assign to P left ’;
If P is less than or equal to P tmp Then P is assigned to P left ', will P tmp Assign to P right ’;
Calculating P tmp ’=(P left ’+P right ’)/2;
Will P tmp ' assign value to the P tmp And returning to perform the calculation S n-1 Is greater than P tmp Ratio of data of (A) 1 Calculating S n In (b) is less than P tmp Ratio of data of (A) 2 The step (2).
7. The android drawing software brush style identification method of claim 6, wherein determining the target level corresponding to the target object based on the touch area comprises:
judging the grading value of the touch area;
if the touch area is smaller than D1, determining that the target level corresponding to the target object is a first level;
if the touch area is larger than or equal to D n-2 And said is less than D n-1 Determining that the target level corresponding to the target object is the (n-1) th level;
if the touch area is larger than or equal to D n-1 Then the target level corresponding to the target object is determined to be the nth level.
8. An ann Zhuo Huatu software brush pattern recognition device, comprising:
the device comprises a first acquisition unit, a second acquisition unit and a third acquisition unit, wherein the first acquisition unit is used for acquiring the touch area of a target object when a touch event of the target object is detected;
the determining unit is used for determining a target level corresponding to the target object according to the touch area, wherein the android drawing software is provided with a plurality of levels, each level corresponds to one brush style, and the plurality of levels comprise the target level;
and the identification unit is used for identifying the target brush style corresponding to the target level.
9. An electronic device, comprising:
a processor; and
a memory for storing a program, wherein the program is stored in the memory,
wherein the program comprises instructions which, when executed by the processor, cause the processor to perform the method of identifying an android drawing software brush style of any of claims 1-7.
10. A non-transitory computer readable storage medium storing computer instructions for causing a computer to perform the method for identifying an android drawing software brush style as claimed in any one of claims 1-7.
CN202210968412.9A 2022-08-12 2022-08-12 Method and device for identifying painting brush style by using safety Zhuo Huatu software Active CN115344187B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210968412.9A CN115344187B (en) 2022-08-12 2022-08-12 Method and device for identifying painting brush style by using safety Zhuo Huatu software

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210968412.9A CN115344187B (en) 2022-08-12 2022-08-12 Method and device for identifying painting brush style by using safety Zhuo Huatu software

Publications (2)

Publication Number Publication Date
CN115344187A true CN115344187A (en) 2022-11-15
CN115344187B CN115344187B (en) 2024-03-12

Family

ID=83951070

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210968412.9A Active CN115344187B (en) 2022-08-12 2022-08-12 Method and device for identifying painting brush style by using safety Zhuo Huatu software

Country Status (1)

Country Link
CN (1) CN115344187B (en)

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103164158A (en) * 2013-01-10 2013-06-19 深圳市欧若马可科技有限公司 Method, system and device of creating and teaching painting on touch screen
CN106547433A (en) * 2016-11-07 2017-03-29 青岛海信电器股份有限公司 Written handwriting determines method and device
CN106843650A (en) * 2017-01-17 2017-06-13 创维光电科技(深圳)有限公司 The touch identification method and system of a kind of touch screen integrated machine
US20190004698A1 (en) * 2013-04-15 2019-01-03 Carnegie Mellon University Virtual tools for use with touch-sensitive surfaces
CN109885201A (en) * 2019-02-19 2019-06-14 Oppo广东移动通信有限公司 Touch screen touches area detection method, electronic device and computer readable storage medium
CN113296616A (en) * 2021-05-12 2021-08-24 深圳市宝视达光电有限公司 Pen point selection method and device and intelligent terminal
CN114385098A (en) * 2020-10-20 2022-04-22 全方位语言解决方案有限责任公司 Computerized method and apparatus for determining accuracy of written characters
CN114816130A (en) * 2022-06-29 2022-07-29 长沙朗源电子科技有限公司 Writing recognition method and system of electronic whiteboard, storage medium and electronic whiteboard

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103164158A (en) * 2013-01-10 2013-06-19 深圳市欧若马可科技有限公司 Method, system and device of creating and teaching painting on touch screen
US20190004698A1 (en) * 2013-04-15 2019-01-03 Carnegie Mellon University Virtual tools for use with touch-sensitive surfaces
CN106547433A (en) * 2016-11-07 2017-03-29 青岛海信电器股份有限公司 Written handwriting determines method and device
CN106843650A (en) * 2017-01-17 2017-06-13 创维光电科技(深圳)有限公司 The touch identification method and system of a kind of touch screen integrated machine
CN109885201A (en) * 2019-02-19 2019-06-14 Oppo广东移动通信有限公司 Touch screen touches area detection method, electronic device and computer readable storage medium
CN114385098A (en) * 2020-10-20 2022-04-22 全方位语言解决方案有限责任公司 Computerized method and apparatus for determining accuracy of written characters
CN113296616A (en) * 2021-05-12 2021-08-24 深圳市宝视达光电有限公司 Pen point selection method and device and intelligent terminal
CN114816130A (en) * 2022-06-29 2022-07-29 长沙朗源电子科技有限公司 Writing recognition method and system of electronic whiteboard, storage medium and electronic whiteboard

Also Published As

Publication number Publication date
CN115344187B (en) 2024-03-12

Similar Documents

Publication Publication Date Title
CN106156791B (en) Business data classification method and device
CN108628656A (en) Interface adaptation method, device, computer equipment and storage medium on ios device
US9285969B2 (en) User interface navigation utilizing pressure-sensitive touch
CN110427232B (en) Page management method and device, computer equipment and storage medium
CN105117062A (en) Screen luminance regulation method and mobile terminal
EP2869174A1 (en) Method and device for text input and display of intelligent terminal
CN109118447B (en) Picture processing method, picture processing device and terminal equipment
CN104898981A (en) Gesture recognizing method, device and terminal
CN110377215B (en) Model display method and device and terminal equipment
CN112163642A (en) Wind control rule obtaining method, device, medium and equipment
CN114943673A (en) Defect image generation method and device, electronic equipment and storage medium
CN108153454B (en) Multi-touch switching method and device, storage medium and terminal equipment
CN111603767A (en) Method, terminal and storage medium for adjusting resolution
CN111178017A (en) Method and device for generating flow chart, storage medium and electronic equipment
CN114565701A (en) Line segment drawing method and device, electronic equipment and computer readable storage medium
CN107146098B (en) Advertisement operation configuration method and equipment
CN107870685B (en) Touch operation identification method and device
CN115344187A (en) Method and device for identifying brush style of An Zhuo Huatu software
CN111913644B (en) Line drawing method and device for whiteboard and readable storage medium
CN106843714B (en) Method and system for optimizing handwriting of touch pen
CN107992348B (en) Dynamic cartoon plug-in processing method and system based on intelligent terminal
CN108288298B (en) Method and device for drawing function image, computer equipment and storage medium
CN107831935B (en) Touch point trajectory tracking method and device based on fitting and intelligent equipment
CN111754061A (en) Method and device for controlling man-machine distribution, server equipment and storage medium
CN110689922A (en) Method and system for GC content analysis of automatic parallelization knockout strategy

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant