CN107123152B - Editing processing method and device - Google Patents

Editing processing method and device Download PDF

Info

Publication number
CN107123152B
CN107123152B CN201710219879.2A CN201710219879A CN107123152B CN 107123152 B CN107123152 B CN 107123152B CN 201710219879 A CN201710219879 A CN 201710219879A CN 107123152 B CN107123152 B CN 107123152B
Authority
CN
China
Prior art keywords
drawing tool
sub
region
target
touch position
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201710219879.2A
Other languages
Chinese (zh)
Other versions
CN107123152A (en
Inventor
匡皓琦
苏凌枫
梁颖蕾
汤晨韵
符德恩
石琳
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Tencent Technology Shenzhen Co Ltd
Original Assignee
Tencent Technology Shenzhen Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tencent Technology Shenzhen Co Ltd filed Critical Tencent Technology Shenzhen Co Ltd
Priority to CN201710219879.2A priority Critical patent/CN107123152B/en
Publication of CN107123152A publication Critical patent/CN107123152A/en
Application granted granted Critical
Publication of CN107123152B publication Critical patent/CN107123152B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T11/002D [Two Dimensional] image generation
    • G06T11/40Filling a planar surface by adding surface attributes, e.g. colour or texture
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/0482Interaction with lists of selectable items, e.g. menus
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04842Selection of displayed objects or displayed text elements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • Processing Or Creating Images (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The embodiment of the invention provides an editing processing method and device, wherein the method comprises the following steps: acquiring a first operation in a control region, wherein the control region at least comprises a first sub-region and a second sub-region, the first sub-region corresponds to a first class of drawing tool set, and the second sub-region corresponds to a second class of drawing tool set; determining touch position information of the first operation on the control area; if the first operation is located in a first sub-region, determining a target drawing tool corresponding to the touch position information from a first class drawing tool set corresponding to the first sub-region; if the first operation is located in the second sub-region, determining a target drawing tool corresponding to the touch position information from a second drawing tool set corresponding to the second sub-region; and previewing and displaying the target drawing tool.

Description

Editing processing method and device
Technical Field
The invention relates to the field of image processing, in particular to an editing processing method and device.
Background
At present, there are some applications that color filling and other functions are gradually added in picture editing, for example, there are some applications that can support selection of a doodle painting tool (also called as a common color painting tool) with multiple colors for color filling, and region filling is fixed; there are also applications that may support filling with colors that are not region specific, called a personalized brush tool.
For such applications only supporting a common color brush tool, in a scene of picture editing, although the common color brush tool can change the color of a brush according to the color change selected by a user, the common color brush tool lacks a user personalized design in functional experience, cannot provide more creative possibilities for the user, and is low in interest of graffiti.
For such applications supporting the personalized brush tool, common color brush selection and personalized brush selection are designed separately, so that the whole module is seriously divided, multiple pages need to be slid to select when too many personalized brushes touch in the process of editing pictures, the operation is complicated, and a user cannot quickly and intuitively edit the pictures.
Disclosure of Invention
In view of this, the present invention is intended to provide an editing processing method and apparatus, which can improve the efficiency of selecting a target drawing tool by integrating and designing a first type of drawing tool and a second type of drawing tool.
The technical scheme of the invention is realized as follows:
the embodiment of the invention provides an editing processing method, which comprises the following steps:
acquiring a first operation in a control area, wherein the control area at least comprises a first sub-area and a second sub-area, the first sub-area corresponds to a first class of drawing tool set, and the second sub-area corresponds to a second class of drawing tool set;
determining touch position information of the first operation on the control area;
if the first operation is located in a first sub-region, determining a target drawing tool corresponding to the touch position information from a first class drawing tool set corresponding to the first sub-region; if the first operation is located in a second sub-area, determining a target drawing tool corresponding to the touch position information from a second drawing tool set corresponding to the second sub-area;
and previewing and displaying the target drawing tool.
An embodiment of the present invention further provides an editing processing apparatus, where the apparatus includes:
the device comprises a first acquisition unit, a second acquisition unit and a control unit, wherein the first acquisition unit is used for acquiring a first operation in a control area, the control area at least comprises a first sub-area and a second sub-area, the first sub-area corresponds to a first class of drawing tool set, and the second sub-area corresponds to a second class of drawing tool set;
a first determination unit configured to determine touch position information of the first operation on the control area;
a second determining unit, configured to determine, if the first operation is located in the first sub-region, a target drawing tool corresponding to the touch position information from a first class of drawing tool set corresponding to the first sub-region; if the first operation is located in a second sub-area, determining a target drawing tool corresponding to the touch position information from a second drawing tool set corresponding to the second sub-area;
and the control unit is used for performing preview display on the target drawing tool.
By adopting the technical scheme of the embodiment of the invention, a first class drawing tool set and a second class drawing tool set are integrated into the same control area, and when a first operation in the control area is acquired, the touch position information of the first operation on the control area is determined; if the first operation is located in a first sub-region, determining a target drawing tool corresponding to the touch position information from a first class drawing tool set corresponding to the first sub-region; if the first operation is located in a second sub-area, determining a target drawing tool corresponding to the touch position information from a second drawing tool set corresponding to the second sub-area; and performing preview display on the target drawing tool. Therefore, the first type drawing tool set and the second type drawing tool set are integrated into the same control area, so that drawing tools of different types can be managed in a unified mode, a user can select a target drawing tool from the same control area conveniently, and the efficiency of selecting the target drawing tool is improved. And the target drawing tool is displayed in a preview mode, so that the preview of the target drawing tool corresponding to the current touch position can be displayed in a more targeted mode, the target drawing tool corresponding to the first operation is clear and clear, a user can conveniently and quickly select the target drawing tool, and the use efficiency of the user is improved. Because the drawing tools in the first type drawing tool set or the second type drawing tool set can be selected as the target drawing tools in the same control area, the interestingness of image editing such as doodling can be increased, and compared with the situation that the first type drawing tool set and the second type drawing tool set are respectively arranged on different control modules, the control modules do not need to be switched, and the possibility of quick operation can be increased.
Drawings
FIG. 1 is a schematic diagram of a picture editing interface for editing a picture using a common color brush tool;
FIG. 2 is a schematic diagram of a picture editing interface for editing a picture with a personalized brush tool;
fig. 3 is a schematic flow chart illustrating an implementation of an editing processing method according to an embodiment of the present invention;
FIG. 4 (a) is a schematic diagram of a control region comprising a plurality of sub-regions according to an embodiment of the present invention;
FIG. 4 (b) is a schematic diagram of a control area including two sub-areas according to an embodiment of the present invention;
FIG. 4 (c) is another schematic diagram of a control area including two sub-areas according to an embodiment of the present invention;
fig. 5 (a) is a schematic diagram illustrating a first display area displaying a preview image of a target rendering tool when a first operation is performed according to an embodiment of the present invention;
FIG. 5 (b) is a schematic diagram showing the target rendering tool at a first position in the control area when a second operation is performed according to the embodiment of the present invention;
FIG. 5 (c) is another diagram illustrating a first display area displaying a preview of a target rendering tool when a first operation is performed according to an embodiment of the present invention;
FIG. 5 (d) is another schematic diagram of displaying the target rendering tool at the first position of the control region when the second operation is performed according to the embodiment of the present invention;
fig. 6 is a schematic block diagram of a process for implementing preview display according to an embodiment of the present invention;
fig. 7 (a) is a schematic diagram of a control area in a bar shape popped up after a user clicks a preset editing function button according to an embodiment of the present invention;
FIG. 7 (b) is a diagram illustrating a preview of a target rendering tool displayed in a first display area according to an embodiment of the present invention;
FIG. 7 (c) is a schematic diagram of the target rendering tool displayed at a first position in the control region according to the present invention;
fig. 8 is a schematic structural diagram of an editing apparatus according to an embodiment of the present invention;
fig. 9 is a schematic diagram of an alternative hardware structure for implementing the editing processing apparatus in the embodiment of the present invention.
Detailed Description
In order to make the features and technical contents of the present invention more comprehensible, important abbreviations and key terms used in the present invention are introduced as follows.
The brush stroke refers to the trace left when the brush is in contact with the picture in the drawing process, such as lines, colors and images formed when the brush is in contact with the picture.
The brush, which is one of tools in image editing software (PS, photo Shop), is a number of preset patterns that can be used directly in the form of a brush.
Drawing tool, refers to a tool used to draw a picture.
In order to better explain the invention, firstly, a schematic diagram for editing a picture by using a common color brush tool is introduced, as shown in fig. 1, in the picture editing interface, a brush is arranged at the upper right corner of the picture, and a color bar is arranged below the brush; and the brush may change brush colors according to changes in the color selected by the user on the color bar. However, the common color brush only has color selection, and in terms of functional experience, the personalization of a user and the design of multiple elements are not considered, so that more creative possibilities for the user cannot be provided.
Introducing a schematic diagram of editing a picture by using a personalized brush tool, as shown in fig. 2, in the picture editing interface, an editing tool selection window is located below the picture, and the editing tools include both a common color brush and a personalized brush, wherein, for a row of editing tools above a solid circle, from the right, a right first and a right second are common color brushes; the right three, the right four and the right five are individual brushes. When the selected common color brush is as the right brush, a color bar pops up; when the personalized brush is selected, no color bar pops up. It can be seen that the color selection and the individual strokes are designed separately, making the entire module heavily partitioned. In addition, when the number of individual brushes is too many, a plurality of pages need to be slid to select, and a user cannot quickly and intuitively select and know all the individual brushes.
The technical scheme of the invention is further elaborated by combining the attached drawings and specific embodiments.
An embodiment of the present invention provides an editing processing method, which may be applied to a terminal side, as shown in fig. 1, and the method mainly includes:
step 301, a first operation in the control area is acquired.
The control area at least comprises a first sub-area and a second sub-area, wherein the first sub-area corresponds to a first class of drawing tool set, and the second sub-area corresponds to a second class of drawing tool set.
Therefore, the first type drawing tool set and the second type drawing tool set are integrated into the same control area, so that drawing tools of different types can be managed uniformly, a user can select a target drawing tool from the same control area conveniently, and the efficiency of selecting the target drawing tool is improved.
It should be noted that the control region includes, but is not limited to, only the first sub-region and the second sub-region, and the control region may further include M other sub-regions, where M may be a positive integer greater than or equal to 1. Here, the other sub-region may correspond to a drawing tool set of another class different from both the first class drawing tool set and the second class drawing tool set.
Fig. 4 (a) is a schematic diagram of a control region including a plurality of sub-regions according to an embodiment of the present invention, as shown in fig. 4 (a), the control region includes x sub-regions, where x is a positive integer greater than or equal to 3, x =2+ m, and m is a positive integer greater than or equal to 1.
For example, the control region may include 1 other sub-region, denoted as a third sub-region, in addition to the first sub-region and the second sub-region, where the third sub-region corresponds to a third class of drawing tool set. For another example, the control region may include 2 other sub-regions besides the first sub-region and the second sub-region, which are denoted as a third sub-region and a fourth sub-region, where the third sub-region corresponds to a third class of drawing tool set, and the fourth sub-region corresponds to a fourth class of drawing tool set. For another example, the control region may include 3 other sub-regions, which are denoted as a third sub-region, a fourth sub-region, and a fifth sub-region, in addition to the first sub-region and the second sub-region; the third sub-region corresponds to a third class drawing tool set, the fourth sub-region corresponds to a fourth class drawing tool set, and the fifth sub-region corresponds to a fifth class drawing tool set.
Here, the first operation is a touch operation of an operation body on the control area, such as an operation of a user directly using a finger on the control area or an operation using a touch pen on the control area.
The first operation comprises but is not limited to click operation, sliding operation and triggering operation corresponding to a preset track.
For example, when a click operation in the control area is detected, it is determined that the first operation is received in the control area. For another example, when a slide operation in the control area is detected, it is determined that the first operation is received in the control area. For another example, when a trigger operation in the control area is detected, if the trigger operation belongs to a trigger operation corresponding to the preset trajectory, it is determined that the first operation is received in the control area.
Here, the first class drawing tool set includes first class drawing tools corresponding to different colors. For example, the first type of drawing tool is a conventional brush tool that can support multiple color selections.
Here, the second type drawing tools of different strokes are included in the second type drawing tool set; the brush strokes are lines, colors and images formed by the brush pen contacting the picture. For example, the second type of drawing tool is a personal brush tool, which can support selection of various figures, patterns, lines, colors, and the like.
In the above scheme, before step 301, the method further includes:
and displaying the control area when the operation of starting the preset editing function is detected.
For example, the predetermined editing function may be a graffiti function.
Illustratively, an image editing application or a video editing application is installed on the terminal, and the image editing application or the video editing application has a function of scheduled editing; and displaying the control area when the image editing application or the video editing application receives an operation request for starting a preset editing function input by a user, such as the operation of clicking a preset editing shortcut key of the image editing application or the video editing application on a terminal desktop by the user.
Here, the image editing type application or the video editing type application may be an application installed on a terminal, the image editing type application or the video editing type application having a function of predetermined editing.
Step 302, determining the touch position information of the first operation on the control area.
As an embodiment, the determining the touch position information of the first operation on the control area includes:
acquiring first data information corresponding to the first operation; the first data information is used for representing operation attribute information corresponding to the first operation;
and obtaining the touch position information of the first operation on the control area according to the first data information.
Wherein, the operation attribute at least comprises one or more of the following:
the number of touch points, touch time, position coordinates and pressure.
In this way, by determining the touch position information of the first operation on the control area, a basis can be provided for subsequently judging whether the first operation corresponds to the first type drawing tool set or the second type drawing tool set, so that the system can more specifically determine the target drawing tool in the corresponding drawing tool set.
Step 303, if the first operation is located in the first sub-region, determining a target drawing tool corresponding to the touch position information from a first class of drawing tool set corresponding to the first sub-region; and if the first operation is positioned in the second sub-area, determining a target drawing tool corresponding to the touch position information from a second drawing tool set corresponding to the second sub-area.
The above step 303 is described by taking an example in which the control region includes only the first sub-region and the second sub-region. It should be noted that, when the control region includes more than two sub-regions, that is, when the control region includes M other sub-regions in addition to the first sub-region and the second sub-region, in which sub-region the first operation is located, the target drawing tool corresponding to the touch position information is determined from the drawing tool set corresponding to the sub-region. As shown in FIG. 4 (a), the control region includes x sub-regions, where x is greater than or equal to a positive integer of 3, and x =2+ M. Here, the M may be a positive integer greater than or equal to 1. Here, the other sub-region may correspond to a drawing tool set of another class different from both the first class drawing tool set and the second class drawing tool set. If the first operation is located in the y sub-region, determining a target drawing tool corresponding to the touch position information from a y class drawing tool set corresponding to the y sub-region; wherein y is a positive integer greater than or equal to 1 and less than or equal to x.
For example, if the first operation is located in a first sub-region, a target drawing tool corresponding to the touch position information is determined from a first class of drawing tool set corresponding to the first sub-region; if the first operation is located in a second sub-area, determining a target drawing tool corresponding to the touch position information from a second drawing tool set corresponding to the second sub-area; if the first operation is located in a third sub-area, determining a target drawing tool corresponding to the touch position information from a third drawing tool set corresponding to the third sub-area; if the first operation is located in a fourth sub-region, determining a target drawing tool corresponding to the touch position information from a fourth drawing tool set corresponding to the fourth sub-region; and so on in the manner described above.
Wherein, the presentation form of the control area includes but is not limited to: strip-shaped and ring-shaped.
In this embodiment, the presentation form, such as the shape, of the control area is not limited as long as the control area can integrate the first type drawing tool set and the second type drawing tool set and can implement touch operation.
Taking only the example that the control region includes two sub-regions, exemplarily, fig. 4 (b) is a schematic diagram of a control region including two sub-regions according to an embodiment of the present invention, in the diagram, the control region is in the form of a strip, and the control region is composed of two parts, namely, a first sub-region A1 and a second sub-region A2.
Only taking the example that the control region includes two sub-regions, exemplarily, fig. 4 (c) is another schematic diagram of the control region including two sub-regions according to the embodiment of the present invention, in the figure, the control region is in a ring shape, and the control region is composed of two parts, namely, a first sub-region B1 and a second sub-region B2.
Of course, the presentation form of the control region is not limited to that shown in fig. 4 (b) and fig. 4 (c), and is not listed here.
As an optional implementation manner, the determining, from the first class of drawing tools corresponding to the first sub-region, a target drawing tool corresponding to the touch position information includes:
calling a system interface to acquire a color value corresponding to the touch position information;
generating a picture with a preset size and shape by using the color values and setting the picture on a first class of drawing tools;
taking a first class of drawing tools provided with the pictures as target drawing tools;
the first class drawing tool set comprises first class drawing tools corresponding to different colors.
Optionally, all color values supported by the first type of rendering tool are integrated onto one picture.
Specifically, the obtaining, by the calling system interface, the color value corresponding to the touch position information includes:
and obtaining the picture integrated with all color values supported by the first class of drawing tool by calling a system interface, comparing the position of the touch position information on the picture, and obtaining the color value at the position.
Illustratively, the first class of rendering tools is a common color painting tool, and the color corresponding to the color value is red, then a red painting tool capable of painting red is determined as the target rendering tool; and if the color corresponding to the color value is green, determining a green painting tool capable of drawing the green as the target drawing tool.
As an optional implementation manner, the determining, from the second set of drawing tools corresponding to the second sub-region, a target drawing tool corresponding to the touch position information includes:
obtaining drawing tool resources corresponding to the touch position information;
determining a target drawing tool based on the drawing tool resource;
and the second drawing tool set comprises second drawing tools with different strokes.
And the brush strokes comprise lines, colors and images formed by the brush tool contacting the picture.
The second sub-area comprises a plurality of drawing tool resources, different drawing tool resources correspond to a picture respectively, and dynamic configuration is supported.
Exemplarily, the second type of drawing tool is a personalized brush tool, and when the drawing tool resource corresponding to the touch position information is a rainbow, the personalized brush tool capable of drawing the rainbow is determined as a target drawing tool; and when the drawing tool resource corresponding to the touch position information is the purple lip, determining the individual painting tool capable of drawing the purple lip as a target drawing tool. And when the drawing tool resource corresponding to the touch position information is a red heart, determining an individual painting tool capable of drawing the red heart as a target drawing tool.
Therefore, the drawing tool resources can be continuously expanded by the second drawing tool set, the types of the second drawing tools can be increased, and the second drawing tool set has strong extensibility and expandability. Because the first drawing tool set and the second drawing tool set are fused into the same control area, a user can conveniently and quickly preview all the second drawing tools, and then the user can conveniently and quickly select a target drawing tool.
As an embodiment, the determining a target drawing tool based on the drawing tool resource includes:
if a fourth operation is received, calling a property bar corresponding to the drawing tool resource based on the fourth operation; wherein the property bar comprises editing properties supported by the current drawing tool resources;
acquiring selection information selected from the attribute bar;
and determining a target drawing tool adapted to the touch position information by combining the selection information and the drawing tool resource.
Therefore, when the drawing tool resource corresponds to the attribute bar, the types of the second type drawing tools included in the second type drawing tool set can be enriched.
Illustratively, the second drawing tool is a personalized brush tool, and when the personalized brush tool is in a selected state and a fourth operation is received, a property bar corresponding to the personalized brush tool is popped up, where the property bar includes at least one of an expansion of color and shape, a static graph and a dynamic graph supported by the personalized brush tool.
For example, the individual brush tool is a brush tool capable of drawing a red heart, the attribute bar corresponding to the individual brush tool includes a color type supporting adjustment of a color of a current red heart, or the current red heart is a static red heart, and the attribute bar further includes a frequency value supporting dynamic presentation of the red heart at different frequencies.
For example, the individual painting tool is a painting tool capable of painting a mosaic, and the attribute bar corresponding to the individual painting tool includes a mosaic supporting adjustment of the shape of the current square mosaic, for example, the current square mosaic can be changed into a mosaic of other geometric figures, for example, the other geometric figures may be a diamond, a parallelogram, and the like.
In this way, by determining the target drawing tool adapted to the touch position information by combining the selection information and the drawing tool resource, it is possible to increase the types of selectable target drawing tools and to enhance the interest of the user in the Do It Yourself (DIY).
And step 304, performing preview display on the target drawing tool.
As an embodiment, the preview displaying the target drawing tool includes:
acquiring a preset corresponding relation between the touch position information and the first display area;
determining a first display area corresponding to the touch position information based on the preset corresponding relation;
and displaying a preview of the target drawing tool in the first display area.
Wherein, the preset corresponding relationship comprises:
the area position of the first display area changes along with the change of the touch position information; and/or the presence of a gas in the gas,
the region shape of the first display region changes following a change in the touch position information and/or the target drawing tool.
Here, the shape of the first display region is set by the user according to the user's needs, such as a drop shape, a bubble shape, a regular N-deformation, an irregular N-deformation, an animal pattern shape, and the like. Wherein N is a positive integer greater than or equal to 3.
Of course, the shape of the region of the first display region corresponding to the first type of drawing tool set may be the same as or different from the shape of the region of the first display region corresponding to the second type of drawing tool set.
Therefore, the display form of the first display area is enriched while the preview of the target drawing tool is displayed in the first display area, and the interestingness of image editing of a user is improved conveniently.
In one embodiment, the first display area is located at one side of the control area, such as at the left side, or the right side, or above, or below the control area.
Here, the first display region may also be moved by a user through a drag operation according to his or her own needs, thereby displaying a preview image of the target drawing tool in a viewing region to which the user is accustomed.
In the above scheme, the method further comprises:
acquiring a second operation in the control area; wherein the second operation is a continuous operation of the first operation, and the second operation is an operation for ending a touch operation in a control area;
displaying the target drawing tool at a first location of the control region; wherein the first position comprises a touch position corresponding to the first operation.
For example, the first location is a certain area centered on the touch location.
In this way, after the touch operation of leaving the control area is detected, the icon of the target drawing tool is displayed at the first position of the control area, so that the user can know the currently selected target drawing tool conveniently.
Fig. 5 (a) is a schematic diagram of displaying, in a first display area, a preview diagram of a target drawing tool when a first operation is performed, where in the diagram, a touch position corresponding to the first operation is located in a first sub-area, that is, the target drawing tool is one of a first class of drawing tools, and as can be seen from the diagram, the target drawing tool is a common color brush tool capable of drawing a certain color. With respect to fig. 5 (a), fig. 5 (b) is a schematic diagram of displaying the target drawing tool at the first position of the control area when the second operation is performed according to the embodiment of the present invention, and it can be seen from the diagram that when the second operation is received, the target drawing tool is displayed at the first position of the control area, and is matched with the preview image in the first display area in fig. 5 (a). In this way, the user can know the form or style of the currently selected target drawing tool, whether the user performs the first operation or after performing the second operation.
Fig. 5 (c) is another schematic diagram of displaying, in the first display area, a preview of a target drawing tool when the first operation is performed, where in the diagram, the touch position corresponding to the first operation is located in the second sub-area, that is, the target drawing tool is one of a second type of drawing tool set, and as can be seen from the diagram, the target drawing tool is a personality brush tool capable of drawing a solid heart shape. With respect to fig. 5 (c), fig. 5 (d) is another schematic diagram of displaying the target drawing tool at the first position of the control area when the second operation is performed according to the embodiment of the present invention, and it can be seen from the diagram that, when the second operation is received, the target drawing tool is displayed at the first position of the control area, and is matched with the preview image in the first display area in fig. 5 (c). In this way, the user can know the form or style of the currently selected target drawing tool, whether the user performs the first operation or after performing the second operation.
It should be noted that although the preview of the target drawing tool may be displayed at the first position in the control area when the first operation is performed, the preview at the first position may not be visible to the user when the operator touches the control area, but the form or style of the target drawing tool corresponding to the current touch operation in the control area can be made clearer by the preview displayed in the first display area.
In the above scheme, the method further comprises:
detecting a third operation on the target drawing tool; wherein the third operation is an operation for determining to use the target drawing tool;
controlling the target drawing tool in the control region to be in an enabled state;
and when the target drawing tool is in an enabled state, allowing the target drawing tool to be utilized to edit the image to be processed.
Wherein the third operation includes, but is not limited to, a single click operation, a double click operation, and a slide operation.
In a specific embodiment, when the terminal detects a double-click operation of the target drawing tool displayed at the first position of the control area by the user, the terminal controls the target drawing tool in the control area to be in an enabled state.
For example, when the target drawing tool is enabled, a user uses the target drawing tool to scribble on an image.
The technical scheme can be applied to picture editing and video editing scenes. Here, the video includes, but is not limited to, a short video.
In the technical scheme of this embodiment, a first class drawing tool set and a second class drawing tool set are integrated into the same control area, and when a first operation in the control area is acquired, touch position information of the first operation on the control area is determined; if the first operation is located in a first sub-region, determining a target drawing tool corresponding to the touch position information from a first class drawing tool set corresponding to the first sub-region; if the first operation is located in a second sub-area, determining a target drawing tool corresponding to the touch position information from a second drawing tool set corresponding to the second sub-area; and performing preview display on the target drawing tool. Therefore, the first type drawing tool set and the second type drawing tool set are integrated into the same control area, so that drawing tools of different types can be managed uniformly, a user can select a target drawing tool from the same control area conveniently, and the efficiency of selecting the target drawing tool is improved. And the target drawing tool is displayed in a preview mode, so that the preview of the target drawing tool corresponding to the current touch position can be displayed in a more targeted mode, the target drawing tool corresponding to the first operation is clear and clear, a user can conveniently and quickly select the target drawing tool, and the use efficiency of the user is improved. Because the drawing tools in the first type drawing tool set or the second type drawing tool set can be selected in the same control area as the target drawing tool, the interestingness of image editing such as graffiti can be increased, and compared with the situation that the first type drawing tool set and the second type drawing tool set are respectively arranged on different control modules, the control modules do not need to be switched, and the possibility of quick operation can be increased.
Fig. 6 is a schematic block diagram for implementing preview display processing according to an embodiment of the present invention, and as shown in fig. 6, a pure color stroke corresponding to a first type of drawing tool set and a personality stroke corresponding to a second type of drawing tool set are integrated into a design, and a pure color stroke and a personality stroke style are put into a same layout file, where a control area includes the pure color stroke and the personality stroke; wherein, the pure color brush-touch background is a picture; the stroke styles of different individual strokes respectively correspond to the pictures matched with the strokes, the strokes are placed behind the pure-color strokes, and dynamic configuration is supported; the touch position is acquired by a preview (View) touch event, and the selected stroke type can be calculated from the touch position. If the touch position is a pure-color brush stroke, calling a system interface to obtain a color value corresponding to the touch position, and then generating a picture by using the color value to set the picture on a first display area corresponding to the touch position and a corresponding brush stroke so as to obtain a preview effect; if the touch is the individual stroke, directly using the corresponding resource to set a first display area corresponding to the touch position and a preview effect of the corresponding brush; meanwhile, when the touch position is changed, the position offset of the first display area and the corresponding brush in the layout file is changed in real time.
In addition, the first display area and the preview image of the corresponding brush are arranged in the same layer of layout, and when the operating body slides on the control area, the background of the first display area and the background of the brush need to be set respectively, and the deviation value corresponding to the overall layout is changed to realize the unification of the preview effect.
In an alternative embodiment, the following code may be referred to specifically for determining whether the pen stroke is a personalized pen stroke or a solid pen stroke:
Figure BDA0001263394410000141
it should be noted that the individual brush is expandable and not invariable. For example, the personalized brush may support color selection in addition to its own features, and may change its color value while selecting the personalized brush, supporting multiple selection modes, for example: firstly, selecting the individual brush, and the color bar can be continuously selected while the brush is in the selected state. Illustratively, the individual brush capable of drawing the mosaic can be expanded, and the shape of the square mosaic to be painted can be changed, such as painting into other geometric figures, such as diamonds and triangles, and adding a more playable mosaic painting mode. For example, the control area may also support forming a dynamic picture mode, that is, a dynamic picture in a Graphics Interchange Format (GIF) is scrawled instead of just the change of a single graphic, for example, a previous scrawled picture and a next scrawled picture are combined to form a GIF dynamic picture, so as to increase the diversity of DIY available to a user, thereby improving the satisfaction degree of the user.
For example, in a picture editing or short video preview screen, a user clicks a preset editing function key to pop up a bar-shaped control area, and a return cancel key is further disposed on the control area, which can support to cancel the current operation and return to the state corresponding to the previous operation, as shown in fig. 7 (a). Detecting that a user performs a first operation on the control area (such as clicking the first sub-area), popping up a preview state of a target drawing tool by the terminal in a first display area (such as the left side of the control area) (such as displaying a color preview picture if a certain color in the first sub-area is selected; such as displaying the personalized brush preview picture if a certain color brush in the second sub-area is selected); after the user finishes the selection, namely when the terminal receives a second operation, the preview state of the target drawing tool is presented at the first position on the control area (if a certain color in the first sub-area is selected, the color preview image is displayed; and a return cancel key is also arranged on the control area, and can support the cancel of the current operation and the return to the state corresponding to the last operation. Therefore, the preview state diagram of the target drawing tool corresponding to the current touch position can be clearly and clearly shown to the user in various operation states. Fig. 7 (b) illustrates a schematic diagram of displaying a preview image of a target rendering tool in a first display area, where a touch location corresponding to the first operation is located in a second sub-area, that is, the target rendering tool is a personality rendering tool in a second set of rendering tools, and as can be seen from the diagram, the target rendering tool is a personality brush tool capable of rendering a mosaic. With respect to fig. 7 (b), fig. 7 (c) shows another schematic diagram of displaying the target drawing tool at the first position of the control region, from which it can be seen that when a second operation is received, the target drawing tool is displayed at the first position of the control region, matching the preview image in the first display region in fig. 7 (b). Therefore, the preview state diagram of the target drawing tool corresponding to the current touch position can be clearly and clearly shown to the user in various operation states.
Fig. 8 is a schematic diagram of a composition structure of an editing processing apparatus according to an embodiment of the present invention, and as shown in fig. 8, the apparatus includes: a first acquisition unit 81, a first determination unit 82, a second determination unit 83, a control unit 84; wherein, the first and the second end of the pipe are connected with each other,
the first obtaining unit 81 is configured to obtain a first operation in a control region, where the control region includes a first sub-region and a second sub-region, the first sub-region corresponds to a first class of drawing tool set, and the second sub-region corresponds to a second class of drawing tool set;
the first determining unit 82 is configured to determine touch position information of the first operation on the control area;
the second determining unit 83 is configured to determine, if the first operation is located in the first sub-region, a target drawing tool corresponding to the touch position information from a first class of drawing tool set corresponding to the first sub-region; if the first operation is located in a second sub-area, determining a target drawing tool corresponding to the touch position information from a second drawing tool set corresponding to the second sub-area;
the control unit 84 is configured to perform preview display on the target drawing tool.
Further, the apparatus further comprises:
a second acquisition unit 85 for acquiring a second operation in the control area; wherein the second operation is a continuous operation of the first operation, and the second operation is an operation for ending a touch operation in a control area;
the control unit 84, further configured to display the target rendering tool at a first position of the control region; wherein the first position comprises a touch position corresponding to the first operation.
Further, the apparatus further comprises:
a detection unit 86 for detecting a third operation on the target drawing tool; wherein the third operation is an operation for determining to use the target drawing tool;
the control unit 84 is further configured to control the target rendering tool in the control region to be in an enabled state; and when the target drawing tool is in an enabling state, allowing the target drawing tool to be utilized to edit the image to be processed.
As an optional implementation, the control unit 84 is specifically configured to:
acquiring a preset corresponding relation between the touch position information and the first display area;
determining a first display area corresponding to the touch position information based on the preset corresponding relation;
and displaying a preview of the target drawing tool in the first display area.
As an optional implementation manner, the second determining unit 83 is specifically configured to:
calling a system interface to acquire a color value corresponding to the touch position information;
generating a picture with a preset size and shape by using the color values and setting the picture on a first class of drawing tools;
taking a first class of drawing tools provided with the pictures as target drawing tools;
the first class drawing tool set comprises first class drawing tools corresponding to different colors.
In this way, the second determining unit 83 may determine the target drawing tool corresponding to the touch position information from the first drawing tool set corresponding to the first sub-region.
As an optional implementation manner, the second determining unit 83 is specifically configured to:
obtaining drawing tool resources corresponding to the touch position information;
determining a target drawing tool based on the drawing tool resources;
and the second drawing tool set comprises second drawing tools with different strokes.
Further, the second determining unit 83 is further configured to:
if a fourth operation is received, calling a property bar corresponding to the drawing tool resource based on the fourth operation; the attribute bar comprises editing attributes supported by the current drawing tool resource;
acquiring selection information selected from the attribute bar;
and determining a target drawing tool adapted to the touch position information by combining the selection information and the drawing tool resource.
In this way, the second determining unit 83 may determine the target drawing tool corresponding to the touch position information from the second type of drawing tool set corresponding to the second sub-region.
In the above scheme, the shape of the control area is a strip or a ring, the strip or ring control area is composed of at least a first sub-area and a second sub-area, and the first sub-area displays a color schematic diagram supported by the first class of drawing tool set; and the drawing tool schematic diagram supported by the second drawing tool set is displayed in the second sub-area.
It should be understood by those skilled in the art that the functions of the units in the editing processing apparatus of the present embodiment can be understood by referring to the related description of the editing processing method.
In practical applications, the specific structures of the first obtaining unit 81, the first determining unit 82, the second determining unit 83, the controlling unit 84, the second obtaining unit 85, and the detecting unit 86 may all correspond to a processor. The specific structure of the processor may be a Central Processing Unit (CPU), a Micro Controller Unit (MCU), a Digital Signal Processor (DSP), a Programmable Logic Controller (PLC), or other electronic components or a collection of electronic components having a Processing function. The processor includes executable codes, the executable codes are stored in a storage medium, the processor can be connected with the storage medium through a communication interface such as a bus, and when the corresponding functions of specific units are executed, the executable codes are read from the storage medium and executed. The storage medium is preferably a non-transitory storage medium for storing the portion of the executable code.
The editing processing apparatus of this embodiment may be disposed at the terminal side.
The editing processing device is particularly suitable for image editing or video editing application scenes.
Fig. 9 is a schematic diagram illustrating an alternative hardware structure for implementing the editing processing apparatus, which includes a processor 11, an input/output interface 13 (e.g., a display screen, a touch screen, and a speaker), a storage medium 14, and a network interface 12, and the components may be connected to communicate via a system bus 15. Accordingly, the storage medium 14 of the editing processing apparatus stores therein executable instructions for executing the editing processing method provided by the embodiment of the present invention.
The storage medium 14 may be various media capable of storing program codes, such as a removable storage device, a Random Access Memory (RAM), a Read-Only Memory (ROM), a magnetic disk, or an optical disk. Preferably, the storage medium 14 may be a non-volatile storage medium.
The processor 11 may be a CPU, an MCU, a DSP, a PLC, or a processing circuit, such as an Application Specific Integrated Circuit (ASIC).
Specifically, the processor 11 reads and executes the executable instructions of the editing processing method from the storage medium 14 through the system bus 15, and may execute the following steps:
acquiring a first operation in a control area, wherein the control area comprises a first sub-area and a second sub-area, the first sub-area corresponds to a first class of drawing tool set, and the second sub-area corresponds to a second class of drawing tool set;
determining touch position information of the first operation on the control area;
if the first operation is located in a first sub-region, determining a target drawing tool corresponding to the touch position information from a first class drawing tool set corresponding to the first sub-region; if the first operation is located in a second sub-area, determining a target drawing tool corresponding to the touch position information from a second drawing tool set corresponding to the second sub-area;
and previewing and displaying the target drawing tool.
According to the editing processing device, the first type of drawing tool and the second type of drawing tool are designed in an integrated mode, the efficiency of selecting the target drawing tool can be improved, and the use experience of a user is improved.
In the several embodiments provided in the present application, it should be understood that the disclosed apparatus and method may be implemented in other ways. The above-described device embodiments are merely illustrative, for example, the division of the unit is only one logical function division, and there may be other division ways in actual implementation, such as: multiple units or components may be combined, or may be integrated into another system, or some features may be omitted, or not implemented. In addition, the coupling, direct coupling or communication connection between the components shown or discussed may be through some interfaces, and the indirect coupling or communication connection between the devices or units may be electrical, mechanical or other forms.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units; can be located in one place or distributed on a plurality of network units; some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, all functional units in the embodiments of the present invention may be integrated into one processing unit, or each unit may be separately used as one unit, or two or more units may be integrated into one unit; the integrated unit can be realized in a form of hardware, or in a form of hardware plus a software functional unit.
Those of ordinary skill in the art will understand that: all or part of the steps for realizing the method embodiments can be completed by hardware related to program instructions, the program can be stored in a computer readable storage medium, and the program executes the steps comprising the method embodiments when executed; and the aforementioned storage medium includes: a removable storage device, a read-only memory, a random access memory, a magnetic or optical disk, or other various media that can store program code.
Alternatively, the integrated unit of the present invention may be stored in a computer-readable storage medium if it is implemented in the form of a software functional module and sold or used as a separate product. Based on such understanding, the technical solutions of the embodiments of the present invention may be essentially implemented or a part contributing to the prior art may be embodied in the form of a software product, which is stored in a storage medium and includes several instructions for causing a computer device (which may be a personal computer, a server, or a network device) to execute all or part of the methods described in the embodiments of the present invention. And the aforementioned storage medium includes: a removable storage device, a ROM, a RAM, a magnetic or optical disk, or various other media that can store program code.
The above description is only for the specific embodiments of the present invention, but the scope of the present invention is not limited thereto, and any person skilled in the art can easily conceive of the changes or substitutions within the technical scope of the present invention, and all the changes or substitutions should be covered within the scope of the present invention. Therefore, the protection scope of the present invention shall be subject to the protection scope of the appended claims.

Claims (15)

1. An editing processing method, characterized in that the method comprises:
acquiring a first operation in a control region, wherein the control region at least comprises a first sub-region and a second sub-region, the first sub-region corresponds to a first class of drawing tool set, and the second sub-region corresponds to a second class of drawing tool set;
determining touch position information of the first operation on the control area;
if the first operation is located in the first sub-region, determining a target drawing tool corresponding to the touch position information from a first drawing tool set corresponding to the first sub-region; if the first operation is located in a second sub-area, determining a target drawing tool corresponding to the touch position information from a second drawing tool set corresponding to the second sub-area;
performing preview display on the target drawing tool in a first display area;
acquiring a second operation in the control area; wherein the second operation is a continuous operation of the first operation, and the second operation is an operation for ending a touch operation in the control region;
displaying the target drawing tool at a first location of the control region; the first position comprises a touch position corresponding to the first operation, and the position of the first display area is different from the first position.
2. The method of claim 1, further comprising:
detecting a third operation on the target drawing tool; wherein the third operation is an operation for determining to use the target drawing tool;
controlling the target drawing tool in the control region to be in an enabled state;
and when the target drawing tool is in an enabling state, allowing the target drawing tool to be utilized to edit the image to be processed.
3. The method of claim 1, wherein the previewing the target rendering tool comprises:
acquiring a preset corresponding relation between the touch position information and the first display area;
determining the first display area corresponding to the touch position information based on the preset corresponding relation;
and displaying a preview of the target drawing tool in the first display area.
4. The method of claim 3, wherein the presetting of the correspondence comprises:
the area position of the first display area changes along with the change of the touch position information; and/or the presence of a gas in the gas,
a region shape of the first display region changes following a change in the touch position information and/or the target drawing tool.
5. The method according to claim 1, wherein the determining a target drawing tool corresponding to the touch position information from a first class of drawing tools corresponding to a first sub-region comprises:
calling a system interface to acquire a color value corresponding to the touch position information;
generating a picture with a preset size and shape by using the color values and setting the picture on a first class of drawing tools;
taking a first class of drawing tools provided with the pictures as the target drawing tools;
the first class drawing tool set comprises first class drawing tools corresponding to different colors.
6. The method according to claim 1, wherein the determining a target drawing tool corresponding to the touch position information from the second drawing tool set corresponding to the second sub-region comprises:
acquiring drawing tool resources corresponding to the touch position information;
determining the target drawing tool based on the drawing tool resources;
and the second drawing tool set comprises second drawing tools with different strokes.
7. The method of claim 6, wherein determining the target drawing tool based on the drawing tool resources comprises:
if a fourth operation is received, calling a property bar corresponding to the drawing tool resource based on the fourth operation; wherein the property bar comprises editing properties supported by the current drawing tool resources;
acquiring selection information selected from the attribute bar;
and determining the target drawing tool which is suitable for the touch position information by combining the selection information and the drawing tool resource.
8. The method according to any one of claims 1 to 7, wherein the control area is in a shape of a strip or a ring, and the strip or the ring-shaped control area is composed of at least two parts, namely a first sub-area and a second sub-area, and the first sub-area displays a color schematic diagram supported by the first type drawing tool set; and the second sub-area displays drawing tool schematic diagrams supported by the second drawing tool set.
9. An editing processing apparatus, characterized in that the apparatus comprises:
a first obtaining unit, configured to obtain a first operation in a control region, where the control region at least includes a first sub-region and a second sub-region, the first sub-region corresponds to a first class of drawing tool set, and the second sub-region corresponds to a second class of drawing tool set;
a first determination unit configured to determine touch position information of the first operation on the control area;
a second determining unit, configured to determine, if the first operation is located in the first sub-region, a target drawing tool corresponding to the touch position information from a first drawing tool set corresponding to the first sub-region; if the first operation is located in a second sub-area, determining a target drawing tool corresponding to the touch position information from a second drawing tool set corresponding to the second sub-area;
the control unit is used for previewing and displaying the target drawing tool in a first display area;
a second acquisition unit configured to acquire a second operation in the control area; wherein the second operation is a continuous operation of the first operation, and the second operation is an operation for ending a touch operation in the control region;
the control unit further to display the target drawing tool at a first position of the control area; the first position comprises a touch position corresponding to the first operation, and the position of the first display area is different from the first position.
10. The apparatus of claim 9, further comprising:
a detection unit configured to detect a third operation on the target drawing tool; wherein the third operation is an operation for determining to use the target drawing tool;
the control unit is further used for controlling the target drawing tool in the control area to be in an enabling state; and when the target drawing tool is in an enabling state, allowing the target drawing tool to be utilized to edit the image to be processed.
11. The apparatus of claim 9, wherein the control unit is further configured to: acquiring a preset corresponding relation between the touch position information and the first display area;
determining the first display area corresponding to the touch position information based on the preset corresponding relation;
and displaying a preview of the target drawing tool in the first display area.
12. The apparatus of claim 9, wherein the second determining unit is further configured to:
acquiring drawing tool resources corresponding to the touch position information;
determining the target drawing tool based on the drawing tool resources;
and the second drawing tool set comprises second drawing tools with different strokes.
13. The apparatus of claim 12, wherein the second determining unit is further configured to:
if a fourth operation is received, calling a property bar corresponding to the drawing tool resource based on the fourth operation; the attribute bar comprises editing attributes supported by the current drawing tool resource;
acquiring selection information selected from the attribute bar;
and determining the target drawing tool which is suitable for the touch position information by combining the selection information and the drawing tool resource.
14. A computer device, characterized in that the computer device comprises:
a memory for storing executable instructions;
a processor for implementing the editing processing method of any one of claims 1 to 8 when executing the executable instructions or computer program stored in the memory.
15. A computer-readable storage medium having computer-executable instructions stored thereon, wherein the computer-executable instructions, when executed by a processor, implement the edit processing method of any one of claims 1 to 8.
CN201710219879.2A 2017-04-06 2017-04-06 Editing processing method and device Active CN107123152B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201710219879.2A CN107123152B (en) 2017-04-06 2017-04-06 Editing processing method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201710219879.2A CN107123152B (en) 2017-04-06 2017-04-06 Editing processing method and device

Publications (2)

Publication Number Publication Date
CN107123152A CN107123152A (en) 2017-09-01
CN107123152B true CN107123152B (en) 2023-01-06

Family

ID=59726240

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201710219879.2A Active CN107123152B (en) 2017-04-06 2017-04-06 Editing processing method and device

Country Status (1)

Country Link
CN (1) CN107123152B (en)

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109102556A (en) * 2018-07-23 2018-12-28 上海掌门科技有限公司 The configuration method of edit tool and the generation method of configuration parameter
CN109445623A (en) * 2018-09-10 2019-03-08 广东智媒云图科技股份有限公司 A kind of system and method for drawing a picture based on touch screen and mechanical arm
CN109448078B (en) * 2018-10-19 2022-11-04 珠海金山数字网络科技有限公司 Image editing system, method and equipment
CN109739597B (en) * 2018-12-21 2022-05-27 上海商汤智能科技有限公司 Image tool acquisition method and device, image equipment and storage medium

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101867744A (en) * 2009-04-20 2010-10-20 Tcl集团股份有限公司 TV set having electronic drawing function and realizing method thereof
CN104850408A (en) * 2015-05-28 2015-08-19 深圳市陨石通信设备有限公司 Method and device for drawing pictures on smartwatch

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7503493B2 (en) * 1999-10-25 2009-03-17 Silverbrook Research Pty Ltd Method and system for digitizing freehand graphics with user-selected properties
US20150058753A1 (en) * 2013-08-22 2015-02-26 Citrix Systems, Inc. Sharing electronic drawings in collaborative environments
KR20150071971A (en) * 2013-12-19 2015-06-29 삼성전자주식회사 Electronic Device And Method For Providing Graphical User Interface Of The Same
US20190347865A1 (en) * 2014-09-18 2019-11-14 Google Inc. Three-dimensional drawing inside virtual reality environment
US9857888B2 (en) * 2015-03-17 2018-01-02 Behr Process Corporation Paint your place application for optimizing digital painting of an image
US20160349979A1 (en) * 2015-05-28 2016-12-01 Adobe Systems Incorporated Multiple Brush Strokes Preview

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101867744A (en) * 2009-04-20 2010-10-20 Tcl集团股份有限公司 TV set having electronic drawing function and realizing method thereof
CN104850408A (en) * 2015-05-28 2015-08-19 深圳市陨石通信设备有限公司 Method and device for drawing pictures on smartwatch

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
Projector-guided painting;Matthew Flagg 等;《Proceedings of the 19th annual ACM symposium on User interface software and technology》;20061031;第235-244页 *
Word97实用技巧——巧用绘图工具的扩展功能(一);郭建厚;《知识就是力量》;19981015(第10期);第26-27页 *

Also Published As

Publication number Publication date
CN107123152A (en) 2017-09-01

Similar Documents

Publication Publication Date Title
US11412292B2 (en) Video processing method, video processing device, and storage medium
JP6868659B2 (en) Image display method and electronic device
CN107123152B (en) Editing processing method and device
CN105204745A (en) Screen capture method and device for mobile terminal
JP2020021516A (en) Apparatus and method for supplying content recognition photo filters
CN107678644B (en) Image processing method and mobile terminal
CN107613203B (en) Image processing method and mobile terminal
CN116243801A (en) Apparatus, method and graphical user interface for manipulating user interface objects with visual and/or tactile feedback
JP2016529635A (en) Gaze control interface method and system
US20100097339A1 (en) Image processing apparatus, image processing method, and program
CN109448050B (en) Method for determining position of target point and terminal
US10095940B2 (en) Image processing apparatus, image processing method and non-transitory computer readable medium
JP6598984B2 (en) Object selection system and object selection method
JP2003050653A (en) Method for generating input event and information terminal equipment with the same method
CN110174984B (en) Information processing method and electronic equipment
US10891768B2 (en) Annotating an image with a texture fill
CN112596643A (en) Application icon management method and device
JP6889686B2 (en) Information processing equipment, information processing system, information processing program and information processing method
US9965147B2 (en) Display method and electronic device
CN111242712A (en) Commodity display method and device
US10185457B2 (en) Information processing apparatus and a method for controlling the information processing apparatus
CN113986080A (en) Multimedia file editing method and device and electronic equipment
CN113592983A (en) Image processing method and device and computer readable storage medium
CN110568972A (en) Method and device for presenting shortcut
CN104571844B (en) A kind of information processing method and electronic equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant