CN109634504B - Selectively enabling touchpad functionality in a graphical interface - Google Patents

Selectively enabling touchpad functionality in a graphical interface Download PDF

Info

Publication number
CN109634504B
CN109634504B CN201810846942.XA CN201810846942A CN109634504B CN 109634504 B CN109634504 B CN 109634504B CN 201810846942 A CN201810846942 A CN 201810846942A CN 109634504 B CN109634504 B CN 109634504B
Authority
CN
China
Prior art keywords
input
typeface
design
character
graphical interface
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201810846942.XA
Other languages
Chinese (zh)
Other versions
CN109634504A (en
Inventor
T·T·唐纳修
R·西恩
A·M·杨
G·尼古拉斯
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Adobe Inc
Original Assignee
Adobe Systems Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from US15/726,973 external-priority patent/US11488053B2/en
Priority claimed from US15/726,832 external-priority patent/US10339680B2/en
Priority claimed from US15/726,909 external-priority patent/US10983679B2/en
Application filed by Adobe Systems Inc filed Critical Adobe Systems Inc
Publication of CN109634504A publication Critical patent/CN109634504A/en
Application granted granted Critical
Publication of CN109634504B publication Critical patent/CN109634504B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04842Selection of displayed objects or displayed text elements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04845Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range for image manipulation, e.g. dragging, rotation, expansion or change of colour
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/0486Drag-and-drop

Abstract

The content manipulation application provides a graphical interface for editing graphical content. The graphical interface includes first and second control elements for performing first and second manipulations of the graphical content. If the first control element is selected, the content management application switches the graphical interface to a touch pad mode. The touch pad mode disables the second control element and thereby prevents the second control element from performing the second maneuver. While the graphical interface is in the touch pad mode, the content management application receives input in an input area lacking the first control element and performs a first manipulation of the graphical content in response to receiving the input. After performing the first manipulation, the graphical interface is switched away from the touch pad mode, thereby enabling the second control element to perform the second manipulation.

Description

Selectively enabling touchpad functionality in a graphical interface
Technical Field
The present disclosure relates generally to functionality for implementing graphical interfaces for data provision, commands and other interactions with content management applications. More particularly, but not by way of limitation, the present disclosure relates to selectively enabling touch pad functionality in a graphical interface.
Background
The typeface (typeface) design involves creating a set of letters or other characters with unique aesthetic qualities. The typeface design involves shaping individual glyphs (glyph) (i.e., curves) of different characters, and eventually merging the multiple curves into a pleasing font or other typeface. Further, typesetting (typography) involving arranging letters so that a set of characters is readable and attractive when presented involves a variety of factors (e.g., dot size, line length, line spacing, letter spacing, etc.) that may be affected by typeface design choices.
To support typeface design, certain design applications are used to modify the appearance of various characters of various typefaces. For example, the typeface application displays characters, such as "b" characters, in a graphical interface. The typeface design application modifies one or more visual properties of the character design in response to input received via the graphical interface. Examples of such inputs include the width of some component shapes of the character (e.g., the width of a stem in a "b" character), the curvature of some component shapes of the character (e.g., the curvature of a semicircular arc (bowl) in a "b" character), and so forth. Modifying these visual properties results in a new design of the character of a typeface.
Existing typeface design applications impose certain restrictions on design modifications. For example, some design applications are limited to modifying typeface with straight lines, and thus have limited ability to develop unique fonts with glyphs of expanded curvature. Other design applications are used to modify a typeface design by moving the outline (outline) of the typeface, where the outline includes a set of lines defining the perimeter of a particular character. Modifying the outline of the character may result in unexpected errors. For example, if the changes are applied non-uniformly (e.g., because one side of the character is tilted with respect to the other), moving one or more points along the outline alone may result in an unrecognizable character design. Thus, the user must manually make the same changes to the corresponding portions of the outline (i.e., the top and bottom of the trunk or other component shape), which increases the time required for typeface design and reduces the accuracy with which changes are applied to different portions of the character.
Furthermore, some typeface applications lack the constraint of typeface design modifications that a user can make. For example, the angle of the trunk in the "b" character may be changed from a vertical direction to a more oblique direction. At some point, the inclination of the trunk may change so drastically that the character cannot be recognized by the average reader. The typedesigner may not have experience in determining such design variations to be aesthetically unattractive.
For these and other reasons, existing typeface design applications have drawbacks.
Content management applications are used to enhance or otherwise modify text, images, and other graphical content. Examples of content management applications include graphic editing applications, text editors, image editors, and the like. Content management applications are typically used to select certain graphical content and perform one or more manipulations of the graphical content. For example, the content management application receives, via the graphical interface, an input selecting a particular shape and one or more additional inputs changing the location or size of the shape.
However, the size of the input received via the graphical interface negatively affects the operation of the content management application. In one example, if a graphical interface is presented via a touch screen, certain shapes or other graphical content may be inadvertently selected or manipulated. For example, the content management application may be a typeface design application for changing the manner in which fonts are displayed. The specific characters from the font include a collection of shapes (e.g., semicircular arcs and stems) manipulated by the typeface design application. However, if the semicircular arc and the trunk occupy an input area on the graphical interface that is smaller than the user's finger, a touch input intended for only the trunk may inadvertently select or manipulate the semicircular arc (or vice versa).
Some existing solutions to this problem have drawbacks. For example, a user may zoom in on a particular portion of the graphical interface that includes the target graphical content (e.g., via a zoom command) and then provide touch input to the zoomed-in portion of the graphical interface. But doing so may prevent the user from viewing the context of the resulting graphical manipulation. For example, in the typeface design example provided above, while a user may zoom in on a portion of the graphical interface of the trunk displaying the character and then manipulate the trunk in that zoomed-in interface, this action may prevent the user from assessing how manipulation of the trunk affects the overall appearance of the character (e.g., because if the trunk is zoomed in, other portions of the character are no longer displayed). For these and other reasons, existing graphical interfaces for content management applications have drawbacks.
Disclosure of Invention
Some embodiments relate to selectively enabling a touch pad function in a graphical interface. For example, the content manipulation application provides a graphical interface for editing graphical content. The graphical interface includes first and second control elements for performing first and second manipulations of the graphical content. If the first control element is selected, the content management application switches the graphical interface to a touch pad mode. The touch pad mode disables the second control element and thereby prevents the second control element from performing the second maneuver. While the graphical interface is in the touch pad mode, the content management application receives input in an input area lacking the first control element and performs a first manipulation of the graphical content in response to receiving the input. After performing the first manipulation, the graphical interface is switched away from the touch pad mode, thereby enabling the second control element to perform the second manipulation.
These illustrative embodiments are not mentioned to limit or define the disclosure, but rather to provide examples to aid understanding of the disclosure. Further examples are discussed in the detailed description, and further description is provided herein.
Drawings
The features, embodiments and advantages of the present disclosure will be better understood when the following detailed description is read with reference to the accompanying drawings.
FIG. 1 depicts an example of an operating environment for implementing a typeface development platform, in accordance with certain embodiments of the present disclosure.
Fig. 2 depicts an example of a process for generating graphics control data for performing skeleton-based (skeleton) modifications of typeface designs, in accordance with certain embodiments of the present disclosure.
FIG. 3 depicts an example of a development interface that is used by a typeface processing application to calculate control point parameters for characters of various design parameters available in a typeface design application, in accordance with certain embodiments of the present disclosure.
FIG. 4 depicts an example of control point location data that has been modified via the development interface of FIG. 3, in accordance with certain embodiments of the present disclosure.
Fig. 5 depicts an example in which different sets of control point values are calculated via the process depicted in fig. 2 using different scales and different ranges of thickness parameter values, in accordance with certain embodiments of the present disclosure.
FIG. 6 depicts an example of a preview of the behavior in which graphics control data generated via the process in FIG. 2 is used to generate a character design with respect to design parameters, in accordance with certain embodiments of the present disclosure.
FIG. 7 depicts an example of a design interface from a typeface design application that uses a graphical control data set output by the process of FIG. 2 to modify typefaces, in accordance with certain embodiments of the present disclosure.
FIG. 8 depicts an example of character skeletons of characters that may be mapped to each other to perform linked design modifications in accordance with certain embodiments of the present disclosure.
FIG. 9 depicts an example of a mapping linking constituent shapes of the characters in FIG. 8 based on a character skeleton of the characters in accordance with certain embodiments of the present disclosure.
Fig. 10 depicts an example of a constituent shape of a link in which modifications to the constituent shape of a first character are applied to a second character, in accordance with certain embodiments of the present disclosure.
Fig. 11 depicts another example of a constituent shape of a link in which modifications to the constituent shape of a first character are applied to a second character in accordance with certain embodiments of the present disclosure.
FIG. 12 depicts an example of a process for automatically controlling modification of typeface designs using a machine learning model, in accordance with certain embodiments of the present disclosure.
FIG. 13 depicts an example of a design interface for modifying the design of an input character according to the process depicted in FIG. 12, in accordance with certain embodiments of the present disclosure.
Fig. 14 depicts an example of modifying the input of the input character depicted in fig. 13, in accordance with certain embodiments of the present disclosure.
Fig. 15 depicts an example in which design modifications to the input characters depicted in fig. 13 are rejected by the process depicted in fig. 12, in accordance with certain embodiments of the disclosure.
Fig. 16 depicts an example in which design modifications to the input characters depicted in fig. 13 result in an alert being output by the process depicted in fig. 12, in accordance with certain embodiments of the present disclosure.
Fig. 17 depicts an example of iteratively training a machine learning model used by the process depicted in fig. 12, in accordance with certain embodiments of the present disclosure.
Fig. 18 depicts an example of feature vector space used by the machine learning model of fig. 17, in accordance with certain embodiments of the present disclosure.
Fig. 19 depicts an example of a process for selectively enabling a touch pad function in a graphical interface in accordance with certain embodiments of the present disclosure.
FIG. 20 depicts an example of a graphical interface for modifying a typeface design using a touch pad function from the process of FIG. 19, in accordance with certain embodiments of the present disclosure.
Fig. 21 depicts an example in which the graphical interface of fig. 20 is switched to a touch pad mode according to the process of fig. 19, in accordance with certain embodiments of the present disclosure.
FIG. 22 depicts an example of modifying typeface designs using the enabled trackpad functionality in FIG. 21, according to some embodiments of the disclosure.
Fig. 23 depicts a graphical interface of an email client using the touch pad functionality from the process of fig. 19, in accordance with certain embodiments of the present disclosure.
Fig. 24 depicts an example in which the graphical interface of fig. 23 is switched to a touch pad mode according to the process of fig. 19, in accordance with certain embodiments of the present disclosure.
Fig. 25 depicts an example of manipulating email content using the touchpad functionality enabled in fig. 24, in accordance with certain embodiments of the present disclosure.
FIG. 26 depicts a graphical interface of a text editor using a touch pad function from the process of FIG. 19, in accordance with certain embodiments of the present disclosure.
Fig. 27 depicts an example in which the graphical interface of fig. 26 is switched to a touch pad mode according to the process of fig. 19, in accordance with certain embodiments of the present disclosure.
Fig. 28 depicts an example of manipulating email content using the touchpad functionality enabled in fig. 27, in accordance with certain embodiments of the present disclosure.
FIG. 29 depicts an example of a computing system for implementing one or more embodiments described herein.
Detailed Description
Some embodiments relate to selectively enabling a touch pad function in a graphical interface. For example, the content management application temporarily switches a portion of the graphical interface to a "touch pad" mode that disables certain control elements in the graphical interface and causes input received via the graphical interface to be applied only to the control elements that remain enabled.
The following non-limiting examples are intended to introduce certain embodiments of a content management application that implements a touch pad function in a graphical interface. In this example, the content management application provides a graphical interface for editing graphical content. The graphical interface includes various control elements for manipulating different portions of the graphical content. For example, if the content management application is a typeface application, the graphical interface is used to display the particular character being edited. The characters include graphical content, such as various curves defining the characters, and the graphical interface presents various control elements, such as control points positioned along the various characters. Moving the control point may change the length, curvature, or other attribute of the curve. To support manipulation according to the user's intent, the content management application switches the graphical interface to a touch pad mode. To this end, the content management application responds to the touch pad enable input by disabling certain control elements, such as selection of one control element, and thereby preventing those control elements from performing manipulation of the graphical content. For example, the typeface application responds to selection of the first control point (such as a control point along the trunk of the character) by disabling the second control point (such as a control point along the semicircular arc of the character) while the graphical interface is in the touch pad mode.
The touch pad mode allows input received in any touch pad area of the graphical interface to be applied to the active control element, thereby performing manipulation of the graphical content in accordance with the input provided to the active control element. For example, a drag input received via a graphical interface is applied to a first control point of a character even though the drag input occurs in an area of the graphical interface that includes a second control point. The drag input causes the first control point to be moved, thereby changing the design of the character. The content management application may then switch the graphical interface away from the touchpad mode, thereby enabling one or more control elements that are disabled in the touchpad mode.
In some embodiments, the above-described touch pad functionality enables improvements in the operation of graphical interfaces displayed on certain computing devices, such as tablet computers and smart phones. For example, the touchpad functionality may enhance operation of a touch-based interface in which control elements (e.g., control points, buttons, selectable text, etc.) are smaller than the area occupied by the touch input (e.g., the size of a fingertip). If a particular control element in a given region of the graphical interface is selected, touch input may be received in any other touch pad enabled region of the graphical interface. Thus, a particular control element may be manipulated without positioning touch inputs on the control element. Thus, the user can see the result of the manipulation while the touch input is received.
The disclosed aspects relate generally to digital data processing systems or methods for creating, manipulating, and otherwise processing graphics data that cause the graphics data to be displayed or modified in a specified manner. More particularly, the disclosed aspects relate to generating graphics control data for performing skeleton-based modifications of typeface designs.
Some embodiments relate to generating graphics control data for performing skeleton-based modifications of typeface designs. For example, the shape of a character skeleton, which is a set of one or more centerlines that make up a set of shapes for a character, may be modified by a designer using a design application. Modifying a character skeleton involves changing the shape defined by the skeleton, the width (i.e., outline) of the shape surrounding the skeleton, etc., by modifying different design parameters (e.g., curvature, inclination, etc.). The described embodiments are for enabling this skeleton-based modification via a design application by creating graphical control data that indicates a skeleton behavior range and corresponding outline behavior for a range of design parameter values.
The following non-limiting examples are intended to introduce certain embodiments in which word processing applications use words Fu Tuxing (such as graphics of "T" characters) to generate graphics control data for use by design applications. The character graphic includes a character skeleton and a character outline. The character skeleton includes a set of curves forming the center line of the constituent shape of the character. The character outline surrounds the character skeleton and has a shape controlled by the character skeleton. For example, in a "T" character, the character skeleton includes a single horizontally oriented curve intersecting a single vertically oriented curve, and the outline includes a horizontal rectangle of variable width around the horizontally oriented curve and a vertical rectangle of variable width around the horizontally oriented curve.
Design applications include various design parameters that can be modified by a designer to change the appearance of a character (e.g., the width of the character, the width of the contrast of the horizontal and vertical portions of the character, the inclination of the character, etc.). Graphics control data provided from the typeface processing application to the design application indicates how the character skeleton should be moved in response to various changes in design parameters. In a simplified example involving tilt design parameters, the graphical control data indicates different angles of a vertically oriented curve of the "T" character, where each angle corresponds to a different tilt value.
Continuing with this example, the typeface processing application generates graphics control data based on the input character graphic of the typeface character (e.g., the letter "T" of a font). The input character graphic includes a character skeleton having a set of control points and a curve defined by the control points and a character outline having a shape controlled at least in part by the positions of the control points of the character skeleton. In a simplified example involving one of the control points and one of the available design parameters, the typeface processing application identifies at least two locations of the control point for at least two design parameter values, respectively. For example, one or more of these control point locations may be specified via user input to the typeface application (e.g., by dragging the control point to a new location, entering different coordinates of the control point, etc.). The typeface processing application also identifies at least two extension-related parameters of the control point for the identified design parameter values. The expansion-related parameters relate to behavior regarding character outline portions of affected control points along the character skeleton, such as angular variations of character outlines near the control points, width variations of outlines near the control points, and so forth.
The typeface processing application uses the identified control point locations and the extension related parameters to automatically calculate a set of control point data for the particular control point, where the set of control point data corresponds to a set of available values for the design parameters associated with the control point. For example, if the set of design parameter values d is [1,2,3,..10 ], and control point locations (p 1 and p 10) and extensions (e 1 and e 10) have been specified for the design parameter values 1 and 10, then the word processing applies interpolation intermediate locations [ p 2...p9 ] and intermediate extensions [ e 2...e9 ]. The resulting set of control point data includes the set of control point locations [ p 1…p10 ] and extensions [ e 1...e10 ], where each location p i and extension e i is associated with a design parameter value d i.
The typeface processing application outputs a graphical control data set that includes the control point data for the control point and design parameters, and other control point data for other control points and other design parameters. The design application may access the output graphical control data set and thereby allow a designer to create a unique typeface by changing the relevant design parameters, which causes the design application to retrieve corresponding control point data for the character skeleton and calculate a modified skeleton curve and associated outline curve from the retrieved control point data.
Certain embodiments described herein provide improved computing systems and devices for creating unique typeface designs. For example, graphics control data generated by a typeface processing application enables a typeface design application to automatically calculate a particular design based on specified behavior of a character skeleton (i.e., control points of the skeleton) for one or more different combinations of design parameters. Doing so simplifies creation of unique character designs on mobile devices or other computing devices via design applications. Further, the generated control point data allows similarly shaped characters to be mapped to each other, thereby allowing specific changes to the design of the characters (e.g., lowercase "b") to be applied to the design of the similarly shaped characters (e.g., lowercase "d").
Aspects of the further disclosure relate generally to the use of artificial intelligence. More particularly, but not by way of limitation, the disclosed aspects relate to automatically controlling modification of typeface designs using machine learning models.
Certain embodiments relate to automatically controlling modification of typeface designs using machine learning models. For example, a typeface design application uses a machine learning model trained to recognize specific characters from a variety of different typefaces to evaluate changes to the design of the characters. If the typeface design application determines that a particular design change prevents (or possibly prevents) the trained machine learning model from recognizing characters, the typeface design application notifies the user. In this way, the typeface design application supports providing users with automatic suggestions as to the extent of changes to the typeface design. Doing so may assist the user by automatically determining whether the typeface with the modified design is acceptable for the user's purpose.
The following non-limiting examples are intended to introduce certain embodiments of typeface design applications that use machine learning to guide or control modification of typeface designs. In this example, the content management application provides a design interface for modifying the design of the input character (e.g., letter "b") from the typeface. The design interface is configured to modify different design parameters in response to user input received via the design interface. Examples of such design parameters include thickness, width, contrast, inclination, curvature, etc. The typeface design application uses a trained machine learning model to analyze one or more inputs indicative of changes in the design of the input characters. For example, a machine learning model is trained using a large number of typefaces (e.g., 500 to 1000) to identify different designs of reference characters (e.g., different typefaces for depicting the letter "b"). The typeface design application provides the machine learning model with input characters having a modified design. If the machine learning model is unable to identify an input character having a modified design as a reference character, the typeface design application outputs one or more alerts or other indicators via the design interface. A warning or other indicator informs the user that the modified character design is not recognized as a reference character.
Any suitable process may be used to train the machine learning model. In some embodiments, the machine learning model transforms the characters of a particular typeface into corresponding feature vectors. The training process adjusts the machine learning model such that the same reference character (e.g., the "b" character) of different typefaces (e.g., arial, courier, etc.) is transformed into feature vectors that lie in the same region of vector space. Thus, the feature vector of the "b" character of any typeface should define a first point in a first region of the vector space, and the feature vector of the "x" character of any typeface should define a second point in a second region of the vector space.
To apply the machine learning model to the input character, the typeface design application transforms the input character with the modified design into a corresponding input feature vector. If the input feature vector defines a point that is too far from the relevant region of the vector space (e.g., the modified "b" character falls outside of the "b" region of the vector space), the typeface design application determines that the modified design renders the character unrecognizable. Additionally or alternatively, if the input feature vector defines a point located near the boundary of the relevant region of vector space (e.g., the modified "b" character is located at the edge of the "b" region of vector space), the typeface design application determines that the modified design is nearly unidentifiable. The typeface application may output a warning to indicate that further changes to the character will render the character unrecognizable.
Examples of operating environments
Referring now to the drawings, FIG. 1 depicts an example of an operating environment 100 for implementing one or more embodiments described herein. In this example, the creative apparatus 102 provides one or more content manipulation services, such as (but not limited to) the typeface development platform 104, via one or more data networks 118. The creative means 102 executes suitable program code, such as various applications or other software modules of the typeface development platform 104. The typeface development platform 104 includes executable code of different computing modules that perform one or more of the functions described herein. For example, the executable code includes a typeface processing application 106, a typeface design application 108, and a typeface training module 110.
One or more of these modules use data stored in the typeface repository 112. Typeface repository 112 includes one or more databases (or other suitable structures) stored in data storage units. Such data storage units may be implemented as one or more data servers. Examples of data stored in the typeface design repository 112 include one or more template typefaces 114, one or more machine learning models 115, and one or more training typefaces 116.
The typeface processing application 106 is used to generate template typefaces 114 from the input typeface design. For example, template typeface 114 comprises a set of characters having a particular design. In some embodiments, the design of each character is based on the skeleton of the character. Template typeface 114 also includes data that may be used by an end user to modify the design of one or more characters. One example of this data is parameter data. The data set calculated by the typeface processing application 106 allows the user to adjust or otherwise modify the design of the template typeface 114 and thereby create a unique typeface suitable for the user's purpose.
For example, the typeface development platform 104 may be used to modify various typesetting parameters of characters, such as thickness, width, contrast, inclination, tortuosity, x-height, taper (taper), and spacing (tracking). The typeface development platform 104 may also be used to modify various serif (serif) parameters of characters, such as width, height, stent (bracket) radius, stent angle, and slab (sleb) angle. Before the typeface development platform 104 is used to create a new typeface, the typeface processing application 106 calculates various data sets corresponding to these parameters. The typeface processing application 106 calculates various value ranges for each parameter for each character of the template typeface 114. For example, the typeface processing application 106 calculates the control point locations and ranges of corresponding curves for a range of thickness values, a range of width values, and so forth. A detailed example of the operations performed by the typeface processing application 106 will be described in further detail with respect to fig. 2-7.
The typeface application 108 is used to manipulate graphical content via one or more graphical interfaces presented to an end user. In some embodiments, the typeface design application 108 performs one or more functions that allow a user to create unique typefaces from one or more template typefaces 114 (e.g., by selecting different combinations of parameter values pre-computed by the typeface processing application 106). For example, typeface design application 108 may provide a touch-based interface to manipulate control points of characters of a typeface, combine different portions of characters (e.g., constituent shapes such as half-arcs, trunks, arms, legs, etc.), link characters together for manipulation purposes, and so forth. Examples of operations performed by the typeface application 108 will be described in further detail herein. In some embodiments, the typeface application 108 implementation includes the touchpad functionality described with respect to fig. 19-28.
In some embodiments, one or more applications included in the typeface development platform 104 use model-based machine learning functionality to instruct or control how users modify typeface designs. For example, the typeface training module 110 trains the machine learning model 115 based on various training typefaces 116. Through the training process, the machine learning model 115 learns how to recognize various characters across many different typefaces. The trained machine learning model 115 is provided to the typeface design application 108 (or other suitable module of the typeface development platform 104), and the typeface design application 108 uses the trained machine learning model 115 to automatically provide feedback to the user regarding the aesthetic quality of different design changes. For example, if a user-specified design change would render the trained machine learning model 115 unrecognizable to characters, the typeface development platform 104 may output a warning or other indication to the user regarding the design change, or may simply reject the design change. Detailed examples of operations performed using the machine learning model 115 are described in further detail herein with respect to fig. 12-18.
Creative apparatus 102 may be implemented using one or more servers, one or more processing devices, one or more platforms with corresponding application programming interfaces, cloud infrastructure, etc. Furthermore, each module described herein may also execute on one or more servers, one or more processing devices, one or more platforms with corresponding application programming interfaces, cloud infrastructure, and the like.
Some embodiments of the operating environment include user devices, including developer devices 120a through 120n that access the word processing application 106 and designer devices 122a through 122n that access the word processing application 106. In some embodiments, the same user device may be used as both a developer device and a designer device. Examples of user devices include, but are not limited to, personal computers, tablet computers, desktop computers, processing units, any combination of these devices, or any other suitable device having one or more processors. Each user device includes at least one application supported by the creative means 102. The user equipment corresponds to various users. Examples of users include, but are not limited to, creative professionals or amateurs that use creative tools to generate, edit, track, or manage creative content, market professionals or end users that use market tools to generate, edit, track, or manage online content or manage online tagging processes, administrators, users that use image tools to create, edit, track, or manage images, advertisers, publishers, developers, content owners, content managers, content creators, content viewers, content consumers, designers, editors, any combination of these users, or any other user that uses digital tools to create, edit, track, or manage digital experiences. In one example, the developer devices 120 a-120 n correspond to developers using the typeface processing application 106 to generate graphics control data used by the typeface design application 108 to perform skeleton-based typeface design, as described in detail herein. In another example, the design devices 122 a-122 n correspond to designers that use the typeface design application 108 to create custom typeface designs.
As described herein, digital tools include tools such as a typeface development platform 104 for electronically executing functions or workflows. Examples of digital tools include, but are not limited to, creation tools, content editing tools, content publishing tools, content tracking tools, content management tools, content printing tools, content consumption tools, any combination of these tools, or any other tool that may be used to create, edit, manage, generate, track, consume, or perform any other function or workflow associated with content. As described herein, a digital experience includes an experience that can be consumed by an electronic device. Examples of digital experiences include content creation, content editing, content tracking, content publishing, content printing, content management, content viewing, content consumption, any combination of these experiences, or any other workflow or function that may be performed in connection with the content. As described herein, content includes electronic content. Examples of content include, but are not limited to, images, videos, websites, webpages, user interfaces, menu items, tool menus, magazines, slides, animations, social posts, comments, blogs, data feeds, audio, advertisements, vector graphics, bitmaps, documents, any combination of one or more content, or any other electronic content.
In this example, each of the user devices 120a through 120n is communicatively coupled to the creative apparatus 102 via one or more data networks 118. The user of the user device may use various products, applications, or services supported by the creative device 102 via the data network 118. Examples of data network 118 include, but are not limited to, the Internet, a local area network, a wireless area network, a wired area network, a wide area network, and the like.
The example depicted in fig. 1 is provided for illustration purposes. However, other implementations are possible in the context of typeface designs, with or without implementing certain features. Any suitable software module may implement one or more of the operations described herein. For example, text editors, email clients, graphical design programs, and other content management applications may apply the touch pad functionality described with respect to fig. 19-28 to other types of content in addition to or in lieu of typeface.
Example of generating graphics control data for skeleton-based typeface design
As described in detail with respect to various examples below, creative means 102 is used to generate graphics control data for skeletal-based typeface designs. For example, typeface design application 108 includes various design controls. Examples of such controls include design parameters that control visual properties of the design of the character, such as thickness, width, contrast, inclination, tortuosity, x-height, taper, serif width, serif height, and the like. These design parameters depend on the graphics control data generated by the typeface processing application 106. Graphics control data is automatically generated based on a developer adjusting various visual attributes of graphics including characters from a typeface via one or more development interfaces of the typeface processing application 106. These patterns include curves defined by control points.
In some embodiments, the typeface processing application 106 receives input specifying how certain control points and curves should behave in response to exemplary values of the "thickness" parameter (e.g., minimum thickness and maximum thickness). Based on these inputs, the typeface processing application 106 generates graphics control data that identifies a set of control point locations for a corresponding set of "thickness" parameter values. The typeface processing application 106 repeats this process for one or more other design parameters of the typeface design application 108. Various sets of control point locations are included in the graphics control data output from the typeface processing application 106 to the typeface design application 108 for use by the designer.
For example, fig. 2 depicts an example of a process 200 for generating graphics control data for performing skeleton-based modification of a typeface design. In some embodiments, one or more processors of the creative apparatus 102, one or more developer devices 120 a-120 n, or some combination thereof, implement the operations depicted in FIG. 2 by executing suitable program code (such as the typeface processing application 106 of the typeface development platform 104). For illustrative purposes, the process 200 is described with reference to certain examples depicted in the accompanying drawings. Other implementations are possible.
At block 202, process 200 involves accessing a character graphic from a typeface, the character graphic including a character skeleton having a set of control points and corresponding curves and further including a character outline having one or more shapes surrounding the character skeleton. To implement block 202, the typeface processing application 106 retrieves the character graphic from a suitable non-transitory computer readable medium, such as a local memory device on a computing device executing the typeface processing application 106, a remote memory device accessible by such computing device over a data network, or some combination thereof. The character patterns may be included in a set of character patterns from template typeface 114. Each character graphic may include a set of control points defining one or more curves providing the shape of the character graphic.
In some embodiments, the creative apparatus 102 receives the character graphic from another computing device (such as a designer device associated with a typeface designer). In a simplified example involving one character, the typeface processing application 106 receives an input data set that includes a character skeleton graphic for a "thin" version of the character, a corresponding character outline graphic for a "thin" version of the character, a character skeleton graphic for a "thick" version of the character, and a corresponding character outline graphic for a "thick" version of the character. The "fine" graphic of the character (i.e., the character skeleton and character outline) may be manipulated by a developer or other user via the typeface processing application 106. The "thick" graphic of the character (i.e., the character skeleton and character outline) is a designer-provided guide that allows the developer to visually examine how far apart the manipulation of the typeface design is from the aesthetically ideal typeface design. In some embodiments, the "thick" graphic is omitted.
FIG. 3 depicts an example of a development interface 300 used by the typeface processing application 106 to calculate control point parameters 304 of characters from an input dataset for various design parameters 302 that may be manipulated in an end user application (e.g., typeface design application 108). In the example depicted in fig. 3, a set of control points (including a particular control point 306) are connected by various curves. These curves are calculated from the position of the control points. The curve defines a character skeleton 308. Character skeleton 308 is surrounded by character outline 310. For example, the character outline includes another set of control points (e.g., control points 307a and 307b on opposite sides of control point 306) connected by another set of curves. Another set of curves defines a character outline 310.
In some embodiments, the sets of control points defining the character skeleton 308 and the character outline 310 may overlap. For example, control points 306 are included in one set of control points defining character skeleton 308 and another set of control points defining character outline 310.
Design parameters 302 are parameters from the typeface design application 108 that are adjusted using the development interface 300 of the typeface processing application 106. Each design parameter 302 controls certain aesthetic features of a typeface design that may be controlled by a designer using typeface development platform 104. For example, the "thickness" parameter may be the width of the character outline, the "x height" parameter may be the vertical position of one or more horizontally oriented curves, and so on.
The particular value of the thickness parameter 312 is assigned by moving the slider 314. The thickness parameters are associated with various control point parameters 304, such as an x-coordinate parameter 316, a y-coordinate parameter 318, and an expansion angle parameter 320. The map 322 indicates that the value of the x-coordinate parameter 316 of the control point 306 should be controlled at least in part by a change in the value of the thickness parameter 312. Likewise, maps 324 and 326 and map 322 indicate that the values of y-coordinate parameter 318 and expansion angle 320, respectively, should be controlled at least in part by changes in the values of thickness parameter 312.
In some embodiments, the development interface 300 also displays a guide 315, as well as a character skeleton 308 and a character outline 310. For example, the guide 315 may be a "thick" version of the character outline provided by the designer device. The guides 315 are visual aids that allow a developer to evaluate the appearance of different manipulations of the character skeleton 308, the character outline 310, or both. For example, one or more control point parameters may be modified in a manner that causes the character outline 310 to extend beyond the guide 315. The developer may observe this effect and further adjust various control point parameters and their associated behavior so that the character outline 310 remains within the guide 315. In some embodiments, the guide 315 is omitted.
Returning to FIG. 2, process 200 also involves calculating a set of intermediate character patterns based on specific control points from the set of control points for the design parameters of the computer-implemented typeface design application. For example, the typeface processing application 106 calculates the intermediate graphic based on the control point parameters 304 that may be modified (e.g., by a developer) in the typeface processing application 106. The typeface processing application 106 generates an intermediate character graphic for a given design parameter by, for example, performing blocks 204, 206, and 208 of process 200.
At block 204, process 200 involves identifying a pair of locations of a particular control point that respectively correspond to a pair of design parameter values. The control point locations identify the location of a particular control point in a suitable plane or space. The control point location may be specified in any suitable manner. In some embodiments, the control point locations are specified as a set of coordinates (e.g., xy coordinate pairs) in a Cartesian plane or space. In various other embodiments, the control point locations may be modified using other coordinate systems (e.g., spherical coordinates, cylindrical coordinates, etc.).
In some embodiments, block 204 involves the word processing application 106 receiving, via the development interface 300, input data indicating one or more locations of a particular control point 306. The word processing application 106 also receives input data indicative of one or more parameter values of the design parameters via the development interface 300. The typeface processing application 106 stores the control point data in a suitable memory device. In some embodiments, the control point data also includes values of design parameter control point parameter values for certain specified values of the design parameter (e.g., minimum or maximum values of the design parameter). The typeface processing application 106 identifies the pair of locations and corresponding parameter values at block 204 by referencing the stored control point data.
For example, FIG. 4 depicts an example of location data that has been modified via development interface 300. In this example, the typeface processing application 106 has received an input 402 that reduces the value of the y-coordinate parameter 318. The typeface processing application 106 responds to the input 402 by lowering the control point 306, as indicated by the dashed arrow in fig. 4. Modifying the position of the control point 306 also changes at least one curve of the character skeleton 308, such as a curve ending with the control point 306. The typeface processing application 106 identifies the first parameter value 18 of the thickness parameter 312 based on the position of the slider 314. For this first parameter value, the typeface processing application 106 also identifies a first control point location of the control point 306. The first control point location includes the value of the x-coordinate parameter 316 and the value of the y-coordinate parameter 318 as modified by input 402.
The typeface processing application 106 also identifies a second parameter value and control point location for the one or more design parameters under consideration. In one example, the typeface processing application 106 receives a second input that moves the slider 314 and thereby selects a second parameter value for the thickness parameter 312. The typeface processing application 106 may then receive one or more inputs modifying the position of the control point 306, and the slider 314 indicates the second thickness parameter value. In another example, the typeface processing application 106 may use the minimum or maximum value of the thickness parameter 312 and the default control point location as the second parameter value. For example, an input dataset with a "fine" version of a character graphic may include a default position of the control point 306 for a minimum parameter value. Additionally or alternatively, the input dataset with the "thick" version of the character graphic may include a default position of the control point 306 for the maximum parameter value. At block 204, the typeface processing application 106 may identify one or more of these minimum and maximum parameter values as the second parameter value. The typeface processing application 106 may also identify one or more default control point locations from the "thin" and "thick" versions of the character graphic in the input dataset as second control point locations.
Returning to FIG. 2, at block 206, process 200 involves identifying a pair of extensions of the character outline for a particular control point, the pair of extensions corresponding to the pair of design parameter values, respectively. In some embodiments, the extension includes a width parameter, an angle parameter, or both. A first width or angle parameter value is identified for a first design parameter value and a second width or angle parameter value is identified for a second design parameter value.
In some embodiments, the width parameter indicates the width of the character outline with respect to a particular control point. For example, in fig. 3 and 4, control points 307a and 307b define the width of character outline 310 with respect to control point 306. The typeface processing application 106 may determine that the width defined by the control points 307a and 307b should change if the design parameter value changes based on one or more user inputs (e.g., changes to a "scale" associated with the width parameter). In these embodiments, the expansion of the character outline involves a corresponding rate of change in the width of the character outline as the design parameters change.
In some embodiments, the angle parameter indicates an expansion angle of the character outline portion with respect to a particular control point. For example, in fig. 3 and 4, control points 307a and 307b define a curve that bisects control point 306 at an angle of 2.225 degrees for a specified design parameter value. The typeface processing application 106 may determine that if the design parameter value changes, the specified angle should change based on one or more user inputs (e.g., a change in a "scale" associated with the angle parameter). In these embodiments, the expansion of the character outline involves modifying the expansion angle according to a particular scale. For example, in the example of fig. 3 and 4, each incremental change in the thickness parameter 312 causes a corresponding 0.002 degree change in the angle of the curve defined by points 307a and 307 b.
To implement block 204, the typeface processing application 106 receives input data indicative of various expansion related values via a development interface. The word processing application 106 stores the input data in a suitable memory device. The typeface processing application 106 identifies the pair of extensions and corresponding parameter values at block 204 by referencing the stored input data.
For example, in the examples depicted in fig. 3 and 4, the typeface processing application 106 may receive one or more inputs indicative of certain extension data via the development interface 300. Examples of the inputted expansion data include a specified expansion angle, a specified scale of expansion angle, a specified expansion width, a specified scale of expansion width, and the like. In fig. 3 and 4, the typeface processing application 106 has received input expansion data indicating that the expansion angle with respect to the control point 306 is 2.225 degrees for the value of the thickness parameter 312 indicated by slider 314. The typeface processing application 106 has also received input extension data indicating that each incremental change in the thickness parameter 312 should modify the extension angle on a scale of-0.002 degrees with respect to the control point 306.
The typeface processing application 106 may also identify the second parameter value and the extension. In one example, the typeface processing application 106 may receive a second input that moves the slider 314 and thereby selects a second parameter value for the thickness parameter 312. When slider 314 indicates a second parameter value for thickness parameter 312, word processing application 106 may receive one or more inputs modifying an expansion angle with respect to control point 306, an expansion width with respect to control point 306, or both. In another example, the typeface processing application 106 may use the minimum or maximum value of the thickness parameter 312 and the default extension as the second parameter value. For example, an input dataset having a "thin" version of a character graphic may include a default expansion angle and width with respect to control point 306 to obtain a minimum parameter value. Additionally or alternatively, the input dataset with a "thick" version of the character graphic may include default expansion angles and widths with respect to the control point 306 for the maximum parameter value, or some combination thereof. At block 206, the typeface processing application 106 may identify one or more of the minimum and maximum design parameter values as the second parameter value. The typeface processing application 106 may identify one or more extensions (e.g., extension angle, extension width, etc.) from the "thin" and "thick" versions of the character graphic in the input dataset as a second extension.
At block 208, process 200 involves generating a graphical control data set that includes (i) intermediate positions of particular control points between the pair of positions, and (ii) intermediate extensions of character outlines between the pair of extensions. To implement block 208, the typeface processing application 106 identifies a set of available design parameter values. The typeface processing application 106 also defines a range of locations bounded by the pair of locations and an extension range bounded by the pair of extensions. The word processing application 106 calculates a set of values for each range that fall between the pair of locations and the pair of extensions.
In some embodiments, the typeface processing application 106 uses a scale associated with the control point parameters to calculate a set of control point parameter values (e.g., intermediate positions). The scale indicates the spacing between an adjacent pair of control point parameter values (e.g., two adjacent position coordinates, two adjacent expansion angles, two adjacent expansion widths, etc.) corresponding to an adjacent pair of values of the user modifiable parameter. For example, the thickness parameter 312 may have a set of values w, such as [0,1,2,3,4]. The typeface processing application 106 may determine, based on one or more user inputs, that the x-coordinate parameter 316 of the particular control point 306 has a scale of 0.5 and a specified value of 1 for w 1 =0. Thus, the typeface processing application 106 calculates an x-coordinate value of 1.5 for w 2 = 1,2 for w 3 = 2, and so on until each value in the set w has a corresponding value of the x-coordinate parameter 316.
Although changing one control point parameter is described above for illustrative purposes, multiple control point parameters are associated with a given design parameter. The value of each control point parameter may be independently varied with respect to the design parameters. For example, FIG. 5 depicts an example in which different scales and different ranges of thickness parameter values are used to calculate different sets of control point values 502, 504, and 506. In this example, as the thickness value increases from 0 to 3.75, the x-coordinate parameter 316 of the control point 306 changes from 14 to 25 on a1 scale. As the thickness value increases from 0.5 to 2.5, the y-coordinate parameter 318 of the control point 306 changes the y-coordinate of the control point from 19 to 23 on a scale of 0.5. As the thickness value increases from 2 to 3.25, the expansion angle 320 with respect to the control point 306 changes from 45 to 44.5 on a scale of-0.1. In the resulting graphics control data generated by the typeface processing application 106, a thickness value of 2 results in a control point position (18, 22) with an expansion angle of 45 degrees, and a thickness value of 2.5 results in a control point position (20, 23) with an expansion angle of 44.8 degrees.
In an additional or alternative embodiment, the word processing application 106 calculates intermediate positions by interpolating between the pair of positions, intermediate extensions by interpolating between the pair of extensions, or both. The typeface processing application 106 determines that interpolation should be used based on receiving a pair of inputs. The pair of inputs includes a first input assigning one or more first control point parameter values to a first design parameter value. The pair of inputs further includes a second input assigning one or more second control point parameter values to the second design parameter value. The first input indicates that the first control point parameter value should be used as one boundary of the interpolation and the second input indicates that the second control point parameter value should be used as another boundary of the interpolation.
As a simplified example, the pair of control point locations for the thickness parameter 312 may be a first control point location (e.g., (x 1,y1)) and a second control point location (e.g., (x 2,y2)). The first control point location is assigned to the minimum thickness value via user input to the development interface 300. Via user input to the development interface 300, a second control point location (e.g., (x 2,y2)) is assigned as the maximum thickness value. Each intermediate point position is calculated based on the position where the thickness value lies between the maximum and minimum thickness values. If, for example, the thickness value is between the minimum and maximum thickness values, the typeface processing application 106 calculates an intermediate point between the first and second control point locations (e.g.,). Similarly, the pair of expansions for the thickness parameter may include a first expansion angle θ 1 for a minimum thickness value and a second expansion angle θ 2 for a maximum thickness value. For thickness values between the minimum and maximum thickness values, the typeface processing application 106 calculates the angle/>, between the first and second expansion angles
In some embodiments, the typeface processing application 106 provides a preview function that displays the effect of changing design parameters with a particular graphic control data set. For example, FIG. 6 depicts an example in which slider 314 is moved to a new position 602 by one or more user inputs received via development interface 300. In response to slider 314 being moved, typeface processing application 106 identifies the value of thickness parameter 312 corresponding to location 602. The typeface processing application 106 determines from the graphics control data generated at block 208 that the control point 306 is associated with the thickness parameter 312 and selects a control point parameter value corresponding to the identified value of the thickness parameter 312. In this example, an x-coordinate parameter 316, a y-coordinate parameter 318, and an expansion angle parameter 320 are mapped to the coarse-fine parameter 312.
The typeface application 106 selects the corresponding values of these control point parameters and calculates a new set of curves based on the control point parameter values. For example, control point 306 moves to a new position specified by the retrieved values of x-coordinate parameter 316 and y-coordinate parameter 318, as indicated by the downward one-way arrow in FIG. 6. Likewise, control points 307a and 307b move to a new position according to the retrieved value of the expansion angle, as indicated by the double-headed arrow in fig. 6. The typeface processing application 106 calculates a new curve based on the changed positions of the control points 306, 307a, and 307b, thereby modifying the shape of the character skeleton 308 and the corresponding shape of the character outline 310.
At block 210, the process 200 involves outputting the graphic control data set from the typeface processing application 106 to the typeface design application 108. The typeface processing application 106 implements block 210 by causing graphics control data to be stored in a location accessible by the typeface design application 108 of the typeface design application 108. In some embodiments, the typeface processing application 106 configures the creative apparatus 102 to store the graphic control data as part of the template typeface 114 in the typeface design repository 112. The graphics control data is accessible by the typeface application 108 when the typeface application 108 is used by one or more user devices 120a through 120n to create new typefaces from the template typeface 114. In additional or alternative embodiments, the typeface processing application 106 configures the creative apparatus 102 to transmit the graphic control data to one or more computing devices executing the graphic design module (e.g., one or more user devices 120 a-120 n executing a local typeface design application).
The typeface design application 108 with the outputted graphical control data may be accessed by one or more user devices 120a through 120n to create a new typeface. For example, the typeface design application 108 executes the typeface design application 108 to establish a session with a user device. The typeface design application 108 receives a selection of parameter values for the design parameters during the session. The typeface design application 108 responds to the selection by displaying a modified character design of the character that includes a modified curve generated from a portion of the graphical control data set.
FIG. 7 depicts an example of a design interface 700 provided by the typeface design application 108 for modifying a typeface using a graphical control data set output by the process 200. In this example, design interface 700 displays a set of design parameters 702 (e.g., character metrics, character serifs, etc.). Design parameters 702 include one or more identical design parameters used to generate the graphic control data set. For example, the design interface 700 includes control elements for modifying the thickness parameter 704 of one or more characters (e.g., the "a" character) in the template typeface 114. Modifying the thickness parameter 704 causes one or more control points of the character skeleton to be moved, one or more control points of the character outline to be moved, or both. The modified control point locations are determined based on a portion of the graphical control data set (i.e., certain control point parameter values associated with a particular thickness value) in a similar manner as the example described above with respect to fig. 6.
In some embodiments, typeface design application 108 allows one or more typeface design aspects, such as the location of control points, to be manually changed via design interface 700. For example, the typeface design application 108 may allow one or more control points of the "a" character depicted in FIG. 7 to be moved without changing any of the design parameters 702. The typeface application 108 responds to these movements of the control points by modifying the graphical control data set accordingly. In a simplified example, the graphical control data set output at block 210 specifies that if the thickness parameter value is set to 1, the x-coordinate of control point 306 is set to 5 and incremented according to a scale of 0.5, as shown in table 1 below.
/>
The typeface design application 108 may then receive drag input regarding the control point 306 via the design interface 700 while the thickness is set to 1. The drag input increases the x-coordinate by 2. The typeface application 108 responds to this drag input by recalculating one or more graphical control data sets that relate to the x-coordinates of the control point 306. For example, in table 2 below, the thickness parameter value 1 corresponds to an x-coordinate of 7, which is then incremented according to a scale of 0.5.
For illustrative purposes, the operation of process 200 is described above with respect to a single control point and a single design parameter. The above operations may be applied to a plurality of control points and a plurality of design parameters. In one example, the typeface processing application 106 may be used to map the thickness parameter 312 to control point parameters of one or more additional control points along the character skeleton 308. Thus, for each additional control point, the graphics control data includes one or more sets of control point parameter values corresponding to respective values of the coarse-fine parameter 312.
In another example, the typeface processing application 106 may be used to map a plurality of design parameters 302 to a particular control point 306. For example, both the "tortuosity" parameter and the "thickness" parameter may be assigned to control point parameters (e.g., x-coordinate, y-coordinate, spread angle, etc.) of a particular control point 306. The typeface processing application 106 calculates a first set of control point parameter values (e.g., a first set of x-coordinate values) for a first design parameter (e.g., a "tortuosity" parameter) by performing operations from one or more of blocks 202 through 208. The typeface processing application 106 calculates a second set of control point parameter values (e.g., a second set of x-coordinate values) for a second design parameter (e.g., a "thickness" parameter) by performing operations from one or more of blocks 202-208.
In some embodiments, the first and second sets of control point parameter values may be calculated using a common scale. Thus, in this example, both the incremental change in the tortuosity parameter and the incremental change in the thickness parameter result in the control point moving the same horizontal distance. In additional or alternative embodiments, the first and second sets of control point parameter values may be calculated using different scales that specify different intervals between adjacent pairs of control point parameter values. Thus, in this example, an incremental change in the tortuosity parameter causes the control point to move a horizontal distance that is different than the horizontal distance associated with the incremental change in the thickness parameter.
Mapping multiple design parameters to a given control point may potentially create conflicts with respect to the control point when calculating curves to be displayed in development interface 300, design interface 700, or both. In a simplified example, table 3 depicts multiple sets of x-coordinate values for control points 306 for a tortuosity parameter and a thickness parameter.
In this example, two sets of x coordinate values are generated using different scales. Thus, although the same x-coordinate is used if both the tortuosity parameter and the thickness parameter are set to "1", different x-coordinates result from the tortuosity parameter and the thickness parameter having different values (e.g., tortuosity of 3 and thickness of 1) or even the same value (e.g., tortuosity of 2 and thickness of 2).
One or more modules of the typeface development platform 104 (e.g., typeface processing application 106, typeface design application 108, etc.) resolve these potential conflicts by using combined control point parameter values for a particular set of user parameter values of the design parameters. For example, the typeface development platform 104 receives a first user parameter value (e.g., tortuosity of 3) for a first design parameter and a second user parameter value (e.g., thickness of 1) for a second design parameter. The typeface development platform 104 selects a first control point parameter value corresponding to the first user parameter value and a second control point parameter value corresponding to the second user parameter value. The first control point parameter value is selected from graphics control data (e.g., a set of intermediate positions, a set of intermediate extensions, etc.) specifying characteristics of the control point (e.g., an x-coordinate of 6, a degree of curvature of 3) with respect to the first design parameter. The second control point parameter value is selected from graphics control data (e.g., a set of intermediate positions, a set of intermediate extensions, etc.) specifying characteristics (e.g., x-coordinate 2, thickness 1) of the control point with respect to the second design parameter.
The typeface development platform 104 calculates a combined control point parameter value from the first and second control point parameter values. Examples of computing a combined control point include: averaging the first and second control point parameter values, calculating a weighted average of the first and second control point parameter values, and so on. The typeface development platform 104 assigns the combined control point parameter values to specific control points and calculates a modification curve from the specific control points having the combined control point parameter values. For example, in the present example, if the tortuosity parameter is set to 3 and the tortuosity parameter is set to 1, the control point 306 may be assigned an x-coordinate of 4 (i.e., the average of 6 and 2). The typeface development platform 104 modifies the character skeleton to calculate a new curve defined by the control points 306 with x-coordinates of 4.
In some embodiments, the typeface development platform 104 allows feedback to be provided from one or more designer devices 122 a-122 n using the typeface design application 108 to one or more developer devices 120 a-120 n using the typeface processing application 106. For example, typeface design application 108 may perform one or more comment capture operations with respect to the design of one or more characters. The comment capture operation involves receiving comment input indicating a portion of a character including one or more control points.
For example, during a session between the creative device 102 and a designer device, the typeface design application 108 may receive comment input via the design interface 700. The comment input may identify a portion of the "a" character, such as a character skeleton portion including the control point 306, a character outline portion calculated based on the control point 306, or some combination thereof. The typeface application 108 responds to comment input by capturing comment data. For example, during comment input, typeface design application 108 receives and stores in a memory device one or more current values of one or more design parameters. The typeface design application 108 also receives and stores the corresponding character graphic of the current design parameter value. In response to the current design parameter value being selected, a corresponding character graphic is generated from the graphic control data selected by the typeface design application 108. One example of such a character graphic is a specific character outline portion. The typeface design application 108 stores the comment data (e.g., design parameter values and corresponding character patterns) in memory locations accessible to the typeface processing application 106.
The typeface application 108 also performs one or more functions (e.g., interprocess communications) that inform the typeface processing application 106 that comment data is available. The word processing application 106 responds to this notification by making the comment data available to one or more developer devices 120a through 120 n. For example, the word processing application 106 configures the creative device 102 to transmit the review data to a developer device, to transmit user notifications to a developer device where the review data is available, or some combination thereof. The developer device may access comment data from the typeface processing application 106 and use the comment data to update the graphic control data for the character.
Link-based modification of typeface design
In some embodiments, the typeface development platform 104 is used to link character design modifications of different characters of a typeface. For example, the typeface development platform 104 creates a mapping between control points of different characters in a typeface. The mapping is created based on, for example, one or more user inputs specifying at least two characters to link and received via a suitable graphical interface (e.g., development interface 300, design interface 700, etc.). In the mapping, a first control point from a first character is linked to a second control point via a second character. The mapping indicates a similarity between a first constituent shape from the first character and a second constituent shape from the second character.
For example, FIG. 8 depicts an example of a character skeleton for characters 802 and 804 (e.g., "b" and "d" for a certain template typeface 114). Character 802 includes two constituent shapes: a trunk comprising a curve defined by control points 806, 808, and 810; and semicircular arcs including curves defined by control points 810, 812, 814, and 816. Character 804 also includes two constituent shapes: a trunk comprising a curve defined by control points 818, 820, and 822; and a semicircular arc including a curve defined by control points 822, 824, 826 and 828.
In this example, typeface development platform 104 is used to generate map 902, as shown in FIG. 9. Mapping 902 includes an association (represented by a double-headed arrow) between pairs of control points from characters 802 and 804. For example, two trunks are linked via a map 902, the map 902 associating points 806 and 818, points 808 and 820, and points 810 and 822. Similarly, the map 902 links the semicircular arcs via associations between various control points defining the curves of the semicircular arcs.
Mapping 902 allows a change in one of the constituent shapes (e.g., the semi-circular arc of character 802) to be applied to the corresponding constituent shape (e.g., the semi-circular arc of character 802). For example, the number of the cells to be processed,
Fig. 10 depicts an example in which a change in trunk length from character 802 is also applied to trunk length from character 804. The typeface development platform 104 receives an input 1002 of a mobile control point 806. In response to receiving input 1002, typeface development platform 104 determines that control point 806 is mapped to control point 818 by referencing mapping 902. The typeface development platform 104 applies corresponding modifications to the characters 804 based on the mapping between the control points 806 and 818. For example, if the input 1002 changes the y-coordinate of the control point 806, the modification 1004 includes an equal or proportional change in the y-coordinate of the control point 818.
Fig. 11 depicts an example in which a change in the half-arc length from character 802 is also applied to the half-arc length from character 804. The typeface development platform 104 receives input 1102 of a mobile control point 826. In response to receiving input 1102, by referencing mapping 902, typeface development platform 104 determines that control point 826 is mapped to control point 814. The typeface development platform 104 applies corresponding modifications to the characters 804 based on the mapping between the control points 826 and 814. For example, if the input 1102 changes the size of the x-coordinate of the control point 826, the modification 1104 includes an equal or proportional change in the size of the x-coordinate of the control point 814.
In some embodiments, the typeface development platform 104 stores the orientation of each constituent shape of a character in a map 902 or other suitable data structure. For example, the typeface application stores data indicating: the stems of characters 802 and 804 are oriented at an angle of 90 degrees with respect to the horizontal axis, the semicircular arc of character 802 is oriented at an angle of 0 degrees with respect to the horizontal axis, and the semicircular arc of character 804 is oriented at an angle of 180 degrees with respect to the horizontal axis. To apply the changes in fig. 10 and 11, the typeface development platform 104 uses the stored orientations. For example, in FIG. 10, since the stems of characters 802 and 804 are both oriented at a 90 degree angle, modification 1004 moves control point 818 in that direction. In fig. 11, the input 1102 moves the control point 826 a distance at an angle of 180 degrees (i.e., the orientation of the semicircular arc of the character 804). But the corresponding modification 1104 moves the control point 814 a distance at an angle of 90 degrees (i.e., the orientation of the semicircular arc of the character 802).
Although fig. 10 and 11 depict examples in which the constituent shape is modified via movement of a control point, any suitable modification may be used. For example, a change in the expansion of the first constituent shape (e.g., a change in the expansion width, a change in the expansion angle, etc.) may result in a corresponding change in the expansion of the second constituent shape (e.g., an equal or proportional change in the expansion width, an equal or proportional change in the size of the expansion angle, etc.).
In some embodiments, changes in control point location, expansion, etc. may result from user input modifying one or more of the design parameters described above with respect to fig. 2-7. For example, the thickness parameter value may be modified for the first character via the design interface 700, causing the typeface development platform 104 to change the control point parameters (e.g., the location of the control point, the expansion of the control point, or both). Changing the control point parameter may involve selecting an intermediate control point parameter value (e.g., intermediate position or extension) corresponding to the modified value of the design parameter from the graphical control data set output by process 200. The typeface development platform 104 applies the corresponding changes to the second constituent shape from the linked character. In some embodiments, applying the corresponding change may involve selecting an intermediate control point parameter value for the linked control point from the graphic control data set, wherein the intermediate control point parameter value also corresponds to the modified value of the design parameter. The typeface development platform 104 updates the design of each linked character (e.g., including or derived from the selected control point parameter values) by computing one or more new curves based on the control points having the modified control point parameter values.
In some embodiments, the typeface development platform 104 performs one or more validation operations when generating mappings between characters. For example, the typeface development platform 104 receives one or more inputs including a request to map a first constituent shape from a first character to a second constituent shape from a second character. The typeface development platform 104 validates the requested mapping based on one or more attributes of the two characters.
In one example, the typeface development platform 104 compares constituent shapes. If the constituent shapes are sufficiently similar (e.g., two semicircular arcs), the typeface development platform 104 performs the requested mapping. If the constituent shapes are not sufficiently similar (e.g., semicircular arcs and trunks), the typeface development platform 104 rejects the requested mapping.
In another example, the typeface development platform 104 compares the orientation angles of the constituent shapes. If the constituent shapes share a common orientation about any axis, the typeface development platform 104 performs the requested mapping. For example, in fig. 8, the semicircular arcs of characters 802 and 804 share the same orientation about the vertical axis. If the constituent shapes lack a common orientation about any axis, the typeface development platform 104 rejects the requested mapping. For example, while the "t" character and the "x" character both contain two intersecting stems, the stems of the "t" character are oriented at 0 degrees and 90 degrees, while the stems of the "x" character are oriented at 45 degrees and 135 degrees. Thus, the typeface development platform 104 will reject the requested mapping between the "t" character and the "x" character.
Examples of model-based control of typeface design modifications
As described in detail with respect to various examples below, in some embodiments, the creative means 102 is also used for model-based control of typeface designs. For example, machine learning model 115 is trained to identify certain expected visual features of different typeface characters across different fonts. When a typeface manipulation application, such as typeface processing application 106 or typeface design application 108, receives a modification to the design of a particular character, the modification is analyzed based on trained machine learning model 115. If the modification reduces the ability of the machine learning model 115 to recognize the character, then the typeface manipulation application notifies the user of the modification may be undesirable.
FIG. 12 depicts an example of a process 1200 for automatically controlling modification of a typeface design using a machine learning model 115. In some embodiments, the one or more processors of the creative apparatus 102, the one or more user devices 120 a-120 n, or some combination thereof implement the operations depicted in fig. 12 by executing suitable program code (e.g., the typeface development platform 104). For illustrative purposes, the process 1200 is described with reference to certain examples depicted in the accompanying drawings. Other implementations are possible.
At block 1202, process 1200 involves providing a design interface for modifying a design of an input character from a typeface. To implement block 1202, the typeface development platform 104 generates a graphical interface for modifying the design of one or more characters in the template typeface 114. Fig. 13 depicts an example of a design interface 1300 generated by the typeface development platform 104. In this example, design interface 1300 is used to modify the design of input character 1302 (e.g., character "b"). For example, the typeface development platform 104 may receive one or more inputs via the design interface 1300 that modify the stem 1304, the stem 1304 being the constituent shape of the input character 1302.
In some embodiments, the design interface provided at block 1202 is a graphical interface (such as design interface 700) used by typeface design application 108. Other implementations are possible. For example, the design interface provided at block 1202 may include a preview of the development interface 300 used by the typeface processing application 106 in the process 200.
In some embodiments, providing the graphical interface involves transmitting the graphical interface from the creative apparatus 102 to one or more user devices, such as developer devices 120 a-120 n or designer devices 112 a-112 n. For example, the one or more processors of the creative apparatus 102 configure the transmission device of the creative apparatus 102 to transmit the graphical interface to the user devices via the one or more data networks 118. The creative device 102 then receives input from the user device via the data network 118, where the input includes (or is derived from) user input received at the user device via a graphical interface provided by the creative device 102.
In additional or alternative embodiments, providing the graphical interface involves displaying the graphical interface locally at a computing system executing the content management application. For example, at least one processor included in the creative apparatus 102 or the user device transmits the graphical interface to the display device via a bus connecting the processor to the display device, where the display device is also included in the creative apparatus 102 or the user device. The processor then receives input via the bus from one or more input devices (e.g., touch screen, mouse, etc.) for interacting with the graphical interface.
Returning to FIG. 12, at block 1204, process 1200 involves accessing machine learning model 115 trained with training word 116 to identify input characters as reference characters. For example, the typeface development platform 104 retrieves the trained machine learning model 115 from the typeface design repository 112. Retrieving the trained machine learning model 115 from the typeface repository 112 may involve accessing a remote data source via a data network (e.g., a particular server device, a volume in a storage area network, network attached storage, etc.), or accessing a local data source via a data bus (e.g., a local non-transitory computer readable medium communicatively coupled to a processing device via a data bus).
In some embodiments, the typeface training module 110 trains the machine learning model 115. During training, various training patterns 116 are accessed by the pattern training module 110 and used to train the machine learning model 115 to classify a particular shape as certain reference characters. For example, the typeface training module 110 accesses different training patterns of reference characters, where each pattern is taken from a different training typeface 116. Each training graph includes a set of bezier curves defined by control points (e.g., bezier curves defining a skeleton of a character, sal curves defining a contour of a character, etc.). The typeface training module 110 trains the machine learning model 115 to classify each set of bezier curves, each set of control points, or some combination thereof as reference characters.
In some embodiments, the typeface training module 110 trains the machine learning model 115 to associate feature vectors of different typeface patterns with particular reference characters. For example, to train the machine learning model 115 to identify a particular reference character, the typeface development platform 104 generates feature vectors for various different patterns depicting the reference character, wherein the different patterns are acquired from different typefaces. The feature vectors define clusters of points in the vector space. The machine learning model 115 is trained to associate reference characters with regions of vector space that include at least a portion of the clusters.
At block 1206, process 1200 involves receiving, via a design interface, an input modifying a design of an input character. The input indicates one or more changes to one or more parameters of the input character (e.g., angle of the trunk, curvature of the semicircular arc, thickness of one or more constituent shapes, etc.).
For example, FIG. 14 depicts an example of an input 1402 modifying the design of an input character 1302. In this illustrative example, input 1402 involves: positioning a mouse cursor over a control point of the trunk 1304, selecting the control point, and changing the angle of the trunk 1304 by dragging the cursor in a given direction, as shown by the dashed arrow in fig. 14. However, any suitable input (e.g., clicking, pinching, scrolling, typing parameter values, etc.) may be received via the design interface 1300.
In some embodiments, the input received at block 1206 relates to one or more design inputs received by a typeface design application, as described above with respect to fig. 7. In additional or alternative embodiments, the input received at block 1206 includes one or more adjustment inputs received by the typeface processing application 106 during the process 200, as described above with respect to fig. 3-6. For example, the input may involve: modifying the position of the control point, modifying the expansion angle, changing the scale of the control point parameters, etc. The typeface processing application 106 generates one or more temporary character graphics based on the graphics control data generated by these inputs. The word processing application 106 uses the machine learning model 115 to analyze one or more of these temporary character patterns.
Returning to FIG. 12, at block 1208, process 1200 involves determining that the machine learning model 115 cannot match the reference character with the input character having the modified design. For example, the typeface development platform 104 creates a comparative version of the input character from the modified design. The compared version of the input character is stored in memory accessible to the typeface development platform 104. The typeface development platform 104 applies the machine learning model 115 to the input characters. For example, the typeface development platform 104 provides a comparative version of the input character, data derived from a comparative version of the input character (e.g., one or more feature vectors), or some combination thereof as input data to the machine learning model. The machine learning model 115 attempts to classify the input data as a particular reference character. If the machine learning model 115 cannot classify the input data or classify the input data as an incorrect reference character, the typeface design application determines that the input character with the modified design cannot be classified as the correct reference character by the machine learning model 115. (in various embodiments, the compared version of the input character may or may not be displayed in the design interface before the typeface development platform 104 applies the machine learning model 115 to the compared version of the input character.)
Machine learning model 115 may be used to perform any suitable comparison of an input character having a modified design with one or more reference versions of the same character. In some embodiments, the typeface development platform 104 determines whether a particular control point or groups of control points have been moved outside of a particular area defined by the machine learning model 115. For example, machine learning model 115 may be trained to classify various sets of control points from different typefaces into related reference characters. The input received at block 1206 may move one or more input control points from the input character from an initial position to a changed position. The initial position of the input control point is within an area defined by various reference control points corresponding to reference characters in the machine learning model 115. The changed position of the input control point is outside this area. The typeface development platform 104 determines that the machine learning model 115 cannot recognize the input character as a reference character based on the input control point being outside of this region.
In an additional or alternative embodiment, the typeface development platform 104 uses the vector-based clusters described above to determine whether an input character with a modified design can be classified as a reference character using the machine learning model 115. For example, the typeface development platform 104 generates input feature vectors from input characters having a modified design. The input feature vectors define points in the vector space used by the machine learning model 115. As explained above in the example with respect to block 1204, the machine learning model 115 associates a particular region of vector space with a reference character. If the point defined by the generated input feature vector is outside this region of vector space, the typeface development platform 104 determines that the input character with the modified design cannot be classified as a reference character using the machine learning model 115.
At block 1210, process 1200 involves outputting, via the design interface, an indicator that an input character having the modified design is not recognized as a reference character. To implement block 1210, the typeface development platform 104 generates an updated version of the graphical interface that includes one or more indicators that the modified design was not identified as a reference character. The various options for these indicators include: adding a warning to the graphical interface, rejecting the proposed design modification, replacing an alternative design modification, and so forth.
In some embodiments, the typeface development platform 104 accepts the modified design, even though the modified design prevents the machine learning model 115 from recognizing the input character as a reference character. For example, the typeface development platform 104 continues to update the design interface to display the input characters with the modified design. The typeface development platform 104 also presents a warning that the input character is not recognized as a reference character. Examples of alerts include graphical indicators displayed in the typeface development platform 104, text with alerts displayed in the typeface development platform 104, audible indicators, and the like.
For example, FIG. 15 depicts an example of a design interface 1300 in which an input 1502 results in a modified design that prevents a machine learning model 115 from identifying an input character 1302 as an appropriate reference character. Thus, at block 1210, the typeface development platform 104 updates the interface 1300 to include the alert 1504, the alert 1504 informing the user that the proposed change is "invalid". Other examples of suitable alerts include modifying one or more visual features in interface 1300 (such as highlighting a portion of interface 1300 where a design change occurred) and modifying one or more visual features of input character 1302 (such as changing a color of trunk 1304) to indicate that the design change renders input character 1302 unrecognizable.
In an additional or alternative embodiment, the typeface development platform 104 rejects the modified design if the modified design prevents the machine learning model 115 from recognizing the input character as a reference character. In one example, the typeface development platform 104 maintains the design of the input characters because the design is displayed before the modification input is received. The typeface development platform 104 outputs a message or other indicator informing the user that the modification has been denied. In another example, the typeface development platform 104 responds to the input by applying alternative modifications to the design of the input character. For example, if the input moves a portion of the input character along a particular path (e.g., drags a stem of the character), the typeface development platform 104 identifies different modifications of the design corresponding to different points along the path (e.g., different orientations of the stem as the stem is dragged along the path). The typeface development platform 104 uses the machine learning model 115 to analyze one or more of these different modifications to determine whether the various modified designs prevent the input character from being able to be identified as a reference character. The typeface development platform 104 selects the last modified design along the path of the input as an alternative modification to the design (e.g., the maximum angle of the trunk that allows the input character to be identified as a reference character) before the input character becomes recognizable.
In additional or alternative embodiments, the typeface development platform 104 provides temporary warnings. The temporary warning indicates that the modification to the design has not made the trained machine learning model 115 unable to recognize the character, but may do so if the modification continues. As a simplified example, if a character has a trunk (e.g., a "b" character or a "h" character), then modification to the design may involve changing the angle of the trunk. If the angle of the stem exceeds a particular angle, the machine learning model 115 is unable to recognize the character. The typeface development platform 104 provides the user with a temporary warning that the angle has not been exceeded but is approaching being exceeded.
For example, fig. 16 depicts an example of a design interface 1300 in which an input 1602 results in a modified design that reduces the ability of the machine learning model 115 to recognize an input character 1302 as an appropriate reference character. Input 1602 has modified the angle of trunk 1304 to a point that is close to the position depicted in fig. 15. Thus, at block 1210, the typeface development platform 104 updates the interface 1300 to include an alert 1604, the alert 1604 informing the user that the proposed modification is close to "invalidating" the modified design according to the machine learning model 115. Other examples of suitable alerts include modifying one or more visual features in interface 1300 (such as highlighting a portion of interface 1300 where a design change occurred) and modifying one or more visual features of input character 1302 (such as changing a color of trunk 1304) to indicate that the design modification is close to making input character 1302 unrecognizable.
The typeface development platform 104 may use the machine learning model 115 to alert the user to any design changes in the characters that may prevent the machine learning model 115 from recognizing the characters. For example, the typeface development platform 104 receives temporary modifications to the design of the input characters. The temporary modification may be a modification that does not result in unrecognizable input characters. The typeface development platform 104 generates the input feature vector from the input characters having the design with the temporary modifications, as discussed above with respect to block 1206. Further, as explained above in the example with respect to block 1204, the machine learning model 115 associates a particular region of vector space with a reference character.
The typeface development platform 104 identifies the boundaries of this region. The input feature vectors defining points outside the boundary are not identifiable as reference characters. The typeface development platform 104 also identifies a threshold distance from the boundary of the region in vector space. The typeface development platform 104 uses the threshold distance to determine whether the temporary modification indicates a potential design change that renders the input character unrecognizable. For example, while the machine learning model 115 matches the reference character with the input character having the design with the temporary modification, the typeface design application determines that the input feature vector identifies a position in vector space within a threshold distance from the boundary. Based on this determination, the typeface application outputs a warning in the design interface. The warning indicates that the temporary modification has reduced the ability of the machine learning model 115 to classify the input character with the temporary modification as a reference character.
FIG. 17 depicts an example of iteratively training a machine learning model 115 to generate a set of training feature vectors 1704 a-1704 c that occupy a common region of feature vector space for training graphs 1702 a-1702 c that depict the same reference character 1706. The machine learning model 115 performs one or more suitable transformation functions (e.g., neural network operations) to compute training feature vectors 1704 a-1704 c from training graphs 1702 a-1702 c. In this example, each training graphic includes a different typeface for the same reference character "b". Any number of training patterns (e.g., 500 to 1000) may be used to train the machine learning model 115.
In this example, the creative means 102 provides data from training patterns 1702-1702 a retrieved from the training word 116 as input to the machine learning model 115. The machine learning model 115 includes one or more layers, each of which includes one or more nodes. Applying the machine learning model 115 to a particular training pattern generates a corresponding training feature vector. For example, training feature vectors 1704a through 1704c are vector representations of features from training graphs 1702a through 1702 c.
Fig. 18 depicts an example of feature vector space 1802 occupied by points defined by various training feature vectors. For illustrative purposes, feature vector space 1802 is depicted as having three dimensions. However, the feature vector space used by the typeface training module 110 may have any number of dimensions (e.g., the number of dimensions in each training feature vector).
The typeface training module 110 performs a validation operation 1708 that determines whether the machine learning model 115 should be adjusted. In some embodiments, the verification operation involves determining if points in feature vector space 1802 are sufficiently close together if the points correspond to the same reference character 1706. For example, if the machine learning model 115 is optimized (or otherwise adjusted) to be suitable for use by the typeface training module 110, training feature vectors corresponding to similar constituent shapes should be defined at points closer together in the feature vector space 1802. In the example depicted in fig. 18, region 1804 of feature vector space 1802 includes a set of "b" points defined by feature vectors of various "b" character patterns that are aggregated around a first centroid. The region 1806 of feature vector space 1802 includes a set of "z" points defined by feature vectors of various "z" character graphics that are clustered around a second centroid. The region 1808 of the feature vector space 1802 includes a set of "t" points defined by feature vectors of various "t" character patterns that are clustered around a third centroid.
As depicted in fig. 18, trained machine learning model 115 generates feature vectors defining clusters of points corresponding to the same reference character for different typefaces of the reference character. For example, four different typefaces of the "b" character result in a set of points within region 1804. Further, different regions, such as region 1806, include clusters of points defined by feature vectors corresponding to different typefaces of the "z" character. Thus, the machine learning model 115 is able to recognize characters depicted using different typefaces, and is also able to distinguish between two different characters, even though they use the same typeface.
Returning to fig. 17, if verification operation 1708 identifies one or more errors in training feature vectors (e.g., feature vectors that are not sufficiently close, feature vectors that are not sufficiently far apart, etc.), typeface training module 110 performs model adjustment 1710. Model adjustment 1710 includes one or more modifications such as, but not limited to, changing the number of nodes in machine learning model 115, changing the number of layers in machine learning model 115, changing one or more mapping functions used in machine learning model 115, changing the number of dimensions included in the training feature vectors output by machine learning model 115, and the like.
Automatically controlling modifications to typeface designs may support various enhancements to typeface design applications. As one example, one or more of the embodiments described above may implement automatic suggestions for manipulating various typesetting parameters (e.g., thickness, width, contrast, pitch, bow, x-height, taper, and word pitch). As another example, one or more of the embodiments described above may enable automatic advice regarding the serifs that manipulate fonts, including width, height, stand radius, stand angle, and plank angle.
Examples of selectively enabling touchpad functionality in a graphical interface
In some embodiments, the creative device 102 or other computing system implements a touch pad functionality. For example, some content management applications, such as (but not limited to) word design applications, may execute on devices having smaller displays. The touchpad functionality may improve the end user experience on these devices by switching graphical interfaces, such as but not limited to word design interfaces, between a touchpad mode and a non-touchpad mode. In the touch pad mode, an input area of the graphical interface other than the active control element (e.g., the selected control point) is configured to ignore inputs to other control elements, and the input area is considered as a touch pad affecting the active control element. In the non-touchpad mode, various control elements are enabled according to their standard functionality.
FIG. 19 depicts an example of a process 1900 for selectively enabling a touch pad function in a graphical interface. In some embodiments, the one or more processors of the creative apparatus 102, the one or more user devices 120 a-120 n, or some combination thereof, implement the operations depicted in fig. 19 by executing suitable program code (e.g., the typedesign application 108 of the typeface development platform 104, a separate content management application, etc.). For illustrative purposes, the process 1900 is described with reference to certain examples depicted in the accompanying drawings. Other implementations are possible.
At block 1902, the process 1900 involves providing a graphical interface including a first control element for performing a first manipulation of graphical content and a second control element for performing a second manipulation of graphical content. For example, a content management application, such as the typeface development platform 104 with the typeface design application 108, generates a graphical interface for manipulating one or more types of graphical content. Examples of graphical content include combinations of shapes (e.g., a set of curves defining characters in template typeface 114), text, image content, and so forth. One or more regions of the graphical interface display graphical content. One or more regions of the graphical interface further comprise a control element. Examples of control elements include control points along a curve, buttons, selectable regions (e.g., selectable text), text cursors, and the like.
In some embodiments, providing the graphical interface involves transmitting the graphical interface from the creative device 102 to one or more user devices 120 a-120 n. For example, the one or more processors of the creative device 102 configure the transmission device of the creative device 102 to transmit the graphical interface to the user device via the data network 118. The creative device 102 then receives input from the user device via the data network 118, where the input includes (or is derived from) user input received at the user device via a graphical interface provided by the creative device 102. In additional or alternative embodiments, providing the graphical interface involves displaying the graphical interface locally at a computing system executing the content management application. For example, at least one processor included in the creative apparatus 102 or the user device transmits the graphical interface to the display device via a bus connecting the processor to the display device, where the display device is also included in the creative apparatus 102 or the user device. The processor then receives input via the bus from one or more input devices (e.g., touch screen, mouse, etc.) for interacting with the graphical interface.
At block 1904, process 1900 involves switching the graphical interface to a touch pad mode that disables the second control element and thereby prevents the second control element from performing the second manipulation. For example, a content management application (e.g., the typeface development platform 104 with the typeface design application 108) generates an updated version of the graphical interface in which control elements are enabled or disabled according to the touchpad mode.
Fig. 20 depicts an example of a graphical interface 2000 in which the graphical content is characters 2002 from one of the template typefaces 114. The graphical interface 2000 includes a touch pad area 2001 in which characters 2002 are displayed. The character 2002 includes various control points, such as control points 2004 and 2006, and various curves defined with respect to the control points. For example, control point 2004 is located in the middle of one curve and control point 2006 is located in the middle of another curve. If graphical interface 2000 is switched to the touchpad mode, one or more control elements including control points 2004 and 2006 displayed in touchpad area 2001 may be selectively disabled.
In some embodiments, the graphical interface with the touch pad area further includes a non-touch pad area. The additional control element is located in the non-touch pad area. The graphical interface maintains the functionality of these additional control elements located in the non-touch pad area, whether or not the graphical interface is in touch pad mode. For example, in this example depicted in fig. 20, graphical interface 2000 also includes a non-touch pad area 2008. The non-touch pad area 2008 includes additional control elements 2010, such as a "cancel" button, an "upload" button, and the like. The control elements included in the non-touch pad region 2008 are not affected by the touch pad mode. Thus, the graphical interface maintains the functionality of these control elements (e.g., the "undo" button, the "upload" button, etc.) both in the touchpad mode and in the away-from-touchpad mode.
In some embodiments, the typeface application 108 switches the graphical interface to the touch pad mode in response to selection of the first control element. In one example involving modifying a typeface design, in response to receiving a selection of one or more first control points (i.e., first control elements of block 1902), the graphical interface is switched to a touchpad mode. For example, fig. 21 depicts an example in which selection of control point 2006 causes graphical interface 2000 to be switched to a touchpad mode. For illustrative purposes, a selection of control point 2006 is depicted using a cursor input 2102, and control point 2006 is enlarged and changes color. Any suitable input (e.g., a touch input such as a double click) may select a control element and any suitable visual indicator may be used to identify a control element that remains enabled in the touchpad mode.
Returning to FIG. 19, at block 1906, process 1900 involves receiving input in an input area of the graphical interface that lacks the first control element while the graphical interface is in a touch pad mode. For example, the content management application detects or otherwise receives one or more input events via the graphical interface. The input event is generated by interactions occurring in an input area of the graphical interface. The input area includes any portion of a graphical interface (e.g., event listener, event handler, event source, etc.) having event-based code for generating an input event. Examples of input events include touch events (e.g., taps, swipes, etc.), drag inputs, mouse clicks, typed inputs (e.g., tabs or spaces), and so forth.
In embodiments involving a cloud-based or other remote content management application (e.g., the typeface development platform 104), receiving input involves a computing system, such as the creative apparatus 102, communicating with one or more user devices and thereby receiving input via the data network 118. For example, the one or more processors of the creative apparatus 102 configure the receiver devices of the creative apparatus 102 to receive communications from the user devices via the data network 118 (e.g., during a session established between the typeface application 108 and a client application on the user device). The receiving device receives the communication via the data network 118 and the processor obtains data describing the input from the communication. The input includes (or is derived from) user input received at the user device via the graphical interface. In additional or alternative embodiments, receiving input involves transmitting the input locally from an input device of a local computing system (e.g., a tablet computer) to a processor of the same computing system via a bus. For example, at least one input device included in the local creative apparatus 102 or the user device is used to interact with a graphical interface displayed on the display device. These interactions are captured as inputs provided to the processor via a bus connecting the processor to the input device.
At block 1908, the process 1900 involves performing a first manipulation of graphical content in response to receiving the input. In some embodiments, a content management application, such as word processing application 106 or word design application 108, responds to the input by updating the graphical interface. To update the graphical interface, the content management application moves the first control element in a manner responsive to the input received at block 1906. Moving the first control element causes a corresponding manipulation of the graphical content to be performed.
For example, fig. 22 depicts an example of receiving input in a touchpad mode that enables corresponding manipulation of graphical content. In this example, input 2202 is received in an input area (i.e., a portion of touch pad area 2001) that includes various other control points, such as disabled control point 2004. Input 2202 is a drag action in the input area, which is indicated by a dashed arrow in fig. 22. The typeface application 108 responds to the input 2202 by performing a manipulation 2204. Manipulation 2204 involves moving control point 2006 along a path, as indicated by another dashed arrow in fig. 22, which corresponds to input 2202. Moving the control point 2006 causes the curve with the control point 2006 to lengthen. (for illustrative purposes, FIG. 22 depicts input 2202 using a mouse cursor, but touch input or other input without a mouse cursor may also be used.)
The touchpad mode allows other areas of the touchpad area to be used to affect the input of the active control elements, even though those areas of the touchpad area include other control elements that would otherwise be affected by the input received in those areas. In one example, the first control element is active and the graphical interface is in a touch pad mode. The area including the disabled second control element receives the input. The typeface application 108 applies the received input to the first control element (i.e., the active control element) and not the second control element (i.e., the disabled control element). For example, in the example of fig. 22, input 2202 includes a path through control point 2004. But input 2202 is applied to control point 2006 instead of control point 2004 because control point 2004 is disabled and control point 2006 is active and interface 2000 is in a touch pad mode.
At block 1910, the process 1900 involves switching the graphical interface away from the touchpad mode after performing the first manipulation and thereby enabling the second control element to perform the second manipulation. To implement block 1910, the content management application detects one or more inputs or other events (e.g., expiration of a timer, selection of a different active window, etc.) indicating that the touch pad mode should be deactivated. The content management application responds to inputs or other events by enabling one or more control elements that are disabled in the touchpad mode.
In some embodiments, at least one input for performing a manipulation is also used to indicate that the touch pad mode should be deactivated. In this way, the content management application enters and exits the touch pad mode in a manner specific to a certain content manipulation. For example, if the touch pad mode is enabled in response to a particular control element being selected (e.g., clicking on a control point in FIG. 21), the touch pad mode is disabled in response to completion of a particular manipulation involving that control element. For example, the input for performing the manipulation may be a drag input, which involves a start input (e.g., pressing a mouse button or making contact with a touch screen), a move input, and an end input (e.g., releasing the mouse button or stopping making contact with the touch screen). The content management application may respond to detecting an ending input by completing the manipulation and deactivating (i.e., exiting) the touch pad mode.
In an additional or alternative embodiment, a separate input or event is used to indicate that the touch pad mode should be deactivated. In one example, if the touch pad mode is enabled in response to a particular control element being selected (e.g., a first click being performed on a control point in fig. 21), the touch pad mode is disabled in response to the control element being deselected (e.g., a second click being performed on the control point). In another example, the touchpad mode may be deactivated by receiving some "cancel" command (e.g., an "escape" key press, clicking a "back" button, etc.). In another example, the touch pad mode may be deactivated in response to some other event, such as a timer being initiated when the touch pad mode is activated or an input is received in the touch pad mode, and the timer expiring without receiving another input in the touch pad mode. In another example, the touch pad mode is deactivated in response to an event indicating that the user's attention has been shifted, such as a different application or other window being selected as an active window on the display device.
The touch pad mode may be implemented in any suitable manner. In some embodiments, activating the touch pad mode involves deactivating event-based code (e.g., event listeners, event handlers, event sources, etc.) associated with the disabled control element. For example, in the example depicted in fig. 20-22, the event-based code of control point 2004 is enabled in a non-touchpad mode. The event-based code is responsive to events involving manipulation of the control point. This event-based code of control point 2004 is disabled in the touchpad mode. Thus, the content management application using graphical interface 2000 does not identify "clicks" or other events that may occur with respect to control point 2004. In additional or alternative embodiments, the touch pad mode involves activating event-based dedicated code (e.g., event listeners, event handlers, event sources, etc.) associated with the enabled control elements. For example, in the example depicted in fig. 20-22, event-based code of control point 2006 is enabled in a touchpad mode. The event-based code listens for events (e.g., drag inputs) that occur throughout the trackpad area 2001 and performs applicable processes (e.g., movement of the control point 2006) in response to the events.
Fig. 23-25 depict another example of a use process 1900. In this example, fig. 23 depicts a graphical interface 2302 for an email application, where graphical interface 2302 includes a non-touch pad area 2304 and a touch pad area 2306. The trackpad region 2306 displays graphical content including visual representations 2310, 2312, 2314 and 2316. The visual representations 2310, 2312, 2314, and 2316 correspond to different email messages, where each visual representation is a row identifying the sender, subject, and date received of the message. The visual representations 2310, 2312, 2314 and 2316 may be (or include) control elements. For example, moving or otherwise manipulating the control element causes a change with respect to the represented message. For example, moving a control element of a message (e.g., a visual representation of the message) to the "delete" icon 2308 causes the message to be deleted.
In fig. 24, graphical interface 2302 receives input 2402, the input 2402 causing an appropriate content manipulation application (e.g., an email application) to switch graphical interface 2302 to a touch pad mode. In this example, input 2402 relates to selecting a visual representation 2310 (e.g., using a double click or other input indicating that a touch pad mode should be activated). Switching graphical interface 2302 to a touchpad mode causes input (e.g., input in which visual representation 2312, 2314, or 2316 is located) received in various ones of the touchpad areas to be applied to an active control element (i.e., visual representation 2310). For example, FIG. 25 depicts a drag input 2502 that begins by clicking on the region of graphical interface 2302 where visual representation 2312 is located. Drag input 2502 ends at the region of graphical interface 2302 where visual representation 2316 is located. (for illustrative purposes, FIG. 25 depicts input 2502 using a mouse cursor, but touch input or other input without a mouse cursor may also be used.) the touchpad mode disables control elements of the visual representations 2312, 2314, and 2316. Thus, a drag input is applied to the active control element (i.e., visual representation 2310), which causes the visual representation 2310 to be moved to the "delete" icon 2308.
Fig. 26-28 depict another example of a use process 1900. In this example, fig. 26 depicts a graphical interface 2602 displaying graphical content including text lines 2604 and text lines 2606. Graphical interface 2602 also includes control elements, such as a cursor 2608 with an end 2610. Another example of a control element includes selectable text. For example, in a non-touchpad mode, any point on text line 2606 may receive touch input that causes a particular character to be selected.
In fig. 27, the graphical interface 2602 receives input 2702 that causes an appropriate content manipulation application (e.g., a text editing application) to switch the graphical interface 2602 to a touch pad mode. For example, input 2702 may include receiving a double click or double tap on end 2610 of cursor 2608. (although fig. 27 depicts input 2702 using a mouse cursor, touch input or other input without a mouse cursor may also be used.) switching the graphical interface 2602 to a touchpad mode causes input received in various regions of the graphical interface 2602 (e.g., touch input on text line 2606) to be applied to an active control element (i.e., cursor 2608). For example, fig. 28 depicts drag input 2802 moving along text line 2606.
The touchpad mode disables one or more control elements that otherwise apply Yu Wenben lines 2606 to this drag input 2802. For example, in a non-touchpad mode, the content management application of the display interface 2602 will execute event-specific code to listen to events related to the display text of the text line 2604, such as clicking, dragging, keyboard entry, etc. In the touchpad mode, the content management application disables this event-specific code, performs alternative event-specific code that ignores the input of text that affects line 2606, or some combination thereof. Alternative event-specific code management involves events to be applied to inputs in the region of the graphical interface 2602 of an active element (such as a cursor 2608) in touchpad mode. Because the content management application executes the alternate event-specific code, the drag input 2802 is applied to the active control element (i.e., cursor 2608), which causes the cursor to move along the path of text line 2604 instead of the path of text line 2606, thereby selecting text line 2604 via the drag input 2802 received in touchpad mode.
Examples of computing systems for implementing various embodiments
Any suitable computing system or group of computing systems may be used to perform the operations described herein. For example, fig. 29 depicts an example of a computing system 2900 for implementing one or more embodiments of the disclosure. In some embodiments, the computing system 2900 is the creative device 102 and includes all of the computing modules and datasets depicted in fig. 29. In other embodiments, different computing systems (e.g., creative apparatus 102 and one or more user devices) having devices (e.g., processors, memory, etc.) similar to the devices depicted in fig. 29 execute different subsets of the modules depicted in fig. 29, execute alternative applications (e.g., content management applications other than word design applications), store different subsets of the data sets depicted in fig. 29, or some combination thereof.
The depicted example of a computing system 2900 includes a processor 2902 communicatively coupled to one or more memory devices 2904. The processor 2902 executes computer-executable program code stored in the memory device 2904, accesses information stored in the memory device 2904, or both. Examples of processor 2902 include a microprocessor, an application specific integrated circuit ("ASIC"), a field programmable gate array ("FPGA"), or any other suitable processing device. The processor 2902 may include any number of processing devices, including a single processing device.
Memory device 2904 includes any suitable non-transitory computer-readable medium for storing data, program code, or both. The computer readable medium may include any electronic, optical, magnetic, or other storage device that can provide computer readable instructions or other program code to a processor. Non-limiting examples of computer readable media include magnetic disks, memory chips, ROM, RAM, ASIC, optical storage, magnetic tape or other magnetic storage, or any other medium from which a processing device may read instructions. The instructions may include processor-specific instructions generated by a compiler or interpreter from code written in any suitable computer programming language, including, for example, C, C ++, c#, visual basic, java, python, perl, javaScript, and ActionScript.
Computing system 2900 may also include a number of external or internal devices, such as input or output devices. For example, computing system 2900 is shown with one or more input/output ("I/O") interfaces 2908. The I/O interface 2908 may receive input from an input device (e.g., a mouse, touch screen, keyboard, microphone, etc.), or provide output to an output device (e.g., a display device, touch screen, speaker, etc.). One or more buses 2906 are also included in computing system 2900. Bus 2906 communicatively couples one or more components of computing system 2900.
The computing system 2900 executes program code that configures the processor 2902 to perform one or more operations described herein. For example, the program code includes a typeface development platform 104, one or more component modules thereof (e.g., typeface processing application 106, typeface design application 108, typeface training module 110, etc.), or other suitable applications that perform one or more of the operations described herein. The program code may reside in the memory device 2904 or any suitable computer readable medium and may be executed by the processor 2902 or any other suitable processor. In some embodiments, the program code is stored in the memory device 2904, as depicted in fig. 29. In additional or alternative embodiments, the program code described above is stored in one or more other memory devices accessible via a data network.
The computing system 2900 may access various data objects and data structures, such as template typeface 114, machine learning model 115, and training typeface 116. These data objects and data structures (which use is described above) may be accessed in any suitable manner. In some embodiments, some or all of these data objects and data structures are stored in memory device 2904, as in the example depicted in fig. 29. For example, the computing system 2900 executing the typeface development platform 104 may provide external systems with access to template typefaces 114, machine learning models 115, or other data stored in the typeface design repository 112.
In additional or alternative embodiments, some or all of these data objects and data structures are stored in the same memory device (e.g., one of memory devices 2904). For example, a public computing system, such as the creative apparatus 102 depicted in fig. 1, may host the typeface development platform 104 and store one or more data sets included in the typeface design repository 112. In additional or alternative embodiments, some or all of these data objects and data structures are stored in one or more other memory devices accessible via a data network.
Computing system 2900 also includes a network interface device 2910. Network interface device 2910 comprises any device or group of devices suitable for establishing a wired or wireless data connection to one or more data networks. Non-limiting examples of network interface device 2910 include an ethernet network adapter, a modem, and the like. The computing system 2900 is capable of communicating with one or more other computing devices (e.g., computing devices executing the typeface development platform 104) via a data network using the network interface device 2910.
In some embodiments, computing system 2900 further includes presentation device 2912 depicted in fig. 29. The rendering device 2912 may include any device or group of devices suitable for providing visual, auditory, or other suitable sensory output. Non-limiting examples of presentation device 2912 include a touch screen, monitor, speaker, separate mobile computing device, and the like. In some aspects, presentation device 2912 may include a remote client computing device, such as the user device depicted in fig. 1, that communicates with computing system 2900 using one or more data networks described herein. Other aspects may omit the presentation device 2912.
Additional features, aspects, and embodiments are provided in the following entities:
entity 1. A method for generating graphics control data for performing skeleton-based modification of a typeface design, the method comprising:
Accessing, by a typeface processing application executed by one or more processing devices, a character graphic of a character from a typeface, the character graphic comprising (i) a character skeleton comprising a set of control points and a set of curves defined by the set of control points, and (ii) a character outline comprising one or more shapes surrounding the character skeleton;
calculating a graphic control data set based on the specific control points from the set of control points for the design parameters of the computer-implemented typeface design application, wherein calculating the graphic control data set comprises:
a pair of locations of a particular control point corresponding respectively to a pair of design parameter values of the design parameter are identified,
Identifying a pair of extensions of the character outline with respect to the particular control point, wherein the pair of extensions correspond to the pair of design parameter values, respectively, and
Generating a graphic control data set comprising (i) an intermediate position of a particular control point between the pair of positions, and (ii) an intermediate extension of the character outline between the pair of extensions; and
The graphical control data set is output from the typeface processing application to the typeface design application, wherein the typeface design application is configured to display a modified character design comprising a modified curve generated from a portion of the graphical control data set in response to selection of the design parameter values.
Entity 2. The method according to entity 1, wherein the expansion from the pair of expansions includes a corresponding angle with respect to the particular control point at which the character outline expands or contracts in response to a change in the design parameter value.
Entity 3. The method of entity 1 wherein the expansion from the pair of expansions includes a corresponding rate at which the width of the character outline changes in response to changes in the value of the design parameter.
Entity 4. The method of entity 1, wherein generating the graphical control data set further comprises identifying a scale associated with the particular control point, wherein:
(i) The scale indicates (i) a spacing between a pair of adjacent intermediate positions corresponding to a pair of adjacent values of the design parameter, or (ii) a spacing between a pair of adjacent intermediate extensions corresponding to a pair of adjacent values of the design parameter, an
(Ii) Each intermediate position or intermediate extension is generated by modifying the adjacent position or adjacent extension according to the interval.
Entity 5. The method according to entity 4, further comprising:
Additional design parameters for a typeface design application calculate additional graphics control data based on a particular control point, wherein calculating the additional graphics control data includes:
a pair of additional locations of the particular control point corresponding respectively to a pair of design parameter values of the additional design parameter are identified,
A pair of additional extensions of the character outline for the particular control point corresponding respectively to the pair of design parameter values of the additional design parameter are identified,
Generating additional graphics control data comprising (i) an additional intermediate position of the particular control point between the pair of additional positions, and (ii) an additional intermediate extension of the character outline between the pair of additional extensions,
Wherein each additional intermediate position or additional intermediate extension is generated by modifying additional adjacent positions or extensions according to the additional interval,
Wherein the additional interval is different from the interval described above, and is specified by an additional scale associated with the additional design parameter,
Wherein the additional graphic control data is included in the graphic control data set output to the typeface application.
Entity 6. According to the method of entity 5, further comprising generating, by the typeface processing application or the typeface design application, a modified character design by performing operations comprising:
Receiving a first user parameter value of a design parameter and a second user parameter value of the design parameter;
Selecting a first control point parameter value corresponding to the first user parameter value and a second control point parameter value corresponding to the second user parameter value, wherein the first control point parameter value is one of the intermediate positions or one of the intermediate extensions, wherein the second control point parameter value is one of the additional intermediate positions or one of the additional intermediate extensions;
calculating a combined control point parameter value from the first control point parameter value and the second control point parameter value;
Assigning a combined control point parameter value to a particular control point; and
One of the modified curves is calculated from a particular control point having a combined control point parameter value.
Entity 7. The method according to entity 1, wherein the method further comprises applying character design modifications of the linked character and the additional character from the typeface by the typeface design, wherein the linked character design modifications comprise:
mapping a specific control point of the character to an additional control point of the additional character, wherein the mapping indicates a similarity between a first constituent shape from the character and a second constituent shape from the additional character;
modifying a first constituent shape from the character, wherein modifying the first constituent shape includes one or more of:
movement of position at specific control point, and
Changes in the expansion with respect to a particular control point;
modifying a second constituent shape from the additional character based on the mapping, wherein modifying the second constituent shape includes one or more of:
moving the position of the additional control point to a distance or direction corresponding to the movement in the position of the specific control point; and
The width or angle of the extension with respect to the additional control point is changed to correspond to the change of the extension with respect to the specific control point.
Entity 8. The method of entity 7, wherein modifying the first constituent shape includes selecting a first intermediate location or extension of the particular control point from the graphic control data set in response to the changed value of the design parameter,
Wherein modifying the second constituent shape includes selecting a second intermediate position or extension of the additional control point corresponding to the design parameter having the changed value from the graphic control data set,
Wherein the design parameters correspond to both (i) a first intermediate position or extension and (ii) a second intermediate position or extension.
Entity 9. The method according to entity 7, further comprising:
Receiving, by the typeface application, a request to map a first constituent shape from a character to a third constituent shape from a third character; and
Rejecting, by the typeface application, the request based on one or more of:
The first component shape is different from the third component shape, and
The first component shape lacks a common orientation about any axis as compared to the third component shape.
Entity 10. According to the method of entity 1, further comprising performing a comment capture operation by the typeface application, wherein the comment capture operation comprises:
receiving, from a first client computing device accessing a typeface design application, comment input indicating a portion of a character including a particular control point;
capturing comment data including a current value of a design parameter and a corresponding character graphic generated for the design parameter during a period in which the comment input is received; and
The comment data is transmitted to a second client computing capable of accessing the typeface processing application.
An entity 11, a system comprising:
A processing device; and
A non-transitory computer readable medium communicatively coupled to a processing device, wherein the processing device is configured to execute a typeface processing application stored in the non-transitory computer readable medium and thereby perform operations comprising:
accessing a character graphic from the character of the typeface, the character graphic comprising (i) a character skeleton comprising a set of control points and a set of curves defined by the set of control points, and (ii) a character outline comprising one or more shapes surrounding the character skeleton;
calculating a graphic control data set based on the specific control points from the set of control points for the design parameters of the computer-implemented typeface design application, wherein calculating the graphic control data set comprises:
a pair of locations of a particular control point corresponding respectively to a pair of design parameter values of the design parameter are identified,
Identifying a pair of extensions of the character outline with respect to the particular control point, wherein the pair of extensions correspond to the pair of design parameter values, respectively, and
Generating a graphic control data set comprising (i) an intermediate position of a particular control point between the pair of positions, and (ii) an intermediate extension of the character outline between the pair of extensions; and
The graphic control dataset is output to a typeface design application.
Entity 12. According to the system of entity 11, wherein the expansion from the pair of expansions includes a corresponding angle with respect to the particular control point at which the character outline expands or contracts in response to a change in the design parameter value.
Entity 13. According to the system of entity 11, wherein the expansion from the pair of expansions includes a corresponding rate at which the width of the character outline changes in response to changes in the value of the design parameter.
Entity 14. The system of entity 11 wherein generating the graphical control data set further comprises identifying a scale associated with a particular control point, wherein:
(i) The scale indicates (i) a spacing between a pair of adjacent intermediate positions corresponding to a pair of adjacent values of the design parameter, or (ii) a spacing between a pair of adjacent intermediate extensions corresponding to a pair of adjacent values of the design parameter, an
(Ii) Each intermediate position or intermediate extension is generated by modifying the adjacent position or adjacent extension according to the interval.
Entity 15. According to the system of entity 14, these operations further include:
Additional design parameters for a typeface design application calculate additional graphics control data based on a particular control point, wherein calculating the additional graphics control data includes:
a pair of additional locations of the particular control point corresponding respectively to a pair of design parameter values of the additional design parameter are identified,
A pair of additional extensions of the character outline for the particular control point corresponding respectively to the pair of design parameter values of the additional design parameter are identified,
Generating additional graphics control data comprising (i) an additional intermediate position of the particular control point between the pair of additional positions, and (ii) an additional intermediate extension of the character outline between the pair of additional extensions,
Wherein each additional intermediate position or additional intermediate extension is generated by modifying additional adjacent positions or extensions according to the additional interval,
Wherein the additional interval is different from the interval described above, and is specified by an additional scale associated with the additional design parameter,
Wherein the additional graphic control data is included in the graphic control data set output to the typeface application.
Entity 16. A non-transitory computer readable medium having program code for a typeface processing application, the program code stored on the non-transitory computer readable medium and executable by one or more processing devices to perform operations comprising:
accessing a character graphic of a character from the typeface, the character graphic comprising (i) a character skeleton comprising a set of control points and a set of curves defined by the set of control points, and (ii) a character outline comprising one or more shapes surrounding the character skeleton;
calculating a graphic control data set based on the specific control points from the set of control points for the design parameters of the typeface design application, wherein calculating the graphic control data set comprises:
A step for identifying a pair of positions of a specific control point respectively corresponding to a pair of design parameter values of the design parameter,
A step for identifying a pair of extensions of the character outline with respect to a specific control point, wherein the pair of extensions respectively correspond to the pair of design parameter values, and
A step for generating a graphic control data set comprising (i) an intermediate position of a particular control point between the pair of positions, and (ii) an intermediate extension of the character outline between the pair of extensions; and
The graphic control data set is output from the typeface processing application to the typeface design application.
Entity 17. According to the non-transitory computer-readable medium of entity 16, wherein generating the graphical control data set further comprises identifying a scale associated with the particular control point, wherein:
(i) The scale indicates (i) a spacing between a pair of adjacent intermediate positions corresponding to a pair of adjacent values of the design parameter, or (ii) a spacing between a pair of adjacent intermediate extensions corresponding to a pair of adjacent values of the design parameter, an
(Ii) Each intermediate position or intermediate extension is generated by modifying the adjacent position or adjacent extension according to the interval.
Entity 18. The non-transitory computer readable medium according to entity 17, wherein the program code further comprises a typeface design application, and wherein the operations further comprise generating a modified character design comprising a modified curve generated from a portion of the graphic control dataset in response to selection of the design parameter value, wherein calculating the modified character design comprises:
Receiving a first user parameter value of a design parameter and a second user parameter value of the design parameter;
Selecting a first control point parameter value corresponding to the first user parameter value and a second control point parameter value corresponding to the second user parameter value, wherein the first control point parameter value is one of the intermediate positions or one of the intermediate extensions, wherein the second control point parameter value is one of the additional intermediate positions or one of the additional intermediate extensions;
calculating a combined control point parameter value from the first control point parameter value and the second control point parameter value;
Assigning a combined control point parameter value to a particular control point; and
One of the modified curves is calculated from a particular control point having a combined control point parameter value.
Entity 19. According to the non-transitory computer readable medium of entity 16, wherein the program code further comprises a typeface design application, and wherein the operations further comprise a character design modification of the linked character and the additional character from the typeface, wherein the linked character design modification comprises:
mapping a specific control point of the character to an additional control point of the additional character, wherein the mapping indicates a similarity between a first constituent shape from the character and a second constituent shape from the additional character;
modifying a first constituent shape from the character, wherein modifying the first constituent shape includes one or more of:
movement of position at specific control point, and
Changes in the expansion with respect to a particular control point;
modifying a second constituent shape from the additional character based on the mapping, wherein modifying the second constituent shape includes one or more of:
moving the position of the additional control point to a distance or direction corresponding to the movement in the position of the specific control point; and
The width or angle of the extension with respect to the additional control point is changed to correspond to the change of the extension with respect to the specific control point.
Entity 20. According to the non-transitory computer-readable medium of entity 16, wherein the program code further comprises a typeface design application, and wherein the operations further comprise performing comment capture operations, wherein the comment capture operations comprise:
receiving, from a first client computing device accessing a typeface design application, comment input indicating a portion of a character including a particular control point;
capturing comment data including a current value of a design parameter and a corresponding character graphic generated for the design parameter during a period in which the comment input is received; and
The comment data is transmitted to a second client computing capable of accessing the typeface processing application.
Additional features, aspects, and embodiments are provided in the following:
item 1. A method for automatically controlling modification of a typeface design, the method comprising:
providing, by a typeface design application executed by one or more processing devices, a design interface for modifying a design of input characters from a typeface;
Accessing, by the typeface design application, a machine learning model trained with a plurality of training typefaces to identify an input character as a reference character;
receiving, via a design interface, an input modifying a design of an input character;
Determining that the machine learning model cannot match the reference character with the input character having the modified design; and
An indicator is output via the design interface that an input character having the modified design is not recognized as a reference character.
Item 2. The method of item 1, wherein outputting the indicator comprises:
Updating the design interface to display the input character with the modified design; and
A warning is presented via the design interface that the input character is not recognized as a reference character.
Item 3. The method of item 1, wherein outputting the indicator comprises outputting, via the design interface, a rejection of the modification of the design specified by the input.
Item 4. The method of item 3, wherein outputting the indicator further comprises:
applying a substitution modification to the design of the input character in response to the input;
Determining that the machine learning model matches the reference character with the input character having the surrogate modification to the design; and
The design interface is updated to display the input character with the alternate modification to the design.
Item 5. The method of item 1, further comprising:
prior to accessing the machine learning model:
accessing a first graphic of a reference character from a first training pattern and a second graphic of a reference character from a second training pattern, and
Training a machine learning model to classify (i) a first set of control points from a first graphic as reference characters and (ii) a second set of control points from a second graphic as reference characters;
wherein determining that the machine learning model is unable to match the reference character with the input character having the modified design comprises:
Identifying a changed position of an input control point from the input character, wherein the changed position is indicated by an input modifying the design of the input character, wherein a previous position of the input control point is within an area defined by (i) a first reference control point from a first set of control points and (ii) a second reference control point from a second set of control points, and
The changed position of the input control point is determined to be outside of the area defined by the first reference control point and the second reference control point.
Item 6. The method of item 1, further comprising:
prior to accessing the machine learning model:
Accessing a first graphic of a reference character from a first training typeface and a second graphic of a reference character from a second training typeface,
A first feature vector is generated from the first graph and a second feature vector is generated from the second graph,
Assigning the first feature vector and the second feature vector to clusters in a region of vector space, an
Training a machine learning model to associate regions of vector space with reference characters;
wherein determining that the machine learning model is unable to match the reference character with the input character having the modified design comprises:
Generating an input feature vector from an input character having a modified design, and
It is determined that the input feature vector is outside of the region of vector space associated with the reference character.
Item 7. The method of item 1, further comprising: before receiving input to modify the design:
Receiving additional input via the design interface, the additional input applying a temporary modification to the design of the input character;
generating an input feature vector from an input character having a design with temporary modifications;
identifying, from the machine learning model, a boundary of a region of vector space associated with the reference character and a threshold distance from the boundary;
determining (i) that the machine learning model matches the reference character with an input character having a design with temporary modifications, and (ii) that the input feature vector identifies a position in vector space within a threshold distance from the boundary; and
An alert is output via the design interface and based on the input feature vector being within a threshold distance from the boundary, the alert indicating that the temporary modification reduces the machine learning model's ability to classify the input character with the temporary modification as a reference character.
Item 8. A system, comprising:
A processing device; and
A non-transitory computer readable medium communicatively coupled to a processing device, wherein the processing device is configured to execute a typeface design application stored in the non-transitory computer readable medium and thereby perform operations comprising:
A design interface is provided for modifying the design of the input characters from the typeface,
Accessing a machine learning model trained with a plurality of training word patterns to identify an input character as a reference character,
Input modifying the design of the input character is received via the design interface,
Determining that the machine learning model cannot match the reference character to the input character having the modified design, and
An indicator is output via the design interface that an input character having the modified design is not recognized as a reference character.
Item 9. The system of item 8, wherein outputting the indicator comprises:
Updating the design interface to display the input character with the modified design; and
A warning is presented via the design interface that the input character is not recognized as a reference character.
Item 10. The system of item 8, wherein outputting the indicator includes outputting, via the design interface, a rejection of the modification of the design specified by the input.
Item 11. The system of item 10, wherein outputting the indicator further comprises:
applying a substitution modification to the design of the input character in response to the input;
Determining that the machine learning model matches the reference character with the input character having the surrogate modification to the design; and
The design interface is updated to display the input character with the alternate modification to the design.
Item 12. The system of item 8, the operations further comprising:
prior to accessing the machine learning model:
accessing a first graphic of a reference character from a first training pattern and a second graphic of a reference character from a second training pattern, and
Training a machine learning model to classify (i) a first set of control points from a first graphic as reference characters and (ii) a second set of control points from a second graphic as reference characters;
wherein determining that the machine learning model is unable to match the reference character with the input character having the modified design comprises:
Identifying a changed position of an input control point from the input character, wherein the changed position is indicated by an input modifying the design of the input character, wherein a previous position of the input control point is within an area defined by (i) a first reference control point from a first set of control points and (ii) a second reference control point from a second set of control points, and
The changed position of the input control point is determined to be outside of the area defined by the first reference control point and the second reference control point.
Item 13. The system of item 8, the operations further comprising:
prior to accessing the machine learning model:
Accessing a first graphic of a reference character from a first training typeface and a second graphic of a reference character from a second training typeface,
A first feature vector is generated from the first graph and a second feature vector is generated from the second graph,
Assigning the first feature vector and the second feature vector to clusters in a region of vector space, an
Training a machine learning model to associate regions of vector space with reference characters;
wherein determining that the machine learning model is unable to match the reference character with the input character having the modified design comprises:
Generating an input feature vector from an input character having a modified design, and
It is determined that the input feature vector is outside of the region of vector space associated with the reference character.
Item 14. The system of item 8, the operations further comprising: before receiving input to modify the design:
Receiving additional input via the design interface, the additional input applying a temporary modification to the design of the input character;
generating an input feature vector from an input character having a design with temporary modifications;
identifying, from the machine learning model, a boundary of a region of vector space associated with the reference character and a threshold distance from the boundary;
determining (i) that the machine learning model matches the reference character with an input character having a design with temporary modifications, and (ii) that the input feature vector identifies a position in vector space within a threshold distance from the boundary; and
An alert is output via the design interface and based on the input feature vector being within a threshold distance from the boundary, the alert indicating that the temporary modification reduces the machine learning model's ability to classify the input character with the temporary modification as a reference character.
Item 15. A non-transitory computer readable medium having program code for a typeface design application, the program code stored on the non-transitory computer readable medium and executable by one or more processing devices to perform operations comprising:
providing a design interface for modifying a design of an input character from the typeface;
Accessing a machine learning model trained with a plurality of training word patterns to identify an input character as a reference character;
receiving, via a design interface, an input modifying a design of an input character;
Determining that the machine learning model cannot match the reference character with the input character having the modified design; and
A step for outputting, via the design interface, an indicator that an input character having the modified design is not recognized as a reference character.
Item 16. The non-transitory computer-readable medium of item 15, wherein outputting the indicator comprises:
Updating the design interface to display the input character with the modified design; and
A warning is displayed via the design interface that the input character is not recognized as a reference character.
Item 17. The non-transitory computer-readable medium of item 15, wherein outputting the indicator comprises:
outputting, via the design interface, a rejection of the modification of the design specified by the input;
applying a substitution modification to the design of the input character in response to the input;
Determining that the machine learning model matches the reference character with the input character having the surrogate modification to the design; and
The design interface is updated to display the input character with the alternate modification to the design.
Item 18. The non-transitory computer-readable medium of item 15, the operations further comprising:
prior to accessing the machine learning model:
Accessing a first graphic of a reference character from a first training pattern and accessing a second graphic of a reference character from a second training pattern, and
Training a machine learning model to classify (i) a first set of control points from a first graphic as reference characters and (ii) a second set of control points from a second graphic as reference characters;
wherein determining that the machine learning model is unable to match the reference character with the input character having the modified design comprises:
identifying a changed position of an input control point from the input character, wherein the changed position is indicated by an input modifying the design of the input character, wherein a previous position of the input control point is within an area defined by (i) a first reference control point from a first set of control points and (ii) a second reference control point from a second set of control points, and
The changed position of the input control point is determined to be outside of the area defined by the first reference control point and the second reference control point.
Item 19. The non-transitory computer-readable medium of item 15, the operations further comprising:
prior to accessing the machine learning model:
accessing a first graphic of reference characters from a first training pattern and accessing a second graphic of reference characters from a second training pattern,
A first feature vector is generated from the first graph and a second feature vector is generated from the second graph,
Assigning the first feature vector and the second feature vector to clusters in a region of vector space, an
Training a machine learning model to associate regions of vector space with reference characters;
wherein determining that the machine learning model is unable to match the reference character with the input character having the modified design comprises:
Generating an input feature vector from an input character having a modified design, and
It is determined that the input feature vector is outside of the region of vector space associated with the reference character.
Item 20. The non-transitory computer-readable medium of item 15, the operations further comprising: before receiving input to modify the design:
Receiving additional input via the design interface, the additional input applying a temporary modification to the design of the input character;
generating an input feature vector from an input character having a design with temporary modifications;
identifying, from the machine learning model, a boundary of a region of vector space associated with the reference character and a threshold distance from the boundary;
determining (i) that the machine learning model matches the reference character with an input character having a design with temporary modifications, and (ii) that the input feature vector identifies a position in vector space within a threshold distance from the boundary; and
An alert is output via the design interface and based on the input feature vector being within a threshold distance from the boundary, the alert indicating that the temporary modification reduces the machine learning model's ability to classify the input character with the temporary modification as a reference character.
General considerations
Numerous specific details are set forth herein to provide a thorough understanding of the claimed subject matter. However, it will be understood by those skilled in the art that the claimed subject matter may be practiced without these specific details. In other instances, methods, apparatuses, or systems known by those of ordinary skill have not been described in detail so as not to obscure the claimed subject matter.
It should be appreciated that throughout the specification discussions utilizing terms such as "processing," "computing," "calculating," "determining," and "identifying" or the like, refer to the action and processes of a computing device, such as one or more computers or similar electronic computing devices, that manipulates and transforms data represented as physical electronic or magnetic quantities within the memories, registers, or other information storage devices, transmission devices, or display devices of the computing platform.
The one or more systems discussed herein are not limited to any particular hardware architecture or configuration. The computing device may include any suitable arrangement of components that provide results conditioned on one or more inputs. Suitable computing devices include multipurpose microprocessor-based computer systems that access stored software that programs or configures the computing system from a general-purpose computing device to a special-purpose computing device that implements one or more embodiments of the present subject matter. The teachings contained herein may be implemented in software for programming or configuring a computing device using any suitable programming, scripting, or other type of language or combination of languages.
Embodiments of the methods disclosed herein may be performed in the operation of such a computing device. The order of the blocks presented in the above examples may be varied, e.g., the blocks may be reordered, combined, and/or divided into sub-blocks. Some blocks or processes may be performed in parallel.
The use of "adapted to" or "configured to" herein means an open and inclusive language that does not exclude devices adapted or configured to perform additional tasks or steps. In addition, the use of "based on" means open and inclusive in that a process, step, calculation, or other action "based on" one or more of the stated conditions or values may be based on additional conditions or values other than those stated in practice. Headings, lists, and numbers included herein are for ease of explanation only and are not limiting.
While the present subject matter has been described in detail with respect to specific embodiments thereof, it will be appreciated that those skilled in the art, upon attaining an understanding of the foregoing may readily produce alterations to, variations of, and equivalents to such embodiments. Accordingly, it should be understood that the present disclosure has been presented for purposes of example and not limitation, and does not preclude inclusion of such modifications, variations and/or additions to the present subject matter as would be readily apparent to one of ordinary skill in the art.

Claims (15)

1. A method for selectively enabling a touch pad function in a graphical interface, the method comprising:
Providing, by a content manipulation application executed by one or more processing devices, a graphical interface for editing graphical content, the graphical interface having a first control element for performing a first manipulation of the graphical content and a second control element for performing a second manipulation of the graphical content;
switching the graphical interface to a touch pad mode in response to a selection of the first control element, wherein the touch pad mode disables the second control element and thereby prevents the second control element from performing the second manipulation, and wherein the second control element remains disabled while additional input is provided in the graphical interface until the graphical interface is switched out of the touch pad mode;
Receiving a first input different from a selection of the first control element in an input region of the graphical interface lacking the first control element while the graphical interface is in the touch pad mode;
performing the first manipulation of the graphical content in response to receiving the first input in the input region of the graphical interface that lacks the first control element; and
After performing the first manipulation, switching the graphical interface away from the touchpad mode and thereby enabling the second control element to perform the second manipulation.
2. The method according to claim 1, wherein:
the graphical content includes characters from a typeface,
The first control element is a first control point;
The first manipulation includes modifying a curve with the first control point;
The second control element is a second control point; and
The second manipulation includes modifying a curve with the second control point.
3. The method of claim 2, wherein the graphical interface is switched to the touchpad mode in response to receiving a selection of the first control point via the graphical interface,
Wherein the graphical interface is switched out of the touch pad mode in response to receiving a deselection of the first control point via the graphical interface.
4. A method according to claim 3, wherein the deselection of the first control point comprises selection of the second control point.
5. The method of claim 1, wherein the content manipulation application updates the graphical interface to move the first control element in response to the first input, wherein moving the first control element effects the first manipulation.
6. The method of claim 1, wherein the input area comprises the second control element, wherein the first input is applied to the first control element instead of the second control element.
7. The method of claim 1, wherein the graphical interface comprises (i) a touch pad area in which the first and second control elements are located, and (ii) a non-touch pad area in which a third control element is located, wherein the graphical interface retains functionality of the third control element both in the touch pad mode and in the off-touch pad mode.
8. An electronic system, comprising:
A processing device; and
A non-transitory computer readable medium communicatively coupled to the processing device, wherein the processing device is configured to execute a content manipulation application stored in the non-transitory computer readable medium and thereby perform operations comprising:
Providing a graphical interface for editing graphical content, the graphical interface displaying a first control element for performing a first manipulation of the graphical content, an input area devoid of the first control element, and a second control element for performing a second manipulation of the graphical content,
Switching the graphical interface to a touch pad mode in response to a selection of the first control element, wherein the touch pad mode disables the second control element and thereby prevents the second control element from performing the second manipulation, and wherein the second control element remains disabled while additional input is provided in the graphical interface until the graphical interface is switched out of the touch pad mode,
Receiving a first input different from a selection of the first control element in the input area of the graphical interface lacking the first control element while the graphical interface is in the touch pad mode,
Performing the first manipulation of the graphical content in response to receiving the first input in the input area of the graphical interface lacking the first control element, and
After performing the first manipulation, switching the graphical interface away from the touchpad mode and thereby enabling the second control element to perform the second manipulation.
9. The system of claim 8, wherein:
the graphical content includes characters from a typeface,
The first control element is a first control point;
The first manipulation includes modifying a curve with the first control point;
The second control element is a second control point; and
The second manipulation includes modifying a curve with the second control point.
10. The system of claim 9, wherein the processing device is configured to switch the graphical interface to the touchpad mode in response to receiving a selection of the first control point via the graphical interface,
Wherein the processing device is configured to switch the graphical interface out of the touch pad mode in response to receiving a deselection of the first control point via the graphical interface.
11. The system of claim 10, wherein the deselection of the first control point comprises a selection of the second control point.
12. The system of claim 8, wherein the processing device is configured to update the graphical interface to move the first control element in response to the first input, wherein moving the first control element effects the first manipulation.
13. The system of claim 8, wherein the input area comprises the second control element, wherein the first input is applied to the first control element instead of the second control element.
14. The system of claim 8, wherein the graphical interface comprises (i) a touch pad area in which the first and second control elements are located, and (ii) a non-touch pad area in which a third control element is located, wherein the graphical interface is configured to maintain functionality of the third control element both in the touch pad mode and in the off-touch pad mode.
15. A non-transitory computer readable medium having program code of a content management application stored thereon and executable by one or more processing devices to perform the method of any of claims 1-7.
CN201810846942.XA 2017-10-06 2018-07-27 Selectively enabling touchpad functionality in a graphical interface Active CN109634504B (en)

Applications Claiming Priority (6)

Application Number Priority Date Filing Date Title
US15/726,909 2017-10-06
US15/726,973 US11488053B2 (en) 2017-10-06 2017-10-06 Automatically controlling modifications to typeface designs with machine-learning models
US15/726,832 2017-10-06
US15/726,832 US10339680B2 (en) 2017-10-06 2017-10-06 Graphics control data for performing skeleton-based modifications of a typeface design
US15/726,909 US10983679B2 (en) 2017-10-06 2017-10-06 Selectively enabling trackpad functionality in graphical interfaces
US15/726,973 2017-10-06

Publications (2)

Publication Number Publication Date
CN109634504A CN109634504A (en) 2019-04-16
CN109634504B true CN109634504B (en) 2024-04-19

Family

ID=63518052

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201810846942.XA Active CN109634504B (en) 2017-10-06 2018-07-27 Selectively enabling touchpad functionality in a graphical interface

Country Status (4)

Country Link
CN (1) CN109634504B (en)
AU (1) AU2018206708B2 (en)
DE (1) DE102018005621A1 (en)
GB (1) GB2567283B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110941388A (en) * 2019-11-27 2020-03-31 北京字节跳动网络技术有限公司 Interface control method, device, terminal and storage medium in task execution process

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1254902A (en) * 1998-11-20 2000-05-31 三星电子株式会社 Device and method for recognizing characters input from touch screen
US6459439B1 (en) * 1998-03-09 2002-10-01 Macromedia, Inc. Reshaping of paths without respect to control points
CN102884491A (en) * 2010-03-18 2013-01-16 克里斯·阿基罗 Actionable-object controller and data-entry attachment for touchscreen-based electronics
CN105144037A (en) * 2012-08-01 2015-12-09 苹果公司 Device, method, and graphical user interface for entering characters

Family Cites Families (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4580231A (en) * 1978-09-15 1986-04-01 Alphatype Corporation Ultrahigh resolution photocomposition system employing electronic character generation from magnetically stored data
US5754187A (en) * 1994-05-16 1998-05-19 Agfa Division, Bayer Corporation Method for data compression of digital data to produce a scaleable font database
US7769222B2 (en) * 2006-10-27 2010-08-03 Mitutoyo Corporation Arc tool user interface
US8754855B2 (en) * 2008-06-27 2014-06-17 Microsoft Corporation Virtual touchpad
KR101504209B1 (en) * 2008-10-16 2015-03-19 엘지전자 주식회사 Mobile terminal having input device with touch sensor and control method thereof
EP2686758B1 (en) * 2011-03-17 2020-09-30 Laubach, Kevin Input device user interface enhancements
JP2013015890A (en) * 2011-06-30 2013-01-24 Toshiba Corp Information processor and method for controlling the same
US9141280B2 (en) * 2011-11-09 2015-09-22 Blackberry Limited Touch-sensitive display method and apparatus
JP2014120091A (en) * 2012-12-19 2014-06-30 Riso Kagaku Corp Electronic apparatus
JP5780438B2 (en) * 2013-05-21 2015-09-16 カシオ計算機株式会社 Electronic device, position designation method and program
CN104166517A (en) * 2014-07-31 2014-11-26 中兴通讯股份有限公司 Method and device for operating touch screen device
JP6430197B2 (en) * 2014-09-30 2018-11-28 株式会社東芝 Electronic apparatus and method

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6459439B1 (en) * 1998-03-09 2002-10-01 Macromedia, Inc. Reshaping of paths without respect to control points
CN1254902A (en) * 1998-11-20 2000-05-31 三星电子株式会社 Device and method for recognizing characters input from touch screen
CN102884491A (en) * 2010-03-18 2013-01-16 克里斯·阿基罗 Actionable-object controller and data-entry attachment for touchscreen-based electronics
CN105144037A (en) * 2012-08-01 2015-12-09 苹果公司 Device, method, and graphical user interface for entering characters

Also Published As

Publication number Publication date
CN109634504A (en) 2019-04-16
DE102018005621A1 (en) 2019-04-11
AU2018206708A1 (en) 2019-05-02
GB201812394D0 (en) 2018-09-12
GB2567283A (en) 2019-04-10
GB2567283B (en) 2021-08-11
AU2018206708B2 (en) 2021-07-22

Similar Documents

Publication Publication Date Title
US20220405647A1 (en) Automatically controlling modifications to typeface designs with machine-learning models
KR102417783B1 (en) Systems and methods for managing digital ink typesetting
US10528236B2 (en) Creating a display pattern for multiple data-bound graphic objects
US10339680B2 (en) Graphics control data for performing skeleton-based modifications of a typeface design
JP3775076B2 (en) Method and system for dynamic grouping of multiple graphic objects
KR101159325B1 (en) System and method for automatic label placement on charts
US9619435B2 (en) Methods and apparatus for modifying typographic attributes
KR102381801B1 (en) Systems and methods for guiding handwriting input
US10101891B1 (en) Computer-assisted image cropping
JP2018533782A (en) Digital notebook taking system and method
US10839139B2 (en) Glyph aware snapping
US20120293558A1 (en) Manipulating graphical objects
JP2019507915A (en) Apparatus and method for note taking using gestures
US10983679B2 (en) Selectively enabling trackpad functionality in graphical interfaces
US11475617B2 (en) Path-constrained drawing with visual properties based on drawing tool
US20190114057A1 (en) Fixing spaced relationships between graphic objects
CN110663017B (en) Multi-stroke intelligent ink gesture language
CN107817935A (en) Display methods, device, terminal and the computer-readable recording medium of application interface
US10475223B2 (en) Generating multiple data-bound graphic objects
CN109634504B (en) Selectively enabling touchpad functionality in a graphical interface
JPH05108786A (en) Method and apparatus for transforming graphic form
JPH064607A (en) Data display device
US10592087B1 (en) System and method for creating fluid design keyframes on graphical user interface
US10410387B2 (en) System and method for generating user interface elements
US20190095084A1 (en) System and Method for Selecting a Time Stamp and Generating User Interface Elements

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant