US20210224946A1 - Glyph Transformations as Editable Text - Google Patents
Glyph Transformations as Editable Text Download PDFInfo
- Publication number
- US20210224946A1 US20210224946A1 US16/749,563 US202016749563A US2021224946A1 US 20210224946 A1 US20210224946 A1 US 20210224946A1 US 202016749563 A US202016749563 A US 202016749563A US 2021224946 A1 US2021224946 A1 US 2021224946A1
- Authority
- US
- United States
- Prior art keywords
- glyph
- glyphs
- transformation
- bounding box
- relative
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
- 230000009466 transformation Effects 0.000 title claims abstract description 238
- 238000000844 transformation Methods 0.000 title abstract description 18
- 238000013507 mapping Methods 0.000 claims abstract description 35
- 238000006073 displacement reaction Methods 0.000 claims description 53
- 238000000034 method Methods 0.000 claims description 51
- 238000009877 rendering Methods 0.000 claims description 9
- 230000004044 response Effects 0.000 description 12
- 238000012545 processing Methods 0.000 description 10
- 230000008569 process Effects 0.000 description 7
- 230000001131 transforming effect Effects 0.000 description 5
- 230000000007 visual effect Effects 0.000 description 4
- 238000005516 engineering process Methods 0.000 description 3
- 230000006870 function Effects 0.000 description 3
- 230000003993 interaction Effects 0.000 description 3
- 230000003287 optical effect Effects 0.000 description 3
- 238000007796 conventional method Methods 0.000 description 2
- 238000010586 diagram Methods 0.000 description 2
- 238000004519 manufacturing process Methods 0.000 description 2
- 239000004065 semiconductor Substances 0.000 description 2
- 238000007792 addition Methods 0.000 description 1
- 230000008901 benefit Effects 0.000 description 1
- 230000001413 cellular effect Effects 0.000 description 1
- 230000008859 change Effects 0.000 description 1
- 238000006243 chemical reaction Methods 0.000 description 1
- 238000004891 communication Methods 0.000 description 1
- 239000000284 extract Substances 0.000 description 1
- 239000000463 material Substances 0.000 description 1
- 230000007246 mechanism Effects 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 230000002093 peripheral effect Effects 0.000 description 1
- 230000002085 persistent effect Effects 0.000 description 1
- 230000008054 signal transmission Effects 0.000 description 1
- 229910052710 silicon Inorganic materials 0.000 description 1
- 239000010703 silicon Substances 0.000 description 1
- 238000012546 transfer Methods 0.000 description 1
- 230000007723 transport mechanism Effects 0.000 description 1
Images
Classifications
-
- G06T3/10—
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T3/00—Geometric image transformation in the plane of the image
- G06T3/0056—Geometric image transformation in the plane of the image the transformation method being selected according to the characteristics of the input image
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T11/00—2D [Two Dimensional] image generation
- G06T11/20—Drawing from basic elements, e.g. lines or circles
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T3/00—Geometric image transformation in the plane of the image
- G06T3/40—Scaling the whole image or part thereof
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T3/00—Geometric image transformation in the plane of the image
- G06T3/60—Rotation of a whole image or part thereof
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2200/00—Indexing scheme for image data processing or generation, in general
- G06T2200/24—Indexing scheme for image data processing or generation, in general involving graphical user interfaces [GUIs]
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2210/00—Indexing scheme for image generation or computer graphics
- G06T2210/12—Bounding box
Definitions
- Conventional systems for creating and/or editing digital content enable users of such systems to modify default positions of glyphs included as part of the digital content. These systems can provide this functionality based on inputs received from user interactions with an input device such as manipulations of the input device to select a glyph and “drag” the glyph within the digital content.
- Some conventional systems generate guides to facilitate precise positioning or transforming of glyphs relative to objects within the digital content. These systems utilize outlines of the glyphs which are compared to outlines of other objects to generate the guides. Once generated, the guides are rendered in a user interface to indicate that the outlines of the glyphs are oriented in a particular manner relative to the outlines of the objects.
- a computing device implements a transformation system to generate bounding boxes for a first glyph and a second glyph of multiple glyphs.
- the transformation system leverages the bounding boxes for the first and second glyphs to generate a multiple glyph bounding box for the multiple glyphs together by concatenating the bounding boxes of the first and second glyphs. In this way, the transformation system generates the multiple glyph bounding box to address both the first and second glyphs together, which is not possible in conventional techniques.
- a user input is received defining a transformation of the multiple glyph bounding box relative to an object.
- the transformation system determines a mapping of the transformation of the multiple glyph bounding box to the bounding boxes for the first glyph and the second glyph. For example, the transformation system determines the mapping without converting the multiple glyphs into outlines, thus improving computational efficiency of the computing device.
- the multiple glyphs are rendered in a user interface as editable text having the transformation based on the mapping without further user intervention, which is also not possible in conventional techniques.
- FIG. 1 is an illustration of an environment in an example implementation that is operable to employ digital systems and techniques for glyph transformations as editable text as described herein.
- FIG. 2 depicts a system in an example implementation showing operation of a transformation module for transforming glyphs as editable text.
- FIG. 3 is an illustration of representations of functionality described with respect to FIG. 2 .
- FIG. 4 is a flow diagram depicting a procedure in an example implementation in which a bounding box for multiple glyphs is generated by concatenating bounding boxes of a first glyph and a second glyph of the multiple glyphs, a user input defining a transformation of the multiple glyphs is received, and the multiple glyphs are generated for display in a user interface as editable text having the transformation.
- FIG. 5 is an illustration of a representation of a displacement of a single glyph relative to an object using an alignment guide.
- FIG. 6 is an illustration of a representation of a displacement of a single glyph of a text object within the text object using a center alignment guide.
- FIG. 7 is an illustration of a representation of a displacement of a single glyph relative to an object using an equal spacing guide.
- FIG. 8 is an illustration of a representation of a rotation of a single glyph relative to an object using an angle guide.
- FIG. 9 is an illustration of a representation of generation of a bounding box for multiple glyphs by concatenating bounding boxes of a first glyph and a second glyph of the multiple glyphs.
- FIG. 10 is an illustration of a representation of an example of a displacement of multiple glyphs relative to objects and an example of a horizontal scaling of the multiple glyphs relative to the objects.
- FIG. 11 is an illustration of a representation of an example of a vertical scaling of multiple glyphs relative to objects and an example of a uniform scaling of the multiple glyphs relative to the objects.
- FIG. 12 is an illustration of a representation of an example of character rotation as editable text and an example of word rotation as editable text.
- FIGS. 13A and 13B are illustrations of representations of aspects of word rotation as editable text.
- FIG. 14 is an illustration of a representation of a displacement of multiple glyphs relative to an object using an alignment guide.
- FIG. 15 is an illustration of a representation of a rotation of multiple glyphs relative to an object using an angle guide.
- FIG. 16 illustrates an example system that includes an example computing device that is representative of one or more computing systems and/or devices that may implement the various techniques described herein.
- a computing device implements a transformation system to generate bounding boxes for a first glyph and a second glyph of multiple glyphs. These bounding boxes may fully envelop outlines of the first and second glyphs and also precisely bound the outlines such that sides of the bounding boxes are tangent to extreme portions of the outlines.
- the transformation system concatenates the bounding boxes for the first glyph and the second glyph together as a multiple glyph bounding box for the multiple glyphs.
- the transformation system By concatenating the bounding boxes for the first and second glyphs, for instance, the transformation system generates the multiple glyph bounding box as having a right side equal to a right side of the rightmost bounding box for the first and second glyphs.
- the multiple glyph bounding box is generated as having a left side equal to a left side of the leftmost bounding box for the first and second glyphs.
- Extreme portions of the multiple glyph bounding box are used to identify potential alignments between the first and second glyphs and other objects. These potential alignments can be identified based on transformations of the first and second glyphs relative to the other objects and/or transformations of the other objects relative to the first and second glyphs.
- the transformation system can generate guides based on the potential alignments without converting the first and second glyphs into outlines.
- positions of corners of the multiple glyph bounding box are used to identify potential horizontal, vertical, center, and equal spacing alignments between the first and second glyphs and the other objects.
- Angle line data e.g., due to a rotation of the multiple glyph bounding box, can be used to identify potential angle alignments between the first and second glyphs and the other objects.
- the transformation system renders indications as guides in a user interface based on the identified potential alignments.
- a user input is received defining a transformation of the multiple glyph bounding box relative to any object of the other objects.
- This transformation can include a displacement, a scaling, a rotation, etc.
- the transformation system maps the transformation of the multiple glyph bounding box to the bounding boxes for the first glyph and the second glyph.
- the multiple glyphs are rendered in the user interface as editable text having the transformation based on the mapping.
- the described systems improve conventional technology for creating and/or editing digital content by rendering guides for guiding transformations of a glyph without converting the glyph into an outline. This enables the glyph to be precisely transformed as editable text which is not possible using conventional systems.
- the described systems also enable precise transformations of a single glyph or multiple glyphs any of which can be transformed as editable text. By enabling multiple glyphs to be transformed as editable text, the described systems and techniques improve computational efficiency of computing devices by preserving computational resources used to reposition glyphs one glyph at a time using conventional systems.
- Examples of functionality made possible by preserving editability of transformed glyphs as text which are not possible after converting a glyph into an outline include an ability to change fonts, font sizes, or any text characteristic used to render glyphs before and after transformation of the glyphs.
- glyphs can be rendered using a first font, transformed as editable text, and then glyphs may be rendered using a second font having the transformation as editable text.
- transformed glyphs can be modified to add or remove features such as boldface, italic, strikethrough, underline, etc.
- all features and functionality of TrueType and OpenType fonts are available to editable text which are lost by converting glyphs into outline using conventional systems.
- the described systems improve operation of computing devices by significantly increasing functionality available to create and/or edit digital content including editable text.
- Example procedures are also described which may be performed in the example environment as well as other environments. Consequently, performance of the example procedures is not limited to the example environment and the example environment is not limited to performance of the example procedures.
- FIG. 1 is an illustration of an environment 100 in an example implementation that is operable to employ digital systems and techniques as described herein.
- the illustrated environment 100 includes a computing device 102 connected to a network 104 .
- the computing device 102 may be configured as a desktop computer, a laptop computer, a mobile device (e.g., assuming a handheld configuration such as a tablet or mobile phone), and so forth.
- the computing device 102 may range from a full resource device with substantial memory and processor resources (e.g., personal computers, game consoles) to a low-resource device with limited memory and/or processing resources (e.g., mobile devices).
- the computing device 102 may be representative of a plurality of different devices, such as multiple servers utilized by a business to perform operations “over the cloud.”
- the illustrated environment 100 also includes a display device 106 that is communicatively coupled to the computing device 102 via a wired or a wireless connection.
- a display device 106 that is communicatively coupled to the computing device 102 via a wired or a wireless connection.
- a variety of device configurations may be used to implement the computing device 102 and/or the display device 106 .
- the computing device 102 includes a storage device 108 and a transformation module 110 .
- the storage device 108 is illustrated to include digital content 112 .
- the transformation module 110 is illustrated as having, receiving, and/or transmitting glyph data 114 and input data 116 .
- the glyph data 114 includes information usable by an application for creating and/or editing the digital content 112 to render glyphs and other objects as part of creation and/or editing of the digital content 112 .
- the glyph data 114 includes both general properties of glyphs such as font file data as well as data that is specific to glyphs included in the digital content 112 such as outline data describing the glyphs included in the digital content 112 .
- the glyph data 114 also includes general properties of objects as well as data that is specific to objects included in the digital content 112 , e.g., outline data describing the objects included in the digital content 112 .
- the glyph data 114 is illustrated to include an example 118 of glyphs 120 and an object 122 .
- the glyph data 114 includes data that is general to the glyphs 120 such as font file data corresponding to a font used to render the glyphs 120 as well as data that is specific to the glyphs 120 as depicted in the example 118 such as outline data, font size data, horizontal scale data, vertical scale data, etc.
- the example 118 depicts 11 glyphs 120 and the object 122 is an arrow oriented at an angle relative to the glyphs 120 .
- the input data 116 includes information usable by the application for creating and/or editing the digital content 112 to render additions and modifications to glyphs and other objects as part of creation and/or editing of the digital content 112 .
- An input device such as a stylus, a touchscreen, an electronic pen, or a mouse that receives data, e.g., from a user manipulation, can transmit the received data to the computing device 102 as part of the input data 116 .
- the input data 116 includes a user input defining a transformation of the glyphs 120 relative to the object 122 depicted in the example 118 .
- the transformation module 110 processes the glyph data 114 and the input data 116 to generate the glyphs 120 depicted in the example 118 having the transformation defined by the user input included as part of the input data 116 . Once generated, the transformation module 110 can render transformed glyphs 124 in a user interface 126 of the display device 106 .
- the rendering of the transformed glyphs 124 depicts three glyphs 128 , 130 , 132 of the glyphs 120 depicted in the example 118 rotated relative to the object 122 .
- the transformation defined in the input data 116 is a rotation of the three glyphs 128 , 130 , 132 relative to the object 122 .
- the transformation module 110 generates the transformed glyphs 124 as editable text which can be edited, e.g., by an additional user input.
- the transformation module 110 generates the transformed glyphs 124 without converting the three glyphs 128 , 130 , 132 to outlines. In this manner, the computing device 102 can implement the transformation module 110 to precisely snap the three glyphs 128 , 130 , 132 to the object 122 as editable text.
- FIG. 2 depicts a system 200 in an example implementation showing operation of a transformation module 110 .
- FIG. 3 is an illustration of representations 300 of functionality described with respect to FIG. 2 .
- the transformation module 110 is illustrated to include a generation module 202 , a mapping module 204 , and a rendering module 206 .
- the computing device 102 implements the transformation module 110 to receive the glyph data 114 and the input data 116 .
- the generation module 202 receives the glyph data 114 and the input data 116 and processes the glyph data 114 and the input data 116 to generate bounding box data 208 .
- the generation module 202 generates the bounding box data 208 by accessing glyph identification data, font size data, horizontal scale data, vertical scale data, orientation data, and/or glyph outline data describing the glyphs 120 from the glyph data 114 .
- the generation module 202 then generates a zero angled reference bounding box for each of the glyphs 120 included in the example 118 to fully envelop an outline corresponding to each of the glyphs 120 . To do so, the generation module 202 generates the zero angled reference bounding boxes for the glyphs 120 as having heights and widths defined with respect to a reference origin, e.g., (0,0) in a font space.
- the generation module 202 generates a particular zero angled reference bounding box for a particular glyph of the glyphs 120 to precisely bound the outline of the particular glyph such that sides of the particular zero angled reference bounding box are tangent to extreme portions of the particular glyph's outline.
- the particular zero angled bounding box is usable precisely transform the particular glyph.
- the particular zero angled reference bounding box may differ from an em-box or em-square associated with the particular glyph of the glyphs 120 because the em-box or em-square may not necessarily fully envelop the outline of the particular glyph of the glyphs 120 .
- the particular zero angled reference bounding box may differ from the em-box or em-square associated with the particular glyph because the em-box or em-square may fully envelop the outline of the particular glyph but with spacing between the sizes of the em-box or em-square and the extreme portions of the particular glyph's outline.
- the generation module 202 generates the bounding box data 208 to include data describing the zero angled reference bounding box for each of the glyphs 120 included in the example 118 .
- the generation module 202 generates zero angled reference bounding boxes 302 , 304 , 306 for the glyphs 128 , 130 , 132 , respectively.
- sides of the zero angled reference bounding box 302 are tangent to extreme portions of the glyph 128 .
- the zero angled reference bounding box 302 also fully envelops an outline of the glyph 128 .
- the generation module 202 also generates a theta angled reference bounding box for each of the glyphs 120 included in the example 118 , e.g., if any rotation transformation is applied to the glyphs 120 via the input data 116 . To do so, the generation module 202 processes the input data 116 to extract any rotation applied to each glyph of the glyphs 120 depicted in the example 118 . The generation module 202 generates the theta angled reference bounding boxes by applying rotations described by rotation data extracted from the input data 116 to the zero angled reference bounding boxes.
- a particular theta angled reference bounding box for the particular glyph is the same as the particular zero angled reference bounding box if the particular glyph is not rotated in the example 118 .
- the generation module 202 generates the bounding box data 208 to include data describing the theta angled reference bounding box for each of the glyphs 120 included in the example 118 .
- the generation module 202 generates theta angled reference bounding boxes 308 , 310 , 312 for the glyphs 128 , 130 , 132 , respectively.
- sides of the theta angled reference bonding box 308 are tangent to the extreme portions of the glyph 128 .
- the theta angled reference bounding box 308 also fully envelops the outline of the glyph 128 .
- the generation module 202 additionally generates a zero angled non-reference bounding box for each glyph of the glyphs 120 by processing the glyph data 114 to extract glyph outline information for the glyphs 120 as depicted in the example 118 .
- the generation module 202 then generates the zero angled non-reference bounding boxes using the extracted glyph outline information based on orientations of the outlines of the glyphs 120 as depicted in the example 118 .
- the generation module 202 generates the bounding box data 208 to include data describing the zero angled non-reference bounding box for each of the glyphs 120 as depicted in the example 118 .
- the generation module 202 generates a zero angled non-reference bounding box 314 for the glyph 128 .
- sides of the zero angled non-reference bounding box 314 are tangent to portions of the theta angled reference bounding box 308 .
- the zero angled non-reference bounding box 314 fully envelops the theta angled reference bounding box 308 for the glyph 128 .
- the generation module 202 generates a bounding box for multiple glyphs of the glyphs 120 by concatenating bounding boxes of individual glyphs of the multiple glyphs. To do so, the generation module 202 generates the bounding box for the multiple glyphs to fully envelop outlines corresponding to each of the multiple glyphs based on extremes of the individual bounding boxes of the glyphs of the multiple glyphs. For example, the generation module 202 generates a left side of the bounding box for the multiple glyphs as a left side of the bounding box of a leftmost individual glyph of the multiple glyphs.
- the generation module 202 generates a right side of the bounding box for the multiple glyphs as a right side of the bounding box of a rightmost individual glyph of the multiple glyphs.
- the generation module 202 generates the bounding box data 208 to include data describing the zero angled non-reference bounding box for the multiple glyphs.
- the generation module 202 generates a multiple glyph bounding box 316 for multiple glyphs by concatenating the bounding boxes 302 , 304 , 306 .
- the generation module 202 generates the multiple glyph bounding box 316 as having a left side based on a left side of bounding box 302 .
- the generation module 202 generates the multiple glyph bounding box 316 as having a right side based on a right side of bounding box 306 .
- the generation module 202 generates the multiple glyph bounding box 316 as having a bottom side based on a bottom side of bounding box 304 .
- the generation module 202 generates the multiple glyph bounding box 316 as having a top side based on a top side of bounding box 302 .
- the generation module 202 generates the bounding box for multiple glyphs 316 to fully envelop the bounding boxes 302 , 304 , 306 .
- the mapping module 204 receives the bounding box data 208 and the input data 116 and the mapping module 204 processes the bounding box data 208 and the input data 116 to generate transformation data 210 .
- the mapping module 204 includes a guide generating module 212 and a guide rendering module 214 which may be implemented to generate and render a guide, respectively, for guiding a transformation of the glyphs 120 .
- guide refers to an indication such as a visual indication that a glyph is oriented in a particular manner relative to an object or that an object is oriented in a particular manner relative to a glyph in the user interface 126 .
- a guide can indicate that a glyph is precisely oriented or aligned relative to an object.
- the mapping module 204 processes the input data 116 to map transformations, e.g., user inputs, to the glyphs 120 based on the bounding box data 208 .
- the mapping module 204 maps a horizontal displacement of a particular glyph of the glyphs 120 to kerning and the mapping module 204 maps a vertical displacement of the particular glyph into a baseline shift.
- the mapping module 204 maps a rotation of the particular glyph as a displacement angle from a center of the particular glyph.
- the mapping module 204 maps a horizontal scaling of the particular glyph as a scale percentage mapped onto a horizontal scale of the particular glyph.
- mapping module 204 maps a vertical scaling of the particular glyph as a scale percentage mapped onto a vertical scale of the particular glyph.
- the mapping module 204 maps a uniform scaling of the particular glyph as a scale percentage mapped onto the horizontal scale and the vertical scale of the particular glyph.
- mapping module 204 As the mapping module 204 is mapping transformations described in the input data 116 to the glyphs 120 , the mapping module 204 also updates the zero angled non-reference bounding boxes of the glyphs 120 based on the mapped transformations.
- the guide generating module 212 uses the updated zero angled non-reference bounding boxes to identify potential alignments of the glyphs 120 , e.g., with outlines of objects.
- the guide generating module 212 generates these potential alignments as guides and the guide rendering module 214 renders the guides in the user interface 126 .
- the transformation module 110 renders the guides to guide transformations in substantially real time as data describing the transformations is received as part of the input data 116 .
- the mapping module 204 generates the transformation data 210 using the zero angled reference bounding boxes, the zero angled non-reference bounding boxes, and/or the theta angled reference bounding boxes to transform the glyphs 120 as editable text.
- the transformation data 210 describes these editable text transformations and the rendering module 206 receives the transformation data 210 .
- the rendering module 206 processes the transformation data to render the transformed glyphs 124 as editable text.
- FIG. 4 is a flow diagram depicting a procedure 400 in an example implementation in which a bounding box for multiple glyphs is generated by concatenating bounding boxes of a first glyph and a second glyph of the multiple glyphs, a user input defining a transformation of the multiple glyphs is received, and the multiple glyphs are generated for display in a user interface as editable text having the transformation. Bounding boxes for a first glyph and a second glyph of multiple glyphs are generated (block 402 ). For example, the transformation module 110 generates the bounding boxes for the first and second glyphs. In another example, the transformation module 110 can generate bounding boxes for all glyphs of multiple glyphs selected for transformation.
- a multiple glyph bounding box for the multiple glyphs is generated (block 404 ) by concatenating the bounding boxes for the first and second glyphs.
- the computing device 102 implements the transformation module 110 to generate the multiple glyph bounding box by concatenating the bounding boxes for the first and second glyphs in one example.
- a user input defining a transformation of the multiple glyph bounding box relative to an object is received (block 406 ).
- the transformation module 110 receives the user input defining the transformation.
- a mapping of the transformation of the multiple glyph bounding box to the bounding boxes for the first and second glyphs is determined (block 408 ).
- the computing device 102 can implement the transformation module 110 to determine the mapping.
- the multiple glyphs are generated for display in a user interface (block 410 ) as editable text having the transformation based on the mapping.
- the transformation module 110 may generate the multiple glyphs for display in the user interface in an example.
- FIG. 5 is an illustration of a representation 500 of a displacement of a single glyph relative to an object using an alignment guide.
- the representation 500 includes an object 502 and a single glyph 504 .
- the object 502 includes six glyphs and the single glyph 504 is a registered trademark symbol.
- a bounding box 506 for the single glyph 504 is generated, e.g., the computing device 102 may implement the transformation module 110 to generate the bounding box 506 for the single glyph 504 .
- the transformation module 110 may generate the bounding box 506 as a zero angled non-reference bounding box for the single glyph 504 and the transformation module 110 may utilize a corresponding zero angled reference bounding box for the single glyph 504 to transform the single glyph 504 as editable text.
- the bounding box 506 and the single glyph 504 are vertically displaced relative to the object 502 based on the input data 116 .
- the computing device 102 may implement the transformation module 110 to map a user input defining a vertical displacement of the single glyph 504 relative to the object 502 into a baseline shift.
- the transformation module 110 can generate an updated bounding box 506 based on this mapping.
- a potential alignment between the single glyph 504 and the object 502 is identified based on the updated bounding box 506 , e.g., the potential alignment may be identified based on updated corner positions of the updated bounding box 506 .
- the transformation module 110 provides data describing the updated corner positions to a snapping module of the application for creating and/or editing the digital content 112 .
- the transformation module 110 renders the potential alignment as an alignment guide 508 usable to guide the vertical displacement of the single glyph 504 relative to the object 502 .
- the alignment guide 508 is usable to precisely align a top portion of the object 502 with a bottom portion of the single glyph 504 such that the transformation of the single glyph 504 appears visually pleasing.
- the snapping module of the application for creating and/or editing the digital content 112 can snap the single glyph 504 relative to the object 502 using the alignment guide 508 .
- the corner positions of the bounding box 506 to generate the alignment guide 508 instead of converting the single glyph 504 into an outline to generate the alignment guide 508 , the single glyph 504 is transformed relative to the object 502 as editable text.
- FIG. 6 is an illustration of a representation 600 of a displacement of a single glyph of a text object within the text object using a center alignment guide.
- the representation 600 includes a single text object 602 which includes nine glyphs, and these nine glyphs are editable as text.
- a single glyph 604 of the nine glyphs of the text object 602 is also illustrated.
- the single glyph 604 is the letter “N” and the single glyph 604 is disposed between the other eight glyphs of the single text object 602 .
- the transformation module 110 generates a bounding box 606 for the single glyph 604 .
- the transformation module 110 may generate the bounding box 606 as a zero angled non-reference bounding box and the transformation module 110 can utilize a corresponding zero angled reference bounding box for the single glyph 604 to transform the single glyph 604 as editable text within the text object 602 .
- the single text object 602 remains editable as text both before and after a transformation of the single glyph 604 within the text object 602 .
- the bounding box 606 and the single glyph 604 are vertically displaced within the single text object 602 based on a user input defining this transformation included in the input data 116 .
- the transformation module 110 maps the user input into a baseline shift of the single glyph 604 .
- the transformation module 110 also generates an updated bounding box 606 based on this mapping.
- a potential alignment between the single glyph 604 and the other glyphs of the text object 602 is identified using corner positions of the updated bounding box 606 .
- the transformation module 110 can provide data describing the updated corner positions to the snapping module of the application for creating and/or editing the digital content 112 .
- the transformation module 110 renders the identified potential alignment as a center alignment guide 608 which can be used to precisely align a portion of the single glyph 604 with portions of the other glyphs of the text object 602 .
- the center alignment guide 608 is used to guide an alignment between a bottom portion of the single glyph 604 and central portions of the other glyphs of the single text object 602 .
- the transformation of the single glyph 604 within the text object 602 is precise and visually pleasing.
- the snapping module of the application for creating and/or editing the digital content 112 can snap the single glyph 604 to the other glyphs of the text object 602 using the center alignment guide 608 .
- the single glyph 604 is transformed within the text object 602 without converting the single glyph 604 or the other glyphs of the text object 602 into outlines.
- the single glyph 604 and the other eight glyphs are editable as the single text object 602 and these glyphs are not converted into separate text objects.
- the single glyph 604 is transformed within the other glyphs of the text object 602 and all nine glyphs of the text object 602 remain as editable text.
- FIG. 7 is an illustration of a representation 700 of a displacement of a single glyph relative to an object using an equal spacing guide.
- the representation 700 includes an object 702 , a single glyph 704 , and an additional glyph 706 .
- the object 702 is a square with rounded corners
- the single glyph 704 is the numeral “9”
- the additional glyph 706 is the numeral “1.”
- the computing device 102 implements the transformation module 110 to generate a bounding box 708 for the single glyph 704 .
- the transformation module 110 generates the bounding box 708 as a zero angled non-reference bounding box for the single glyph 704 .
- the transformation module 110 can utilize the zero angled non-reference bounding box along with a corresponding zero angled reference bounding box to generate transformations of the single glyph 704 as editable text.
- the bounding box 708 and the single glyph 704 are horizontally displaced relative to the object 702 and the additional glyph 706 based on data describing a user input defining this transformation included in the input data 116 .
- the transformation module 110 maps the transformation defined in the input data 116 to a kerning space between the single glyph 704 and the additional glyph 706 to perform the horizontal displacement.
- the transformation module 110 also generates an updated bounding box 708 as part of the horizontal displacement and provides data describing corner positions of the updated bounding box 708 to the snapping module of the application for creating and/or editing the digital content 112 .
- a potential alignment of the single glyph 704 relative to the object 702 is identified based on the corner positions of the updated bounding box 708 .
- the transformation module 110 can identify the potential alignment without converting the single glyph 704 into an outline. In this way, the single glyph 704 retains its editability as text which would be lost if the single glyph 704 was converted into an outline to identify the potential alignment.
- the transformation module 110 renders the potential alignment as an equal spacing guide 710 usable to align the single glyph 704 relative to the object 702 such that left and right sides of the bounding box 708 are spaced equally from left and right sides of the object 702 , respectively.
- the equal spacing guide 710 is rendered as two lines having an equal length and the equal spacing guide 710 is a visual indication of an equal spacing alignment between the single glyph 704 and the object 702 .
- the snapping module of the application for creating and/or editing the digital content 112 can snap the single glyph 704 to the object 702 using the equal spacing guide 710 .
- FIG. 8 is an illustration of a representation 800 of a rotation of a single glyph relative to an object using an angle guide.
- the representation 800 includes an object 802 and a single glyph 804 .
- the object 802 is a lightning bolt and the single glyph 804 is the letter “H.”
- the transformation module 110 is implemented to generate a bounding box 806 for the single glyph 804 .
- the transformation module 110 generates the bounding box 806 as a theta angled reference bounding box for the single glyph 804 .
- the bounding box 806 and the single glyph 804 are rotated relative to the object 802 based on a user input defining this transformation included in the input data 116 .
- the transformation module 110 maps the transformation defined in the input data 116 as a displacement angle from a center of the single glyph 804 .
- the transformation module 110 also generates a zero angled non-reference bounding box 808 for the single glyph 804 and the transformation module 110 can utilize a corresponding zero angled reference bounding box to map the transformation of the single glyph 804 as editable text.
- the transformation module 110 generates an updated bounding box 806 as part of the rotation of the single glyph 804 .
- transformation module 110 provides angular line data describing lines parallel to the updated bounding box 806 to an angular snapping module of the application for creating and/or editing the digital content 112 .
- the angular snapping module searches the object 802 for lines parallel to the lines described by the angular line data of the updated bounding box 806 .
- a potential alignment of the single glyph 804 relative to the object 802 is identified based on the angular line data describing lines parallel to the updated bounding box 806 .
- the transformation module 110 identifies the potential alignment without converting the single glyph 804 into an outline.
- the transformation module 110 identifies the potential alignment and the single glyph 804 is transformed as editable text.
- the transformation module 110 renders the potential alignment as an angle guide 810 which is usable to align the single glyph 804 relative to the object 802 .
- the transformation module 110 also renders an alignment indication 812 which visually indicates a portion of the object 802 identified as part of identifying the potential alignment of the single glyph 804 relative to the object 802 .
- the angle guide 810 provides a visual indication that the updated bounding box 806 is parallel to a line extending from a tip of the object 802 and the alignment indication 812 illustrates the line extending from the tip of the object 802 .
- the angular snapping module of the application for creating and/or editing the digital content 112 can snap the single glyph 804 to the object 802 using the angle guide 810 .
- FIG. 9 is an illustration of a representation 900 of generation of a bounding box for multiple glyphs by concatenating bounding boxes of a first glyph and a second glyph of the multiple glyphs.
- the representation 900 includes a first glyph 902 and a second glyph 904 .
- the first glyph 902 is the letter “0”
- the second glyph 904 is the letter “F.”
- the computing device 102 implements the transformation module 110 to generate a bounding box 906 for the first glyph 902 and a bounding box 908 for the second glyph 904 .
- the transformation module 110 may generate bounding boxes 906 , 908 as zero angled non-reference bounding boxes.
- the transformation module 110 generates a multiple glyph bounding box 910 by concatenating the bounding box 906 of the first glyph 902 and the bounding box 908 of the second glyph 904 . As shown, the transformation module 110 generates the multiple glyph bounding box 910 as having a left side 912 equal to the left side 912 of bounding box 906 because the first glyph 902 is a leftmost glyph of the glyphs 902 , 904 . The transformation module 110 generates the multiple glyph bounding box 910 as having a right side 914 equal to the right side 914 of bounding box 908 because the second glyph 904 is a rightmost glyph of the glyphs 902 , 904 .
- the glyphs 902 , 904 are illustrated as having a same cap height and the multiple glyph bounding box 910 is generated as having a top side equal to top sides of bounding boxes 906 , 908 and as having a bottom side equal to bottom sides of bounding boxes 906 , 908 . It should be appreciated that in an example in which the glyphs 902 , 904 have different cap heights, the transformation module 110 generates the multiple glyph bounding box 910 to have a top side equal to a greater top side of the tops sides of the bounding boxes 906 , 908 . Similarly, in this example, the transformation module 110 generates the multiple glyph bounding box 910 to have a bottom side equal to a lower bottom side of the bottom sides of the bounding boxes 906 , 908 .
- the multiple glyph bounding box 910 fully envelops outlines of the first glyph 902 and the second glyph 904 .
- the transformation module 110 can be implemented to generate guides for transforming the first glyph 902 and the second glyph 904 simultaneously without converting the glyphs 902 , 904 into outlines.
- the glyphs 902 , 904 can be transformed as editable text.
- FIG. 10 is an illustration of a representation 1000 of an example 1002 of a displacement of multiple glyphs relative to objects and an example 1004 of a horizontal scaling of the multiple glyphs relative to the objects.
- the examples 1002 , 1004 include a first object 1006 and a second object 1008 as well as a first glyph 1010 and a second glyph 1012 .
- the first object 1006 includes four glyphs and the second object 1008 includes seven glyphs.
- the first glyph 1010 is the letter “0” and the second glyph 1012 is the letter “F” and both of the glyphs 1010 , 1012 are disposed between the first object 1006 and the second object 1008 .
- the transformation module 110 is implemented to generate a multiple glyph bounding box 1014 for the glyphs 1010 , 1012 .
- the transformation module 110 may generate a bounding box for the first glyph 1010 and a bounding box for the second glyph 1012 .
- the transformation module 1010 then concatenates the bounding box for the first glyph 1010 and the bounding box for the second glyph 1012 as the multiple glyph bounding box 1014 .
- the transformation module 110 receives a user input defining a transformation of the glyphs 1010 , 1012 .
- the user input can be described by data included in the input data 116 .
- the transformation is a horizontal displacement and a vertical displacement of the glyphs 1010 , 1012 .
- the transformation module 110 maps the horizontal displacement in kerning and the transformation module 110 maps the vertical displacement as a baseline shift of the glyphs 1010 , 1012 . As shown, with kerning increased between glyph “E” of first object 1006 and the first glyph 1010 , this automatically shifts the second object 1008 due to increased horizontal displacement in the text reflow direction.
- the transformation module 110 receives the user input defining the horizontal displacement and the vertical displacement with respect to the multiple glyph bounding box 1014 .
- the transformation module 110 maps the horizontal displacement and the vertical displacement to the glyphs 1010 , 1012 to transform the glyphs 1010 , 1012 as editable text.
- the transformation is a horizontal scaling of the glyphs 1010 , 1012 .
- the transformation module 110 maps the horizontal scaling as a scale percentage onto a horizontal scale of the glyphs 1010 , 1012 .
- the transformation module 110 receives the user input defining an increase in the horizontal scale with respect to the multiple glyph bounding box 1014 .
- the transformation module 110 maps the increase in the horizontal scale to the glyphs 1010 , 1012 to transform the glyphs 1010 , 1012 as editable text.
- the transformation module 110 shifts the second object 1008 in response to the transformation of the glyphs 1010 , 1012 .
- the transformation module 110 automatically displaces the second object 1008 via horizontal displacement due to automatic text reflow of the glyphs 1010 , 1012 .
- FIG. 11 is an illustration of a representation 1100 of an example 1102 of a vertical scaling of multiple glyphs relative to objects and an example 1104 of a uniform scaling of the multiple glyphs relative to the objects.
- the examples 1102 , 1104 include a first object 1106 and a second object 1108 as well as a first glyph 1110 and a second glyph 1112 .
- the first object 1106 includes four glyphs and the second object 1108 includes seven glyphs.
- the first glyph 1110 is the letter “0” and the second glyph 1112 is the letter “F” and both of the glyphs 1110 , 1112 are disposed between the first object 1106 and the second object 1108 .
- the transformation module 110 is implemented to generate a multiple glyph bounding box 1114 for the glyphs 1110 , 1112 by generating a bounding box for the first glyph 1110 and a bounding box for the second glyph 1112 .
- the transformation module 110 then concatenates the bounding box for the first glyph 1110 and the bounding box for the second glyph 1112 as the multiple glyph bounding box 1114 .
- the transformation module 110 receives a user input defining a transformation of the glyphs the glyphs 1110 , 1112 .
- the user input can be described by data included in the input data 116 and the transformation module 110 receives the user input with respect to the multiple glyph bounding box 1114 .
- the transformation is a vertical scaling of the glyphs 1110 , 1112 .
- the transformation module 110 maps the vertical scaling as a scale percentage onto a vertical scale of the glyphs 1110 , 1112 .
- the user input defines an increase in the vertical scale with respect to the multiple glyph bounding box 1114 .
- the transformation module 110 maps the increase in the vertical scale to the glyphs 1110 , 1112 to transform the glyphs 1110 , 1112 as editable text.
- the transformation is a uniform scaling of the glyphs 1110 , 1112 .
- the transformation module 110 maps the uniform scaling as a scale percentage onto a vertical scale and a horizontal scale of the glyphs 1110 , 1112 .
- the user input defines an increase in the uniform scale with respect to the multiple glyph bounding box 1114 .
- the transformation module 110 maps the increase in the uniform scale to the glyphs 1110 , 1112 to transform the glyphs 1110 , 1112 as editable text.
- the transformation module 110 shifts the second object 1108 in response to the transformation of the glyphs 1110 , 1112 .
- FIG. 12 is an illustration of a representation 1200 of an example 1202 of character rotation as editable text and an example 1204 of word rotation as editable text.
- the examples 1202 , 1204 include glyphs 1206 , 1208 , 1210 and an object 1212 .
- the computing device 102 implements the transformation module 110 to generate a multiple glyph bounding box 1214 for the glyphs 1206 , 1208 , 1210 . To do so, the transformation module 110 generates a bounding box for each of the glyphs 1206 , 1208 , 1210 and the transformation module 110 concatenates these bounding boxes as the multiple glyph bounding box 1214 .
- the transformation module 110 generates a theta angled reference bounding box for each of the glyphs 1206 , 1208 , 1210 and the transformation module 110 concatenates the theta angled reference bounding boxes as a theta angled multiple glyph bounding box 1214 .
- the transformation module 110 receives a user input defining a transformation of the glyphs 1206 , 1208 , 1210 .
- the user input may be described by data included in the input data 116 .
- the transformation module 110 can receive the user input with respect to the theta angled multiple glyph bounding box 1214 .
- the user input defines the transformation of the glyphs 1206 , 1208 , 1210 relative to the object 1212 .
- the user input defines a character rotation of the glyphs 1206 , 1208 , 1210 relative to the object 1212 .
- the transformation module 110 In response to receiving the user input, the transformation module 110 generates an updated theta angled multiple glyph bounding box 1214 that is rotated based on the user input.
- the transformation module 110 determines an amount of rotation based on the updated theta angled multiple glyph bounding box 1214 and the transformation module 110 maps the determined amount of rotation to the glyphs 1206 , 1208 , 1210 . To do so, the transformation module 110 maps the rotation applied to the updated theta angled multiple glyph bounding box 1214 to the theta angled bounding boxes of the glyphs 1206 , 1208 , 1210 .
- the transformation module 110 can also map a vertical displacement of the glyphs 1206 , 1208 , 1210 by mapping a baseline shift to zero angled non-reference bounding boxes of the glyphs 1206 , 1208 , 1210 .
- the transformation module 110 can map a horizontal displacement of the glyphs 1206 , 1208 , 1210 to kerning by mapping the horizontal displacement to the zero angled non-reference bounding boxes of the glyphs 1206 , 1208 , 1210 .
- the transformation module 110 maps the rotation defined by the user input as a character rotation applied to the glyphs 1206 , 1208 , 1210 such that the glyphs 1206 , 1208 , 1210 are individually rotated relative to the object 1212 .
- the user input defines a word rotation of the glyphs 1206 , 1208 , 1210 relative to the object 1212 .
- the transformation module 110 determines a transformation for the glyphs 1206 , 1208 , 1210 such that the glyphs 1206 , 1208 , 1210 are rotated relative to the object 1212 as a group. For example, the transformation module 110 determines this transformation to collectively rotate the glyphs 1206 , 1208 , 1210 relative to the object 1212 .
- the transformation module 110 determines horizontal and/or vertical displacements for each of the glyphs 1206 , 1208 , 1210 based on the determined transformation.
- the transformation module 110 maps these horizontal and/or vertical displacements to kerning and/or baseline shifts to generate the glyphs 1206 , 1208 , 1210 having the word rotation relative to the object 1212 .
- FIGS. 13A and 13B are illustrations of representations of aspects of word rotation as editable text.
- FIG. 13A illustrates a representation 1300 in which a transformation is determined for a word rotation.
- FIG. 13B illustrates a representation 1302 in which horizontal and vertical displacements are determined based on the determined transformation.
- Representation 1304 illustrates a mapping for word rotation.
- the transformation module 110 generates a zero angled non-reference bounding box for each of the glyphs 1206 , 1208 , 1210 .
- the transformation module 110 then generates the multiple glyph bounding box 1214 by concatenating the zero angle non-reference bounding boxes for the glyphs 1206 , 1208 , 1210 .
- the transformation module 110 In response to receiving the user input defining the word rotation, the transformation module 110 generates theta angled reference bounding boxes 1306 , 1308 , 1310 having an amount of rotation defined as part of the user input.
- the transformation module 110 maps the theta angled reference bounding boxes 1306 , 1308 , 1310 to the multiple glyph bounding box 1214 and extracts an updated multiple glyph bounding box 1214 having the amount of rotation.
- the transformation module 110 determines the transformation for the glyphs 1206 , 1208 , 1210 such that the glyphs 1206 , 1208 , 1210 are rotated relative to the object 1212 as the group using the updated multiple glyph bounding box 1214 and the theta angled reference bounding boxes 1306 , 1308 , 1310 .
- Representation 1314 illustrates a vertical displacement and a horizontal displacement for glyph 1206 based on the transformation.
- the transformation module 110 maps the vertical displacement for glyph 1206 to kerning and the transformation module 110 maps the horizontal displacement for glyph 1206 into a baseline shift.
- Representation 1316 illustrates a vertical displacement and a horizontal displacement for glyph 1208 based on the transformation. As shown, the transformation module 110 maps the vertical displacement for glyph 1208 to kerning and the transformation module 110 maps the horizontal displacement for glyph 1208 into a baseline shift.
- Representation 1318 illustrates a vertical displacement and a horizontal displacement for glyph 1210 based on the transformation.
- the transformation module 110 maps the vertical displacement for glyph 1210 to kerning and the transformation module 110 maps the horizontal displacement for glyph 1210 into a baseline shift.
- Representation 1320 illustrates the glyphs 1206 , 1208 , 1210 having the word rotation transformation as editable text.
- FIG. 14 is an illustration of a representation 1400 of a displacement of multiple glyphs relative to an object using an alignment guide.
- the representation 1400 includes an object 1402 , a first glyph 1404 , and a second glyph 1406 .
- the object 1402 includes eight glyphs.
- the first glyph 1404 is the letter “T” and the second glyph 1406 is the letter “M” in this example.
- a multiple glyph bounding box 1408 for the glyphs 1404 , 1406 is generated, e.g., the computing device 102 may implement the transformation module 110 to generate the multiple glyph bounding box 1408 by concatenating bounding boxes of the glyphs 1404 , 1406 .
- the multiple glyph bounding box 1408 and the glyphs 1404 , 1406 are vertically displaced relative to the object 1402 based on the input data 116 .
- the input data 116 can include data describing a user input which defines a vertical displacement of the glyphs 1404 , 1406 relative to the object 1402 .
- the computing device 102 may implement the transformation module 110 to map the user input defining the vertical displacement into a baseline shift.
- the transformation module 110 can generate an updated multiple glyph bounding box 1408 based on this mapping.
- a potential alignment between the glyphs 1404 , 1406 and the object 1402 is identified based on the updated multiple glyph bounding box 1408 , e.g., the potential alignment may be identified based on updated corner positions of the updated multiple glyph bounding box 1408 .
- the transformation module 110 provides data describing the updated corner positions to a snapping module of the application for creating and/or editing the digital content 112 .
- the transformation module 110 renders the potential alignment as an alignment guide 1410 usable to guide the vertical displacement of the glyphs 1404 , 1406 relative to the object 1402 .
- the alignment guide 1410 is usable to precisely align a top portion of the object 1402 with a bottom portion of the glyphs 1404 , 1406 such that the transformation of the glyphs 1404 , 1406 appears visually pleasing.
- the snapping module of the application for creating and/or editing the digital content 112 can snap the glyphs 1404 , 1406 relative to the object 1402 using the alignment guide 1410 .
- FIG. 15 is an illustration of a representation 1500 of a rotation of multiple glyphs relative to an object using an angle guide.
- the representation 1500 includes glyphs 1502 , 1504 , 1506 and an object 1508 .
- the transformation module 110 is implemented to generate a multiple glyph bounding box 1510 for the glyphs 1502 , 1504 , 1506 .
- the transformation module 110 generates the multiple glyph bounding box 1510 by concatenating bounding boxes of the glyphs 1502 , 1504 , 1506 .
- the multiple glyph bounding box 1510 and the glyphs 1502 , 1504 , 1506 are rotated relative to the object 1508 based on a user input defining this transformation included in the input data 116 .
- the transformation module 110 maps the transformation defined in the input data 116 as a word rotation for the glyphs 1502 , 1504 , 1506 .
- the transformation module 110 generates an updated multiple glyph bounding box 1510 as part of the word rotation of the glyphs 1502 , 1504 , 1506 .
- transformation module 110 provides angular line data describing lines parallel to the updated multiple glyph bounding box 1510 to an angular snapping module of the application for creating and/or editing the digital content 112 .
- the angular snapping module searches the object 1508 for lines parallel to the lines described by the angular line data of the updated multiple glyph bounding box 1510 .
- a potential alignment of the glyphs 1502 , 1504 , 1506 relative to the object 1508 is identified based on the angular line data describing lines parallel to the updated multiple glyph bounding box 1510 .
- the transformation module 110 identifies the potential alignment without converting the glyphs 1502 , 1504 , 1506 into outlines.
- the transformation module 110 identifies the potential alignment and the glyphs 1502 , 1504 , 1506 are transformed as editable text.
- the transformation module 110 renders the potential alignment as an angle guide 1512 which is usable to align the glyphs 1502 , 1504 , 1506 relative to the object 1508 .
- the transformation module 110 also renders an alignment indication 1454 which visually indicates a portion of the object 1508 identified as part of identifying the potential alignment of the glyphs 1502 , 1504 , 1506 relative to the object 1508 .
- the angle guide 1512 provides a visual indication that the updated multiple glyph bounding box 1510 is parallel to a line extending from a base of the object 1508 and the alignment indication 1514 illustrates the line extending from the base of the object 1508 .
- the angular snapping module of the application for creating and/or editing the digital content 112 can snap the glyphs 1502 , 1504 , 1506 to the object 1508 using the angle guide 1512 .
- FIG. 16 illustrates an example system 1600 that includes an example computing device that is representative of one or more computing systems and/or devices that may implement the various techniques described herein. This is illustrated through inclusion of the transformation module 110 .
- the computing device 1602 may be, for example, a server of a service provider, a device associated with a client (e.g., a client device), an on-chip system, and/or any other suitable computing device or computing system.
- the example computing device 1602 as illustrated includes a processing system 1604 , one or more computer-readable media 1606 , and one or more I/O interfaces 1608 that are communicatively coupled, one to another.
- the computing device 1602 may further include a system bus or other data and command transfer system that couples the various components, one to another.
- a system bus can include any one or combination of different bus structures, such as a memory bus or memory controller, a peripheral bus, a universal serial bus, and/or a processor or local bus that utilizes any of a variety of bus architectures.
- a variety of other examples are also contemplated, such as control and data lines.
- the processing system 1604 is representative of functionality to perform one or more operations using hardware. Accordingly, the processing system 1604 is illustrated as including hardware elements 1610 that may be configured as processors, functional blocks, and so forth. This may include implementation in hardware as an application specific integrated circuit or other logic device formed using one or more semiconductors.
- the hardware elements 1610 are not limited by the materials from which they are formed or the processing mechanisms employed therein.
- processors may be comprised of semiconductor(s) and/or transistors (e.g., electronic integrated circuits (ICs)).
- processor-executable instructions may be electronically-executable instructions.
- the computer-readable media 1606 is illustrated as including memory/storage 1612 .
- the memory/storage 1612 represents memory/storage capacity associated with one or more computer-readable media.
- the memory/storage component 1612 may include volatile media (such as random access memory (RAM)) and/or nonvolatile media (such as read only memory (ROM), Flash memory, optical disks, magnetic disks, and so forth).
- the memory/storage component 1612 may include fixed media (e.g., RAM, ROM, a fixed hard drive, and so on) as well as removable media (e.g., Flash memory, a removable hard drive, an optical disc, and so forth).
- the computer-readable media 1606 may be configured in a variety of other ways as further described below.
- Input/output interface(s) 1608 are representative of functionality to allow a user to enter commands and information to computing device 1602 , and also allow information to be presented to the user and/or other components or devices using various input/output devices.
- input devices include a keyboard, a cursor control device (e.g., a mouse), a microphone, a scanner, touch functionality (e.g., capacitive or other sensors that are configured to detect physical touch), a camera (e.g., which may employ visible or non-visible wavelengths such as infrared frequencies to recognize movement as gestures that do not involve touch), and so forth.
- Examples of output devices include a display device (e.g., a monitor or projector), speakers, a printer, a network card, tactile-response device, and so forth.
- the computing device 1602 may be configured in a variety of ways as further described below to support user interaction.
- modules include routines, programs, objects, elements, components, data structures, and so forth that perform particular tasks or implement particular abstract data types.
- module generally represent software, firmware, hardware, or a combination thereof.
- the features of the techniques described herein are platform-independent, meaning that the techniques may be implemented on a variety of commercial computing platforms having a variety of processors.
- Computer-readable media may include a variety of media that may be accessed by the computing device 1602 .
- computer-readable media may include “computer-readable storage media” and “computer-readable signal media.”
- Computer-readable storage media may refer to media and/or devices that enable persistent and/or non-transitory storage of information in contrast to mere signal transmission, carrier waves, or signals per se. Thus, computer-readable storage media refers to non-signal bearing media.
- the computer-readable storage media includes hardware such as volatile and non-volatile, removable and non-removable media and/or storage devices implemented in a method or technology suitable for storage of information such as computer readable instructions, data structures, program modules, logic elements/circuits, or other data.
- Examples of computer-readable storage media may include, but are not limited to, RAM, ROM, EEPROM, flash memory or other memory technology, CD-ROM, digital versatile disks (DVD) or other optical storage, hard disks, magnetic cassettes, magnetic tape, magnetic disk storage or other magnetic storage devices, or other storage device, tangible media, or article of manufacture suitable to store the desired information and which may be accessed by a computer.
- Computer-readable signal media may refer to a signal-bearing medium that is configured to transmit instructions to the hardware of the computing device 1602 , such as via a network.
- Signal media typically may embody computer readable instructions, data structures, program modules, or other data in a modulated data signal, such as carrier waves, data signals, or other transport mechanism.
- Signal media also include any information delivery media.
- modulated data signal means a signal that has one or more of its characteristics set or changed in such a manner as to encode information in the signal.
- communication media include wired media such as a wired network or direct-wired connection, and wireless media such as acoustic, RF, infrared, and other wireless media.
- hardware elements 1610 and computer-readable media 1606 are representative of modules, programmable device logic and/or fixed device logic implemented in a hardware form that may be employed in some embodiments to implement at least some aspects of the techniques described herein, such as to perform one or more instructions.
- Hardware may include components of an integrated circuit or on-chip system, an application-specific integrated circuit (ASIC), a field-programmable gate array (FPGA), a complex programmable logic device (CPLD), and other implementations in silicon or other hardware.
- ASIC application-specific integrated circuit
- FPGA field-programmable gate array
- CPLD complex programmable logic device
- hardware may operate as a processing device that performs program tasks defined by instructions and/or logic embodied by the hardware as well as a hardware utilized to store instructions for execution, e.g., the computer-readable storage media described previously.
- software, hardware, or executable modules may be implemented as one or more instructions and/or logic embodied on some form of computer-readable storage media and/or by one or more hardware elements 1610 .
- the computing device 1602 may be configured to implement particular instructions and/or functions corresponding to the software and/or hardware modules. Accordingly, implementation of a module that is executable by the computing device 1602 as software may be achieved at least partially in hardware, e.g., through use of computer-readable storage media and/or hardware elements 1610 of the processing system 1604 .
- the instructions and/or functions may be executable/operable by one or more articles of manufacture (for example, one or more computing devices 1602 and/or processing systems 1604 ) to implement techniques, modules, and examples described herein.
- the techniques described herein may be supported by various configurations of the computing device 1602 and are not limited to the specific examples of the techniques described herein. This functionality may also be implemented all or in part through use of a distributed system, such as over a “cloud” 1614 as described below.
- the cloud 1614 includes and/or is representative of a platform 1616 for resources 1618 .
- the platform 1616 abstracts underlying functionality of hardware (e.g., servers) and software resources of the cloud 1614 .
- the resources 1618 may include applications and/or data that can be utilized while computer processing is executed on servers that are remote from the computing device 1602 .
- Resources 1618 can also include services provided over the Internet and/or through a subscriber network, such as a cellular or Wi-Fi network.
- the platform 1616 may abstract resources 1618 and functions to connect the computing device 1602 with other computing devices.
- the platform may also serve to abstract scaling of resources to provide a corresponding level of scale to encountered demand for the resources that are implemented via the platform. Accordingly, in an interconnected device embodiment, implementation of functionality described herein may be distributed throughout the system 1600 . For example, the functionality may be implemented in part on the computing device 1602 as well as via the platform 1616 that abstracts the functionality of the cloud 1614 .
Abstract
In implementations of precise glyph transformations as editable text, a computing device implements a transformation system to generate bounding boxes for a first glyph and a second glyph of multiple glyphs. The bounding boxes are concatenated as a multiple glyph bounding box for the multiple glyphs. The transformation system receives a user input defining a transformation of the multiple glyph bounding box relative to an object, and the system maps the transformation of the multiple glyph bounding box to the bounding boxes for the first glyph and the second glyph. The multiple glyphs are rendered in a user interface as the editable text having the transformation based on the mapping.
Description
- Conventional systems for creating and/or editing digital content enable users of such systems to modify default positions of glyphs included as part of the digital content. These systems can provide this functionality based on inputs received from user interactions with an input device such as manipulations of the input device to select a glyph and “drag” the glyph within the digital content. Some conventional systems generate guides to facilitate precise positioning or transforming of glyphs relative to objects within the digital content. These systems utilize outlines of the glyphs which are compared to outlines of other objects to generate the guides. Once generated, the guides are rendered in a user interface to indicate that the outlines of the glyphs are oriented in a particular manner relative to the outlines of the objects.
- One shortcoming of conventional systems that compare glyph outlines with other outlines to generate the guides is that these systems need glyphs to be converted into outlines for performing the comparisons. The glyphs, once converted into outlines, are no longer editable as text. As a result, functionality available as editable text such as functionality of TrueType and OpenType fonts is lost as soon as the glyphs are converted into outlines.
- Although conventional systems facilitate repositioning of glyphs without the benefit of the guides, these systems only support repositioning of glyphs one glyph at a time. Thus, in order to reposition three glyphs relative to an object as part of editing digital content, each of the three glyphs must be individually repositioned relative to the object. This manual process often requires several attempts before a user is satisfied with a repositioning of the glyphs relative to the object because the guides are not available and because the glyphs can only be repositioned one at a time which increases the likelihood of user error.
- Systems and techniques are described for glyph transformations as editable text. In an example, a computing device implements a transformation system to generate bounding boxes for a first glyph and a second glyph of multiple glyphs. The transformation system leverages the bounding boxes for the first and second glyphs to generate a multiple glyph bounding box for the multiple glyphs together by concatenating the bounding boxes of the first and second glyphs. In this way, the transformation system generates the multiple glyph bounding box to address both the first and second glyphs together, which is not possible in conventional techniques.
- In one example, a user input is received defining a transformation of the multiple glyph bounding box relative to an object. The transformation system determines a mapping of the transformation of the multiple glyph bounding box to the bounding boxes for the first glyph and the second glyph. For example, the transformation system determines the mapping without converting the multiple glyphs into outlines, thus improving computational efficiency of the computing device. The multiple glyphs are rendered in a user interface as editable text having the transformation based on the mapping without further user intervention, which is also not possible in conventional techniques.
- This Summary introduces a selection of concepts in a simplified form that are further described below in the Detailed Description. As such, this Summary is not intended to identify essential features of the claimed subject matter, nor is it intended to be used as an aid in determining the scope of the claimed subject matter.
- The detailed description is described with reference to the accompanying figures. Entities represented in the figures may be indicative of one or more entities and thus reference may be made interchangeably to single or plural forms of the entities in the discussion.
-
FIG. 1 is an illustration of an environment in an example implementation that is operable to employ digital systems and techniques for glyph transformations as editable text as described herein. -
FIG. 2 depicts a system in an example implementation showing operation of a transformation module for transforming glyphs as editable text. -
FIG. 3 is an illustration of representations of functionality described with respect toFIG. 2 . -
FIG. 4 is a flow diagram depicting a procedure in an example implementation in which a bounding box for multiple glyphs is generated by concatenating bounding boxes of a first glyph and a second glyph of the multiple glyphs, a user input defining a transformation of the multiple glyphs is received, and the multiple glyphs are generated for display in a user interface as editable text having the transformation. -
FIG. 5 is an illustration of a representation of a displacement of a single glyph relative to an object using an alignment guide. -
FIG. 6 is an illustration of a representation of a displacement of a single glyph of a text object within the text object using a center alignment guide. -
FIG. 7 is an illustration of a representation of a displacement of a single glyph relative to an object using an equal spacing guide. -
FIG. 8 is an illustration of a representation of a rotation of a single glyph relative to an object using an angle guide. -
FIG. 9 is an illustration of a representation of generation of a bounding box for multiple glyphs by concatenating bounding boxes of a first glyph and a second glyph of the multiple glyphs. -
FIG. 10 is an illustration of a representation of an example of a displacement of multiple glyphs relative to objects and an example of a horizontal scaling of the multiple glyphs relative to the objects. -
FIG. 11 is an illustration of a representation of an example of a vertical scaling of multiple glyphs relative to objects and an example of a uniform scaling of the multiple glyphs relative to the objects. -
FIG. 12 is an illustration of a representation of an example of character rotation as editable text and an example of word rotation as editable text. -
FIGS. 13A and 13B are illustrations of representations of aspects of word rotation as editable text. -
FIG. 14 is an illustration of a representation of a displacement of multiple glyphs relative to an object using an alignment guide. -
FIG. 15 is an illustration of a representation of a rotation of multiple glyphs relative to an object using an angle guide. -
FIG. 16 illustrates an example system that includes an example computing device that is representative of one or more computing systems and/or devices that may implement the various techniques described herein. - Overview
- Conventional systems for creating and editing digital content enable users of such systems to modify default positions of glyphs included as part of the digital content through interaction with an input device to select a glyph and “drag” the glyph within the digital content. However, these conventional systems are limited to transforming or repositioning of glyphs one glyph at a time. As a result, transforming multiple glyphs is burdensome and in practice is difficult to precisely align the repositioned glyphs as each glyph must be aligned individually, which is manually and computationally inefficient.
- Although some conventional systems provide guides for repositioning objects included in the digital content, the glyphs are converted into outlines in order to use these guides. Although conversion of glyphs into outlines enables use of the guides, the converted glyphs are no longer editable as text. As a result, the editability of the converted glyphs as text is lost along with a variety of functionality available in editable text.
- Systems and techniques are described for glyph transformations as editable text. In an example, a computing device implements a transformation system to generate bounding boxes for a first glyph and a second glyph of multiple glyphs. These bounding boxes may fully envelop outlines of the first and second glyphs and also precisely bound the outlines such that sides of the bounding boxes are tangent to extreme portions of the outlines.
- The transformation system concatenates the bounding boxes for the first glyph and the second glyph together as a multiple glyph bounding box for the multiple glyphs. By concatenating the bounding boxes for the first and second glyphs, for instance, the transformation system generates the multiple glyph bounding box as having a right side equal to a right side of the rightmost bounding box for the first and second glyphs. Similarly, the multiple glyph bounding box is generated as having a left side equal to a left side of the leftmost bounding box for the first and second glyphs.
- Extreme portions of the multiple glyph bounding box are used to identify potential alignments between the first and second glyphs and other objects. These potential alignments can be identified based on transformations of the first and second glyphs relative to the other objects and/or transformations of the other objects relative to the first and second glyphs. By using the multiple glyph bounding box to identify potential alignments, the transformation system can generate guides based on the potential alignments without converting the first and second glyphs into outlines.
- For example, positions of corners of the multiple glyph bounding box are used to identify potential horizontal, vertical, center, and equal spacing alignments between the first and second glyphs and the other objects. Angle line data, e.g., due to a rotation of the multiple glyph bounding box, can be used to identify potential angle alignments between the first and second glyphs and the other objects. The transformation system renders indications as guides in a user interface based on the identified potential alignments.
- A user input is received defining a transformation of the multiple glyph bounding box relative to any object of the other objects. This transformation can include a displacement, a scaling, a rotation, etc. The transformation system maps the transformation of the multiple glyph bounding box to the bounding boxes for the first glyph and the second glyph. The multiple glyphs are rendered in the user interface as editable text having the transformation based on the mapping.
- The described systems improve conventional technology for creating and/or editing digital content by rendering guides for guiding transformations of a glyph without converting the glyph into an outline. This enables the glyph to be precisely transformed as editable text which is not possible using conventional systems. The described systems also enable precise transformations of a single glyph or multiple glyphs any of which can be transformed as editable text. By enabling multiple glyphs to be transformed as editable text, the described systems and techniques improve computational efficiency of computing devices by preserving computational resources used to reposition glyphs one glyph at a time using conventional systems.
- Examples of functionality made possible by preserving editability of transformed glyphs as text which are not possible after converting a glyph into an outline include an ability to change fonts, font sizes, or any text characteristic used to render glyphs before and after transformation of the glyphs. For example, using the described systems, glyphs can be rendered using a first font, transformed as editable text, and then glyphs may be rendered using a second font having the transformation as editable text. As editable text, transformed glyphs can be modified to add or remove features such as boldface, italic, strikethrough, underline, etc. Additionally, all features and functionality of TrueType and OpenType fonts are available to editable text which are lost by converting glyphs into outline using conventional systems. Thus, the described systems improve operation of computing devices by significantly increasing functionality available to create and/or edit digital content including editable text.
- In the following discussion, an example environment is first described that may employ the techniques described herein. Example procedures are also described which may be performed in the example environment as well as other environments. Consequently, performance of the example procedures is not limited to the example environment and the example environment is not limited to performance of the example procedures.
- Example Environment
-
FIG. 1 is an illustration of anenvironment 100 in an example implementation that is operable to employ digital systems and techniques as described herein. The illustratedenvironment 100 includes acomputing device 102 connected to anetwork 104. Thecomputing device 102 may be configured as a desktop computer, a laptop computer, a mobile device (e.g., assuming a handheld configuration such as a tablet or mobile phone), and so forth. Thus, thecomputing device 102 may range from a full resource device with substantial memory and processor resources (e.g., personal computers, game consoles) to a low-resource device with limited memory and/or processing resources (e.g., mobile devices). Additionally, thecomputing device 102 may be representative of a plurality of different devices, such as multiple servers utilized by a business to perform operations “over the cloud.” - The illustrated
environment 100 also includes adisplay device 106 that is communicatively coupled to thecomputing device 102 via a wired or a wireless connection. A variety of device configurations may be used to implement thecomputing device 102 and/or thedisplay device 106. Thecomputing device 102 includes astorage device 108 and atransformation module 110. Thestorage device 108 is illustrated to includedigital content 112. - The
transformation module 110 is illustrated as having, receiving, and/or transmittingglyph data 114 andinput data 116. Theglyph data 114 includes information usable by an application for creating and/or editing thedigital content 112 to render glyphs and other objects as part of creation and/or editing of thedigital content 112. For example, theglyph data 114 includes both general properties of glyphs such as font file data as well as data that is specific to glyphs included in thedigital content 112 such as outline data describing the glyphs included in thedigital content 112. Theglyph data 114 also includes general properties of objects as well as data that is specific to objects included in thedigital content 112, e.g., outline data describing the objects included in thedigital content 112. - The
glyph data 114 is illustrated to include an example 118 ofglyphs 120 and anobject 122. Thus, theglyph data 114 includes data that is general to theglyphs 120 such as font file data corresponding to a font used to render theglyphs 120 as well as data that is specific to theglyphs 120 as depicted in the example 118 such as outline data, font size data, horizontal scale data, vertical scale data, etc. As shown, the example 118 depicts 11glyphs 120 and theobject 122 is an arrow oriented at an angle relative to theglyphs 120. - The
input data 116 includes information usable by the application for creating and/or editing thedigital content 112 to render additions and modifications to glyphs and other objects as part of creation and/or editing of thedigital content 112. An input device such as a stylus, a touchscreen, an electronic pen, or a mouse that receives data, e.g., from a user manipulation, can transmit the received data to thecomputing device 102 as part of theinput data 116. As illustrated, theinput data 116 includes a user input defining a transformation of theglyphs 120 relative to theobject 122 depicted in the example 118. Thetransformation module 110 processes theglyph data 114 and theinput data 116 to generate theglyphs 120 depicted in the example 118 having the transformation defined by the user input included as part of theinput data 116. Once generated, thetransformation module 110 can render transformedglyphs 124 in auser interface 126 of thedisplay device 106. - In the illustrated example, the rendering of the transformed
glyphs 124 depicts threeglyphs glyphs 120 depicted in the example 118 rotated relative to theobject 122. In this example, the transformation defined in theinput data 116 is a rotation of the threeglyphs object 122. Thetransformation module 110 generates the transformedglyphs 124 as editable text which can be edited, e.g., by an additional user input. In one example, thetransformation module 110 generates the transformedglyphs 124 without converting the threeglyphs computing device 102 can implement thetransformation module 110 to precisely snap the threeglyphs object 122 as editable text. -
FIG. 2 depicts asystem 200 in an example implementation showing operation of atransformation module 110.FIG. 3 is an illustration ofrepresentations 300 of functionality described with respect toFIG. 2 . Thetransformation module 110 is illustrated to include ageneration module 202, amapping module 204, and arendering module 206. Thecomputing device 102 implements thetransformation module 110 to receive theglyph data 114 and theinput data 116. In the illustrated example, thegeneration module 202 receives theglyph data 114 and theinput data 116 and processes theglyph data 114 and theinput data 116 to generate boundingbox data 208. - The
generation module 202 generates thebounding box data 208 by accessing glyph identification data, font size data, horizontal scale data, vertical scale data, orientation data, and/or glyph outline data describing theglyphs 120 from theglyph data 114. Thegeneration module 202 then generates a zero angled reference bounding box for each of theglyphs 120 included in the example 118 to fully envelop an outline corresponding to each of theglyphs 120. To do so, thegeneration module 202 generates the zero angled reference bounding boxes for theglyphs 120 as having heights and widths defined with respect to a reference origin, e.g., (0,0) in a font space. For example, thegeneration module 202 generates a particular zero angled reference bounding box for a particular glyph of theglyphs 120 to precisely bound the outline of the particular glyph such that sides of the particular zero angled reference bounding box are tangent to extreme portions of the particular glyph's outline. By precisely bounding the outline of the particular glyph, the particular zero angled bounding box is usable precisely transform the particular glyph. - In one example, the particular zero angled reference bounding box may differ from an em-box or em-square associated with the particular glyph of the
glyphs 120 because the em-box or em-square may not necessarily fully envelop the outline of the particular glyph of theglyphs 120. In another example, the particular zero angled reference bounding box may differ from the em-box or em-square associated with the particular glyph because the em-box or em-square may fully envelop the outline of the particular glyph but with spacing between the sizes of the em-box or em-square and the extreme portions of the particular glyph's outline. Thegeneration module 202 generates thebounding box data 208 to include data describing the zero angled reference bounding box for each of theglyphs 120 included in the example 118. - As shown in
FIG. 3 , thegeneration module 202 generates zero angledreference bounding boxes glyphs reference bounding box 302 are tangent to extreme portions of theglyph 128. The zero angledreference bounding box 302 also fully envelops an outline of theglyph 128. - The
generation module 202 also generates a theta angled reference bounding box for each of theglyphs 120 included in the example 118, e.g., if any rotation transformation is applied to theglyphs 120 via theinput data 116. To do so, thegeneration module 202 processes theinput data 116 to extract any rotation applied to each glyph of theglyphs 120 depicted in the example 118. Thegeneration module 202 generates the theta angled reference bounding boxes by applying rotations described by rotation data extracted from theinput data 116 to the zero angled reference bounding boxes. Thus, a particular theta angled reference bounding box for the particular glyph is the same as the particular zero angled reference bounding box if the particular glyph is not rotated in the example 118. For example, if the particular glyph was rotated by 45 degrees in the example 118, then the particular theta angled reference bounding box would be rotated by 45 degrees relative to the particular zero angled reference bounding box. Thegeneration module 202 generates thebounding box data 208 to include data describing the theta angled reference bounding box for each of theglyphs 120 included in the example 118. - As illustrated in
FIG. 3 , thegeneration module 202 generates theta angledreference bounding boxes glyphs reference bonding box 308 are tangent to the extreme portions of theglyph 128. The theta angledreference bounding box 308 also fully envelops the outline of theglyph 128. - The
generation module 202 additionally generates a zero angled non-reference bounding box for each glyph of theglyphs 120 by processing theglyph data 114 to extract glyph outline information for theglyphs 120 as depicted in the example 118. Thegeneration module 202 then generates the zero angled non-reference bounding boxes using the extracted glyph outline information based on orientations of the outlines of theglyphs 120 as depicted in the example 118. For example, thegeneration module 202 generates thebounding box data 208 to include data describing the zero angled non-reference bounding box for each of theglyphs 120 as depicted in the example 118. - As shown in
FIG. 3 , thegeneration module 202 generates a zero anglednon-reference bounding box 314 for theglyph 128. In an example, sides of the zero anglednon-reference bounding box 314 are tangent to portions of the theta angledreference bounding box 308. As illustrated, the zero anglednon-reference bounding box 314 fully envelops the theta angledreference bounding box 308 for theglyph 128. - In one example, the
generation module 202 generates a bounding box for multiple glyphs of theglyphs 120 by concatenating bounding boxes of individual glyphs of the multiple glyphs. To do so, thegeneration module 202 generates the bounding box for the multiple glyphs to fully envelop outlines corresponding to each of the multiple glyphs based on extremes of the individual bounding boxes of the glyphs of the multiple glyphs. For example, thegeneration module 202 generates a left side of the bounding box for the multiple glyphs as a left side of the bounding box of a leftmost individual glyph of the multiple glyphs. Similarly, thegeneration module 202 generates a right side of the bounding box for the multiple glyphs as a right side of the bounding box of a rightmost individual glyph of the multiple glyphs. Thegeneration module 202 generates thebounding box data 208 to include data describing the zero angled non-reference bounding box for the multiple glyphs. - As illustrated in
FIG. 3 , thegeneration module 202 generates a multipleglyph bounding box 316 for multiple glyphs by concatenating the boundingboxes generation module 202 generates the multipleglyph bounding box 316 as having a left side based on a left side ofbounding box 302. In this example, thegeneration module 202 generates the multipleglyph bounding box 316 as having a right side based on a right side ofbounding box 306. As shown, thegeneration module 202 generates the multipleglyph bounding box 316 as having a bottom side based on a bottom side ofbounding box 304. In a similar way, thegeneration module 202 generates the multipleglyph bounding box 316 as having a top side based on a top side ofbounding box 302. Thus, thegeneration module 202 generates the bounding box formultiple glyphs 316 to fully envelop the boundingboxes - The
mapping module 204 receives thebounding box data 208 and theinput data 116 and themapping module 204 processes thebounding box data 208 and theinput data 116 to generatetransformation data 210. In one example, themapping module 204 includes aguide generating module 212 and aguide rendering module 214 which may be implemented to generate and render a guide, respectively, for guiding a transformation of theglyphs 120. As used herein the term “guide” refers to an indication such as a visual indication that a glyph is oriented in a particular manner relative to an object or that an object is oriented in a particular manner relative to a glyph in theuser interface 126. For example, a guide can indicate that a glyph is precisely oriented or aligned relative to an object. - The
mapping module 204 processes theinput data 116 to map transformations, e.g., user inputs, to theglyphs 120 based on thebounding box data 208. For example, themapping module 204 maps a horizontal displacement of a particular glyph of theglyphs 120 to kerning and themapping module 204 maps a vertical displacement of the particular glyph into a baseline shift. In another example, themapping module 204 maps a rotation of the particular glyph as a displacement angle from a center of the particular glyph. Themapping module 204 maps a horizontal scaling of the particular glyph as a scale percentage mapped onto a horizontal scale of the particular glyph. Similarly, themapping module 204 maps a vertical scaling of the particular glyph as a scale percentage mapped onto a vertical scale of the particular glyph. Themapping module 204 maps a uniform scaling of the particular glyph as a scale percentage mapped onto the horizontal scale and the vertical scale of the particular glyph. - As the
mapping module 204 is mapping transformations described in theinput data 116 to theglyphs 120, themapping module 204 also updates the zero angled non-reference bounding boxes of theglyphs 120 based on the mapped transformations. Theguide generating module 212 uses the updated zero angled non-reference bounding boxes to identify potential alignments of theglyphs 120, e.g., with outlines of objects. Theguide generating module 212 generates these potential alignments as guides and theguide rendering module 214 renders the guides in theuser interface 126. In this way, thetransformation module 110 renders the guides to guide transformations in substantially real time as data describing the transformations is received as part of theinput data 116. - The
mapping module 204 generates thetransformation data 210 using the zero angled reference bounding boxes, the zero angled non-reference bounding boxes, and/or the theta angled reference bounding boxes to transform theglyphs 120 as editable text. Thetransformation data 210 describes these editable text transformations and therendering module 206 receives thetransformation data 210. Therendering module 206 processes the transformation data to render the transformedglyphs 124 as editable text. - In general, functionality, features, and concepts described in relation to the examples above and below may be employed in the context of the example procedures described in this section. Further, functionality, features, and concepts described in relation to different figures and examples in this document may be interchanged among one another and are not limited to implementation in the context of a particular figure or procedure. Moreover, blocks associated with different representative procedures and corresponding figures herein may be applied together and/or combined in different ways. Thus, individual functionality, features, and concepts described in relation to different example environments, devices, components, figures, and procedures herein may be used in any suitable combinations and are not limited to the particular combinations represented by the enumerated examples in this description.
- Example Procedures
- The following discussion describes techniques that may be implemented utilizing the previously described systems and devices. Aspects of each of the procedures may be implemented in hardware, firmware, software, or a combination thereof. The procedures are shown as a set of blocks that specify operations performed by one or more devices and are not necessarily limited to the orders shown for performing the operations by the respective blocks. In portions of the following discussion, reference will be made to
FIG. 1 andFIG. 2 . -
FIG. 4 is a flow diagram depicting aprocedure 400 in an example implementation in which a bounding box for multiple glyphs is generated by concatenating bounding boxes of a first glyph and a second glyph of the multiple glyphs, a user input defining a transformation of the multiple glyphs is received, and the multiple glyphs are generated for display in a user interface as editable text having the transformation. Bounding boxes for a first glyph and a second glyph of multiple glyphs are generated (block 402). For example, thetransformation module 110 generates the bounding boxes for the first and second glyphs. In another example, thetransformation module 110 can generate bounding boxes for all glyphs of multiple glyphs selected for transformation. A multiple glyph bounding box for the multiple glyphs is generated (block 404) by concatenating the bounding boxes for the first and second glyphs. Thecomputing device 102 implements thetransformation module 110 to generate the multiple glyph bounding box by concatenating the bounding boxes for the first and second glyphs in one example. - A user input defining a transformation of the multiple glyph bounding box relative to an object is received (block 406). In one example, the
transformation module 110 receives the user input defining the transformation. A mapping of the transformation of the multiple glyph bounding box to the bounding boxes for the first and second glyphs is determined (block 408). Thecomputing device 102 can implement thetransformation module 110 to determine the mapping. The multiple glyphs are generated for display in a user interface (block 410) as editable text having the transformation based on the mapping. Thetransformation module 110 may generate the multiple glyphs for display in the user interface in an example. -
FIG. 5 is an illustration of arepresentation 500 of a displacement of a single glyph relative to an object using an alignment guide. Therepresentation 500 includes anobject 502 and asingle glyph 504. In the illustrated example, theobject 502 includes six glyphs and thesingle glyph 504 is a registered trademark symbol. Abounding box 506 for thesingle glyph 504 is generated, e.g., thecomputing device 102 may implement thetransformation module 110 to generate thebounding box 506 for thesingle glyph 504. Thetransformation module 110 may generate thebounding box 506 as a zero angled non-reference bounding box for thesingle glyph 504 and thetransformation module 110 may utilize a corresponding zero angled reference bounding box for thesingle glyph 504 to transform thesingle glyph 504 as editable text. - The
bounding box 506 and thesingle glyph 504 are vertically displaced relative to theobject 502 based on theinput data 116. For example, thecomputing device 102 may implement thetransformation module 110 to map a user input defining a vertical displacement of thesingle glyph 504 relative to theobject 502 into a baseline shift. Thetransformation module 110 can generate an updatedbounding box 506 based on this mapping. - A potential alignment between the
single glyph 504 and theobject 502 is identified based on the updatedbounding box 506, e.g., the potential alignment may be identified based on updated corner positions of the updatedbounding box 506. For example, thetransformation module 110 provides data describing the updated corner positions to a snapping module of the application for creating and/or editing thedigital content 112. Thetransformation module 110 renders the potential alignment as analignment guide 508 usable to guide the vertical displacement of thesingle glyph 504 relative to theobject 502. - In the illustrated example, the
alignment guide 508 is usable to precisely align a top portion of theobject 502 with a bottom portion of thesingle glyph 504 such that the transformation of thesingle glyph 504 appears visually pleasing. In one example, in response to receipt of a user input, the snapping module of the application for creating and/or editing thedigital content 112 can snap thesingle glyph 504 relative to theobject 502 using thealignment guide 508. By using the corner positions of thebounding box 506 to generate thealignment guide 508 instead of converting thesingle glyph 504 into an outline to generate thealignment guide 508, thesingle glyph 504 is transformed relative to theobject 502 as editable text. -
FIG. 6 is an illustration of arepresentation 600 of a displacement of a single glyph of a text object within the text object using a center alignment guide. Therepresentation 600 includes asingle text object 602 which includes nine glyphs, and these nine glyphs are editable as text. Asingle glyph 604 of the nine glyphs of thetext object 602 is also illustrated. Thesingle glyph 604 is the letter “N” and thesingle glyph 604 is disposed between the other eight glyphs of thesingle text object 602. - The
transformation module 110 generates abounding box 606 for thesingle glyph 604. For example, thetransformation module 110 may generate thebounding box 606 as a zero angled non-reference bounding box and thetransformation module 110 can utilize a corresponding zero angled reference bounding box for thesingle glyph 604 to transform thesingle glyph 604 as editable text within thetext object 602. By generating thebounding box 606 in this manner, thesingle text object 602 remains editable as text both before and after a transformation of thesingle glyph 604 within thetext object 602. - The
bounding box 606 and thesingle glyph 604 are vertically displaced within thesingle text object 602 based on a user input defining this transformation included in theinput data 116. Thetransformation module 110 maps the user input into a baseline shift of thesingle glyph 604. Thetransformation module 110 also generates an updatedbounding box 606 based on this mapping. - A potential alignment between the
single glyph 604 and the other glyphs of thetext object 602 is identified using corner positions of the updatedbounding box 606. For example, thetransformation module 110 can provide data describing the updated corner positions to the snapping module of the application for creating and/or editing thedigital content 112. As shown, thetransformation module 110 renders the identified potential alignment as acenter alignment guide 608 which can be used to precisely align a portion of thesingle glyph 604 with portions of the other glyphs of thetext object 602. - In this example, the
center alignment guide 608 is used to guide an alignment between a bottom portion of thesingle glyph 604 and central portions of the other glyphs of thesingle text object 602. As a result, the transformation of thesingle glyph 604 within thetext object 602 is precise and visually pleasing. In one example, in response to receiving a user input defining a snapping operation, the snapping module of the application for creating and/or editing thedigital content 112 can snap thesingle glyph 604 to the other glyphs of thetext object 602 using thecenter alignment guide 608. - Through use of corner positions of the
bounding box 606 to generate thecenter alignment guide 608, thesingle glyph 604 is transformed within thetext object 602 without converting thesingle glyph 604 or the other glyphs of thetext object 602 into outlines. Thus, thesingle glyph 604 and the other eight glyphs are editable as thesingle text object 602 and these glyphs are not converted into separate text objects. In this way, thesingle glyph 604 is transformed within the other glyphs of thetext object 602 and all nine glyphs of thetext object 602 remain as editable text. -
FIG. 7 is an illustration of arepresentation 700 of a displacement of a single glyph relative to an object using an equal spacing guide. Therepresentation 700 includes anobject 702, asingle glyph 704, and anadditional glyph 706. As illustrated, theobject 702 is a square with rounded corners, thesingle glyph 704 is the numeral “9,” and theadditional glyph 706 is the numeral “1.” Thecomputing device 102 implements thetransformation module 110 to generate abounding box 708 for thesingle glyph 704. For example, thetransformation module 110 generates thebounding box 708 as a zero angled non-reference bounding box for thesingle glyph 704. In this example, thetransformation module 110 can utilize the zero angled non-reference bounding box along with a corresponding zero angled reference bounding box to generate transformations of thesingle glyph 704 as editable text. - The
bounding box 708 and thesingle glyph 704 are horizontally displaced relative to theobject 702 and theadditional glyph 706 based on data describing a user input defining this transformation included in theinput data 116. Thetransformation module 110 maps the transformation defined in theinput data 116 to a kerning space between thesingle glyph 704 and theadditional glyph 706 to perform the horizontal displacement. Thetransformation module 110 also generates an updatedbounding box 708 as part of the horizontal displacement and provides data describing corner positions of the updatedbounding box 708 to the snapping module of the application for creating and/or editing thedigital content 112. - A potential alignment of the
single glyph 704 relative to theobject 702 is identified based on the corner positions of the updatedbounding box 708. By leveraging the corner positions of the updatedbounding box 708, thetransformation module 110 can identify the potential alignment without converting thesingle glyph 704 into an outline. In this way, thesingle glyph 704 retains its editability as text which would be lost if thesingle glyph 704 was converted into an outline to identify the potential alignment. - The
transformation module 110 renders the potential alignment as anequal spacing guide 710 usable to align thesingle glyph 704 relative to theobject 702 such that left and right sides of thebounding box 708 are spaced equally from left and right sides of theobject 702, respectively. Theequal spacing guide 710 is rendered as two lines having an equal length and theequal spacing guide 710 is a visual indication of an equal spacing alignment between thesingle glyph 704 and theobject 702. For example, in response to receiving a user input defining a snapping operation, the snapping module of the application for creating and/or editing thedigital content 112 can snap thesingle glyph 704 to theobject 702 using theequal spacing guide 710. -
FIG. 8 is an illustration of arepresentation 800 of a rotation of a single glyph relative to an object using an angle guide. Therepresentation 800 includes anobject 802 and asingle glyph 804. In this example, theobject 802 is a lightning bolt and thesingle glyph 804 is the letter “H.” Thetransformation module 110 is implemented to generate abounding box 806 for thesingle glyph 804. For example, thetransformation module 110 generates thebounding box 806 as a theta angled reference bounding box for thesingle glyph 804. Thebounding box 806 and thesingle glyph 804 are rotated relative to theobject 802 based on a user input defining this transformation included in theinput data 116. Thetransformation module 110 maps the transformation defined in theinput data 116 as a displacement angle from a center of thesingle glyph 804. Thetransformation module 110 also generates a zero anglednon-reference bounding box 808 for thesingle glyph 804 and thetransformation module 110 can utilize a corresponding zero angled reference bounding box to map the transformation of thesingle glyph 804 as editable text. - The
transformation module 110 generates an updatedbounding box 806 as part of the rotation of thesingle glyph 804. In one example,transformation module 110 provides angular line data describing lines parallel to the updatedbounding box 806 to an angular snapping module of the application for creating and/or editing thedigital content 112. The angular snapping module searches theobject 802 for lines parallel to the lines described by the angular line data of the updatedbounding box 806. - A potential alignment of the
single glyph 804 relative to theobject 802 is identified based on the angular line data describing lines parallel to the updatedbounding box 806. By leveraging the angular line data based on the updatedbounding box 806, thetransformation module 110 identifies the potential alignment without converting thesingle glyph 804 into an outline. Thus, thetransformation module 110 identifies the potential alignment and thesingle glyph 804 is transformed as editable text. - The
transformation module 110 renders the potential alignment as anangle guide 810 which is usable to align thesingle glyph 804 relative to theobject 802. Thetransformation module 110 also renders analignment indication 812 which visually indicates a portion of theobject 802 identified as part of identifying the potential alignment of thesingle glyph 804 relative to theobject 802. As shown, theangle guide 810 provides a visual indication that the updatedbounding box 806 is parallel to a line extending from a tip of theobject 802 and thealignment indication 812 illustrates the line extending from the tip of theobject 802. In an example, in response to receiving a user input defining a snapping operation, the angular snapping module of the application for creating and/or editing thedigital content 112 can snap thesingle glyph 804 to theobject 802 using theangle guide 810. -
FIG. 9 is an illustration of arepresentation 900 of generation of a bounding box for multiple glyphs by concatenating bounding boxes of a first glyph and a second glyph of the multiple glyphs. Therepresentation 900 includes afirst glyph 902 and asecond glyph 904. In the illustrated example, thefirst glyph 902 is the letter “0” and thesecond glyph 904 is the letter “F.” Thecomputing device 102 implements thetransformation module 110 to generate abounding box 906 for thefirst glyph 902 and abounding box 908 for thesecond glyph 904. For example, thetransformation module 110 may generate boundingboxes - The
transformation module 110 generates a multipleglyph bounding box 910 by concatenating thebounding box 906 of thefirst glyph 902 and thebounding box 908 of thesecond glyph 904. As shown, thetransformation module 110 generates the multipleglyph bounding box 910 as having aleft side 912 equal to theleft side 912 ofbounding box 906 because thefirst glyph 902 is a leftmost glyph of theglyphs transformation module 110 generates the multipleglyph bounding box 910 as having aright side 914 equal to theright side 914 ofbounding box 908 because thesecond glyph 904 is a rightmost glyph of theglyphs - The
glyphs glyph bounding box 910 is generated as having a top side equal to top sides of boundingboxes boxes glyphs transformation module 110 generates the multipleglyph bounding box 910 to have a top side equal to a greater top side of the tops sides of the boundingboxes transformation module 110 generates the multipleglyph bounding box 910 to have a bottom side equal to a lower bottom side of the bottom sides of the boundingboxes - As shown, the multiple
glyph bounding box 910 fully envelops outlines of thefirst glyph 902 and thesecond glyph 904. By generating the multipleglyph bounding box 910 in this manner, thetransformation module 110 can be implemented to generate guides for transforming thefirst glyph 902 and thesecond glyph 904 simultaneously without converting theglyphs glyphs -
FIG. 10 is an illustration of arepresentation 1000 of an example 1002 of a displacement of multiple glyphs relative to objects and an example 1004 of a horizontal scaling of the multiple glyphs relative to the objects. As illustrated, the examples 1002, 1004 include afirst object 1006 and asecond object 1008 as well as afirst glyph 1010 and asecond glyph 1012. Thefirst object 1006 includes four glyphs and thesecond object 1008 includes seven glyphs. Thefirst glyph 1010 is the letter “0” and thesecond glyph 1012 is the letter “F” and both of theglyphs first object 1006 and thesecond object 1008. - The
transformation module 110 is implemented to generate a multipleglyph bounding box 1014 for theglyphs transformation module 110 may generate a bounding box for thefirst glyph 1010 and a bounding box for thesecond glyph 1012. Thetransformation module 1010 then concatenates the bounding box for thefirst glyph 1010 and the bounding box for thesecond glyph 1012 as the multipleglyph bounding box 1014. Thetransformation module 110 receives a user input defining a transformation of theglyphs input data 116. - In example 1002, the transformation is a horizontal displacement and a vertical displacement of the
glyphs transformation module 110 maps the horizontal displacement in kerning and thetransformation module 110 maps the vertical displacement as a baseline shift of theglyphs first object 1006 and thefirst glyph 1010, this automatically shifts thesecond object 1008 due to increased horizontal displacement in the text reflow direction. For example, thetransformation module 110 receives the user input defining the horizontal displacement and the vertical displacement with respect to the multipleglyph bounding box 1014. Thetransformation module 110 maps the horizontal displacement and the vertical displacement to theglyphs glyphs - In example 1004, the transformation is a horizontal scaling of the
glyphs transformation module 110 maps the horizontal scaling as a scale percentage onto a horizontal scale of theglyphs transformation module 110 receives the user input defining an increase in the horizontal scale with respect to the multipleglyph bounding box 1014. Thetransformation module 110 maps the increase in the horizontal scale to theglyphs glyphs transformation module 110 shifts thesecond object 1008 in response to the transformation of theglyphs transformation module 110 automatically displaces thesecond object 1008 via horizontal displacement due to automatic text reflow of theglyphs -
FIG. 11 is an illustration of arepresentation 1100 of an example 1102 of a vertical scaling of multiple glyphs relative to objects and an example 1104 of a uniform scaling of the multiple glyphs relative to the objects. The examples 1102, 1104 include afirst object 1106 and asecond object 1108 as well as afirst glyph 1110 and asecond glyph 1112. Thefirst object 1106 includes four glyphs and thesecond object 1108 includes seven glyphs. Thefirst glyph 1110 is the letter “0” and thesecond glyph 1112 is the letter “F” and both of theglyphs first object 1106 and thesecond object 1108. - The
transformation module 110 is implemented to generate a multipleglyph bounding box 1114 for theglyphs first glyph 1110 and a bounding box for thesecond glyph 1112. Thetransformation module 110 then concatenates the bounding box for thefirst glyph 1110 and the bounding box for thesecond glyph 1112 as the multipleglyph bounding box 1114. Thetransformation module 110 receives a user input defining a transformation of the glyphs theglyphs input data 116 and thetransformation module 110 receives the user input with respect to the multipleglyph bounding box 1114. - In example 1102, the transformation is a vertical scaling of the
glyphs transformation module 110 maps the vertical scaling as a scale percentage onto a vertical scale of theglyphs glyph bounding box 1114. Thus, thetransformation module 110 maps the increase in the vertical scale to theglyphs glyphs - In example 1104, the transformation is a uniform scaling of the
glyphs transformation module 110 maps the uniform scaling as a scale percentage onto a vertical scale and a horizontal scale of theglyphs glyph bounding box 1114. Thetransformation module 110 maps the increase in the uniform scale to theglyphs glyphs transformation module 110 shifts thesecond object 1108 in response to the transformation of theglyphs -
FIG. 12 is an illustration of arepresentation 1200 of an example 1202 of character rotation as editable text and an example 1204 of word rotation as editable text. As illustrated, the examples 1202, 1204 includeglyphs object 1212. Thecomputing device 102 implements thetransformation module 110 to generate a multipleglyph bounding box 1214 for theglyphs transformation module 110 generates a bounding box for each of theglyphs transformation module 110 concatenates these bounding boxes as the multipleglyph bounding box 1214. For example, thetransformation module 110 generates a theta angled reference bounding box for each of theglyphs transformation module 110 concatenates the theta angled reference bounding boxes as a theta angled multipleglyph bounding box 1214. - The
transformation module 110 receives a user input defining a transformation of theglyphs input data 116. Thetransformation module 110 can receive the user input with respect to the theta angled multipleglyph bounding box 1214. In examples 1202, 1204 the user input defines the transformation of theglyphs object 1212. - In example 1202, the user input defines a character rotation of the
glyphs object 1212. In response to receiving the user input, thetransformation module 110 generates an updated theta angled multipleglyph bounding box 1214 that is rotated based on the user input. Thetransformation module 110 determines an amount of rotation based on the updated theta angled multipleglyph bounding box 1214 and thetransformation module 110 maps the determined amount of rotation to theglyphs transformation module 110 maps the rotation applied to the updated theta angled multipleglyph bounding box 1214 to the theta angled bounding boxes of theglyphs - As illustrated in example 1202, the
transformation module 110 can also map a vertical displacement of theglyphs glyphs transformation module 110 can map a horizontal displacement of theglyphs glyphs transformation module 110 maps the rotation defined by the user input as a character rotation applied to theglyphs glyphs object 1212. - In example 1204, the user input defines a word rotation of the
glyphs object 1212. In response to receiving the user input, thetransformation module 110 determines a transformation for theglyphs glyphs object 1212 as a group. For example, thetransformation module 110 determines this transformation to collectively rotate theglyphs object 1212. Thetransformation module 110 then determines horizontal and/or vertical displacements for each of theglyphs transformation module 110 maps these horizontal and/or vertical displacements to kerning and/or baseline shifts to generate theglyphs object 1212. -
FIGS. 13A and 13B are illustrations of representations of aspects of word rotation as editable text.FIG. 13A illustrates arepresentation 1300 in which a transformation is determined for a word rotation.FIG. 13B illustrates arepresentation 1302 in which horizontal and vertical displacements are determined based on the determined transformation.Representation 1304 illustrates a mapping for word rotation. As shown, thetransformation module 110 generates a zero angled non-reference bounding box for each of theglyphs transformation module 110 then generates the multipleglyph bounding box 1214 by concatenating the zero angle non-reference bounding boxes for theglyphs - In response to receiving the user input defining the word rotation, the
transformation module 110 generates theta angledreference bounding boxes transformation module 110 maps the theta angledreference bounding boxes glyph bounding box 1214 and extracts an updated multipleglyph bounding box 1214 having the amount of rotation. As shown inrepresentation 1312, thetransformation module 110 determines the transformation for theglyphs glyphs object 1212 as the group using the updated multipleglyph bounding box 1214 and the theta angledreference bounding boxes -
Representation 1314 illustrates a vertical displacement and a horizontal displacement forglyph 1206 based on the transformation. Thetransformation module 110 maps the vertical displacement forglyph 1206 to kerning and thetransformation module 110 maps the horizontal displacement forglyph 1206 into a baseline shift.Representation 1316 illustrates a vertical displacement and a horizontal displacement forglyph 1208 based on the transformation. As shown, thetransformation module 110 maps the vertical displacement forglyph 1208 to kerning and thetransformation module 110 maps the horizontal displacement forglyph 1208 into a baseline shift.Representation 1318 illustrates a vertical displacement and a horizontal displacement forglyph 1210 based on the transformation. Thetransformation module 110 maps the vertical displacement forglyph 1210 to kerning and thetransformation module 110 maps the horizontal displacement forglyph 1210 into a baseline shift.Representation 1320 illustrates theglyphs -
FIG. 14 is an illustration of arepresentation 1400 of a displacement of multiple glyphs relative to an object using an alignment guide. Therepresentation 1400 includes anobject 1402, afirst glyph 1404, and asecond glyph 1406. As shown, theobject 1402 includes eight glyphs. Thefirst glyph 1404 is the letter “T” and thesecond glyph 1406 is the letter “M” in this example. A multipleglyph bounding box 1408 for theglyphs computing device 102 may implement thetransformation module 110 to generate the multipleglyph bounding box 1408 by concatenating bounding boxes of theglyphs - The multiple
glyph bounding box 1408 and theglyphs object 1402 based on theinput data 116. For example, theinput data 116 can include data describing a user input which defines a vertical displacement of theglyphs object 1402. Thecomputing device 102 may implement thetransformation module 110 to map the user input defining the vertical displacement into a baseline shift. Thetransformation module 110 can generate an updated multipleglyph bounding box 1408 based on this mapping. - A potential alignment between the
glyphs object 1402 is identified based on the updated multipleglyph bounding box 1408, e.g., the potential alignment may be identified based on updated corner positions of the updated multipleglyph bounding box 1408. For example, thetransformation module 110 provides data describing the updated corner positions to a snapping module of the application for creating and/or editing thedigital content 112. Thetransformation module 110 renders the potential alignment as analignment guide 1410 usable to guide the vertical displacement of theglyphs object 1402. In the illustrated example, thealignment guide 1410 is usable to precisely align a top portion of theobject 1402 with a bottom portion of theglyphs glyphs digital content 112 can snap theglyphs object 1402 using thealignment guide 1410. -
FIG. 15 is an illustration of arepresentation 1500 of a rotation of multiple glyphs relative to an object using an angle guide. Therepresentation 1500 includesglyphs object 1508. Thetransformation module 110 is implemented to generate a multipleglyph bounding box 1510 for theglyphs transformation module 110 generates the multipleglyph bounding box 1510 by concatenating bounding boxes of theglyphs glyph bounding box 1510 and theglyphs object 1508 based on a user input defining this transformation included in theinput data 116. Thetransformation module 110 maps the transformation defined in theinput data 116 as a word rotation for theglyphs - The
transformation module 110 generates an updated multipleglyph bounding box 1510 as part of the word rotation of theglyphs transformation module 110 provides angular line data describing lines parallel to the updated multipleglyph bounding box 1510 to an angular snapping module of the application for creating and/or editing thedigital content 112. The angular snapping module searches theobject 1508 for lines parallel to the lines described by the angular line data of the updated multipleglyph bounding box 1510. - A potential alignment of the
glyphs object 1508 is identified based on the angular line data describing lines parallel to the updated multipleglyph bounding box 1510. By leveraging the angular line data based on the updated multipleglyph bounding box 1510, thetransformation module 110 identifies the potential alignment without converting theglyphs transformation module 110 identifies the potential alignment and theglyphs - The
transformation module 110 renders the potential alignment as anangle guide 1512 which is usable to align theglyphs object 1508. Thetransformation module 110 also renders an alignment indication 1454 which visually indicates a portion of theobject 1508 identified as part of identifying the potential alignment of theglyphs object 1508. As shown, theangle guide 1512 provides a visual indication that the updated multipleglyph bounding box 1510 is parallel to a line extending from a base of theobject 1508 and thealignment indication 1514 illustrates the line extending from the base of theobject 1508. In an example, in response to receiving a user input defining a snapping operation, the angular snapping module of the application for creating and/or editing thedigital content 112 can snap theglyphs object 1508 using theangle guide 1512. - Example System and Device
-
FIG. 16 illustrates anexample system 1600 that includes an example computing device that is representative of one or more computing systems and/or devices that may implement the various techniques described herein. This is illustrated through inclusion of thetransformation module 110. Thecomputing device 1602 may be, for example, a server of a service provider, a device associated with a client (e.g., a client device), an on-chip system, and/or any other suitable computing device or computing system. - The
example computing device 1602 as illustrated includes aprocessing system 1604, one or more computer-readable media 1606, and one or more I/O interfaces 1608 that are communicatively coupled, one to another. Although not shown, thecomputing device 1602 may further include a system bus or other data and command transfer system that couples the various components, one to another. A system bus can include any one or combination of different bus structures, such as a memory bus or memory controller, a peripheral bus, a universal serial bus, and/or a processor or local bus that utilizes any of a variety of bus architectures. A variety of other examples are also contemplated, such as control and data lines. - The
processing system 1604 is representative of functionality to perform one or more operations using hardware. Accordingly, theprocessing system 1604 is illustrated as includinghardware elements 1610 that may be configured as processors, functional blocks, and so forth. This may include implementation in hardware as an application specific integrated circuit or other logic device formed using one or more semiconductors. Thehardware elements 1610 are not limited by the materials from which they are formed or the processing mechanisms employed therein. For example, processors may be comprised of semiconductor(s) and/or transistors (e.g., electronic integrated circuits (ICs)). In such a context, processor-executable instructions may be electronically-executable instructions. - The computer-
readable media 1606 is illustrated as including memory/storage 1612. The memory/storage 1612 represents memory/storage capacity associated with one or more computer-readable media. The memory/storage component 1612 may include volatile media (such as random access memory (RAM)) and/or nonvolatile media (such as read only memory (ROM), Flash memory, optical disks, magnetic disks, and so forth). The memory/storage component 1612 may include fixed media (e.g., RAM, ROM, a fixed hard drive, and so on) as well as removable media (e.g., Flash memory, a removable hard drive, an optical disc, and so forth). The computer-readable media 1606 may be configured in a variety of other ways as further described below. - Input/output interface(s) 1608 are representative of functionality to allow a user to enter commands and information to
computing device 1602, and also allow information to be presented to the user and/or other components or devices using various input/output devices. Examples of input devices include a keyboard, a cursor control device (e.g., a mouse), a microphone, a scanner, touch functionality (e.g., capacitive or other sensors that are configured to detect physical touch), a camera (e.g., which may employ visible or non-visible wavelengths such as infrared frequencies to recognize movement as gestures that do not involve touch), and so forth. Examples of output devices include a display device (e.g., a monitor or projector), speakers, a printer, a network card, tactile-response device, and so forth. Thus, thecomputing device 1602 may be configured in a variety of ways as further described below to support user interaction. - Various techniques may be described herein in the general context of software, hardware elements, or program modules. Generally, such modules include routines, programs, objects, elements, components, data structures, and so forth that perform particular tasks or implement particular abstract data types. The terms “module,” “functionality,” and “component” as used herein generally represent software, firmware, hardware, or a combination thereof. The features of the techniques described herein are platform-independent, meaning that the techniques may be implemented on a variety of commercial computing platforms having a variety of processors.
- An implementation of the described modules and techniques may be stored on or transmitted across some form of computer-readable media. The computer-readable media may include a variety of media that may be accessed by the
computing device 1602. By way of example, and not limitation, computer-readable media may include “computer-readable storage media” and “computer-readable signal media.” - “Computer-readable storage media” may refer to media and/or devices that enable persistent and/or non-transitory storage of information in contrast to mere signal transmission, carrier waves, or signals per se. Thus, computer-readable storage media refers to non-signal bearing media. The computer-readable storage media includes hardware such as volatile and non-volatile, removable and non-removable media and/or storage devices implemented in a method or technology suitable for storage of information such as computer readable instructions, data structures, program modules, logic elements/circuits, or other data. Examples of computer-readable storage media may include, but are not limited to, RAM, ROM, EEPROM, flash memory or other memory technology, CD-ROM, digital versatile disks (DVD) or other optical storage, hard disks, magnetic cassettes, magnetic tape, magnetic disk storage or other magnetic storage devices, or other storage device, tangible media, or article of manufacture suitable to store the desired information and which may be accessed by a computer.
- “Computer-readable signal media” may refer to a signal-bearing medium that is configured to transmit instructions to the hardware of the
computing device 1602, such as via a network. Signal media typically may embody computer readable instructions, data structures, program modules, or other data in a modulated data signal, such as carrier waves, data signals, or other transport mechanism. Signal media also include any information delivery media. The term “modulated data signal” means a signal that has one or more of its characteristics set or changed in such a manner as to encode information in the signal. By way of example, and not limitation, communication media include wired media such as a wired network or direct-wired connection, and wireless media such as acoustic, RF, infrared, and other wireless media. - As previously described,
hardware elements 1610 and computer-readable media 1606 are representative of modules, programmable device logic and/or fixed device logic implemented in a hardware form that may be employed in some embodiments to implement at least some aspects of the techniques described herein, such as to perform one or more instructions. Hardware may include components of an integrated circuit or on-chip system, an application-specific integrated circuit (ASIC), a field-programmable gate array (FPGA), a complex programmable logic device (CPLD), and other implementations in silicon or other hardware. In this context, hardware may operate as a processing device that performs program tasks defined by instructions and/or logic embodied by the hardware as well as a hardware utilized to store instructions for execution, e.g., the computer-readable storage media described previously. - Combinations of the foregoing may also be employed to implement various techniques described herein. Accordingly, software, hardware, or executable modules may be implemented as one or more instructions and/or logic embodied on some form of computer-readable storage media and/or by one or
more hardware elements 1610. Thecomputing device 1602 may be configured to implement particular instructions and/or functions corresponding to the software and/or hardware modules. Accordingly, implementation of a module that is executable by thecomputing device 1602 as software may be achieved at least partially in hardware, e.g., through use of computer-readable storage media and/orhardware elements 1610 of theprocessing system 1604. The instructions and/or functions may be executable/operable by one or more articles of manufacture (for example, one ormore computing devices 1602 and/or processing systems 1604) to implement techniques, modules, and examples described herein. - The techniques described herein may be supported by various configurations of the
computing device 1602 and are not limited to the specific examples of the techniques described herein. This functionality may also be implemented all or in part through use of a distributed system, such as over a “cloud” 1614 as described below. - The
cloud 1614 includes and/or is representative of aplatform 1616 forresources 1618. Theplatform 1616 abstracts underlying functionality of hardware (e.g., servers) and software resources of thecloud 1614. Theresources 1618 may include applications and/or data that can be utilized while computer processing is executed on servers that are remote from thecomputing device 1602.Resources 1618 can also include services provided over the Internet and/or through a subscriber network, such as a cellular or Wi-Fi network. - The
platform 1616 mayabstract resources 1618 and functions to connect thecomputing device 1602 with other computing devices. The platform may also serve to abstract scaling of resources to provide a corresponding level of scale to encountered demand for the resources that are implemented via the platform. Accordingly, in an interconnected device embodiment, implementation of functionality described herein may be distributed throughout thesystem 1600. For example, the functionality may be implemented in part on thecomputing device 1602 as well as via theplatform 1616 that abstracts the functionality of thecloud 1614. - Although implementations of glyph transformations as editable text have been described in language specific to structural features and/or methods, it is to be understood that the appended claims are not necessarily limited to the specific features or methods described. Rather, the specific features and methods are disclosed as example implementations of glyph transformations as editable text, and other equivalent features and methods are intended to be within the scope of the appended claims. Further, various different examples are described and it is to be appreciated that each described example can be implemented independently or in connection with one or more other described examples.
Claims (20)
1. In a digital medium environment to transform multiple glyphs as editable text relative to an object, a method implemented by a computing device, the method comprising:
generating, by the computing device, bounding boxes for a first glyph and a second glyph of the multiple glyphs;
generating, by the computing device, a multiple glyph bounding box for the multiple glyphs by concatenating the bounding boxes for the first and second glyphs;
receiving, by the computing device, a user input defining a transformation of the multiple glyph bounding box relative to the object;
determining, by the computing device, a mapping of the transformation of the multiple glyph bounding box to the bounding boxes for the first and second glyphs; and
generating, for display in a user interface, the multiple glyphs as the editable text having the transformation based on the mapping.
2. The method as described in claim 1 , wherein the transformation is a rotation of the multiple glyphs relative to the object.
3. The method as described in claim 1 , wherein the transformation is a displacement of the multiple glyphs relative to the object.
4. The method as described in claim 1 , wherein the transformation is a scaling of the multiple glyphs relative to the object.
5. The method as described in claim 1 , further comprising rendering an indication in the user interface as a guide for the transformation.
6. The method as described in claim 5 , wherein the guide is an alignment guide rendered in the user interface as a line parallel to a portion of an object.
7. The method as described in claim 6 , wherein the alignment guide is an angle alignment guide.
8. The method as described in claim 5 , wherein the guide is an equal spacing guide rendered in the user interface as a pair of lines having an equal length.
9. In a digital medium environment to transform multiple glyphs as editable text relative to an object, a system comprising:
a generation module implemented at least partially in hardware of a computing device to:
generate bounding boxes for a first glyph and a second glyph of the multiple glyphs; and
generate a multiple glyph bounding box for the multiple glyphs by concatenating the bounding boxes for the first and second glyphs;
a mapping module implemented at least partially in the hardware of the computing device to:
receive a user input defining a transformation of the multiple glyph bounding box relative to the object; and
determine a mapping of the transformation of the multiple glyph bounding box to the bounding boxes for the first and second glyphs; and
a rendering module implemented at least partially in the hardware of the computing device to generate the multiple glyphs as the editable text having the transformation.
10. The system as described in claim 9 , further comprising a guide module implemented at least partially in the hardware of the computing device to:
calculate positions of corners of the multiple glyph bounding box; and
generate an indication for display in a user interface as a guide for the transformation of the multiple glyphs based on the positions of the corners.
11. The system as described in claim 10 , wherein the guide is an alignment guide generated for display in the user interface as a line parallel to a portion of an object.
12. The system as described in claim 11 , wherein the alignment guide is an angle alignment guide.
13. The system as described in claim 10 , wherein the guide is an equal spacing guide generated for display in the user interface as a pair of lines having an equal length.
14. The system as described in claim 9 , wherein the transformation is a rotation of the multiple glyphs relative to the object.
15. The system as described in claim 9 , wherein the transformation is a scaling of the multiple glyphs relative to the object.
16. The system as described in claim 9 , wherein the transformation is a displacement of the multiple glyphs relative to the object.
17. One or more computer-readable storage media comprising instructions stored thereon that, responsive to execution by a computing device in a digital medium environment to transform multiple glyphs as editable text relative to an object, cause operations of the computing device including:
generating a first bounding box for a first glyph and a second bounding box for a second glyph of the multiple glyphs, the first bounding box enveloping an outline of the first glyph and the second bounding box enveloping an outline of the second glyph;
generating a multiple glyph bounding box for the multiple glyphs by concatenating the first bounding box and the second bounding box;
receiving a user input defining a transformation of the multiple glyph bounding box relative to the object;
determining a mapping of the transformation of the multiple glyph bounding box to the first bounding box and the second bounding box; and
generating, for display in a user interface, the multiple glyphs as the editable text having the transformation relative to the object based on the mapping.
18. The one or more computer-readable storage media as described in claim 17 , wherein the transformation is a rotation of the multiple glyphs relative to the object.
19. The one or more computer-readable storage media as described in claim 17 , wherein the transformation is a displacement of the multiple glyphs relative to the object.
20. The one or more computer-readable storage media as described in claim 17 , wherein the transformation is a scaling of the multiple glyphs relative to the object.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US16/749,563 US11069027B1 (en) | 2020-01-22 | 2020-01-22 | Glyph transformations as editable text |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US16/749,563 US11069027B1 (en) | 2020-01-22 | 2020-01-22 | Glyph transformations as editable text |
Publications (2)
Publication Number | Publication Date |
---|---|
US11069027B1 US11069027B1 (en) | 2021-07-20 |
US20210224946A1 true US20210224946A1 (en) | 2021-07-22 |
Family
ID=76857174
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US16/749,563 Active US11069027B1 (en) | 2020-01-22 | 2020-01-22 | Glyph transformations as editable text |
Country Status (1)
Country | Link |
---|---|
US (1) | US11069027B1 (en) |
Families Citing this family (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
AU2021201352A1 (en) * | 2021-03-02 | 2022-09-22 | Canva Pty Ltd | Systems and methods for extracting text from portable document format data |
US20220337709A1 (en) * | 2021-04-15 | 2022-10-20 | Canon Kabushiki Kaisha | Information processing apparatus, information processing method, and storage medium |
Family Cites Families (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20120001922A1 (en) * | 2009-01-26 | 2012-01-05 | Escher Marc | System and method for creating and sharing personalized fonts on a client/server architecture |
AU2009202377A1 (en) * | 2009-06-15 | 2011-01-06 | Canon Kabushiki Kaisha | Combining overlapping objects |
US9286861B2 (en) * | 2011-01-28 | 2016-03-15 | Provo Craft And Novelty, Inc. | System and method for providing digital content |
US9250786B2 (en) * | 2013-07-16 | 2016-02-02 | Adobe Systems Incorporated | Snapping of object features via dragging |
US20150301721A1 (en) * | 2014-01-02 | 2015-10-22 | n2y LLC | Desktop publishing tool |
US9235757B1 (en) * | 2014-07-24 | 2016-01-12 | Amazon Technologies, Inc. | Fast text detection |
US9842251B2 (en) * | 2016-01-29 | 2017-12-12 | Konica Minolta Laboratory U.S.A., Inc. | Bulleted lists |
US10115374B2 (en) * | 2016-05-18 | 2018-10-30 | Blackberry Limited | Variable glyph encoding |
US10830594B2 (en) * | 2017-11-06 | 2020-11-10 | Mitac International Corp. | Updating missing attributes in navigational map data via polyline geometry matching |
US10157331B1 (en) * | 2018-03-08 | 2018-12-18 | Capital One Services, Llc | Systems and methods for image preprocessing to improve accuracy of object recognition |
-
2020
- 2020-01-22 US US16/749,563 patent/US11069027B1/en active Active
Also Published As
Publication number | Publication date |
---|---|
US11069027B1 (en) | 2021-07-20 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US10839139B2 (en) | Glyph aware snapping | |
US10127221B2 (en) | Detection and reconstruction of East Asian layout features in a fixed format document | |
WO2019119966A1 (en) | Text image processing method, device, equipment, and storage medium | |
US7750924B2 (en) | Method and computer-readable medium for generating graphics having a finite number of dynamically sized and positioned shapes | |
US7982737B2 (en) | System and method for independent font substitution of string characters | |
US11403794B2 (en) | Glyph weight modification | |
US8922582B2 (en) | Text rendering and display using composite bitmap images | |
TW200306489A (en) | Interfacing with ink | |
JP2006114013A (en) | System and method for automatic label placement on chart | |
US10319129B2 (en) | Snapping line generation | |
US11069027B1 (en) | Glyph transformations as editable text | |
US20180357206A1 (en) | Content inker | |
US10403040B2 (en) | Vector graphics rendering techniques | |
US11030388B2 (en) | Live text glyph modifications | |
US10366518B2 (en) | Extension of text on a path | |
US9965457B2 (en) | Methods and systems of applying a confidence map to a fillable form | |
EP3422251A1 (en) | Typesetness score for a table | |
US11763064B2 (en) | Glyph accessibility and swash control system | |
CN112256175B (en) | Text display method, text display device, electronic equipment and computer readable storage medium | |
US10984173B2 (en) | Vector-based glyph style transfer | |
US10930045B2 (en) | Digital ink based visual components | |
US11755817B2 (en) | Systems for generating snap guides relative to glyphs of editable text | |
US11900510B2 (en) | Glyph size control for digital content | |
JP7430219B2 (en) | Document information structuring device, document information structuring method and program | |
US11631206B2 (en) | Glyph selection tool for digital text content |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
FEPP | Fee payment procedure |
Free format text: ENTITY STATUS SET TO UNDISCOUNTED (ORIGINAL EVENT CODE: BIG.); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY |
|
AS | Assignment |
Owner name: ADOBE INC., CALIFORNIA Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:JAIN, ARUSHI;DHANUKA, PRAVEEN KUMAR;JAIN, ASHISH;SIGNING DATES FROM 20200123 TO 20200129;REEL/FRAME:051663/0452 |
|
STCF | Information on status: patent grant |
Free format text: PATENTED CASE |