CN108241594B - Character editing method, electronic device, and recording medium - Google Patents

Character editing method, electronic device, and recording medium Download PDF

Info

Publication number
CN108241594B
CN108241594B CN201710975869.1A CN201710975869A CN108241594B CN 108241594 B CN108241594 B CN 108241594B CN 201710975869 A CN201710975869 A CN 201710975869A CN 108241594 B CN108241594 B CN 108241594B
Authority
CN
China
Prior art keywords
character
image
images
format
formats
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201710975869.1A
Other languages
Chinese (zh)
Other versions
CN108241594A (en
Inventor
冈野満
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Casio Computer Co Ltd
Original Assignee
Casio Computer Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Casio Computer Co Ltd filed Critical Casio Computer Co Ltd
Publication of CN108241594A publication Critical patent/CN108241594A/en
Application granted granted Critical
Publication of CN108241594B publication Critical patent/CN108241594B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F40/00Handling natural language data
    • G06F40/10Text processing
    • G06F40/103Formatting, i.e. changing of presentation of documents
    • G06F40/109Font handling; Temporal or kinetic typography
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B41PRINTING; LINING MACHINES; TYPEWRITERS; STAMPS
    • B41FPRINTING MACHINES OR PRESSES
    • B41F33/00Indicating, counting, warning, control or safety devices
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F40/00Handling natural language data
    • G06F40/10Text processing
    • G06F40/103Formatting, i.e. changing of presentation of documents
    • G06F40/106Display of layout of documents; Previewing

Abstract

A text editing method, an electronic device and a storage medium. The text editing method comprises the following steps: acquiring character information of a reference character from a reference image including an image of the reference character; generating a plurality of images for comparison obtained by applying each of the plurality of character formats to the reference character, based on the character information of the reference character and the plurality of character formats stored in the storage unit; calculating a similarity between the image of the reference character and each of the plurality of images for comparison; and displaying, on a display unit, an image of at least one candidate character obtained by applying at least one of the plurality of character formats to the input editing target character based on the similarity.

Description

Character editing method, electronic device, and recording medium
Technical Field
The invention relates to a character editing method, an electronic device and a recording medium.
Background
Conventionally, there is known an application program for editing characters printed on a print medium by a printing device such as a label printer on an electronic device such as a smartphone or a personal computer. Such an application program is described in patent document 1 (japanese patent application laid-open No. 2013-037393), for example.
However, the electronic device has various functions of editing characters, for example, a function of setting a character format such as a font, an italic font, and a bold font. Editing of characters is a very time-consuming operation, and for example, in the case of font setting, a large number of fonts are generally registered in an electronic device, and it is not easy to find and set a favorite font.
Disclosure of Invention
In view of the above circumstances, an object of one aspect of the present invention is to reduce the operation load on the user required for character editing.
The invention provides a character editing method, which comprises the following steps: acquiring character information of a reference character from a reference image including an image of the reference character; generating a plurality of images for comparison obtained by applying each of the plurality of character formats to the reference character, based on the character information of the reference character and the plurality of character formats stored in the storage unit; calculating a similarity between the image of the reference character and each of the plurality of images for comparison; and displaying, on a display unit, an image of at least one candidate character obtained by applying at least one of the plurality of character formats to the input editing target character based on the similarity.
The present invention also provides an electronic device including: a storage unit for storing information in a plurality of character formats; a display unit; and a calculation unit that acquires character information of a reference character from a reference image including an image of the reference character, generates a plurality of comparison images obtained by applying each of the plurality of character formats to the reference character based on the character information of the reference character and the plurality of character formats stored in the storage unit, calculates a degree of similarity between the image of the reference character and each of the plurality of comparison images, and causes the display unit to display an image of at least one candidate character obtained by applying at least one of the plurality of character formats to an input editing target character based on the degree of similarity.
The present invention also provides a recording medium recording a program that causes a computer to execute: the character information of a reference character is acquired from a reference image including an image of the reference character, a plurality of comparison images obtained by applying each of the plurality of character formats to the reference character are generated based on the character information of the reference character and the plurality of character formats stored in the storage unit, a degree of similarity between the image of the reference character and each of the plurality of comparison images is calculated, and an image of at least one candidate character obtained by applying at least one of the plurality of character formats to an input editing target character based on the degree of similarity is displayed on the display unit.
According to the above aspect, the operation load of the user required for character editing can be reduced.
Drawings
Fig. 1 is a diagram illustrating a configuration of a printing system according to the present embodiment.
Fig. 2 is a block diagram showing the configuration of the printing apparatus 1 and the electronic device 100.
Fig. 3 is an example of a flowchart of the text editing process.
Fig. 4 is an example of a tab editing screen before font setting.
Fig. 5 is a diagram for explaining a method of acquiring a reference image.
Fig. 6 is an example of a reference image display screen.
Fig. 7 is an example of a font selection screen.
Fig. 8 is an example of a label editing screen after font selection.
Fig. 9 is a modified example of the flowchart of the text editing process.
Fig. 10 shows another example of the reference image display screen.
Fig. 11 is another example of the font selection screen.
Fig. 12 is another modification of the flowchart of the text editing process.
Fig. 13 is a diagram for explaining a method of calculating a size ratio between a printed matter and a reference character.
Detailed Description
Fig. 1 is a diagram illustrating a configuration of a printing system according to the present embodiment. Fig. 2 is a block diagram showing the configuration of the printing apparatus 1 and the electronic device 100. The printing system shown in fig. 1 includes a printing apparatus 1 and an electronic device 100 that transmits print data to the printing apparatus 1. The printing apparatus 1 and the electronic device 100 exchange data by wireless communication or wired communication.
The printing apparatus 1 is a printing apparatus that prints on a print medium M, and is, for example, a label printer that prints on a long print medium M in a single pass. In the following, a case where the printing apparatus 1 is a thermal transfer type label printer using an ink ribbon will be described as an example, but the printing apparatus 1 may be a thermal type label printer using thermal paper. The printing apparatus 1 as a label printer is not limited to a thermal printer, and may be an inkjet printer.
The print medium M is, for example, a tape member including a base material having an adhesive layer and a release paper releasably attached to the base material so as to cover the adhesive layer. The medium M to be printed may be a tape member without release paper.
As shown in fig. 1, in the printing apparatus 1, a lid 10 and a plurality of buttons (a button 21, a button 22a, a button 22b, a button 22c, and a button 22d) are provided on an upper surface of a tubular apparatus housing 20. The button 21 is an opening/closing button of the lid 10. The buttons 22a to 22d are a power button, a wireless communication button, a supply button, and a cut button, respectively. Although not shown, the apparatus housing 20 is provided with a power line connection terminal, an external device connection terminal, and the like.
The lid 10 is provided to be openable and closable. The lid portion 10 is opened by pressing the button 21, and exposes the cartridge housing portion for housing the tape cartridge to the outside. The lid portion 10 is formed with a window 11 so that whether or not the tape cassette is stored in the cassette storage portion can be confirmed by visual observation even in a state where the lid portion 10 is closed. Further, a discharge port 20a is formed in a side surface of the apparatus housing 20. The printing medium M printed in the printing apparatus 1 is discharged from the discharge port 20a to the outside of the apparatus.
In addition to the above configuration, as shown in fig. 2, the printing apparatus 1 includes a control unit 2, a communication interface 3, a ROM (Read Only Memory)4, a RAM (Random Access Memory)5, a thermistor 23, a head drive circuit 24, a thermal head 25, a platen roller 26, a conveying motor 27, a conveying motor drive circuit 28, a cutter motor drive circuit 29, a cutter motor 30, a full-cut mechanism 31, a half-cut mechanism 32, and a bandwidth detection switch 33.
The control Unit 2 includes a processor 2a such as a CPU (Central Processing Unit). The control unit 2 controls the operations of the respective units of the printing apparatus 1 by expanding and executing a program stored in the ROM4 on the RAM 5. The communication interface 3 transceives data with an external device (e.g., the electronic apparatus 100) by wired communication or wireless communication.
The ROM4 stores a print program for printing on the print medium M and various data (for example, fonts and the like) necessary for execution of the print program. The ROM4 also functions as a storage medium storing a program that can be read by the control unit 2. The RAM5 is a print data memory for storing print data, and is a work memory used for executing programs.
The head drive circuit 24 energizes the heating element 25a of the thermal head 25 based on the print data and the strobe signal. The thermal head 25 is a print head having a plurality of heat generating elements 25a arranged in the main scanning direction. A thermistor 23 for measuring the temperature of the thermal head 25 is embedded in the thermal head 25. The thermal head 25 heats the ink ribbon by the heating element 25a and prints on the print medium M by thermal transfer one line at a time.
The conveying motor drive circuit 28 drives the conveying motor 27. The conveyance motor 27 is, for example, a stepping motor, and rotates the platen roller 26. The platen roller 26 is rotated by power of the conveyance motor 27, and conveys the printing medium M along the longitudinal direction (sub-scanning direction, conveyance direction) of the printing medium M.
The tool motor driving circuit 29 drives the tool motor 30. The full-cut mechanism 31 and the half-cut mechanism 32 are operated by the power of the cutter motor 30 to perform full-cut or half-cut of the print medium M. The full cut is an operation of cutting the base material of the print medium M along the width direction of the print medium M together with the release paper, and the half cut is an operation of cutting only the base material along the width direction. For example, when the button 22d is pressed, the cutter motor driving circuit 29 may drive the cutter motor 30 to cause the full-cut mechanism 31 to perform full-cut on the print medium M.
The bandwidth detection switch 33 is a switch for detecting the width of the print medium M based on the shape of the tape cassette. The plurality of bandwidth detection switches 33 are provided in the cassette housing portion. The tape cassettes having different bandwidths are configured such that the plurality of bandwidth detection switches 33 are pressed in different combinations. The control unit 2 specifies the type of the tape cassette based on the combination of the pressed tape width detection switches 33, and detects the width (tape width) of the print medium M.
As shown in fig. 1 and 2, the electronic apparatus 100 includes a display unit 101, an input unit 103, and an imaging device 104, and is a portable computer such as a smartphone or a tablet terminal. The display portion 101 may be a liquid crystal display or an organic electroluminescence (organic EL) display, for example. The input unit 103 is, for example, a touch panel. The imaging device 104 is a camera module including an imaging element such as a CMOS (Complementary Metal Oxide Semiconductor) sensor, for example, and acquires a reference image described later.
In addition to the above configuration, the electronic apparatus 100 includes a display driving circuit 102, a register 105, a communication interface 106, a ROM107, a RAM108, and a control unit 109. The display drive circuit 102 is, for example, a liquid crystal display driver circuit or an organic EL display driver circuit. The register 105 is a storage unit that stores a plurality of character formats. The character format is a format set for characters, and is a setting for the form of characters. The text format includes font, font size, text color, italics, underlining, and bold. The control unit 109 includes a processor 110. The processor 110 is a calculation unit that edits characters printed on the print medium M by the printing apparatus 1 by executing an application program.
Fig. 3 is an example of a flowchart of the text editing process. Fig. 4 is an example of a tab editing screen before font setting. Fig. 5 is a diagram for explaining a method of acquiring a reference image. Fig. 6 is an example of a reference image display screen. Fig. 7 is an example of a font selection screen. Fig. 8 is an example of a label editing screen after font selection. The following describes the character editing process shown in fig. 3 performed by the electronic apparatus 100, with reference to fig. 3 to 8.
The character editing process shown in fig. 3 is a process of changing font setting and editing characters. For example, when the electronic apparatus 100 executes an application program for editing characters printed on the print medium M, the font setting button B1 is pressed on the label editing screen shown in fig. 4, thereby starting the character editing process shown in fig. 3. Fig. 4 shows an example in which an editing target character C1 (here, "ABC algorithm") is input in the editing area ER of the tab editing screen. The editing target characters are characters to be edited by changing the character format such as font.
When the character editing process is started, first, the processor 110 acquires character information of the editing target character C1 (step S1). The character information is a character content that does not depend on the setting of the character format, and is specified by, for example, a character code or the like. The characters include forms of characters in addition to contents of characters (character information), and even if the characters have the same character information, a plurality of characters having different forms (formats such as fonts) are treated as different characters.
Next, the processor 110 acquires the reference image M1 (step S2). Here, the reference image M1 is an image including characters (hereinafter, referred to as reference characters) in which a character format (font here) to be set by a user for the editing target characters is set. The reference image M1 may be, for example, an image obtained from the internet by a user operating the electronic apparatus 100, or an image obtained by a user shooting an object with the imaging device 104 of the electronic apparatus 100.
Fig. 5 shows an example in which the user photographs a poster (object) 200 with the imaging device 104 of the electronic apparatus 100 to obtain a reference image M1. Fig. 6(a) shows an example in which the acquired reference image M1 is displayed on the display unit 101.
When the reference image M1 is acquired, the processor 110 acquires character information of the reference character C11 from the reference image M1 (step S3). The character information of the reference character C11 (here, "XYZ car") is acquired using a technique such as Optical Character Recognition (OCR) for extracting character information from an image.
In the case where the reference image M1 contains a character or the like other than the reference character C11, the processor 110 may first perform preprocessing for setting a target region on the reference image M1 in accordance with an input from a user, and then acquire character information of the reference character C11 from the target region. Here, the attention region is a selected region in the reference image M1 and is a region including the reference character C11. This makes it possible to more reliably acquire character information referred to as the character C11. Fig. 6 b shows an example in which a target region ROI (region surrounded by a broken line) including the reference character C11 is set in the reference image M2.
When the character information of the reference character C11 is acquired, the processor 110 generates a plurality of images for comparison (step S4). Specifically, the processor 110 generates a plurality of images for comparison each including a character corresponding to the reference character C11 based on the character information of the reference character C11 and the plurality of fonts stored in the register 105. Fig. 6(c) shows an example of a plurality of comparison images P1 to Pn. Here, the character corresponding to the reference character C11 included in each of the comparison images is a character having the same character information as the reference character C11, and is a character obtained by applying each of a plurality of different fonts stored in the register 105 to the reference character C11. That is, the plurality of comparison images are a plurality of images in which each of the plurality of fonts is applied to the reference character C11.
When a plurality of images for comparison are generated, the processor 110 calculates the degree of similarity between the image of the reference character C11 in the reference image M1 acquired in step S2 and each of the plurality of images for comparison generated in step S4 (step S5). The similarity quantitatively indicates how similar the 2 images are. Various methods are known as a method of calculating the similarity between images. The calculation method used in step S5 is not limited to a specific method. For example, the similarity between images may be calculated based on an LPB (local binary pattern) generated from the images. Further, the similarity may also be calculated using a similarity calculation method called Average Hash. Further, the similarity may also be calculated by calculating the Sum of Squared Differences (SSD), the Sum of Absolute Differences (SAD), and normalized cross-correlation (NCC), for example. Here, when the reference image M1 is obtained by photographing the object with the imaging device 104, the object may be photographed at a position offset from the front, and the object may be deformed. In such a case, it is preferable to perform the above-described processing after bringing the object close to the image in the case of photographing from the front by deforming the reference image M1, for example.
When the similarity is calculated, the processor 110 selects at least one font from the plurality of fonts stored in the register 105 based on the similarity calculated in step S5 (step S6). Here, the processor 110 may specify a font set for a comparison image having a similarity to the reference image M1 equal to or higher than a threshold value among the plurality of comparison images, and select the font as at least one specific font. This can prevent the presence of similar fonts from being overlooked. The threshold value may be arbitrarily changed by the user. The processor 110 may determine a predetermined number of fonts set for a predetermined number of comparison images in the order of similarity between the comparison images and the image of the reference character C11 in the reference image M1, and select the predetermined number of fonts as at least one specific font. This enables the number of similar fonts to be reduced in advance to a preferable number corresponding to the size of the display unit 101 or the like. The predetermined number may be arbitrarily changed by the user.
Then, the processor 110 displays a list of images in which candidate characters having different fonts are set (step S7). Here, the processor 110 applies the at least one specific font selected in step S6 to the editing target character C1 to obtain at least one candidate character corresponding to the editing target character C1. At least one candidate character has the same character information as the editing target character C1. Then, the processor 110 displays the obtained image of at least one candidate character in a list on the display unit 101. Thereby, the image of at least one candidate character of each font to which at least one font is applied is displayed on the display unit 101. Fig. 7 shows an example in which images Q1 to Q4 of 4 candidate characters (images of candidate character C2, candidate character C3, candidate character C4, and candidate character C5) are displayed in a list on the display unit 101.
Next, the processor 110 monitors the selection of the candidate character by the user (step S8). When the processor 110 receives a selection of one candidate character image from among the at least one candidate character image displayed on the display unit 101, the processor applies the font applied to the selected one candidate character (hereinafter, referred to as a first candidate character) to the editing target character C1 (step S9). Here, the processor 110 edits the editing target character C1 to the first printing target character by the application of the font, and updates the display of the label editing screen. Fig. 8 shows an example in which the candidate character C2 is selected on the font selection screen shown in fig. 7, the editing target character C1 in the editing area ER of the tab editing screen is edited as the printing target character C2, and the printing target character C2 to which the font is applied to the editing target character C1 is set as the character to be printed.
Finally, the processor 110 determines whether or not to end the character editing process shown in fig. 3 (step S10), and repeatedly executes the processes of steps S1 to S10 until it is determined to end the character editing process.
In the electronic apparatus 100, the user selects an image including a character (reference character) in which a font to be set for the editing target character is set, and displays a list of images in which candidate characters having a font identical or similar to the font are set (applied) for the editing target character. Therefore, the user can easily edit the editing target character to a preferred font by simply selecting an arbitrary character from the images of the candidate characters displayed in the list. Therefore, according to the electronic apparatus 100, the operation load of the user required for character editing can be reduced.
Although fig. 3 illustrates a process of editing a character by changing the font setting, the electronic apparatus 100 may edit a character by changing the setting of the entire character format without being limited to the font setting.
Fig. 9 is a modified example of the flowchart of the text editing process. Fig. 10 shows another example of the reference image display screen. Fig. 11 is another example of the font selection screen. The following describes the character editing process shown in fig. 9 performed by the electronic apparatus 100, with reference to fig. 9 to 11.
The character editing process shown in fig. 9 is a process of editing characters by changing the setting of the entire character format (including italics, underlines, bold, and the like in addition to the font). For example, the text editing process shown in fig. 9 is started by pressing a setting button for setting a text format on the label editing screen while the electronic device 100 is executing an application for editing text printed on the print medium M.
When the character editing process is started, first, the processor 110 acquires character information of the editing target character C1 (step S11). Next, the reference image M3 is acquired (step S12), and character information of the reference character C12 is acquired from the reference image M3 (step S13). Further, the processing from step S11 to step S13 is the same as the processing from step S1 to step S3 shown in fig. 3. Fig. 10 shows an example in which the reference image M3 acquired in step S12 is displayed on the display unit 101. Reference image M3 contains underlined reference character C12.
Next, the processor 110 generates a plurality of images for comparison (step S14). Specifically, the processor 110 generates a plurality of images for comparison each including a character corresponding to the reference character C12 based on the character information of the reference character C12 and the plurality of character formats stored in the register 105. The character formats applied to the characters corresponding to the reference character C12 included in the comparison images are different from each other, and are stored in the register 105.
When a plurality of images for comparison are generated, the processor 110 calculates the degree of similarity between the image of the reference character C12 of the reference image M3 acquired in step S12 and each of the plurality of images for comparison generated in step S14 (step S15), and selects at least one specific character format based on the degree of similarity (step S16). The processing of step S15 is the same as the processing of step S5 shown in fig. 3. The processing of step S16 is similar to the processing of step S6 shown in fig. 3, except that a character format is selected instead of a font.
Then, the processor 110 displays a list of images to which candidate characters of different character formats are applied (step S17). Here, the processor 110 sets at least one specific character format selected in step S16 for the editing target character C1, and obtains at least one candidate character corresponding to the editing target character C1. Then, the processor 110 displays the obtained image of at least one candidate character in a list on the display unit 101. Thereby, the image of at least one candidate character obtained by applying each of the at least one character format is displayed on the display unit 101. Fig. 11 shows an example in which images Q5 to Q8 of 4 candidate characters (images of candidate character C6, candidate character C7, candidate character C8, and candidate character C9) are displayed in a list on the display unit 101.
Next, the processor 110 monitors the selection of the candidate character by the user (step S18), and when the selection of the image of one candidate character is accepted, applies the character format applied to the selected one candidate character to the character to be edited C1 (step S19). Thereby, the editing target character C1 is edited, and the display of the tab editing screen is updated. Then, the processor 110 repeats the processing of steps S11 to S20 until determining to end the character editing process (step S20: YES).
By performing the character editing process shown in fig. 9, the user can easily edit the editing target character to a preferred character format by simply selecting an arbitrary character from the listed candidate characters. Therefore, according to the electronic apparatus 100, the operation load of the user required for character editing can be reduced.
In fig. 9, the process of editing characters by changing the setting of the character format is illustrated, but the electronic apparatus 100 may set the margin in addition to the setting of the character format.
Fig. 12 is another modification of the flowchart of the text editing process. Fig. 13 is a diagram for explaining a method of calculating a size ratio of a printed matter to a reference character. The following describes the character editing process shown in fig. 12 performed by the electronic device 100, with reference to fig. 12 and 13.
In the character editing process shown in fig. 12, when print data is transmitted, margin is set in addition to the character format. The character editing process shown in fig. 12 is started by pressing a setting button for setting a character format on the tab editing screen, similarly to the character editing process shown in fig. 9. The processing of steps S21 to S30 is the same as the processing of steps S11 to S20 shown in fig. 9. In the following, a case will be described as an example where the reference image M4 as shown in fig. 13 is acquired in step S22. The reference image M4 is an image obtained by imaging the printed matter P, which is a label printed with the reference character C14.
When the processor 110 determines in step S30 that editing is complete, it determines whether a print instruction has been input (step S31). Here, the processor 110 determines whether or not the print button B2 of the label edit screen is pressed, for example.
When it is determined that the print instruction is not input, the processor 110 terminates the character editing process shown in fig. 12. On the other hand, when determining that the print instruction is input, the processor 110 acquires the size information of the print medium M from the printing apparatus 1 (step S32). Here, the processor 110 requests the printing apparatus 1 for information via, for example, the communication interface (communication IF)106, thereby acquiring the size information of the target medium M from the printing apparatus 1. If the to-be-printed medium M is a tape member, for example, the size information of the to-be-printed medium M is a tape width.
Next, the processor 110 calculates a ratio of the size L1 of the printed matter P indicated by the reference image M4 to the size L2 of the reference character C14 printed on the printed matter P, based on the reference image M4 acquired in step S22 (step S33). The above-described sizes (size L1 and size L4) may be all the sizes in the reference image M4.
Then, the processor 110 sets the size of the characters to be printed on the print medium M (step S34). Here, the processor 110 determines and sets the size of characters to be printed on the print medium M, that is, the size of candidate characters edited in step S29, based on the size information of the print medium M acquired in step S32 and the ratio value calculated in step S33. Specifically, the processor 110 sets, for example, the product of the size of the print medium M and the ratio (L2/L1) as the size of the candidate character.
Finally, the processor 110 transmits the print data including the candidate character edited in step S29 and the font size set in step S34 to the printing apparatus 1 (step S35), and terminates the character editing process shown in fig. 12. Thus, the edited candidate characters are printed on the print medium M with the same balance as the balance between the reference character C14 and the margin in the printed matter P.
By performing the character editing process shown in fig. 12, the user can easily edit the editing target character to a preferred character format by simply selecting an arbitrary character from the listed candidate characters. In addition, the user can easily set the balance to be the same as that of the reference image for setting the margin. Therefore, according to the electronic apparatus 100, the operation load of the user required for character editing can be reduced.
The above embodiments show specific examples for facilitating understanding of the invention, but the invention is not limited to the above embodiments. The program, the character editing method, and the electronic device can be modified and changed in various ways without departing from the scope of the claims. For example, fig. 1 shows an example in which the electronic apparatus 100 is a portable computer, but may be a desktop computer. In addition, although fig. 3, 9, and 12 show examples in which a plurality of fonts or character formats are selected, one font or character format may be selected, and the editing target characters may be set by setting the selected one character format to the editing target characters.

Claims (10)

1. A method of text editing comprising:
acquiring character information of a reference character from a reference image including an image of the reference character;
generating a plurality of images for comparison obtained by applying each of the plurality of character formats to the reference character, based on the character information of the reference character and the plurality of character formats stored in the storage unit;
calculating a similarity between the image of the reference character and each of the plurality of images for comparison;
applying at least one of the plurality of character formats to the inputted character to be edited based on the similarity to obtain images of a plurality of candidate characters, and causing a display unit to display the images of the plurality of candidate characters in a list; and
and a step of setting, as a printing target character, a character obtained by applying a specific character format to the editing target character when an image of the specific candidate character obtained by applying the specific character format to the editing target character is selected from the images of the plurality of candidate characters displayed on the display unit.
2. The text editing method of claim 1, further comprising:
acquiring size information of a print medium from a printing apparatus;
calculating a ratio of a size of a printed matter on which the reference characters are printed in the reference image to a size of the reference characters printed on the printed matter, based on the reference image; and
and determining the size of the specific candidate character based on the size information of the print medium and the value of the ratio.
3. The text editing method of claim 1, further comprising:
and selecting the at least one character format from the plurality of character formats based on the similarity.
4. The text editing method of claim 3,
the step of selecting the text format comprises:
and selecting a character format to which the comparison image is applied, the similarity between the image of the reference character and the image of the plurality of character formats being equal to or greater than a threshold value, as the at least one character format.
5. The text editing method according to claim 3 or claim 4,
the step of selecting the text format comprises:
selecting, as the at least one character format, a character format to which a predetermined number of the comparison images are applied before the reference character image when the similarity between the reference character image and the character formats is ranked from high to low.
6. The text editing method of claim 1, further comprising:
and acquiring character information of the reference character from the image of the selected region in the reference image.
7. The text editing method of claim 1,
the character format includes at least one of a font style, a font size, a character color, an italic font, an underline, and a bold font.
8. An electronic device is provided with:
a storage unit for storing information in a plurality of character formats;
a display unit; and
the calculation part is used for calculating the calculation result,
the above-mentioned calculating portion is that,
acquiring character information of a reference character from a reference image including an image of the reference character,
generating a plurality of images for comparison obtained by applying each of the plurality of character formats to the reference character based on the character information of the reference character and the plurality of character formats stored in the storage unit,
calculating a similarity between the image of the reference character and each of the plurality of images for comparison,
applying at least one of the plurality of character formats to the inputted character to be edited based on the similarity to obtain images of a plurality of candidate characters, and causing a display unit to display the images of the plurality of candidate characters in a list,
when an image of a specific candidate character obtained by applying a specific character format to the editing target character is selected from the images of the plurality of candidate characters displayed on the display unit, a character obtained by applying the specific character format to the editing target character is set as a printing target character.
9. The electronic device as set forth in claim 8,
the image pickup apparatus further includes an image pickup device for acquiring the reference image.
10. A recording medium recording a program that causes a computer to execute:
acquiring character information of a reference character from a reference image including an image of the reference character,
generating a plurality of images for comparison obtained by applying each of the plurality of character formats to the reference character based on the character information of the reference character and the plurality of character formats stored in the storage unit,
calculating a similarity between the image of the reference character and each of the plurality of images for comparison,
applying at least one of the plurality of character formats to the inputted character to be edited based on the similarity to obtain images of a plurality of candidate characters, and causing a display unit to display the images of the plurality of candidate characters in a list,
and a step of setting, as a printing target character, a character obtained by applying a specific character format to the editing target character when an image of the specific candidate character obtained by applying the specific character format to the editing target character is selected from the images of the plurality of candidate characters displayed on the display unit.
CN201710975869.1A 2016-12-26 2017-10-19 Character editing method, electronic device, and recording medium Active CN108241594B (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2016252073A JP6866636B2 (en) 2016-12-26 2016-12-26 Character editing methods, electronic devices, and programs
JP2016-252073 2016-12-26

Publications (2)

Publication Number Publication Date
CN108241594A CN108241594A (en) 2018-07-03
CN108241594B true CN108241594B (en) 2022-05-03

Family

ID=62700859

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201710975869.1A Active CN108241594B (en) 2016-12-26 2017-10-19 Character editing method, electronic device, and recording medium

Country Status (2)

Country Link
JP (1) JP6866636B2 (en)
CN (1) CN108241594B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111402367B (en) * 2020-03-27 2023-09-26 维沃移动通信有限公司 Image processing method and electronic equipment

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103400127A (en) * 2013-08-05 2013-11-20 苏州鼎富软件科技有限公司 Picture and text identifying method
CN103488608A (en) * 2013-09-27 2014-01-01 杨昕吉 Method and system for quickly creating visual-style electronic business card
CN104954605A (en) * 2014-03-31 2015-09-30 京瓷办公信息系统株式会社 Image forming apparatus, image forming system, and image forming method

Family Cites Families (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP4624611B2 (en) * 2001-09-06 2011-02-02 セイコーエプソン株式会社 CHARACTER INPUT DEVICE, TAPE PRINTING DEVICE HAVING THE SAME, AND CHARACTER INPUT METHOD FOR CHARACTER INPUT DEVICE
JP4788106B2 (en) * 2004-04-12 2011-10-05 富士ゼロックス株式会社 Image dictionary creation device, encoding device, image dictionary creation method and program thereof
CN101000601B (en) * 2007-01-10 2010-05-19 方正国际软件(北京)有限公司 Device and method for automatic changing type face in type-setting process
CN101226596B (en) * 2007-01-15 2012-02-01 夏普株式会社 Document image processing apparatus and document image processing process
CN101354703B (en) * 2007-07-23 2010-11-17 夏普株式会社 Apparatus and method for processing document image
CN101493811A (en) * 2008-01-24 2009-07-29 鸿富锦精密工业(深圳)有限公司 Font automatic identification and conversion system and method
JP2010146185A (en) * 2008-12-17 2010-07-01 Sharp Corp Image processing apparatus, image reading apparatus, image sending apparatus, image processing method, program, and recording medium thereof
JPWO2015136692A1 (en) * 2014-03-14 2017-04-06 株式会社日立製作所 Electronic image document editing system

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103400127A (en) * 2013-08-05 2013-11-20 苏州鼎富软件科技有限公司 Picture and text identifying method
CN103488608A (en) * 2013-09-27 2014-01-01 杨昕吉 Method and system for quickly creating visual-style electronic business card
CN104954605A (en) * 2014-03-31 2015-09-30 京瓷办公信息系统株式会社 Image forming apparatus, image forming system, and image forming method

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
"How to use Photoshop’s ‘Match Font’ Feature";Luke O’Neill;《https://www.creativebloq.com/how-to/how-to-use-photoshops-match-font-feature》;20160825;第1-10页 *

Also Published As

Publication number Publication date
CN108241594A (en) 2018-07-03
JP6866636B2 (en) 2021-04-28
JP2018106422A (en) 2018-07-05

Similar Documents

Publication Publication Date Title
US9120329B2 (en) Printing apparatus for printing on print receiving tape, printing method and recording medium
US20080304745A1 (en) Image display apparatus, image display method, and recording medium containing an image-display-method program
JP6784021B2 (en) Information processing device, program, cut setting method by information processing device, and printing system
US20110222946A1 (en) Tape printer, method for controlling tape printer, and computer program product
JP2000036930A (en) Print method and printer
US8587821B2 (en) Tape printer, method for controlling tape printer, and computer program product
JP7130948B2 (en) ELECTRONIC DEVICE, PRINTING SUPPORT METHOD AND PROGRAM
CN108241594B (en) Character editing method, electronic device, and recording medium
US20190023027A1 (en) Printer and method of controlling printer
US20180357023A1 (en) Image forming apparatus and image forming method
US11399102B2 (en) Print information processing system detecting cassette by analyzing image and determining whether detected cassette satisfies specific condition
JP7070627B2 (en) Image display program
JP4774805B2 (en) File search device, printing device, file search method and program thereof
EP1607226B1 (en) Printing controller and printing control program
JP6686746B2 (en) Program, label image creating method, and label image creating apparatus
JP2012123704A (en) Control apparatus, control method of control apparatus, and program
US20190126651A1 (en) Printer apparatus, printing method, and computer readable medium
US7986423B2 (en) Image reprinting apparatus and method
JP7293900B2 (en) PRINT IMAGE GENERATION DEVICE, PRINT IMAGE GENERATION METHOD, AND PROGRAM
US9444968B2 (en) Image forming apparatus that continues without halt to perform print job including sign where glyph is invalid data, and recording medium
JP3597038B2 (en) Character information processing device
US20240086127A1 (en) Printer
CN111332015B (en) Print data generation device, print data generation method, and recording medium
JP2003226056A (en) Long medium printer
JP2011088373A (en) Recorder, control method for the same and program

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant