CN112346671A - Storage medium - Google Patents

Storage medium Download PDF

Info

Publication number
CN112346671A
CN112346671A CN202010787489.7A CN202010787489A CN112346671A CN 112346671 A CN112346671 A CN 112346671A CN 202010787489 A CN202010787489 A CN 202010787489A CN 112346671 A CN112346671 A CN 112346671A
Authority
CN
China
Prior art keywords
display
image
display frame
images
storage medium
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202010787489.7A
Other languages
Chinese (zh)
Inventor
渡边真知子
小田昌范
阮龙龙
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Brother Industries Ltd
Original Assignee
Brother Industries Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Brother Industries Ltd filed Critical Brother Industries Ltd
Publication of CN112346671A publication Critical patent/CN112346671A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/12Digital output to print unit, e.g. line printer, chain printer
    • G06F3/1201Dedicated interfaces to print systems
    • G06F3/1223Dedicated interfaces to print systems specifically adapted to use a particular technique
    • G06F3/1237Print job management
    • G06F3/1242Image or content composition onto a page
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/12Digital output to print unit, e.g. line printer, chain printer
    • G06F3/1201Dedicated interfaces to print systems
    • G06F3/1223Dedicated interfaces to print systems specifically adapted to use a particular technique
    • G06F3/1237Print job management
    • G06F3/125Page layout or assigning input pages onto output media, e.g. imposition
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/12Digital output to print unit, e.g. line printer, chain printer
    • G06F3/1201Dedicated interfaces to print systems
    • G06F3/1223Dedicated interfaces to print systems specifically adapted to use a particular technique
    • G06F3/1237Print job management
    • G06F3/1253Configuration of print job parameters, e.g. using UI at the client
    • G06F3/1257Configuration of print job parameters, e.g. using UI at the client by using pre-stored settings, e.g. job templates, presets, print styles

Abstract

A storage medium storing a program executable by a computer of an apparatus having a function of displaying a plurality of images, the program causing the computer to execute: an insertion process of inserting at least a part of one image among a plurality of images having a predetermined display order into each of a plurality of display frames having a randomly determined size; a display process of causing a display section of the apparatus to display an interface in which a plurality of images inserted into a display frame are arranged in a display order; a selection instruction accepting process of accepting, via an input interface of the apparatus, a selection instruction that is an instruction to select one of the plurality of images in a state where the plurality of images are displayed by the display process; and a determination process of determining an image to be a printing target based on the image selected by the selection instruction accepting process. According to the present invention, when a plurality of images are displayed, a visual sense of freshness can be given to a user.

Description

Storage medium
Technical Field
The technical field of the invention of this specification relates to a storage medium storing a program that causes an apparatus to display a plurality of images.
Background
There is known a technique of changing a display mode of an image in a device having a function of displaying a plurality of images. For example, a configuration is disclosed in which an entire image is divided and these divided images are displayed in a random order. In a program for causing a device to display a plurality of images, a sense of freshness is lost by displaying the plurality of images with the same layout each time. By displaying the divided images at random as in the above configuration, the user is visually provided with a sense of freshness while the plurality of images are displayed, but the layout of each image is not changed. Therefore, it is desired to improve the manner of displaying a plurality of images in terms of providing a sense of freshness to the user.
Disclosure of Invention
A storage medium made to solve the above-mentioned problem, storing a program executable by a computer of an apparatus having a function of displaying a plurality of images, wherein the program causes the computer to execute: an insertion process of inserting at least a part of one of the plurality of images in a predetermined display order into each of a plurality of display frames of which sizes are determined at random; a display process of causing a display section of the apparatus to display an interface in which the plurality of images inserted into the display frame are arranged in a display order; a selection instruction accepting process of accepting, via an input interface of the apparatus, a selection instruction that selects one of the plurality of images in a state in which the plurality of images are displayed by the display process; and a determination process of determining an image to be a printing target based on the image selected by the selection instruction accepting process.
Drawings
Fig. 1 is a schematic configuration diagram of an apparatus according to an embodiment.
Fig. 2 is a flowchart showing the sequence of the label creation process.
Fig. 3 is an explanatory diagram showing an example of the top-layer interface.
Fig. 4 is a flowchart showing the procedure of object arrangement processing.
Fig. 5 is a flowchart showing the procedure of the display frame setting process.
Fig. 6 is an explanatory diagram showing an example of the first frame group.
Fig. 7 is an explanatory diagram showing an example of the second frame group.
Fig. 8 is an explanatory diagram showing an example of clipping.
Fig. 9 is an explanatory diagram showing an example of the top layer interface.
Fig. 10 is an explanatory diagram showing an example of the editing interface.
Detailed Description
Hereinafter, an embodiment in which a program embedded in a device is embodied will be described in detail with reference to the drawings. The present embodiment relates to an application program (hereinafter, referred to as an "application") embedded in a portable device such as a smartphone capable of displaying images.
As shown in fig. 1, the apparatus 1 of the present embodiment includes a controller 10, the controller 10 includes a CPU11 and a memory 12, and the controller 10 is connectable to the printer 2. The device 1 includes a user interface (hereinafter, referred to as "user IF") 20 and a communication interface (hereinafter, referred to as "communication IF") 30, which are electrically connected to the controller 10. The apparatus 1 is, for example, a device capable of executing various applications for causing the printer 2 to print. Note that the controller 10 in fig. 1 is a general term for collectively storing hardware and software used for controlling the device 1, and is not limited to a single hardware that actually represents the device 1.
The CPU11 executes various processes in accordance with programs read out from the memory 12 or based on user operations. The CPU11 is an example of a computer. The memory 12 includes a ROM and a RAM, and also includes a nonvolatile memory such as an HDD and a flash memory, and the memory 12 stores various programs and data.
The user IF20 includes a touch panel having both a display function and an operation reception function. The user IF20 is an example of a display unit and an example of an input interface. The user IF20 may include a combination of a display for displaying information, a keyboard, a mouse, and the like for receiving an input operation by a user.
The communication IF30 includes hardware for communicating with an external device such as the printer 2. The communication IF30 may be wireless or wired, or may be any standard system such as Wi-Fi (registered trademark), Bluetooth (registered trademark), USB, or LAN. The device 1 of the present embodiment may have a function of connecting to the internet via the communication IF 30.
As shown in fig. 1, an operating system (hereinafter, referred to as "OS") 41, a tag creation application 42, and an image database (hereinafter, referred to as "image DB") 43 are embedded in the memory 12 of the device 1 of the present embodiment. The OS41 is, for example, any one of iOS (registered trademark), Android (registered trademark), Windows (registered trademark), MacOS (registered trademark), and Linux (registered trademark).
The printer 2 of the present embodiment is, for example, a so-called label printer which includes a thermal transfer type print head, stores label paper wound in a roll shape, rolls out the label paper, and prints the label paper. The printer 2 prints an image on the stored label paper based on, for example, a print job received from the device 1, conveys the label paper, and projects the printed portion to the outside of the apparatus.
The label creation application 42 of the present embodiment is an application for creating various labels using the printer 2. The label creation application 42 is an example of a program. The label creation application 42 receives an instruction to create and edit an image to be printed by the printer 2, and displays the image subjected to the instruction to the user IF 20. The label creation application 42 receives an instruction to print an image being displayed, generates a print job based on the image being displayed, and transmits the print job to the printer 2. The labeling application 42 of the present embodiment may be a program that can be executed independently based on an execution instruction of a user, or may be a program that is called from the program and executed while another program is being executed.
The image DB43 is a storage area for storing image data of various images for the label creation application 42. The labeling application 42 displays the image of the image data stored in the image DB43 to the user IF20 based on the user's instruction. The image data stored in the image DB43 may be stored at all times or may be acquired from a server or the like as needed.
In this embodiment, the image DB43 stores, for example, a plurality of templates selectable by the label creation application 42 and image data of a plurality of use case images representing use cases in association with the templates. The template used in the label creation application 42 is image data of a prototype for label creation, and includes, for example, a character string, a coded image, a frame image, and a sample of an illustration. The user can select a template similar to a desired label from the templates with reference to the use case images, edit the selected template, and print the edited template. For example, the user can easily create a desired label by changing the character string of the template to a desired character and printing the character string.
Next, the sequence of the label creation process by the label creation application 42 according to the present embodiment will be described with reference to the flowchart of fig. 2. The label creation process is executed by the CPU11 of the device 1 when receiving an execution command of the label creation application 42. The following processes and the processing steps in the flowcharts basically represent the processing of the CPU11 in accordance with the commands described in the programs. The processing performed by the CPU11 includes hardware control using an API of the OS41 of the device 1. In this specification, the operation of each program will be described without description of the OS 41.
In the label making process, first, the CPU11 executes an object configuration process (S101), which is a process for displaying a top-level interface at the user IF 20. As shown in fig. 3, the top-level interface 50 of the present embodiment includes, for example, a title 51, a top-level image 52, and an image area 53 including a plurality of use case images G1, G2, etc. representing use cases of labels. The title 51 includes a setting button 511. Each use case image of the present embodiment is a target for receiving specification of a template, and is configured to be shifted to an editing interface of a template of a label indicated in one of the use case images when the one of the use case images is lightly pressed.
The label creation application 42 of the present embodiment has a function of scrolling in two directions, i.e., vertical and horizontal directions. Hereinafter, as indicated by arrows in fig. 3, the scroll direction is referred to as the X direction, and the direction orthogonal to the scroll direction is referred to as the Y direction. In the image area 53, a plurality of usage example images G1, G2, and the like are divided into two or more predetermined columns in the Y direction, and the respective columns are displayed in parallel in the X direction. In the following, the number of columns in the Y direction in which the use-image images are arranged in the image area 53 is referred to as "display column number". In the example of fig. 3, the scroll direction is vertical, and the number of display columns is two.
The label creation application 42 receives an instruction to switch the scroll direction and the number of display columns by operating the setting button 511. The labeling application 42 stores the accepted setting contents in the memory 12. Next, the label creation application 42 reads out the setting contents from the memory 12 at the time of startup or after startup as needed. The range of the number of display columns that can be set may be limited by the scroll direction, the interface size of the user IF20 of the device 1, and the like.
The procedure of the object arrangement process executed in S101 of the label creation process will be described with reference to the flowchart of fig. 4. The object arrangement processing is processing for determining the arrangement of each use case image displayed in the image area 53. In the object arrangement processing, first, the CPU11 executes display frame setting processing (S201). The display frame setting process is a process of setting a display frame as a display range of each use case image.
The procedure of the display frame setting process will be described with reference to the flowchart of fig. 5. In the display frame setting process, first, the CPU11 acquires the X direction and the number of display columns as the scroll direction (S301). Next, the CPU11 determines whether the X direction is vertical (S302). When it is determined that the vertical scroll is performed (yes in S302), the CPU11 determines to use the first frame group as the display frame group for vertical scroll as the display frame group (S303). As shown in fig. 6, the first frame group is a display frame group including four display frames A, B, C, D whose Y-direction dimensions are all the same width W and whose X-direction dimensions are different heights T1 to T4, respectively, for example. The first frame group exemplifies a first display frame group.
Each display frame of the display frame group is selected to have a size at which a significant difference is visually recognized. For example, in the example of fig. 6, the display frame B is a square, that is, T2 is W, and the heights of the other display frames are in the following relationship.
T1=T2×2/3
T3=T2×4/3
T4=T2×5/3
On the other hand, if it is determined that the X direction is the landscape direction, that is, if it is the landscape scrolling (S302: no), the CPU11 determines to use the second frame group as the display frame group for the landscape scrolling (S304). As shown in fig. 7, the second frame group is a display frame group including four display frames E, F, G, H whose Y-direction dimensions are all the same height T and X-direction dimensions are different widths W1 to W4, respectively, for example. The second frame group exemplifies a second display frame group. The widths W1 to W4 of the respective display frames of the second frame group are visually distinct from the heights T1 to T4 of the first frame group shown in fig. 6.
Next, the CPU11 acquires the size of the display interface of the user IF20 (S305). Then, the size of each display frame in the Y direction is determined based on the number of display columns acquired in S301 and the size of the display interface acquired in S305 (S306). In S306, the width W shown in fig. 6 is determined when the selected display frame group is the first frame group, and the height T shown in fig. 7 is determined when the selected display frame group is the second frame group. Further, the CPU11 determines the X-direction size of each display frame based on the determined Y-direction size and the above-described relationship (S307), ends the display frame setting process, and returns to the object arrangement process.
The size of each display frame for displaying the use case image is determined by the display frame setting processing. In this aspect, since the display frame group includes the first frame group and the second frame group, even in a device capable of switching between vertical scrolling and horizontal scrolling, the size in the Y direction is determined in accordance with the scrolling direction and the number of display columns. Further, since the size in the Y direction is fixed in each display frame group, it is easy to arrange the display frames in parallel in the X direction, and it is also easy to balance the arrangement in the Y direction. The number of types of display frames included in each display frame group is not limited to four, and may be plural.
The object arrangement processing of fig. 4 is explained back. After the display frame setting processing of S201, the CPU11 acquires image data of a series of images displayed as each use case image (S202). Each image data may be stored in the memory 12 of the device 1 or may be acquired from an external device such as a server via the communication IF 30. The display order of each use case image is predetermined, and the CPU11 acquires each image data together with information on the display order. The CPU11 may acquire image data sequentially in the display order and perform processing sequentially.
The CPU11 acquires image data of a use case image to be displayed next from the image data acquired at S202 (S203). In first S203, the CPU11 acquires image data of the use case image displayed first. Then, the CPU11 extracts a display frame that can correspond to the image data acquired at S203 from the display frames determined in the display frame setting process (S204). Each image data is associated with information of a range of sizes of the display frames that can be handled, for example, as an attribute.
Depending on the image, when the display frame is too small, it is difficult to recognize the image. In addition, if the display frame is too large compared to the size of the image, the margin becomes large, and the waste is large. In this aspect, information as to whether or not a display frame can be associated is provided for each image, and one display frame is selected from the display frames that can be associated. The CPU11 may determine whether or not the display frame is compatible based on the size of the image data. The CPU11 may extract a display frame having a size in the X direction that is half or more of the size of the image data, for example.
Next, the CPU11 randomly selects one from the extracted display frames (S205), and inserts the image of the image data acquired at S203 in the selected display frame (S206). S205 exemplifies a selection process, and S206 exemplifies an insertion process. Since information as to whether the display frame can correspond is set for each image, and in S205 the CPU11 randomly selects one from the display frames that can correspond, a display frame suitable for display is randomly selected for each image.
Further, in the case where the selected display frame is smaller than the image, the CPU11 cuts out the image and generates an image of the size of the display frame, for example. As shown in fig. 8 (a), the CPU11 deletes, for example, the left and right end portions of the horizontally long image 61, and generates an image of the central portion of the image 61 as the image 62 inserted into the display frame B. As shown in fig. 8 (B), the CPU11 deletes the upper and lower ends of the vertically long image 63, for example, and generates the image 64 inserted in the display frame B. The range of clipping may be designated in advance or may be randomly clipped. On the other hand, when the selected display frame is larger than the image, the CPU11 arranges the image at the center of the display frame and adds a blank space or a solid image of a predetermined ground color around the image to generate an image of the size of the display frame. Further, instead of cropping or adding a solid image, the image may be scaled.
Next, the CPU11 selects the column whose end of the use case image is the shortest after the arrangement (S207), and additionally arranges the image of this time inserted into the display frame at a position subsequent to the use case image for which the arrangement of the selected column is completed (S208). In S207, the CPU11 selects the column with the smallest position in the X direction at the end of the use case images that have been arranged in the columns. In addition, when there are columns at the same position in the X direction, the CPU11 selects a column at a smaller position in the Y direction. The end of the use case image is the maximum value of the X coordinate included in the display range of the use case image in the X direction.
After the image acquired in S203 is arranged, the CPU11 determines whether an unconfigured image remains in the image group acquired in S202 (S209). If it is determined that there is a residual image (yes in S209), the CPU11 returns to S203 and acquires an image in the next display order. If it is determined that there is no residual image (no in S209), the CPU11 ends the object arrangement processing and returns to the label creation processing.
An example of the arrangement of each use case image will be described with reference to fig. 3. The first use case image G1 and the second use case image G2 are arranged in this order from the left at the uppermost position of the image area 53. Specifically, the use case image G1 is arranged in a range from the left column of points (X1, Y1) to the lower right, and the next use case image G2 is arranged in a range from the right column of points (X1, Y2) to the lower right with a predetermined interval from the display range of the use case image G1. Further, the display frame of the use case image G1 is larger than that of the use case image G2, and the end in the X direction of the use case image G1 is located lower than the end in the X direction of the use case image G2, and therefore, the third use case image G3 is disposed at a point (X2, Y2) spaced at a prescribed interval below the use case image G2 in the right column, from the display range of the use case image G2. Further, the value of the position X at the end of the use case image G1 in the left column is smaller than the end of the use case image G3 in the right column, and therefore the next use case image G4 is appended to the point (X3, Y1) in the left column. Since the next use case image is arranged in the last shortest column, the arrangement is well balanced for each column. In addition, there may be no interval provided between the use images.
The explanation returns to the label creation process of fig. 2. The CPU11 displays the top-level interface 50 at the user IF20 after the object configuration processing of S101 (S102). S102 is an example of the display processing. As shown in fig. 3, for example, the use case images are displayed in the image area 53 of the top-level interface 50 in the arrangement decided in the object arrangement processing in the display order thereof. Although only a part of the top-level interface 50 may be displayed in one interface, the top-level interface 50 includes all the use case images acquired in S202 of the object arrangement processing, and can display all the use case images by scrolling.
Since the next use case image is added to the shortest column by the object arrangement processing, the use case images of the top interface 50 are displayed in a predetermined display order. Further, since the display frames of the respective use case images are randomly selected from four kinds of display frames having visually distinct differences in size, the display frames of the respective use case images have distinct differences in size and are displayed in different sizes for each execution. Although fig. 3 shows an example in which the display frames are selected in the order of C, B, D, A, for example, when the display frames are selected in the order of A, C, B, B, the arrangement of the use case images is as shown in fig. 9. In this way, the top-level interface 50 having a large difference in arrangement every time it is executed is displayed with a large visual freshness.
The CPU11 determines whether or not a selection instruction for selecting one of the objects including the use case image is accepted in the displayed top-level interface 50 (S103). S103 is an example of the selection instruction receiving process. When determining that the instruction to select the use case image has been accepted (yes in S103), the CPU11 specifies the template represented by the selected use case image (S104). S104 is an example of the specific processing.
Next, the CPU11 displays an editing interface including the determined template at the user IF20 (S105). As shown in fig. 10, the editing interface 70 includes, for example, a use case image 71, a template 72, an edit button 73, a print button 74, and a return button 75. The use case image 71 is, for example, the same image as the use case image selected in the top-level interface 50. The template 72 is an image for making a print object of a label displayed by the use image. The edit button 73 is a button for accepting an edit instruction to the template 72. The print button 74 is a button that accepts a print instruction of the template 72. The return button 75 is a button that accepts an instruction to close the editing interface 70 and return to the top-level interface 50.
The CPU11 determines whether or not a print instruction is accepted on the editing interface 70 (S106). The CPU11 accepts a print instruction based on an operation of the print button 74. Upon determining that the print instruction is accepted (S106: YES), the CPU11 generates a print job based on the image displayed in the template 72, and transmits the print job to the printer 2 via the communication IF30 (S107). The template 72 to be printed may be an image edited by the user. The CPU11 accepts an edit instruction of the template 72 based on, for example, an operation of the edit button 73.
When it is determined that the print instruction has not been accepted (no in S106), the CPU11 determines whether a return instruction has been accepted (S108). After S107, or in the case where it is determined that a return instruction is accepted (S108: yes), the CPU11 returns to S102 and causes the top-level interface 50 to be displayed. The top-level interface 50 at this time is displayed in the same manner as the top-level interface 50 at the time of startup.
When determining that the return instruction has not been accepted (no in S108), the CPU11 determines whether an end instruction has been accepted (S109). When determining that the end instruction has not been accepted (no in S109), the CPU11 returns to S106 to accept the print instruction, the return instruction, and the end instruction.
On the other hand, when it is determined that the selection of the use case image has not been accepted while the top interface 50 is displayed (no in S103), the CPU11 determines whether or not an operation on the setting button 511 (see fig. 3) has been accepted (S110). When determining that the operation of the setting button 511 has been accepted (yes in S110), the CPU11 executes the display frame setting process shown in fig. 5 (S111). Note that, after the setting button 511 is operated, the CPU11 may execute the display frame setting process only when the scroll direction or the number of display columns is changed. When the determination at S110 is yes, only the designated setting content may be stored in the memory 12 instead of S111. In this case, the setting contents are reflected in the next execution of the label creation application 42.
After S111, or when it is determined that the operation of the setting button 511 has not been accepted (S110: no), the CPU11 determines whether or not an end instruction has been accepted (S112). If it is determined that the end instruction has not been accepted (no in S112), the CPU11 returns to S103 and stands by until the selection of the use case image, the operation of the setting button 511, or the end instruction is accepted. If it is determined that an end command has been received at the top interface 50 or the editing interface 70 (yes at S109 or yes at S112), the CPU11 ends the label creation process.
As described above in detail, according to the label creation application 42 described in this specification, the CPU11 inserts each use-case image into each of a plurality of display frames whose sizes are determined at random, for the use-case images G1, G2, and the like, which are a plurality of images determined in advance as a display order, and arranges each use-case image in the display order, and displays the use-case image on the user IF 20. Accordingly, the size of the display frame changes every time the user displays the image, and the size of each use case image inserted into the display frame and displayed changes. Further, it is conceivable to give a sense of freshness by changing the display order of the use case images, but when changing the display order, the user is likely to have trouble searching for an image when selecting an image. In the present invention, since the display order of images does not change, it is difficult to cause trouble of finding images.
The present embodiment is merely an example, and does not limit the present invention. Therefore, it is needless to say that various improvements and modifications can be made to the present invention without departing from the scope of the invention. For example, the device 1 is not limited to a portable device, and may be a stationary device such as a personal computer. For example, the number of printers connected to the apparatus 1 is not limited to one, and may be plural. The printing method of the printer 2 is not limited to the thermal transfer method, and may be, for example, a thermal method, an ink jet method, or an electrophotographic method. The print medium used in the printer 2 is not limited to label paper, and may be cut paper or simply roll paper, for example.
In this embodiment, one display frame is randomly selected in advance from the display frame group whose size relationship is determined, that is, one display frame is randomly selected from a plurality of display frames prepared in advance, but the present invention is not limited to this. For example, the size of each display frame may be completely random without preparing a display frame group. For example, as each display frame, a range of selectable X-direction and Y-direction sizes may be set, and the size may be randomly determined within the range. However, if the display frames are completely random, there is a possibility that a situation in which it is difficult to distinguish the sizes of the images and a situation in which the difference in the sizes of the display frames is small and it is difficult to give a visual sense of freshness are selected, and if the display frames are selected from a plurality of display frames in which the difference in the sizes of the display frames is significant in advance, the display is performed so that the sizes are within an appropriate range and the difference in the sizes of the display frames is significant, and thus the visual sense of freshness is easily given.
In this embodiment, the label creation application 42 receives the setting of the scroll direction and the number of display columns, and determines the size of each display frame based on the received scroll direction and the number of display columns, but the invention is not limited thereto. For example, the scroll direction and the number of display columns may be determined in advance. In this case, each display frame of the display frame group may have a predetermined size. Further, each display frame group corresponding to each combination in which the scroll direction and the number of display columns can be set may be prepared in advance, and one display frame group may be selected from the display frame groups. The information of each display frame group may be stored in the memory 12, or may be stored in an external device such as a server that can be referred to from the device 1.
In this embodiment, an example of using an image for label preparation is shown, but the display target image is not limited to this as long as it is a plurality of images whose display order is determined. It is not necessary to include a function of accepting an instruction by an operation in the display frame.
In this embodiment, information of a range of sizes of the display frames that can be handled is associated as an attribute in each image data of the use-image, but there is no limitation to this, and the use-image may be displayed by display frames of all sizes. In this case, for example, the display frames may be randomly selected and arranged on the interface, and after the display frames are arranged, the use case images may be inserted into the arranged display frames in a predetermined display order.
In this embodiment, the display frames are randomly selected for each use image, but the display frames having the same size may be made non-adjacent. Further, a plurality of display order sets in which the selection order of the display frames is determined in advance may be prepared, and the display order may be randomly selected from the plurality of display order sets. The display frames of the display frame group have the same size in the Y direction, but may be different. In addition, each display frame may not be rectangular.
In this embodiment, the image area 53 in which the use case image is displayed is included in the top-level interface 50, but the invention is not limited to this. For example, the display may be performed when a display instruction is received at the top interface 50. The arrangement examples of the header 51 and the top-level image 52 of the top-level interface 50 shown in fig. 3 are examples, and may be arranged arbitrarily, may not have at least a part, or may further include other images.
In this embodiment, the size and arrangement of the display frame are determined by executing the object arrangement processing at the time of activation of the label creation application 42, but the execution condition of the object arrangement processing is not limited to the time of activation. For example, the object configuration process may also be performed each time a transition is made from an interface other than the top-level interface 50 to the top-level interface 50. For example, after S107 of the label creation process or when it is determined yes in S108, the process may return to S101. However, if the insertion and arrangement of the display frames of the respective use case images are determined at the time of starting the label creation application 42 and then the arrangement is fixed, the processing load is small compared to the case where the display frames are changed every time the top-level interface 50 is displayed. On the other hand, if the display frame is changed every time the top-level interface 50 is displayed, the chance of changing the display frame increases, and it can be expected that the user will be given a more visually fresh feeling.
In the arbitrary flowcharts disclosed in the embodiments, the plurality of processes in the arbitrary plurality of steps can be arbitrarily changed in the execution order or can be executed in parallel to each other within a range in which the contents of the processes do not contradict each other.
The processing disclosed in the embodiment may be executed by hardware such as a single CPU, a plurality of CPUs, an ASIC, or a combination thereof. The processing disclosed in the embodiments can be implemented in various forms such as a storage medium or a method for storing a program for executing the processing.

Claims (9)

1. A storage medium storing a program executable by a computer of an apparatus having a function of displaying a plurality of images, the storage medium characterized in that,
the program causes the computer to execute:
an insertion process of inserting at least a part of one of the plurality of images in a predetermined display order into each of a plurality of display frames of which sizes are determined at random;
a display process of causing a display section of the apparatus to display an interface in which the plurality of images inserted into the display frame are arranged in a display order;
a selection instruction accepting process of accepting, via an input interface of the apparatus, a selection instruction that selects one of the plurality of images in a state in which the plurality of images are displayed by the display process; and
a determination process of determining an image to be a printing target based on the image selected by the selection instruction accepting process.
2. The storage medium of claim 1,
the program causes the computer to execute a selection process of randomly selecting one display frame from a display frame group including a plurality of display frames different in size from each other for each of the plurality of images,
in the insertion process, at least a part of the image is inserted into the display frame selected in the selection process.
3. The storage medium of claim 2,
the display section is capable of scrolling the display,
in the selection processing, one is selected from the plurality of display frames whose sizes in a scroll direction, which is a direction of scrolling, are different from each other, and whose sizes in an orthogonal direction, which is a direction orthogonal to the scroll direction, are the same.
4. The storage medium of claim 3,
the display frame group includes a first display frame group including a plurality of display frames whose longitudinal sizes are different from each other and whose lateral sizes are the same, and a second display frame group including a plurality of display frames whose lateral sizes are different from each other and whose longitudinal sizes are the same,
the display section is capable of switching between vertical scrolling and horizontal scrolling,
in the selection process, it is preferable that,
when the display device is scrolled in the vertical direction, one display frame is selected from the first display frame group, and when the display device is scrolled in the horizontal direction, one display frame is selected from the second display frame group.
5. The storage medium of claim 3 or 4,
in the display processing, it is preferable that,
when the position of the display unit in the scroll direction is X and the position of the display unit in the orthogonal direction is Y, the arrangement of the display frame is expressed by a combination of X and Y (X, Y),
the interface displays a predetermined value of two or more display frames arranged in parallel in the orthogonal direction, arranges a new display frame at a position Y2 different from a position Y1 in the orthogonal direction of the already arranged display frames until the number of columns reaches the predetermined value, selects a column in which a display frame having a minimum value of a position X at the end of the scrolling direction in the display frames arranged in the respective columns is arranged, and arranges the new display frame at a position X2 different from a position X1 in the scrolling direction of the display frame already arranged in the selected column after the number of columns reaches the predetermined value.
6. The storage medium of claim 2,
each image in the plurality of images is associated with information whether a display frame can correspond or not,
in the selection process, it is preferable that,
for each image, a display frame capable of being corresponded is randomly selected from the display frames included in the display frame group.
7. The storage medium of claim 1,
in the insertion process, it is preferable that,
when the display frame is smaller than the image, the image is cut out in accordance with the size of the display frame and inserted into the display frame.
8. The storage medium of claim 1,
in the insertion process, it is preferable that,
when the display frame is larger than the image, the image is arranged at the center of the display frame.
9. The storage medium of claim 1,
in a case where the program is started, the program causes the computer to execute the insertion processing.
CN202010787489.7A 2019-08-09 2020-08-07 Storage medium Pending CN112346671A (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2019-146986 2019-08-09
JP2019146986A JP7379921B2 (en) 2019-08-09 2019-08-09 program

Publications (1)

Publication Number Publication Date
CN112346671A true CN112346671A (en) 2021-02-09

Family

ID=74358311

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010787489.7A Pending CN112346671A (en) 2019-08-09 2020-08-07 Storage medium

Country Status (2)

Country Link
JP (1) JP7379921B2 (en)
CN (1) CN112346671A (en)

Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2005352606A (en) * 2004-06-08 2005-12-22 Sony Corp Image display device, image display method, computer program, and recording medium
JP2006139505A (en) * 2004-11-11 2006-06-01 Fuji Photo Film Co Ltd Print-ordering program and device
CN101753757A (en) * 2008-11-28 2010-06-23 兄弟工业株式会社 Printing apparatus and printing method
CN102339215A (en) * 2010-07-20 2012-02-01 佳能株式会社 Image processing apparatus and method for controlling the image processing apparatus
US20130106913A1 (en) * 2011-10-28 2013-05-02 Microsoft Corporation Image layout for a display
CN103713801A (en) * 2012-09-28 2014-04-09 富士施乐株式会社 Image display control apparatus, image display apparatus, non-transitory computer readable medium, and image display method
CN105205506A (en) * 2014-06-16 2015-12-30 富士胶片株式会社 Image Processing Device, Image Processing Method, And Image Processing Program
US20160292543A1 (en) * 2015-03-31 2016-10-06 Brother Kogyo Kabushiki Kaisha Information processing device and non-transitory computer-readable medium storing instructions for print control
CN106469035A (en) * 2015-08-19 2017-03-01 三星电子株式会社 Imaging method, imaging device and be used for its computer readable recording medium storing program for performing
JP2018049450A (en) * 2016-09-21 2018-03-29 キヤノン株式会社 Display control method and display device
CN108132762A (en) * 2016-12-01 2018-06-08 京瓷办公信息系统株式会社 Image processing apparatus and image forming apparatus
CN108780377A (en) * 2016-03-17 2018-11-09 三星电子株式会社 Object Management group using computing device and visualization

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2018093523A (en) 2018-02-02 2018-06-14 カシオ計算機株式会社 Image arrangement determination method and image arrangement determination program

Patent Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2005352606A (en) * 2004-06-08 2005-12-22 Sony Corp Image display device, image display method, computer program, and recording medium
JP2006139505A (en) * 2004-11-11 2006-06-01 Fuji Photo Film Co Ltd Print-ordering program and device
CN101753757A (en) * 2008-11-28 2010-06-23 兄弟工业株式会社 Printing apparatus and printing method
CN102339215A (en) * 2010-07-20 2012-02-01 佳能株式会社 Image processing apparatus and method for controlling the image processing apparatus
US20130106913A1 (en) * 2011-10-28 2013-05-02 Microsoft Corporation Image layout for a display
CN103713801A (en) * 2012-09-28 2014-04-09 富士施乐株式会社 Image display control apparatus, image display apparatus, non-transitory computer readable medium, and image display method
CN105205506A (en) * 2014-06-16 2015-12-30 富士胶片株式会社 Image Processing Device, Image Processing Method, And Image Processing Program
US20160292543A1 (en) * 2015-03-31 2016-10-06 Brother Kogyo Kabushiki Kaisha Information processing device and non-transitory computer-readable medium storing instructions for print control
CN106469035A (en) * 2015-08-19 2017-03-01 三星电子株式会社 Imaging method, imaging device and be used for its computer readable recording medium storing program for performing
CN108780377A (en) * 2016-03-17 2018-11-09 三星电子株式会社 Object Management group using computing device and visualization
JP2018049450A (en) * 2016-09-21 2018-03-29 キヤノン株式会社 Display control method and display device
CN108132762A (en) * 2016-12-01 2018-06-08 京瓷办公信息系统株式会社 Image processing apparatus and image forming apparatus

Also Published As

Publication number Publication date
JP7379921B2 (en) 2023-11-15
JP2021026736A (en) 2021-02-22

Similar Documents

Publication Publication Date Title
US8520251B2 (en) Information processing apparatus, information processing method, and computer program
CN105827893A (en) Image processing device, image processing method, and image forming system
US8302000B2 (en) Enlarging document elements
JP5578188B2 (en) Image processing apparatus, image processing apparatus control method, and program
JP2008131381A (en) Form generating device, form generating method, program and record medium
US8215851B2 (en) Print control apparatus that controls printing device performing printing on print sheet having tab
KR101993245B1 (en) Apparatus for controlling print and method for controlling print
US9244892B2 (en) Information display apparatus and computer readable medium
CN112346671A (en) Storage medium
US10572194B2 (en) Information processing apparatus communicable with label printing device
US9767077B2 (en) Image processing apparatus, image processing method, and storage medium that stores program
JP7380169B2 (en) Control program, information processing device
US11321033B2 (en) Non-transitory computer-readable recording medium storing program and information processing apparatus
JP7238532B2 (en) Printing device and image editing program
CN107850987A (en) Set of options method and program in information processor, information processor
JP5569028B2 (en) Display control apparatus, display control method, and program
US8707165B2 (en) Information processing apparatus, control method, and storage medium for adjustment of alternate document layers to reduce printed pages
JP2011156706A (en) Printing control program and print controller
JP5650683B2 (en) Image processing apparatus, image processing method, and image processing program
JP5158016B2 (en) Composite image creation method and job analysis program
JP4660301B2 (en) Drawing method, drawing apparatus and program thereof
JP2010286872A (en) Program and apparatus for processing print data
JP6283590B2 (en) Printing system, printing instruction method, and printing processing method
JP2005032098A (en) Method, program, and device for preparing slip
JP5293711B2 (en) Print control program and print control apparatus

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination