CN113039583A - Information processing system, information processing apparatus, and recording medium - Google Patents

Information processing system, information processing apparatus, and recording medium Download PDF

Info

Publication number
CN113039583A
CN113039583A CN201980075011.7A CN201980075011A CN113039583A CN 113039583 A CN113039583 A CN 113039583A CN 201980075011 A CN201980075011 A CN 201980075011A CN 113039583 A CN113039583 A CN 113039583A
Authority
CN
China
Prior art keywords
image
image data
information processing
content
information
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
CN201980075011.7A
Other languages
Chinese (zh)
Inventor
平井知子
高柳真悟
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Ricoh Co Ltd
Original Assignee
Ricoh Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Ricoh Co Ltd filed Critical Ricoh Co Ltd
Priority claimed from PCT/JP2019/044906 external-priority patent/WO2020101019A1/en
Publication of CN113039583A publication Critical patent/CN113039583A/en
Withdrawn legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T11/002D [Two Dimensional] image generation
    • G06T11/60Editing figures and text; Combining figures or text

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • User Interface Of Digital Computer (AREA)
  • Processing Or Creating Images (AREA)

Abstract

An information processing system is disclosed. The system includes a terminal apparatus and an information processing apparatus. In the information processing apparatus, first image data is stored in a first storage device. Second image data is received from the terminal device. Third image data representing a third image is generated for displaying a part of the second image represented by the second image data in an area inside or outside the shape of the element included in the first image represented by the first image data. The third image data is output to an output device. In the terminal device, an input is received from a user. The second image data is stored in a second storage device. The second image data is transmitted to the information processing apparatus.

Description

Information processing system, information processing apparatus, and recording medium
Technical Field
The present disclosure relates to an information processing system, an information processing apparatus, and a recording medium.
Background
Conventionally, it is known to generate a design by laying out a plurality of digital contents registered in advance, and include a paid content in a part of the digital contents (patent document 1).
Reference list
Patent document
[ patent document 1] Japanese laid-open patent publication No.2009-134762
Disclosure of Invention
Technical problem
In the above-described technology, a plurality of contents are simply arranged along a layout, and it is difficult for a user to generate unique new contents.
Therefore, in view of such circumstances, an object of the present disclosure is to allow a user to easily generate unique new content.
Technical scheme for solving problems
An aspect of the present disclosure provides an information processing system including: a terminal device; and an information processing apparatus, wherein the information processing apparatus performs a first process including storing first image data in a first storage device, receiving second image data from the terminal apparatus, generating third image data representing a third image for displaying a part of the second image represented by the second image data in an area inside or outside a shape of an element included in the first image represented by the first image data, and outputting the third image data to an output apparatus, and wherein the terminal apparatus performs a second process including receiving an input from a user, storing the second image data in a second storage device, and transmitting the second image data to the information processing apparatus.
Drawings
Fig. 1 is a schematic diagram showing an information processing system in a first embodiment.
Fig. 2A is a schematic diagram showing an example of the hardware configuration of the information processing apparatus in the first embodiment.
Fig. 2B is a schematic diagram showing an example of the hardware configuration of the terminal apparatus in the first embodiment.
Fig. 3 is a diagram showing an example of the content image DB in the first embodiment.
Fig. 4 is a schematic diagram showing functions of an information processing apparatus and a terminal apparatus in the first embodiment.
Fig. 5 is a sequence chart for explaining the operation of the information processing system in the first embodiment.
Fig. 6 is a schematic diagram for explaining a combined image data generating process in the first embodiment.
Fig. 7 is a schematic diagram showing generation of combined image data and combined information in the first embodiment.
Fig. 8A is a first diagram showing a display example in the first embodiment.
Fig. 8B is a second diagram showing a display example of the terminal apparatus in the first embodiment.
Fig. 9 is a schematic diagram showing an information processing system in the second embodiment.
Fig. 10A is a first diagram showing an example of an image DB in the second embodiment.
Fig. 10B is a second diagram showing an example of the image DB in the second embodiment.
Fig. 11 is a diagram showing an example of the user DB in the second embodiment.
Fig. 12 is a schematic diagram showing functions of an information processing apparatus and a terminal apparatus in the second embodiment.
Fig. 13 is a sequence chart for explaining the operation of the information processing system in the second embodiment.
Fig. 14 is a flowchart for explaining a combined image data generation process in the second embodiment.
Fig. 15 is a schematic diagram showing functions of an information processing apparatus and a terminal apparatus in the third embodiment.
Fig. 16 is a sequence diagram for explaining the operation of the information processing system in the third embodiment.
Fig. 17 is a schematic diagram showing a combined image data generation process in the third embodiment.
Fig. 18 is a schematic diagram showing functions of an information processing apparatus and a terminal apparatus in the fourth embodiment.
Fig. 19 is a sequence chart showing the operation of the information processing system in the fourth embodiment.
Fig. 20 is a schematic diagram for explaining a process of generating synthetic image data in the fourth embodiment.
Fig. 21 is a diagram showing a display example of a terminal apparatus in the fourth embodiment.
Fig. 22 is a schematic diagram showing functions of an information processing apparatus and a terminal apparatus in the fifth embodiment.
Fig. 23 is a sequence diagram showing the operation of the information processing system in the fifth embodiment.
Fig. 24 is a schematic diagram for explaining a process of generating synthetic image data in the fifth embodiment.
Fig. 25 is a diagram showing a display example of a terminal apparatus in the fifth embodiment.
Fig. 26A is a schematic diagram showing an example of editing of a text image by an editing section in the fifth embodiment.
Fig. 26B is a diagram showing another example of editing of a text image by the editing section in the fifth embodiment.
Fig. 27 is a diagram showing functions of an information processing apparatus and a terminal apparatus in the sixth embodiment.
Fig. 28 is a sequence chart showing the operation of the information processing system in the sixth embodiment.
Fig. 29 is a schematic diagram for explaining a process of generating synthetic image data in the sixth embodiment.
Fig. 30 is a schematic diagram showing a display example of a terminal apparatus in the sixth embodiment.
Fig. 30 is a schematic diagram for explaining a modification of the information processing system in the sixth embodiment.
Detailed Description
(first embodiment)
The first embodiment will be described below with reference to the drawings. Fig. 1 is a schematic diagram showing an information processing system in a first embodiment.
The information processing system 100 in the first embodiment includes an information processing apparatus 200, a terminal apparatus 300, and a printing apparatus 400 that are connected via a network or the like.
In the example of FIG. 1, information handling system 100 includes, but is not limited to, a printing device 400. The information processing system 100 may include the information processing apparatus 200 and the terminal apparatus 300 without including the printing apparatus 400. Further, the information processing apparatus 200 in the information processing system 100 may include all or a part of the configuration or function of the terminal apparatus 300. Alternatively, the terminal apparatus 300 may include all or a part of the configuration or function of the information processing apparatus 200.
In the information processing system 100 of the first embodiment, the information processing apparatus 200 includes a content image Database (DB)210 and a combined image generation processing section 220. The content image DB210 stores image data representing content images.
Hereinafter, image data representing a content image is referred to as content image data. The content image may be an image of an existing character or an original character, an image of a landscape such as a mountain or sea, an image of one or more letters of a name such as a place name, a game name, or a building name, or the like.
The combined image generation processing section 220 generates combined image data obtained by combining the image data received from the terminal device 300 and the content image data.
The terminal apparatus 300 in the first embodiment includes a combination instruction section 310, and instructs the information processing apparatus 200 to generate combined image data.
In the information processing system 100, when the information processing apparatus 200 receives image data from the terminal apparatus 300, the information processing apparatus 200 transmits a list of content image data sets to be superimposed on the image data to the terminal apparatus 300.
Note that the image data received by the information processing apparatus 200 from the terminal apparatus 300 may be, for example, image data captured by an imaging device or the like included in the terminal apparatus 300, or image data saved in the terminal apparatus 300.
That is, in the first embodiment, the content image data is image data stored in the information processing apparatus 200, and the image data is image data stored in the terminal apparatus 300.
For example, the content image data may be image data collected from another apparatus by the information processing apparatus 200 communicating with another apparatus via the internet or the like. Further, the content image data may be uploaded from another apparatus via a network and stored in the information processing apparatus 200.
For example, the image data may be captured image data acquired by an imaging device included in the terminal apparatus 300. Further, the image data may be image data obtained by the terminal device 300 communicating with another device via a network, acquired from another device, and stored in the terminal device 300.
When content image data is selected at the terminal device 300, the information processing device 200 in the first embodiment generates combined image data by combining the received image data (first image data) and the content image data (second image data), and transmits identification information for identifying the combined image data to the terminal device 300.
Further, when the information processing apparatus 200 receives the identification information of the combined image data from the printing apparatus 400, the information processing apparatus 200 transmits the combined image data corresponding to the identification information to the printing apparatus 400. Note that the identification information of the combined image data may be acquired by the printing apparatus 400 through, for example, communication between the printing apparatus 400 and the terminal apparatus 300.
Upon receiving the combined image data, the printing apparatus 400 outputs an image based on the combined image data to a recording medium. That is, the printing apparatus 400 is considered as an example of an output apparatus that performs image formation on a recording medium based on combined image data. In the first embodiment and other embodiments, the recording medium may be, for example, a clothing fabric or a material other than cloth. For example, the recording medium may be a medium that can be processed into a flat fan or a folding fan.
Further, in the information processing system 100, the output destination of the combined image data from the information processing apparatus 200 is the printing apparatus 400, however, the output destination is not limited to the printing apparatus 400. The output destination of the combined image data may be output to, for example, an apparatus that performs embroidery or the like on a cloth according to the image data.
In the first embodiment, as described above, combined image data is generated and output based on image data acquired from the terminal apparatus 300 and selected content image data. Therefore, according to the first embodiment, a new content image different from the content image stored in the content image DB210 can be generated.
Hereinafter, the information processing apparatus 200 in the first embodiment will be described. Fig. 2A is a schematic diagram showing an example of a hardware configuration of an information processing apparatus.
The information processing apparatus 200 in the first embodiment includes an input device 21, an output device 22, a drive device 23, an auxiliary storage device 24, a memory device 25, an arithmetic processing device 26 as a processor, and an interface device 27, which are connected to each other via a bus B.
The input device 21 is a device for inputting various information, and is implemented by a keyboard or a pointing device, for example. The output device 22 is a device for outputting various information items, and is realized by, for example, a display. The interface device 27 includes a LAN card or the like and is used to connect to a network.
The information processing program in the first embodiment is at least a part of various programs that control the information processing apparatus 200. The information processing program is provided by, for example, distributing the recording medium 28 or downloading from a network. As the recording medium 28 on which the information processing program is recorded, any of various types of recording media can be used: a recording medium such as a CD-ROM, a floppy disk, a magneto-optical disk, etc. for optically, electrically, or magnetically recording information, a semiconductor memory such as a ROM, a flash memory, etc. for electrically recording data, etc.
Further, when the recording medium 28 in which the information processing program is recorded is set in the drive device 23, the information processing program is installed in the auxiliary storage device 24 from the recording medium 28 via the drive device 23. The information processing program downloaded from the network is installed in the secondary storage device 24 via the interface device 27.
The auxiliary storage device 24 stores the installed information processing program, and stores necessary files, data, and the like, such as the above-described database. When the information processing apparatus 200 is activated, the memory device 25 reads and stores the information processing program from the auxiliary storage device 24. The arithmetic processing device 26 realizes various processes described later in accordance with an information processing program stored in the memory device 25.
Next, with reference to fig. 2B, the hardware configuration of the terminal apparatus 300 will be described. Fig. 2B is a schematic diagram showing an example of the hardware configuration of the terminal apparatus.
The terminal apparatus 300 in the first embodiment includes a Central Processing Unit (CPU)301, a Read Only Memory (ROM)302, a Random Access Memory (RAM)303, an Electrically Erasable Programmable Read Only Memory (EEPROM)304, a Complementary Metal Oxide Semiconductor (CMOS) sensor 305, an image sensor interface (I/F)306, an acceleration/direction sensor 307, a medium I/F309, and a Global Positioning System (GPS) receiver 321.
Among these hardware components, the CPU 301 is an arithmetic processing device as a processor, which controls the overall operation of the terminal apparatus 300. The ROM 302 stores a program for driving the CPU 301, a program such as an Initial Program Loader (IPL), and the like. The RAM 303 is used as a work area of the CPU 301. The EEPROM 304 reads or writes various data items such as a smart phone program under the control of the CPU 301. ROM 302, RAM 303, and EEPROM 304 are examples of storage devices of the terminal apparatus 300.
The CMOS sensor 305 is a built-in imaging unit that captures a subject (mainly self-timer shooting) under the control of the CPU 301 and obtains image data. Instead of the CMOS sensor 305, an imaging unit such as a Charge Coupled Device (CCD) sensor may be used.
The image sensor I/F306 is a circuit that controls driving of the CMOS sensor 305. The acceleration/direction sensor 307 may be any of various sensors that detect geomagnetism, such as an electronic magnetic compass, a gyro compass, and an acceleration sensor. The media I/F309 controls reading or writing (storage) of data with respect to a recording medium 308 (an example of a second storage device) such as a flash memory. The GPS receiver 321 receives GPS signals from GPS satellites.
Further, the terminal apparatus 300 includes a long-distance communication circuit 322, an antenna 322a of the long-distance communication circuit 322, a CMOS sensor 323, an image sensor I/F324, a microphone 325, a speaker 326, an audio input/output I/F327, a display device 328, and an external device connection I/F329, a near field communication circuit 330, an antenna 330a of the near field communication circuit 330, and a touch panel 331.
Among these hardware components, long-range communication circuit 322 is a circuit that communicates with other devices via a communication network. The CMOS sensor 323 is a built-in imaging unit that captures an image of a subject and obtains image data under the control of the CPU 301. The image sensor I/F324 is a circuit that controls driving of the CMOS sensor 323. The microphone 325 is a built-in circuit that converts audio into an electrical signal. The speaker 326 is a built-in circuit that generates electric sound such as music and voice by converting an electric signal into physical vibration. The audio input/output I/F327 is a circuit that processes input and output of an audio signal between the microphone 325 and the speaker 326 under the control of the CPU 301. The display device 328 is a display portion of liquid crystal, organic Electroluminescence (EL), or the like that displays an image of a subject, various icons, or the like. The display device 328 is a display portion of liquid crystal, organic EL (electroluminescence), or the like that displays an image of a subject, various icons, or the like. The external device connection I/F329 is an interface for connecting various external devices. The near field communication circuit 330 is a communication circuit such as Near Field Communication (NFC), bluetooth (registered trademark), or the like. The touch panel 331 is an input section for operating the terminal apparatus 300 when the user presses the display device 328. The touch panel 331 is an input section for operating the terminal apparatus 300 when the user presses the display device 328. The display device 328 is an example of a display section included in the terminal apparatus 300.
In addition, the terminal device 300 includes a bus line 340. The bus line 340 is an address bus or a data bus for electrically connecting each component such as the CPU 301.
Next, the content image DB210 in the first embodiment will be described with reference to fig. 3. In the first embodiment, the content image DB210 is provided in the information processing apparatus 200. However, the first embodiment is not limited thereto. The content image DB210 may be provided in a device different from the information processing device 200.
Fig. 3 is a diagram showing an example of the content image DB in the first embodiment. For example, the content image DB210 in the first embodiment may be provided in the auxiliary storage device 24 (an example of a first storage device) or the like of the information processing apparatus 200. Further, the content image DB210 in the first embodiment may be prepared in advance.
The content image DB210 in the first embodiment includes "content ID" and "content image data" as information items. In the content image DB210, the content ID and the content image data are associated with each other.
The value of the item "content ID" indicates identification information for specifying content image data. The value of the item "content image data" is an entity of the content image data.
For example, the content image data in the first embodiment may be collected from the internet by the information processing apparatus 200, and may be stored in the content image DB210 together with the content ID assigned by the information processing apparatus 200. Further, the information processing apparatus 200 may periodically perform processing for collecting content image data from the internet and for assigning a content ID to the collected image data.
Further, the content image data may be transmitted from a specific server apparatus or the like to the information processing apparatus 200, and may be stored in the content image DB 210.
Next, the functions of the information processing apparatus 200 and the terminal apparatus 300 in the first embodiment will be described with reference to fig. 4.
Fig. 4 is a schematic diagram showing functions of an information processing apparatus and a terminal apparatus in the first embodiment. First, the function of the information processing apparatus 200 will be described. The functions of the information processing apparatus 200 described below are realized by the arithmetic processing device 26 reading and executing a program stored in the memory device 25.
The combined image generation processing section 220 of the information processing apparatus 200 in the first embodiment includes a communication section 221, an image acquisition section 222, a content list output section 223, a combined information acquisition section 224, a combination processing section 225, and a combined image data output section 226.
The communication section 221 corresponds to an example of a first communication section, and performs data transmission and reception between the information processing apparatus 200 and an external apparatus. Specifically, the communication unit 221 performs reception of image data from the terminal device 300, transmission of combined image data to the printing device 400, and the like.
The image acquisition unit 222 acquires the image data received by the communication unit 221. The content list output section 223 outputs a list of content image data sets stored in the content image DB210 to the terminal device 300.
The combination information acquisition unit 224 acquires the combination information and the image data received by the communication unit 221 from the terminal device 300.
The combination processing section 225 generates combined image data using the image data, the combination information, and the content image data selected from the list output by the content list output section 223. Specifically, the combination processing section 225 generates combined image data (an example of third image data) representing a combined image (an example of a third image) so that at least a part of the image represented by the image data is displayed in an inner region or an outer region of the shape of an element included in the image represented by the content image data.
At this time, the combination processing section 225 assigns identification information for specifying the combined image data to the generated combined image data. Further, the combined image data generated by the combination processing section 225 may be temporarily saved in the auxiliary storage device 24 or the like.
Details of the generation of the combined image data and the combination information will be described later.
The combined image data output section 226 corresponds to an example of an output section. When the combined image data output portion 226 receives the identification information of the combined image data through the communication portion 221, the combined image data output portion 226 transmits the communication image data corresponding to the identification information to the printing apparatus 400.
Next, the functions of the terminal apparatus 300 will be described. The terminal device 300 includes a combination instruction unit 310. The combination instruction unit 310 is realized by the arithmetic processing device of the terminal 300 reading and executing a program stored in the memory device.
The combination instruction section 310 includes a display control section 311, an input receiving section 312, an image selecting section 313, a combination information generating section 314, and a communication section 315.
The display control section 311 controls display on the terminal device 300. Specifically, the display control section 311 causes the terminal apparatus 300 to display a list of content image data sets and content images represented based on content image data selected from the list. Further, the display control section 311 displays a preview of an image obtained by combining the content image and the image represented by the selected image data. At this time, the display control section 311 may display a preview at which the image data will be visible through the content image based on the shape extracted from the content image data.
The input receiving unit 312 receives an input to the terminal device 300.
The image selecting section 313 selects image data stored in the terminal device 300 in response to the input received by the input receiving section 312.
The combination information generating unit 314 generates information indicating a positional relationship when the content image displayed by the display control unit 311 and the image indicated by the selected image data are superimposed.
The communication section 315 corresponds to an example of a second communication section, and transmits and receives information to and from the information processing apparatus 200 and the printing apparatus 400. Specifically, for example, the communication section 315 transmits the selected image data, combination information, and the like to the information processing apparatus 200.
Next, the operation of the information processing system 100 in the first embodiment will be described with reference to fig. 5. Fig. 5 is a sequence chart for explaining the operation of the information processing system in the first embodiment.
In the information processing system 100 in the first embodiment, when the terminal apparatus 300 receives an instruction to activate the terminal apparatus 300 from the user, the terminal apparatus 300 starts the combination instructing section 310 and causes the display control section 311 to display a setting screen related to image combination on the display device 328 or the like (step S501). Details of the setting screen will be described later.
Subsequently, the terminal apparatus 300 receives the recording medium-related setting on which the combined image is printed on the displayed setting screen through the input receiving section 312 (step S502).
Specifically, for example, the terminal device 300 receives settings such as selection of the shape of the fabric on which the combined image is formed (the shape of a T-shirt, a handbag, or the like), the type (material such as cloth, thick cloth, thin cloth, vinyl, or the like), the color (red, blue, yellow, amber, or the like), the size (S size, M size, L size, or the like). For example, the selection of T-shirts, fabrics, bags and L-sizes is accepted as the shape, type, color and size of the fabric. Further, the display control section 311 may perform display on the display device 328 based on the setting received by the input receiving section 312.
Subsequently, the terminal apparatus 300 causes the display control section 311 to display a list of image data sets stored in the terminal apparatus 300 on the terminal apparatus 300, and causes the image selection section 313 to recognize image data selected from the list (step S503). Note that the image data to be selected may be image data captured by the terminal apparatus 300.
When the image data is selected, the terminal device 300 transmits an acquisition request of a list of content image data sets to the information processing device 200 through the communication section 315 (step S504).
Upon receiving the acquisition request through the communication unit 221, the information processing apparatus 200 transmits a list of content image data sets to the terminal apparatus 300 through the content list output unit 223 (step S505).
When the list of content image data sets is obtained, the terminal device 300 causes the display control section 311 to display the received list, and causes the input receiving section 312 to receive a selection of content image data (step S506). The list of content image data sets may be displayed in one of the display areas included in the setting screen.
When the content image data is selected, the terminal device 300 causes the display control section 311 to display a preview image in which the content image represented by the selected content image data and the image represented by the image data selected in step S503 are combined in accordance with a user operation (step S507).
Specifically, the input receiving section 312 receives an input for changing the position of the image indicated by the image data selected by the user in step S503, and adjusts and determines the position with respect to the content image represented by the selected content image data.
When the user determines the positional relationship between the content image and the image, the terminal device 300 causes the combination information generation section 314 to generate combination information including information indicating the positional relationship between the content image and the image, and transmits the combination information together with the selected image data as a combination instruction to the information processing device 200 (step S508).
In fig. 5, after selecting image data stored in the terminal apparatus 300, a list of content image data sets is acquired and content image data is selected, however, the order of selection of image data and content image data is not limited to this order.
In fig. 5, the setting of the printing positions of the web and the combined image is performed before the image data is selected, however, the setting order is not limited to this order.
In the present embodiment, for example, first, a list of content image data sets may be acquired, and content image data may be selected. Next, the image data stored in the terminal apparatus 300 may be selected. Finally, settings such as the printing positions of the fabric and the combined image may be made. That is, in the first embodiment, these orders can be arbitrarily changed according to the user's operation.
Upon receiving the combination instruction and the image data from the terminal device 300, the information processing device 200 causes the combination processing section 225 to generate combined image data by superimposing the selected content image data and the selected image data based on the combination information (step S509). Details of step S509 will be described later.
Next, the information processing apparatus 200 transmits the identification information (combined image ID) assigned to the combined image data of the combined image data to the terminal apparatus 300 through the communication unit 221 (step S510).
Next, in the information processing system 100, the printing apparatus 400 reads the identification information of the combined image data from the terminal apparatus 300 (step S511). Upon reading the identification information, the printing apparatus 400 transmits the identification information to the information processing apparatus 200 (step S512).
Note that, in the case where the user of the terminal apparatus 300 purchases a recording medium (T-shirt or the like) on which the combined image is printed, the printing apparatus 400 may transmit the identification information to the information processing apparatus 200.
When the information processing apparatus 200 receives the identification information from the printing apparatus 400, the information processing apparatus 200 transmits the combined image data associated with the identification information to the printing apparatus 400 through the combined image data output part 226 (step S513).
The printing apparatus 400 outputs the combined image data to the prepared recording medium (step S514) and terminates the processing.
In the first embodiment, in the terminal apparatus 300, the information set in step S502 may be retained in the information processing apparatus 200 and may be transmitted to the printing apparatus 400 together with the combined image data. Alternatively, the information set in step S502 may be retained by the terminal apparatus 300, and may be read by the printing apparatus 400 together with the identification information of the combined image data.
The information set in step S502 includes the shape (T-shirt), type, color, size, and print position of the combined image on the fabric on which the combined image is formed.
In the first embodiment, the combination information and the combined image data are retained in the information processing apparatus 200, however, the first embodiment is not limited to this configuration. The information processing apparatus 200 may transmit the generated combined image data to the terminal apparatus 300, and may cause the terminal apparatus 300 to retain the generated combined image data. In this case, the terminal apparatus 300 may transmit the combined image data directly to the printing apparatus 400.
In the first embodiment, the preview image is displayed on the terminal device 300, and the combined image data is generated by the information processing device 200; however, the action related to the generation of the combined image data is not limited thereto. In the first embodiment, the combined image data may be generated by the terminal apparatus 300.
Next, the process of the combination processing section 225 in the first embodiment will be described with reference to fig. 6. Fig. 6 is a schematic diagram for explaining a combined image data generating process in the first embodiment. Fig. 6 shows details of the processing in step S509 of fig. 5.
The combined image generation processing portion 220 in the first embodiment acquires the image data received by the image acquisition portion 222 from the terminal device 300, and acquires the combined information received by the combined information acquisition portion 224 from the terminal device 300 (step S601).
Subsequently, the combined image generation processing portion 220 causes the combination processing portion 225 to superimpose the image represented by the selected content image data and the image represented by the image data, thereby obtaining the positional relationship represented by the combination information (step S602). More specifically, the combination processing section 225 superimposes an image represented by the content image data on an image represented by the image data.
Subsequently, the combination processing section 225 identifies an element existing in the content image, and sets an area indicating the shape of the identified element as a transparent area (step S603). The transparent area is an area processed to be able to visually recognize an underlying image. In the first embodiment, an image in such a state is referred to as a combined image, and image data representing the combined image is referred to as combined image data.
Further, for example, the shape of an element existing in the content image can be extracted from the content image data.
Subsequently, the combination processing section 225 assigns a combined image ID as identification information for specifying the combined image data to the combined image data generated in step S603, and ends the processing. The combined image ID may be, for example, a two-dimensional barcode or the like.
Hereinafter, the process of the combination processing section 225 will be further described with reference to fig. 7. Fig. 7 is a diagram showing generation of combined image data and combined information in the first embodiment.
The content image 71 shown in fig. 7 is an image represented by content image data selected from the list of content image data sets, and the image 72 represents an image represented by image data stored in the terminal device 300. A combined image 73 shown in fig. 7 is an image obtained by combining the content image 71 and the image 72.
The content image 71 shown in fig. 7 includes an element 71 a. In fig. 7, the element 71a represents the shape of a mountain, but the element extracted from the content image data may have any shape. The elements in the first embodiment may be, for example, letters and numbers or any graphics. In the first embodiment, there may be a plurality of elements extracted from the content image data.
In the first embodiment, when content image data representing a content image 71 is selected from the list of content image data sets at the terminal apparatus 300, the combination instruction part 310 may display a preview image in which the content image 71 is superimposed on the image 72 on the display device 328 of the terminal apparatus 300. At this time, the combination instruction section 310 may retain the outline of the element 71a, and may make the entire content image 71a transparent image, so that the user can visually recognize the position of the element 71a in the image 72.
In the terminal apparatus 300, the position of the element 71a in the image 72 is determined by the operation of the user moving the content image 71 on the image 72, and the positional relationship between the content image 71 and the image 72 is determined.
In the combined image 73 shown in fig. 7, the positional relationship between the content image 71 and the image 72 is determined so that the images of the persons P1 and P2 present in the image 72 are included in the element 71 a. In the terminal apparatus 300, when an operation for determining the positional relationship between the content image 71 and the image 72 is received, in this state, the area inside the element 71a in the content image 71 is made transparent, and the area outside the element 71a is not made transparent (made opaque). Then, the terminal apparatus 300 generates combination information indicating the positional relationship between the content image 71 and the image 72.
By performing the above-described operations, a combined image 73 can be generated in which the image 72 is cut out in the shape of the element 71 a.
In the first embodiment, since the user of the terminal apparatus 300 can determine the positional relationship between the content image 71 and the image 72, a portion (area) in the image 72 desired by the user can be arranged in the element 71 a.
Thus, for example, in the case where the persons P1 and P2 travel to the area where the mountain range indicated by the element 71a exists and the image data of the image 72 is captured using the imaging function or the like of the terminal apparatus 300 of the person P1, the combined image 73 may be generated in which the image 72 and the element 71a are associated with each other.
More specifically, the person P1 activates the combination instruction section 310 in the terminal device 300, selects image data representing the image 72, and selects content image data representing the content image 71 from the list of content image data sets. Then, the content image 71 displayed on the image 72 may be moved to position a desired area in the image 72 within the element 71a, and an operation for determining the positional relationship may be performed.
In response to this operation, the terminal apparatus 300 transmits image data representing the image 72 and combination information to the information processing apparatus 200.
With regard to the combination information in the first embodiment, for example, a reference point may be defined for each of the content image 71 and the image 72, and information indicating a positional relationship between coordinates of the reference point of the content image 71 and coordinates of the reference point of the image 72 may be obtained as the combination information.
In fig. 7, as a method for determining the positional relationship between the content image 71 and the image 72, an example of moving the content image 71 on the image 72 has been described, however, the method is not limited to the above-described method. In the first embodiment, for example, any one of the content image 71 and the image 72 may be rotated, enlarged, or reduced.
When the information processing apparatus 200 receives the image data representing the image 72 and the combination information, the information processing apparatus 200 acquires the content image data representing the content image 71 from the content image DB210, and generates the combination image data representing the combination image 73. Then, the information processing apparatus 200 transmits the combined image data representing the combined image 73 and the identification information of the combined image data to the terminal apparatus 300.
When the person P1 visits a shop or the like in which the printing apparatus 400 serviced by the information processing system 100 is installed, the person P1 operates the terminal apparatus 300 to transmit identification information of the combined image data from the terminal apparatus 300 to the printing apparatus 400.
Upon receiving the identification information, the printing apparatus 400 transmits the received identification information to the information processing apparatus 200, and receives the combined image data corresponding to the identification information. Then, the printing apparatus 400 prints the combined image 73 represented by the received combined image data on a T-shirt or the like selected by the person P1.
In the first embodiment, the inside of the element 71a of the content image 71 is set as the transparent area, but the transparent area is not limited to this design. The information processing apparatus 200 can use the outside of the element 71a as the transparent area. In this case, the content image 71 and the image 72 may be superimposed so that a portion of the image 72 that the user desires to make visible in the image 72 may be superimposed in a transparent area outside the element 71 a.
Therefore, in this embodiment, by superimposing a content image with an image acquired by the terminal apparatus 300, and by generating a combined image in which one of the inner area and the outer area of the shape of the element extracted from the content image data is a transparent area, a new content image can be generated and output.
Further, in the example of fig. 7, only the content image 71 is superimposed on the image 72, however, the first embodiment is not limited to this example. The plurality of content images may be superimposed on the image 72, and in each of the plurality of content images, the inside or outside of the element shape may be a transparent region.
Next, a display example of the terminal apparatus 300 will be described with reference to fig. 8A and 8B. Fig. 8A is a first diagram showing a display example in the first embodiment. The screen 80 shown in fig. 8A illustrates an example of the setting screen displayed on the terminal apparatus 300 in step S501 of fig. 5.
The screen 80 includes display areas 80-1, 80-2, and 80-3. In the display area 80-1, operation buttons 80-1a, 80-1b, 80-1c, and the like are displayed.
For example, the operation button 80-1a may be a button for displaying a list of image data sets stored in the terminal apparatus 300. In the first embodiment, for example, when the operation button 80-1a is operated, a list of image data sets stored in the terminal apparatus 300 may be displayed in the display area 80-2. The image data selected from the image data list may be displayed in the display area 80-3.
The operation button 80-1b may be, for example, a button for setting a fabric for printing a combined image. In the first embodiment, for example, when the operation button 80-1b is operated, a list of sizes or colors of T-shirts on which the combined image is printed may be displayed in the display area 80-2 or the like.
The operation button 80-1c may be, for example, a button for displaying a list of content image data sets. In the first embodiment, for example, when the operation button 80-1c is operated, a list of content image data sets stored in the content image DB210 may be displayed in the display area 80-2 or the like. In fig. 8A, a list of content image data sets is displayed in the display area 80-2.
In the display area 80-3, an operation for determining a method of combining (superimposing) the image data and the content image data is performed. The display area 80-3 includes operation buttons 80-4.
In the first embodiment, an outline representing the shape of the fabric selected in step S502 is displayed in the display area 80-3. In the display area 80-3, the image data and the content image data are displayed on the outline representing the shape of the fabric.
In the first embodiment, after the operation of superimposing the image data and the content image data is performed in the display area 80-3, the combination information may be generated from the image displayed in the display area 80-3 when the operation button 80-4 is operated.
In the example of fig. 8A, the fabric on which the combined image is printed is a T-shirt, as can be seen from the outline Ta. The example of fig. 8A represents a state in which the image 72 (in fig. 7) and the content image 86 including the element 86a "ABC" are selected and a superimposition operation is performed in the display area 80-3.
In the first embodiment, for example, the positional relationship between the content image 86 and the image 72 can be determined by shifting the position of the image 72 with respect to the content image 86.
Specifically, for example, the position of the content image 86 may be fixed in the display area 80-3, and only the operation of sliding the image 72 may be accepted.
In the display area 80-3, for example, the position at which the content image 86 is displayed may be a position including at least a part of the content image in the outline Ta representing the fabric shape.
In the first embodiment, for example, when the image 72 overlaps the region 86b other than the element 86a in the content image 86 when adjusting the position of the image 72 with respect to the content image 86, the display control section 311 may cause the display device 328 to display the image 72 so that the image 72 is visible through the region 86 b.
At this time, in the first embodiment, the transparency of the image 72 in the element 86a is set to be different from the transparency of the image 72 in the area 86 b. Specifically, the transparency of the image 72 in the region 86b is made lower than the transparency of the image 72 in the element 86 a. Thus, in the state of fig. 8A, the image 72 superimposed with the element 86a is more clearly visible than the image 72 superimposed with the area 86 b.
In the first embodiment, as described above, the display position of the content image is fixed while moving the image, and the positional relationship between the content image and the image is adjusted, however, the method for adjusting the positional relationship is not limited to this example. The positional relationship between the content image and the image may be adjusted by fixing the display position of the image and moving the content image relative to the image, or the positional relationship between the content image and the image may be adjusted by moving both the image and the content image.
In the first embodiment, when an operation for superimposing the image 72 with the content image 86 is performed in the display area 80-3 and the operation button 80-4 is operated, the screen 80 becomes the screen 81 shown in fig. 8B.
Fig. 8B is a second diagram showing a display example of the terminal apparatus in the first embodiment. For example, the screen 81 shown in fig. 8B corresponds to an example of the screen displayed on the terminal apparatus 300 in step S507 of fig. 5.
The screen 81 includes display areas 82-1 and 82-2, a display area 83, and operation buttons 84.
The display area 82-1 displays an operation section for selecting image data and the like. For example, the display area 82-2 displays an operation member for setting the brightness and contrast of an image.
In the display area 83, a combined image 85 is displayed. The combined image 85 is an image in which the content image 86 is superimposed on the image 72. In this combined image 85, each shape of the letter "ABC" as the element 86a extracted from the content image 86 is set as a transparent region in which the image 72 is visible. That is, the combined image 85 is an image obtained by cutting out the image 72 in accordance with the shape shown by the element 86 a. In the example of fig. 8B, the combined image 85 is printed on, for example, a T-shirt, a bag, or the like.
That is, in the first embodiment, in the case where the first image data is the content image data and the second image data is the image data, a part of the second image is displayed in accordance with each shape of the elements included in the first image represented by the first image data.
As described above, in the first embodiment, a content image having a new meaning can be created by combining a content image having a specific meaning and an image acquired by the terminal apparatus 300 to generate a combined image.
For example, in the combined image 73 shown in fig. 7, by combining the element 71a indicating "mountain" and the image 72, the meaning of "image associated with mountain" can be applied with respect to the image 72. Further, for example, in the combined image 85 shown in fig. 8B, by combining the element 86a indicating "ABC" and the image 72, the meaning of "image associated with ABC" can be applied to the image 86.
In the case where the element is a letter indicating a place name, a combined image in which a part of the image is associated with the letter indicating the place name is generated. In the case where the element is a number indicating a date, a combined image in which a part of the image is associated with the date is generated.
As described above, in the first embodiment, a new content image (combined image) in which the meaning of an element extracted from content image data is associated with an image represented by image data acquired by the terminal apparatus 300 can be generated.
In the first embodiment described above, the combination instruction unit 310 of the terminal device 300 displays a setting screen or the like for generating a combination image on the terminal device 300, but the first embodiment is not limited to this functional arrangement. The combination instruction part 310 may be provided on the information processing apparatus 200. In this case, the terminal apparatus 300 simply displays the above-described setting screen on the terminal apparatus 300 only by accessing the information processing apparatus 200.
In the first embodiment, the terminal apparatus 300 and the information processing apparatus 200 are described as separate apparatuses, however, the first embodiment is not limited to this configuration. The information processing apparatus 200 may include the functions of the terminal apparatus 300.
(second embodiment)
The second embodiment will be described below with reference to the drawings. The second embodiment is different from the first embodiment in that there is paid content image data or paid image data, and content image data is automatically selected from metadata of the image data. In the following description of the second embodiment, differences from the first embodiment will be described. Components having the same functional configuration as in the first embodiment are denoted by the same reference numerals as those used in the description of the first embodiment, and thus the description thereof will be omitted.
Fig. 9 is a schematic diagram showing an information processing system in the second embodiment. The information processing system 100A in the second embodiment includes an information processing apparatus 200A, a terminal apparatus 300, and a printing apparatus 400.
The information processing apparatus 200A in the second embodiment includes an image DB 210A, a user DB 240, and a combined image generation processing section 220A.
The image DB 210A of the information processing apparatus 200A in the second embodiment stores content image data and image data to be combined with the content image data. Further, the image DB 210A stores information indicating whether or not the content image data and the image data are payment data and the like.
In addition, the user DB 240 in the second embodiment stores information on users who upload content image data and image data to the image DB 210A.
Fig. 10A is a first diagram showing an example of an image DB in the second embodiment. The image DB 210A in the second embodiment includes "image ID", "genre", "image data", "registered user ID", "paid or free" and "metadata" as information items, and the item "image ID" is associated with other items.
The value of the item "image ID" represents identification information for specifying the content image data and the image data stored in the image DB 210A.
The value of the item "type" indicates whether the image data is content image data or image data captured by the terminal apparatus 300, the imaging device, or the like.
The value of the item "image data" is content image data or an entity of image data. The value of the item "registered user ID" represents a user ID for specifying a user who has updated the content image data and the image data to the information processing apparatus 200A. The value of the item "paid or free" indicates whether the content image data or the image data is provided for a fee or for a free.
The value of the item "metadata" indicates metadata attached to image data. The metadata includes, for example, information indicating a location where the image data is acquired, date information indicating a date and time when the image data is acquired, information corresponding to a tag attached to the image data, and the like.
In the example of fig. 10A, for example, the image data of the image ID "101" is the content image data uploaded by the user specified by the user ID "xx" and is provided for a fee.
Note that the information items included in the image DB 210A are examples, and the image DB 210A may include items other than those shown in fig. 10A.
Fig. 10B is a second diagram showing an example of the image DB in the second embodiment. The image DB 210A-1 in the second embodiment includes an item "transparent" in addition to the item of the image DB 210A as an information item, and is associated with an item "image ID".
The value of the item "transparent" indicates whether the underlying image is allowed to be visible through the shape of the content image data or the element included in the image data. Specifically, in the case where the value of the item "transparent" is "yes", it is allowed to make the underlying image visible in accordance with the shape of the image specified by the image ID or the element included in the content image data.
In addition, when the value of the item "transparent" is "no", the underlying image is not allowed to be visible with respect to the image data or the content image data specified by the image ID and shown.
For example, the value of the item "transparent" may be set by a user who performs uploading or the like when uploading content image data or image data to the image DB 210A-1.
Fig. 11 is a diagram showing an example of the user DB in the second embodiment. The user DB 240 in the second embodiment includes "registered user ID", "image ID", "financial information", "billing count", and the like as information items, and the item "registered user ID" is associated with other items. In the following description, information including the value of the item "registered user ID" and the values of other items in the user DB 240 is referred to as user information.
The value of the item "financial information" is information on a financial institution or the like, which is a payment destination for using the price of the image data associated with the user specified by the registered user ID. Specifically, for example, the value of the item "financial information" may be bank account information or the like.
The value of the item "billing count" indicates the number of times the paid image data uploaded by the user is used.
Next, each function of the information processing system 100A in the second embodiment will be described with reference to fig. 12. Fig. 12 is a schematic diagram showing functions of an information processing apparatus and a terminal apparatus in the second embodiment.
The information processing apparatus 200A in the second embodiment includes a combined image generation processing section 220A. The combined image generation processing section 220A in the second embodiment includes a communication section 221, an image acquisition section 222, a content selection section 223A, a combination processing section 225, a combined image data output section 226, a metadata extraction section 227, a charge determination section 228, a charge history storage section 229, and an image accumulation section 230.
The metadata extraction section 227 extracts metadata attached to the image data acquired by the image acquisition section 222 from the terminal device 300.
The charge determination unit 228 determines whether or not the selected content image data is charged. When it is determined that the content image data is to be charged, the charging history storage part 229 increases the charging count in the user DB 240 as the charging history.
The image accumulating section 230 stores the image data uploaded from the terminal device 300 in the image DB 210A.
Next, the operation of the information processing system 100A in the second embodiment will be described with reference to fig. 13. Fig. 13 is a sequence chart for explaining the operation of the information processing system in the second embodiment.
The processing in steps S1301 and S1302 in fig. 13 is the same as the processing in steps S501 and S502 in fig. 5, and thus the description thereof will be omitted.
Subsequently, the terminal device 300 captures image data by the operation of the user, and transmits the captured image data to the information processing device 200A (step S1303).
In the example of fig. 13, the terminal apparatus 300 transmits the image data photographed in the field to the information processing apparatus 200A, but the second invention is not limited to this process. Similar to the process described with reference to fig. 5, in the process of transmitting image data from the terminal device 300 to the information processing device 200A, image data may be selected from among the image data sets stored in the terminal device 300, and the image data may be transmitted to the information processing device 200A.
Upon receiving the image data from the terminal device 300, the information processing device 200A in the second embodiment generates combined image data (step S1304). Details of the processing in step S1304 will be described later.
The processing from step S1305 to step S1309 in fig. 13 is the same as the processing from step S510 to step S514 in fig. 5, and therefore, the description thereof will be omitted.
Next, generation of combined image data in the second embodiment will be described with reference to fig. 14. Fig. 14 is a flowchart for explaining a combined image data generation process in the second embodiment.
In the combined image generation processing portion 220A of the information processing apparatus 200A in the second embodiment, the image acquisition portion 222 acquires the image data received by the communication portion 221 (step S1401).
Subsequently, the combined-image generation processing portion 220A extracts the metadata attached to the image data by the metadata extraction portion 227, and selects the content image data based on the extracted metadata by the content selection portion 223A referring to the image DB 210A (step S1402).
Specifically, for example, in the case where the metadata of the image data is the position information, the content selection section 223A may select the content image data including the place name indicated by the position information as an element. In addition, in the case where the metadata of the image data is date and time information, the content selection section 223A may select content image data including, as elements, the time, year, and the like corresponding to the date and time information.
In the second embodiment, the content selection section 223A selects one content image data set from among a plurality of content image data sets stored in the image DB 210A based on metadata, however, the selection of the content image data set is not limited to this selection manner. The content selection section 223A may select content image data other than the content image data stored in the image DB 210A. For example, the content selection section 223A may acquire content image data including elements corresponding to metadata from the internet.
Subsequently, the combined-image generation processing portion 220A causes the combination processing portion 225 to superimpose the image represented by the selected content image data on the image represented by the image data, and makes the image represented by the image data visible within the element region (step S1403).
For example, in the case where the image data acquired from the terminal device 300 includes an image of a face of a person (face image), the combination processing section 225 in the second embodiment may superimpose the content image and the image so as to position the face image within the element region. In this case, in particular, the eye portion of the person may be located within the element region. In the second embodiment, the combination processing section 225 can set an image in advance to be arranged in an element region in an image represented by image data.
Subsequently, the combined image generation processing unit 220A determines whether or not the selected content image data is provided for a fee by the fee determination unit 228 (step S1404). When it is determined in step S1404 that payment is not paid for the content image data, that is, when the content image data is provided free of charge, the combined image generation processing part 220A proceeds to step S1406 described later.
When it is determined in step S1404 that payment is to be paid for the content image data, the combined image generation processing part 220A updates the charging history storage part 229 by adding "1" to the billing count of the registered user ID associated with the selected content image data (step S1405).
Subsequently, the combined-image generation processing unit 220A assigns a combined-image ID to the combined-image data (step S1406), and terminates the processing.
As described above, in the second embodiment, combined image data having content image data suitable for image data can be generated simply by transmitting the image data from the terminal apparatus 300. In the second embodiment, uploaded content image data can be provided for a fee.
Next, uploading of image data in the present embodiment will be described. In the second embodiment, for example, image data captured by the user with his terminal device 300 or content image data created by himself/herself may be transmitted to the information processing device 200A.
Specifically, for example, the user can upload image data to which a tag (event name, concert name, or the like) is assigned from the terminal apparatus 300 to the information processing apparatus 200A using an application such as SNS (social network service).
In the second embodiment, the type of image data and information for payment or free can be set by the user at the time of uploading the image data from the terminal apparatus 300. Information indicating the contents of these settings and the registered user ID of the user of the terminal apparatus 300 may be transmitted from the terminal apparatus 300 to the information processing apparatus 200A together with the image data.
In the second embodiment, the sales price of a recording medium (such as a T-shirt) on which a combined image is printed by the printing apparatus 400 can be discounted for a user who has uploaded image data and content image data for free.
When the combined image generation processing part 220 of the information processing apparatus 200A in the second embodiment receives the upload of the image data from the terminal apparatus 300, the image accumulation part 230 acquires settings for the type of the image data, metadata applied to the image data, a registered user ID, payment or free of charge, and the like, and stores the settings in the image DB 210A together with the image data.
Therefore, in the second embodiment, a new content image (combined image) can be generated using one or more image data sets accumulated by a large number of users.
(third embodiment)
The third embodiment will be described below with reference to the drawings. The third embodiment is different from the second embodiment in that the information processing apparatus selects both image data and content image data. In the following description, differences from the second embodiment will be described, and those components having the same functional configuration as that of the second embodiment will be denoted by the same reference numerals as those used in the description of the second embodiment. The description is omitted.
Fig. 15 is a schematic diagram showing functions of an information processing apparatus and a terminal apparatus in the third embodiment. The information processing apparatus 200B in the third embodiment includes a combined image generation processing section 220B. The combined image generation processing section 220B includes a communication section 221, an image acquisition section 222A, a content selection section 223A, a combination processing section 225, a combined image data output section, a metadata extraction section 227, a charge determination section 228, a charge history storage section 229, an image accumulation section 230, and an image selection section 231.
The position information acquisition unit 222A acquires the position information received from the terminal device 300A. The image selecting section 231 selects image data whose type represents "image" from the image DB 210A based on the position information. In the case where there is no corresponding image data in the image DB 210A, the image selecting section 231 may select image data in the network.
The terminal device 300A in the third embodiment has a combination instruction section 310A. The combination instruction section 310A in the third embodiment only needs to include the display control section 311, the input reception section 312, and the communication section 315.
Fig. 16 is a sequence diagram for explaining the operation of the information processing system in the third embodiment. The processing in step S1601 and step S1602 in fig. 16 is the same as the processing in step S501 and step S502 in fig. 5, and therefore, the description thereof will be omitted.
Subsequently, the terminal device 300A transmits the position information acquired by the GPS (global positioning system) function or the like to the information processing device 200B (step S1603).
In the information processing apparatus 200B, when the position information acquisition unit 222A acquires the position information received by the communication unit 221, the combination processing unit 225 generates combined image data (step S1604). Details of the processing in step S1604 will be described later.
The processing from step S1605 to step S1609 in fig. 16 is the same as the processing from step S510 to step S514 in fig. 5, and therefore, the description thereof will be omitted.
Next, the combined-image-data generating process in the third embodiment will be described with reference to fig. 17. Fig. 17 is a schematic diagram showing a combined image data generation process in the third embodiment.
In the combined image generation processing unit 220B of the third embodiment, when the position information acquisition unit 222A acquires the position information (step S1701), the image selection unit 231 selects the image data based on the position information (step S1702).
Specifically, the image selecting section 231 may refer to the image DB 210A, and may select image data in which the received position information is included in a predetermined range from the position information included in the metadata and the type indicates "image".
Subsequently, the combined image generation processing portion 220B selects content image data by the content selection portion 223A based on the metadata of the image data selected by the image selection portion 231 (step S1703).
Subsequently, the combined-image generation processing portion 220A causes the combination processing portion 225 to superimpose the image represented by the selected content image data on the image represented by the image data, and makes the image represented by the image data visible within the element region (step S1704).
Subsequently, the combined image generation processing portion 220B determines whether each of the image data selected by the image selection portion 231 and the content image data selected by the content selection portion 223A is provided for a charge.
When it is determined in step S1705 that neither is charged, that is, when it is determined that the image data and the content image data are provided free of charge, the combined image generation processing portion 220A proceeds to step S1707 described later.
When it is determined that the image data is determined and the content image data is provided for charging in step S1705, the combined image generation processing part 220B updates the charging history storage part 229 by adding "1" to the billing count of the registered user ID associated with the selected image data or content image data (step S1706).
Subsequently, the combined-image generation processing section 220B assigns a combined-image ID to the combined-image data (step S1707), and terminates the processing.
In the third embodiment, the terminal device 300A transmits the position information to the information processing device 200B, but the third embodiment is not limited to this transmission method. In the third embodiment, for example, information included in metadata of image data may be transmitted from the terminal apparatus 300A to the information processing apparatus 200B instead of the position information. In this case, the information processing apparatus 200B may select image data based on information included in the metadata.
As described above, according to the third embodiment, a new content image (combined image) can be generated.
(fourth embodiment)
The fourth embodiment will be described below with reference to the drawings. The fourth embodiment is different from the first embodiment in that it is determined whether or not a region representing the shape of an element is defined as a transparent region when a combined image is generated. Therefore, in the following description of the fourth embodiment, differences from the first embodiment will be described. Components having the same functional configuration as in the first embodiment are denoted by the same reference numerals as those used in the description of the first embodiment, and thus the description thereof will be omitted.
Fig. 18 is a diagram showing functions of an information processing apparatus and a terminal apparatus in the fourth embodiment.
In the information processing apparatus 200C, the combined image generation processing unit 220C includes a communication unit 221, an image acquisition unit 222, a content list output unit 223, a combined information acquisition unit 224, a combination processing unit 225A, a combined image data output unit 226, and a mode determination unit 251.
The mode determination section 251 of the fourth embodiment determines the mode selected in the terminal apparatus 300B based on the mode information transmitted from the terminal apparatus 300B, and instructs the combination processing section 225A to generate the combined image data based on the determination result. Details of the mode information are described later.
The combination processing section 225A in the fourth embodiment generates combined image data in a mode according to the determination result of the mode determination section 251.
The combination instruction section 310B of the terminal device 300B in the fourth embodiment includes a display control section 311, an input receiving section 312, an image selection section 313, a combination information generation section 314, a communication section 315, and a mode selection section 316.
In the fourth embodiment, when the input receiving section 312 receives a selection of a mode indicating a method for combining an image and a content image in a setting screen displayed on the terminal device 300B, the mode selecting section 316 generates mode information according to the selection.
Next, the mode in the fourth embodiment will be described.
The mode in the fourth embodiment represents a method for combining an image and a content image (superimposition method). In the fourth embodiment, there are two modes: a transparent mode and an overlay mode, and the modes are switched by an operation of a user at the terminal apparatus 300B.
The transparent mode corresponds to a method of generating a combined image by using a region (a region inside or outside an element) indicated by the shape of the element extracted from content image data as a transparent region in a case where an image is superimposed with a content image. In other words, the transparent mode is a mode for allowing generation of third image data representing a third image in which at least a part of the second image data represented by the second image is displayed in an inner region or an outer region of a shape of an element included in the first image represented by the first image data.
In the overlay mode, elements extracted from the content image data are superimposed on the image to produce a combined image. In other words, the overlay mode is a mode for allowing generation of third image data representing a third image, wherein at least a portion of the second image is not displayed in an inner area or an outer area in which at least a portion of the second image is displayed in the transparent mode, and wherein at least an image portion other than the portion of the second image is displayed in an area other than the inner area or the outer area in which at least a portion of the second image is displayed in the transparent mode. That is, in the overlay mode, the second image is not displayed in an area where at least a portion of the second image is displayed in the transparent mode.
In the terminal device 300B in the fourth embodiment, when the input receiving section 312 receives an operation for selecting a mode, the mode selecting section 316 generates mode information indicating the selected mode and transmits the mode information to the information processing device 200C.
Hereinafter, the operation of the information processing system in the fourth embodiment will be described with reference to fig. 19. Fig. 19 is a sequence chart showing the operation of the information processing system in the fourth embodiment.
Since the processing from step S1901 to step S1906 in fig. 19 is the same as the processing from step S501 to step S506 in fig. 5, the description thereof is omitted.
In step S1906, when receiving the selection of the content image data, the terminal device 300B subsequently receives the selection of the mode through the input receiving section 312, and generates mode information indicating the selected mode through the mode selecting section 316 (step S1907).
Subsequently, in step 1908, the terminal device 300B displays a preview image in which the content image represented by the selected content image data and the image represented by the selected image data are combined in the selected mode according to the user operation (step S1903).
When the user determines the positional relationship between the content image and the image, the terminal device 300B generates combination information including information indicating the positional relationship between the content image and the image by the combination information generation section 314, and transmits the selected image data and mode information as a combination instruction to the information processing device 200C (step S1909).
When the information processing apparatus 200C receives the combination instruction and the image data from the terminal apparatus 300B, the combination processing part 225A generates the combined image data by superimposing the selected content image and the image data based on the mode information and the combination information (step S1910). Details of step S1910 are described below.
Since the processing from step S1911 to step S1915 in fig. 19 is the same as the processing from step S510 to step S514 in fig. 5, the description thereof is omitted.
Next, the process of the combination processing section 225A of the information processing apparatus 200C in the fourth embodiment will be described with reference to fig. 20. Fig. 20 is a schematic diagram for explaining a process of generating synthetic image data in the fourth embodiment. Fig. 20 shows details of the processing in step S1910 of fig. 19.
The combined image generation processing portion 220C in the fourth embodiment acquires image data by the image acquisition portion 222 when receiving image data from the terminal device 300B, acquires combination information by the combination information acquisition portion 224 when receiving combination information from the terminal device 300B, and acquires mode information by the mode determination portion 251 (step S2001).
Subsequently, the combined image generation processing portion 220C superimposes one image represented by the selected content image data and another image represented by the image data so as to correspond to the positional relationship represented by the combination information through the combination processing portion 225A (step S2002). More specifically, the combination processing section 225A superimposes an image represented by the content image data on an image represented by the image data.
Subsequently, the combination processing unit 225A determines whether or not the mode indicated by the mode information is the transparent mode by the mode determination unit 251 (step S2003).
In step S2003, when the mode indicated by the mode information is the transparent mode, the combination processing section 225A proceeds to step S2004.
In step S2003, when the mode indicated by the mode information is not the transparent mode, that is, when the mode indicated by the mode information is the overlay mode, the combination processing section 225A proceeds to step S2005.
Since the processes of step S2004 and step S2005 in fig. 20 are the same as those of step S603 and step S604 in fig. 6, a description thereof will be omitted.
As described above, in the fourth embodiment, the user can select a method for combining image data and content image data.
Hereinafter, mode selection will be described with reference to fig. 21. Fig. 21 is a diagram showing a display example of a terminal apparatus in the fourth embodiment.
The screen 80A shown in fig. 21 is an example of the setting screen displayed in the terminal apparatus 300B in step S1907 in fig. 19. More specifically, after the user determines the positional relationship between the content image and the image, the display area 80-3A of the screen 80A is displayed.
The screen 80A in the fourth embodiment includes display areas 80-1A, 80-2, and 80-3A. The display area 80-1A of the screen 80A in the fourth embodiment displays operation buttons 80-1A, 80-1b, 80-1c, 80-1d, and the like.
The operation button 80-1d is a button for allowing the user to select an overlap mode or a transparent mode.
When the operation button 80-1d is operated in the display area 80-1A of the screen 80A, an area 87 for selecting the overlap mode and an area 88 for selecting the transparent mode are displayed in the display area 80-3A.
In the screen 80A, for example, in the display area 80-3A, when the area 87 for selecting the overlap mode is selected and the operation button 89 is operated, the overlap mode is selected. In the screen 80A, for example, in the display area 80-3A, when the area 88 for selecting the overlap mode is selected and the operation button 89 is operated, the transparent mode is selected.
In the fourth embodiment, when the selected image is combined with the content image 87b as an example of the selected content image in an overlapping mode, the area 87 displays a combined image 87c (preview image).
A combined image 88c (preview image) in which the image 87a and the content image 87b are combined in the transparent mode is displayed in the area 88.
It should be understood that the combined image 87c is an image in which the elements in the content image 87b are superimposed on the image 87a, while the combined image 88c is displayed such that the image 87a is visible within the shape of the elements in the content image 87 b.
As described above, in the fourth embodiment, the user can select a method for superimposing an image represented by image data stored in the terminal device 300C and a content image represented by content image data.
In the example of fig. 21, in the display area 80-3A, the transparent mode or the overlapping mode is selected after the positional relationship between the image and the content image is determined, however, the mode selection timing is not limited in this manner. The selection of the transparent mode or the superimposition mode may be performed before the positional relationship between the image and the content image is determined.
According to the fourth embodiment, for example, in the case where the information processing apparatus 200C refers to the content image DB 210A-1 (fig. 10B), when content image data of which the value of the item "transparent" is "no" is selected, the transparent mode may not be selected on the screen 80A.
Specifically, when content image data of which the value of the item "transparent" is "no" is selected, the area 88 in the screen 80A may be grayed out to indicate that the transparent mode is not selectable.
In addition, even if the selection area displayed for selecting one of the transparent mode and the overlay mode is formed to be pulled down by selecting the display area 80-1c, in the case of the content image data of which the value of the selection item "transparent" is "no", the selection area is displayed in a state in which the transparent mode is not selectable.
In the fourth embodiment, the transparent mode or the superimposition mode can be selected by referring to whether or not the content image data is transparent. Thus, in the fourth embodiment, for example, the author of the content image data can specify the manner of use of the content image data by the user.
(fifth embodiment)
The fifth embodiment will be described below with reference to the drawings. The fifth embodiment is different from the first embodiment in that elements are edited. Therefore, hereinafter, in the fifth embodiment, only the differences from the first embodiment will be described. Components having the same functional configuration as in the first embodiment are denoted by the same reference numerals as those used in the description of the first embodiment, and thus the description thereof will be omitted.
Fig. 22 is a schematic diagram showing functions of an information processing apparatus and a terminal apparatus in the fifth embodiment.
The information processing apparatus 200D in the fifth embodiment includes a communication section 221, an image acquisition section 222, a content list output section 223, a combination information acquisition section 224, a combination processing section 225B, a combination image data output section 226, and an editing information acquisition section 252.
The editing information acquisition section 252 in the fifth embodiment acquires editing information transmitted from the terminal device 300C. The combination processing section 225B generates combined image data by using the image data and the content image data edited based on the editing information.
The combination instruction section 310C of the terminal device 300C in the fifth embodiment includes a display control section 311, an input receiving section 312, an image selecting section 313, a combination information generating section 314, a communication section 315, and an editing section 317.
In the case where the element extracted from the content image data is a text, the editing section 317 performs predetermined editing on the element and displays the content image data including the edited element. Further, the editing unit 317 generates editing information indicating an editing method of the element.
Specifically, in the case where the element is text such as a letter or a number, the editing section 317 changes the width or font type of the letter or number to a predetermined width or font type.
The predetermined width or font type may be, for example, a width or font type set in advance by the user of the terminal apparatus 300C. The predetermined width or font type may be, for example, a width or font type set in advance by an administrator of the information processing system in the fifth embodiment.
The editing section 317 in the fifth embodiment generates information representing, for example, the width or font type of the edited letter or number as editing information.
Hereinafter, with reference to fig. 23, the operation of the information processing system in the fifth embodiment will be described with reference to fig. 23. Fig. 23 is a sequence diagram showing the operation of the information processing system in the fifth embodiment.
Since the processing from step S2301 to step S2306 in fig. 23 is the same as the processing from step S501 to step S506 in fig. 5, the description thereof is omitted.
In the case where the content image data is selected in step S2306, when the element extracted from the content image data by the editing unit 317 is a text, the combination instruction unit 310C of the terminal device 300C generates editing information by editing the element (step S2307).
In the fifth embodiment, the editing process is performed on the content image data selected from the list of content image data sets, however, the editing process is not limited in this manner.
In the fifth embodiment, for example, the setting screen displayed in step S2301 may include an area for inputting text, and the text input in the area may be edited as an element.
Subsequently, the terminal device 300C displays a preview image in which the content image containing the edited element and the image represented by the image data selected in step S503 are combined in accordance with the user' S operation (step S2308).
Subsequently, when the user determines the positional relationship between the content image and the image, the terminal device 300C generates combination information including information indicating the positional relationship between the content image and the image by the combination information generation section 314, and transmits the combination information together with the selected image data and the editing information as a combination instruction to the information processing device 200 (step S2309). At this time, in the case where the element is a text input at the terminal device 300C, the terminal device 300C transmits text data representing the text to the information processing device 200D.
Upon receiving the combination instruction and the image data from the terminal device 300C, the information processing device 200D edits the content image data according to the editing information by the combination processing section 225B, and generates combined image data in which the edited content image and the image indicated by the image data are superimposed based on the combination information. Details of step S2310 will be described below.
Since the processing from step S2311 to step S2315 in fig. 23 is the same as the processing from step S510 to step S514 in fig. 5, the description thereof is omitted.
Next, the process of the combination processing section 225B of the information processing apparatus 200D in the fifth embodiment will be described with reference to fig. 24. Fig. 24 is a diagram for explaining a process of generating combined image data in the fifth embodiment. Fig. 24 shows details of the process of step S2310 of fig. 23.
The processing of fig. 24 represents the operation of the information processing apparatus 200D in the case where an element is a text and editing processing is performed on the element. The operation of the information processing apparatus 200D in the case where the element is not a text is the same as that of the information processing apparatus 200 in the first embodiment.
The combined image generation processing unit 220D of the fifth embodiment acquires image data received from the terminal device 300C by the image acquisition unit 222, acquires combination information received from the terminal device 300C by the combination information acquisition unit 224, and acquires edit information by the edit information acquisition unit 252.
Subsequently, the combined image generation processing unit 220D edits the selected content image data based on the editing information by the combined processing unit 225B (step S2402).
Subsequently, the combined image generation processing portion 220D combines the content image data and the image data by the combined image generation processing portion 225B to form a positional relationship between the edited content image and the image data represented by the image data received from the terminal device 300C (step S2403).
Since the processing of step S2404 and step S2405 in fig. 24 is the same as the processing of step S603 and step S604 in fig. 6, the description thereof will be omitted.
Hereinafter, a setting screen in the fifth embodiment will be described with reference to fig. 25. Fig. 25 is a diagram showing a display example of a terminal apparatus in the fifth embodiment.
The screen 80B illustrated in fig. 25 is an example of the setting screen displayed on the terminal apparatus 300C in step S2301 in fig. 23.
The screen 80B in the fifth embodiment includes display areas 80-1, 80-2A, 80-3B. In the display area 80-2A of the screen 80B in the fifth embodiment, for example, an input area 91 and an operation button 92 for editing input text data are displayed.
In the display area 80-3B of the fifth embodiment, in a case where a text is input to the input area 91 and the operation button 92 is operated, an edited image showing the text is displayed.
Note that the text input in the input area 91 is a letter or a character string, and the image representing the text is a text image representing the letter or the character string. Hereinafter, an image representing text may be referred to as a text image.
In the fifth embodiment, an edited text image in which a text image represents an inputted text is decided as one element.
The example of fig. 25 shows a case where the text "ABC" is entered in the input area 91. When the text "ABC" is input in the input area 91 and the operation button 92 is operated, the editing section 317 edits a text image representing the text "ABC".
In the display area 80-3B of fig. 25, content images 93, 94, 95, and 96 including the edited text image are displayed. For example, the content images 93 to 96 are regarded as candidates of content images to be combined with the image represented by the image data stored in the terminal apparatus 300C.
For example, the content image 93 includes a text image in which the text "ABC" is drawn in the font type "Calibri", and the content image 94 includes a text image in which the text "ABC" is drawn in the font type "HG Marugoshic M-PRO". For example, content image 95 includes a text image in which text "ABC" is drawn in bold form of "Meiryo UI", and content image 96 includes a text image in which text "ABC" is drawn in font type "Meiryo UI".
The font type may not be limited to the above font type, and the content image 96 may be a text image using another font type. The font type to be used may also vary depending on the region. For example, in Japan, a font type such as "Otsuka Gothic pro H" or the like may be used.
Each text image included in the content images 93 to 96 is an edited text image in which the text image of the text "ABC" is edited by the editing section 317. In other words, the text images included in the content images 93 to 96 are regarded as edited elements included in the content images.
In the fifth embodiment, for example, when any one of the content images 93 to 96 is selected in the display area 80-3B and the operation button 97 is operated, the selected content image is applied as a target of the combining process.
In the fifth embodiment, the editing section 317 may edit a text image so that the width of the text is drawn with a predetermined width. Specifically, it is preferable that the editing section 317 in the fifth embodiment edits the text image so that, for example, the width of the line of the text image is larger than about 2.5 cm.
In the following, the table shows the results of the inventors evaluating the width of the lines of the text image and whether the visualized image is preferably visible. The evaluation is performed by changing the width of one line of the font type using a text image of Calibri font size 360.
[ Table 1]
Line width of text image Visibility
0.5cm Can not be seen
1.0cm Can not be seen
1.7cm Is slightly visible
2.0cm Is slightly visible
2.5cm Can be well seen
2.7cm Can be well seen
For example, the width may be set based on the area of the text image. In the case of performing combination processing on a content image including a text image, the larger the area of the text image is, the wider the transparent region corresponding to the shape of the element is.
The editing section 317 in the fifth embodiment can edit each of the images representing the respective letters included in the inputted text to form a continuous image.
Fig. 26A and 26B are diagrams showing examples of editing of a text image by an editing unit. In fig. 26A, the text image representing the text "ABC" describes a continuous letter image.
In fig. 26B, a text image representing the text "ABC" is also formed of consecutive letter images. However, the editing process is performed so that the area of the text image is wider than that shown in fig. 26A.
In the fifth embodiment, the text image is edited by the editing section 317, however, the editing process is not limited in this manner. The text image can be edited by the user's operation.
Specifically, for example, the user can perform an operation for editing a text image with respect to the content images 93 to 96 displayed in the display area 80-3B of the screen 80B. In this case, for example, the images of the letters "a", "B", and "C" may be individually moved, or a width and a font type may be defined for each letter.
As described above, in the fifth embodiment, in the case where an element is text, the width and font type of a text image representing the text can be edited to create various patterns of the text image and display the various patterns to the user of the terminal apparatus 300C. In other words, in the fifth embodiment, in the case where the element included in the content image is a text image, the width and font type of the text image are edited to create various patterns of the text image.
(sixth embodiment)
The sixth embodiment will be described with reference to the drawings. The sixth embodiment is different from the first embodiment in that image data sets acquired from terminal apparatuses are combined with each other. Therefore, hereinafter, in the sixth embodiment, only the differences from the first embodiment will be described. Components having the same functional configuration as in the first embodiment are denoted by the same reference numerals as those used in the description of the first embodiment, and thus the description thereof will be omitted.
Fig. 27 is a diagram showing functions of an information processing apparatus and a terminal apparatus in the sixth embodiment.
The combined image generation processing section 220E of the information processing apparatus 200E in the sixth embodiment includes a communication section 221, an image acquisition section 222, a content list output section 223, a combined information acquisition section 224, a combination processing section 225C, and a combined image data output section 226.
The combination processing unit 225C combines the two image data sets received from the terminal device 300D. More specifically, the combination processing section 225C generates combined image data by visualizing one image data set among the two image data sets selected at the terminal device 300D in accordance with the shape of the element of the other image data set.
The combination instruction section 310D of the terminal device 300D in the sixth embodiment includes a display control section 311, an input receiving section 312, an image selection section 313, a combination information generation section 314, a communication section 315, and an element specification section 318.
The element specification section 318318 in the sixth embodiment performs processing such as edge detection on the image data selected by the image selection section 313, and specifies an element in an image in accordance with a shape extracted from the image data. In addition, in response to specifying an element based on the image data, the element specifying section 318 may generate shape image data for drawing the element based on the image data.
Further, when the element specifying section 318 performs edge detection or the like and detects a plurality of elements from the image data based on a line representing an edge, the element specifying section 318 may allow the user to select an element for combination processing from the plurality of elements. In this case, the shape image data may be generated only for the elements selected for use in the combination processing.
Next, an operation in the information processing system in the sixth embodiment will be described. Fig. 28 is a sequence chart showing the operation of the information processing system in the sixth embodiment.
Since the processing of step S2801 and step S2802 in fig. 28 is similar to the processing of step S501 and step S502 in fig. 5, the description thereof is omitted.
After step S2802, the terminal device 300D receives a selection of a plurality of image data sets to be combined together through the input receiving section 312 (step S2803).
Subsequently, the terminal apparatus 300D specifies an element included in an image from the plurality of selected image data sets by the element specifying section 318 (step S2804). At this time, the element specification unit 318 may generate the shape image data.
In the sixth embodiment, for example, in the case where the user selects an image data set of a specified element from among the plurality of image data selected in step S2803, the process of specifying an element may be performed with respect to the image data.
Further, in the sixth embodiment, the processing of specifying an element may be performed for all of the plurality of pieces of image data selected in step S2803.
Subsequently, the terminal device 300D displays a preview image in which the plurality of image data sets selected in step S2803 are combined in accordance with the user operation through the display control section 311 (step S2805).
For example, in step S2803, when two image data sets are selected and shape image data is generated from one image data set, the other image data set and the shape image data may be combined, and a preview image may be displayed.
Subsequently, the terminal device 300D transmits the selected image data and combination information together with the combination instruction to the information processing device 200E through the communication section 315 (step S2806). For example, the communication section 315 may transmit the shape image data generated from one image data set, the other image data set, and the combination information as a combination instruction to the information processing apparatus 200E.
Since the processing from step S2807 to step S2812 in fig. 28 is the same as the processing from step S509 to step S514 in fig. 5, the description thereof is omitted.
Next, the process of the combination processing section 225C of the information processing apparatus 200E in the sixth embodiment will be described with reference to fig. 29. Fig. 29 is a schematic diagram for explaining a process of generating combined image data in the sixth embodiment. Fig. 29 shows details of the processing of step S2807 of fig. 28.
The combination processing section 225C in the sixth embodiment receives the combination information and the image data set selected in step S2803 from the terminal device 300D (step S2901). Subsequently, the combination processing section 225C superimposes the received image data sets on each other based on the combination information. Specifically, the combination processing section 225C may superimpose the shape image data on the received image data set.
Subsequently, the combination processing unit 225C generates combined image data in which the lower layer image can be seen inside the shape of the element represented by the shape image data (step S2903).
Subsequently, the combination processing section 225C applies the combined image ID as identification information in order to specify the combined image data generated in step S2903 (step S2904), and terminates the processing.
Next, a setting screen in the sixth embodiment will be described with reference to fig. 30. Fig. 30 is a schematic diagram showing a display example of a terminal apparatus in the sixth embodiment.
The screen 80C shown in fig. 30 is an example of the setting screen displayed on the terminal apparatus 300D in step S2801 in fig. 28.
The screen 80C in the sixth embodiment includes display areas 80-1, 80-2B, and 80-3C. For example, in the sixth embodiment, a list of image data sets stored in the terminal apparatus 300D is displayed in the display area 80-2C of the screen 80C. Specifically, for example, a list of image data captured by the imaging device of the terminal apparatus 300D may be displayed in the display area 80-2C. In the example of fig. 30, the list of image data sets is displayed as a list of thumbnail images.
The display area 80-3C displays an image represented by the image data selected in the display area 80-2B.
In fig. 30, the display area 80-2B displays a state in which the image 72 and the image 101 are selected and the processing is performed on the image 101 by the element specifying section 318.
The display area 80-3C displays the shape image 102 represented by the shape image data representing the shape of the element 101a specified in the image 101, and displays the image 72. The shape image 102 and the image 72 are superimposed by the user's operation.
In the sixth embodiment, when the operation button 80-4 is operated after the operation by the user, the image data representing the image 72 displayed in the display area 80-3C, the shape image data representing the shape image 102, and the combination information are transmitted to the information processing apparatus 200E.
As described above, according to the sixth embodiment, any image desired by the user can be defined as an image having any shape without using content image data prepared in advance.
In the above-described embodiment, the information processing apparatuses 200 and 200A to 200E include the combined image generation processing sections 220 and 220A to 220E, respectively, however, each functional configuration of the information processing apparatuses 200 and 200A to 200E is not limited to this functional configuration. A modification of the information processing system will be described with reference to fig. 31. Fig. 31 is a schematic diagram showing a modified embodiment of the information processing system.
In the information processing system shown in fig. 31, the information processing apparatus 200F includes a content image DB210, and the terminal apparatus 300E includes a combination image generation processing section 220 and a combination instruction section 310.
In the information processing system shown in fig. 31, the generation of the combined image data is completed in the terminal device 300E, so that the communication load between the terminal device 300E and the information processing device 200F can be reduced.
In the example of fig. 31, since the combined image data generated by the terminal apparatus 300E can be directly transmitted from the terminal apparatus 300E to the printing apparatus 400, the combined image data can be printed out without using the information processing apparatus 200F.
Although the present invention has been described in terms of embodiments, it is not intended that the invention be limited to the requirements set forth in the embodiments. In these respects, the subject matter of the present invention may be varied unbiased and may be defined appropriately according to the application thereof.
The invention can be implemented in any convenient form, for example using dedicated hardware or a mixture of dedicated hardware and software. The invention may be implemented as computer software implemented by one or more networked processing devices. The network may comprise any conventional terrestrial or wireless communication network, such as the internet. The processing device may comprise any suitably programmed device such as a general purpose computer, a personal digital assistant, a mobile telephone (such as a WAP or 3G compatible telephone) or the like. Because the present invention can be implemented as software, each aspect of the present invention includes computer software implementable on a programmable device.
The computer software may be provided to the programmable device using any storage medium for storing processor-readable code, such as a floppy disk, a hard disk, a CD ROM, a tape device, or a solid state memory device.
The hardware platform includes any desired hardware resources including, for example, processors such as a Central Processing Unit (CPU), Random Access Memory (RAM), and Hard Disk Drive (HDD). The CPU may be implemented by any desired number of processors. The RAM may be implemented by any desired volatile or non-volatile memory. The HDD may be implemented by any desired nonvolatile memory capable of storing a large amount of data. The hardware resources may additionally include input devices, output devices, or network devices, depending on the type of apparatus. Alternatively, the HDD may be provided outside the apparatus as long as the HDD is accessible. In this example, a CPU and RAM such as a cache memory of the CPU may be used as a physical memory or a main memory of the apparatus, and an HDD may be used as a secondary memory of the apparatus.
The present application is based on and claims the priority rights of japanese prior patent application No.2018-216044 filed on 16.11.2018 and japanese prior patent application No.2019-205610 filed on 13.11.2019, the entire contents of which are incorporated herein by reference.
List of reference numerals
100. 100A information processing system
200. 200A to 200F information processing apparatus
210 content image DB
210A image DB
220. 220A to 220E combined image generation processing section
221 communication part
222 image acquisition unit
222A image acquisition unit
223 content list output unit
223A content selection unit
224 combined information acquiring unit
225 combined processing part
226 combined image data output unit
227 metadata extraction unit
228 fee charge judging part
229 charging history storage unit
230 image accumulating section
231 image selection unit
251 mode determination unit
252 edit information acquisition unit
300. 300A to 300E terminal device
310 combined indication part

Claims (10)

1. An information processing system, comprising:
a terminal device; and
an information processing apparatus for processing an information signal,
wherein the information processing apparatus executes a first process including:
storing the first image data in a first storage device,
receiving second image data from the terminal device,
generating third image data representing a third image for displaying a part of the second image represented by the second image data in an area inside or outside the shape of an element included in the first image represented by the first image data, and
outputting the third image data to an output device, and
wherein the terminal apparatus executes a second process including:
an input is received from a user and the user,
storing the second image data in a second storage device, an
Transmitting the second image data to the information processing apparatus.
2. The information processing system according to claim 1, wherein the third image is an image representing that at least a part of the second image is located in a region inside the shape of the element.
3. The information processing system of claim 1 or 2, wherein receiving the input comprises receiving a selection for one of the following in the generation of the third image data:
a first generation process of generating the third image, wherein at least a part of the second image represented by the second image data is displayed in a region inside or outside a shape of the element included in the first image represented by the first image data, and
a second generation process other than the first generation process.
4. The information processing system according to claim 3, wherein the second generation process generates the third image data in which at least a part of the second image data displayed in the area by the first generation process is not displayed and at least another part of the second image is displayed in another area different from the area.
5. The information processing system according to any one of claims 1 to 3, wherein the first storage device stores:
the first image data, and
information associated with the first image data, the information indicating whether execution of the first generation process that generates third image data representing the third image is permitted, wherein at least a part of the second image represented by the second image data is displayed in an area inside or outside a shape of the element included in the first image represented by the first image data.
6. The information processing system according to any one of claims 1 to 4, wherein the second processing of the terminal apparatus further includes displaying at least one of the first image and the second image on a display device, and
wherein the receiving of the input receives an input to change a position of the first image or the second image, and
the display is such that, based on an input to change a position of the first image or the second image, the second image is visible through the first image with a first transparency in a first region inside the element of the first image, and the second image is visible through the first image with a second transparency in a second region outside the element, the second transparency being different from the first transparency.
7. The information processing system according to any one of claims 1 to 6, wherein the element is a letter having a line width of 2.5cm or more.
8. The information processing system according to any one of claims 1 to 7, further comprising an output device,
wherein the first processing of the information processing apparatus further includes outputting the third image data to the output apparatus,
wherein the output device performs image formation on a recording medium based on the third image data received from the information processing device.
9. An information processing apparatus, comprising:
a memory; and
a processor coupled with the memory and configured to perform processing including generating third image data representing a third image for displaying a portion of a second image represented by the second image data in a region inside or outside a shape of an element included in a first image represented by the first image data.
10. A computer-readable recording medium for causing a computer to execute a process, the process comprising:
third image data representing a third image is generated for displaying a part of the second image represented by the second image data in an area inside or outside the shape of the element included in the first image represented by the first image data.
CN201980075011.7A 2018-11-16 2019-11-15 Information processing system, information processing apparatus, and recording medium Withdrawn CN113039583A (en)

Applications Claiming Priority (5)

Application Number Priority Date Filing Date Title
JP2018-216044 2018-11-16
JP2018216044 2018-11-16
JP2019205610A JP7230780B2 (en) 2018-11-16 2019-11-13 Information processing system, terminal device and program
JP2019-205610 2019-11-13
PCT/JP2019/044906 WO2020101019A1 (en) 2018-11-16 2019-11-15 Information processing system, information processing apparatus, and recording medium

Publications (1)

Publication Number Publication Date
CN113039583A true CN113039583A (en) 2021-06-25

Family

ID=70908524

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201980075011.7A Withdrawn CN113039583A (en) 2018-11-16 2019-11-15 Information processing system, information processing apparatus, and recording medium

Country Status (3)

Country Link
EP (1) EP3881291A1 (en)
JP (1) JP7230780B2 (en)
CN (1) CN113039583A (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20230298304A1 (en) * 2020-07-08 2023-09-21 Tosyo, Inc. Image data processing method, image data processing apparatus, and commercial use
WO2023166637A1 (en) * 2022-03-02 2023-09-07 Tosyo株式会社 Method and device for running virtual store

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1912920A (en) * 2005-08-11 2007-02-14 兄弟工业株式会社 Information processing device
US8866841B1 (en) * 2009-04-03 2014-10-21 Joshua Distler Method and apparatus to deliver imagery with embedded data
US20160005229A1 (en) * 2014-07-01 2016-01-07 Samsung Electronics Co., Ltd. Electronic device for providing map information
US9332149B2 (en) * 2013-08-23 2016-05-03 Brother Kogyo Kabushiki Kaisha Image processing apparatus capable of generating image including arranged images
CN108476273A (en) * 2015-11-18 2018-08-31 麦克赛尔株式会社 Information processing unit and its control method for image data

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP4124084B2 (en) * 2003-10-02 2008-07-23 セイコーエプソン株式会社 Image processing apparatus, image processing method, and image processing program
EP2813929A4 (en) * 2012-02-10 2015-09-30 Sony Corp Information processing device, information processing method, and program

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1912920A (en) * 2005-08-11 2007-02-14 兄弟工业株式会社 Information processing device
US8866841B1 (en) * 2009-04-03 2014-10-21 Joshua Distler Method and apparatus to deliver imagery with embedded data
US9332149B2 (en) * 2013-08-23 2016-05-03 Brother Kogyo Kabushiki Kaisha Image processing apparatus capable of generating image including arranged images
US20160005229A1 (en) * 2014-07-01 2016-01-07 Samsung Electronics Co., Ltd. Electronic device for providing map information
CN108476273A (en) * 2015-11-18 2018-08-31 麦克赛尔株式会社 Information processing unit and its control method for image data

Also Published As

Publication number Publication date
EP3881291A1 (en) 2021-09-22
JP7230780B2 (en) 2023-03-01
JP2020087457A (en) 2020-06-04

Similar Documents

Publication Publication Date Title
US9003330B2 (en) User interface for selecting a photo tag
US8667384B2 (en) User interface for editing photo tags
JP5773618B2 (en) Information processing apparatus, control method for information processing apparatus, and program
US20110099523A1 (en) Product selection and management workflow
CN113039583A (en) Information processing system, information processing apparatus, and recording medium
CA2630944C (en) User interface for editing photo tags
JP5315424B2 (en) Image display control apparatus and control method thereof
CA2630947C (en) User interface for selecting a photo tag
JP2010039583A (en) Method and system for displaying photograph on electronic map, and electronic map therefor
US9824447B2 (en) Information processing apparatus, information processing system, and information processing method
JP2010044625A (en) Information processing apparatus, and method of controlling the same
JP2008145935A (en) Historical map output unit, historical map output method, and program
WO2020101019A1 (en) Information processing system, information processing apparatus, and recording medium
JP6340124B1 (en) Photo production system, sales system, photo production apparatus and program
JP7126392B2 (en) Information processing system, information processing system program, and information processing method
CN112400149A (en) Method for executing operation based on bending and electronic equipment
KR101399498B1 (en) Smartphone and tablet pc applications for the management of the cadastral surveying ground boundary point and operating method thereof
JP2010175896A (en) Information processor, imaging device and program
US20210041867A1 (en) Device and method for providing an enhanced graphical representation based on processed data
JP4378942B2 (en) Information management method, information management apparatus, program for causing computer to execute information management method, and recording medium recording this program
JP2005136604A (en) Device and method for supporting electrophotography album preparation
EP4187470A1 (en) Information processing apparatus, information processing system, information processing method, and carrier means
KR101480400B1 (en) Apparatus for managing image files and the method of the same
JP7464068B2 (en) Trade management system and trade management method
JP2021140390A (en) Program, composite image generation method, terminal device and information processor

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
WW01 Invention patent application withdrawn after publication
WW01 Invention patent application withdrawn after publication

Application publication date: 20210625