CN108307084B - Information processing apparatus, information processing method, and computer program - Google Patents

Information processing apparatus, information processing method, and computer program Download PDF

Info

Publication number
CN108307084B
CN108307084B CN201710938467.4A CN201710938467A CN108307084B CN 108307084 B CN108307084 B CN 108307084B CN 201710938467 A CN201710938467 A CN 201710938467A CN 108307084 B CN108307084 B CN 108307084B
Authority
CN
China
Prior art keywords
function
image
information
cooperative
functions
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201710938467.4A
Other languages
Chinese (zh)
Other versions
CN108307084A (en
Inventor
得地贤吾
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Fujifilm Business Innovation Corp
Original Assignee
Fujifilm Business Innovation Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Fujifilm Business Innovation Corp filed Critical Fujifilm Business Innovation Corp
Publication of CN108307084A publication Critical patent/CN108307084A/en
Application granted granted Critical
Publication of CN108307084B publication Critical patent/CN108307084B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N1/00Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
    • H04N1/00127Connection or combination of a still picture apparatus with another apparatus, e.g. for storage, processing or transmission of still picture signals or of information associated with a still picture
    • H04N1/00204Connection or combination of a still picture apparatus with another apparatus, e.g. for storage, processing or transmission of still picture signals or of information associated with a still picture with a digital computer or a digital computer system, e.g. an internet server
    • H04N1/00244Connection or combination of a still picture apparatus with another apparatus, e.g. for storage, processing or transmission of still picture signals or of information associated with a still picture with a digital computer or a digital computer system, e.g. an internet server with a server, e.g. an internet server
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N1/00Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
    • H04N1/00127Connection or combination of a still picture apparatus with another apparatus, e.g. for storage, processing or transmission of still picture signals or of information associated with a still picture
    • H04N1/00249Connection or combination of a still picture apparatus with another apparatus, e.g. for storage, processing or transmission of still picture signals or of information associated with a still picture with a photographic apparatus, e.g. a photographic printer or a projector
    • H04N1/00251Connection or combination of a still picture apparatus with another apparatus, e.g. for storage, processing or transmission of still picture signals or of information associated with a still picture with a photographic apparatus, e.g. a photographic printer or a projector with an apparatus for taking photographic images, e.g. a camera
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N1/00Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
    • H04N1/00127Connection or combination of a still picture apparatus with another apparatus, e.g. for storage, processing or transmission of still picture signals or of information associated with a still picture
    • H04N1/00326Connection or combination of a still picture apparatus with another apparatus, e.g. for storage, processing or transmission of still picture signals or of information associated with a still picture with a data reading, recognizing or recording apparatus, e.g. with a bar-code apparatus
    • H04N1/00328Connection or combination of a still picture apparatus with another apparatus, e.g. for storage, processing or transmission of still picture signals or of information associated with a still picture with a data reading, recognizing or recording apparatus, e.g. with a bar-code apparatus with an apparatus processing optically-read information
    • H04N1/00334Connection or combination of a still picture apparatus with another apparatus, e.g. for storage, processing or transmission of still picture signals or of information associated with a still picture with a data reading, recognizing or recording apparatus, e.g. with a bar-code apparatus with an apparatus processing optically-read information with an apparatus processing barcodes or the like
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N1/00Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
    • H04N1/00127Connection or combination of a still picture apparatus with another apparatus, e.g. for storage, processing or transmission of still picture signals or of information associated with a still picture
    • H04N1/00326Connection or combination of a still picture apparatus with another apparatus, e.g. for storage, processing or transmission of still picture signals or of information associated with a still picture with a data reading, recognizing or recording apparatus, e.g. with a bar-code apparatus
    • H04N1/00328Connection or combination of a still picture apparatus with another apparatus, e.g. for storage, processing or transmission of still picture signals or of information associated with a still picture with a data reading, recognizing or recording apparatus, e.g. with a bar-code apparatus with an apparatus processing optically-read information
    • H04N1/00336Connection or combination of a still picture apparatus with another apparatus, e.g. for storage, processing or transmission of still picture signals or of information associated with a still picture with a data reading, recognizing or recording apparatus, e.g. with a bar-code apparatus with an apparatus processing optically-read information with an apparatus performing pattern recognition, e.g. of a face or a geographic feature
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N1/00Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
    • H04N1/0035User-machine interface; Control console
    • H04N1/00405Output means
    • H04N1/00474Output means outputting a plurality of functional options, e.g. scan, copy or print
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N1/00Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
    • H04N1/00912Arrangements for controlling a still picture apparatus or components thereof not otherwise provided for
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N2201/00Indexing scheme relating to scanning, transmission or reproduction of documents or the like, and to details thereof
    • H04N2201/0077Types of the still picture apparatus
    • H04N2201/0094Multifunctional device, i.e. a device capable of all of reading, reproducing, copying, facsimile transception, file transception

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Artificial Intelligence (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Computing Systems (AREA)
  • General Engineering & Computer Science (AREA)
  • Facsimiles In General (AREA)
  • Accessory Devices And Overall Control Thereof (AREA)

Abstract

An information processing apparatus and an information processing method. An information processing apparatus includes a controller. The controller performs control to present guidance indicating a second device capable of performing a cooperative function together with the first device if a first image related to the first device required to perform the cooperative function is specified.

Description

Information processing apparatus, information processing method, and computer program
Technical Field
The invention relates to an information processing apparatus and an information processing method.
Background
Japanese unexamined patent application publication nos. 2015-177504 and 2015-223006 disclose techniques for causing a plurality of devices to cooperate with each other.
However, in some cases, the desired collaboration function may not be performed.
Disclosure of Invention
Therefore, an object of the present invention is to increase user convenience in the case of performing a cooperative function.
According to a first aspect of the present invention, there is provided an information processing apparatus including a controller. The controller performs control to present guidance indicating a second device capable of performing the cooperative function together with the first device if a first image related to the first device required to perform the cooperative function is specified.
According to the second aspect of the present invention, if an image related to an apparatus incapable of performing the cooperative function with the first apparatus is further specified, the controller performs control to present the guidance.
According to the third aspect of the present invention, if an operation of linking the first image and an image related to an apparatus incapable of performing the cooperative function with the first apparatus to each other is performed, the controller performs control to present the guidance.
According to the fourth aspect of the present invention, the controller performs control to present the guidance if the first image and an image related to an apparatus incapable of performing the cooperative function with the first apparatus are superimposed on each other.
According to a fifth aspect of the present invention, if a partial image included in the first image is specified, the controller performs control to present the guidance indicating the second device capable of performing the cooperative function together with a function corresponding to the partial image.
According to a sixth aspect of the present invention, as the control for presenting the guidance, the controller performs control to display a candidate list showing information on one or more second devices capable of performing the cooperative function.
According to a seventh aspect of the present invention, if a second device is specified from among the one or more second devices on the candidate list, the controller performs control to display information on the cooperative function using the specified second device.
According to an eighth aspect of the present invention, the controller performs control to display the cooperative function while changing the cooperative function according to an order in which the first device and the second device are specified.
According to a ninth aspect of the present invention, if a first device and a second device are specified, the controller further performs control to present guidance indicating a third device capable of performing a cooperative function together with the first device and the second device.
According to a tenth aspect of the present invention, the controller performs control to present the guidance while changing the third device according to an order in which the first device and the second device are specified.
According to an eleventh aspect of the present invention, there is provided an information processing apparatus including a controller. If a first image related to a first function required to execute a cooperative function is specified, the controller performs control to present guidance indicating a second function capable of executing the cooperative function together with the first function.
According to the twelfth aspect of the present invention, if an image related to a function incapable of executing the cooperative function together with the first function is further specified, the controller performs control to present the guidance.
According to the thirteenth aspect of the present invention, if an operation of linking a first image and an image related to a function which cannot execute the cooperative function together with the first function to each other is performed, the controller performs control to present the guidance.
According to a fourteenth aspect of the present invention, the controller performs control to present the guidance if the first image and an image related to a function that cannot perform the cooperative function together with the first function are superimposed on each other.
According to a fifteenth aspect of the present invention, as the control for presenting the guidance, the controller performs control to display a candidate list showing information on one or more second functions capable of performing the cooperative function.
According to a sixteenth aspect of the present invention, the order in which the one or more second functions are arranged in the candidate list is determined based on past usage records of the one or more second functions.
According to a seventeenth aspect of the present invention, the controller performs control to display the cooperative function while changing the cooperative function in accordance with an order in which the first function and the second function are specified.
According to the eighteenth aspect of the present invention, if the first function and the second function are specified, the controller further performs control to present guidance indicating a third function capable of performing a cooperative function together with the first function and the second function.
According to a nineteenth aspect of the present invention, the controller performs control to present the guidance while changing the third function in accordance with the order in which the first function and the second function are specified.
According to a twentieth aspect of the present invention, the first function and the second function are included in a group of functions registered in advance, a group of functions of one or more identified devices, a group of functions displayed on a display, or a group of functions displayed in a specific area of a screen of the display.
According to a twenty-first aspect of the present invention, there is provided an information processing method comprising: if a first image related to a first device required to perform a cooperative function is specified, control is performed to present guidance indicating a second device capable of performing the cooperative function together with the first device.
According to a twenty-second aspect of the present invention, there is provided an information processing method comprising: if a first image related to a first function required to execute a cooperative function is specified, control is performed to present guidance indicating a second function capable of executing the cooperative function together with the first function.
According to the first, ninth, and twentieth aspects of the present invention, user convenience increases in the case of performing a cooperative function.
According to the second or twelfth aspect of the present invention, the complexity that may occur if guidance is always presented can be avoided.
According to the third or thirteenth aspect of the present invention, control is performed to present guidance by an operation of linking images.
According to the fourth or fourteenth aspect of the present invention, control is performed to present guidance by an operation of superimposing images.
According to a fifth aspect of the present invention, control is performed to present guidance indicating an apparatus capable of performing a cooperative function together with a specific function of the apparatus.
According to a sixth or seventh aspect of the present invention, a list of devices capable of performing a collaboration function is presented.
According to the eighth or seventeenth aspect of the present invention, control is performed to display a cooperative function estimated to be used.
According to a tenth aspect of the present invention, control is performed to present guidance indicating an apparatus estimated to be used.
According to the eleventh, eighteenth, twentieth, or twenty-second aspect of the present invention, in the case where a function required to execute a cooperative function is specified, user convenience increases.
According to a fifteenth aspect of the invention, a list of functions capable of performing a collaboration function is presented.
According to the sixteenth aspect of the present invention, it becomes easy to grasp the record of the relative use of the respective devices.
According to the nineteenth aspect of the present invention, control is performed to present guidance indicating a function estimated to be used.
Drawings
Exemplary embodiments of the present invention will be described in detail based on the following drawings, in which:
FIG. 1 is a block diagram illustrating an apparatus system according to an exemplary embodiment of the present invention;
FIG. 2 is a block diagram illustrating an image forming apparatus according to an exemplary embodiment;
FIG. 3 is a block diagram illustrating a server according to an exemplary embodiment;
FIG. 4 is a block diagram illustrating a terminal device according to an example embodiment;
fig. 5 is a schematic diagram showing an appearance of the image forming apparatus;
fig. 6 is a diagram showing an example of a device function management table;
fig. 7 is a diagram showing an example of a cooperation function management table;
FIG. 8 is a diagram showing the device used alone;
fig. 9 is a diagram showing an example of a function display screen;
fig. 10 is a diagram showing an example of a function display screen;
FIG. 11 is a diagram showing target devices cooperating with each other;
fig. 12 is a diagram showing an example of a function display screen;
fig. 13 is a sequence diagram showing a connection process;
fig. 14A and 14B are diagrams showing examples of device display screens;
fig. 15 is a diagram showing an example of a device display screen according to example 1;
fig. 16 is a diagram showing an example of a device display screen according to example 2;
fig. 17 is a diagram showing an example of a device display screen according to example 3;
fig. 18 is a diagram showing an example of a device display screen according to example 3;
fig. 19 is a diagram showing an example of a device display screen according to example 3;
fig. 20 is a diagram showing an example of a device display screen according to example 3;
fig. 21 is a diagram showing an example of a device display screen according to example 4;
fig. 22 is a diagram showing an example of a device display screen according to example 4;
fig. 23 is a diagram showing an example of a device display screen according to example 5;
fig. 24 is a diagram showing an example of a device display screen according to example 5;
fig. 25 is a diagram showing an example of a screen according to example 5;
fig. 26 is a diagram showing an example of a device selection screen according to example 6;
fig. 27 is a diagram showing an example of a device selection screen according to example 7;
fig. 28 is a diagram showing an example of a device selection screen according to example 7;
fig. 29 is a diagram showing an example of a function selection screen according to example 7;
fig. 30 is a diagram showing an example of a screen according to example 7;
fig. 31 is a diagram showing an example of a message according to example 7;
fig. 32 is a diagram showing an example of a message according to example 7;
fig. 33 is a diagram showing an example of a device selection screen according to example 8;
fig. 34 is a diagram showing an example of a device selection screen according to example 8;
fig. 35 is a diagram showing an example of a screen according to example 8;
fig. 36 is a diagram showing an example of a cooperation function management table;
fig. 37A and 37B are diagrams showing an example of a device display screen and an example of a function display screen, respectively;
fig. 38A and 38B are diagrams showing an example of a device display screen and an example of a function display screen, respectively;
fig. 39 is a diagram showing an example of a device function management table;
fig. 40A and 40B are diagrams showing an example of a device display screen and an example of a function display screen, respectively;
fig. 41 is a diagram showing an example of a device function management table;
fig. 42 is a diagram showing an example of a cooperation function management table;
fig. 43A, 43B, and 43C are diagrams showing examples of screens displayed on the terminal device;
fig. 44A and 44B are diagrams showing examples of screens displayed on the terminal device; and
fig. 45A and 45B are diagrams showing examples of screens displayed on the terminal device.
Detailed Description
An apparatus system serving as an information processing system according to an exemplary embodiment of the present invention will be described with reference to fig. 1. Fig. 1 shows an example of an apparatus system according to an exemplary embodiment.
The device system according to the exemplary embodiment includes a plurality of devices (for example, devices 10 and 12), a server 14 as an example of an external apparatus, and a terminal apparatus 16 as an example of an information processing apparatus. The apparatuses 10 and 12, the server 14, and the terminal device 16 have a function of communicating with each other through a communication path N such as a network. Of course, the apparatuses 10 and 12, the server 14, and the terminal device 16 may communicate with another device through different communication paths. In the example shown in fig. 1, two devices (devices 10 and 12) are included in the device system. Three or more devices may be included in the device system. Further, a plurality of servers 14 and a plurality of terminal devices 16 may be included in the appliance system.
The apparatuses 10 and 12 are devices having specific functions and may be, for example, an image forming device having an image forming function, a Personal Computer (PC), a projector, a display device such as a liquid crystal display or a projector, a telephone, a clock, a monitoring camera, or the like. The apparatuses 10 and 12 have a function of transmitting and receiving data to and from another device. In the exemplary embodiment, for example, it is assumed that the apparatus 10 is an image forming apparatus. An image forming apparatus (device 10) is an apparatus having at least one of a scanner function, a printer function, a copy function, and a facsimile function.
The server 14 is a device that manages functions of the respective apparatuses. For example, the server 14 manages functions of the respective apparatuses, a cooperation function using a plurality of functions, and the like. The server 14 also has a function of transmitting and receiving data to and from another device.
Server 14 may manage, for each user, one or more functions available to that user. For example, the functions available to the user are functions offered to the user for no charge or functions offered to the user for charge and purchased by the user. The server 14 may manage available function information (e.g., function purchase history information) indicating one or more functions available to the user for each user. Of course, the server 14 does not have to manage the functions according to whether the functions have been purchased or not, because there are free functions, additional update functions, and special functions managed by the administrator. The function purchase processing is executed by, for example, the server 14. Of course, the function purchase process may be performed by another device.
The terminal device 16 is a device such as a PC, a tablet PC, a smartphone, or a mobile phone, and has a function of transmitting and receiving data to and from another device. When the apparatus is used, the terminal device 16 functions as, for example, a user interface unit (UI unit).
In the device system according to the exemplary embodiment, if a first image related to a first device is specified, control is performed to present guidance indicating a second device capable of performing a cooperative function together with the first device. Alternatively, if a first image related to a first function is specified, a control may be performed to present guidance indicating a second function capable of performing a cooperative function together with the first function.
Hereinafter, each device included in the device system according to the exemplary embodiment will be described in detail.
The configuration of the device 10 as an image forming apparatus will be described in detail with reference to fig. 2. Hereinafter, the device 10 may be referred to as an image forming apparatus 10. Fig. 2 shows a configuration of the image forming apparatus 10.
The communication unit 18 is a communication interface, and has a function of transmitting data to another device and a function of receiving data from another device. The communication unit 18 may be a communication interface having a wireless communication function, or may be a communication interface having a wired communication function.
The image forming unit 20 has an image forming function. Specifically, the image forming unit 20 has at least one of a scanner function, a printer function, a copy function, and a facsimile function. When the scan function is executed, a document is read and scan data (image data) is generated. When the printing function is executed, an image is printed on a recording medium such as paper. When the copy function is executed, the document is read and printed on a recording medium. When the facsimile function is executed, image data is transmitted or received by facsimile. In addition, a cooperative function using a plurality of functions may be performed. For example, a scan-and-transfer function, which is a combination of a scan function and a transfer (transfer) function, may be performed. When the scan-and-transfer function is performed, a document is read, scan data (image data) is generated, and the scan data is transmitted to a destination (for example, an external device such as the terminal device 16). Of course, this collaboration function is merely an example, and another collaboration function may be performed.
The memory 22 is a storage device such as a hard disk or a memory (e.g., a Solid State Drive (SSD), etc.). The memory 22 stores, for example, information indicating an image forming instruction (e.g., job information), image data to be printed, scan data generated by executing a scan function, device address information indicating an address of another device, server address information indicating an address of the server 14, various control data, and various programs. Of course, such information and data may be stored in different storage devices or in one storage device.
The UI unit 24 is a user interface unit, and includes a display and an operation unit. The display is a display device such as a liquid crystal display. The operation unit is an input device such as a touch panel or a keyboard. Of course, a user interface serving as both a display and an operation unit (for example, a touch display or a device including a display that electronically displays a keyboard or the like) may be used. The image forming apparatus 10 does not necessarily include the UI unit 24, and may include a hardware user interface unit (hardware UI unit) serving as hardware instead of the display. For example, the hardware UI unit is a hardware keypad dedicated to input numbers (e.g., a numeric keypad) or a hardware keypad dedicated to indicate directions (e.g., a direction indication keypad).
The controller 26 controls the operations of the respective units of the image forming apparatus 10.
Hereinafter, the configuration of the server 14 will be described in detail with reference to fig. 3. Fig. 3 shows the configuration of the server 14.
The communication unit 28 is a communication interface, and has a function of transmitting data to another device and a function of receiving data from another device. The communication unit 28 may be a communication interface having a wireless communication function, or may be a communication interface having a wired communication function.
The memory 30 is a storage device such as a hard disk or a memory (e.g., SSD or the like). The memory 30 stores, for example, device function management information 32, cooperation function management information 34, various data, various programs, device address information indicating the address of each device, and server address information indicating the address of the server 14. Of course, such information and data may be stored in different storage devices or in one storage device. The device function management information 32 and the cooperation function management information 34 stored in the memory 30 may be provided to the terminal apparatus 16 periodically or at a specified timing, and thus, the information stored in the terminal apparatus 16 may be updated. Hereinafter, the device function management information 32 and the cooperation function management information 34 will be described.
The device function management information 32 is information for managing functions of the respective devices, and is, for example, information representing a correspondence relationship between device identification information for identifying a device and one or more pieces of function information representing one or more functions of the device. The device identification information includes, for example, a device ID, a device name, information indicating the type of the device, the model number of the device, information indicating the position of the device (device position information), and an appearance image indicating the appearance of the device. The function information includes, for example, a function ID and a function name. For example, if the image forming apparatus 10 has a scan function, a print function, a copy function, and a scan-and-transfer function, the device identification information of the image forming apparatus 10 is associated with function information indicating the scan function, function information indicating the print function, function information indicating the copy function, and function information indicating the scan-and-transfer function. The functions of the respective devices are determined (identified) by referring to the device function management information 32,
for example, the devices managed by the device function management information 32 are devices (e.g., devices 10 and 12) included in the device system. Of course, devices not included in the device system may also be managed by the device function management information 32. For example, the server 14 may obtain information (information including device identification information and function information) about a new device not included in the device system and may newly register the information in the device function management information 32. Information about the device may be obtained using the internet or the like, or may be input by an administrator or the like. The server 14 may update the device function management information 32 at a certain timing, periodically, or at a timing specified by an administrator or the like. Therefore, function information indicating functions that the device before update does not and the device after update is provided can be registered in the device function management information 32. Further, the function information indicating the function that the device before the update is provided and the device after the update is not provided may be deleted from the device function management information 32 and may be registered as the unavailable information. The information for updating may be obtained using the internet or the like, or may be input by an administrator or the like.
The cooperative function management information 34 is information for managing a cooperative function, each of which is executed by cooperation among a plurality of functions. One or more cooperative functions are performed through cooperation among the plurality of functions. Each cooperative function may be performed by cooperation between functions of one apparatus (e.g., the apparatus 10 or 12), or may be performed by cooperation between functions of a plurality of apparatuses (e.g., the apparatuses 10 and 12). A terminal device (in an exemplary embodiment, the terminal device 16) that provides the operation instruction may be included in the apparatus to be identified, and the function of the terminal device may be used as a part of the cooperative function.
The cooperative function may be a function performed without using a hardware device. For example, the cooperative function may be a function performed by cooperation between a plurality of software units. Of course, the cooperative function may be a function executed by cooperation between a function of a hardware device and a function realized by software.
For example, the cooperation function management information 34 is information indicating: a correspondence relationship between a combination of function information indicating respective functions used in the cooperation function and cooperation function information indicating the cooperation function. For example, the cooperation function information includes a cooperation function ID and a cooperation function name. If a single function is updated, the cooperative function management information 34 is also updated according to the update. Therefore, a cooperative function using a plurality of functions that cannot cooperate with each other before update may become available after update, or a cooperative function available before update may become unavailable after update. The cooperative function information indicating the cooperative function that becomes available after the update is registered in the cooperative function management information 34, and the cooperative function information indicating the cooperative function that becomes unavailable after the update is deleted from the cooperative function management information 34 or registered as unavailable information.
In the case of causing a plurality of devices to cooperate with each other, the cooperative function management information 34 is information for managing one or more cooperative functions of a plurality of functions using the plurality of devices, and is information representing a correspondence relationship between a combination of device identification information identifying respective devices for the one or more cooperative functions and the cooperative function information. If the device function management information 32 is updated, the cooperation function management information 34 is also updated according to the update. Therefore, a cooperation function using a plurality of devices that cannot cooperate with each other before update may become available, or a cooperation function available before update may become unavailable after update.
The cooperative function may be a function performed by cooperation between a plurality of different functions, or may be a function performed by cooperation between the same functions. The collaboration function may be a function that is not available without collaboration. The functions that are not available without cooperation may be functions that become available by combining the same function among the functions of the target apparatuses cooperating with each other, or functions that become available by combining different functions among the functions of the target apparatuses cooperating with each other. For example, cooperation between an apparatus having a printing function (printer) and an apparatus having a scanning function (scanner) realizes a copying function as a cooperation function. That is, cooperation between the print function and the scan function realizes the copy function. In this case, the copy function as the cooperation function is associated with a combination of the print function and the scan function. In the cooperative function management information 34, for example, cooperative function information indicating a copy function as a cooperative function is associated with a combination of device identification information for identifying a device having a print function and device identification information for identifying a device having a scan function.
The memory 30 may store available function management information. The available function management information is information for managing one or more functions available to each user, and is, for example, information representing a correspondence between user identification information for identifying a user and one or more pieces of function information (which may include cooperation function information) representing one or more functions available to the user. As described above, the functions available to the user are, for example, functions provided to the user without payment or functions purchased by the user, and may be a single function or a cooperative function. The user identification information is, for example, user account information such as a user ID and a user name. The functions available to the respective users are determined (identified) by referring to the available function management information. The available function management information is updated each time a function is provided to the user (e.g., each time a function is provided to the user for a fee or for a fee).
The controller 36 controls the operations of the respective units of the server 14. The controller 36 comprises a determination unit 38.
The determination unit 38 receives device identification information for identifying a device and determines one or more pieces of function information representing one or more functions associated with the device identification information among the device function management information 32 stored in the memory 30. Thus, one or more functions of the device are determined (identified). For example, device identification information is transmitted from the terminal apparatus 16 to the server 14, and then the determination unit 38 determines one or more pieces of function information representing one or more functions associated with the device identification information. For example, information on the one or more functions (e.g., function information and function specification information) is transmitted from the server 14 to the terminal device 16 and displayed on the terminal device 16. Accordingly, information on one or more functions of the device determined by the device-identifying information is displayed on the terminal apparatus 16.
Further, the determination unit 38 receives a plurality of pieces of device identification information for identifying target devices that cooperate with each other, and determines one or more pieces of cooperation function information representing one or more cooperation functions associated with a combination of the plurality of pieces of device identification information, among the cooperation-function management information 34 stored in the memory 30. Thus, one or more cooperative functions performed by cooperation between the functions of the target devices that cooperate with each other are determined (identified). For example, a plurality of pieces of device identification information are transmitted from the terminal apparatus 16 to the server 14, and then the determination unit 38 determines one or more pieces of cooperation function information representing one or more cooperation functions associated with the plurality of pieces of device identification information. For example, information on the one or more cooperation functions (e.g., cooperation function information and cooperation function specification information) is transmitted from the server 14 to the terminal device 16 and displayed on the terminal device 16. Accordingly, information on one or more cooperative functions performed by the plurality of devices determined by the plurality of pieces of device identification information is displayed on the terminal device 16.
For example, if a device is identified (e.g., if a device is photographed), the determination unit 38 may receive device identification information for identifying the device, and may determine one or more pieces of function information representing one or more functions associated with the device identification information in the device function management information 32. Thus, if a device is identified (e.g., if a device is photographed), one or more functions of the device are determined (identified). If a plurality of devices are identified (e.g., if a plurality of devices are photographed), the determination unit 38 may receive a plurality of pieces of device identification information for identifying respective devices included in the plurality of devices, and may determine one or more pieces of cooperation function information representing one or more cooperation functions associated with a combination of the plurality of pieces of device identification information in the cooperation function management information 34. Accordingly, if a plurality of devices are identified (e.g., if a plurality of devices are photographed), one or more cooperative functions using the functions of the plurality of devices are determined (identified).
The determination unit 38 may receive pieces of function information indicating respective functions used in the cooperative functions, and may determine one or more pieces of cooperative function information indicating one or more cooperative functions associated with combinations of the pieces of function information in the cooperative function management information 34 stored in the memory 30. Thus, one or more cooperative functions performed by cooperation between the target functions are determined (identified). For example, a plurality of pieces of function information are transmitted from the terminal device 16 to the server 14, and then the determination unit 38 determines one or more pieces of cooperative function information representing one or more cooperative functions associated with the plurality of pieces of function information. In a manner similar to the above, information on one or more cooperative functions performed by the plurality of functions determined by the plurality of pieces of function information is displayed on the terminal device 16.
If functions available to the user are managed, the determination unit 38 may receive user identification information for identifying the user, and may determine function information representing respective functions associated with the user identification information among the available function management information stored in the memory 30. Thus, a set of functions available to the user is determined (identified). For example, user identification information is transmitted from the terminal device 16 to the server 14, and function information indicating respective functions associated with the user identification information is determined by the determination unit 38. For example, information on the respective functions available to the user (e.g., information indicating names of the respective functions) is transmitted from the server 14 to the terminal device 16 and displayed on the terminal device 16. Accordingly, information on the respective functions available to the user determined by the user identification information is displayed on the terminal device 16. For example, the determination unit 38 receives the device identification information and the user identification information, determines one or more pieces of function information representing one or more functions associated with the device identification information in the device function management information 32, and also determines one or more pieces of function information representing one or more functions associated with the user identification information in the available function management information. Thus, one or more functions that the device determined by the device identification information has and that are available to the user determined by the user identification information are determined.
Controller 36 may perform the function purchase process and may manage the purchase history. For example, if the user purchases a payment function, the controller 36 may apply a charging process to the user.
The controller 36 may perform functions related to image processing, such as a character recognition function, a translation function, an image processing function, and an image forming function. Of course, the controller 36 may perform functions related to processes other than image processing. When the character recognition function is executed, characters in an image are recognized and character data representing the characters are generated. When the translation function is executed, characters in the image are translated into characters represented by a specific language, and character data representing the translated characters are generated. When the image processing function is executed, the image is processed. For example, the controller 36 may receive scan data generated by execution of a scan function from the image forming apparatus 10, and may execute a function related to image processing, such as a character recognition function, a translation function, or an image processing function, on the scan data. Controller 36 may receive image data from terminal device 16 and perform various functions on the image data. For example, character data or image data generated by the controller 36 is transmitted from the server 14 to the terminal device 16. The server 14 may function as an external device, and the cooperation function may be a function using functions of a plurality of devices including the server 14.
Hereinafter, the configuration of the terminal device 16 will be described in detail with reference to fig. 4. Fig. 4 shows the configuration of the terminal device 16.
The communication unit 40 is a communication interface, and has a function of transmitting data to another device and a function of receiving data from another device. The communication unit 40 may be a communication interface having a wireless communication function, or may be a communication interface having a wired communication function.
The camera 42 serving as a photographing unit photographs a subject, thereby generating image data (e.g., still image data or moving image data). Alternatively, instead of using the camera 42 of the terminal device 16, image data captured by an external camera connected to a communication path such as a network may be received by the communication unit 40 and may be displayed on the UI unit 46 so that the image data may be operated by the user.
The memory 44 is a storage device such as a hard disk or a memory (e.g., SSD or the like). The memory 44 stores various programs, various data, address information of the server 14, address information of each device (for example, address information of the devices 10 and 12), information on the identified devices, information on the identified target devices cooperating with each other, information on functions of the identified devices, and information on the cooperative functions.
The UI unit 46 is a user interface unit, and includes a display and an operation unit. The display is a display device such as a liquid crystal display. The operation unit is an input device such as a touch panel, a keyboard, or a mouse. Of course, a user interface serving as both a display and an operation unit (for example, a touch display or a device including a display that electronically displays a keyboard or the like) may be used.
The controller 48 controls the operations of the respective units of the terminal device 16. The controller 48 functions as, for example, a display controller (controller) and causes the display of the UI unit 46 to display various information.
The display of the UI unit 46 displays, for example, an image captured by the camera 42, an image related to a device identified as a target device to be used (for example, a device used alone or a target device collaborated), an image related to a function, and the like. The image related to the apparatus may be an image (still image or moving image) representing the apparatus captured by the camera 42, or may be an image (e.g., icon) schematically representing the apparatus. The data schematically representing the image of the apparatus may be stored in the server 14 and provided from the server 14 to the terminal device 16, may be stored in the terminal device 16 in advance, or may be stored in another device and provided from the other device to the terminal device 16. The image related to the function is, for example, an image such as an icon indicating the function.
The above-described device function management information 32 may be stored in the memory 44 of the terminal apparatus 16. In this case, the device function management information 32 is not necessarily stored in the memory 30 of the server 14. Further, the above-described cooperative function management information 34 may be stored in the memory 44 of the terminal device 16. In this case, the cooperative function management information 34 is not necessarily stored in the memory 30 of the server 14. The controller 48 of the terminal apparatus 16 may include the above-described determination unit 38, which may determine one or more functions of the device by identifying the device based on the device identification information and may determine one or more cooperative functions using the plurality of functions. In this case, the server 14 does not necessarily include the determination unit 38.
If the available function management information is created, the available function management information may be stored in the memory 44 of the terminal device 16. In this case, the available function management information is not necessarily stored in the memory 30 of the server 14. The controller 48 of the terminal device 16 may manage the user's function purchase history. In this case, the controller 36 of the server 14 does not necessarily have a management function therefor. The controller 48 of the terminal device 16 may determine one or more functions available to the user based on the user identification information.
Alternatively, the device function management information 32 and the cooperation function management information 34 may be stored in a device such as the device 10 or 12, and the device such as the device 10 or 12 may include the determination unit 38. That is, the processing by the determination unit 38 of the server 14 (for example, the processing of identifying a device, the function of identifying a function, or the processing of identifying a cooperative function) may be executed in the server 14, may be executed in the terminal device 16, or may be executed in a device such as the device 10 or 12.
In an exemplary embodiment, for example, Augmented Reality (AR) techniques are applied to obtain device identification information and identify a device. For example, the AR technology is applied to obtain device identification information of a device used alone and identify the device, and also obtain device identification information of a target device cooperating with each other and identify the target device. AR technology according to the prior art is used. For example, a marker-based AR technique using a marker such as a two-dimensional barcode, a marker-less AR technique using an image recognition technique, a positional information AR technique using positional information, and the like are used. Of course, device identification information may be obtained and the device may be identified without applying AR technology. For example, in the case of a device connected to a network, the device may be identified based on its IP address or by reading its device ID. In addition, in the case of a device or a terminal apparatus having various types of wireless communication functions based on infrared communication, visible light communication, wireless fidelity (Wi-Fi, registered trademark), or Bluetooth (registered trademark), devices that cooperate with each other with the wireless communication functions can be identified by obtaining their device IDs, and the cooperation function can be executed.
Hereinafter, the process of obtaining the device identification information will be described in detail with reference to fig. 5. As an example, a case of obtaining the device identification information of the image forming apparatus 10 will be described. Fig. 5 schematically shows the appearance of the image forming apparatus 10. Here, a process of obtaining device identification information by applying the tag-based AR technique will be described. A label 50 such as a two-dimensional barcode is attached to the housing of the image forming apparatus 10. The mark 50 is information obtained by encoding device identification information of the image forming apparatus 10. The user starts the camera 42 of the terminal device 16 and captures the marker 50 attached to the image forming device 10 as the target to be used with the camera 42. Thus, image data representing the marker 50 is generated. For example, the image data is transmitted from the terminal device 16 to the server 14. In the server 14, the controller 36 performs decoding processing on the marker image represented by the image data, thereby extracting device identification information. Thus, the image forming apparatus 10 to be used (the image forming apparatus 10 to which the photographed mark 50 is attached) is identified. The determination unit 38 of the server 14 determines the function information of the function associated with the extracted device identification information in the device function management information 32. Thus, the function of the image forming apparatus 10 to be used is determined (identified).
Alternatively, the controller 48 of the terminal device 16 may perform a decoding process on the image data representing the marker 50 to extract the device identification information. In this case, the extracted device identification information is transmitted from the terminal apparatus 16 to the server 14. The determination unit 38 of the server 14 determines function information indicating a function associated with the device identification information transmitted from the terminal apparatus 16 in the device function management information 32. If the device-function management information 32 is stored in the memory 44 of the terminal apparatus 16, the controller 48 of the terminal apparatus 16 may determine function information indicating a function associated with the extracted device-identification information in the device-function management information 32.
The mark 50 may include encoded function information indicating the function of the image forming apparatus 10. In this case, by performing the decoding process on the image data representing the mark 50, the device identification information of the image forming apparatus 10 is extracted and also the function information representing the function of the image forming apparatus 10 is extracted. Thus, the image forming apparatus 10 is determined (identified) and the function of the image forming apparatus 10 is also determined (identified). The decoding process may be performed by the server 14 or the terminal device 16.
In the case of performing a cooperative function using functions of a plurality of devices, the marks of target devices cooperating with each other are photographed to obtain device identification information of the devices, thereby determining (identifying) the cooperative function.
In the case of obtaining the device identification information by applying the marker-less AR technique, for example, the user photographs the entire exterior or partial appearance of the device to be used (e.g., the image forming apparatus 10) with the camera 42 of the terminal apparatus 16. Of course, it is useful to obtain information for determining a device to be used, such as a device name (e.g., a trade name) or a model number, from the appearance of the photographing device. As a result of the photographing, appearance image data representing the entire appearance or partial appearance of the apparatus to be used is generated. For example, the appearance image data is transmitted from the terminal device 16 to the server 14. In the server 14, the controller 36 identifies a device to be used based on the appearance image data. For example, the memory 30 of the server 14 stores, for each device, appearance image correspondence information indicating: a correspondence between appearance image data representing the entire appearance or a partial appearance of the device and device identification information of the device. The controller 36 compares, for example, the appearance image data received from the terminal device 16 with pieces of appearance image data included in the appearance image correspondence information, and determines device identification information of devices to be used based on the comparison result. For example, the controller 36 extracts a feature of the appearance of the device to be used from appearance image data received from the terminal apparatus 16, determines appearance image data representing a feature identical or similar to the feature of the appearance among an appearance image data group included in the appearance image correspondence information, and determines device identification information associated with the appearance image data. Thus, the device to be used (the device photographed by the camera 42) is identified. As another example, if the name (e.g., trade name) or model number of the photographing apparatus and appearance image data representing the name or model number are generated, an apparatus to be used may be identified based on the name or model number represented by the appearance image data. The determination unit 38 of the server 14 determines function information indicating each function associated with the determined device identification information in the device function management information 32. Thus, the function of the device (e.g., the image forming apparatus 10) to be used is determined.
Alternatively, the controller 48 of the terminal apparatus 16 may compare appearance image data representing the entire appearance or a partial appearance of a device to be used (e.g., the image forming apparatus 10) with pieces of appearance image data included in the appearance image correspondence information, and may determine device identification information of the device to be used based on the comparison result. The appearance image correspondence information may be stored in the memory 44 of the terminal device 16. In this case, the controller 48 of the terminal device 16 refers to the appearance image correspondence information stored in the memory 44 of the terminal device 16, thereby determining the device identification information of the device to be used. Alternatively, the controller 48 of the terminal device 16 may obtain the appearance image correspondence information from the server 14 and may refer to the appearance image correspondence information to determine the device identification information of the device to be used.
In the case of performing a cooperative function using a plurality of functions of a plurality of devices, the entire appearance or partial appearance of the respective devices cooperating with each other is photographed to obtain device identification information of the devices, thereby determining (identifying) the cooperative function.
In the case where the device identification information is obtained by applying the position information AR technology, for example, position information indicating the position of a device (for example, the image forming apparatus 10) is obtained using a Global Positioning System (GPS) function. For example, each device has a GPS function and obtains device position information indicating the position of the device. The terminal apparatus 16 outputs information indicating a request to obtain device location information to a device to be used, and receives device location information of the device from the device as a response to the request. For example, the device location information is transmitted from the terminal apparatus 16 to the server 14. In the server 14, the controller 36 identifies a device to be used based on the device location information. For example, the memory 30 of the server 14 stores, for each device, location correspondence information indicating: a correspondence between device location information indicating a location of a device and device identification information of the device. The controller 36 determines device identification information associated with the device location information received from the terminal apparatus 16 among the location correspondence information. Thus, the device to be used is determined (identified). The determination unit 38 of the server 14 determines function information indicating each function associated with the determined device identification information in the device function management information 32. Thus, the function of the device (e.g., the image forming apparatus 10) to be used is determined (identified).
The controller 48 of the terminal apparatus 16 may determine device identification information associated with the location information of the device to be used in the location correspondence information. The location correspondence information may be stored in the memory 44 of the terminal device 16. In this case, the controller 48 of the terminal device 16 refers to the position correspondence information stored in the memory 44 of the terminal device 16, thereby determining the device identification information of the device to be used. Alternatively, the controller 48 of the terminal device 16 may obtain the position correspondence information from the server 14 and refer to the position correspondence information to determine the device identification information of the device to be used.
In the case of performing a cooperation function using a plurality of devices, device location information of devices cooperating with each other is obtained and device identification information of the devices is determined based on the device location information. Thus, the cooperation function is determined (identified).
Hereinafter, the device system according to the exemplary embodiment will be described in further detail.
The device function management information 32 will be described in detail with reference to fig. 6. Fig. 6 shows an example of a device function management table as the device function management information 32. In the device function management table, for example, a device ID, information indicating a device name (e.g., a device type), information indicating one or more functions of the device (function information), and an image ID are associated with each other. The device ID and the device name correspond to an example of device identification information. The image ID is an example of image identification information for identifying an image representing the apparatus (for example, an image representing the appearance of the apparatus or an image (for example, an icon) schematically representing the apparatus). The device function management table does not necessarily include an image ID. For example, the device having the device ID "B" is a multi-function peripheral (MFP, an image forming apparatus having a plurality of image forming functions) and has a printing function, a scanning function, and the like. An image ID for identifying an image representing a device is associated with the device. Data representing an image of the apparatus is stored, for example, in memory 30 of server 14 or in another device.
For example, with the AR technology, a device ID for identifying a device to be used is obtained. The determination unit 38 of the server 14 determines the name of the device, one or more functions of the device, and the image ID associated with the device ID by referring to the device function management table. Thus, the device to be used is identified. For example, information indicating the device name and data indicating the image of the device are transmitted from the server 14 to the terminal apparatus 16, and then they are displayed on the UI unit 46 of the terminal apparatus 16. The image representing the device is displayed as an image relating to the device. Of course, the image captured by the camera 42 may be displayed on the UI unit 46 of the terminal device 16. If the user specifies an image related to the device (e.g., an image captured by the camera 42 or an image schematically representing the device) on the UI unit 46 of the terminal apparatus 16, information about one or more functions of the device (e.g., function information or function specification information) may be transmitted from the server 14 to the terminal apparatus 16 and may be displayed on the UI unit 46 of the terminal apparatus 16.
Next, the cooperation function management information 34 will be described in detail with reference to fig. 7. Fig. 7 shows the cooperation function management information 34 as an example of the cooperation function management table. In the cooperation function management table, for example, a combination of device IDs, information indicating names (types) of target devices cooperating with each other, and information indicating one or more cooperation functions (cooperation function information) are associated with each other. For example, the device having the device ID "a" is a Personal Computer (PC), and the device having the device ID "B" is an MFP. The cooperation between the pc (a) and the mfp (b) realizes, for example, a scan-and-transfer function and a print function as cooperation functions. The scan-and-transfer function is a function of transferring image data generated by the scanning of the mfp (b) to the pc (a). The print function is a function of transmitting data (e.g., image data or document data) stored in the pc (a) to the mfp (b) and printing the data by the mfp (b).
Hereinafter, a process in the case of using the device alone will be described with reference to fig. 8. Fig. 8 shows an example of a device used alone. For example, it is assumed that the image forming apparatus 10 is a device used alone. The image forming apparatus 10 is, for example, an MFP. The image forming apparatus 10 is a device existing in a real space. The terminal device 16 shown in fig. 8 is an apparatus existing in a real space, and is, for example, a mobile terminal device such as a smartphone or a mobile phone.
For example, a label 50 such as a two-dimensional barcode is attached to the housing of the image forming apparatus 10. In the case of using the tag-based AR technology or the tag-free AR technology, the user captures the image forming apparatus 10 to be used with the camera 42 of the terminal apparatus 16 (e.g., smartphone). Thus, image data representing the mark 50 or appearance image data representing the appearance of the image forming apparatus 10 is generated. A device display screen 52 is displayed on the display of the UI unit 46 of the terminal apparatus 16, and a device image 54 related to the image forming apparatus 10 is displayed on the device display screen 52. The device image 54 is, for example, an image (having an original size at the time of shooting or an increased or decreased size) generated by shooting by the camera 42.
Image data generated by being captured by the camera 42 is transmitted from the terminal device 16 to the server 14. In the server 14, the controller 36 performs a decoding process on the image data to extract the device identification information of the image forming apparatus 10, and thus, identifies the image forming apparatus 10. Alternatively, appearance image data representing the appearance of the image forming apparatus 10 may be generated, and the appearance image data may be transmitted from the terminal apparatus 16 to the server 14. In this case, in the server 14, the controller 36 determines the device identification information of the image forming apparatus 10 by referring to the appearance image correspondence information. Thus, the image forming apparatus 10 is identified.
The determination unit 38 of the server 14 determines (recognizes) the function of the image forming apparatus 10 by referring to the device function management information 32 (e.g., the device function management table shown in fig. 6). This will be described in detail with reference to fig. 6. For example, assume that the image forming apparatus 10 is "mfp (b)". The determination unit 38 determines a function associated with the mfp (b) in the device function management table shown in fig. 6. Thus, the function of MFP (B) is determined. Information about the determined function is sent from the server 14 to the terminal device 16. Of course, the process for identifying the devices and functions may be performed by the terminal device 16.
On the device display screen 52, instead of an image generated by shooting by the camera 42, a preliminary image (not an image obtained by shooting but a schematic image (e.g., an icon)) related to the identified image forming apparatus 10 or an image generated by shooting by an external camera may be displayed as a device image 54.
For example, in the case of using image data obtained by a photographing device, the appearance of the device in the current state (e.g., the appearance including scratches, notes, stickers attached to the device, etc.) is reflected in the image, and thus, the user can visually recognize the difference from another device of the same type more clearly.
In the case of using a schematic image, for example, data of the schematic image is transmitted from the server 14 to the terminal device 16. For example, when the image forming apparatus 10 is recognized, the determination unit 38 of the server 14 determines a schematic image related to the image forming apparatus 10 by referring to the device function management table (device function management information 32) shown in fig. 6. Data of the schematic image is transmitted from the server 14 to the terminal apparatus 16, and the schematic image is displayed as a device image 54 on the device display screen 52. The data indicative of the image may be stored in the terminal device 16 in advance. In this case, when the image forming apparatus 10 is recognized, the device image 54 stored in the terminal apparatus 16 is displayed on the device display screen 52. The data indicative of the image may be stored in a device other than server 14 and terminal device 16.
Further, when a device is identified, information representing the device name may be transmitted from the server 14 to the terminal apparatus 16, and the device name may be displayed on the device display screen 52 in the terminal apparatus 16. In the example shown in fig. 8, the image forming apparatus 10 is an MFP, and its name "MFP (b)", is displayed.
After determining the functions of the image forming apparatus 10, the controller 48 of the terminal apparatus 16 causes the UI unit 46 of the terminal apparatus 16 to display a function display screen 56, and causes information on the functions to be displayed on the function display screen 56, as shown in fig. 9. As the information on the function, for example, a button image for providing an instruction to execute the function is displayed. The mfp (b) as the image forming apparatus 10 has, for example, a printing function, a scanning function, a copying function, and a facsimile function, and therefore, button images for providing instructions to execute these functions are displayed on the function display screen 56. For example, when the user designates a button image representing a print function with the terminal device 16 and provides an instruction to execute the print function, execution instruction information representing an instruction to execute the print function is transmitted from the terminal device 16 to the image forming device 10. The execution instruction information includes control data for executing the printing function, data such as image data to which the printing function is applied, and the like. In response to the reception of the execution instruction information, the image forming apparatus 10 performs printing according to the execution instruction information.
Fig. 10 shows another example of the function display screen. The function display screen 58 is a screen displayed on the UI unit 46 of the terminal device 16 in the case where a single device is used as shown in fig. 8. As described above, the device to be used (for example, the image forming apparatus 10) is determined and the function of the device to be used is determined. Function information representing functions associated with user identification information of a user using the target device (i.e., functions available to the user) may be determined. In addition, since the function of the device to be used is determined, a function that is not possessed by the device to be used among a group of functions to be provided can be determined. Such information may be displayed on the function display screen 58.
On the function display screen 58 shown in fig. 10, a button image 60 representing a function a, a button image 62 representing a function B, and a button image 64 representing a function C are displayed as examples of function information. The function a is a function of a device to be used (for example, the identified image forming apparatus 10) and is a function available to the user. The function B is a function of a device to be used and is a function unavailable to the user. By providing the function B to the user, the user becomes able to use the function B. If function B is a payment function, the user becomes able to use function B by purchasing function B. If the function B is a free function, the user becomes able to use the function B by providing the function B to the user for free. The function C is a function that is not possessed by the device to be used, i.e., a function that is not compatible with the device to be used. The controller 48 of the terminal device 16 may change the display form of the button image according to whether the function represented by the button image is a function of the apparatus to be used or not. Further, the controller 48 may change the display form of the button image according to whether the function represented by the button image is a function available to the user. For example, the controller 48 may change the color or shape of each button image. In the example shown in fig. 10, the button images 60, 62, and 64 are displayed in different colors. For example, a button image (for example, a button image 60 representing a function a) representing a device to be used and functions available to the user is displayed in blue. A button image indicating a function that the apparatus is to be used and is not available to the user (for example, the button image 62 indicating the function B) is displayed in yellow. A button image indicating a function not included in the device to be used (for example, the button image 64 indicating the function C) is displayed in gray. Alternatively, the controller 48 may change the shape of the button images 60, 62, and 64, or may change the font of the function display name. Of course, the display form may be changed in another method. Thus, the user can recognize the usability of each function with enhanced visibility.
For example, if the user specifies the button image 60 representing the function a with the terminal apparatus 16 and provides an instruction to execute the function a, execution instruction information representing the instruction to execute the function a is transmitted from the terminal apparatus 16 to the target device to be used. The execution instruction information includes control data for executing the function a, image data to be subjected to the processing of the function a, and the like. In response to the reception of the execution instruction information, the target device executes function a according to the execution instruction information. For example, if the target device is the image forming apparatus 10 and if the function a is the scan-and-transfer function, the image forming unit 20 of the image forming apparatus 10 executes the scan function to generate scan data (image data). Then, the scan data is transmitted from the image forming apparatus 10 to the set destination (for example, the terminal apparatus 16).
The providing process is executed if the user specifies the button image 62 representing the function B with the terminal device 16 and provides an instruction to execute the function B. If the providing process is performed by the server 14, the terminal device 16 accesses the server 14. Therefore, as information enabling the user to use the function B, a screen (e.g., a website) for providing the function B is displayed on the UI unit 46 of the terminal device 16. By performing a provision (provision) process on the screen, the user becomes able to use the function B. For example, the terminal device 16 stores a program of a web browser. With the web browser, the user can access the server 14 from the terminal device 16. When the user accesses the server 14 using the web browser, a function providing screen (e.g., a website) is displayed on the UI unit 46 of the terminal device 16, and a function is provided to the user through the website. Of course, the providing process may be performed by a server other than the server 14. Alternatively, as information enabling the user to use the function B, a use permission request screen (e.g., a website) for requesting the administrator or the like to use the function B may be displayed on the UI unit 46 of the terminal device 16. If the user requests a license to use the function B from an administrator or the like through the use license request screen and if the license is obtained, the user can use the function B.
Hereinafter, a process in the case of using the cooperation function will be described with reference to fig. 11. Fig. 11 shows an example of target devices cooperating with each other. For example, it is assumed that the image forming apparatus 10 and a projector as the device 12 (hereinafter may be referred to as the projector 12) are used as target devices. The image forming apparatus 10, the projector 12, and the terminal apparatus 16 are devices existing in a real space.
For example, a marker 50 such as a two-dimensional barcode is attached to the housing of the image forming apparatus 10, and a marker 66 such as a two-dimensional barcode is attached to the housing of the projector 12. The mark 66 is information obtained by encoding device identification information of the projector 12. If the tag-based AR technology or the tag-free AR technology is used, the user captures the image forming apparatus 10 and the projector 12 as target devices cooperating with each other with the camera 42 of the terminal apparatus 16 (e.g., a smartphone). In the example shown in fig. 11, the image forming apparatus 10 and the projector 12 are photographed together in a state where both the image forming apparatus 10 and the projector 12 are within the field of view of the camera 42. Thus, image data representing the markers 50 and 66 is generated. The device display screen 68 is displayed on the display of the UI unit 46 of the terminal apparatus 16. On the device display screen 68, a device image 70 related to the image forming apparatus 10 and a device image 72 related to the projector 12 are displayed. The device images 70 and 72 are images (having an original size at the time of shooting or an increased or decreased size) generated by, for example, shooting by the camera 42.
Image data generated by being captured by the camera 42 is transmitted from the terminal device 16 to the server 14. In the server 14, the controller 36 performs a decoding process on the image data to extract the device identification information of the image forming apparatus 10 and the device identification information of the projector 12, and thus, identifies the image forming apparatus 10 and the projector 12. Alternatively, appearance image data representing the appearances of the image forming apparatus 10 and the projector 12 may be generated and transmitted from the terminal apparatus 16 to the server 14. In this case, in the server 14, the controller 36 determines the device identification information of the image forming apparatus 10 and the device identification information of the projector 12 by referring to the appearance image correspondence information. Thus, the image forming apparatus 10 and the projector 12 are recognized.
The determination unit 38 of the server 14 determines (identifies) one or more cooperation functions using the functions of the image forming apparatus 10 and the projector 12 by referring to the cooperation function management information 34 (e.g., the cooperation function management table shown in fig. 7). This will be described in detail with reference to fig. 7. For example, it is assumed that the image forming apparatus 10 is an mfp (b) and the projector 12 is a projector (C). The determination unit 38 determines the cooperation function associated with the combination of the mfp (b) and the projector (C) in the cooperation function management table shown in fig. 7. Thus, the cooperation function executed by cooperation between the mfp (b) and the projector (C) is determined. Information about the determined cooperation function is transmitted from the server 14 to the terminal device 16. Of course, the processing for identifying the device and the cooperation function may be performed by the terminal device 16.
On the device display screen 68, instead of an image generated by being captured by the camera 42, a preliminary image (for example, a schematic image such as an icon) related to the recognized image forming apparatus 10 or an image generated by being captured by an external camera may be displayed as the device image 70. Further, a preliminary image related to the identified projector 12 or an image generated by being photographed by an external camera may be displayed as the device image 72. As described above, the data of the schematic image may be transmitted from the server 14 to the terminal device 16, may be stored in the terminal device 16 in advance, or may be stored in another device.
When the device is identified, information representing the device name may be transmitted from the server 14 to the terminal apparatus 16, and the device name may be displayed on the device display screen 68 in the terminal apparatus 16. In the example shown in fig. 11, the names "mfp (b)" of the image forming apparatus 10 and the name "projector (C)" of the projector 12 are displayed.
The determination unit 38 of the server 14 may determine the functions of the respective devices by referring to the device function management information 32 if a plurality of devices are photographed. In the example shown in fig. 11, the determination unit 38 may determine the function of the image forming apparatus 10 and the function of the projector 12. Information about the determined function may be sent from the server 14 to the terminal device 16.
After determining the cooperation function, the controller 48 of the terminal apparatus 16 causes the UI unit 46 of the terminal apparatus 16 to display a function display screen 74 and causes information about the cooperation function to be displayed on the function display screen 74, as shown in fig. 12. As the information on the cooperative function, for example, a button image for providing an instruction to execute the cooperative function is displayed. The cooperation between the mfp (b) and the projector (C) enables execution of a cooperation function of projecting an image generated by scanning by the mfp (b) by the projector (C) and a cooperation function of printing an image projected by the projector (C) by the mfp (b). Button images for providing instructions to execute these cooperative functions are displayed on the function display screen 74. For example, when the user specifies a button image with the terminal device 16 and provides an instruction to execute a cooperation function, execution instruction information indicating an instruction to execute the cooperation function is transmitted from the terminal device 16 to the image forming device 10 and the projector 12. In response to the reception of the execution instruction information, the image forming apparatus 10 and the projector 12 execute the cooperative function specified by the user.
Target devices that cooperate with each other can be specified by user operation. For example, as a result of photographing the image forming apparatus 10 and the projector 12 with the camera 42, as shown in fig. 11, a device image 70 related to the image forming apparatus 10 and a device image 72 related to the projector 12 are displayed on the UI unit 46 of the terminal apparatus 16. The image related to the device may be an image obtained by being photographed by the camera 42, or may be a preliminary image (for example, a schematic image such as an icon) related to the recognized device. When the user specifies the device images 70 and 72 on the device display screen 68, the image forming apparatus 10 and the projector 12 are specified as target devices that cooperate with each other. For example, when the user specifies the device image 70, the tag-based AR technique or the tag-free AR technique is applied to the device image 70, thereby determining (identifying) the image forming apparatus 10. Also, when the user specifies the device image 72, the marker-based AR technique or the marker-less AR technique is applied to the device image 72, thereby determining (identifying) the projector 12. Accordingly, the cooperation function performed by the image forming apparatus 10 and the projector 12 is determined, and information on the cooperation function is displayed on the UI unit 46 of the terminal apparatus 16.
As another example, the user may touch the device image 70 with, for example, his/her finger on the device display screen 68, and may move the finger to the device image 72 to designate the device images 70 and 72 and designate the image forming apparatus 10 and the projector 12 as target devices that cooperate with each other. The order in which the user touches the device images 70 and 72 or the direction of movement of the finger may be reversed from the above example. Of course, a screen indication unit other than a finger moving on the device display screen 68, such as a pen, may be used. The user may link the device images 70 and 72 to each other to specify the device images 70 and 72, and may thereby specify the image forming apparatus 10 and the projector 12 as target devices that cooperate with each other. The user can superimpose the device images 70 and 72 on each other to specify the device images 70 and 72, and can thereby specify the image forming apparatus 10 and the projector 12 as target devices that cooperate with each other. In addition, target devices that cooperate with each other may be specified by drawing a circle thereon, or may be specified by specifying a device image related to the device within a predetermined period of time. In the case of releasing the cooperation, the user may specify a target device to be released on the device display screen 68, or may press a cooperation release button. If an image of a device that is not a target device for collaboration is on the device display screen 68, the user may specify the device on the device display screen 68 to remove the device from the target devices for collaboration with each other. The device to be released may be specified by performing a predetermined operation (e.g., drawing a cross thereon).
The target devices cooperating with each other can be individually photographed. For example, target devices cooperating with each other are identified by performing photographing by the camera 42 a plurality of times. If photographing by the camera 42 is performed a plurality of times, device identification information of the device identified in each photographing operation is stored in the memory of the server 14 or the terminal apparatus 16. For example, the image forming apparatus 10 is photographed in a state where the image forming apparatus 10 is in the field of view of the camera 42, and then the projector 12 is photographed in a state where the projector 12 is in the field of view of the camera 42. Thus, image data representing the image forming apparatus 10 and image data representing the projector 12 are generated. By applying the tag-based AR technique or the tag-free AR technique to each piece of image data, the image forming apparatus 10 and the projector 12 are determined (identified), and the cooperative function using the functions of the image forming apparatus 10 and the projector 12 is determined (identified). For example, the image forming apparatus 10 and the projector 12, which are target devices that cooperate with each other, are not always close to each other within the field of view of the camera 42. The angle of view of the camera 42 may be changed or the field of view may be increased or decreased. If these operations are insufficient, photographing may be performed multiple times to identify target devices that cooperate with each other.
As another example, the target device of cooperation may be set in advance as the basic cooperation device. For example, assume that the image forming apparatus 10 is set as basic cooperation means in advance. The device identification information of the basic cooperation device may be stored in advance in the memory of the server 14 or the terminal apparatus 16. Alternatively, the user may specify the basic cooperation apparatus using the terminal device 16. If the basic cooperation apparatus is set, the user photographs a target apparatus other than the basic cooperation apparatus with the camera 42 of the terminal device 16. Thus, target apparatuses that cooperate with each other are determined (identified), and one or more cooperation functions that use the functions of the basic cooperation apparatus and the photographed apparatus are determined (identified).
In the examples shown in fig. 11 and 12, the respective cooperative functions are functions using hardware devices. Alternatively, the cooperation function may be a function using a function implemented by software (application). For example, instead of the device image, a function image (e.g., an icon image or the like) related to a function realized by software may be displayed on the UI unit 46 of the terminal apparatus 16, and a plurality of function images among the function images may be specified by the user so that a cooperative function using a plurality of functions related to the plurality of function images may be determined (identified). For example, the cooperative function may be determined by specifying a function image related to a function displayed on the home screen of a smartphone or the desktop of a PC. Of course, if a device image related to a hardware device and a function image related to a function realized by software are displayed on the UI unit 46 of the terminal apparatus 16 and if the user specifies the device image and the function image, a cooperative function using the device related to the device image and the function related to the function image can be recognized.
In the above examples, a marker-based AR technique or a marker-less AR technique is used, but a location information AR technique may be used. For example, the terminal device 16 has a GPS function, obtains terminal position information indicating the position of the terminal device 16, and transmits the terminal position information to the server 14. The controller 36 of the server 14 refers to the position correspondence information indicating the correspondence between the device position information (indicating the position of the device) and the device identification information, and determines a device located within a predetermined range with respect to the position of the terminal apparatus 16 as a candidate cooperative device. For example, assume that the positions of the MFP, PC, printer, and scanner with respect to the terminal device 16 are within a predetermined range. In this case, the MFP, the PC, the printer, and the scanner are determined as candidate cooperative apparatuses. The device identification information of each candidate cooperative device is transmitted from the server 14 to the terminal apparatus 16 and displayed on the UI unit 46 of the terminal apparatus 16. As the device identification information, an image of the candidate cooperative device may be displayed or a character string such as a device ID may be displayed. The user specifies target devices that cooperate with each other from among the candidate cooperation devices displayed on the UI unit 46. The device identification information of the target device specified by the user is transmitted from the terminal apparatus 16 to the server 14. In the server 14, one or more cooperation functions are determined based on the device identification information of the target device. Information about the one or more cooperative functions is displayed on the UI unit 46 of the terminal device 16. The process of determining the candidate cooperative device and the process of determining the cooperative function may be performed by the terminal device 16.
If the photographed device is not recognized even if the AR technology or the like is applied, a device image representing the photographed device does not have to be displayed on the device display screen. Thus, the visibility of the identified device may be increased. For example, if there are an identified device and an unidentified device and if both devices are photographed by the camera 42, a device image representing the unidentified device is not displayed. Accordingly, the device image representing the recognized device is displayed while being distinguished from the device image representing the unidentified device, and thus, the visibility of the recognized device can be increased. Alternatively, a device image representing the identified device may be displayed in a highlighted manner. For example, the device image representing the identified device may be displayed in a particular color, may be displayed while highlighting the edges of the device image, may be displayed while enlarging the device image, may be displayed three-dimensionally, or may be displayed while blinking the device image. Thus, the visibility of the identified device may be increased.
Hereinafter, a process for executing the function of the apparatus will be described. As an example, a process for executing the cooperative function will be described. In this case, a connection request is transmitted from the terminal device 16 to the target apparatuses cooperating with each other, and the connection between the terminal device 16 and the target apparatuses is established. Hereinafter, the connection process will be described with reference to fig. 13. Fig. 13 is a sequence diagram showing this process.
First, the user provides an instruction to start an application (program) for executing the device function with the terminal device 16. In response to the instruction, the controller 48 of the terminal device 16 starts the application (S01). The application may be stored in advance in the memory 44 of the terminal device 16, or may be downloaded from the server 14 or the like.
Subsequently, target devices cooperating with each other are identified by applying the tag-based AR technology, the tag-free AR technology, or the location information AR technology (S02). Of course, techniques other than AR techniques may be utilized to identify the target device. In the case of applying the tag-based AR technology or the tag-free AR technology, the user photographs the target device with the camera 42 of the terminal apparatus 16. For example, in the case of using the image forming apparatus 10(mfp (b)) and the projector 12 (projector (C)) as target devices, the user photographs the image forming apparatus 10 and the projector 12 with the camera 42. Thus, the device identification information of the image forming apparatus 10 and the projector 12 is obtained, and the image forming apparatus 10 and the projector 12 are identified as the target device. In the case of applying the position information AR technology, position information of the image forming apparatus 10 and the projector 12 is obtained, device identification information of the image forming apparatus 10 and the projector 12 is determined based on the position information, and the image forming apparatus 10 and the projector 12 are identified.
For example, if the user provides an instruction to display a cooperation function, the cooperation function using the functions of the plurality of identified devices is identified. Information about the identified cooperation function is displayed on the UI unit 46 of the terminal device 16 (S03). The process of identifying the cooperative function may be performed by the server 14 or the terminal device 16.
Subsequently, after the user specifies a target cooperation function to be executed with the terminal apparatus 16, the terminal apparatus 16 transmits information indicating a connection request to target devices (e.g., the image forming apparatus 10 and the projector 12) that execute the cooperation function (S04). For example, if address information indicating addresses of target devices cooperating with each other is stored in the server 14, the terminal apparatus 16 obtains the address information from the server 14. If the address information is included in the device-identifying information, the terminal apparatus 16 may obtain the address information from the device-identifying information of the target device. Alternatively, the address information of the target device may be stored in the terminal apparatus 16. Of course, the terminal apparatus 16 may obtain the address information of the target device using another method. Using the address information of the target devices (e.g., the image forming apparatus 10 and the projector 12), the terminal apparatus 16 transmits information representing the connection request to the target devices (e.g., the image forming apparatus 10 and the projector 12).
The image forming apparatus 10 and the projector 12 that have received the information indicating the connection request allow or disallow connection to the terminal apparatus 16 (S05). For example, if the image forming apparatus 10 and the projector 12 are devices that are not permitted to be connected or if the number of devices that request connection exceeds an upper limit, connection is not permitted. If the connection from the terminal device 16 is permitted, the operation of changing the setting information unique to the image forming device 10 and the projector 12 may be prohibited so that the setting information is not changed by the terminal device 16. For example, a change in the color parameter of the image forming apparatus 10 or the set time for transition to the power saving mode may be prohibited. Accordingly, the security of the target devices cooperating with each other may be increased. Alternatively, in the case of causing the devices to cooperate with each other, the change of the setting information may be restricted, as compared with the case where the respective devices are used alone without cooperating with another device. For example, fewer setting items may be allowed to be changed than in the case where the device is used alone. Alternatively, viewing personal information (e.g., operational history) of other users may be prohibited. Accordingly, the security of the personal information of the user may be increased.
Result information indicating permission or non-permission of connection is transmitted from the image forming apparatus 10 and the projector 12 to the terminal apparatus 16 (S06). If connection to the image forming apparatus 10 and the projector 12 is permitted, communication is established between the terminal apparatus 16 and each of the image forming apparatus 10 and the projector 12.
Subsequently, the user provides an instruction to execute the cooperation function with the terminal device 16 (S07). In response to the instruction, execution instruction information indicating an instruction to execute the cooperation function is transmitted from the terminal apparatus 16 to the image forming apparatus 10 and the projector 12 (S08). The execution instruction information transmitted to the image forming apparatus 10 includes information (e.g., job information) indicating a process to be executed in the image forming apparatus 10, and the execution instruction information transmitted to the projector 12 includes information (e.g., job information) indicating a process to be executed in the projector 12.
In response to the execution instruction information, the image forming apparatus 10 and the projector 12 execute the respective functions according to the execution instruction information (S09). For example, if the cooperation function includes a process of transmitting/receiving data between the image forming apparatus 10 and the projector 12, communication is established between the image forming apparatus 10 and the projector 12 as in a function of transferring scan data from the image forming apparatus 10(mfp (b)) to the projector 12 (projector (C)) and projecting the data by the projector 12. In this case, for example, the execution instruction information transmitted to the image forming apparatus 10 includes address information of the projector 12, and the execution instruction information transmitted to the projector 12 includes address information of the image forming apparatus 10. Communication is established between the image forming apparatus 10 and the projector 12 using these address information.
After the execution of the cooperation function is completed, information indicating the completion of the execution of the cooperation function is transmitted from the image forming apparatus 10 and the projector 12 to the terminal apparatus 16 (S10). Information indicating completion of execution of the cooperation function is displayed on the UI unit 46 of the terminal device 16 (S11). If information indicating completion of execution of the cooperation function is not displayed even if a predetermined period of time elapses from the time point at which the execution instruction is provided, the controller 48 of the terminal device 16 may cause the UI unit 46 to display information indicating an error, and may again transmit the execution instruction information or the information indicating the connection request to the image forming device 10 and the projector 12.
Subsequently, the user determines whether to release the cooperation state of the image forming apparatus 10 and the projector 12 (S12), and performs processing according to the determination result (S13). In the case of releasing the cooperation state, the user provides a release instruction with the terminal device 16. Therefore, the communication between the terminal device 16 and each of the image forming device 10 and the projector 12 is stopped. Further, the communication between the image forming apparatus 10 and the projector 12 is stopped. The execution instructions may continue to be provided without releasing the collaboration state.
In addition, the number of target devices that cooperate with each other may increase. For example, the device identification information of the third device may be obtained, and the cooperation function performed by cooperation between three devices including the image forming apparatus 10 and the projector 12 may be determined. Information indicating that the image forming apparatus 10 and the projector 12 have been identified as target devices cooperating with each other is stored in the server 14 or the terminal apparatus 16.
Device identification information indicating target devices that cooperate with each other and cooperation function information indicating the executed cooperation function may be stored in the terminal apparatus 16 or the server 14. For example, user account information (user identification information) of a user who uses the terminal apparatus 16 may be obtained, and history information indicating a correspondence relationship between the user account information, device identification information indicating target devices that cooperate with each other, and cooperation function information indicating executed cooperation functions may be created and stored in the terminal apparatus 16 or the server 14. The history information may be created by the terminal device 16 or the server 14. With reference to the history information, the cooperative function that has been executed and the means for the cooperative function are determined.
As the history information, target devices (e.g., the image forming apparatus 10 and the projector 12) that cooperate with each other may store user account information of the user who has requested connection once and terminal identification information indicating the terminal device 16 which has requested connection once. With reference to the history information, a user who has used the apparatus is determined. In the case where, for example, a user who used the apparatus once when the apparatus was damaged is determined or in the case where a charging process is performed for consumables or the like, the user can be determined using the history information. The history information may be stored in the server 14 or the terminal device 16, or may be stored in another device.
For example, the user account information is stored in advance in the memory 44 of the terminal device 16. The controller 48 of the terminal device 16 functions as an example of a user identification unit, reads user account information of the user from the memory 44, and identifies the user using the terminal device 16. If the user account information of a plurality of users is stored in the memory 44, the user specifies his/her user account information with the terminal device 16. Accordingly, the user account information of the user is read and the user is identified. Alternatively, the controller 48 of the terminal device 16 may identify the user by reading user account information of the user logged into the terminal device 16. Alternatively, if only one piece of user account information is stored in the same terminal device 16, the controller 48 of the terminal device 16 may identify the user by reading the one piece of user account information. If the user account is not set and if the user account information is not created, initial setting is performed, thereby creating user account information.
The usage history of the cooperation function may be managed for each user, and information indicating the cooperation function previously used by the user indicated by the read user account information may be displayed on the UI unit 46 of the terminal device 16. Information representing the usage history may be stored in terminal device 16 or server 14. Further, information indicating a cooperative function used at a predetermined frequency or more may be displayed. In the case where such a shortcut function is provided, the user operation with respect to the cooperation function can be simplified.
In the case of executing the one-device function, information indicating an instruction to execute the one-device function is transmitted from the terminal device 16 to the device that executes the one-device function. The device performs a single device function according to the instruction.
In the above example, the cooperative function may be performed by a plurality of apparatuses. However, the cooperative function is not always executable depending on the combination of devices. Further, according to a combination of a plurality of functions (for example, a combination of functions implemented by software or a combination of functions implemented by software and functions of a hardware device), a cooperative function is not always executable. This will be described in detail below.
Fig. 14A shows an example of a combination of devices that cannot perform a cooperative function. For example, assume that the mfp (b) and the blower (D) are identified as devices. As shown in fig. 14A, a device display screen 68 is displayed on the UI unit 46 of the terminal apparatus 16, and device images 70 and 76 relating to the identified devices (mfp (b) and blower (D)) are displayed on the device display screen 68. If there is no cooperation function executable by the mfp (B) and the blower (D) and if the mfp (B) and the blower (D) are specified as target devices to cooperate with each other, no information about the cooperation function is displayed and a message screen 78 is displayed on the UI 46 of the terminal device 16, as shown in fig. 14B. The message screen 78 displays a message indicating that the cooperative function cannot be executed by the mfp (b) and the blower (D).
The above-described process will be described in more detail. When the mfp (b) and the blower (D) are identified and designated as target devices cooperating with each other, the determination unit 38 of the server 14 determines (identifies) the use of the cooperative functions of the mfp (b) and the blower (D) by referring to the cooperative function management information 34 (e.g., the cooperative function management table shown in fig. 7) as described above. The determination unit 38 determines the cooperation function using the mfp (b) and the blower (D) if the cooperation function is registered in the cooperation function management table. On the other hand, if the cooperative functions using the mfps (b) and the blower (D) are not registered in the cooperative function management table, the determination unit 38 determines that there is no cooperative function using the mfps (b) and the blower (D). In this case, the controller 36 of the server 14 outputs a message indicating that the combination of the mfp (b) and the blower (D) cannot execute the cooperative function. This message is displayed on the UI unit 46 of the terminal device 16, as shown in fig. 14B.
Even if there is no available cooperative function as in the above case, depending on the operation state of the apparatus, the environment in which the apparatus is installed (ambient environment), or a change (update) of the function of the apparatus, it may become possible to use the cooperative function. In the above example, if condensation occurs in the mfp (b) -installed environment, the condensation may be removed or prevented by using the blower (D). In this case, the cooperative function using the mfp (b) and the blower (D) is available, and therefore, information indicating the cooperative function is displayed on the UI unit 46 of the terminal device 16. For example, the controller 36 of the server 14 monitors the operation state of each device, the environment (surrounding environment) in which the device is installed, the update state of the function of each device, and the like, and determines the availability of the cooperative function based on the monitoring result. In the case of the combination of the mfp (b) and the blower (D), if the ambient environment of the mfp (b) satisfies a certain condition (for example, if condensation occurs in the ambient environment of the mfp (b)), the controller 36 determines that the cooperative function is available and determines (identifies) the use of the cooperative function of the blower (D). The same applies to the operational state of the devices, i.e., if an identified or designated group of devices is in a particular operational state, the controller 36 determines that a cooperative function using the group of devices is available. The same applies to the case where the functions of the devices are updated and the cooperation function becomes available by the updated functions.
Further, there are cases where the cooperative function is not executable by a plurality of functions realized by a plurality of software units, and there are cases where the cooperative function is not executable by a function realized by a software unit and a hardware device.
Boot process
In an exemplary embodiment, for example, when an image related to an apparatus is specified, guidance indicating a hardware apparatus or a software-implemented function capable of performing a cooperative function with the apparatus is presented. The same applies to the use of the cooperative function of software. For example, when an image related to a function realized by software is specified, guidance indicating a hardware device or a function realized by software capable of performing a cooperative function together with the function is presented. Hereinafter, an example of the booting process according to an exemplary embodiment will be described in detail.
Example 1
The boot processing according to example 1 will be described with reference to fig. 15. Fig. 15 shows an example of a device display screen according to example 1. For example, assume that the mfp (b), the projector (C), and the blower (D) are identified as devices. A device display screen 68 is displayed on the UI unit 46 of the terminal apparatus 16, and device images 70, 72, and 76 relating to the identified devices (mfp (b), projector (C), and blower (D)) are displayed on the device display screen 68.
In this case, for example, it is assumed that the user selects the mfp (b) and the user specifies the device image 70 related to the mfp (b). The mfp (b) corresponds to a first device, and the device image 70 relating to the mfp (b) corresponds to a first image relating to the first device. In response to the user designating the mfp (b) as the first device, the determination unit 38 of the server 14 determines the second device capable of executing the cooperative function together with the mfp (b) as the first device by referring to the cooperative function management information 34 (e.g., the cooperative function management table shown in fig. 7). For example, it is assumed that the combination of mfp (b) and projector (C) is capable of executing the cooperative function and the combination of mfp (b) and blower (D) is incapable of executing the cooperative function. That is, it is assumed that the cooperation functions using the mfps (b) and the projectors (C) are registered in the cooperation function management table and the cooperation functions using the mfps (b) and the blowers (D) are not registered in the cooperation function management table. In this case, the projector (C) is determined as the second device, and the controller 36 of the server 14 performs control to present guidance indicating the projector (C) as the second device. Specifically, device identification information representing the projector (C) is transmitted from the server 14 to the terminal apparatus 16 under the control of the controller 36. The controller 48 of the terminal device 16 presents guidance indicating the projector (C) as the second means. For example, as shown in fig. 15, the controller 48 of the terminal apparatus 16 causes an arrow 80 indicating the projector (C) as a cooperation partner device to be displayed on the device display screen 68. An arrow 80 is an image linking the device image 70 (first image relating to the mfp (b) as the first device) and the device image 72 (second image relating to the projector (C) as the second device) to each other. Of course, the controller 48 may present guidance indicating the projector (C) as the second device using methods other than arrows. For example, the controller 48 may present guidance indicating the second device by outputting a sound, may cause a mark superimposed on a second image (e.g., the device image 72) related to the second device to be displayed on the device display screen 68, may cause a second image related to the second device to be displayed on the device display screen 68 so that the second image is distinguishable from another image, or may cause a character string representing a collaboration partner to be displayed on the device display screen 68.
In the above case where the user designates the mfp (b) as the first device, the projector (C) as the second device capable of performing the cooperative function together with the mfp (b) is recommended as the candidate cooperation partner device. Therefore, user convenience of specifying the devices required for the cooperative function can be increased as compared with the case where such a candidate is not recommended.
If the user designates a function image (image corresponding to the first image) related to the first function realized by the software, a guidance process similar to the above-described guidance process is also executed. That is, if the user designates a function image as the first image, guidance indicating a second function capable of performing the cooperative function together with the first function related to the function image is presented. For example, guidance indicating a second image related to the second function may be presented, or guidance indicating the second function may be presented using a sound or a character string. The second function may be a function realized by software or a function of a hardware device. Of course, if the user designates a device image related to the first device, guidance indicating a second function that can perform a cooperative function together with the first device and is implemented by software may be presented.
Since the blower (D) is a device that cannot perform the cooperation function together with the mfp (b), guidance indicating the blower (D) as a cooperation partner is not presented.
Guidance instructing the blower (D) as the second device may be presented in accordance with an environment (surrounding environment) in which the MFP (b) as the first device is installed, an operation state of the MFP (b) (e.g., a toner amount, a paper amount, whether the MFP is in use, or a process end time), a change (update) of a function of the MFP (b), a change (update) of a function of the blower (D), or the like. For example, if condensation occurs in the ambient environment of the mfp (b) as the first device, the determination unit 38 of the server 14 determines the blower (D) as the second device necessary for removing condensation. In this case, guidance indicating the blower (D) as a cooperation partner is presented as in the above-described case of the projector (C). For example, an arrow linking the device image 70 (first image related to the mfp (b) as the first device) and the device image 76 (second image related to the blower (D) as the second device) to each other is displayed, or guidance indicating the blower (D) is presented with sound.
The device or object related to the image displayed on the device display screen 68 is not necessarily recognized. For example, there may be a case where the determination unit 38 of the server 14 does not identify the blower (D). The unrecognized device or object is excluded from the candidate collaboration partner (second device). Images related to unrecognized devices or objects may or may not be displayed on the device display screen 68.
Example 2
The boot processing according to example 2 will be described with reference to fig. 16. Fig. 16 shows an example of a device display screen according to example 2. In example 2, the plurality of apparatuses correspond to a second apparatus (cooperation partner apparatus). For example, assume that the mfp (b), the projector (C), the blower (D), and the camera (E) are recognized as devices and the leaf-viewing plant (F) is recognized as a leaf-viewing plant. Of course, objects that are not devices are not necessarily identified.
A device display screen 68 is displayed on the UI unit 46 of the terminal device 16, and device images 70, 72, 76, and 82 relating to the identified devices (mfp (b), projector (C), blower (D), and camera (E)) and an image 84 relating to the foliage plant (F) are displayed on the device display screen 68.
In this case, for example, it is assumed that the user selects mfp (b) as the first device and the user designates the device image 70 related to mfp (b) as the first image. Further, it is assumed that the projector (C) and the camera (E) are determined as the second device capable of performing the cooperative function together with the mfp (b) as the first device. In this case, guidance indicating the projector (C) and the camera (E) as the second device is presented. For example, the guidance is presented simultaneously. In the example shown in fig. 16, as in example 1, an arrow 80 linking the device image 70 relating to the mfp (b) and the device image 72 relating to the projector (C) to each other is displayed as a guide. Further, an arrow 86 linking the device image 70 relating to the mfp (b) and the device image 82 relating to the camera (E) to each other is displayed as a guide.
The order of priority may be associated with the collaboration function. Information indicating the priority order in the cooperative function management information 34 (for example, a cooperative function management table shown in fig. 7) is associated with each cooperative function. In the case where guidance indicating a plurality of second devices is presented, information indicating the priority order of the respective cooperation functions is transmitted from the server 14 to the terminal apparatus 16, and the priority order is displayed on the device display screen 68. For example, if the priority of using the first cooperation function of the mfp (b) and the projector (C) is higher than the priority of using the second cooperation function of the mfp (b) and the camera (E), the controller 48 of the terminal apparatus 16 causes information indicating that the priority of the projector (C) used in the first cooperation function is higher than the priority of the camera (E) used in the second cooperation function to be displayed on the device display screen 68. The controller 48 may cause a character string indicating the priority order to be displayed on the device display screen 68, may cause the arrows 80 and 86 to be displayed in different colors on the device display screen 68, may cause the device images 72 and 82 to be displayed in different display forms on the device display screen 68, or may cause the arrow 80 for the device image 72 having a higher priority to be displayed on the device display screen 68 and cause the arrow 86 for the device image 82 having a lower priority to be not displayed on the device display screen 68.
Alternatively, instead of displaying an arrow, the controller 48 of the terminal apparatus 16 may display a character string representing the second device in a specific area of the device display screen 68. For example, the character string may be displayed in an area where the device image is not displayed. Therefore, the occurrence of an arrow that makes it difficult to see information displayed on the screen is prevented.
In the above case where the user designates the mfp (b) as the first device, the projector (C) and the camera (E) as the second devices capable of performing the cooperative function together with the mfp (b) are recommended as candidate cooperation partner devices.
If the candidate collaboration partner apparatus (second apparatus) is not displayed on the apparatus display screen 68, position information indicating a position where the second apparatus is installed or information indicating guidance indicating the position where the second apparatus is installed may be displayed on the UI unit 46 of the terminal device 16. For example, the controller 36 of the server 14 obtains the position information of the second device using a GPS function or the like, and creates information indicating guidance indicating the position of the second device with respect to the position of the terminal apparatus 16 based on the obtained position information and the position information of the terminal apparatus 16. Information representing the guidance may be transmitted from the server 14 to the terminal device 16 and may be displayed on the UI unit 46 of the terminal device 16.
As in example 1, the processing according to example 2 is applicable to the case of using a function image related to a function. For example, if the user designates a function image as a first image, guidance indicating a plurality of second functions capable of performing a cooperative function together with the first function related to the function image may be presented. Of course, if the user designates a device image related to the first device, guidance indicating a plurality of functions capable of performing a cooperative function with the first device may be presented.
Example 3
The boot processing according to example 3 will be described with reference to fig. 17 to 20. Fig. 17 to 20 each show an example of a device display screen according to example 3. In example 3, a user designates a first device, then designates a second device, and performs control to present guidance indicating a third device capable of performing a cooperative function together with the first and second devices. The device recommended as the third device may be changed according to the order in which the first and second devices are designated.
For example, it is assumed that the pc (a), the mfp (b), the projector (C), and the camera (E) are recognized as devices, and device images 70, 72, 82, and 88 and an image 84 related to the recognized devices (pc (a), mfp (b), projector (C), and camera (E)) are displayed on the device display screen 68, as shown in fig. 17.
In this case, for example, if the user specifies the device image 70 relating to the mfp (b), the determination unit 38 of the server 14 recognizes the mfp (b) as the first device, and the pc (a), the projector (C), and the camera (E) capable of performing the cooperative function together with the mfp (b) are recognized as the second devices (candidate cooperative partner devices). As shown in fig. 17, for example, arrows 80, 86, and 90 are displayed as guides indicating the second device. An arrow 90 is an image linking the device image 70 relating to the mfp (b) and the device image 88 relating to the pc (a) to each other.
Subsequently, it is assumed that the user selects the projector (C) as the collaboration partner device from the group of second devices and the user specifies the device image 72 related to the projector (C). In this case, the determination unit 38 of the server 14 determines a third device capable of performing a cooperative function together with the mfp (b) as the first device and the projector (C) as the second device by referring to the cooperative function management information 34. In the cooperation function management table shown in fig. 7, a cooperation function executable by cooperation between two devices is registered. However, it is of course possible to register a cooperative function that can be performed by cooperation between three or more devices. For example, assume that pc (a) is determined as the third device. In this case, guidance indicating the pc (a) as the third device is presented as shown in fig. 18. For example, an arrow 92 linking the device image 72 related to the projector (C) as the second device and the device image 88 related to the pc (a) as the third device to each other is displayed as a guide.
For example, the user may specify the device image 72 related to the projector (C), perform an operation of linking the device image 70 related to the mfp (b) and the device image 72 related to the projector (C) with each other, superimpose the device image 70 on the device image 72, or place a pointer on the device image 70 and then move the pointer to the device image 72 to specify the projector (C) serving as the second device.
The order in which the respective devices (respective device images) are designated corresponds to the order in which the functions of the devices are used or the order in which data is moved between the devices. An operation of specifying a device (for example, an operation of linking images or superimposing an image on another image) serves as an operation of specifying an order in which functions are used or a data movement order. In the example shown in fig. 18, the mfp (b) as the first device is used first, and the projector (C) as the second device is used second. Presenting guidance indicating a third apparatus capable of performing a cooperative function with the first and second apparatuses and used by a third one in the cooperative function. That is, guidance indicating a third device used by the third in the cooperative function in which the mfp (b) is used first and the projector (C) is used second is presented. In the example shown in fig. 18, pc (a) is the third device. In example 3, a cooperation function that can be executed by cooperation among a plurality of apparatuses is registered in the cooperation function management information 34, and the order of use of the apparatuses is also registered. The determination unit 38 of the server 14 determines the third device by referring to the cooperative function management information 34.
Fig. 19 shows another example. As in fig. 17, device images 70, 72, 82, and 88 and an image 84 are displayed on the device display screen 68.
In this case, for example, if the user specifies the device image 72 relating to the projector (C), the projector (C) is recognized as the first device, and the determination unit 38 of the server 14 recognizes the pc (a), the mfp (b), and the camera (E) capable of performing the cooperative function together with the projector (C) as the second device (candidate cooperative partner device). As shown in fig. 19, for example, arrows 94, 96, and 98 are displayed as guides indicating the second device. An arrow 94 is an image linking the device image 72 relating to the projector (C) as the first device and the device image 70 relating to the mfp (b) as the second device to each other. The arrow 96 is an image linking the device image 72 and the device image 82 related to the camera (E) as the second device to each other. An arrow 98 is an image linking the device image 72 and the device image 88 relating to the pc (a) as the second device to each other.
Subsequently, it is assumed that the user selects mfp (b) as a cooperation partner device from the group of second devices and the user specifies the device image 70 related to mfp (b). In this case, the determination unit 38 of the server 14 determines a third device capable of performing the cooperation function together with the projector (C) as the first device and the mfp (b) as the second device by referring to the cooperation function management information 34. For example, assume that a camera (E) is determined as the third device. In this case, guidance indicating the camera (E) as the third device is presented as shown in fig. 20. For example, an arrow 100 linking the device image 70 related to the mfp (b) as the second device and the device image 82 related to the camera (E) as the third device to each other is displayed as a guide. In this case, guidance indicating a device (e.g., camera (E)) used by the third in the cooperation function in which the projector (C) is used by the first and the mfp (b) is used by the second is presented as the third device.
For example, the user may specify the device image 70 related to the mfp (b), perform an operation of linking the device image 72 related to the projector (C) and the device image 70 related to the mfp (b) with each other, superimpose the device image 72 on the device image 70, or place a pointer on the device image 72 and then move the pointer to the device image 70 to specify the mfp (b) serving as the second device.
In the above manner, guidance indicating a third device capable of performing a cooperative function together with the first and second devices is presented. The device presented (recommended) as the third device changes according to the order in which the first and second devices are specified. The order in which the devices are designated corresponds to the order in which the functions of the devices are used or the order in which data is moved between the devices. The operation of the specifying means serves as an operation of specifying the order in which the functions are used or the order of data movement. The devices used next in the cooperation function or the devices as destinations of data are changed according to the order in which the devices are designated. Thus, in example 3, guidance indicating the third apparatus to be used in the cooperative function is presented in accordance with the change.
As in example 1, the processing according to example 3 is applicable to the case of using a function image related to a function. For example, if the user designates a function image related to a first function, guidance indicating a function image related to a second function capable of performing a cooperative function together with the first function is presented. If the user specifies a function image related to the second function, guidance may be presented indicating a function image related to a third function capable of performing a cooperative function together with the first and second functions. In this case, the function presented as the third function is changed according to the order in which the first and second functions are specified. In example 3, the cooperation function may be a function using a function of a hardware device and a function realized by software.
Example 4
The boot processing according to example 4 will be described with reference to fig. 21 and 22. Fig. 21 and 22 each show an example of a device display screen according to example 4. In example 4, if a device incapable of performing a cooperative function with a first device is specified, guidance indicating a second device (a cooperation partner device) capable of performing a cooperative function with the first device is presented. For example, assume that the mfp (b), the projector (C), and the blower (D) are identified as devices.
As shown in fig. 21, a device display screen 68 is displayed on the UI unit 46 of the terminal apparatus 16, and device images 70, 72, and 76 relating to the identified devices (mfp (b), projector (C), and blower (D)) are displayed on the device display screen 68.
In this case, for example, if the user selects mfp (b) as the first device and the user designates device image 70 related to mfp (b) as the first image, determination unit 38 of server 14 recognizes mfp (b) as the first device. For example, it is assumed that the projector (C) corresponds to a second device capable of performing a cooperative function together with the mfp (b) as the first device and the blower (D) corresponds to a device incapable of performing a cooperative function together with the mfp (b). In this case, for example, it is assumed that the user designates the blower (D) which cannot perform the cooperative function together with the mfp (b) as a cooperative partner device by designating the device image 76 relating to the blower (D), performing an operation of linking the device image 70 relating to the mfp (b) and the device image 76 relating to the blower (D) with each other, superimposing the device image 70 on the device image 76, or placing an indicator on the device image 70 and then moving the indicator to the device image 76. In the example shown in fig. 21, the operation of linking the device image 70 and the device image 76 to each other is performed by the user as indicated by an arrow 102.
When the user designates the blower (D) incapable of performing the cooperative function together with the mfp (b) as the first device as a cooperation partner device, the controller 36 of the server 14 receives the designation and performs control to present guidance instructing the projector (C) as the second device capable of performing the cooperative function together with the mfp (b). Thus, guidance indicating the projector (C) as the second device is presented. For example, as shown in fig. 22, the controller 48 of the terminal apparatus 16 causes an arrow 104 indicating the projector (C) as a cooperation partner apparatus to be displayed on the apparatus display screen 68. For example, an arrow 104 is an image linking the device image 70 relating to the mfp (b) as the first device and the device image 72 relating to the projector (C) as the second device to each other. Of course, guidance may be presented by using sound or displaying a character string.
As described above, in example 4, if a device incapable of performing a cooperative function with a first device is specified, control is performed to present guidance indicating a second device capable of performing a cooperative function with the first device. The arrow or the like may clutter the screen if the arrow or the like serving as a guide is displayed each time the first device is specified. This situation can be avoided in example 4.
As in example 1, the processing according to example 4 is applicable to the case of using a function image related to a function. For example, if a function that cannot perform the cooperative function together with the first function is specified, guidance indicating a second function that can perform the cooperative function together with the first function is presented.
Example 5
Example 5 will be described with reference to fig. 23 to 25. Fig. 23 to 25 each show an example of a screen according to example 5. In example 5, if an apparatus designated as a collaboration partner apparatus by a user cannot perform a collaboration function with a first apparatus because the apparatus is damaged or in use, guidance indicating another apparatus capable of performing the collaboration function with the first apparatus is presented. In this case, guidance indicating the same type of device (e.g., a device having the same type of function) as the device designated as the collaboration partner device by the user may be preferentially presented. The controller 36 of the server 14 obtains information indicating the operation state of each device (e.g., whether the device is performing processing, is broken, or is being maintained) from each device and manages the operation state of each device.
For example, assume that the mfp (b) and the projectors (C) and (F) are identified as devices. As shown in fig. 23, a device display screen 68 is displayed on the UI unit 46 of the terminal apparatus 16, and device images 70, 72, and 106 related to the identified devices (mfp (b) and projectors (C) and (F)) are displayed on the device display screen 68.
In this case, for example, if the user selects mfp (b) as the first device and the user designates device image 70 related to mfp (b) as the first image, determination unit 38 of server 14 recognizes mfp (b) as the first device. For example, it is assumed that the projectors (C) and (F) correspond to a second device capable of performing a cooperative function together with the mfp (b) as the first device. For example, assume that the user designates the projector (F) as a cooperation partner apparatus by designating the device image 106 relating to the projector (F), performing an operation of linking the device image 70 relating to the mfp (b) and the device image 106 relating to the projector (F) with each other, superimposing the device image 70 on the device image 106, or placing a pointer on the device image 70 and then moving the pointer to the device image 106. In the example shown in fig. 23, an operation of linking the device image 70 and the device image 106 to each other is performed by the user as indicated by an arrow 108.
When the user designates the projector (F) as a cooperation partner apparatus, the controller 36 of the server 14 receives the designation and checks the operation state of the projector (F). For example, if the projector (F) is broken or being used, the controller 36 of the server 14 performs control to present guidance indicating a device other than the projector (F), that is, another device capable of performing a cooperative function together with the first device. The controller 36 may preferentially present guidance indicating a device of the same type as the projector (F) (e.g., a device having a function of the same type as the projector (F)). For example, if the projector (C) is the same type of device as the projector (F), guidance indicating the projector (C) as the second device is preferentially presented. In this case, as shown in fig. 24, for example, the controller 48 of the terminal apparatus 16 causes an arrow 110 indicating the projector (C) as a cooperation partner device to be displayed on the device display screen 68. For example, an arrow 110 is an image linking the device image 70 relating to the mfp (b) as the first device and the device image 72 relating to the projector (C) as the second device to each other.
If the user designates a damaged or in-use device as a cooperation partner device, a screen 112 shown in fig. 25 may be displayed on the UI unit 46 of the terminal device 16 under the control of the controller 36 of the server 14, showing a message indicating the reason for the inability to cooperate.
If a damaged device becomes available after repair or if a device that is performing processing completes the processing and is not performing the processing, the controller 36 of the server 14 identifies the device as a device capable of performing a cooperative function with the first device.
According to example 5, guidance indicating that there is no damage or no device in use is presented, and therefore, user convenience can be increased. In addition, guidance indicating the same type of device as the device specified by the user is presented, and therefore, guidance indicating a device that the user intends to use is presented.
Example 6
The boot processing according to example 6 will be described with reference to fig. 26. Fig. 26 shows an example of a device selection screen. In example 6, a candidate list showing information on one or more second apparatuses capable of performing a cooperation function together with a first apparatus is displayed on the UI unit 46 of the terminal device 16.
For example, it is assumed that mfp (b) and blower (D) are recognized as devices and a device image 70 related to mfp (b) and a device image 76 related to blower (D) are displayed on the UI unit 46 of the terminal device 16, as shown in fig. 14A. In this case, for example, if the user selects mfp (b) as the first device and the user designates the device image 70 relating to mfp (b), the determination unit 38 of the server 14 identifies mfp (b) as the first device. For example, assume that the blower (D) is a device that cannot perform a cooperative function together with the mfp (b). In this case, for example, it is assumed that the user designates the hair dryer (D) as a cooperation partner device by designating the device image 76 relating to the hair dryer (D), performing an operation of linking the device image 70 relating to the mfp (b) and the device image 76 relating to the hair dryer (D) with each other, superimposing the device image 70 on the device image 76, or placing a pointer on the device image 70 and then moving the pointer to the device image 76.
When the user designates the blower (D) incapable of performing the cooperative function together with the mfp (b) as the first device as the cooperative partner device, the controller 36 of the server 14 receives the designation, and as control to present guidance indicating one or more second devices capable of performing the cooperative function together with the mfp (b), performs control to display a candidate list showing information on the one or more second devices. Therefore, as shown in fig. 26, a device selection screen 114 is displayed on the UI unit 46 of the terminal apparatus 16, and a candidate list is displayed on the device selection screen 114. The message screen 78 shown in fig. 14B may be displayed on the UI unit 46 of the terminal apparatus 16 before the device selection screen 114 is displayed.
As shown in fig. 26, the candidate list includes names of respective devices capable of executing the cooperation function together with the mfp (b), device images related to the respective devices, examples of the cooperation function (e.g., names of the cooperation function), and the like. Of course, the candidate list may include at least one of these pieces of information. Each device image may be an image representing the appearance of an actual device (an image related to the device in a one-to-one relationship), or may be an image schematically depicting the device (e.g., an icon). For example, the image representing the appearance of the actual device is an image generated by capturing the appearance of the device, and is an image representing the device itself. The image schematically depicting the device corresponds to an image representing the type of device. As an example of the collaboration function, the candidate list includes a name of one collaboration function or names of a plurality of collaboration functions. In the case where the names of the plurality of cooperation functions are displayed, the names of the respective cooperation functions may be displayed in a display order corresponding to an order in which the plurality of target devices cooperating with each other are specified. For example, if a combination of the MFP and the projector is capable of executing a plurality of cooperation functions (e.g., cooperation functions a and B), the display order of the plurality of cooperation functions included in the candidate list may be different between when the MFP is designated as the first device and when the projector is designated as the first device. For example, if the MFP is designated as the first device, the names of the respective cooperation functions may be displayed in the order of the cooperation function a and the cooperation function B. If the projector is specified as the first device, the names of the respective cooperation functions may be displayed in the order of the cooperation function B and the cooperation function a.
The order of arrangement of the second devices in the candidate list may be determined based on, for example, past usage records of the respective second devices. For example, the controller 36 of the server 14 manages the usage records of the respective devices by obtaining information indicating the usage records of the devices from the respective devices. For example, the controller 36 displays the second devices in the candidate list in descending order of the frequency of use (descending order of the number of uses). The past usage record may be a usage record of a user who specified the first image (e.g., a user who used the terminal device 16 or a user who logged into the server 14), or may be a usage record including a usage record of another user.
For another example, controller 36 may display the second devices in the candidate list in descending order of the number of executable cooperative functions. For example, if the number of cooperative functions executable by the projector and the mfp (b) is 3 and if the number of cooperative functions executable by the PC and the mfp (b) is 2, the projector is displayed on the PC in the candidate list.
If the functions of the respective devices are updated and if the cooperative function management information 34 is updated, the controller 36 updates the display of the second device in the candidate list in accordance with the update. For example, if a device that cannot execute the cooperation function with the MFP before the update becomes able to execute the cooperation function with the MFP after the update, the controller 36 displays the device as the second device in the candidate list. Further, the usage records of the respective devices are updated over time, and the controller 36 updates the display order of the second devices in the candidate list based on the updated usage records.
The controller 36 of the server 14 may update the second devices included in the candidate list or may update the display order of the second devices in the candidate list according to the operation states or surrounding environments of the first and second devices.
Devices that can perform a cooperative function with the MFP but are damaged or being used are not necessarily displayed on the candidate list. Also in this case, if the device has been repaired or has become available, the device is displayed on the candidate list.
If the user specifies a device name or a device image included in the candidate list, the controller 48 of the terminal apparatus 16 may cause the UI unit 46 of the terminal apparatus 16 to display a list of one or more cooperative functions that can be executed with the specified device by the MFP as the first device under the control of the controller 36 of the server 14. For example, if a projector in the candidate list is specified, the controller 48 of the terminal device 16 causes the UI unit 46 to display a list of one or more cooperative functions that can be executed by the MFP and the projector. If the user designates a cooperation function included in the candidate list, control is performed to execute the designated cooperation function.
Each of the second devices displayed in the candidate list may be a device included in a device group registered in advance in the server 14, a device included in a device group identified using AR technology or the like, a device included in a device group displayed on the UI unit 46 of the terminal apparatus 16, or a device included in a device group displayed in a specific area in the screen of the UI unit 46. For example, if the user operates a device image related to the device, the device image is displayed in the specific area. The first device may also be a device included in the group of devices. The same applies to the above examples 1 to 5 and the following examples.
As in example 1, the processing according to example 6 is applicable to the case of using a function image related to a function. For example, if a device or function that cannot perform the cooperative function together with the first function is specified, a candidate list showing devices (candidate second devices) or functions (candidate second functions) that can perform the cooperative function together with the first function may be displayed. For example, the name of the second function, a function image related to the second function, an example of the cooperation function, and the like are included in the candidate list. Alternatively, if a function that cannot perform the cooperative function with the first device is specified, a candidate list may be displayed.
According to example 6, the candidate second device or the candidate second function is displayed as a candidate list, which may be convenient for the user.
Example 7
Example 7 will be described with reference to fig. 26 to 32. In example 7, the candidate list is displayed on the UI unit 46 of the terminal device 16 as in example 6.
As in example 6, it is assumed that the user designates mfp (b) as the first device and the user designates a device that cannot execute the cooperative function together with mfp (b). The controller 36 of the server 14 receives the designation, and as control to present guidance indicating one or more second devices capable of executing the cooperative function with the mfp (b), performs control to display a candidate list showing information about the one or more second devices. Accordingly, the device selection screen 114 is displayed on the UI unit 46 of the terminal apparatus 16, as shown in fig. 26.
For example, if the user specifies a device name, the controller 48 of the terminal apparatus 16 causes the UI unit 46 of the terminal apparatus 16 to display a list of devices of the same type as the specified device (a list of second devices) under the control of the controller 36 of the server 14. For example, if the user designates a projector, the device selection screen 116 is displayed on the UI unit 46 of the terminal apparatus 16, as shown in fig. 27. A list of projectors as second devices is displayed on the device selection screen 116. For example, if the determination unit 38 of the server 14 determines (identifies) the projectors aaa, bbb, and ccc as projectors corresponding to the second device, a list showing these projectors is displayed. The user selects a projector to be used as the second device from the list.
Further, a message asking the user whether to add a target device of cooperation is displayed on the device selection screen 116. For example, if the user specifies the second device and then the user provides an instruction to increase the target device of the cooperation (for example, if the user selects "yes" in fig. 27), the device selection screen 118 is displayed on the UI unit 46 of the terminal apparatus 16 as shown in fig. 28, and a candidate list showing information on one or more third devices capable of performing the cooperation function together with the first and second devices is displayed on the device selection screen 118. The candidate list has the same configuration as a candidate list showing information on one or more second apparatuses. The display order of the third device may be different from the display order in the candidate list of the second device. If the user designates a third device, a list of the third devices is displayed as in the device selection screen 116 (see fig. 27) displayed in response to the designation of the second device. The same applies to the case where a fourth means, a fifth means, etc. are added.
If an image (e.g., an icon) schematically depicting a device is included in the candidate list as a device image and if the device image is specified by the user, a list of second devices (e.g., projectors) related to the device image is displayed as shown in fig. 27. That is, the device image represents this type of device, and generally represents a device. Accordingly, if the device image is specified, a list of second devices related to the device image is displayed. On the other hand, an image representing an actual device (an image related to the device in a one-to-one relationship) is an image representing the device itself. Therefore, if the device image is included in the candidate list and specified by the user, the user specifies the device itself. In this case, the list of the second devices shown in fig. 27 is not displayed, and only a message inquiring the user whether to add a cooperative target device may be displayed.
If the target device of the cooperation is not added (for example, if the user designates "no" in fig. 27), the controller 48 of the terminal apparatus 16 causes the UI unit 46 of the terminal apparatus 16 to display the function selection screen 120 shown in fig. 29 under the control of the controller 36 of the server 14. A list of the cooperative functions executable by the plurality of devices specified by the user is displayed on the function selection screen 120. For example, if the mfp (b) is designated as the first device and if the projector aaa is designated as the second device, a list of the cooperation functions executable by the mfp (b) and the projector aaa is displayed. If the user specifies the cooperation function from the list and provides an instruction to execute the cooperation function, the cooperation function is executed by the mfp (b) and the projector aaa.
On the other hand, in the case of a cooperative function not included in the list of executing the cooperative function, the user requests execution of the cooperative function. For example, as shown in fig. 30, a screen 122 for making a request is displayed on the UI unit 46 of the terminal device 16, and the user inputs the name of the cooperation function to be executed on the screen 122. Information representing the request is sent from the terminal device 16 to the server 14. In response to receipt of the request, the controller 36 of the server 14 determines whether the cooperation function related to the request can be performed by the designated devices (e.g., the first and second devices). If the cooperation function related to the request is not executable, the controller 48 of the terminal device 16 causes the UI unit 46 of the terminal device 16 to display a message indicating that the cooperation function related to the request is not executable under the control of the controller 36 of the server 14, as shown in fig. 31. If the cooperation function related to the request is executable, the controller 48 of the terminal apparatus 16 causes the UI unit 46 of the terminal apparatus 16 to display a message indicating that the cooperation function related to the request is to be executed, under the control of the controller 36 of the server 14, as shown in fig. 32. If the user provides the execution instruction, the cooperative function is executed. In addition, the collaboration function requested by the user may be registered. For example, as shown in fig. 32, the controller 48 of the terminal device 16 causes the UI unit 46 of the terminal device 16 to display a message inquiring of the user whether or not the requested cooperation function is registered as a candidate cooperation function from now on under the control of the controller 36 of the server 14. If the user selects "register", information on the requested cooperative function is registered in the cooperative function management information 34 from now on and displayed while being included in the list of cooperative functions. If the user selects "unregister", information on the requested cooperation function is not registered and is not included in the list of cooperation functions. For example, if the requested cooperative function corresponds to exceptional processing, if the frequency of use of other cooperative functions is higher, or if the user wants to prevent a situation in which the number of cooperative functions included in the list increases and the list becomes complicated, the requested cooperative function is not registered.
As in example 1, the processing according to example 7 is applicable to the case of using a function image related to a function.
According to example 7, the candidate second device or the candidate second function is displayed as a candidate list, which may be convenient for the user. In addition, the target devices of the cooperation can be easily increased using the candidate list.
Example 8
Example 8 will be described with reference to fig. 33 to 35. In example 8, as in examples 6 and 7, the candidate list is displayed on the UI unit 46 of the terminal device 16.
As in examples 6 and 7, it is assumed that the user designates mfp (b) as the first device and the user designates a device that cannot execute the cooperative function together with mfp (b). In this case, as shown in fig. 26, the device selection screen 114 is displayed on the UI unit 46 of the terminal apparatus 16.
If the user designates a cooperation function (e.g., "print display screen") on the device selection screen 114, the controller 48 of the terminal apparatus 16 causes the UI unit 46 of the terminal apparatus 16 to display a device selection screen 124 shown in fig. 33 under the control of the controller 36 of the server 14. On the device selection screen 124, a list of devices (a list of second devices) necessary to execute a cooperation function (for example, a "print display screen") designated by the user, that is, devices capable of executing the cooperation function is displayed. If the user specifies a device (second device) in the list and if the user provides an instruction to execute the cooperative function, the cooperative function specified by the user is executed by the first and second devices specified by the user.
If the user provides an instruction to select a device not included in the device list displayed on the device selection screen 124 (if "yes" is selected in response to the query "do you want to select another device. On the device selection screen 126, a candidate list of another device capable of executing the cooperation function designated by the user is displayed. If the user specifies a device (second device) on the device selection screen 126 and if the user provides an instruction to execute the cooperative function, the cooperative function specified by the user is executed by the first and second devices specified by the user.
If the user provides an instruction to select a device not included in the device list displayed on the device selection screen 126 (if "yes" is selected in response to the query "do you want to select another device. The user inputs information about the target device of the cooperation (e.g., the name or type of the device) on the screen 128. The information about the target device input by the user is transmitted from the terminal apparatus 16 to the server 14. In response to the receipt of the information, the controller 36 of the server 14 determines whether the cooperation function specified by the user can be performed by the first device and the target device specified by the user. If the cooperation function is not executable, the controller 48 of the terminal apparatus 16 causes the UI unit 46 of the terminal apparatus 16 to display a screen for setting another device as a target device under the control of the controller 36 of the server 14. If the cooperation function is executable and if an instruction to execute the cooperation function is provided, the cooperation function designated by the user is executed by the first and second devices (target devices cooperating with each other) designated by the user.
As in example 1, the processing according to example 8 is applicable to the case of using a function image related to a function.
According to example 8, when a cooperation function is specified in the candidate list, apparatuses required to perform the cooperation function are displayed, and therefore, user convenience in selecting an apparatus can be increased.
The exemplary embodiments are applicable to an environment in which a plurality of users use a plurality of devices. For example, even if a user interface such as a touch screen is removed from the apparatus, the terminal device 16 serves as the user interface. In another case, for example, if the user uses the device temporarily while traveling, a user interface suitable for the user, that is, a user interface that displays one or more functions of the device designated by the user and one or more cooperative functions using the device, is implemented by the terminal device 16.
Hereinafter, processing related to examples 1 to 8 will be described.
Processing for switching display of information on collaboration function
In an exemplary embodiment, the display of the information on the cooperative function may be switched according to an order in which the device images related to the devices are linked to each other. In this case, if the device designated as the cooperation partner device is not capable of performing the cooperation function together with the first device, control is performed to present guidance indicating a second device capable of performing the cooperation function together with the first device, as in the above-described examples 1 to 8. On the other hand, if the user designates a second device capable of performing a cooperative function together with the first device, the display of information on the cooperative function is switched according to the order in which the device images are linked to each other. This process will be described in detail below with reference to fig. 36 to 38B.
Fig. 36 shows a cooperation function management table as another example of the cooperation function management information 34. In the cooperation function management table, for example, information indicating a combination of device IDs, information indicating names (types) of target devices cooperating with each other, information indicating one or more cooperation functions (cooperation function information), information indicating a link order, and information indicating a priority order are associated with each other. The link order corresponds to an order in which device images related to devices are linked to each other. The priority order is a priority order in which information about the cooperative function is displayed. For example, the device having the device ID "a" is a PC, and the device having the device ID "B" is an MFP. The cooperation between the pc (a) and the mfp (b) realizes, for example, a scan-and-transfer function and a print function as cooperation functions. The scan and transfer function is a function of transferring image data generated by scanning by the mfp (b) to the pc (a). The print function is a function of transmitting data (e.g., image data or document data) stored in the pc (a) to the mfp (b) and printing the data by the mfp (b). For example, if a link is made from mfp (b) to pc (a), that is, if a link is made from a device image related to mfp (b) to a device image related to pc (a), the priority order of the scan and transfer function is "1" and the priority order of the print function is "2". In this case, the information on the scan-and-transfer function is displayed in preference to the information on the print function. On the other hand, if a link is made from the pc (a) to the mfp (b), that is, if a link is made from the device image relating to the pc (a) to the device image relating to the mfp (b), the priority order of the print function is "1", and the priority order of the scan-and-transfer function is "2". In this case, the information on the print function is displayed in preference to the information on the scan-and-transfer function.
Fig. 37A to 38B each show an example of a screen displayed on the UI unit 46 of the terminal device 16. For example, assume that MFP (B) and PC (A) are identified. As shown in fig. 37A, a device display screen 68 is displayed on the UI unit 46 of the terminal apparatus 16, and a device image 70 related to mfp (b) and a device image 88 related to pc (a) are displayed on the device display screen 68. In this state, the user links device images representing target devices to each other using a pointer (e.g., a finger of the user, a pen, or a stylus pen). The controller 48 of the terminal device 16 detects a touch of a pointing object on the apparatus display screen 68, and detects movement of the pointing object on the apparatus display screen 68. For example, as indicated by arrow 130, the user touches the device image 70 on the device display screen 68 with the operator and moves the operator to the device image 88 on the device display screen 68, thereby linking the device image 70 to the device image 88. Accordingly, the mfp (b) related to the device image 70 and the pc (a) related to the device image 88 are specified as target devices cooperating with each other and specify the link order. The order in which the device images are linked corresponds to the linking order. MFP (B) corresponds to the first device, and PC (A) corresponds to the second device. In the example shown in fig. 37A, the link is made from the device image 70 to the device image 88, i.e., from the mfp (b) to the pc (a). Information indicating the order of linking of the devices is transmitted from the terminal apparatus 16 to the server 14. The controller 48 of the terminal apparatus 16 may cause an image representing the movement trajectory performed by the user to be displayed on the device display screen 68. After the devices are linked to each other, the controller 48 of the terminal apparatus 16 may replace the trajectory with a predetermined straight line or the like, and may cause the straight line to be displayed on the device display screen 68.
When the target devices (for example, mfp (b) and pc (a)) that cooperate with each other are specified in the above-described manner, the determination unit 38 of the server 14 determines the cooperation function associated with the combination of pc (a) and mfp (b) in the cooperation function management table shown in fig. 36. Thus, the cooperation function executed by cooperation between the pc (a) and the mfp (b) is determined. When the user designates the link order of the devices, the determination unit 38 determines the priority order associated with the link order in the cooperation function management table. Specifically, referring to fig. 36, since the pc (a) and the mfp (b) are specified as target devices that cooperate with each other, the cooperation function performed by these devices is a scan-and-transfer function and a print function. In addition, since the link is made from the mfp (B) to the pc (a) (B → a), the priority order of the scan and transfer function is "1", and the priority order of the print function is "2".
Information on the determined cooperation function and information on the determined priority order are transmitted from the server 14 to the terminal device 16. The controller 48 of the terminal device 16 causes the UI unit 46 to display information on the cooperative function as information on the candidate cooperative function according to the priority order.
For example, as shown in fig. 37B, the controller 48 of the terminal apparatus 16 causes the UI unit 46 to display a cooperation function display screen 132 and display information on candidate cooperation functions on the cooperation function display screen 132. Since the priority order of the scan-and-transfer function is "1" and the priority order of the print function is "2", information on the scan-and-transfer function is displayed in preference to (e.g., above) information on the print function. For example, as the information on the scan-and-transfer function, a description of "transfer of data scanned by mfp (b) to pc (a)" is displayed. Further, as information on the print function, a description of the print function "data in the print pc (a)" is displayed.
If the user specifies a collaboration function and provides an execution instruction, the specified collaboration function is executed. For example, if the user presses the "yes" button, the cooperative function related to the "yes" button is executed. Further, a "back" button is displayed on the cooperative function display screen 132. If the user presses the "back" button, the process of connecting the devices stops.
The process of determining the cooperative function and the process of determining the order of priority may be performed by the terminal device 16.
Instead of moving the operator between the device images, by drawing a circle around the device images, target devices cooperating with each other can be specified and the link order thereof can be specified. For example, the order of the drawing operations corresponds to the link order. Alternatively, the target devices cooperating with each other and the link order thereof may be specified according to a voice instruction provided by the user.
Fig. 38A and 38B show an example of another operation. For example, as shown in fig. 38A, the user touches the device image 88 on the device display screen 68 with an operator and moves the operator to the device image 70 in the direction indicated by the arrow 134, thereby linking the device image 88 to the device image 70. Therefore, the pc (a) related to the device image 88 and the mfp (b) related to the device image 70 are specified as target devices cooperating with each other, and the link order is also specified. In this example, the link is made from the device image 88 to the device image 70, i.e., from the pc (a) to the mfp (b). Referring to the cooperation function management table shown in fig. 36, the priority order of the print function is "1", and the priority order of the scan and transfer function is "2". In this case, as shown in fig. 38B, information on the print function is displayed in preference to (e.g., above) information on the scan-and-transfer function on the cooperation function display screen 136.
As described above, the device images related to the devices are linked to each other, thereby determining a cooperative function using the function of the device. The display order of the information on the cooperative function is changed according to the order in which the images are linked to each other, that is, the order in which the devices are linked to each other. The link order of the devices is also considered as an order of using functions in the respective devices or an order of moving data between devices cooperating with each other. The operation of linking the devices (the operation of linking the images) is also regarded as an operation of specifying the order of use of the functions or the order of movement of the data. Therefore, as a result of changing the display order of the information on the cooperative function according to the link order, the information on the cooperative function that the user expects to use is preferentially displayed. In other words, information about a collaboration function that is more likely to be used by the user is preferentially displayed. For example, if linking is made from mfp (b) to pc (a), it is expected that the user will use the cooperation function of "use mfp (b) first, then transfer data from mfp (b) to pc (a)". On the other hand, if linking is made from pc (a) to mfp (b), it is expected that the user will use the cooperative function of "use pc (a) first, and then transfer data from pc (a) to mfp (b)". Therefore, as a result of changing the display order of the information on the cooperation functions in accordance with the link order of the images, the information on the cooperation functions that are more likely to be used by the user is preferentially displayed. In addition, the order of use of the functions or the order of data movement is specified without a special operation other than the operation of linking the device images, and information on the cooperative function that the user intends to use is displayed.
The above display switching processing is applicable to a case where a function image related to a function is used. For example, the display of the information on the cooperative function is switched according to the order in which the function image related to the first function and the function image related to the second function are specified.
The above-described display switching process may be applied to information on the cooperation function displayed in the candidate list (see fig. 26, for example) according to example 6 or the like. That is, if the mfp (B) is designated as the first device, the pc (a) is displayed in the candidate list as the candidate second device (candidate cooperation partner), and the information on the plurality of cooperation functions is displayed as the information on the cooperation function related to the pc (a) (the information on the cooperation function executable by the mfp (B) and the pc (a)) in the order shown in fig. 37B. On the other hand, if the pc (a) is designated as the first device, the mfp (B) is displayed in the candidate list as the candidate second device (candidate cooperation partner), and information on the plurality of cooperation functions is displayed in the order shown in fig. 38B as information on the cooperation function relating to the mfp (B).
Collaborative processing using partial images
The function assigned to a device cooperating with a function may vary depending on the location in the device image relative to the device. When a user designates a specific position in a device image, information on a cooperative function using a function corresponding to the specific position is preferentially displayed. Hereinafter, this process will be described in detail.
Fig. 39 shows an example of the device function management table. The data of the device function management table is stored in the server 14 as device function management information 32. In the device function management table, for example, a device ID, information indicating a name (e.g., type) of a device, information indicating a position in a device image, information indicating a function corresponding to the position (function information), and an image ID are associated with each other. The position in the device image is a specific position (specific portion) in the device image related to the device, for example, a specific position in the device image schematically representing the device or a specific position in the device image captured by the camera. Different functions are associated with each particular location in the device image.
Fig. 40A and 40B each show an example of a screen displayed on the UI unit 46 of the terminal device 16. For example, assume that MFP (B) and PC (A) are recognized. As shown in fig. 40A, a device display screen 68 is displayed on the UI unit 46 of the terminal apparatus 16, and device images 70 and 88 are displayed on the device display screen 68. For example, in the device image 70, a specific position (partial image 70a) corresponding to the main body of the mfp (b) is assigned with a print function. In the device image 70, a specific position (partial image 70b) corresponding to the document cover, the document glass, and the automatic document feeder of the mfp (b) is assigned with a scanning function. In the device image 70, a specific position (partial image 70c) corresponding to the post-processing apparatus is assigned a binding function. The binding function is a function of binding output sheets. In the device image 88, a specific position (partial image 88a) corresponding to the body portion of the pc (a) is assigned a data storage function. In the device image 88, a screen display function is assigned to a specific position (partial image 88b) corresponding to the display unit of the pc (a). The data storage function is a function of storing data received from another device in the pc (a). The screen display function is a function of displaying data received from another device in the pc (a).
The controller 48 of the terminal apparatus 16 may cause the name of the function (e.g., print, scan, etc.) assigned to the specific location in the device image to be displayed on the device display screen 68. Therefore, the user is provided with information clearly indicating the correspondence between the specific position and the function. Of course, the names of the functions are not necessarily displayed.
When the user specifies a position in the device image to which a function is assigned, the functions assigned to the specified position are specified as target functions that cooperate with each other. The user links specific positions (partial images) assigned with functions in the device images representing the target devices cooperating with each other with the pointer. For example, as indicated by an arrow 138, the user touches the partial image 70b on the device display screen 68 with the operator and moves the operator to the partial image 88b, thereby linking the partial image 70b to the partial image 88 b. Accordingly, the mfp (b) related to the device image 70 including the partial image 70b and the pc (a) related to the device image 88 including the partial image 88b are specified as target devices cooperating with each other, and the scanning function assigned to the partial image 70b and the screen display function assigned to the partial image 88b are specified. Further, the link order may be specified by a link operation. In this case, the order in which the partial images are linked corresponds to the linking order. In the example shown in fig. 40A, the link is made from the partial image 70b to the partial image 88b, that is, from the mfp (b) to the pc (a). The scan function and the screen display function are designated as functions for the cooperation function. Information indicating the link order of the devices and information indicating the specific position designated by the user in the device image are transmitted from the terminal apparatus 16 to the server 14.
When target devices (e.g., pc (a) and mfp (b)) that cooperate with each other are identified, the determination unit 38 of the server 14 determines the cooperation function realized by the cooperation between pc (a) and mfp (b) in the cooperation function management table shown in fig. 7. Further, the determination unit 38 determines a function assigned to a specific position specified by the user in the device image with reference to the device function management table shown in fig. 39. Further, among the cooperative functions realized by cooperation between the pc (a) and the mfp (b), the determination unit 38 assigns a higher priority to a cooperative function using a function assigned to a position designated by the user, and assigns a lower priority to a cooperative function not using the function.
Information on the cooperative function determined in the above manner and information indicating the order of priority are transmitted from the server 14 to the terminal device 16. The controller 48 of the terminal device 16 causes the UI unit 46 to display information on the cooperation function as information on the candidate cooperation function according to the priority order.
For example, as shown in fig. 40B, the controller 48 of the terminal apparatus 16 causes the display of the UI unit 46 to display a cooperation function display screen 140 and displays information on candidate cooperation functions on the cooperation function display screen 140. Since the user designates the scan function and the screen display function in this order, with respect to the cooperation function performed by cooperation between the scan function and the screen display function, information of the scan, transfer, and display function is displayed in preference to (e.g., above) information with respect to other cooperation functions. For example, information regarding the scanning, transmitting, and displaying function is displayed in preference to information regarding the scanning, transmitting, and storing function performed by cooperation between the scanning function and the data storing function. The scan, transfer and display function is a function of transferring data generated by the scan by the mfp (b) to the pc (a) and displaying the data on the screen of the pc (a). The scan, transfer, and store function is a function of transferring data generated by scanning by the mfp (b) to the pc (a) and storing the data in the pc (a). In the example shown in fig. 40B, the explanation of each cooperation function is displayed as information on each cooperation function.
According to the cooperation process using the partial images, in the case where the respective target devices cooperating with each other have a plurality of functions, the functions are individually specified, and information on the cooperative function using the specified functions is preferentially displayed. Accordingly, the cooperation function to be used by the intended user is preferentially displayed.
The cooperation function may be a function using a combination of parts of the apparatus, a function using a combination of the entire apparatus and parts of the apparatus, or a function using a combination of the entire apparatus.
The cooperation process using the partial images is applicable to a case where a function image related to a function is used. For example, different functions are assigned to positions in the function image, and a cooperative function using the function assigned to the position designated by the user is determined.
The above examples 1 to 8 are also applicable to cooperative processing using partial images. For example, if the user designates a partial image included in a first image related to a first device, a control may be performed to present guidance indicating the whole or part of a second device capable of performing a cooperative function together with a function assigned to a part related to the partial image. As another example, if a user specifies an entire first image related to a first device, a control may be performed to present guidance indicating a portion of a second device capable of performing a cooperative function with the first device. To present guidance, all or part of the second apparatus may be displayed while being included in the candidate list described above in example 6. If the user specifies all or part of the first device, guidance may be presented indicating all or part of a second device capable of performing a cooperative function with all or part of the first device. If all or part of a second device that cannot perform a cooperative function with all or part of a first device designated by a user is designated, guidance may be presented indicating all or part of the second device that can perform a cooperative function with all or part of the first device. Hereinafter, these processes will be described in detail.
For example, when a first image (entire image) related to a first apparatus is specified, the controller 48 of the terminal device 16 presents guidance indicating one or more functions of a second apparatus, which are capable of performing a cooperative function with the first apparatus, under the control of the controller 36 of the server 14. More specifically, the controller 48 of the terminal device 16 presents guidance indicating one or more partial images related to one or more portions included in the second apparatus (one or more partial images in the second image related to the second apparatus) assigned with functions capable of performing the cooperative function together with the first apparatus. For example, as in examples 1 to 8, the controller 48 of the terminal device 16 causes the UI unit 46 of the terminal device 16 to display an image depicting an arrow serving as a guide, presents the guide with sound, or causes the UI unit 46 to display a character string serving as a guide. Referring to fig. 40A, if the user designates the device image 70 relating to the mfp (b) as the first image, guidance is presented indicating a partial image relating to a portion to which functions of the pc (a) capable of performing cooperative functions together with the mfp (b) are assigned. For example, if the screen display function of the pc (a) is a function capable of executing the cooperation function together with the mfp (b), guidance indicating the partial image 88b related to the screen display function is presented. For example, an arrow linking the device image 70 and the partial image 88b related to the mfp (b) to each other is displayed, guidance indicating a screen display function is presented with sound, a character string indicating the screen display function is displayed, or the partial image 88b is displayed so that the partial image 88b can be distinguished from another partial image. When the user designates the device image 70 as the first image or when the user designates a part of the pc (a) assigned with a function incapable of executing the cooperative function together with the mfp (b), guidance may be presented.
As another example, if a partial image included in the first image related to the first device is specified, the controller 48 of the terminal apparatus 16 may present guidance indicating the second device (entire device) capable of performing a cooperative function together with the function assigned to the portion related to the partial image (the function of the first device) under the control of the controller 36 of the server 14. Referring to fig. 40A, for example, if the user designates a partial image 70A relating to the mfp (b), guidance indicating a second image relating to a second device capable of performing a cooperative function together with a print function assigned to a portion relating to the partial image 70A is presented. For example, if the pc (a) is a device capable of executing the cooperative function together with the mfp (b), guidance indicating the device image 88 relating to the pc (a) is presented. For example, an arrow linking the partial image 70a and the device image 88 to each other is displayed, guidance indicating the pc (a) is presented with sound, a character string representing the pc (a) is displayed, or the device image 88 is displayed so that the device image 88 can be distinguished from another device image. Guidance may be presented when the user designates a partial image included in the device image 70 or when the user designates a device image related to a device that cannot perform a cooperative function together with a function related to the partial image.
As another example, if a partial image (first partial image) included in the first image related to the first device is specified, the controller 48 of the terminal apparatus 16 presents guidance indicating one or more functions of the second device that can perform a cooperative function together with the function assigned to the portion related to the first partial image (the function of the first device), under the control of the controller 36 of the server 14. More specifically, the controller 48 of the terminal device 16 presents guidance indicating one or more partial images related to one or more portions included in the second apparatus (one or more partial images included in a second image related to the second apparatus, referred to as second partial images), that is, one or more second partial images related to one or more portions assigned with one or more functions capable of performing a cooperative function together with the function assigned to the portion related to the first partial image. Referring to fig. 40A, for example, if the user designates a partial image 70A relating to the mfp (b), guidance indicating a second partial image relating to a portion assigned with the function of the pc (a) capable of performing a cooperative function together with the print function assigned to the portion relating to the partial image 70A is presented. For example, if the screen display function of the pc (a) is a function capable of executing the cooperation function together with the print function, guidance indicating a partial image 88b related to the screen display function is presented. For example, an arrow linking the partial image 70a and the partial image 88b to each other is displayed, guidance indicating a screen display function is presented with sound, a character string indicating the screen display function is displayed, or the partial image 88b is displayed so that the partial image 88b can be distinguished from another partial image. Guidance may be presented when the user designates a partial image included in the device image 70 or when the user designates a partial image related to a function that cannot perform a cooperative function together with the function related to the partial image.
Three or more partial images may be specified, and thus, the user may specify three or more functions. For example, if the user designates two partial images (two functions: a first function and a second function), guidance may be presented indicating a portion assigned with a function (a third function) capable of performing a cooperative function together with the two functions. In this case, the determination unit 38 of the server 14 may change the third function of the presentation guide according to the order in which the partial image related to the first function and the partial image related to the second function are specified.
Multiple functions of the same device may be designated as target functions in cooperation with each other. For example, the user may designate the screen display function and the data storage function of pc (a) as target functions in cooperation with each other, or the user may designate the scan function of mfp (b) and the screen display function and the data storage function of pc (a) as target functions in cooperation with each other. Also in this case, guidance is presented indicating a portion assigned with a function (e.g., a second function) capable of performing a cooperative function together with the first-specified function (e.g., a first function). The determining unit 38 of the server 14 may determine the cooperative function using the respective functions according to a specified order and may assign a higher priority to the cooperative function.
Another example of cooperative processing using partial images
Hereinafter, another example of the cooperation process using the partial images will be described with reference to fig. 41 and 42.
Fig. 41 shows an example of the device function management table. The data of the device function management table is stored in the server 14 as the device function management information 32. In the device function management table, for example, a device ID, information indicating a name of the device (for example, a type of the device), information indicating a name of a part of the device (for example, a type of the part), a part ID as part identification information for identifying the part, information indicating a function assigned to the part (function of the part), and a part image ID for identifying a part image related to the part are associated with each other. The partial image is an image representing the appearance of a portion of the device obtained by shooting with a camera. Of course, a partial image schematically representing a portion of the device may be associated with that portion. For example, different functions are assigned to various portions of the device.
Specifically, the screen display function is assigned to the display section of the pc (a), and information indicating the screen display function is associated with a partial image ID of a partial image related to the display section. The screen display function is a function of displaying information on the pc (a). The data storing function is assigned to the body section of the pc (a), and information indicating the data storing function is associated with the partial image ID of the partial image relating to the body section. The data storage function is a function of storing data in the pc (a).
The print function is assigned to the main body section of the mfp (b), and information indicating the print function is associated with a partial image ID of a partial image relating to the main body section. The scanning function is assigned to a reading section (e.g., a portion corresponding to a document cover, a document glass, and an automatic document feeder of the mfp (b)) of the mfp (b), and information representing the scanning function is associated with a partial image ID of a partial image related to the reading section. The binding function is assigned to the post-processing apparatus of the mfp (b), and information indicating the binding function is associated with the partial image ID of the partial image relating to the post-processing apparatus. The binding function is a function of binding output sheets.
The function assigned to the portion of the device is determined (identified) using, for example, label-free AR techniques. For example, if a portion of the apparatus is photographed by a camera (e.g., camera 42 of terminal device 16), appearance image data representing the portion is transmitted from terminal device 16 to server 14. The determination unit 38 of the server 14 determines (identifies) the function associated with the appearance image data in the device function management table. Thus, the function assigned to the photographed portion is determined (identified). For example, if the main body of the mfp (b) is photographed by the camera 42, appearance image data indicating the main body of the mfp (b) is transmitted from the terminal device 16 to the server 14. The determination unit 38 of the server 14 determines the print function associated with the appearance image data in the device function management table. Therefore, the function assigned to the main body section of the mfp (b) is determined to be the printing function.
Of course, the functionality assigned to portions of the device may be determined (identified) utilizing tag-based AR techniques. For example, each part of the apparatus is provided with a mark such as a two-dimensional barcode obtained by encoding part identification information (e.g., part ID) for identifying the part. If a mark on a part is photographed by a camera and a mark-based AR technique is applied thereto, part identification information (e.g., part ID) of the part is obtained. The application of the tag-based AR technology may be performed by the terminal device 16 or the server 14. After thus obtaining the partial identification information, the determination unit 38 of the server 14 determines (identifies) the function associated with the partial identification information (e.g., partial ID) in the device function management table.
Fig. 42 shows an example of the cooperation function management table. The data of the cooperation function management table is stored in the server 14 as the cooperation function management information 34. The cooperative function management table is information indicating cooperative functions each using a plurality of partial functions. In the cooperative function management table, for example, information indicating a combination of parts of the device, information indicating a combination of part IDs, and information indicating a cooperative function using functions of a plurality of parts included in the combination are associated with each other. Of course, in the cooperative function management table, information indicating a combination of a part of the device and the entire device and information indicating a cooperative function using a function of the part of the device and a function of the entire device may be associated with each other.
Specifically, the print function as the cooperative function is assigned to the combination of the display part of the pc (a) and the main body part of the mfp (b), and the information representing the print function as the cooperative function is associated with the information representing the combination of the partial ID of the display part of the pc (a) and the partial ID of the main body part of the mfp (b). For example, the print function as the cooperation function is a function of transmitting data stored in the pc (a) to the mfp (b) and printing the data by the mfp (b).
The print function as the cooperative function is assigned to the combination of the main body part of the mfp (b) and the main body part of the projector (C), and the information indicating the print function as the cooperative function is associated with the information indicating the combination of the partial ID of the main body part of the mfp (b) and the partial ID of the main body part of the projector (C). For example, the print function as the cooperation function is a function of transmitting data projected by the projector (C) to the mfp (b) and printing the data by the mfp (b).
The scanning-and-projecting function as the cooperation function is assigned to the combination of the reading section of the mfp (b) and the main body section of the projector (C), and the information indicating the scanning-and-projecting function as the cooperation function is associated with the information indicating the combination of the partial ID of the reading section of the mfp (b) and the partial ID of the main body section of the projector (C). For example, the scan-and-projection function as the cooperation function is a function of transmitting data generated by scanning by the mfp (b) to the projector (C) and projecting the data by the projector (C).
The cooperation function may be a function using functions of a plurality of parts included in the same apparatus, or may be a function using functions of a part included in a plurality of different apparatuses. The cooperative function may be a function using three or more parts of functions.
For example, after determining (identifying) multiple parts of a device (e.g., multiple parts of multiple different devices or multiple parts of the same device) using a tag-based AR technique or a tag-free AR technique, the determining unit 38 of the server 14 determines (identifies) a cooperation function associated with the identified combination of the multiple parts in the cooperation function management table. Thus, a cooperative function using the functions of a plurality of recognized (e.g., photographed) parts is determined (recognized). For example, if the main body portions of the mfp (b) and the projector (C) are photographed by the camera 42 of the terminal device 16 and if the main body portions of the mfp (b) and the projector (C) are recognized, the determining unit 38 of the server 14 determines the print function and the like in the cooperation function management table as the cooperation function associated with the combination of the main body portions of the mfp (b) and the projector (C).
The above-described examples 1 to 8 are also applicable to such cooperative processing. For example, if a first portion of the device is determined (identified), the controller 48 of the terminal device 16, under control of the controller 36 of the server 14, presents guidance indicating a second portion of the device capable of performing a cooperative function along with the function assigned to the first portion.
Specifying target devices that cooperate with each other by superimposing device images
The target devices that cooperate with each other can be specified by superimposing a plurality of device images on each other. This processing will be described below with reference to fig. 43A, 43B, 43C, 44A, and 44B. Fig. 43A, 43B, 43C, 44A, and 44B each show an example of a screen displayed on the UI unit 46 of the terminal device 16.
For example, assume that MFP (B) and PC (A) are identified. As shown in fig. 43A, a device display screen 68 is displayed on the UI unit 46 of the terminal apparatus 16, and device images 70 and 88 related to the identified devices are displayed on the device display screen 68. In this state, the user superimposes a device image related to the first device on a device image related to the collaboration partner device (second device) with a pointer (for example, a finger, a pen, or a stylus pen of the user). For example, as shown in fig. 43B, the user specifies the device image 70 with the operator and superimposes the device image 70 on the device image 88, as indicated by an arrow 142. For example, the user superimposes device images on each other by performing a drag-and-drop operation. Specifically, the user drags the device image 70 and releases it at the location where the device image 70 is superimposed on the device image 88. This drag and drop operation is according to prior art techniques. Alternatively, the device images to be superimposed on each other may be specified according to a voice instruction provided by the user. For example, the device images 70 and 88 may be designated as target device images and may be superimposed on one another according to voice instructions provided by the user.
As a result of superimposing the device images 70 and 88 on each other, the mfp (b) related to the device image 70 and the pc (a) related to the device image 88 are specified as target devices that cooperate with each other. For example, the first designated mfp (b) corresponds to a first device. If the second designated pc (a) is a device that cannot execute the cooperative function together with the mfp (b), guidance indicating a device that can execute the cooperative function together with the mfp (b) is presented as in the above-described examples 1 to 8.
The controller 48 of the terminal apparatus 16 may cause the device image being dragged to be displayed on the UI unit 46 in a recognizable manner. For example, the device image being dragged may be displayed semi-transparently or in a particular color.
If the device image 70 is superimposed on the device image 88 and if the pc (a) is capable of executing the cooperation function together with the mfp (b), a confirmation screen 144 is displayed on the UI unit 46 of the terminal apparatus 16 as shown in fig. 43C. The confirmation screen 144 is a screen for confirming whether or not the specified devices are caused to cooperate with each other. If the user provides a cooperation instruction on the confirmation screen 144 (for example, if the user presses a "yes" button), information on the cooperation function is displayed on the UI unit 46 of the terminal device 16.
For example, as shown in fig. 44A, the controller 48 of the terminal apparatus 16 causes the UI unit 46 to display a cooperation function display screen 146 and display information on candidate cooperation functions on the cooperation function display screen 146. By causing the pc (a) and the mfp (b) to cooperate with each other, for example, a scan-and-transfer function and a print function are realized. Thus, information on the scan-and-transfer function and information on the print function are displayed on the cooperative function display screen 146.
If the user specifies the cooperation function and the user provides an execution instruction, a connection request is transmitted from the terminal device 16 to the target apparatuses cooperating with each other. As shown in fig. 44B, while the connection request is being processed, a waiting screen 148 is displayed on the UI unit 46 of the terminal device 16. When the connection between the terminal apparatus 16 and the target device is successfully established, the specified cooperation function is executed.
As described above, the device images related to the devices are superimposed on each other, thereby determining the cooperative function using the functions of the devices. Therefore, the functions can be made to cooperate with each other without special operations other than image operations, and the cooperation between the functions can be performed with simple operations. Also in this case, the guidance indicating the second device capable of performing the cooperative function together with the first device is presented, and therefore, user convenience in using the cooperative function can be increased as compared with the case where the guidance is not presented.
The cooperation function may be determined by superimposing the partial image on the device image or the partial image. This process will be described with reference to fig. 45A and 45B. Fig. 45A and 45B each show an example of a screen displayed on the UI unit 46 of the terminal device 16.
As in the cooperative processing using the partial images described above, the function of the device changes depending on the position in the device image relating to the device. The cooperative function using the functions related to the two partial images is determined by superimposing partial images included in the device images on partial images included in the same or different device images. Hereinafter, this process will be described in detail.
For example, assume that MFP (B) and PC (A) are identified. As shown in fig. 45A, a device display screen 68 is displayed on the UI unit 46 of the terminal apparatus 16, and device images 70 and 88 are displayed on the device display screen 68. For example, each of the partial images 70a, 70b, 70c, 88a, and 88b is displayed as an image that can be moved separately from another partial image.
If the user designates a partial image and if the partial image is superimposed on another partial image, a cooperation function using functions related to the two partial images is determined, and information on the cooperation function is displayed on the UI unit 46 of the terminal device 16. This determination process may be performed by the determination unit 38 of the server 14 or the terminal device 16.
For example, as indicated by an arrow 150 in fig. 45B, if the user drags the partial image 70B with the operator and places it on the partial image 88B, the mfp (B) related to the device image 70 including the partial image 70B and the pc (a) related to the device image 88 including the partial image 88B are designated as target devices cooperating with each other, and the scanning function assigned to the partial image 70B and the screen display function assigned to the partial image 88B are also designated as target functions cooperating with each other.
In the server 14, functions assigned to the respective partial images are managed. For example, identification information for identifying partial images, function information indicating a function associated with the partial images, and cooperation function information indicating a cooperation function performed by cooperation between the functions are stored in the server 14 in association with each other. If a partial image is selected on the device display screen 68 and superimposed on another partial image, identification information representing the partial images superimposed on each other is transmitted from the terminal apparatus 16 to the server 14. In the example shown in fig. 45B, identification information indicating the partial images 70B and 88B is transmitted from the terminal device 16 to the server 14. The determination unit 38 of the server 14 determines the functions assigned to the partial images 70b and 88b based on the identification information, and determines the cooperative function using the functions. Information about the cooperation function is transmitted from the server 14 to the terminal device 16 and displayed on the terminal device 16.
As described above, in the case where each target device cooperating with each other has a plurality of functions, the functions are selected in each target device, and information on the cooperative function using the designated function is preferentially displayed. Accordingly, the cooperation function to be used by the intended user is preferentially displayed.
The priority order of the cooperative function display may be changed according to the order in which the partial images are superimposed on each other. In this case, information on a cooperative function using a function related to the superimposed partial image is preferentially displayed.
In a case where the partial images are superimposed on each other, the first specified partial image corresponds to the first image, and the function related to the partial image corresponds to the first function. The second specified partial image (partial image superimposed with the first image) corresponds to the second image, and the function related to the partial image corresponds to the second function. If the second specified function cannot perform the cooperative function together with the first specified function, guidance indicating a function capable of performing the cooperative function together with the first specified function is presented as in examples 1 to 8 described above. In the above example, if the screen display function related to the second specified partial image 88b cannot perform the cooperative function together with the scan function related to the first specified partial image 70b, guidance indicating a function capable of performing the cooperative function together with the first specified scan function is presented. In this case, guidance indicating the entire device (for example, the device image itself) capable of performing the cooperative function together with the first specified function (for example, the scan function) may be presented, or guidance indicating a portion (for example, a partial image) of the device assigned with the function capable of performing the cooperative function together with the first specified function may be presented.
Processing for switching display of single device function and cooperation function
In the exemplary embodiment, control of switching between display of a function (hereinafter referred to as "single device function") that uses a single device alone and display of a cooperation function may be performed.
For example, if only one device is recognized within a predetermined recognition period, the controller 48 of the terminal apparatus 16 causes the UI unit 46 of the terminal apparatus 16 to display information regarding one or more functions of the one device (e.g., the image forming apparatus 10) as single-device function information. The length of the identification period may be varied by the user. Devices may be identified by applying AR technology or other techniques. The process of identifying the device may be performed by the server 14 or the terminal device 16. For example, the start point of the recognition period may be a time point at which the one apparatus is recognized or a time point designated by the user (for example, a time point at which the recognition processing starts).
For example, if another device is not recognized within the recognition period from the time point at which the one device is recognized, the controller 48 of the terminal apparatus 16 causes the UI unit 46 of the terminal apparatus 16 to display information regarding one or more functions of the one device as single-device function information. In this case, the one device is handled as a device recognized during the recognition period. The information on the device may be information transmitted from the server 14 to the terminal apparatus 16 or information stored in advance in the terminal apparatus 16.
As another example, if only one device is identified within the identification period from the time point designated by the user, the controller 48 of the terminal apparatus 16 causes the UI unit 46 of the terminal apparatus 16 to display information regarding one or more functions of the one device as single-device function information.
As another example, if an instruction to display a single-device function is provided after identifying one device, the controller 48 of the terminal apparatus 16 may cause the UI unit 46 of the terminal apparatus 16 to display information regarding one or more functions of the one device as single-device function information. The controller 48 of the terminal apparatus 16 causes the UI unit 46 of the terminal apparatus 16 to always display or display a button image for providing an instruction to display a single device function in the case of photographing one device (or in the case of recognizing one device). If the user presses the button image, the controller 48 causes the UI unit 46 to display information on one or more functions of the one device.
When the recognition period elapses, the controller 48 of the terminal device 16 may cause the UI unit 46 of the terminal device 16 to display a confirmation screen. For example, the confirmation screen is a screen used by the user to provide an instruction to extend the recognition period. If the user provides an instruction to extend the recognition period through the confirmation screen and if another device is not photographed within the extended period, the controller 48 of the terminal apparatus 16 causes the UI unit 46 of the terminal apparatus 16 to display information on one or more functions of the recognized device.
The display control of the single device function information will be further described. For example, assume that the device is identified using a tag-based AR technique or a tag-free AR technique. For example, if only one device is photographed within a predetermined photographing period, the controller 48 of the terminal apparatus 16 causes the UI unit 46 of the terminal apparatus 16 to display information regarding one or more functions of the one device as single-device function information. The start point of the photographing period may be a time point at which the one device is photographed or a time point designated by a user (e.g., a photographing start point). The length of the photographing period may be changed by the user. After the one device is photographed, the one device is identified using a marker-based AR technique or a marker-free AR technique. The identification process may be performed by the server 14 or the terminal device 16.
For example, if another device is not photographed within a photographing period from a point in time when the one device is photographed, the controller 48 of the terminal apparatus 16 causes the UI unit 46 of the terminal apparatus 16 to display information regarding one or more functions of the one device as single-device function information. In this case, the one device is treated as a device that photographs within a photographing period.
As another example, if only one device is photographed within a photographing period from a time point designated by the user, the controller 48 of the terminal apparatus 16 causes the UI unit 46 of the terminal apparatus 16 to display information regarding one or more functions of the one device as single-device function information.
For another example, if an instruction to display a single-device function is provided after photographing one device, the controller 48 of the terminal apparatus 16 may cause the UI unit 46 of the terminal apparatus 16 to display information regarding one or more functions of the one device as single-device function information.
For example, if only one device is photographed within a photographing period (for example, if the second device is not photographed within the photographing period from the time point at which the first device is photographed or if only one device is photographed within the photographing period from the start point specified by the user), the controller 48 of the terminal apparatus 16 transmits image data generated by photographing to the server 14. The photographing period may be measured by the controller 48 or a timer. The determination unit 38 of the server 14 determines (identifies) a device based on the image data and determines one or more functions of the device. The information regarding the one or more functions is transmitted from the server 14 to the terminal device 16 and displayed on the UI unit 46 of the terminal device 16. Of course, the server 14 may manage the time instead of the terminal device 16, and may transmit information about one or more functions of the identified apparatus to the terminal device 16.
When the photographing period elapses, the controller 48 of the terminal device 16 may cause the UI unit 46 of the terminal device 16 to display a confirmation screen. For example, the confirmation screen is a screen used by the user to provide an instruction to extend the shooting period. If the user provides an instruction to extend the photographing period through the confirmation screen and if another device is not photographed within the extended period, the controller 48 of the terminal apparatus 16 transmits image data obtained by photographing to the server 14 and causes the UI unit 46 of the terminal apparatus 16 to display one or more functions of the one device. The length of the extended period may be varied by the user.
As another example, if an instruction to display a single device function is provided after photographing one device, the controller 48 of the terminal apparatus 16 may transmit image data generated by photographing to the server 14, and thus, may receive information about one or more functions of the photographed device from the server 14.
As another example, each time image data is generated by a camera, the controller 48 of the terminal device 16 may transmit the image data to the server 14, and thus, may receive information about one or more functions of the camera from the server 14. In this case, if only one device is photographed within the photographing period, the controller 48 of the terminal apparatus 16 causes the UI unit 46 of the terminal 16 to display information about the one device as single-device function information.
On the other hand, if a plurality of devices are identified within the identification period, the controller 36 of the server 14 executes the cooperative function mode. In the cooperative function mode, guidance indicating a second device capable of performing a cooperative function together with the first device is presented as in the above-described examples 1 to 8.
For example, if the second apparatus is identified within an identification period from a time point at which the first apparatus is identified, the cooperative function mode is executed. In this case, the first device is also treated as a device identified within the identification period. Further, if the second device is identified within the identification period from the time point at which the first device is identified, the controller 36 of the server 14 may set a new identification period from the time point at which the second device is identified. The same applies for the subsequent period, i.e. if a third device is identified within the new identification period, another new identification period is set.
As another example, if a plurality of devices are photographed within an identification period from a time point designated by a user, the cooperation function mode may be performed.
As another example, if an instruction to display a cooperative function is provided after a plurality of devices are identified, the cooperative function mode may be executed. The controller 48 of the terminal device 16 causes the UI unit 46 to display a button image for providing an instruction to display one or more cooperative functions at all times or in the case of photographing a plurality of apparatuses (or in the case of recognizing a plurality of apparatuses). If the user presses the button image, the cooperative function mode is executed.
As another example, if the second device is identified during a period in which no instruction to perform the function of the first device is provided after the first device is identified, the cooperative functional mode may be executed.
The execution of the cooperative functional mode will be further described. For example, assume that multiple devices are identified using either marker-based AR technology or marker-less AR technology. For example, if a plurality of devices are photographed within a predetermined photographing period, the cooperative function mode is performed. For example, if the second apparatus is photographed within a photographing period from a point in time at which the first apparatus is photographed, the cooperative function mode is executed. In this case, the first device is also treated as a device that shoots within the shooting period. Further, if the second device is photographed within a photographing period from a time point at which the first device is photographed, the controller 36 of the server 14 may set a new photographing period from the time point at which the second device is photographed. The same applies to the subsequent period, i.e., if the third device is photographed within the new photographing period, another new photographing period is set.
As another example, if a plurality of apparatuses are photographed within a photographing period from a time point designated by a user, the cooperation function mode may be performed.
As another example, if an instruction to display a cooperation function is provided after photographing a plurality of devices, the cooperation function mode may be executed.
As another example, if the second apparatus is photographed during a period in which no instruction to perform the function of the first apparatus is provided after photographing the first apparatus, the cooperative function mode may be performed.
As described above, in switching the display of the single device function and the cooperation function, if one device is recognized (e.g., photographed), information on one or more functions of the one device is displayed, and if a plurality of devices are recognized (e.g., photographed), the cooperation function mode is executed. Accordingly, information about functions that can be performed with the identified (e.g., photographing) device is provided to the user, which may be convenient for the user.
Since the single device function or the cooperation function becomes available only by identifying the device by applying the AR technology, the respective functions are available by simple operations and the time and effort of the user can be reduced, compared to the case where the user manually makes settings for executing the functions.
In an exemplary embodiment, the device image related to the recognized device and the device images superimposed one on another may be three-dimensionally displayed to be distinguished from the background image. That is, these images may be displayed as three-dimensional images. For example, the background image is displayed two-dimensionally, and the device image is displayed three-dimensionally. Therefore, the visibility of the device image can be increased. In addition, the color of the device image specified by the user may be changed, or the specified device image may blink so that the specified device image can be distinguished from other device images.
According to an exemplary embodiment, a cooperation function using functions of target devices cooperating with each other is determined by applying AR technology, and information on the cooperation function is displayed. Therefore, even if the user cannot know from the appearance of the devices which cooperative function can be performed by the target devices cooperating with each other, the user is provided with information on the cooperative function. Further, by making a plurality of devices cooperate with each other, functions that cannot be performed by a single device become available, which may be convenient. In addition, the cooperation function becomes available only by applying the AR technology to identify target devices that cooperate with each other. Therefore, the cooperative function becomes available by a simple operation and the time and effort of the user can be reduced as compared with the case where the user manually makes a setting for executing the cooperative function.
For example, each of the image forming apparatus 10, the server 14, and the terminal apparatus 16 is realized by cooperation between hardware and software resources. Specifically, each of the image forming apparatus 10, the server 14, and the terminal device 16 includes one or more processors (not shown) such as a Central Processing Unit (CPU). The one or more processors read and execute programs stored in a storage device (not shown), thereby implementing the functions of the respective units of the image forming apparatus 10, the server 14, and the terminal apparatus 16. The program is stored in the storage device through a recording medium such as a Compact Disc (CD) or a Digital Versatile Disc (DVD) or through a communication path such as a network. Alternatively, the respective units of the image forming apparatus 10, the server 14, and the terminal apparatus 16 may be realized by hardware resources such as a processor, an electronic circuit, or an Application Specific Integrated Circuit (ASIC). Devices such as memory may be used for implementation. Alternatively, the respective units of the image forming apparatus 10, the server 14, and the terminal apparatus 16 may be implemented by a Digital Signal Processor (DSP) or a Field Programmable Gate Array (FPGA).
The foregoing description of the exemplary embodiments of the invention has been presented for the purposes of illustration and description. It is not intended to be exhaustive or to limit the invention to the precise form disclosed. Obviously, many modifications and variations will be apparent to practitioners skilled in the art. The embodiments were chosen and described in order to best explain the principles of the invention and its practical applications, to thereby enable others skilled in the art to understand the invention for various embodiments and with various modifications as are suited to the particular use contemplated. It is intended that the scope of the invention be defined by the following claims and their equivalents.

Claims (22)

1. An information processing apparatus, comprising:
a controller that performs control to present guidance indicating a second device capable of performing the cooperative function together with the first device if a first image related to the first device required to perform the cooperative function is specified,
wherein the controller monitors at least one of an operation state of the first device, an environment in which the first device is installed, and an update state of the first device, and determines the availability of the cooperation function based on the monitoring result.
2. The information processing apparatus according to claim 1, wherein the controller performs control to present the guidance if an image related to a device incapable of performing the cooperative function with the first device is further specified.
3. The information processing apparatus according to claim 2, wherein the controller performs control to present the guidance if an operation of linking the first image and an image related to a device that cannot perform the cooperative function with the first device to each other is performed.
4. The information processing apparatus according to claim 2, wherein the controller performs control to present the guidance if the first image and an image related to a device that cannot perform the cooperative function with the first device are superimposed on each other.
5. The apparatus according to any one of claims 1 to 4, wherein if a partial image included in the first image is specified, the controller performs control to present the guidance indicating the second device that can perform the cooperative function together with a function corresponding to the partial image.
6. The apparatus according to claim 1, wherein as the control to present the guidance, the controller performs control to display a candidate list showing information on one or more second devices capable of performing the cooperative function.
7. The apparatus according to claim 6, wherein if a second device is specified from among the one or more second devices on the candidate list, the controller performs control to display information on the cooperative function using the specified second device.
8. The information processing apparatus according to claim 1, wherein the controller performs control to display the cooperative function while changing the cooperative function according to an order in which the first device and the second device are specified.
9. The information processing apparatus according to claim 1, wherein if the first device and the second device are specified, the controller further performs control to present guidance indicating a third device capable of performing a cooperative function together with the first device and the second device.
10. The information processing apparatus according to claim 9, wherein the controller performs control to present the guidance while changing the third device according to an order in which the first device and the second device are specified.
11. An information processing apparatus, comprising:
a controller that performs control to present guidance indicating a second function capable of performing the cooperative function together with the first function if a first image related to the first function required to perform the cooperative function is specified,
wherein the controller monitors at least one of an operation state of a device corresponding to the first and second functions, an environment in which the device corresponding to the first and second functions is installed, and an updated state of the device corresponding to the first and second functions, and determines the availability of the cooperative function based on the monitoring result.
12. The information processing apparatus according to claim 11, wherein the controller performs control to present the guidance if an image related to a function that cannot execute the cooperative function together with the first function is further specified.
13. The information processing apparatus according to claim 12, wherein the controller performs control to present the guidance if an operation of linking the first image and an image related to a function that cannot perform the cooperative function together with the first function to each other is performed.
14. The information processing apparatus according to claim 12, wherein the controller performs control to present the guidance if the first image and an image related to a function that cannot perform the cooperative function together with the first function are superimposed on each other.
15. The information processing apparatus according to any one of claims 11 to 14, wherein as the control to present the guidance, the controller performs control to display a candidate list showing information on one or more second functions that can execute the cooperative function.
16. The information processing apparatus according to claim 15, wherein an order in which the one or more second functions are arranged in the candidate list is determined based on past usage records of the one or more second functions.
17. The apparatus according to claim 11, wherein the controller performs control to display the cooperative function while changing the cooperative function in accordance with an order in which the first function and the second function are specified.
18. The information processing apparatus according to claim 11, wherein if the first function and the second function are specified, the controller further performs control to present guidance indicating a third function that can perform a cooperative function together with the first function and the second function.
19. The information processing apparatus according to claim 18, wherein the controller performs control to present the guidance while changing the third function in accordance with an order in which the first function and the second function are specified.
20. The apparatus according to claim 11, wherein the first function and the second function are included in a group of functions registered in advance, a group of functions of one or more identified devices, a group of functions displayed on a display, or a group of functions displayed in a specific area of a screen of the display.
21. An information processing method, comprising the steps of:
if a first image related to a first device required to perform a cooperative function is specified, performing control to present guidance indicating a second device capable of performing the cooperative function together with the first device,
wherein at least one of an operation state of the first device, an environment in which the first device is installed, and an update state of the first device is monitored, and availability of the cooperation function is determined based on the monitoring result.
22. An information processing method, comprising the steps of:
if a first image related to a first function required to execute a cooperative function is specified, performing control to present guidance indicating a second function capable of executing the cooperative function together with the first function,
wherein at least one of an operation state of a device to which the first function and the second function correspond, an environment in which the device to which the first function and the second function correspond is installed, and an updated state of the device to which the first function and the second function correspond is monitored, and the availability of the cooperation function is determined based on a monitoring result.
CN201710938467.4A 2017-01-11 2017-09-30 Information processing apparatus, information processing method, and computer program Active CN108307084B (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2017-002491 2017-01-11
JP2017002491A JP2018112860A (en) 2017-01-11 2017-01-11 Information processing device and program

Publications (2)

Publication Number Publication Date
CN108307084A CN108307084A (en) 2018-07-20
CN108307084B true CN108307084B (en) 2022-06-24

Family

ID=62869993

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201710938467.4A Active CN108307084B (en) 2017-01-11 2017-09-30 Information processing apparatus, information processing method, and computer program

Country Status (2)

Country Link
JP (1) JP2018112860A (en)
CN (1) CN108307084B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP7271157B2 (en) * 2018-12-13 2023-05-11 シャープ株式会社 DISPLAY DEVICE, PROGRAM AND DISPLAY METHOD OF DISPLAY DEVICE

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2011182183A (en) * 2010-03-01 2011-09-15 Panasonic Corp Device information display apparatus, television, equipment information display method, program, and recording medium
JP2011253370A (en) * 2010-06-02 2011-12-15 Sony Corp Information processing device, information processing method and program
JP2012213144A (en) * 2011-03-18 2012-11-01 Ricoh Co Ltd Information processor, information processing system, device cooperation method and program
WO2013061517A1 (en) * 2011-10-27 2013-05-02 パナソニック株式会社 Apparatus for executing device coordination service, method for executing device coordination service, and program for executing device coordination service
CN104375948A (en) * 2013-08-14 2015-02-25 佳能株式会社 Image forming apparatus and control method thereof
JP6024848B1 (en) * 2016-05-06 2016-11-16 富士ゼロックス株式会社 Information processing apparatus and program
CN106161834A (en) * 2015-05-11 2016-11-23 富士施乐株式会社 Information processing system, information processor and information processing method
JP6052458B1 (en) * 2016-06-29 2016-12-27 富士ゼロックス株式会社 Information processing apparatus and program

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP6303709B2 (en) * 2014-03-28 2018-04-04 ブラザー工業株式会社 Image processing apparatus, communication system, and relay apparatus

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2011182183A (en) * 2010-03-01 2011-09-15 Panasonic Corp Device information display apparatus, television, equipment information display method, program, and recording medium
JP2011253370A (en) * 2010-06-02 2011-12-15 Sony Corp Information processing device, information processing method and program
JP2012213144A (en) * 2011-03-18 2012-11-01 Ricoh Co Ltd Information processor, information processing system, device cooperation method and program
WO2013061517A1 (en) * 2011-10-27 2013-05-02 パナソニック株式会社 Apparatus for executing device coordination service, method for executing device coordination service, and program for executing device coordination service
CN104375948A (en) * 2013-08-14 2015-02-25 佳能株式会社 Image forming apparatus and control method thereof
CN106161834A (en) * 2015-05-11 2016-11-23 富士施乐株式会社 Information processing system, information processor and information processing method
JP6024848B1 (en) * 2016-05-06 2016-11-16 富士ゼロックス株式会社 Information processing apparatus and program
JP6052458B1 (en) * 2016-06-29 2016-12-27 富士ゼロックス株式会社 Information processing apparatus and program

Also Published As

Publication number Publication date
CN108307084A (en) 2018-07-20
JP2018112860A (en) 2018-07-19

Similar Documents

Publication Publication Date Title
CN107346220B (en) Information processing apparatus, information processing method, and computer program
JP6052459B1 (en) Information processing apparatus and program
JP6052458B1 (en) Information processing apparatus and program
JP6179653B1 (en) Information processing apparatus and program
US10440208B2 (en) Information processing apparatus with cooperative function identification
CN107391061B (en) Information processing apparatus and information processing method
JP6146528B1 (en) Information processing apparatus and program
JP6160761B1 (en) Information processing apparatus and program
CN107346218B (en) Information processing apparatus, information processing method, and computer program
CN108307084B (en) Information processing apparatus, information processing method, and computer program
US10359975B2 (en) Information processing device and non-transitory computer readable medium
JP6327387B2 (en) Information processing apparatus and program
JP6443498B2 (en) Information processing apparatus and program
JP6958680B2 (en) Information processing equipment and programs
JP2018129097A (en) Information processing apparatus and program
JP6443497B2 (en) Information processing apparatus and program
JP6455551B2 (en) Information processing apparatus and program
JP2019068442A (en) Information processing apparatus and program
JP2019114279A (en) Information processing device and program
JP2022027759A (en) Information processing unit, program and control method
JP2018067292A (en) Information processing apparatus and program

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
CB02 Change of applicant information
CB02 Change of applicant information

Address after: Tokyo, Japan

Applicant after: Fuji film business innovation Co.,Ltd.

Address before: Tokyo, Japan

Applicant before: Fuji Xerox Co.,Ltd.

GR01 Patent grant
GR01 Patent grant