CN108307084A - Information processing equipment and information processing method - Google Patents

Information processing equipment and information processing method Download PDF

Info

Publication number
CN108307084A
CN108307084A CN201710938467.4A CN201710938467A CN108307084A CN 108307084 A CN108307084 A CN 108307084A CN 201710938467 A CN201710938467 A CN 201710938467A CN 108307084 A CN108307084 A CN 108307084A
Authority
CN
China
Prior art keywords
function
image
collaboration feature
information
controller
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201710938467.4A
Other languages
Chinese (zh)
Other versions
CN108307084B (en
Inventor
得地贤吾
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Fujifilm Business Innovation Corp
Original Assignee
Fuji Xerox Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Fuji Xerox Co Ltd filed Critical Fuji Xerox Co Ltd
Publication of CN108307084A publication Critical patent/CN108307084A/en
Application granted granted Critical
Publication of CN108307084B publication Critical patent/CN108307084B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N1/00Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
    • H04N1/00127Connection or combination of a still picture apparatus with another apparatus, e.g. for storage, processing or transmission of still picture signals or of information associated with a still picture
    • H04N1/00204Connection or combination of a still picture apparatus with another apparatus, e.g. for storage, processing or transmission of still picture signals or of information associated with a still picture with a digital computer or a digital computer system, e.g. an internet server
    • H04N1/00244Connection or combination of a still picture apparatus with another apparatus, e.g. for storage, processing or transmission of still picture signals or of information associated with a still picture with a digital computer or a digital computer system, e.g. an internet server with a server, e.g. an internet server
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N1/00Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
    • H04N1/00127Connection or combination of a still picture apparatus with another apparatus, e.g. for storage, processing or transmission of still picture signals or of information associated with a still picture
    • H04N1/00249Connection or combination of a still picture apparatus with another apparatus, e.g. for storage, processing or transmission of still picture signals or of information associated with a still picture with a photographic apparatus, e.g. a photographic printer or a projector
    • H04N1/00251Connection or combination of a still picture apparatus with another apparatus, e.g. for storage, processing or transmission of still picture signals or of information associated with a still picture with a photographic apparatus, e.g. a photographic printer or a projector with an apparatus for taking photographic images, e.g. a camera
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N1/00Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
    • H04N1/00127Connection or combination of a still picture apparatus with another apparatus, e.g. for storage, processing or transmission of still picture signals or of information associated with a still picture
    • H04N1/00326Connection or combination of a still picture apparatus with another apparatus, e.g. for storage, processing or transmission of still picture signals or of information associated with a still picture with a data reading, recognizing or recording apparatus, e.g. with a bar-code apparatus
    • H04N1/00328Connection or combination of a still picture apparatus with another apparatus, e.g. for storage, processing or transmission of still picture signals or of information associated with a still picture with a data reading, recognizing or recording apparatus, e.g. with a bar-code apparatus with an apparatus processing optically-read information
    • H04N1/00334Connection or combination of a still picture apparatus with another apparatus, e.g. for storage, processing or transmission of still picture signals or of information associated with a still picture with a data reading, recognizing or recording apparatus, e.g. with a bar-code apparatus with an apparatus processing optically-read information with an apparatus processing barcodes or the like
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N1/00Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
    • H04N1/00127Connection or combination of a still picture apparatus with another apparatus, e.g. for storage, processing or transmission of still picture signals or of information associated with a still picture
    • H04N1/00326Connection or combination of a still picture apparatus with another apparatus, e.g. for storage, processing or transmission of still picture signals or of information associated with a still picture with a data reading, recognizing or recording apparatus, e.g. with a bar-code apparatus
    • H04N1/00328Connection or combination of a still picture apparatus with another apparatus, e.g. for storage, processing or transmission of still picture signals or of information associated with a still picture with a data reading, recognizing or recording apparatus, e.g. with a bar-code apparatus with an apparatus processing optically-read information
    • H04N1/00336Connection or combination of a still picture apparatus with another apparatus, e.g. for storage, processing or transmission of still picture signals or of information associated with a still picture with a data reading, recognizing or recording apparatus, e.g. with a bar-code apparatus with an apparatus processing optically-read information with an apparatus performing pattern recognition, e.g. of a face or a geographic feature
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N1/00Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
    • H04N1/0035User-machine interface; Control console
    • H04N1/00405Output means
    • H04N1/00474Output means outputting a plurality of functional options, e.g. scan, copy or print
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N1/00Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
    • H04N1/00912Arrangements for controlling a still picture apparatus or components thereof not otherwise provided for
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N2201/00Indexing scheme relating to scanning, transmission or reproduction of documents or the like, and to details thereof
    • H04N2201/0077Types of the still picture apparatus
    • H04N2201/0094Multifunctional device, i.e. a device capable of all of reading, reproducing, copying, facsimile transception, file transception

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Artificial Intelligence (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Computing Systems (AREA)
  • General Engineering & Computer Science (AREA)
  • Facsimiles In General (AREA)
  • Accessory Devices And Overall Control Thereof (AREA)

Abstract

Information processing equipment and information processing method.A kind of information processing equipment includes controller.If specified the first image related with first device needed for execution collaboration feature, the controller perform control to the guiding that the second device that instruction can execute collaboration feature together with first device is presented.

Description

Information processing equipment and information processing method
Technical field
The present invention relates to information processing equipments and information processing method.
Background technology
Japanese Unexamined Patent Application Publication No.2015-177504 and No.2015-223006 are disclosed for so that more A interoperable technology of device.
However, in some cases possibly desired collaboration feature can not be executed.
Invention content
Therefore, the purpose of the present invention is to increase user's facility in the case where executing collaboration feature.
According to the first aspect of the invention, a kind of information processing equipment including controller is provided.If specifying and holding Related first image of first device needed for row collaboration feature, then the controller perform control to presentation instruction can be with the One device executes the guiding of the second device of the collaboration feature together.
According to the second aspect of the invention, if further specifying that and the cooperation work(cannot be executed together with first device The related image of device of energy, then the controller, which performs control to, is presented the guiding.
According to the third aspect of the invention we, if execute by the first image and with cannot be executed together with first device described in The operation that the related image of device of collaboration feature links each other, then the controller, which performs control to, is presented the guiding.
According to the fourth aspect of the invention, if the first image and with the cooperation work(cannot be executed together with first device The related image of device of energy is superposed on one another, then the controller, which performs control to, is presented the guiding.
According to the fifth aspect of the invention, if the specified parts of images being included in the first image, the controller It performs control to and instruction is presented can execute described the of the collaboration feature together with function corresponding with the parts of images The guiding of two devices.
According to the sixth aspect of the invention, as the control that the guiding is presented, the controller performs control to display Candidate list, the candidate list show the information about one or more second devices for being able to carry out the collaboration feature.
According to the seventh aspect of the invention, if working as middle finger from one or more second device on candidate list Determine second device, then the controller performs control to display about the collaboration feature for using specified second device Information.
According to the eighth aspect of the invention, the controller is performed control to according to the first device and described second The appointed sequence of device shows the collaboration feature while changing the collaboration feature.
According to the ninth aspect of the invention, if first device and second device are designated, the controller is further Perform control to the 3rd device for presenting and indicating that collaboration feature can be executed together with the first device and the second device Guiding.
According to the tenth aspect of the invention, the controller is performed control to according to the first device and described second The guiding is presented while changing the 3rd device in the appointed sequence of device.
According to the eleventh aspect of the invention, a kind of information processing equipment including controller is provided.If it is specified with Execute related first image of the first function needed for collaboration feature, then the controller perform control to presentation instruction can be with First function executes the guiding of the second function of the collaboration feature together.
According to the twelfth aspect of the invention, if further specify that with cannot be executed together with first function described in The related image of function of collaboration feature, then the controller, which performs control to, is presented the guiding
According to the thirteenth aspect of the invention, if execute by the first image and with cannot be held together with first function The operation that the related image of function of the row collaboration feature links each other, then the controller performs control to draws described in presentation It leads.
According to the fourteenth aspect of the invention, if described first image and with cannot be executed together with first function The related image of function of the collaboration feature is superposed on one another, then the controller, which performs control to, is presented the guiding.
According to the fifteenth aspect of the invention, as the control that the guiding is presented, the controller performs control to aobvious Show candidate list, which shows the letter about one or more second functions of being able to carry out the collaboration feature Breath.
According to the sixteenth aspect of the invention, the sequence of one or more second function arrangements described in candidate list It is determined based on the past usage record of one or more the second function.
According to the seventeenth aspect of the invention, the controller is performed control to according to first function and described The appointed sequence of two functions shows the collaboration feature while changing the collaboration feature.
According to the eighteenth aspect of the invention, if first function and second function are designated, the control Device processed, which further performs control to presentation instruction, to execute collaboration feature together with first function and second function Third function guiding.
According to the nineteenth aspect of the invention, the controller is performed control to according to first function and described The guiding is presented while changing the third function in the appointed sequence of two functions.
According to a twentieth aspect of the invention, first function and second function are included in pre-registered one Group function, one group of function of device of one or more identification, display one group of function over the display or it is shown in display In one group of function in the specific region of the picture of device.
According to the twenty-first aspect of the invention, a kind of information processing method is provided, which includes:Such as Specified the first image related with the first device needed for execution collaboration feature of fruit, then performing control to presentation instruction can be with institute State the guiding that first device executes the second device of the collaboration feature together.
According to the twenty-second aspect of the invention, a kind of information processing method is provided, which includes:Such as Specified the first image related with the first function needed for execution collaboration feature of fruit, then performing control to presentation instruction can be with institute State the guiding that the first function executes the second function of the collaboration feature together.
According to the present invention first, the 9th and the 20th on the one hand, and in the case where executing collaboration feature, user, which facilitates, increases Add.
In terms of according to the present invention second or the 12nd, can avoid being likely to occur in the case where guiding is presented always answers Polygamy.
Third according to the present invention or the 13rd aspect, perform control to and guiding are presented by the operation of concatenated image.
According to the present invention 4th or fourteenth aspect, it performs control to and guiding is presented by the operation of superimposed image.
According to the fifth aspect of the invention, association can be executed together with the specific function of device by performing control to presentation instruction Make the guiding of the device of function.
6th or the 7th aspect according to the present invention, is presented the list for the device for being able to carry out collaboration feature.
In terms of according to the present invention 8th or the 17th, performs control to display and estimate collaboration feature to be used.
According to the tenth aspect of the invention, it performs control to and the guiding that device to be used is estimated in instruction is presented.
11st, the 18th, the 20th or the 22nd aspect according to the present invention, executes specified needed for collaboration feature Function in the case of, user facilitate increase.
According to the fifteenth aspect of the invention, the list for the function of being able to carry out collaboration feature is presented.
According to the sixteenth aspect of the invention, the record for grasping the relative usage of each device becomes easy.
According to the nineteenth aspect of the invention, it performs control to and the guiding that function to be used is estimated in instruction is presented.
Description of the drawings
The following drawings detailed description of the present invention illustrative embodiments will be based on, in attached drawing:
Fig. 1 is the block diagram for showing apparatus system according to an illustrative embodiment of the invention;
Fig. 2 is the block diagram for showing image forming apparatus according to illustrative embodiments;
Fig. 3 is the block diagram for showing server according to illustrative embodiments;
Fig. 4 is the block diagram for showing terminal device according to illustrative embodiments;
Fig. 5 is the schematic diagram for the appearance for showing image forming apparatus;
Fig. 6 is the exemplary figure for showing apparatus function management table;
Fig. 7 is the exemplary figure for showing collaboration feature management table;
Fig. 8 is the figure for the device for showing to be used alone;
Fig. 9 is to show that function shows the exemplary figure of picture;
Figure 10 is to show that function shows the exemplary figure of picture;
Figure 11 is the figure for showing interoperable destination apparatus;
Figure 12 is to show that function shows the exemplary figure of picture;
Figure 13 is the precedence diagram for showing connection processing;
Figure 14 A and Figure 14 B are to show that device shows the exemplary figure of picture;
Figure 15 is the exemplary figure for showing to show picture according to the device of example 1;
Figure 16 is the exemplary figure for showing to show picture according to the device of example 2;
Figure 17 is the exemplary figure for showing to show picture according to the device of example 3;
Figure 18 is the exemplary figure for showing to show picture according to the device of example 3;
Figure 19 is the exemplary figure for showing to show picture according to the device of example 3;
Figure 20 is the exemplary figure for showing to show picture according to the device of example 3;
Figure 21 is the exemplary figure for showing to show picture according to the device of example 4;
Figure 22 is the exemplary figure for showing to show picture according to the device of example 4;
Figure 23 is the exemplary figure for showing to show picture according to the device of example 5;
Figure 24 is the exemplary figure for showing to show picture according to the device of example 5;
Figure 25 is the exemplary figure for showing the picture according to example 5;
Figure 26 is the exemplary figure for showing to select picture according to the device of example 6;
Figure 27 is the exemplary figure for showing to select picture according to the device of example 7;
Figure 28 is the exemplary figure for showing to select picture according to the device of example 7;
Figure 29 is the exemplary figure for showing to select picture according to the function of example 7;
Figure 30 is the exemplary figure for showing the picture according to example 7;
Figure 31 is the exemplary figure for showing the message according to example 7;
Figure 32 is the exemplary figure for showing the message according to example 7;
Figure 33 is the exemplary figure for showing to select picture according to the device of example 8;
Figure 34 is the exemplary figure for showing to select picture according to the device of example 8;
Figure 35 is the exemplary figure for showing the picture according to example 8;
Figure 36 is the exemplary figure for showing collaboration feature management table;
Figure 37 A and Figure 37 B are to show that device shows that the example of picture and function show the exemplary figure of picture respectively;
Figure 38 A and Figure 38 B are to show that device shows that the example of picture and function show the exemplary figure of picture respectively;
Figure 39 is the exemplary figure for showing apparatus function management table;
Figure 40 A and Figure 40 B are to show that device shows that the example of picture and function show the exemplary figure of picture respectively;
Figure 41 is the exemplary figure for showing apparatus function management table;
Figure 42 is the exemplary figure for showing collaboration feature management table;
Figure 43 A, Figure 43 B and Figure 43 C are the exemplary figures for showing to show picture on the terminal device;
Figure 44 A and Figure 44 B are the exemplary figures for showing to show picture on the terminal device;And
Figure 45 A and Figure 45 B are the exemplary figures for showing to show picture on the terminal device.
Specific implementation mode
It will be described for the apparatus system of information processing system according to an illustrative embodiment of the invention referring to Fig.1. Fig. 1 shows the example of apparatus system according to illustrative embodiments.
Apparatus system according to illustrative embodiments includes multiple devices (for example, device 10 and 12), is set as outside Standby exemplary server 14 and the exemplary terminal device 16 as information processing equipment.Device 10 and 12, server 14 And terminal device 16 has the function that the communication path N by such as network communicates with one another.Certainly, device 10 and 12, service Device 14 and terminal device 16 can be communicated by different communication paths with another equipment.In the example depicted in fig. 1, device system System includes two devices (device 10 and 12).It may include three or more devices in apparatus system.In addition, in apparatus system It may include multiple servers 14 and multiple terminal devices 16.
Device 10 and 12 is to have the equipment of specific function and can be the image shape for for example forming function with image Forming apparatus, personal computer (PC), the display equipment of projecting apparatus, such as liquid crystal display or projecting apparatus, phone, clock, monitoring Camera etc..Device 10 and 12 has to another equipment transmission data and receives from it the function of data.In exemplary embodiment party In formula, for example, it is assumed that device 10 is image forming apparatus.Image forming apparatus (device 10) is with scanning function, printing work( The equipment of at least one of energy, copy function and facsimile function.
Server 14 is the equipment for the function of managing each device.For example, server 14 manages the function of each device, makes With the collaboration feature etc. of multiple functions.Server 14 also has to another equipment transmission data and receives from it the work(of data Energy.
Server 14 can be directed to the available one or more functions of each user management user.For example, user is available Function be the function of being gratuitously supplied to user or the paid function of being supplied to user and bought by user.Server 14 can needle Indicate the available functional information of the available one or more functions of the user (for example, function purchase is gone through each user management History information).Certainly, server 14 may not be according to whether having bought function carrys out management function, because there are free function, adding More new function and the specific function by Admin Administration.Function purchase processing is executed by such as server 14.Certainly, function is purchased Buying processing can be executed by another equipment.
Terminal device 16 is the equipment of such as PC, tablet PC, smart phone or mobile phone, and is had to another equipment Transmission data and the function of receiving from it data.When use device, terminal device 16 is used as such as user interface section (UI Unit).
In apparatus system according to illustrative embodiments, if specifying the first image related with first device, Perform control to the guiding that the second device that instruction can execute collaboration feature together with first device is presented.Alternatively, if Specify the first image related with the first function, executable control that can execute cooperation work(together with the first function so that instruction is presented The guiding of second function of energy.
Hereinafter, will be described in being included in each equipment in apparatus system according to illustrative embodiments.
By the configuration with reference to Fig. 2 detailed descriptions as the device 10 of image forming apparatus.Hereinafter, device 10 is referred to alternatively as figure As forming equipment 10.Fig. 2 shows the configurations of image forming apparatus 10.
Communication unit 18 is communication interface, and has to the function of another equipment transmission data and connect from another equipment Receive the function of data.Communication unit 18 can be the communication interface for having wireless communication function, or can be with cable modem The communication interface of telecommunication function.
Image forming unit 20 has the function of image formation.Specifically, image forming unit 20 has scanning function, printing At least one of function, copy function and facsimile function.When executing scanning function, reads document and generate scan data (image data).When executing printing function, the print image in the recording medium of such as paper.When executing copy function, Document is read and prints on the recording medium.When executing facsimile function, image data is sent or received by faxing.Separately Outside, it can perform the collaboration feature using multiple functions.For example, the executable combination as scanning function and transmission (transmission) function Scanning and transmitting function.When executing scanning and transmitting function, document is read, generates scan data (image data), and Scan data is sent to destination (for example, such as external equipment of terminal device 16).Certainly, this collaboration feature is only shown Example can perform another collaboration feature.
Memory 22 is the storage device of hard disk or memory (for example, solid state drive (SSD) etc.).Memory 22 Storage for example indicates that image forms the information (for example, job information) instructed, image data to be printed, scans work(by execution Can and the unit address information of the scan data, the address for indicating another device that generate, indicate server 14 address service Device address information, various control data and various programs.Certainly, these information and data can be stored in different storages and set In standby or in a storage device.
UI units 24 are user interface sections, and include display and operating unit.Display is such as liquid crystal display The display equipment of device.Operating unit is the input equipment of such as touch screen or keyboard.Certainly, it can be used and serve as display and operation The user interface of both units is (for example, touch display or including electronically showing that the display of keyboard etc. is set It is standby).Image forming apparatus 10 may not include UI units 24, it may include serving as the hardware user interface unit of hardware, (hardware UI is mono- Member) and non-display.For example, hardware UI units are the hardware keypads (for example, numeric keypad) or special for being exclusively used in input number It is used to indicate the hardware keypad (for example, direction instruction keypad) in direction.
Controller 26 controls the operation of each unit of image forming apparatus 10.
Hereinafter, by the configuration with reference to Fig. 3 detailed description servers 14.Fig. 3 shows the configuration of server 14.
Communication unit 28 is communication interface, and has to the function of another equipment transmission data and connect from another equipment Receive the function of data.Communication unit 28 can be the communication interface for having wireless communication function, or can be with cable modem The communication interface of telecommunication function.
Memory 30 is the storage device of hard disk or memory (for example, SSD etc.).Memory 30 stores such as device Function management information 32, collaboration feature management information 34, various data, various programs, indicate each device address device The server address information of the address of address information and expression server 14.Certainly, these information and data can be stored in In different storage devices or in a storage device.The apparatus function management information 32 of storage in memory 30 and cooperation work( Energy management information 34 can be supplied to terminal device 16 periodically or according to specified timing, and therefore, be stored in terminal and set Information in standby 16 can be updated.Hereinafter, will description apparatus function management information 32 and collaboration feature management information 34.
Apparatus function management information 32 is the information of the function for managing each device, and is, for example, to indicate for knowing Between the device identification information of other device and the one or more functional informations of one or more functions for indicating the device Correspondence information.Device identification information includes the information of the type of such as device ID, device name, expression device, dress The appearance images of the information (device location information) of the position of the model, expression device set and the appearance of expression device.Function Information includes such as functional identity and function title.For example, if image forming apparatus 10 has scanning function, printing function, answers Function and scanning and transmitting function are printed, then the device identification information of image forming apparatus 10 is believed with the function of scanning function is indicated Breath, the functional information for indicating printing function, the functional information for indicating copy function and the function of indicating scanning and transmitting function Information association.The function of (identification) each device is determined by reference to apparatus function management information 32,
For example, being included in the device in apparatus system (for example, dress by the device that apparatus function management information 32 manages Set 10 and 12).Certainly, the device being not included in apparatus system can also be managed by apparatus function management information 32.For example, Server 14 can get information (including device identification information and the functional information about the new equipment for not including in apparatus system Information) and can be by the information new registration in apparatus function management information 32.Information about device can utilize internet etc. It obtains, or can by inputs such as administrators.Server 14 can according to a certain timing, periodically or according to administrator etc. Specified timing carrys out updating device function management information 32.Therefore, there is device after expression updates preceding device not and updates The functional information of function can be registered in apparatus function management information 32.In addition, after indicating that device has and updates before update The functional information of the no function of device can be deleted from apparatus function management information 32, and can be registered as unavailable information. The acquisitions such as internet can be utilized for newer information, or can be by inputs such as administrators.
Collaboration feature management information 34 is the information for managing collaboration feature, each collaboration feature by multiple functions it Between cooperation execute.One or more collaboration features are executed by the cooperation between multiple functions.Each collaboration feature can It is executed by the cooperation between multiple functions of a device (for example, device 10 or 12), or multiple device (examples can be passed through Such as, cooperation between device 10 and multiple functions 12) executes.The terminal device of operational order is provided (in exemplary implementation In mode, terminal device 16) it may include in the device to be identified, and the function of terminal device can be used as the one of collaboration feature Part.
Collaboration feature can be the function of being executed without using hardware device.For example, collaboration feature can be The function of being executed by the cooperation between multiple software units.Certainly, collaboration feature can be by the function of hardware device with The function of being executed by the cooperation between the function of software realization.
For example, collaboration feature management information 34 is such information, indicate:Indicate each used in collaboration feature Correspondence between the combination and the collaboration feature information for indicating collaboration feature of the functional information of function.For example, collaboration feature Information includes collaboration feature ID and collaboration feature title.If individual feature is updated, the also root of collaboration feature management information 34 It is updated according to the update.Therefore, may be become in the updated using the collaboration feature for the multiple functions that can not be cooperated each other before update It must can be used, or available collaboration feature may become unavailable in the updated before the update.It is made available by after expression update The collaboration feature information of collaboration feature is registered in collaboration feature management information 34, and is become unavailable after indicating update The collaboration feature information of collaboration feature deletes or is registered as unavailable information from collaboration feature management information 34.
In the case where making multiple device coordination with one another, collaboration feature management information 34 is for managing using described more The information of one or more collaboration features of multiple functions of a device, and be to indicate identification for one or more The information of the combination of the device identification information of each device of a collaboration feature and the correspondence of collaboration feature information.If dress It sets function management information 32 to be updated, then collaboration feature management information 34 is updated also according to the update.Therefore, using updating Before can not the collaboration features of interoperable multiple devices may be made available by, or before the update available collaboration feature more It may be become unavailable after new.
Collaboration feature can be the function of being executed by the cooperation between multiple and different functions, or can be by identical The function that cooperation between function executes.Collaboration feature can be the not available function in the case where being not coordinated with.It is being not coordinated with In the case of not available function can be by by the function of interoperable destination apparatus identical function combine by The function being made available by, or be made available by by combining the different function in the function of interoperable destination apparatus Function.For example, the device (printer) with printing function and cooperating between the device with scanning function (scanner) Realize copy function as collaboration feature.That is, cooperating between printing function and scanning function realizes copy function.In this feelings Under condition, the copy function as collaboration feature is associated with the combination of printing function and scanning function.In collaboration feature management information In 34, for example, the collaboration feature information for being denoted as the copy function of collaboration feature and the dress for identification with printing function The device identification information set is associated with the combination of the device identification information of the device for identification with scanning function.
Memory 30, which can store, can use function management information.Can be available for managing each user with function management information One or more functions information, and be, for example, indicate for identification the customer identification information of user with indicate the user Corresponding pass between the one or more functional informations (may include collaboration feature information) of available one or more functions The information of system.As described above, for example, the available function of user is the work(for being gratuitously supplied to the function of user or user to be bought Can, and can be individual feature or collaboration feature.For example, customer identification information is user's account of such as User ID and user name Family information.By reference to that can determine the available function of (identification) each user with function management information.It provides a user every time When function while gratis providing a user function (for example, paid every time or), it can be updated with function management information.
Controller 36 controls the operation of each unit of server 14.Controller 36 includes determination unit 38.
Determination unit 38 receives the device identification information of device for identification and the device in storage in memory 30 It is determined in function management information 32 and the associated one or more items for indicating one or more functions of the device identification information Functional information.Accordingly, it is determined that one or more functions of (identification) device.For example, being sent out from terminal device 16 to server 14 Device identification information is sent, then, it is determined that unit 38 determines and the associated one or more functions of expression of the device identification information One or more functional informations.For example, the information about one or more function is (for example, functional information and work( Can illustrate information) it is sent to terminal device 16 from server 14 and is displayed on terminal device 16.Accordingly, with respect to passing through device The information of one or more functions for the device that identification information determines is displayed on terminal device 16.
In addition, determination unit 38 receives a plurality of device identification information of interoperable destination apparatus for identification, and It determines in the collaboration feature management information 34 of storage in memory 30 and is associated with the combination of a plurality of device identification information One or more collaboration features of expression one or more collaboration feature information.Accordingly, it is determined that (identification) is by each other One or more collaboration features that cooperation between the function of the destination apparatus of cooperation executes.For example, from terminal device 16 to Server 14 sends a plurality of device identification information, then, it is determined that the determination of unit 38 is associated with a plurality of device identification information Indicate one or more collaboration feature information of one or more collaboration features.For example, about one or more The information (for example, collaboration feature information and collaboration feature illustrate information) of collaboration feature is sent to terminal device 16 from server 14 And it is displayed on terminal device 16.It is executed accordingly, with respect to multiple devices by being determined by a plurality of device identification information The information of one or more collaboration features be displayed on terminal device 16.
For example, if a device is identified (for example, if a device is taken), it is determined that unit 38, which can receive, to be used In the device identification information of the identification device, and it can determine in apparatus function management information 32 and be closed with the device identification information One or more functional informations of one or more functions of expression of connection.Therefore, if device it is identified (for example, If a device is taken), it is determined that one or more functions of (identification) device.If multiple devices are identified (for example, if multiple devices are taken), it is determined that each included by the receivable device the multiple for identification of unit 38 The a plurality of device identification information of a device, and can determine in collaboration feature management information 34 and believe with a plurality of device identification The associated one or more collaboration feature information for indicating one or more collaboration features of combination of breath.Therefore, if it is more A device is identified (for example, if multiple devices are taken), it is determined that (identification) uses the one of the function of the multiple device A or more collaboration feature.
Determination unit 38 can receive a plurality of functional information for indicating each function used in collaboration feature, and can be The associated expression of combination with a plurality of functional information is determined in the collaboration feature management information 34 of storage in memory 30 One or more collaboration feature information of one or more collaboration features.Accordingly, it is determined that (identification) by objective function it Between cooperation execute one or more collaboration features.For example, sending a plurality of function letter from terminal device 16 to server 14 Breath, then, it is determined that unit 38 determines and associated indicate one or more collaboration features one of a plurality of functional information Or more collaboration feature information.According to mode similar with aforesaid way, about by being determined by a plurality of functional information The information of one or more collaboration features that executes of multiple functions be displayed on terminal device 16.
If managing the available function of user, it is determined that unit 38 can receive the customer identification information of user for identification, And it can be determined in the available function management information of storage in memory 30 each with the associated expression of the customer identification information The functional information of a function.Accordingly, it is determined that the available one group of function of (identification) user.For example, from terminal device 16 to server 14 send customer identification information, and are determined and the associated work(for indicating each function of the customer identification information by determination unit 38 It can information.For example, the information (for example, indicating the information of the title of each function) about the available each function of user is from service Device 14 is sent to terminal device 16 and is displayed on terminal device 16.Accordingly, with respect to the use determined by customer identification information The information of the available each function in family is displayed on terminal device 16.For example, 38 reception device identification information of determination unit and Customer identification information determines one or more with the associated expression of the device identification information in apparatus function management information 32 One or more functional informations of function, and determination is associated with the customer identification information also in available function management information One or more functions of expression one or more functional informations.Accordingly, it is determined that determined by device identification information Possessed by device and to the available one or more functions of the user determined by customer identification information.
Controller 36 can perform function purchase and handle and can manage purchasing history.For example, if user buys payment work( Can, then controller 36 can be handled user using charge.
Controller 36 can perform function related with image procossing, such as character identification function, interpretative function, image procossing Function and image form function.Certainly, controller 36 can perform function related with the processing other than image procossing.When execution word When according with identification function, identifies the character in image and generate the character data for indicating character.When executing interpretative function, will scheme Character translation as in generates the character data for the character for indicating translation at the character indicated by language-specific.Work as execution When image processing function, image is handled.For example, controller 36 can receive holding by scanning function from image forming apparatus 10 The scan data that row generates, and function related with image procossing can be executed to scan data, such as character identification function turns over Translate function or image processing function.Controller 36 can receive image data from terminal device 16 and be executed to image data each Function.For example, the character data or image data that are generated by controller 36 are sent to terminal device 16 from server 14.Server 14 can be used as external device (ED), and collaboration feature can be the function using the function for the multiple devices for including server 14.
Hereinafter, the configuration that terminal device 16 will be described in detail with reference to Fig. 4.Fig. 4 shows the configuration of terminal device 16.
Communication unit 40 is communication interface, and has to the function of another equipment transmission data and connect from another equipment Receive the function of data.Communication unit 40 can be the communication interface for having wireless communication function, or can be with cable modem The communication interface of telecommunication function.
42 reference object of camera for serving as shooting unit, to generate image data (for example, Still image data or movement Image data).Alternatively, instead of using the camera 42 of terminal device 16, the outside of the communication path by being connected to such as network The image data of camera capture can be received by communication unit 40, and be displayed on UI units 46, so that the picture number According to can be by user's operation.
Memory 44 is the storage device of hard disk or memory (for example, SSD etc.).Memory 44 stores various journeys Sequence, various data, the address information of server 14, the address information of each device (for example, address information of device 10 and 12), The information of device about identification, about identification interoperable destination apparatus information, about identification device function Information and information about collaboration feature.
UI units 46 are user interface sections, and include display and operating unit.Display is such as liquid crystal display The display equipment of device.Operating unit is such as input equipment of touch screen, keyboard or mouse.Certainly, it can be used and serve as display User interface with both operating units is (for example, touch display or the display including electronically showing keyboard etc. Equipment).
The operation of each unit of 48 control terminal equipment 16 of controller.Controller 48 is used as such as display controller (control Device processed) and the display of UI units 46 is made to show various information.
The display of UI units 46 shows the image for example captured by camera 42 and is identified as destination apparatus to be used Device (for example, destination apparatus of the device or cooperation that are used alone) related image, image related with function etc..With dress The image (static image or moving image) that related image can be the device for indicating to be captured by camera 42 is set, or can be Schematically show the image (for example, icon) of device.Service can be stored in by schematically showing the data of the image of device It is supplied to terminal device 16 in device 14 and from server 14, can be pre-stored in terminal device 16, or can be stored in It is supplied to terminal device 16 in another equipment and from another equipment.Image related with function is, for example, the figure for indicating function The image of mark etc..
Above-mentioned apparatus function management information 32 can be stored in the memory 44 of terminal device 16.In this case, Apparatus function management information 32 may not be stored in the memory 30 of server 14.In addition, above-mentioned collaboration feature management information 34 can be stored in the memory 44 of terminal device 16.In this case, collaboration feature management information 34 may not be stored In the memory 30 of server 14.The controller 48 of terminal device 16 may include above-mentioned determination unit 38, can be by being based on Device identification information identification device carrys out one or more functions of determining device and can determine one using multiple functions Or more collaboration feature.In this case, server 14 may not include determination unit 38.
If can be created with function management information, available function management information can be stored in depositing for terminal device 16 In reservoir 44.In this case, it can may not be stored in the memory 30 of server 14 with function management information.Terminal is set Standby 16 controller 48 can manage the function purchasing history of user.In this case, the controller 36 of server 14 may not have It is useful for its management function.The controller 48 of terminal device 16 can determine available one of user based on customer identification information Or more function.
Alternatively, apparatus function management information 32 and collaboration feature management information 34 can be stored in such as device 10 or 12 Device in, and the device of such as device 10 or 12 may include determination unit 38.That is, the determination unit 38 for passing through server 14 Processing (for example, the processing of identification device, identification function function or identify collaboration feature processing) can be in server 14 Middle execution can execute in terminal device 16, or can be executed in such as device of device 10 or 12.
In the exemplary embodiment, for example, using augmented reality (AR) technology to obtain device identification information and identify Device.For example, using AR technologies to obtain the device identification information of the device of exclusive use and identify the device, and also obtain The device identification information of interoperable destination apparatus simultaneously identifies the destination apparatus.Use AR technologies according to prior art. For example, using the AR technologies based on label of the label of such as two-dimensional bar, using the unmarked AR skills of image recognition technology Art is used using the location information AR technologies etc. of location information.Certainly, it available device identification information and can not apply Identification device in the case of AR technologies.For example, in the case where being connected to the device of network, its IP address can be based on or passed through It reads its device ID and carrys out identification device.In addition, with based on infrared communication, visible light communication, Wireless Fidelity (Wi-Fi, registration Trade mark) or Bluetooth (registered trademark) various types of wireless communication functions device or terminal device in the case of, can It is identified using the interoperable device of wireless communication function by obtaining its device ID, and executable collaboration feature.
Hereinafter, the processing that device identification information will be obtained with reference to Fig. 5 detailed descriptions.As an example, description is obtained image The case where forming the device identification information of equipment 10.Fig. 5 schematically shows the appearance of image forming apparatus 10.Here, it will retouch It states and the processing of device identification information is obtained by AR technology of the application based on label.The shell of image forming apparatus 10 is attached with The label 50 of such as two-dimensional bar.Label 50 is obtained by the device identification information to image forming apparatus 10 encodes The information obtained.User is started the camera 42 of terminal device 16 and is attached to as the figure of target to be used using the shooting of camera 42 Label 50 as forming equipment 10.Therefore, the image data for indicating label 50 is generated.For example, from terminal device 16 by picture number According to being sent to server 14.In server 14, controller 36 executes decoding process to the tag image represented by image data, To extraction element identification information.Therefore, identify that image forming apparatus 10 to be used (is attached with the image of the label 50 of shooting Form equipment 10).The determination unit 38 of server 14 determines in apparatus function management information 32 to be identified with the device extracted The functional information of the function of information association.Accordingly, it is determined that the function of (identification) image forming apparatus 10 to be used.
Alternatively, the controller 48 of terminal device 16 can be to indicating that the image data of label 50 executes decoding process to extract Device identification information.In this case, it is sent to server 14 from terminal device 16 by the device identification information extracted.Clothes The determination unit 38 of business device 14 determines in apparatus function management information 32 to be indicated to know with the device sent from terminal device 16 The functional information of the function of other information association.If apparatus function management information 32 is stored in the memory 44 of terminal device 16 In, then the controller 48 of terminal device 16 can determine in apparatus function management information 32 indicates to believe with the device identification extracted Cease the functional information of associated function.
Label 50 may include the functional information for indicating the coding of the function of image forming apparatus 10.In this case, lead to It crosses and decoding process is executed to the image data for indicating label 50, extract the device identification information of image forming apparatus 10 and also carry Take the functional information for the function of indicating image forming apparatus 10.Accordingly, it is determined that (identification) image forming apparatus 10 and also determine The function of (identification) image forming apparatus 10.Decoding process can be executed by server 14 or terminal device 16.
In the case where executing the collaboration feature using the function of multiple devices, the mark of interoperable destination apparatus is shot Note is to obtain the device identification information of device, so that it is determined that (identification) collaboration feature.
By the unmarked AR technologies of application come in the case of obtaining device identification information, for example, user is set using terminal Standby 16 camera 42 shoots entire outside or the partial appearance of device to be used (for example, image forming apparatus 10).Certainly, lead to Cross the appearance of filming apparatus obtain such as device name (for example, trade name) or model for determining device to be used Information is useful.As shooting as a result, generating the outside drawing of the entire appearance or partial appearance that indicate device to be used As data.For example, being sent to server 14 from terminal device 16 by appearance images data.In server 14,36 base of controller Device to be used is identified in appearance images data.For example, the memory 30 of server 14 stores appearance for each device Image corresponding informance indicates:Indicate that the entire appearance of device or the device of the appearance images data of partial appearance and device are known Correspondence between other information.Controller 36 by the appearance images data received from terminal device 16 and includes for example in appearance Each appearance images data in image corresponding informance are compared, and determine the dress of device to be used based on comparative result Set identification information.For example, controller 36 extracts the outer of device to be used from the appearance images data received from terminal device 16 The feature of sight determines the feature phase indicated with the appearance in the appearance images data group in being included in appearance images corresponding informance The appearance images data of same or similar feature, and determine the device identification information with the appearance images data correlation.Therefore, Identify device to be used (device shot by camera 42).For another example, if the title (for example, trade name) of filming apparatus or Model and generate the appearance images data for indicating the title or model, then can based on represented by appearance images data title or Model identifies device to be used.The determination unit 38 of server 14 determines expression and institute in apparatus function management information 32 The functional information of the associated each function of determining device identification information.Accordingly, it is determined that device to be used is (for example, image shape Forming apparatus 10) function.
Alternatively, the controller 48 of terminal device 16 can will indicate device to be used (for example, image forming apparatus 10) Entire appearance or partial appearance appearance images data and each appearance images number being included in appearance images corresponding informance According to being compared, and the device identification information of device to be used can be determined based on comparative result.Appearance images corresponding informance It can be stored in the memory 44 of terminal device 16.In this case, the reference of controller 48 of terminal device 16 is stored in Appearance images corresponding informance in the memory 44 of terminal device 16, so that it is determined that the device identification information of device to be used. Alternatively, the controller 48 of terminal device 16 can obtain appearance images corresponding informance from server 14 and can refer to the outside drawing As corresponding informance, with the device identification information of determination device to be used.
In the case where executing the collaboration feature using multiple functions of multiple devices, interoperable each device is shot Entire appearance or partial appearance to obtain the device identification information of device, so that it is determined that (identification) collaboration feature.
By application site information AR technologies come in the case of obtaining device identification information, for example, utilizing global location System (GPS) function obtains the location information for the position for indicating device (for example, image forming apparatus 10).For example, each device Have the function of GPS and obtains the device location information for the position for indicating device.Terminal device 16 is exported to device to be used The information of the request of acquisition device location information is indicated, and as the response to the request, from the device of device reception device Location information.For example, being sent to server 14 from terminal device 16 by device location information.In server 14, controller 36 Device to be used is identified based on device location information.For example, the memory 30 of server 14 stores position for each device Corresponding informance is set, is indicated:It is corresponding between the device location information of position and the device identification information of device of expression device Relationship.Controller 36 determines in the corresponding informance of position to be known with the associated device of device location information received from terminal device 16 Other information.Accordingly, it is determined that (identification) device to be used.The determination unit 38 of server 14 is in apparatus function management information 32 Determine the functional information indicated with the associated each function of identified device identification information.Accordingly, it is determined that (identification) will use Device (for example, image forming apparatus 10) function.
The controller 48 of terminal device 16 can determine in the corresponding informance of position to close with the location information of device to be used The device identification information of connection.Position corresponding informance can be stored in the memory 44 of terminal device 16.In this case, eventually The controller 48 of end equipment 16 is with reference to the position corresponding informance being stored in the memory 44 of terminal device 16, so that it is determined that make The device identification information of device.Alternatively, the controller 48 of terminal device 16 can obtain position from server 14 and correspond to letter It ceases and refers to the position corresponding informance, with the device identification information of determination device to be used.
In the case where executing the collaboration feature using multiple devices, the device location information of interoperable device is obtained And based on the device identification information of described device location information determining device.Accordingly, it is determined that (identification) collaboration feature.
Hereinafter, apparatus system according to illustrative embodiments will be further described.
Apparatus function management information 32 will be described in detail with reference to Fig. 6.Fig. 6 is shown as the dress of apparatus function management information 32 Set the example of function management table.In apparatus function manages table, for example, device ID, expression device name (for example, type of device) Information, indicate device one or more functions information (functional information) and image ID it is associated with each other.Device ID and Device name corresponds to the example of device identification information.Image ID is to indicate the image of device (for example, indicating device for identification Appearance image or schematically show the image (for example, icon) of device) image recognition information example.Device work( It may not include image ID that table, which can be managed,.For example, the device with device ID " B " is that (MFP has multiple images to multifunction peripheral Formed function image forming apparatus) and with printing function, scanning function, etc..The image of device is indicated for identification Image ID be associated with the device.Indicate that the data of the image of device are stored in the memory 30 or another of such as server 14 In equipment.
For example, using AR technologies, the device ID of device to be used for identification is obtained.The determination unit 38 of server 14 The title of determining device, one or more functions of device are come by reference to apparatus function management table and are closed with device ID The image ID of connection.Therefore, device to be used is identified.For example, indicating the information of device name and indicating the image of device Data be sent to terminal device 16 from server 14, then they are displayed on the UI units 46 of terminal device 16.It indicates The image of device is shown as image related with device.Certainly, the image captured by camera 42 can be displayed in terminal device On 16 UI units 46.If user specifies image related with device (for example, by phase on the UI units 46 of terminal device 16 The image or schematically show the image of device that machine 42 captures), then the information about one or more functions of device (for example, functional information or function declaration information) can be sent to terminal device 16 from server 14 and can be displayed in terminal device On 16 UI units 46.
Next, collaboration feature management information 34 will be described in detail with reference to Fig. 7.Fig. 7 shows collaboration feature management information 34 The example of table is managed as collaboration feature.Collaboration feature manage table in, for example, the combination of device ID, indicate it is interoperable The information of the title (type) of destination apparatus and indicate one or more collaboration features information (collaboration feature information) that This association.For example, the device with device ID " A " is personal computer (PC), the device with device ID " B " is MFP.PC (A) cooperate realization such as scanning and transmitting function and the printing function between MFP (B) are as collaboration feature.It scans and transmits Function is that the image data generated by the scanning of MFP (B) is sent to the function of PC (A).Printing function is will to be stored in PC (A) data (for example, image data or document data) in are sent to MFP (B) and print the function of the data by MFP (B).
Hereinafter, by the processing in the case of device is used alone with reference to Fig. 8 descriptions.Fig. 8 shows the device being used alone Example.For example, it is assumed that the device that image forming apparatus 10 is single use.Image forming apparatus 10 is, for example, MFP.Image is formed Equipment 10 is the device being present in real space.Terminal device 16 shown in Fig. 8 is the device being present in real space, and And it is, for example, the mobile terminal device of such as smart phone or mobile phone.
For example, the shell of image forming apparatus 10 is attached with the label 50 of such as two-dimensional bar.Using based on label AR technologies or unmarked AR technologies in the case of, user using terminal device 16 (for example, smart phone) camera 42 shoot Image forming apparatus 10 to be used.Therefore, it generates the image data for indicating label 50 or indicates image forming apparatus 10 The appearance images data of appearance.The display device display picture 52 on the display of the UI units 46 of terminal device 16, and Device, which is shown, shows device image related with image forming apparatus 10 54 on picture 52.Device image 54 is, for example, by by phase Machine 42 is shot and the image of generation (having original size when shooting or the size of increase or diminution).
The image data generated by being shot by camera 42 is sent to server 14 from terminal device 16.In server 14 In, controller 36 executes decoding process to extract the device identification information of image forming apparatus 10 to image data, and therefore, Identify image forming apparatus 10.Alternatively, the appearance images data for the appearance for indicating image forming apparatus 10 are produced, and The appearance images data can be sent to server 14 from terminal device 16.In this case, in server 14, controller 36 The device identification information of image forming apparatus 10 is determined by reference to appearance images corresponding informance.Therefore, image shape is identified Forming apparatus 10.
The determination unit 38 of server 14 is by reference to apparatus function management information 32 (for example, apparatus function shown in fig. 6 Management table) determine the function of (identification) image forming apparatus 10.This will be described in detail with reference to Fig. 6.For example, it is assumed that image is formed Equipment 10 is " MFP (B) ".It is determined and MFP (B) associated function in the apparatus function management table shown in Fig. 6 of determination unit 38. Accordingly, it is determined that the function of MFP (B).Information about identified function is sent to terminal device 16 from server 14.Certainly, The processing of device and function can be executed by terminal device 16 for identification.
It is shown on picture 52 in device, instead of the image generated by being shot by camera 42, with the image shape identified 10 related preliminary image of forming apparatus (not instead of by shooting the image obtained, schematic images (for example, icon)) is logical It crosses and can be used as device image 54 by external camera shooting and the image of generation and shown.
For example, using the image data obtained by filming apparatus, the appearance of the device of current state (e.g., including scratch, note, the paster etc. for being attached to device appearance) be reflected in the picture, therefore, user can be more clear Visually identify to Chu the difference with same type of another device.
Using schematic images, for example, the data of schematic images are sent to terminal device 16 from server 14. For example, when image forming apparatus 10 is identified, the determination unit 38 of server 14 is by reference to apparatus function pipe shown in fig. 6 Table (apparatus function management information 32) is managed to determine schematic images related with image forming apparatus 10.The data of schematic images from Server 14 is sent to terminal device 16, and schematic images are shown in device as device image 54 and show on picture 52.Show Being intended to the data of picture can be pre-stored in terminal device 16.In this case, when image forming apparatus 10 is identified, The device image 54 being stored in terminal device 16 is displayed on device and shows on picture 52.The data of schematic images can be stored In the equipment other than server 14 and terminal device 16.
In addition, when device is identified, indicate that the information of device name can be sent to terminal device 16 from server 14, and And device name is displayed at the device in terminal device 16 and shows on picture 52.In the example depicted in fig. 8, image is formed Equipment 10 is MFP, and its title " MFP (B) " is shown.
After the function of determining image forming apparatus 10, the controller 48 of terminal device 16 makes the UI of terminal device 16 46 display function of unit shows picture 56, and so that the presentation of information about function is shown in function on picture 56, such as Fig. 9 institutes Show.As the information about function, for example, button image of the display for providing the instruction for executing function.It is formed as image The MFP (B) of equipment 10 has such as printing function, scanning function, copy function and facsimile function, therefore, for providing execution The button image of the instruction of these functions is displayed on function and shows on picture 56.For example, when user is referred to using terminal device 16 When indicating the button image of printing function surely and the instruction for executing printing function being provided, holding for the instruction for executing printing function is indicated Row command information is sent to image forming apparatus 10 from terminal device 16.It includes for executing printing function to execute instruction information Control data, such as using the data of the image data of printing function, etc..In response to executing instruction the reception of information, image It forms equipment 10 and executes printing according to information is executed instruction.
Figure 10 shows that function shows another example of picture.Function shows that picture 58 is filled using single as shown in Figure 8 The picture being shown in the case of setting on the UI units 46 of terminal device 16.As described above, determination device to be used (for example, Image forming apparatus 10) and determine the function of device to be used.It can determine the use of expression and the user using destination apparatus The functional information of the associated function of family identification information (that is, the available function of user).In addition, due to the function of device to be used It is determined, so can determine the unexistent function of device to be used in one group of function to be offered.These information can be shown Show and is shown on picture 58 in function.
It is shown on picture 58 in function shown in Fig. 10, indicates the button image 60 of function A, indicates the button figure of function B As 62 and indicate that the button image 64 of function C is displayed as the example of functional information.Device (the example that function A is used to Such as, the image forming apparatus 10 identified) function and be the available function of user.The work(for the device that function B is used to Can and be the not available function of user.By providing a user function B, user becomes able to use function B.If function B It is paid for function, then by buying function B, user becomes able to use function B.If function B is free function, pass through to User gratuitously provides function B, and user becomes able to use function B.The unexistent function of device that function C is used to, that is, no The function compatible with device to be used.Function represented by button image is the function of device to be used, eventually The display format of button image can be changed in the controller 48 of end equipment 16.In addition, the function represented by button image is not It is the available function of user, the display format of button image can be changed in controller 48.For example, each button can be changed in controller 48 The color or shape of image.In the example depicted in fig. 10, button image 60,62 and 64 is shown with different colors.For example, table Show that device to be used has and the button image of the available function of user (for example, indicating the button image 60 of function A) is with indigo plant Color is shown.Indicate that device to be used has and the button image of the not available function of user is (for example, indicate pressing for function B Button image 62) it is shown with yellow.Indicate the button image of the unexistent function of device to be used (for example, indicating function C's Button image 64) it is shown with grey.Alternatively, the shape of button image 60,62 and 64 can be changed in controller 48, or can change Become the font of function display Name.Certainly, display format can be changed according to another method.Therefore, what user can enhance can The availability of each function is identified depending on property.
For example, if user is using the specified button image 60 for indicating function A of terminal device 16 and provides execution function A's Instruction, then it represents that the information that executes instruction for executing the instruction of function A is sent to destination apparatus to be used from terminal device 16.It holds Row command information include for execute the image data of the control data of function A, the processing that be subjected to function A, etc..In response to The reception of information is executed instruction, destination apparatus executes function A according to information is executed instruction.For example, if destination apparatus is image It forms equipment 10 and if function A is scanning and transmitting function, the image forming unit 20 of image forming apparatus 10 executes Scanning function is to generate scan data (image data).Then, scan data is sent to set from image forming apparatus 10 Destination (for example, terminal device 16).
If user specifies the instruction for indicating the button image 62 of function B and providing execution function B using terminal device 16, Offer processing is then provided.If offer processing is executed by server 14, terminal device 16 accesses server 14.Therefore, make To allow users to be displayed on terminal device using the information of function B, the picture (for example, website) for providing function B On 16 UI units 46.By being provided (provision) process on this screen, user becomes able to use function B.Example Such as, terminal device 16 stores the program of web browser.Using the web browser, user can access from terminal device 16 and service Device 14.When user accesses server 14 using web browser, function offer picture (for example, website) is displayed on terminal and sets On standby 16 UI units 46, and function is provided a user by the website.Certainly, providing processing can be by being different from server 14 Server execute.Alternatively, as the information allowed users to using function B, for using work(to requests such as administrators The use license request picture (for example, website) of energy B is displayed on the UI units 46 of terminal device 16.If user passes through Using license request picture to requests such as administrators using the license of function B and if secured permission, user can use Function B.
Hereinafter, referring to Fig.1 1 description is used the processing in the case of collaboration feature.Figure 11 shows interoperable target The example of device.For example, it is assumed that image forming apparatus 10 and the projecting apparatus (can be described as projecting apparatus 12 below) as device 12 As destination apparatus.Image forming apparatus 10, projecting apparatus 12 and terminal device 16 are the devices being present in real space.
For example, the label 50 of such as two-dimensional bar is attached to the shell of image forming apparatus 10, and it is such as two-dimentional The label 66 of bar code is attached to the shell of projecting apparatus 12.Label 66 is by the device identification information progress to projecting apparatus 12 The information for encoding and obtaining.If using based on label AR technologies or unmarked AR technologies, user utilize terminal device 16 The camera 42 of (for example, smart phone) shoots the image forming apparatus 10 and projecting apparatus 12 as interoperable destination apparatus. In the example depicted in fig. 11, one in the state that both image forming apparatus 10 and projecting apparatus 12 are in the visual field of camera 42 Play shooting image forming apparatus 10 and projecting apparatus 12.Therefore, the image data for indicating label 50 and 66 is generated.Device shows picture 68 are displayed on the display of UI units 46 of terminal device 16.It is shown on picture 68 in device, display is set with image formation Standby 10 related device images 70 and device image related with projecting apparatus 12 72.Device image 70 and 72 be for example, by by Camera 42 is shot and the image of generation (having original size when shooting or the size of increase or diminution).
The image data generated by being shot by camera 42 is sent to server 14 from terminal device 16.In server 14 In, controller 36 executes decoding process to extract the device identification information and projecting apparatus 12 of image forming apparatus 10 to image data Device identification information, and therefore, identify image forming apparatus 10 and projecting apparatus 12.Alternatively, indicate that image formation is set Appearance images data for 10 and the appearance of projecting apparatus 12 can be generated and be sent to server 14 from terminal device 16.This In the case of, in server 14, controller 36 determines the dress of image forming apparatus 10 by reference to appearance images corresponding informance Set identification information and the device identification information of projecting apparatus 12.Therefore, image forming apparatus 10 and projecting apparatus 12 are identified.
The determination unit 38 of server 14 is by reference to collaboration feature management information 34 (for example, collaboration feature shown in Fig. 7 Management table) come determine (identification) use image forming apparatus 10 function and projecting apparatus 12 function one or more cooperations Function.This will be described in detail with reference to Fig. 7.For example, it is assumed that image forming apparatus 10 is MFP (B) and projecting apparatus 12 is projecting apparatus (C).Determination is associated with the combination of MFP (B) and projecting apparatus (C) in the collaboration feature management table shown in Fig. 7 of determination unit 38 Collaboration feature.Accordingly, it is determined that the collaboration feature executed by the cooperation between MFP (B) and projecting apparatus (C).About identified The information of collaboration feature is sent to terminal device 16 from server 14.Certainly, the processing of device and collaboration feature can for identification It is executed by terminal device 16.
It is shown on picture 68 in device, instead of the image generated by being shot by camera 42, with the image shape identified 10 related preliminary image of forming apparatus (for example, schematic images of such as icon) or generation by being shot by external camera Image can be displayed as device image 70.In addition, preliminary image related with the projecting apparatus 12 identified or by by outer The image that portion's camera shoots and generates can be displayed as device image 72.As described above, the data of schematic images can be from service Device 14 is sent to terminal device 16, can be pre-stored in terminal device 16, or can be stored in another equipment.
When device is identified, indicate that the information of device name can be sent to terminal device 16 from server 14, and fill The device that title is displayed in terminal device 16 is set to show on picture 68.In the example depicted in fig. 11, image shape is shown The title " MFP (B) " of forming apparatus 10 and the title " projecting apparatus (C) " of projecting apparatus 12.
If multiple devices are taken, the determination unit 38 of server 14 can be by reference to apparatus function management information 32 To determine the function of each device.In the example depicted in fig. 11, determination unit 38 can determine the function of image forming apparatus 10 With the function of projecting apparatus 12.Information about identified function can be sent to terminal device 16 from server 14.
After determining collaboration feature, the controller 48 of terminal device 16 is so that the UI units 46 of terminal device 16 show work( It can show picture 74 and so that the presentation of information about collaboration feature is shown in function on picture 74, as shown in figure 12.As About the information of collaboration feature, for example, button image of the display for providing the instruction for executing collaboration feature.MFP (B) and projection Cooperation between instrument (C) makes it possible to execute the cooperation for being projected through the image generated by MFP (B) scannings by projecting apparatus (C) The collaboration feature of function and the image projected by MFP (B) printing projecting apparatus (C).These collaboration features are executed for providing Instruction button image be displayed on function show picture 74 on.For example, when user utilizes 16 designated button figure of terminal device As and when the instruction for executing collaboration feature is provided, indicate that the instruction for executing collaboration feature executes instruction information from terminal device 16 It is sent to image forming apparatus 10 and projecting apparatus 12.In response to executing instruction the reception of information, image forming apparatus 10 and projection Instrument 12 executes the collaboration feature specified by user.
Interoperable destination apparatus can be specified by user's operation.For example, being formed as image is shot using camera 42 Equipment 10 and projecting apparatus 12 as a result, as shown in figure 11, device image 70 related with image forming apparatus 10 and with projection 12 related device image 72 of instrument is displayed on the UI units 46 of terminal device 16.Image related with device can be passed through The image for being shot by camera 42 and being obtained, or can be preliminary image related with the device identified (for example, such as icon Schematic images).When user in device shows on picture 68 specified device image 70 and when 72, image forming apparatus 10 and projection Instrument 12 is designated as interoperable destination apparatus.For example, when user's specified device image 70, base is applied to device image 70 AR technologies in label or unmarked AR technologies, so that it is determined that (identification) image forming apparatus 10.Equally, when user's specified device When image 72, to AR technology of the application of device image 72 based on label or unmarked AR technologies, so that it is determined that (identification) projecting apparatus 12.Accordingly, it is determined that the collaboration feature executed by image forming apparatus 10 and projecting apparatus 12, and about the information quilt of collaboration feature It is shown on the UI units 46 of terminal device 16.
For another example, user can show in device utilizes such as his/her finger touching device image 70 on picture 68, and can Finger is moved to device image 72, with specified device image 70 and 72 and is specified image forming apparatus 10 and projecting apparatus 12 For interoperable destination apparatus.The sequence of user's touching device image 70 and 72 or the moving direction of finger can show with above-mentioned Example is opposite.Certainly, the picture indicating unit maying be used at other than the finger moved on device display picture 68, such as pen.User Device image 70 and 72 can be linked each other with specified device image 70 and 72, and can thus by image forming apparatus 10 and thrown Shadow instrument 12 is appointed as interoperable destination apparatus.User can be superposed on one another with specified device image 70 by device image 70 and 72 With 72, and can image forming apparatus 10 and projecting apparatus 12 be thus appointed as interoperable destination apparatus.In addition, can pass through Circle is drawn on it to specify interoperable destination apparatus, or can be by specifying device related with device in scheduled time slot Image specifies destination apparatus.In the case where releasing cooperation, user can show in device specifies the mesh to be released on picture 68 Device for mark or pressable cooperation releasing button.If not the image of the device of the destination apparatus of cooperation picture is shown in device On face 68, then user can show in device specifies the device to remove the device from interoperable destination apparatus on picture 68 It removes.The device to be released can be specified by executing predetermined operation (for example, crossing on it).
Interoperable destination apparatus can individually be shot.For example, being identified by executing shooting repeatedly by camera 42 Interoperable destination apparatus.If be performed a number of times by the shooting of camera 42, the device identified in each shooting operation Device identification information be stored in the memory of server 14 or terminal device 16.For example, at image forming apparatus 10 Image forming apparatus 10 is shot in the state of in the visual field of camera 42, then in the visual field that projecting apparatus 12 is in camera 42 Projecting apparatus 12 is shot under state.Therefore, the image data for indicating image forming apparatus 10 and the image for indicating projecting apparatus 12 are generated Data.By applying AR technologies or unmarked AR technologies based on label to each image data, determine that (identification) image is formed Equipment 10 and projecting apparatus 12, and determine the collaboration feature of (identification) using the function of image forming apparatus 10 and projecting apparatus 12. For example, image forming apparatus 10 and projecting apparatus 12 as interoperable destination apparatus not always in the visual field of camera 42 that This is close.The visual angle of changeable camera 42, or the visual field can be increased or reduced.If these operations are inadequate, shooting can perform Repeatedly, to identify interoperable destination apparatus.
For another example, the destination apparatus of cooperation can be redefined for basic cooperation device.For example, it is assumed that image forming apparatus 10 It is redefined for basic cooperation device.The device identification information of basic cooperation device can be pre-stored in server 14 or whole In the memory of end equipment 16.Alternatively, user can specify basic cooperation device using terminal device 16.If basic cooperation fills It sets and is set, then user shoots the destination apparatus other than basic cooperation device using the camera 42 of terminal device 16.Accordingly, it is determined that (identification) interoperable destination apparatus, and determine that (identification) uses the function of basic cooperation device and captured device One or more collaboration features.
In the example shown in Figure 11 and 12, each collaboration feature is the function using hardware device.Alternatively, cooperate work( It can be able to be the function using the function of being realized by software (application).For example, instead of device image, with the function by software realization Related function image (for example, icon image etc.) is displayed on the UI units 46 of terminal device 16, and can be by user Multiple function images in specified function image, so that can determine that (identification) use is related with the multiple function image The collaboration feature of multiple functions.For example, function that is specified and being shown on the key frame of smart phone or the desktop of PC can be passed through Related function image determines collaboration feature.Certainly, if device image related with hardware device and with by software reality The related function image of existing function is displayed on the UI units 46 of terminal device 16 and if user's specified device image And function image, then it can recognize that the work(that cooperates using device related with device image and function related with function image Energy.
In the examples described above, using based on label AR technologies or unmarked AR technologies, but location information AR can be used Technology.For example, terminal device 16 has the function of GPS, the terminal positional information for the position for indicating terminal device 16 is obtained, and will Terminal positional information is sent to server 14.The 36 reference table showing device location information of controller of server 14 (indicates device Position) correspondence between device identification information position corresponding informance, and determine the position relative to terminal device 16 Setting in the device in preset range as candidate cooperation device.For example, it is assumed that MFP, PC, printer and scanner are relative to end The position of end equipment 16 is located in preset range.In this case, MFP, PC, printer and scanner are determined as candidate Cooperation device.The device identification information of each candidate's cooperation device is sent to terminal device 16 from server 14 and is displayed on end On the UI units 46 of end equipment 16.As device identification information, it can show the image of candidate cooperation device or can show such as The character string of device ID.User specifies interoperable target dress in the candidate cooperation device being shown on UI units 46 It sets.The device identification information for the destination apparatus specified by user is sent to server 14 from terminal device 16.In server 14, One or more collaboration features are determined based on the device identification information of destination apparatus.About one or more cooperation The information of function is displayed on the UI units 46 of terminal device 16.It determines the processing of candidate's cooperation device and determines cooperation work( The processing of energy can be executed by terminal device 16.
Even if using AR technologies etc., also without identifying captured device, then it represents that the device of captured device Image is not necessarily displayed device and shows on picture.Therefore, the visuality of the device of identification can increase.For example, if there is identification Device and unidentified device and if by camera 42 shoot two kinds of devices, then it represents that the device image of unidentified device It does not show.Therefore, it is same to indicate that the device image of the device of identification divides in the device image phase region of the device unidentified with expression When it is shown, therefore, the visuality of the device of identification can increase.Alternatively, indicate that the device image of the device of identification can be according to Highlighted mode is shown.For example, indicate that the device image of the device of identification can be shown according to particular color, it can be to device image Edge it is shown while carry out highlighted, can be shown, can dimensionally show while amplifying device image, or can be Make to show while device image flicker.Therefore, the visuality of the device of identification can increase.
Hereinafter, the processing by description for the function of executive device.As an example, by describing for executing collaboration feature Processing.In this case, the destination apparatus to cooperate to each other from terminal device 16 sends connection request, and establishes terminal and set Standby connection between 16 and destination apparatus.Hereinafter, 3 connection processing will be described referring to Fig.1.Figure 13 is the sequence for showing the processing Figure.
First, user utilizes terminal device 16 to provide instruction to start the application (program) for executive device function.It rings Should be in the instruction, the controller 48 of terminal device 16, which starts, applies (S01).The application can be pre-stored in terminal device 16 In memory 44, or it can be downloaded from server 14 etc..
Then, it is identified each other by AR technology, unmarked AR technology or location information AR technology of the application based on label The destination apparatus (S02) of cooperation.Certainly, destination apparatus can be identified using the technology other than AR technologies.In application based on label AR technologies or unmarked AR technologies in the case of, user is using the camera 42 of terminal device 16 come photographic subjects device.For example, In the case where using image forming apparatus 10 (MFP (B)) and projecting apparatus 12 (projecting apparatus (C)) is used as destination apparatus, Yong Huli Image forming apparatus 10 and projecting apparatus 12 are shot with camera 42.Therefore, the dress of image forming apparatus 10 and projecting apparatus 12 is obtained Identification information is set, and image forming apparatus 10 and projecting apparatus 12 are identified as destination apparatus.In application site information AR technologies In the case of, the location information of image forming apparatus 10 and projecting apparatus 12 is obtained, image forming apparatus is determined based on location information 10 and projecting apparatus 12 device identification information, and identify image forming apparatus 10 and projecting apparatus 12.
For example, if user provides the instruction for showing collaboration feature, the function of the device using multiple identifications is identified Collaboration feature.The information of collaboration feature about identification is displayed on the UI units 46 of terminal device 16 (S03).Identification cooperation The processing of function can be executed by server 14 or terminal device 16.
Then, after user specifies the target collaboration feature to be executed using terminal device 16, terminal device 16 is to holding The destination apparatus (for example, image forming apparatus 10 and projecting apparatus 12) of row collaboration feature sends the information for indicating connection request (S04).For example, if indicating that the address information of the address of interoperable destination apparatus is stored in server 14, eventually End equipment 16 obtains address information from server 14.If address information is included in device identification information, terminal device 16 Address information can be obtained from the device identification information of destination apparatus.Alternatively, the address information of destination apparatus can be stored in end In end equipment 16.Certainly, terminal device 16 can utilize the address information of another method acquisition destination apparatus.Utilize destination apparatus The address information of (for example, image forming apparatus 10 and projecting apparatus 12), terminal device 16 is to destination apparatus (for example, image is formed Equipment 10 and projecting apparatus 12) send the information for indicating connection request.
The image forming apparatus 10 and projecting apparatus 12 for receiving the information for indicating connection request allow or do not allow connection to Terminal device 16 (S05).For example, if image forming apparatus 10 and projecting apparatus 12 are the devices for not allowing to be attached or such as The quantity of the device of fruit request connection is more than the upper limit, then does not allow to connect.If allowing the connection from terminal device 16, change The operation of 12 distinctive set information of image forming apparatus 10 and projecting apparatus can be prohibited, so that not changed by terminal device 16 Become the set information.For example, the change of the color parameter of image forming apparatus 10 or the setting time changed to battery saving mode It can be prohibited.Therefore, the safety of interoperable destination apparatus can increase.Alternatively, with or not do not cooperating another device In the case of the case where each device is used alone compare, in the case where making device coordination with one another, the change of set information can It is limited.For example, compared with being used alone in the case of device, allow to change less setting item.Alternatively, it is checked The personal information (for example, operation history) of his user can be prohibited.Therefore, the safety of the personal information of user can increase.
From image forming apparatus 10 and projecting apparatus 12 result for indicating to allow or do not allow connection is sent to terminal device 16 Information (S06).If allowing connection to image forming apparatus 10 and projecting apparatus 12, in terminal device 16 and image forming apparatus 10 Communication is established between each in projecting apparatus 12.
Then, user provides the instruction (S07) for executing collaboration feature using terminal device 16.In response to the instruction, by table Show that the information that executes instruction for the instruction for executing collaboration feature is sent to image forming apparatus 10 and projecting apparatus 12 from terminal device 16 (S08).The information that executes instruction for being sent to image forming apparatus 10 includes indicating the processing to be executed in image forming apparatus 10 Information (for example, job information), and be sent to projecting apparatus 12 execute instruction information include indicate projecting apparatus 12 in hold The information (for example, job information) of capable processing.
In response to executing instruction information, image forming apparatus 10 and projecting apparatus 12 each work(is executed according to information is executed instruction Energy (S09).For example, if collaboration feature is included in the place of transmission between image forming apparatus 10 and projecting apparatus 12/reception data Reason such as transmits scan data from image forming apparatus 10 (MFP (B)) to projecting apparatus 12 (projecting apparatus (C)) and is thrown by projecting apparatus 12 It is the same in the function of the shadow data, communication is established between image forming apparatus 10 and projecting apparatus 12.In this case, example Such as, the information that executes instruction for being sent to image forming apparatus 10 includes the address information of projecting apparatus 12, and is sent to projecting apparatus 12 execute instruction the address information that information includes image forming apparatus 10.Using these address informations in image forming apparatus 10 Communication is established between projecting apparatus 12.
After the execution of collaboration feature is completed, refer to from image forming apparatus 10 and projecting apparatus 12 to the transmission of terminal device 16 Show the information (S10) that the execution of collaboration feature is completed.Indicate that the information that the execution of collaboration feature is completed is displayed on terminal device On 16 UI units 46 (S11).Even if the time executed instruction from offer lights and has passed predetermined period, still without display Indicate the information that the execution of collaboration feature is completed, then the controller 48 of terminal device 16 may make the display of UI units 46 to indicate mistake Information, and the letter that execute instruction information or indicate connection request can be sent to image forming apparatus 10 and projecting apparatus 12 again Breath.
Then, user determines whether to lift the cooperation state (S12) of image forming apparatus 10 and projecting apparatus 12, and according to Definitive result executes processing (S13).In the case where releasing cooperation state, user is provided using terminal device 16 and releases instruction. Therefore, the communication between each in terminal device 16 and image forming apparatus 10 and projecting apparatus 12 stops.In addition, image shape Communication between forming apparatus 10 and projecting apparatus 12 stops.In the case where not releasing cooperation state, offer can be provided and executed instruction.
In addition, the quantity of interoperable destination apparatus can increase.For example, can get the device identification letter of 3rd device Breath, and can determine the cooperation work(executed by the cooperation between three devices including image forming apparatus 10 and projecting apparatus 12 Energy.The information that instruction image forming apparatus 10 and projecting apparatus 12 have been previously identified as interoperable destination apparatus is stored in clothes It is engaged in device 14 or terminal device 16.
It indicates the device identification information of interoperable destination apparatus and indicates the cooperation work(of performed collaboration feature Energy information can be stored in terminal device 16 or server 14.For example, can get the user of the user of using terminal equipment 16 Account information (customer identification information), and indicate in user account information, indicate that the device of interoperable destination apparatus is known The historical information of correspondence between the collaboration feature information of other information and the performed collaboration feature of expression can be created And it is stored in terminal device 16 or server 14.Historical information can be created by terminal device 16 or server 14.It is gone through with reference to this History information determines the collaboration feature executed and the device for the collaboration feature.
As historical information, interoperable destination apparatus (for example, image forming apparatus 10 and projecting apparatus 12) can store Once it had requested that the user account information of the user of connection and indicated once to have requested that the terminal of the terminal device 16 of connection is known Other information.With reference to the historical information, the user for once having used the device is determined.It was once used when device damages in such as determination In the case of the user of device or in the case where being handled for the execution charge such as consumptive material, determines and use using historical information Family.Historical information can be stored in server 14 or terminal device 16, or can be stored in another equipment.
For example, user account information is pre-stored in the memory 44 of terminal device 16.The control of terminal device 16 Device 48 is used as the example of user identification unit, and the user account information of user is read from memory 44, and identifies using terminal The user of equipment 16.If the user account information of multiple users is stored in memory 44, user utilizes terminal device 16 specify his/her user account information.Therefore, it reads the user account information of user and identifies user.Alternatively, terminal is set Standby 16 controller 48 can identify user by reading the user account information for the user for logining terminal device 16.Alternatively, If only a user account information is stored in same terminal device 16, the controller 48 of terminal device 16 can pass through reading This user account information is taken to identify user.Believe without user account is created if not setting user account Breath, then execute initial setting, to create user account information.
It can be directed to the usage history of each user management collaboration feature, and indicated by read user account information table The information for the previously used collaboration feature of user shown is displayed on the UI units 46 of terminal device 16.Indicate usage history Information can be stored in terminal device 16 or server 14.In addition, indicating with the cooperation of predetermined or more frequency usage The information of function can be shown.In the case where providing this shortcut function, the user's operation about collaboration feature can simplify.
In the case where executing single apparatus function, is sent from terminal device 16 to the device for executing single apparatus function and indicate to hold The information of the instruction of row list apparatus function.Device is according to the instruction execution list apparatus function.
In the examples described above, collaboration feature can be executed by multiple devices.However, according to the combination of device, collaboration feature is not Always it can perform.In addition, according to the combination of multiple functions (for example, by the combination of the function of software realization or by software realization Function and hardware device function combination), collaboration feature always not can perform.This will be discussed in more detail below.
Figure 14 A show that the example of the combination of the device of collaboration feature can not be executed.For example, it is assumed that MFP (B) and hair-dryer (D) it is identified as device.As shown in Figure 14 A, device shows that picture 68 is displayed on the UI units 46 of terminal device 16, and Device is displayed on the related device image 70 and 76 of device (MFP (B) and hair-dryer (D)) of identification to show on picture 68. If there is no the collaboration feature that can be executed by MFP (B) and hair-dryer (D) and if MFP (B) and hair-dryer (D) are designated For interoperable destination apparatus, then the information about collaboration feature is not shown, and message frame 78 is displayed on terminal and sets On standby 16 UI 46, as shown in Figure 14B.The display of message frame 78 instruction can not execute cooperation work(by MFP (B) and hair-dryer (D) The message of energy.
Above-mentioned processing will be described in further detail.When MFP (B) and hair-dryer (D) are identified and are appointed as interoperable mesh When device for mark, the determination unit 38 of server 14 is by reference to collaboration feature management information as described above 34 (for example, Fig. 7 institutes The collaboration feature management table shown) come determine (identification) use MFP (B) and hair-dryer (D) collaboration feature.If used MFP (B) It is registered in collaboration feature management table with the collaboration feature of hair-dryer (D), it is determined that unit 38 determines the collaboration feature.It is another Aspect, if be not registered in collaboration feature management table using the collaboration feature of MFP (B) and hair-dryer (D), it is determined that unit 38 determine that there is no the collaboration features using MFP (B) and hair-dryer (D).In this case, the controller 36 of server 14 is defeated Go out to indicate that the combination of MFP (B) and hair-dryer (D) can not execute the message of collaboration feature.This message is displayed on terminal device 16 UI units 46 on, as shown in Figure 14B.
Even if as in the above situation, there is no available collaboration features, are installed according to the mode of operation of device, device The change (update) of the function of environment (ambient enviroment) or device may become that collaboration feature can be used.In the examples described above, If condensed in the environment that MFP (B) is installed, condensed using hair-dryer (D) to remove or prevent.In such case Under, it is available using the collaboration feature of MFP (B) and hair-dryer (D), therefore, indicate that the information of collaboration feature is displayed on terminal and sets On standby 16 UI units 46.For example, the environment that the controller 36 of server 14 monitors the mode of operation of each device, device is installed More new state of the function of (ambient enviroment), each device etc., and determine based on monitoring result the availability of collaboration feature. In the case of the combination of MFP (B) and hair-dryer (D), if the ambient enviroment of MFP (B) meet specified conditions (for example, if Condensed in the ambient enviroment of MFP (B)), then controller 36 determines that collaboration feature is available and determines that (identification) uses blowing The collaboration feature of machine (D).This is equally applicable to the mode of operation of device, that is, if identification or one group of specified device are in spy Determine mode of operation, then controller 36 determines available using the collaboration feature of this group of device.This can equally be well applied to the work(of device It can be updated and the case where collaboration feature is made available by by newer function.
Not the case where collaboration feature not being can perform additionally, there are multiple functions by being realized by multiple software units, and In the presence of by being realized by software unit function and hardware device executable collaboration feature the case where.
Guiding is handled
In the exemplary embodiment, for example, when specifying image related with device, guiding, guiding instruction is presented The hardware device of collaboration feature or the function by software realization can be executed together with the device.This is equally applicable to use software Collaboration feature.For example, when specifying image related with the function by software realization, guiding is presented, guiding instruction can With the hardware device for executing collaboration feature together with the function or by the function of software realization.Hereinafter, will be described according to example Property embodiment guiding processing example.
Example 1
Referring to Fig.1 5 descriptions are handled according to the guiding of example 1.Figure 15 shows to show showing for picture according to the device of example 1 Example.For example, it is assumed that MFP (B), projecting apparatus (C) and hair-dryer (D) are identified as device.Device shows that picture 68 is displayed on end On the UI units 46 of end equipment 16, and the related device with the device of identification (MFP (B), projecting apparatus (C) and hair-dryer (D)) Image 70,72 and 76 is displayed on device and shows on picture 68.
In this case, for example, it is assumed that user selects MFP (B) and user specifies and the related installation drawings of MFP (B) As 70.MFP (B) corresponds to first device, and device image 70 related with MFP (B) is corresponding to related with first device First image.MFP (B) is specified to be used as first device in response to user, the determination unit 38 of server 14 is determined to and conduct The MFP (B) of first device executes the second device of collaboration feature together, by reference to collaboration feature management information 34 (for example, figure Collaboration feature shown in 7 manages table).For example, it is assumed that the combination of MFP (B) and projecting apparatus (C) be able to carry out collaboration feature and The combination of MFP (B) and hair-dryer (D) cannot execute collaboration feature.I.e., it is assumed that use the cooperation work(of MFP (B) and projecting apparatus (C) It can be registered in collaboration feature management table and not be registered in cooperation work(using the collaboration feature of MFP (B) and hair-dryer (D) It can manage in table.In this case, projecting apparatus (C) is confirmed as second device, and the controller 36 of server 14 executes Control is to be presented guiding of the indicating projector (C) as second device.Specifically, from server 14 under the control of controller 36 It will indicate that the device identification information of projecting apparatus (C) is sent to terminal device 16.The controller 48 of terminal device 16 is presented instruction and throws Guiding of the shadow instrument (C) as second device.For example, as shown in figure 15, the controller 48 of terminal device 16 is so that indicating projector (C) device is shown in as the arrow 80 of cooperative partner device to show on picture 68.Arrow 80 is by device image 70 (with conduct Related first images of MFP (B) of first device) with device image 72 (related the with the projecting apparatus (C) as second device Two images) image that links each other.Certainly, controller 48 can be made using the method other than arrow indicating projector (C) is presented For the guiding of second device.For example, the guiding of instruction second device can be presented by exporting sound in controller 48, may make folded The label being added on the second image (for example, device image 72) related with second device is shown in device and shows on picture 68, The second image related with second device may make to be shown in device and show on picture 68 so that the second image can be with another figure Picture is mutually distinguished, or be may make and indicated that the character string display of cooperative partner is shown in device on picture 68.
Under the specified the above situation as the MFP (B) of first device of user, as association can be executed together with MFP (B) The projecting apparatus (C) for making the second device of function is proposed as candidate cooperative partner device.Therefore, recommend this candidate with no The case where compare, user's facility of the device needed for specified collaboration feature can increase.
If user is specified and the related function image of the first function (figure corresponding with the first image by software realization Picture), it also executes and handles similar guiding processing with above-mentioned guiding.That is, if the specified function image as the first image of user, Guiding is then presented, guiding instruction can execute the second function of collaboration feature together with the first function related with function image. For example, the guiding for indicating the second image related with the second function can be presented, or can be referred to present using sound or character string Show the guiding of the second function.Second function can be the function of the function or hardware device by software realization.Certainly, if with Device image related with first device is specified at family, then guiding can be presented, and guiding instruction can execute together with first device Collaboration feature and the second function by software realization.
Since hair-dryer (D) is cannot to execute the device of collaboration feature together with MFP (B), so instruction blowing is not presented Guiding of the machine (D) as cooperative partner.
The mode of operation of the environment (ambient enviroment), MFP (B) installed according to the MFP (B) as first device is (for example, color Whether adjustment amount, paper amount, MFP are used or handle the end time), the change (update) of the function of MFP (B), blowing The change (update) of the function of machine (D), etc., guiding of the instruction hair-dryer (D) as second device can be presented.For example, if It is condensed in the ambient enviroment as the MFP of first device (B), then the determination unit 38 of server 14 determines hair-dryer (D) as the second device needed for removal condensation.In this case, as in the above situation of projecting apparatus (C), presentation refers to Show guiding of the hair-dryer (D) as cooperative partner.For example, display (has device image 70 with the MFP (B) as first device The first image closed) linked each other (with related second image of hair-dryer (D) as second device) with device image 76 Arrow, or the guiding for indicating hair-dryer (D) is presented using sound.
Show that the image-related device or object on picture 68 may not be identified with device is shown in.For example, it may be possible to deposit 38 unidentified hair-dryer (D) of determination unit of server 14 the case where.From candidate cooperative partner (second device) exclude not by The device or object of identification.Image related with unrecognized device or object can show or can not appear in that device is shown On picture 68.
Example 2
Referring to Fig.1 6 descriptions are handled according to the guiding of example 2.Figure 16 shows to show showing for picture according to the device of example 2 Example.In example 2, multiple devices correspond to second device (cooperative partner device).For example, it is assumed that MFP (B), projecting apparatus (C), Hair-dryer (D) and camera (E) are identified as device and foliage plant (F) is identified as foliage plant.Certainly, device it is not Object may not be identified.
Device show picture 68 be displayed on the UI units 46 of terminal device 16, and with the device of identification (MFP (B), Projecting apparatus (C), hair-dryer (D) and camera (E)) related device image 70,72,76 and 82 and related with foliage plant (F) Image 84 be displayed on device show picture 68 on.
In this case, for example, it is assumed that user selects MFP (B) as first device and user specifies and MFP (B) Related device image 70 is used as the first image.Also assume that projecting apparatus (C) and camera (E) be determined as can with as The MFP (B) of one device executes the second device of collaboration feature together.In this case, indicating projector (C) and camera is presented (E) as the guiding of second device.For example, guiding is necessarily presented at the same time.In the example shown in Figure 16, as in example 1, show Showing will device image 70 related with MFP (B) and related device image 72 links each other with projecting apparatus (C) 80 conduct of arrow Guiding.In addition, display will related with MFP (B) device image 70 and related device image 82 links each other with camera (E) Arrow 86 is as guiding.
Priority can be associated with collaboration feature.In collaboration feature management information 34 (for example, collaboration feature shown in Fig. 7 Management table) in indicate priority information be associated with each collaboration feature.The guiding for indicating multiple second devices is being presented In the case of, indicate that the information of the priority of each collaboration feature is sent to terminal device 16 from server 14, and this is preferential Sequence is displayed on device and shows on picture 68.For example, if using MFP (B) and projecting apparatus (C) the first collaboration feature it is excellent First grade is higher than the second collaboration feature using MFP (B) and camera (E), then the controller 48 of terminal device 16 is so that instruction first The priority of projecting apparatus (C) used in collaboration feature is aobvious higher than the information of the camera (E) used in the second collaboration feature Show and is shown on picture 68 in device.Controller 48, which may make, indicates that the character string display of priority shows picture 68 in device On, may make arrow 80 and 86 with different colors be shown in device show picture 68 on, may make device image 72 and 82 with Different display format is shown in device and shows on picture 68, or may make the arrow for the higher device image of priority 72 First 80, which are shown in device, shows on picture 68 and the arrow 86 for being used for the lower device image of priority 82 is not appeared in Device is shown on picture 68.
Alternatively, instead of showing that arrow, the controller 48 of terminal device 16 can will indicate the character string display of second device In the specific region that device shows picture 68.For example, the character string is displayed in the region of non-display device images.Cause This, prevents the case where arrow makes it difficult to see the information being shown on picture.
Under the specified the above situation as the MFP (B) of first device of user, as association can be executed together with MFP (B) The projecting apparatus (C) and camera (E) for making the second device of function are proposed as candidate cooperative partner device.
If device is shown does not show candidate cooperative partner device (second device) on picture 68, then it represents that second device is pacified The location information of the position of dress indicates that the information of the guiding for the position that instruction second device is installed is displayed at terminal and sets On standby 16 UI units 46.For example, the controller 36 of server 14 obtains the location information of second device using GPS functions etc., And indicate instruction relative to terminal device 16 to create based on the location information of the location information and terminal device 16 obtained Position second device position guiding information.Indicate that the information of the guiding can be sent to terminal device from server 14 It 16 and is displayed on the UI units 46 of terminal device 16.
As in example 1, the case where related with the function function image of use is applicable to according to the processing of example 2. For example, if the specified function image as the first image of user, instruction can be presented can related with function image first Function executes the guiding of multiple second functions of collaboration feature together.Certainly, if user specifies dress related with first device Image is set, then instruction can be presented can execute the guiding of multiple functions of collaboration feature together with first device.
Example 3
Referring to Fig.1 7 to Figure 20 descriptions are handled according to the guiding of example 3.Figure 17 to Figure 20 is respectively shown according to example 3 Device shows the example of picture.In example 3, user specifies first device, then specifies second device, performs control to presentation Instruction can execute the guiding of the 3rd device of collaboration feature together with the first and second devices.It is proposed as the dress of 3rd device Setting can change according to the appointed sequence of the first and second devices.
For example, it is assumed that PC (A), MFP (B), projecting apparatus (C) and camera (E) are identified as device, and with the device of identification (PC (A), MFP (B), projecting apparatus (C) and camera (E)) related device image 70,72,82 and 88 and image 84 are displayed on On device display picture 68, as shown in figure 17.
In this case, for example, if user specifies and the related device images 70 of MFP (B), server 14 is really MFP (B) is identified as first device by order member 38, and can execute PC (A), the projecting apparatus of collaboration feature together with MFP (B) (C) and camera (E) is identified as second device (candidate cooperative partner device).As shown in figure 17, for example, arrow 80,86 and 90 It is displayed as the guiding of instruction second device.Arrow 90 be by device image 70 related with MFP (B) and with PC (A) it is related The image that links each other of device image 88.
Then, it is assumed that user selects projecting apparatus (C) as cooperative partner device and user from this group of second device The specified related device image 72 with projecting apparatus (C).In this case, the determination unit 38 of server 14 is by reference to cooperation Function management information 34 is determined to the MFP (B) as first device and the projecting apparatus (C) as second device together Execute the 3rd device of collaboration feature.In collaboration feature management table shown in Fig. 7, registering with can be by between two device Cooperate the collaboration feature executed.However, the certainly registrable cooperation that can be executed by the cooperation between three or more devices Function.For example, it is assumed that PC (A) is determined as 3rd device.In this case, instruction PC (A) is presented as shown in figure 18 to make For the guiding of 3rd device.For example, display by with as second device the related device image 72 of projecting apparatus (C) and with conduct The arrow 92 that the related device image of the PC (A) of 3rd device 88 links each other is as guiding.
For example, user may specify with the related device image 72 of projecting apparatus (C), execution will with MFP (B) related installation drawing Picture 70 and the operation that related device image 72 links each other with projecting apparatus (C), device image 72 is superimposed upon by device image 70 On, or indicant is placed on device image 70, indicant is then moved into device image 72, it is used as second device with specified Projecting apparatus (C).
The appointed sequence of each device (each device image) corresponds to the sequence or number that the function of device is used According to the sequence moved between the devices.The operation of specified device is (for example, image is linked or by image superposition in another image On operation) serve as the operation of sequence or data movement sequence that specified function is used.In the example shown in Figure 18, make It is used for the MFP (B) first of first device, the projecting apparatus (C) second as second device is used.Instruction energy is presented It is enough that collaboration feature and the drawing by a 3rd device used of third in collaboration feature are executed together with the first and second devices It leads.It is used and in second collaboration feature used of projecting apparatus (C) by third for first in MFP (B) that is, instruction is presented The guiding of a 3rd device used.In the example shown in Figure 18, PC (A) is 3rd device.It, can be by more in example 3 The collaboration feature that cooperation between a device executes is registered in collaboration feature management information 34, and the use sequence of device Also it is registered.The determination unit 38 of server 14 determines 3rd device by reference to collaboration feature management information 34.
Figure 19 shows another example.As in Figure 17, device image 70,72,82 and 88 and image 84 are displayed on Device is shown on picture 68.
In this case, for example, if user specifies and the related device image 72 of projecting apparatus (C), projecting apparatus (C) It is identified as first device, and the determination unit 38 of server 14 will execute collaboration feature together with projecting apparatus (C) PC (A), MFP (B) and camera (E) are identified as second device (candidate cooperative partner device).As shown in figure 19, for example, arrow 94, 96 and 98 are displayed as the guiding of instruction second device.Arrow 94 is will be related with the projecting apparatus (C) as first device Device image 72 and the image that related device image 70 links each other with MFP (B) as second device.Arrow 96 is will to fill Set image 72 and related device image 82 links each other with camera (E) as second device image.Arrow 98 is will to fill Set image 72 and related device image 88 links each other with PC (A) as second device image.
Then, it is assumed that user selects MFP (B) specified as cooperative partner device and user from this group of second device The related device image 70 with MFP (B).In this case, the determination unit 38 of server 14 is by reference to collaboration feature pipe Information 34 is managed to be determined to execute association together with the projecting apparatus (C) as first device and the MFP (B) as second device Make the 3rd device of function.For example, it is assumed that camera (E) is determined as 3rd device.In this case, it is in as shown in figure 20 Now guiding of the instruction camera (E) as 3rd device.For example, display by with the related installation drawings of MFP (B) as second device As 70 and related device image 82 links each other with camera (E) as 3rd device arrow 100 as guiding.This In the case of, it indicates in the collaboration feature that projecting apparatus (C) is used by first use and MFP (B) by second by third The guiding of the device (for example, camera (E)) used is presented as 3rd device.
For example, user may specify with the related device images 70 of MFP (B), execution will with projecting apparatus (C) related installation drawing Picture 72 and the operation that related device image 70 links each other with MFP (B), device image 72 is superimposed upon on device image 70, Or indicant is placed on device image 72, indicant is then moved into device image 70, to specify as second device MFP(B)。
In the manner described above, the 3rd device for indicating that collaboration feature can be executed together with the first and second devices is presented Guiding.The device that (recommendation) is presented as 3rd device changes according to the appointed sequence of the first and second devices.Device quilt The sequence that specified sequence corresponds to the sequence that is used of function of device or data move between the devices.Specified device The operation of sequence or data movement sequence that specified function is used is served as in operation.The dress next used in collaboration feature It sets or the device of destination as data changes according to the appointed sequence of device.Therefore, in example 3, according to this Change the guiding of 3rd device to be used in instruction collaboration feature is presented.
As in example 1, the case where related with the function function image of use is applicable to according to the processing of example 3. For example, if user specifies function image related with the first function, guiding is presented, which indicates and can be with the first work( The related function image of the second function of collaboration feature can be executed together.If user specifies functional diagram related with the second function Guiding can be then presented in picture, and guiding instruction has with the third function that can execute collaboration feature together with the first and second functions The function image of pass.In this case, the function as the presentation of third function is appointed suitable according to the first and second functions Sequence and change.In addition in example 3, collaboration feature can be using the function of hardware device and the function by software realization Function.
Example 4
It will be handled according to the guiding of example 4 with reference to Figure 21 and Figure 22 descriptions.Figure 21 and Figure 22 is respectively shown according to example 4 Device shows the example of picture.In example 4, if the specified device that cannot execute collaboration feature together with first device, is in Now instruction can execute the guiding of the second device (cooperative partner device) of collaboration feature together with first device.For example, it is assumed that MFP (B), projecting apparatus (C) and hair-dryer (D) are identified as device.
As shown in figure 21, device show picture 68 be displayed on the UI units 46 of terminal device 16, and with identification The related device image of device (MFP (B), projecting apparatus (C) and hair-dryer (D)) 70,72 and 76 is displayed on device and shows picture On 68.
In this case, for example, if user selects MFP (B) as first device and user specifies and MFP (B) Related device image 70 is used as the first image, then MFP (B) is identified as first device by the determination unit 38 of server 14.Example Such as, it is assumed that projecting apparatus (C) corresponds to execute the second device of collaboration feature simultaneously together with the MFP (B) as first device And hair-dryer (D) corresponds to the device that cannot execute collaboration feature together with MFP (B).In this case, for example, it is assumed that with Family by it is specified with the related device image 76 of hair-dryer (D), execute will with MFP (B) related device image 70 and with blowing Device image 70 is superimposed upon on device image 76 or will instruction by operation that the related device image of machine (D) 76 links each other Then object is placed on device image 70 moves to device image 76 by indicant, the work(that cooperates cannot be executed together with MFP (B) to specify The hair-dryer (D) of energy is used as cooperative partner device.In the example shown in Figure 21, as indicated by arrow 102, by device image 70 The operation linked each other with device image 76 is executed by user.
When the hair-dryer (D) that cannot execute collaboration feature together with the MFP (B) as first device is appointed as assisting by user When making partner device, the controller 36 of server 14 receives this and specifies and perform control to presentation indicating projector (C) conduct The guiding of the second device of collaboration feature can be executed together with MFP (B).Therefore, indicating projector (C) is presented as the second dress The guiding set.For example, as shown in figure 22, the controller 48 of terminal device 16 is so that indicating projector (C) is filled as cooperative partner The arrow 104 set is shown in device and shows on picture 68.For example, arrow 104 is will be related with the MFP (B) as first device Device image 70 and related device image 72 links each other with projecting apparatus (C) as second device image.Certainly, may be used Guiding is presented by using sound or display character string.
As described above, in example 4, if the specified device that cannot execute collaboration feature together with first device, holds Row control is to be presented the guiding that instruction can execute the second device of collaboration feature together with first device.If specifying the every time The arrow etc. of guiding is served as in display when one device, then arrow etc. may make picture in disorder.It can avoid such case in example 4.
As in example 1, the case where related with the function function image of use is applicable to according to the processing of example 4. For example, if the specified function that cannot execute collaboration feature together with the first function, instruction is presented can be with the first function one Act the guiding for the second function of executing collaboration feature.
Example 5
Example 5 will be described with reference to Figure 23 to Figure 25.Figure 23 to Figure 25 respectively shows the example of the picture according to example 5. In example 5, if the device for being appointed as cooperative partner device by user cannot be with the because the device is damaged or is used One device executes collaboration feature together, then presents and indicate drawing for another device that can execute collaboration feature together with first device It leads.In this case, the device of instruction and the device same type that cooperative partner device is appointed as by user can preferentially be presented The guiding of (for example, having the function of the device of same type).The controller 36 of server 14 obtains expression respectively from each device The information of the mode of operation (for example, whether whether device be carrying out processing, damage or safeguarding) of a device is simultaneously And the mode of operation of each device of management.
For example, it is assumed that MFP (B) and projecting apparatus (C) and (F) are identified as device.As shown in figure 23, device shows picture 68 are displayed on the UI units 46 of terminal device 16, and have with the device of identification (MFP (B) and projecting apparatus (C) and (F)) The device image 70,72 and 106 of pass is displayed on device and shows on picture 68.
In this case, for example, if user selects MFP (B) as first device and user specifies and MFP (B) Related device image 70 is used as the first image, then MFP (B) is identified as first device by the determination unit 38 of server 14.Example Such as, it is assumed that projecting apparatus (C) and (F), which correspond to execute the second of collaboration feature together with the MFP (B) as first device, to be filled It sets.For example, it is assumed that user by it is specified with the related device image 106 of projecting apparatus (F), execute will with MFP (B) related device Device image 70 is superimposed upon device image by image 70 and related device image 106 links each other with projecting apparatus (F) operation On 106 or indicant is placed on device image 70 and indicant is then moved into device image 106, to specify projecting apparatus (F) As cooperative partner device.In the example shown in Figure 23, as indicated by arrow 108, by device image 70 and device image 106 operations linked each other are executed by user.
When user specifies projecting apparatus (F) to be used as cooperative partner device, the controller 36 of server 14 receives this and specifies simultaneously Check the mode of operation of projecting apparatus (F).For example, if projecting apparatus (F) is damaged or is being used, the controller 36 of server 14 Perform control to the device presented other than indicating projector (F), that is, the another of collaboration feature can be executed together with first device The guiding of device.The device of instruction and projecting apparatus (F) same type can be preferentially presented (for example, having and projecting apparatus in controller 36 (F) device of the function of same type) guiding.For example, if projecting apparatus (C) is the device with projecting apparatus (F) same type, Guiding of the indicating projector (C) as second device is then preferentially presented.In this case, as shown in figure 24, for example, terminal is set Standby 16 controller 48 shows picture 68 so that indicating projector (C) is shown in device as the arrow 110 of cooperative partner device On.For example, arrow 110 be by with as first device the related device images 70 of MFP (B) and with the throwing as second device The image that the related device image of shadow instrument (C) 72 links each other.
If the specified damage of user or the device that is being used are as cooperative partner device, picture 112 shown in Figure 25 It can be shown on the UI units 46 of terminal device 16 under the control of the controller 36 of server 14, show that instruction can not cooperate The message of reason.
If damage device be made available by after a repair or if be carrying out processing device complete processing and Processing is not being executed, then the device is identified as that cooperation work(can be executed together with first device by the controller 36 of server 14 The device of energy.
According to example 5, instruction is presented without damage or not in the guiding of the device used, therefore, user can be increased It is convenient.In addition, the guiding of instruction and the device of the device same type specified by user is presented, therefore, it is pre- that instruction user is presented The guiding of phase device to be used.
Example 6
It will be handled according to the guiding of example 6 with reference to Figure 26 descriptions.Figure 26 shows the example of device selection picture.In example 6 In, the candidate row of the information about one or more second devices that can execute collaboration feature together with first device are shown Table is displayed on the UI units 46 of terminal device 16.
For example, it is assumed that MFP (B) and hair-dryer (D) be identified as device and with MFP (B) related device image 70 and Related device image 76 is displayed on the UI units 46 of terminal device 16 with hair-dryer (D), as shown in Figure 14 A.This In the case of, for example, if user selects MFP (B) as first device and user specifies and the related device images of MFP (B) 70, then the determination unit 38 of server 14 MFP (B) is identified as first device.For example, it is assumed that hair-dryer (D) is cannot be with MFP (B) device of collaboration feature is executed together.In this case, for example, it is assumed that user pass through it is specified related with hair-dryer (D) Device image 76, execute will device image 70 related with MFP (B) and the chain each other of related device image 76 with hair-dryer (D) Device image 70 is superimposed upon on device image 76 or indicant is placed on device image 70 and then will be referred to by the operation that connects Show that object moves to device image 76, to specify hair-dryer (D) to be used as cooperative partner device.
When the specified hair-dryer (D) that cannot execute collaboration feature together with the MFP (B) as first device of user is as association When making partner device, the controller 36 of server 14 receives this and specifies, and can together be held with MFP (B) as instruction is presented The control of the guiding of one or more second devices of row collaboration feature, performs control to display candidate list, candidate row Represent the information about one or more second device.Therefore, as shown in figure 26, device selection picture 114 is shown Show on the UI units 46 of terminal device 16, and candidate list is displayed on device selection picture 114.Shown in Figure 14 B Message frame 78 can be shown in before device selects picture 114 to show on the UI units 46 of terminal device 16.
As shown in figure 26, candidate list include can together be executed with MFP (B) collaboration feature each device title, The example (for example, title of collaboration feature) etc. of device image related with each device, collaboration feature.Certainly, candidate row Table may include at least one of these information.Each device image can be the image for the appearance for indicating actual device (with one One-one relationship image related with device), or can be the image (for example, icon) of schematically drawing apparatus.For example, table Show that the image of the appearance of actual device is the image generated by the appearance of filming apparatus, and is the figure for indicating device itself Picture.Schematically the image of drawing apparatus corresponds to the image for the type for indicating device.As the example of collaboration feature, candidate's row Table includes the title of collaboration feature or the title of multiple collaboration features.In the feelings for the title for showing multiple collaboration features Under condition, the title of each collaboration feature can be suitable according to display corresponding with interoperable multiple appointed sequences of destination apparatus Sequence is shown.For example, if the combination of MFP and projecting apparatus is able to carry out multiple collaboration features (for example, collaboration feature A and B), Between when MFP is designated as first device and when projecting apparatus is designated as first device, it is included in candidate list The display order of multiple collaboration features can be different.For example, if MFP is designated as first device, the name of each collaboration feature Title can be shown according to the sequence of collaboration feature A and collaboration feature B.If projecting apparatus is designated as first device, each cooperation The title of function can be shown according to the sequence of collaboration feature B and collaboration feature A.
Order of placement of the second device in candidate list can the past usage record based on for example each second device come It determines.For example, the controller 36 of server 14 is managed by obtaining the information for the usage record for indicating device from each device The usage record of each device.For example, controller 36 in candidate list according to the descending of the frequency of use (drop of access times Sequence) display second device.Past usage record can be the user of specified first image (for example, the use of using terminal equipment 16 The user of server 14 is either logined at family) usage record or can be the usage record for including another user use note Record.
For another example, controller 36 can show the second dress in candidate list according to the descending of the quantity of executable collaboration feature It sets.For example, if can be by the quantity of projecting apparatus and MFP (B) collaboration feature executed 3 and if can by PC and The quantity for the collaboration feature that MFP (B) is executed is 2, then projecting apparatus is displayed on above PC in candidate list.
If the function of each device is updated and if collaboration feature management information 34 is updated, controller 36 It more newly arrives according to this and updates the display of the second device in candidate list.For example, if cooperation cannot be executed together with MFP before update The device of function becomes able to execute collaboration feature together with MFP in the updated, then controller 36 is using the device as the second dress It sets and is shown in candidate list.In addition, the usage record of each device updates at any time, and controller 36 is based on newer Usage record updates the display order of the second device in candidate list.
According to the mode of operation or ambient enviroment of the first and second devices, the controller 36 of server 14 is renewable to be included in The display order of second device in candidate list or the second device in renewable candidate list.
The device that collaboration feature can be executed together with MFP but damage or be being used may not be displayed on candidate list On.Also in this case, if device has repaired or had become available, which is displayed on candidate list.
If the specified device name or device image being included in candidate list of user, the controller of terminal device 16 48 can be under the control of the controller 36 of server 14 so that the display of UI units 46 of terminal device 16 can be by being used as the first dress The list for one or more collaboration features that the MFP set is executed with specified device.For example, if in specified candidate list Projecting apparatus, then the controller 48 of terminal device 16 is so that can be executed by MFP and projecting apparatus one of the display of UI units 46 or more The list of multiple collaboration features.If the specified collaboration feature being included in candidate list of user, performs control to and executes institute Specified collaboration feature.
The each second device being shown in candidate list can be included in pre-registered device group in server 14 In device, be included in the device group using identifications such as AR technologies device, be included in be shown in terminal device 16 UI it is mono- In device in device group or the device group that is included in the specific region being shown in the picture of UI units 46 in member 46 Device.For example, if user's operation device image related with device, device image are displayed in the specific region. First device can also be the device being included in these devices group.This is equally applicable to above-mentioned example 1 to 5 and following shows Example.
As in example 1, the case where related with the function function image of use is applicable to according to the processing of example 6. For example, if the specified device or function that cannot execute collaboration feature together with the first function, can show candidate list, the time List is selected to show that the device (candidate second device) of collaboration feature or function (candidate second work(can be executed together with the first function Can).For example, candidate list includes the title of the second function, the example of function image related with the second function, collaboration feature Etc..Alternatively, if the specified function that cannot execute collaboration feature together with first device, can show candidate list.
According to example 6, candidate second device or candidate second function show that this can be to user as candidate list Easily.
Example 7
Example 7 will be described with reference to Figure 26 to Figure 32.In example 7, as in example 6, candidate list is displayed on end On the UI units 46 of end equipment 16.
As in example 6, it is assumed that user specifies MFP (B) that first device and user is used as to specify cannot be with MFP (B) The device of collaboration feature is executed together.The controller 36 of server 14 receives this and specifies, and can be with MFP as instruction is presented (B) together execute collaboration feature one or more second devices guiding control, perform control to display show about The candidate list of the information of one or more second device.Therefore, device selection picture 114 is displayed on terminal device On 16 UI units 46, as shown in figure 26.
For example, if user's specified device title, the controller 36 of the controller 48 of terminal device 16 in server 14 Control under so that the display of UI units 46 and the device of specified device same type of terminal device 16 list (the second dress The list set).For example, if user specifies projecting apparatus, device selection picture 116 to be displayed on the UI units of terminal device 16 On 46, as shown in figure 27.The list of projecting apparatus as second device is displayed on device selection picture 116.For example, if The determination unit 38 of server 14 determines (identification) projecting apparatus aaa, bbb and ccc as projecting apparatus corresponding with second device, then The list of these projecting apparatus is shown.User will be used as the projecting apparatus of second device from list selection.
In addition, display asks the user whether to increase the message of the destination apparatus of cooperation on device selection picture 116.Example Such as, if user specifies second device, then user provides the instruction for the destination apparatus for increasing cooperation (for example, if user selects Select the "Yes" in Figure 27), then device selection picture 118 is shown in as shown in figure 28 on the UI units 46 of terminal device 16, and Candidate list is shown on device selection picture 118, which shows about can be held together with the first and second devices The information of one or more 3rd devices of row collaboration feature.The candidate list has and shows about one or more the The identical configuration of candidate list of the information of two devices.The display order of 3rd device may differ from the candidate list of second device In display order.If user specifies 3rd device, such as picture is selected in response to the device that is specified and showing of second device It is the same in face 116 (referring to Figure 27), show the list of 3rd device.This be equally applicable to increase the 4th device, the 5th device, etc. Deng situation.
If schematically the image (for example, icon) of drawing apparatus is included in candidate list simultaneously as device image And if user specifies the device image, show the list of second device (for example, projecting apparatus) related with the device image, As shown in figure 27.That is, the such device of device graphical representation, and usually indicate device.Therefore, if specifying the device Image then shows the list of second device related with the device image.On the other hand, indicate the image of actual device (with one One-one relationship image related with device) it is the image for indicating device itself.Therefore, if the device image is included in candidate It is specified in list and by user, then user's specified device itself.In this case, second device shown in Figure 27 is not shown List can only show the message for the destination apparatus for asking the user whether to increase cooperation.
If not increasing the destination apparatus (for example, if user specifies the "No" in Figure 27) of cooperation, terminal device 16 Controller 48 under the control of the controller 36 of server 14 so that terminal device 16 UI units 46 show Figure 29 shown in Function selects picture 120.The list for the collaboration feature that can be executed by multiple devices specified by user is displayed on function choosing It selects on picture 120.For example, if MFP (B) is designated as first device and if projecting apparatus aaa is designated as second device, Then display can pass through the list of MFP (B) and projecting apparatus the aaa collaboration feature executed.If user specifies cooperation work(from list The instruction for executing collaboration feature can and be provided, then collaboration feature is executed by MFP (B) and projecting apparatus aaa.
On the other hand, in the case of the collaboration feature for not including in the list for executing collaboration feature, user asks to execute Collaboration feature.For example, as shown in figure 30, the picture 122 for making requests on is displayed on the UI units 46 of terminal device 16, And user inputs the title for the collaboration feature to be executed on picture 122.Indicate that the information of the request is sent out from terminal device 16 It send to server 14.In response to the reception of request, the controller 36 of server 14 determines that collaboration feature related with the request is It is no to be executed by specified device (for example, first and second devices).If can not be held with related collaboration feature is asked It goes, then the controller 48 of terminal device 16 makes the UI units 46 of terminal device 16 under the control of the controller 36 of server 14 Display instruction related not executable message of collaboration feature with request, as shown in figure 31.If with the related work(that cooperates is asked Can be executable, then the controller 48 of terminal device 16 makes the UI of terminal device 16 under the control of the controller 36 of server 14 The display of unit 46 instruction and the message for asking related collaboration feature that will execute, as shown in figure 32.If user provides execution Instruction, then execute collaboration feature.In addition, the requested collaboration feature of user can be registered.For example, as shown in figure 32, terminal is set Standby 16 controller 48 is under the control of the controller 36 of server 14 so that the display inquiry of UI units 46 of terminal device 16 is used Whether family is registered as from now on requested collaboration feature the message of candidate collaboration feature.If user selects " registration ", Then it is registered in from now in collaboration feature management information 34 and is being included in about the information of requested collaboration feature It is shown while in the list of collaboration feature.If user selects " not registering ", the information about requested collaboration feature It is not registered and is not included in the list of collaboration feature.For example, if requested collaboration feature corresponds to exception Processing, if the frequency of use higher of other collaboration features, or if user would want to prevent to include cooperation work(in lists The quantity of energy increases and list becomes complicated situation, then requested collaboration feature is not registered.
As in example 1, the case where related with the function function image of use is applicable to according to the processing of example 7.
According to example 7, candidate second device or candidate second function show that this for a user may be used as candidate list To be convenient.In addition, easily increasing the destination apparatus of cooperation using candidate list.
Example 8
Example 8 will be described with reference to Figure 33 to Figure 35.In example 8, if example 6 is as in 7, candidate list is displayed on On the UI units 46 of terminal device 16.
If example 6 is as in 7, it is assumed that user specifies MFP (B) that first device and user is used as to specify cannot be with MFP (B) device of collaboration feature is executed together.In this case, as shown in figure 26, device selection picture 114 is displayed on terminal On the UI units 46 of equipment 16.
If user specifies collaboration feature (for example, " printing shows picture ") on device selection picture 114, terminal is set Standby 16 controller 48 is under the control of the controller 36 of server 14 so that the UI units 46 of terminal device 16 show Figure 33 institutes The device selection picture 124 shown.On device selection picture 124, display executes the collaboration feature specified by user (for example, " beating Print show picture ") needed for device (that is, being able to carry out the device of collaboration feature) list (list of second device).If User in lists specified device (second device) and if user provide execute collaboration feature instruction, pass through user institute Specified the first and second devices execute the collaboration feature specified by user.
If user provides the finger for the device for not including in device list shown on selection device selection picture 124 Enable that (" you will select another device if in response to inquiry", select "Yes"), then the controller 48 of terminal device 16 is servicing So that the UI units 46 of terminal device 16 show that device shown in Figure 34 selects picture 126 under the control of the controller 36 of device 14. On device selection picture 126, display is able to carry out the candidate list of another device of the collaboration feature specified by user.It is filling Set selection picture 126 on if user's specified device (second device) and if user provide execute collaboration feature instruction, Then the collaboration feature specified by user is executed by the first and second devices specified by user.
If user provides the finger for the device for not including in device list shown on selection device selection picture 126 Enable that (" you will select another device if in response to inquiry", select "Yes"), then the controller 48 of terminal device 16 is servicing So that the UI units 46 of terminal device 16 show picture 128 shown in Figure 35, the picture 128 under the control of the controller 36 of device 14 Information for inputting the destination apparatus about cooperation.User inputs the information of the destination apparatus about cooperation on picture 128 (for example, title or type of device).The information about destination apparatus that user is inputted is sent to service from terminal device 16 Device 14.In response to the reception of information, the controller 36 of server 14 determines whether the collaboration feature specified by user can be by the Destination apparatus specified by one device and user executes.If collaboration feature not can perform, the controller of terminal device 16 48 make the display of UI units 46 of terminal device 16 make for setting another device under the control of the controller 36 of server 14 For the picture of destination apparatus.If collaboration feature is executable and provided that execute the instruction of collaboration feature, pass through user Specified the first and second devices (interoperable destination apparatus) execute the collaboration feature specified by user.
As in example 1, the case where related with the function function image of use is applicable to according to the processing of example 8.
According to example 8, when specifying collaboration feature in candidate list, display executes the device needed for the collaboration feature, Therefore, user's facility of selection device can be increased.
Illustrative embodiments are applicable to the environment that multiple users use multiple devices.For example, even if being removed from device The user interface of such as touch screen, terminal device 16 also serve as user interface.In another case, for example, if user is in trip Interim use device on the way then realizes the user interface for being suitable for user by terminal device 16, that is, specified by display user The user interface of one or more collaboration features of one or more functions and use of the device device.
Hereinafter, processing related with example 1 to 8 will be described.
Switch the processing of the display of the information about collaboration feature
It in the exemplary embodiment, can be according to device image related with device about the display of the information of collaboration feature The sequence that links each other and switch.In this case, if the device for being designated as cooperative partner device is cannot be with first Device executes collaboration feature together, then as in above-mentioned example 1 to 8, performing control to presentation instruction can be with first device The guiding of the second device of collaboration feature is executed together.On the other hand, it can be executed together with first device if user is specified The second device of collaboration feature is then cut about the display of the information of collaboration feature according to the sequence that device image links each other It changes.Hereinafter, this processing will be described in detail with reference to Figure 36 to Figure 38 B.
Figure 36 is shown as another exemplary collaboration feature management table of collaboration feature management information 34.In the collaboration feature It manages in table, for example, indicating the letter of the information of the combination of device ID, the title (type) for indicating interoperable destination apparatus The information (collaboration feature information) of one or more collaboration features is ceased, indicated, the information of link sequence is indicated and indicates excellent The information of first sequence is associated with each other.Link sequence corresponds to the sequence that device image related with device links each other.It is preferential suitable Sequence is the priority being shown about the information of collaboration feature.For example, the device with device ID " A " is PC, there is device The device of ID " B " is MFP.Cooperate realization such as scanning and transmitting function and printing function conduct between PC (A) and MFP (B) Collaboration feature.It scans and transmitting function is that the image data generated by being scanned by MFP (B) is sent to the function of PC (A). Printing function is that the data (for example, image data or document data) that will be stored in PC (A) are sent to MFP (B) and by MFP (B) function of the data is printed.For example, if (A) is linked from MFP (B) to PC, that is, if from related with MFP (B) Device image is to related device image is linked with PC (A), then the priority of scanning and transmitting function is " 1 ", printing The priority of function is " 2 ".In this case, about the information of scanning and transmitting function prior to about printing function Information and show.On the other hand, if (B) is linked from PC (A) to MFP, that is, if from the related installation drawing with PC (A) As to related device image is linked with MFP (B), then the priority of printing function is " 1 ", scan and transmitting function Priority is " 2 ".In this case, about the information of printing function prior to about scanning and transmitting function information and Display.
Figure 37 A to Figure 38 B respectively show the example for the picture being shown on the UI units 46 of terminal device 16.For example, false If MFP (B) and PC (A) is identified.As shown in Figure 37 A, device shows that picture 68 is displayed on the UI units 46 of terminal device 16 On, and device image 70 related with MFP (B) and with PC (A) related device image 88 be displayed on device show picture On face 68.In this state, user will indicate destination apparatus using indicant (for example, the finger of user, pen or writing pencil) Device image link each other.The controller 48 of terminal device 16 detects indicant and shows the touch on picture 68 in device, and And it detects indicant and shows the movement on picture 68 in device.For example, as indicated by the arrow 130, user using operation object come Touching device show picture 68 on device image 70 and by operate object move to device show picture 68 on device image 88, Thus device image 70 is linked into device image 88.Therefore, MFP (B) related with device image 70 and with device image 88 related PC (A) are designated as interoperable destination apparatus and specified link sequence.The linked sequence of device image Corresponding to link sequence.MFP (B) corresponds to first device, and PC (A) corresponds to second device.In the example shown in Figure 37 A, From device image 70 to device image 88, that is, from MFP (B) to PC, (A) is linked.Indicate the information of the link sequence of device It is sent to server 14 from terminal device 16.The controller 48 of terminal device 16 may make the motion track indicated performed by user Image be shown in device show picture 68 on.After device links each other, the controller 48 of terminal device 16 can utilize in advance The track is replaced in boning out etc., and the straight line may make to be shown in device and show on picture 68.
When specifying interoperable destination apparatus (for example, MFP (B) and PC (A)) in the manner described above, server 14 Determination unit 38 determines and the associated work(that cooperates of the combination of PC (A) and MFP (B) in the collaboration feature management table shown in Figure 36 Energy.Accordingly, it is determined that passing through the collaboration feature executed that cooperates between PC (A) and MFP (B).When the link of user's specified device is suitable When sequence, determination unit 38 determines in collaboration feature manages table links the associated priority of sequence with this.Specifically, with reference to figure 36, since PC (A) and MFP (B) are designated as interoperable destination apparatus, so the collaboration feature executed by these devices It is scanning and transmitting function and printing function.In addition, since from MFP (B) to PC, (A) (B → A) is linked, so scanning is simultaneously The priority of transmitting function is " 1 ", and the priority of printing function is " 2 ".
Information about identified collaboration feature and the information about identified priority are sent out from server 14 It send to terminal device 16.The controller 48 of terminal device 16 is so that UI units 46 are shown according to priority about collaboration feature Information is as the information about candidate collaboration feature.
For example, as illustrated in figure 37b, the controller 48 of terminal device 16 is so that UI units 46 show that collaboration feature shows picture 132 and show information of the display about candidate collaboration feature on picture 132 in collaboration feature.Due to scanning and transmitting function Priority be " 1 ", and the priority of printing function be " 2 ", so about scanning and transmitting function information prior to Information (for example, above it) about printing function is shown.For example, as the information about scanning and transmitting function, display Scan the explanation " by the data transmission of MFP (B) scannings to PC (A) " of simultaneously transmitting function.In addition, as the letter about printing function Breath shows the explanation " data in printing PC (A) " of printing function.
If user specifies collaboration feature and provides and executes instruction, specified collaboration feature is executed.For example, if User presses "Yes" button, then executes collaboration feature related with "Yes" button.In addition, " retrogressing " button is displayed on cooperation work( It can show on picture 132.If user presses " retrogressing " button, the processing of attachment device stops.
It determines the processing of collaboration feature and determines that the processing of priority can be executed by terminal device 16.
It may specify interoperable instead of the moving operation object between device image by surrounding device Image Rendering circle Destination apparatus and may specify its link sequence.For example, the sequence for drawing operation corresponds to link sequence.It alternatively, can basis The phonetic order that user is provided specify interoperable destination apparatus and its link sequence.
Figure 38 A and Figure 38 B show the example of another operation.For example, as shown in fig. 38 a, user touches dress using operation object It sets the device image 88 on display picture 68 and operation object is moved into device image 70 in the direction indicated by the arrow 134, To which device image 88 is linked to device image 70.Therefore, PC (A) related with device image 88 and with device image 70 Related MFP (B) is designated as interoperable destination apparatus, and also specified link sequence.In this example, from installation drawing As 88 to device image 70, that is, from PC (A) to MFP, (B) is linked.The collaboration feature with reference to shown in figure 36 manages table, printing The priority of function is " 1 ", and the priority for scanning simultaneously transmitting function is " 2 ".In this case, such as Figure 38 B institutes Show, shows the information on picture 136 about printing function prior to the information (example about scanning and transmitting function in collaboration feature Such as, above it) shows.
As described above, device image related with device links each other, so that it is determined that the cooperation of the function using the device Function.The sequence linked each other according to image, that is, the display of the sequence that device links each other, the information about collaboration feature is suitable Sequence changes.The link sequence of device is also regarded as using the sequence of function or data in interoperable dress in each device The sequence moved between setting.The operation of device link is also regarded as the sequence for specifying function to use (by the operation that image links) Or the operation of the sequence of data movement.Therefore, suitable about the display of the information of collaboration feature as being changed according to link sequence Sequence as a result, preferential show is expected the information of collaboration feature to be used about user.In other words, preferential to show about user more There is a possibility that collaboration feature information.For example, if (A) is linked from MFP (B) to PC, prospective users will use The collaboration feature of " using the function of MFP (B) first, data are then sent to PC (A) from MFP (B) ".On the other hand, if From PC (A) to MFP, (B) is linked, then prospective users will use " uses the function of PC (A), then by data from PC first (A) be sent to MFP (B) " collaboration feature.Therefore, change the information about collaboration feature as according to the link sequence of image Display order as a result, preferential show the information of collaboration feature more likely used about user.In addition, specified function makes Sequence or the sequence of data movement show pass without the special operation other than the operation that links device image It is expected the information of collaboration feature to be used in user.
Above-mentioned display hand-off process is applicable to the case where use related with function function image.For example, according to The related function image of one function and the appointed sequence of function image related with the second function switch about collaboration feature Information display.
Above-mentioned display hand-off process is applicable to about institute in the candidate list (for example, with reference to Figure 26) according to example 6 etc. The information of the collaboration feature of display.That is, if MFP (B) is designated as first device, PC (A) (is waited as candidate second device Select cooperative partner) it is shown in candidate list, and be used as about the related association with PC (A) about the information of multiple collaboration features Make the information (about can be by the information of MFP (B) and PC (A) collaboration features executed) of function according to sequence shown in Figure 37 B Display.On the other hand, if PC (A) is designated as first device, MFP (B) is as candidate second device (candidate cooperation partner Companion) it is shown in candidate list, and be used as about the related collaboration feature with MFP (B) about the information of multiple collaboration features Information sequentially shown according to shown in Figure 38 B.
Use the collaborative process of parts of images
Being assigned to the function of the device of collaboration feature can change according to the position in device image related with device.When When specific position in user's specified device image, preferential display is about the cooperation work(for using function corresponding with the specific position The information of energy.Hereinafter, will be described in the processing.
Figure 39 shows the example of apparatus function management table.Apparatus function manages the data of table as apparatus function management information 32 are stored in server 14.In apparatus function manages table, for example, the title (for example, type) of device ID, expression device Information, indicate position in device image information, indicate corresponding with position function information (functional information) and Image ID is associated with each other.Position in device image is the specific position (specific part) in device image related with device, Such as, it is schematically indicated specific position in the device image of device or by the certain bits in the device image of camera capture It sets.Different function is associated with each specific position in device image.
Figure 40 A and Figure 40 B respectively show the example for the picture being shown on the UI units 46 of terminal device 16.For example, false If MFP (B) and PC (A) is identified.As shown in fig. 40, device shows that picture 68 is displayed on the UI units 46 of terminal device 16 On, and device image 70 and 88 is displayed on device and shows on picture 68.For example, in device image 70, with MFP's (B) The corresponding specific position (parts of images 70a) of body part is assigned printing function.In device image 70, the document with MFP (B) Lid, document glass and the corresponding specific position (parts of images 70b) of automatic document feeder are assigned scanning function.In installation drawing In 70, specific position (parts of images 70c) corresponding with equipment for after-treatment is assigned bookbinding function.Bookbinding function is that bookbinding is defeated The function of the paper gone out.In device image 88, specific position (parts of images 88a) corresponding with the body part of PC (A) is referred to Send data storage function.In device image 88, specific position (parts of images 88b) corresponding with the display unit of PC (A) is referred to Send menu display function.Data storage function is the function that will be stored in from the data that another equipment receives in PC (A).Picture is aobvious It includes the function in PC (A) from the data that another equipment receives to show function to be.
The controller 48 of terminal device 16 may make the title (example of the function for the specific position being assigned in device image Such as, print, scan etc.) be shown on device display picture 68.Therefore, it provides a user and clearly shows that specific position and function Between correspondence information.Certainly, the title of function may not be shown.
When being assigned with the position of function in user's specified device image, the function of being assigned to specified position is designated For interoperable objective function.User is sent out the device image middle finger for indicating interoperable destination apparatus using indicant The specific position (parts of images) of function links.For example, as indicated by arrow 138, user is aobvious using operation object touching device Show the parts of images 70b on picture 68 and operation object is moved into parts of images 88b, to which parts of images 70b is linked to portion Partial image 88b.Therefore, with including the related MFP (B) of the device image 70 of parts of images 70b and with including parts of images 88b 88 related PC (A) of device image be designated as interoperable destination apparatus, and be assigned to the scanning of parts of images 70b Function and the menu display function for being assigned to parts of images 88b are designated.In addition, link sequence can be specified by linked operation. In this case, the linked sequence of parts of images corresponds to link sequence.In the example shown in Figure 40 A, from part figure As 70b to parts of images 88b, that is, from MFP (B) to PC, (A) is linked.Scanning function and menu display function are designated as Function for collaboration feature.It indicates the information of the link sequence of device and to indicate that user refers in device image fixed specific The information of position is sent to server 14 from terminal device 16.
When interoperable destination apparatus (for example, PC (A) and MFP (B)) is identified, the determination unit 38 of server 14 The collaboration feature realized by the cooperation between PC (A) and MFP (B) is determined in collaboration feature management table shown in Fig. 7.This Outside, the apparatus function management table with reference to shown in figure 39 of determination unit 38 determines that be assigned to user refers to fixed spy in device image Position the function of setting.In addition, in the collaboration feature realized by the cooperation between PC (A) and MFP (B), determination unit 38 To using the collaboration feature for the function of being assigned to the position specified by user to assign higher priority, to without using the function Collaboration feature assigns lower priority.
It is sent out from server 14 about the information of collaboration feature determining in the above described manner and the information of expression priority It send to terminal device 16.The controller 48 of terminal device 16 is so that UI units 46 are shown according to priority about collaboration feature Information, as the information about candidate collaboration feature.
For example, as shown in Figure 40 B, the controller 48 of terminal device 16 is so that the display of UI units 46 shows collaboration feature It shows picture 140 and shows information of the display about candidate collaboration feature on picture 140 in collaboration feature.Due to user according to This sequence invisible scanning function and menu display function, so about the cooperation between scanning function and menu display function is passed through The collaboration feature of execution, scanning, transmission and display function information prior to about other collaboration features information (for example, Its top) display.For example, the information about scanning, transmission and display function about by scanning function and data prior to depositing Store up the collaboration feature that the cooperation between function executes, the presentation of information of scanning, transmission and store function.It scans, transmit and show It to PC (A) and by the data includes on the picture of PC (A) that function, which is the data transmission that will be generated by being scanned by MFP (B), Function.Scanning, transmission and store function are the data transmissions that will be generated by being scanned by MFP (B) to PC (A) and by the number According to the function being stored in PC (A).In the example shown in Figure 40 B, the explanation of each collaboration feature is displayed as about each The information of a collaboration feature.
According to the collaborative process for using parts of images, there is the case where multiple functions in interoperable each destination apparatus Under, function is individually specified, and preferentially show the information about the collaboration feature for using specified function.Therefore, preferentially Show prospective users collaboration feature to be used.
Collaboration feature can be the function of the combination of the part of use device, using the group of whole device and the part of device The function of the combination of the function or use whole device of conjunction.
The case where use related with function function image being applicable to using the collaborative process of parts of images.For example, not Same function is assigned to the position in function image, and determines the association using the function of being assigned to the position specified by user Make function.
Above-mentioned example 1 to 8 is equally applicable to the collaborative process using parts of images.For example, if user specifies and first Included parts of images in related first image of device, then can perform control to be presented guiding, and guiding instruction can be with It is assigned to together with the function of part related with parts of images and executes the second device of collaboration feature in whole or in part.For another example, If user specifies entire first image related with first device, guiding is presented, which indicates energy for executable control The part of enough second devices that collaboration feature is executed together with first device.In order to which guiding, the entirety of second device or portion is presented It is shown while dividing in the candidate list that can be in example 6 described on be included in.If user specifies first device In whole or in part, then guiding can be presented, guiding instruction can execute collaboration feature together in whole or in part with first device Second device in whole or in part.If specified cannot cooperate with the execution in whole or in part of the first device specified by user In whole or in part, then guiding can be presented in the second device of function, and guiding instruction can be with first device in whole or in part Execute the second device of collaboration feature in whole or in part together.Hereinafter, will be described in these processing.
For example, when specifying the first image (whole image) related with first device, the controller 48 of terminal device 16 The guiding of the one or more functions of instruction second device, the function energy are presented under the control of the controller 36 of server 14 It is enough that collaboration feature is executed together with first device.More specifically, guiding, guiding instruction is presented in the controller 48 of terminal device 16 Related with one or more parts included in second device one or more parts of images (related with second device the One or more of two images parts of images), one or more of part figures related with one or more of parts As being assigned with the function that can execute collaboration feature together with first device.For example, as in example 1 to 8, terminal device 16 controller 48 makes the display of UI units 46 of terminal device 16 describe the image for the arrow for serving as guiding, is in using sound It now guides, or UI units 46 is made to show the character string for serving as guiding.With reference to Figure 40 A, have with MFP (B) if user is specified The device image 70 of pass is used as the first image, then guiding is presented, which indicates and be assigned with and can together be executed with MFP (B) The related parts of images in part of the function of the PC (A) of collaboration feature.For example, if the menu display function of PC (A) is can The function of executing collaboration feature together with MFP (B), then be presented drawing for instruction parts of images 88b related with menu display function It leads.For example, display will related device image 70 and parts of images 88b are linked each other with MFP (B) arrow, using sound come Present instruction menu display function guiding, display indicate menu display function character string or display portion image 88b with So that parts of images 88b can mutually be distinguished with another part image.When user specifies the device image 70 as the first image Or it when specified when being assigned with the part of function that cannot execute collaboration feature together with MFP (B) of PC (A) of user, can present Guiding.
For another example, if specifying parts of images included in the first image related with first device, terminal device 16 Controller 48 can be presented guiding under the control of the controller 36 of server 14, guiding instruction can be assigned to and the portion The function (function of first device) of the related part of partial image executes the second device (whole device) of collaboration feature together.Ginseng According to Figure 40 A, for example, if user specifies and the related parts of images 70a of MFP (B), guiding is presented, the guiding instruction and energy Enough second devices that collaboration feature is executed together with the printing function for being assigned to part related with parts of images 70a related the Two images.For example, if PC (A) is the device that can execute collaboration feature together with MFP (B), instruction and PC (A), which is presented, to be had The guiding of the device image 88 of pass.For example, the arrow that display links parts of images 70a and device image 88 each other, utilizes sound Sound indicates the guiding of PC (A) to present, and the character string or display device images 88 of display expression PC (A) is so that installation drawing Picture 88 can be with another device image phase region point.When user is specified is included in the parts of images in device image 70 or work as user When specified device image related with device that is cannot executing collaboration feature together with function related with parts of images, it can present and draw It leads.
For another example, if specifying parts of images included in the first image related with first device, (first part schemes Picture), then guiding is presented in the controller 48 of terminal device 16 under the control of the controller 36 of server 14, and guiding instruction can The second of collaboration feature is executed together with the function (function of first device) of being assigned to the part image-related with first part The one or more functions of device.More specifically, guiding is presented in the controller 48 of terminal device 16, which indicates and is included in The related one or more parts of images in one or more of second device part are (in the second image related with second device Included one or more parts of images are referred to as second part image), that is, one related with one or more parts Or multiple second part images, one or more of parts be assigned with can be assigned to it is image-related with first part Partial function executes the one or more functions of collaboration feature together.With reference to Figure 40 A, for example, if user specifies and MFP (B) related parts of images 70a, then be presented guiding, which indicates second part image related with such part, the portion Divide and is assigned with the PC that can execute collaboration feature together with the printing function for being assigned to part related with parts of images 70a (A) function.For example, if the menu display function of PC (A) is the function that can execute collaboration feature together with printing function, The guiding of instruction parts of images 88b related with menu display function is then presented.For example, showing parts of images 70a and part The arrow that image 88b is linked each other, the guiding of instruction menu display function is presented using sound, and display indicates that picture shows work( The character string or display portion image 88b of energy are so that parts of images 88b can mutually be distinguished with another part image.Work as user It is specified when being included in the parts of images in device image 70 or when user it is specified with cannot function related with parts of images one When acting the related parts of images of function for executing collaboration feature, guiding can be presented.
It may specify three or more parts of images, and user may specify three or more functions as a result,.For example, If user specifies two parts of images (two functions:First function and the second function), then guiding can be presented, guiding instruction It is assigned with the part for the function (third function) that collaboration feature can be executed together with the two functions.In this case, it takes The determination unit 38 of business device 14 can be according to parts of images related with the first function and parts of images related with the second function Appointed sequence is presented the third function of guiding to change.
Multiple functions of same device can be designated as interoperable objective function.For example, user may specify PC's (A) Menu display function and data storage function may specify the scanning work(of MFP (B) as interoperable objective function or user The menu display function and data storage function of energy and PC (A) are as interoperable objective function.Equally in such case Under, guiding is presented, which, which is assigned with, to execute the work(that cooperates together with function (for example, first function) specified first The part of the function (for example, second function) of energy.The determination unit 38 of server 14 can be used according to specified sequence to determine The collaboration feature of each function and can to collaboration feature assign higher priority.
Use another example of the collaborative process of parts of images
Hereinafter, by another example with reference to Figure 41 and Figure 42 descriptions using the collaborative process of parts of images.
Figure 41 shows the example of apparatus function management table.Apparatus function manages the data of table as apparatus function management information 32 are stored in server 14.In apparatus function manages table, for example, the title of device ID, expression device is (for example, device Type) information, indicate device title of part (for example, partial type) information, as part for identification The part ID of section identification information, indicate the information for being assigned to the function (function of part) of part and for identification with portion Divide the parts of images ID of related parts of images associated with each other.Parts of images is the device for indicating to obtain by being shot by camera Part appearance image.Certainly, it is schematically indicated the parts of images of the part of device can be with the partial association.For example, Different function is assigned to the various pieces of device.
Specifically, menu display function is assigned to the display unit of PC (A), and indicate the information of menu display function with The parts of images ID associations of the related parts of images of display unit.Menu display function is the function on PC (A) by presentation of information. Data storage function is assigned to the body part of PC (A), and indicates the information of data storage function portion related with body part The parts of images ID associations of partial image.Data storage function is the function of storing data in PC (A).
Printing function is assigned to the body part of MFP (B), and indicates the information of printing function portion related with body part The parts of images ID associations of partial image.Scanning function be assigned to MFP (B) reading part (for example, with the document lid of MFP (B), Document glass and the corresponding part of automatic document feeder), and indicate the information of scanning function portion related with reading part The parts of images ID associations of partial image.Bookbinding function is assigned to the equipment for after-treatment of MFP (B), and indicates bookbinding function The parts of images ID of information parts of images related with equipment for after-treatment is associated with.Bookbinding function is the work(of the paper of bookbinding output Energy.
Determine that (identification) is assigned to the function of the part of device using for example unmarked AR technologies.For example, if by camera The part of (for example, camera 42 of terminal device 16) filming apparatus, then it represents that the appearance images data of the part are from terminal device 16 are sent to server 14.The determination unit 38 of server 14 determines (identification) and appearance images number in apparatus function manages table According to associated function.Accordingly, it is determined that (identification) is assigned to the function of captured part.For example, if shooting MFP by camera 42 (B) body part, then it represents that the appearance images data of the body part of MFP (B) are sent to server 14 from terminal device 16.Service The determination unit 38 of device 14 determines the printing function with appearance images data correlation in apparatus function manages table.Accordingly, it is determined that The function of being assigned to the body part of MFP (B) is printing function.
Certainly, determine that (identification) is assigned to the function of the part of device using the AR technologies based on label.For example, device Various pieces be provided with the two-dimensional bar such as obtained by being encoded to section identification information (for example, part ID) Label for identifying the part.If by the label on camera photographing section and being applied to the AR skills based on label Art then obtains the section identification information (for example, part ID) of the part.The application of AR technologies based on label can be by terminal device 16 or server 14 execute.After being achieved in that section identification information, the determination unit 38 of server 14 is in apparatus function pipe It manages and determines (identification) and the associated function of section identification information (for example, part ID) in table.
Figure 42 shows the example of collaboration feature management table.Collaboration feature manages the data of table as collaboration feature management information 34 are stored in server 14.Collaboration feature management table is the letter for indicating the respectively collaboration feature of the function using multiple portions Breath.In collaboration feature manages table, for example, indicating the information of the combination of the part of device, the information for the combination for indicating part ID And indicate associated with each other using the information of the collaboration feature of the function of included multiple portions in combining.Certainly, it is cooperating In function management table, indicates the information of the part of device and the combination of whole device and indicate the function of the part of use device Information with the collaboration feature of the function of whole device can be associated with each other.
Specifically, the display unit of PC (A) and the body part of MFP (B) are assigned to as the printing function of collaboration feature Combination, and it is denoted as the part ID and MFP (B) of the information and the display unit for indicating PC (A) of the printing function of collaboration feature Body part part ID combination information association.For example, the printing function as collaboration feature will be stored in PC (A) Data be sent to MFP (B) and print the function of the data by MFP (B).
Printing function as collaboration feature is assigned to the group of the body part of MFP (B) and the body part of projecting apparatus (C) It closes, and is denoted as the part ID and projecting apparatus of the information and the body part for indicating MFP (B) of the printing function of collaboration feature (C) information association of the combination of the part ID of body part.For example, the printing function as collaboration feature is by projecting apparatus (C) The data of projection are sent to MFP (B) and print the function of the data by MFP (B).
Scanning and projecting function as collaboration feature are assigned to the body part of the reading part and projecting apparatus (C) of MFP (B) Combination, and be denoted as collaboration feature scanning and projecting function information with indicate MFP (B) reading part part ID With the information association of the combination of the part ID of the body part of projecting apparatus (C).For example, the scanning as collaboration feature and projecting function It is that the data generated by being scanned by MFP (B) are sent to projecting apparatus (C) and project the function of the data by projecting apparatus (C).
Collaboration feature can be using the function for the function of including multiple portions in the same apparatus, or can make With the function of the function for the part being included in multiple and different devices.Collaboration feature can use three or more parts The function of function.
For example, utilizing the multiple portions (example for determining (identification) device based on the AR technologies of label or unmarked AR technologies Such as, the multiple portions of multiple and different devices or the multiple portions of same device) after, the determination unit 38 of server 14 is being assisted Make the associated collaboration feature of combination of (identification) determining in function management table and the multiple portions identified.Accordingly, it is determined that (knowing Not) using the collaboration feature of the function of the part of multiple identifications (for example, shooting).For example, if by terminal device 16 camera 42 shooting MFP (B) body part and projecting apparatus (C) body part and if the body part and projecting apparatus (C) of MFP (B) sheet Body portion is identified, then the determination unit 38 of server 14 determined in collaboration feature manages table printing function etc. as with MFP (B) Body part and projecting apparatus (C) body part the associated collaboration feature of combination.
Above-mentioned example 1 to 8 is equally applicable to this collaborative process.Such as, if it is determined that the first part of (identification) device, Then guiding is presented in the controller 48 of terminal device 16 under the control of the controller 36 of server 14, and guiding instruction can be with finger Task first part function execute together collaboration feature device second part.
Interoperable destination apparatus is specified by stacking apparatus image
It can be by specifying interoperable destination apparatus by multiple device images are superposed on one another.Hereinafter, will be with reference to figure 43A, Figure 43 B, Figure 43 C, Figure 44 A and Figure 44 B describe this processing.Figure 43 A, Figure 43 B, Figure 43 C, Figure 44 A and Figure 44 B are respectively shown It is shown in the example of the picture on the UI units 46 of terminal device 16.
For example, it is assumed that MFP (B) and PC (A) is identified.As shown in Figure 43 A, device shows that picture 68 is displayed on terminal and sets On standby 16 UI units 46, and device image related with the device of identification 70 and 88 is displayed on device and shows picture 68 On.In this state, user will be related with first device using indicant (for example, the finger of user, pen or writing pencil) Device image is superimposed upon with cooperative partner device (second device) on related device image.For example, as shown in Figure 43 B, user Using operation object specified device image 70 and device image 70 is superimposed upon on device image 88, as indicated by arrow 142. For example, user is superposed on one another by device image by executing drag-and-drop operation.Specifically, it user's towing gear image 70 and is filling It sets and decontrols it at the position that image 70 is superimposed on device image 88.The drag-and-drop operation is technology according to prior art.Separately Selection of land can want device image superposed on one another according to the phonetic order that user is provided come specified.For example, being provided according to user Phonetic order, device image 70 and 88 can be designated as destination apparatus image and can be superposed on one another.
As device image 70 and 88 is superposed on one another as a result, MFP (B) related with device image 70 and and device 88 related PC (A) of image is designated as interoperable destination apparatus.For example, first appointed MFP (B) corresponds to the One device.If second appointed PC (A) is the device that cannot execute collaboration feature together with MFP (B), show Ru above-mentioned The same in example 1 to 8, instruction is presented can execute the guiding of device of collaboration feature together with MFP (B).
It is mono- that the controller 48 of terminal device 16 may make that just towed device image is shown in UI in a manner of identifiable In member 46.For example, just towed device image can be displayed semi-transparently or be shown with particular color.
If device image 70 is superimposed on device image 88 and if PC (A) can execute association together with MFP (B) Make function, then confirmation screen 144 is displayed on the UI units 46 of terminal device 16, as shown in Figure 43 C.Confirmation screen 144 is For being confirmed whether the interoperable picture of device so that specified.Refer to if user provides cooperation in confirmation screen 144 (for example, if user presses "Yes" button) is enabled, the information about collaboration feature is displayed on the UI units of terminal device 16 On 46.
For example, as shown in Figure 44 A, the controller 48 of terminal device 16 is so that UI units 46 show that collaboration feature shows picture 146 and show information of the display about candidate collaboration feature on picture 146 in collaboration feature.By making PC (A) and MFP (B) coordination with one another, for example, realizing scanning and transmitting function and printing function.Accordingly, with respect to scanning and transmitting function information with And the information about printing function is displayed on collaboration feature and shows on picture 146.
If user specifies collaboration feature and user to provide and executes instruction, the mesh to cooperate to each other from terminal device 16 Device for mark sends connection request.As shown in Figure 44 B, while just processing connection request, Wait Screen 148 is displayed on terminal On the UI units 46 of equipment 16.When the successful connection between terminal device 16 and destination apparatus is established, specified association is executed Make function.
As described above, device image related with device is superposed on one another, the cooperation of the function of use device is thereby determined that Function.Therefore, the special operation other than may make function coordination with one another to be operated without image, and using shirtsleeve operation Execute the cooperation between function.Also in this case, collaboration feature can be executed together with first device by instruction being presented Therefore the guiding of second device compared with the case where guiding is not presented, can increase and be facilitated using the user of collaboration feature.
By the way that parts of images to be superimposed upon collaboration feature can be determined on device image or parts of images.This is handled reference Figure 45 A and Figure 45 B descriptions.Figure 45 A and Figure 45 B respectively show showing for the picture being shown on the UI units 46 of terminal device 16 Example.
As in the above-mentioned collaborative process using parts of images, the function of device is according to device image related with device In position and change.It is included in identical or different device image by making the parts of images being included in device image be superimposed upon In parts of images on, determine using function related with two parts of images collaboration feature.Hereinafter, this processing will retouch in detail It states.
For example, it is assumed that MFP (B) and PC (A) is identified.As shown in Figure 45 A, device shows that picture 68 is displayed on terminal and sets On standby 16 UI units 46, and device image 70 and 88 is displayed on device and shows on picture 68.For example, parts of images 70a, Each in 70b, 70c, 88a and 88b is displayed as the image that can be discretely moved with another part image.
If user's specified portions image and if the parts of images is superimposed on another part image, it is determined that make With the collaboration feature of function related with two parts of images, and the information about the collaboration feature is displayed on terminal device On 16 UI units 46.This determination processing can be executed by the determination unit 38 or terminal device 16 of server 14.
For example, as indicated by the arrow 150 in Figure 45 B, if user utilizes operation object towing parts of images 70b and handle It is placed on parts of images 88b, then with including the related MFP (B) of the device image 70 of parts of images 70b and with including part The 88 related PC (A) of device image of image 88b is designated as interoperable destination apparatus, and is assigned to parts of images The scanning function of 70b and the menu display function for being assigned to parts of images 88b are also designated as interoperable objective function.
In server 14, management is assigned to the function of various pieces image.For example, the identification of parts of images for identification Information, expression pass through the cooperation executed that cooperates between function with the functional information of the associated function of parts of images and expression The collaboration feature information of function is stored in associated with one another in server 14.If showing selector component on picture 68 in device It is simultaneously superimposed upon on another part image by picture, then it represents that the identification information of parts of images superposed on one another is sent out from terminal device 16 It send to server 14.In the example shown in Figure 45 B, indicate that the identification information of parts of images 70b and 88b are sent out from terminal device 16 It send to server 14.The determination unit 38 of server 14 determines the work(for being assigned to parts of images 70b and 88b based on identification information Can, and determine the collaboration feature using the function.Information about collaboration feature is sent to terminal device from server 14 It 16 and is displayed on terminal device 16.
As described above, in the case where interoperable each destination apparatus has multiple functions, in each destination apparatus Middle selection function, and preferentially show the information about the collaboration feature for using specified function.Therefore, preferential display is expected User's collaboration feature to be used.
The priority that collaboration feature is shown can change according to parts of images sequence superposed on one another.In such case Under, the information of collaboration feature of the preferential display about the related function of parts of images of using be superimposed.
Parts of images by it is superposed on one another in the case of, first specified parts of images corresponds to the first image, and Function related with the parts of images corresponds to the first function.Second specified parts of images (is superimposed with the portion of the first image Partial image) correspond to the second image, and function related with the parts of images corresponds to the second function.If second specified Function cannot execute collaboration feature together with first specified function, then as in above-mentioned example 1 to 8, present guiding, The guiding indicates that the function of collaboration feature can be executed together with first specified function.In the examples described above, if with Two specified related menu display functions of parts of images 88b cannot be specified with first parts of images 70b related sweep It retouches function and executes collaboration feature together, then guiding is presented, guiding instruction can be held together with first specified scanning function The function of row collaboration feature.In this case, the function (for example, scanning function) that instruction can be specified with first can be presented The guiding of the whole device (for example, device image itself) of collaboration feature is executed together, or can be presented that instruction is assigned with can The guiding of the part (for example, parts of images) of the device of the function of collaboration feature is executed together with the function specified with first.
The processing of the display of the single apparatus function of switching and collaboration feature
In the exemplary embodiment, it can perform the function (hereinafter referred to as " single apparatus function ") that single device is used alone Display and the display of collaboration feature between switching control.
For example, if only identifying a device within the predetermined identification period, the controller 48 of terminal device 16 makes The UI units 46 of terminal device 16 show one or more functions about this device (for example, image forming apparatus 10) Information as single apparatus function information.The length of identification period can be changed by user.It can be by applying AR technologies or other skills Art carrys out identification device.The processing of identification device can be executed by server 14 or terminal device 16.For example, the starting point of identification period It can be the time point specified by this device identified time point or user (for example, the time that identifying processing starts Point).
For example, if not identifying another dress out of this device identified time lights the identification period Set, then the controller 48 of terminal device 16 so that the display of UI units 46 of terminal device 16 about one of this device or more The information of multiple functions is as single apparatus function information.In this case, this device is used as identifies within the identification period Device handled.Information about device can be sent to the information of terminal device 16 from server 14 or prestore Information in terminal device 16.
For another example, if only identifying a device, terminal within the identification period since the time point specified by user The controller 48 of equipment 16 is so that the UI units 46 of terminal device 16 show one or more functions about this device Information as single apparatus function information.
For another example, if providing the instruction of the single apparatus function of display, the control of terminal device 16 after identifying a device Device 48 processed may make the UI units 46 of terminal device 16 to show that the information of one or more functions about this device is made For single apparatus function information.The controller 48 of terminal device 16 is so that the UI units 46 of terminal device 16 show or clapping always Finger of the display for providing the single apparatus function of display in the case of taking the photograph a device (or in the case where identifying a device) The button image of order.If user presses the button image, controller 48 is so that UI units 46 are shown about this device One or more functions information.
When identifying that the period passes, the controller 48 of terminal device 16 may make the UI units 46 of terminal device 16 to show really Recognize picture.For example, confirmation screen is the picture that user is used for providing the instruction for extending the identification period.If user is by confirming picture Face provides the instruction for extending the identification period and if no within the extended period shoots another device, terminal device 16 Controller 48 is so that the UI units 46 of terminal device 16 show the information of one or more functions of the device about identification.
It will be described with the display control of single apparatus function information.For example, it is assumed that using based on label AR technologies or Unmarked AR technologies identification device.For example, if only shooting a device, the control of terminal device 16 within the predetermined shooting period Device 48 processed is so that the UI units 46 of terminal device 16 show the information conduct of one or more functions about this device Single apparatus function information.When the starting point of shooting period can be time point that this device is taken or user specifies Between point (for example, shooting starting point).The length of shooting period can be changed by user.After shooting this device, base is utilized This device is identified in the AR technologies of label or unmarked AR technologies.Identifying processing can be held by server 14 or terminal device 16 Row.
For example, if no within the shooting period that the time being taken from this device lights shoot another device, Then the controller 48 of terminal device 16 is so that the UI units 46 of terminal device 16 are shown about the one or more of this device The information of a function is as single apparatus function information.In this case, this device is treated as clapping in section when shooting The device taken the photograph.
For another example, if only shooting a device in shooting period the time point specified since user, terminal is set Standby 16 controller 48 is so that the UI units 46 of terminal device 16 show one or more functions about this device Information is as single apparatus function information.
For another example, if providing the instruction of the single apparatus function of display, the control of terminal device 16 after shooting a device Device 48 processed may make the UI units 46 of terminal device 16 to show that the information of one or more functions about this device is made For single apparatus function information.
For example, if a device is only shot in section when shooting (for example, if in the time being taken from first device Do not have to shoot second device or if out of, the starting specified by user is lighted the shooting period in the shooting period lighted Only shoot a device), then the image data generated by shooting is sent to server by the controller 48 of terminal device 16 14.Shooting the period can be measured by controller 48 or timer.The determination unit 38 of server 14 (is known based on image data to determine Not) one or more functions of device and the determining device.Information about one or more function is from service Device 14 is sent to terminal device 16 and is displayed on the UI units 46 of terminal device 16.Certainly, server 14 can replace terminal Equipment 16 manages the time, and can the information of one or more functions about the device of identification be sent to terminal device 16。
When passing the shooting period, the controller 48 of terminal device 16 may make the UI units 46 of terminal device 16 to show really Recognize picture.For example, confirmation screen is the picture that user is used for providing the instruction for extending the shooting period.If user is by confirming picture Face provides the instruction for extending the shooting period and if no within the extended period shoots another device, terminal device 16 Controller 48 will be sent to server 14 by shooting the image data obtained, and make the UI units 46 of terminal device 16 aobvious Show one or more functions of this device.The length of extended period can be changed by user.
For another example, if providing the instruction of the single apparatus function of display, the control of terminal device 16 after shooting a device Device 48 processed can will be sent to server 14 by shooting the image data generated, and therefore, can be received from server 14 about The information of one or more functions of captured device.
For another example, every time when filming apparatus generates image data, the controller 48 of terminal device 16 can be by picture number According to being sent to server 14, and therefore, one or more functions about captured device can be received from server 14 Information.In this case, if only shooting a device in section when shooting, the controller 48 of terminal device 16 makes The display of UI units 46 of terminal 16 is used as single apparatus function information about the information of this device.
On the other hand, if identifying multiple devices within the identification period, the controller 36 of server 14 executes cooperation work( It can pattern.Under collaboration feature pattern, as in above-mentioned example 1 to 8, instruction is presented can execute association together with first device Make the guiding of the second device of function.
For example, if identifying second device out of the first device identified time lights the identification period, execute Collaboration feature pattern.In this case, first device is also treated as the device identified within the identification period.In addition, if Second device is being identified out of the first device identified time lights the identification period, then the controller 36 of server 14 can be set The fixed new identification period since second device identified time point.This is equally applicable to following sessions, that is, if known newly 3rd device is identified in the other period, then sets another new identification period.
For another example, it if shooting multiple devices within the identification period since the time point specified by user, can perform Collaboration feature pattern.
For another example, if providing the instruction for showing collaboration feature after identifying multiple devices, it can perform collaboration feature mould Formula.The controller 48 of terminal device 16 so that UI units 46 show always either in the case where shooting multiple devices (or In the case of identifying multiple devices) button image of the display for providing the instruction for showing one or more collaboration features.Such as Fruit user presses the button image, then executes collaboration feature pattern.
For another example, if during the period for the instruction for not having to provide the function of executing first device after identifying first device It identifies second device, then can perform collaboration feature pattern.
It will be described with the execution of collaboration feature pattern.For example, it is assumed that utilizing AR technologies based on label or unmarked AR technologies identify multiple devices.For example, if shooting multiple devices within the predetermined shooting period, collaboration feature pattern is executed. For example, if shooting second device within the shooting period that the time being taken from first device lights, collaboration feature is executed Pattern.In this case, first device is also treated as the device shot in section when shooting.In addition, if from first Second device is shot in the shooting period that the time that device is taken lights, then the controller 36 of server 14 can be set from second The new shooting period that the time point that device is taken starts.This is equally applicable to following sessions, that is, if within the newly shooting period 3rd device is shot, then sets another new shooting period.
For another example, it if shooting multiple devices within the shooting period since the time point specified by user, can perform Collaboration feature pattern.
For another example, if providing the instruction for showing collaboration feature after shooting multiple devices, it can perform collaboration feature mould Formula.
For another example, if during the period for the instruction for not having to provide the function of executing first device after shooting first device Second device is shot, then can perform collaboration feature pattern.
As described above, during the display of the single apparatus function of switching and collaboration feature, if identification (for example, shooting) One device, then information of the display about one or more functions of this device, and if identification (for example, shooting) Multiple devices then execute collaboration feature pattern.The function of executing accordingly, with respect to the device using identification (for example, shooting) Information is provided to user, this can be convenient for a user.
Since only by applying AR technology identification devices, single apparatus function or collaboration feature are made available by, so and user It is manually set and is compared the case where function for executing, each function can be used by shirtsleeve operation, and user Time and effort can reduce.
In the exemplary embodiment, device image related with the device of identification and device image superposed on one another can It dimensionally shows mutually to be distinguished with background image.That is, these images can be shown as 3-D view.For example, background image is by two Dimension display, and device image is displayed in three dimensions.Therefore, the visuality of device image can increase.In addition, the dress specified by user Set image color is changeable or specified device image sparkling so that specified device image can with it is other Device image phase region point.
According to illustrative embodiments, the function using interoperable destination apparatus is determined by application AR technologies Collaboration feature, and show the information about collaboration feature.Therefore, even if user can not know from the appearance of device by each other Which collaboration feature the destination apparatus of cooperation can perform, and also provide a user the information about collaboration feature.In addition, by making Multiple device coordination with one another can not be made available by by the function that single device executes, this can be convenient.In addition, only logical It crosses and identifies that interoperable destination apparatus, collaboration feature are made available by using AR technologies.Therefore, it is manually set with user It is compared the case where collaboration feature for executing, collaboration feature is made available by by shirtsleeve operation, and the time of user It can be reduced with effort.
For example, each in image forming apparatus 10, server 14 and terminal device 16 passes through hardware and software resource Between cooperation realize.Specifically, each in image forming apparatus 10, server 14 and terminal device 16 includes such as The one or more processors (not shown) of central processing unit (CPU).One or more of processors read and execute and deposit The program in storage device (not shown) is stored up, each of image forming apparatus 10, server 14 and terminal device 16 is achieved in The function of a unit.Program is by the recording medium of such as compact disk (CD) or digital versatile disc (DVD) or by such as The communication path of network is stored in storage device.Alternatively, image forming apparatus 10, server 14 and terminal device 16 is each A unit can be realized by the hardware resource of such as processor, electronic circuit or application-specific integrated circuit (ASIC).Such as store The equipment of device can be used for realizing.Alternatively, each unit of image forming apparatus 10, server 14 and terminal device 16 can pass through Digital signal processor (DSP) or field programmable gate array (FPGA) are realized.
In order to which purpose of illustration and description provides the above description of exemplary embodiments of the present invention.It is not intended to It is exhaustive or limits the invention to disclosed precise forms.It is obvious to the skilled person that many is repaiied Change and variation will be apparent.Embodiment is chosen and described most preferably to illustrate the principle of the present invention and its actually to answer With so that skilled artisans appreciate that the present invention various embodiments and be suitable for it is contemplated that it is specific The various modifications of purposes.The scope of the present invention is intended to be limited by following claims and its equivalent.

Claims (22)

1. a kind of information processing equipment, the information processing equipment include:
Controller, if specified the first image related with first device needed for execution collaboration feature, the controller execute Control is to be presented the guiding that instruction can execute the second device of the collaboration feature together with the first device.
2. information processing equipment according to claim 1, wherein if further specify that with cannot be with the first device The related image of device of the collaboration feature is executed together, then the controller, which performs control to, is presented the guiding.
3. information processing equipment according to claim 2, wherein if execute by described first image and with cannot be with institute It states first device and executes the operation that the related image of device of the collaboration feature links each other together, then the controller executes Control is to be presented the guiding.
4. information processing equipment according to claim 2, wherein if described first image and with cannot be with described first The related image of device that device executes the collaboration feature together is superposed on one another, then the controller, which performs control to, is presented institute State guiding.
5. information processing equipment according to any one of claims 1 to 4, wherein if specified be included in described the Parts of images in one image, then the controller perform control to presentation instruction being capable of work(corresponding with the parts of images The guiding of the second device of the collaboration feature can be executed together.
6. information processing equipment according to any one of claims 1 to 5, wherein as the control that the guiding is presented System, the controller performs control to display candidate list, which shows about being able to carry out the collaboration feature The information of one or more second devices.
7. information processing equipment according to claim 6, wherein if from one on the candidate list or more Specified second device in multiple second devices, then the controller perform control to display about using specified second to fill The information for the collaboration feature set.
8. the information processing equipment according to any one of claims 1 to 6, wherein the controller performs control to The association is shown while changing the collaboration feature according to the first device and the appointed sequence of the second device Make function.
9. the information processing equipment according to any one of claim 1 to 8, wherein if the first device and institute Second device is stated to be designated, then the controller further perform control to presentation instruction can be with the first device and described Second device executes the guiding of the 3rd device of collaboration feature together.
10. information processing equipment according to claim 9, wherein the controller is performed control to according to described The guiding is presented while changing the 3rd device in one device and the appointed sequence of the second device.
11. a kind of information processing equipment, the information processing equipment include:
Controller, if specified the first image related with the first function needed for execution collaboration feature, the controller execute Control is to be presented the guiding that instruction can execute the second function of the collaboration feature together with first function.
12. information processing equipment according to claim 11, wherein if further specify that with cannot be with first work( The related image of function of the collaboration feature can be executed together, then the controller, which performs control to, is presented the guiding.
13. information processing equipment according to claim 12, wherein if execute by described first image and with cannot be with The operation that the related image of function that first function executes the collaboration feature together links each other, then the controller hold Row control is to be presented the guiding.
14. information processing equipment according to claim 12, wherein if described first image and with cannot be with described The related image of function that one function executes the collaboration feature together is superposed on one another, then the controller performs control to presentation The guiding.
15. the information processing equipment according to any one of claim 11 to 14, wherein as the presentation guiding Control, the controller performs control to display candidate list, which shows about being able to carry out the collaboration feature One or more second functions information.
16. information processing equipment according to claim 15, wherein one or more described in the candidate list The sequence of second function arrangement is determined based on the past usage record of one or more the second function.
17. the information processing equipment according to any one of claim 11 to 16, wherein the controller executes control Described in display while changing the collaboration feature according to first function and the appointed sequence of second function Collaboration feature.
18. the information processing equipment according to any one of claim 11 to 17, wherein if first function and Second function be designated, then the controller further perform control to presentation instruction can be with first function and institute State the guiding that the second function executes the third function of collaboration feature together.
19. information processing equipment according to claim 18, wherein the controller is performed control to according to described The guiding is presented while changing the third function in one function and the appointed sequence of second function.
20. the information processing equipment according to any one of claim 11 to 19, wherein first function and described Second function is included in pre-registered one group of function, one group of function of one or more devices identified, is shown in display One group of function on device or be shown in the display picture specific region in one group of function in.
21. a kind of information processing method, which includes the following steps:
If specified the first image related with first device needed for execution collaboration feature, performs control to presentation instruction energy The guiding of enough second devices that the collaboration feature is executed together with the first device.
22. a kind of information processing method, which includes the following steps:
If specified the first image related with the first function needed for execution collaboration feature, performs control to presentation instruction energy The guiding of enough the second functions that the collaboration feature is executed together with first function.
CN201710938467.4A 2017-01-11 2017-09-30 Information processing apparatus, information processing method, and computer program Active CN108307084B (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2017-002491 2017-01-11
JP2017002491A JP2018112860A (en) 2017-01-11 2017-01-11 Information processing device and program

Publications (2)

Publication Number Publication Date
CN108307084A true CN108307084A (en) 2018-07-20
CN108307084B CN108307084B (en) 2022-06-24

Family

ID=62869993

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201710938467.4A Active CN108307084B (en) 2017-01-11 2017-09-30 Information processing apparatus, information processing method, and computer program

Country Status (2)

Country Link
JP (1) JP2018112860A (en)
CN (1) CN108307084B (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP7271157B2 (en) * 2018-12-13 2023-05-11 シャープ株式会社 DISPLAY DEVICE, PROGRAM AND DISPLAY METHOD OF DISPLAY DEVICE
JP7509643B2 (en) 2020-10-01 2024-07-02 グローリー株式会社 Automated Trading System

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2011182183A (en) * 2010-03-01 2011-09-15 Panasonic Corp Device information display apparatus, television, equipment information display method, program, and recording medium
JP2011253370A (en) * 2010-06-02 2011-12-15 Sony Corp Information processing device, information processing method and program
JP2012213144A (en) * 2011-03-18 2012-11-01 Ricoh Co Ltd Information processor, information processing system, device cooperation method and program
WO2013061517A1 (en) * 2011-10-27 2013-05-02 パナソニック株式会社 Apparatus for executing device coordination service, method for executing device coordination service, and program for executing device coordination service
CN104375948A (en) * 2013-08-14 2015-02-25 佳能株式会社 Image forming apparatus and control method thereof
US20150277816A1 (en) * 2014-03-28 2015-10-01 Brother Kogyo Kabushiki Kaisha Image processing apparatus, communication system, and relay device
JP6024848B1 (en) * 2016-05-06 2016-11-16 富士ゼロックス株式会社 Information processing apparatus and program
CN106161834A (en) * 2015-05-11 2016-11-23 富士施乐株式会社 Information processing system, information processor and information processing method
JP6052458B1 (en) * 2016-06-29 2016-12-27 富士ゼロックス株式会社 Information processing apparatus and program

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2011182183A (en) * 2010-03-01 2011-09-15 Panasonic Corp Device information display apparatus, television, equipment information display method, program, and recording medium
JP2011253370A (en) * 2010-06-02 2011-12-15 Sony Corp Information processing device, information processing method and program
JP2012213144A (en) * 2011-03-18 2012-11-01 Ricoh Co Ltd Information processor, information processing system, device cooperation method and program
WO2013061517A1 (en) * 2011-10-27 2013-05-02 パナソニック株式会社 Apparatus for executing device coordination service, method for executing device coordination service, and program for executing device coordination service
CN104375948A (en) * 2013-08-14 2015-02-25 佳能株式会社 Image forming apparatus and control method thereof
US20150277816A1 (en) * 2014-03-28 2015-10-01 Brother Kogyo Kabushiki Kaisha Image processing apparatus, communication system, and relay device
CN106161834A (en) * 2015-05-11 2016-11-23 富士施乐株式会社 Information processing system, information processor and information processing method
JP6024848B1 (en) * 2016-05-06 2016-11-16 富士ゼロックス株式会社 Information processing apparatus and program
JP6052458B1 (en) * 2016-06-29 2016-12-27 富士ゼロックス株式会社 Information processing apparatus and program

Also Published As

Publication number Publication date
JP2018112860A (en) 2018-07-19
CN108307084B (en) 2022-06-24

Similar Documents

Publication Publication Date Title
CN107346220B (en) Information processing apparatus, information processing method, and computer program
JP6052459B1 (en) Information processing apparatus and program
CN107346221A (en) Message processing device and information processing method
CN107346204A (en) Message processing device and information processing method
JP6179653B1 (en) Information processing apparatus and program
JP6052458B1 (en) Information processing apparatus and program
CN109309770A (en) Information processing unit and the computer-readable medium for storing program
US10440208B2 (en) Information processing apparatus with cooperative function identification
CN107391061B (en) Information processing apparatus and information processing method
CN109309769A (en) Information processing unit and the computer-readable medium for storing program
JP6146528B1 (en) Information processing apparatus and program
JP6160761B1 (en) Information processing apparatus and program
CN107346218B (en) Information processing apparatus, information processing method, and computer program
CN108307084A (en) Information processing equipment and information processing method
CN109413293B (en) Information processing apparatus and computer-readable medium storing program
JP6327387B2 (en) Information processing apparatus and program
JP6443498B2 (en) Information processing apparatus and program
CN109309773A (en) Information processing unit and the computer-readable medium for storing program
JP2019067414A (en) Information processing apparatus and program
JP2018129097A (en) Information processing apparatus and program
JP6958680B2 (en) Information processing equipment and programs
JP6743928B2 (en) Information processing device and program
JP6455551B2 (en) Information processing apparatus and program
JP6443497B2 (en) Information processing apparatus and program
JP6624242B2 (en) Information processing device and program

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
CB02 Change of applicant information

Address after: Tokyo, Japan

Applicant after: Fuji film business innovation Co.,Ltd.

Address before: Tokyo, Japan

Applicant before: Fuji Xerox Co.,Ltd.

CB02 Change of applicant information
GR01 Patent grant
GR01 Patent grant