CN107111466B - Method for generating worksheet by using BYOD service and mobile device for performing the same - Google Patents

Method for generating worksheet by using BYOD service and mobile device for performing the same Download PDF

Info

Publication number
CN107111466B
CN107111466B CN201580069939.6A CN201580069939A CN107111466B CN 107111466 B CN107111466 B CN 107111466B CN 201580069939 A CN201580069939 A CN 201580069939A CN 107111466 B CN107111466 B CN 107111466B
Authority
CN
China
Prior art keywords
mobile device
imaging apparatus
image forming
forming apparatus
application
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201580069939.6A
Other languages
Chinese (zh)
Other versions
CN107111466A (en
Inventor
柳明汉
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hewlett Packard Development Co LP
Original Assignee
Hewlett Packard Development Co LP
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hewlett Packard Development Co LP filed Critical Hewlett Packard Development Co LP
Priority claimed from PCT/KR2015/014095 external-priority patent/WO2016105083A1/en
Publication of CN107111466A publication Critical patent/CN107111466A/en
Application granted granted Critical
Publication of CN107111466B publication Critical patent/CN107111466B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/12Digital output to print unit, e.g. line printer, chain printer
    • G06F3/1201Dedicated interfaces to print systems
    • G06F3/1202Dedicated interfaces to print systems specifically adapted to achieve a particular effect
    • G06F3/1203Improving or facilitating administration, e.g. print management
    • G06F3/1204Improving or facilitating administration, e.g. print management resulting in reduced user or operator actions, e.g. presetting, automatic actions, using hardware token storing data
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/12Digital output to print unit, e.g. line printer, chain printer
    • G06F3/1201Dedicated interfaces to print systems
    • G06F3/1202Dedicated interfaces to print systems specifically adapted to achieve a particular effect
    • G06F3/1203Improving or facilitating administration, e.g. print management
    • G06F3/1205Improving or facilitating administration, e.g. print management resulting in increased flexibility in print job configuration, e.g. job settings, print requirements, job tickets
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/12Digital output to print unit, e.g. line printer, chain printer
    • G06F3/1201Dedicated interfaces to print systems
    • G06F3/1223Dedicated interfaces to print systems specifically adapted to use a particular technique
    • G06F3/1275Print workflow management, e.g. defining or changing a workflow, cross publishing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/12Digital output to print unit, e.g. line printer, chain printer
    • G06F3/1201Dedicated interfaces to print systems
    • G06F3/1278Dedicated interfaces to print systems specifically adapted to adopt a particular infrastructure
    • G06F3/1292Mobile client, e.g. wireless printing

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Facsimiles In General (AREA)
  • Studio Devices (AREA)

Abstract

There is provided a method of generating a worksheet defining an order in which jobs are executed, the method comprising: setting any one of the image forming apparatus and the mobile device as an input source for receiving a job target; setting a conversion method for converting a job target received through an input source; setting any one of the image forming apparatus and the mobile device as a transmission destination for transmitting the job target converted according to the conversion method; and storing a work table defining an order of executing the jobs according to the input source, the conversion method, and the transmission destination.

Description

Method for generating worksheet by using BYOD service and mobile device for performing the same
Technical Field
One or more exemplary embodiments relate to mobile device communication, and more particularly, to a method of establishing a connection between a mobile device and an imaging apparatus, and an imaging apparatus and a mobile device for performing the method.
Background
Recently, as the use of personal devices increases at work, a self-contained office of equipment (BYOD) service has emerged. The BYOD service allows users to use personal devices while conducting business. Thus, the company's information, devices, and systems may be accessed by personal devices such as, for example, laptop computers, smart phones, tablet Personal Computers (PCs), and the like. For example, a user, such as an employee, may conduct business by accessing a company's system using a personal laptop computer, rather than a company desktop computer issued for business use.
When creating a BYOD work environment, employees need not carry separate devices for business use and personal use, so productivity can be improved and company expenses due to purchasing devices can be reduced.
Disclosure of Invention
Technical problem
One or more exemplary embodiments include a method of generating a worksheet defining an order in which jobs are performed by using a self-contained office of equipment (BYOD) service, and a mobile Device for performing the method.
Additional aspects will be set forth in part in the description which follows and, in part, will be obvious from the description, or may be learned by practice of the exemplary embodiments provided.
Technical scheme
According to one or more exemplary embodiments, a method of generating a worksheet defining an order of executing jobs, the method comprising: setting any one of the image forming apparatus and the mobile device as an input source for receiving a job target; setting a conversion method for converting a job target received through an input source; setting any one of the image forming apparatus and the mobile device as a transmission destination for transmitting the converted job target; and storing a work table defining an order of executing the jobs according to the input source, the conversion method, and the transmission destination.
Setting any one of the imaging apparatus and the mobile device as the input source may include: selecting any one of an imaging apparatus and a mobile device; and a function of selecting the selected any one of the imaging apparatus and the mobile device operated as the input source.
Setting any one of the imaging apparatus and the mobile device as the input source may further include: setting options related to performing the selected function.
The setting conversion method may include: setting a conversion method for converting the job target by using an application installed in the mobile device storing the workform.
Setting any one of the imaging apparatus and the mobile device as a transmission destination may include: selecting any one of an imaging apparatus and a mobile device; and selecting a function to be executed by the selected any one of the image forming apparatus and the mobile device after the job target is transmitted to the selected any one of the image forming apparatus and the mobile device.
Setting any one of the imaging apparatus and the mobile device as a transmission destination may further include: setting options related to performing the selected function.
When the imaging apparatus is set as an input source, setting any one of the imaging apparatus and the mobile device as a transmission destination may include: an image forming apparatus different from the image forming apparatus set as the input source is set as the transmission destination.
The names of the worksheets may be automatically generated so that the details and order of the jobs defined in the worksheets are distinguished.
When the workform is stored in the mobile device, the mobile device may perform pairing with an image forming apparatus set as the input source or the transmission destination.
According to one or more exemplary embodiments, a mobile device includes: an input unit that receives a user input for setting an input source, a conversion method, and a transmission destination for forming a worksheet; a controller generating a work table defining an order of executing jobs according to an input source, a conversion method, and a transmission destination set according to a user input received through an input unit; and a storage unit that stores the generated work table, wherein the input source is used to receive a job target, the conversion method is used to convert the job target received by the input source, and the transmission destination is used to transmit the converted job target, and any one of the image forming apparatus and the mobile device is set as each of the input source and the transmission destination.
When the user input received through the input unit includes information on a function of selecting any one of the imaging apparatus and the mobile device as an input source and selecting the selected any one of the imaging apparatus and the mobile device operated as the input source, the controller may generate a workform including the selected any one of the imaging apparatus and the mobile device and the selected function.
When the user input received through the input unit further includes information on setting an option related to executing the selected function, the controller may generate a worksheet further including the option.
When the user input received through the input unit contains information on converting the job target by using the application installed in the mobile device, the controller may generate a worksheet defining the application as a conversion method.
When the user input received through the input unit contains information on selecting any one of the image forming apparatus and the mobile device as a transmission destination and selecting a function to be executed by the selected any one of the image forming apparatus and the mobile device after the job target is transmitted to the selected any one of the image forming apparatus and the mobile device, the controller may generate a workform including the selected any one of the image forming apparatus and the mobile device and the selected function.
When the user input received through the input unit further includes information on setting an option related to performing the selected function, the controller generates a worksheet further including the option.
When the image forming apparatus is set as an input source, an image forming apparatus different from the image forming apparatus set as the input source may be set as a transmission destination.
The controller may automatically generate the names of the worksheets so as to distinguish the details and order of jobs defined in the worksheets.
The mobile apparatus may further include a communication unit that communicates with the image forming device, wherein the controller may pair with the image forming device set as an input source or a transmission destination through the communication unit when the work table is generated.
Drawings
These and/or other aspects will become apparent and more readily appreciated from the following description of the exemplary embodiments, taken in conjunction with the accompanying drawings of which:
FIG. 1 is a diagram of a self-contained office of equipment (BYOD) environment in accordance with an illustrative embodiment;
FIG. 2A is a block diagram of hardware components of an imaging device according to an exemplary embodiment;
FIG. 2B is a block diagram of hardware components of an imaging device according to another exemplary embodiment;
FIG. 3A is a block diagram of hardware components of a mobile device, according to an example embodiment;
FIG. 3B is a block diagram of hardware components of a mobile device, according to another example embodiment;
fig. 4 is a diagram describing communication between a mobile device and an imaging apparatus in a BYOD environment according to an exemplary embodiment;
fig. 5 is a diagram describing an operation of performing a Unified Protocol (UP) communication according to an exemplary embodiment;
fig. 6 illustrates a User Interface (UI) screen of a mobile device displayed while the mobile device is connected to an imaging apparatus to perform a BYOD service according to an exemplary embodiment;
fig. 7 is a flowchart of a discovery process of discovering an imaging apparatus to which a mobile device is connected to perform a BYOD service according to an exemplary embodiment;
fig. 8 is a diagram of a BYOD connection environment in which a mobile device discovers an imaging apparatus to perform BYOD service according to an exemplary embodiment;
fig. 9 illustrates a UI of a mobile device showing a result of discovering an imaging apparatus to perform a BYOD service according to an exemplary embodiment;
fig. 10 is a diagram depicting the installation of a BYOD application in accordance with an exemplary embodiment;
fig. 11 is a diagram describing information exchange performed during a pairing process between an image forming apparatus for performing a BYOD service and a mobile device according to an exemplary embodiment;
fig. 12 is a detailed flowchart of a pairing process between an image forming apparatus for performing a BYOD service and a mobile device according to an exemplary embodiment;
fig. 13A is a diagram describing authorization information provided from an imaging apparatus to a mobile device according to an exemplary embodiment;
fig. 13B is a diagram describing credential information (token and secret) provided from an imaging apparatus to a mobile device according to an exemplary embodiment;
fig. 14 is a diagram of a pairing process between an image forming apparatus and a mobile device for performing a BYOD service according to an exemplary embodiment;
fig. 15 is a diagram describing that an imaging apparatus transmits authorization for Personal Identification Number (PIN) code assignment to a mobile device after a PIN code is authenticated according to an exemplary embodiment;
FIG. 16A is a flowchart of a pairing process performed by a mobile device, according to an example embodiment;
fig. 16B is a flowchart of a pairing process performed by an image forming apparatus according to an exemplary embodiment;
fig. 17 is a diagram describing an event registration process between an imaging apparatus for performing a BYOD service and a mobile device according to an exemplary embodiment;
fig. 18 is a diagram for describing a method of transmitting an event generated by an imaging apparatus to a mobile device according to an exemplary embodiment;
fig. 19A is a flowchart of a method of establishing a connection with an imaging apparatus by a mobile device according to an exemplary embodiment;
fig. 19B is a flowchart of a method of establishing a connection with a mobile device by an imaging apparatus according to an exemplary embodiment;
fig. 20 is a diagram of an environment of a mobile device that generates, manages, and executes a worksheet using BYOD services, according to an example embodiment;
FIG. 21 is a diagram of a process of generating a worksheet for a multifunction printer (MFP) to print photographs captured by a mobile device, according to an exemplary embodiment;
fig. 22 is a diagram of performing a process of printing a worksheet of photographs captured by a mobile device by an imaging apparatus according to an exemplary embodiment;
FIG. 23 is a flowchart of a process of generating a worksheet for printing of photos captured by a mobile device by an imaging apparatus, according to an exemplary embodiment;
FIG. 24 is a flowchart of a process of executing a worksheet for printing photographs captured by a mobile device by an imaging apparatus, according to an exemplary embodiment;
fig. 25 is a diagram of a process of generating a worksheet for editing an image scanned by an imaging apparatus by a mobile device and sending the edited image via email, according to an exemplary embodiment;
fig. 26 is a diagram of performing a process of editing an image scanned by an imaging apparatus by a mobile device and sending the edited image via email according to an exemplary embodiment;
fig. 27 is a flowchart of a process of generating a worksheet of editing an image scanned by an imaging apparatus by a mobile device and sending the edited image via email, according to an exemplary embodiment;
fig. 28 is a diagram of performing a process of editing an image scanned by an imaging apparatus by a mobile device and sending the edited image via email according to an exemplary embodiment;
fig. 29 is a diagram of a process of generating a worksheet for editing an image scanned by an image forming apparatus by a mobile device and printing the edited image by another image forming apparatus according to an exemplary embodiment;
fig. 30 is a diagram of performing a process of editing an image scanned by an image forming apparatus by a mobile device and printing a worksheet of the edited image by another image forming apparatus according to an exemplary embodiment;
fig. 31 is a flowchart of a process of generating a worksheet for editing an image scanned by an image forming apparatus by a mobile device and printing the edited image by another image forming apparatus according to an exemplary embodiment;
fig. 32 is a diagram of performing a process of editing an image scanned by an image forming apparatus by a mobile device and printing a worksheet of the edited image by another image forming apparatus according to an exemplary embodiment;
fig. 33 is a diagram of detailed processing of a mobile device and an imaging apparatus that perform pairing when generating a worksheet, according to an exemplary embodiment;
FIG. 34 is a flowchart of a method of generating a worksheet in accordance with an exemplary embodiment;
fig. 35 to 37 are diagrams describing a method of reserving a job by using a BYOD service according to an exemplary embodiment;
FIG. 38 is a diagram of a structure of a mobile device processing a worksheet in accordance with an illustrative embodiment;
FIG. 39 is a flowchart of a method of processing a workflow in accordance with an illustrative embodiment;
fig. 40 is a diagram of an operation of processing a workflow in which a scanning function of a first imaging apparatus and an editing function of a mobile device are combined, according to an exemplary embodiment;
fig. 41 is a diagram describing connection of a first imaging apparatus for processing a workflow and a mobile device when the mobile device selects the workflow;
fig. 42 is a diagram of a process of receiving a result of performing a scan function of the first imaging apparatus after the mobile device performs the scan function;
fig. 43 is a diagram for describing an editing function performed by using a resource of a mobile device;
fig. 44 is a diagram describing a manipulation interface with respect to drawing when an editing function of the mobile device is executed;
fig. 45 is a diagram describing a manipulation interface with respect to an additional image when an editing function of the mobile device is performed;
fig. 46 is a diagram of a process of processing a workflow in which the scanning function of the first image forming apparatus, and the editing function and the document transmission function of the mobile device are combined, according to an exemplary embodiment;
fig. 47 is a diagram of a process of executing a document transmission function of a mobile device with respect to an edited document obtained by editing a scanned document;
fig. 48 is a diagram of a process of processing a workflow in which the scanning function of the first imaging apparatus and the editing function and the sharing function of the mobile device are combined, according to an exemplary embodiment;
fig. 49 is a diagram of a process of executing a sharing function of a mobile device with respect to an edited document obtained by editing a scanned document;
fig. 50 is a diagram of a process of processing a workflow in which a scanning function of a first image forming apparatus, an editing function of a mobile device, and a document transmission function of the first image forming apparatus are combined, according to an exemplary embodiment;
fig. 51 is a diagram of a process of executing the document transmission function of the first image forming apparatus with respect to an edited document obtained by editing a scanned document;
FIG. 52 is a flowchart of a method of processing a workflow according to another exemplary embodiment;
fig. 53 is a diagram of a process of processing a workflow in which a scan function of a first image forming apparatus, an edit function of a mobile device, and a print function of a second image forming apparatus are combined, according to an exemplary embodiment;
fig. 54 is a diagram describing connection of first and second imaging apparatuses for processing a workflow with a mobile device when the mobile device selects the workflow;
FIG. 55 is a diagram of a process of executing the print function of the second image forming apparatus with respect to an edited document obtained by editing a scanned document;
56A-56C illustrate UIs for generating worksheets by combining applications in accordance with an illustrative embodiment;
fig. 57 is a diagram describing an embodiment of generating a work table for controlling power supplied to the imaging apparatus according to the position of the mobile device;
fig. 58 is a diagram describing an embodiment of generating a work table for changing the settings of the imaging apparatus from the temperature measured by the mobile device;
FIG. 59 is a diagram depicting an embodiment of generating a worksheet for downloading travel information, and then printing or transmitting/storing the downloaded travel information based on the location of the mobile device;
FIG. 60 is a diagram depicting an embodiment of generating a worksheet for filtering received mail and then printing or sending/storing the filtered mail in accordance with certain criteria;
FIG. 61 is a diagram depicting an embodiment of generating a worksheet for printing or sending/storing translations of text extracted from scanned images of documents;
FIG. 62 is a diagram for describing an embodiment of generating a worksheet for transmitting/storing a file obtained by combining a voice with a scanned image of a document;
FIG. 63 is a diagram describing an embodiment of generating a worksheet for automatically generating a filename of a scanned image of a document by identifying and searching for images included in the scanned image;
fig. 64 is a diagram of an environment that provides a secure printing solution in a BYOD environment, according to an example embodiment;
fig. 65A to 65C are diagrams describing a process of installing a print driver and a secure print application and executing printing in a Mobile Device Management (MDM) environment according to an exemplary embodiment;
FIG. 66 is a diagram depicting an embodiment of generating and executing a worksheet for scanning a document to obtain a scanned image, generating a filename using information obtained via Optical Character Recognition (OCR) of a particular region of the scanned image, and then storing the scanned image in a Server Management Block (SMB) server;
FIG. 67 is a diagram depicting an embodiment of generating and executing a worksheet for scanning a document to obtain a scanned image, generating a filename using information obtained via OCR to a particular region of the scanned image, and then storing the scanned image by a cloud document management application provided by a third party;
FIG. 68 is a diagram depicting an embodiment of generating and executing a worksheet for capturing an image of a document, generating a filename using information obtained via OCR for a particular region of the image, and then storing the image by a cloud document management application provided by a third party;
FIG. 69 is a diagram depicting an embodiment of generating and executing a worksheet for scanning business cards to obtain a scanned image, obtaining an email address from the scanned image, and then sending a file to the email address; and
fig. 70 is a diagram describing an embodiment of generating and executing a worksheet for scanning business cards to obtain a scanned image, obtaining email addresses from the scanned image, and then updating the email addresses to an address book.
Detailed Description
According to one or more exemplary embodiments, a method of generating a worksheet defining an order of executing jobs, the method comprising: setting any one of the image forming apparatus and the mobile device as an input source for receiving a job target; setting a conversion method for converting a job target received through an input source; setting any one of the image forming apparatus and the mobile device as a transmission destination for transmitting the converted job target; and storing a work table defining an order of executing the jobs according to the input source, the conversion method, and the transmission destination.
According to one or more exemplary embodiments, a mobile device includes: an input unit that receives a user input for setting an input source, a conversion method, and a transmission destination for forming a worksheet; a controller generating a work table defining an order of executing jobs according to an input source, a conversion method, and a transmission destination set according to a user input received through an input unit; and a storage unit that stores the generated work table, wherein the input source is used to receive a job target, the conversion method is used to convert the job target received by the input source, and the transmission destination is used to transmit the converted job target, and any one of the image forming apparatus and the mobile device is set as each of the input source and the transmission destination.
Modes for carrying out the invention
Reference will now be made in detail to exemplary embodiments, examples of which are illustrated in the accompanying drawings, wherein like reference numerals refer to the like elements throughout. In this regard, the present exemplary embodiments may have different forms and should not be construed as being limited to the descriptions set forth herein. Accordingly, the exemplary embodiments are described below to illustrate aspects of the present specification, only by referring to the drawings.
It will be further understood that the terms "comprises" and/or "comprising," when used herein, specify the presence of stated features or components, but do not preclude the presence or addition of one or more other features or components.
It will be understood that, although the terms "first," "second," etc. may be used herein to describe various components, these components should not be limited by these terms. These components are used only to distinguish one component from another.
Expressions such as "at least one of, when followed by a list of elements, modify the entire list of elements without modifying each element of the list.
One or more exemplary embodiments will now be described in detail with reference to the accompanying drawings.
Fig. 1 is a diagram of a "do-it-yourself" (BYOD) environment in accordance with an example embodiment. In a BYOD environment according to an exemplary embodiment, a user may manipulate various functions of the imaging apparatus 200 via a BYOD service by using one or more BYOD devices, for example, in the form of mobile devices 100 or 105. In other words, one or more BYOD devices that control the imaging apparatus 200 may be simultaneously connected to the imaging apparatus 200. The BYOD service may be a service in which the personal device accesses the function of the imaging apparatus 200 to make the resources of the imaging apparatus 200 shared with the personal device. The BYOD environment may be a network system for using BYOD service.
In fig. 1, the mobile device 100 is illustrated as a tablet device and the mobile device 105 is illustrated as a smartphone, but the types of the mobile devices 100 and 105 are not limited thereto. In other words, the mobile device 100 or 105 may be any of a variety of devices that include a display screen, such as a tablet device, a smartphone, a laptop computer, a Personal Digital Assistant (PDA), and a wearable device (watch or glasses).
A BYOD application needs to be installed in the mobile device 100 or 105 supporting BYOD in order to use the BYOD service. The BYOD application may also be referred to as a BYOD portal application. When the BYOD application is installed in the mobile device 100 or 105, the mobile device 100 or 105 transmits a control command to the imaging apparatus 200 to control the operation of the imaging apparatus 200. Here, the BYOD application may control the imaging apparatus 200 via an Application Programming Interface (API). The mobile device 100 or 105 may be wirelessly connected to the imaging apparatus 200 through an Access Point (AP) or Wi-Fi direct.
As shown, the imaging apparatus 200 includes a manipulator 210, and a user may also manipulate the imaging apparatus 200 via the manipulator 210. The manipulator 210 may include a display panel displaying a Graphical User Interface (GUI) and input keys receiving user input.
The manipulator 210 of the imaging apparatus 200 and the mobile device 100 or 105 may have independent User Interface (UI) contents. In other words, the mobile device 100 or 105 may display UI content displayed on the manipulator 210 or display separate UI content for manipulating the imaging apparatus 200, which is different from the UI content displayed on the manipulator 210. In other words, UI contents for performing various functions of the image forming apparatus 200, such as copying, printing, and scanning, may be independently provided to the BYOD application installed in the mobile device 100 or 105.
The user may perform some or all of the manipulations available in manipulator 210 by using mobile device 100 or 105. Therefore, according to the current embodiment, the user can manipulate the image forming apparatus 200 by using the mobile device 100 or 105, and can conveniently print a file stored in the mobile device 100 or 105, or perform an image forming job (e.g., scan to email or scan to cloud) by using an address book stored in the mobile device 100 or 105, and thus user convenience can be improved. Further, for example, the process of converting the image data into the print data such as the Print Command Language (PCL), the Page Description Language (PDL), or the post-Processing Script (PS) may be performed via a resource of the image forming apparatus 200 having a relatively high processing performance, instead of a resource of the mobile device 100 or 105 having a relatively low processing performance. Thus, the print data can be processed at a high speed as compared with a general mobile printing application.
As such, the mobile device 100 or 105 may be connected to the imaging apparatus 200 so as to perform a BYOD service to control the imaging apparatus 200. In the current embodiment, the mobile device 100 or 105 is connected to the image forming apparatus 200 via various connection methods such as, for example, pairing. A connection method of the BYOD service will now be described in detail with reference to the related drawings.
Fig. 2A is a block diagram of hardware components of the imaging device 200 of fig. 1, according to an example embodiment. Referring to fig. 2A, the image forming apparatus 200 may include the manipulator 210, the main controller 220, the communication unit 230, the printer 240, the scanner 250, and the fax unit 260 of fig. 1. However, it will be apparent to those of ordinary skill in the art that the imaging device 200 may also include general hardware components other than those shown in fig. 2A.
The manipulator 210 is a hardware component used by the user to manipulate or control the imaging apparatus 200. The manipulator 210 may include a display panel (not shown) for displaying a GUI screen and input keys for receiving user input. The manipulator 210 provides a GUI screen to the user and transmits a manipulation command received from the user to the main controller 220 through the GUI screen.
The main controller 220 is a hardware component that controls the operation of some or all of the components included in the image forming apparatus 200, and may be implemented as a processor. The main controller 220 may communicate with the manipulator 210 to transmit and receive commands required to manipulate and control the image forming apparatus 200 to and from the manipulator 210. Further, the main controller 220 may communicate with the mobile device 100 or 105 connected to the image forming apparatus 200 for the BYOD service to transmit and receive commands required to operate and control the image forming apparatus to and from the mobile device 100 or 105 of fig. 1.
The communication unit 230 is a hardware component for communicating with the mobile device 100 or 105 (of fig. 1) that provides the BYOD service as described above with reference to fig. 1. The communication unit 230 may connect to the mobile device 100 or 105 via the AP or directly by using Wi-Fi direct.
The printer 240 performs a printing operation according to the control of the main controller 220, the scanner 250 performs a scanning operation according to the control of the main controller 220, and the fax unit 260 performs a fax operation according to the control of the main controller 220.
Fig. 2B is a block diagram of hardware components of the imaging device 200 of fig. 1 according to another exemplary embodiment. Referring to fig. 2B, the image forming apparatus 200 may include a main controller 220 (of fig. 1 and 2A) and a communication unit 230 (of fig. 2A). In other words, the imaging apparatus 200 of fig. 2B includes only some components of the imaging apparatus 200 of fig. 1 for convenience of explanation, but the components of the imaging apparatus 200 of fig. 2B are not limited thereto.
The communication unit 230 transmits temporary credential information issued when a pairing request is received from the mobile device 100 to the mobile device, and receives a Personal Identification Number (PIN) code encrypted by the mobile device 100.
The main controller 220 decrypts the PIN code by using the temporary credential information to determine whether the PIN code is valid.
When it is determined that the PIN code is valid, the main controller 220 issues permanent credential information and controls the communication unit 230 so that the permanent credential information is returned to the mobile device 100.
Fig. 3A is a block diagram of the hardware components of the mobile device 100 of fig. 1, according to an example embodiment. Referring to fig. 3A, the mobile device 100 may include a controller 110, a display unit 120, an input unit 130, a communication unit 140, and a storage unit 150. In addition, the communication unit 140 may include a mobile communication module 142 and a wireless communication module 144. However, mobile device 100 may include general-purpose hardware components different from or in addition to those shown in FIG. 3A. Meanwhile, in fig. 3A, components of the mobile device 100 of fig. 1 will be described, but the details of fig. 3A may also be applied to the mobile device 105 of fig. 1. In other words, mobile device 105 of fig. 1 may include the components of mobile device 100 shown in fig. 3A and other additional components not shown. In the illustrated embodiment, the storage unit 150 may store a BYOD application (not shown). In some embodiments, the BYOD application may be stored in other components external to controller 110, or alternatively stored internally as part of controller 110. Further, although the storage unit 150 is illustrated as being external to the controller 110, the storage unit 150 may be a storage unit embedded in the controller 110.
The controller 110 is a hardware component implemented in at least one processor and may control the overall operation of the components in the mobile device 100. For example, the controller 110 may execute a BYOD application stored in the storage unit 150 to control a BYOD service with respect to the image forming apparatus 200. In addition, the controller 110 may control the mobile device 100 such that a connection with the imaging apparatus 200 for the BYOD service is established. Further, the controller 110 may control the functions and operations of the mobile device 100 described below according to one or more exemplary embodiments. The controller 110 may be implemented as a processor module such as a Central Processing Unit (CPU), an application processor, or a Graphics Processing Unit (GPU).
The display unit 120 displays and outputs information processed by the mobile device 100. For example, the display unit 120 may display a GUI screen for controlling the image forming apparatus 200 according to the BYOD service, or display information about an event (e.g., a print completion event or a power low event) generated in the image forming apparatus 200. Further, the display unit 120 may display information (e.g., a discovery result or a PIN code input screen) of the mobile device 100 connected to the imaging apparatus 200 for the BYOD service. The display unit 120 may be of any type, such as a Liquid Crystal Display (LCD), a thin film transistor-liquid crystal display (TFT-LCD), an Organic Light Emitting Diode (OLED), a flexible display, a three-dimensional (3D) display, or an electrophoretic display.
The input unit 130 allows a user to input information or instructions to control the mobile device 100. For example, although not shown, the input unit 130 may include a keyboard, a dome switch, a touch pad (a contact capacitance type, a pressure resistance film type, an infrared ray detection type, or a piezoelectric effect type), a jog wheel, or a jog switch, but is not limited thereto. The input unit 130 may receive a user input for selecting any one of various contents or options displayed on the display unit 120. For example, the input unit 130 may receive information on a PIN code for authenticating the mobile device 100 to the imaging apparatus 200 connected for the BYOD service from the user.
In the mobile device 100, the display unit 120 and the input unit 130 may be integrated in the form of a touch screen widely used in a smart phone or a tablet device.
The communication unit 140 performs communication between the mobile device 100 and the image forming apparatus 200, and may include a mobile communication module 142 and a wireless communication module 144.
The mobile communication module 142 transmits and receives mobile communication signals to and from a base station (not shown), an external device (not shown) or a server (not shown) on a mobile communication network (not shown). Here, examples of the mobile communication signal include various types of wireless data such as a voice call signal, an image call signal, a text/multimedia message signal, and a content data signal received through a third generation (3G) or fourth generation (4G) mobile communication network.
The wireless communication module 144 may include a bluetooth module (not shown), a Bluetooth Low Energy (BLE) module (not shown), a Near Field Communication (NFC) module (not shown), a Wireless Local Area Network (WLAN) (Wi-Fi) module (not shown), a Zigbee module (not shown), an infrared data association (IrDA) module (not shown), a Wi-Fi direct (WFD) module (not shown), or an Ultra Wideband (UWB) module (not shown), but is not limited thereto. The wireless communication module 144 is also capable of communicating with the imaging device 200 via a local wireless connection or a wireless network connection.
The storage unit 150 may store programs for processing and control of the controller 110, or may store various types of data (e.g., applications such as BYOD applications) and various types of content (e.g., documents, pictures, and images). The storage unit 150 may include any one of various types of storage media: such as flash memory, a Hard Disk Drive (HDD), card memory (e.g., a Secure Digital (SD) card), Random Access Memory (RAM)), Static Random Access Memory (SRAM), Read Only Memory (ROM), Electrically Erasable Programmable Read Only Memory (EEPROM), Programmable Read Only Memory (PROM), magnetic memory, a magnetic disk, and an optical disk. Further, the storage unit 150 may operate as a web storage unit.
Fig. 3B is a block diagram of hardware components of the mobile device 100 of fig. 1, according to another example embodiment. Referring to fig. 3B, the mobile device 100 may include the controller 110 of fig. 3A and the communication unit 140 of fig. 3A. In other words, the mobile device 100 of fig. 3B includes some components of the mobile device 100 of fig. 3A for ease of illustration, but the components of the mobile device 100 of fig. 3B are not limited thereto.
The communication unit 140 receives temporary credential information issued by the image forming apparatus 200 when transmitting a pairing request to the image forming apparatus 200.
The controller 110 encrypts the PIN code input by the user by using the temporary credential information.
The communication unit 140 transmits the PIN code to the imaging apparatus 200, and receives permanent credential information from the imaging apparatus 200 when the imaging apparatus 200 determines that the PIN code is valid.
Detailed functions and operations of components of the imaging apparatus and the mobile device 100 will now be described in detail with reference to fig. 2A to 3B.
Fig. 4 is a diagram describing communication between the mobile devices 100 and 105 and the imaging apparatus 200 in a BYOD environment according to an exemplary embodiment.
Referring to fig. 4, the imaging apparatus 200 is connected to the mobile devices 100 and 105, but the number of mobile devices is not limited to two. As shown in fig. 4, the mobile devices 100 and 105 for BYOD service may be simultaneously connected to the imaging apparatus 200. In some embodiments, the number of BYOD devices that can be simultaneously connected to the imaging apparatus 200 may be limited. For example, the maximum number of BYOD devices that can be simultaneously connected to the imaging apparatus 200 may be set in advance based on the product specification of the imaging apparatus, such as the memory size, and when the maximum number has been reached, the BYOD devices attempting to connect to the imaging apparatus 200 may not be allowed.
In the BYOD environment, the mobile devices 100 and 105 and the imaging apparatus 200 may perform communication using a Unified Protocol (UP). In detail, the main controller 220 of the image forming apparatus 200 may perform UP communication with the mobile devices 100 and 105. Further, the main controller 220 may perform UP communication with the manipulator 210. The UP is a web service API and is a protocol for accessing, generating, deleting, and updating resources by using a hypertext transfer protocol (HTTP) based on a Uniform Resource Locator (URL).
The mobile devices 100 and 105 may control the operation of the image forming apparatus 200 by transmitting an UP command to the main controller 220. The main controller 220 controls the printer 240, the scanner 250, and the fax unit 260 to perform operations corresponding to the UP command received from the manipulator 210 or the mobile devices 100 and 105.
Meanwhile, when an event is generated, the main controller 220 broadcasts the event to the manipulator 210 and the mobile devices 100 and 105. The manipulator 210 and the mobile devices 100 and 105 may each determine whether an event needs to be processed and perform an operation when the event needs to be processed and ignore the event when the event does not need to be processed.
To perform the UP communication, the main controller 220 may operate as a server. In other words, the main controller 220 may include the UP web server 221. Here, it may be assumed that the manipulator 210 and the mobile devices 100 and 105 are clients. The client can request resources from the UP web server 221 and the UP web server 221 responds to the request. The UP web server 221 and the client may use HTTP as a communication protocol. Thus, any device can connect to the UP web server 221 as long as the device uses HTTP, and can communicate with the UP web server 221 as long as the device uses the determined protocol, regardless of different platforms.
Fig. 5 is a diagram for describing an operation of performing UP communication according to an exemplary embodiment. The manipulator 210 and the main controller 220 of fig. 5 may be components included in the image forming apparatus 200 of fig. 2A or 2B.
Referring to fig. 5, when the manipulator 210 transmits a connection request and a job request to the UP web server 221 of the main controller 220 by using HTTP in operation 501, the UP web server 221 registers the manipulator 210 in operation 502. In other words, the UP web server 221 generates a session by using access information included in the HTTP request received from the manipulator 210.
Similarly, when the mobile device 100 transmits a connection request and a job request to the UP web server 221 by using HTTP in operation 503, the UP web server registers the mobile device 100 in operation 504. In other words, the UP web server 221 generates a session by using access information included in an HTTP request received from the mobile device 100.
When an event is generated in the image forming apparatus 200 in operation 505, the UP web server 221 transmits responses to the manipulator 210 and the mobile device 100 in operations 506 and 507, respectively. According to an exemplary embodiment, the UP web server 221 may not immediately transmit a response when receiving a request from the manipulator 210 or the mobile device 100, but may transmit a response after generating an event, and this method may be referred to as a long polling method.
Fig. 6 illustrates a UI screen of the mobile device 100 displayed while the mobile device 100 is connected to the imaging apparatus 200 to perform a BYOD service according to an exemplary embodiment. The UI screen of fig. 6 is arbitrarily shown for convenience of description, and thus the arrangement and composition of the UI screen may vary.
The mobile device 100 may be a BYOD device, and the process of the mobile device 100 connected to the imaging apparatus 200 may be roughly divided into three processes, i.e., a discovery process 650, a pairing process 660, and an event registration process 670. Here, the communication between the mobile device 100 of fig. 1 and the image forming apparatus 200 of fig. 1 may be performed based on an open standard for authorization (OAuth).
First, the mobile device 100 executes or starts the BYOD application 601 in the execution process 680 on the wallpaper UI 610. The BYOD application 601 is stored in the storage unit 150 of fig. 3A, and can be downloaded and installed into the mobile device 100 from commonly known online application markets such as Samsung Apps of Samsung, Google Play of Google, or AppStore of Apple. Alternatively, the BYOD application 601 may be a basic application installed in the mobile device 100 during manufacturing. The BYOD application 601 may provide various functions for the BYOD service, such as connecting to the imaging apparatus 200, displaying a GUI screen for controlling the imaging apparatus 200 of fig. 1, and generating commands for controlling the imaging apparatus 200 of fig. 1.
When the BYOD application 601 is executed in the execution process 680, the display unit 120 of fig. 3A displays the UI 620 that enables the discovery process 650 to be executed. The discovery process 650 searches for imaging devices that may be connected to the mobile device 100, such as, for example, the imaging devices of fig. 1. In the illustrated embodiment, the UI 620 finds three imaging devices, namely, "SCX-5737" 602, "CLX-8650," and "SCX-6401 n" found. Information about a model name, an Internet Protocol (IP) address, and a location of a found imaging device may also be displayed on the UI 620. The user can select an imaging device "SCX-5737" 602 to be controlled (or connected) by the BYOD service from among the discovered imaging devices. Meanwhile, the list of imaging devices displayed on the UI 620 may support the BYOD service, or alternatively, the imaging devices may be found regardless of whether the BYOD service is supported.
When the imaging device "SCX-5737" 602 to be controlled (or connected) is selected through the BYOD service, the pairing process 660 is performed between the imaging device "SCX-5737" 602 and the mobile device 100. Pairing process 660 is a series of operations performed by imaging apparatus "SCX-5737" 602 to determine whether mobile device 100 is reliable as a BYOD device and to initiate a connection session for the BYOD device.
The mobile device 100 prompts and receives user input regarding the PIN code via prompt 625 of UI 630 for the pairing process 660. The PIN code may be used to authenticate the mobile device 100 to the imaging apparatus "SCX-5737". In other words, to pair the mobile device 100 with the imaging apparatus "SCX-5737" 602 as a BYOD device, the PIN code may be authenticated. After the imaging device "SCX-5737" 602 authenticates the PIN code, the imaging device "SCX-5737" 602 may issue authentication information, such as credential information including a token and a secret (secret key or password), and transmit the credential information to the mobile device 100. The mobile device 100 may store the credential information (token and secret) in the storage unit 150 and transmit the credential information when transmitting the BYOD command to the imaging apparatus "SCX-5737" 602 to inform the imaging apparatus "SCX-5737" 602 that the BYOD command is a valid command. In other words, the imaging apparatus "SCX-5737" 602 may allow access to the BYOD service only to the mobile device 100 having the credential information in order to prevent another mobile device having an unauthenticated PIN code from using the BYOD service. The pairing process using the credential information will be described in detail below with reference to fig. 10 to 16B.
The event registration process 670 begins after the imaging apparatus "SCX-5737" 602 authenticates the PIN code and the imaging apparatus "SCX-5737" 602 pairs with the mobile device 100.
After establishing the BYOD connection, the event registration process 670 selects an event to be received from the mobile device 100 from various events that the imaging device "SCX-5738" 602 can generate, such as a print completion event, a paper low event, a paper jam event, and a scan completion event. Referring to the UI 640 for the event registration process 670, the mobile device 100 may receive an event generation notification about a "print completion" event and a "scan completion" event selected from various events and may not receive an event generation notification about other events that are not selected from the imaging apparatus "SCX-5737" 602.
In fig. 6, in order for the mobile device 100 to connect to the imaging apparatus 200 as a BYOD device, a BYOD application 601 is executed, and a discovery process 650, a pairing process 660, and an event registration process 670 are executed. The discovery process 650, pairing process 660, and event registration process 670 will now be described in more detail.
Fig. 7 is a flowchart of a discovery process 700 (similar to the discovery process 650 of fig. 6) of discovering the imaging apparatus 200 of fig. 1 to which the mobile device 100 of fig. 1 is to connect in order to perform a BYOD service, according to an example embodiment. Referring to fig. 7, the method may be performed by components of the mobile device 100 of fig. 3A or 3B.
In operation 701, the controller 110 of the mobile device 100 generates a discovery request for searching for an image forming apparatus connectable via the BYOD service, and the communication unit 140 (of fig. 3A and 3B, for example) of the mobile device 100 broadcasts the discovery request. Here, the communication unit 140 may broadcast the discovery request through a mobile communication module 142 (e.g., of fig. 3A) (e.g., 3G or 4G) or a wireless communication module 144 (e.g., of fig. 3A) (e.g., Wi-Fi, WFD, bluetooth, or NFC).
In operation 702, the controller 110 of the mobile device 100 determines whether an imaging apparatus is found based on the result of the broadcasting. If no imaging device is found, the discovery process 700 ends. If one or more imaging devices have been found, operation 703 is performed.
In operation 703, the communication unit 140 of the mobile device 100 receives Identification (ID) information about the discovered imaging apparatus from the discovered imaging apparatus. Here, the ID information may include information on: a model name of the imaging apparatus, whether the BYOD service is supported, a connection method of the imaging apparatus, an Internet Protocol (IP) address of the imaging apparatus, a location of the imaging apparatus, a description of the imaging apparatus, or whether the imaging apparatus is registered as a favorite.
In operation 704, the controller 110 of the mobile device 100 determines whether an imaging apparatus supporting the BYOD service exists among the discovered imaging apparatuses based on the received ID information. If no imaging apparatus supports the BYOD service, the discovery process ends. If there is an image forming apparatus supporting the BYOD service, operation 705 is performed.
In operation 705, the display unit 120 (e.g., of fig. 3A) of the mobile device 100 displays a list of imaging apparatuses supporting the BYOD service.
After receiving the ID information, the mobile device 100 may transmit a pairing request to the image forming apparatus 200 to perform the pairing process after the discovery process 700.
Fig. 8 is a diagram of a BYOD connection environment 800 in which the mobile device 100 of fig. 1 discovers an imaging apparatus to perform a BYOD service according to an exemplary embodiment.
Referring to fig. 8, the communication unit 140 (of fig. 3A, for example) of the mobile device 100 may discover an imaging apparatus via WFD, NFC, Wi-Fi, bluetooth, and 3G or 4G mobile communication. In detail, the mobile device 100 may find "SCX-8230" 201 as a proximity imaging device connectable via WFD 804 by activating a WFD module (not shown), may find "SL-C473" 202 as a proximity imaging device connectable via NFC 808 by activating an NFC module (not shown), and may find "SCX-5737" 205 as a proximity imaging device connectable via bluetooth 812 by activating a bluetooth module (not shown).
Further, the mobile device 100 can discover "CLX-8650" 203 as an imaging apparatus connected to the AP 801 via a network by wirelessly connecting to the AP 801 via Wi-Fi 816. When the mobile device 100 exists within the same network environment as the imaging apparatus, for example, "CLX-8650" 203, that is, when the mobile device 100 is connected to the "CLX-8650" 203 through the AP 801, the mobile device 100 can discover the "CLX-8650" 203 by using a method such as universal plug and play (UPnP), Bonjour, Simple Network Management Protocol (SNMP), or multicast domain name system (mDNS). However, "CLX-8650" may be discovered in the same manner even if the mobile device 100 and "CLX-8650" 203 are not in the same network environment.
Further, the mobile device 100 may connect to the external server 802 through 3G or 4G mobile communication, and discover the "SL-M43 4370 LX" 204 as a remote imaging apparatus (e.g., in another region or in a foreign country) connected to the external server 802 (e.g., a web server, a cloud server, or a mobile carrier server) via a network. In other words, the mobile device 100 can discover a close-range or long-range imaging apparatus by using various communication methods.
Fig. 9 illustrates a UI 901 of the mobile device 100 of fig. 1, which illustrates a result of discovering an imaging apparatus for performing a BYOD service according to an exemplary embodiment.
In FIG. 9, it is assumed that the five imaging devices of FIG. 8, namely, "SCX-8230" 201, "SL-C473" 202, "CLX-8650" 203, "SL-M4370 LX" 204, and "SCX-5737" 205, are found by the mobile device 100.
In the UI 901, ID information of the imaging apparatus including information on a model name, whether the BYOD service is supported, a connection method, an IP address, a location, a description, and whether the imaging apparatus is registered as a favorite may also be displayed. The user may select one of the imaging devices displayed on the UI 901, for example, "SCX-8230," to connect the mobile device 100 to "SCX-8230" as a BYOD device.
Meanwhile, in fig. 9, imaging devices ("SCX-5737" and "SL-M4370 LX") that do not support the BYOD service are also displayed on the UI 901. However, in operations 704 and 705 of FIG. 7, only imaging devices ("SCX-8230", "SL-C473", and "CLX-8650") that support the BYOD service are displayed. Exemplary embodiments are not limited, and it is found that the results of the imaging devices may also include a list of imaging devices ("SCX-5737" and "SL-M4370 LX") that do not support BYOD services, or may include only a list of imaging devices ("SCX-8230", "SL-C473" and "CLX-8650") that support BYOD services.
Fig. 10 is a diagram describing an installation process of a BYOD application according to an exemplary embodiment.
In a general pairing technique, it is necessary to install a specific application in two devices to be paired, and to pair the devices by the specific application. In other words, when a specific application is installed in a device, the specific application provides a permission to control the device and stores application information in a system of the device, so that the specific application is driven even after the device is turned on or off. In other words, when a specific application is installed, the specific application has authorization to control the device.
However, according to the BYOD environment described in one or more exemplary embodiments, the BYOD application 1001 downloaded from the external server 1000 may not be installed on the imaging apparatus 200 while the BYOD application 1001 is installed in the mobile device 100 regardless of the imaging apparatus 200. Thus, the pairing of BYOD service described in one or more exemplary embodiments may be established using only the BYOD application 1001 installed only in the mobile device 100. In other words, in the BYOD environment described in one or more exemplary embodiments, the imaging apparatus 200 may be controlled under the authority of the BYOD application 1001 installed only in the mobile device 100.
In order to control the imaging apparatuses 200 to be paired by using the BYOD application 1001 installed only in the mobile device 100, the BYOD application 1001 needs to be identified, and authorization of the BYOD application 1001 needs to be set. Here, the BYOD application 1001 may be identified using credential information (token and secret) issued by the imaging apparatus 200, and an event registration process similar to the event registration process 670 of fig. 6 may be required to set the authorization of the BYOD application 1001.
After the pairing is completed, the image forming apparatus 200 may determine that various types of commands (e.g., API calls) transmitted from the BYOD application 1001 installed in the mobile device 100 are reliable until the pairing is disconnected.
Fig. 11 is a diagram describing information exchange performed during a pairing process between the imaging apparatus 200 for performing the BYOD service and the mobile device 100 according to an exemplary embodiment.
Referring to fig. 11, the input unit 130 (of fig. 3A, for example) of the mobile device 100 receives information on a PIN code from a user, as described above with reference to fig. 6. When the mobile device 100 transmits the PIN code to the imaging apparatus 200, the imaging apparatus 200 issues and returns (or transmits) credential information (token and secret) to the mobile device 100 in response to the PIN code.
The credential information issued by the imaging apparatus 200 is used to identify the mobile device 100 (BYOD application installed in the mobile device 100). The mobile device 100 may transmit a command for controlling the image forming apparatus 200 to the image forming apparatus 200 together with the credential information to notify the image forming apparatus 200 that the command is valid, and the image forming apparatus 200 may authenticate the mobile device 100 by determining that the command is valid using the credential information.
Meanwhile, in one or more exemplary embodiments, terms such as token and secret are used, but it is obvious to those of ordinary skill in the art that these terms may be changed to other terms such as key as long as they are used to identify the mobile device 100 (BYOD application installed in the mobile device 100).
Fig. 12 is a detailed flowchart of a pairing process between the imaging apparatus 200 and the mobile device 100 for performing the BYOD service according to an exemplary embodiment.
Referring to fig. 12, operations 1211 to 1215 are an unsigned process 1210 of exchanging information in an unencrypted (unsigned) state, and operations 1221 to 1226 are a signed process 1220 of exchanging information in an encrypted (signed) state. The pairing process of fig. 12 may be performed by the components of the imaging apparatus 200 and the mobile device 100 described above with reference to fig. 2A to 3B. The unsigned process 1210 may be performed when the BYOD application of the mobile device 100 requests an unsigned API for pairing from the imaging apparatus 200 by using Secure Sockets Layer (SSL)/Transport Layer Security (TLS), and the imaging apparatus 200 allows the request.
In operation 1211, the mobile device 100 (controller 110) performs a pairing process of the BYOD application (similar to the pairing process 660 of fig. 6). Here, it is assumed that the BYOD application executed in the mobile device 100 has performed the discovery process similar to the discovery process 650 and selected the imaging apparatus 200, as described above with reference to fig. 6 to 9.
In operation 1212, the image forming apparatus 200 transmits the authorization information provided by the image forming apparatus 200 to the mobile device 100 through the communication unit 230 (of fig. 2A). Here, the authorization information includes information and/or functions of the imaging apparatus 200 to which the mobile device 100 is allowed to access, and may contain information on: an option related to a print function, an option related to a scan function, an option related to a fax function, an option related to a copy function, a notification related to a status change event, a notification related to an error event, and a notification related to a job processing event.
In operation 1213, the mobile device 100 transmits a pairing request to the image forming device 200 through the communication unit 140 (e.g., of fig. 3A) via the communication unit 230.
In operation 1214, the image forming apparatus 200 issues temporary credential information (a temporary token and a temporary secret) via the main controller 220 of fig. 4, and the image forming apparatus 200 returns (or transmits) the temporary credential information to the mobile device 100 via the communication unit 140 through the communication unit 230.
In operation 1215, the mobile device 100 receives information about a PIN code from a user via the input unit 130 (of fig. 3A, for example). In other words, the user inputs the PIN code through the input unit 130. The PIN code may be used to determine that the mobile device 100 is valid as a BYOD device with respect to the imaging apparatus 200. The PIN code may be issued by a manager who manages the BYOD environment or by the image forming apparatus 200, for example, via the main controller 220.
In operation 1221, the mobile device 100 encrypts, by using the temporary credential information, the PIN code by the controller 110 (e.g., of fig. 3A). Here, the mobile device 100 (controller 110) may encrypt the PIN code using the temporary credential information according to any one of well-known encryption methods such as a hash algorithm and a key encryption algorithm.
In operation 1222, the mobile device 100 transmits the encrypted PIN code to the communication unit 230 (of fig. 2A, for example) of the image forming apparatus 200 through the communication unit 140.
If the main controller 220 or the communication unit 230 of the image forming apparatus 200 does not receive the encrypted PIN code within a certain period of time after returning (or transmitting) the provisional credential information in operation 1214, the main controller 220 of the image forming apparatus 200 may cancel the pairing process and discard the provisional credential information.
In operation 1223, the image forming apparatus 200 decrypts the encrypted PIN code by using the temporary credential information through the main controller 220 (of fig. 2A, for example).
In operation 1224, the image forming apparatus 200 authenticates the decrypted PIN code through the main controller 220 (of fig. 2A, for example). In other words, the image forming apparatus 200 determines whether the decrypted PIN code is a valid PIN code allocated to the mobile device 100 through the main controller 220 (of fig. 2A, for example), so that the mobile device 100 operates as a BYOD device with respect to the image forming apparatus 200.
When it is determined that the decrypted PIN code is valid, the image forming apparatus 200 issues permanent credential information (a permanent token and a permanent secret) through the main controller 220 (of fig. 2A, for example) and the image forming apparatus 200 transmits the permanent credential information to the mobile device 100 through the communication unit 140 (of fig. 3A, for example) through the communication unit 230 (of fig. 2A, for example) in operation 1225. The permanent credential information may be information for assigning authorization to the mobile device 100 to control the function of the image forming apparatus 200. In other words, the permanent credential information may be information for assigning authorization to the mobile device 100 to control the function of the image forming apparatus 200. The term "permanent" may be interpreted as the permanent credential information being valid until the imaging apparatus 200 or the mobile device 100 discards the permanent credential information. Meanwhile, since the decryption of the encrypted PIN code is completed, the image forming apparatus 200 discards the temporary credential information through the main controller 220 (of fig. 2A, for example). In other words, the temporary credential information is only used during the encryption or decryption process of the PIN code as part of the pairing process.
The temporary token and the permanent token may have different values, and the temporary secret and the permanent secret may also have different values.
Although not shown in operation 1225, the image forming apparatus 200 may determine the authority assigned to the PIN code through the main controller 220 (of fig. 2A, for example) in operation 1225, as shown in fig. 15. Then, the imaging apparatus 200 may transmit information (e.g., the information 1501 of fig. 15) about the authorization allocated to the PIN code to the mobile device 100 via the communication unit 140 (e.g., of fig. 3A) through the communication unit 230 (e.g., of fig. 2A) in operation 1225.
In operation 1226, the image forming apparatus 200 starts a pairing session through the main controller 220 and the communication unit 230 (of fig. 2A, for example) and the mobile device 100 via the controller 110 and the communication unit 140 (of fig. 3A, for example), thereby completing the pairing process. In other words, the mobile device 100 may be reliable as a BYOD device for controlling the imaging apparatus 200.
Fig. 13A is a diagram describing authorization information provided from the imaging apparatus 200 to the mobile device 100 according to an exemplary embodiment.
Referring to fig. 13A, in operation 1212 of fig. 12, authorization information may be provided from the imaging apparatus 200 to the mobile device 100. The authorization information includes information about functions of the imaging apparatus 200 to which the mobile device 100 is allowed to access, and may include a list of authorizations to be provided by the imaging apparatus 200, such as "saalocaluaccess", "saaajobrequest", "saaajobcontrol", "saaaservicemode", "saaaupdate", "saaareadcardreader", "saaamanageapplicationinstance", "saaaobtainthwisetysessision", "saaaobtainsafe creatives", and "saaaobtainusecondensionals".
Fig. 13B is a diagram describing credential information (token and secret) provided from the imaging apparatus 200 (of fig. 1, for example) to the mobile device 100 (of fig. 1, for example) according to an exemplary embodiment.
The token ("jjjhhmht 6kngt 545") and secret ("aalljnzxy 678687 jasd") shown in fig. 13B may be a temporary token and a temporary secret, or a permanent token and a permanent secret, respectively.
Fig. 14 is a diagram of a pairing process (similar to pairing process 660 discussed above, e.g., with respect to fig. 6) between a mobile device 100 (similar to mobile device 100 of fig. 1) for performing a BYOD service and an imaging apparatus 200 (similar to imaging apparatus 200 of fig. 1), according to an example embodiment.
Referring to fig. 14, during the pairing process, the imaging apparatus 200 transmits temporary credential information 1401 (a temporary token ("jj 12ejefjf 933") and temporary credential information ("cd 93 rwhskf")) to the mobile device 100.
Upon receiving the PIN code ("CD 123456 AB") from the user, the mobile device 100 encrypts the PIN code by using the temporary credential information 1401 to obtain an encrypted PIN code 1402. Then, the encrypted PIN code 1402 is transmitted to the image forming apparatus 200.
The imaging apparatus 200 decrypts the encrypted PIN code 1402 by using the temporary credential information 1401 to obtain a decrypted PIN code 1403. The imaging apparatus 200 verifies the validity of the decrypted PIN code 1403 to determine whether the mobile device 100 has transmitted a valid PIN code to operate as a BYOD device.
When the decrypted PIN code 1403 is valid, the imaging apparatus 200 sends permanent credential information 1404 (permanent token ("f 49hesfi39 e") and permanent secret ("sdf 9q0qjf 03")) to the mobile device 100. Mobile device 100 stores permanent credential information 1404 in storage unit 150. In this way, the pairing process is completed and the mobile device 100 may be considered or considered reliable as a BYOD device for controlling the imaging apparatus 200. The permanent credential information 1404 stored in the storage unit 150 may persist in the mobile device 100 for BYOD services until the user erases the permanent credential information 1404. In some embodiments, the user may intentionally erase the permanent credential information 1404 with an erase process (not shown).
After the pairing process, when the mobile device 100 is to transmit a specific command to the imaging apparatus 200, as a BYOD device, the mobile device 100 may transmit the permanent credential information 1404 together with the specific command to notify the imaging apparatus 200 that the specific command is a valid command transmitted by the BYOD device. In other words, when the apparatus transmits a command without permanent credential information (permanent token and permanent secret) to the image forming apparatus 200, the image forming apparatus 200 may determine that the command is invalid. In other words, when a mobile device that does not include permanent credential information (permanent token and permanent secret) sends a control command, the imaging apparatus 200 may ignore or discard the control command.
Fig. 15 is a diagram describing that the imaging apparatus 200 transmits the authorization assigned to the PIN code to the mobile device 100 after authenticating the PIN code according to an exemplary embodiment.
Fig. 15 shows a management table 1500 for assigning available authorizations and expiration dates according to the type of PIN code. Assume that the PIN code entered by user a's mobile device 100 during the pairing process (similar to pairing process 660 of fig. 6 and the pairing process described above) is "CD 123456 AB". When the PIN code obtained by decrypting the encrypted PIN code received from the mobile device 100 is "CD 123456 AB", the imaging apparatus 200 ("MFP a") can transmit information 1501 on the authorization ("∞") and the expiration date ("∞") assigned to the PIN code "CD 123456 AB" to the mobile device 100. Therefore, the mobile device 100, which is a BYOD device assigned with the PIN code "CD 123456 AB", can use the imaging apparatus 200 without being limited to authorization and due date. In other words, when the PIN code is determined to be valid, the imaging apparatus 200 may transmit authorization information for accessing the imaging apparatus 200, which is assigned to the PIN code, to the mobile device 100.
However, when the PIN code is "SF 8FW93 KS" and the user of mobile device 100 is not user a, mobile device 100 is not operable as a BYOD device. Further, even if the user of the mobile device 100 is user B, when the PIN code input through the mobile device 100 is "SF 8FW93 KS", the mobile device 100 can use only the print and scan function of the image forming apparatus 200 only until 2015, 12, month, and 31.
The management table 1500 may be a table that assigns authorization information such as individual (different) authorization and expiration dates as BYOD device setting information assigned by a manager of the BYOD environment or the image forming apparatus according to the value of the PIN code. Accordingly, the administrator of the imaging apparatus 200 or BYOD environment can set the management table 1500 to adjust the authorization of the BYOD device.
Fig. 16A is a flowchart of pairing processing (similar to pairing processing 660 of fig. 6 and the pairing processing described above) performed by mobile device 100 according to an example embodiment.
In operation 1601, the communication unit 140 of the mobile device 100 (e.g., of fig. 1) transmits a pairing request to the communication unit 230 (e.g., of fig. 2A) of the image forming apparatus 200 (e.g., of fig. 1).
In operation 1602, the communication unit 140 of the mobile device 100 receives temporary credential information (a temporary token and a temporary secret) issued by the main controller 220 of the image forming apparatus 200 (e.g., of fig. 2A).
In operation 1603, the input unit 130 (of fig. 3A, for example) of the mobile device 100 receives information about the PIN code from the user. In other words, the user inputs the PIN code through the input unit 130 of the mobile device 100.
In operation 1604, the controller 110 of the mobile device 100 encrypts the PIN code by using the temporary credential information.
In operation 1605, the communication unit 140 of the mobile device 100 transmits the PIN code to the communication unit 230 (of fig. 2A, for example) of the image forming apparatus 200.
After the main controller 220 (of fig. 2A, for example) of the image forming apparatus 200 authenticates the PIN code, the communication unit 140 (of fig. 3A, for example) of the mobile device 100 receives permanent credential information (permanent token and permanent secret) together with authorization information (information 1501 assigned by the management table 1500 of fig. 15) assigned to the PIN code in operation 1606.
In operation 1607, the controller 110 of the mobile device 100 generates a control command of the image forming apparatus 200, and the communication unit 140 of the mobile device 100 transmits the control command to the communication unit 230 of the image forming apparatus 200 together with the permanent credential information. As such, the controller 110 (e.g., of fig. 3A) of the mobile device 100 may perform a BYOD service with respect to the imaging apparatus 200.
Fig. 16B is a flowchart of pairing processing performed by the image forming apparatus 200 according to an exemplary embodiment.
In operation 1611, the communication unit 230 (of fig. 2A, for example) of the image forming apparatus 200 receives a pairing request from the communication unit 140 of the mobile device 100.
In operation 1612, the main controller 220 (of fig. 2A, for example) of the image forming apparatus 200 issues temporary credential information (temporary token and temporary secret), and the communication unit 230 (of fig. 2A, for example) of the image forming apparatus 200 transmits the temporary credential information to the communication unit 140 of the mobile device 100.
In operation 1613, the communication unit 230 (of fig. 2A, for example) of the imaging apparatus 200 receives the PIN code encrypted by the controller 110 (of fig. 3A, for example) of the mobile device 100 from the communication unit 140 (of fig. 3A, for example) of the mobile device 100.
In operation 1614, the main controller 220 (of fig. 2A, for example) of the image forming apparatus 200 decrypts the encrypted PIN code by using the temporary credential information.
In operation 1615, the main controller 220 (of fig. 2A, for example) of the image forming apparatus 200 determines whether the decrypted PIN code is valid. When it is determined that the decrypted PIN code is invalid, the pairing process ends. When it is determined that the decrypted PIN code is valid, operation 1616 is performed.
In operation 1616, the main controller 220 (of fig. 2A, for example) of the image forming apparatus 200 discards the temporary credential information and issues permanent credential information (permanent token and permanent secret) for the mobile device 100. The communication unit 230 (of fig. 2A, for example) of the image forming apparatus 200 transmits the permanent credential information to the communication unit 140 of the mobile device 100.
In operation 1617, the communication unit 230 of the imaging apparatus 200 transmits the authorization information (the information 1501 assigned by the management table 1500 of fig. 15) assigned to the PIN code to the communication unit 140 of the mobile device 100.
In operation 1618, when the communication unit 230 of the image forming apparatus 200 receives the control command and the permanent credential information from the communication unit 140 (of fig. 3A, for example) of the mobile device 100, the main controller 220 (of fig. 2A, for example) of the image forming apparatus 200 controls the image forming apparatus 200 to perform an image forming job according to the control command. In this way, the main controller 220 (of fig. 2A, for example) of the image forming apparatus 200 may perform the BYOD service by the mobile device 100.
Various embodiments of authentication methods required when pairing is performed will now be described.
First, the authentication method may use an ID card. In detail, during authentication in the pairing process, a guide screen describing an authentication method may be displayed on the mobile device. For example, when the ID card comes into contact with a card reader included in an image forming apparatus (e.g., the image forming apparatus 200 of fig. 14), the guide screen may display contents to perform authentication. When the user contacts his/her ID card with the card reader by moving toward the imaging apparatus according to the guide screen, the imaging apparatus reads authentication information (e.g., a user ID and a password) stored in advance in the ID card, and when it is determined that the authentication information is valid, transmits a token and credential information for pairing to the mobile device. The mobile device pairs with the imaging apparatus using the received token and the received credential information.
Second, the authentication method may use fingerprint recognition. The fingerprint recognizer for authentication may be connected to an imaging apparatus (e.g., the imaging apparatus 200 of fig. 14) via a Universal Serial Bus (USB), or the mobile device may include the fingerprint recognizer. When the fingerprint identifier is connected to the image forming apparatus, authentication is performed in a similar manner to an authentication method using an ID card. In other words, when a guide screen guiding a user to place a finger on a fingerprint recognizer connected to the imaging apparatus is displayed, the user may place the finger on the fingerprint recognizer according to the guide screen, the imaging apparatus may transmit information required for pairing to the mobile device when it is determined that the recognized fingerprint is valid, and the mobile terminal may pair with the imaging apparatus using the received information. Meanwhile, when the mobile device includes a fingerprint recognizer, the authentication method may be performed as follows. When a user performs fingerprint recognition using a mobile device, the mobile device converts the recognized fingerprint into a form of digital information and transmits the digital information to an imaging apparatus. The imaging device may previously store digital information about fingerprints of allowed users. Accordingly, the image forming apparatus compares the received digital information with the pre-stored digital information, and when they match, transmits a token and credential information for pairing to the mobile device. The mobile device pairs with the imaging apparatus using the received token and the received credential information.
Third, the authentication method may use voice recognition. A voice of a user is recognized using a microphone (not shown) included in a mobile device (e.g., the mobile device 100), and the mobile device converts the recognized voice into a form of digital information and transmits the digital information to an imaging apparatus. The imaging apparatus stores digital information on the voice of the allowed user in advance. Accordingly, the image forming apparatus compares the received digital information with the pre-stored digital information, and when they match, transmits a token and credential information for pairing to the mobile device. The mobile device pairs with the imaging apparatus using the received token and the received credential information. Meanwhile, the voices of some or all users may be recorded in advance, and an authorization matching the voice of each user may be assigned.
Fourth, the authentication method may use NFC for communication. The user does not directly enter the PIN. When the mobile device is located within a certain distance from the imaging apparatus, the mobile device automatically sends the PIN to the imaging apparatus through the NFC interface. For example, when the mobile device is within a certain distance from the NFC tag, the NFC module contained in the mobile device may read device information stored in the NFC tag attached to the imaging apparatus by detecting the NFC tag and analyze the device information to transmit the pre-stored PIN to the imaging apparatus. The user may pre-store a PIN to be transmitted during NFC tag connection (tagging) in the mobile device. Further, when an application for authentication is executed in the mobile device, a PIN stored in advance may be converted and stored to be transmitted via NFC. When the PIN received from the mobile device matches a pre-stored PIN, the imaging apparatus transmits a token and credential information for pairing to the mobile device. The mobile device pairs with the imaging apparatus using the received token and the received credential information.
Fig. 17 is a diagram describing an event registration process (similar to the event registration process 670 of fig. 6 and the above-described event registration process) between the mobile devices 101 to 103 (similar to the mobile device 100 of fig. 1) for performing the BYOD service and the imaging apparatus 200 (similar to the imaging apparatus 200 of fig. 1) according to an exemplary embodiment.
Referring to fig. 17, various types of mobile devices 101 to 103 each can be connected to (paired with) the imaging apparatus 200 as BYOD devices. Here, mobile devices 101 to 103 may each correspond to mobile device 101 or 105 described above.
When the mobile device 101 is paired with the image forming apparatus 200, as a BYOD device, the mobile device 101 can register a desired event type so as to receive only a notification about generation of a specific event ("job processing").
The mobile device 101 may transmit a registration request for a list of events to be received by the mobile device 101 among the events that can be generated in the imaging apparatus 200.
When an event regarding processing of an image forming job, for example, completion of printing or completion of scanning, is generated in the image forming apparatus 200, the communication unit 140 of the mobile device 101 may receive a notification regarding the registered event from the communication unit 230 of the image forming apparatus 200. In other words, when an event included in the event list is generated in the imaging apparatus 200, the imaging apparatus 200 may transmit a notification about the event to the mobile device 101. If the mobile device 101 does not register an event type and an event regarding an error is generated in the imaging apparatus 200, a notification regarding the event regarding the error may not be provided to the mobile device 101.
Similarly, when the mobile device 102 registers an event regarding an error ("device error"), the communication unit 140 (of fig. 3A, for example) of the mobile device 101 may receive only an event regarding an error generated in the image forming apparatus 200 (e.g., a paper low event, a paper jam event, or a toner low event) from the communication unit 230 (of fig. 2A, for example) of the image forming apparatus 200.
Meanwhile, the mobile device 103 may register some or all events ("all events") that may be generated in the imaging apparatus 200, and thus may provide a notification about the events through the display unit 120 of the mobile device 103 as with the manipulator 210 (of fig. 2A, for example) of the imaging apparatus 200.
Fig. 18 is a diagram describing a method of transmitting an event generated by the imaging apparatus 200 to the mobile device 100 according to an exemplary embodiment.
Referring to fig. 18, when an event registered by the mobile device 100, for example, event a, is generated in the imaging apparatus 200, the communication unit 230 of the imaging apparatus 200 may push the generation of the event a to the communication unit 140 of the mobile device 100 using an event transmission method. In other words, whenever an event a registered by the mobile device 100 is generated in the imaging apparatus 200, the imaging apparatus 200 notifies the mobile device 100 of the event generation by using the data push method. According to the data push method, the event notification message 1801 ("print 100 page complete |) is immediately provided through the display unit 120 of the mobile device 100, and thus a user using the image forming apparatus 200 through the mobile device 100 can immediately determine whether an event he/she indicated is processed or whether an error is generated in the image forming apparatus 200. Further, notifications regarding the generation of event a may be pushed to other mobile devices that registered event a as well as to mobile device 100.
The communication unit 230 of the image forming apparatus 200 may use a WebSocket protocol as a protocol for transmitting event information. The WebSocket protocol is a protocol capable of real-time bi-directional communication and may be implemented in any of a variety of protocols, such as the transmission control protocol/internet protocol (TCP/IP), HTTP, and the User Datagram Protocol (UDP). Here, in order to communicate via the WebSocket protocol, a socket port needs to be provided between the communication unit 230 (of fig. 2A, for example) of the imaging apparatus 200 and the communication unit 140 (of fig. 3A, for example) of the mobile device 100. In the current embodiment, the opened 80 port may be used as a socket port of the WebSocket protocol, but the socket port is not limited thereto.
In fig. 18, the event is transmitted between the communication unit 230 of the imaging apparatus 200 and the communication unit 140 of the mobile device 100 via a data push method, but the method of transmitting the event is not limited thereto, and any of various methods, such as a polling method and a data long polling method, may be used.
Fig. 19A is a flowchart of a method of establishing a connection with an imaging apparatus 200 (of fig. 1, for example) by a mobile device 100 (of fig. 1, for example) according to an example embodiment. Referring to fig. 19A, the method includes operations performed by the imaging apparatus 200 in time series, and thus the details regarding the imaging apparatus 200 described above can be applied to the method of fig. 19A. Even if not explicitly mentioned.
In operation 1901, the communication unit 230 (of fig. 2A, for example) of the image forming apparatus 200 transmits temporary credential information issued when a pairing request is received from the mobile device 100 to the mobile device 100.
In operation 1902, the communication unit 230 (of fig. 2A, for example) of the image forming apparatus 200 receives a PIN code encrypted by the mobile device 100.
In operation 1903, the main controller 220 (of fig. 2A, for example) of the image forming apparatus 200 decrypts the encrypted PIN code by using the temporary credential information to determine whether the decrypted PIN code is valid.
In operation 1904, when the main controller 220 (of fig. 2A, for example) determines that the PIN code is valid, the communication unit 230 (of fig. 2A, for example) of the image forming apparatus 200 returns (or transmits) permanent credential information to the mobile device 100.
Fig. 19B is a flowchart of a method of establishing a connection with the mobile device 100 (of fig. 1, for example) by the imaging apparatus 200 (of fig. 1, for example) according to an example embodiment. Referring to fig. 19B, the method includes operations performed by the mobile device 100 in time series, and thus the details described above with respect to the mobile device 100 may be applied to the method of fig. 19B. Even if not explicitly mentioned.
In operation 1911, the communication unit 140 of the mobile device 100 receives temporary credential information issued by the image forming apparatus 200 when the communication unit 140 transmits a pairing request to the image forming apparatus 200.
In operation 1912, the controller 110 (of fig. 3A, for example) of the mobile device 100 encrypts the PIN code input by the user by using the temporary credential information.
In operation 1913, the communication unit 140 (of fig. 3A, for example) of the mobile device 100 transmits the encrypted PIN code to the imaging apparatus 200.
When the imaging apparatus 200 determines that the PIN code is valid, the communication unit 140 (of fig. 3A, for example) of the mobile device 100 receives permanent credential information from the imaging apparatus 200 in operation 1914.
Hereinafter, a method of setting a worksheet by a mobile device, which defines a sequence in which jobs are executed by using functions of an imaging apparatus and the mobile device via a BYOD service, and executing jobs by using the set worksheet will now be described.
Fig. 20 is a diagram of an environment of a mobile device 100 (similar to the mobile device 100 of fig. 1) for generating, managing, and executing a worksheet using a BYOD service according to an exemplary embodiment.
Referring to fig. 20, a user can generate and manage a worksheet by using the mobile device 100 in a BYOD environment according to an exemplary embodiment. Here, the workform defines a sequence of executing jobs by using functions of the imaging apparatuses 200A and 200B (similar to the imaging apparatus 200 of fig. 1) and the mobile device 100 (similar to the mobile device 100 of fig. 1).
Here, the mobile device 100 is a portable electronic device supporting wireless communication such as a smartphone or a tablet PC, and the image forming apparatuses 200A and 200B are each apparatuses supporting image forming jobs such as scanning, printing, and faxing, such as a scanner, a printer, a facsimile machine, or a multifunction printer (MFP).
The user can execute an application installed in the mobile device 100 and supporting a worksheet using the BYOD service and generate a worksheet by using a function combination job of the imaging apparatus 200A, the imaging apparatus 200B, and the mobile device 100 in a desired order. The method of generating the worksheet will be described in detail below.
According to an exemplary embodiment, a work sheet using the BYOD service is generated and managed by the mobile device 100, and when executing the work sheet, the mobile device 100 manages execution of jobs according to an order defined in the work sheet. In other words, according to the executed work table, when it is the turn of a job to be executed by using the functions of the image forming apparatuses 200A and 200B according to the defined sequence, the mobile device 100 transmits a command to execute the job to the image forming apparatuses 200A and 200B, and when it is the turn of the job to be executed by using the functions of the mobile device 100 according to the defined sequence, the job is executed by using the application installed in the mobile device 100 or the hardware component of the mobile device 100.
As shown in fig. 20, the workform may be generated by differently combining the functions of the imaging apparatuses 200A and 200B and the mobile device 100. For example, the workform may be set such that an image obtained when the imaging apparatus 200A performs scanning is transmitted to the mobile device 100, and the mobile device 100 transmits the received image to a web Server such as a File Transfer Protocol (FTP) Server or a Server Message Block (SMB) Server via email, or to the imaging apparatus 200B.
Alternatively, the worksheet may be started upon receiving a facsimile document of the image forming apparatus 200A, and the mobile device 100 may provide an editing function for the received document or image at an intermediate stage. Alternatively, the worksheet may have any of a variety of forms, and various exemplary embodiments of the worksheet will be described in detail later.
According to an exemplary embodiment, the three elements that form the worksheet using the BYOD service are input, conversion, and transmission. In other words, a job is executed via the worksheet when a job target is "input", the job target is "converted" according to a method set in advance, and the converted target is "sent" to a sending destination.
Therefore, according to an exemplary embodiment, the process of setting a worksheet using the BYOD service mainly includes three operations. First, an input source is set, a conversion method is set, and then a transmission destination is set.
Various exemplary embodiments of setting and executing a worksheet using a BYOD service will now be described with reference to fig. 21 to 32.
Fig. 21 is a diagram of a process of generating a worksheet for printing photographs captured by the mobile device 100 (of fig. 1, for example) by the MFP, according to an exemplary embodiment.
Referring to fig. 21, first, a camera application installed in the mobile device 100 is selected as an input source for receiving a job target. In other words, a photograph captured by a camera included in the mobile device 100 is received as a job target.
Then, the method of converting the job object is set, but in the present embodiment, the job object is set not to be converted. In other words, the mobile device 100 transmits the photograph to the transmission destination without conversion.
Finally, the transmission destination of the transmission job object is set. In the present embodiment, a photograph captured by the mobile device 100 is to be printed, and thus the transmission destination may be an image forming apparatus such as a printer or an MFP for performing printing. To set the transmission destination, the mobile apparatus 100 finds an imaging device and displays a list 2110 of found imaging devices as illustrated in fig. 21. The list 2110 may show model names and locations of found imaging devices. The user can select a desired device from the list 2110 displayed on the screen of the mobile apparatus 100 and set a transmission destination by assigning the attribute of the selected device to "print".
Here, the user can set and store the print options in the work sheet. The mobile device 100 may obtain the capabilities of the imaging apparatus during the discovery process and display the settable options on the screen based on the obtained capabilities. The user may set the options displayed on the screen of the mobile device 100 to desired values.
When an imaging device is selected, the mobile device 100 attempts to pair with the selected imaging device. When the worksheet is generated, the mobile device 100 is paired with the imaging apparatus set as the transmission destination to communicate with the imaging apparatus, and when an event is generated, that is, when the worksheet is executed and a photograph is captured by the mobile device 100, the mobile device 100 transmits the photograph to the imaging apparatus and requests the imaging apparatus to print the photograph.
Alternatively, when generating the work sheet, the mobile device 100 may not be paired with the image forming apparatus selected as the transmission destination, but may be paired with the image forming apparatus when executing the work sheet and executing the job. Such an example may be applied when an imaging device is selected as an input source.
Discovery and pairing of imaging devices has been described above. Further, the process of pairing at the time of setting the work table will be described again later with reference to fig. 33.
In this way, when an input source, a conversion method, and a transmission destination are set, the workform 2120 is generated and stored. The name of the work table 2120 may be determined so that the details and order of the jobs defined in the work table 2120 are distinguishable, such as "camera ═ print". The user can later select and execute the worksheet 2120 so that jobs defined in the worksheet 2120 are executed in order.
Fig. 22 is a diagram of performing a process of printing a worksheet of photographs captured by a mobile device 100 (similar to the mobile device 100 of fig. 1) by an image forming apparatus 200 (similar to the image forming apparatus 200 of fig. 1), according to an exemplary embodiment.
Referring to fig. 22, when the user selects and executes a worksheet 2120 of "camera ═ print" from a worksheet list displayed on the screen of the mobile device 100, the mobile device 100 may automatically execute the camera function or may display a message guiding the user to execute the camera function.
When performing the camera function, a user may capture a photograph by using the mobile device 100. When capturing a photograph, the mobile device 100 transmits the photograph to the image forming apparatus 200 set as a transmission destination in the worksheet 2120 and requests the image forming apparatus 200 to print the photograph. Here, the mobile device 100 may also transmit a print option previously set and stored in the work sheet 2120 to the image forming apparatus 200, and the print option may be previously set by the user when generating the work sheet 2120, as described above with reference to fig. 21.
The image forming apparatus 200 prints a photograph according to the print option, thereby completing the execution of the job table 2120.
Fig. 23 is a flowchart of a process of generating a worksheet for printing of photos captured by a mobile device by an imaging apparatus, according to an exemplary embodiment.
Referring to fig. 23, in operation 2301, a user selects a camera application of a mobile device (similar to the mobile device 100 of fig. 1) as an input source. In other words, a photograph captured by using a camera included in the mobile device 100 is received as a job target.
In operation 2302, when the user wants to set an imaging apparatus as a transmission destination, the mobile device 100 finds an imaging apparatus (similar to the imaging apparatus 200 of fig. 1) and displays a list of found imaging apparatuses on a screen.
In operation 2303, the user selects one of the found image forming apparatuses as a transmission destination, and assigns "print" as an attribute. Here, the user can set and store the print options in the work sheet. When the user selects a transmission destination, the mobile device 100 attempts pairing with an image forming apparatus (for example, the image forming apparatus 200 of fig. 1) selected as the transmission destination. The pairing process has been described in detail above.
In operation 2304, the mobile device 100 generates and stores a workform according to the set input source and the set transmission destination. Here, the name of the work sheet may be determined so that the details and order of the jobs defined in the work sheet are distinguishable.
Fig. 24 is a flowchart of a process of executing a worksheet for printing, by an imaging apparatus 200 (similar to the imaging apparatus 200 of fig. 1), a photograph captured by a mobile device 100 (similar to the mobile device 100 of fig. 1), according to an exemplary embodiment.
Referring to fig. 24, in operation 2401, a workform stored in the mobile device 100 is executed. Here, the worksheet defines the order of the jobs so that the photographs captured by the mobile device 100 are printed by the image forming apparatus 200. When executing the worksheet, the mobile device 100 may automatically execute the camera function or display a message guiding the user to execute the camera function.
In operation 2402, a user may capture a photo by using the mobile device 100.
In operation 2403, the mobile device 100 transmits the photograph to the image forming apparatus 200 together with the preset printing option. Here, the image forming apparatus 200 is a device set as a transmission destination by the work table executed in operation 2401. As described above with reference to fig. 21, the print option may be set in advance by the user when the image forming apparatus 200 is set as the transmission destination. The imaging apparatus 200 may be connected to the mobile device 100 via a pairing process when generating the workform, or may be paired after the workform is executed in operation 2401.
In operation 2404, the image forming apparatus 200 prints a photo according to the preset print option, thereby completing the work sheet.
Fig. 25 is a diagram of a process of generating a worksheet 2520 that edits an image scanned by an imaging apparatus (similar to the imaging apparatus 200 of fig. 1) by the mobile device 100 (similar to the mobile device 100 of fig. 1), and transmits the edited image via email, according to an exemplary embodiment.
Referring to fig. 25, first, the user selects an image forming apparatus as an input source for receiving a job target, i.e., a scanned image. In detail, the mobile device 100 finds an imaging apparatus and displays a list 2510 of found imaging apparatuses on a screen, as shown in fig. 25. The user selects an imaging device from the list 2510 displayed on the screen of the mobile device 100, and assigns the attribute of the imaging device as "scan" to set the imaging device as an input source.
At this point, the user may set and store the scan options in the worksheet 2520. The mobile device 100 may obtain capabilities of the imaging apparatus during the discovery process and display settable scanning options on the screen based on the obtained capabilities. The user can set the settable scan option to a desired value.
When an imaging device is selected, the mobile device 100 attempts to pair with the imaging device. When the work table 2520 is generated, the mobile device 100 is paired with the imaging apparatus set as an input source to communicate with the imaging apparatus, and when an event is generated, that is, when the work table 2520 is executed and the imaging apparatus performs scanning, the mobile device 100 receives a scanned image from the imaging apparatus set as an input source.
Alternatively, when generating the work table 2520, the mobile device 100 may not be paired with an image forming apparatus selected as an input source, but may be paired with an image forming apparatus when executing the work table 2520 and executing a job.
Discovery and pairing of imaging devices has been described above. Further, the process of pairing at the time of setting the work table will be described again later with reference to fig. 33.
Then, the user sets a conversion method of the scanned image. In the present embodiment, the user sets an image editor application providing an editing function as a conversion method. In other words, the user can edit the scan image received from the imaging apparatus by using the image editor application installed in the mobile device 100 while executing the work table 2520.
Finally, a transmission destination to which the edited scanned image is to be transmitted is set. In the present embodiment, the user sets an email application installed in the mobile device 100 as a transmission destination. In other words, the edited scanned image is transmitted to the email via the email application of the mobile device 100. Here, the user can set and store an email address to which the edited scanned image is to be transmitted in advance.
In this way, when the input source, conversion method, and transmission destination are set, the work table 2520 is generated and stored. The name of the work table 2520 may be determined so that the details and order of jobs defined by the work table 2520 are distinguishable, such as "scan ═ edit ═ email". The user may later select and execute the worksheet 2520, causing the jobs defined by the worksheet 2520 to be executed according to the order.
Fig. 26 is a diagram of performing a process of editing an image scanned by the imaging apparatus 200 (similar to the imaging apparatus 200 of fig. 1) by the mobile device 100 (similar to the mobile device 100 of fig. 1) and transmitting the edited image via email to the workform 2520 according to an exemplary embodiment.
Referring to fig. 26, when the user selects and executes a worksheet 2520 of "scan ═ edit ═ email" from a worksheet list displayed on the screen of the mobile device 100, the mobile device 100 may notify the user that the imaging apparatus 200 is set as an input source and display a message guiding the scanning.
When the user performs scanning in the image forming apparatus 200 set as an input source based on the message, the image forming apparatus 200 transmits a scanned image to the mobile device 100. Here, the mobile device 100 may previously transmit a command requesting the image forming apparatus 200 to transmit a scan image to the mobile device 100, and thus, the image forming apparatus 200 may transmit the scan image to the mobile device 100.
When a scan image is received from the imaging apparatus 200, the mobile device 100 automatically executes the "image editor" application 2504 according to the work table 2520. When executing the "image editor" application, the user may edit the scanned image in the mobile device 100.
After the user edits the scanned image, the mobile device 100 executes an "email" application 2506 and transmits the edited scanned image to an email address stored in the work table 2520. Here, the email address may be preset by the user when generating the operational mode 2520, as described above with reference to fig. 25.
Fig. 27 is a flowchart of generating a worksheet for editing an image scanned by an imaging apparatus (similar to imaging apparatus 200 of fig. 1) by a mobile device (similar to mobile device 100 of fig. 1) and sending the edited image via email, according to an exemplary embodiment.
Referring to fig. 27, when a user selects an imaging apparatus (similar to the imaging apparatus 200 of fig. 1) as an input source, a mobile device (similar to the mobile device 100 of fig. 1) discovers the imaging apparatus and displays a list of the discovered imaging apparatuses on a screen in operation 2701.
In operation 2702, the user selects one of the found image forming apparatuses as an input source and assigns an attribute of the selected image forming apparatus as "scan". At this point, the user may set and store the scan options in the worksheet. When the user selects an imaging apparatus, the mobile device 100 attempts to pair with the selected imaging apparatus (similar to the imaging apparatus 200 of fig. 1). The pairing process (similar to pairing process 660 of fig. 6) has been described in detail above.
In operation 2703, the user selects an image editor application installed in a mobile device (similar to the mobile device 100 of fig. 1) as a conversion method. In other words, the user may edit the scanned image through the image editor application while executing the worksheet.
In operation 2704, the user selects an email application installed in the mobile device as a transmission destination. In other words, the edited scanned image is sent to the email via the email application. Here, the email address to which the edited scanned image is to be transmitted may be set and stored in advance by the user.
In operation 2705, the mobile device generates and stores a workform according to the input source, the conversion method, and the transmission destination. Here, the name of the work sheet may be determined so that the details and order of the jobs defined in the work sheet are distinguishable.
Fig. 28 is a diagram of performing a process of editing an image scanned by the imaging apparatus 200 (similar to the imaging apparatus 200 of fig. 1) by the mobile device 100 (similar to the mobile device 100 of fig. 1) and transmitting a worksheet of the edited image via email according to an exemplary embodiment.
Referring to fig. 28, a worksheet stored in the mobile device 100 is executed in operation 2801. Here, the work table defines the order of jobs so that the image scanned by the image forming apparatus 200 is edited by the mobile device 100 and the edited image is transmitted via email.
In operation 2802, the mobile device 100 transmits a scan request to the imaging apparatus 200 set as an input source. Here, the mobile device 100 may display a guidance message for the user to perform scanning via the imaging apparatus 200 on the screen. In addition, the mobile device 100 may send the scan options stored in the worksheet along with the scan request. As described above with reference to fig. 25, when generating the work sheet, the scan option may be set in advance by the user while setting the imaging apparatus 200 as an input source.
In operation 2803, the imaging device 200 performs a scan to obtain a scan image. Here, the image forming apparatus 200 may perform scanning according to the scanning option received together with the scanning request in operation 2802.
In operation 2804, the image forming apparatus 200 transmits the scanned image to the mobile device 100. Here, the imaging apparatus 200 may be connected to the mobile device 100 via a pairing process (similar to the pairing process 660 of fig. 6 and the above-described pairing process) while generating the workform, or may perform the post-workform pairing in operation 2801.
Upon receiving the scanned image from the imaging apparatus 200, the mobile device 100 executes an image editor application as defined in the worksheet and edits the scanned image according to user input in operation 2805.
After editing the scanned image, the mobile device 100 executes an email application and transmits the edited scanned image via email in operation 2806, thereby completing the worksheet. Here, as described above with reference to fig. 25, the email address to which the edited scanned image is to be transmitted may be set in advance by the user at the time of generating the worksheet.
Fig. 29 is a diagram of a process of generating a worksheet 2930 in which an image scanned by an image forming apparatus (similar to image forming apparatus 200 of fig. 1) is edited by mobile device 100 (similar to mobile device 100 of fig. 1), and the edited image is printed by another image forming apparatus, according to an exemplary embodiment.
Referring to fig. 29, first, the user selects an imaging apparatus as an input source for receiving a job target, i.e., a scanned image. In detail, the mobile device 100 may find an imaging apparatus and display a list 2910 of the found imaging apparatuses on a screen, as shown in fig. 29. The user selects an imaging apparatus from the list 2910, and sets the imaging apparatus as an input source by assigning the attribute of the imaging apparatus to "scan".
At this point, the user may set and store scan options in the worksheet 2930. The mobile device 100 may obtain the capabilities of the imaging apparatus during the discovery process and display scan options that are settable based on the capabilities on the screen. The user may set the displayed scan options to desired values.
When an imaging device is selected, the mobile device 100 attempts to pair with the imaging device. The mobile device 100 may pair with the imaging apparatus set as the input source to communicate with the imaging apparatus while generating the worksheet 2930, and when an event is generated, that is, when the worksheet 2930 is executed and the imaging apparatus performs scanning, the mobile device 100 receives a scanned image from the imaging apparatus set as the input source.
Alternatively, the mobile device 100 may not be paired with an image forming apparatus set as an input source when generating the workform 2930, but may be paired with an image forming apparatus when executing the workform 2930 and executing a job.
Discovery and pairing of imaging devices has been described above. Further, the process of pairing at the time of setting the work table will be described again later with reference to fig. 33.
Then, the user sets a conversion method of converting the scanned image. In the present embodiment, the user sets an "image editor" application 2902 that provides an editing function as a conversion method. In other words, when the worksheet 2930 is executed, the user can edit the scanned image received from the imaging device by using the "image editor" application 2902 installed in the mobile device 100.
Finally, a transmission destination to which the edited scanned image is to be transmitted is set. In the present embodiment, an image forming apparatus that prints an edited scanned image is set as a transmission destination. To set the transmission destination, the mobile apparatus 100 finds an imaging device and displays a list 2920 of found imaging devices on the screen, as shown in fig. 29. The list 2920 may display the model names and locations of the found imaging devices. The user can select an image forming apparatus from the list 2920 and set the image forming apparatus as a transmission destination by assigning the attribute of the image forming apparatus to "print".
In the present embodiment, the image forming apparatus set as the transmission destination is different from the image forming apparatus set as the input source. Referring to fig. 29, an imaging device located in "development group" is set as an input source, and an imaging device located in "OA room" is set as a transmission destination. The current embodiment may be useful when one image forming apparatus supports the scan function but does not support the color print function and another image forming apparatus supports the color print function but does not support the scan function.
Here, the user can set and store the print options in the work table 2930. The method of setting the print option when setting the image forming apparatus as the transmission destination has been described above with reference to fig. 21.
Meanwhile, when the image forming apparatus is set as the transmission destination, the mobile device 100 attempts pairing with the image forming apparatus. The mobile device 100 pairs with the image forming apparatus set as the transmission destination to communicate with the image forming apparatus at any time while generating the workform 2930, and requests the image forming apparatus to print the edited scanned image by transmitting the edited scanned image when an event is generated, that is, when the workform 2930 is executed and the mobile device apparatus 100 completes editing of the scanned image.
Alternatively, when generating the workform 2930, the mobile device 100 may not be paired with an image forming apparatus set as a transmission destination, but may be paired with an image forming apparatus when executing the workform 2930 and executing a job.
Discovery and pairing of imaging devices has been described above. Further, the process of pairing at the time of setting the work table will be described again later with reference to fig. 33.
When the input source, the conversion method, and the transmission destination are set as described above, the workform 2930 is generated and stored. The name of the work table 2930 may be determined so that the details and order of the jobs defined in the work table 2930 are distinguishable, for example, "scan ═ edit ═ print".
Fig. 30 is a diagram of performing a process of editing an image scanned by the image forming apparatus 200A (similar to the image forming apparatus 200 of fig. 1) by the mobile device 100 (similar to the mobile device 100 of fig. 1) and printing the edited image by the image forming apparatus 200B (similar to the image forming apparatus 200 of fig. 1) according to an exemplary embodiment.
Referring to fig. 30, when the user selects and executes a worksheet 2930 of "scan ═ edit ═ print" from a worksheet list displayed on the screen of the mobile device 100, the mobile device 100 may notify the user that the image forming apparatus 200A is set as an input source and display a message guiding the user to execute scanning.
When the user performs scanning by using the image forming apparatus 200A according to the message, the image forming apparatus 200 transmits the scanned image to the mobile device 100. Here, the mobile device 100 may pre-transmit a command requesting transmission of a scan image to the mobile device 100 after scanning to the imaging apparatus 200A, and thus, the imaging apparatus 200A may transmit the scan image to the mobile device 100 after scanning.
When the scanned image is received from the imaging apparatus 200A, the mobile device 100 automatically executes an "image editor" application as defined in the worksheet 2930. When executing the "image editor" application, the user may edit the scanned image by using the mobile device 100.
After the user edits the scanned image, the mobile device 100 transmits the edited scanned image to the image forming apparatus 200B set as the transmission destination in the worksheet 2930 to request the image forming apparatus 200B to print the edited scanned image. At this time, the mobile device 100 may also transmit the print options, which are set in advance and stored in the work table 2930, to the image forming apparatus 200B, and as described above with reference to fig. 29, the print options may be set in advance by the user at the time of generating the work table 2930.
The image forming apparatus 200B prints the edited scanned image according to the print option, thereby completing the execution of the work table 2930.
Fig. 31 is a flowchart of a process of generating a worksheet in which an image scanned by one image forming apparatus (similar to the image forming apparatus 200 of fig. 1) is edited by a mobile device (similar to the mobile device 100 of fig. 1), and the edited image is printed by another image forming apparatus (similar to the image forming apparatus 200 of fig. 1), according to an exemplary embodiment.
Referring to fig. 31, when the user selects an imaging device as an input source, the mobile device finds an imaging device and displays a list of found imaging devices on a screen in operation 3101.
In operation 3102, the user selects and sets one of the found image forming apparatuses as an input source, and assigns "scan" as an attribute. Here, the user can set and store the scan options in the worksheet. When the user sets the imaging apparatus as an input source, the mobile device attempts to pair with the imaging apparatus set as the input source. The pairing process has been described in detail above.
In operation 3103, the user selects an "image editor" application installed in the mobile device as the conversion method. In other words, the user may edit the scanned image through the "image editor" application of the mobile device while executing the worksheet.
In operation 3104, the user may select and set one of the image forming apparatuses found in operation 3101 as a transmission destination, and assign "print" as an attribute. Here, the image forming apparatus set as the transmission destination may be different from the image forming apparatus set as the input source in operation 3102. Also, the user can set and store print options in the worksheet. When setting a transmission destination, the mobile device attempts pairing with the image forming apparatus set as the transmission destination. The pairing process has been described in detail above.
In operation 3105, the mobile device generates and stores a worksheet according to the input source, the conversion method, and the transmission destination. Here, the name of the work sheet may be determined so that the details and order of the jobs defined in the work sheet are distinguishable.
Fig. 32 is a diagram of a process of executing a worksheet in which an image scanned by the imaging apparatus 200A (similar to the imaging apparatus 200 of fig. 1) is edited by the mobile device 100 (similar to the mobile device 100 of fig. 1), and the edited image is printed by the imaging apparatus 200B (similar to the imaging apparatus 200 of fig. 1), according to an exemplary embodiment.
Referring to fig. 32, a worksheet stored in the mobile device 100 is executed in operation 3201. Here, the work table defines the order in which the jobs are executed, so that the image scanned by the image forming apparatus 200A is edited by the mobile device 100, and the edited image is printed by the image forming apparatus 200B.
In operation 3202, the mobile device 100 transmits a scan request to the image forming apparatus 200A set as an input source. Here, the mobile device 100 may display a guidance message guiding the user to perform scanning by using the imaging apparatus 200A on the screen. In addition, the mobile device 100 may send the scan options stored in the worksheet along with the scan request. As described above with reference to fig. 29, the scan option may be set in advance by the user while setting the imaging apparatus 200A as an input source at the time of generating the work sheet.
In operation 3203, the imaging device 200A obtains a scan image by performing a scan. Here, the image forming apparatus 200A may perform scanning according to the scanning option received together with the scanning request in operation 3202.
In operation 3204, the imaging apparatus 200A transmits the scan image to the mobile device 100. Here, the imaging apparatus 200A may be connected to the mobile device 100 via the pairing process when generating the work sheet, or may perform pairing after executing the work sheet in operation 3201.
Upon receiving the scanned image from the imaging apparatus 200A, the mobile device 100 executes an "image editor" application as defined in the worksheet and edits the scanned image according to user input in operation 3205.
After editing the scan image, the mobile device 100 transmits the edited scan image to the image forming apparatus 200B set as the transmission destination in operation 3206. Here, the mobile device 100 may transmit a preset print option to the image forming apparatus 200B together with the edited scanned image. As described above with reference to fig. 29, the print options may be set in advance by the user while setting the image forming apparatus 200B as the transmission destination at the time of generating the work sheet. The imaging apparatus 200B may be connected to the mobile device 100 via the pairing process when generating the work sheet, or may be paired after performing the work sheet in operation 3201.
In operation 3207, the image forming apparatus 200b prints the edited scanned image according to the print option, thereby completing the work sheet.
As described above, when the image forming apparatus is selected as the input source or the transmission destination, the mobile device may be connected to the image forming apparatus via the pairing process. The pairing process at the time of generating the work table will now be described in detail with reference to fig. 33.
Fig. 33 is a diagram of a detailed process in which the mobile device 100 (similar to the mobile device 100 of fig. 1) and the imaging apparatus 200 (similar to the imaging apparatus 200 of fig. 1) perform pairing when generating a workform according to an exemplary embodiment.
In operation 3301, when the user selects an image forming apparatus as an input source or a transmission destination, the mobile device 100 discovers the image forming apparatuses, and the user selects one of the discovered image forming apparatuses, for example, the image forming apparatus 200.
When the image forming apparatus 200 is selected, the image forming apparatus 200 transmits a PIN input request to the mobile device 100 in operation 3302.
In operation 3303, the mobile device 100 displays a screen for receiving a PIN and receives the PIN from the user.
In operation 3304, the mobile device 100 transmits a PIN to the image forming apparatus 200 to request registration.
In operation 3305, the image forming apparatus 200 determines whether the received PIN matches a pre-stored PIN. When the received PIN and the pre-stored PIN do not match, operation 3302 is performed for another PIN, and when the received PIN and the pre-stored PIN match, credential information, such as a token and a secret, is transmitted to the mobile device 100 in operation 3306. The mobile device 100 may access the imaging apparatus 200 at a later time by using the token and the secret received in operation 3306.
Operations 3301 to 3306 correspond to the device registration operation 3310, and the attribute setting operation 3320 is performed after the device registration operation 3310. The attribute setting operation 3320 includes operations 3307 to 3309.
In operation 3307, the mobile device 100 requests the capability of the image forming apparatus 200, and in operation 3308, the image forming apparatus 200 transmits the capability to the mobile device 100. Here, the capability may contain information about functions executable by the image forming apparatus 200, about the state of the image forming apparatus 200, and about options settable by the image forming apparatus 200.
In operation 3309, the mobile device 100 may display selectable functions and settable options on the screen and select the functions and set the options based on the user input. For example, when the image forming apparatus 200 is used as an input source for obtaining a scanned image, the user may assign "scan" as an attribute and set a scan option. Alternatively, when the image forming apparatus 200 is used as a transmission destination of printing, the user may assign "print" as an attribute and set a print option.
Meanwhile, as described above, the mobile device 100 and the imaging apparatus 200 may be paired at the time of executing the worksheet, rather than at the time of generating the worksheet.
FIG. 34 is a flowchart of a method of generating a worksheet in accordance with an exemplary embodiment.
Referring to fig. 34, a user executes an application supporting a worksheet using a BYOD service by using a mobile device (similar to the mobile device 100 of fig. 1), and sets any one of an imaging apparatus (similar to the imaging apparatus 200 of fig. 1) and the mobile device as an input source in operation 3401. In other words, the user selects any one of the image forming apparatus and the mobile device as an input source for receiving a job target, and sets a function for obtaining the job target.
In operation 3402, the user sets a conversion method using a function of the mobile device by using the mobile device. In other words, a method of converting a job target received from an input source is set.
In operation 3403, the user sets any one of the imaging apparatus and the mobile device as a transmission destination. In other words, any one of the image forming apparatus and the mobile device is selected as the destination of the job target converted by the conversion method set in operation 3402, and information for transmission, such as an email address, is set.
In operation 3404, the mobile device stores a worksheet defining the input source, conversion method, and transmission destination set above.
Meanwhile, when the user requests the image forming apparatus to execute a job but the image forming apparatus has executed another job, the user may have to wait until the other job is completed. Accordingly, one or more exemplary embodiments provide a method of reserving a job by using a BYOD service.
Fig. 35 to 37 are diagrams for describing a method of reserving a job by using a BYOD service according to an exemplary embodiment.
Referring to fig. 35, in operation 3501, a mobile device 100 (similar to the mobile device 100 of fig. 1) receives a request from a user to check job standby information. In other words, since the image forming apparatus 200 (similar to the image forming apparatus 200 of fig. 1) is executing the current job, the user may request the mobile device 100 to check whether the user has to wait for a new job.
In operation 3502, the mobile device 100 requests the image forming apparatus 200 for job-ready information, and in operation 3503, the image forming apparatus 200 transmits the job-ready information to the mobile device 100. At this time, when the image forming apparatus 200 is executing a current job requested by another user, the image forming apparatus 200 may transmit job standby information including current state information of the image forming apparatus 200 and the number of persons requesting job reservation to the mobile device 100.
Upon receiving the job reservation request from the user in operation 3504, the mobile device 100 transmits the job reservation request to the image forming apparatus 200 in operation 3505.
Upon receiving the job reservation request, the image forming apparatus 200 transmits the wait number to the mobile device 100 in operation 3506.
When the image forming apparatus 200 completes the current job of another user in operation 3507, the image forming apparatus 200 transmits a job completion notification to the mobile device 100 in operation 3508. Accordingly, the screen of the mobile device 100 displays a notification of the completion of the current job of the other user.
The user who reserved the job is given priority in a certain period of time from the point of time when the mobile device 100 receives the job completion notification. Therefore, during a certain period of time, the image forming apparatus 200 stands by even when another user transmits a job request, in addition to the user who reserved the job. However, if the user who reserved the job does not send a job request within a certain period of time, the priority disappears.
When the mobile device 100 receives a job request from the user within a certain period of time from the time point at which the job completion notification is received in operation 3509, the mobile device 100 transmits a request to execute a job to the image forming apparatus 200 in operation 3510, and the image forming apparatus 200 executes the job in operation 3511.
Fig. 36 and 37 show in detail a process of executing a method of reserving a job by using the BYOD service.
Referring to fig. 36, when an image forming apparatus 200 (similar to the image forming apparatus 200 of fig. 1) is performing a job of another user, the user may check job standby information through a mobile device 100 (similar to the mobile device 100 of fig. 1). Here, as shown in fig. 36, the job standby information may include information that the image forming apparatus 200 is executing a copy job and information that there are two other user-reserved jobs.
When the user selects "reserve" from the screen displayed on the mobile device 100, the mobile device 100 transmits a reservation request to the image forming apparatus 200, and in response, the image forming apparatus 200 transmits a waiting list to the mobile device 100.
Referring to fig. 37, when the jobs of the other users are completed, the image forming apparatus 200 transmits a notification to the mobile device 100. A notification that the imaging apparatus 200 is currently available, such as "scanner is now available", may be displayed on the screen of the mobile device 100.
When the notification is transmitted to the mobile device 100, the image forming apparatus 200 is locked for a certain period of time after the time point at which the notification is transmitted, and assigns a priority to the user. If the user requests the image forming apparatus 200 to execute a job within a certain period of time by connecting the mobile device 100NFC tag to the image forming apparatus 200, the image forming apparatus 200 is unlocked and executes the job.
As described above, in the BYOD environment, the user can manipulate the imaging apparatus 200 by using the mobile device 100 via the BYOD service. At this time, various product type image forming apparatuses may be connected to the mobile device 100. The imaging devices connected to the mobile device 100 may have different UIs based on the product type and the model type. However, as described above, the mobile device 100 and the imaging apparatus together perform UP communication to support the BYOD service regardless of the product type and model type of the imaging apparatus. Accordingly, the user can control the imaging apparatus performing the UP communication in the same method by using the UI provided by the BYOD application installed in the mobile device 100 regardless of the different UIs of the imaging apparatuses.
Further, functions that are not supported by the imaging apparatus connected to the mobile device 100 but are supported by the mobile device 100 can be extended by using resources of the mobile device 100, and thus a workflow that cannot be handled by the imaging apparatus alone can be handled. Hereinafter, a method of processing a workflow combining functions supported by the imaging apparatus 200 and functions supported by the mobile device 100, and the mobile device 100 performing the method will be described in detail.
Fig. 38 is a diagram of the structure of a mobile device 100 (e.g., of fig. 1) processing a workflow according to an example embodiment. It will be apparent to those of ordinary skill in the art that the mobile device 100 may include general components in addition to those shown in FIG. 38.
Referring to fig. 38, the mobile device 100 may include a controller 110 (similar to the controller 110 of fig. 3A), a display unit 120 (similar to the display unit 120 of fig. 3A), an input unit 130 (similar to the input unit 130 of fig. 3A), a communication unit 140 (similar to the communication unit 140 of fig. 3A), and a storage unit 150 (similar to the storage unit 150 of fig. 3A).
Although not shown, the controller 110 may include at least one of a Random Access Memory (RAM), a Read Only Memory (ROM), a Central Processing Unit (CPU), and a Graphic Processing Unit (GPU). The RAM, ROM, CPU and GPU may be connected to each other via a data bus.
The CPU can access the storage unit 150 and perform booting by using an Operating System (OS) stored in the storage unit 150. In addition, the CPU can perform various operations by using various programs, various contents, and various data stored in the storage unit 150.
The ROM may store a command set for system booting. For example, when power is supplied to the mobile device 100 when a turn-on command is input to the mobile device 100, the CPU may copy the OS stored in the storage unit 150 into the RAM according to the command stored in the ROM and execute the OS to boot the system of the mobile device 100. When booting is completed, the CPU may copy various programs stored in the storage unit 150 into the RAM and perform various operations by executing the various programs copied to the RAM. When booting is completed, the GPU may display a UI screen on the area of the display unit 120. In detail, the GPU may generate a screen including various objects such as contents, icons, and menus. According to the layout of the screen, the GPU may calculate attribute values, such as coordinate values, shapes, sizes, and colors, of objects displayed on the screen. The GPU may then generate a screen in one of various layouts that include objects based on the calculated attribute values. The screen generated by the GPU may be provided to the display unit 120 and displayed on an area of the display unit 120.
The controller 110 may display a portion of the content stored in the storage unit 150 on the display unit 120. Alternatively, the controller 110 may perform a control operation corresponding to a user manipulation input to the input unit 130.
The input unit 130 may receive various commands from a user. The input unit 130 may include at least one of a keypad (not shown), a touch panel (not shown), and a pen recognition panel (not shown).
The keypad may include various types of keys, such as mechanical buttons and wheels, formed on various areas of the mobile device 100, such as a front area, side areas, and a rear area of the mobile device 100.
The touch panel may detect a touch input of a user and output a touch event value corresponding to a touch signal. When the touch panel is combined with the display panel to form a touch screen, the touch screen may be implemented as any of various types of touch sensors such as an electrostatic type or a piezoelectric type. Touch events generated on a touch screen are typically generated by a human finger, but may alternatively be generated by a conductive object that applies a change in capacitance.
The pen recognition panel may detect a proximity input or a touch input of the stylus pen when the user uses the stylus pen, and output a pen proximity event or a pen touch event.
The communication unit 140 may communicate with any type of external device according to any of various communication methods. Further, as described above, the communication unit 140 may communicate with various product types or various models of image forming apparatuses by using UP, so that the image forming apparatuses are controlled by using one BYOD application installed in the mobile device 100.
The storage unit 150 may store various types of programs and data required to operate the mobile device 100. In detail, the storage unit 150 may store a control program required for the controller 110 to control the mobile device 100, and data generated when the mobile device 100 operates. For example, the storage unit 150 may store information about an imaging device connected to the mobile device 100, data received from the imaging device, workflows predefined in the mobile device 100 by a user, and data about various UP commands corresponding to user inputs. Further, the storage unit 150 may store a BYOD application and a function application corresponding to functions included in the workflow. For example, the storage unit 150 may store a functional application for executing a function supported by an imaging device connected to the mobile device 100 and various types of applications for executing the function of the mobile device 100.
When the user requests the BYOD service through the input unit 130 of the mobile device 100, the controller 110 may pull out and execute the BYOD application from the storage unit 150. The controller 110 executing the BYOD application may draw a workflow predetermined by a user from the storage unit 150, display the workflow on the display unit 120, and enable the user to select the workflow through the input unit 130. At least one function supported by the imaging apparatus 200 and at least one function supported by the mobile device 100 are combined in a workflow. The imaging apparatus 200 may not support the functions supported by the mobile device 100 included in the workflow. The communication unit 140 may connect the imaging apparatus 200 for processing a workflow and the mobile device 100 according to the control of the controller 110. The controller 110 may execute the functions included in the workflow based on the order of processing of the functions included in the workflow. Here, the controller 110 may perform one function earlier in order after another function is completed.
For example, when a user requesting a BYOD service selects a workflow from the mobile device 100 to combine the scanning function of the image forming apparatus 200 and the editing function of the mobile device 100 in the stated order, the controller 110 executing the BYOD application may execute the workflow as predetermined by the user, as shown in fig. 38. Accordingly, in order to connect the imaging apparatus 200 having the scanning function to the mobile device 100, the communication unit 140 may perform the above-described discovery process, pairing process, and event registration process. When the image forming apparatus 200 for executing a workflow is connected via UP communication and performs a scan function, the controller 110 may automatically perform an edit function based on the order stored in the workflow. The controller 110 controls the image forming apparatus 200 connected to the mobile device 100 so as to perform not only functions supported by the image forming apparatus 200, such as a scanning function, but also functions of the mobile device 100, such as an editing function, by using resources of the mobile device 100, thereby processing a workflow requested by a user.
FIG. 39 is a flowchart of a method of processing a workflow according to an example embodiment. Details regarding the mobile device 100 (e.g., of fig. 1) handling the above workflow may be applied to the method of fig. 39, even if not explicitly mentioned.
In operation 3910, the mobile device 100 may receive an input selecting a workflow in which a first function supported by the imaging apparatus 200 (e.g., of fig. 1) and a function of the mobile device 100 are combined. In other words, a user of the mobile device 100 may select a workflow by using the mobile device 100. At this time, the mobile device 100 executes the BYOD application according to the user's request for the BYOD service, and receives an input of selecting a workflow according to the execution of the BYOD application. The functions of the mobile device 100 included in the workflow may be optionally or selectively supported by the imaging apparatus 200.
In operation 3920, the mobile device 100 may be connected to the imaging apparatus 200 for processing a workflow. The mobile device 100 may connect to the imaging apparatus 200 by performing the discovery process, the pairing process, and the event registration process described above.
In operation 3930, the mobile device 100 performs a first function and the function based on an order in which the first function and the function are processed. Based on the order in which the first function and the function are processed, the mobile device 100 may perform another function after the previously ordered functions are completed.
For example, when the order of processing the first function precedes the order of processing the functions, the mobile device may receive a result of executing the first function in response to a command to execute the first function from the image forming apparatus 200 and then execute the function according to the received result. At this time, in order to receive the result of executing the first function, the mobile device 100 may transmit a command to the imaging apparatus 200 based on the capability information about the first function provided from the imaging apparatus 200. In addition, when the function is executed based on the result, the mobile device 100 may execute the function in conjunction with an application executable in the mobile device 100.
Hereinafter, a method of processing a workflow combining at least one function supported by the image forming apparatus 200 and at least one function supported by the mobile device, and the mobile device 100 performing the method will be described in detail with reference to the workflow.
Fig. 40 is a diagram of operations of a workflow of processing to combine a scanning function of the first imaging apparatus 200-1 (similar to the imaging apparatus 200 of fig. 1) and an editing function of the mobile device 100 (similar to the mobile device 100 of fig. 1), according to an exemplary embodiment. At least one imaging apparatus for processing a workflow may be connected to the mobile device 100, and for convenience of description, it is assumed that the first imaging apparatus 200-1 is connected to the mobile device 100.
When processing the scan function of the first imaging apparatus 200-1 in the order defined by the workflow precedes processing the edit function of the mobile device 100, the workflow may be processed as follows.
In operation 4005, the mobile device 100 may execute a BYOD application according to the BYOD service request and receive an input from a user selecting a workflow that combines the scanning function of the first imaging apparatus 200-1 and the editing function of the mobile device 100.
In operation 4010, the mobile device 100 may be connected to a first imaging apparatus 200-1 for processing a workflow. In order to be connected to the first imaging apparatus 200-1 capable of performing the scanning function, the mobile device 100 may perform the discovery process, the pairing process, and the event registration process described above.
Fig. 41 is a diagram describing connecting the first imaging apparatus 200-1 (similar to the imaging apparatus 200 of fig. 1) and the mobile device 100 (similar to the mobile device 100 of fig. 1) for processing a workflow when the mobile device 100 selects the workflow.
When a user requests a BYOD service, the mobile device 100 executing the BYOD application may display the workflow in a predefined form. For example, as shown in fig. 41, a workflow of "edit after scan" may be displayed. When there are multiple predefined workflows, a list of predefined workflows may be displayed.
The user may view a list of predefined workflows displayed on the mobile device 100 and select one of the workflows. As shown in fig. 41, the user may touch a workflow displayed on the mobile device 100 to select the workflow.
To process a workflow selected by a user, the mobile device 100 executing the BYOD application may be connected to the first imaging apparatus 200-1 performing functions included in the workflow. In order to connect to the first imaging apparatus 200-1 capable of performing the scanning function, the mobile device 100 may perform the above-described discovery process, pairing process, and event registration process. If the mobile device 100 and the first imaging apparatus 200-1 are connected for the first time, a registration process of registering each other is performed, and if not, the mobile device 100 and the first imaging apparatus 200-1 may be connected to each other without a separate registration process. Then, the mobile device 100 may collect information about the first imaging apparatus 200-1 to prepare a process workflow.
Referring back to fig. 40, the mobile device 100 may receive a command to execute a scan function of the first image forming apparatus 200-1 having priority based on an order of processing jobs included in a workflow in operation 4015. In other words, the mobile device 100 executing the BYOD application may receive a command to perform a scan function by executing the scan application installed in the mobile device 100, so that the scan function is performed in the first imaging apparatus 200-1 by controlling the first imaging apparatus 200-1.
In operation 4020, the mobile device 100 may transmit a command to perform a scan function to the first image forming apparatus 200-1. When a command to perform the scan function is received by executing the scan application installed in the mobile device 100, an UP command corresponding to the command to perform the scan function may be transmitted to the first image forming apparatus 200-1 according to the UP communication method, so that the first image forming apparatus 200-1 supporting the scan function is controlled by the mobile device 100 executing the BYOD application.
In operation 4025, the first imaging apparatus 200-1 may perform a scan function. The first imaging apparatus 200-1 may check the UP command received from the mobile device 100 and perform a function corresponding to the UP command.
In operation 4030, mobile device 100 may receive a result of performing the scan function. In other words, the mobile device 100 may receive the scanned document obtained by the first image forming apparatus 200-1 according to the UP communication method.
Fig. 42 is a diagram of a process of receiving a result of performing a scan function of the first imaging apparatus 200-1 (similar to the imaging apparatus 200 of fig. 1) after the mobile device 100 (similar to the mobile device 100 of fig. 1) performs the scan function.
To process the workflow of "scan after edit" selected by the user, the mobile device 100 executing the BYOD application may first perform a scan function of the first image forming apparatus 200-1 based on the order of jobs included in the workflow.
Referring to fig. 42, the mobile device 100 executing the BYOD application may execute a scan application installed in the mobile device 100 in order to control the first imaging apparatus 200-1 supporting the scan function. Based on the capability information on the scan function provided from the first imaging apparatus 200-1, the mobile device 100 may display a UI screen for receiving a command to execute the scan function as an execution screen of the scan application. For example, the execution screen of the scan application may be configured by reflecting capability information currently provided by the first image forming apparatus 200-1, such as the original size, the original orientation, the duplex scanning, and the color mode. The mobile device 100 may receive a command to perform a scanning function from a user checking an execution screen of a scanning application.
When a command to perform a scan function is input to the mobile device 100, an UP command containing a scan option set by a user, a storage location of a scanned document, and a file name may be transmitted to the first image forming apparatus 200-1 performing the scan function.
The first image forming apparatus 200-1 may perform a scan function according to an UP command. The first imaging apparatus 200-1 may transmit a state of performing the scanning function to the mobile device 100 while performing the scanning function according to the web socket method. For example, when the first imaging apparatus 200-1 performs the scan function, the mobile device 100 may display a pop-up screen indicating that the scan is being performed. When there are several pages to be scanned by the first imaging apparatus 200-1, the mobile device 100 may display a pop-up screen inquiring the user whether to scan the next page.
When the execution of the scanning function is completed in the first imaging apparatus 200-1, the mobile device 100 may receive the result of the execution of the scanning function. While receiving the scanned document from the first image forming apparatus 200-1, the mobile device 100 may display a pop-up screen indicating that the scanned document is being received.
Upon receiving the scanned document from the first imaging apparatus 200-1, the mobile device 100 may determine that the scanning function included in the workflow of "post-scanning editing" is completed, and execute an editing application installed in the mobile device 100 to perform the editing function. The execution screen of the editing application displayed on the mobile device 100 may automatically display the scanned document received from the first imaging apparatus 200-1, thereby preparing to execute an editing function.
Referring back to fig. 40, in operation 4035, mobile device 100 may receive a command to perform an editing function on the scanned document. In other words, the mobile device 100 executing the BYOD application may receive a command to execute an editing function by executing an editing application installed in the mobile device 100. The first imaging apparatus 200-1 may not support the editing function of the mobile device 100. When receiving the scanned document from the first image forming apparatus 200-1, the mobile device 100 may determine that the scanning function included in the workflow is completed, and receive a command to perform an editing function performed after the scanning function from the user.
In operation 4040, the mobile device 100 may perform an editing function on the scanned document received from the first imaging apparatus 200-1 according to a command to perform the editing function. Accordingly, the mobile device 100 can generate an edited document obtained by performing an editing function on the scanned document.
Fig. 43 is a diagram describing an editing function of the mobile device 100 (similar to the mobile device 100 of fig. 1) performed by using resources of the mobile device 100.
As shown in fig. 43, the user can edit the scanned document by using various editing tools displayed together with the scanned document on the execution screen of the editing application of the mobile device 100. The application interworking with the editing tool included in the execution screen of the editing application may be an application embedded in the mobile device 100 or an application installed in the mobile device 100 by the user. For example, a camera tool included in an execution screen of an editing application may interwork with a camera application, a drawing tool may interwork with a drawing pad application, and an image attachment tool may interwork with an album application.
Fig. 44 is a diagram describing a manipulation interface with respect to drawing when an editing function of the mobile device 100 (similar to the mobile device 100 of fig. 1) is executed.
When a drawing tool is selected from the execution screen of the editing application of fig. 43, as shown in fig. 44, a manipulation interface with respect to drawing may be displayed. The user may draw lines or shapes on the image of the scanned document using the manipulation interface. The manipulation interface may display a pop-up screen for selecting a color of a line and a pop-up screen for selecting a width of the line. When the drawing tool is to be finished, the "X" button on the manipulation interface can be touched to finish the drawing tool.
Fig. 45 is a diagram describing a manipulation interface 4502 with respect to an additional image when an editing function of the mobile device 100 (similar to the mobile device 100 of fig. 1) is performed.
When an image attachment tool is selected from an execution screen of the editing application, an album application that interworks with the image attachment tool may be executed. The user may select one of the images stored in the album application. When the image 4504 is selected, a manipulation interface with respect to an additional image may be displayed, as shown in fig. 44. The user can attach the selected image to the scanned document 4506 using the manipulation interface. The manipulation interface may be displayed with the selected image at an edge of the selected image. For example, as shown in fig. 45, the position of the image may be adjusted by long-pressing the position adjustment button on the lower right of the image, and the image may be inserted into the adjusted position by pressing the OK button on the upper left of the image. Alternatively, the image may be removed by pressing a removal button at the upper right of the image. After the image is appended, the scanned document with the appended image may be stored.
The size or position of the document to be edited can be adjusted by using an image enlargement/reduction/movement tool included in the execution screen of the editing application.
As a result, the workflow of "edit after scan" cannot be processed by the first imaging apparatus 200-1 alone, but the workform can be processed by using an editing application and various applications corresponding to resources of the mobile device 100, which interwork with an editing tool.
Fig. 46 is a diagram of a process of processing a workflow combining a scanning function of the first image forming apparatus 200-1 (similar to the image forming apparatus 200 of fig. 1) and an editing function and a document transmission function of the mobile device 100 (similar to the mobile device 100 of fig. 1), according to an exemplary embodiment. Comparing the workflows of fig. 40 and 46, the workflow of fig. 46 also includes the document sending function of the mobile device 100. This workflow of fig. 46 may be named "edit and send after scan".
The mobile device 100 may support an editing function and a document transmission function that are not supported by the first imaging apparatus 200-1. When the scanning function of the first image forming apparatus 200-1 is performed first and the document transmission function of the mobile device 100 is performed last, the workflow may be processed as follows.
Since operations 4605 to 4640 of fig. 46 correspond to operations 4005 to 4040 of fig. 40, details thereof are not provided again, and processing from operation 4645 is described.
In operation 4645, the mobile device 100 may receive a command to perform a document sending function of the mobile device 100, which is finally performed based on an order of executing jobs included in the workflow. In other words, the mobile device 100 executing the BYOD application may receive a command to execute the document transmission function by executing the document transmission application installed in the mobile device 100, thereby executing the document transmission function. The mobile device 100 may receive a command to perform a document transmission function on an edited document obtained by editing a scanned document using an editing function of the mobile device 100.
In operation 4650, the mobile device 100 may perform a document transmission function by transmitting the edited document to an external device. In other words, the mobile device 100 may perform a document sending function on the edited document.
Fig. 47 is a diagram of a process of executing a document transmission function of the mobile device 100 (similar to the mobile device 100 of fig. 1) with respect to an edited document obtained by editing a scanned document.
The mobile device 100 executing the BYOD application when the document editing is finished may display a screen for selecting a document to be transmitted in order to process the workflow of "edit and transmit after scanning" selected by the user. The user can check the edited document from the screen for selecting the document to be transmitted. When the user selects to edit a document from the mobile device 100 and inputs a command to perform a document transmission function, various applications installed in the mobile device 100 may be displayed. As shown in fig. 47, the user may select an email application in order to send an edited document. The mobile device 100 can transmit the edited document by using the email application and display the document transmission state.
As a result, the workflow of "post-scan editing and transmission" cannot be processed by the first image forming apparatus 200-1 alone, but can be processed by using an editing application and a document transmission application corresponding to resources of the mobile device 100.
Fig. 48 is a diagram of a process of processing a workflow that combines the scanning function of the first imaging apparatus 200-1 (similar to the imaging apparatus 200 of fig. 1) and the editing function and the sharing function of the mobile device 100 (similar to the mobile device 100 of fig. 1), according to an exemplary embodiment. Comparing the workflow of fig. 48 with the workflow of fig. 40, the workflow of fig. 48 also includes the sharing functionality of the mobile device 100. Such a workflow may be named "post-scan editing and sharing".
The mobile device 100 may support an editing function and a sharing function that are not supported by the first imaging apparatus 200-1. When the scan function of the first imaging apparatus 200-1 is performed first and the share function of the mobile device 100 is performed last, the workflow may be processed as follows.
Since operations 4805 to 4840 of fig. 48 correspond to operations 4005 to 4040 of fig. 40, details thereof will not be provided, and the processing from operation 4845 will be described.
In operation 4845, the mobile device 100 may receive a command to execute a shared function of the mobile device 100 that is last executed based on the order of processing functions included in the workflow. In other words, the mobile device 100 executing the BYOD application may receive a command to execute the sharing function by executing the sharing application installed in the mobile device 100, thereby executing the sharing function. The mobile device 100 may receive a command to perform a sharing function on an edited document obtained by editing the scanned document using an editing function of the mobile device 100.
In operation 4850, the mobile device 100 may perform a sharing function to share an edited document obtained by editing a scanned document using the editing function. In other words, the mobile device 100 may perform a sharing function on the edited document.
Fig. 49 is a diagram of a process of executing a sharing function of the mobile device 100 (similar to the mobile device 100 of fig. 1) with respect to an edited document obtained by editing a scanned document.
The mobile device 100 executing the BYOD application when the document editing is finished may display a screen for selecting a document to be shared in order to process the workflow of "edit and share after scan" selected by the user. The user may edit the document from the screen check to select the document to be shared. When the user selects to edit a document from the mobile device 100 and inputs a command to perform a sharing function, various sharing applications installed in the mobile device 100 may be displayed. As shown in fig. 49, the user may select "S note (S note) application" 4902 for sending and sharing a document in order to share an edited document. Alternatively, the user may select the "S note application" 4902 according to a shortcut function that separately displays the most recently used shared application. The mobile device 100 may share the edited document with another mobile device (not shown) by using a sharing application.
As a result, the workflow of "post-scan editing and sharing" cannot be processed by the first imaging apparatus 200-1 alone, but can be processed by using an editing application and a sharing application corresponding to resources of the mobile device 100.
Fig. 50 is a diagram of a process of processing a workflow combining a scanning function of the first image forming apparatus 200-1 (similar to the image forming apparatus 200 of fig. 1), an editing function of the mobile device 100 (similar to the mobile device 100 of fig. 1), and a document transmission function of the first image forming apparatus 200-1, according to an exemplary embodiment. Comparing the workflow of fig. 50 with the workflow of fig. 40, the workflow of fig. 50 further includes a document sending function of the first image forming apparatus 200-1. Further, comparing the workflow of fig. 50 with the workflow of fig. 46, the main body performing the document transmission function is not the mobile device 100 but the first image forming apparatus 200-1. The workflow of FIG. 50 may be named "post-scan edit and proxy send".
The mobile device 100 may support editing functions that are not supported by the first imaging apparatus 200-1. When the scanning function of the first image forming apparatus 200-1 is performed first and the document sending function of the first image forming apparatus 200-1 is performed last, the workflow may be processed as follows.
Since operations 5005 to 5040 of fig. 50 correspond to operations 4005 to 4040 of fig. 40, details thereof are not provided again, and thus processing from operation 5045 will be described.
In operation 5045, the mobile device 100 may receive a command to perform a document transmission function of the first imaging apparatus 200-1, which is executed last, based on an order of processing functions included in the workflow. In other words, the mobile device 100 executing the BYOD application may receive a command to perform the document sending function by executing the document sending application installed in the mobile device 100, so that the document sending function is performed in the first image forming apparatus 200-1 by controlling the first image forming apparatus 200-1 supporting the document sending function. The mobile device 100 may receive a command to execute a document transmission function of the first image forming apparatus 200-1 with respect to an edited document obtained by editing a scanned document using an editing function of the mobile device 100.
In operation 5050, the mobile device 100 may transmit a command to perform a document transmission function to the first image forming apparatus 200-1. When a command to perform the document transmission function is received by executing the document transmission application installed in the mobile device 100, an UP command corresponding to the command to perform the document transmission function may be transmitted to the first image forming apparatus 200-1 according to the UP communication method, so that the mobile device 100 executing the BYOD application controls the first image forming apparatus 200-1 supporting the document transmission function.
In operation 5055, the first image forming apparatus 200-1 may perform a document sending function on the edited document. The first imaging apparatus 200-1 may check the UP command received from the mobile device 100 and perform a function of the first imaging apparatus 200-1 corresponding to the UP command.
In operation 5060, the mobile device 100 may receive a result of performing the document sending function. For example, the mobile device 100 may receive a state of performing a document transmission function on an edited document from the first imaging apparatus 200-1.
Fig. 51 is a diagram of a process of performing a document transmission function of the first image forming apparatus 200-1 (similar to the image forming apparatus 200 of fig. 1) with respect to an edited document obtained by editing a scanned document.
To handle the workflow of "post-scan edit and proxy send" selected by the user, the mobile device 100 (similar to the mobile device 100 of fig. 1) executing the BYOD application when document editing is finished may display a screen for selecting a document to be sent. The user can check the edited document from the screen for selecting the document to be transmitted. When the user selects to edit a document from the mobile device 100 and inputs a command to perform the document transmission function of the first image forming apparatus 200-1, the mobile device 100 executing the BYOD application may transmit an UP command corresponding to the command to perform the document transmission function on the edited document to the first image forming apparatus 200-1 according to the UP communication method. The first image forming apparatus 200-1 may receive a destination of an edited document and an UP command together with the edited document from the mobile device 100 and transmit the edited document to the destination. According to such a workflow, when it is difficult for the mobile device 100 to transmit an edited document, the edited document may be transmitted by the first image forming apparatus 200-1 supporting the document transmission function.
As a result, the workflow of "post-scan editing and agent transmission" cannot be separately executed by the first imaging apparatus 200-1, but can be processed by using an editing application corresponding to the resource of the mobile device 100.
Fig. 52 is a flowchart of a method of processing a workflow according to another exemplary embodiment. Details regarding the mobile device 100 (similar to the mobile device 100 of fig. 1) for handling the above workflow may be applied to the method of fig. 52, even if not explicitly mentioned. At least one imaging apparatus (similar to the imaging apparatus 200 of fig. 1) for processing a workflow may be connected to the mobile device 100, and it is assumed that a first imaging apparatus 200-1 and a second imaging apparatus 200-2 (similar to the imaging apparatus 200 of fig. 1) are connected to the mobile device 100 for convenience of description. The first imaging apparatus 200-1 may perform a first function included in the workflow, and the second imaging apparatus 200-2 may perform a second function included in the workflow.
In operation 5210, the mobile device 100 may receive an input selecting a workflow combining a first function supported by the first imaging apparatus 200-1, a function of the mobile device 100, and a second function supported by the second imaging apparatus 200-2. In other words, the user of the mobile device 100 may select a workflow combining the first function supported by the first imaging apparatus 200-1, the function of the mobile device 100, and the second function supported by the second imaging apparatus 200-2. At this time, the mobile device 100 may execute the BYOD application according to the user's BYOD service request and receive an input of selecting a workflow according to the execution of the BYOD application. The functions of the mobile device 100 included in the worksheet may not be supported by the first imaging apparatus 200-1 and the second imaging apparatus 200-2.
In operation 5220, the mobile device 100 may be connected to the first and second imaging apparatuses 200-1 and 200-2 for processing a workflow. The mobile device 100 may be connected with the first and second imaging apparatuses 200-1 and 200-2 by performing the discovery process (similar to the discovery process 650 of fig. 6), the pairing process (similar to the pairing process 660 of fig. 6), and the event registration process (similar to the event registration process 670 of fig. 6) as described above.
The mobile device 100 may perform functions included in the workflow based on an order of processing functions defined in the workflow, and when a first function supported by the first imaging apparatus 200-1 is first performed and a second function supported by the second imaging apparatus 200-2 is finally performed, the workflow may be processed as follows.
In operation 5230, the mobile device 100 may receive a result of performing the first function from the first imaging apparatus 200-1 in response to a command to perform the first function. To receive the result of performing the first function, the mobile device 100 may transmit a command to perform the first function to the first imaging apparatus 200-1 based on the capability information about the first function provided by the first imaging apparatus 200-1.
In operation 5240, the mobile device 100 may perform a function of the mobile device 100 with respect to a result of performing the first function. Here, when the mobile device 100 performs the function of the mobile device 100 with respect to the result of performing the first function, the function of the mobile device 100 may be performed through interworking with an application executable in the mobile device 100.
In operation 5250, the mobile device 100 may receive a result of performing the second function from the second imaging apparatus 200-2 in response to a command to perform the second function with respect to the result of performing the function of the mobile device 100.
Hereinafter, a method of processing a workflow combining at least one function supported by the first imaging apparatus 200-1, at least one function supported by the mobile device 100, and at least one function supported by the second imaging apparatus 200-1, and the mobile device 100 performing the workflow will now be described with respect to an example of the workflow.
Fig. 53 is a diagram of a process of processing a workflow combining a scanning function of the first image forming apparatus 200-1 (similar to the image forming apparatus 200 of fig. 1), an editing function of the mobile device 100 (similar to the mobile device 100 of fig. 1), and a printing function of the second image forming apparatus 200-2 (similar to the image forming apparatus 200 of fig. 1), according to an exemplary embodiment. Such a workflow may be named "edit and print after scan".
The mobile device 100 may support an editing function that is not supported by the first imaging apparatus 200-1, and the second imaging apparatus 200-2 may support a printing function that is not supported by the first imaging apparatus 200-1. When the scan function supported by the first image forming apparatus 200-1 is first processed and the print function supported by the second image forming apparatus 200-2 is finally executed, the workflow may be processed as follows.
In operation 5305, the mobile device 100 executes a BYOD application according to the BYOD service request and receives an input from the user to select a workflow in which a scanning function of the first image forming apparatus 200-1, an editing function of the mobile device 100, and a printing function of the second image forming apparatus 200-2 are combined.
In operation 5310, the mobile device 100 may be connected to a first imaging apparatus 200-1 for processing a workflow and to a second imaging apparatus 200-2. To connect to the first image forming apparatus 200-1 capable of performing the scan function and the second image forming apparatus 200-2 capable of performing the print function, the mobile device 100 may perform the discovery process (similar to the discovery process 650 of fig. 6), the pairing process (similar to the pairing process 660 of fig. 6), and the event registration process (similar to the event registration process 670 of fig. 6) described above.
Fig. 54 is a diagram describing connection of a first imaging apparatus 200-1 (similar to the imaging apparatus 200 of fig. 1) and a second imaging apparatus 200-2 (similar to the imaging apparatus 200 of fig. 1) for processing a workflow to the mobile device 100 (similar to the mobile device 100 of fig. 1) when the mobile device 100 selects the workflow.
A predefined form of workflow may be displayed on the mobile device 100 executing the BYOD application in accordance with the user's BYOD service request. For example, as shown in fig. 54, a list of workflows such as "edit and print after scan" and "edit and store after scan" may be displayed.
The user may review the list of workflows displayed on the mobile device 100 and select a desired workflow. As shown in fig. 41, the user may select a workflow of "edit and print after scan" from the mobile device 100.
To process a workflow selected by a user, the mobile device 100 executing the BYOD application may be connected to a first image forming apparatus 200-1 performing a scan function and a second image forming apparatus 200-2 performing a print function, which are included in the workflow. To connect to the first imaging apparatus 200-1 and the second imaging apparatus 200-2, the mobile device 100 may perform the discovery process (similar to the discovery process 650 of fig. 6), the pairing process (similar to the pairing process 660 of fig. 6), and the event registration process (similar to the event registration process 670 of fig. 6) described above. The mobile device 100 may collect information about the first imaging apparatus 200-1 and the second imaging apparatus 200-2 to prepare a processing workflow.
Referring back to fig. 53, the mobile device 100 may receive a command to perform the scan function of the first imaging apparatus 200-1, which is performed first, based on the order of the processing functions included in the workflow in operation 5315. In other words, the mobile device 100 executing the BYOD application may receive a command to perform a scan function by executing the scan application installed in the mobile device 100, so that the scan function is performed by the first image forming apparatus 200-1 by controlling the first image forming apparatus 200-1 performing the scan function.
In operation 5320, the mobile device 100 may transmit a command to perform a scan function to the first imaging apparatus 200-1. When a command to perform the scan function is received by executing the scan application installed in the mobile device 100, an UP command corresponding to the command to perform the scan function may be transmitted to the first image forming apparatus 200-1 according to the UP communication method, so that the first image forming apparatus 200-1 supporting the scan function is controlled by the mobile device 100 executing the BYOD application.
In operation 5325, the first imaging apparatus 200-1 may perform a scan function. The first image forming apparatus 200-1 may check the UP command received from the mobile device 100 and perform a scan function corresponding to the UP command.
In operation 5330, the mobile device 100 may receive a result of performing the scan function. In other words, the mobile device 100 may receive the scanned document obtained by the first image forming apparatus 200-1 according to the UP communication method.
In operation 5335, the mobile device 100 may receive a command that the mobile device 100 perform an editing function on the scanned document. In other words, the mobile device 100 executing the BYOD application may receive a command to execute an editing function by executing an editing application installed in the mobile device 100. The first imaging apparatus 200-1 may not support the editing function of the mobile device 100. When receiving the scanned document from the first image forming apparatus 200-1, the mobile device 100 may determine that the execution of the scanning function included in the workflow is completed, and thus receive a command to execute an editing function to be executed next from the user.
In operation 5340, the mobile device 100 may perform an editing function on the scanned document received from the first imaging apparatus 200-1. Accordingly, the mobile device 100 can generate an edited document obtained by performing an editing function on the scanned document.
In operation 5345, the mobile device 100 may receive a command to execute a print function of the second image forming apparatus 200-2, which is executed last based on an order of processing functions included in the workflow. In other words, the mobile device 100 executing the BYOD application may receive a command to execute a print function by executing a print application installed in the mobile device 100, so that the print function is executed by the second image forming apparatus 200-2 by controlling the second image forming apparatus 200-2 supporting the print function. The mobile device 100 may receive a command to execute a printing function of the second image forming apparatus 200-2 with respect to an edited document obtained by editing a scanned document using an editing function of the mobile device 100.
In operation 5350, the mobile device 100 may transmit a command to perform a printing function to the second image forming apparatus 200-2. When a command to execute the print function is received by executing the print application installed in the mobile device 100, an UP command corresponding to the command to execute the print function may be transmitted to the second image forming apparatus 200-2 according to the UP communication method, so that the second image forming apparatus 200-2 supporting the print function is controlled by the mobile device 100 executing the BYOD application.
In operation 5355, the second image forming apparatus 200-2 may perform a printing function on the edited document. The first image forming apparatus 200-1 and the mobile device 100 may not support the printing function supported by the second image forming apparatus 200-2. The second image forming apparatus 200-2 may check the UP command received from the mobile device 100 to perform a printing function corresponding to the UP command.
In operation 5360, the mobile device 100 may receive a result of performing the print function. For example, the mobile device 100 may receive a state of performing a printing function on an edited document from the second imaging apparatus 200-2.
Fig. 55 is a diagram of a process of executing a print function of the second image forming apparatus 200-2 (similar to the image forming apparatus 200 of fig. 1) with respect to an edited document obtained by editing a scanned document.
To handle the workflow of "edit and print after scan" selected by the user, the mobile device 100 (similar to the mobile device 100 of fig. 1) executing the BYOD application may display a screen selecting a document to be printed after the end of file editing. The user can check the edited document from the screen selecting the document to be printed. The user may select an edited document from the mobile device 100, execute a print application for controlling the second image forming apparatus 200-2, and input a command to perform a print function of the second image forming apparatus 200-2. Accordingly, the mobile device 100 executing the BYOD application may transmit an UP command corresponding to a command to execute a print function to the second image forming apparatus 200-2 according to the UP communication method. The second imaging apparatus 200-2 may perform a printing function on the edited document received from the mobile device 100. Accordingly, the mobile device 100 may receive a state of executing the printing function performed by the second image forming apparatus 200-2.
As a result, the workflow of "post-scan editing and printing" cannot be separately executed by the first imaging apparatus 200-1 or the second imaging apparatus 200-2, but can be processed by using an editing application corresponding to the resource of the mobile device 100.
The process of generating a worksheet defining a workflow has been described above with reference to fig. 20 to 34. And the process of executing a job according to a workflow defined by the worksheet has been described above with reference to fig. 38 to 55.
However, in the above-described embodiments, an example has been described in which a workform is generated by combining basic functions of the imaging apparatus and the mobile device (for example, scanning and printing in the imaging apparatus, and capturing an image, editing an image, and transmitting an image in the mobile device).
However, the worksheet may be generated not only by combining functions or applications (e.g., Out of Box (OOB) applications, hereinafter referred to as "basic applications") that are generally included in the imaging apparatus and the mobile device, but also by combining an application provided by a third party (hereinafter referred to as "third party application") and the basic applications or only the third party application.
The third party application is an application prepared by a third party according to various needs and allows the imaging apparatus or the mobile device to perform functions that are not typically supported by the imaging apparatus or the mobile device. Third party applications are typically not installed in the imaging apparatus or mobile device, but may be downloaded from an online store (e.g., Google store or Appstore) if the user so desires. Meanwhile, various third party applications may be used to generate worksheets, regardless of the type of worksheet.
In the exemplary embodiments described below, the worksheet is generated using a third party application, such as an Optical Character Reader (OCR) application, an email client application, a translation application, an application that provides travel information, or an application that performs image recognition and searching. However, the illustrative embodiments are not so limited and any type of third party application suitable for generating a worksheet may be used.
Fig. 56A illustrates a UI screen 5600a for generating a worksheet by combining applications according to an exemplary embodiment. The UI screen 5600a of fig. 56A may be displayed on a display panel of an imaging apparatus or a screen of a mobile device. Referring to fig. 56A, the UI screen 5600a displays lists 5610a and 5620a of applications available for generating a worksheet. The user can select an application having a function to be included in the worksheet from the lists 5610a and 5620a and then drag and drop the selected application to the worksheet generation area 5630a to generate a worksheet including the selected application.
Here, the lists 5610a and 5620a include not only basic applications of the imaging apparatus and the mobile device but also various third party applications. Since the third party application is installed in the imaging apparatus or the mobile device, when the UI screen 5600a is displayed on the screen of the mobile device, the user may be able to check the basic application and the third party application installed in the imaging apparatus through the mobile device. On the other hand, when the UI screen 5600a is displayed on the display panel of the imaging apparatus, the user may be able to check the basic application and the third party application installed in the mobile device through the imaging apparatus.
In fig. 56A, a worksheet is generated by combining a scan application and a box application (box application) which are basic applications of an imaging apparatus, an email application which is a basic application of a mobile device, and an OCR application which is a third party application. Here, the OCR application may be installed in any one of the imaging apparatus and the mobile device.
When the work sheet generated in fig. 56A is executed, the image forming apparatus scans a document to obtain a scanned image, an OCR application installed in any one of the image forming apparatus and the mobile device generates a document using text information extracted by reading the scanned image, the mobile device transmits the generated document via email, and the image forming apparatus stores the generated document in an assigned folder by a box application.
In other words, by including the third party application in the worksheet, a worksheet including OCR functions unsupported by the basic application of the imaging apparatus or the mobile device can be executed.
The UI screens 5600B and 5600C, the lists 5610B and 5620B and 5610C and 5620C, and the work table generation areas 5630B and 5630C of fig. 56B and 56C operate in a similar manner to that described with reference to fig. 56A.
Fig. 56B is a diagram describing an embodiment of generating a worksheet in which when a URL storing content therein is transmitted to an image forming apparatus, the image forming apparatus accesses the URL to download the content, and then prints the content. In other words, in fig. 56B, the work sheet is generated by combining the URL sending application and the print application.
When executing the worksheet generated in fig. 56B, the mobile device transmits the URL where the content is stored to the image forming apparatus, and the image forming apparatus downloads and prints the content by using the received URL. When the user finds the content he/she wants to print while searching for the network by using the mobile device, the user executes the work table defined in fig. 56B. When the work sheet is executed, the URL transmission application is automatically executed and transmits the URL, in which the content is stored, to the image forming apparatus. Alternatively, when the user executes the URL sending application while executing the worksheet, the user may directly input the URL where the content is stored. In this way, since the mobile device transmits only the URL to the image forming apparatus, rather than directly downloading and transmitting the content to the image forming apparatus, it is possible to reduce a network load during transmission.
Fig. 56C is a diagram describing an embodiment of generating a work table in which the image forming apparatus downloads and prints contents by accessing a received URI when a Uniform Resource Identifier (URI) of a directory where contents are stored is transmitted to the image forming apparatus through a box application. In other words, in fig. 56C, the work sheet is generated by combining the box application and the print application.
Here, the user may set the worksheet to be periodically executed. When the job table is set to be periodically executed, the image forming apparatus may download and print the contents stored in the directory corresponding to the URI each time the job table is executed. In other words, the user can update the contents only to the directory corresponding to the URI and automatically print the updated contents according to the set period.
Hereinafter, an embodiment of generating a worksheet by combining various base applications and third party applications will now be described with reference to fig. 57 to 63.
Fig. 57 is a diagram describing an embodiment of generating an operation table for controlling power supplied to the imaging apparatus according to the position of the mobile device.
Referring to fig. 57, a work table is generated by combining an application (e.g., a "Global Positioning System (GPS)" application) that determines position information of the apparatus and an application (e.g., a "power" application) that controls power of the imaging apparatus. According to the generated work table, the location of the mobile device is determined by a "GPS" application installed in the mobile device, and when the mobile device transmits location information to the imaging apparatus, a "power supply" application installed in the imaging apparatus controls power supplied to the imaging apparatus according to the received location information. For example, when the mobile device is far from the imaging apparatus, the imaging apparatus may enter a sleep mode or may be turned off.
Fig. 58 is a diagram describing an embodiment of generating a work table for changing the settings of the imaging apparatus in accordance with the temperature measured by the mobile device.
Referring to fig. 58, a worksheet is generated by combining an application (e.g., a "temperature" application) that measures temperature by using a sensor of the mobile device and an application (e.g., a "setting" application) that controls settings of the imaging apparatus. According to the generated work table, the temperature is measured by a "temperature" application installed in the mobile device, and when the mobile device transmits the measured temperature to the imaging apparatus, the "setting" application installed in the imaging apparatus changes the setting of the imaging apparatus according to the received temperature. For example, various options and settings of the imaging device may be changed to obtain the best print quality at the measured temperature. Here, an application for measuring humidity may be used instead of the "temperature" application, or an application for measuring both temperature and humidity may be used.
Fig. 59 is a diagram describing an embodiment of generating a work table for downloading travel information and then printing or transmitting/storing the downloaded travel information corresponding to the location of the mobile device. The UI screen 5900 for generating the worksheet of fig. 59 is different from the screens of fig. 56 to 58, and the UI screen for generating the worksheet may be changed.
Referring to fig. 59, in the UI screen 5900, a work sheet is generated by selecting the position determination application 5901 as an input application, the travel information application 5902 as a processing application, and the email application 5904, the FTP server storage application 5905, and the print application 5906 as output applications.
When the worksheet defined in the worksheet of fig. 58 is executed, the location determination application 5901 installed in the mobile device determines the location of the mobile device. At this time, a GPS sensor included in the mobile device may be used, or the user may directly input the location information. Then, the travel information application 5902 installed in the mobile device receives travel information (e.g., information about lodging and restaurants) corresponding to the location information from the internet or a server provided by a third party. Upon receipt of the travel information, the mobile device executes the email application 5904 to send the travel information to the pre-assigned email address and executes the FTP server storage application 5905 to store the travel information in the pre-assigned FTP server. Further, the mobile device transmits the travel information to the image forming apparatus, and the image forming apparatus executes the print application 5906 to print the travel information.
Accordingly, the user can access an image forming apparatus provided at a hotel at a travel destination, for example, to execute a work sheet, thereby conveniently printing travel information.
Fig. 60 is a diagram describing an embodiment of a worksheet for generating a job table for filtering received mails according to a specific criterion and then printing or transmitting/storing the filtered mails.
Referring to fig. 60, in the UI screen 6000, a worksheet is generated by selecting the email client application 6001 as an input application, the filtering application 6002 as a processing application, and the email application 6004, the FTP server storage application 6005, and the printing application 6006 as output applications.
When executing the workflow defined in the worksheet of fig. 60, the email client application 6001 installed in the mobile device checks and manages emails received by the user. Then, the filtering application 6002 installed in the mobile device filters the received email according to a condition set in advance. For example, filtering application 6002 may only extract emails whose header or content includes a particular phrase. The email may be configured to be filtered according to various other criteria. The mobile device may execute an email application 6004 and an FTP server storage application 6005 to send or store the extracted email via email in an FTP server. Here, the mobile device may transmit or store at least one of a body of the extracted email or a file attached in the extracted email. Further, the mobile device may transmit at least one of the body of the extracted email and a file attached in the extracted email to the imaging apparatus, and the imaging apparatus may execute the print application 6006 to print the at least one of the body and the file.
Fig. 61 is a diagram describing an embodiment of generating a work table for printing or transmitting/storing translations of text extracted from a scanned image of a document.
Referring to fig. 61, in the UI screen 6100, a work sheet is generated by selecting the scanning application 6101 as an input application, selecting the OCR application 6102 and the translation application 6103 as processing applications, and selecting the email application 6104, the FTP server storage application 6105, and the printing application 6106 as output applications.
When the workflow defined in the worksheet of fig. 61 is executed and the user places the document on the scanner of the imaging apparatus, the imaging apparatus scans the document by the scanning application 6101 to obtain a scanned image, and transmits the scanned image to the mobile device. An OCR application 6102 installed in the mobile device extracts text from the scanned image. The translation application 6103 then translates the extracted text into a pre-specified language. When the translation is complete, the mobile device executes the email application 6104 and the FTP server storage application 6105 to send the translated text via email or store the translated text in the FTP server. Further, the mobile device can send the translated text to an imaging apparatus, and the imaging apparatus can execute print application 6106 to print the translated text.
Fig. 62 is a diagram describing an embodiment of generating a worksheet for transmitting/storing a file obtained by combining a voice with a scanned image of a document.
Referring to fig. 62, in the UI screen 6200, a worksheet is generated by selecting the scanning application 6201 as an input application, selecting the voice recording application 6202 as a processing application, and selecting the email application 6204 and the FTP server storage application 6205 as output applications.
When the workflow defined in the worksheet of fig. 62 is executed and the user places the document on the scanner of the image forming apparatus, the image forming apparatus scans the document through the scanning application 6201 to obtain a scanned image, and transmits the scanned image to the mobile device. The mobile device executes a voice recording application 6202 to store a file that combines the scanned image and the voice message. In other words, when the user records a voice message while viewing a preview of a specific page of the scanned image, a file in which the voice message is recorded according to the specific page is stored. When the recording of the voice message and the generation of the file are complete, the mobile device may execute the email application 6204 and the FTP server storage application 6205 to send the file via email or store the file in the FTP server.
Fig. 63 is a diagram describing an embodiment of generating a work table for automatically generating a file name of a scanned image of a document by recognizing and searching an image included in the scanned image.
Referring to fig. 63, in the UI screen 6300, a work table is generated by selecting a scan application 6301 as an input application, selecting an image recognition/search application 6302 and an automatic name application 6303 as processing applications, and selecting an email application 6304 and an FTP server storage application 6305 as output applications.
When the workflow defined in the workform of fig. 63 is executed and the user places the document on the scanner of the image forming apparatus, the image forming apparatus scans the document by the scan application 6301 to obtain a scanned image, and transmits the scanned image to the mobile device. The mobile device recognizes at least one of the scanned images by executing the image recognition/search application 6302 and searches for the recognized image on the internet. For example, when the scanned image includes a car image, the image recognition/search application 6302 may recognize the car image in the scanned image and search for the car image on the internet, thereby determining that "car" is included in the scanned image. Then, the mobile device may execute the auto-naming application 6303 to automatically generate a file name based on the result of recognizing/searching the recognized image, or may provide a plurality of file name candidates to the user and receive a user input selecting one of the plurality of file name candidates. When the file name is determined, the mobile device may store the scanned image with the determined file name and execute the email application 6304 or the FTP server storage application 6305 to transmit the scanned image via email or store the scanned image in the FTP server.
The worksheet may be generated from various other scenarios.
For example, a worksheet may be generated such that regional OCR is performed on a scanned image to recognize text in a specific region of the scanned image, a document is generated by inputting the recognized text to a preset form, and then the generated document is transmitted/stored or printed. Such worksheets may be used to extract information such as hospital name, expense items and cost from the hospital-issued receipts and automatically enter the extracted information into the hospital bill.
Alternatively, when the text message received by the mobile device includes a URL of the file, the file may be downloaded by accessing the URL, and may be transmitted/stored or printed. Here, a worksheet may be generated such that filtering is performed to extract text messages based on the sender, content, and extensions, and the file is downloaded only with respect to the extracted text messages.
Alternatively, when the mobile device is connected to the image forming apparatus when the mobile device is within a certain distance from the image forming apparatus and the user requests the image forming apparatus to print a file stored in the USB memory by connecting the USB memory to the image forming apparatus, but the file is not in a format to be directly printed, the worksheet may be generated such that the image forming apparatus transmits the file to the mobile device, the mobile device renders the file, converts the file into print data, and transmits the print data to the image forming apparatus, and the image forming apparatus prints the print data.
Hereinafter, a method of providing a secure printing solution in a BYOD environment will be described. The exemplary embodiments described below relate to a Mobile Device Management (MDM) or Mobile Application Management (MAM) environment.
Fig. 64 is a diagram of an environment that provides a secure printing solution in a BYOD environment, according to an example embodiment. Referring to figure 64, the PC 6410, mobile device 6420 and imaging apparatus 6430 are under MDM by the MDM server 6440. The print driver is installed in the PC 6410, and the mobile device 6420 is registered in the print driver. Further, a secure printing application is installed in each of the mobile device 6420 and the image forming apparatus 6430.
In the MDM environment, when the PC 6410 performs printing, the print driver generates print data and transmits the print data to the mobile device 6420. The mobile device 6420 stores print data, and the user can check the print data and request to print the print data by executing a secure print application in the mobile device 6420. Upon receiving the print request, the secure print application in the mobile device 6420 may request the secure print application in the image forming apparatus 6430 to print the print data by transmitting the print data.
Here, secure printing applications installed in the mobile device 6420 and the imaging apparatus 6430 typically operate only when the mobile device 6420 and the imaging apparatus 6430 are in the MDM environment. Therefore, when the mobile device 6420 and the imaging apparatus 6430 are not in the MDM environment, the user cannot check or print the print data even when the user executes a secure print application in the mobile device 6420.
A process of installing a print driver and a secure print application and executing printing will now be described with reference to fig. 65A to 65C.
Referring to fig. 65A, a secure print application is installed in each of a mobile device 6420 and an image forming apparatus 6430. The print driver is installed in the PC 6410, and the mobile device 6420 is registered in the print driver while the print driver is installed.
Referring to fig. 65B, in the MDM environment, when the PC 6410 receives a print request using the mobile device 6420 registered in the print driver as an output device, the print driver generates print data and transmits the print data to the mobile device 6420. The mobile device 6420 stores print data.
Referring to fig. 65C, in the MDM environment, when a user executes a secure print application in the mobile device 6420, the user can check print data received from the PC 6410 and request printing of the print data. Here, printing of the print data may be requested in any of various manners. For example, when the mobile device 6420 is connected to the imaging apparatus 6430 by an NFC tag, or by a camera of the mobile device 6420 scanning a Quick Response (QR) code containing ID information of the imaging apparatus 6430, printing of print data may be requested. Printing of the print data may be requested via other methods.
When the user requests printing of the print data, the secure print application of the mobile device 6420 requests the image forming apparatus 6430 to print the print data by transmitting the print data to the secure print application of the image forming apparatus 6430. The image forming apparatus 6430 prints print data.
Other exemplary embodiments of generating and executing a worksheet will now be described with reference to FIGS. 66-70.
Fig. 66 is a diagram describing an embodiment of generating and executing a work table for scanning a document to obtain a scanned image, generating a file name by using information obtained via OCR on a specific area of the scanned image, and then storing the scanned image in a Server Management Block (SMB) server.
Referring to fig. 66, a first screen 6600a is a UI screen for generating a worksheet, and a second screen 6600b is a UI screen displayed when the worksheet is executed. In the first screen 6600a, a worksheet is generated by selecting the scan application 6601 as an input application, the regional OCR application 6602 as a process application, and the SMB application 6603 as an output application. The first screen 6600a and the second screen 6600b may be displayed on either one of the screen of the mobile device and the display panel of the imaging apparatus, or may be simultaneously displayed on both the screen of the mobile device and the display panel of the imaging apparatus.
When the worksheet generated according to the first screen 6600a of fig. 66 is executed, the processing is started when the second screen 6600b is displayed. First, the image forming apparatus obtains a scanned image by scanning a document. Then, the regional OCR application 6602 performs OCR on a specific region of the scanned image. Here, the specific area where the OCR is performed may be an area of the document previously designated by the user, or may be an area where text having at least a specific size exists. Further, the specific region may be set differently. When the OCR is completed, the SMB application 6603 automatically generates a file name by using the text obtained as a result of the OCR, and stores the scanned image in the SMB server under the generated file name.
Accordingly, by generating a file name using a text obtained by performing OCR on a specific area of a scanned image, a file name for recognizing the scanned image can be automatically generated without a user specifying the file name.
Meanwhile, the above describes an exemplary embodiment in which a document is scanned and a file name is generated by using text obtained by performing OCR on a specific region of the document, but alternatively, a target other than the document may be scanned and the file name may be generated by using information obtained by performing image recognition on a specific region of the target.
The regional OCR application 6602 and the SMB application 6603 may be installed in any of the mobile device and the imaging apparatus. In other words, both the regional OCR application 6602 and the SMB application 6603 may be installed in the mobile device or the imaging apparatus. Alternatively, the regional OCR application 6602 may be installed in the mobile device and the SMB application 6603 may be installed in the imaging apparatus, or the regional OCR application 6602 may be installed in the imaging apparatus and the SMB application 6603 may be installed in the mobile device.
Meanwhile, the progress status of the worksheet may be displayed on the second screen 6600 b. A second screen 6600b shown in fig. 66 can display the scan currently being performed, and which process is to be performed next.
Fig. 67 is a diagram describing an embodiment of generating and executing a worksheet for scanning a document to obtain a scanned image, generating a file name by using information obtained via OCR on a specific area of the scanned image, and then storing the scanned image by a cloud document management application provided by a third party.
Referring to fig. 67, a first screen 6700a is a UI screen for generating a sheet, and a second screen 6700b is a UI screen displayed when the sheet is executed. In the first screen 6700a, a worksheet is generated by selecting a scan application 6701 as an input application, an area OCR application 6702 as a processing application, and a cloud note application 6703 as an output application. The first screen 6700a and the second screen 6700b may be displayed on any one of the screen of the mobile device and the display panel of the imaging apparatus, or may be simultaneously displayed on both the screen of the mobile device and the display panel of the imaging apparatus.
When the work sheet generated according to the first screen 6700a of fig. 67 is executed, the processing is started when the second screen 6700b is displayed. First, the image forming apparatus obtains a scanned image by scanning a document. Then, the regional OCR application 6602 performs OCR on a specific region of the scanned image. Here, the specific area on which the OCR is performed may be an area of the document previously designated by the user, or may be an area in which text having at least a specific size exists. Further, the specific region may be set differently. When the OCR is completed, the cloud note application 6703 automatically generates a file name by using the text obtained as a result of the OCR, and stores the scanned image in the cloud server under the generated file name. Here, the cloud note application 6703 is an application provided by a third party, and may perform a function of storing and managing documents or images in a cloud server.
Accordingly, by generating a file name using a text obtained by performing OCR on a specific area of a scanned image, a file name for recognizing the scanned image can be automatically generated without a user specifying the file name.
Meanwhile, the above describes an exemplary embodiment in which a document is scanned and a file name is generated by using text obtained by performing OCR on a specific region of the document, but alternatively, a target other than the document may be scanned and the file name may be generated by using information obtained by performing image recognition on a specific region of the target.
The regional OCR application 6702 and the cloud note application 6703 may be installed in any one of the mobile device and the imaging apparatus. In other words, the regional OCR application 6702 and the cloud note application 6703 may both be installed in the mobile device or the imaging apparatus. Alternatively, the regional OCR application 6702 may be installed in the mobile device and the cloud note application 6703 may be installed in the imaging apparatus, or the regional OCR application 6702 may be installed in the imaging apparatus and the cloud note application 6703 may be installed in the mobile device.
Meanwhile, the progress status of the worksheet may be displayed on the second screen 6700 b. The second screen 6700b of fig. 67 may display which process is currently being performed the area OCR after the scanning is completed and which process is to be performed next.
FIG. 68 is a diagram describing an embodiment of generating and executing a worksheet for capturing an image of a document, generating a filename by using information obtained via OCR to a specific area of the image, and then storing the image by a cloud document management application provided by a third party.
Referring to fig. 68, a first screen 6800a is a UI screen for generating a sheet, and a second screen 6800b is a UI screen displayed when the sheet is executed. In the first screen 6800a, a worksheet is generated by selecting the camera application 6801 as an input application, the regional OCR application 6802 as a processing application, and the cloud note application 6803 as an output application. The first screen 6800a and the second screen 6800b may be displayed on any one of a screen of the mobile device and a display panel of the imaging apparatus, or may be simultaneously displayed on both of the screen of the mobile device and the display panel of the imaging apparatus.
When the work table generated according to the first screen 6800a of fig. 68 is executed, the processing is started when the second screen 6800b is displayed. First, the mobile device captures an image of a document by using a camera included in the mobile device. Then, the regional OCR application 6802 performs OCR on a specific region of the image. Here, the specific area where the OCR is performed may be an area of the document previously designated by the user, or may be an area where text having at least a specific size exists. Further, the specific region may be set differently. When the OCR is completed, the cloud note application 6803 automatically generates a file name by using the text obtained as a result of the OCR, and stores the image in the cloud server under the generated file name. Here, the cloud pen application 6803 is an application provided by a third party, and may perform a function of storing and managing a document or an image in a cloud server.
Meanwhile, the above describes an exemplary embodiment in which a document is scanned and a file name is generated by using text obtained by performing OCR on a specific region of the document, but alternatively, a target other than the document may be scanned and a file name may be generated using information obtained by performing image recognition on a specific region of the target.
The regional OCR application 6802 and the cloud note application 6803 may be installed in any of the mobile device and the imaging apparatus. In other words, both the regional OCR application 6802 and the cloud note application 6803 may be installed in the mobile device or the imaging apparatus. Alternatively, the regional OCR application 6802 may be installed in the mobile device and the cloud note application 6803 may be installed in the imaging apparatus, or the regional OCR application 6802 may be installed in the imaging apparatus and the cloud note application 6803 may be installed in the mobile device.
Meanwhile, the progress status of the worksheet may be displayed on the second screen 6800 b. The second screen 6800b of fig. 68 can display the region OCR which is currently performed after the image of the document is captured and which process is to be performed next.
Fig. 69 is a diagram describing an embodiment of generating and executing a worksheet for scanning a business card to obtain a scanned image, obtaining an email address from the scanned image, and then sending a file to the email address.
Referring to fig. 69, a first screen 6900a is a UI screen for generating a sheet, and a second screen 6900b is a UI screen displayed when the sheet is executed. In the first screen 6900a, a worksheet is generated by selecting the scan application 6901 as an input application, the card recognition application 6902 as a processing application, and the email application 6903 as an output application. The first screen 6900a and the second screen 6900b may be displayed on any one of a screen of the mobile device and a display panel of the imaging apparatus, or may be simultaneously displayed on both the screen of the mobile device and the display panel of the imaging apparatus.
When the work table generated according to the first screen 6900a of fig. 69 is executed, the processing is started when the second screen 6900b is displayed. First, the image forming apparatus scans a business card to obtain a scanned image. Then, the card recognition application 6902 recognizes and acquires an email address from the scanned image. Here, the email address may be obtained from the scanned image by using an area OCR application instead of the card recognition application 6902. When the email address is obtained, email application 6903 sends the file to the email address. Here, the file may be previously designated by the user. For example, the transmission of event coupons may be pre-designated, or the transmission of business documents, such as meeting documents, may be pre-designated.
Meanwhile, the above describes an exemplary embodiment in which a business card is scanned to obtain a scanned image, an email address is acquired from the scanned image, and a file is sent to the email address, but alternatively, the business card may be photographed, the email address may be obtained from a captured image of the business card, and the file may be sent to the email address, or a destination (such as a telephone number) other than the email address may be obtained from the scanned image or the captured image of the business card to send the file.
The card recognition application 6902 and the email application 6903 may be installed in any one of the mobile device and the image forming apparatus. In other words, both the card recognition application 6902 and the email application 6903 may be installed in the mobile device or the image forming apparatus. Alternatively, the card recognition application 6902 may be installed in the mobile device and the email application 6903 may be installed in the image forming apparatus, or the card recognition application 6902 may be installed in the image forming apparatus and the email application 6903 may be installed in the mobile device.
Meanwhile, the progress status of the worksheet may be displayed on the second screen 6900 b. The second screen 6900b shown in fig. 69 may display which card recognition is currently being performed after the business card is scanned, and which process is to be performed next.
Fig. 70 is a diagram describing an embodiment of generating and executing a worksheet for scanning business cards to obtain a scanned image, obtaining email addresses from the scanned image, and then updating the email addresses to an address book.
Referring to fig. 70, a first screen 7000a is a UI screen for generating a worksheet, and a second screen 7000b is a UI screen displayed when the worksheet is executed. In the first screen 7000a, a workform is generated by selecting the scanning application 7001 as an input application, the image segmentation application 7002 as a processing application, and the card recognition application 7003 as an output application. The first screen 7000a and the second screen 7000b may be displayed on either one of the screen of the mobile device and the display panel of the imaging apparatus, or may be simultaneously displayed on both the screen of the mobile device and the display panel of the imaging apparatus.
When the worksheet generated from the first screen 7000a of FIG. 70 is executed, the process is started when the second screen 7000b is displayed. First, the image forming apparatus scans a business card to obtain a scanned image. Then, the image segmentation application 7002 segments the region of the scanned image. For example, the scanned image may be divided into an area displaying a name, an area displaying a telephone number, and an area displaying an email address. After dividing the scanned image, the card recognition application 7003 may obtain an email address from the scanned image and update the email address to an address book. In other words, the card recognition application 7003 may store the email address in the address book so that the email address corresponds to the name included in the scanned image. Alternatively, the card identification application 7003 may obtain a telephone number instead of an email address and update the telephone number to an address book.
Meanwhile, the exemplary embodiments of scanning a business card to obtain a scanned image, obtaining an email address from the scanned image, and updating an address book are described above, but alternatively, the business card may be photographed, the email address may be obtained from a captured image of the business card, and the address book may be updated, or information other than the email address, such as a telephone number, may be obtained from the scanned image or the captured image of the business card to update the address book.
The image segmentation application 7002 and the card recognition application 7003 may be installed in any one of the mobile device and the imaging apparatus. In other words, both the image segmentation application 7002 and the card recognition application 7003 may be installed in the mobile device or the imaging apparatus. Alternatively, the image segmentation application 7002 may be installed in the mobile device and the card recognition application 7003 may be installed in the imaging apparatus, or the image segmentation application 7002 may be installed in the imaging apparatus and the card recognition application 7003 may be installed in the mobile device.
Meanwhile, the progress status of the worksheet may be displayed on the second screen 7000 b. The second screen 7000b shown in fig. 70 may currently perform the scan display and which process is performed next.
As described above, according to one or more exemplary embodiments, a workform defining an order in which jobs are performed using a BYOD service may be generated and stored, and the workform may be performed later so that jobs are performed in the defined order, thereby increasing user convenience.
One or more of the exemplary embodiments described above can be written as computer programs and can be implemented in general-use digital computers that execute the programs using a computer readable recording medium. Examples of the computer readable recording medium include magnetic storage media (e.g., ROM, floppy disks, hard disks, etc.), optical recording media (e.g., CD-ROMs, or DVDs), and so forth.
It should be understood that the exemplary embodiments described herein should be considered in a descriptive sense only and not for purposes of limitation. Descriptions of features or aspects in each exemplary embodiment are generally considered to be applicable to other similar features or aspects in other exemplary embodiments. While one or more exemplary embodiments have been described with reference to the accompanying drawings, it will be understood by those of ordinary skill in the art that various changes in form and details may be made therein without departing from the scope defined by the appended claims.

Claims (15)

1. A method of generating a worksheet defining an order in which jobs are executed, the method comprising:
selecting any one of the imaging apparatus and the mobile device as an input source for receiving a job target for a sheet to be generated during execution of the sheet;
during execution of the worksheet, selecting an editor application for the worksheet to be generated that is automatically executed in response to a job target received via the input source, wherein the editor application is for editing the job target received via the input source during execution of the worksheet;
selecting any one of the image forming apparatus and the mobile device as a transmission destination for transmitting a job target edited by an editor application for a sheet to be generated during execution of the sheet;
generating a worksheet defining an order of executing jobs according to the input source, the editor application, and the transmission destination; and
and storing the worksheet.
2. The method of claim 1, wherein selecting any one of an imaging apparatus and a mobile device as an input source comprises:
selecting any one of an imaging apparatus and a mobile device; and
selecting a function of the selected any one of the imaging apparatus and the mobile device.
3. The method of claim 2, wherein selecting any one of an imaging apparatus and a mobile device as an input source further comprises: setting options related to performing the selected function.
4. The method of claim 1, wherein an editor application is an image editor application and an image editor application is installed in the mobile device storing the worksheet.
5. The method according to claim 1, wherein selecting any one of the imaging apparatus and the mobile device as a transmission destination includes:
selecting any one of an imaging apparatus and a mobile device; and
after the job target is transmitted to the selected any one of the image forming apparatus and the mobile device, a function to be executed by the selected any one of the image forming apparatus and the mobile device is selected.
6. The method of claim 5, wherein selecting any one of the imaging apparatus and the mobile device as a transmission destination further comprises: setting options related to performing the selected function.
7. The method of claim 1, wherein when the worksheet is stored in the mobile device, the mobile device performs pairing with an imaging apparatus selected as the input source or the transmission destination.
8. A non-transitory computer-readable recording medium having recorded thereon a program for executing the method of claim 1.
9. A mobile device, comprising:
an input unit that receives a user input for performing:
during execution of the worksheet, selecting an input source for receiving a job target for the worksheet to be generated,
selecting, for a sheet to be produced, an editor application to be automatically executed in response to a job target received through an input source during execution of the sheet, wherein the editor application is to edit the job target received through the input source during execution of the sheet,
selecting a transmission destination for transmitting a job target edited by the editor application for a sheet to be generated during execution of the sheet;
a controller generating a work table defining an order of executing jobs according to an input source, an editor application, and a transmission destination selected according to a user input received through an input unit; and
a storage unit that stores the generated work table,
wherein any one of the imaging apparatus and the mobile device is selected as each of the input source and the transmission destination.
10. The mobile device according to claim 9, wherein when the user input received through the input unit contains information on a function of selecting any one of the imaging apparatus and the mobile device as the input source and selecting the selected any one of the imaging apparatus and the mobile device to be operated as the input source, the controller generates a worksheet including the selected any one of the imaging apparatus and the mobile device and the selected function.
11. The mobile device of claim 10, wherein when the user input received through the input unit further contains information about setting an option related to performing the selected function, the controller generates a worksheet further including the option.
12. The mobile device of claim 9, wherein the editor application is an image editor application and the image editor application is installed in the mobile device.
13. The mobile device according to claim 9, wherein when the user input received through the input unit contains information on selection of any one of the imaging apparatus and the mobile device as the transmission destination and selection of a function to be executed by the selected any one of the imaging apparatus and the mobile device after the job target is transmitted to the selected any one of the imaging apparatus and the mobile device, the controller generates a workform including the selected any one of the imaging apparatus and the mobile device and the selected function.
14. The mobile device of claim 13, wherein when the user input received through the input unit further contains information about setting an option related to performing the selected function, the controller generates a worksheet further including the option.
15. The mobile device of claim 9, further comprising a communication unit in communication with the imaging apparatus,
wherein the controller performs pairing with the image forming apparatus selected as the input source or the transmission destination through the communication unit when the work table is generated.
CN201580069939.6A 2014-12-22 2015-12-22 Method for generating worksheet by using BYOD service and mobile device for performing the same Active CN107111466B (en)

Applications Claiming Priority (5)

Application Number Priority Date Filing Date Title
KR20140186372 2014-12-22
KR10-2014-0186372 2014-12-22
KR10-2015-0120542 2015-08-26
KR1020150120542A KR20160076421A (en) 2014-12-22 2015-08-26 Method for generating workform using byod service and mobile device for performing the same
PCT/KR2015/014095 WO2016105083A1 (en) 2014-12-22 2015-12-22 Method of generating workform by using byod service and mobile device for performing the method

Publications (2)

Publication Number Publication Date
CN107111466A CN107111466A (en) 2017-08-29
CN107111466B true CN107111466B (en) 2021-01-08

Family

ID=56352862

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201580069939.6A Active CN107111466B (en) 2014-12-22 2015-12-22 Method for generating worksheet by using BYOD service and mobile device for performing the same

Country Status (2)

Country Link
KR (1) KR20160076421A (en)
CN (1) CN107111466B (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107783737A (en) * 2017-10-31 2018-03-09 江苏神州信源系统工程有限公司 A kind of cross-platform Method of printing and system
US11106405B2 (en) 2019-03-05 2021-08-31 Toshiba Tec Kabushiki Kaisha Printer and printer search system

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101777073A (en) * 2010-02-01 2010-07-14 浪潮集团山东通用软件有限公司 Data conversion method based on XML form

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101571348B1 (en) * 2009-09-01 2015-12-04 삼성전자주식회사 Host device workform performing device method for generating workform and method for performing workform
US9007613B2 (en) * 2011-09-23 2015-04-14 Sharp Laboratories Of America, Inc. Secure mobile printing method and system

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101777073A (en) * 2010-02-01 2010-07-14 浪潮集团山东通用软件有限公司 Data conversion method based on XML form

Also Published As

Publication number Publication date
KR20160076421A (en) 2016-06-30
CN107111466A (en) 2017-08-29

Similar Documents

Publication Publication Date Title
EP3038322B1 (en) Method of establishing connection between mobile device and image forming apparatus, and image forming apparatus for performing the method
US10110767B2 (en) Method of generating workform by using BYOD service and mobile device for performing the method
US10048915B2 (en) Method of processing workflow in which a function of an image forming apparatus and a function of a mobile device are combined and mobile device for performing the method
EP3065436A1 (en) Non-transitory computer-readable information recording medium, information processing apparatus, and communications method
JP6103000B2 (en) Information processing apparatus, program, and image processing system
US20200412910A1 (en) Shared terminal transmits print data indicating user identification information to printer after authentication request of user terminal device is confirmed by server
EP3293665B1 (en) Shared terminal, communication system, display control method, and carrier medium
JP6992293B2 (en) Shared terminals, communication systems, image transmission methods, and programs
US12131085B2 (en) System and method for transmitting electronic data associated with a user identified based on source identification information
JP6919432B2 (en) Shared terminals, communication systems, communication methods, and programs
CN107111466B (en) Method for generating worksheet by using BYOD service and mobile device for performing the same
JP2017108338A (en) Information processing device, information processing device control method, mobile terminal, mobile terminal control method, and program
JP2021060974A (en) Program, information processing system, information processing method, and information processing apparatus
CN107111718B (en) Method for establishing connection between mobile equipment and imaging device, imaging device and mobile equipment
JP7081195B2 (en) Communication terminals, communication systems, communication methods, and programs
US10416939B2 (en) Communication terminal, communication system, communication control method, and non-transitory computer-readable medium
JP7056285B2 (en) Shared terminals, communication systems, communication methods, and programs
JP6822341B2 (en) Shared terminals, communication systems, image transmission methods, and programs
JP6761207B2 (en) Shared terminals, communication systems, communication methods, and programs
JP7017167B2 (en) Shared terminals, communication systems, image transmission methods, and programs
JP2017130217A (en) Program, information processing device, communication system, and communication method
KR20170058342A (en) Method for establishing connection between image forming apparatus and mobile device, image forming apparatus and mobile device for performing the same
KR20170037919A (en) Method for processing workflow and mobile device for performing the same
KR20160076305A (en) Method for establishing connection between image forming apparatus and mobile device, image forming apparatus and mobile device for performing the same

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
CB02 Change of applicant information

Address after: Gyeonggi Do, South Korea

Applicant after: HP printer Korea Co., Ltd.

Address before: Gyeonggi Do, South Korea

Applicant before: Ace Print Solutions Ltd

CB02 Change of applicant information
TA01 Transfer of patent application right

Effective date of registration: 20191120

Address after: American Texas

Applicant after: Hewlett-Packard Development Corporation, Limited Liability Partnership

Address before: Han Guojingjidao

Applicant before: HP printer Korea Co., Ltd.

TA01 Transfer of patent application right
GR01 Patent grant
GR01 Patent grant