US10791230B2 - Image forming apparatus, instruction acceptance method, and computer readable program - Google Patents

Image forming apparatus, instruction acceptance method, and computer readable program Download PDF

Info

Publication number
US10791230B2
US10791230B2 US16/506,130 US201916506130A US10791230B2 US 10791230 B2 US10791230 B2 US 10791230B2 US 201916506130 A US201916506130 A US 201916506130A US 10791230 B2 US10791230 B2 US 10791230B2
Authority
US
United States
Prior art keywords
forming apparatus
image forming
instruction
voice
equipment
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
US16/506,130
Other versions
US20200028979A1 (en
Inventor
Tomohiro Yamaguchi
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Konica Minolta Inc
Original Assignee
Konica Minolta Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Konica Minolta Inc filed Critical Konica Minolta Inc
Assigned to Konica Minolta, Inc. reassignment Konica Minolta, Inc. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: YAMAGUCHI, TOMOHIRO
Publication of US20200028979A1 publication Critical patent/US20200028979A1/en
Application granted granted Critical
Publication of US10791230B2 publication Critical patent/US10791230B2/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N1/00Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
    • H04N1/0035User-machine interface; Control console
    • H04N1/00352Input means
    • H04N1/00392Other manual input means, e.g. digitisers or writing tablets
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/12Digital output to print unit, e.g. line printer, chain printer
    • G06F3/1201Dedicated interfaces to print systems
    • G06F3/1202Dedicated interfaces to print systems specifically adapted to achieve a particular effect
    • G06F3/1203Improving or facilitating administration, e.g. print management
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/12Digital output to print unit, e.g. line printer, chain printer
    • G06F3/1201Dedicated interfaces to print systems
    • G06F3/1223Dedicated interfaces to print systems specifically adapted to use a particular technique
    • G06F3/1237Print job management
    • G06F3/1238Secure printing, e.g. user identification, user rights for device usage, unallowed content, blanking portions or fields of a page, releasing held jobs
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/12Digital output to print unit, e.g. line printer, chain printer
    • G06F3/1201Dedicated interfaces to print systems
    • G06F3/1223Dedicated interfaces to print systems specifically adapted to use a particular technique
    • G06F3/1237Print job management
    • G06F3/1239Restricting the usage of resources, e.g. usage or user levels, credit limit, consumables, special fonts
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/12Digital output to print unit, e.g. line printer, chain printer
    • G06F3/1201Dedicated interfaces to print systems
    • G06F3/1223Dedicated interfaces to print systems specifically adapted to use a particular technique
    • G06F3/1237Print job management
    • G06F3/1253Configuration of print job parameters, e.g. using UI at the client
    • G06F3/1256User feedback, e.g. print preview, test print, proofing, pre-flight checks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/12Digital output to print unit, e.g. line printer, chain printer
    • G06F3/1201Dedicated interfaces to print systems
    • G06F3/1223Dedicated interfaces to print systems specifically adapted to use a particular technique
    • G06F3/1237Print job management
    • G06F3/1253Configuration of print job parameters, e.g. using UI at the client
    • G06F3/1258Configuration of print job parameters, e.g. using UI at the client by updating job settings at the printer
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N1/00Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
    • H04N1/0035User-machine interface; Control console
    • H04N1/00352Input means
    • H04N1/00403Voice input means, e.g. voice commands
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/12Digital output to print unit, e.g. line printer, chain printer
    • G06F3/1201Dedicated interfaces to print systems
    • G06F3/1202Dedicated interfaces to print systems specifically adapted to achieve a particular effect
    • G06F3/121Facilitating exception or error detection and recovery, e.g. fault, media or consumables depleted
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/12Digital output to print unit, e.g. line printer, chain printer
    • G06F3/1201Dedicated interfaces to print systems
    • G06F3/1223Dedicated interfaces to print systems specifically adapted to use a particular technique
    • G06F3/1237Print job management
    • G06F3/1253Configuration of print job parameters, e.g. using UI at the client
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/12Digital output to print unit, e.g. line printer, chain printer
    • G06F3/1201Dedicated interfaces to print systems
    • G06F3/1223Dedicated interfaces to print systems specifically adapted to use a particular technique
    • G06F3/1237Print job management
    • G06F3/1267Job repository, e.g. non-scheduled jobs, delay printing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/12Digital output to print unit, e.g. line printer, chain printer
    • G06F3/1201Dedicated interfaces to print systems
    • G06F3/1223Dedicated interfaces to print systems specifically adapted to use a particular technique
    • G06F3/1237Print job management
    • G06F3/1268Job submission, e.g. submitting print job order or request not the print data itself
    • G06F3/1271Job submission at the printing node, e.g. creating a job from a data stored locally or remotely
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/12Digital output to print unit, e.g. line printer, chain printer
    • G06F3/1201Dedicated interfaces to print systems
    • G06F3/1223Dedicated interfaces to print systems specifically adapted to use a particular technique
    • G06F3/1237Print job management
    • G06F3/1274Deleting of print job

Definitions

  • the present invention relates to the technology of an image forming apparatus to be used together with equipment, such as a measurement device that measures the living body of a user.
  • MFPs multi function peripherals
  • Such an image forming apparatus is instructed by a touch of a user to an input device, such as a touch panel. Furthermore, such an image forming apparatus is instructed by input of a voice of a user into an input device, such as a microphone.
  • This arrangement enables an instruction by voice in addition to an instruction by touch, resulting in improvement of the convenience of a user.
  • JP 2013-41379 A, JP 2004-234529 A, JP 2013-508808 A, JP 2007-79852 A, and JP 2005-115773 A each disclose a device to which a user can provide an instruction by touch or an instruction by voice.
  • Mobile electronic equipment described in JP 2013-41379 A includes: a casing; a capacitive touch panel that displays an image and detects a touch operation as an input signal; a microphone that detects a sound as a voice signal; and a controller that processes the voice signal detected by the microphone as the input signal of the touch operation performed to the touch panel, in a voice operation mode.
  • the mobile electronic equipment migrates to the voice operation mode.
  • a kiosk terminal described in JP 2004-234529 A includes: a touch panel disposed in superimposition on a display, the touch panel being to sense input by a touch of an operator; and a voice input device that senses a voice of the operator and converts the voice into character data.
  • a touch panel senses a touch to an input field after display of an input screen
  • a Japanese syllabary input screen is displayed.
  • the operator touches characters displayed on the Japanese syllabary input screen, to input data.
  • the kiosk terminal accepts voice input. That is the kiosk terminal switches to voice input.
  • a calculation device described in JP 2013-508808 A acquires, in response to a touch of a user to a touch input area, the positional coordinates of the touch input area, and further acquires a voice signal from a voice sensor.
  • the impact strength of the touch of the user is determined on the basis of the voice signal.
  • the calculation device performs an action associated with the determined impact strength.
  • a data processing device in a voice input mode in which processing is performed on the basis of a voice input through a microphone migrates, in a case where determining that an input voice has been registered in a voice-input prohibition information list, to an operator input mode to prompt a user to perform an input with a numeric keypad, otherwise prompts the user to perform an input with a voice.
  • the data processing device prompts the user to perform an input with the numeric keypad in order to prevent another person from listening to the information, and, for information requiring no retaining as a secret, prompts the user to perform a simplified input through a voice.
  • An L mode facsimile described in JP 2005-115773 A includes: a voice recognizer; a button operator; an operation time database; a CPU; a RAM; a ROM; a display; and a voice synthesizer.
  • the CPU reads the average operation time in each input mode (a voice input mode, a button input mode, or the voice input mode and the button input mode) from the operation time database, and displays an input mode selection screen on the display.
  • Selection of a combined operation of voice input and button input by the user on the input mode selection screen causes the CPU to display a screen prompting the user to perform voice input.
  • the CPU displays a result of utterance of the user recognized in voice by the voice recognizer, onto the screen.
  • the CPU displays a screen prompting the user to perform button input, and displays a result of the button input of the user onto the screen.
  • An object of the present invention is to provide an image forming apparatus in which a function of accepting an instruction by voice is more efficient than ever before.
  • an image forming apparatus to be used together with equipment to be used by a user with both hands
  • the image forming apparatus reflecting one aspect of the present invention comprises: an acceptor that does not accept, by voice, an instruction for processing to be performed by the image forming apparatus before the user holds the equipment but accepts, by the voice, the instruction while the equipment is being used with both of the hands.
  • FIG. 1 is an illustration of an exemplary external appearance of an image forming apparatus
  • FIG. 2 is an illustration of the hardware configuration of the image forming apparatus
  • FIG. 3 is an illustration of an exemplary measurement device held with both hands of a user
  • FIG. 4 is an illustration of an exemplary functional configuration of the image forming apparatus
  • FIG. 5 is an illustration of an exemplary home screen
  • FIG. 6 is an illustration of an exemplary copy operation screen
  • FIG. 7 is an illustration of exemplary job data
  • FIG. 8 is an illustration of an exemplary explanatory screen
  • FIG. 9 is an illustration of exemplary personal data
  • FIG. 10 is an illustration of an exemplary measurement-in-process screen
  • FIG. 11 is an illustration of an exemplary measured-result screen
  • FIG. 12 is an illustration of an exemplary view screen
  • FIG. 13 is a flowchart of an exemplary flow of entire processing of the image forming apparatus
  • FIG. 14 is a flowchart of an exemplary flow of measurement-start-occasion processing
  • FIG. 15 is a flowchart of an exemplary flow of voice-input-based processing.
  • FIG. 16 is a flowchart of an exemplary flow of view screen processing.
  • FIG. 1 is an illustration of an exemplary external appearance of an image forming apparatus 1 .
  • FIG. 2 is an illustration of the hardware configuration of the image forming apparatus 1 .
  • FIG. 3 is an illustration of an exemplary measurement device 10 r held with both hands of a user.
  • FIG. 4 is an illustration of an exemplary functional configuration of the image forming apparatus 1 .
  • the image forming apparatus 1 illustrated in FIG. 1 has a collective function, such as copying, PC printing, cloud printing, faxing, scanning, and boxing. Generally, the image forming apparatus 1 is also referred to as a “multi function peripheral (MFP)”.
  • MFP multi function peripheral
  • the PC print function allows printing an image on a sheet on the basis of image data received from a terminal device in the same local area network (LAN) as the image forming apparatus 1 .
  • the PC print function is also referred to as “network printing” or “network print”.
  • the cloud print function allows printing an image on a sheet on the basis of image data received from an external terminal device through a server on the Internet.
  • the box function allows each user given a storage area referred to as a “box” or a “personal box”, to save and manage, for example, image data in the storage area. Provision of a box per group enables the members to share in each group.
  • the box corresponds to a “folder” or a “directory” in a personal computer.
  • the image forming apparatus 1 includes, for example, a central processing unit (CPU) 10 a , a random access memory (RAM) 10 b , a read only memory (ROM) 10 c , an auxiliary storage device 10 d , a touch panel display 10 e , an operation key panel 10 f , a network interface card (NIC) 10 g , a wireless LAN communication unit 10 h , a modem 10 i , a scan unit 10 j , a print unit 10 k , a finisher 10 m , a voice input unit 10 n , and a measurement device 10 r.
  • CPU central processing unit
  • RAM random access memory
  • ROM read only memory
  • auxiliary storage device 10 d e.g
  • a touch panel display 10 e e.g
  • a touch panel display 10 e e.g
  • an operation key panel 10 f e.g
  • NIC network interface card
  • the CPU 10 a is the main CPU of the image forming apparatus 1 .
  • the RAM 10 b is the main memory of the image forming apparatus 1 .
  • the touch panel display 10 e displays, for example, a screen indicating a message to the user, a screen into which the user inputs a command or information, or a screen indicating a result of processing performed by the CPU 10 a . Furthermore, the touch panel display 10 e transmits a signal indicating the touched position, to the CPU 10 a.
  • the operation key panel 10 f that is a so-called hardware keyboard, includes, for example, a numeric keypad, a start key, a stop key, and function keys.
  • the NIC 10 g communicates with a different device in accordance with a protocol, such as transmission control protocol/internet protocol (TCP/IP).
  • a protocol such as transmission control protocol/internet protocol (TCP/IP).
  • the wireless LAN communication unit 10 h communicates with a different device on the basis of the standard of Institute of Electrical and Electronics Engineers (IEEE) 802.11 that is a wireless LAN standard.
  • IEEE Institute of Electrical and Electronics Engineers
  • the modem 10 i exchanges document data with a facsimile in accordance with a protocol, such as G3.
  • the scan unit 10 j reads an image on an original (sheet) set on an auto document feeder (ADF) or a platen glass, and generates image data.
  • ADF auto document feeder
  • the print unit 10 k prints an image in image data received from an external device through the NIC 10 g , onto a sheet, in addition to the image read by the scan unit 10 j.
  • the finisher 10 m performs, as necessary, postprocessing to printed matter acquired by the print unit 10 k .
  • Examples of the postprocessing include processing of stapling, processing of punching, and processing of folding.
  • the voice input unit 10 n including, for example, a sound board and a microphone, collects sound and generates voice data 6 A. Particularly, in a case where the user instructs the image forming apparatus 1 by so-called voice input, the voice input unit 10 n generates the voice data 6 A indicating a voice uttered from the user (namely, a voice of the user).
  • the measurement device 10 r measures, for example, the blood pressure and the pulse of the user. As illustrated in FIG. 3 , the user covers four terminals with the thumbs and the forefingers of both hands. In this manner, the measurement device 10 r is held with both hands.
  • the measurement device 10 r irradiates the thumbs and the forefingers of both hands with light, and detects the light reflected from the fingers at predetermined time intervals (e.g., every one second). This arrangement allows acquisition of a pulse wave. While the measurement device 10 r itself is being held, for example, the blood pressure and the pulse of the user are measured on the basis of the acquired pulse wave. The measurement device 10 r generates measurement data 6 B indicating a measured result (namely, measured values) and then transmits the measurement data 6 B to the image forming apparatus 1 .
  • the measurement device 10 r is connected to the image forming apparatus 1 by wire or wireless.
  • the measurement device 10 r is disposed detachably, for example, on a side face of the image forming apparatus 1 . Note that removal of the measurement device 10 r from the side face causes a sensor to detect the removal.
  • the ROM 10 c or the auxiliary storage device 10 d stores an application for achieving a function, such as the copying.
  • the ROM 10 c or the auxiliary storage device 10 d stores a measurement-occasion processing program 10 P (refer to FIG. 4 ).
  • the measurement-occasion processing program 10 P enables the image forming apparatus 1 to accept an instruction by voice input of the user while the measurement device 10 r is measuring the blood pressure of the user.
  • the measurement-occasion processing program 10 P allows output of the measured result. The detail thereof will be described later.
  • the measurement-occasion processing program 10 P causes a login processor 101 , a voice input OFF setter 102 , a screen display unit 103 , a job executor 104 , a job data storage 105 , an explanatory necessity discriminator 106 , a personal data storage 107 , a voice input ON setter 108 , a measurement-occasion processor 109 , and a voice processor 110 of FIG. 4 to be achieved in the image forming apparatus 1 .
  • FIG. 5 is an illustration of an exemplary home screen 5 A.
  • FIG. 6 is an illustration of an exemplary copy operation screen 51 B.
  • FIG. 7 is an illustration of exemplary job data 6 C.
  • FIG. 8 is an illustration of an exemplary explanatory screen 5 C.
  • FIG. 9 is an illustration of exemplary personal data 6 D.
  • FIG. 10 is an illustration of an exemplary measurement-in-process screen 51 D.
  • FIG. 11 is an illustration of an exemplary measured-result screen 52 D.
  • FIG. 12 is an illustration of an exemplary view screen 5 F.
  • the operation of the login processor 101 , the voice input OFF setter 102 , the screen display unit 103 , the job executor 104 , the job data storage 105 , the explanatory necessity discriminator 106 , the personal data storage 107 , the voice input ON setter 108 , the measurement-occasion processor 109 , and the voice processor 110 of FIG. 4 , will be described below with reference to FIGS. 5 to 12 with an exemplary case where the image forming apparatus 1 performs a job of copying and receives the measurement data 6 B from the measurement device 10 r.
  • the user considers causing the image forming apparatus 1 to copy the image of an original. Then, the user with the user name and the password thereof makes a request to the image forming apparatus 1 for login. Then, the following processing is performed.
  • the login processor 101 of the image forming apparatus 1 discriminates whether the user is an authorized user, and permits the user to log in the image forming apparatus 1 in a case where the user is an authorized user.
  • the voice input OFF setter 102 makes, when the function of the voice input unit 10 n is active (namely, on), the function inactive (namely, off) in order to prevent voice input from being performed.
  • the screen display unit 103 appropriately causes the touch panel display 10 e to display a screen corresponding to the event, as to be described sequentially later.
  • the screen display unit 103 causes the touch panel display 10 e to display the home screen 5 A as in FIG. 5 .
  • the home screen 5 A allows the user to select an operation screen 5 B to be displayed on the touch panel display 10 e , from a plurality of operation screens 5 B, the operation screen 5 B being to be operated by the user in order to cause the image forming apparatus 1 to perform the job.
  • a plurality of icons each having a job name is disposed on the home screen 5 A.
  • the user presses an icon 7 A corresponding to a screen to be operated for performance of the job of copying, from the plurality of icons, to provide the image forming apparatus 1 with an instruction for display of the copy operation screen 51 B as in FIG. 6 onto the touch panel display 10 e.
  • the screen display unit 103 causes the touch panel display 10 e to display the copy operation screen 51 B.
  • the user After providing the instruction for display of the copy operation screen 51 B onto the touch panel display 10 e , the user sets the original to the ADF. Input of conditions for the job of copying (e.g., the number of print copies and scaling) sets the job, and then an instruction for start of the job is provided. Then, the job executor 104 performs the following processing.
  • conditions for the job of copying e.g., the number of print copies and scaling
  • the job executor 104 controls each constituent of the image forming apparatus 1 such that the job is performed.
  • the job executor 104 causes, for example, the scan unit 10 j and the print unit 10 k to perform the job.
  • the job executor 104 generates the job data 6 C indicating the job code identifying the job, the job type indicating the type of the job, and the user code of the user who has provided the instruction, and stores the job data 6 C into the job data storage 105 every job code as in FIG. 7 .
  • the image of the original read by the scan unit 10 j is stored in, for example, the RAM 10 b.
  • the user After causing the image forming apparatus 1 to start the job of copying, the user is on standby in front of the image forming apparatus 1 during performance of the job. Here, the user considers measuring the blood pressure of the user itself during the standby time. Then, the user detaches the measurement device 10 r from the side face of the image forming apparatus 1 . Then, the following processing is performed.
  • the explanatory necessity discriminator 106 discriminates whether the explanatory screen 5 C as in FIG. 8 is to be displayed on the touch panel display 10 e , on the basis of the personal data 6 D stored in the personal data storage 107 , as below.
  • the explanatory screen 5 C provides the user with the description of the method of operating the image forming apparatus 1 by voice. Note that the explanatory screen 5 C may provide the description of the measurement method with the measurement device 10 r.
  • the personal data storage 107 stores the personal data 6 D including the measurement date and time indicating the date and time of measurement of the blood pressure of the user in the past with the image forming apparatus 1 , the subject code of the user (namely, a subject), and the measurement data 6 B of the user, every measurement date and time, as in FIG. 9 .
  • the explanatory necessity discriminator 106 searches the personal data storage 107 for the personal data 6 D having the subject code identical to the user code of the user who is currently logging in the image forming apparatus 1 . In a case where the personal data 6 D has not been found or in a case where the personal data 6 D less in pieces of data than a predetermined number has been found, the explanatory necessity discriminator 106 discriminates that the explanatory screen 5 C is to be displayed. In a case where the personal data 6 D not less in pieces of data than the predetermined number has been found, the explanatory necessity discriminator 106 discriminates that the explanatory screen 5 C is not to be displayed.
  • the predetermined number can be arbitrarily set by an administrator.
  • the image forming apparatus 1 discriminates that the user is an inexperienced person in operating the image forming apparatus 1 while measuring blood pressure, namely, a beginner. Then, the image forming apparatus 1 indicates the method of operating the image forming apparatus 1 itself with the measurement device 10 r held with both hands (namely, by voice), to the user.
  • the explanatory necessity discriminator 106 discriminates that the explanatory screen 5 C is to be displayed.
  • the screen display unit 103 causes the touch panel display 10 e to display the explanatory screen 5 C.
  • the voice input ON setter 108 turns on the function of the voice input unit 10 n . That is the voice input ON setter 108 makes the function active such that voice input is allowed. Similarly, when the explanatory necessity discriminator 106 discriminates that the explanatory screen 5 C is not to be displayed, the voice input ON setter 108 turns on the function of the voice input unit 10 n.
  • the measurement device 10 r When the user presses the “end” icon 7 C on the explanatory screen 5 C, the measurement device 10 r starts processing of measurement. Similarly, when the explanatory necessity discriminator 106 discriminates that the explanatory screen 5 C is not to be displayed, the measurement device 10 r starts the processing of measurement.
  • the measurement device 10 r cannot acquire the pulse wave of the user unless the user holds the measurement device 10 r correctly. As a result, the processing of measurement is not allowed to start.
  • the screen display unit 103 may cause the touch panel display 10 e to continuously display a screen displaying an error, until the measurement device 10 r is allowed to start the processing of measurement.
  • the measurement device 10 r generates the measurement data 6 B every measurement, and transmits the measurement data 6 B to the image forming apparatus 1 , successively.
  • the measurement-occasion processor 109 After reception of the first measurement data 6 B (hereinafter, referred to as “measurement data 61 B”) in the processing of measurement for this time (hereinafter, referred to as “measurement processing for this time”), the measurement-occasion processor 109 performs the following processing.
  • the measurement-occasion processor 109 After reception of the measurement data 61 B, the measurement-occasion processor 109 causes the personal data storage 107 to store, as the personal data 6 D, the measurement data 61 B in association with the data and time of the reception of the measurement data 61 B as the measurement date and time and the user code of the user who is currently logging in the image forming apparatus 1 as the subject code (refer to FIG. 9 ). In addition, the measurement-occasion processor 109 starts measurement of elapse in time.
  • the measurement-occasion processor 109 After reception of the measurement data 61 B, the measurement-occasion processor 109 causes the personal data storage 107 to store, every reception of the measurement data 6 B in the measurement processing for this time, the measurement data 6 B. In this case, the storing is performed such that the data already stored in the personal data storage 107 is not overwritten.
  • the measurement-occasion processor 109 discriminates that the measurement processing for this time has been completed.
  • the screen display unit 103 After the measurement device 10 r starts the processing of measurement, the screen display unit 103 performs processing of generating the measurement-in-process screen 51 D indicating the content of measurement still in progress as in FIG. 10 , as below.
  • the screen display unit 103 reads the personal data 6 D of the measurement processing for this time, from the pieces of personal data 6 D stored in the personal data storage 107 .
  • the screen display unit 103 requests the remaining time t until completion of the measurement processing for this time, from the measurement-occasion processor 109 .
  • the measurement-occasion processor 109 calculates the difference between the required time and the elapsed time as the remaining time t, and transmits the remaining time t to the screen display unit 103 .
  • the screen display unit 103 generates the measurement-in-process screen 51 D, on the basis of the measurement data 6 B of the read personal data 6 D and the received remaining time t.
  • the screen display unit 103 may reread the personal data 6 D and request and receive the new remaining time t at predetermined time intervals (e.g., every two or three seconds) to generate the new measurement-in-process screen 51 D to be displayed on the touch panel display 10 e .
  • This arrangement causes the measurement-in-process screen 51 D to be updated at the predetermined time intervals.
  • the screen display unit 103 causes the touch panel display 10 e to display the generated measurement-in-process screen 51 D.
  • the screen display unit 103 When the measurement-occasion processor 109 discriminates that the measurement processing for this time has been completed, the screen display unit 103 generates the measured-result screen 52 D indicating a measured result as in FIG. 11 , on the basis of all the measurement data 6 B of the personal data 6 D of the measurement processing for this time, and causes the touch panel display 10 e to display the measured-result screen 52 D.
  • the user After verifying the measured result through the measured-result screen 52 D, the user provides, by voice, the image forming apparatus 1 with an instruction for display of the home screen 5 A onto the touch panel display 10 e .
  • the user provides, by voice, the image forming apparatus 1 with an instruction for interruption of the measurement processing for this time (namely, cancellation). Then, the following processing is performed.
  • the voice input unit 10 n (refer to FIG. 2 ) generates the voice data 6 A, on the basis of the voice of the user.
  • the voice processor 110 acquires the voice data 6 A generated by the voice input unit 10 n , and converts the voice data 6 A into a character code, for example, with an input method editor (IME) for sound.
  • the voice processor 110 identifies the content of the instruction from the user, on the basis of the character code. That is, the occurred event is identified.
  • the screen display unit 103 causes the touch panel display 10 e to display the home screen 5 A (refer to FIG. 5 ). Note that the image of the original stored in the RAM 10 b is deleted.
  • the provision of the instruction for interruption of the measurement processing for this time causes the measurement device 10 r to interrupt the processing of measurement.
  • the user can provide, by voice, an instruction for redisplay of the operation screen 5 B displayed on the touch panel display 10 e before removal of the measurement device 10 r from the side face of the image forming apparatus 1 . Due to provision of the instruction, the screen display unit 103 causes the touch panel display 10 e to display the operation screen 5 B.
  • the user provides, by voice, the image forming apparatus 1 with an instruction for redisplay of the measurement-in-process screen 51 D onto the touch panel display 10 e .
  • the screen display unit 103 reads the personal data 6 D and requests and receives the new remaining time t, generates the measurement-in-process screen 51 D, on the basis of the read personal data 6 D and the received remaining time t, and then causes the touch panel display 10 e to display the measurement-in-process screen 51 D.
  • the user when the function of the voice input unit 10 n is on, the user provides, by voice, the image forming apparatus 1 with an instruction for interruption (namely, cancellation) or suspension of the job currently being performed by the image forming apparatus 1 . Then, the job executor 104 controls each constituent of the image forming apparatus 1 such that the job is suspended, for example.
  • the job executor 104 discriminates whether the user who has provided the instruction for performance of the job, is identical to the user who has provided the instruction for interruption or suspension of the job. Specifically, the job executor 104 makes discrimination, on the basis of the user code for the job currently being performed and the subject code of the personal data 6 D of the measurement processing for this time. Then, in a case where both of the users are identical, each constituent of the image forming apparatus 1 is controlled so as to suspend the job.
  • the user when the function of the voice input unit 10 n is on, the user provides, by voice, an instruction for redisplay of the explanatory screen 5 C onto the touch panel display 10 e , so that the screen display unit 103 causes the touch panel display 10 e to display the explanatory screen 5 C.
  • the user when the function of the voice input unit 10 n is on, the user provides, by voice, an instruction for display of a list of jobs performed or being performed by the image forming apparatus 1 . Then, on the basis of the job data 6 C stored in the job data storage 105 , the screen display unit 103 causes the touch panel display 10 e to display a job list screen 5 E indicating a list of jobs performed or being performed by the image forming apparatus 1 .
  • the user when the function of the voice input unit 10 n is on, the user provides, by voice, the image forming apparatus 1 with an instruction for verification of an image in the job of printing currently being performed by the image forming apparatus 1 (hereinafter, referred to as a “print image 7 F”). Then, the screen display unit 103 causes the touch panel display 10 e to display the view screen 5 F as in FIG. 12 for verification of the print image 7 F.
  • the user provides, by voice, the image forming apparatus 1 with an instruction for change of a print image 71 F that is the print image 7 F currently being displayed on the view screen 5 F, to the image of the next original of the original of the print image 71 F (namely, the next print image 7 F).
  • the user provides an instruction for change to the image of the previous original of the original of the print image 71 F (namely, the previous print image 7 F).
  • the screen display unit 103 causes the touch panel display 10 e to display the view screen 5 F including the print image 7 F changed in accordance with the instruction of the user.
  • the user While the view screen 5 F is being displayed on the touch panel display 10 e , the user considers changing the direction of printing of the print image 7 F because the direction of typing is not identical to the orientation of a sheet. Then, the user provides, by voice, the image forming apparatus 1 with an instruction for suspension of the job of printing currently being performed by the image forming apparatus 1 .
  • the job executor 104 controls each constituent of the image forming apparatus 1 such that the job of printing is suspended.
  • the user provides, by voice, the image forming apparatus 1 with an instruction for rotation of the print image 7 F by a predetermined angle in a predetermined direction (e.g., by 90° clockwise). That is the user provides, by voice, the image forming apparatus 1 with an instruction for change of the direction of printing of the print image 7 F.
  • the screen display unit 103 causes the touch panel display 10 e to display the view screen 5 F including the print image 7 F rotated in accordance with the instruction of the user.
  • the user provides, by voice, the image forming apparatus 1 with an instruction for determination of the degree of rotation of the print image 7 F (namely, the degree of change of the direction of printing).
  • the job executor 104 controls each constituent of the image forming apparatus 1 such that the job of printing is performed to the rotated print image 7 F from the beginning.
  • the user While the view screen 5 F is being displayed on the touch panel display 10 e , the user provides an instruction for adjustment of the print image 7 F, for example, in density. Then, the screen display unit 103 causes the touch panel display 10 e to display the view screen 5 F including the print image 7 F adjusted in density in accordance with the instruction of the user.
  • the user when the function of the voice input unit 10 n is on, the user provides an instruction for performance of a job of reprinting an image in the job of printing recently performed by the image forming apparatus 1 (hereinafter, referred to as a “reprint job”). Then, the job executor 104 searches the RAM 10 b for the image that is the target of the reprint job, and causes the reprint job to be performed to a found image.
  • the user can provide, by voice, an instruction for start of the job.
  • the job executor 104 discriminates whether preparation for performance of the job has been completed, on the basis of, for example, whether the original has been set to the ADF. Then, in a case where the preparation has been completed, the job executor 104 controls each constituent of the image forming apparatus 1 such that the job is performed.
  • FIG. 13 is a flowchart of an exemplary flow of entire processing of the image forming apparatus 1 .
  • FIG. 14 is a flowchart of an exemplary flow of measurement-start-occasion processing.
  • FIG. 15 is a flowchart of an exemplary flow of voice-input-based processing.
  • FIG. 16 is a flowchart of an exemplary flow of view screen processing.
  • the image forming apparatus 1 performs the processing in the order illustrated in FIG. 13 , on the basis of the measurement-occasion processing program 10 P.
  • the image forming apparatus 1 performs login processing in response to a request for login from the user (# 601 of FIG. 13 ), switches off the function of voice input, if the function of voice input is on, after permission for login (# 602 ), and displays the home screen 5 A (# 603 ).
  • the image forming apparatus 1 displays the selected operation screen 5 B (# 605 ).
  • the image forming apparatus 1 sets the job in accordance with the conditions (# 607 ).
  • the image forming apparatus 1 starts the job (# 609 ).
  • an image acquired while the job is being performed e.g., the image of the original read by the scan unit 10 j ), is stored in the RAM 10 b.
  • the image forming apparatus 1 performs the measurement-start-occasion processing as in FIG. 14 (# 612 ).
  • the image forming apparatus 1 discriminates whether the user who intends to measure blood pressure is a beginner (# 631 ).
  • the image forming apparatus 1 displays the explanatory screen 5 C (# 633 ). In a case where discriminating that the user is not a beginner (No at # 632 ) or in a case where the explanatory screen 5 C is closed (Yes at # 634 ), the image forming apparatus 1 starts the measurement processing (# 635 ), displays the measurement-in-process screen 51 D (# 636 ), and turns on the function of voice input (# 637 ).
  • the measurement device 10 r While the measurement device 10 r is continuously generating the measurement data 6 B, namely, while the measurement device 10 r is performing the measurement processing for this time (Yes at # 610 , Yes at # 611 , No at # 613 , and No at # 618 ), when the user performs voice input (Yes at # 614 ), the image forming apparatus 1 performs the voice-input-based processing as in FIG. 15 (# 615 ).
  • the image forming apparatus 1 starts the job and stores an image in the job into, for example, the RAM 10 b (# 653 ).
  • the image forming apparatus 1 discriminates whether the user who has provided the instruction for performance of the job and the subject are identical (# 655 ). In a case where the user and the subject are identical (Yes at # 656 ), the image forming apparatus 1 suspends the job (# 657 ).
  • the image forming apparatus 1 displays the explanatory screen 5 C (# 659 ).
  • the image forming apparatus 1 displays the measurement-in-process screen 51 D (# 661 ).
  • the image forming apparatus 1 displays the job list screen 5 E (# 663 ).
  • the image forming apparatus 1 displays the view screen 5 F (# 665 ). While the view screen 5 F is being displayed (Yes at # 666 ), the image forming apparatus 1 performs the view screen processing as in FIG. 16 (# 667 ).
  • the image forming apparatus 1 displays the view screen 5 F including the next print image 7 F (# 682 ).
  • the image forming apparatus 1 displays the view screen 5 F including the previous print image 7 F (# 684 ).
  • the image forming apparatus 1 suspends the job of printing (# 686 ), and then displays the view screen 5 F including the print image 7 F rotated by a specified angle and performs the job of printing from the beginning (# 687 ).
  • the image forming apparatus 1 displays the view screen 5 F including the print image 7 F adjusted in accordance with the instruction (# 689 ).
  • the image forming apparatus 1 performs the reprint job (# 669 ).
  • the image forming apparatus 1 displays the measured-result screen 52 D (# 616 ), and deletes the image in the job stored in the RAM 10 b (# 617 ). Then, the image forming apparatus 1 turns off the function of voice input (# 602 ).
  • the image forming apparatus 1 interrupts the measurement processing for this time and deletes the image in the job stored in the RAM 10 b (# 619 ). Note that, in this case, the effect that the measurement processing for this time has been interrupted, may be displayed on the touch panel display 10 e . Then, the image forming apparatus 1 turns off the function of voice input (# 602 ).
  • the image forming apparatus 1 Until the image forming apparatus 1 performs logout processing in response to a request for logout from the user (Yes at # 620 ), the image forming apparatus 1 appropriately repeats steps # 602 to # 619 , # 631 to # 637 , # 651 to # 669 , and # 681 to # 689 described above.
  • the image forming apparatus 1 can be provided in which the function of accepting an instruction by voice is more efficient than ever before.
  • the timing the voice input ON setter 108 turns on the function of the voice input unit 10 n is when the user presses the “end” icon 7 C on the explanatory screen 5 C.
  • the timing may be any of the following timings.
  • the timing may be when the measurement device 10 r acquires the pulse wave of the user (namely, when the processing of measurement starts).
  • the timing may be when the explanatory necessity discriminator 106 discriminates whether the explanatory screen 5 C is to be displayed on the touch panel display 10 e .
  • the timing may be when the sensor detects that the measurement device 10 r has been detached from the side face of the image forming apparatus 1 .
  • the timing the voice input OFF setter 102 turns off the function of the voice input unit 10 n is when an instruction is provided for display of the home screen 5 A while the measured-result screen 52 D is being displayed on the touch panel display 10 e or when an instruction is provided for interruption of the measurement processing for this time, after permission for login of the login processor 101 .
  • the timing may be any of the following timings.
  • the timing may be when the measurement device 10 r finishes transmitting all the measurement data 6 B (namely, when completing the processing of measurement).
  • the timing may be when the sensor detects that the measurement device 10 r has returned to the original position (namely, to the side face of the image forming apparatus 1 ).
  • the timing may be when the measurement device 10 r is disabled from acquiring the pulse wave of the user while the processing of measurement is being performed.
  • the condition that the explanatory necessity discriminator 106 discriminates that the explanatory screen 5 C is to be displayed meets that the personal data 6 D of the subject code identical to the user code of the user who is currently logging in the image forming apparatus 1 (hereinafter, referred to as “identical data”), not less in pieces of data than the predetermined number, is stored in the personal data storage 107 .
  • the condition (namely, the condition of discriminating that the explanatory screen 5 C is to be displayed) may meet that the interval is a predetermined time or more (e.g., 500 hours or more) between the measurement date and time of the identical data recently stored in the personal data storage 107 and the time of start of the processing of measurement by the user who is currently logging in (e.g., the time of detachment of the measurement device 10 r from the image forming apparatus 1 ) (hereinafter, referred to as a “measurement interval time”).
  • the interval is a predetermined time or more (e.g., 500 hours or more) between the measurement date and time of the identical data recently stored in the personal data storage 107 and the time of start of the processing of measurement by the user who is currently logging in (e.g., the time of detachment of the measurement device 10 r from the image forming apparatus 1 ) (hereinafter, referred to as a “measurement interval time”).
  • the explanatory necessity discriminator 106 may discriminate that the explanatory screen 5 C is to be displayed, as long as the measurement interval time is the predetermined time or more even when the identical data not less in pieces of data than the predetermined number, is stored in the personal data storage 107 .
  • the condition that the measurement device 10 r starts the processing of measurement meets that the user presses the “end” icon 7 C on the explanatory screen 5 C or that the explanatory necessity discriminator 106 discriminates that the explanatory screen 5 C is not to be displayed.
  • condition namely, the condition of starting the processing of measurement
  • the user may meet that the user provides, by voice, the image forming apparatus 1 with an instruction for start of the processing of measurement after the user presses the “end” icon 7 C on the explanatory screen 5 C or after the explanatory necessity discriminator 106 discriminates that the explanatory screen 5 C is not to be displayed.
  • the measurement-occasion processor 109 calculates the remaining time t. However, the measurement device 10 r may calculate the remaining time t. According to the embodiment, the condition that the measurement-occasion processor 109 discriminates that the processing of measurement has been completed, meets that the elapsed time measured by the measurement-occasion processor 109 is more than the required time. However, the condition may meet that the measurement device 10 r notifies the measurement-occasion processor 109 that the processing of measurement has been completed and then the measurement-occasion processor 109 receives the notification.
  • the measurement device 10 r stores the required time.
  • the measurement device 10 r starts the processing of measurement simultaneously with measurement of elapse in time. Every generation of the measurement data 6 B, the remaining time t is calculated on the basis of the elapsed time and the required time. The calculated remaining time t together with the measurement data 6 B is transmitted to the measurement-occasion processor 109 .
  • the measurement device 10 r completes the processing of measurement.
  • the measurement device 10 r notifies the measurement-occasion processor 109 that the processing of measurement has been completed, simultaneously with transmission of the last generated measurement data 6 B.
  • the measurement-occasion processor 109 discriminates that the processing of measurement has been completed, on the basis of reception of the last generated measurement data 6 B and reception of the notification that the processing of measurement has been completed.
  • the job executor 104 may cause suspension of the job being performed, in accordance with an instruction from the user.
  • the job executor 104 may discriminate whether the user who has provided the instruction for performance of the job is identical to the user who has provided the instruction for change of the print image 7 F. Then, in a case where both of the users are identical, as described above, the job executor 104 is required at least to control each constituent of the image forming apparatus 1 such that the job is suspended. After that, the screen display unit 103 is required at least to cause the touch panel display 10 e to display the view screen 5 F including the print image 7 F changed in accordance with the instruction of the user.
  • the entire configuration of the image forming apparatus 1 the configuration of each constituent of the image forming apparatus 1 , the content of processing, the order of processing, and the configuration of data can be appropriately changed without departing from the spirit of the present invention.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Accessory Devices And Overall Control Thereof (AREA)
  • User Interface Of Digital Computer (AREA)
  • Facsimiles In General (AREA)

Abstract

An image forming apparatus to be used together with equipment to be used by a user with both hands, the image forming apparatus includes: an acceptor that does not accept, by voice, an instruction for processing to be performed by the image forming apparatus before the user holds the equipment but accepts, by the voice, the instruction while the equipment is being used with both of the hands.

Description

The entire disclosure of Japanese patent Application No. 2018-133951, filed on Jul. 17, 2018, is incorporated herein by reference in its entirety.
BACKGROUND Technological Field
The present invention relates to the technology of an image forming apparatus to be used together with equipment, such as a measurement device that measures the living body of a user.
Description of the Related Art
Conventionally, image forming apparatuses referred to as “multi function peripherals (MFPs)”, have been widespread.
Typically, such an image forming apparatus is instructed by a touch of a user to an input device, such as a touch panel. Furthermore, such an image forming apparatus is instructed by input of a voice of a user into an input device, such as a microphone. This arrangement enables an instruction by voice in addition to an instruction by touch, resulting in improvement of the convenience of a user. JP 2013-41379 A, JP 2004-234529 A, JP 2013-508808 A, JP 2007-79852 A, and JP 2005-115773 A each disclose a device to which a user can provide an instruction by touch or an instruction by voice.
Mobile electronic equipment described in JP 2013-41379 A includes: a casing; a capacitive touch panel that displays an image and detects a touch operation as an input signal; a microphone that detects a sound as a voice signal; and a controller that processes the voice signal detected by the microphone as the input signal of the touch operation performed to the touch panel, in a voice operation mode. When discriminating that the touch operation is abnormal due to moisture adhering to the touch panel or when discriminating that detected humidity is a predetermined level or more, the mobile electronic equipment migrates to the voice operation mode.
A kiosk terminal described in JP 2004-234529 A includes: a touch panel disposed in superimposition on a display, the touch panel being to sense input by a touch of an operator; and a voice input device that senses a voice of the operator and converts the voice into character data. When the touch panel senses a touch to an input field after display of an input screen, a Japanese syllabary input screen is displayed. The operator touches characters displayed on the Japanese syllabary input screen, to input data. In a case where no input has been made through the touch panel for five seconds after display of the input screen or after input through the touch panel, the kiosk terminal accepts voice input. That is the kiosk terminal switches to voice input.
A calculation device described in JP 2013-508808 A acquires, in response to a touch of a user to a touch input area, the positional coordinates of the touch input area, and further acquires a voice signal from a voice sensor. The impact strength of the touch of the user is determined on the basis of the voice signal. The calculation device performs an action associated with the determined impact strength.
According to JP 2007-79852 A, a data processing device in a voice input mode in which processing is performed on the basis of a voice input through a microphone, migrates, in a case where determining that an input voice has been registered in a voice-input prohibition information list, to an operator input mode to prompt a user to perform an input with a numeric keypad, otherwise prompts the user to perform an input with a voice. Thus, for information requiring retaining as a secret, the data processing device prompts the user to perform an input with the numeric keypad in order to prevent another person from listening to the information, and, for information requiring no retaining as a secret, prompts the user to perform a simplified input through a voice.
An L mode facsimile described in JP 2005-115773 A includes: a voice recognizer; a button operator; an operation time database; a CPU; a RAM; a ROM; a display; and a voice synthesizer. On the basis of a task selection of a user, the CPU reads the average operation time in each input mode (a voice input mode, a button input mode, or the voice input mode and the button input mode) from the operation time database, and displays an input mode selection screen on the display. Selection of a combined operation of voice input and button input by the user on the input mode selection screen, causes the CPU to display a screen prompting the user to perform voice input. Then, the CPU displays a result of utterance of the user recognized in voice by the voice recognizer, onto the screen. Next, the CPU displays a screen prompting the user to perform button input, and displays a result of the button input of the user onto the screen.
However, even when a function of accepting an instruction by voice is provided, only provision of an instruction by touch from a user makes the function unused. There is a possibility that acceptance of a voice of a user who is not using an image forming apparatus, as an instruction, causes the image forming apparatus to perform unnecessary processing.
SUMMARY
An object of the present invention is to provide an image forming apparatus in which a function of accepting an instruction by voice is more efficient than ever before.
To achieve the abovementioned object, according to an aspect of the present invention, there is provided an image forming apparatus to be used together with equipment to be used by a user with both hands, and the image forming apparatus reflecting one aspect of the present invention comprises: an acceptor that does not accept, by voice, an instruction for processing to be performed by the image forming apparatus before the user holds the equipment but accepts, by the voice, the instruction while the equipment is being used with both of the hands.
BRIEF DESCRIPTION OF THE DRAWINGS
The advantages and features provided by one or more embodiments of the invention will become more fully understood from the detailed description given hereinbelow and the appended drawings which are given by way of illustration only, and thus are not intended as a definition of the limits of the present invention:
FIG. 1 is an illustration of an exemplary external appearance of an image forming apparatus;
FIG. 2 is an illustration of the hardware configuration of the image forming apparatus;
FIG. 3 is an illustration of an exemplary measurement device held with both hands of a user;
FIG. 4 is an illustration of an exemplary functional configuration of the image forming apparatus;
FIG. 5 is an illustration of an exemplary home screen;
FIG. 6 is an illustration of an exemplary copy operation screen;
FIG. 7 is an illustration of exemplary job data;
FIG. 8 is an illustration of an exemplary explanatory screen;
FIG. 9 is an illustration of exemplary personal data;
FIG. 10 is an illustration of an exemplary measurement-in-process screen;
FIG. 11 is an illustration of an exemplary measured-result screen;
FIG. 12 is an illustration of an exemplary view screen;
FIG. 13 is a flowchart of an exemplary flow of entire processing of the image forming apparatus;
FIG. 14 is a flowchart of an exemplary flow of measurement-start-occasion processing;
FIG. 15 is a flowchart of an exemplary flow of voice-input-based processing; and
FIG. 16 is a flowchart of an exemplary flow of view screen processing.
DETAILED DESCRIPTION OF EMBODIMENTS
Hereinafter, one or more embodiments of the present invention will be described with reference to the drawings. However, the scope of the invention is not limited to the disclosed embodiments.
FIG. 1 is an illustration of an exemplary external appearance of an image forming apparatus 1. FIG. 2 is an illustration of the hardware configuration of the image forming apparatus 1. FIG. 3 is an illustration of an exemplary measurement device 10 r held with both hands of a user. FIG. 4 is an illustration of an exemplary functional configuration of the image forming apparatus 1.
The image forming apparatus 1 illustrated in FIG. 1 has a collective function, such as copying, PC printing, cloud printing, faxing, scanning, and boxing. Generally, the image forming apparatus 1 is also referred to as a “multi function peripheral (MFP)”.
The PC print function allows printing an image on a sheet on the basis of image data received from a terminal device in the same local area network (LAN) as the image forming apparatus 1. The PC print function is also referred to as “network printing” or “network print”.
The cloud print function allows printing an image on a sheet on the basis of image data received from an external terminal device through a server on the Internet.
The box function allows each user given a storage area referred to as a “box” or a “personal box”, to save and manage, for example, image data in the storage area. Provision of a box per group enables the members to share in each group. The box corresponds to a “folder” or a “directory” in a personal computer.
As illustrated in FIG. 2, the image forming apparatus 1 includes, for example, a central processing unit (CPU) 10 a, a random access memory (RAM) 10 b, a read only memory (ROM) 10 c, an auxiliary storage device 10 d, a touch panel display 10 e, an operation key panel 10 f, a network interface card (NIC) 10 g, a wireless LAN communication unit 10 h, a modem 10 i, a scan unit 10 j, a print unit 10 k, a finisher 10 m, a voice input unit 10 n, and a measurement device 10 r.
The CPU 10 a is the main CPU of the image forming apparatus 1. The RAM 10 b is the main memory of the image forming apparatus 1.
The touch panel display 10 e displays, for example, a screen indicating a message to the user, a screen into which the user inputs a command or information, or a screen indicating a result of processing performed by the CPU 10 a. Furthermore, the touch panel display 10 e transmits a signal indicating the touched position, to the CPU 10 a.
The operation key panel 10 f that is a so-called hardware keyboard, includes, for example, a numeric keypad, a start key, a stop key, and function keys.
The NIC 10 g communicates with a different device in accordance with a protocol, such as transmission control protocol/internet protocol (TCP/IP).
The wireless LAN communication unit 10 h communicates with a different device on the basis of the standard of Institute of Electrical and Electronics Engineers (IEEE) 802.11 that is a wireless LAN standard.
The modem 10 i exchanges document data with a facsimile in accordance with a protocol, such as G3.
The scan unit 10 j reads an image on an original (sheet) set on an auto document feeder (ADF) or a platen glass, and generates image data.
The print unit 10 k prints an image in image data received from an external device through the NIC 10 g, onto a sheet, in addition to the image read by the scan unit 10 j.
The finisher 10 m performs, as necessary, postprocessing to printed matter acquired by the print unit 10 k. Examples of the postprocessing include processing of stapling, processing of punching, and processing of folding.
The voice input unit 10 n including, for example, a sound board and a microphone, collects sound and generates voice data 6A. Particularly, in a case where the user instructs the image forming apparatus 1 by so-called voice input, the voice input unit 10 n generates the voice data 6A indicating a voice uttered from the user (namely, a voice of the user).
The measurement device 10 r measures, for example, the blood pressure and the pulse of the user. As illustrated in FIG. 3, the user covers four terminals with the thumbs and the forefingers of both hands. In this manner, the measurement device 10 r is held with both hands.
For example, the measurement device 10 r irradiates the thumbs and the forefingers of both hands with light, and detects the light reflected from the fingers at predetermined time intervals (e.g., every one second). This arrangement allows acquisition of a pulse wave. While the measurement device 10 r itself is being held, for example, the blood pressure and the pulse of the user are measured on the basis of the acquired pulse wave. The measurement device 10 r generates measurement data 6B indicating a measured result (namely, measured values) and then transmits the measurement data 6B to the image forming apparatus 1.
The measurement device 10 r is connected to the image forming apparatus 1 by wire or wireless. The measurement device 10 r is disposed detachably, for example, on a side face of the image forming apparatus 1. Note that removal of the measurement device 10 r from the side face causes a sensor to detect the removal.
The ROM 10 c or the auxiliary storage device 10 d stores an application for achieving a function, such as the copying. The ROM 10 c or the auxiliary storage device 10 d stores a measurement-occasion processing program 10P (refer to FIG. 4).
The measurement-occasion processing program 10P enables the image forming apparatus 1 to accept an instruction by voice input of the user while the measurement device 10 r is measuring the blood pressure of the user. The measurement-occasion processing program 10P allows output of the measured result. The detail thereof will be described later.
The measurement-occasion processing program 10P causes a login processor 101, a voice input OFF setter 102, a screen display unit 103, a job executor 104, a job data storage 105, an explanatory necessity discriminator 106, a personal data storage 107, a voice input ON setter 108, a measurement-occasion processor 109, and a voice processor 110 of FIG. 4 to be achieved in the image forming apparatus 1.
FIG. 5 is an illustration of an exemplary home screen 5A. FIG. 6 is an illustration of an exemplary copy operation screen 51B. FIG. 7 is an illustration of exemplary job data 6C. FIG. 8 is an illustration of an exemplary explanatory screen 5C. FIG. 9 is an illustration of exemplary personal data 6D. FIG. 10 is an illustration of an exemplary measurement-in-process screen 51D. FIG. 11 is an illustration of an exemplary measured-result screen 52D. FIG. 12 is an illustration of an exemplary view screen 5F.
The operation of the login processor 101, the voice input OFF setter 102, the screen display unit 103, the job executor 104, the job data storage 105, the explanatory necessity discriminator 106, the personal data storage 107, the voice input ON setter 108, the measurement-occasion processor 109, and the voice processor 110 of FIG. 4, will be described below with reference to FIGS. 5 to 12 with an exemplary case where the image forming apparatus 1 performs a job of copying and receives the measurement data 6B from the measurement device 10 r.
The user considers causing the image forming apparatus 1 to copy the image of an original. Then, the user with the user name and the password thereof makes a request to the image forming apparatus 1 for login. Then, the following processing is performed.
In response to acceptance of the request for login, the login processor 101 of the image forming apparatus 1 discriminates whether the user is an authorized user, and permits the user to log in the image forming apparatus 1 in a case where the user is an authorized user.
After permission for login, the voice input OFF setter 102 makes, when the function of the voice input unit 10 n is active (namely, on), the function inactive (namely, off) in order to prevent voice input from being performed.
Every occurrence of an event, the screen display unit 103 appropriately causes the touch panel display 10 e to display a screen corresponding to the event, as to be described sequentially later.
After permission for login, the screen display unit 103 causes the touch panel display 10 e to display the home screen 5A as in FIG. 5. The home screen 5A allows the user to select an operation screen 5B to be displayed on the touch panel display 10 e, from a plurality of operation screens 5B, the operation screen 5B being to be operated by the user in order to cause the image forming apparatus 1 to perform the job. A plurality of icons each having a job name is disposed on the home screen 5A.
The user presses an icon 7A corresponding to a screen to be operated for performance of the job of copying, from the plurality of icons, to provide the image forming apparatus 1 with an instruction for display of the copy operation screen 51B as in FIG. 6 onto the touch panel display 10 e.
Then, the screen display unit 103 causes the touch panel display 10 e to display the copy operation screen 51B.
After providing the instruction for display of the copy operation screen 51B onto the touch panel display 10 e, the user sets the original to the ADF. Input of conditions for the job of copying (e.g., the number of print copies and scaling) sets the job, and then an instruction for start of the job is provided. Then, the job executor 104 performs the following processing.
The job executor 104 controls each constituent of the image forming apparatus 1 such that the job is performed. Here, because of the job of copying, the job executor 104 causes, for example, the scan unit 10 j and the print unit 10 k to perform the job.
Furthermore, the job executor 104 generates the job data 6C indicating the job code identifying the job, the job type indicating the type of the job, and the user code of the user who has provided the instruction, and stores the job data 6C into the job data storage 105 every job code as in FIG. 7.
Note that the image of the original read by the scan unit 10 j is stored in, for example, the RAM 10 b.
After causing the image forming apparatus 1 to start the job of copying, the user is on standby in front of the image forming apparatus 1 during performance of the job. Here, the user considers measuring the blood pressure of the user itself during the standby time. Then, the user detaches the measurement device 10 r from the side face of the image forming apparatus 1. Then, the following processing is performed.
When the sensor detects that the measurement device 10 r has been detached from the side face of the image forming apparatus 1, the explanatory necessity discriminator 106 discriminates whether the explanatory screen 5C as in FIG. 8 is to be displayed on the touch panel display 10 e, on the basis of the personal data 6D stored in the personal data storage 107, as below. The explanatory screen 5C provides the user with the description of the method of operating the image forming apparatus 1 by voice. Note that the explanatory screen 5C may provide the description of the measurement method with the measurement device 10 r.
The personal data storage 107 stores the personal data 6D including the measurement date and time indicating the date and time of measurement of the blood pressure of the user in the past with the image forming apparatus 1, the subject code of the user (namely, a subject), and the measurement data 6B of the user, every measurement date and time, as in FIG. 9.
Note that processing of generating the personal data 6D and storing the personal data 6D into the personal data storage 107, will be described later.
The explanatory necessity discriminator 106 searches the personal data storage 107 for the personal data 6D having the subject code identical to the user code of the user who is currently logging in the image forming apparatus 1. In a case where the personal data 6D has not been found or in a case where the personal data 6D less in pieces of data than a predetermined number has been found, the explanatory necessity discriminator 106 discriminates that the explanatory screen 5C is to be displayed. In a case where the personal data 6D not less in pieces of data than the predetermined number has been found, the explanatory necessity discriminator 106 discriminates that the explanatory screen 5C is not to be displayed. The predetermined number can be arbitrarily set by an administrator.
That is, in a case where the image forming apparatus 1 cannot find the personal data 6D for the user who intends to measure blood pressure from now, not less in pieces of data than the predetermined number, the image forming apparatus 1 discriminates that the user is an inexperienced person in operating the image forming apparatus 1 while measuring blood pressure, namely, a beginner. Then, the image forming apparatus 1 indicates the method of operating the image forming apparatus 1 itself with the measurement device 10 r held with both hands (namely, by voice), to the user.
For example, in a case where the predetermined number is three or more, the user code of the user who is currently logging in the image forming apparatus 1 is “U004”, and the personal data storage 107 stores the personal data 6D as in FIG. 9, the explanatory necessity discriminator 106 discriminates that the explanatory screen 5C is to be displayed.
When the explanatory necessity discriminator 106 discriminates that the explanatory screen 5C is to be displayed, the screen display unit 103 causes the touch panel display 10 e to display the explanatory screen 5C.
After appropriately verifying the method of operating the image forming apparatus 1 through the explanatory screen 5C, the user presses an “end” icon 7C on the explanatory screen 5C. Then, the following processing is performed.
When the user presses the “end” icon 7C on the explanatory screen 5C, the voice input ON setter 108 (refer to FIG. 4) turns on the function of the voice input unit 10 n. That is the voice input ON setter 108 makes the function active such that voice input is allowed. Similarly, when the explanatory necessity discriminator 106 discriminates that the explanatory screen 5C is not to be displayed, the voice input ON setter 108 turns on the function of the voice input unit 10 n.
When the user presses the “end” icon 7C on the explanatory screen 5C, the measurement device 10 r starts processing of measurement. Similarly, when the explanatory necessity discriminator 106 discriminates that the explanatory screen 5C is not to be displayed, the measurement device 10 r starts the processing of measurement.
Here, for example, the measurement device 10 r cannot acquire the pulse wave of the user unless the user holds the measurement device 10 r correctly. As a result, the processing of measurement is not allowed to start. In this case, the screen display unit 103 may cause the touch panel display 10 e to continuously display a screen displaying an error, until the measurement device 10 r is allowed to start the processing of measurement.
The measurement device 10 r generates the measurement data 6B every measurement, and transmits the measurement data 6B to the image forming apparatus 1, successively.
After reception of the first measurement data 6B (hereinafter, referred to as “measurement data 61B”) in the processing of measurement for this time (hereinafter, referred to as “measurement processing for this time”), the measurement-occasion processor 109 performs the following processing.
After reception of the measurement data 61B, the measurement-occasion processor 109 causes the personal data storage 107 to store, as the personal data 6D, the measurement data 61B in association with the data and time of the reception of the measurement data 61B as the measurement date and time and the user code of the user who is currently logging in the image forming apparatus 1 as the subject code (refer to FIG. 9). In addition, the measurement-occasion processor 109 starts measurement of elapse in time.
After reception of the measurement data 61B, the measurement-occasion processor 109 causes the personal data storage 107 to store, every reception of the measurement data 6B in the measurement processing for this time, the measurement data 6B. In this case, the storing is performed such that the data already stored in the personal data storage 107 is not overwritten.
Note that, in a case where the elapsed time is more than the required time from the start to the completion of the processing of measurement, previously stored, for example, in the auxiliary storage device 10 d, the measurement-occasion processor 109 discriminates that the measurement processing for this time has been completed.
After the measurement device 10 r starts the processing of measurement, the screen display unit 103 performs processing of generating the measurement-in-process screen 51D indicating the content of measurement still in progress as in FIG. 10, as below.
That is the screen display unit 103 reads the personal data 6D of the measurement processing for this time, from the pieces of personal data 6D stored in the personal data storage 107.
The screen display unit 103 requests the remaining time t until completion of the measurement processing for this time, from the measurement-occasion processor 109. In response to the request, the measurement-occasion processor 109 calculates the difference between the required time and the elapsed time as the remaining time t, and transmits the remaining time t to the screen display unit 103.
The screen display unit 103 generates the measurement-in-process screen 51D, on the basis of the measurement data 6B of the read personal data 6D and the received remaining time t.
Note that the screen display unit 103 may reread the personal data 6D and request and receive the new remaining time t at predetermined time intervals (e.g., every two or three seconds) to generate the new measurement-in-process screen 51D to be displayed on the touch panel display 10 e. This arrangement causes the measurement-in-process screen 51D to be updated at the predetermined time intervals.
The screen display unit 103 causes the touch panel display 10 e to display the generated measurement-in-process screen 51D.
When the measurement-occasion processor 109 discriminates that the measurement processing for this time has been completed, the screen display unit 103 generates the measured-result screen 52D indicating a measured result as in FIG. 11, on the basis of all the measurement data 6B of the personal data 6D of the measurement processing for this time, and causes the touch panel display 10 e to display the measured-result screen 52D.
After verifying the measured result through the measured-result screen 52D, the user provides, by voice, the image forming apparatus 1 with an instruction for display of the home screen 5A onto the touch panel display 10 e. Alternatively, before completion of the measurement processing for this time, the user provides, by voice, the image forming apparatus 1 with an instruction for interruption of the measurement processing for this time (namely, cancellation). Then, the following processing is performed.
Every input of a voice from the user, the voice input unit 10 n (refer to FIG. 2) generates the voice data 6A, on the basis of the voice of the user.
The voice processor 110 (refer to FIG. 4) acquires the voice data 6A generated by the voice input unit 10 n, and converts the voice data 6A into a character code, for example, with an input method editor (IME) for sound. The voice processor 110 identifies the content of the instruction from the user, on the basis of the character code. That is, the occurred event is identified.
Provision of the instruction for display of the home screen 5A while the measured-result screen 52D is being displayed on the touch panel display 10 e or provision of the instruction for interruption of the measurement processing for this time, causes the voice input OFF setter 102 to turn off the function of the voice input unit 10 n. The screen display unit 103 causes the touch panel display 10 e to display the home screen 5A (refer to FIG. 5). Note that the image of the original stored in the RAM 10 b is deleted.
The provision of the instruction for interruption of the measurement processing for this time, causes the measurement device 10 r to interrupt the processing of measurement.
Note that, while the measurement-in-process screen 51D is being displayed, the user can provide, by voice, an instruction for redisplay of the operation screen 5B displayed on the touch panel display 10 e before removal of the measurement device 10 r from the side face of the image forming apparatus 1. Due to provision of the instruction, the screen display unit 103 causes the touch panel display 10 e to display the operation screen 5B.
After that, the user provides, by voice, the image forming apparatus 1 with an instruction for redisplay of the measurement-in-process screen 51D onto the touch panel display 10 e. Then, similarly to the above, the screen display unit 103 reads the personal data 6D and requests and receives the new remaining time t, generates the measurement-in-process screen 51D, on the basis of the read personal data 6D and the received remaining time t, and then causes the touch panel display 10 e to display the measurement-in-process screen 51D.
Here, when the function of the voice input unit 10 n is on, the user provides, by voice, the image forming apparatus 1 with an instruction for interruption (namely, cancellation) or suspension of the job currently being performed by the image forming apparatus 1. Then, the job executor 104 controls each constituent of the image forming apparatus 1 such that the job is suspended, for example.
In this case, the job executor 104 discriminates whether the user who has provided the instruction for performance of the job, is identical to the user who has provided the instruction for interruption or suspension of the job. Specifically, the job executor 104 makes discrimination, on the basis of the user code for the job currently being performed and the subject code of the personal data 6D of the measurement processing for this time. Then, in a case where both of the users are identical, each constituent of the image forming apparatus 1 is controlled so as to suspend the job.
Alternatively, when the function of the voice input unit 10 n is on, the user provides, by voice, an instruction for redisplay of the explanatory screen 5C onto the touch panel display 10 e, so that the screen display unit 103 causes the touch panel display 10 e to display the explanatory screen 5C.
Alternatively, when the function of the voice input unit 10 n is on, the user provides, by voice, an instruction for display of a list of jobs performed or being performed by the image forming apparatus 1. Then, on the basis of the job data 6C stored in the job data storage 105, the screen display unit 103 causes the touch panel display 10 e to display a job list screen 5E indicating a list of jobs performed or being performed by the image forming apparatus 1.
Alternatively, when the function of the voice input unit 10 n is on, the user provides, by voice, the image forming apparatus 1 with an instruction for verification of an image in the job of printing currently being performed by the image forming apparatus 1 (hereinafter, referred to as a “print image 7F”). Then, the screen display unit 103 causes the touch panel display 10 e to display the view screen 5F as in FIG. 12 for verification of the print image 7F.
In a case where a plurality of originals is included in the job of printing, while the view screen 5F is being displayed on the touch panel display 10 e, the user provides, by voice, the image forming apparatus 1 with an instruction for change of a print image 71F that is the print image 7F currently being displayed on the view screen 5F, to the image of the next original of the original of the print image 71F (namely, the next print image 7F). Alternatively, the user provides an instruction for change to the image of the previous original of the original of the print image 71F (namely, the previous print image 7F). Then, the screen display unit 103 causes the touch panel display 10 e to display the view screen 5F including the print image 7F changed in accordance with the instruction of the user.
While the view screen 5F is being displayed on the touch panel display 10 e, the user considers changing the direction of printing of the print image 7F because the direction of typing is not identical to the orientation of a sheet. Then, the user provides, by voice, the image forming apparatus 1 with an instruction for suspension of the job of printing currently being performed by the image forming apparatus 1.
Similarly to the above, the job executor 104 controls each constituent of the image forming apparatus 1 such that the job of printing is suspended.
Subsequently, the user provides, by voice, the image forming apparatus 1 with an instruction for rotation of the print image 7F by a predetermined angle in a predetermined direction (e.g., by 90° clockwise). That is the user provides, by voice, the image forming apparatus 1 with an instruction for change of the direction of printing of the print image 7F. Then, the screen display unit 103 causes the touch panel display 10 e to display the view screen 5F including the print image 7F rotated in accordance with the instruction of the user.
Subsequently, the user provides, by voice, the image forming apparatus 1 with an instruction for determination of the degree of rotation of the print image 7F (namely, the degree of change of the direction of printing). Then, the job executor 104 controls each constituent of the image forming apparatus 1 such that the job of printing is performed to the rotated print image 7F from the beginning.
While the view screen 5F is being displayed on the touch panel display 10 e, the user provides an instruction for adjustment of the print image 7F, for example, in density. Then, the screen display unit 103 causes the touch panel display 10 e to display the view screen 5F including the print image 7F adjusted in density in accordance with the instruction of the user.
Alternatively, when the function of the voice input unit 10 n is on, the user provides an instruction for performance of a job of reprinting an image in the job of printing recently performed by the image forming apparatus 1 (hereinafter, referred to as a “reprint job”). Then, the job executor 104 searches the RAM 10 b for the image that is the target of the reprint job, and causes the reprint job to be performed to a found image.
Note that, unless the image forming apparatus 1 has started the job when the function of the voice input unit 10 n is on, the user can provide, by voice, an instruction for start of the job.
In this case, the job executor 104 discriminates whether preparation for performance of the job has been completed, on the basis of, for example, whether the original has been set to the ADF. Then, in a case where the preparation has been completed, the job executor 104 controls each constituent of the image forming apparatus 1 such that the job is performed.
FIG. 13 is a flowchart of an exemplary flow of entire processing of the image forming apparatus 1. FIG. 14 is a flowchart of an exemplary flow of measurement-start-occasion processing. FIG. 15 is a flowchart of an exemplary flow of voice-input-based processing. FIG. 16 is a flowchart of an exemplary flow of view screen processing.
Next, the flow of entire processing in the image forming apparatus 1 will be described with reference to the flowcharts of FIGS. 13 to 16.
The image forming apparatus 1 performs the processing in the order illustrated in FIG. 13, on the basis of the measurement-occasion processing program 10P.
The image forming apparatus 1 performs login processing in response to a request for login from the user (#601 of FIG. 13), switches off the function of voice input, if the function of voice input is on, after permission for login (#602), and displays the home screen 5A (#603).
In a case where the user has selected the operation screen 5B that the user desires to display (Yes at #604), the image forming apparatus 1 displays the selected operation screen 5B (#605).
In a case where the user has input conditions for the job (Yes at #606), the image forming apparatus 1 sets the job in accordance with the conditions (#607).
In a case where the user has provided an instruction for start of the job (Yes at #608), the image forming apparatus 1 starts the job (#609). In this case, an image acquired while the job is being performed (e.g., the image of the original read by the scan unit 10 j), is stored in the RAM 10 b.
In a case where detecting that the measurement device 10 r has been detached (Yes at #610 and No at #611), the image forming apparatus 1 performs the measurement-start-occasion processing as in FIG. 14 (#612).
In the measurement-start-occasion processing, the image forming apparatus 1 discriminates whether the user who intends to measure blood pressure is a beginner (#631).
In a case where discriminating that the user is a beginner (Yes at #632), the image forming apparatus 1 displays the explanatory screen 5C (#633). In a case where discriminating that the user is not a beginner (No at #632) or in a case where the explanatory screen 5C is closed (Yes at #634), the image forming apparatus 1 starts the measurement processing (#635), displays the measurement-in-process screen 51D (#636), and turns on the function of voice input (#637).
While the measurement device 10 r is continuously generating the measurement data 6B, namely, while the measurement device 10 r is performing the measurement processing for this time (Yes at #610, Yes at #611, No at #613, and No at #618), when the user performs voice input (Yes at #614), the image forming apparatus 1 performs the voice-input-based processing as in FIG. 15 (#615).
In the voice-input-based processing, in a case where the voice input includes an instruction for start of the job (Yes at #651) and in a case where preparation for performance of the job has been completed (Yes at #652), the image forming apparatus 1 starts the job and stores an image in the job into, for example, the RAM 10 b (#653).
In a case where the voice input includes an instruction for interruption or suspension of the job (Yes at #654), the image forming apparatus 1 discriminates whether the user who has provided the instruction for performance of the job and the subject are identical (#655). In a case where the user and the subject are identical (Yes at #656), the image forming apparatus 1 suspends the job (#657).
In a case where the voice input includes an instruction for display of the explanatory screen 5C (Yes at #658), the image forming apparatus 1 displays the explanatory screen 5C (#659).
In a case where the voice input includes an instruction for display of the measurement-in-process screen 51D (Yes at #660), the image forming apparatus 1 displays the measurement-in-process screen 51D (#661).
In a case where the voice input includes an instruction for display of the job list screen 5E (Yes at #662), the image forming apparatus 1 displays the job list screen 5E (#663).
In a case where the voice input includes an instruction for display of the view screen 5F (Yes at #664), the image forming apparatus 1 displays the view screen 5F (#665). While the view screen 5F is being displayed (Yes at #666), the image forming apparatus 1 performs the view screen processing as in FIG. 16 (#667).
In the view screen processing, in a case where the user has provided, by voice, an instruction for change of the print image 7F on the view screen 5F (namely, the print image 71F) to the next print image 7F (Yes at #681), the image forming apparatus 1 displays the view screen 5F including the next print image 7F (#682).
In a case where the user has provided, by voice, an instruction for change of the print image 7F on the view screen 5F (namely, the print image 71F) to the previous print image 7F (Yes at #683), the image forming apparatus 1 displays the view screen 5F including the previous print image 7F (#684).
In a case where the user has provided, by voice, an instruction for suspension of the job of printing of the print image 7F and rotation of the print image 7F (Yes at #685), the image forming apparatus 1 suspends the job of printing (#686), and then displays the view screen 5F including the print image 7F rotated by a specified angle and performs the job of printing from the beginning (#687).
In a case where the user has provided, by voice, an instruction for adjustment of the print image 7F (Yes at #688), the image forming apparatus 1 displays the view screen 5F including the print image 7F adjusted in accordance with the instruction (#689).
Referring back to FIG. 15, in a case where the user has provided, by voice, an instruction for performance of the reprint job (Yes at #668), the image forming apparatus 1 performs the reprint job (#669).
Referring back to FIG. 13, in a case where the measurement processing for this time has been completed (Yes at #613), the image forming apparatus 1 displays the measured-result screen 52D (#616), and deletes the image in the job stored in the RAM 10 b (#617). Then, the image forming apparatus 1 turns off the function of voice input (#602).
In a case where the user has provided an instruction for interruption of the measurement processing for this time (Yes at #618), the image forming apparatus 1 interrupts the measurement processing for this time and deletes the image in the job stored in the RAM 10 b (#619). Note that, in this case, the effect that the measurement processing for this time has been interrupted, may be displayed on the touch panel display 10 e. Then, the image forming apparatus 1 turns off the function of voice input (#602).
Until the image forming apparatus 1 performs logout processing in response to a request for logout from the user (Yes at #620), the image forming apparatus 1 appropriately repeats steps #602 to #619, #631 to #637, #651 to #669, and #681 to #689 described above.
According to the present embodiment, the image forming apparatus 1 can be provided in which the function of accepting an instruction by voice is more efficient than ever before.
According to the present embodiment, the timing the voice input ON setter 108 turns on the function of the voice input unit 10 n, is when the user presses the “end” icon 7C on the explanatory screen 5C. However, the timing may be any of the following timings.
That is the timing may be when the measurement device 10 r acquires the pulse wave of the user (namely, when the processing of measurement starts). Alternatively, the timing may be when the explanatory necessity discriminator 106 discriminates whether the explanatory screen 5C is to be displayed on the touch panel display 10 e. Alternatively, the timing may be when the sensor detects that the measurement device 10 r has been detached from the side face of the image forming apparatus 1.
According to the present embodiment, the timing the voice input OFF setter 102 turns off the function of the voice input unit 10 n is when an instruction is provided for display of the home screen 5A while the measured-result screen 52D is being displayed on the touch panel display 10 e or when an instruction is provided for interruption of the measurement processing for this time, after permission for login of the login processor 101. However, the timing may be any of the following timings.
That is the timing may be when the measurement device 10 r finishes transmitting all the measurement data 6B (namely, when completing the processing of measurement). Alternatively, the timing may be when the sensor detects that the measurement device 10 r has returned to the original position (namely, to the side face of the image forming apparatus 1). Alternatively, the timing may be when the measurement device 10 r is disabled from acquiring the pulse wave of the user while the processing of measurement is being performed.
According to the present embodiment, the condition that the explanatory necessity discriminator 106 discriminates that the explanatory screen 5C is to be displayed, meets that the personal data 6D of the subject code identical to the user code of the user who is currently logging in the image forming apparatus 1 (hereinafter, referred to as “identical data”), not less in pieces of data than the predetermined number, is stored in the personal data storage 107.
However, the condition (namely, the condition of discriminating that the explanatory screen 5C is to be displayed) may meet that the interval is a predetermined time or more (e.g., 500 hours or more) between the measurement date and time of the identical data recently stored in the personal data storage 107 and the time of start of the processing of measurement by the user who is currently logging in (e.g., the time of detachment of the measurement device 10 r from the image forming apparatus 1) (hereinafter, referred to as a “measurement interval time”).
Furthermore, the explanatory necessity discriminator 106 may discriminate that the explanatory screen 5C is to be displayed, as long as the measurement interval time is the predetermined time or more even when the identical data not less in pieces of data than the predetermined number, is stored in the personal data storage 107.
According to the present embodiment, the condition that the measurement device 10 r starts the processing of measurement, meets that the user presses the “end” icon 7C on the explanatory screen 5C or that the explanatory necessity discriminator 106 discriminates that the explanatory screen 5C is not to be displayed.
However, the condition (namely, the condition of starting the processing of measurement) may meet that the user provides, by voice, the image forming apparatus 1 with an instruction for start of the processing of measurement after the user presses the “end” icon 7C on the explanatory screen 5C or after the explanatory necessity discriminator 106 discriminates that the explanatory screen 5C is not to be displayed.
According to the present embodiment, the measurement-occasion processor 109 calculates the remaining time t. However, the measurement device 10 r may calculate the remaining time t. According to the embodiment, the condition that the measurement-occasion processor 109 discriminates that the processing of measurement has been completed, meets that the elapsed time measured by the measurement-occasion processor 109 is more than the required time. However, the condition may meet that the measurement device 10 r notifies the measurement-occasion processor 109 that the processing of measurement has been completed and then the measurement-occasion processor 109 receives the notification.
For each case, the measurement device 10 r stores the required time. The measurement device 10 r starts the processing of measurement simultaneously with measurement of elapse in time. Every generation of the measurement data 6B, the remaining time t is calculated on the basis of the elapsed time and the required time. The calculated remaining time t together with the measurement data 6B is transmitted to the measurement-occasion processor 109.
When the elapsed time exceeds the required time, the measurement device 10 r completes the processing of measurement. The measurement device 10 r notifies the measurement-occasion processor 109 that the processing of measurement has been completed, simultaneously with transmission of the last generated measurement data 6B. The measurement-occasion processor 109 discriminates that the processing of measurement has been completed, on the basis of reception of the last generated measurement data 6B and reception of the notification that the processing of measurement has been completed.
According to the present embodiment, when the print image 7F on the view screen 5F is changed as described above (namely, changed to the image of the next original or the image of the previous original) or is adjusted, the job executor 104 may cause suspension of the job being performed, in accordance with an instruction from the user.
In this case, the job executor 104 may discriminate whether the user who has provided the instruction for performance of the job is identical to the user who has provided the instruction for change of the print image 7F. Then, in a case where both of the users are identical, as described above, the job executor 104 is required at least to control each constituent of the image forming apparatus 1 such that the job is suspended. After that, the screen display unit 103 is required at least to cause the touch panel display 10 e to display the view screen 5F including the print image 7F changed in accordance with the instruction of the user.
In addition, for example, the entire configuration of the image forming apparatus 1, the configuration of each constituent of the image forming apparatus 1, the content of processing, the order of processing, and the configuration of data can be appropriately changed without departing from the spirit of the present invention.
Although embodiments of the present invention have been described and illustrated in detail, the disclosed embodiments are made for purposes of illustration and example only and not limitation. The scope of the present invention should be interpreted by terms of the appended claims.

Claims (12)

What is claimed is:
1. An image forming apparatus to be used together with equipment to be used by a user with both hands, the image forming apparatus comprising:
a voice input unit configured to collect sound and generate voice data;
a hardware processor configured to turn on and off voice input processing;
wherein the image forming apparatus does not accept, by voice, an instruction for processing to be performed by the image forming apparatus before the user holds the equipment and the image processing apparatus turns on the voice input processing in response to the equipment being used with both of the hands.
2. The image forming apparatus according to claim 1, wherein the hardware processor is further configured to:
receive, from the equipment, information regarding a living body of the user acquired by the equipment while the equipment is being used with both of the hands; and
output the information received.
3. The image forming apparatus according to claim 1,
wherein, while the image forming apparatus is performing a job given from the user as the processing, with the equipment used with both of the hands, the hardware processor accepts, by the voice, a first instruction for interruption or suspension of the job as the instruction.
4. The image forming apparatus according to claim 3,
wherein the job includes a print job in which the image forming apparatus prints an image, and
while the image forming apparatus is performing the print job as the processing, with the equipment used with both of the hands, the hardware processor accepts, by the voice, a second instruction for suspension of the print job as the instruction while the image is being displayed, and then accepts, by the voice, a third instruction for rotation of the image as the instruction.
5. The image forming apparatus according to claim 4,
wherein, while the equipment is being used with both of the hands, the hardware processor accepts, by the voice, as the instruction, a fourth instruction for start of a reprint job of printing the image printed or interrupted in printing by the image forming apparatus in the print job, as the processing newly.
6. The image forming apparatus according to claim 5,
wherein, while the equipment is being used with both of the hands and while the image printed by the image forming apparatus in the print job is being displayed, the hardware processor accepts, by the voice, a fifth instruction for display of a different image printed by the image forming apparatus instead of the image or adjustment and display of the image as the instruction.
7. The image forming apparatus according to claim 6,
wherein, after completion of preparation for performance of the processing while the equipment is being used with both of the hands, the hardware processor accepts, by the voice, a sixth instruction for start of the processing as the instruction.
8. The image forming apparatus according to claim 1, wherein the hardware processor is further configured to:
discriminate whether an operation method of the image forming apparatus by the voice is to be displayed, based on usage history indicating usage of the equipment by the user; and
the image forming apparatus further comprises a display that displays the operation method in a case where it is discriminated that the operation method is to be displayed.
9. The image forming apparatus according to claim 1,
wherein the hardware processor disables the instruction from being accepted by the voice before the use of the equipment with both of the hands starts, enables the instruction to be accepted by the voice when the use of the equipment with both of the hands starts, and disables the instruction from being accepted by the voice when the use of the equipment with both of the hands finishes.
10. An image forming apparatus to be used together with equipment that measures a living body of a user holding the equipment with both hands, the image forming apparatus comprising:
a voice input unit configured to collect sound and generate voice data;
a hardware processor configured to turn on and off voice input processing;
wherein the image forming apparatus does not accept, by voice, an instruction for processing to be performed by the image forming apparatus before the user holds the equipment the image forming apparatus accepts, by the voice, the instruction while the equipment is measuring the living body.
11. An instruction acceptance method comprising:
disabling, before a user holds equipment to be used by the user with both hands, an image forming apparatus from accepting, by voice, an instruction for processing to be performed by the image forming apparatus; and
enabling the image forming apparatus to accept, by voice input processing, the instruction in response to the equipment being used with both of the hands.
12. A non-transitory recording medium storing a computer readable program for controlling an image forming apparatus to be used together with equipment to be used by a user with both of hands, the computer readable program causing the image forming apparatus to perform:
disabling an instruction for processing to be performed by the image forming apparatus, from being accepted by voice before the user holds the equipment; and
enabling the instruction to be accepted by voice input processing in response to the equipment being used with both of the hands.
US16/506,130 2018-07-17 2019-07-09 Image forming apparatus, instruction acceptance method, and computer readable program Active US10791230B2 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2018133951A JP7139743B2 (en) 2018-07-17 2018-07-17 IMAGE FORMING APPARATUS, INSTRUCTION RECEIVING METHOD, AND COMPUTER PROGRAM
JP2018-133951 2018-07-17

Publications (2)

Publication Number Publication Date
US20200028979A1 US20200028979A1 (en) 2020-01-23
US10791230B2 true US10791230B2 (en) 2020-09-29

Family

ID=69163302

Family Applications (1)

Application Number Title Priority Date Filing Date
US16/506,130 Active US10791230B2 (en) 2018-07-17 2019-07-09 Image forming apparatus, instruction acceptance method, and computer readable program

Country Status (2)

Country Link
US (1) US10791230B2 (en)
JP (1) JP7139743B2 (en)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP7224863B2 (en) * 2018-11-09 2023-02-20 キヤノン株式会社 Relay server and control method
JP7251194B2 (en) * 2019-02-15 2023-04-04 セイコーエプソン株式会社 PRINTING SYSTEM, PRINTING METHOD, INFORMATION PROCESSING APPARATUS, AND INFORMATION PROGRAM
JP7415350B2 (en) * 2019-07-08 2024-01-17 コニカミノルタ株式会社 Voice operation system, control device, and control program

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2004234529A (en) 2003-01-31 2004-08-19 Cross Culture Ltd Kiosk terminal
JP2005115773A (en) 2003-10-09 2005-04-28 Canon Inc Method and device for selection of input mode, method and device for switching of input mode, input mode selection/switching method, electronic apparatus, program, and storage medium
JP2007079852A (en) 2005-09-13 2007-03-29 Canon Inc Data processor, data processing method and computer program
US20110065428A1 (en) * 2009-09-16 2011-03-17 At&T Intellectual Property I, L.P Systems and methods for selecting an output modality in a mobile device
JP2013041379A (en) 2011-08-12 2013-02-28 Kyocera Corp Portable electronic device, input processing method, and input processing program
JP2013508808A (en) 2009-10-14 2013-03-07 株式会社ソニー・コンピュータエンタテインメント Touch interface with microphone for determining the impact strength of a touch
US20140070002A1 (en) * 2012-09-07 2014-03-13 Viscount Systems Inc. System and method for printer operation based on user proximity
US20180288248A1 (en) * 2017-03-29 2018-10-04 Xyzprinting, Inc. Voice control system for 3d printer and voice control method for using the same

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2000209378A (en) 1999-01-20 2000-07-28 Ricoh Co Ltd Image forming device
JP2006067066A (en) 2004-08-25 2006-03-09 Fuji Xerox Co Ltd Image processor
JP6171511B2 (en) 2013-04-09 2017-08-02 コニカミノルタ株式会社 Control device, image forming apparatus, portable terminal device, control method, and control program
JP2015191448A (en) 2014-03-28 2015-11-02 パナソニックIpマネジメント株式会社 Terminal apparatus and voice operation control method in terminal apparatus
JP6729058B2 (en) 2016-06-24 2020-07-22 コニカミノルタ株式会社 Image forming apparatus and job execution control program
JP6683062B2 (en) 2016-08-23 2020-04-15 コニカミノルタ株式会社 Image processing apparatus, stress measuring method and stress measuring program in the apparatus

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2004234529A (en) 2003-01-31 2004-08-19 Cross Culture Ltd Kiosk terminal
JP2005115773A (en) 2003-10-09 2005-04-28 Canon Inc Method and device for selection of input mode, method and device for switching of input mode, input mode selection/switching method, electronic apparatus, program, and storage medium
JP2007079852A (en) 2005-09-13 2007-03-29 Canon Inc Data processor, data processing method and computer program
US20110065428A1 (en) * 2009-09-16 2011-03-17 At&T Intellectual Property I, L.P Systems and methods for selecting an output modality in a mobile device
JP2013508808A (en) 2009-10-14 2013-03-07 株式会社ソニー・コンピュータエンタテインメント Touch interface with microphone for determining the impact strength of a touch
US8411050B2 (en) 2009-10-14 2013-04-02 Sony Computer Entertainment America Touch interface having microphone to determine touch impact strength
JP2013041379A (en) 2011-08-12 2013-02-28 Kyocera Corp Portable electronic device, input processing method, and input processing program
US20140070002A1 (en) * 2012-09-07 2014-03-13 Viscount Systems Inc. System and method for printer operation based on user proximity
US20180288248A1 (en) * 2017-03-29 2018-10-04 Xyzprinting, Inc. Voice control system for 3d printer and voice control method for using the same

Also Published As

Publication number Publication date
US20200028979A1 (en) 2020-01-23
JP2020012927A (en) 2020-01-23
JP7139743B2 (en) 2022-09-21

Similar Documents

Publication Publication Date Title
US10791230B2 (en) Image forming apparatus, instruction acceptance method, and computer readable program
US8477936B2 (en) Management system including display apparatus and data management apparatus for displaying data on the display apparatus, and data acquisition method
JP4635761B2 (en) Image forming apparatus and method for controlling image forming apparatus
JP5317590B2 (en) Job processing apparatus, control method therefor, storage medium, and program
US20140300920A1 (en) Image forming apparatus capable of displaying initial screen based on past setting information, method of controlling the image forming apparatus, and storage medium
US10200561B2 (en) Communication apparatus and method for controlling the same for checking a transmission destination
US10972632B2 (en) Information processing apparatus with voice print authentication and program
CN110096160B (en) Image processing apparatus, control method for image processing apparatus, and computer readable medium
JP2007156745A (en) Processor, job execution device, processor control method, and computer program
JP7206881B2 (en) Information processing device and program
JP2017209815A (en) Image forming system, image processing device to be used for this, control method thereof, and program
US20240086125A1 (en) Printing apparatus, method, and printing system for preventing undesired cancellation of receipt of print job
US20240045631A1 (en) Printing apparatus and printing system for preventing undesired cancellation of printing
JP6215043B2 (en) Information display system and electronic device
US9646233B2 (en) Image forming apparatus and non-transitory computer readable recording medium for improved email printing
CN111953857A (en) Device for measuring the position of a moving object
JP5170415B2 (en) Information processing apparatus and information processing program
JP2015055929A (en) Image forming apparatus and operation control program
US8185957B2 (en) Peripheral device
CN111246042A (en) Data processing system and control method of data processing system
US11625474B2 (en) Electronic apparatus with two login types
JP7124549B2 (en) Information processing device, information processing system, and information processing program
JP2019217686A (en) Image-related processing apparatus, failure notification method, and computer program
JP6527566B2 (en) Image forming apparatus and information display system
US11861249B1 (en) File transfer system that transfers file among plurality of image forming apparatuses

Legal Events

Date Code Title Description
AS Assignment

Owner name: KONICA MINOLTA, INC., JAPAN

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:YAMAGUCHI, TOMOHIRO;REEL/FRAME:049700/0095

Effective date: 20190604

FEPP Fee payment procedure

Free format text: ENTITY STATUS SET TO UNDISCOUNTED (ORIGINAL EVENT CODE: BIG.); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

STPP Information on status: patent application and granting procedure in general

Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION

STPP Information on status: patent application and granting procedure in general

Free format text: NON FINAL ACTION MAILED

STPP Information on status: patent application and granting procedure in general

Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER

STCF Information on status: patent grant

Free format text: PATENTED CASE

MAFP Maintenance fee payment

Free format text: PAYMENT OF MAINTENANCE FEE, 4TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1551); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

Year of fee payment: 4