CN111064852B - Image processing apparatus and method - Google Patents

Image processing apparatus and method Download PDF

Info

Publication number
CN111064852B
CN111064852B CN201811208072.XA CN201811208072A CN111064852B CN 111064852 B CN111064852 B CN 111064852B CN 201811208072 A CN201811208072 A CN 201811208072A CN 111064852 B CN111064852 B CN 111064852B
Authority
CN
China
Prior art keywords
operator
document
image
reading
line
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201811208072.XA
Other languages
Chinese (zh)
Other versions
CN111064852A (en
Inventor
西田知世
河田祐一
山崎英树
斋藤玲子
板东义文
冈本健资
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Fujifilm Business Innovation Corp
Original Assignee
Fujifilm Business Innovation Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Fujifilm Business Innovation Corp filed Critical Fujifilm Business Innovation Corp
Priority to CN201811208072.XA priority Critical patent/CN111064852B/en
Publication of CN111064852A publication Critical patent/CN111064852A/en
Application granted granted Critical
Publication of CN111064852B publication Critical patent/CN111064852B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N1/00Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
    • H04N1/0035User-machine interface; Control console
    • H04N1/00352Input means
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N1/00Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
    • H04N1/0035User-machine interface; Control console
    • H04N1/00405Output means
    • H04N1/00408Display of information to the user, e.g. menus
    • H04N1/0044Display of information to the user, e.g. menus for image preview or review, e.g. to help the user position a sheet
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N1/00Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
    • H04N1/00795Reading arrangements
    • H04N1/00798Circuits or arrangements for the control thereof, e.g. using a programmed control device or according to a measured quantity
    • H04N1/00814Circuits or arrangements for the control thereof, e.g. using a programmed control device or according to a measured quantity according to a detected condition or state of the reading apparatus, e.g. temperature
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N2201/00Indexing scheme relating to scanning, transmission or reproduction of documents or the like, and to details thereof
    • H04N2201/0077Types of the still picture apparatus
    • H04N2201/0094Multifunctional device, i.e. a device capable of all of reading, reproducing, copying, facsimile transception, file transception

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Human Computer Interaction (AREA)
  • Facsimiles In General (AREA)
  • User Interface Of Digital Computer (AREA)
  • Facsimile Scanning Arrangements (AREA)

Abstract

The invention provides an image processing apparatus. In the case of reading an image of an original, compared with a configuration in which an operator manually performs an input related to reading, it is possible to suppress the reading from being performed in a state in which the position of the original is shifted. The image processing device includes: an image reading unit that reads an image of an original according to an instruction of an operator; a read operator information acquisition unit (32) that acquires information of an operator who reads a document; a reading operation information acquisition unit (31) that acquires information on an operation performed by an operator on the image processing apparatus in order to read an original document; and a display control unit (34), wherein when the acquired information satisfies a predetermined condition, the display control unit (34) displays a view-based input for receiving a document reading from an operator.

Description

Image processing apparatus and method
Technical Field
The present invention relates to an image processing apparatus.
Background
For example, patent document 1 discloses an original document reading apparatus including: when the start key is operated with the opening/closing sensor detected to be closed and the cover opened, the control section measures a time t for which the start key is operated, and when the operation time t exceeds 3 seconds, the control section waits for the operation time t to elapse from the end of the operation of the start key and causes the reading apparatus to start document reading.
Further, for example, patent document 2 discloses an image forming apparatus for reproducing an image of an original placed on a platen glass on a sheet, the image forming apparatus including: a 1 st start unit provided on the operation panel for starting a copy operation; a 2 nd start unit that detects that the platen glass is pressed by a force equal to or greater than a predetermined amount, and starts a copying operation based on the pressing detection signal; and a selection unit for enabling at least one of the 1 st start unit and the 2 nd start unit.
Patent document 1: japanese patent laid-open No. 2004-233557
Patent document 2: japanese patent laid-open No. 10-268708
The image processing apparatus may read an image of the document. For example, when an operator sets a document on a document table with both hands and then performs an operation of starting reading, the operator may separate the hands from the document and perform an operation of starting reading, and thus the operator may perform reading in a state where the position of the document is deviated.
Disclosure of Invention
The present invention aims to suppress reading in a state where the position of an original is deviated, compared with a configuration in which an input of reading is manually performed by an operator, in the case of reading an image of an original.
[1] An image processing apparatus, comprising: an image reading unit that reads an image of an original according to an instruction of an operator; an acquisition unit that acquires information about an operator who reads a document or information about an operation performed by the operator on the image processing apparatus in order to read the document; and a display unit that displays, when the information acquired by the acquisition unit satisfies a predetermined condition, a view-line-based input relating to reading of the document received from the operator.
[2] The image processing apparatus according to item [1], wherein the acquiring means acquires information on an operation performed by an operator on a document table on which a document is placed or on a document pressing member provided to be openable and closable for pressing the document placed on the document table, and the display means performs the display when the information acquired by the acquiring means indicates that a predetermined operation is performed on the document table or the document pressing member.
[3] The image processing apparatus according to [2], wherein the acquiring means acquires the magnitude of the pressing force against the document platen, and the display means displays the pressing force when the magnitude of the pressing force is equal to or greater than a predetermined threshold value in a state where the document pressing member is opened.
[4] The image processing apparatus according to [2], wherein the acquiring means acquires information indicating opening and closing of the document pressing member, and the display means displays the document when the document pressing member is opened for a time longer than a predetermined time.
[5] The image processing apparatus according to [1], wherein the acquiring means acquires information indicating a state of the hand of the operator, and the display means displays the information when the information acquired by the acquiring means indicates that the state of the hand of the operator is a predetermined state.
[6] The image processing apparatus according to [5], wherein the image processing apparatus further comprises a photographing unit that photographs a document table on which a document is placed, the obtaining unit obtains an image photographed by the photographing unit as information indicating a state of hands of an operator, and the display unit displays the image when the image obtained by the obtaining unit indicates that both hands of the operator are on the document table.
[7] The image processing apparatus according to [1], wherein the acquiring means acquires image reading conditions set by an operator, and the display means displays the image when the image reading conditions acquired by the acquiring means indicate that a document made of a plurality of sheets and having a page-combining portion is read.
[8] The image processing apparatus according to [7], wherein the display means performs the display when an original is placed on an original platen after the setting.
[9] An image processing apparatus, comprising: and a display unit that displays, when the document is set to be read, a view-line-based input relating to reading of the document from the operator.
[10] An image processing apparatus, comprising: an acquisition unit that acquires information of a specific image reading mode selected by an operator and different from a normal image reading mode; a display unit that displays an image for determining a line of sight from an operator in a case where the specific image reading mode is selected; and a reception unit that receives a line-of-sight-based input relating to reading of the document by determining a line of sight of an operator with respect to the image displayed by the display unit.
Effects of the invention
According to [1], in the case of reading an image of an original, it is possible to suppress reading in a state where the position of the original is shifted, as compared with a configuration in which an operator manually performs an input related to reading.
According to [2], the possibility of reading while suppressing the misalignment of the original can be increased as compared with a configuration in which information on the operation performed by the operator on the original platen and the original pressing member is not used.
According to [3], the possibility of suppressing the reading in a state where the position of the original is shifted can be more reliably improved.
According to [4], the possibility of suppressing the reading in a state where the position of the original is shifted can be more reliably improved.
According to [5], the possibility of suppressing the reading in the state where the position of the original is shifted can be increased as compared with a configuration in which information indicating the state of the hand of the operator is not used.
According to [6], the possibility of suppressing the reading in a state where the position of the original is shifted can be more reliably improved.
According to [7], in the case of reading an image of an original, it is possible to suppress reading in a state where the original is formed of a plurality of sheets and has a page-spread portion, as compared with a configuration in which an operator manually performs an input related to reading.
According to [8], as compared with a configuration in which a display for receiving a line-of-sight-based input concerning reading of an original from an operator is performed in a state in which the original has not been set, erroneous reading can be suppressed, regardless of whether the original is present or not.
According to [9], in the case of reading an image of an original, it is possible to suppress the reading in a state where the position of the original is shifted, compared with a configuration in which an operator manually performs an input related to the reading.
According to [10], in the case of reading an image of an original, it is possible to suppress the reading in a state where the position of the original is shifted, compared with a configuration in which an operator manually performs an input related to the reading.
Drawings
Fig. 1 is a perspective view of an image processing apparatus according to the present embodiment.
Fig. 2 is a block diagram showing an example of a hardware configuration of the image processing apparatus according to the present embodiment.
Fig. 3 is a block diagram showing an example of the functional configuration of the control unit.
Fig. 4 is a diagram for explaining an example of the structure of the line-of-sight detection sensor.
Fig. 5 (a) and (b) are diagrams for explaining an example of the structure of the line-of-sight detection sensor.
Fig. 6 is a flowchart showing an example of a processing procedure for reading an input image and receiving an input based on the line of sight of the operator.
Fig. 7 is a flowchart showing the processing sequence of embodiment 1.
Fig. 8 (a) to (c) are diagrams showing an example of the screen displayed in embodiment 1.
Fig. 9 is a flowchart showing the processing sequence of embodiment 2.
Fig. 10 (a) to (c) are diagrams showing an example of the screen displayed in example 2.
Fig. 11 is a flowchart showing the processing sequence of embodiment 3.
Fig. 12 (a) and (b) are diagrams showing an example of the screen displayed in embodiment 3.
Detailed Description
Hereinafter, embodiments of the present invention will be described in detail with reference to the accompanying drawings.
Appearance of image processing apparatus
First, the external appearance of the image processing apparatus 10 of the present embodiment will be described. Fig. 1 is a perspective view of an image processing apparatus 10 according to the present embodiment. The image processing apparatus 10 of the present embodiment is, for example, a so-called multifunction peripheral having various image processing functions such as an image reading function (scanning function), a printing function (printing function), a copying function (copying function), and a facsimile function.
The image processing apparatus 10 has a scanner 11, a printer 12, and a User Interface (UI) 13. The scanner 11 is a device for reading an image formed on an original document, and the printer 12 is a device for forming an image on a recording material. The user interface 13 is a device for receiving an operation (instruction) from an operator when the operator operates the image processing apparatus 10, and displaying various information to the operator.
The scanner 11 of the present embodiment is disposed above the printer 12. Here, the scanner 11 has: a document pressing member 14 provided so as to be openable and closable in the up-down direction as indicated by an arrow in the figure, and a document table 15 on which a document is placed. The scanner 11 is configured so that an operator can place an original on the original table 15 by opening the original pressing member 14 upward. Further, after the operator places the original on the original platen 15, the original pressing member 14 is returned to the original position (closed downward), whereby the original pressing member 14 can be pressed against the original placed on the original platen 15.
Further, the document pressing member 14 includes: a document storage section 16 for storing a document, and a document discharge section 17 for discharging the document conveyed from the document storage section 16. The scanner 11 reads an image of an original document placed on the document table 15, and reads an image of an original document transported from the original document storage section 16 to the original document discharge section 17.
The user interface 13 is mounted on the scanner 11. The user interface 13 is arranged on the front side of the image processing apparatus 10 (scanner 11) that stands when the operator operates the image processing apparatus 10. Also, the user interface 13 is arranged to face upward so that an operator standing on the front side of the image processing apparatus 10 can operate in a state of being viewed from above to below.
Here, the user interface 13 has a touch panel 130 and an operation button group 131. The touch panel 130 has a function of displaying various information to an operator and receiving an input from the operator. The operation button group 131 has a function of receiving an input from an operator.
The image processing apparatus 10 includes a line-of-sight detection sensor 18, a 1 st camera 19, and a 2 nd camera 20. The line-of-sight detection sensor 18 is mounted on the left side of the user interface 13 toward the upper side. The 1 st camera 19 is mounted on the upper side of the user interface 13. The 2 nd camera 20 is mounted on the front side and the left side of the printer 12.
The line-of-sight detection sensor 18 has a function of detecting the line of sight of an operator existing around the image processing apparatus 10. More specifically, the line-of-sight detection sensor 18 detects the line of sight of the operator toward the touch panel 130 of the user interface 13. In addition, although the line-of-sight detection sensor 18 is mounted on the left side of the user interface 13 in the example shown in fig. 1, the configuration is not limited thereto. The line-of-sight detection sensor 18 may be mounted at any portion as long as it is a position capable of detecting the line of sight of the operator toward the touch panel 130, for example, the line-of-sight detection sensor 18 is mounted at a position within a predetermined range from the touch panel 130.
The 1 st camera 19 is constituted by a so-called video camera, and has a function of capturing an image. Further, the 1 st camera 19 is disposed at a position where the document table 15 and the user interface 13 can be photographed, and photographs images of the document table 15 and the user interface 13.
The 2 nd camera 20 is also constituted by a so-called video camera, and has a function of capturing an image. Further, the 2 nd camera 20 is disposed at a position where an operator present around the image processing apparatus 10 can be photographed, and photographs an image of the operator present around the image processing apparatus 10.
The 1 st camera 19 and the 2 nd camera 20 capture at least one of a still image and a moving image.
Hardware structure of image processing apparatus
Next, a hardware configuration of the image processing apparatus 10 according to the present embodiment will be described. Fig. 2 is a block diagram showing an example of a hardware configuration of the image processing apparatus 10 according to the present embodiment.
As shown in the figure, the image processing apparatus 10 of the present embodiment includes: the control section 21, the communication section 22, the operation section 23, the display section 24, the storage section 25, the image reading section 26, the image forming section 27, the photographing section 28, the line-of-sight detection section 29, and the authentication section 30.
The control unit 21 controls the operations of the respective parts of the image processing apparatus 10. The control unit 21 includes a CPU (Central Processing Unit: central processing unit) 21a, a RAM (Random Access Memory: random access Memory) 21b, and a ROM (Read Only Memory) 21 c.
The CPU21a executes various programs stored in the ROM21c and the like by loading them into the RAM21b, thereby realizing the functions of the image processing apparatus 10. The RAM21b is a memory (storage section) used as a working memory or the like of the CPU21 a. The ROM21c is a memory (storage unit) that stores various programs and the like executed by the CPU21 a.
The communication unit 22 is an interface for communication connected to a communication line not shown. The communication unit 22 communicates with a client device and other image processing devices (both not shown) via a communication line.
The operation unit 23 inputs information corresponding to an operation by the operator to the control unit 21. In this example, the operation section 23 is implemented by a touch panel 130 provided on the user interface 13, and an operation button group 131.
The display unit 24 displays various information to the operator. In this example, the display section 24 is implemented by a touch panel 130 provided on the user interface 13.
The storage unit 25 is a storage unit that stores various data. The storage unit 25 is, for example, a hard disk. The storage unit 25 stores various programs, data, and the like used by the control unit 21.
The image reading section 26, which is an example of the image reading unit, reads an image of an original according to an instruction of an operator, and generates image data representing the read image. In this example, the image reading section 26 is implemented by the scanner 11.
The image forming section 27 forms an image corresponding to the image data on a sheet-like recording material such as paper. In this example, the image forming section 27 is implemented by the printer 12. The image forming unit 27 may form an image by an electrophotographic method, or may form an image by another method.
The imaging unit 28, which is an example of an imaging unit, images an imaging target. In this example, the photographing section 28 is implemented by the 1 st camera 19 and the 2 nd camera 20.
The line-of-sight detecting unit 29 has a function of detecting the line of sight of the operator existing around the image processing apparatus 10. In this example, the line-of-sight detection section 29 is implemented by the line-of-sight detection sensor 18.
When the operator wants to operate the image processing apparatus 10, the authentication unit 30 authenticates the operator. For example, the operator identifies and authenticates his/her own IC (Integrated Circuit: integrated circuit) card such as a employee card by bringing the IC card closer to an IC card reader (not shown) of the image processing apparatus 10. In this case, the authentication section 30 is realized by a card reader. In addition, for example, the authentication may be performed by using the face image of the operator captured by the 2 nd camera 20. More specifically, for example, by comparing the face image captured by the 2 nd camera 20 with a face image registered in advance, the operator is determined and authenticated. In this case, the authentication unit 30 is realized by a processing device that performs: the operator is determined by comparing the face image photographed by the 2 nd camera 20 with the face image registered in advance.
In the image processing apparatus 10, under the control of the control unit 21, the scanning function is realized by the image reading unit 26, the printing function is realized by the image forming unit 27, the copying function is realized by the image reading unit 26 and the image forming unit 27, and the facsimile function is realized by the image reading unit 26, the image forming unit 27, and the communication unit 22.
Functional structure of control part
Next, the functional configuration of the control unit 21 will be described. Fig. 3 is a block diagram showing an example of the functional configuration of the control unit 21. The control unit 21 includes: the reading operation information acquisition unit 31, the reading operator information acquisition unit 32, the condition determination unit 33, the display control unit 34, the operation input reception unit 35, and the line-of-sight input reception unit 36.
The reading operation information acquiring unit 31 as an example of the acquiring means acquires information (hereinafter referred to as reading operation information) related to an operation performed by an operator on the image processing apparatus 10 for reading a document. The reading operation information is information indicating, for example, an image of the document table 15 and an image of the user interface 13 obtained by the imaging of the 1 st camera 19, and a magnitude of the pressing force with which the document table 15 is pressed. Details regarding the reading action information will be described later.
The reading operator information acquiring unit 32 as an example of the acquiring means acquires information of an operator who reads a document in the image processing apparatus 10 (hereinafter referred to as reading operator information). The read operator information is, for example, an image of the operator obtained by photographing with the 2 nd camera 20, and information of the operator specified by the authentication section 30. Details regarding the reading of the operator information will be described below.
The condition determination unit 33 determines whether or not the read operation information or the read operator information satisfies a predetermined condition.
When the condition determination unit 33 determines that the reading operation information or the reading operator information satisfies a predetermined condition, the display control unit 34, which is an example of the display means, displays a screen for receiving an input of a line of sight related to reading of the document from the operator. Here, the display control unit 34 displays an image (hereinafter referred to as a read input image) for receiving a line-of-sight-based input related to reading of the document from the operator on the touch panel 130. The read input image is used as an example of an image for specifying a line of sight from an operator.
Incidentally, for example, in the case of reading a document (hereinafter referred to as "on-page document") composed of a plurality of sheets and having an on-page portion, in order to prevent the position of the document from shifting, the operator may perform the reading while pressing the document placed on the document table 15 with both hands. In the present embodiment, in such a case, a condition that the operator's hands are occupied is predetermined in order to have an effect on the input based on the operator's line of sight, and when it is determined that the condition is satisfied, the display control unit 34 displays the read input image.
In the following, the predetermined condition is sometimes referred to as a "line-of-sight input condition". With respect to the details of the line-of-sight input condition, as will be described below.
The operation input receiving unit 35 receives an input based on an operation by an operator. Here, the operation input receiving unit 35 detects a contact operation performed by the operator on the user interface 13, and receives an input from the operator. For example, the operation input receiving unit 35 detects that an icon for a copy function is pressed on the menu screen, and receives a selection of the copy function.
The sight line input reception unit 36 as an example of the reception means receives an input based on the operator's sight line. Here, the sight line input reception unit 36 acquires information (hereinafter referred to as sight line position information) of the position toward which the operator's sight line is directed from the sight line detection sensor 18, for example, periodically (for example, every 100 milliseconds). Then, an input based on the line of sight of the operator is received based on the acquired line of sight position information. More specifically, the line-of-sight input receiving unit 36 determines the line of sight of the operator using the acquired line-of-sight position information, and determines whether the line of sight of the operator is directed to the read input image. When it is determined that the operator is looking at the read input image, an input corresponding to the position at which the operator is looking is received.
For example, when it is determined that the operator's line of sight is directed to the "start" button of the copy function in the read input image, the line of sight input reception unit 36 receives an input to execute the copy function. Then, the copy function is performed. That is, image reading by the image reading section 26 and image forming by the image forming section 27 are performed. For example, when it is determined that the operator is looking at the "return to setting" button in the read input image, the line-of-sight input reception unit 36 receives an input of a setting screen for displaying the copy function. Then, a setting screen of the copy function is displayed. That is, the display control unit 34 displays a setting screen of the copy function.
Incidentally, the line-of-sight input reception unit 36 uses a rectangular coordinate system in the touch panel 130 (see fig. 1), for example, the upper left corner of the touch panel 130 is defined as the origin O1 (0, 0), the horizontal coordinate of the touch panel 130 is defined as the X coordinate, and the vertical coordinate of the touch panel 130 is defined as the Y coordinate. The sight line input reception unit 36 obtains, for example, information on the X coordinate and Y coordinate of a position toward which the operator's sight line is directed as sight line position information. The sight line input reception unit 36 determines whether the operator's sight line is directed to the read input image or not and which position of the read input image the operator's sight line is directed to, based on the information of the X-coordinate and the Y-coordinate of the position where the operator's sight line is directed and the information of the X-coordinate and the Y-coordinate of the position where the read input image is arranged.
In the present embodiment, even when the read input image is displayed, the input by the touch operation of the operator can be performed. That is, even when the read input image is displayed, the operation input receiving unit 35 receives an input based on an operation by the operator.
Each of the functional units constituting the control unit 21 shown in fig. 3 is realized by cooperation of software and hardware resources. Specifically, in the case of implementing the image processing apparatus 10 by the hardware configuration shown in fig. 2, the function units such as the read operation information acquisition unit 31, the read operator information acquisition unit 32, the condition determination unit 33, the display control unit 34, the operation input reception unit 35, and the line-of-sight input reception unit 36 are implemented by reading the OS program and the application program stored in the ROM21c into the RAM21b and executing the read operation information acquisition unit by the CPU21 a.
Structure of sight line detection sensor
Next, the structure of the line-of-sight detection sensor 18 (see fig. 1) will be described. Fig. 4 and 5 (a) and (b) are diagrams for explaining an example of the configuration of the line-of-sight detection sensor 18.
As shown in fig. 4, the line-of-sight detection sensor 18 has a light source 181 that irradiates infrared light in a dot shape to the eyeball 101 of the operator, and the infrared reflected light from the eyeball 101 is incident into an optical lens group 183 via a minute aperture stop provided on an eyepiece 182. The optical lens group 183 forms an image of the incident infrared reflection light in a dot shape on the image pickup surface of the CCD184, and the CCD184 converts a virtual image (purkinje image) based on the cornea reflection formed on the image pickup surface into an electric signal and outputs the electric signal.
As shown in fig. 5 (a) and (b), the virtual image is a virtual image 103 obtained by reflecting infrared light irradiated from a light source 181 (see fig. 4) by the cornea of the pupil 102, and the relative positional relationship between the center of the pupil 102 and the virtual image 103 changes in proportion to the rotation angle of the eyeball 101. In the present embodiment, image processing is performed using an electric signal representing a virtual image from the CCD184, and the line of sight of the operator (the direction of the line of sight of the operator, the position toward which the line of sight of the operator is directed) is detected based on the result of the image processing.
The line of sight detection by the operator may be performed by other known methods, and is not limited to the methods shown in fig. 4 and 5.
The eye movement detector 18 may be, for example, an eye movement detector manufactured by Tobe Technology.
< description of reading action information, reading operator information, and line of sight input Condition >)
Next, the reading operation information, the reading operator information, and the line-of-sight input condition will be described in detail. As described above, the sight line input condition is predetermined as a condition in which it is assumed that both hands of the operator are occupied, and the condition determination unit 33 determines whether or not the read operation information or the read operator information satisfies the sight line input condition. Here, a case where the reading operation information satisfies the line-of-sight input condition and a case where the reading operator information satisfies the line-of-sight input condition will be described separately.
First, a case where the reading operation information satisfies the line-of-sight input condition will be described.
The line-of-sight input condition in this case defines the action of the operator in a manner assuming that both hands of the operator are occupied. The reading operation information used for the determination is information related to the operation performed by the operator on the document table 15, the document pressing member 14, the user interface 13, and other components of the image processing apparatus 10. The condition determination unit 33 determines whether or not the operator has performed an operation defined by the line-of-sight input condition, based on the read operation information.
More specifically, for example, a condition that "both hands of the operator are located on the document table 15" is specified as the line-of-sight input condition. The reading operation information in this case is information on an operation performed on the document table 15 by the operator, and is, for example, an image of the document table 15 obtained by capturing with the 1 st camera 19. The condition determination unit 33 determines whether or not both hands of the operator are on the document table 15 in the image of the document table 15. For example, when the image of the document table 15 includes image information of both hands of the operator, the condition determination unit 33 determines that the reading operation information satisfies the line-of-sight input condition.
Here, when the operator presses the original document against the original document table 15 with both hands, it is considered that the original document table 15 is pressed with a force equal to or greater than a predetermined magnitude. Therefore, for example, a condition of "the document platen 15 is pressed by a force equal to or greater than a predetermined threshold value in a state where the document pressing member 14 is opened" may be defined as the line-of-sight input condition. The reading operation information in this case is information indicating the magnitude of the pressing force against the document table 15 in the state where the document pressing member 14 is opened. The condition determination unit 33 determines whether or not the pressing force against the document table 15 is equal to or greater than a predetermined threshold in the reading operation information.
Further, it is considered that the magnitude of the pressing force when the operator presses the original on the original platen 15 is different from the magnitude of the pressing force when the original pressing member 14 is closed and the original pressing member 14 presses the original. Therefore, the pressing force when the document pressing member 14 is closed may be measured in advance, and the line-of-sight input condition may be satisfied when a force different from the measured pressing force is pressed on the document table 15. Further, it is considered that the magnitude of the pressing force when the operator presses the original document against the original document table 15 is within a predetermined range, and therefore, when the original document table 15 is pressed by a force of a magnitude within a predetermined range, the line-of-sight input condition is satisfied. As described above, the line-of-sight input condition is understood to be a condition based on the operation of the operator on the document table 15 or a condition based on the magnitude of the pressing force against the document table 15.
In addition, in a case where the operator does not operate the user interface 13, there is a possibility that the operator is pressing the original against the original table 15 with both hands. Therefore, a condition that "the operator's hand (finger) is not on the user interface 13 for a predetermined time" may be defined as the line-of-sight input condition. The read operation information in this case is information on an operation performed on the user interface 13 by the operator, and is, for example, an image of the user interface 13 obtained by capturing with the 1 st camera 19. The condition determination unit 33 determines whether or not the hand of the operator is not on the user interface 13 in the image of the user interface 13. For example, when the image of the user interface 13 does not include image information of the hand of the operator for a predetermined period of time, the condition determination unit 33 determines that the reading operation information satisfies the line-of-sight input condition.
The line-of-sight input condition may be defined as "a state in which the document pressing member 14 is opened". That is, the "hands (fingers) of the operator are not on the user interface 13 for a predetermined time in the state where the document pressing member 14 is opened" may be used as the line-of-sight input condition. Thus, the line-of-sight input condition can be understood as a condition based on an action performed by the operator with respect to the user interface 13.
In addition, in general, reading of the original is performed in a state where the original pressing member 14 is closed. Therefore, when the document pressing member 14 is opened upward, the operator may press the document against the document table 15 with both hands. Therefore, for example, a condition that "the original pressing member 14 is opened for more than a predetermined time" may be defined as the line-of-sight input condition. The reading operation information in this case is information related to an operation performed by the operator on the document pressing member 14, and is, for example, information indicating whether the document pressing member 14 is opened or closed. The condition determination unit 33 determines whether or not the time for which the document pressing member 14 is opened exceeds a predetermined time in the reading operation information. In this way, the line-of-sight input condition can be understood as a condition based on the operation performed by the operator on the document pressing member 14.
Further, when the operator selects a setting for reading the page-spread document (hereinafter, referred to as a book setting), the operator is supposed to press the page-spread document against the document table 15 with both hands after that. Therefore, for example, "book setting is selected" may be defined as the line-of-sight input condition. The reading operation information in this case is information on an operation performed by the operator on the user interface 13, in other words, information indicating a condition of image reading (image reading mode) set by the operator. When the document setting is selected in the reading operation information, the condition determination unit 33 determines that the reading operation information satisfies the line-of-sight input condition.
The present invention is not limited to the book setting, and may be applied to a case where the operator selects a specific image reading mode, and the line-of-sight input condition is satisfied. The specific image reading mode is different from a normal image reading mode (for example, a copy function and a scanner function set in a standard), and the specific image reading mode is set in advance by an operator or the like.
In addition, the line-of-sight input condition may be satisfied when a specific image reading mode such as book setting is selected and then an original is set on the original platen 15. The reading operation information in this case is information about an operation performed by the operator on the document table 15, and is information indicating whether or not the document is placed on the document table 15 (that is, whether or not the document is placed on the document table 15). After the book setting (specific image reading mode) is selected, the condition determination unit 33 determines whether or not the document is placed on the document table 15 in the reading operation information. When an original is placed on the original platen 15, the original platen 15 detects that the original has been placed, and the reading operation information acquiring unit 31 acquires the information as reading operation information.
Incidentally, when a specific image reading mode such as book setting is selected, a line-of-sight input condition in a normal image reading mode such as "both hands of the operator are located on the document table 15" may be used. That is, in the specific image reading mode, a condition different from or in addition to the line-of-sight input condition in the normal image reading mode may be used, and a line-of-sight input condition such as "document set on document table 15" may be used.
Next, a case where the read operator information satisfies the line-of-sight input condition will be described.
The sight line input condition in this case specifies the state of the operator assuming that both hands of the operator are occupied. The condition determination unit 33 determines whether or not the read operator information indicates the state of the operator specified by the line-of-sight input condition.
For example, in the case where the operator is holding something, it is possible that the reading of the original is performed while holding something with both hands, or holding luggage with one hand while pressing the original with the other hand, so that both hands are occupied. Therefore, for example, a condition that "the operator is holding something" is defined as a line-of-sight input condition. The read operator information in this case is an image of the operator obtained by photographing with the 2 nd camera 20. The condition determination unit 33 determines whether or not the operator is holding the object in the image of the operator. The portion where the object is held may be defined, for example, by defining a line-of-sight input condition such as "the operator is holding the object with both hands" and "the operator is holding the object with one hand".
Similarly, if the hands (or arms) of the operator are injured, both hands may not be used when reading the document. Therefore, for example, a condition that "the hand (or arm) of the operator is injured" may be defined as the line-of-sight input condition. The read operator information in this case is an image of the operator obtained by photographing with the 2 nd camera 20. The condition determination unit 33 determines whether or not the hand of the operator is injured in the image of the operator. The injured portion may be defined, for example, by defining a sight line input condition such as "both hands of the operator are injured" or "one hand of the operator is injured".
In addition, for example, in the case where the operator is a handicapped person, both hands may not be used when reading the original. Therefore, for example, a condition such as "operator is a disabled person" may be defined as the line-of-sight input condition. The read operator information in this case is an image of the operator captured by the 2 nd camera 20. The condition determination unit 33 determines whether the operator is a disabled person in the image of the operator. In addition, a disabled portion may be defined, for example, a sight line input condition such as "disabled hands of the operator" may be defined.
Here, information of an injured person and information of a handicapped person among operators who have a possibility of operating the image processing apparatus 10 may be registered in advance. Then, the condition determination unit 33 may compare the information of the operator specified by the authentication unit 30 with the information registered in advance, and determine whether the operator of the image processing apparatus 10 is injured, is a handicapped person, or the like. The read operator information in this case is information of the operator specified by the authentication section 30.
In addition, in the case where the condition determination section 33 determines whether or not the line-of-sight input condition is satisfied from the images captured by the 1 st camera 19 and the 2 nd camera 20, an existing image analysis technique may be used. For example, in the case of determining whether or not both hands of the operator are on the document table 15, images in the case where both hands are on the document table 15 are registered in advance. Next, the condition determination unit 33 compares the captured image of the document table 15 with the image registered in advance, and determines whether or not both hands of the operator are positioned on the document table 15. In addition, for example, in the case of determining whether or not the hand of the operator is injured, an image in the case where the hand is injured is registered in advance. Then, the condition determination unit 33 compares the captured image of the operator with the image registered in advance, thereby determining whether or not the hand of the operator is injured.
Among the above-described line-of-sight input conditions, the conditions such as "both hands of the operator are on the document table 15", "the operator holds something", "the operator has his hands injured", "the operator has handicapped hands", "the operator has his hands not on the user interface 13 for a predetermined time" are conditions related to the state of the operator's hands. In addition, the reading operation information and the reading operator information in this case can be understood as information indicating the state of the hand of the operator. The condition determination unit 33 determines whether or not the hand state of the operator is a predetermined state in the reading operation information or the reading operator information.
The condition determination unit 33 may determine any one of the above-described line-of-sight input conditions, or may determine a plurality of the above-described line-of-sight input conditions. In the case where the condition determination unit 33 is to determine a plurality of conditions, the display control unit 34 may display the read input image when any one of the conditions is satisfied, or may display the read input image when 2 or more of the conditions are satisfied.
In addition, although a condition that the operator's hands are occupied is assumed as the line-of-sight input condition, in reality, even if the operator's hands are not occupied, the read input image is displayed when the line-of-sight input condition is satisfied.
< display order of processing of reading input image and accepting input >
Next, a procedure of processing for displaying and reading an input image and receiving an input based on the line of sight of the operator will be described. Fig. 6 is a flowchart showing an example of a processing procedure for reading an input image and receiving an input based on the line of sight of the operator.
First, the operation input receiving unit 35 determines whether or not the operator has selected a function of reading a document (for example, a copy function, a scanner function, etc.) (step 101). If the determination in step 101 is negative (no), the present processing flow ends. On the other hand, when the affirmative determination (yes) is made in step 101, a setting screen for receiving the setting of the function selected by the operator (the function of executing document reading) is displayed on the touch panel 130. Then, the operator operates on the setting screen, and the operation input receiving unit 35 receives a setting concerning the selected function (step 102).
After the operator has set the condition, the condition determination unit 33 determines whether or not the read operation information or the read operator information satisfies the line-of-sight input condition (step 103). In other words, the condition determination unit 33 determines whether or not the sight line input condition is satisfied, based on the read operator information acquired by the read operator information acquisition unit 32 or the read operation information acquired by the read operation information acquisition unit 31.
If the affirmative determination (yes) is made in step 103, the display control unit 34 displays the read input image on the touch panel 130 (step 104). Next, the line-of-sight input receiving unit 36 determines whether or not the reading execution of the original document is received, based on the line of sight of the operator with respect to the read input image (step 105). In the case where the affirmative determination (yes) is made in step 105, reading of the original is performed (step 106). Then, the present processing flow is ended. On the other hand, if the determination in step 105 is negative (no), the routine proceeds to step 102.
If the negative determination (NO) is made in step 103, the operation input reception unit 35 determines whether or not the reading execution of the document has been received by the operation of the operator (step 107). In the case where the determination in step 107 is affirmative (yes), the process proceeds to step 106, whereby reading of the original is performed. On the other hand, if the determination in step 107 is negative (no), the routine proceeds to step 102.
Further, since the reading operation information is information on the operation of the operator, it can be said that the reading operation information changes with time according to the time. Therefore, in the determination in step 103, the read operation information acquired at the time of making the determination may be used. On the other hand, the read operator information is information of the operator, and is not information that changes with time. Therefore, the read operator information can be acquired at a timing before the determination in step 103, for example, at the timings of the processing in step 101 and step 102.
In step 104, the display control unit 34 may display the read input image when the operator's line of sight is directed to the touch panel 130. That is, the following means may be adopted: when it is determined that the line-of-sight input condition is satisfied and it is determined that the line of sight of the operator is directed toward the touch panel 130, the display control section 34 displays the read input image.
In the above example, the condition determination unit 33 performs the determination of step 103 after the processing of step 102, but the present invention is not limited to this configuration. For example, it is considered that when reading an original with a standard setting (default setting), the operator does not input the setting in step 102. Therefore, for example, the processing of step 101 may be followed by the processing of step 103 and subsequent steps. For example, the processing of step 103 and subsequent steps may be performed while the setting of the operator is received in step 102.
Specific example of processing for reading an input image and receiving an input
Next, a process of displaying a read input image and receiving an input based on the line of sight of the operator will be described with specific examples (embodiments 1 to 3).
Example 1
First, example 1 will be described. In embodiment 1, after the operator selects the copy function, the original platen 15 is pressed with both hands, whereby the read input image is displayed. That is, in this example, the line-of-sight input condition is a condition that "both hands of the operator are on the document table 15".
Fig. 7 is a flowchart showing the processing sequence of embodiment 1. Fig. 8 (a) to (c) are diagrams showing an example of the screen displayed in example 1. The process of example 1 will be described with reference to fig. 7 and 8.
First, on the touch panel 130, a menu screen 44 shown in fig. 8 (a) is displayed. In the menu screen 44, various icons showing functions executable in the image processing apparatus 10 are displayed. Here, the operation input receiving unit 35 determines whether or not the operator has selected a function of executing reading of the document (step 201). If the determination in step 201 is negative (no), the present processing flow ends. On the other hand, when the operator selects the icon 41 of the copy function shown in fig. 8 a, the operation input receiving unit 35 determines that the copy function is selected, and determines that the determination is affirmative in step 201 (yes).
When it is determined in step 201 that the copy function is selected (yes in step 201), the display control unit 34 displays the setting screen 45 shown in fig. 8 (b) as a setting screen for accepting the setting of the copy function. By the operator operating on the setting screen 45, the operation input receiving unit 35 receives settings concerning the copy function (step 202). Here, for example, the size of the paper sheet to be subjected to image formation, the number of output sheets, and the like are set. Next, the condition determination unit 33 determines whether or not both hands of the operator are on the document table 15 in the reading operation information acquired by the reading operation information acquisition unit 31 (step 203).
If the affirmative determination (yes) is made in step 203, the display control unit 34 displays the read input image 46 shown in fig. 8 (c) (step 204). The read input image 46 includes a "start" button 46A and a "return setting" button 46B. Here, the visual line input receiving unit 36 determines whether or not the operator's visual line is directed to the "start" button 46A (step 205).
In the case where the determination in step 205 is affirmative (yes), a copy function is executed (step 206). Then, the present processing flow is ended. On the other hand, if the determination in step 205 is negative (no), the routine proceeds to step 202. The negative determination at step 205 is, for example, a case where the operator's line of sight is directed to the "return setting" button 46B, or a case where the operator's line of sight is not directed to the "start" button 46A or the "return setting" button 46B for a certain period of time.
If the determination in step 203 is negative (no), the operation input receiving unit 35 determines whether or not the "start" button for executing the copy function has been pressed by the operation of the operator (step 207). The "start" button is a button different from the "start" button 46A for reading the input image 46, and is a button for receiving an input by a touch operation of the operator. If the determination in step 207 is affirmative (yes), the routine proceeds to step 206. On the other hand, if the determination in step 207 is negative (no), the routine proceeds to step 202.
Example 2
Next, example 2 will be described. In embodiment 2, after the operator selects the scanning function, the original platen 15 is pressed with a force equal to or greater than a predetermined force, and the read input image is displayed. That is, in this example, the line-of-sight input condition is a condition of "force of the original platen 15 being equal to or greater than a predetermined threshold value in a state where the original pressing member 14 is opened".
Fig. 9 is a flowchart showing the processing sequence of embodiment 2. Fig. 10 (a) to (c) are diagrams showing an example of the screen displayed in example 2. The process of example 2 will be described with reference to fig. 9 and 10.
First, on the touch panel 130, a menu screen 44 shown in fig. 8 (a) is displayed. Here, the operation input receiving unit 35 determines whether or not the operator has selected a function of executing reading of the document (step 301). If the determination in step 301 is negative (no), the present processing flow ends. On the other hand, when the operator selects the icon 42 of the scanning function shown in fig. 8 (a), the operation input receiving unit 35 determines that the scanning function is selected, and determines that the operation is affirmative in step 301 (yes).
When it is determined in step 301 that the scanning function is selected (yes in step 301), the display control unit 34 displays the setting screen 47 shown in fig. 10 (a) as a setting screen for accepting the setting of the scanning function. The operator operates the setting screen 47 to receive settings concerning the scanning function from the operation input receiving unit 35 (step 302). Here, for example, a color setting (color or black and white) and a setting of a transmission destination of image data generated by a scanning function are received. Next, the condition determination unit 33 determines whether or not the pressing force against the document platen 15 is equal to or greater than a predetermined threshold value in the read operation information acquired by the read operation information acquisition unit 31 in a state where the document pressing member 14 is opened (step 303).
If the affirmative determination (yes) is made in step 303, the display control unit 34 displays the 1 st read input image 48 shown in fig. 10 (b) (step 304). The 1 st read input image 48 includes a "start" button 48A and a "return setting" button 48B. Here, the visual line input receiving unit 36 determines whether or not the operator's visual line is directed to the "start" button 48A (step 305).
In the case of affirmative determination (yes) in step 305, a scanning function is performed (step 306). On the other hand, in the case of a negative determination (no) in step 305, the flow proceeds to step 302. The case where the negative determination is made in step 305 is, for example, a case where the operator's line of sight is directed to the "return setting" button 48B, or a case where the operator's line of sight is not directed to the "start" button 48A, or is not directed to the "return setting" button 48B for a certain period of time.
After the scanning function is executed in step 306, the display control unit 34 displays the 2 nd read input image 49 (step 307). The 2 nd read input image 49 is displayed with a "continue scan" button 49A and a "scan end/setting change" button 49B. Here, the visual line input receiving unit 36 determines whether or not the operator's visual line is directed to the "continue scan" button 49A (step 308). In the case of executing the continuing scanning function, the operator may simply place the next document (for example, the next document) on the document table 15.
In the case of an affirmative determination (yes) in step 308, the flow proceeds to step 306, and then the scanning function is executed. On the other hand, if the negative determination (no) is made in step 308, the setting screen 47 shown in fig. 10 (a) is displayed. The negative determination at step 308 is, for example, a case where the operator's line of sight is directed to the "scan end/setting change" button 49B, or a case where the operator's line of sight is not directed to the "continue scan" button 49A, or is not directed to the "scan end/setting change" button 49B for a certain period of time.
After the setting screen 47 is displayed, the operation input receiving unit 35 determines whether or not the operator has selected the end of the scanning function (step 309). If the determination in step 309 is affirmative (yes), the present processing flow is ended. On the other hand, in the case of a negative determination (no) in step 309, the flow proceeds to step 302.
If the negative determination (NO) is made in step 303, the operation input receiving unit 35 determines whether or not the "start" button for executing the scanning function has been pressed by the operation of the operator (step 310). The "start" button is a button different from the "start" button 48A of the 1 st read input image 48, and is a button for receiving an input by a touch operation of the operator. If the determination in step 310 is negative (no), the routine proceeds to step 302. On the other hand, in the case where the affirmative determination (yes) is made in step 310, the scanning function is executed (step 311).
Next, the operation input receiving unit 35 determines whether or not the "continue scan" button has been pressed by the operation of the operator (step 312). The "continue scan" button is also a button different from the "continue scan" button 49A of the 2 nd read input image 49, and is a button for receiving an input by a touch operation of the operator. In the case of an affirmative determination (yes) in step 312, the flow proceeds to step 302. On the other hand, if the negative determination (no) is made in step 312, the present processing flow ends.
Example 3
Next, example 3 will be described. In embodiment 3, after the operator selects the book setting, the original is placed on the original table 15, and thereby the read input image is displayed. That is, in this example, the condition of "document set on document table 15" is defined as a line-of-sight input condition in the case where the book setting is selected. Further, as an example of the book setting, a book copy function, which is a function for copying a page-spread document, may be selected.
Fig. 11 is a flowchart showing the processing sequence of embodiment 3. Fig. 12 (a) and (b) are diagrams showing an example of the screen displayed in example 3. The process of example 3 will be described with reference to fig. 11 and 12.
First, on the touch panel 130, a menu screen 44 shown in fig. 8 (a) is displayed. Here, the operation input receiving unit 35 determines whether or not the operator has selected a function of executing document reading (step 401). If the determination in step 401 is negative (no), the present processing flow ends. On the other hand, when the operator selects the icon 43 of the book copy function shown in fig. 8 a, the operation input receiving unit 35 determines that the book copy function is selected, and the determination in step 401 is affirmative (yes).
When it is determined in step 401 that the book copy function is selected (yes in step 401), the display control unit 34 displays the setting screen 50 shown in fig. 12 (a) as a setting screen for accepting the setting of the book copy function. By the operator operating on the setting screen 50, the operation input receiving unit 35 receives settings concerning the book copy function (step 402). On the setting screen 50, for example, a setting item such as "book both sides" is displayed. The "document double-sided" is a setting item for whether or not double-sided copying is performed on a closed document so that the page-on state is not changed, and is a setting item unique to a book copy function that is not displayed on the setting screen 45 of the copy function shown in fig. 8 (b).
Next, the condition determination unit 33 determines whether or not a document is placed on the document table 15 (step 403). If the affirmative determination (yes) is made in step 403, the display control unit 34 displays the read input image 51 shown in fig. 12 (b) (step 404). The read input image 51 includes a "start" button 51A and a "return setting" button 51B. Here, the visual line input receiving unit 36 determines whether or not the operator's visual line is directed to the "start" button 51A (step 405).
In the case where the affirmative determination (yes) is made in step 405, the book copy function is executed (step 306). Then, the present processing flow is ended. On the other hand, if the determination in step 405 is negative (no), the routine proceeds to step 402. The case where the negative determination is made in step 405 is, for example, a case where the operator's line of sight is directed to the "return setting" button 51B, or a case where the operator's line of sight is not directed to the "start" button 51A, the "return setting" button 51B, or the like for a certain period of time.
If the determination (no) in step 403 is negative, the operation input receiving unit 35 determines whether or not the "start" button for executing the book copy function has been pressed by the operation of the operator (step 407). The "start" button is a button different from the "start" button 51A for reading the input image 51, and is a button for receiving an input based on a touch operation by the operator. If the determination in step 407 is affirmative (yes), the routine proceeds to step 406. On the other hand, if the determination in step 407 is negative (no), the routine proceeds to step 402.
As described above, the image processing apparatus 10 according to the present embodiment displays the read input image and receives the input of the reading based on the line of sight of the operator when the reading operation information or the reading operator information satisfies the predetermined condition. When reading a document, an operator may adjust, for example, a shift in position or angle of the document placed on the document table 15. However, for example, in a case where an operator presses an original with both hands in order to read a page-spread original, and in a case where the operator is holding something or the operator's hands are injured, both hands of the operator may be occupied. In this case, when the hand is separated from the original in order to start the reading operation, the reading may be performed in a state in which the position of the original is deviated.
Therefore, according to the image processing apparatus 10 of the present embodiment, since the read input image is displayed when the predetermined condition is satisfied, the operator can start reading without leaving the hand from the original. Therefore, for example, compared with a configuration in which an operator manually performs an input related to reading, it is possible to suppress a case in which reading is performed in a state in which the position of the original is shifted. Further, since the image processing apparatus 10 according to the present embodiment starts reading based on the line of sight of the operator, for example, reading can be easily performed at a timing desired by the operator, compared with a configuration in which reading is started after a predetermined time has elapsed since the original is placed on the original platen 15.
In the above example, the condition that the operator has both hands occupied is assumed as the line-of-sight input condition, but the present invention is not limited to this configuration. In the present embodiment, a condition in which an obstacle is assumed in the operation of the operator may be set as the line-of-sight input condition, regardless of whether or not both hands of the operator are occupied. In other words, a condition that operability is lower than a case where both hands of the operator are left empty (both hands can be used by the operator) may be used as the line-of-sight input condition. As such a case, for example, a case where one hand of the operator is occupied can be cited.
For example, when an operator presses a document with one hand and the other hand is left free, it is considered that, depending on the arrangement position of the touch panel 130, it is easier for the operator to start reading in a manner of receiving an input by the line of sight than when the touch panel 130 is operated with the other hand. Therefore, by taking a condition that it is assumed that one hand of the operator is occupied as a line-of-sight input condition in advance, reading of the original can be started by the line of sight of the operator in such a case as well.
The program for realizing the embodiment of the present invention is naturally provided by the communication unit, but may be provided by being stored in a recording medium such as a CD-ROM.
In addition, although various embodiments and modifications (examples) have been described above, it is needless to say that a combination of these embodiments and modifications (examples) may be employed.
The present disclosure is not limited to any of the embodiments described above, and can be implemented in various ways within a scope not departing from the gist of the present disclosure.

Claims (7)

1. An image processing apparatus, comprising:
an image reading unit that reads an image of an original according to an instruction of an operator;
an acquisition unit that acquires information on an operator who reads a document or information on an operation performed by the operator on a document table on which the document is placed or a document pressing member provided so as to be openable and closable for pressing the document placed on the document table in order to read the document; and
and a display unit that displays, when the information acquired by the acquisition unit indicates that both hands of the operator are positioned on the document table, a view-based input for starting reading of the document is received from the operator.
2. The image processing apparatus according to claim 1, wherein,
the acquisition unit acquires the magnitude of the pressing force against the document table,
the display unit performs the display when the magnitude of the pressing force is equal to or greater than a predetermined threshold value in a state where the document pressing member is opened.
3. The image processing apparatus according to claim 1, wherein,
the acquisition unit acquires information indicating opening and closing of the document pressing member,
the display unit performs the display when the original pressing member is opened for more than a predetermined time.
4. The image processing apparatus according to claim 1, wherein,
the acquisition means acquires information indicating the state of the hand of the operator,
the display unit displays the information when the information acquired by the acquisition unit indicates that the state of the hand of the operator is a predetermined state.
5. The image processing apparatus according to claim 4, wherein,
the image processing apparatus further has a photographing unit that photographs a document table on which a document is placed,
The acquisition unit acquires an image captured by the capturing unit as information indicating a state of the hand of the operator.
6. The image processing apparatus according to claim 1, wherein,
the acquisition unit acquires image reading conditions set by an operator,
the display unit performs the display when the image reading condition acquired by the acquisition unit indicates that a setting is made to read an original document that is made of a plurality of sheets and has a page combining portion.
7. The image processing apparatus according to claim 6, wherein,
the display unit displays the document when the document is placed on the document table after the setting is performed.
CN201811208072.XA 2018-10-17 2018-10-17 Image processing apparatus and method Active CN111064852B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201811208072.XA CN111064852B (en) 2018-10-17 2018-10-17 Image processing apparatus and method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201811208072.XA CN111064852B (en) 2018-10-17 2018-10-17 Image processing apparatus and method

Publications (2)

Publication Number Publication Date
CN111064852A CN111064852A (en) 2020-04-24
CN111064852B true CN111064852B (en) 2024-04-05

Family

ID=70296904

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201811208072.XA Active CN111064852B (en) 2018-10-17 2018-10-17 Image processing apparatus and method

Country Status (1)

Country Link
CN (1) CN111064852B (en)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103813044A (en) * 2012-11-06 2014-05-21 柯尼卡美能达株式会社 Guidance information display device
JP2015028733A (en) * 2013-07-30 2015-02-12 コニカミノルタ株式会社 Operation device and image processing apparatus
CN105765513A (en) * 2013-11-01 2016-07-13 索尼公司 Information processing device, information processing method, and program

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103813044A (en) * 2012-11-06 2014-05-21 柯尼卡美能达株式会社 Guidance information display device
JP2015028733A (en) * 2013-07-30 2015-02-12 コニカミノルタ株式会社 Operation device and image processing apparatus
CN105765513A (en) * 2013-11-01 2016-07-13 索尼公司 Information processing device, information processing method, and program

Also Published As

Publication number Publication date
CN111064852A (en) 2020-04-24

Similar Documents

Publication Publication Date Title
JP5787099B2 (en) Guidance information display device
US10642555B2 (en) Image processing system, method, and non-transitory computer readable medium
US10965837B2 (en) Authentication device and authentication method
US9380179B2 (en) AR display device in which an image is overlapped with a reality space, AR display control device, print condition setting system, print system, print setting display method, and non-transitory computer-readable recording medium
US10205853B2 (en) Authentication apparatus, image forming apparatus, authentication method, and image forming method
US20160269578A1 (en) Head mounted display apparatus and method for connecting head mounted display apparatus to external device
US10104258B2 (en) Information processing apparatus and image processing apparatus including user gaze based shifting from a first state to a second state having a smaller electric power consumption
JP6974032B2 (en) Image display device, image forming device, control program and control method
JP6809022B2 (en) Image display device, image forming device, and program
JP2013045125A (en) Input display device, image forming device, imaging device, and program
US20080170044A1 (en) Image Printing Apparatus and Method for Processing an Image
CN109788155B (en) Image reading apparatus and method
CN111064852B (en) Image processing apparatus and method
JP2021193780A (en) Image reader, image reading method and image reading program
JP6946853B2 (en) Image processing device
US9473670B2 (en) Peripheral with image processing function
JP2014219712A (en) Operation display device
JP6245690B2 (en) Image forming apparatus
US10116809B2 (en) Image processing apparatus, control method, and computer-readable storage medium, which obtains calibration image information with which to correct image data
US20180091677A1 (en) Image display apparatus, image forming apparatus, and non-transitory computer readable medium
US10289367B2 (en) Image forming apparatus
EP3282388A1 (en) Authentication apparatus for carrying out authentication based on captured image, authentication method and server
US20230297179A1 (en) Information processing apparatus, non-transitory computer readable medium storing program, and information processing method
JP6777039B2 (en) Image reader and image forming device
JP6780603B2 (en) Image reader and image forming device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
CB02 Change of applicant information

Address after: Tokyo, Japan

Applicant after: Fuji film business innovation Co.,Ltd.

Address before: Tokyo, Japan

Applicant before: Fuji Xerox Co.,Ltd.

CB02 Change of applicant information
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant