WO2024047801A1 - Image processing device and communication system - Google Patents

Image processing device and communication system Download PDF

Info

Publication number
WO2024047801A1
WO2024047801A1 PCT/JP2022/032789 JP2022032789W WO2024047801A1 WO 2024047801 A1 WO2024047801 A1 WO 2024047801A1 JP 2022032789 W JP2022032789 W JP 2022032789W WO 2024047801 A1 WO2024047801 A1 WO 2024047801A1
Authority
WO
WIPO (PCT)
Prior art keywords
image processing
processing device
user
biometric information
authentication
Prior art date
Application number
PCT/JP2022/032789
Other languages
French (fr)
Japanese (ja)
Inventor
浩史 岡
博文 鈴木
茂樹 高谷
雅俊 郷
Original Assignee
京セラ株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 京セラ株式会社 filed Critical 京セラ株式会社
Priority to PCT/JP2022/032789 priority Critical patent/WO2024047801A1/en
Priority to JP2023550614A priority patent/JP7408027B1/en
Publication of WO2024047801A1 publication Critical patent/WO2024047801A1/en

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B41PRINTING; LINING MACHINES; TYPEWRITERS; STAMPS
    • B41JTYPEWRITERS; SELECTIVE PRINTING MECHANISMS, i.e. MECHANISMS PRINTING OTHERWISE THAN FROM A FORME; CORRECTION OF TYPOGRAPHICAL ERRORS
    • B41J29/00Details of, or accessories for, typewriters or selective printing mechanisms not otherwise provided for
    • B41J29/38Drives, motors, controls or automatic cut-off devices for the entire printing mechanism
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B41PRINTING; LINING MACHINES; TYPEWRITERS; STAMPS
    • B41JTYPEWRITERS; SELECTIVE PRINTING MECHANISMS, i.e. MECHANISMS PRINTING OTHERWISE THAN FROM A FORME; CORRECTION OF TYPOGRAPHICAL ERRORS
    • B41J29/00Details of, or accessories for, typewriters or selective printing mechanisms not otherwise provided for
    • B41J29/42Scales and indicators, e.g. for determining side margins
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F21/00Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F21/30Authentication, i.e. establishing the identity or authorisation of security principals
    • G06F21/31User authentication
    • G06F21/32User authentication using biometric data, e.g. fingerprints, iris scans or voiceprints
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N1/00Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof

Definitions

  • the present disclosure relates to an image processing device having at least one of a printer and a scanner, and a communication system including the image processing device.
  • Patent Document 1 There is known an image processing device (or a communication system including the image processing device) that performs biometric authentication and limits and removes restrictions on functions available to the user according to the authentication result (for example, Patent Document 1 listed below). . Patent Document 1 generally discloses the following two types of communication systems.
  • an image processing device acquires biometric information of a user, transmits it to a server, and causes the server to register the biometric information as verification biometric information.
  • the image processing device allows the user to input biometric information and transmits the input biometric information to the server.
  • the server performs authentication by comparing the received biometric information with the above-mentioned verification biometric information, and returns the authentication result to the image processing device.
  • an image processing device acquires biometric information of a user and registers the biometric information as verification biometric information in a table held by the image processing device.
  • the image processing device allows the user to input biometric information, compares the input biometric information with the verification biometric information registered in the table above, and performs authentication. conduct.
  • the image processing apparatus periodically acquires update information of tables held by other image processing apparatuses and reflects the information in its own table.
  • An image processing device includes an image processing section, an input section, a memory, and a control section.
  • the image processing section includes at least one of a printer and a scanner.
  • the input unit receives the user's biometric information.
  • the memory stores first biometric information for each user.
  • the control unit controls functions related to the image processing unit based on an authentication result that compares the second biometric information input to the input unit and the first biometric information stored in the memory. Control the release of restrictions.
  • the image processing device stores first biometric information of the first user and second biometric information of the first user imported from another image processing device in response to input of information specifying the first user. It is also possible to control the release of restrictions on functions related to the image processing section based on the authentication results compared with the above.
  • the image processing device exports second biometric information of the first user to another image processing device, and receives an authentication result based on the second biometric information of the first user from the other image processing device. It is also possible to control the release of restrictions on functions related to the image processing unit based on the received authentication result.
  • a communication system includes the image processing device and the other image processing device.
  • FIG. 1 is a schematic diagram for explaining an overview of an embodiment.
  • FIG. 1 is a schematic diagram for explaining the configuration of an image processing device and a network according to an embodiment.
  • FIG. 1 is a schematic diagram showing a hardware configuration related to a signal processing system of an image processing device according to an embodiment.
  • 5 is a flowchart illustrating an example of a procedure of processing executed by the image processing apparatus according to the first embodiment.
  • 5A, 5B, and 5C are schematic diagrams showing examples of images displayed on the screen of the image processing device according to the first embodiment.
  • 6A and 6B are schematic diagrams showing other examples of images displayed on the screen of the image processing apparatus according to the first embodiment.
  • 5 is a flowchart showing details of some procedures in FIG. 4.
  • FIG. 7 is a flowchart illustrating an example of a procedure of processing executed by the image processing apparatus according to the second embodiment.
  • 10 is a flowchart showing details of some procedures in FIG. 9.
  • FIG. 7 is a functional block diagram for explaining a communication system according to a third embodiment.
  • FIGS. 12A and 12B are schematic diagrams for explaining a specific example of releasing restrictions based on authentication results.
  • 5 is a flowchart for explaining the release of restrictions related to VPN connection.
  • biological information refers to the information itself about the characteristics that actually appear on a person (from another point of view, information that does not depend on the detection method), and when it refers to the raw information obtained by detecting the above-mentioned characteristics.
  • it refers to feature information extracted from raw information, and in other cases, it refers to information that has been processed from raw information or feature information according to the purpose of use. Examples of the processed information include information obtained by encrypting feature amounts.
  • authentication sometimes refers to the act of confirming the legitimacy of an object, and sometimes refers to the fact that the legitimacy has been confirmed or has been confirmed through such an act.
  • the fact that the validity has been confirmed is sometimes expressed as a successful authentication, and the fact that the legitimacy cannot be confirmed is sometimes expressed as a failure in authentication.
  • the "authentication state” refers to a state where authenticity has been confirmed, or a state where it is regarded as such.
  • network sometimes refers to a communication network, and sometimes refers to a combination of a communication network and devices connected to the communication network. The same holds true for the lower-level concept of network. Examples of the sub-concept terms of network are the Internet, public network, private network, LAN (Local Area Network), and VPN (Virtual Private Network).
  • VPN sometimes refers to a technology that virtually extends a private network to a public network, and sometimes refers to a network using this technology.
  • VPN may be appropriately used to refer to technical matters related to VPN.
  • a connection established for communication using a VPN is sometimes referred to as a VPN connection, and such a connection is sometimes referred to as a VPN connection.
  • connection can refer to a connection established through authentication (for example, a three-way handshake) (a connection in a narrow sense), or a connection that simply means that communication is possible (a connection in a broad sense).
  • authentication for example, a three-way handshake
  • connection in a narrow sense a connection in a narrow sense
  • connection establishment is prohibited. Things that are electrically (physically from another point of view) connected to each other by cables, but in terms of software (logically from another point of view) any communication is prohibited.
  • FIG. 1 is a schematic diagram for explaining an overview of a communication system 1 according to an embodiment.
  • the communication system 1 includes a plurality of (three illustrated in FIG. 1) image processing devices 3MA, 3MB, and 3MC that are communicably connected to each other via the network 10.
  • the image processing devices 3MA to 3MC (and 3A to 3C in FIG. 2) may be referred to as the image processing device 3 (numerals in FIG. 3, etc.) without distinguishing them from each other.
  • Image processing device 3 includes at least one of a printer and a scanner.
  • Each image processing device 3 authenticates the user, and based on the authentication result, limits the functions available to the user or cancels the limits. For example, if the authentication is successful, the image processing device 3 allows the user to use a predetermined function (for example, printing) (removes the restriction on the function). Conversely, if the authentication fails, the image processing device 3 does not allow the user to use the predetermined function (restricts the function).
  • a predetermined function for example, printing
  • the image processing device 3 does not allow the user to use the predetermined function (restricts the function).
  • control for removing restrictions on functions may be used for concepts that include both restricting functions and removing restrictions on functions.
  • a term such as “control of cancellation of authentication state” may be used for a concept that includes both maintaining an authentication state in which authentication has been successful and canceling the authentication state.
  • the image processing devices 3MA, 3MB, and 3MC each have management tables DT0A, DT0B, and DT0C to realize authentication.
  • management table DT0 may be referred to as management table DT0 without distinguishing them from each other.
  • the management table DT0 holds various information in association with each user.
  • "ID”, "password”, "biometric information 1", “biometric information 2", and “functional restriction” are illustrated as various pieces of information.
  • the combination of ID (identification) and password is sometimes referred to as account information.
  • the management table DT0 may be able to store two or more types of biometric information for each user, and in FIG. Specifically, fingerprints and faces are illustrated. In the description of the embodiments, for convenience, explanations may be given that ignore the fact that two or more types of biometric information can be stored (descriptions based on the assumption that only one type of biometric information is stored). "Functional restriction" is, for example, information that directly or indirectly indicates a function whose restriction is lifted (a function which is restricted from another perspective).
  • the image processing device 3 When the image processing device 3 is used by the user, it requests the user to input biometric information. Then, when the biometric information (second biometric information) is input, the image processing device 3 registers biometric information (first biometric information) that matches the input biometric information in its own management table DT0. Determine whether or not. If it is registered, it is assumed that the authentication has been successful and the restriction on the function is canceled.
  • the restriction cancellation at this time may or may not be different between users.
  • information related to "functional limitations" associated with biometric information (first biometric information) that matches the input biometric information may be referred to.
  • the management table DT0 does not need to have information regarding "functional limitations" for each user.
  • the management tables DT0A, DT0B, and DT0C are different from each other.
  • the management tables DT0A, DT0B, and DT0C are created by different users (from another point of view). account), we hold information such as account information and biometric information.
  • the image processing device 3MB is Authentication is successful by referring to its own management table DT0B.
  • user B can use the predetermined function for which the restriction has been lifted.
  • the image processing device 3MA authenticates the user B by performing one of the following two methods, for example. This allows user B to use predetermined functions on image processing device 3MA.
  • the image processing device 3MA imports user B's biometric information (first biometric information) from the image processing device 3MB. Then, the image processing device 3MB performs authentication by comparing the biometric information of user B detected by itself (second biometric information) and the imported biometric information of user B.
  • the image processing device 3MA exports user B's biometric information (second biometric information) detected by the image processing device 3MA to the image processing device 3MB.
  • the image processing device 3MB then performs authentication by comparing the exported biometric information of the user B with the biometric information (first biometric information) of the user B that it has stored. Thereafter, the image processing device 3MB notifies the image processing device 3MA of the authentication result.
  • user B can access not only the image processing device 3MB in which his or her biometric information (first biometric information) is registered, but also the image processing device 3MA (in other words, Then, the restriction on the predetermined function can be lifted in other image processing devices 3) as well.
  • the image processing device 3MA in other words, Then, the restriction on the predetermined function can be lifted in other image processing devices 3) as well.
  • user B's convenience is improved.
  • biometric information may be imported (exported) as needed, so the burden on the network 10 is reduced compared to a mode in which biometric information is always exported to a server.
  • the above operation of the image processing device 3MA may or may not be possible in an image processing device 3 other than the image processing device 3MA.
  • the above operation of the image processing device 3MB may or may not be possible in an image processing device 3 other than the image processing device 3MB.
  • the image processing device 3 may be configured to be able to execute only one of the first method and the second method, or may be configured to be able to selectively execute both.
  • Communication system 1 in general ( Figure 2) 1.1. Information used by communication system 1 1.1.1. Biological information 1.1.2. Account information 1.1.3. Authentication information 1.2. Overall configuration of communication system 1 1.3. Overview of each communication device 1.4. Connection mode of communication equipment 2. Configuration of image processing device 3 ( Figure 3) 2.1. Overall configuration of image processing device 3 2.2. Printer 2.3. Scanner 2.4. UI (User Interface) section 2.4.1. Operation unit 2.4.2. Display section 2.5.
  • Input section into which biological information is input 2.6. Communication Department 2.7. Control unit 2.8. Connector 2.9. Others 3. Information saved 3.1. Management table DT0 3.2. Preservation of biological information 4.
  • a mode in which all the image processing devices 3 are capable of both the operation of the image processing device 3MA and the operation of the image processing device 3MB may be taken as an example.
  • image processing devices 3MA and 3MB and user B's A symbol may be added.
  • the user registered in the image processing device 3MA may be referred to as user A (for convenience, the data of user A is given a reference numeral).
  • the biometric information used by the image processing device 3 for authentication may be of various types, and may be information used in known biometric authentication, for example.
  • the biometric information may be information about the user's physical characteristics or may be information about the user's behavioral characteristics. Specific examples of physical characteristics include fingerprints, palm shapes, retinas (patterns of blood vessels, etc.), iris (distribution of shading values, etc.), faces, blood vessels (patterns of specific parts such as fingers), ear shapes, Sound (such as voice prints) and body odor may be mentioned.
  • Examples of behavioral characteristics include handwriting.
  • the account information includes, for example, information for identifying a user (hereinafter sometimes abbreviated as "ID"). Additionally, the account information may include a password. In the description of the embodiments, a mode in which account information includes an ID and a password may be taken as an example unless otherwise specified. However, as long as there is no contradiction, the word account information may be replaced with the word ID (without a password) or the word ID and password.
  • authentication information information for indicating user validity may be referred to as authentication information.
  • Biometric information and account information are types of authentication information.
  • other authentication information may be used in addition to or in place of biometric information and account information.
  • the term authentication information may refer to other authentication information other than the biometric information stored in the management table DT0.
  • Authentication information includes biometric information and account information, as well as static keys, common keys, private keys (or public keys), and electronic certificates.
  • authentication information is the information itself that is transmitted from a communication device requesting authentication (for example, image processing device 3MA) to a communication device performing authentication (for example, image processing device 3MB). Alternatively, it may be used when generating information to be sent to a communication device that performs authentication.
  • the former includes account information, static keys, electronic certificates, information obtained from security tokens, and biometric information.
  • the latter includes a common key and a private key (or public key). Both the former and the latter may be used as authentication information.
  • authentication information information based on authentication information
  • the term "information based on authentication information” is sometimes used as a general concept of authentication information as the transmitted information itself and information generated based on the authentication information and transmitted.
  • the authentication information stored by the communication device requesting authentication and the authentication information stored by the communication device performing authentication do not have to be the same information. . Further, the authentication information stored in the communication device requesting authentication may be appropriately processed and sent to the communication device performing authentication. Challenge-response authentication may be performed as one such mode.
  • the authentication information before processing and the authentication information after processing are expressed as the same thing, unless otherwise specified or unless there is a contradiction. In other words, differences in the format and/or accuracy of the information are ignored, and if the information content indicating the validity of the user is the same, it is expressed as the same information. Therefore, for example, when the authentication information stored in a first communication device and the authentication information stored in a second communication device match, in reality, the authentication information stored in the first communication device matches the authentication information stored in the second communication device.
  • the authentication information is processed in the process of transmission to the second communication device, and the authentication information stored in the first communication device and the authentication information stored in the second communication device have different formats, etc. may be different.
  • account information will be mainly taken as an example of authentication information. Further, unless otherwise specified, explanations may be provided assuming that the authentication information is account information.
  • the communication system 1 includes at least two image processing devices 3.
  • the two image processing devices 3 may have various configurations, and the configuration of the network 10 connecting the two image processing devices 3 may also be arbitrary.
  • FIG. 2 is a diagram for explaining an example of the configuration of each image processing device 3 and the network 10.
  • FIG. 2 three image processing devices 3A to 3C are illustrated.
  • the three image processing devices 3A to 3C have different positions relative to the public network 11 and the private networks 13A and 13B, for example.
  • Each of the two image processing devices 3MA and 3MB included in the communication system 1 may be configured in any one of the image processing devices 3A to 3C shown in FIG. 2, for example.
  • two image processing devices 3C may be provided, and these two image processing devices 3C may be image processing devices 3MA and 3MB.
  • the network 10 may be the private network 13A.
  • the image processing devices 3A and 3C may be replaced by image processing devices 3MA and 3MB.
  • network 10 may include private network 13A and public network 11. The configuration of the image processing devices 3MA and 3MB in relation to the network will be supplemented later in Section 1.4.
  • FIG. 2 may be taken as a diagram showing the communication system 1. Below, for convenience, the configuration shown in FIG. 2 may be referred to as a communication system 1.
  • the communication system 1 may include communication devices other than the image processing device 3 as appropriate.
  • servers 5 and 7 and terminals 9A, 9B and 9C are illustrated.
  • the terminals 9A to 9C may be referred to as the terminal 9 (representatively, the terminal 9A is given the reference numeral) without distinguishing them.
  • the communication system 1 may be defined only by at least two image processing devices 3 (3MA and 3MB). Furthermore, the communication system 1 may be defined to include communication devices (for example, servers 5 and 7 and terminal 9) other than the image processing device 3MB that can communicate with the image processing device 3MA. Furthermore, the communication system 1 may be defined to include a private network (13A or 13B) including two image processing devices 3MA and 3MB. However, in any case, the communication system 1 may be defined without the public network 11.
  • the image processing device 3 includes at least one of a printer and a scanner, as described above. The following description will mainly take as an example a mode in which the image processing device 3 includes both a printer and a scanner.
  • the image processing device 3 may or may not be a multi-function product/printer/peripheral (MFP). Note that in the drawings, the image processing device 3 may be referred to as "MFP" for convenience.
  • the image processing device 3 may be capable of executing one or more of printing, scanning, copying, FAX transmission, and FAX reception (although these are not necessarily separable concepts), for example.
  • the method of operating the image processing device 3 is arbitrary.
  • the image processing device 3A may be installed in a store such as a convenience store and used by an unspecified number of users.
  • the image processing device 3B may be installed in a private home and used by a specific and small number of users (for example, one person).
  • the image processing device 3C may be installed in a company and used by a specific number of users.
  • the server 5 may, for example, contribute to the authentication of the user who uses the image processing device 3, as described in the third embodiment (Section 6). Further, for example, the server 5 may function as a VPN server (Section 7.4). Further, for example, the server 5 may perform ECM (Enterprise Content Management) regarding the private network 13A. Further, unlike the example of FIG. 1, the server 5 may hold information regarding the restriction of functions for each user to assist in controlling the restriction of functions in the image processing device 3.
  • ECM Enterprise Content Management
  • the server 7 may provide various services.
  • server 7 may be a file server, a mail server and/or a web server.
  • the file server may store, for example, data of an image printed by the image processing device 3 or data scanned by the image processing device 3.
  • the mail server may deliver mail printed by the image processing device 3 or mail containing an image scanned by the image processing device 3.
  • the web server may execute web services performed through communication with the image processing device 3.
  • each of servers 5 and 7 is represented by one computer. However, one server may be realized by a plurality of distributed computers. A plurality of computers making up one server may be directly connected, included in one LAN, or included in mutually different LANs. Note that the servers 5 and 7 may be configured by one computer. Moreover, the servers 5 and 7 may be regarded as one server, regardless of whether they are configured by one computer or not.
  • the terminal 9 may be of any appropriate type.
  • terminals 9A and 9B are depicted as laptop-type PCs (personal computers).
  • Terminal 9C is depicted as a smartphone.
  • the terminal 9 may be, for example, a desktop PC or a tablet PC.
  • the terminal 9 can be operated in any manner.
  • the terminal 9 may be one that is used by one or more specific users, such as a terminal owned by a company or a terminal owned by an individual, or one that is unspecified and used by many users, such as a terminal at an Internet cafe. It may be used by several users.
  • the public network 11 is a network open to the outside (for example, an unspecified number of communication devices). The specific aspect may be determined as appropriate.
  • the public network 11 may include the Internet, a closed network provided by a telecommunications carrier, and/or a public telephone network.
  • the private networks 13A and 13B are networks that are not disclosed to the outside.
  • Private network 13A and/or 13B may be, for example, a LAN.
  • the LAN may be, for example, a network within the same building. Examples of the LAN include those using Ethernet (registered trademark) and Wi-Fi (registered trademark). Further, the private network 13A and/or 13B may be an intranet.
  • Transmission and/or reception of signals by the communication device may be performed via a wire or wirelessly. Further, the communication device (for example, the image processing device 3) may communicate with the public network 11 without being included in the private network, or may be included in the private network. A communication device (for example, the image processing device 3) included in the private network may communicate only within the private network, or may communicate with the public network 11 via the private network.
  • multiple communication devices may be connected to each other in various ways.
  • FIG. 2 it is as follows.
  • the image processing device 3A has not constructed a private network.
  • the image processing device 3A is capable of communicating with the public network 11 without going through a private network by including a router or the like (not shown) or by being connected to a router or the like.
  • the image processing device 3A may be able to communicate with a terminal 9 (not shown in FIG. 1) that is directly connected to the image processing device 3A by wire. Further, the image processing device 3A may be capable of short-range wireless communication with a terminal 9 (not shown in FIG. 2) placed near the image processing device 3A.
  • the image processing device 3B and the terminal 9B are connected to each other by a private network 13B. More specifically, both are connected via the router 15 (its hub). The image processing device 3B and the terminal 9B can communicate with the public network 11 via the router 15 and the like.
  • the image processing device 3C, server 5, server 7, and terminal 9A are connected to each other by a private network 13A.
  • the image processing device 3C, the server 7, and the terminal 9A can communicate with the public network 11 via the server 5, for example.
  • the server 5 may include a router or the like, or a router (not shown) or the like may be provided between the server 5 and the public network 11.
  • the terminal 9C communicates wirelessly with the public telephone network. Furthermore, the terminal 9C communicates with the public network 11 including the public telephone network.
  • the two image processing devices 3MA and 3MB that perform authentication according to the embodiment may be, for example, two image processing devices 3C included in the same private network 13A. Further, the image processing devices 3MA and 3MB may be two image processing devices 3 included in the same VPN. In this case, each of the image processing devices 3MA and 3MB may be any of the image processing devices 3A to 3C. Further, the image processing devices 3MA and 3MB may be two image processing devices 3 connected via a public network 11 (not a VPN). Also in this case, each of the image processing devices 3MA and 3MB may be any of the image processing devices 3A to 3C.
  • the two image processing devices 3MA and 3MB are included in the same VPN, for example, before biometric authentication according to the embodiment (before importing or exporting biometric information), authentication other than the above biometric information is performed. If the authentication using the information is successful, the image processing devices 3MA and 3MB may be connected to the VPN. For authentication for VPN connection, authentication information for individual users or authentication information assigned to the image processing device 3 may be used. Different from the above, the image processing apparatuses 3MA and 3MB connected via the public network 11 are connected via VPN by the biometric authentication according to the embodiment or by another authentication after the biometric authentication. It may be considered that the devices 3MA and 3MB are connected via the public network 11 (not a VPN).
  • connection mode of communication equipment and the operating method of communication equipment (from another perspective, social positioning) is arbitrary.
  • the image processing device 3A that is not included in the private network may be installed in a store and used by an unspecified number of users as described above, or it may be installed in a company and used by an unspecified number of users, as described above, or It may be installed and used by a specific user.
  • the image processing device 3B included in the private network 13B may be installed in a private home and used by a specific and small number of users as described above, or the image processing device 3B included in the private network 13B may be used by a specific and small number of users. Alternatively, it may be installed in an Internet cafe and used by an unspecified number of users.
  • the image processing apparatuses 3MA and 3MB may be expressed assuming a situation where they are located in different departments of the same company.
  • the image processing devices 3MA and 3MB may be included in the same private network 13A or the same VPN, or may be connected via the public network 11 (not a VPN). It doesn't matter if you stay there.
  • FIG. 3 is a schematic diagram showing the hardware configuration of the signal processing system of the image processing device 3. As shown in FIG.
  • the image processing device 3 includes, for example, the following components.
  • a housing 17 (FIG. 2) constitutes the outer shape of the image processing device 3.
  • a printer 19 that performs printing.
  • a scanner 21 image scanner
  • a UI unit 23 that accepts user operations and/or presents information to the user.
  • a detection unit 25 detects biometric information of the user.
  • Communication unit 27 (FIG. 3) that performs communication.
  • a control section 29 (FIG. 3) that controls each section (19, 21, 23, 25, and 27).
  • a connector 37 (FIG. 3) for connecting an appropriate device to the image processing apparatus 3.
  • the printer 19 and/or the scanner 21 may be referred to as an image processing section 31 (reference numeral is shown in FIG. 3).
  • the housing 17 may be considered as part of the printer 19 or the scanner 21.
  • the control unit 29 is conceptually one control unit that controls all operations (including printing and scanning, for example) of the image processing device 3 (in terms of hardware, it is distributed over multiple units).
  • the objects (19, 21, 23, 25, and 27) controlled by the control section 29 may be conceptualized only in terms of mechanical parts that do not include the control section, or the objects (19, 21, 23, 25, and 27) controlled by the control section may be conceptualized as including some of them.
  • Components other than the housing 17 (19, 21, 23, 25, 27, and 29; hereinafter, in Section 2.1, the word component refers to such components other than the housing 17) , are provided in the housing 17.
  • the housing 17 holds or supports a plurality of components, or is mechanically connected to or coupled to a plurality of components. , it can be said.
  • the plurality of components are provided in the housing 17, it can be said that they are provided integrally with each other. Note that, as understood from the above description, when it is said that a component is provided in the casing 17, the casing 17 may be regarded as a part of the component.
  • the components and the housing 17 are fixed to each other (of course, excluding movable parts). Furthermore, the components are also fixed to each other. Furthermore, unless the image processing device 3 is disassembled by, for example, removing screws, the components and the housing 17 cannot be separated from each other and placed in different locations. Furthermore, the constituent elements cannot be separated from each other and placed in different locations. However, unlike the above example, when it is said that the image processing device 3 has a component, the component may be detachable from the housing 17. In FIG. 3, while the detection section 25 provided in the housing 17 is shown, a detection section 25A that is a different example from the detection section 25 and that is attached to and detached from the connector 37 is indicated by a dotted line. has been done.
  • the specific positional relationship is arbitrary.
  • the component may be housed within the casing 17, provided integrally with the wall of the casing 17, protruding from the wall of the casing 17, or mounted on the casing 17.
  • the orientation and/or position relative to the body 17 may be variable.
  • the printer 19, scanner 21, communication section 27, and control section 29 may be considered to be housed in the housing 17.
  • the UI section 23 and the detection section 25 may be considered to be integrally provided on the wall surface of the housing 17.
  • the size and shape of the image processing device 3 are arbitrary.
  • the image processing device 3 may have a size (mass) that can be carried by one person, such as a home multifunction device or a printer (see illustration of the image processing device 3B), or may have a size (mass) that can be carried by one person, such as a home multifunction device or a printer (see illustration of the image processing device 3B),
  • the image processing apparatus may have a size (mass) that cannot be carried by one person, such as a multifunction device or a printer (see illustrations of image processing apparatuses 3A and 3C).
  • the image processing device 3 may have a concept that is significantly different from a general multifunction peripheral or printer placed in a company (office) or a private home.
  • the printer 19 may print on roll paper.
  • the image processing device 3 may include a robot, and may apply coating to a vehicle body or the like using an inkjet head.
  • the image processing device 3 may be of a size that can be held in one hand, and the image processing device 3 itself may scan a medium to perform printing and/or scanning.
  • the printer 19 is configured, for example, to print on sheets of paper arranged within the housing 17 or on a tray protruding from the housing 17 to the outside, and to discharge the printed sheets.
  • the specific configuration of the printer 19 may be various configurations, for example, it may be similar to a known configuration.
  • the printer 19 may be an inkjet printer that prints by ejecting ink, a thermal printer that prints by heating thermal paper or an ink ribbon, or a photosensitive printer irradiated with light. It may also be an electrophotographic printer (for example, a laser printer) that transfers toner adhering to the body.
  • the inkjet printer may be a piezo type that applies pressure to the ink using a piezoelectric body, or a thermal type that applies pressure to the ink using bubbles generated in the heated ink.
  • the printer 19 may be a line printer in which the head has a length spanning the width of the sheet (a direction that intersects the conveying direction of the sheet), or the printer 19 may have a head that extends in the width direction of the sheet. It may be a serial printer that moves to.
  • the printer 19 may be a color printer or a monochrome printer.
  • the printer 19 may be capable of forming any image, or may be capable of printing only characters.
  • the scanner 21 is arranged on the original glass using a plurality of image pickup elements (not shown) that move along the original glass under the original glass exposed from the top surface of the housing 17 (hidden by the lid in FIG. 2). image and scan the original.
  • the scanner 21 may have various configurations, for example, may be similar to known configurations.
  • the configuration of the UI section 23 is arbitrary.
  • the UI unit 23 includes an operation unit 33 (reference numeral shown in FIG. 3) that receives user operations, and a display unit 35 (reference numeral shown in FIG. 3) that visually presents information to the user.
  • the UI section 23 may not be provided, or only one of the operation section 33 and the display section 35 may be provided.
  • the UI unit 23 may include an audio unit that presents information to the user by sound.
  • the UI unit 23 may be defined to include the connector 37, unlike the description of the embodiment. This is because connecting a device to the connector 37 may be a type of inputting an instruction to the image processing device 3.
  • the configuration of the operation section 33 is arbitrary.
  • the operation unit 33 accepts, for example, a user's touch operation.
  • Such an operation section 33 may include, for example, a touch panel and/or one or more buttons.
  • a touch panel reference numeral omitted
  • a button 33a is illustrated as at least a part of the operation unit 33 of the image processing device 3B.
  • the button 33a may be a push button, a touch button, or another button.
  • the touch button may be a capacitive touch button or another touch button.
  • the image processing devices 3A and 3C may have buttons, and the image processing device 3B may have a touch panel.
  • the operation unit 33 may accept other types of operations such as voice operations.
  • the operation unit 33 may be used for various purposes. Typically, the operation unit 33 is used to instruct the image processing device 3 to execute processing related to the image processing unit 31. For example, by operating the operation unit 33, printing, scanning, and copying are performed, and settings related to these operations (for example, settings for paper selection, magnification, density, and/or color, etc.) are performed. In addition, for example, by operating the operation unit 33, access to data, transmission and reception of data, and input of authentication information may be performed.
  • the configuration of the display section 35 is arbitrary.
  • the display unit 35 may include at least one of a display capable of displaying any image, a display capable of displaying only arbitrary characters, a display capable of displaying only specific characters and/or specific graphics, and an indicator light. May contain one.
  • the image here is a concept that includes characters. Examples of displays that display arbitrary images or arbitrary characters include liquid crystal displays or organic EL (Electro Luminescence) displays that have a relatively large number of regularly arranged pixels. Furthermore, examples of displays that display specific characters and/or specific graphics include liquid crystal displays with a limited number and/or shape of pixels, or segment displays such as a 7-segment display. Segmented displays may take various forms, including liquid crystal displays. Examples of the indicator light include those including LEDs (Light Emitting Diodes). An appropriate number of indicator lights may be provided. In addition, in the following description, for convenience, expressions may be given on the premise that the display unit 35 can display any image.
  • the image processing device 3 includes the detection unit 25 that detects biological information. However, the image processing device 3 does not need to include the detection section 25. For example, a secure connection is established in short-range wireless communication between the image processing device 3 and the terminal 9. In this state, biometric information may be detected by the terminal 9 and the detected biometric information may be transmitted to the image processing device 3 . Thereby, the biometric information may be input to the communication unit 27 communicating with the terminal 9. As a superordinate concept for such communication unit 27 and detection unit 25, the term “input unit” into which biometric information is input is sometimes used.
  • the input unit may be a unit into which biometric information is directly input (detection unit 25) or a unit into which biometric information is indirectly input (communication unit 27).
  • the description of the embodiment basically takes as an example a mode in which the input section is the detection section 25.
  • the configuration of the detection unit 25 is, for example, as follows.
  • the configuration of the detection section 25 may also be various.
  • various detection units 25 may be used for the same type of biological information.
  • the basic configuration of the detection unit 25 may be the same as a known one.
  • the detection unit 25 may acquire an image related to biological information.
  • biological information obtained by acquiring images include fingerprints, palm shapes, retinas, iris, faces, blood vessels, and ear shapes.
  • a typical example of the detection unit 25 that acquires an image is an optical type.
  • the optical detection unit 25 includes an image sensor that detects light.
  • the light to be detected by the image sensor (in other words, the wavelength range) may be visible light or non-visible light (for example, infrared light).
  • the detection unit 25 may or may not have an illumination unit that irradiates the living body with light in the wavelength range detected by the image sensor.
  • the image may be a binary image, a grayscale image or a color image.
  • the detection unit 25 that acquires images may be of an ultrasonic type.
  • the ultrasonic detection unit 25 includes an ultrasonic element that transmits and receives ultrasonic waves.
  • the detection unit 25 including an ultrasonic element can acquire an image of the surface and/or internal shape of a living body. More specifically, the detection unit 25 transmits ultrasonic waves toward the living body and receives the reflected waves. An image that reflects the distance from the ultrasound element (ie, the shape of the living body) is acquired based on the time from transmission to reception.
  • the detection unit 25 that acquires the image may be of a capacitance type.
  • the capacitive detection unit 25 includes a panel with which a living body comes into contact, and a plurality of electrodes arranged behind the panel and along the panel.
  • a part of a living body for example, a finger
  • the electric charge generated in the electrode at the position where it is in contact the position of a convex part on the body surface
  • the position where the living body is not in contact the position of a concave part on the body surface
  • the detection unit 25 that acquires images may acquire a two-dimensional image by sequentially acquiring line-shaped images in the transverse direction of the line-shaped images (that is, scanning), A two-dimensional image may be acquired substantially in one time without performing such scanning. Scanning may be realized by the operation of the detection unit 25 or by moving the living body relative to the detection unit 25.
  • the former includes, for example, a mode in which a carriage containing an image sensor or an ultrasonic device moves.
  • the plurality of ultrasound elements can also perform electronic scanning without mechanical movement.
  • An example of the detection unit 25 other than the configuration that acquires images is one that includes a microphone that acquires audio. Thereby, voice (for example, voiceprint) information as biometric information is acquired. Further, for example, the other detection unit 25 may be a touch panel that accepts writing with a touch pen. As a result, handwriting information as biometric information is acquired.
  • the detection unit 25 may be used for purposes other than acquiring biological information. From another perspective, the detection unit 25 may be realized by a component provided in the image processing device 3 for a purpose other than acquiring biological information. Alternatively, the detection unit 25 may be structurally inseparably combined with other components.
  • the detection unit 25 that acquires an image may be realized by the scanner 21, unlike the illustrated example. That is, when it is said that an image processing device has a scanner and a detection section, the two may be the same component. The same applies when other components are shared with the detection unit 25 (not limited to the one that acquires images).
  • the detection unit 25 may also be used as a button so that when a finger is placed on a button included in the operation unit 33, a fingerprint is detected.
  • An example of such a button and detection section 25 is the capacitive detection section 25 described above.
  • the button operation is detected by the sensor including the plurality of electrodes described above. Further, for example, the reception of handwriting may be realized by a touch panel included in the operation unit 33.
  • the detection surface on which the finger is placed may be subjected to anti-virus treatment.
  • the detection surface is constituted by a plate-shaped member, and the material of this plate-shaped member may include a component that produces an antiviral effect.
  • the detection surface may be constituted by a film covering the above-mentioned plate-shaped member, etc., and the film may contain a component that produces an antiviral effect.
  • Components that produce antiviral effects include, for example, monovalent copper compounds and silver.
  • the type of virus to be targeted is arbitrary.
  • the antiviral property of the detection surface may be such that the antiviral activity value is 2.0 or more in a test according to ISO (International Organization for Standardization) 21702, for example.
  • the sensing surface may produce an antibacterial effect in addition to, or instead of, an antiviral effect.
  • the position, orientation, etc. of the detection unit 25 are arbitrary.
  • the detection unit 25 may be fixed to the housing 17, or may be connected to the housing 17 so that its position and/or orientation can be changed. Alternatively, it may be detachable from the housing 17.
  • the detection unit 25 (more precisely, a part directly involved in reading biometric information; for example, a detection surface on which a finger is placed when detecting a fingerprint; the same applies hereinafter in this paragraph) is a UI It may be located adjacent to the UI section 23 or may be located away from the UI section 23.
  • the communication unit 27 is, for example, a part of an interface for the image processing device 3 to communicate with other communication devices that is not included in the control unit 29.
  • the communication unit 27 may include only hardware components, or may include a portion realized by software in addition to the hardware components. In the latter case, the communication section 27 may not be clearly distinguishable from the control section 29.
  • the communication section 27 may have a connector or a port to which a cable is connected.
  • a port here is a concept that includes software elements in addition to a connector.
  • the communication unit 27 includes an RF (Radio Frequency) circuit that converts a baseband signal into a high frequency signal, and an RF (Radio Frequency) circuit that converts a baseband signal into a high frequency signal. and an antenna for converting the signal into a wireless signal.
  • the communication unit 27 may include, for example, an amplifier and/or a filter.
  • the control unit 29 has, for example, a similar configuration to a computer. Specifically, for example, the control unit 29 includes a CPU (Central Processing Unit) 39, a ROM (Read Only Memory) 41, a RAM (Random Access Memory) 43, and an auxiliary storage device 45. The control unit 29 is constructed by the CPU 39 executing programs stored in the ROM 41 and/or the auxiliary storage device 45. In addition to the portion constructed as described above, the control section 29 may include a logic circuit configured to perform only certain operations.
  • a CPU Central Processing Unit
  • ROM Read Only Memory
  • RAM Random Access Memory
  • the connector 37 is for connecting peripheral equipment to the image processing device 3, for example.
  • the connector 37 may be of various standards, for example, USB (Universal Serial Bus).
  • USB Universal Serial Bus
  • the detection unit 25A according to another example is illustrated as a peripheral device connected to the connector 37, as described above.
  • Other peripheral devices connected to the connector 37 include a USB memory and a card reader.
  • bus 47 (FIG. 3).
  • all the components are schematically connected to one bus 47.
  • multiple buses may be connected in any suitable manner.
  • an address bus, a data bus and a control bus may be provided.
  • a crossbar switch and/or a link bus may be applied.
  • FIG 3 is just a schematic diagram. Therefore, for example, in reality, a plurality of various devices (for example, CPUs) may be distributed and provided.
  • the illustrated CPU 39 may be a concept including a CPU included in the printer 19 or the scanner 21.
  • An interface (not shown) may be interposed between the bus 47 and various devices (for example, the printer 19 or the scanner 21).
  • FIG. 3 has been described as showing the configuration of the image processing device 3.
  • FIG. 3 can be used as a block diagram showing the configuration of the servers 5 and 7 and the terminal 9 as appropriate.
  • the explanation of the components shown in FIG. 3 may also be applied to the components of the servers 5 and 7 and the terminal 9, as long as there is no contradiction.
  • the block diagram showing the configuration of the servers 5 and 7 and the terminal 9 may be obtained by omitting the printer 19 and scanner 21 from FIG. 3.
  • the block diagram showing the configuration of the servers 5 and 7 may be obtained by omitting the detection unit 25, the operation unit 33, and/or the display unit 35 from FIG.
  • Management table DT0 is stored in nonvolatile memory (for example, auxiliary storage device 45).
  • the management table DT0 holds account information, biometric information, etc. for a plurality of users.
  • the management table DT0 may be able to hold account information, biometric information, etc. only for one user. Even in such a mode, for convenience, it may be expressed that account information and biometric information are stored in association with each other "for each user.”
  • a mode in which the management table DT0 can store information of a plurality of users will be basically taken as an example, and the explanation will be based on such a mode unless otherwise specified.
  • An example of a usage mode in which the management table DT0 stores information for only one user is a mode in which the image processing device 3 is for home use (see the illustration of the image processing device 3B in FIG. 3). It will be done. It may be assumed that this image processing device 3 is basically used by only one user. Then, when another user uses it exceptionally, the other user's biometric information (first biometric information) is imported from the other image processing device 3, or the other user's biometric information is imported to the other image processing device 3.
  • the biometric information (second biometric information) may be exported.
  • the management table DT0 may be able to store two or more pieces of biometric information in association with one piece of account information.
  • a mode in which one piece of biometric information is associated with one piece of account information may be taken as an example, unless otherwise specified.
  • the two or more pieces of biometric information may be different biometric information of one user, for example.
  • biometric information include fingerprints of different fingers or fingerprints of the same finger acquired at different times. In the former case, for example, when authentication with one finger fails due to injury, aging, etc., authentication can be performed with another finger. In the latter case, the probability that authentication will fail when biometric information changes due to aging or the like is reduced.
  • the two or more pieces of biometric information may be two or more types of biometric information, as illustrated in FIG.
  • the two or more types of biometric information two or more of various biometric information (see Section 1.1.1) may be selected as appropriate. If two or more types of biometric information can be registered, for example, if authentication fails with one type of biometric information due to some reason other than fraud, it is possible to authenticate with other biometric information, as described above. This improves user convenience.
  • fingerprints from different fingers can also be considered as different types of biometric information.
  • the configuration of the detection unit 25 and the method of processing the detected raw information are the same. Therefore, in the description of the embodiment, fingerprints of different fingers are treated as the same type of biometric information.
  • different types of biological information differ in at least one of the configuration of the detection unit 25 and the method of processing the detected information, for example.
  • Two or more types of biological information may be used selectively, for example, as described above. That is, authentication may be performed for only one type of biometric information of two or more types. In this aspect, it is not necessary that all the biometric information of two or more types of biometric information be registered. However, all and/or at least two types of two or more types of biometric information may be required to be registered. Furthermore, all and/or at least two types of two or more types of biometric information may be required to be input at the time of authentication. In this case, security is improved.
  • two or more pieces of biometric information associated with one account information may belong to different people. That is, a "user” is not limited to a “person” and may be a concept that includes an "account” (from another perspective, a user group).
  • biometric information (second biometric information) detected by the detection unit 25 during use may be stored in a volatile memory (eg, RAM 43), for example.
  • the second biological information may be stored in a nonvolatile memory (for example, the auxiliary storage device 45). This second biometric information may be deleted at an appropriate time (for example, upon completion of authentication).
  • the biometric information (first biometric information) that the image processing device 3MA imports from the image processing device 3MB during use may be stored, for example, in the volatile memory (eg, RAM 43) of the image processing device 3MA.
  • the first biological information may be stored in the nonvolatile memory (for example, the auxiliary storage device 45) of the image processing device 3MA. This first biometric information may be deleted at an appropriate time (for example, upon completion of authentication).
  • the biometric information (second biometric information) that the image processing device 3MA exports to the image processing device 3MB during use may be stored, for example, in the volatile memory (eg, RAM 43) of the image processing device 3MB.
  • the second biological information may be stored in the nonvolatile memory (for example, the auxiliary storage device 45) of the image processing device 3MB. This second biometric information may be deleted at an appropriate time (for example, upon completion of authentication).
  • the deletion of biometric information may involve overwriting the storage area where the biometric information was stored with other information or initializing the storage area. That is, erasing biometric information may make the biometric information unrecoverable. Furthermore, deleting the biometric information may be to delete information on an address of a storage area where the biometric information is stored. That is, the biometric information may be made inaccessible through normal processing in the image processing device 3, while leaving room for recovery of the biometric information by a specialized company.
  • the first embodiment is a mode in which the image processing device 3MA executes the first method of importing the first biometric information (registered biometric information) of the user B from the image processing device 3MB. Specifically, for example, it is as follows.
  • FIG. 4 is a flowchart illustrating an example of an outline of a procedure related to authentication executed by the image processing device 3MA (control unit 29 from another perspective). Note that the various flowcharts, including FIG. 4, conceptually illustrate operational procedures for ease of understanding, and do not necessarily match the actual procedures, and may lack accuracy. Sometimes there are.
  • Step ST6 or ST13 a user who attempts to use the image processing device 3 is authenticated, and depending on the authentication result, restrictions on a predetermined function are canceled (step ST6 or ST13), or restrictions on a predetermined function are (step ST7 or ST14).
  • Steps ST2 to ST7 exemplify the processing procedure when user A registered in the image processing apparatus 3MA uses the image processing apparatus 3MA.
  • Steps ST8 to ST14 exemplify the processing procedure when user B who is registered in an image processing device 3 (3MB in this case) other than the image processing device 3MA uses the image processing device 3MA.
  • the process in FIG. 4 may be started at any appropriate time.
  • the process in FIG. 4 may be started when the image processing device 3MA is powered on or transitioned from sleep mode to startup mode.
  • the process in FIG. 4 may be initiated when the user attempts to perform a predetermined function (for example, printing) whose use is restricted. Note that in the description of the embodiments, the former may be taken as an example unless otherwise specified.
  • step ST1 the image processing device 3MA displays an image on the display screen that prompts the user to input information (for example, an operation on the operation unit 33) that specifies the user's own department (and/or the image processing device 3 in which the user is registered). 35. Further, the image processing device 3MA accepts the above input.
  • step ST2 the image processing apparatus 3MA determines whether the user who is attempting to use the image processing apparatus 3MA is registered with a user in another department (from another perspective, with another image processing apparatus 3) based on the information input in step ST1. ). Then, the image processing device 3MA proceeds to step ST3 when the determination is negative, and proceeds to step ST8 when the determination is affirmative.
  • step ST3 the image processing device 3MA displays on the display unit 35 an image that prompts the user to input biometric information.
  • the image processing device 3MA uses the detection unit 25 to detect the user's biometric information.
  • step ST5 the image processing device 3MA performs biometric authentication based on the biometric information (second biometric information) detected in step ST4. Specifically, the image processing device 3MA determines whether biometric information (first biometric information) that matches the second biometric information is registered in its own management table DT0. Then, the image processing device 3MA proceeds to step ST6 when the determination is positive, and proceeds to step ST7 when the determination is negative.
  • step ST6 the image processing device 3MA releases the restrictions on the predetermined functions.
  • step ST7 the image processing device 3MA maintains the predetermined function restriction.
  • the functions restricted in step ST7 may be some or all of the various functions of the image processing device 3MA. In other words, the latter is a mode in which the use of the image processing device 3MA itself is prohibited. In FIG. 4, assuming the former, step ST7 is labeled as "restricted operation.”
  • step ST8 the image processing device 3MA accesses the image processing device 3MB of the department specified by the user in step ST1. From another perspective, the image processing device 3MA requests the image processing device 3MB to export the biometric information (first biometric information) of the user B registered in the image processing device 3MB. Then, in step ST9, the image processing device 3MB receives (imports) the first biometric information of the user B.
  • Steps ST10 to ST14 are similar to steps ST3 to ST7.
  • the first biometric information registered biometric information to be compared with the detected second biometric information
  • step ST9 The imported one will be used.
  • FIG. 4 for convenience, illustration of a step corresponding to step ST3 before step ST10 is omitted.
  • step ST11 (not shown before step ST5) is shown in which it is specified that authentication (comparison) is performed by the image processing device 3MA.
  • only one type of function may require authentication, or two or more types may be required. Further, it may be possible to repeatedly execute one type of function or to execute two or more types of functions with one authentication. However, authentication may be requested each time a function is executed, authentication may be requested for each type of function, or authentication may be requested again when a function with a high security level is executed.
  • the biometric information acquired during use may be deleted from the image processing device 3MA immediately after being compared with the registered biometric information.
  • the biometric information acquired during use may be stored in the image processing device 3MA until an appropriate time thereafter (for example, when the authentication state is canceled) and used as appropriate.
  • the biometric information acquired during use may be used to update the registered biometric information.
  • step ST6 and step ST13 are the same. However, there may be differences between the two. For example, in step ST13, the restriction on a specific function that is removed in step ST6 may not be lifted. Similarly, although step ST7 and step ST14 have been described as being the same, there may be a difference between them. For example, in step ST14, use of a specific function that is not limited in step ST7 may be restricted.
  • the configuration of the network 10 connecting the image processing devices 3MA and 3MB may have various configurations, the communication in steps ST8 and ST9 may be performed as appropriate.
  • the biometric information of user B may be imported (step ST9) within the private network 13A or via the public network 11.
  • a VPN connection may or may not be utilized.
  • the connection for importing the biometric information of user B may be established at step ST8 or ST9, or may be established before that.
  • image processing device 3MA does not specify image processing device 3MB, and All communication devices within the LAN may be requested to export user B's first biometric information. Then, the corresponding communication device (the image processing device 3MB in which user B's biometric information is registered) may respond to the request.
  • the user may be required to input biometric information before inputting information that identifies his or her own department. Then, it may be determined whether or not the input biometric information is registered in the management table DT0 of the image processing device 3MA (from another point of view, authentication may be performed). Depending on the success or failure of this authentication, it may be determined whether the user is a user of the own department or a user of another department. If authentication fails, users in other departments are asked to input information identifying their own department (or not) and request another image processing device 3MB to import user B's biometric information. good.
  • Image IG1 shown in FIG. 5A is displayed in step ST1, for example.
  • Image IG1 prompts the user to input an ID and password (ie, account information) by displaying "Please enter your ID and password" at the top.
  • ID and password ie, account information
  • the user selects the input fields associated with "ID” and “password” by tapping, etc., and then inputs characters (a broad concept including numbers and symbols) by operating a mechanical switch or software keyboard. ) can be input.
  • Image IG1 also includes a button labeled "Other department user” or "Execute” at the bottom.
  • the user can proceed to the next operation (next screen) by performing a predetermined operation (for example, tap) on these buttons.
  • a predetermined operation for example, tap
  • step ST2 If the "other department user” button is selected, an affirmative determination is made in step ST2, and the process proceeds to step ST8. On the other hand, if the "execute” button is selected, a negative determination is made in step ST2, and the process proceeds to step ST3. Note that in a mode where there is an image (screen) that is displayed before image IG1, a button for returning to that screen may be displayed on image IG1. Similarly, other screens to be described later may also have a button (not shown) for returning to the previous screen and/or a button (not shown) for returning to a specific screen.
  • Image IG3 shown in FIG. 5B is displayed, for example, when the "other department user” button is selected in image IG1. Further, the image IG3 is displayed, for example, at an appropriate time after an affirmative determination is made in step ST2 and before step ST8.
  • the notation "other department user” in the upper row indicates that a screen for other department users is being displayed.
  • input fields for ID and password are displayed.
  • a "own department MFP selection” button is displayed at the bottom. The user can proceed to the next operation (next screen) by performing a predetermined operation (eg, tap) on the "own department MFP selection" button.
  • Image IG5 shown in FIG. 5C is displayed, for example, when the "select own department MFP" button is operated in image IG3. Further, image IG5 is displayed at an appropriate time, for example, after an affirmative determination is made in step ST2 and before step ST8.
  • image IG5 the notations "Other department user MFP selection” and "Please select your own department's MFP" in the upper row allow users from other departments to select the department to which they belong and/or the image to which they are registered. It is shown that a screen for selecting the processing device 3 is displayed. Further, in image IG5, a list of selectable image processing devices 3 is shown.
  • the user can select the image processing device 3 in which the user is registered by performing an operation (for example, tapping) to select one of the image processing devices 3 displayed in the list.
  • this selection is the input of information specifying the image processing device 3 in which the device itself is registered.
  • the user can proceed to the next operation (next screen) by performing a predetermined operation (eg, tap) on the "OK" button at the bottom.
  • the image processing device 3MA requests the user's biometric information from the image processing device 3 (3MB) selected in the image IG5.
  • the image processing device 3MA transmits the account information input in the image IG1 or IG3 to the image processing device 3MB.
  • the information transmitted here may be only the ID, or may be a combination of the ID and password.
  • the image processing device 3MB exports the biometric information of the account information that matches the account information included in the received request to the image processing device 3MA.
  • Image IG7 shown in FIG. 6A is displayed, for example, when the "OK" button is operated in image IG5. Further, the image IG7 is displayed, for example, at an appropriate time after an affirmative determination is made in step ST2 (for example, after step ST9) and before step ST10.
  • image IG7 in addition to an image similar to the upper half of image IG3, the image processing device 3 selected in image IG5 (in the illustrated example, "B department 1 section") is shown in the "MFP with registered biometric information" column. MFP “B01”) is shown. Additionally, the message ⁇ Please let me read your fingerprint'' prompts you to enter your biometric information. Note that, as understood from the above, a mode in which a fingerprint is used as biometric information is assumed here.
  • biometric information may be other than fingerprints.
  • the user can have his or her fingerprint read by placing his or her finger on the detection unit 25. The finger at this time is naturally the finger whose fingerprint was read during registration. Note that in step ST3, an image similar to image IG7 may be displayed.
  • Image IG9 shown in FIG. 6B is generated, for example, when biometric information is detected by detection unit 25 (step ST10) and authentication is successful (when an affirmative determination is made in step ST12) while image IG7 is being displayed. will be displayed.
  • image IG9 in addition to an image similar to the upper half of image IG7, the notation ⁇ authentication complete'' indicates that authentication has been successful.
  • a predetermined operation for example, tapping
  • the user proceeds to a screen (so-called menu screen and/or home screen) for using the functions of the image processing device 3MA (or back) can be done.
  • the previous screen or A specific screen (eg, image IG1) may be displayed.
  • options such as re-entering the biometric information (retrying the same authentication method) and returning to a specific screen may be displayed along with a display indicating that the authentication has failed.
  • the above options may include switching to other authentication methods (including authentication using other biometric information).
  • the first biometric information is used based on the user's self-report. Further, as described above, some users may overlap in the management table DT0 of the image processing device 3MA and the management table DT0 of the image processing device 3MB. Therefore, for example, although the first biometric information of user B is registered in the image processing device 3MA, the first biometric information of user B is imported from the image processing device 3MB to the image processing device 3MA. It doesn't matter if the situation arises.
  • the image processing device 3MA even if the image processing device 3MA declares that the user B is a user in another department, before importing, the image processing device 3MA refers to its own management table DT0 and determines the presence or absence of the first biometric information of the user B. You may check.
  • inputting account information in image IG1 may or may not be a prerequisite for moving to the next screen (FIG. 5B, etc.).
  • account information may be inputtable on the next screen (for example, image IG3).
  • FIG. 5B may be skipped and the process may proceed to FIG. 5C.
  • image processing device 3MA if the user is user A registered in image processing device 3MA, input of account information may or may not be required. Regarding the latter, if biometric authentication is performed even if account information is not input, the image processing device 3MA can refer to its own management table DT0 and identify the account information from the biometric information. Further, if biometric information and information regarding functional limitations are directly linked as in the management table DT0 illustrated in FIG. 1, there is no need to specify user A's account information.
  • information prompting for input of account information is removed, and only a display prompting the user to declare whether or not the user is from another department is displayed. It's okay. If the user declares that he/she is the user of his/her own department (user A), an image prompting the user to input the biometric information may be displayed instead of an image prompting the user to input the account information. Further, if it is declared that the user is a user from another department (user B), an image (for example, an image similar to image IG3 in FIG. 5B) prompting the user to input account information may be displayed.
  • an image for selecting an authentication method is displayed at an appropriate time (before step ST1 and/or before image IG1). Good too. Also, as mentioned in the explanation of Figure 4, an image that prompts the user to input biometric information is displayed first, and then, if necessary (if the user is a user from another department), the user is prompted to input account information. , the user may be prompted to input information specifying the image processing device 3 with which the user is registered.
  • FIG. 7 is a flowchart illustrating an example of details of a part of the procedure described in FIG. 4. More specifically, FIG. 7 shows details of the procedure (steps ST8 to ST14) when the user using the image processing device 3MA is the user B registered in the image processing device 3MB. Note that FIG. 7 may be interpreted as showing other processing similar to the processing in FIG. 4.
  • the account information of user B is sent from the image processing device 3MA to the image processing device 3MB, and only if the authentication using this account information is successful, the image processing device 3MA sends the account information of the user B from the image processing device 3MB. Import of biometric information (first biometric information) is permitted. Further, the imported first biometric information is deleted when a predetermined condition is met. These operations reduce the possibility of unintended leakage of biological information. Specifically, it is as follows.
  • the procedure of "MFP-A” indicates the procedure executed by the image processing apparatus 3MA (its control unit 29).
  • the "MFP-B” procedure indicates a procedure executed by the image processing device 3MB (its control unit 29).
  • step ST41 the image processing device 3MA acquires user B's account information. It is assumed that the account information here includes a password (that can be used for authentication).
  • step ST42 the image processing device 3MA transmits the account information acquired in step ST41 to the image processing device 3MB. Note that this account information transmission may be, for example, a request for biometric information in step ST8 or a part thereof.
  • step ST41 acquisition of account information (step ST41), identification of the image processing device 3MB in which user B is registered (image IG5), and biometric information of user B (second The detection of biological information (step ST10) may be performed in any order.
  • the identification of the image processing device 3MB is performed at an appropriate time before sending the account information (step ST42), and illustration is omitted.
  • the detection of biometric information is performed after importing the biometric information of user B (first biometric information).
  • step ST43 the image processing device 3MB determines whether account information matching the account information received in step ST42 is registered in its own management table DT0. That is, the image processing device 3MB performs authentication based on account information. Then, if the image processing device 3MB succeeds in authentication, the image processing device 3MB exports the biometric information stored in association with the matching account information (i.e., the first biometric information of user B) to the image processing device 3MA ( Step ST45). On the other hand, if the authentication fails, the image processing device 3MB transmits a notification indicating that the authentication has failed to the image processing device 3MA (step ST44).
  • step ST45 in actual processing, a notification indicating that authentication has been successful may be transmitted from the image processing device 3MB to the image processing device 3MA, and then the biometric information may be transmitted from the former to the latter.
  • step ST46 the image processing device 3MA determines whether or not the authentication in the image processing device 3MB is successful based on the notification from the image processing device 3MB (steps ST44 and ST45). Then, when the image processing device 3MA makes a positive determination (in other words, when the biometric information of user B could be imported), the process proceeds to step ST47, and when the negative determination is made, the process proceeds to step ST50. Steps ST47 to ST50 are the same as steps ST10 to ST14 in FIG. 4 (however, step ST11 is not shown).
  • step ST51 the image processing device 3MA determines whether a predetermined erasure condition is satisfied. Then, in the case of a positive determination, the image processing apparatus 3MA erases the first biometric information of user B imported in step ST45 (step ST52), and in the case of a negative determination, repeats step ST51 (stands by). .
  • step ST50 includes the case where the determination in step ST46 is negative (the case where import is not performed). Actually, if it has not been imported, steps ST51 and ST52 may not be performed. Further, unlike the illustrated example, the first biometric information may be deleted when the authentication is completed (immediately after step ST48).
  • the erasing conditions in step ST51 may be set as appropriate.
  • the erasing condition may include that a predetermined time has elapsed.
  • the timing at which the predetermined time starts is arbitrary.
  • the time measurement starts when the import of the first biometric information of user B is completed, when the authentication of user B is completed, and when the execution of a function whose restrictions are lifted based on the result of user B's authentication starts. This may be the point in time or completion point, the point in time when the authentication state is canceled, or the point in time when user B performs the last operation on the UI section 23.
  • the entity that sets the predetermined time and/or the timing start point may be any of the manufacturer of the image processing device 3MA, the administrator of the image processing device 3MA, and user B (the user whose biometric information is to be deleted).
  • the specific length of the predetermined time is arbitrary, for example, 1 second or more, 10 seconds or more, 30 seconds or more, 1 minute or more, 10 minutes or more, 30 minutes or more, 1 hour or more, 1 day or more, 1 week or more. , may be more than 1 month, less than 1 month, less than 1 week, less than 1 day, less than 1 hour, less than 30 minutes, less than 10 minutes, less than 1 minute, less than 30 seconds, less than 10 seconds, or less than 1 second.
  • the above lower limit and upper limit may be arbitrarily combined so as not to contradict each other.
  • the deletion condition may include completion of execution of a function whose restrictions have been lifted based on the authentication result.
  • the deletion condition may include completion of execution of a function whose restrictions have been lifted based on the authentication result.
  • a mode where authentication is required for each execution of a function for example, printing, scanning, copying, or sending or receiving data
  • when the execution of the function is completed for example, not including interruptions due to abnormalities, etc.
  • User B's first biometric information may be deleted.
  • the deletion condition is that the user B (or another user) performs an operation (in other words, a predetermined operation) on the UI unit 23 to delete the imported first biometric information of the user B. May include what has been done.
  • the biometric information of user B imported by the image processing device 3MA is deleted due to the passage of a predetermined time or the completion of execution of a function
  • the biometric information is automatically deleted when the automatic deletion conditions are met. It can be understood as Conversely, the deletion by the predetermined operation in the previous paragraph is not automatic but intentional by the user.
  • the mode in which the erasure condition includes the passage of a predetermined time is a mode intended to automatically erase the first biometric information of User B, and unless otherwise specified, it is assumed that the time when the predetermined time starts to be counted.
  • the time point at which the biometric information is obtained does not include the time point at which the first biometric information is deleted as described in the previous paragraph. In other words, when the predetermined operation described in the previous paragraph is performed, the first biometric information is basically deleted immediately.
  • the procedure shown in FIG. 7 may be modified as appropriate.
  • account information is used as authentication information indicating the validity of user B (and information based on the authentication information; the same applies hereinafter).
  • the authentication information may be various types of authentication information different from the biometric information to be imported, and is not limited to account information.
  • various kinds of authentication information illustrated in Section 1.1.3 may be used.
  • the imported first biometric information is deleted in step ST52.
  • the second biometric information of user B detected in step ST47 (and/or the second biometric information of user A detected in step ST4) is May be deleted.
  • the first biometric information of user B that the image processing device 3MA imported from the image processing device 3MB may or may not be reused before being deleted in step ST52. Reuse reduces the load on the network 10 due to import, for example.
  • An example of a reused mode will be shown below.
  • FIG. 8 is a flowchart showing an example of a part of the procedure described in FIG. 7 in detail. More specifically, FIG. 8 shows details of the procedure (steps ST49 to ST52) after successful authentication. Note that when the authentication fails (when the process proceeds to step ST50 instead of step ST49), for example, steps ST55 and ST56, which will be described later, are not performed. Note that FIG. 8 may be interpreted as showing other processing similar to the processing in FIG. 7.
  • Step ST49 is as described in the explanation of FIG.
  • the image processing device 3MA determines whether re-authentication conditions are satisfied (step ST55). Note that the first determination of the re-authentication conditions may be performed after step ST48 and before step ST49. Then, when the image processing apparatus 3MA makes a positive determination, the process proceeds to step ST56, and when it makes a negative judgment, the image processing apparatus 3MA skips step ST56 and proceeds to step ST51.
  • step ST56 similarly to steps ST47 to ST50, the image processing device 3MA displays a prompt to re-enter the biometric authentication, causes the detection unit 25 to re-detect the biometric information (second biometric information), and detects the biometric information (second biometric information) again.
  • Authentication is performed by comparing the biometric information and the first biometric information.
  • the first biometric information is not newly imported, but the one imported in step ST45 and held by the image processing device 3MA is used.
  • the image processing device 3 maintains the release of the function restriction (maintains the authentication state) or restricts the function (cancels the authentication state) according to the authentication result.
  • the authentication state is canceled, for example, execution of the restricted function is prohibited after the currently executed function (task) is completed (or without waiting for its completion).
  • step ST51 Steps ST51 and ST52 are as described in the explanation of FIG. However, if a negative determination is made in step ST51, the image processing device 3 does not repeat step ST51, but returns to step ST55, for example. As a result, for example, re-input of biometric information is repeatedly requested.
  • step ST56 if the re-authentication in step ST56 fails, the process may be as described above, or a process different from the above may be performed.
  • the image processing device 3 may prompt a retry of authentication and repeat step ST56 instead of proceeding to step ST51.
  • the image processing device 3 may proceed to step ST51 and repeat step ST51 without returning to step ST55 if a negative determination is made in step ST51.
  • the first biometric information may be deleted without performing re-authentication to reuse the first biometric information.
  • the re-authentication conditions may be set as appropriate.
  • the re-authentication condition may include that a predetermined period of time has elapsed since the detection unit 25 detected the biometric information of the user B most recently (from another point of view, the most recent authentication of the user B in the past). In this case, for example, the probability that someone other than user B will illegally use user B's authority is reduced.
  • the detection (authentication) of the most recent past biometric information here is, for example, the detection in step ST47 (first authentication) or the detection in step ST56 (re-authentication).
  • the predetermined time may be set by, for example, the manufacturer of the image processing device 3, the administrator of the image processing device 3, or an individual user (here, user B).
  • the re-authentication condition may include that an operation instructing the execution of a predetermined function has been performed on the UI unit 23.
  • the above-mentioned predetermined function is, for example, a function related to at least one of the image processing section 31 and the communication section 27, and may be included in one or more functions whose restrictions have been lifted due to successful authentication.
  • all functions may be selected from the one or more functions described above, or some functions (for example, a function requiring relatively high security) may be selected.
  • the predetermined function may be selected by, for example, the manufacturer of the image processing device 3, the image processing device 3, or an individual user (here, user B).
  • the imported first biometric information may be reused before being deleted in a manner other than the above.
  • the image processing device 3MA associates and holds user B's account information input in step ST41 with user B's first biometric information. good.
  • the account information matches the re-entered account information.
  • the biometric authentication in steps ST47 and ST48 may be performed using the previous first biometric information stored in association with the account information (that is, without importing).
  • Steps ST55 and ST56 may also be performed after step ST6. That is, the request for re-authentication may also be made to the user A registered in the image processing device 3MA.
  • the first biometric information to be compared with the detected second biometric information is what the image processing device 3MA holds in its own management table DT0.
  • the first biometric information of user A registered in the management table DT0 is different from the first biometric information of user B, and is not imported or automatically deleted.
  • re-authentication may be performed by importing the first biometric information of user B.
  • re-authentication differs from the above explanation in that the second biometric information detected in step ST4 or ST10 (ST47) is the first biometric information (registered in the image processing device 3MA or the image (imported from the processing device 3MB) may be used instead.
  • the second biometric information detected in step ST4 or ST10 (ST47) is more likely to be in accordance with the user's current physical condition than the first biometric information. The probability that re-authentication will fail due to physical condition or the like is reduced.
  • the second embodiment is a mode in which the image processing device 3MA executes the second method of exporting the second biometric information (detected biometric information) of the user B to the image processing device 3MB. Specifically, for example, it is as follows.
  • FIG. 9 is a flowchart illustrating an example of an outline of a procedure related to authentication executed by the image processing device 3MA (control unit 29 from another perspective). This figure corresponds to FIG. 4 of the first embodiment. Steps ST1 to ST7 and ST12 to ST14 are similar to those in FIG. 4. Similar to steps ST8 to ST14 in FIG. 4, steps ST61 to ST64 and ST12 to ST14 are processes performed when the user using the image processing apparatus 3MA is a user in another department (here, user B).
  • step ST61 the image processing device 3MA detects user B's biometric information (second biometric information) similarly to step ST4.
  • the image processing device 3MA accesses the image processing device 3MB and requests user B's authentication, for example.
  • step ST63 the image processing device 3MA exports the second biometric information of the user B detected in step ST61 to the image processing device 3MB.
  • step ST64 the image processing device 3MB determines whether biometric information (first biometric information) that matches the second biometric information exported in step ST63 exists in its own management table DT0. That is, the image processing device 3MB performs biometric authentication. Then, the authentication result is notified to the image processing device 3MA.
  • step ST64 can be called a step of receiving the authentication result from the image processing device 3MB as a process of the image processing device 3MA.
  • Steps ST12 to ST14 are as described in the explanation of FIG. 4. However, unlike in FIG. 4, the determination in step ST12 is based on a notification from the image processing device 3MB, and is not based on self-authentication.
  • the second biometric information exported from the image processing device 3MA may be deleted from the image processing device 3MB immediately after the comparison with the registered biometric information is performed.
  • the exported second biometric information may be stored in the image processing device 3MB until an appropriate time thereafter (for example, when the authentication state is canceled) and used as appropriate.
  • the exported second biometric information may be used to update the registered first biometric information.
  • the description of the communication in steps ST8 and ST9 in FIG. 4 may be used for the communication in steps ST62 and ST63 as appropriate.
  • the export may be within the private network 13A or via the public network 11, for example. In the latter case, a VPN connection may or may not be utilized. Further, the connection for export may be established at step ST62 or ST63, or may be established before that.
  • image processing device 3MA does not specify image processing device 3MB, and All communication devices within the LAN may be inquired about the presence or absence of user B's first biometric information.
  • the corresponding communication device (the image processing device 3MB that has registered the first biometric information of user B) may respond affirmatively to the inquiry.
  • the image processing device 3MA may then export the second biometric information (detected) of the user B to the source of the positive reply.
  • the procedure shown in FIG. 9 may be modified as appropriate.
  • the order of steps ST61 and ST62 may be reversed.
  • FIG. 9 When executing the process in FIG. 9, an appropriate image may be displayed on the display section 35.
  • the screens (FIGS. 5A to 6B) illustrated in the description of the first embodiment (section 4.2) may be used in the second embodiment.
  • the screens shown in FIGS. 5A to 6B may be used as they are in the second embodiment.
  • the user does not need to be able to distinguish between the first embodiment and the second embodiment.
  • the various explanations regarding FIGS. 5A to 6B may be appropriately incorporated into the second embodiment by replacing specific terms (for example, symbols indicating steps), as long as there is no contradiction.
  • FIG. 10 is a flowchart illustrating an example of details of a part of the procedure described in FIG. 9. More specifically, FIG. 10 shows details of the procedure (steps ST61 to ST64 and ST12 to ST14) when the user using the image processing device 3MA is the user B registered in the image processing device 3MB. . Note that FIG. 10 may be interpreted as showing other processing similar to the processing of FIG. 9.
  • the account information of user B is sent from the image processing device 3MA to the image processing device 3MB, and only if the authentication using this account information is successful, the image processing device 3MA sends the account information of user B to the image processing device 3MB.
  • Export of biometric information (second biometric information) is permitted. Furthermore, the exported second biometric information and the detected first biometric information are deleted when predetermined conditions are satisfied.
  • Steps ST41 to ST44 are similar to those in FIG. 7, and the explanation of FIG. 7 may be used. Also, similar to FIG. 7, acquisition of account information (step ST41), identification of the image processing device 3MB in which user B is registered (image IG5), and detection of user B's biometric information (second biometric information) (Step ST61) may be performed in any order as long as no contradiction occurs.
  • the illustration of the identification of the image processing device 3MB is omitted as in FIG. 7. Furthermore, it is assumed that the detection of biometric information is performed at an appropriate time before the export of the detected biometric information (step ST73), and illustration thereof is omitted.
  • step ST43 when the authentication based on the account information received in step ST42 is successful, the image processing device 3MB transmits a notification indicating that the authentication was successful to the image processing device 3MA (step ST71).
  • step ST72 the image processing device 3MA determines whether or not the authentication based on the account information has been successful, based on the notification from the image processing device 3MB (steps ST44 and ST71).
  • step ST73 is the same as step ST63, and the image processing device 3MA exports the biometric information (second biometric information) of the user B detected by itself to the image processing device 3MB.
  • step ST78 is the same as step ST14, and the image processing device 3MA does not release the restriction on the function.
  • Steps ST74 and ST75 are the same as step ST64. That is, the image processing device 3MB determines whether or not the second biometric information of the user B exported from the image processing device 3MA matches the first biometric information of the user B that it holds. Authentication is performed (step ST74), and the authentication result is transmitted to the image processing device 3MA (step ST75).
  • Steps ST76, ST77, and ST78 are the same as steps ST12, ST13, and ST14, and their explanation will be omitted.
  • step ST73 After the export (step ST73) is completed (in the illustrated example, after step ST77), the image processing device 3MA determines whether a predetermined deletion condition is satisfied (step ST79). Then, in the case of a positive determination, the image processing device 3MA erases the second biometric information of the user B detected in step ST61 (step ST80), and in the case of a negative determination, repeats step ST79 (stands by). .
  • the second biometric information may be deleted when the export is completed (immediately after the export).
  • biometric authentication when biometric authentication is successful (when an affirmative determination is made in step ST76), it is determined whether the erasure condition is satisfied.
  • biometric authentication fails, for example, the second biometric information of user B may be deleted immediately, or a determination is made as to whether the deletion conditions are met in the same way as when biometric authentication is successful. It's okay.
  • step ST74 the image processing device 3MB determines whether a predetermined deletion condition is satisfied (step ST81). Then, in the case of a positive determination, the image processing device 3MB erases the second biometric information of the user B exported from the image processing device 3MA in step ST73 (step ST81), and in the case of a negative determination, the image processing device 3MB deletes the second biometric information of the user B exported from the image processing device 3MA in step ST73. Repeat (wait). Note that, unlike the illustrated example, the second biometric information may be deleted when biometric authentication is completed (immediately after biometric authentication).
  • the second biometric information of user B may be deleted immediately, or the second biometric information of user B may be deleted immediately, or a determination is made as to whether the deletion conditions are satisfied, in the same way as when biometric authentication is successful. may be done.
  • steps ST76 and ST81 may be set as appropriate.
  • the erasing condition may include that a predetermined time has elapsed.
  • the timing start time is the time when the export of the second biometric information of user B (step ST73) is completed, the time when the biometric authentication result is received from the image processing device 3MB (step ST75) ), at the start or completion of execution of a function for which restrictions have been lifted based on the result of user B's authentication, at the point at which the authentication state is canceled, or at the time when user B's last operation is performed on the UI unit 23.
  • the timing start time is the time when the export of the second biometric information of user B (step ST73) is completed, the time when the biometric authentication result is received from the image processing device 3MB (step ST75) ), at the start or completion of execution of a function for which restrictions have been lifted based on the result of user B's authentication, at the point at which the authentication state is canceled, or at the time when user B's last operation is performed on the UI unit 23.
  • the timing start time may be the time when biometric authentication (step ST74) is completed, or the time when the biometric authentication result is transmitted to the image processing device 3MB (step ST75).
  • the entity that sets the predetermined time and/or the timing start point is the manufacturer of the image processing device 3MA or 3MB, the administrator of the image processing device 3MA or 3MB, or User B (the user whose biometric information is to be deleted). It's okay.
  • the deletion condition in step ST79 may include completion of execution of a function whose restrictions have been canceled based on the authentication result.
  • a function for example, printing, scanning, copying, or sending or receiving data
  • the execution of the function is completed (for example, not including interruptions due to abnormalities, etc.)
  • User B's second biometric information may be deleted.
  • the deletion condition in step ST79 is an operation (in other words, a predetermined This may be defined as the point in time when the operation) was performed.
  • the deletion condition in step ST81 may be that the image processing device 3MB has received a request to delete the second biometric information of the user B from the image processing device 3MA.
  • the explanation of the deletion conditions in step ST79 may be used.
  • the conditions for transmitting the above-mentioned request and the deletion conditions in step ST79 may be the same or different.
  • the procedure shown in FIG. 10 may be modified as appropriate.
  • the authentication information indicating the validity of user B used in step ST43 (and information based on the authentication information; the same applies hereinafter) is the biometric information to be exported.
  • the second biometric information (detected) of user B that is deleted in step ST80 or ST82 may or may not be reused.
  • the procedure shown in FIG. 8 may be used to reuse the second biometric information deleted in step ST80 or ST82.
  • steps ST49, ST51, and ST52 in FIG. 8 may be replaced with steps ST77, ST79, and ST80, respectively.
  • the restriction release in step ST49 of FIG. 8 may be replaced with the authentication in step ST74, and steps ST51 and ST52 may be replaced with steps ST81 and ST82, respectively.
  • steps ST55 and ST56 in FIG. 8 may be different from the first embodiment. For example, as follows.
  • the image processing device 3MA exports the second biometric information to the image processing device 3MB and requests re-authentication in step ST56.
  • re-authentication by the image processing device 3MB can be realized, for example, without requiring the user B to re-enter biometric information.
  • step ST55 when performing the operation related to reusing the second biometric information deleted in step ST80 as described above are arbitrary.
  • the image processing Authentication by the device 3MB may be required.
  • the re-authentication condition may include the need to establish the connection described above.
  • the establishment of a connection may be a purpose in itself and may be instructed by User B, or may be necessary in conjunction with the use of some function (for example, sending or receiving data). There may be.
  • connection for example, VPN connection
  • re-authentication condition may be that a connection requiring authentication is unintentionally interrupted.
  • the re-authentication condition in step ST55 is that the image processing device 3MA requests the image processing device 3MB to authenticate the user B. good.
  • the image processing device 3MB uses the second biometric information of the user B newly exported from the image processing device 3MA and the second biometric information of the user B registered in its own management table DT0. Instead of comparing with the first biometric information, the newly exported second biometric information of user B may be compared with the second biometric information previously acquired and before being deleted in step ST82. . In this case, there is a high probability that the previously acquired second biometric information has content that is more appropriate to the current physical condition of user B than the registered first biometric information, and as a result, It is expected that the accuracy of authentication will improve.
  • the second biometric information erased in step ST82 is to be reused as described above, it is assumed that the authentication is successful in step ST74 and the validity of the second biometric information has been confirmed. It is.
  • the request for authentication from the image processing device 3MA may be, for example, in a situation where user B is requesting new authentication from the image processing device 3MA to remove functional restrictions, or in a situation where functional restrictions are being removed.
  • the situation may also be one in which authentication is required to establish a connection (for example, VPN) from the image processing device 3MA to the image processing device 3MB after the authentication is completed.
  • the second biological information newly exported from the image processing device 3MA may be newly detected information, or may be information that has been reused before being deleted in step ST80. .
  • FIG. 11 is a block diagram showing the configuration of a communication system 1 according to the third embodiment.
  • the image processing device 3 has, for example, the configuration described with reference to FIGS. 1 to 3, and in FIG. 11, the detection unit 25, control unit 29, image processing unit 31, and auxiliary storage device 45 are extracted is shown.
  • the auxiliary storage device 45 has the management table DT0, a part of which is shown as the comparison table DT1 in FIG.
  • the comparison table DT1 holds account information (ID and password) and one or more pieces of biometric information in association with each other for each user.
  • the server 5 includes, for example, a verification unit 5a and a nonvolatile memory 5b.
  • the verification unit 5a is constructed, for example, similarly to the control unit 29 of the image processing device 3, by the CPU executing a program stored in the ROM and/or the auxiliary storage device.
  • the nonvolatile memory 5b is constituted by, for example, an auxiliary storage device, and stores the verification table DT2.
  • the verification table DT2 holds account information (ID and password) for each user.
  • the image processing device 3 (3MA) performs biometric authentication of the user similarly to the first and second embodiments.
  • the biometric authentication is successful (if an affirmative determination is made in step ST5 or ST12 in FIG. 4 or FIG. 9)
  • the image processing device 3 does not immediately release the restriction on the function, but requests the server 5 for authentication.
  • step ST5 the image processing device 3MA associates the biometric information of the user A who has been successfully authenticated in the comparison table DT1 held by the image processing device 3MA. Send account information to server 5. Further, if an affirmative determination is made in step ST12, the image processing device 3MA transmits the account information of user B input in step ST41 of FIG. 7 or FIG. 9 to the server 5.
  • the server 5 that has received the account information determines whether account information that matches the received account information is registered in the verification table DT2. Authentication is thereby performed. That is, if account information that matches the received account information is registered, authentication is successful; otherwise, authentication is unsuccessful. Then, the server 5 transmits the authentication result (success or failure of authentication) to the image processing apparatus 3 (3MA) that is the source of the account information.
  • step ST6 or ST13 the image processing device 3MA releases the restriction on the function. If not, the image processing device 3MA limits the function (step ST7 or ST14). Note that if biometric authentication fails in step ST5 or step ST12, the account information is not transmitted to the server 5 and functions are restricted (step ST7 or ST14).
  • the third embodiment may be modified as appropriate.
  • the authentication information (and information based on the authentication information; the same applies hereinafter) used for authentication by the server 5 is not limited to account information.
  • the authentication information may be various kinds of authentication information different from the biometric information to be imported or exported.
  • various kinds of authentication information illustrated in Section 1.1.3 may be used.
  • the authentication information may be biometric information to be imported or exported.
  • the information input from the UI unit 23 may not be used, or the information may be the first biometric information of user B.
  • the information does not have to be input as information for identifying user B (for example, account information) for importing information.
  • the image processing device 3MA imports the first biometric information of the user B from the image processing device 3MB, it also imports the authentication information stored in association with this first biometric information, and The imported authentication information may be sent to the server 5.
  • Removal of functional restrictions based on the authentication result may be performed in various ways. An example is shown below.
  • the function whose restriction is controlled to be lifted based on the authentication result may be, for example, a function related to at least one of the image processing section 31 (printer 19 and/or scanner 21) and the communication section 27.
  • Examples of restricted functions include the following: One or more of the following functions may be appropriately selected and set as a restriction target. Note that the plurality of functions listed below may overlap with each other or may be inseparable from one another.
  • printing by the printer 19 can be cited as a function to be restricted.
  • Printing may be restricted for each subdivided function.
  • printing may be based on scanning by the scanner 21, printing based on data received by the communication unit 27, or storing data in the image processing device 3 (auxiliary storage device 45) or a device connected to the connector 37 (for example, non-volatile memory). It may be subdivided into printing based on the data that has been created.
  • the printing restrictions based on the data received by the communication unit 27 may be further subdivided according to the transmission source communication device (for example, another image processing device 3, the server 5 or 7, or the terminal 9). Note that such printing restrictions may be substantially implemented by restricting communication destinations. Furthermore, the printing restrictions based on the data received by the communication unit 27 may be further subdivided according to the mode of communication (normal data communication, email reception, or FAX reception).
  • the transmission source communication device for example, another image processing device 3, the server 5 or 7, or the terminal 9. Note that such printing restrictions may be substantially implemented by restricting communication destinations.
  • the printing restrictions based on the data received by the communication unit 27 may be further subdivided according to the mode of communication (normal data communication, email reception, or FAX reception).
  • the printing restrictions based on the data stored in the image processing device 3 may be further subdivided according to the type of box (folder or directory in another expression) in which the data is stored. Note that such printing restrictions may be substantially implemented by restricting access to a box in which highly confidential files (document files and/or image files) are expected to be stored.
  • the printing restrictions based on the data stored in the memory connected to the connector 37 may be further subdivided depending on the type or individual of the connected device. Note that such printing restrictions may be substantially realized by restricting the devices that can be connected to the connector 37 (so-called device control).
  • scanning by the scanner 21 is an example of a function that is subject to restriction. Similar to printing, scanning may be limited by granular features. For example, scanning is for copying (printing), for transmitting data (for example, image data), and for scanning data to the image processing device 3 (auxiliary storage device 45) or a device connected to the connector 37. may be subdivided into those for preservation and those for preservation.
  • Scanning for data transmission may be further subdivided depending on the destination communication device (for example, another image processing device 3, server 5 or 7, or terminal 9). Note that such scanning restrictions may be substantially implemented by restricting destinations. Furthermore, scanning for data transmission may be further subdivided depending on the mode of communication (normal data communication, email transmission, or FAX transmission).
  • the mode of communication normal data communication, email transmission, or FAX transmission.
  • Scans for storage in the image processing device 3 may be further subdivided according to the type of storage destination box. Note that such scanning restrictions may be substantially implemented by restricting access to a box in which highly confidential files are expected to be stored.
  • Scans for storage in the device connected to the connector 37 may be further subdivided depending on the type or individual of the connected device. Note that such scanning limitations may be substantially implemented by limiting the devices that can be connected to the connector 37.
  • the function to be restricted does not have to be a major function such as printing or scanning.
  • the restricted function may be a function that performs settings related to major functions, such as setting the size of the margins of printed paper.
  • such a function may be regarded as a function of printing with arbitrary margin settings, and may even be regarded as a type of main function.
  • the function to be restricted may be a function used by the administrator of the image processing device 3.
  • the image processing device 3 may be set to uniformly (regardless of the user's authentication results) prohibit some of the above-mentioned main functions or to prohibit connection of a predetermined device to the image processing device 3. may be accepted. Then, such setting restrictions may be canceled for a specific user (the administrator of the image processing device 3).
  • the image processing device 3 has various functions.
  • the functions to be restricted may be all or some of the various functions excluding the authentication function. From another point of view, a user who fails authentication via biometric authentication may be substantially prevented from using the image processing device 3, or may be able to use some functions.
  • the manner in which functional restrictions are lifted when authentication is successful may be common to all users, or may be set individually for each user. To put the former in another perspective, there may be only two types of users: users for whom functional restrictions are not lifted without authentication and users for whom functional restrictions are lifted after authentication. There may be no difference in the functions that can be used between the users whose functions are lifted.
  • the image processing device 3 registers input account information and the self, other image processing devices 3, and/or server 5 without performing biometric authentication. It may be possible to perform authentication by comparing account information with existing account information, and to control the release of restrictions on functions.
  • the restriction on the first number of functions is lifted, and when authentication using biometric information is successful, the restriction on the first number of functions is lifted.
  • the restriction may include a restriction on one number of functions, and a restriction on a second number of functions greater than the first number may be lifted.
  • the release of restrictions on functions when authentication is successful can be set individually for each user. Assume that a user who is not authenticated cannot use both the first function and the second function. At this time, the authenticated users include users who can use only the first function, users who can use only the second function, users who can use both the first function and the second function, and users who have been authenticated. However, there may be two or more types of users whose functions are restricted in the same way as users who are not authenticated.
  • the authentication state (state in which functional restrictions are removed) will be released at some point. For example, canceling the authentication state can be rephrased as returning to a non-authenticated state (a state before functional restrictions were released). Canceling the authentication state involves terminating functions that require authentication (e.g., VPN connection described below) and/or invalidating (e.g., erasing from memory) information acquired on the assumption of authentication (e.g., k described later). good. Therefore, cancellation of the authentication state may be recognized by the termination of these operations and/or the invalidation of information. Furthermore, for example, in a mode in which a flag is set to indicate that authentication has been successful, it may be recognized by an action of tossing down the flag. In this case, it is not necessary to terminate the operation based on the authentication and/or invalidate the information obtained based on the authentication.
  • a flag is set to indicate that authentication has been successful
  • the authentication state may be canceled using various events as triggers. Examples of such events include the following: The user has performed a predetermined operation on the operation unit 33. Processing related to a function that requires authentication (for example, a function to download and print predetermined image data) has been completed. A predetermined time has elapsed since a predetermined time (for example, the time when the last operation on the operation unit 33 was performed). The human sensor detects that the user has left the image processing device 3.
  • FIG. 12A is a schematic diagram showing the authority table DT3 used for controlling the release of function restrictions.
  • the authority table DT3 holds IDs and authority information D3 in association with each other. That is, the authority information D3 is information that specifies whether restrictions can be canceled for each function, and printing and scanning are illustrated in FIG. 12A.
  • the authority table DT3 may be held by the image processing device 3, for example.
  • the image processing device 3 may have an authority table DT3 as a part of its own management table DT0 (FIG. 1).
  • the authority level (described later) is illustrated as the "functional restriction" of the management table DT0, but the information held in the "functional restriction” is different from the example of FIG. 1, and is the authority information D3. It's good.
  • the image processing device 3 holds user authority information D3 registered in its own management table DT0.
  • the image processing device 3MA refers to the authority information D3 linked to the ID of the user A. , controls the release of restrictions on functions.
  • the image processing device 3MA is linked to the ID of the user B from the image processing device 3MB.
  • the authority information D3 is imported, and the release of restrictions on functions is controlled. This import may occur, for example, at any appropriate time before or after successful authentication. For example, in the first embodiment, it may be performed in ST45 or ST49 (ST14) in FIG. 7, and in the second embodiment, it may be performed in step ST75 or ST77 (ST14) in FIG.
  • the management table DT0 may hold information on authority levels linked to IDs.
  • the image processing device 3 has a table that links authority levels with information on restrictions for each function (see authority information D3). Then, the image processing device 3MA identifies the authority level of the user A who has been successfully authenticated by referring to its own management table DT0.
  • the image processing device 3MA imports the authority level information of the user B who has been successfully authenticated from the image processing device 3MB.
  • the timing of import is arbitrary, similar to the timing of importing authority information D3 above.
  • the image processing device 3MA refers to the authority information D3 linked to the specified authority level and controls the release of the restriction on the function. Note that this aspect can be regarded as an aspect in which the authority table DT3 is divided into a table that associates IDs and authority levels, and a table that associates authority levels and authority information D3.
  • the authority table DT3 may be stored in the server 5 in addition to or instead of the image processing device 3. Then, the image processing device 3MA may acquire the authority information D3 from the server 5 when the user who has been successfully authenticated is the user B (or regardless of whether it is the user A or the user B).
  • the authority table DT3 is divided into a table that associates IDs and authority levels, and a table that associates authority levels and authority information D3, for example, in addition to the image processing device 3, or Alternatively, both tables may be stored on the server 5. The operation in this case is the same as above. Further, a table linking authority levels and authority information D3 may be stored in the image processing device 3. Then, if the user who has successfully been authenticated is user B (or in both cases of users A and B), the image processing device 3MA acquires the authority level from the server 5 and links the acquired authority level. With reference to the attached authority information D3, the release of restrictions on functions is controlled.
  • a table that holds ID and function restriction information in association with each other may be one table that holds both information, or two tables that are divided based on authority level. There may be.
  • the functions for which restrictions are lifted may differ depending on the type of authentication information and/or authentication method.
  • the content (or authority level) of the authority information D3 may differ depending on the type of authentication information and/or authentication method, for example.
  • the image processing device 3MA can specify the type of authentication information and/or authentication method, unlike the above, the image processing device 3MA can specify the same authority information D3( or authority level), and in the image processing apparatus 3MA, restrictions may not be lifted for a specific function among the functions for which restrictions have been lifted according to the authority information D3.
  • the image processing device 3MB and/or the server 5 may modify the authority information D3 according to the type of authentication information and/or authentication method.
  • the processing procedure is as follows.
  • the image processing device 3 determines whether the current user has the authority to execute the predetermined function. .
  • the image processing device 3 refers to the authority information that it owns or has previously acquired from the outside (another image processing device 3 or the server 5).
  • the image processing device 3 determines that the user has the authority, it executes the predetermined function, and otherwise does not execute the predetermined function.
  • the display unit 35 may display that the user is not authorized (or that the user has not been authenticated).
  • an example of the operation of canceling the restriction on the function is as follows. It can be considered that this has been done. Further, when the image processing device 3 becomes capable of executing a predetermined function according to the user's authority information, a predetermined flag is usually set internally in the image processing device 3. The setting of this flag may also be considered as an example of an operation in which a restriction on a function is lifted. Further, the operation of acquiring authority information from the outside (another image processing device 3 or server 5) may also be considered as an example of the operation of canceling the restriction on functions. Furthermore, the operation of determining whether the user has the authority for the specified function that has been instructed to execute and executing the above specified function if the user has the authority also lifts the restriction on the function. This can be taken as an example of an action to do.
  • menu screen related to functional restrictions When controlling the release of functional restrictions based on the authentication result, settings for the menu screen displayed on the display unit 35 may also be performed. This setting may be performed for each user. Specifically, it is as follows.
  • the menu screen is, for example, a screen (image) that includes one or more options on the GUI.
  • a process corresponding to the option is executed.
  • the operation unit 33 and the display unit 35 are configured by touch panels, when one or more options displayed on the display unit 35 is pressed with a finger or a touch pen, the corresponding process is executed. Ru.
  • the processes corresponding to the options shown on the menu screen of the image processing device 3 may be various processes.
  • the options may be processes that cause operations related to major functions such as printing, scanning, copying, FAX transmission, and FAX reception (although these are not necessarily separable concepts).
  • the option may be a process for making settings related to the above operation. Such settings include, for example, paper size selection, print magnification settings, and print darkness.
  • the main functions may be subdivided as appropriate and authorities may be set, but this explanation of subdivision may be used for subdivision of options as appropriate.
  • the menu screen for each user may, for example, reflect the preferences of each user and/or the authority of each user.
  • the former includes, for example, adjusting the position, size, color, shape, etc. of a specific option within the screen 35a to suit the user's preference.
  • Examples of the latter include, for example, a screen in which options for a given function are displayed in different ways depending on whether the user has authority for that function. More specifically, examples include a screen in which options have different colors depending on the presence or absence of authority, and a screen in which only options for which the user has authority are displayed (options for which he does not have authority are not displayed).
  • controlling the display of the menu screen in this case may be regarded as an example of controlling the release of functional restrictions.
  • the menu screen settings for each user based on the authentication results may be set to only two types: a menu screen for users who have been successfully authenticated, and a menu screen for other users. Furthermore, for example, it may be possible to set different menu screens to different users who have successfully authenticated. The menu screen may not be displayed to users whose authentication is not successful. In an embodiment in which multiple types of authentication methods can be selected, the menu screens may or may not be different depending on the authentication method.
  • the image processing device 3 may be capable of displaying a main menu screen that is initially displayed and one or more submenu screens that are displayed by selecting an option on the main menu screen.
  • the menu screen set for each user may be the main menu screen, at least one of one or more submenu screens, or both of the above two types. There may be. Further, depending on the menu screen settings for each user, whether or not a submenu screen can be displayed may be set, or the number of submenu screens that can be displayed among a plurality of submenu screens may be set.
  • menu screen settings described above may be realized in various more specific ways. An example is shown below.
  • FIG. 12B is a schematic diagram showing the configuration of the menu table DT7 used for setting the menu screen.
  • the menu table DT7 stores an ID and menu information D7 that specifies the mode of the menu screen (in other words, the settings of the menu screen) in association with each other.
  • menu information D7 is expressed by a schematic diagram of the screen 35a of the display unit 35 (touch panel). For example, a plurality of buttons IGF are displayed on the screen 35a, and by selectively operating (for example, tapping) on the plurality of buttons IGF, processing related to the function corresponding to that button is performed. .
  • the image processing device 3 controls the menu screen displayed on the display unit 35 by referring to the menu information D7 associated with the user who has been successfully authenticated.
  • the description of the authority table DT3 may be appropriately used in the menu table DT7 (menu information D7).
  • the menu table DT7 may be held by the image processing device 3 or by the server 5.
  • Menu table DT7 may or may not be integrated with management table DT0 (and/or comparison table DT1).
  • the function whose restrictions are lifted based on the authentication result may be a VPN connection. Specifically, it is as follows.
  • a VPN for example, virtually extends a private network to the public network 11.
  • a VPN logically divides a physically single network including the public network 11. Thereby, for example, communication via the public network 11 is performed in a secure environment.
  • Such virtual expansion or logical division is achieved, for example, by authentication, tunneling, and encryption.
  • communication using a VPN may be performed through authentication and tunneling without being encrypted.
  • tunneling can also be considered a type of encryption.
  • Authentication methods include, for example, those that use account information (ID and password), those that use static keys, those that use a common key (shared key), those that use a combination of a private key and public key, and those that use electronic signatures. Examples include those that use electronic certificates, those that use security tokens, and those that combine two or more of the above (for example, multi-factor authentication).
  • Tunneling an operation is performed to treat two points that are physically or logically separated via a network as if they were the same point.
  • Tunneling is achieved, for example, by encapsulation.
  • encapsulation for example, an entire packet is embedded in a payload of another protocol, a payload of another layer, or a payload of the same layer during communication.
  • Tunneling may be performed at any appropriate layer, for example at layer 3 (network layer) or layer 2 (data link layer).
  • Encryption converts information sent and received into a format that cannot be read by third parties. Encryption may be performed only on the payload, or on both the header and the payload. In another aspect, encryption may be performed at any appropriate layer, eg, at the network layer, transport layer, and/or session layer. An appropriate encryption method may be used. For example, encryption methods include those that use a common key and those that use a combination of a private key and a public key.
  • the type of VPN may be selected as appropriate.
  • the VPN of the communication system 1 may be a remote access type VPN and/or a LAN type (intersite) VPN.
  • a remote access type VPN for example, VPN client software is installed on a communication device such as the image processing device 3, and the communication device directly establishes a VPN connection to the server 5 as a VPN server.
  • a LAN type VPN for example, a VPN gateway connects LANs (bases) to each other via VPN.
  • the operation of the image processing device 3 that functions as a client of a remote access VPN will be taken as an example.
  • the public network 11 may take various forms. From the viewpoint of VPN types, they are as follows.
  • the VPN may be an Internet VPN in which the public network 11 includes the Internet.
  • the VPN may be an IP (Internet Protocol)-VPN, an entry VPN, or a wide area Ethernet, including a closed network provided by a communication carrier or the like to the public network 11.
  • IP Internet Protocol
  • the protocol for the VPN may be a known one, a new one, or one uniquely defined by the administrator of the server 5.
  • Known protocols for remote access VPNs include, for example, a combination of L2TP (Layer 2 Tunneling Protocol) and IPsec (Security Architecture for Internet Protocol), and PPTP (Point to Point Tunneling Protocol).
  • FIG. 13 is a flowchart illustrating a specific example of the above operation.
  • the image processing device 3 is a remote access VPN client that communicates with the server 5 as a VPN server (for example, the image processing device 3A or 3B in FIG. 1).
  • the data processing device 49 is a device that communicates with the image processing device 3 via a VPN (from another perspective, the server 5 as a VPN server). Examples of the data processing device 49 include another image processing device 3, the server 7, and the terminal 9.
  • the data processing device 49 may be the server 5, FIG. 13 shows an example in which the two are separate.
  • the data processing device 49 that is not the server 5 may be included in the private network 13A including the server 5 (3C, 7 or 9A) or may not be included (3A, 3B or 9B). good. In FIG. 13, the latter is taken as an example.
  • the process shown in FIG. 13 is started, for example, when the VPN connection start condition is satisfied in the image processing device 3.
  • the start condition may be, for example, that a predetermined operation instructing a VPN connection is performed on the operation unit 33. Further, the start condition may be that a task that requires a VPN connection (for example, an operation of downloading and printing image data from the data processing device 49) is performed on the operation unit 33. When such a task is instructed, the start condition may be satisfied when the user is asked whether or not to make a VPN connection, and as a result, a predetermined operation instructing the VPN connection is performed. Further, the start condition may be that a predetermined signal is input from an external communication device (for example, the terminal 9).
  • step ST5 for user A is illustrated in FIG. 13, the step shown as step ST5 may be step ST12 for user B.
  • authentication by the server 5 of the third embodiment may be performed subsequently.
  • the image processing device 3 requests the server 5 for authentication for VPN connection.
  • the image processing device 3 transmits the account information of the user who has been successfully authenticated to the server 5 (step ST29). Then, the server 5 determines whether the received account information is registered in the table it holds (see verification table DT2 in FIG. 11). Authentication is thereby performed. Then, the server 5 transmits the authentication result (success or failure of authentication) to the image processing device 3 that is the source of the account information (step ST30).
  • step ST29 similarly to the transmission of account information in FIG. 11, if an affirmative determination is made in step ST5, the image processing apparatus 3MA transmits the user A who has been successfully authenticated in the comparison table DT1 held by the image processing apparatus 3MA. The account information associated with the biometric information of is sent to the server 5. Further, if an affirmative determination is made in step ST12, the image processing device 3MA transmits the account information of user B input in step ST41 of FIG. 7 or FIG. 10 to the server 5.
  • the authentication in steps ST29 and ST30 may be performed by the server 5 after the biometric authentication is successful and before the function restriction is lifted.
  • authentication may be performed based on authentication information other than account information, or authentication information imported from the image processing device 3MB may be used. may be used.
  • the process starts from step ST29. May be started.
  • the VPN connection may be automatically established when the authentication described in the first to third embodiments is successful.
  • successful authentication may be a condition for starting a VPN connection.
  • a VPN connection may be established between the image processing device 3 and the server 5 before the authentication described in the first to third embodiments is performed.
  • the authentication information held by the image processing device 3 is transmitted to the server 5.
  • Authentication related to the VPN connection may also be performed. Even in this case, by requiring the authentication described in the first to third embodiments to execute functions using a VPN connection, the probability that a third party will illegally use the VPN connection is reduced. be done.
  • FIG. 13 exemplifies the operation of downloading image data from the data processing device 49 and printing it. Specifically, it is as follows.
  • step ST31 the image processing device 3 transmits a signal requesting download of image data to the server 5 via the VPN.
  • the image data here may be general image data or image data as a print job.
  • step ST32 the server 5 transmits (transfers) a signal requesting image data to the destination (here, the data processing device 49) specified by the information included in the received signal.
  • the data processing device 49 is a communication device external to the private network 13A including the server 5, the transmission may be performed via a VPN (as shown in the example).
  • the data processing device 49 is connected to the server 5 via VPN in advance before step ST32.
  • the data processing device 49 is a communication device included in the private network 13A, normal communication within the private network 13A may be performed.
  • step ST33 the data processing device 49 transmits the requested image data to the server 5.
  • a VPN may be used (as shown in the example), and if the data processing device 49 is located inside the private network 13A, a VPN may be used. Communication may take place within the normal private network 13A.
  • step ST34 the server 5 transmits (transfers) the received image data to the image processing device 3. Transmission at this time is performed via VPN.
  • step ST35 the image processing device 3 executes printing based on the received image data.
  • the VPN server to which the image processing device 3 makes a VPN connection may or may not be selectable by the user using the image processing device 3.
  • the image processing device 3 may be able to select a connection destination only from two or more VPN servers that make up one VPN, or may be able to select a connection destination from two or more VPN servers that make up two or more different VPNs. It may also be possible to select a connection destination.
  • a VPN connection may be disconnected when appropriate disconnection conditions are met.
  • the cutting condition may be that a predetermined operation instructing cutting is performed on the operation unit 33.
  • the disconnection condition may be that the task is completed.
  • the disconnection condition may be that the authentication state has been canceled. Note that examples of conditions for canceling the authentication state have already been described.
  • the operation in which the image processing device 3 receives image data from the data processing device 49 and performs printing is taken as an example.
  • various other operations using the VPN are possible.
  • information (eg, image data) acquired by the scanner 21 may be transmitted to the data processing device 49 via the VPN.
  • the image processing device 3 includes the image processing section 31, the input section (detection section 25), the memory (auxiliary storage device 45), and the control section 29. It has .
  • the image processing section 31 includes at least one of a printer 19 and a scanner 21.
  • the detection unit 25 receives input of the user's biometric information.
  • the auxiliary storage device 45 stores first biometric information for each user.
  • the control unit 29 performs functions related to the image processing unit 31 based on the result of authentication that compares the second biometric information input to the detection unit 25 and the first biometric information stored in the auxiliary storage device 45. (steps ST1 to ST7).
  • the image processing device 3MA inputs the first biometric information of the user B imported from the other image processing device 3MB in response to the input of information identifying the first user (user B) (for example, the input of account information in step ST41). It is also possible to control the release of restrictions on functions related to the image processing unit 31 based on the result of authentication that compares with the second biometric information of user B (steps ST8 to ST14).
  • the effects described in the overview of the embodiment are achieved.
  • convenience for user B can be improved, storage capacity of image processing device 3MA can be saved, and/or load on network 10 can be reduced.
  • the image processing device 3 imports the first biometric information in response to the information identifying the user B (for example, ID) being input to the image processing device 3MA, the first biometric information necessary at that time (that is, the first biometric information of user B) can be requested to another image processing device 3 (for example, 3 MB). From another perspective, the image processing device 3 can delete the first biometric information of the user B after using the information.
  • the information identifying the user B is not limited to the ID (or account information), and may be other authentication information (excluding the second biometric information), for example.
  • the image processing device 3 may further include a UI section 23 into which account information is input.
  • Account information may include an ID and password.
  • the image processing device 3MA transfers the account information of the user B input to the UI unit 23 to the image processing device 3MB. (step ST42 in FIG. 7).
  • the image processing device 3MB can perform authentication based on the account information sent from the image processing device 3A, and export the first biometric information of the user B only when the authentication is successful.
  • the probability of unintended leakage of user B's first biometric information is reduced.
  • account information since account information is used, the procedure for user B is simpler than in the case where other authentication information is used.
  • the image processing device 3MA deletes the first user (user The first biometric information in B) may be deleted (step ST52 in FIG. 7).
  • the imported biometric information is automatically deleted, reducing the possibility that the biometric information will be unintentionally leaked. Furthermore, by appropriately setting the predetermined time, it is possible to balance improved convenience by reusing imported biometric information and improved security by erasing imported biometric information.
  • the image processing device 3MA according to the first embodiment (or the third embodiment) is configured to satisfy erasure conditions including completion of execution of functions related to the image processing unit 31 whose restrictions have been lifted based on the authentication result.
  • the first biometric information of the first user (user B) imported from another image processing device 3MB may be deleted (step ST52 in FIG. 7).
  • the imported biometric information is deleted every time the execution of the function for which the restriction has been lifted is completed, which improves security.
  • the image processing device 3 may further include a UI section 23 that receives user operations.
  • the image processing device 3MA deletes the first biometric information of the first user (user B) imported from another image processing device 3MA when deletion conditions including a predetermined operation performed on the UI unit 23 are satisfied. may be deleted (step ST52 in FIG. 7).
  • user B can delete the imported biometric information himself.
  • user B can avoid the possibility of unintended leakage of his or her biometric information, regardless of the presence or absence and/or setting of other conditions included in the deletion conditions (for example, deletion after a predetermined period of time has passed). can be reduced.
  • the image processing device 3MA automatically deletes the first biometric information of the user B imported from another image processing device 3MB when a predetermined automatic deletion condition is met. (without accepting an operation for erasing the first biometric information) (step ST52 in FIG. 7).
  • the image processing device 3MA compares the second biometric information of the user B with the first biometric information of the user B imported from the image processing device 3MB and succeeds in authentication (after the affirmative determination in step ST48). , and before the automatic deletion condition is satisfied (before the affirmative determination in step ST51), the second biometric information of user B that is re-entered into the input unit (detection unit 25) and the user imported from the image processing device 3MB. Re-authentication may be performed by comparing the first biometric information of B (step ST56 in FIG. 8).
  • the frequency with which user B's first biometric information is imported is reduced, and the burden on the network 10 is reduced.
  • the imported first biometric information is automatically deleted, the possibility of unintended leakage of the first biometric information is reduced.
  • the image processing device 3MA retains the second biological information until a predetermined deletion condition is met, and retains the newly detected second biological information.
  • Re-authentication may be performed to compare the biometric information with the second biometric information, and release of restrictions on functions may be controlled according to the result of the re-authentication (step ST56 in FIG. 8).
  • the second biometric information corresponding to the user's current physical condition is used instead of the first biometric information, so re-authentication may fail depending on the user's physical condition etc. The probability is reduced. Furthermore, since the second biometric information will be erased eventually, the storage capacity of the image processing device 3MA is saved.
  • the image processing device 3 includes an image processing section 31, an input section (detection section 25), a memory (auxiliary storage device 45), and a control section 29. are doing.
  • the image processing section 31 includes at least one of a printer 19 and a scanner 21.
  • the detection unit 25 receives input of the user's biometric information.
  • the auxiliary storage device 45 stores first biometric information for each user.
  • the control unit 29 performs functions related to the image processing unit 31 based on the result of authentication that compares the second biometric information input to the detection unit 25 and the first biometric information stored in the auxiliary storage device 45. (steps ST1 to ST7).
  • the image processing device 3MA exports the second biometric information of the first user (user B) to another image processing device 3MB (step ST63 in FIG. 9), and displays the image based on the authentication result received from the image processing device 3MB. It is also possible to control the release of restrictions on functions related to the processing unit 31 (step ST13).
  • the image processing device 3 may further include a UI section 23 into which account information is input.
  • Account information may include an ID and password.
  • the image processing device 3MA may transmit the account information of the first user (user B) input to the UI unit 23 to the other image processing device 3MB (step ST42 in FIG. 10), and the If successful authentication based on account information is notified (if an affirmative determination is made in step ST72), the second biometric information of user B may be exported to the image processing device 3MB (step ST73).
  • the probability that the image processing device 3MA exports the second biometric information of the user B to the wrong destination is reduced. That is, the probability that the second biological information will be unintentionally leaked is reduced. Furthermore, since account information is used, the procedure for user B is simpler than in the case where other authentication information is used.
  • the image processing device 3MA may retain the second biometric information of the first user (user B) until a predetermined deletion condition (step ST79 in FIG.
  • a predetermined re-authentication condition is satisfied before the re-authentication is performed, the second biometric information of the user B may be exported again to the image processing device 3MB (step ST56 in FIG. 8).
  • the image processing device 3 may be capable of registering a plurality of types of first biometric information for each user in the memory (auxiliary storage device 45).
  • the image processing device 3MA may further include a UI unit 23 that receives an input of information specifying information on another image processing device 3MB (import source or export destination) by the user. (See Figure 5C).
  • the image processing device 3MA does not need to inquire of the plurality of other image processing devices 3 whether they hold the first biometric information of the user B or not.
  • the burden on the communication system 1 is reduced.
  • the communication system 1 includes a large number of image processing devices 3, the burden on the image processing devices 3 and the burden on the network 10 is reduced.
  • the image processing device 3MA may further include a display unit 35 that displays a plurality of candidates for other image processing devices 3MB (import source or export destination) (see FIG. 5C). .
  • the burden on user B is reduced.
  • the image processing device 3MB can be selected by operating a plurality of candidates or a button (software key or hardware key) associated with the plurality of candidates, the address of the image processing device 3MB is The burden on the user is further reduced compared to the mode of inputting the information.
  • the memory may store first biometric information and authentication information (for example, account information) in association with each other for each user.
  • the image processing device 3MA may transmit information based on authentication information stored in association with the first biometric information that matches the second biometric information to the external authentication device (server 5), and
  • the authentication result based on the image processing unit 31 may be received from the server 5, and the release of restrictions on functions related to the image processing unit 31 may be controlled based on the received authentication result.
  • the above is the operation when the user is user A.
  • the authentication information input by User B or the authentication information imported from the image processing device 3MB is used instead of the authentication information stored in the image processing device 3MA.
  • the information may be used to perform operations similar to those described above.
  • the restriction on the function is lifted by two-step authentication of biometric authentication (local authentication) and authentication at the server 5. security is improved.
  • user A uses image processing device 3MA, user A can cancel functional restrictions without inputting account information, which improves convenience.
  • the memory may store first biometric information and authentication information (for example, account information) in association with each other for each user.
  • authentication information for example, account information
  • the image processing device 3MA uses authentication information stored in association with the first biometric information that matches the second biometric information.
  • the information based on the information may be transmitted to the server 5 (step ST29 in FIG. 13).
  • the above is the operation when the user is user A.
  • the authentication information input by User B or the authentication information imported from the image processing device 3MB is used instead of the authentication information stored in the image processing device 3MA.
  • the information may be used to perform operations similar to those described above.
  • the VPN connection will undergo two-step authentication, similar to the third embodiment.
  • security is improved.
  • user A uses the image processing device 3MA, user A can make a VPN connection without inputting account information, which improves convenience.
  • the communication system 1 includes an image processing device 3MA as described above and another image processing device 3MB.
  • the communication system 1 may include the image processing device 3MA of the first embodiment (or the third embodiment) and another image processing device 3MB.
  • the image processing device 3MA may further include a UI unit 23 into which account information is input, and may transmit the account information of the first user (user B) input to the UI unit 23 to the image processing device 3MB.
  • Account information may include an ID and password.
  • the image processing device 3MB may store the first biometric information in association with account information for each user, and the first biometric information stored in association with the account information that matches the account information received from the image processing device 3MA.
  • the biometric information may be exported to the image processing device 3MA (step ST45 in FIG. 7).
  • the image processing device 3MA may perform authentication by comparing the first biometric information of the user B exported from the image processing device 3MB and the second biometric information of the user B (step ST48).
  • the image processing device 3MB performs authentication based on the account information sent from the image processing device 3A, and exports the first biometric information of the user B only when the authentication is successful.
  • the probability of unintended leakage of user B's first biometric information is reduced.
  • account information since account information is used, the procedure for user B is simpler than in the case where other authentication information is used.
  • the communication system 1 includes an image processing device 3MA according to the second embodiment (or the third embodiment) and another image processing device 3MB. good.
  • the image processing device 3MA may further include a UI unit 23 into which account information is input, and may transmit the account information of the first user (user B) input to the UI unit 23 to the image processing device 3MB. (Step ST42 in FIG. 10).
  • Account information may include an ID and password.
  • the image processing device 3MB may store the first biometric information and account information for each user in association with each other, and if account information that matches the account information received from the image processing device 3MA is stored, the image processing device 3MB performs authentication. The success may be notified to the image processing device 3MA (step ST71 in FIG. 10).
  • the image processing device 3MA may export the second biometric information of the user B to the image processing device 3MB (step ST73).
  • the image processing device 3MB may perform authentication by comparing the second biometric information of the user B exported from the image processing device 3MA with the first biometric information of the user B (step ST74). The result may be transmitted to the image processing device 3MA (step ST75).
  • the probability that the image processing device 3MA exports the second biometric information of the user B to the wrong destination is reduced. That is, the probability that the second biological information will be unintentionally leaked is reduced. Furthermore, since account information is used, the procedure for user B is simpler than in the case where other authentication information is used.
  • the other image processing device 3MB deletes the second living body exported from the image processing device 3MA when deletion conditions including that a predetermined time has elapsed are met.
  • the information may be deleted (step ST82 in FIG. 10).
  • the probability of unintended leakage of the exported second biological information is reduced. Furthermore, by appropriately setting the above predetermined time, it is possible to balance improved convenience by reusing the second biometric information and improved security by erasing the second biometric information.
  • the other image processing device 3MB erases the second biometric information of the first user (user B), which includes receiving a request to erase the second biometric information of the first user (user B) from the image processing device 3MA.
  • the image processing device 3MA may delete the exported second biometric information (step ST82 in FIG. 10).
  • the probability of unintended leakage of the exported second biological information is reduced.
  • the second biometric information held by the image processing device 3MB is deleted depending on the status of the image processing device 3MA.
  • the user B can control the deletion of the second biometric information held by the image processing device 3MB, even though the user B is far from the image processing device 3MB. This can improve convenience and/or security, for example.
  • the other image processing device 3MB may hold the second biometric information of the first user (user B) until a predetermined deletion condition is met;
  • the second biometric information of user B is exported again from the image processing device 3MA before the second biometric information of user B is satisfied, the second biometric information of user B that has been exported again and the second biometric information of user B held
  • the authentication may be performed by comparing the biometric information with the biometric information, and the result of the authentication may be transmitted to the image processing device 3MA (see step ST56 in FIG. 5).
  • the exported second biometric information can be used in place of the registered first biometric information to perform re-authentication. This reduces the probability that re-authentication will fail due to the user's physical condition or the like.
  • the image processing device 3MA is an example of an image processing device
  • the image processing device 3MB is an example of another image processing device.
  • the detection unit 25 is an example of an input unit.
  • the auxiliary storage device 45 is an example of memory.
  • the server 5 is an example of an external authentication device.
  • Account information is an example of authentication information.
  • the image processing device is not a multifunction device including a printer and a scanner, but may be one that has only a printing function (i.e., a printer in the narrow sense) or one that has only a scanner function (i.e., a scanner in the narrow sense).
  • the multifunction peripheral may be regarded as a printer (in a broad sense) or a scanner (in a broad sense).

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Security & Cryptography (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Hardware Design (AREA)
  • Software Systems (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Facsimiles In General (AREA)

Abstract

In an image processing device according to the present invention, a control unit controls cancelling of functional limitations relating to an image processing unit on the basis of the result of authentication in which second biometric information input to an input unit is compared with first biometric information saved in a memory. The image processing device can also control cancelling of functional limitations relating to the image processing unit on the basis of the result of authentication in which first biometric information of a first user, imported from another image processing device in response to an input of information identifying the first user, is compared with second biometric information of the first user. Alternatively or in addition, the image processing device can also export second biometric information of a first user to another image processing device, receive the result of authentication based on the second biometric information of the first user from the other image processing device, and control cancelling of functional limitations relating to the image processing unit on the basis of the received authentication result.

Description

画像処理装置および通信システムImage processing device and communication system
 本開示は、プリンタおよびスキャナの少なくとも一方を有する画像処理装置、および当該画像処理装置を含む通信システムに関する。 The present disclosure relates to an image processing device having at least one of a printer and a scanner, and a communication system including the image processing device.
 生体認証を行い、その認証結果に応じてユーザが利用できる機能の制限および制限の解除を行う画像処理装置(または該画像処理装置を含む通信システム)が知られている(例えば下記特許文献1)。特許文献1では、概略、以下の2種の通信システムが開示されている。 There is known an image processing device (or a communication system including the image processing device) that performs biometric authentication and limits and removes restrictions on functions available to the user according to the authentication result (for example, Patent Document 1 listed below). . Patent Document 1 generally discloses the following two types of communication systems.
 特許文献1の一の通信システムでは、画像処理装置は、ユーザの生体情報を取得してサーバへ送信し、その生体情報を照合用生体情報としてサーバに登録させる。ユーザが画像処理装置を利用しようとしたときは、画像処理装置は、ユーザに生体情報を入力させ、入力された生体情報をサーバへ送信する。サーバは、受信した生体情報と、上記の照合用生体情報とを比較して認証を行い、認証結果を画像処理装置へ返信する。 In one communication system of Patent Document 1, an image processing device acquires biometric information of a user, transmits it to a server, and causes the server to register the biometric information as verification biometric information. When a user attempts to use the image processing device, the image processing device allows the user to input biometric information and transmits the input biometric information to the server. The server performs authentication by comparing the received biometric information with the above-mentioned verification biometric information, and returns the authentication result to the image processing device.
 特許文献1の他の通信システムでは、画像処理装置は、ユーザの生体情報を取得して、その生体情報を照合用生体情報として自らが保持するテーブルに登録する。ユーザが画像処理装置を利用しようとしたときは、画像処理装置は、ユーザに生体情報を入力させ、入力された生体情報と上記テーブルに登録されている照合用生体情報とを比較して認証を行う。画像処理装置は、定期的に、他の画像処理装置が保持しているテーブルの更新情報を取得して、自己のテーブルに反映させる。 In another communication system disclosed in Patent Document 1, an image processing device acquires biometric information of a user and registers the biometric information as verification biometric information in a table held by the image processing device. When a user attempts to use an image processing device, the image processing device allows the user to input biometric information, compares the input biometric information with the verification biometric information registered in the table above, and performs authentication. conduct. The image processing apparatus periodically acquires update information of tables held by other image processing apparatuses and reflects the information in its own table.
特開2008-21222号公報Japanese Patent Application Publication No. 2008-21222
 本開示の一態様に係る画像処理装置は、画像処理部と、入力部と、メモリと、制御部と、を備える。前記画像処理部は、プリンタおよびスキャナの少なくとも一方を含む。前記入力部は、ユーザの生体情報が入力される。前記メモリは、ユーザ毎の第1の生体情報を保存する。前記制御部は、前記入力部に入力された第2の生体情報と、前記メモリに保存されている前記第1の生体情報とを比較する認証の結果に基づき前記画像処理部に関連する機能の制限解除を制御する。 An image processing device according to one aspect of the present disclosure includes an image processing section, an input section, a memory, and a control section. The image processing section includes at least one of a printer and a scanner. The input unit receives the user's biometric information. The memory stores first biometric information for each user. The control unit controls functions related to the image processing unit based on an authentication result that compares the second biometric information input to the input unit and the first biometric information stored in the memory. Control the release of restrictions.
 一例において、前記画像処理装置は、第1ユーザを特定する情報の入力に応じて他の画像処理装置からインポートした前記第1ユーザの第1の生体情報と前記第1ユーザの第2の生体情報とを比較する認証の結果に基づき前記画像処理部に関連する機能の制限解除を制御することもできる。 In one example, the image processing device stores first biometric information of the first user and second biometric information of the first user imported from another image processing device in response to input of information specifying the first user. It is also possible to control the release of restrictions on functions related to the image processing section based on the authentication results compared with the above.
 一例において、前記画像処理装置は、他の画像処理装置に第1ユーザの第2の生体情報をエクスポートし、前記他の画像処理装置から前記第1ユーザの第2の生体情報に基づく認証の結果を受信し、受信した認証の結果に基づき前記画像処理部に関連する機能の制限解除を制御することもできる。 In one example, the image processing device exports second biometric information of the first user to another image processing device, and receives an authentication result based on the second biometric information of the first user from the other image processing device. It is also possible to control the release of restrictions on functions related to the image processing unit based on the received authentication result.
 本開示の一態様に係る通信システムは、前記画像処理装置と、前記他の画像処理装置とを有している。 A communication system according to one aspect of the present disclosure includes the image processing device and the other image processing device.
実施形態の概要を説明するための模式図。FIG. 1 is a schematic diagram for explaining an overview of an embodiment. 実施形態に係る画像処理装置およびネットワークの構成を説明するための模式図。FIG. 1 is a schematic diagram for explaining the configuration of an image processing device and a network according to an embodiment. 実施形態に係る画像処理装置の信号処理系に係るハードウェア構成を示す模式図。FIG. 1 is a schematic diagram showing a hardware configuration related to a signal processing system of an image processing device according to an embodiment. 第1実施形態に係る画像処理装置が実行する処理の手順の一例を示すフローチャート。5 is a flowchart illustrating an example of a procedure of processing executed by the image processing apparatus according to the first embodiment. 図5A、図5Bおよび図5Cは第1実施形態に係る画像処理装置の画面に表示される画像の例を示す模式図。5A, 5B, and 5C are schematic diagrams showing examples of images displayed on the screen of the image processing device according to the first embodiment. 図6Aおよび図6Bは第1実施形態に係る画像処理装置の画面に表示される画像の他の例を示す模式図。6A and 6B are schematic diagrams showing other examples of images displayed on the screen of the image processing apparatus according to the first embodiment. 図4の一部の手順の詳細を示すフローチャート。5 is a flowchart showing details of some procedures in FIG. 4. 図7の一部の手順の詳細を示すフローチャート。8 is a flowchart showing details of some procedures in FIG. 7. 第2実施形態に係る画像処理装置が実行する処理の手順の一例を示すフローチャート。7 is a flowchart illustrating an example of a procedure of processing executed by the image processing apparatus according to the second embodiment. 図9の一部の手順の詳細を示すフローチャート。10 is a flowchart showing details of some procedures in FIG. 9. 第3実施形態に係る通信システムを説明するための機能ブロック図。FIG. 7 is a functional block diagram for explaining a communication system according to a third embodiment. 図12Aおよび図12Bは認証結果に基づく制限解除の具体例を説明するための模式図。FIGS. 12A and 12B are schematic diagrams for explaining a specific example of releasing restrictions based on authentication results. VPN接続に係る制限解除を説明するためのフローチャート。5 is a flowchart for explaining the release of restrictions related to VPN connection.
 以下、図面を参照して本実施形態に係る画像処理装置について説明する。相対的に後に説明される態様については、基本的に、相対的に先に説明される態様との相違点についてのみ述べる。特に言及が無い事項については、先に説明される態様と同様とされたり、先に説明される態様から類推されたりしてよい。 An image processing apparatus according to this embodiment will be described below with reference to the drawings. Regarding aspects that will be described later, basically only differences from aspects that will be described relatively earlier will be described. Items that are not specifically mentioned may be the same as the aspects described earlier, or may be inferred from the aspects described earlier.
 なお、以下のとおり、いくつかの用語は一般的に多義的である。実施形態の説明においても、文脈等に照らして適宜に用語の意味を解釈されたい。 Note that some terms are generally ambiguous, as described below. Also in the description of the embodiments, the meanings of terms should be interpreted as appropriate in light of the context and the like.
 「生体情報」の語は、例えば、現に人に表れている特徴の情報自体(別の観点では検出方法に依存しない情報)を指す場合と、上記特徴を検出した生の情報を指す場合と、生の情報から抽出した特徴量の情報を指す場合と、生の情報もしくは特徴量の情報を利用目的に応じて加工した情報を指す場合とがある。加工した情報としては、例えば、特徴量を暗号化した情報を挙げることができる。 For example, the term "biological information" refers to the information itself about the characteristics that actually appear on a person (from another point of view, information that does not depend on the detection method), and when it refers to the raw information obtained by detecting the above-mentioned characteristics. In some cases, it refers to feature information extracted from raw information, and in other cases, it refers to information that has been processed from raw information or feature information according to the purpose of use. Examples of the processed information include information obtained by encrypting feature amounts.
 「認証」の語は、対象の正当性を確認する行為を指す場合と、そのような行為によって、正当性が確認できたこと、もしくは確認ができていることを指す場合とがある。これに関連して、正当性が確認できたことは、認証に成功すると表現されることがあり、また、正当性が確認できないことを認証に失敗したと表現されることがある。実施形態の説明において、「認証状態」は、正当性が確認できている状態、またはそのように見做されている状態を指す。 The word "authentication" sometimes refers to the act of confirming the legitimacy of an object, and sometimes refers to the fact that the legitimacy has been confirmed or has been confirmed through such an act. In this regard, the fact that the validity has been confirmed is sometimes expressed as a successful authentication, and the fact that the legitimacy cannot be confirmed is sometimes expressed as a failure in authentication. In the description of the embodiments, the "authentication state" refers to a state where authenticity has been confirmed, or a state where it is regarded as such.
 「ネットワーク」の語は、通信網を指す場合と、通信網および通信網に接続された機器の組み合わせを指す場合とがある。ネットワークの下位概念の語についても同様である。ネットワークの下位概念の語は、例えば、インターネット、パブリックネットワーク、プライベートネットワーク、LAN(Local Area Network)およびVPN(Virtual Private Network)である。 The word "network" sometimes refers to a communication network, and sometimes refers to a combination of a communication network and devices connected to the communication network. The same holds true for the lower-level concept of network. Examples of the sub-concept terms of network are the Internet, public network, private network, LAN (Local Area Network), and VPN (Virtual Private Network).
 「VPN」の語は、プライベートネットワークをパブリックネットワークに仮想的に拡張する技術を指す場合と、当該技術によるネットワークを指す場合とがある。なお、VPNに係る技術的事項に適宜にVPNの語を付すことがある。例えば、VPNを利用した通信を行うように確立される接続をVPN接続ということがあり、また、そのような接続を行うことをVPN接続するということがある。 The term "VPN" sometimes refers to a technology that virtually extends a private network to a public network, and sometimes refers to a network using this technology. Note that the term VPN may be appropriately used to refer to technical matters related to VPN. For example, a connection established for communication using a VPN is sometimes referred to as a VPN connection, and such a connection is sometimes referred to as a VPN connection.
 「接続」の語は、認証(例えばスリーウェイハンドシェイク)を経て確立される接続(狭義の接続)を指す場合と、単に通信可能であることを意味する接続(広義の接続)を指す場合とがある。前者とは異なり、かつ後者に含まれる接続としては、例えば、以下のものを挙げることができる。接続を確立する前の通信(例えばブロードキャストおよびこれに対する返信)は可能であるが、接続の確立は禁止されている接続。互いにケーブルによって電気的(別の観点では物理的)に接続されているが、ソフトウェア的(別の観点では論理的)には通信が一切禁止されているもの。 The word "connection" can refer to a connection established through authentication (for example, a three-way handshake) (a connection in a narrow sense), or a connection that simply means that communication is possible (a connection in a broad sense). There is. Examples of connections that are different from the former and included in the latter include the following. A connection in which communication (e.g., broadcast and reply) is possible before the connection is established, but connection establishment is prohibited. Things that are electrically (physically from another point of view) connected to each other by cables, but in terms of software (logically from another point of view) any communication is prohibited.
(実施形態の概要)
 図1は、実施形態に係る通信システム1の概要を説明するための模式図である。
(Summary of embodiment)
FIG. 1 is a schematic diagram for explaining an overview of a communication system 1 according to an embodiment.
 通信システム1は、ネットワーク10を介して互いに通信可能に接続されている複数(図1では3つを例示)の画像処理装置3MA、3MBおよび3MCを有している。以下では、画像処理装置3MA~3MC(および図2の3A~3C)を互いに区別せずに画像処理装置3(符号は図3等)と称することがある。画像処理装置3は、プリンタおよびスキャナの少なくとも一方を含む。 The communication system 1 includes a plurality of (three illustrated in FIG. 1) image processing devices 3MA, 3MB, and 3MC that are communicably connected to each other via the network 10. Hereinafter, the image processing devices 3MA to 3MC (and 3A to 3C in FIG. 2) may be referred to as the image processing device 3 (numerals in FIG. 3, etc.) without distinguishing them from each other. Image processing device 3 includes at least one of a printer and a scanner.
 各画像処理装置3は、ユーザの認証を行い、その認証結果に基づいてユーザが利用可能な機能を制限し、またはその制限の解除を行う。例えば、認証が成功すると、画像処理装置3は、ユーザに所定の機能(例えば印刷)の利用を許可する(機能の制限を解除する。)。逆に言えば、認証が失敗した場合は、画像処理装置3は、ユーザに所定の機能の利用を許可しない(機能を制限する。)。 Each image processing device 3 authenticates the user, and based on the authentication result, limits the functions available to the user or cancels the limits. For example, if the authentication is successful, the image processing device 3 allows the user to use a predetermined function (for example, printing) (removes the restriction on the function). Conversely, if the authentication fails, the image processing device 3 does not allow the user to use the predetermined function (restricts the function).
 なお、機能を制限すること、および機能の制限を解除することの双方を含む概念について、「機能の制限解除の制御」のような語を用いることがある。同様に、認証に成功した認証状態を維持すること、および認証状態を解除することの双方を含む概念について、「認証状態の解除の制御」のような語を用いることがある。 Note that terms such as "control for removing restrictions on functions" may be used for concepts that include both restricting functions and removing restrictions on functions. Similarly, a term such as "control of cancellation of authentication state" may be used for a concept that includes both maintaining an authentication state in which authentication has been successful and canceling the authentication state.
 画像処理装置3MA、3MBおよび3MCは、それぞれ、認証を実現するために、管理用テーブルDT0A、DT0BおよびDT0Cを有している。以下では、これらを互いに区別せずに、管理用テーブルDT0と称することがある。管理用テーブルDT0は、例えば、ユーザ毎に種々の情報を関連付けて保持している。図1では、種々の情報として、「ID」、「パスワード」、「生体情報1」、「生体情報2」および「機能制限」が例示されている。 The image processing devices 3MA, 3MB, and 3MC each have management tables DT0A, DT0B, and DT0C to realize authentication. Hereinafter, these may be referred to as management table DT0 without distinguishing them from each other. The management table DT0, for example, holds various information in association with each user. In FIG. 1, "ID", "password", "biometric information 1", "biometric information 2", and "functional restriction" are illustrated as various pieces of information.
 ID(identification)およびパスワードの組み合わせをアカウント情報ということがある。管理用テーブルDT0は、個々のユーザに対して2種以上の生体情報を記憶可能であってよく、図1では、上記のとおり、「生体情報1」および「生体情報2」の2種(より詳細には指紋および顔)が例示されている。実施形態の説明では、便宜上、2種以上の生体情報を記憶可能であることを無視した説明(1種の生体情報のみが記憶されることを前提とした説明)を行うことがある。「機能制限」は、例えば、制限が解除される機能(別の観点では制限される機能)を直接的にまたは間接的に示す情報である。 The combination of ID (identification) and password is sometimes referred to as account information. The management table DT0 may be able to store two or more types of biometric information for each user, and in FIG. Specifically, fingerprints and faces are illustrated. In the description of the embodiments, for convenience, explanations may be given that ignore the fact that two or more types of biometric information can be stored (descriptions based on the assumption that only one type of biometric information is stored). "Functional restriction" is, for example, information that directly or indirectly indicates a function whose restriction is lifted (a function which is restricted from another perspective).
 画像処理装置3は、ユーザによる使用に際して、ユーザに生体情報の入力を要求する。そして、画像処理装置3は、生体情報(第2の生体情報)が入力されると、入力された生体情報に一致する生体情報(第1の生体情報)が、自己の管理用テーブルDT0に登録されているか否か判定する。そして、登録されている場合には、認証に成功したものとして、機能の制限の解除を行う。 When the image processing device 3 is used by the user, it requests the user to input biometric information. Then, when the biometric information (second biometric information) is input, the image processing device 3 registers biometric information (first biometric information) that matches the input biometric information in its own management table DT0. Determine whether or not. If it is registered, it is assumed that the authentication has been successful and the restriction on the function is canceled.
 なお、このときの制限解除は、ユーザ間で相違があってもよいし、なくてもよい。前者の場合においては、例えば、入力された生体情報に一致する生体情報(第1の生体情報)に関連付けられている「機能制限」に係る情報が参照されてよい。後者の場合においては、管理用テーブルDT0は、「機能制限」に係る情報をユーザ毎に有していなくてよい。 Note that the restriction cancellation at this time may or may not be different between users. In the former case, for example, information related to "functional limitations" associated with biometric information (first biometric information) that matches the input biometric information may be referred to. In the latter case, the management table DT0 does not need to have information regarding "functional limitations" for each user.
 管理用テーブルDT0A、DT0BおよびDT0Cは、例えば、少なくとも一部のユーザが互いに異なっている。例えば、図1の例では、「ID」の欄に例示された数字が全て互いに異なっていることから理解されるように、管理用テーブルDT0A、DT0BおよびDT0Cは、互いに異なるユーザ(別の観点ではアカウント)について、アカウント情報および生体情報等の情報を保持している。 For example, at least some of the users in the management tables DT0A, DT0B, and DT0C are different from each other. For example, in the example of FIG. 1, as can be understood from the fact that the numbers illustrated in the "ID" column are all different from each other, the management tables DT0A, DT0B, and DT0C are created by different users (from another point of view). account), we hold information such as account information and biometric information.
 従って、例えば、画像処理装置3MB(管理用テーブルDT0B)に登録されているユーザ(ユーザBとする。)が、画像処理装置3MBを使用する場合においては、上記のように、画像処理装置3MBが自己の管理用テーブルDT0Bを参照することによって、認証が成功する。その結果、ユーザBは、制限が解除された所定の機能を使用することができる。 Therefore, for example, when a user (referred to as user B) registered in the image processing device 3MB (management table DT0B) uses the image processing device 3MB, the image processing device 3MB is Authentication is successful by referring to its own management table DT0B. As a result, user B can use the predetermined function for which the restriction has been lifted.
 一方、例えば、ユーザBが画像処理装置3MAを使用しようとした場合においては、管理用テーブルDT0AにはユーザBは登録されていないことから、認証は失敗する。その結果、例えば、ユーザBは、画像処理装置3MAにおいて所定の機能を使用できない。 On the other hand, for example, when user B attempts to use image processing device 3MA, authentication fails because user B is not registered in management table DT0A. As a result, for example, user B cannot use certain functions in image processing device 3MA.
 そこで、画像処理装置3MAは、例えば、以下の2つの方法の一方を行うことによってユーザBの認証を行う。これにより、ユーザBは、画像処理装置3MAにおいて所定の機能を使用可能となる。 Therefore, the image processing device 3MA authenticates the user B by performing one of the following two methods, for example. This allows user B to use predetermined functions on image processing device 3MA.
 第1の方法では、画像処理装置3MAは、ユーザBの生体情報(第1の生体情報)を画像処理装置3MBからインポートする。そして、画像処理装置3MBは、自己が検出したユーザBの生体情報(第2の生体情報)と、インポートしたユーザBの生体情報とを比較して認証を行う。 In the first method, the image processing device 3MA imports user B's biometric information (first biometric information) from the image processing device 3MB. Then, the image processing device 3MB performs authentication by comparing the biometric information of user B detected by itself (second biometric information) and the imported biometric information of user B.
 第2の方法では、画像処理装置3MAは、自己が検出したユーザBの生体情報(第2の生体情報)を画像処理装置3MBへエクスポートする。そして、画像処理装置3MBは、エクスポートされたユーザBの生体情報と、自己が保存しているユーザBの生体情報(第1の生体情報)とを比較して認証を行う。その後、画像処理装置3MBは、認証結果を画像処理装置3MAへ通知する。 In the second method, the image processing device 3MA exports user B's biometric information (second biometric information) detected by the image processing device 3MA to the image processing device 3MB. The image processing device 3MB then performs authentication by comparing the exported biometric information of the user B with the biometric information (first biometric information) of the user B that it has stored. Thereafter, the image processing device 3MB notifies the image processing device 3MA of the authentication result.
 上記のように認証がなされることによって、既述のとおり、ユーザBは、自己の生体情報(第1の生体情報)が登録されている画像処理装置3MBだけでなく、画像処理装置3MA(換言すれば他の画像処理装置3)においても、所定の機能の制限を解除することができる。その結果、例えば、ユーザBの利便性が向上する。また、例えば、各画像処理装置3は、全てのユーザについて、生体情報を保持しなくてもよいから、各画像処理装置3の記憶容量が節約される。さらに、例えば、生体情報は、必要に応じてインポート(エクスポート)されればよいから、常にサーバに生体情報がエクスポートされる態様に比較して、ネットワーク10の負担が軽減される。 By performing the authentication as described above, as described above, user B can access not only the image processing device 3MB in which his or her biometric information (first biometric information) is registered, but also the image processing device 3MA (in other words, Then, the restriction on the predetermined function can be lifted in other image processing devices 3) as well. As a result, for example, user B's convenience is improved. Furthermore, for example, since each image processing device 3 does not need to hold biometric information for all users, the storage capacity of each image processing device 3 is saved. Furthermore, for example, biometric information may be imported (exported) as needed, so the burden on the network 10 is reduced compared to a mode in which biometric information is always exported to a server.
 画像処理装置3MAの上記動作は、画像処理装置3MA以外の画像処理装置3において可能であってもよいし、可能でなくてもよい。同様に、画像処理装置3MBの上記動作は、画像処理装置3MB以外の画像処理装置3において可能であってもよいし、可能でなくてもよい。また、画像処理装置3は、例えば、第1の方法および第2の方法の一方のみを実行可能に構成されていてもよいし、双方を選択的に実行可能に構成されていてもよい。 The above operation of the image processing device 3MA may or may not be possible in an image processing device 3 other than the image processing device 3MA. Similarly, the above operation of the image processing device 3MB may or may not be possible in an image processing device 3 other than the image processing device 3MB. Furthermore, the image processing device 3 may be configured to be able to execute only one of the first method and the second method, or may be configured to be able to selectively execute both.
 以上が実施形態の概要である。以下では、概略、下記の順に説明を行う。以下において、第1実施形態は、上記の第1の方法に対応し、第2実施形態は、上記の第2の方法に対応する。第3実施形態は、第1実施形態または第2実施形態の一部を変更した態様である。
 1.通信システム1全般(図2)
  1.1.通信システム1が利用する情報
   1.1.1.生体情報
   1.1.2.アカウント情報
   1.1.3.認証用情報
  1.2.通信システム1の全体の構成
  1.3.各通信機器の概要
  1.4.通信機器の接続態様
 2.画像処理装置3の構成(図3)
  2.1.画像処理装置3の全体の構成
  2.2.プリンタ
  2.3.スキャナ
  2.4.UI(User Interface)部
   2.4.1.操作部
   2.4.2.表示部
  2.5.生体情報が入力される入力部(検出部)
  2.6.通信部
  2.7.制御部
  2.8.コネクタ
  2.9.その他
 3.保存される情報
  3.1.管理用テーブルDT0
  3.2.生体情報の保存
 4.第1実施形態
  4.1.第1実施形態における動作の概要(図4)
  4.2.第1実施形態における画面の例(図5A~図6B)
  4.3.第1実施形態における一部の動作の詳細
   4.3.1.インポートされる生体情報に係る制限の例(図7)
   4.3.2.インポートされた生体情報の再利用の例(図8)
 5.第2実施形態
  5.1.第2実施形態における動作の概要(図9)
  5.2.第2実施形態における画面の例
  5.3.第2実施形態における一部の動作の詳細
   5.3.1.エクスポートされる生体情報に係る制限の例(図10)
   5.3.2.エクスポートされた生体情報の再利用の例
 6.第3実施形態(図11)
 7.機能制限の解除に係る動作
  7.1.機能制限の解除に係る動作全般
   7.1.1.制限の解除が制御される機能
   7.1.2.機能の制限の態様
   7.1.3.認証状態の解除
  7.2.機能の制限解除に関する具体例(図1および図12A)
  7.3.機能制限に係るメニュー画面(図12B)
  7.4.VPNに関する制限解除(図13)
 8.実施形態のまとめ
The above is an overview of the embodiment. Below, an outline will be explained in the following order. In the following, a first embodiment corresponds to the first method described above, and a second embodiment corresponds to the second method described above. The third embodiment is a partially modified version of the first embodiment or the second embodiment.
1. Communication system 1 in general (Figure 2)
1.1. Information used by communication system 1 1.1.1. Biological information 1.1.2. Account information 1.1.3. Authentication information 1.2. Overall configuration of communication system 1 1.3. Overview of each communication device 1.4. Connection mode of communication equipment 2. Configuration of image processing device 3 (Figure 3)
2.1. Overall configuration of image processing device 3 2.2. Printer 2.3. Scanner 2.4. UI (User Interface) section 2.4.1. Operation unit 2.4.2. Display section 2.5. Input section (detection section) into which biological information is input
2.6. Communication Department 2.7. Control unit 2.8. Connector 2.9. Others 3. Information saved 3.1. Management table DT0
3.2. Preservation of biological information 4. First embodiment 4.1. Outline of operation in the first embodiment (Figure 4)
4.2. Examples of screens in the first embodiment (FIGS. 5A to 6B)
4.3. Details of some operations in the first embodiment 4.3.1. Example of restrictions related to imported biometric information (Figure 7)
4.3.2. Example of reusing imported biometric information (Figure 8)
5. Second embodiment 5.1. Outline of operation in the second embodiment (Figure 9)
5.2. Example of screen in second embodiment 5.3. Details of some operations in the second embodiment 5.3.1. Example of restrictions related to exported biometric information (Figure 10)
5.3.2. Example of reusing exported biometric information 6. Third embodiment (Figure 11)
7. Operations related to cancellation of functional restrictions 7.1. General operations related to canceling functional restrictions 7.1.1. Functions that control release of restrictions 7.1.2. Aspects of functional limitations 7.1.3. Cancellation of authentication status 7.2. Specific example of releasing restrictions on functions (Figure 1 and Figure 12A)
7.3. Menu screen related to functional restrictions (Figure 12B)
7.4. Lifting restrictions on VPN (Figure 13)
8. Summary of embodiments
 なお、実施形態の説明では、特に断り無く、全ての画像処理装置3が、画像処理装置3MAの動作および画像処理装置3MBの動作の双方が可能である態様を例に取ることがある。また、便宜上、上述した説明と同様に、画像処理装置3MBに登録されているユーザBが画像処理装置3MAを利用する状況を想定して、特に断り無く、画像処理装置3MAおよび3MBならびにユーザBの符号を付すことがある。また、画像処理装置3MAに登録されているユーザをユーザAということがある(便宜上、ユーザAのデータに符号を付す。)。 Note that in the description of the embodiment, unless otherwise specified, a mode in which all the image processing devices 3 are capable of both the operation of the image processing device 3MA and the operation of the image processing device 3MB may be taken as an example. Also, for convenience, similar to the above explanation, assuming a situation where user B registered in image processing device 3MB uses image processing device 3MA, image processing devices 3MA and 3MB and user B's A symbol may be added. Further, the user registered in the image processing device 3MA may be referred to as user A (for convenience, the data of user A is given a reference numeral).
(1.通信システム全般)
(1.1.通信システムが利用する情報)
(1.1.1.生体情報)
 画像処理装置3が認証に利用する生体情報は、種々のものとされてよく、例えば、公知の生体認証に利用されている情報とされてよい。例えば、生体情報は、ユーザの身体的特徴の情報であってもよいし、ユーザの行動的特徴の情報であってもよい。身体的特徴の具体例としては、指紋、掌形、網膜(その血管のパターン等)、虹彩(その濃淡値の分布等)、顔、血管(指等の特定の部位のパターン)、耳形、音声(声紋等)および体臭を挙げることができる。行動的特徴としては、例えば、筆跡を挙げることができる。
(1. General communication system)
(1.1. Information used by the communication system)
(1.1.1. Biological information)
The biometric information used by the image processing device 3 for authentication may be of various types, and may be information used in known biometric authentication, for example. For example, the biometric information may be information about the user's physical characteristics or may be information about the user's behavioral characteristics. Specific examples of physical characteristics include fingerprints, palm shapes, retinas (patterns of blood vessels, etc.), iris (distribution of shading values, etc.), faces, blood vessels (patterns of specific parts such as fingers), ear shapes, Sound (such as voice prints) and body odor may be mentioned. Examples of behavioral characteristics include handwriting.
(1.1.2.アカウント情報)
 アカウント情報は、例えば、ユーザを識別するための情報(以下、「ID」と略して呼称することがある。)を含む。また、アカウント情報は、パスワードを含んでもよい。実施形態の説明では、特に断り無く、アカウント情報がIDおよびパスワードを含む態様を例に取ることがある。ただし、矛盾等が生じない限り、アカウント情報の語は、IDの語(パスワードを付帯しない)に置換されてもよいし、IDおよびパスワードの語に置換されてもよい。
(1.1.2. Account information)
The account information includes, for example, information for identifying a user (hereinafter sometimes abbreviated as "ID"). Additionally, the account information may include a password. In the description of the embodiments, a mode in which account information includes an ID and a password may be taken as an example unless otherwise specified. However, as long as there is no contradiction, the word account information may be replaced with the word ID (without a password) or the word ID and password.
(1.1.3.認証用情報)
 実施形態の説明では、ユーザの正当性を示すための情報を認証用情報と称することがある。生体情報およびアカウント情報は、認証用情報の一種である。実施形態に係る通信システム1では、生体情報およびアカウント情報に加えて、または代えて、他の認証用情報が利用されてもよい。なお、第3実施形態の説明では、認証用情報の語は、管理用テーブルDT0に記憶される生体情報を除く、他の認証用情報を指すことがある。
(1.1.3. Authentication information)
In the description of the embodiments, information for indicating user validity may be referred to as authentication information. Biometric information and account information are types of authentication information. In the communication system 1 according to the embodiment, other authentication information may be used in addition to or in place of biometric information and account information. Note that in the description of the third embodiment, the term authentication information may refer to other authentication information other than the biometric information stored in the management table DT0.
 認証用情報としては、生体情報およびアカウント情報の他、静的鍵、共通鍵、秘密鍵(もしくは公開鍵)および電子証明書が挙げられる。これらの例示から理解されるように、認証用情報は、認証を要求する通信機器(例えば画像処理装置3MA)から認証を行う通信機器(例えば画像処理装置3MB)へ送信される情報自体であってもよいし、認証を行う通信機器へ送信される情報を生成するときに利用されるものであってもよい。前者としては、アカウント情報、静的鍵、電子証明書、セキュリティートークンから得られる情報および生体情報を挙げることができる。後者としては、共通鍵ならびに秘密鍵(もしくは公開鍵)を挙げることができる。前者と後者の双方が認証用情報として利用されてもよい。 Authentication information includes biometric information and account information, as well as static keys, common keys, private keys (or public keys), and electronic certificates. As can be understood from these examples, authentication information is the information itself that is transmitted from a communication device requesting authentication (for example, image processing device 3MA) to a communication device performing authentication (for example, image processing device 3MB). Alternatively, it may be used when generating information to be sent to a communication device that performs authentication. The former includes account information, static keys, electronic certificates, information obtained from security tokens, and biometric information. The latter includes a common key and a private key (or public key). Both the former and the latter may be used as authentication information.
 なお、送信される情報自体としての認証用情報と、認証用情報に基づいて生成されて送信される情報との上位概念として、「認証用情報に基づく情報」の語を用いることがある。 Note that the term "information based on authentication information" is sometimes used as a general concept of authentication information as the transmitted information itself and information generated based on the authentication information and transmitted.
 これまでの説明から理解されるように、認証を要求する通信機器が記憶している認証用情報と、認証を行う通信機器が記憶している認証用情報とは、同じ情報でなくて構わない。また、認証を要求する通信機器が記憶している認証用情報は、適宜に加工されて認証を行う通信機器へ送信されてもよい。そのような態様の一種として、チャレンジレスポンス認証が行われてもよい。 As can be understood from the previous explanation, the authentication information stored by the communication device requesting authentication and the authentication information stored by the communication device performing authentication do not have to be the same information. . Further, the authentication information stored in the communication device requesting authentication may be appropriately processed and sent to the communication device performing authentication. Challenge-response authentication may be performed as one such mode.
 なお、実施形態の説明では、特に断りがない限り、また、矛盾等が生じない限り、加工前の認証用情報と加工後の認証用情報とは同じものとして表現される。換言すれば、情報の形式および/または精度の相違は無視し、ユーザの正当性を示す情報内容が同一であれば、同一の情報として表現する。従って、例えば、第1の通信機器が記憶している認証用情報と、第2の通信機器が記憶している認証用情報とが一致するという場合、実際には、第1の通信機器から第2の通信機器への送信の過程で認証用情報が加工され、第1の通信機器が記憶している認証用情報と、第2の通信機器が記憶している認証用情報とは、形式等において異なっていてもよい。 Note that in the description of the embodiments, the authentication information before processing and the authentication information after processing are expressed as the same thing, unless otherwise specified or unless there is a contradiction. In other words, differences in the format and/or accuracy of the information are ignored, and if the information content indicating the validity of the user is the same, it is expressed as the same information. Therefore, for example, when the authentication information stored in a first communication device and the authentication information stored in a second communication device match, in reality, the authentication information stored in the first communication device matches the authentication information stored in the second communication device. The authentication information is processed in the process of transmission to the second communication device, and the authentication information stored in the first communication device and the authentication information stored in the second communication device have different formats, etc. may be different.
 実施形態の説明では、認証用情報として、主として、アカウント情報を例にとる。また、特に断り無く、認証用情報がアカウント情報であることを前提とした説明を行うことがある。 In the description of the embodiment, account information will be mainly taken as an example of authentication information. Further, unless otherwise specified, explanations may be provided assuming that the authentication information is account information.
(1.2.通信システムの全体の構成)
 図1の説明から理解されるように、通信システム1は、少なくとも2つの画像処理装置3を有している。2つの画像処理装置3は、種々の構成とされてよく、また、2つの画像処理装置3を接続するネットワーク10の構成も任意である。
(1.2. Overall configuration of communication system)
As understood from the description of FIG. 1, the communication system 1 includes at least two image processing devices 3. The two image processing devices 3 may have various configurations, and the configuration of the network 10 connecting the two image processing devices 3 may also be arbitrary.
 図2は、各画像処理装置3およびネットワーク10の構成の例を説明するための図である。 FIG. 2 is a diagram for explaining an example of the configuration of each image processing device 3 and the network 10.
 図2では、3つの画像処理装置3A~3Cが例示されている。3つの画像処理装置3A~3Cは、例えば、パブリックネットワーク11ならびにプライベートネットワーク13Aおよび13Bに対する位置付けが互いに異なっている。通信システム1が含む2つの画像処理装置3MAおよび3MBのそれぞれは、例えば、図2に示す画像処理装置3A~3Cのいずれかの態様とされてよい。 In FIG. 2, three image processing devices 3A to 3C are illustrated. The three image processing devices 3A to 3C have different positions relative to the public network 11 and the private networks 13A and 13B, for example. Each of the two image processing devices 3MA and 3MB included in the communication system 1 may be configured in any one of the image processing devices 3A to 3C shown in FIG. 2, for example.
 例えば、2つの画像処理装置3Cが設けられ、この2つの画像処理装置3Cが画像処理装置3MAおよび3MBとされてよい。換言すれば、ネットワーク10は、プライベートネットワーク13Aであってよい。また、例えば、画像処理装置3Aおよび3Cが画像処理装置3MAおよび3MBとされてもよい。換言すれば、ネットワーク10は、プライベートネットワーク13Aおよびパブリックネットワーク11を含んでよい。画像処理装置3MAおよび3MBのネットワークとの関係における構成については、後に1.4節で補足する。 For example, two image processing devices 3C may be provided, and these two image processing devices 3C may be image processing devices 3MA and 3MB. In other words, the network 10 may be the private network 13A. Further, for example, the image processing devices 3A and 3C may be replaced by image processing devices 3MA and 3MB. In other words, network 10 may include private network 13A and public network 11. The configuration of the image processing devices 3MA and 3MB in relation to the network will be supplemented later in Section 1.4.
 なお、画像処理装置3Aおよび3Cが画像処理装置3MAおよび3MBであってよいことから理解されるように、図2は、通信システム1を示す図として捉えられても構わない。以下では、便宜上、図2に示された構成を通信システム1と称することがある。 Note that, as understood from the fact that the image processing devices 3A and 3C may be the image processing devices 3MA and 3MB, FIG. 2 may be taken as a diagram showing the communication system 1. Below, for convenience, the configuration shown in FIG. 2 may be referred to as a communication system 1.
 通信システム1は、適宜に画像処理装置3以外の通信機器を有してよい。図1では、サーバ5および7、ならびに端末9A、9Bおよび9Cが例示されている。以下では、端末9A~9Cを区別せずに端末9(代表して端末9Aに符号を付す。)と称することがある。 The communication system 1 may include communication devices other than the image processing device 3 as appropriate. In FIG. 1, servers 5 and 7 and terminals 9A, 9B and 9C are illustrated. Hereinafter, the terminals 9A to 9C may be referred to as the terminal 9 (representatively, the terminal 9A is given the reference numeral) without distinguishing them.
 なお、通信システム1は、既述のとおり、少なくとも2つの画像処理装置3(3MAおよび3MB)のみによって定義されてよい。また、通信システム1は、画像処理装置3MAと通信可能な画像処理装置3MB以外の通信機器(例えばサーバ5および7ならびに端末9)を含んで定義されてもよい。さらに、通信システム1は、2つの画像処理装置3MAおよび3MBを含むプライベートネットワーク(13Aまたは13B)を含んで定義されてもよい。ただし、いずれにせよ、通信システム1は、パブリックネットワーク11を除いて定義されてよい。 Note that, as described above, the communication system 1 may be defined only by at least two image processing devices 3 (3MA and 3MB). Furthermore, the communication system 1 may be defined to include communication devices (for example, servers 5 and 7 and terminal 9) other than the image processing device 3MB that can communicate with the image processing device 3MA. Furthermore, the communication system 1 may be defined to include a private network (13A or 13B) including two image processing devices 3MA and 3MB. However, in any case, the communication system 1 may be defined without the public network 11.
(1.3.各通信機器の概要)
 画像処理装置3は、既述のように、プリンタおよびスキャナの少なくとも一方を含む。以下の説明では、主として、画像処理装置3がプリンタおよびスキャナの双方を含む態様を例に取る。画像処理装置3は、複合機(MFP:multi-function product/printer/peripheral)であってもよいし、複合機でなくてもよい。なお、図面においては、便宜上、画像処理装置3を「MFP」と表記することがある。画像処理装置3は、例えば、印刷、スキャン、コピー、FAX送信およびFAX受信の1つ以上(ただし、これらは必ずしも分離できる概念ではない。)を実行可能であってよい。
(1.3. Overview of each communication device)
The image processing device 3 includes at least one of a printer and a scanner, as described above. The following description will mainly take as an example a mode in which the image processing device 3 includes both a printer and a scanner. The image processing device 3 may or may not be a multi-function product/printer/peripheral (MFP). Note that in the drawings, the image processing device 3 may be referred to as "MFP" for convenience. The image processing device 3 may be capable of executing one or more of printing, scanning, copying, FAX transmission, and FAX reception (although these are not necessarily separable concepts), for example.
 画像処理装置3の運用方法(別の観点では社会的位置付け)は任意である。例えば、画像処理装置3Aは、コンビニエンスストア等の店舗に設置されて不特定多数のユーザに利用されてよい。画像処理装置3Bは、個人宅に設置されて特定かつ少数(例えば1人)のユーザに利用されてよい。画像処理装置3Cは、会社に設置されて特定かつ複数のユーザに利用されてよい。 The method of operating the image processing device 3 (social positioning from another perspective) is arbitrary. For example, the image processing device 3A may be installed in a store such as a convenience store and used by an unspecified number of users. The image processing device 3B may be installed in a private home and used by a specific and small number of users (for example, one person). The image processing device 3C may be installed in a company and used by a specific number of users.
 サーバ5は、例えば、第3実施形態(6節)で説明するように、画像処理装置3を利用するユーザの認証に寄与してよい。また、例えば、サーバ5は、VPNサーバとして機能してよい(7.4節)。また、例えば、サーバ5は、プライベートネットワーク13Aに関してECM(Enterprise Content Management)を行ってもよい。また、サーバ5は、図1の例とは異なり、ユーザ毎の機能の制限に関する情報を保持して、画像処理装置3における機能の制限の制御を補助してもよい。 The server 5 may, for example, contribute to the authentication of the user who uses the image processing device 3, as described in the third embodiment (Section 6). Further, for example, the server 5 may function as a VPN server (Section 7.4). Further, for example, the server 5 may perform ECM (Enterprise Content Management) regarding the private network 13A. Further, unlike the example of FIG. 1, the server 5 may hold information regarding the restriction of functions for each user to assist in controlling the restriction of functions in the image processing device 3.
 サーバ7は、種々のサービスを行うものであってよい。例えば、サーバ7は、ファイルサーバ、メールサーバおよび/またはウェブサーバであってよい。画像処理装置3に係る動作に着目した場合、ファイルサーバは、例えば、画像処理装置3によって印刷される画像のデータ、または画像処理装置3によってスキャンされたデータを記憶してよい。メールサーバは、画像処理装置3によって印刷されるメール、または画像処理装置3によってスキャンされた画像を含むメールを配送してよい。ウェブサーバは、画像処理装置3との通信を通じて行われるウェブサービスを実行してよい。 The server 7 may provide various services. For example, server 7 may be a file server, a mail server and/or a web server. When focusing on the operation related to the image processing device 3, the file server may store, for example, data of an image printed by the image processing device 3 or data scanned by the image processing device 3. The mail server may deliver mail printed by the image processing device 3 or mail containing an image scanned by the image processing device 3. The web server may execute web services performed through communication with the image processing device 3.
 図2では、サーバ5および7それぞれは、1台のコンピュータによって表現されている。ただし、1つのサーバは、分散されて配置された複数のコンピュータによって実現されて構わない。1つのサーバを構成する複数のコンピュータは、直接に接続されたり、1つのLANに含まれたり、互いに異なるLANに含まれたりしてよい。なお、サーバ5および7は、1台のコンピュータによって構成されてもよい。また、サーバ5および7は、1台のコンピュータによって構成されているか否かに関わらず、1つのサーバとして捉えられてもよい。 In FIG. 2, each of servers 5 and 7 is represented by one computer. However, one server may be realized by a plurality of distributed computers. A plurality of computers making up one server may be directly connected, included in one LAN, or included in mutually different LANs. Note that the servers 5 and 7 may be configured by one computer. Moreover, the servers 5 and 7 may be regarded as one server, regardless of whether they are configured by one computer or not.
 端末9は、適宜な種類のものとされてよい。図2では、端末9Aおよび9Bは、ラップトップ型のPC(パーソナルコンピュータ)として描かれている。端末9Cは、スマートフォンとして描かれている。この他、端末9は、例えば、デスクトップ型PCまたはタブレット型PCであってよい。端末9の運用方法は任意である。例えば、端末9は、会社所有の端末または個人所有の端末のように、特定かつ1人以上のユーザに利用されるものであってもよいし、インターネットカフェの端末のように、不特定かつ多数のユーザに利用されるものであってもよい。 The terminal 9 may be of any appropriate type. In FIG. 2, terminals 9A and 9B are depicted as laptop-type PCs (personal computers). Terminal 9C is depicted as a smartphone. In addition, the terminal 9 may be, for example, a desktop PC or a tablet PC. The terminal 9 can be operated in any manner. For example, the terminal 9 may be one that is used by one or more specific users, such as a terminal owned by a company or a terminal owned by an individual, or one that is unspecified and used by many users, such as a terminal at an Internet cafe. It may be used by several users.
(1.4.通信機器の接続態様)
 パブリックネットワーク11は、外部(例えば不特定多数の通信機器)へ公開されているネットワークである。その具体的な態様は、適宜なものとされてよい。例えば、パブリックネットワーク11は、インターネット、通信業者等が提供する閉域ネットワーク、および/または公衆電話網を含んでよい。
(1.4. Connection mode of communication equipment)
The public network 11 is a network open to the outside (for example, an unspecified number of communication devices). The specific aspect may be determined as appropriate. For example, the public network 11 may include the Internet, a closed network provided by a telecommunications carrier, and/or a public telephone network.
 プライベートネットワーク13Aおよび13Bは、外部へ非公開のネットワークである。プライベートネットワーク13Aおよび/または13Bは、例えば、LANであってよい。LANは、例えば、同一建築物内におけるネットワークであってよい。LANとしては、例えば、イーサネット(登録商標)およびWi-Fi(登録商標)を利用したものを挙げることができる。また、プライベートネットワーク13Aおよび/または13Bは、イントラネットであってよい。 The private networks 13A and 13B are networks that are not disclosed to the outside. Private network 13A and/or 13B may be, for example, a LAN. The LAN may be, for example, a network within the same building. Examples of the LAN include those using Ethernet (registered trademark) and Wi-Fi (registered trademark). Further, the private network 13A and/or 13B may be an intranet.
 通信機器(例えば画像処理装置3)による信号の送信および/または受信は、有線を介したものであってもよいし、無線で行われるものであってもよい。また、通信機器(例えば画像処理装置3)は、プライベートネットワークに含まれずに、パブリックネットワーク11と通信を行ってもよいし、プライベートネットワークに含まれてもよい。プライベートネットワークに含まれる通信機器(例えば画像処理装置3)は、プライベートネットワーク内の通信のみを行ってもよいし、プライベートネットワークを介してパブリックネットワーク11と通信を行ってもよい。 Transmission and/or reception of signals by the communication device (for example, the image processing device 3) may be performed via a wire or wirelessly. Further, the communication device (for example, the image processing device 3) may communicate with the public network 11 without being included in the private network, or may be included in the private network. A communication device (for example, the image processing device 3) included in the private network may communicate only within the private network, or may communicate with the public network 11 via the private network.
 上記のように、複数の通信機器は、種々の態様で互いに接続されてよい。図2の例では、以下のとおりである。 As mentioned above, multiple communication devices may be connected to each other in various ways. In the example of FIG. 2, it is as follows.
 画像処理装置3Aは、プライベートネットワークを構築していない。画像処理装置3Aは、不図示のルータ等を含むことによって、またはルータ等と接続されることによって、プライベートネットワークを介さずにパブリックネットワーク11と通信可能となっている。画像処理装置3Aは、当該画像処理装置3Aに直接的に有線で接続された端末9(図1では不図示)と通信可能であってよい。また、画像処理装置3Aは、当該画像処理装置3Aの近傍に配置された端末9(図2では不図示)と近距離無線通信が可能であってよい。 The image processing device 3A has not constructed a private network. The image processing device 3A is capable of communicating with the public network 11 without going through a private network by including a router or the like (not shown) or by being connected to a router or the like. The image processing device 3A may be able to communicate with a terminal 9 (not shown in FIG. 1) that is directly connected to the image processing device 3A by wire. Further, the image processing device 3A may be capable of short-range wireless communication with a terminal 9 (not shown in FIG. 2) placed near the image processing device 3A.
 画像処理装置3Bおよび端末9Bは、プライベートネットワーク13Bによって互いに接続されている。より詳細には、両者は、ルータ15(そのハブ)を介して接続されている。画像処理装置3Bおよび端末9Bは、ルータ15等を介してパブリックネットワーク11と通信可能である。 The image processing device 3B and the terminal 9B are connected to each other by a private network 13B. More specifically, both are connected via the router 15 (its hub). The image processing device 3B and the terminal 9B can communicate with the public network 11 via the router 15 and the like.
 画像処理装置3C、サーバ5、サーバ7および端末9Aは、プライベートネットワーク13Aによって互いに接続されている。画像処理装置3C、サーバ7および端末9Aは、例えば、サーバ5を介してパブリックネットワーク11と通信可能である。サーバ5は、ルータ等を含んでいてもよいし、サーバ5とパブリックネットワーク11との間に不図示のルータ等が設けられていてもよい。 The image processing device 3C, server 5, server 7, and terminal 9A are connected to each other by a private network 13A. The image processing device 3C, the server 7, and the terminal 9A can communicate with the public network 11 via the server 5, for example. The server 5 may include a router or the like, or a router (not shown) or the like may be provided between the server 5 and the public network 11.
 端末9Cは、例えば、無線による公衆電話網との通信を行う。ひいては、端末9Cは、公衆電話網を含むパブリックネットワーク11と通信を行う。 The terminal 9C, for example, communicates wirelessly with the public telephone network. Furthermore, the terminal 9C communicates with the public network 11 including the public telephone network.
 実施形態に係る認証を行う2つの画像処理装置3MAおよび3MBは、例えば、同一のプライベートネットワーク13Aに含まれる2つの画像処理装置3Cとされてよい。また、画像処理装置3MAおよび3MBは、同一のVPNに含まれる2つの画像処理装置3とされてよい。この場合、画像処理装置3MAおよび3MBのそれぞれは、画像処理装置3A~3Cのいずれであってもよい。また、画像処理装置3MAおよび3MBは、パブリックネットワーク11(VPNではない)を介して接続される2つの画像処理装置3とされてもよい。この場合も、画像処理装置3MAおよび3MBのそれぞれは、画像処理装置3A~3Cのいずれであってもよい。 The two image processing devices 3MA and 3MB that perform authentication according to the embodiment may be, for example, two image processing devices 3C included in the same private network 13A. Further, the image processing devices 3MA and 3MB may be two image processing devices 3 included in the same VPN. In this case, each of the image processing devices 3MA and 3MB may be any of the image processing devices 3A to 3C. Further, the image processing devices 3MA and 3MB may be two image processing devices 3 connected via a public network 11 (not a VPN). Also in this case, each of the image processing devices 3MA and 3MB may be any of the image processing devices 3A to 3C.
 なお、2つの画像処理装置3MAおよび3MBが同一のVPNに含まれる態様では、例えば、実施形態に係る生体認証の前(生体情報のインポートまたはエクスポートの前)に、上記生体情報とは別の認証用情報を用いた認証が成功した場合に画像処理装置3MAおよび3MBのVPN接続がなされてよい。VPN接続のための認証では、個々のユーザの認証用情報または画像処理装置3に割り当てられた認証用情報が用いられてよい。上記とは異なり、実施形態に係る生体認証によって、または当該生体認証の後に別の認証によって、パブリックネットワーク11を介して接続されている画像処理装置3MAおよび3MBがVPN接続される態様は、画像処理装置3MAおよび3MBがパブリックネットワーク11(VPNではない)を介して接続されている態様と捉えられてよい。 Note that in an embodiment in which the two image processing devices 3MA and 3MB are included in the same VPN, for example, before biometric authentication according to the embodiment (before importing or exporting biometric information), authentication other than the above biometric information is performed. If the authentication using the information is successful, the image processing devices 3MA and 3MB may be connected to the VPN. For authentication for VPN connection, authentication information for individual users or authentication information assigned to the image processing device 3 may be used. Different from the above, the image processing apparatuses 3MA and 3MB connected via the public network 11 are connected via VPN by the biometric authentication according to the embodiment or by another authentication after the biometric authentication. It may be considered that the devices 3MA and 3MB are connected via the public network 11 (not a VPN).
 これまでの説明から理解されるように、画像処理装置3MAおよび3MBが相互に通信(例えば生体情報のインポートおよびエクスポート)を行うとき、サーバ5等の他の通信機器が介在してもよいし、介在しなくてもよい。実施形態の説明では、便宜上、サーバ5等の他の通信機器が介在する場合であっても、画像処理装置3MAおよび3MBが互いに直接的に通信を行っているような表現を用いることがある。 As can be understood from the above description, when the image processing devices 3MA and 3MB communicate with each other (for example, importing and exporting biometric information), other communication devices such as the server 5 may intervene, No intervention is required. In the description of the embodiment, for convenience, expressions such as the image processing devices 3MA and 3MB communicating directly with each other may be used even when other communication devices such as the server 5 are involved.
 通信機器の接続態様と、通信機器の運用方法(別の観点では社会的な位置付け)との関係は任意である。例えば、プライベートネットワークに含まれていない画像処理装置3Aは、既述の説明のように店舗に設置されて不特定多数のユーザに利用されてもよいし、既述の説明とは異なり、会社に設置されて特定のユーザに利用されてもよい。また、例えば、プライベートネットワーク13Bに含まれている画像処理装置3Bは、既述の説明のように個人宅に設置されて特定かつ少数のユーザに利用されてもよいし、既述の説明とは異なり、インターネットカフェに設置されて不特定多数のユーザに利用されてもよい。 The relationship between the connection mode of communication equipment and the operating method of communication equipment (from another perspective, social positioning) is arbitrary. For example, the image processing device 3A that is not included in the private network may be installed in a store and used by an unspecified number of users as described above, or it may be installed in a company and used by an unspecified number of users, as described above, or It may be installed and used by a specific user. Further, for example, the image processing device 3B included in the private network 13B may be installed in a private home and used by a specific and small number of users as described above, or the image processing device 3B included in the private network 13B may be used by a specific and small number of users. Alternatively, it may be installed in an Internet cafe and used by an unspecified number of users.
 ただし、実施形態の説明(例えば図4~図6Bの説明)では、便宜上、画像処理装置3MAおよび3MBが同一の会社の互いに異なる部署に配置されている状況を想定した表現をすることがある。この場合、画像処理装置3MAおよび3MBは、例えば、同一のプライベートネットワーク13Aに含まれたり、同一のVPNに含まれたりしていてよく、また、パブリックネットワーク11(VPNではない)を介して接続されていても構わない。 However, in the description of the embodiment (for example, the description of FIGS. 4 to 6B), for convenience, the image processing apparatuses 3MA and 3MB may be expressed assuming a situation where they are located in different departments of the same company. In this case, the image processing devices 3MA and 3MB may be included in the same private network 13A or the same VPN, or may be connected via the public network 11 (not a VPN). It doesn't matter if you stay there.
(2.画像処理装置の構成)
(2.1.画像処理装置の全体の構成)
 図3は、画像処理装置3の信号処理系に係るハードウェア構成を示す模式図である。
(2. Configuration of image processing device)
(2.1. Overall configuration of image processing device)
FIG. 3 is a schematic diagram showing the hardware configuration of the signal processing system of the image processing device 3. As shown in FIG.
 図2および図3に示すように、画像処理装置3は、例えば、以下の構成要素を有している。画像処理装置3の外形を構成している筐体17(図2)。印刷を行うプリンタ19。スキャンを行うスキャナ21(イメージスキャナ)。ユーザの操作の受け付ける、および/またはユーザへの情報の提示を行うUI部23。ユーザの生体情報を検出する検出部25。通信を行う通信部27(図3)。各部(19、21、23、25および27)の制御を行う制御部29(図3)。画像処理装置3に適宜なデバイスを接続するためのコネクタ37(図3)。なお、以下では、プリンタ19および/またはスキャナ21を画像処理部31(符号は図3)ということがある。 As shown in FIGS. 2 and 3, the image processing device 3 includes, for example, the following components. A housing 17 (FIG. 2) constitutes the outer shape of the image processing device 3. A printer 19 that performs printing. A scanner 21 (image scanner) that performs scanning. A UI unit 23 that accepts user operations and/or presents information to the user. A detection unit 25 detects biometric information of the user. Communication unit 27 (FIG. 3) that performs communication. A control section 29 (FIG. 3) that controls each section (19, 21, 23, 25, and 27). A connector 37 (FIG. 3) for connecting an appropriate device to the image processing apparatus 3. Note that hereinafter, the printer 19 and/or the scanner 21 may be referred to as an image processing section 31 (reference numeral is shown in FIG. 3).
 上記の構成要素は、図示の例のように、一部または全部が互いに共有されていてもよい(または、そのように捉えられてよい。)。例えば、筐体17は、プリンタ19またはスキャナ21の一部として捉えられて構わない。本実施形態の説明において、制御部29は、画像処理装置3の全ての動作(例えば印刷およびスキャンを含む)の制御を行う概念的に1つの制御部である(ハードウェア的には複数に分散されていてよい。)。この場合において、制御部29によって制御される対象(19、21、23、25および27)は、制御部を含まない機構的な部分のみによって概念されてもよいし、制御部(制御部29の一部)を含んで概念されてもよい。 As in the illustrated example, some or all of the above components may be shared with each other (or may be perceived as such). For example, the housing 17 may be considered as part of the printer 19 or the scanner 21. In the description of this embodiment, the control unit 29 is conceptually one control unit that controls all operations (including printing and scanning, for example) of the image processing device 3 (in terms of hardware, it is distributed over multiple units). ). In this case, the objects (19, 21, 23, 25, and 27) controlled by the control section 29 may be conceptualized only in terms of mechanical parts that do not include the control section, or the objects (19, 21, 23, 25, and 27) controlled by the control section may be conceptualized as including some of them.
 筐体17以外の構成要素(19、21、23、25、27および29。以下、本2.1節において、構成要素の語は、このような筐体17以外の構成要素を指す。)は、筐体17に設けられている。別の表現または別の観点では、筐体17は、複数の構成要素を保持している、もしくは支持している、または、複数の構成要素に機械的に接続されている、もしくは結合されている、ということができる。また、複数の構成要素は、筐体17に設けられていることによって、互いに一体的に設けられているということができる。なお、既述の説明から理解されるように、構成要素が筐体17に設けられている等というとき、筐体17は、構成要素の一部として捉えられる態様であってもよい。 Components other than the housing 17 (19, 21, 23, 25, 27, and 29; hereinafter, in Section 2.1, the word component refers to such components other than the housing 17) , are provided in the housing 17. In another expression or perspective, the housing 17 holds or supports a plurality of components, or is mechanically connected to or coupled to a plurality of components. , it can be said. Moreover, since the plurality of components are provided in the housing 17, it can be said that they are provided integrally with each other. Note that, as understood from the above description, when it is said that a component is provided in the casing 17, the casing 17 may be regarded as a part of the component.
 画像処理装置3が構成要素を有しているというとき、例えば、典型的には、構成要素および筐体17は、互いに固定されている(もちろん可動部分は除く。)。ひいては、構成要素同士も互いに固定されている。また、例えば、ねじを外すことなどによって画像処理装置3を分解しない限りは、構成要素および筐体17は互いに分離して異なる場所に配置することはできない。ひいては、構成要素同士も互いに分離して異なる場所に配置することはできない。ただし、上記の例とは異なり、画像処理装置3が構成要素を有しているというとき、構成要素は、筐体17に対して着脱可能であってもよい。図3では、筐体17に設けられている検出部25が示されている一方で、当該検出部25とは別の例に係る検出部25Aとして、コネクタ37に着脱されるものが点線で示されている。 When it is said that the image processing device 3 has components, for example, typically, the components and the housing 17 are fixed to each other (of course, excluding movable parts). Furthermore, the components are also fixed to each other. Furthermore, unless the image processing device 3 is disassembled by, for example, removing screws, the components and the housing 17 cannot be separated from each other and placed in different locations. Furthermore, the constituent elements cannot be separated from each other and placed in different locations. However, unlike the above example, when it is said that the image processing device 3 has a component, the component may be detachable from the housing 17. In FIG. 3, while the detection section 25 provided in the housing 17 is shown, a detection section 25A that is a different example from the detection section 25 and that is attached to and detached from the connector 37 is indicated by a dotted line. has been done.
 構成要素が筐体17に設けられているというときの具体的な位置関係は任意である。例えば、構成要素は、筐体17内に収容されていてもよいし、筐体17の壁面に一体的に設けられていてもよいし、筐体17の壁面から突出していてもよいし、筐体17に対する向きおよび/または位置が可変となっていてもよい。図示の例では、プリンタ19、スキャナ21、通信部27および制御部29は、筐体17に収容されていると捉えられてよい。また、UI部23および検出部25は、筐体17の壁面に一体的に設けられていると捉えられてよい。 When it is said that the components are provided in the housing 17, the specific positional relationship is arbitrary. For example, the component may be housed within the casing 17, provided integrally with the wall of the casing 17, protruding from the wall of the casing 17, or mounted on the casing 17. The orientation and/or position relative to the body 17 may be variable. In the illustrated example, the printer 19, scanner 21, communication section 27, and control section 29 may be considered to be housed in the housing 17. Further, the UI section 23 and the detection section 25 may be considered to be integrally provided on the wall surface of the housing 17.
 画像処理装置3(別の観点では筐体17)の大きさおよび形状は任意である。例えば、画像処理装置3は、家庭用の複合機またはプリンタのように1人の人間が運搬可能な大きさ(質量)を有していてもよいし(画像処理装置3Bのイラスト参照)、業務用の複合機またはプリンタのように1人の人間が運搬不可能な大きさ(質量)を有していてもよい(画像処理装置3Aおよび3Cのイラスト参照)。 The size and shape of the image processing device 3 (from another perspective, the housing 17) are arbitrary. For example, the image processing device 3 may have a size (mass) that can be carried by one person, such as a home multifunction device or a printer (see illustration of the image processing device 3B), or may have a size (mass) that can be carried by one person, such as a home multifunction device or a printer (see illustration of the image processing device 3B), The image processing apparatus may have a size (mass) that cannot be carried by one person, such as a multifunction device or a printer (see illustrations of image processing apparatuses 3A and 3C).
 図示の例とは異なり、画像処理装置3は、会社(オフィス)または個人宅に配置される一般的な複合機またはプリンタとは概念が大きく異なるものであってもよい。例えば、プリンタ19は、ロール紙に印刷を行うものであってもよい。画像処理装置3は、ロボットを含んで構成され、インクジェットヘッドによって車体等に塗装を行うものであってもよい。画像処理装置3は、片手で持つことができる大きさとされ、画像処理装置3自体が媒体に対して走査されて印刷および/またはスキャンを行うものであってもよい。 Unlike the illustrated example, the image processing device 3 may have a concept that is significantly different from a general multifunction peripheral or printer placed in a company (office) or a private home. For example, the printer 19 may print on roll paper. The image processing device 3 may include a robot, and may apply coating to a vehicle body or the like using an inkjet head. The image processing device 3 may be of a size that can be held in one hand, and the image processing device 3 itself may scan a medium to perform printing and/or scanning.
(2.2.プリンタ)
 プリンタ19は、例えば、筐体17内または筐体17から外部へ突出するトレイに配置された枚葉紙に印刷を行い、印刷後の枚葉紙を排出するように構成されている。プリンタ19の具体的な構成は、種々の構成とされてよく、例えば、公知の構成と同様とされても構わない。
(2.2. Printer)
The printer 19 is configured, for example, to print on sheets of paper arranged within the housing 17 or on a tray protruding from the housing 17 to the outside, and to discharge the printed sheets. The specific configuration of the printer 19 may be various configurations, for example, it may be similar to a known configuration.
 例えば、プリンタ19は、インクを吐出して印刷を行うインクジェットプリンタであってもよいし、感熱紙またはインクリボンを加熱して印刷を行うサーマルプリンタであってもよいし、光が照射された感光体に付着したトナーを転写する電子写真式プリンタ(例えばレーザープリンタ)であってもよい。インクジェットプリンタは、圧電体によってインクに圧力を付与するピエゾ式であってもよいし、熱が付与されたインクに生じる気泡によってインクに圧力を付与するサーマル式であってもよい。 For example, the printer 19 may be an inkjet printer that prints by ejecting ink, a thermal printer that prints by heating thermal paper or an ink ribbon, or a photosensitive printer irradiated with light. It may also be an electrophotographic printer (for example, a laser printer) that transfers toner adhering to the body. The inkjet printer may be a piezo type that applies pressure to the ink using a piezoelectric body, or a thermal type that applies pressure to the ink using bubbles generated in the heated ink.
 また、例えば、プリンタ19は、ヘッドが枚葉紙の幅(枚葉紙の搬送方向に交差する方向)に亘る長さを有するラインプリンタであってもよいし、ヘッドが枚葉紙の幅方向に移動するシリアルプリンタであってもよい。プリンタ19は、カラープリンタであってもよいし、モノクロプリンタであってもよい。プリンタ19は、任意の画像を形成できるものであってもよいし、文字のみを印刷できるものであってもよい。 Further, for example, the printer 19 may be a line printer in which the head has a length spanning the width of the sheet (a direction that intersects the conveying direction of the sheet), or the printer 19 may have a head that extends in the width direction of the sheet. It may be a serial printer that moves to. The printer 19 may be a color printer or a monochrome printer. The printer 19 may be capable of forming any image, or may be capable of printing only characters.
(2.3.スキャナ)
 スキャナ21は、例えば、筐体17の上面から露出する原稿ガラス(図2では蓋に隠れている)下にて原稿ガラスに沿って移動する複数の撮像素子(不図示)によって原稿ガラス上に配置された原稿を撮像してスキャンを行う。スキャナ21の構成も種々の構成とされてよく、例えば、公知の構成と同様とされても構わない。
(2.3. Scanner)
For example, the scanner 21 is arranged on the original glass using a plurality of image pickup elements (not shown) that move along the original glass under the original glass exposed from the top surface of the housing 17 (hidden by the lid in FIG. 2). image and scan the original. The scanner 21 may have various configurations, for example, may be similar to known configurations.
(2.4.UI部)
 UI部23の構成は任意である。例えば、UI部23は、ユーザの操作を受け付ける操作部33(符号は図3)と、ユーザに視覚的に情報を提示する表示部35(符号は図3)とを有している。なお、UI部23は設けられなくてもよいし、操作部33および表示部35のうち一方のみを有していてもよい。また、UI部23は、音響によってユーザに情報を提示する音響部を有していてもよい。UI部23は、実施形態の説明とは異なり、コネクタ37を含んで定義されてもよい。コネクタ37にデバイスを接続することが画像処理装置3への指示の入力の一種となることがあるからである。
(2.4.UI part)
The configuration of the UI section 23 is arbitrary. For example, the UI unit 23 includes an operation unit 33 (reference numeral shown in FIG. 3) that receives user operations, and a display unit 35 (reference numeral shown in FIG. 3) that visually presents information to the user. Note that the UI section 23 may not be provided, or only one of the operation section 33 and the display section 35 may be provided. Further, the UI unit 23 may include an audio unit that presents information to the user by sound. The UI unit 23 may be defined to include the connector 37, unlike the description of the embodiment. This is because connecting a device to the connector 37 may be a type of inputting an instruction to the image processing device 3.
(2.4.1.操作部)
 操作部33の構成は任意である。操作部33は、例えば、ユーザの接触による操作を受け付ける。このような操作部33としては、例えば、タッチパネルおよび/または1以上のボタンを含むものを挙げることができる。図2では、画像処理装置3Aおよび3Cの操作部33の少なくとも一部としてタッチパネル(符号省略)を例示している。また、画像処理装置3Bの操作部33の少なくとも一部としてボタン33aを例示している。ボタン33aは、押しボタンであってもよいし、タッチボタンであってもよいし、その他のボタンであってもよい。タッチボタンは、静電容量式のタッチボタンであってもよいし、その他のタッチボタンであってもよい。もちろん、画像処理装置3Aおよび3Cがボタンを有していてもよいし、画像処理装置3Bがタッチパネルを有していてもよい。なお、操作部33は、音声操作などの他の方式の操作を受け付けるものであってもよい。
(2.4.1. Operation section)
The configuration of the operation section 33 is arbitrary. The operation unit 33 accepts, for example, a user's touch operation. Such an operation section 33 may include, for example, a touch panel and/or one or more buttons. In FIG. 2, a touch panel (reference numeral omitted) is illustrated as at least a part of the operation unit 33 of the image processing devices 3A and 3C. Further, a button 33a is illustrated as at least a part of the operation unit 33 of the image processing device 3B. The button 33a may be a push button, a touch button, or another button. The touch button may be a capacitive touch button or another touch button. Of course, the image processing devices 3A and 3C may have buttons, and the image processing device 3B may have a touch panel. Note that the operation unit 33 may accept other types of operations such as voice operations.
 操作部33は、種々の目的で利用されてよい。典型的には、操作部33は、画像処理部31に係る処理の実行を画像処理装置3に指示するために利用される。例えば、操作部33に対する操作によって、印刷、スキャンおよびコピーが行われたり、これらの動作に係る設定(例えば用紙選択、倍率、濃度および/または色などの設定)が行われたりする。この他、例えば、操作部33に対する操作によって、データへのアクセス、データの送受信、および認証用情報の入力がなされてよい。 The operation unit 33 may be used for various purposes. Typically, the operation unit 33 is used to instruct the image processing device 3 to execute processing related to the image processing unit 31. For example, by operating the operation unit 33, printing, scanning, and copying are performed, and settings related to these operations (for example, settings for paper selection, magnification, density, and/or color, etc.) are performed. In addition, for example, by operating the operation unit 33, access to data, transmission and reception of data, and input of authentication information may be performed.
(2.4.2.表示部)
 表示部35の構成は任意である。例えば、表示部35は、任意の画像を表示可能なディスプレイ、任意の文字のみを表示可能なディスプレイ、特定の文字および/または特定の図形のみを表示可能なディスプレイ、ならびに表示灯の少なくともいずれか1つを含んでよい。ここでの画像は、文字を含む概念である。任意の画像または任意の文字を表示するディスプレイとしては、例えば、規則的に配列された比較的多数の画素を有する液晶ディスプレイまたは有機EL(Electro Luminescence)ディスプレイを挙げることができる。また、特定の文字および/または特定の図形を表示するディスプレイとしては、画素の数および/または形状が限定的な液晶ディスプレイ、または7セグメントディスプレイのようなセグメントディスプレイを挙げることができる。セグメントディスプレイは、液晶ディスプレイを含む種々の態様とされてよい。表示灯としては、例えば、LED(Light Emitting Diode)を含むものを挙げることができる。表示灯は、適宜な数で設けられてよい。なお、以下の説明では、便宜上、表示部35が任意の画像を表示可能であることを前提とした表現をすることがある。
(2.4.2. Display section)
The configuration of the display section 35 is arbitrary. For example, the display unit 35 may include at least one of a display capable of displaying any image, a display capable of displaying only arbitrary characters, a display capable of displaying only specific characters and/or specific graphics, and an indicator light. May contain one. The image here is a concept that includes characters. Examples of displays that display arbitrary images or arbitrary characters include liquid crystal displays or organic EL (Electro Luminescence) displays that have a relatively large number of regularly arranged pixels. Furthermore, examples of displays that display specific characters and/or specific graphics include liquid crystal displays with a limited number and/or shape of pixels, or segment displays such as a 7-segment display. Segmented displays may take various forms, including liquid crystal displays. Examples of the indicator light include those including LEDs (Light Emitting Diodes). An appropriate number of indicator lights may be provided. In addition, in the following description, for convenience, expressions may be given on the premise that the display unit 35 can display any image.
(2.5.生体情報が入力される入力部(検出部))
 画像処理装置3は、既述のとおり、生体情報を検出する検出部25を有している。ただし、画像処理装置3は、検出部25を有していなくてもよい。例えば、画像処理装置3と端末9との間の近距離無線通信においてセキュアな接続を確立する。この状態で、端末9によって生体情報を検出し、検出した生体情報を画像処理装置3に送信してよい。これにより、生体情報は、端末9と通信を行っている通信部27に入力されてよい。このような通信部27および検出部25の上位概念として、生体情報が入力される「入力部」という用語を用いることがある。入力部は、生体情報が直接的に入力されるもの(検出部25)、または生体情報が間接的に入力されるもの(通信部27)であってよい。
(2.5. Input unit (detection unit) into which biological information is input)
As described above, the image processing device 3 includes the detection unit 25 that detects biological information. However, the image processing device 3 does not need to include the detection section 25. For example, a secure connection is established in short-range wireless communication between the image processing device 3 and the terminal 9. In this state, biometric information may be detected by the terminal 9 and the detected biometric information may be transmitted to the image processing device 3 . Thereby, the biometric information may be input to the communication unit 27 communicating with the terminal 9. As a superordinate concept for such communication unit 27 and detection unit 25, the term “input unit” into which biometric information is input is sometimes used. The input unit may be a unit into which biometric information is directly input (detection unit 25) or a unit into which biometric information is indirectly input (communication unit 27).
 実施形態の説明では、基本的に、入力部が検出部25である態様を例に取る。検出部25の構成等は、例えば、以下のとおりである。 The description of the embodiment basically takes as an example a mode in which the input section is the detection section 25. The configuration of the detection unit 25 is, for example, as follows.
 上述したように、認証に利用される生体情報は、種々のものとされてよい。従って、検出部25の構成も種々のものとされてよい。また、同一種類の生体情報に関しても、種々の検出部25が利用されてよい。検出部25の基本的構成は、公知のものと同様とされて構わない。 As mentioned above, various types of biometric information may be used for authentication. Therefore, the configuration of the detection section 25 may also be various. Furthermore, various detection units 25 may be used for the same type of biological information. The basic configuration of the detection unit 25 may be the same as a known one.
 例えば、検出部25は、生体情報に係る画像を取得するものであってよい。画像の取得により得られる生体情報としては、例えば、指紋、掌形、網膜、虹彩、顔、血管および耳形が挙げられる。画像を取得する検出部25の典型例として、光学式のものを挙げることができる。光学式の検出部25は、光を検出する撮像素子を含む。撮像素子が検出対象とする光(換言すれば波長域)は、可視光または可視光以外(例えば赤外光)であってよい。検出部25は、撮像素子によって検出される波長域の光を生体に照射する照明部を有していてもよいし、有していなくてもよい。画像は、2値画像、グレースケール画像またはカラー画像であってよい。 For example, the detection unit 25 may acquire an image related to biological information. Examples of biological information obtained by acquiring images include fingerprints, palm shapes, retinas, iris, faces, blood vessels, and ear shapes. A typical example of the detection unit 25 that acquires an image is an optical type. The optical detection unit 25 includes an image sensor that detects light. The light to be detected by the image sensor (in other words, the wavelength range) may be visible light or non-visible light (for example, infrared light). The detection unit 25 may or may not have an illumination unit that irradiates the living body with light in the wavelength range detected by the image sensor. The image may be a binary image, a grayscale image or a color image.
 また、画像を取得する検出部25は、超音波式のものであってもよい。超音波式の検出部25は、超音波の送信および受信を行う超音波素子を含む。医療用の超音波診断装置から理解されるように、超音波素子を含む検出部25によって、生体の表面および/または内部の形状に係る画像を取得することができる。より詳細には、検出部25は、超音波を生体に向けて送信し、その反射波を受信する。送信から受信までの時間に基づいて超音波素子からの距離(すなわち生体の形状)を反映した画像が取得される。 Furthermore, the detection unit 25 that acquires images may be of an ultrasonic type. The ultrasonic detection unit 25 includes an ultrasonic element that transmits and receives ultrasonic waves. As understood from a medical ultrasonic diagnostic apparatus, the detection unit 25 including an ultrasonic element can acquire an image of the surface and/or internal shape of a living body. More specifically, the detection unit 25 transmits ultrasonic waves toward the living body and receives the reflected waves. An image that reflects the distance from the ultrasound element (ie, the shape of the living body) is acquired based on the time from transmission to reception.
 また、画像を取得する検出部25は、静電容量式のものであってもよい。静電容量式の検出部25は、生体が接触するパネルと、パネルの背後にパネルに沿って配置された複数の電極とを有する。生体の一部(例えば指)がパネル上に配置されたとき、接触している位置(体表の凸部の位置)の電極において生じる電荷と、生体が接触していない位置(体表の凹部の位置)における電極において生じる電荷とは相違する。この相違に基づいて、体表の凹凸(例えば指紋)の画像が取得される。 Furthermore, the detection unit 25 that acquires the image may be of a capacitance type. The capacitive detection unit 25 includes a panel with which a living body comes into contact, and a plurality of electrodes arranged behind the panel and along the panel. When a part of a living body (for example, a finger) is placed on the panel, the electric charge generated in the electrode at the position where it is in contact (the position of a convex part on the body surface) and the position where the living body is not in contact (the position of a concave part on the body surface) This is different from the charge generated at the electrode at the position of Based on this difference, an image of unevenness (for example, a fingerprint) on the body surface is acquired.
 画像を取得する検出部25は、ライン状の画像の取得を該ライン状の画像の短手方向に順次行う(すなわち走査を行う)ことによって2次元画像を取得するものであってもよいし、そのような走査を行わずに、実質的に1回で2次元画像を取得するものであってもよい。走査は、検出部25の動作によって実現されてもよいし、生体が検出部25に対して移動されることによって実現されてもよい。前者としては、例えば、撮像素子または超音波素子を含むキャリッジが移動する態様を挙げることができる。また、複数の超音波素子は、機械的に移動しない電子式の走査を行うこともできる。 The detection unit 25 that acquires images may acquire a two-dimensional image by sequentially acquiring line-shaped images in the transverse direction of the line-shaped images (that is, scanning), A two-dimensional image may be acquired substantially in one time without performing such scanning. Scanning may be realized by the operation of the detection unit 25 or by moving the living body relative to the detection unit 25. The former includes, for example, a mode in which a carriage containing an image sensor or an ultrasonic device moves. The plurality of ultrasound elements can also perform electronic scanning without mechanical movement.
 画像を取得する構成以外の検出部25としては、例えば、音声を取得するマイクロフォンを含むものを挙げることができる。これにより、生体情報としての音声(例えば声紋)の情報が取得される。また、例えば、他の検出部25としては、タッチペンによる筆記を受け付けるタッチパネルを挙げることができる。これにより、生体情報としての筆跡の情報が取得される。 An example of the detection unit 25 other than the configuration that acquires images is one that includes a microphone that acquires audio. Thereby, voice (for example, voiceprint) information as biometric information is acquired. Further, for example, the other detection unit 25 may be a touch panel that accepts writing with a touch pen. As a result, handwriting information as biometric information is acquired.
 検出部25は、生体情報の取得以外の用途に利用されて構わない。別の観点では、検出部25は、画像処理装置3において生体情報の取得以外の目的で設けられた構成要素によって実現されても構わない。あるいは、検出部25は、他の構成要素と構造的に一体不可分に組み合わされていてもよい。 The detection unit 25 may be used for purposes other than acquiring biological information. From another perspective, the detection unit 25 may be realized by a component provided in the image processing device 3 for a purpose other than acquiring biological information. Alternatively, the detection unit 25 may be structurally inseparably combined with other components.
 例えば、画像を取得する検出部25は、図示の例とは異なり、スキャナ21によって実現されても構わない。すなわち、画像処理装置がスキャナおよび検出部を有するというとき、両者は同一の構成要素であっても構わない。検出部25(画像を取得するものに限定されない。)と共用される構成要素が他のものである場合も同様である。 For example, the detection unit 25 that acquires an image may be realized by the scanner 21, unlike the illustrated example. That is, when it is said that an image processing device has a scanner and a detection section, the two may be the same component. The same applies when other components are shared with the detection unit 25 (not limited to the one that acquires images).
 また、例えば、検出部25は、操作部33に含まれるボタンに指を置くと指紋が検出されるようにボタンと共用されていてもよい。このようなボタンおよび検出部25としては、例えば、上述した静電容量式の検出部25を挙げることができる。ボタンの操作は、上述した複数の電極を含むセンサによって検出される。また、例えば、筆記の受け付けは、操作部33が含むタッチパネルによって実現されても構わない。 Further, for example, the detection unit 25 may also be used as a button so that when a finger is placed on a button included in the operation unit 33, a fingerprint is detected. An example of such a button and detection section 25 is the capacitive detection section 25 described above. The button operation is detected by the sensor including the plurality of electrodes described above. Further, for example, the reception of handwriting may be realized by a touch panel included in the operation unit 33.
 検出部25が指紋を読み取るものである場合において、指が置かれる検出面は、抗ウィルス処理が施されていてもよい。例えば、検出面は、板状の部材によって構成されており、この板状の部材の材料は、抗ウィルスの作用を生じる成分を含んでよい。また、例えば、上記の板状の部材等を覆う膜によって検出面が構成され、当該膜は、抗ウィルスの作用を生じる成分を含んでよい。抗ウィルスの作用を生じる成分としては、例えば、一価銅化合物および銀が挙げられる。対象となるウィルスの種類は任意である。検出面の抗ウィルス性は、例えば、ISO(International Organization for Standardization)21702に従う試験において、抗ウィルス活性値が2.0以上となるものであってよい。検出面は、抗ウィルス作用に加えて、または代えて、抗菌作用を生じてもよい。 In the case where the detection unit 25 reads fingerprints, the detection surface on which the finger is placed may be subjected to anti-virus treatment. For example, the detection surface is constituted by a plate-shaped member, and the material of this plate-shaped member may include a component that produces an antiviral effect. Further, for example, the detection surface may be constituted by a film covering the above-mentioned plate-shaped member, etc., and the film may contain a component that produces an antiviral effect. Components that produce antiviral effects include, for example, monovalent copper compounds and silver. The type of virus to be targeted is arbitrary. The antiviral property of the detection surface may be such that the antiviral activity value is 2.0 or more in a test according to ISO (International Organization for Standardization) 21702, for example. The sensing surface may produce an antibacterial effect in addition to, or instead of, an antiviral effect.
 検出部25の位置および向き等は任意である。例えば、既述の2.1節の説明から理解されるように、検出部25は、筐体17に対して固定的であってもよいし、位置および/または向きを変更可能に連結されていてもよいし、筐体17に対して着脱可能であってもよい。また、例えば、検出部25(より厳密には、生体情報の読み取りに直接に関わる部分。例えば、指紋を検出するときに指が置かれる検出面。本段落において、以下、同様。)は、UI部23に隣接して配置されていてもよいし、UI部23から離れていてもよい。 The position, orientation, etc. of the detection unit 25 are arbitrary. For example, as understood from the description in Section 2.1 above, the detection unit 25 may be fixed to the housing 17, or may be connected to the housing 17 so that its position and/or orientation can be changed. Alternatively, it may be detachable from the housing 17. For example, the detection unit 25 (more precisely, a part directly involved in reading biometric information; for example, a detection surface on which a finger is placed when detecting a fingerprint; the same applies hereinafter in this paragraph) is a UI It may be located adjacent to the UI section 23 or may be located away from the UI section 23.
(2.6.通信部)
 通信部27は、例えば、画像処理装置3が他の通信機器と通信を行うためのインターフェースのうち、制御部29に含まれない部分である。通信部27は、ハードウェア的な構成要素のみを含んでいてもよいし、ハードウェア的な構成要素に加えて、ソフトウェアによって実現される部分を含んでいてもよい。後者の場合、通信部27は、制御部29と明瞭に区別できなくてもよい。
(2.6. Communication Department)
The communication unit 27 is, for example, a part of an interface for the image processing device 3 to communicate with other communication devices that is not included in the control unit 29. The communication unit 27 may include only hardware components, or may include a portion realized by software in addition to the hardware components. In the latter case, the communication section 27 may not be clearly distinguishable from the control section 29.
 具体的には、例えば、画像処理装置3が有線で外部と接続される場合においては、通信部27は、ケーブルが接続されるコネクタまたはポートを有してよい。ここでのポートは、コネクタに加えてソフトウェア的な要素を含む概念である。また、例えば、画像処理装置3が無線(例えば電波)で外部と接続される場合においては、通信部27は、ベースバンドの信号を高周波信号に変換するRF(Radio Frequency)回路と、高周波信号を無線信号に変換するアンテナとを有してよい。また、有線および無線のいずれにおいても、通信部27は、例えば、増幅器および/またはフィルタを含んでよい。 Specifically, for example, when the image processing device 3 is connected to the outside by wire, the communication section 27 may have a connector or a port to which a cable is connected. A port here is a concept that includes software elements in addition to a connector. For example, when the image processing device 3 is connected to the outside via wireless (for example, radio waves), the communication unit 27 includes an RF (Radio Frequency) circuit that converts a baseband signal into a high frequency signal, and an RF (Radio Frequency) circuit that converts a baseband signal into a high frequency signal. and an antenna for converting the signal into a wireless signal. Further, in both wired and wireless communication, the communication unit 27 may include, for example, an amplifier and/or a filter.
(2.7.制御部)
 制御部29は、例えば、コンピュータと同様の構成を有している。具体的には、例えば、制御部29は、CPU(Central Processing Unit)39、ROM(Read Only Memory)41、RAM(Random Access Memory)43および補助記憶装置45を有している。CPU39がROM41および/または補助記憶装置45に記憶されているプログラムを実行することによって制御部29が構築される。なお、制御部29は、上記のように構築される部分の他、一定の動作のみを行うように構成された論理回路を含んでいてもよい。
(2.7. Control unit)
The control unit 29 has, for example, a similar configuration to a computer. Specifically, for example, the control unit 29 includes a CPU (Central Processing Unit) 39, a ROM (Read Only Memory) 41, a RAM (Random Access Memory) 43, and an auxiliary storage device 45. The control unit 29 is constructed by the CPU 39 executing programs stored in the ROM 41 and/or the auxiliary storage device 45. In addition to the portion constructed as described above, the control section 29 may include a logic circuit configured to perform only certain operations.
(2.8.コネクタ)
 コネクタ37は、例えば、画像処理装置3に周辺機器を接続するためのものである。コネクタ37の規格は種々のものとされてよいが、例えば、USB(Universal Serial Bus)を挙げることができる。図3では、コネクタ37に接続される周辺機器として、既述のように、別の例に係る検出部25Aが例示されている。この他、コネクタ37に接続される周辺機器としては、USBメモリおよびカードリーダを挙げることができる。
(2.8. Connector)
The connector 37 is for connecting peripheral equipment to the image processing device 3, for example. The connector 37 may be of various standards, for example, USB (Universal Serial Bus). In FIG. 3, the detection unit 25A according to another example is illustrated as a peripheral device connected to the connector 37, as described above. Other peripheral devices connected to the connector 37 include a USB memory and a card reader.
(2.9.その他)
 上述した種々の構成要素(19、21、25、27、33、35、37、39、41、43および45)は、例えば、バス47(図3)によって接続されている。図3では、模式的に1本のバス47に全ての構成要素が接続されている。実際の製品においては、複数のバスが適宜な形式で接続されていてよい。例えば、アドレスバス、データバスおよびコントロールバスが設けられてよい。また、クロスバースイッチおよび/またはリンクバスが適用されてもよい。
(2.9. Others)
The various components mentioned above (19, 21, 25, 27, 33, 35, 37, 39, 41, 43 and 45) are connected, for example, by a bus 47 (FIG. 3). In FIG. 3, all the components are schematically connected to one bus 47. In an actual product, multiple buses may be connected in any suitable manner. For example, an address bus, a data bus and a control bus may be provided. Also, a crossbar switch and/or a link bus may be applied.
 図3は、あくまで模式図である。従って、例えば、実際には、各種のデバイス(例えばCPU)は、分散して複数設けられていてよい。図示のCPU39は、プリンタ19またはスキャナ21に含まれるCPUを含む概念であってよい。バス47と各種のデバイス(例えばプリンタ19またはスキャナ21)との間には不図示のインターフェースが介在してよい。 Figure 3 is just a schematic diagram. Therefore, for example, in reality, a plurality of various devices (for example, CPUs) may be distributed and provided. The illustrated CPU 39 may be a concept including a CPU included in the printer 19 or the scanner 21. An interface (not shown) may be interposed between the bus 47 and various devices (for example, the printer 19 or the scanner 21).
 図3のブロック図は、画像処理装置3の構成を示すものとして説明された。ただし、図3は、適宜に、サーバ5および7、ならびに端末9の構成を示すブロック図として援用可能である。また、図3に示した構成要素の説明も、矛盾等が生じない限り、サーバ5および7、ならびに端末9の構成要素に援用されてよい。例えば、サーバ5および7、ならびに端末9の構成を示すブロック図は、図3からプリンタ19およびスキャナ21を省略したものとされてよい。また、サーバ5および7の構成を示すブロック図は、図3からさらに検出部25、操作部33および/または表示部35を省略したものとされてもよい。 The block diagram in FIG. 3 has been described as showing the configuration of the image processing device 3. However, FIG. 3 can be used as a block diagram showing the configuration of the servers 5 and 7 and the terminal 9 as appropriate. Further, the explanation of the components shown in FIG. 3 may also be applied to the components of the servers 5 and 7 and the terminal 9, as long as there is no contradiction. For example, the block diagram showing the configuration of the servers 5 and 7 and the terminal 9 may be obtained by omitting the printer 19 and scanner 21 from FIG. 3. Further, the block diagram showing the configuration of the servers 5 and 7 may be obtained by omitting the detection unit 25, the operation unit 33, and/or the display unit 35 from FIG.
(3.保存される情報)
(3.1.管理用テーブル)
 管理用テーブルDT0は、不揮発性メモリ(例えば補助記憶装置45)に記憶されている。管理用テーブルDT0は、複数のユーザについて、アカウント情報および生体情報等を保持している。ただし、管理用テーブルDT0は、1人のユーザについてのみ、アカウント情報および生体情報等を保持可能であってもよい。このような態様であっても、便宜上、「ユーザ毎」にアカウント情報と生体情報とが対応付けられて保存されていると表現することがある。実施形態の説明では、基本的に、管理用テーブルDT0が複数のユーザの情報を保存可能な態様を例に取り、特に断り無く、そのような態様を前提とした説明を行うことがある。
(3. Information to be saved)
(3.1. Management table)
Management table DT0 is stored in nonvolatile memory (for example, auxiliary storage device 45). The management table DT0 holds account information, biometric information, etc. for a plurality of users. However, the management table DT0 may be able to hold account information, biometric information, etc. only for one user. Even in such a mode, for convenience, it may be expressed that account information and biometric information are stored in association with each other "for each user." In the description of the embodiment, a mode in which the management table DT0 can store information of a plurality of users will be basically taken as an example, and the explanation will be based on such a mode unless otherwise specified.
 管理用テーブルDT0が1人のユーザのみの情報を保存する態様の利用態様としては、例えば、画像処理装置3が家庭用のもの(図3の画像処理装置3Bのイラスト参照)である態様が挙げられる。この画像処理装置3は、基本的に、1人のユーザのみに利用されることが想定されてよい。そして、例外的に他のユーザが利用するときに、他の画像処理装置3から他のユーザの生体情報(第1の生体情報)がインポートされたり、他の画像処理装置3へ他のユーザの生体情報(第2の生体情報)がエクスポートされたりしてよい。 An example of a usage mode in which the management table DT0 stores information for only one user is a mode in which the image processing device 3 is for home use (see the illustration of the image processing device 3B in FIG. 3). It will be done. It may be assumed that this image processing device 3 is basically used by only one user. Then, when another user uses it exceptionally, the other user's biometric information (first biometric information) is imported from the other image processing device 3, or the other user's biometric information is imported to the other image processing device 3. The biometric information (second biometric information) may be exported.
 管理用テーブルDT0は、1つのアカウント情報に対して、2以上の生体情報を対応付けて記憶可能であってよい。ただし、実施形態の説明では、便宜上、特に断り無く、1つのアカウント情報に対して1つの生体情報が対応付けられる態様を例に取ることがある。 The management table DT0 may be able to store two or more pieces of biometric information in association with one piece of account information. However, in the description of the embodiments, for convenience, a mode in which one piece of biometric information is associated with one piece of account information may be taken as an example, unless otherwise specified.
 2以上の生体情報は、例えば、1人のユーザの互いに異なる生体情報であってよい。そのような生体情報としては、例えば、互いに異なる指の指紋、または同一の指の互いに異なる時期に取得された指紋が挙げられる。前者の場合においては、例えば、怪我または加齢等に起因して1つの指での認証が失敗したときに他の指で認証を行うことができる。後者の場合においては加齢等に起因して生体情報が変化したときに認証が失敗する蓋然性が低減される。 The two or more pieces of biometric information may be different biometric information of one user, for example. Examples of such biometric information include fingerprints of different fingers or fingerprints of the same finger acquired at different times. In the former case, for example, when authentication with one finger fails due to injury, aging, etc., authentication can be performed with another finger. In the latter case, the probability that authentication will fail when biometric information changes due to aging or the like is reduced.
 また、2以上の生体情報は、図1に例示したように、2種以上の生体情報であってよい。2種以上の生体情報としては、種々の生体情報(1.1.1節を参照)のうちの2以上のものが適宜に選択されてよい。2種以上の生体情報が登録可能である場合、例えば、上記と同様に、不正行為以外の何らかの事情で1種類の生体情報で認証に失敗したときに、他の生体情報によって認証を行うことができ、ユーザの利便性が向上する。 Furthermore, the two or more pieces of biometric information may be two or more types of biometric information, as illustrated in FIG. As the two or more types of biometric information, two or more of various biometric information (see Section 1.1.1) may be selected as appropriate. If two or more types of biometric information can be registered, for example, if authentication fails with one type of biometric information due to some reason other than fraud, it is possible to authenticate with other biometric information, as described above. This improves user convenience.
 なお、互いに異なる指の指紋も互いに種類が異なる生体情報と捉えることが可能である。ただし、互いに異なる指の指紋に関しては、検出部25の構成および検出した生の情報の処理方法が互いに同じである。従って、実施形態の説明では、互いに異なる指の指紋は、同一の種類の生体情報として捉える。換言すれば、互いに種類が異なる生体情報は、例えば、検出部25の構成および検出した情報の処理方法の少なくとも一方が異なるものである。 Note that fingerprints from different fingers can also be considered as different types of biometric information. However, regarding the fingerprints of different fingers, the configuration of the detection unit 25 and the method of processing the detected raw information are the same. Therefore, in the description of the embodiment, fingerprints of different fingers are treated as the same type of biometric information. In other words, different types of biological information differ in at least one of the configuration of the detection unit 25 and the method of processing the detected information, for example.
 2種以上の生体情報は、上記のとおり、例えば、選択的に利用されてよい。すなわち、2種以上の生体情報のうち1種についてのみ認証が行われてよい。この態様では、2種以上の生体情報の全ての生体情報が登録されている必要はない。ただし、2種以上の生体情報は、全ておよび/または少なくとも2種が登録されることが要求されてもよい。さらには、2種以上の生体情報は、認証のときに、その全ておよび/または少なくとも2種が入力されることが要求されてもよい。この場合には、セキュリティが向上する。 Two or more types of biological information may be used selectively, for example, as described above. That is, authentication may be performed for only one type of biometric information of two or more types. In this aspect, it is not necessary that all the biometric information of two or more types of biometric information be registered. However, all and/or at least two types of two or more types of biometric information may be required to be registered. Furthermore, all and/or at least two types of two or more types of biometric information may be required to be input at the time of authentication. In this case, security is improved.
 2種以上(または2以上)の生体情報の登録が任意であるか、必須であるか、および/または認証のときに2種以上(または2以上)の生体情報が選択的に利用されるか、必須とされるかは、全てのユーザに共通であってもよいし、ユーザによって、および/または権限(制限が解除される機能)によって異なっていてもよい。例えば、一般のユーザに対しては選択的に利用され、管理者に対しては2種(または2以上)以上の生体情報が要求されてよい。 Whether registration of two or more types of biometric information (or two or more) is optional or mandatory, and/or whether two or more types of biometric information (or two or more) are selectively used during authentication. Whether or not a function is required may be common to all users, or may differ depending on the user and/or authority (function for which restrictions are lifted). For example, it may be used selectively for general users, and two or more types (or more than one) of biometric information may be requested from administrators.
 また、1つのアカウント情報に対応付けられる2以上の生体情報は、互いに異なる人物のものであってもよい。すなわち、「ユーザ」は、「人物」に限定されず、「アカウント」(別の観点ではユーザグループ)を含む概念であってよい。 Furthermore, two or more pieces of biometric information associated with one account information may belong to different people. That is, a "user" is not limited to a "person" and may be a concept that includes an "account" (from another perspective, a user group).
(3.2.生体情報の保存)
 画像処理装置3MAにおいて、使用に際して検出部25によって検出される生体情報(第2の生体情報)は、例えば、揮発性メモリ(例えばRAM43)に記憶されてよい。ただし、第2の生体情報は、不揮発性メモリ(例えば補助記憶装置45)に記憶されても構わない。この第2の生体情報は、適宜な時期(例えば認証完了時)に消去されてよい。
(3.2. Storage of biological information)
In the image processing device 3MA, biometric information (second biometric information) detected by the detection unit 25 during use may be stored in a volatile memory (eg, RAM 43), for example. However, the second biological information may be stored in a nonvolatile memory (for example, the auxiliary storage device 45). This second biometric information may be deleted at an appropriate time (for example, upon completion of authentication).
 使用に際して画像処理装置3MAが画像処理装置3MBからインポートする生体情報(第1の生体情報)は、例えば、画像処理装置3MAの揮発性メモリ(例えばRAM43)に記憶されてよい。ただし、第1の生体情報は、画像処理装置3MAの不揮発性メモリ(例えば補助記憶装置45)に記憶されても構わない。この第1の生体情報は、適宜な時期(例えば認証完了時)に消去されてよい。 The biometric information (first biometric information) that the image processing device 3MA imports from the image processing device 3MB during use may be stored, for example, in the volatile memory (eg, RAM 43) of the image processing device 3MA. However, the first biological information may be stored in the nonvolatile memory (for example, the auxiliary storage device 45) of the image processing device 3MA. This first biometric information may be deleted at an appropriate time (for example, upon completion of authentication).
 使用に際して画像処理装置3MAが画像処理装置3MBへエクスポートする生体情報(第2の生体情報)は、例えば、画像処理装置3MBの揮発性メモリ(例えばRAM43)に記憶されてよい。ただし、第2の生体情報は、画像処理装置3MBの不揮発性メモリ(例えば補助記憶装置45)に記憶されても構わない。この第2の生体情報は、適宜な時期(例えば認証完了時)に消去されてよい。 The biometric information (second biometric information) that the image processing device 3MA exports to the image processing device 3MB during use may be stored, for example, in the volatile memory (eg, RAM 43) of the image processing device 3MB. However, the second biological information may be stored in the nonvolatile memory (for example, the auxiliary storage device 45) of the image processing device 3MB. This second biometric information may be deleted at an appropriate time (for example, upon completion of authentication).
 生体情報の消去は、生体情報が記憶されていた記憶領域に別の情報を上書きしたり、上記記憶領域を初期化したりするものであってよい。すなわち、生体情報の消去は、生体情報を復元不可能にするものであってよい。また、生体情報の消去は、生体情報が記憶されている記憶領域のアドレスの情報を消去するものであってもよい。すなわち、生体情報は、画像処理装置3における通常の処理によるアクセスが不可能とされ、一方で、専門の業者による生体情報の復元の余地が残されてもよい。 The deletion of biometric information may involve overwriting the storage area where the biometric information was stored with other information or initializing the storage area. That is, erasing biometric information may make the biometric information unrecoverable. Furthermore, deleting the biometric information may be to delete information on an address of a storage area where the biometric information is stored. That is, the biometric information may be made inaccessible through normal processing in the image processing device 3, while leaving room for recovery of the biometric information by a specialized company.
(4.第1実施形態)
 第1実施形態は、既述のとおり、画像処理装置3MAがユーザBの第1の生体情報(登録された生体情報)を画像処理装置3MBからインポートする第1の方法を実行する態様である。具体的には、例えば、以下のとおりである。
(4. First embodiment)
As described above, the first embodiment is a mode in which the image processing device 3MA executes the first method of importing the first biometric information (registered biometric information) of the user B from the image processing device 3MB. Specifically, for example, it is as follows.
(4.1.第1実施形態における動作の概要)
 図4は、画像処理装置3MA(別の観点では制御部29)が実行する認証に係る手順の概要の一例を示すフローチャートである。なお、図4を含め、種々のフローチャートは、理解が容易になるように動作の手順を概念的に示すものであって、必ずしも実際の手順とは一致していないし、また、正確性を欠いていることもある。
(4.1. Outline of operation in first embodiment)
FIG. 4 is a flowchart illustrating an example of an outline of a procedure related to authentication executed by the image processing device 3MA (control unit 29 from another perspective). Note that the various flowcharts, including FIG. 4, conceptually illustrate operational procedures for ease of understanding, and do not necessarily match the actual procedures, and may lack accuracy. Sometimes there are.
 図4の処理によって、画像処理装置3を使用しようとするユーザの認証が行われ、その認証結果に応じて、所定の機能の制限が解除され(ステップST6またはST13)、または所定の機能が制限される(ステップST7またはST14)。ステップST2~ST7は、画像処理装置3MAに登録されているユーザAが画像処理装置3MAを利用する場合の処理の手順を例示している。ステップST8~ST14は、画像処理装置3MA以外の画像処理装置3(ここでは3MB)に登録されているユーザBが画像処理装置3MAを利用する場合の処理の手順を例示している。 Through the process in FIG. 4, a user who attempts to use the image processing device 3 is authenticated, and depending on the authentication result, restrictions on a predetermined function are canceled (step ST6 or ST13), or restrictions on a predetermined function are (step ST7 or ST14). Steps ST2 to ST7 exemplify the processing procedure when user A registered in the image processing apparatus 3MA uses the image processing apparatus 3MA. Steps ST8 to ST14 exemplify the processing procedure when user B who is registered in an image processing device 3 (3MB in this case) other than the image processing device 3MA uses the image processing device 3MA.
 図4の処理は、適宜な時期に開始されてよい。例えば、図4の処理は、画像処理装置3MAにおいて、電源が投入されたり、スリープモードから起動モードへの移行がなされたりしたときに開始されてよい。あるいは、図4の処理は、利用が制限されている所定の機能(例えば印刷)をユーザが実行しようとしたときに開始されてよい。なお、実施形態の説明では、特に断り無く、前者を例に取ることがある。 The process in FIG. 4 may be started at any appropriate time. For example, the process in FIG. 4 may be started when the image processing device 3MA is powered on or transitioned from sleep mode to startup mode. Alternatively, the process in FIG. 4 may be initiated when the user attempts to perform a predetermined function (for example, printing) whose use is restricted. Note that in the description of the embodiments, the former may be taken as an example unless otherwise specified.
 ステップST1では、画像処理装置3MAは、ユーザに自己の部署(および/または自己が登録されている画像処理装置3)を特定する情報の入力(例えば操作部33に対する操作)を促す画像を表示部35に表示する。また、画像処理装置3MAは、上記の入力を受け付ける。 In step ST1, the image processing device 3MA displays an image on the display screen that prompts the user to input information (for example, an operation on the operation unit 33) that specifies the user's own department (and/or the image processing device 3 in which the user is registered). 35. Further, the image processing device 3MA accepts the above input.
 ステップST2では、画像処理装置3MAは、ステップST1で入力された情報に基づいて、画像処理装置3MAを利用しようとしているユーザが、他部署のユーザ(別の観点では他の画像処理装置3に登録されているユーザ)か否か判定する。そして、画像処理装置3MAは、否定判定のときはステップST3に進み、肯定判定のときはステップST8に進む。 In step ST2, the image processing apparatus 3MA determines whether the user who is attempting to use the image processing apparatus 3MA is registered with a user in another department (from another perspective, with another image processing apparatus 3) based on the information input in step ST1. ). Then, the image processing device 3MA proceeds to step ST3 when the determination is negative, and proceeds to step ST8 when the determination is affirmative.
 ステップST3では、画像処理装置3MAは、ユーザに生体情報の入力を促す画像を表示部35に表示する。ステップST4では、画像処理装置3MAは、検出部25によってユーザの生体情報を検出する。 In step ST3, the image processing device 3MA displays on the display unit 35 an image that prompts the user to input biometric information. In step ST4, the image processing device 3MA uses the detection unit 25 to detect the user's biometric information.
 ステップST5では、画像処理装置3MAは、ステップST4で検出した生体情報(第2の生体情報)に基づいて生体認証を行う。具体的には、画像処理装置3MAは、第2の生体情報と一致する生体情報(第1の生体情報)が、自己が有している管理用テーブルDT0に登録されているか否か判定する。そして、画像処理装置3MAは、肯定判定のときはステップST6に進み、否定判定のときはステップST7に進む。 In step ST5, the image processing device 3MA performs biometric authentication based on the biometric information (second biometric information) detected in step ST4. Specifically, the image processing device 3MA determines whether biometric information (first biometric information) that matches the second biometric information is registered in its own management table DT0. Then, the image processing device 3MA proceeds to step ST6 when the determination is positive, and proceeds to step ST7 when the determination is negative.
 ステップST6では、画像処理装置3MAは、所定の機能の制限を解除する。一方、ステップST7では、画像処理装置3MAは、所定の機能の制限を維持する。ステップST7で制限される機能は、画像処理装置3MAの種々の機能の一部であってもよいし、全てであってもよい。後者は、換言すれば、画像処理装置3MAの利用自体が禁止される態様である。図4では、前者を想定して、ステップST7は、「制限付動作」と表記されている。 In step ST6, the image processing device 3MA releases the restrictions on the predetermined functions. On the other hand, in step ST7, the image processing device 3MA maintains the predetermined function restriction. The functions restricted in step ST7 may be some or all of the various functions of the image processing device 3MA. In other words, the latter is a mode in which the use of the image processing device 3MA itself is prohibited. In FIG. 4, assuming the former, step ST7 is labeled as "restricted operation."
 ステップST8では、画像処理装置3MAは、ステップST1でユーザが特定した部署の画像処理装置3MBにアクセスする。別の観点では、画像処理装置3MAは、画像処理装置3MBに対して、画像処理装置3MBに登録されているユーザBの生体情報(第1の生体情報)のエクスポートを要求する。そして、ステップST9では、画像処理装置3MBは、ユーザBの第1の生体情報を受信する(インポートする。)。 In step ST8, the image processing device 3MA accesses the image processing device 3MB of the department specified by the user in step ST1. From another perspective, the image processing device 3MA requests the image processing device 3MB to export the biometric information (first biometric information) of the user B registered in the image processing device 3MB. Then, in step ST9, the image processing device 3MB receives (imports) the first biometric information of the user B.
 ステップST10~ST14は、ステップST3~ST7と同様である。ただし、第1の生体情報(検出された第2の生体情報と比較される登録された生体情報)としては、画像処理装置3MAの管理用テーブルDT0に登録されているものではなく、ステップST9でインポートしたものが利用される。また、図4では、便宜上、ステップST10の前において、ステップST3に相当するステップの図示は省略されている。また、認証(比較)が画像処理装置3MAで行われることを明記したステップST11(ステップST5の前では図示省略)が図示されている。 Steps ST10 to ST14 are similar to steps ST3 to ST7. However, the first biometric information (registered biometric information to be compared with the detected second biometric information) is not registered in the management table DT0 of the image processing device 3MA, but in step ST9. The imported one will be used. Further, in FIG. 4, for convenience, illustration of a step corresponding to step ST3 before step ST10 is omitted. Further, step ST11 (not shown before step ST5) is shown in which it is specified that authentication (comparison) is performed by the image processing device 3MA.
 画像処理装置3MAにおいて、認証が必要な機能は、1種のみであってもよいし、2種以上であってもよい。また、1回の認証によって、1種の機能を繰り返し実行したり、2種以上の機能を実行したりすることが可能であってよい。ただし、1回の機能の実行ごとに認証が要求されたり、1種の機能ごとに認証が要求されたり、セキュリティレベルが高い機能を実行するときに再度認証が要求されたりしてもよい。 In the image processing device 3MA, only one type of function may require authentication, or two or more types may be required. Further, it may be possible to repeatedly execute one type of function or to execute two or more types of functions with one authentication. However, authentication may be requested each time a function is executed, authentication may be requested for each type of function, or authentication may be requested again when a function with a high security level is executed.
 使用に際して取得された生体情報(ステップST4またはST10の生体情報)は、登録されている生体情報との比較が行われた直後に画像処理装置3MAから消去されてよい。ただし、使用に際して取得された生体情報は、その後の適宜な時期(例えば認証状態が解除される時期)まで画像処理装置3MAに記憶されて適宜に利用されても構わない。また、使用に際して取得された生体情報は、登録されている生体情報の更新に利用されてもよい。 The biometric information acquired during use (the biometric information in step ST4 or ST10) may be deleted from the image processing device 3MA immediately after being compared with the registered biometric information. However, the biometric information acquired during use may be stored in the image processing device 3MA until an appropriate time thereafter (for example, when the authentication state is canceled) and used as appropriate. Furthermore, the biometric information acquired during use may be used to update the registered biometric information.
 ステップST6とステップST13とは互いに同じであるものとして説明した。ただし、両者には、相違があっても構わない。例えば、ステップST13では、ステップST6で制限が解除される特定の機能の制限が解除されなくてもよい。同様に、ステップST7とステップST14は互いに同じであるものとして説明したが、両者に相違があっても構わない。例えば、ステップST14では、ステップST7では制限されない特定の機能の利用が制限されてもよい。 The description has been made assuming that step ST6 and step ST13 are the same. However, there may be differences between the two. For example, in step ST13, the restriction on a specific function that is removed in step ST6 may not be lifted. Similarly, although step ST7 and step ST14 have been described as being the same, there may be a difference between them. For example, in step ST14, use of a specific function that is not limited in step ST7 may be restricted.
 画像処理装置3MAおよび3MBを接続するネットワーク10の構成が種々の構成でよい旨の既述の説明から理解されるように、ステップST8およびST9の通信は適宜なものとされてよい。 As can be understood from the above description that the configuration of the network 10 connecting the image processing devices 3MA and 3MB may have various configurations, the communication in steps ST8 and ST9 may be performed as appropriate.
 例えば、ユーザBの生体情報のインポート(ステップST9)は、プライベートネットワーク13A内のものであってもよいし、パブリックネットワーク11を介したものであってもよい。後者の場合において、VPN接続が利用されてもよいし、利用されなくてもよい。また、ユーザBの生体情報をインポートするための接続の確立は、ステップST8またはST9の段階でなされてもよいし、その以前からなされていてもよい。 For example, the biometric information of user B may be imported (step ST9) within the private network 13A or via the public network 11. In the latter case, a VPN connection may or may not be utilized. Further, the connection for importing the biometric information of user B may be established at step ST8 or ST9, or may be established before that.
 また、例えば、画像処理装置3MAおよび3MBが同一のLAN内に含まれる態様(あるいは他の態様)では、上記の説明とは異なり、画像処理装置3MAは、画像処理装置3MBを特定せずに、LAN内の全ての通信機器にユーザBの第1の生体情報のエクスポートを要求してよい。そして、該当する通信機器(ユーザBの生体情報を登録している画像処理装置3MB)が、その要求に応じてよい。 Further, for example, in a mode (or other mode) in which image processing devices 3MA and 3MB are included in the same LAN, unlike the above explanation, image processing device 3MA does not specify image processing device 3MB, and All communication devices within the LAN may be requested to export user B's first biometric information. Then, the corresponding communication device (the image processing device 3MB in which user B's biometric information is registered) may respond to the request.
 図4に示した手順は適宜に変更されてよい。 The procedure shown in FIG. 4 may be modified as appropriate.
 例えば、ユーザに自己の部署を特定する情報を入力させる前に、生体情報の入力が要求されてもよい。そして、入力された生体情報が画像処理装置3MAの管理用テーブルDT0に登録されているか否かが判定されてよい(別の観点では認証が行われてよい。)。この認証の成否によって、ユーザが自部署のユーザであるか、他部署のユーザであるかが判定されてよい。そして、認証に失敗した場合に、他部署のユーザに自己の部署を特定する情報を入力させ(もしくは入力させずに)、他の画像処理装置3MBにユーザBの生体情報のインポートを要求してよい。 For example, the user may be required to input biometric information before inputting information that identifies his or her own department. Then, it may be determined whether or not the input biometric information is registered in the management table DT0 of the image processing device 3MA (from another point of view, authentication may be performed). Depending on the success or failure of this authentication, it may be determined whether the user is a user of the own department or a user of another department. If authentication fails, users in other departments are asked to input information identifying their own department (or not) and request another image processing device 3MB to import user B's biometric information. good.
(4.2.第1実施形態における画面の例)
 図4の処理の実行に際しては、適宜な画像が表示部35に表示されてよい。以下に例を示す。ここでは、表示部35がタッチパネルを構成している態様を想定する。図5A~図6Cは、表示部35の画面35aに表示される画像の例を示す模式図である。
(4.2. Example of screen in first embodiment)
When executing the process in FIG. 4, an appropriate image may be displayed on the display section 35. An example is shown below. Here, it is assumed that the display section 35 constitutes a touch panel. 5A to 6C are schematic diagrams showing examples of images displayed on the screen 35a of the display unit 35.
 図5Aに示す画像IG1は、例えば、ステップST1で表示される。画像IG1は、上段の「ID、パスワードを入力してください」という表示によってIDおよびパスワード(すなわちアカウント情報)の入力をユーザに促している。ユーザは、例えば、「ID」および「パスワード」に対応付けられた入力欄をタップ等の操作によって選択し、その後、メカニカルスイッチまたはソフトウェアキーボードに対する操作によって文字(数字および記号を含む広い概念であるものとする。)を入力することができる。 Image IG1 shown in FIG. 5A is displayed in step ST1, for example. Image IG1 prompts the user to input an ID and password (ie, account information) by displaying "Please enter your ID and password" at the top. For example, the user selects the input fields associated with "ID" and "password" by tapping, etc., and then inputs characters (a broad concept including numbers and symbols) by operating a mechanical switch or software keyboard. ) can be input.
 また、画像IG1は、下段に「他部署ユーザ」または「実行」と表示されたボタンを含んでいる。ユーザは、これらのボタンに対して所定の操作(例えばタップ)を行うことによって、次の操作(次の画面)に進むことができる。また、上記ボタンのいずれかの選択によって、ユーザが、画像処理装置3MAが配置されている部署に所属しているのか(換言すれば画像処理装置3MAに登録されているか)、他の部署に所属しているのかの情報が入力される。 Image IG1 also includes a button labeled "Other department user" or "Execute" at the bottom. The user can proceed to the next operation (next screen) by performing a predetermined operation (for example, tap) on these buttons. Also, by selecting one of the buttons above, you can check whether the user belongs to the department where the image processing device 3MA is located (in other words, whether the user is registered in the image processing device 3MA) or whether the user belongs to another department. Information about what is being done is entered.
 そして、「他部署ユーザ」ボタンが選択された場合、ステップST2で肯定判定がなされることになり、処理は、ステップST8に進む。一方、「実行」ボタンが選択された場合、ステップST2で否定判定がなされることになり、処理は、ステップST3に進む。なお、画像IG1よりも先に表示される画像(画面)が存在する態様では、その画面に戻るためのボタンが画像IG1に表示されてもよい。後述する他の画面も同様に、1つ前の画面に戻るためのボタン(不図示)および/または特定の画面に戻るためのボタン(不図示)が配置されてよい。 If the "other department user" button is selected, an affirmative determination is made in step ST2, and the process proceeds to step ST8. On the other hand, if the "execute" button is selected, a negative determination is made in step ST2, and the process proceeds to step ST3. Note that in a mode where there is an image (screen) that is displayed before image IG1, a button for returning to that screen may be displayed on image IG1. Similarly, other screens to be described later may also have a button (not shown) for returning to the previous screen and/or a button (not shown) for returning to a specific screen.
 図5Bに示す画像IG3は、例えば、画像IG1において「他部署ユーザ」ボタンが選択されたときに表示される。また、画像IG3は、例えば、ステップST2で肯定判定がなされた後、かつステップST8の前の適宜な時期に表示される。画像IG3では、上段の「他部署ユーザ」の表記によって、他部署ユーザのための画面が表示されていることが示されている。また、画像IG3では、画像IG1と同様に、IDおよびパスワードの入力欄が表示されている。また、画像IG3では、下段に「自部署MFP選択」ボタンが表示されている。ユーザは、「自部署MFP選択」ボタンに対して所定の操作(例えばタップ)を行うことによって、次の操作(次の画面)に進むことができる。 Image IG3 shown in FIG. 5B is displayed, for example, when the "other department user" button is selected in image IG1. Further, the image IG3 is displayed, for example, at an appropriate time after an affirmative determination is made in step ST2 and before step ST8. In image IG3, the notation "other department user" in the upper row indicates that a screen for other department users is being displayed. In addition, in image IG3, similarly to image IG1, input fields for ID and password are displayed. Further, in image IG3, a "own department MFP selection" button is displayed at the bottom. The user can proceed to the next operation (next screen) by performing a predetermined operation (eg, tap) on the "own department MFP selection" button.
 図5Cに示す画像IG5は、例えば、画像IG3において「自部署MFP選択」ボタンが操作されたときに表示される。また、画像IG5は、例えば、ステップST2で肯定判定がなされた後、かつステップST8の前の適宜な時期に表示される。画像IG5では、上段の「他部署ユーザ MFP選択」および「自部署のMFPを選択してください」の表記によって、他部署のユーザが、自己が所属する部署および/または自己が登録されている画像処理装置3を選択するための画面が表示されていることが示されている。また、画像IG5では、選択可能な画像処理装置3の一覧が示されている。ユーザは、一覧表示されている画像処理装置3のいずれかを選択する操作(例えばタップ)によって、自己が登録されている画像処理装置3を選択することができる。この選択は、上位概念でいえば、自己が登録されている画像処理装置3を特定する情報の入力である。そして、ユーザは、下段の「OK」ボタンに対して所定の操作(例えばタップ)を行うことによって、次の操作(次の画面)に進むことができる。 Image IG5 shown in FIG. 5C is displayed, for example, when the "select own department MFP" button is operated in image IG3. Further, image IG5 is displayed at an appropriate time, for example, after an affirmative determination is made in step ST2 and before step ST8. In image IG5, the notations "Other department user MFP selection" and "Please select your own department's MFP" in the upper row allow users from other departments to select the department to which they belong and/or the image to which they are registered. It is shown that a screen for selecting the processing device 3 is displayed. Further, in image IG5, a list of selectable image processing devices 3 is shown. The user can select the image processing device 3 in which the user is registered by performing an operation (for example, tapping) to select one of the image processing devices 3 displayed in the list. In a general concept, this selection is the input of information specifying the image processing device 3 in which the device itself is registered. Then, the user can proceed to the next operation (next screen) by performing a predetermined operation (eg, tap) on the "OK" button at the bottom.
 ステップST8においては、例えば、画像処理装置3MAは、画像IG5において選択された画像処理装置3(3MB)に対してユーザの生体情報を要求する。このとき、画像処理装置3MAは、画像IG1またはIG3で入力されたアカウント情報を画像処理装置3MBへ送信する。ここで送信される情報は、IDのみであってもよいし、IDおよびパスワードの組み合わせであってもよい。画像処理装置3MBは、受信した要求に含まれるアカウント情報と一致するアカウント情報の生体情報を画像処理装置3MAへエクスポートする。 In step ST8, for example, the image processing device 3MA requests the user's biometric information from the image processing device 3 (3MB) selected in the image IG5. At this time, the image processing device 3MA transmits the account information input in the image IG1 or IG3 to the image processing device 3MB. The information transmitted here may be only the ID, or may be a combination of the ID and password. The image processing device 3MB exports the biometric information of the account information that matches the account information included in the received request to the image processing device 3MA.
 図6Aに示す画像IG7は、例えば、画像IG5において「OK」ボタンが操作されたときに表示される。また、画像IG7は、例えば、ステップST2で肯定判定がなされた後(例えばステップST9の後)、かつステップST10の前の適宜な時期に表示される。画像IG7では、画像IG3の上半分と同様の画像に加えて、「生体情報が登録されたMFP」の欄によって画像IG5で選択された画像処理装置3(図示の例では「B部1課」のMFP「B01」)が示されている。また、「指紋を読み取らせてください」の表記によって、生体情報の入力が促されている。なお、上記から理解されるように、ここでは、生体情報として指紋が利用される態様が想定されている。もちろん、生体情報は、指紋以外のものであってよい。ユーザは、指を検出部25に置くことによって指紋を読み取らせることができる。このときの指は、当然に、登録のときに指紋を読み取らせた指である。なお、ステップST3では、画像IG7に類似した画像が表示されてよい。 Image IG7 shown in FIG. 6A is displayed, for example, when the "OK" button is operated in image IG5. Further, the image IG7 is displayed, for example, at an appropriate time after an affirmative determination is made in step ST2 (for example, after step ST9) and before step ST10. In image IG7, in addition to an image similar to the upper half of image IG3, the image processing device 3 selected in image IG5 (in the illustrated example, "B department 1 section") is shown in the "MFP with registered biometric information" column. MFP “B01”) is shown. Additionally, the message ``Please let me read your fingerprint'' prompts you to enter your biometric information. Note that, as understood from the above, a mode in which a fingerprint is used as biometric information is assumed here. Of course, biometric information may be other than fingerprints. The user can have his or her fingerprint read by placing his or her finger on the detection unit 25. The finger at this time is naturally the finger whose fingerprint was read during registration. Note that in step ST3, an image similar to image IG7 may be displayed.
 図6Bに示す画像IG9は、例えば、画像IG7が表示されている状態で、検出部25によって生体情報が検出され(ステップST10)、認証に成功したとき(ステップST12で肯定判定がなされたとき)に表示される。画像IG9では、画像IG7の上半分と同様の画像に加えて、「認証完了」の表記によって、認証に成功したことが示されている。ユーザは、「メニューに戻る」ボタンに対して所定の操作(例えばタップ)を行うことによって、画像処理装置3MAの機能を使用するための画面(いわゆるメニュー画面および/またはホーム画面)に進む(または戻る)ことができる。 Image IG9 shown in FIG. 6B is generated, for example, when biometric information is detected by detection unit 25 (step ST10) and authentication is successful (when an affirmative determination is made in step ST12) while image IG7 is being displayed. will be displayed. In image IG9, in addition to an image similar to the upper half of image IG7, the notation ``authentication complete'' indicates that authentication has been successful. By performing a predetermined operation (for example, tapping) on the "Return to Menu" button, the user proceeds to a screen (so-called menu screen and/or home screen) for using the functions of the image processing device 3MA (or back) can be done.
 特に図示しないが、認証に失敗した場合(所定時間に亘って生体情報を検出できなかった場合を含む。)は、例えば、認証に失敗した旨の表示がなされた後に、1つ前の画面または特定の画面(例えば画像IG1)が表示されてよい。あるいは、認証に失敗した旨の表示とともに、生体情報の再入力(同じ認証方法の再試行)、および特定の画面への戻りなどの選択肢が示されてもよい。画像処理装置3MAにおいて、複数種類の認証方法が可能な態様においては、上記の選択肢に他の認証方法(他の生体情報による認証を含む)への切換えが含まれてもよい。 Although not particularly shown, if authentication fails (including cases where biometric information cannot be detected for a predetermined period of time), for example, after a message indicating that authentication has failed is displayed, the previous screen or A specific screen (eg, image IG1) may be displayed. Alternatively, options such as re-entering the biometric information (retrying the same authentication method) and returning to a specific screen may be displayed along with a display indicating that the authentication has failed. In an embodiment where the image processing device 3MA is capable of using multiple types of authentication methods, the above options may include switching to other authentication methods (including authentication using other biometric information).
 図示の例では、ユーザの自己申告に基づいて、画像処理装置3MAおよび3MBのいずれに登録されている第1の生体情報が用いられるかが決定されている。また、既述のとおり、画像処理装置3MAの管理用テーブルDT0と、画像処理装置3MBの管理用テーブルDT0とは、一部のユーザが重複していてもよい。従って、例えば、ユーザBの第1の生体情報が画像処理装置3MAに登録されているにも関わらず、ユーザBの第1の生体情報が画像処理装置3MBから画像処理装置3MAへインポートされるという状況が生じても構わない。また、画像処理装置3MAは、ユーザBが他部署ユーザであることを申告していても、インポートする前に、自己の管理用テーブルDT0を参照して、ユーザBの第1の生体情報の有無をチェックしてもよい。 In the illustrated example, it is determined which of the image processing devices 3MA and 3MB the first biometric information is used based on the user's self-report. Further, as described above, some users may overlap in the management table DT0 of the image processing device 3MA and the management table DT0 of the image processing device 3MB. Therefore, for example, although the first biometric information of user B is registered in the image processing device 3MA, the first biometric information of user B is imported from the image processing device 3MB to the image processing device 3MA. It doesn't matter if the situation arises. In addition, even if the image processing device 3MA declares that the user B is a user in another department, before importing, the image processing device 3MA refers to its own management table DT0 and determines the presence or absence of the first biometric information of the user B. You may check.
 上記の画面の例は、適宜に具体的な運用がなされたり、変更されたりしてよい。 The above screen example may be implemented or modified as appropriate.
 例えば、画像IG1(図5A)におけるアカウント情報の入力は、次の画面(図5B等)へ移るための前提条件とされてもよいし、されなくてもよい。後者の場合、例えば、次の画面(例えば画像IG3)等でアカウント情報が入力可能であってよい。また、前者の場合は、画像IG1で「他部署ユーザ」が選択されたときは、図5Bを飛ばして、図5Cへ移行してもよい。 For example, inputting account information in image IG1 (FIG. 5A) may or may not be a prerequisite for moving to the next screen (FIG. 5B, etc.). In the latter case, for example, account information may be inputtable on the next screen (for example, image IG3). In the former case, when "other department user" is selected in image IG1, FIG. 5B may be skipped and the process may proceed to FIG. 5C.
 画像IG1に関して、ユーザが画像処理装置3MAに登録されているユーザAである場合においては、アカウント情報の入力は、必須とされてもよいし、必須とされなくてもよい。後者に関連して、アカウント情報が入力されなくても、生体認証が行われるのであれば、画像処理装置3MAは、自己の管理用テーブルDT0を参照して、生体情報からアカウント情報を特定できる。また、図1に例示した管理用テーブルDT0のように、生体情報と機能の制限に関する情報とが直接的に紐付けられているのであれば、ユーザAのアカウント情報を特定する必要性もない。 Regarding image IG1, if the user is user A registered in image processing device 3MA, input of account information may or may not be required. Regarding the latter, if biometric authentication is performed even if account information is not input, the image processing device 3MA can refer to its own management table DT0 and identify the account information from the biometric information. Further, if biometric information and information regarding functional limitations are directly linked as in the management table DT0 illustrated in FIG. 1, there is no need to specify user A's account information.
 上記のように、ユーザAのアカウント情報が不要とされる態様においては、図5Aの例とは異なり、アカウント情報の入力を促す情報を無くし、他部署ユーザか否かを申告させる表示のみがなされてもよい。そして、自部署ユーザ(ユーザA)であることが申告された場合においてはアカウント情報の入力を促す画像が表示されることなく、生体情報の入力を促す画像が表示されてもよい。また、他部署ユーザ(ユーザB)であることが申告された場合においては、アカウント情報の入力を促す画像(例えば図5Bの画像IG3と同様の画像)が表示されてよい。 As described above, in the mode where user A's account information is not required, unlike the example in FIG. 5A, information prompting for input of account information is removed, and only a display prompting the user to declare whether or not the user is from another department is displayed. It's okay. If the user declares that he/she is the user of his/her own department (user A), an image prompting the user to input the biometric information may be displayed instead of an image prompting the user to input the account information. Further, if it is declared that the user is a user from another department (user B), an image (for example, an image similar to image IG3 in FIG. 5B) prompting the user to input account information may be displayed.
 ユーザが画像処理装置3MAに登録されたユーザAである場合に、アカウント情報が必須とされる態様においては、例えば、生体認証だけでなく、アカウント情報による認証が行われてもよい。この場合、セキュリティが向上する。 When the user is user A registered in the image processing device 3MA, in a mode where account information is required, for example, not only biometric authentication but also authentication using account information may be performed. In this case, security is improved.
 また、画像処理装置3MAにおいて、複数種類の認証方法が可能である態様においては、適宜な時期(ステップST1よりも前および/または画像IG1の前)に、認証方法を選択する画像が表示されてもよい。また、図4の説明でも述べたように、先に生体情報の入力を促す画像を表示し、その後、必要に応じて(ユーザが他部署ユーザである場合に)、アカウント情報の入力を促したり、自己が登録されている画像処理装置3を特定する情報の入力を促したりしてよい。 Furthermore, in an embodiment where the image processing device 3MA is capable of using multiple types of authentication methods, an image for selecting an authentication method is displayed at an appropriate time (before step ST1 and/or before image IG1). Good too. Also, as mentioned in the explanation of Figure 4, an image that prompts the user to input biometric information is displayed first, and then, if necessary (if the user is a user from another department), the user is prompted to input account information. , the user may be prompted to input information specifying the image processing device 3 with which the user is registered.
 他部署ユーザによる、自己が登録されている画像処理装置3を特定する情報の入力は、一覧からの選択(図5C)ではなく、画像処理装置3を特定する情報(例えばアドレス)を直接に入力するものであってもよい。 When a user in another department inputs information that identifies the image processing device 3 in which he or she is registered, he or she does not select from a list (FIG. 5C), but directly inputs information that identifies the image processing device 3 (for example, an address). It may be something that does.
(4.3.第1実施形態における一部の動作の詳細)
(4.3.1.インポートされる生体情報に係る制限の例)
 図7は、図4で説明した手順の一部の詳細の一例を示すフローチャートである。より詳細には、図7は、画像処理装置3MAを使用するユーザが画像処理装置3MBに登録されているユーザBである場合の手順(ステップST8~ST14)の詳細を示している。なお、図7は、図4の処理に類似する他の処理を示していると捉えられてもよい。
(4.3. Details of some operations in the first embodiment)
(4.3.1. Examples of restrictions related to imported biometric information)
FIG. 7 is a flowchart illustrating an example of details of a part of the procedure described in FIG. 4. More specifically, FIG. 7 shows details of the procedure (steps ST8 to ST14) when the user using the image processing device 3MA is the user B registered in the image processing device 3MB. Note that FIG. 7 may be interpreted as showing other processing similar to the processing in FIG. 4.
 図7の処理では、画像処理装置3MAから画像処理装置3MBへユーザBのアカウント情報が送信され、このアカウント情報による認証が成功した場合のみ、画像処理装置3MAによる画像処理装置3MBからのユーザBの生体情報(第1の生体情報)のインポートが許容される。また、インポートされた第1の生体情報は、所定の条件が満たされたときに消去される。これらの動作によって、生体情報の意図されていない流出の可能性が低減される。具体的には、以下のとおりである。 In the process of FIG. 7, the account information of user B is sent from the image processing device 3MA to the image processing device 3MB, and only if the authentication using this account information is successful, the image processing device 3MA sends the account information of the user B from the image processing device 3MB. Import of biometric information (first biometric information) is permitted. Further, the imported first biometric information is deleted when a predetermined condition is met. These operations reduce the possibility of unintended leakage of biological information. Specifically, it is as follows.
 図7において、「MFP-A」の手順は、画像処理装置3MA(その制御部29)が実行する手順を示している。「MFP-B」の手順は、画像処理装置3MB(その制御部29)が実行する手順を示している。 In FIG. 7, the procedure of "MFP-A" indicates the procedure executed by the image processing apparatus 3MA (its control unit 29). The "MFP-B" procedure indicates a procedure executed by the image processing device 3MB (its control unit 29).
 ステップST41では、画像処理装置3MAは、ユーザBのアカウント情報を取得する。ここでのアカウント情報は、パスワードを含むもの(認証に利用することができるもの)であるものとする。ステップST42では、画像処理装置3MAは、ステップST41で取得したアカウント情報を画像処理装置3MBへ送信する。なお、このアカウント情報の送信は、例えば、ステップST8の生体情報の要求またはその一部であってよい。 In step ST41, the image processing device 3MA acquires user B's account information. It is assumed that the account information here includes a password (that can be used for authentication). In step ST42, the image processing device 3MA transmits the account information acquired in step ST41 to the image processing device 3MB. Note that this account information transmission may be, for example, a request for biometric information in step ST8 or a part thereof.
 なお、これまでの説明から理解されるように、アカウント情報の取得(ステップST41)、ユーザBが登録されている画像処理装置3MBの特定(画像IG5)、およびユーザBの生体情報(第2の生体情報)の検出(ステップST10)は、どのような順番で行われてもよい。ここでは、画像処理装置3MBの特定は、アカウント情報の送信(ステップST42)の前までの適宜な時期に行われているものとし、また、図示を省略する。また、生体情報の検出は、図4と同様に、ユーザBの生体情報(第1の生体情報)のインポートの後に行われる態様を例に取る。 Note that, as understood from the previous explanation, acquisition of account information (step ST41), identification of the image processing device 3MB in which user B is registered (image IG5), and biometric information of user B (second The detection of biological information (step ST10) may be performed in any order. Here, it is assumed that the identification of the image processing device 3MB is performed at an appropriate time before sending the account information (step ST42), and illustration is omitted. Furthermore, as in FIG. 4, the detection of biometric information is performed after importing the biometric information of user B (first biometric information).
 ステップST43では、画像処理装置3MBは、ステップST42で受信したアカウント情報に一致するアカウント情報が自己の管理用テーブルDT0に登録されているか否か判定する。すなわち、画像処理装置3MBは、アカウント情報に基づく認証を行う。そして、画像処理装置3MBは、認証に成功した場合は、その一致したアカウント情報に対応付けて記憶されている生体情報(すなわちユーザBの第1の生体情報)を画像処理装置3MAへエクスポートする(ステップST45)。一方、認証に失敗した場合は、画像処理装置3MBは、認証に失敗したことを示す通知を画像処理装置3MAへ送信する(ステップST44)。 In step ST43, the image processing device 3MB determines whether account information matching the account information received in step ST42 is registered in its own management table DT0. That is, the image processing device 3MB performs authentication based on account information. Then, if the image processing device 3MB succeeds in authentication, the image processing device 3MB exports the biometric information stored in association with the matching account information (i.e., the first biometric information of user B) to the image processing device 3MA ( Step ST45). On the other hand, if the authentication fails, the image processing device 3MB transmits a notification indicating that the authentication has failed to the image processing device 3MA (step ST44).
 なお、既述のとおり、実施形態で例示されるフローチャートは、概念的なものである。従って、例えば、ステップST45において、実際の処理では、認証に成功したことを示す通知が画像処理装置3MBから画像処理装置3MAへ送信され、その後に生体情報が前者から後者へ送信されてもよい。 Note that, as described above, the flowcharts illustrated in the embodiments are conceptual. Therefore, for example, in step ST45, in actual processing, a notification indicating that authentication has been successful may be transmitted from the image processing device 3MB to the image processing device 3MA, and then the biometric information may be transmitted from the former to the latter.
 ステップST46では、画像処理装置3MAは、画像処理装置3MBからの通知(ステップST44およびST45)に基づいて、画像処理装置3MBにおける認証が成功したか否か判定する。そして、画像処理装置3MAは、肯定判定のとき(換言すればユーザBの生体情報をインポートできたとき)はステップST47に進み、否定判定のときはステップST50に進む。ステップST47~ST50は、図4のステップST10~ST14と同じものである(ただし、ステップST11の図示は省略。)。 In step ST46, the image processing device 3MA determines whether or not the authentication in the image processing device 3MB is successful based on the notification from the image processing device 3MB (steps ST44 and ST45). Then, when the image processing device 3MA makes a positive determination (in other words, when the biometric information of user B could be imported), the process proceeds to step ST47, and when the negative determination is made, the process proceeds to step ST50. Steps ST47 to ST50 are the same as steps ST10 to ST14 in FIG. 4 (however, step ST11 is not shown).
 ステップST51では、画像処理装置3MAは、所定の消去条件が満たされたか否か判定する。そして、画像処理装置3MAは、肯定判定の場合は、ステップST45でインポートしたユーザBの第1の生体情報を消去し(ステップST52)、否定判定の場合は、ステップST51を繰り返す(待機する。)。 In step ST51, the image processing device 3MA determines whether a predetermined erasure condition is satisfied. Then, in the case of a positive determination, the image processing apparatus 3MA erases the first biometric information of user B imported in step ST45 (step ST52), and in the case of a negative determination, repeats step ST51 (stands by). .
 なお、図7では、便宜上、ステップST50からステップST51に至るルートは、ステップST46の否定判定の場合(インポートが行われていない場合)を含んでいる。実際には、インポートされていない場合は、ステップST51およびST52の処理は行われなくてよい。また、図示の例とは異なり、第1の生体情報は、認証が完了したとき(ステップST48の直後)に消去されても構わない。 Note that in FIG. 7, for convenience, the route from step ST50 to step ST51 includes the case where the determination in step ST46 is negative (the case where import is not performed). Actually, if it has not been imported, steps ST51 and ST52 may not be performed. Further, unlike the illustrated example, the first biometric information may be deleted when the authentication is completed (immediately after step ST48).
 ステップST51の消去条件は適宜なものとされてよい。 The erasing conditions in step ST51 may be set as appropriate.
 例えば、消去条件は、所定時間が経過したことを含んでよい。所定時間の計時開始時点は任意である。例えば、計時開始時点は、ユーザBの第1の生体情報のインポートが完了した時点、ユーザBの認証が完了した時点、ユーザBの認証の結果に基づいて制限が解除された機能の実行の開始時点もしくは完了時点、認証状態が解除された時点、またはUI部23に対するユーザBの最後の操作が行われた時点とされてよい。 For example, the erasing condition may include that a predetermined time has elapsed. The timing at which the predetermined time starts is arbitrary. For example, the time measurement starts when the import of the first biometric information of user B is completed, when the authentication of user B is completed, and when the execution of a function whose restrictions are lifted based on the result of user B's authentication starts. This may be the point in time or completion point, the point in time when the authentication state is canceled, or the point in time when user B performs the last operation on the UI section 23.
 所定時間および/または計時開始時点を設定する主体は、画像処理装置3MAの製造者、画像処理装置3MAの管理者およびユーザB(消去される生体情報を有するユーザ)のいずれとされてもよい。所定時間の具体的な長さは任意であり、例えば、1秒以上、10秒以上、30秒以上、1分以上、10分以上、30分以上、1時間以上、1日以上、1週間以上、1月以上とされてよく、1月以下、1週間以下、1日以下、1時間以下、30分以下、10分以下、1分以下、30秒以下、10秒以下または1秒以下とされてよく、上記の下限と上限とは、矛盾しないように、任意のもの同士が組み合わされてよい。 The entity that sets the predetermined time and/or the timing start point may be any of the manufacturer of the image processing device 3MA, the administrator of the image processing device 3MA, and user B (the user whose biometric information is to be deleted). The specific length of the predetermined time is arbitrary, for example, 1 second or more, 10 seconds or more, 30 seconds or more, 1 minute or more, 10 minutes or more, 30 minutes or more, 1 hour or more, 1 day or more, 1 week or more. , may be more than 1 month, less than 1 month, less than 1 week, less than 1 day, less than 1 hour, less than 30 minutes, less than 10 minutes, less than 1 minute, less than 30 seconds, less than 10 seconds, or less than 1 second. The above lower limit and upper limit may be arbitrarily combined so as not to contradict each other.
 また、例えば、消去条件は、認証の結果に基づき制限が解除された機能の実行が完了したことを含んでよい。例えば、1回の機能(例えば印刷、スキャン、コピーまたはデータの送信もしくは受信)の実行ごとに認証が要求される態様において、当該機能の実行が完了したとき(例えば異常等による中断を含まない)にユーザBの第1の生体情報が消去されてよい。 Also, for example, the deletion condition may include completion of execution of a function whose restrictions have been lifted based on the authentication result. For example, in a mode where authentication is required for each execution of a function (for example, printing, scanning, copying, or sending or receiving data), when the execution of the function is completed (for example, not including interruptions due to abnormalities, etc.) User B's first biometric information may be deleted.
 また、例えば、消去条件は、UI部23に対してユーザB(もしくは他のユーザ)によって、インポートされたユーザBの第1の生体情報を消去するための操作(換言すれば所定の操作)が行われたことを含んでよい。 Further, for example, the deletion condition is that the user B (or another user) performs an operation (in other words, a predetermined operation) on the UI unit 23 to delete the imported first biometric information of the user B. May include what has been done.
 所定時間の経過または機能の実行完了によって、画像処理装置3MAがインポートしたユーザBの第1の生体情報が消去される場合、自動消去条件が満たされたときに自動的に生体情報が消去されると捉えることができる。逆に言えば、前段落の所定の操作による消去は、自動的なものではなく、ユーザによる意図的なものである。なお、消去条件が所定時間の経過を含む態様は、自動的にユーザBの第1の生体情報を消去することを意図した態様であり、特に断りがない限り、所定時間の計時開始時点とされ得る時点は、前段落で述べた第1の生体情報を消去する操作の時点を含まない。換言すれば、前段落で述べた所定の操作がなされたときは、基本的に即座に第1の生体情報が消去される。 If the first biometric information of user B imported by the image processing device 3MA is deleted due to the passage of a predetermined time or the completion of execution of a function, the biometric information is automatically deleted when the automatic deletion conditions are met. It can be understood as Conversely, the deletion by the predetermined operation in the previous paragraph is not automatic but intentional by the user. Note that the mode in which the erasure condition includes the passage of a predetermined time is a mode intended to automatically erase the first biometric information of User B, and unless otherwise specified, it is assumed that the time when the predetermined time starts to be counted. The time point at which the biometric information is obtained does not include the time point at which the first biometric information is deleted as described in the previous paragraph. In other words, when the predetermined operation described in the previous paragraph is performed, the first biometric information is basically deleted immediately.
 図7に示した手順は、適宜に変更されてよい。例えば、ステップST41~ST43では、ユーザBの正当性を示す認証用情報(および認証用情報に基づく情報。以下、同様。)として、アカウント情報が用いられている。ただし、認証用情報は、インポートの対象の生体情報とは異なる種々の認証用情報とされてよく、アカウント情報に限定されない。例えば、1.1.3節で例示した種々の認証用情報が用いられてよい。 The procedure shown in FIG. 7 may be modified as appropriate. For example, in steps ST41 to ST43, account information is used as authentication information indicating the validity of user B (and information based on the authentication information; the same applies hereinafter). However, the authentication information may be various types of authentication information different from the biometric information to be imported, and is not limited to account information. For example, various kinds of authentication information illustrated in Section 1.1.3 may be used.
 また、図7では、ステップST52において、インポートされた第1の生体情報が消去された。インポートされた第1の生体情報に加えて、または代えて、ステップST47で検出されたユーザBの第2の生体情報(および/またはステップST4で検出されたユーザAの第2の生体情報)が消去されてもよい。 Furthermore, in FIG. 7, the imported first biometric information is deleted in step ST52. In addition to or in place of the imported first biometric information, the second biometric information of user B detected in step ST47 (and/or the second biometric information of user A detected in step ST4) is May be deleted.
(4.3.2.インポートした生体情報の再利用の例)
 画像処理装置3MAが画像処理装置3MBからインポートしたユーザBの第1の生体情報は、ステップST52で消去される前に、再利用されてもよいし、再利用されなくてもよい。再利用によって、例えば、インポートによるネットワーク10の負担が軽減される。以下では、再利用される態様の一例を示す。
(4.3.2. Example of reusing imported biometric information)
The first biometric information of user B that the image processing device 3MA imported from the image processing device 3MB may or may not be reused before being deleted in step ST52. Reuse reduces the load on the network 10 due to import, for example. An example of a reused mode will be shown below.
 図8は、図7で説明した手順の一部の詳細の一例を示すフローチャートである。より詳細には、図8は、認証が成功した後の手順(ステップST49~ST52)の詳細を示している。なお、認証が失敗したとき(処理が、ステップST49ではなく、ステップST50に進むとき)は、例えば、後述するステップST55およびST56は行われない。なお、図8は、図7の処理に類似する他の処理を示していると捉えられてもよい。 FIG. 8 is a flowchart showing an example of a part of the procedure described in FIG. 7 in detail. More specifically, FIG. 8 shows details of the procedure (steps ST49 to ST52) after successful authentication. Note that when the authentication fails (when the process proceeds to step ST50 instead of step ST49), for example, steps ST55 and ST56, which will be described later, are not performed. Note that FIG. 8 may be interpreted as showing other processing similar to the processing in FIG. 7.
 ステップST49については、図7の説明で述べたとおりである。画像処理装置3MAは、認証(ステップST48)が成功した後、再認証条件が満たされたか否か判定する(ステップST55)。なお、最初に行われる再認証条件の判定は、ステップST48の後かつステップST49の前であってもよい。そして、画像処理装置3MAは、肯定判定のときは、ステップST56に進み、否定判定のときは、ステップST56をスキップしてステップST51に進む。 Step ST49 is as described in the explanation of FIG. After the authentication (step ST48) is successful, the image processing device 3MA determines whether re-authentication conditions are satisfied (step ST55). Note that the first determination of the re-authentication conditions may be performed after step ST48 and before step ST49. Then, when the image processing apparatus 3MA makes a positive determination, the process proceeds to step ST56, and when it makes a negative judgment, the image processing apparatus 3MA skips step ST56 and proceeds to step ST51.
 ステップST56では、画像処理装置3MAは、ステップST47~ST50と同様に、生体認証の再入力を促す表示をし、検出部25によって生体情報(第2の生体情報)を再検出し、第2の生体情報と第1の生体情報とを比較して認証を行う。ただし、第1の生体情報は、新たにインポートするのではなく、ステップST45でインポートして画像処理装置3MAが保持しているものを用いる。そして、画像処理装置3は、認証結果に応じて、機能の制限の解除を維持し(認証状態を維持し)、または機能を制限する(認証状態を解除する)。認証状態が解除された場合は、例えば、実行中の機能(タスク)の完了後(あるいは完了を待たずに)、制限対象の機能の実行が禁止される。 In step ST56, similarly to steps ST47 to ST50, the image processing device 3MA displays a prompt to re-enter the biometric authentication, causes the detection unit 25 to re-detect the biometric information (second biometric information), and detects the biometric information (second biometric information) again. Authentication is performed by comparing the biometric information and the first biometric information. However, the first biometric information is not newly imported, but the one imported in step ST45 and held by the image processing device 3MA is used. Then, the image processing device 3 maintains the release of the function restriction (maintains the authentication state) or restricts the function (cancels the authentication state) according to the authentication result. When the authentication state is canceled, for example, execution of the restricted function is prohibited after the currently executed function (task) is completed (or without waiting for its completion).
 その後、画像処理装置3は、ステップST51に進む。ステップST51およびST52については、図7の説明で述べたとおりである。ただし、ステップST51で否定判定がなされた場合、画像処理装置3は、ステップST51を繰り返すのではなく、例えば、ステップST55に戻る。これにより、例えば、生体情報の再入力が繰り返し要求される。 After that, the image processing device 3 proceeds to step ST51. Steps ST51 and ST52 are as described in the explanation of FIG. However, if a negative determination is made in step ST51, the image processing device 3 does not repeat step ST51, but returns to step ST55, for example. As a result, for example, re-input of biometric information is repeatedly requested.
 なお、ステップST56の再認証に失敗した場合は、上記のとおりであってもよいし、上記とは異なる処理が行われてもよい。例えば、画像処理装置3は、ステップST51へ進むのではなく、認証の再試行を促してステップST56を繰り返してもよい。あるいは、画像処理装置3は、ステップST51に進みつつ、ステップST51の否定判定でステップST55に戻らずに、ステップST51を繰り返してもよい。換言すれば、第1の生体情報を再利用する再認証が行われずに、第1の生体情報が消去されてもよい。 Note that if the re-authentication in step ST56 fails, the process may be as described above, or a process different from the above may be performed. For example, the image processing device 3 may prompt a retry of authentication and repeat step ST56 instead of proceeding to step ST51. Alternatively, the image processing device 3 may proceed to step ST51 and repeat step ST51 without returning to step ST55 if a negative determination is made in step ST51. In other words, the first biometric information may be deleted without performing re-authentication to reuse the first biometric information.
 再認証条件は適宜に設定されてよい。例えば、再認証条件は、検出部25による過去の直近のユーザBの生体情報の検出(別の観点ではユーザBの過去の直近の認証)から所定時間が経過したことを含んでよい。この場合、例えば、ユーザBではない人物が不正にユーザBの権限を利用する蓋然性が低減される。ここでの過去の直近の生体情報の検出(認証)は、例えば、ステップST47の検出(最初の認証)、またはステップST56の検出(再認証)である。所定時間は、例えば、画像処理装置3の製造者、画像処理装置3の管理者および個々のユーザ(ここではユーザB)のいずれによって設定されてもよい。 The re-authentication conditions may be set as appropriate. For example, the re-authentication condition may include that a predetermined period of time has elapsed since the detection unit 25 detected the biometric information of the user B most recently (from another point of view, the most recent authentication of the user B in the past). In this case, for example, the probability that someone other than user B will illegally use user B's authority is reduced. The detection (authentication) of the most recent past biometric information here is, for example, the detection in step ST47 (first authentication) or the detection in step ST56 (re-authentication). The predetermined time may be set by, for example, the manufacturer of the image processing device 3, the administrator of the image processing device 3, or an individual user (here, user B).
 また、例えば、再認証条件は、所定の機能の実行を指示する操作がUI部23になされたことを含んでよい。この場合も、例えば、ユーザBではない人物が不正にユーザBの権限を利用する蓋然性が低減される。上記所定の機能は、例えば、画像処理部31および通信部27の少なくとも一方に係る機能であり、認証に成功して制限が解除された1以上の機能に含まれるものであってよい。所定の機能として、上記1以上の機能から全ての機能が選択されてもよいし、一部の機能(例えば相対的に高いセキュリティが要求される機能)が選択されてもよい。所定の機能は、例えば、画像処理装置3の製造者、画像処理装置3および個々のユーザ(ここではユーザB)のいずれによって選択されてもよい。 Further, for example, the re-authentication condition may include that an operation instructing the execution of a predetermined function has been performed on the UI unit 23. In this case as well, for example, the probability that someone other than user B will illegally use user B's authority is reduced. The above-mentioned predetermined function is, for example, a function related to at least one of the image processing section 31 and the communication section 27, and may be included in one or more functions whose restrictions have been lifted due to successful authentication. As the predetermined function, all functions may be selected from the one or more functions described above, or some functions (for example, a function requiring relatively high security) may be selected. The predetermined function may be selected by, for example, the manufacturer of the image processing device 3, the image processing device 3, or an individual user (here, user B).
 インポートされた第1の生体情報は、上記以外の態様で消去前に再利用されてもよい。例えば、画像処理装置3MAは、ユーザBの第1の生体情報の消去前において、ステップST41で入力されたユーザBのアカウント情報と、ユーザBの第1の生体情報とを対応付けて保持してよい。ユーザBが画像処理装置3MAの利用を一旦終えた後、再度、ステップST41を実行したときに、まだ、前回の第1の生体情報が消去されていなければ、再度入力されたアカウント情報と一致するアカウント情報と対応付けられて保存されている前回の第1の生体情報を用いて(すなわちインポートを行うことなく)、ステップST47およびST48の生体認証が行われてよい。 The imported first biometric information may be reused before being deleted in a manner other than the above. For example, before erasing user B's first biometric information, the image processing device 3MA associates and holds user B's account information input in step ST41 with user B's first biometric information. good. When user B once finishes using the image processing device 3MA and executes step ST41 again, if the previous first biometric information has not been deleted yet, the account information matches the re-entered account information. The biometric authentication in steps ST47 and ST48 may be performed using the previous first biometric information stored in association with the account information (that is, without importing).
 ステップST55およびST56は、ステップST6の後にも行われてよい。すなわち、再認証の要求は、画像処理装置3MAに登録されたユーザAに対しても行われてよい。ただし、この場合は、検出された第2の生体情報と比較される第1の生体情報は、画像処理装置3MAが自己の管理用テーブルDT0に保持しているものである。管理用テーブルDT0に登録されているユーザAの第1の生体情報は、ユーザBの第1の生体情報と異なり、インポートされないし、自動的に消去されない。 Steps ST55 and ST56 may also be performed after step ST6. That is, the request for re-authentication may also be made to the user A registered in the image processing device 3MA. However, in this case, the first biometric information to be compared with the detected second biometric information is what the image processing device 3MA holds in its own management table DT0. The first biometric information of user A registered in the management table DT0 is different from the first biometric information of user B, and is not imported or automatically deleted.
 再認証は、上記の説明とは異なり、ユーザBの第1の生体情報がインポートされて行われてもよい。 Unlike the above description, re-authentication may be performed by importing the first biometric information of user B.
 また、再認証は、上記の説明とは異なり、ステップST4またはST10(ST47)で検出された第2の生体情報が、第1の生体情報(画像処理装置3MAに登録されているもの、または画像処理装置3MBからインポートされたもの)に代えて用いられてもよい。この場合、ステップST4またはST10(ST47)で検出された第2の生体情報の方が、第1の生体情報よりも、現在のユーザの体調等に応じたものとなっている蓋然性が高いから、体調等に起因して再認証が失敗する蓋然性が低減される。 Further, re-authentication differs from the above explanation in that the second biometric information detected in step ST4 or ST10 (ST47) is the first biometric information (registered in the image processing device 3MA or the image (imported from the processing device 3MB) may be used instead. In this case, the second biometric information detected in step ST4 or ST10 (ST47) is more likely to be in accordance with the user's current physical condition than the first biometric information. The probability that re-authentication will fail due to physical condition or the like is reduced.
(5.第2実施形態)
 第2実施形態は、既述のとおり、画像処理装置3MAがユーザBの第2の生体情報(検出された生体情報)を画像処理装置3MBへエクスポートする第2の方法を実行する態様である。具体的には、例えば、以下のとおりである。
(5. Second embodiment)
As described above, the second embodiment is a mode in which the image processing device 3MA executes the second method of exporting the second biometric information (detected biometric information) of the user B to the image processing device 3MB. Specifically, for example, it is as follows.
(5.1.第2実施形態における動作の概要)
 図9は、画像処理装置3MA(別の観点では制御部29)が実行する認証に係る手順の概要の一例を示すフローチャートである。この図は、第1実施形態の図4に対応している。ステップST1~ST7およびST12~ST14は、図4のものと同様である。ステップST61~ST64およびST12~ST14は、図4のステップST8~ST14と同様に、画像処理装置3MAを使用するユーザが他部署のユーザ(ここではユーザB)である場合の処理である。
(5.1. Outline of operation in second embodiment)
FIG. 9 is a flowchart illustrating an example of an outline of a procedure related to authentication executed by the image processing device 3MA (control unit 29 from another perspective). This figure corresponds to FIG. 4 of the first embodiment. Steps ST1 to ST7 and ST12 to ST14 are similar to those in FIG. 4. Similar to steps ST8 to ST14 in FIG. 4, steps ST61 to ST64 and ST12 to ST14 are processes performed when the user using the image processing apparatus 3MA is a user in another department (here, user B).
 ステップST61では、画像処理装置3MAは、ステップST4と同様に、ユーザBの生体情報(第2の生体情報)を検出する。ステップST62では、画像処理装置3MAは、画像処理装置3MBにアクセスし、例えば、ユーザBの認証の要求を行う。ステップST63では、画像処理装置3MAは、ステップST61で検出したユーザBの第2の生体情報を画像処理装置3MBにエクスポートする。 In step ST61, the image processing device 3MA detects user B's biometric information (second biometric information) similarly to step ST4. In step ST62, the image processing device 3MA accesses the image processing device 3MB and requests user B's authentication, for example. In step ST63, the image processing device 3MA exports the second biometric information of the user B detected in step ST61 to the image processing device 3MB.
 ステップST64では、画像処理装置3MBは、ステップST63でエクスポートされた第2の生体情報と一致する生体情報(第1の生体情報)が自己の管理用テーブルDT0に存在するか否か判定する。すなわち、画像処理装置3MBは、生体認証を行う。そして、その認証結果を画像処理装置3MAに通知する。なお、ステップST64は、画像処理装置3MAの処理としては、画像処理装置3MBから認証結果を受信するステップということができる。 In step ST64, the image processing device 3MB determines whether biometric information (first biometric information) that matches the second biometric information exported in step ST63 exists in its own management table DT0. That is, the image processing device 3MB performs biometric authentication. Then, the authentication result is notified to the image processing device 3MA. Note that step ST64 can be called a step of receiving the authentication result from the image processing device 3MB as a process of the image processing device 3MA.
 ステップST12~ST14は、図4の説明で述べたとおりである。ただし、ステップST12の判定は、図4とは異なり、画像処理装置3MBからの通知に基づくものであって、自己による認証に基づくものではない。 Steps ST12 to ST14 are as described in the explanation of FIG. 4. However, unlike in FIG. 4, the determination in step ST12 is based on a notification from the image processing device 3MB, and is not based on self-authentication.
 画像処理装置3MBにおいて、画像処理装置3MAからエクスポートされた第2の生体情報(ステップST63)は、登録されている生体情報との比較が行われた直後に画像処理装置3MBから消去されてよい。ただし、エクスポートされた第2の生体情報は、その後の適宜な時期(例えば認証状態が解除される時期)まで画像処理装置3MBに記憶されて適宜に利用されても構わない。また、エクスポートされた第2の生体情報は、登録されている第1の生体情報の更新に利用されてもよい。 In the image processing device 3MB, the second biometric information exported from the image processing device 3MA (step ST63) may be deleted from the image processing device 3MB immediately after the comparison with the registered biometric information is performed. However, the exported second biometric information may be stored in the image processing device 3MB until an appropriate time thereafter (for example, when the authentication state is canceled) and used as appropriate. Furthermore, the exported second biometric information may be used to update the registered first biometric information.
 図4のステップST8およびST9の通信の説明は、適宜にステップST62およびST63の通信に援用されてよい。念のために記載すると、例えば、エクスポートは、プライベートネットワーク13A内のものであってもよいし、パブリックネットワーク11を介したものであってもよい。後者の場合において、VPN接続が利用されてもよいし、利用されなくてもよい。また、エクスポートのための接続の確立は、ステップST62またはST63の段階でなされてもよいし、それよりも前からなされていてもよい。 The description of the communication in steps ST8 and ST9 in FIG. 4 may be used for the communication in steps ST62 and ST63 as appropriate. For clarity, the export may be within the private network 13A or via the public network 11, for example. In the latter case, a VPN connection may or may not be utilized. Further, the connection for export may be established at step ST62 or ST63, or may be established before that.
 また、例えば、画像処理装置3MAおよび3MBが同一のLAN内に含まれる態様(あるいは他の態様)では、上記の説明とは異なり、画像処理装置3MAは、画像処理装置3MBを特定せずに、LAN内の全ての通信機器にユーザBの第1の生体情報の有無を問い合わせてよい。該当する通信機器(ユーザBの第1の生体情報を登録している画像処理装置3MB)は、その問い合わせに対して肯定的な返信をしてよい。そして、画像処理装置3MAは、肯定的な返信の送信元へユーザBの第2の生体情報(検出したもの)をエクスポートしてよい。 Further, for example, in a mode (or other mode) in which image processing devices 3MA and 3MB are included in the same LAN, unlike the above explanation, image processing device 3MA does not specify image processing device 3MB, and All communication devices within the LAN may be inquired about the presence or absence of user B's first biometric information. The corresponding communication device (the image processing device 3MB that has registered the first biometric information of user B) may respond affirmatively to the inquiry. The image processing device 3MA may then export the second biometric information (detected) of the user B to the source of the positive reply.
 図9に示した手順は適宜に変更されてよい。例えば、ステップST61とST62とは順番が逆であってもよい。 The procedure shown in FIG. 9 may be modified as appropriate. For example, the order of steps ST61 and ST62 may be reversed.
(5.2.第2実施形態における画面の例)
 図9の処理の実行に際しては、適宜な画像が表示部35に表示されてよい。第1実施形態の説明(4.2節)で例示した画面(図5A~図6B)は、第2実施形態に利用されてよい。例えば、図5A~図6Bに示した画面は、第2実施形態にも、そのまま利用されてよい。換言すれば、ユーザにとっては、第1実施形態と第2実施形態との区別ができなくても構わない。図5A~図6Bに関しての種々の説明も、矛盾等が生じない限り、具体的な用語等(例えばステップを示す記号)を置き換えて、適宜に第2実施形態に援用されてよい。
(5.2. Example of screen in second embodiment)
When executing the process in FIG. 9, an appropriate image may be displayed on the display section 35. The screens (FIGS. 5A to 6B) illustrated in the description of the first embodiment (section 4.2) may be used in the second embodiment. For example, the screens shown in FIGS. 5A to 6B may be used as they are in the second embodiment. In other words, the user does not need to be able to distinguish between the first embodiment and the second embodiment. The various explanations regarding FIGS. 5A to 6B may be appropriately incorporated into the second embodiment by replacing specific terms (for example, symbols indicating steps), as long as there is no contradiction.
(5.3.第2実施形態における一部の動作の詳細)
(5.3.1.エクスポートされる生体情報に係る制限の例)
 図10は、図9で説明した手順の一部の詳細の一例を示すフローチャートである。より詳細には、図10は、画像処理装置3MAを使用するユーザが画像処理装置3MBに登録されているユーザBである場合の手順(ステップST61~ST64およびST12~ST14)の詳細を示している。なお、図10は、図9の処理に類似する他の処理を示していると捉えられてもよい。
(5.3. Details of some operations in the second embodiment)
(5.3.1. Examples of restrictions related to exported biometric information)
FIG. 10 is a flowchart illustrating an example of details of a part of the procedure described in FIG. 9. More specifically, FIG. 10 shows details of the procedure (steps ST61 to ST64 and ST12 to ST14) when the user using the image processing device 3MA is the user B registered in the image processing device 3MB. . Note that FIG. 10 may be interpreted as showing other processing similar to the processing of FIG. 9.
 図10の処理では、画像処理装置3MAから画像処理装置3MBへユーザBのアカウント情報が送信され、このアカウント情報による認証が成功した場合のみ、画像処理装置3MAによる画像処理装置3MBへのユーザBの生体情報(第2の生体情報)のエクスポートが許容される。また、エクスポートされた第2の生体情報、および検出された第1の生体情報は、それぞれ所定の条件が満たされたときに消去される。これらの動作によって、生体情報の意図されていない流出の可能性が低減される。具体的には、以下のとおりである。 In the process of FIG. 10, the account information of user B is sent from the image processing device 3MA to the image processing device 3MB, and only if the authentication using this account information is successful, the image processing device 3MA sends the account information of user B to the image processing device 3MB. Export of biometric information (second biometric information) is permitted. Furthermore, the exported second biometric information and the detected first biometric information are deleted when predetermined conditions are satisfied. These operations reduce the possibility of unintended leakage of biological information. Specifically, it is as follows.
 ステップST41~ST44は、図7のものと同様であり、図7の説明が援用されてよい。また、図7と同様に、アカウント情報の取得(ステップST41)、ユーザBが登録されている画像処理装置3MBの特定(画像IG5)、およびユーザBの生体情報(第2の生体情報)の検出(ステップST61)は、矛盾等が生じない限り、どのような順番で行われてもよい。画像処理装置3MBの特定については、図7と同様に図示を省略する。また、生体情報の検出については、検出された生体情報のエクスポート(ステップST73)の前までの適宜な時期に行われているものとし、また、図示を省略する。 Steps ST41 to ST44 are similar to those in FIG. 7, and the explanation of FIG. 7 may be used. Also, similar to FIG. 7, acquisition of account information (step ST41), identification of the image processing device 3MB in which user B is registered (image IG5), and detection of user B's biometric information (second biometric information) (Step ST61) may be performed in any order as long as no contradiction occurs. The illustration of the identification of the image processing device 3MB is omitted as in FIG. 7. Furthermore, it is assumed that the detection of biometric information is performed at an appropriate time before the export of the detected biometric information (step ST73), and illustration thereof is omitted.
 ステップST43において、ステップST42で受信したアカウント情報に基づく認証が成功したとき、画像処理装置3MBは、認証に成功したことを示す通知を画像処理装置3MAへ送信する(ステップST71)。ステップST72では、画像処理装置3MAは、画像処理装置3MBからの通知(ステップST44およびST71)に基づいて、アカウント情報に基づく認証が成功したか否か判定する。 In step ST43, when the authentication based on the account information received in step ST42 is successful, the image processing device 3MB transmits a notification indicating that the authentication was successful to the image processing device 3MA (step ST71). In step ST72, the image processing device 3MA determines whether or not the authentication based on the account information has been successful, based on the notification from the image processing device 3MB (steps ST44 and ST71).
 そして、画像処理装置3MAは、肯定判定のときはステップST73に進み、否定判定のときはステップST78に進む。ステップST73は、ステップST63と同じものであり、画像処理装置3MAは、自己が検出したユーザBの生体情報(第2の生体情報)を画像処理装置3MBへエクスポートする。ステップST78は、ステップST14と同じものであり、画像処理装置3MAは、機能の制限解除を行わない。 Then, the image processing device 3MA proceeds to step ST73 when the determination is affirmative, and proceeds to step ST78 when the determination is negative. Step ST73 is the same as step ST63, and the image processing device 3MA exports the biometric information (second biometric information) of the user B detected by itself to the image processing device 3MB. Step ST78 is the same as step ST14, and the image processing device 3MA does not release the restriction on the function.
 ステップST74およびST75は、ステップST64と同じものである。すなわち、画像処理装置3MBは、画像処理装置3MAからエクスポートされたユーザBの第2の生体情報と、自己が保持しているユーザBの第1の生体情報とが一致するか否か判定することによって認証を行い(ステップST74)、その認証結果を画像処理装置3MAへ送信する(ステップST75)。 Steps ST74 and ST75 are the same as step ST64. That is, the image processing device 3MB determines whether or not the second biometric information of the user B exported from the image processing device 3MA matches the first biometric information of the user B that it holds. Authentication is performed (step ST74), and the authentication result is transmitted to the image processing device 3MA (step ST75).
 ステップST76、ステップST77およびST78は、ステップST12、ST13およびST14と同じものであり、説明を省略する。 Steps ST76, ST77, and ST78 are the same as steps ST12, ST13, and ST14, and their explanation will be omitted.
 画像処理装置3MAは、エクスポート(ステップST73)が完了した後(図示の例ではステップST77の後)、所定の消去条件が満たされたか否か判定する(ステップST79)。そして、画像処理装置3MAは、肯定判定の場合は、ステップST61で検出したユーザBの第2の生体情報を消去し(ステップST80)、否定判定の場合は、ステップST79を繰り返す(待機する。)。なお、図示の例とは異なり、第2の生体情報は、エクスポートが完了したとき(エクスポートの直後)に消去されても構わない。図示の例では、生体認証が成功したとき(ステップST76で肯定判定がなされたとき)に、消去条件が満たされるか否かの判定がなされている。生体認証が失敗したときは、例えば、ユーザBの第2の生体情報は、直ちに消去されてもよいし、生体認証が成功したときと同様に、消去条件が満たされるか否かの判定がなされてもよい。 After the export (step ST73) is completed (in the illustrated example, after step ST77), the image processing device 3MA determines whether a predetermined deletion condition is satisfied (step ST79). Then, in the case of a positive determination, the image processing device 3MA erases the second biometric information of the user B detected in step ST61 (step ST80), and in the case of a negative determination, repeats step ST79 (stands by). . Note that, unlike the illustrated example, the second biometric information may be deleted when the export is completed (immediately after the export). In the illustrated example, when biometric authentication is successful (when an affirmative determination is made in step ST76), it is determined whether the erasure condition is satisfied. When biometric authentication fails, for example, the second biometric information of user B may be deleted immediately, or a determination is made as to whether the deletion conditions are met in the same way as when biometric authentication is successful. It's okay.
 画像処理装置3MBは、生体認証(ステップST74)に成功した後(図示の例ではステップST75の後)、所定の消去条件が満たされたか否か判定する(ステップST81)。そして、画像処理装置3MBは、肯定判定の場合は、ステップST73で画像処理装置3MAからエクスポートされたユーザBの第2の生体情報を消去し(ステップST81)、否定判定の場合は、ステップST81を繰り返す(待機する。)。なお、図示の例とは異なり、第2の生体情報は、生体認証が完了したとき(生体認証の直後)に消去されても構わない。また、生体認証が失敗したときは、例えば、ユーザBの第2の生体情報は、直ちに消去されてもよいし、生体認証が成功したときと同様に、消去条件が満たされるか否かの判定がなされてもよい。 After the image processing device 3MB succeeds in biometric authentication (step ST74) (in the illustrated example, after step ST75), the image processing device 3MB determines whether a predetermined deletion condition is satisfied (step ST81). Then, in the case of a positive determination, the image processing device 3MB erases the second biometric information of the user B exported from the image processing device 3MA in step ST73 (step ST81), and in the case of a negative determination, the image processing device 3MB deletes the second biometric information of the user B exported from the image processing device 3MA in step ST73. Repeat (wait). Note that, unlike the illustrated example, the second biometric information may be deleted when biometric authentication is completed (immediately after biometric authentication). Furthermore, when biometric authentication fails, for example, the second biometric information of user B may be deleted immediately, or the second biometric information of user B may be deleted immediately, or a determination is made as to whether the deletion conditions are satisfied, in the same way as when biometric authentication is successful. may be done.
 ステップST76およびST81の消去条件は適宜なものとされてよい。 The erasing conditions in steps ST76 and ST81 may be set as appropriate.
 例えば、消去条件は、所定時間が経過したことを含んでよい。例えば、ステップST76の消去条件に関しては、計時開始時点は、ユーザBの第2の生体情報のエクスポート(ステップST73)が完了した時点、生体認証の結果を画像処理装置3MBから受信した時点(ステップST75)、ユーザBの認証の結果に基づいて制限が解除された機能の実行の開始時点もしくは完了時点、認証状態が解除された時点、またはUI部23に対してユーザBの最後の操作が行われた時点とされてよい。また、例えば、ステップST81の消去条件に関しては、計時開始時点は、生体認証(ステップST74)が完了した時点、または生体認証の結果を画像処理装置3MBへ送信した時点(ステップST75)とされてよい。所定時間および/または計時開始時点を設定する主体は、画像処理装置3MAまたは3MBの製造者、画像処理装置3MAまたは3MBの管理者およびユーザB(消去される生体情報を有するユーザ)のいずれとされてもよい。 For example, the erasing condition may include that a predetermined time has elapsed. For example, regarding the erasure condition in step ST76, the timing start time is the time when the export of the second biometric information of user B (step ST73) is completed, the time when the biometric authentication result is received from the image processing device 3MB (step ST75) ), at the start or completion of execution of a function for which restrictions have been lifted based on the result of user B's authentication, at the point at which the authentication state is canceled, or at the time when user B's last operation is performed on the UI unit 23. may be considered as the point in time. Further, for example, regarding the erasure condition in step ST81, the timing start time may be the time when biometric authentication (step ST74) is completed, or the time when the biometric authentication result is transmitted to the image processing device 3MB (step ST75). . The entity that sets the predetermined time and/or the timing start point is the manufacturer of the image processing device 3MA or 3MB, the administrator of the image processing device 3MA or 3MB, or User B (the user whose biometric information is to be deleted). It's okay.
 また、例えば、ステップST79の消去条件は、認証の結果に基づき制限が解除された機能の実行が完了したことを含んでよい。例えば、1回の機能(例えば印刷、スキャン、コピーまたはデータの送信もしくは受信)の実行ごとに認証が要求される態様において、当該機能の実行が完了したとき(例えば異常等による中断を含まない)にユーザBの第2の生体情報が消去されてよい。 Further, for example, the deletion condition in step ST79 may include completion of execution of a function whose restrictions have been canceled based on the authentication result. For example, in a mode where authentication is required for each execution of a function (for example, printing, scanning, copying, or sending or receiving data), when the execution of the function is completed (for example, not including interruptions due to abnormalities, etc.) User B's second biometric information may be deleted.
 また、例えば、ステップST79の消去条件は、画像処理装置3MAのUI部23に対してユーザB(もしくは他のユーザ)によってユーザBの第2の生体情報を消去するための操作(換言すれば所定の操作)が行われた時点とされてよい。 For example, the deletion condition in step ST79 is an operation (in other words, a predetermined This may be defined as the point in time when the operation) was performed.
 また、例えば、ステップST81の消去条件は、画像処理装置3MBが、ユーザBの第2の生体情報を消去する要請を画像処理装置3MAから受信したこととされてよい。画像処理装置3MAが上記の要請を画像処理装置3MBへ送信する条件については、ステップST79の消去条件の説明が援用されてよい。上記の要請を送信する条件と、ステップST79の消去条件とは、互いに同じであってもよいし、互いに異なっていてもよい。 Furthermore, for example, the deletion condition in step ST81 may be that the image processing device 3MB has received a request to delete the second biometric information of the user B from the image processing device 3MA. Regarding the conditions under which the image processing device 3MA transmits the above request to the image processing device 3MB, the explanation of the deletion conditions in step ST79 may be used. The conditions for transmitting the above-mentioned request and the deletion conditions in step ST79 may be the same or different.
 図10に示した手順は、適宜に変更されてよい。例えば、図7の説明でも述べたように、ステップST43で用いられるユーザBの正当性を示す認証用情報(および認証用情報に基づく情報。以下、同様。)は、エクスポートの対象の生体情報とは異なる種々の認証用情報とされてよく、アカウント情報に限定されない。 The procedure shown in FIG. 10 may be modified as appropriate. For example, as described in the explanation of FIG. 7, the authentication information indicating the validity of user B used in step ST43 (and information based on the authentication information; the same applies hereinafter) is the biometric information to be exported. may be different types of authentication information, and is not limited to account information.
(5.3.2.エクスポートされた生体情報の再利用の例)
 ステップST80またはST82で消去されるユーザBの第2の生体情報(検出されたもの)は、再利用されてもよいし、再利用されなくてもよい。
(5.3.2. Example of reusing exported biometric information)
The second biometric information (detected) of user B that is deleted in step ST80 or ST82 may or may not be reused.
 再利用される場合、例えば、図8に示した手順は、ステップST80またはST82で消去される第2の生体情報の再利用に援用されてよい。ステップST80に関する援用では、例えば、図8のステップST49、ST51およびST52は、それぞれ、ステップST77、ST79およびST80に置換されてよい。ステップST82に関する援用では、例えば、図8のステップST49の制限解除はステップST74の認証に置換されてよく、また、ステップST51およびST52は、それぞれ、ステップST81およびST82に置換されてよい。 In the case of reuse, for example, the procedure shown in FIG. 8 may be used to reuse the second biometric information deleted in step ST80 or ST82. In reference to step ST80, for example, steps ST49, ST51, and ST52 in FIG. 8 may be replaced with steps ST77, ST79, and ST80, respectively. In reference to step ST82, for example, the restriction release in step ST49 of FIG. 8 may be replaced with the authentication in step ST74, and steps ST51 and ST52 may be replaced with steps ST81 and ST82, respectively.
 図8のステップST55およびST56の具体的な内容は、第1実施形態と相違してよい。例えば、以下のとおりである。 The specific contents of steps ST55 and ST56 in FIG. 8 may be different from the first embodiment. For example, as follows.
 例えば、ステップST80で消去される第2の生体情報の再利用では、画像処理装置3MAは、ステップST56において、上記の第2の生体情報を画像処理装置3MBにエクスポートして再認証を要求する。このような動作によって、例えば、ユーザBに生体情報の再入力を要求することなく、画像処理装置3MBによる再認証を実現できる。なお、再認証のために第2の生体情報をエクスポートするとき、最初の認証のためのエクスポート(ステップST73)と同様に、アカウント情報による認証(ステップST43)の成功が前提とされてもよいし、前提とされなくてもよい。 For example, in reusing the second biometric information deleted in step ST80, the image processing device 3MA exports the second biometric information to the image processing device 3MB and requests re-authentication in step ST56. Through such an operation, re-authentication by the image processing device 3MB can be realized, for example, without requiring the user B to re-enter biometric information. Note that when exporting the second biometric information for re-authentication, it may be assumed that the authentication using account information (step ST43) is successful, similar to the export for the first authentication (step ST73). , need not be assumed.
 上記のようにステップST80で消去される第2の生体情報の再利用に係る動作を行う場合における再認証条件(ステップST55)は任意である。 The re-authentication conditions (step ST55) when performing the operation related to reusing the second biometric information deleted in step ST80 as described above are arbitrary.
 例えば、最初の認証が行われて機能の制限が解除された後(ステップST77の後)、画像処理装置3MAから画像処理装置3MBへ接続(例えばVPN接続)の確立を要求するときに、画像処理装置3MBによる認証が必須とされてよい。このような態様において、再認証条件は、上記の接続の確立が必要になったことを含んでよい。なお、接続の確立は、それ自体が目的とされ、ユーザBによって指示されるものであってもよいし、何らかの機能の利用(例えばデータの送信または受信)に付随して必要とされるものであってもよい。 For example, after the initial authentication is performed and the restriction on functions is lifted (after step ST77), when requesting establishment of a connection (for example, VPN connection) from the image processing device 3MA to the image processing device 3MB, the image processing Authentication by the device 3MB may be required. In such aspects, the re-authentication condition may include the need to establish the connection described above. Note that the establishment of a connection may be a purpose in itself and may be instructed by User B, or may be necessary in conjunction with the use of some function (for example, sending or receiving data). There may be.
 また、例えば、ステップST74の認証の成功によって、画像処理装置3MAから画像処理装置3MBへ接続(例えばVPN接続)が確立されたり、上記のように再認証によって接続が確立されたりする場合において、何らかの異常(例えば通信障害)によって接続が切断されることがある。この場合、再認証が必要になる。そこで、再認証条件は、意図されずに認証が必要な接続が遮断されたこととされてよい。 Further, for example, when a connection (for example, VPN connection) is established from the image processing device 3MA to the image processing device 3MB due to successful authentication in step ST74, or when a connection is established by re-authentication as described above, some The connection may be disconnected due to an abnormality (for example, a communication failure). In this case, re-authentication is required. Therefore, the re-authentication condition may be that a connection requiring authentication is unintentionally interrupted.
 また、ステップST82で消去される第2の生体情報の再利用では、例えば、ステップST55の再認証条件は、画像処理装置3MAから画像処理装置3MBへユーザBの認証が要求されたこととされてよい。そして、ステップST56の再認証では、画像処理装置3MBは、画像処理装置3MAから新たにエクスポートされたユーザBの第2の生体情報と、自己の管理用テーブルDT0に登録されているユーザBの第1の生体情報との比較に代えて、新たにエクスポートされたユーザBの第2の生体情報と、以前に取得されてステップST82で消去される前の第2の生体情報とを比較してよい。この場合、登録されている第1の生体情報よりも、以前に取得された第2の生体情報の方が、現在のユーザBの体調等に応じた内容となっている蓋然性が高く、ひいては、認証の精度が向上することが期待される。 In addition, in reusing the second biometric information deleted in step ST82, for example, the re-authentication condition in step ST55 is that the image processing device 3MA requests the image processing device 3MB to authenticate the user B. good. In the re-authentication of step ST56, the image processing device 3MB uses the second biometric information of the user B newly exported from the image processing device 3MA and the second biometric information of the user B registered in its own management table DT0. Instead of comparing with the first biometric information, the newly exported second biometric information of user B may be compared with the second biometric information previously acquired and before being deleted in step ST82. . In this case, there is a high probability that the previously acquired second biometric information has content that is more appropriate to the current physical condition of user B than the registered first biometric information, and as a result, It is expected that the accuracy of authentication will improve.
 上記のようにステップST82で消去される第2の生体情報が再利用される場合においては、ステップST74で認証が成功して、上記第2の生体情報の正当性が確認されていることが前提である。画像処理装置3MAからの認証の要求は、例えば、ユーザBが画像処理装置3MAに新たに認証を要求して機能の制限を解除する状況のものであってもよいし、機能の制限が解除された後に画像処理装置3MAから画像処理装置3MBへ接続(例えばVPN)の確立のために認証が要求される状況のものであってもよい。また、画像処理装置3MAから新たにエクスポートされる第2の生体情報は、新たに検出されたものであってもよいし、ステップST80で消去される前に再利用されたものであってもよい。 In the case where the second biometric information erased in step ST82 is to be reused as described above, it is assumed that the authentication is successful in step ST74 and the validity of the second biometric information has been confirmed. It is. The request for authentication from the image processing device 3MA may be, for example, in a situation where user B is requesting new authentication from the image processing device 3MA to remove functional restrictions, or in a situation where functional restrictions are being removed. The situation may also be one in which authentication is required to establish a connection (for example, VPN) from the image processing device 3MA to the image processing device 3MB after the authentication is completed. Further, the second biological information newly exported from the image processing device 3MA may be newly detected information, or may be information that has been reused before being deleted in step ST80. .
(6.第3実施形態)
 第1および第2実施形態では、生体認証が成功すると、機能の制限が解除された。ただし、生体認証が成功したことを条件として、サーバ5に認証用情報を送信し、その認証用情報に基づく認証が成功したことを条件として、機能の制限が解除されてもよい。この場合、画像処理装置3における認証(生体認証)と、サーバ5における認証との2段階の認証が必要になることから、セキュリティが向上する。なお、このような態様も、生体認証の後に他の認証がさらに行われているとはいえ、生体認証に基づいて機能の制限が解除されている態様であるといえる。
(6. Third embodiment)
In the first and second embodiments, when biometric authentication is successful, functional restrictions are lifted. However, on the condition that the biometric authentication is successful, the authentication information may be transmitted to the server 5, and on the condition that the authentication based on the authentication information is successful, the restriction on the function may be lifted. In this case, security is improved because two-step authentication is required: authentication at the image processing device 3 (biometric authentication) and authentication at the server 5. It should be noted that this mode can also be said to be a mode in which restrictions on functionality are lifted based on biometric authentication, although other authentication is further performed after biometric authentication.
 図11は、第3実施形態に係る通信システム1の構成を示すブロック図である。 FIG. 11 is a block diagram showing the configuration of a communication system 1 according to the third embodiment.
 画像処理装置3は、例えば、図1~図3を参照して説明した構成を有しており、図11では、検出部25、制御部29、画像処理部31および補助記憶装置45が抽出されて示されている。補助記憶装置45は、既述のとおり、管理用テーブルDT0を有しており、図11では、その一部が比較用テーブルDT1として示されている。比較用テーブルDT1は、ユーザ毎に、アカウント情報(IDおよびパスワード)と1以上の生体情報とを対応付けて保持している。 The image processing device 3 has, for example, the configuration described with reference to FIGS. 1 to 3, and in FIG. 11, the detection unit 25, control unit 29, image processing unit 31, and auxiliary storage device 45 are extracted is shown. As described above, the auxiliary storage device 45 has the management table DT0, a part of which is shown as the comparison table DT1 in FIG. The comparison table DT1 holds account information (ID and password) and one or more pieces of biometric information in association with each other for each user.
 サーバ5は、例えば、検証部5aと、不揮発性メモリ5bとを有している。検証部5aは、例えば、画像処理装置3の制御部29と同様に、CPUがROMおよび/または補助記憶装置に記憶されているプログラムを実行することによって構築される。不揮発性メモリ5bは、例えば、補助記憶装置によって構成されており、検証用テーブルDT2を記憶している。検証用テーブルDT2は、ユーザ毎のアカウント情報(IDおよびパスワード)を保持している。 The server 5 includes, for example, a verification unit 5a and a nonvolatile memory 5b. The verification unit 5a is constructed, for example, similarly to the control unit 29 of the image processing device 3, by the CPU executing a program stored in the ROM and/or the auxiliary storage device. The nonvolatile memory 5b is constituted by, for example, an auxiliary storage device, and stores the verification table DT2. The verification table DT2 holds account information (ID and password) for each user.
 画像処理装置3(3MA)は、第1および第2実施形態と同様に、ユーザの生体認証を行う。画像処理装置3は、生体認証が成功すると(図4または図9のステップST5またはST12で肯定判定がなされると)、直ちに機能の制限を解除するのではなく、サーバ5に認証を要求する。 The image processing device 3 (3MA) performs biometric authentication of the user similarly to the first and second embodiments. When the biometric authentication is successful (if an affirmative determination is made in step ST5 or ST12 in FIG. 4 or FIG. 9), the image processing device 3 does not immediately release the restriction on the function, but requests the server 5 for authentication.
 具体的には、例えば、画像処理装置3MAは、ステップST5で肯定判定がなされた場合は、自己が保持している比較用テーブルDT1において認証に成功したユーザAの生体情報と対応付けられているアカウント情報をサーバ5に送信する。また、画像処理装置3MAは、ステップST12で肯定判定がなされた場合は、図7または図9のステップST41で入力されたユーザBのアカウント情報をサーバ5に送信する。 Specifically, for example, if an affirmative determination is made in step ST5, the image processing device 3MA associates the biometric information of the user A who has been successfully authenticated in the comparison table DT1 held by the image processing device 3MA. Send account information to server 5. Further, if an affirmative determination is made in step ST12, the image processing device 3MA transmits the account information of user B input in step ST41 of FIG. 7 or FIG. 9 to the server 5.
 アカウント情報を受信したサーバ5は、受信したアカウント情報と一致するアカウント情報が検証用テーブルDT2に登録されているか否かを判定する。これにより、認証が行われる。すなわち、受信したアカウント情報と一致するアカウント情報が登録されていれば、認証成功であり、そうでなければ認証失敗である。そして、サーバ5は、認証結果(認証の成否)をアカウント情報の送信元の画像処理装置3(3MA)へ送信する。 The server 5 that has received the account information determines whether account information that matches the received account information is registered in the verification table DT2. Authentication is thereby performed. That is, if account information that matches the received account information is registered, authentication is successful; otherwise, authentication is unsuccessful. Then, the server 5 transmits the authentication result (success or failure of authentication) to the image processing apparatus 3 (3MA) that is the source of the account information.
 認証結果を受信した画像処理装置3MAは、認証結果が認証の成功を示すものであれば、機能の制限を解除する(ステップST6またはST13)。また、そうでない場合は、画像処理装置3MAは、機能を制限する(ステップST7またはST14)。なお、ステップST5またはステップST12で、生体認証に失敗した場合は、当然ながら、サーバ5へのアカウント情報の送信が行われることなく、機能を制限する(ステップST7またはST14)。 When the image processing device 3MA receives the authentication result, if the authentication result indicates that the authentication was successful, the image processing device 3MA releases the restriction on the function (step ST6 or ST13). If not, the image processing device 3MA limits the function (step ST7 or ST14). Note that if biometric authentication fails in step ST5 or step ST12, the account information is not transmitted to the server 5 and functions are restricted (step ST7 or ST14).
 第3実施形態は適宜に変更されてよい。例えば、サーバ5による認証に用いられる認証用情報(および認証用情報に基づく情報。以下、同様。)は、アカウント情報に限定されない。認証用情報は、インポートまたはエクスポートの対象の生体情報とは異なる種々の認証用情報とされてよい。例えば、1.1.3節で例示した種々の認証用情報が用いられてよい。もっとも、認証用情報は、インポートまたはエクスポートの対象の生体情報であっても構わない。 The third embodiment may be modified as appropriate. For example, the authentication information (and information based on the authentication information; the same applies hereinafter) used for authentication by the server 5 is not limited to account information. The authentication information may be various kinds of authentication information different from the biometric information to be imported or exported. For example, various kinds of authentication information illustrated in Section 1.1.3 may be used. However, the authentication information may be biometric information to be imported or exported.
 上記に関連して、画像処理装置3MAを使用するユーザがユーザBである場合においては、UI部23から入力された情報(例えばアカウント情報)でなくてもよいし、ユーザBの第1の生体情報のインポートのためにユーザBを特定する情報として入力された情報(例えばアカウント情報)でなくてもよい。例えば、画像処理装置3MAが画像処理装置3MBからユーザBの第1の生体情報をインポートするときに、この第1の生体情報に対応付けられて記憶されている認証用情報を共にインポートし、このインポートした認証用情報をサーバ5へ送信してもよい。 In relation to the above, if the user who uses the image processing device 3MA is user B, the information input from the UI unit 23 (for example, account information) may not be used, or the information may be the first biometric information of user B. The information does not have to be input as information for identifying user B (for example, account information) for importing information. For example, when the image processing device 3MA imports the first biometric information of the user B from the image processing device 3MB, it also imports the authentication information stored in association with this first biometric information, and The imported authentication information may be sent to the server 5.
(7.機能制限の解除に係る動作)
 認証結果に基づく機能の制限の解除は、種々の態様で行われてよい。以下に例を示す。
(7. Operations related to cancellation of functional restrictions)
Removal of functional restrictions based on the authentication result may be performed in various ways. An example is shown below.
(7.1.機能制限の解除に係る動作全般)
(7.1.1.制限の解除が制御される機能の種類)
 認証結果に基づいて制限の解除が制御される機能は、例えば、画像処理部31(プリンタ19および/またはスキャナ21)および通信部27の少なくとも一方に関連する機能であってよい。制限対象とされる機能としては、例えば、以下のものを挙げることができる。以下に挙げる複数の機能の1以上が適宜に選択されて制限対象とされてよい。なお、以下に挙げる複数の機能は、互いに重複していたり、一体不可分であったりすることがある。
(7.1. General operations related to cancellation of functional restrictions)
(7.1.1. Types of functions whose release of restrictions is controlled)
The function whose restriction is controlled to be lifted based on the authentication result may be, for example, a function related to at least one of the image processing section 31 (printer 19 and/or scanner 21) and the communication section 27. Examples of restricted functions include the following: One or more of the following functions may be appropriately selected and set as a restriction target. Note that the plurality of functions listed below may overlap with each other or may be inseparable from one another.
 まず、制限対象となる機能としては、プリンタ19による印刷が挙げられる。印刷は、細分化して捉えられた機能毎に制限されてよい。例えば、印刷は、スキャナ21によるスキャンに基づく印刷、通信部27によって受信したデータに基づく印刷、画像処理装置3(補助記憶装置45)またはコネクタ37に接続されたデバイス(例えば不揮発性メモリ)に記憶されているデータに基づく印刷に細分化されてよい。 First, printing by the printer 19 can be cited as a function to be restricted. Printing may be restricted for each subdivided function. For example, printing may be based on scanning by the scanner 21, printing based on data received by the communication unit 27, or storing data in the image processing device 3 (auxiliary storage device 45) or a device connected to the connector 37 (for example, non-volatile memory). It may be subdivided into printing based on the data that has been created.
 通信部27によって受信したデータに基づく印刷の制限は、送信元の通信機器(例えば他の画像処理装置3、サーバ5もしくは7または端末9)に応じてさらに細分化されてもよい。なお、通信先の制限によって、このような印刷の制限が実質的に実現されてもよい。また、通信部27によって受信したデータに基づく印刷の制限は、通信の態様(通常のデータ通信、メール受信もしくはFAX受信)に応じてさらに細分化されてもよい。 The printing restrictions based on the data received by the communication unit 27 may be further subdivided according to the transmission source communication device (for example, another image processing device 3, the server 5 or 7, or the terminal 9). Note that such printing restrictions may be substantially implemented by restricting communication destinations. Furthermore, the printing restrictions based on the data received by the communication unit 27 may be further subdivided according to the mode of communication (normal data communication, email reception, or FAX reception).
 画像処理装置3に保存されているデータに基づく印刷の制限は、データが保存されているボックス(別の表現ではフォルダまたはディレクトリ)の種類に応じてさらに細分化されてよい。なお、機密性が高いファイル(文書ファイルおよび/または画像ファイル)が保存されることが想定されているボックスへのアクセスの制限によって、このような印刷の制限が実質的に実現されてもよい。 The printing restrictions based on the data stored in the image processing device 3 may be further subdivided according to the type of box (folder or directory in another expression) in which the data is stored. Note that such printing restrictions may be substantially implemented by restricting access to a box in which highly confidential files (document files and/or image files) are expected to be stored.
 コネクタ37に接続されたメモリに記憶されているデータに基づく印刷の制限は、接続されるデバイスの種類または個体に応じてさらに細分化されてよい。なお、コネクタ37に接続可能なデイバスの制限(いわゆるデバイスコントロール)によって、このような印刷の制限が実質的に実現されてもよい。 The printing restrictions based on the data stored in the memory connected to the connector 37 may be further subdivided depending on the type or individual of the connected device. Note that such printing restrictions may be substantially realized by restricting the devices that can be connected to the connector 37 (so-called device control).
 また、制限対象となる機能としては、スキャナ21によるスキャンが挙げられる。印刷と同様に、スキャンは、細分化して捉えられた機能毎に制限されてよい。例えば、スキャンは、コピー(印刷)のためのものと、データ(例えば画像データ)の送信のためのものと、画像処理装置3(補助記憶装置45)またはコネクタ37に接続されたデバイスへのデータの保存のためのものとに細分化されてよい。 In addition, scanning by the scanner 21 is an example of a function that is subject to restriction. Similar to printing, scanning may be limited by granular features. For example, scanning is for copying (printing), for transmitting data (for example, image data), and for scanning data to the image processing device 3 (auxiliary storage device 45) or a device connected to the connector 37. may be subdivided into those for preservation and those for preservation.
 データの送信のためのスキャンは、送信先の通信機器(例えば他の画像処理装置3、サーバ5もしくは7または端末9)に応じてさらに細分化されてもよい。なお、送信先の制限によって、このようなスキャンの制限が実質的に実現されてもよい。また、データの送信のためのスキャンは、通信の態様(通常のデータ通信、メール送信もしくはFAX送信)に応じてさらに細分化されてもよい。 Scanning for data transmission may be further subdivided depending on the destination communication device (for example, another image processing device 3, server 5 or 7, or terminal 9). Note that such scanning restrictions may be substantially implemented by restricting destinations. Furthermore, scanning for data transmission may be further subdivided depending on the mode of communication (normal data communication, email transmission, or FAX transmission).
 画像処理装置3への保存のためのスキャンは、保存先のボックスの種類に応じてさらに細分化されてよい。なお、機密性が高いファイルが保存されることが想定されているボックスへのアクセスの制限によって、このようなスキャンの制限が実質的に実現されてもよい。 Scans for storage in the image processing device 3 may be further subdivided according to the type of storage destination box. Note that such scanning restrictions may be substantially implemented by restricting access to a box in which highly confidential files are expected to be stored.
 コネクタ37に接続されたデバイスへの保存のためのスキャンは、接続されるデバイスの種類または個体に応じてさらに細分化されてよい。なお、コネクタ37に接続可能なデイバスの制限によって、このようなスキャンの制限が実質的に実現されてもよい。 Scans for storage in the device connected to the connector 37 may be further subdivided depending on the type or individual of the connected device. Note that such scanning limitations may be substantially implemented by limiting the devices that can be connected to the connector 37.
 また、制限対象となる機能は、印刷またはスキャンのような主要な機能でなくてもよい。例えば、制限対象となる機能は、印刷される用紙の余白の大きさを設定するような、主要な機能に関する設定を行う機能であってもよい。ただし、このような機能は、余白を任意に設定して印刷を行う機能として捉えられてよく、ひいては、主要な機能の一種として捉えられて構わない。 Further, the function to be restricted does not have to be a major function such as printing or scanning. For example, the restricted function may be a function that performs settings related to major functions, such as setting the size of the margins of printed paper. However, such a function may be regarded as a function of printing with arbitrary margin settings, and may even be regarded as a type of main function.
 また、制限対象となる機能は、画像処理装置3の管理者が利用する機能であってもよい。例えば、画像処理装置3は、一律に(ユーザの認証結果によらずに)、上述した主要な機能の一部を禁止したり、画像処理装置3に対する所定のデバイスの接続を禁止したりする設定を受け付け可能であってよい。そして、そのような設定の制限が、特定のユーザ(画像処理装置3の管理者)に対して解除されてよい。 Furthermore, the function to be restricted may be a function used by the administrator of the image processing device 3. For example, the image processing device 3 may be set to uniformly (regardless of the user's authentication results) prohibit some of the above-mentioned main functions or to prohibit connection of a predetermined device to the image processing device 3. may be accepted. Then, such setting restrictions may be canceled for a specific user (the administrator of the image processing device 3).
(7.1.2.機能の制限の態様)
 上記のとおり、画像処理装置3は、種々の機能を有している。制限対象とされる機能は、認証のための機能を除く種々の機能のうち、全てであってもよいし、一部であってもよい。別の観点では、生体認証を経た認証に失敗したユーザは、実質的に画像処理装置3を利用できないようにされてもよいし、一部の機能を利用可能であってもよい。
(7.1.2. Aspects of functional limitations)
As described above, the image processing device 3 has various functions. The functions to be restricted may be all or some of the various functions excluding the authentication function. From another point of view, a user who fails authentication via biometric authentication may be substantially prevented from using the image processing device 3, or may be able to use some functions.
 認証に成功した場合の機能の制限解除の態様は、全てのユーザに共通であってもよいし、ユーザ毎に個別に設定可能であってもよい。前者を別の観点でいうと、認証がなされずに機能の制限解除がなされないユーザと、認証がなされて機能の制限解除がなされるユーザとの2種のみが存在してよい。そして、機能の制限解除がなされるユーザ同士において、利用できる機能に差異がなくてよい。 The manner in which functional restrictions are lifted when authentication is successful may be common to all users, or may be set individually for each user. To put the former in another perspective, there may be only two types of users: users for whom functional restrictions are not lifted without authentication and users for whom functional restrictions are lifted after authentication. There may be no difference in the functions that can be used between the users whose functions are lifted.
 また、複数種類の認証方法が可能である態様において、認証方法によって、認証されたユーザに対して制限が解除される機能に差異があってもよいし、差異が無くてもよい。前者の例を挙げる。例えば、画像処理装置3は、第1~第3実施形態のいずれかにおいて、生体認証を行わずに、入力されたアカウント情報と、自己、他の画像処理装置3および/またはサーバ5に登録されているアカウント情報との比較によって認証を行い、機能の制限解除を制御することが可能であってよい。このような態様において、同一ユーザまたは全てのユーザに着目した場合、アカウント情報による認証が成功したときは、第1の数の機能の制限が解除され、生体情報による認証が成功したときは、第1の数の機能の制限を含み、第1の数よりも多い第2の数の機能の制限が解除されてよい。 Furthermore, in an embodiment in which multiple types of authentication methods are possible, there may or may not be a difference in the functions for which restrictions are lifted for authenticated users depending on the authentication method. Let me give an example of the former. For example, in any of the first to third embodiments, the image processing device 3 registers input account information and the self, other image processing devices 3, and/or server 5 without performing biometric authentication. It may be possible to perform authentication by comparing account information with existing account information, and to control the release of restrictions on functions. In such an aspect, when focusing on the same user or all users, when authentication using account information is successful, the restriction on the first number of functions is lifted, and when authentication using biometric information is successful, the restriction on the first number of functions is lifted. The restriction may include a restriction on one number of functions, and a restriction on a second number of functions greater than the first number may be lifted.
 認証に成功した場合の機能の制限解除が、ユーザ毎に個別に設定可能である場合の例を挙げる。認証がなされないユーザが第1の機能および第2の機能の双方を利用できない場合を想定する。このとき、認証がなされたユーザとして、第1の機能のみを利用できるユーザ、第2の機能のみを利用できるユーザ、第1の機能および第2の機能の双方を利用できるユーザ、および認証がなされても、認証がなされないユーザと同様に機能の制限がなされるユーザのうちの2種以上が存在してよい。 An example is given in which the release of restrictions on functions when authentication is successful can be set individually for each user. Assume that a user who is not authenticated cannot use both the first function and the second function. At this time, the authenticated users include users who can use only the first function, users who can use only the second function, users who can use both the first function and the second function, and users who have been authenticated. However, there may be two or more types of users whose functions are restricted in the same way as users who are not authenticated.
(7.1.3.認証状態の解除)
 認証状態(機能の制限が解除された状態)は、当然ながら、いずれ解除される。認証状態の解除は、例えば、認証がなされていない状態(機能の制限が解除される前の状態)に戻ることと言い換えることができる。認証状態の解除は、認証を前提とした機能(例えば後述するVPN接続)の終了および/または認証を前提として取得した情報(例えば後述するk)の無効化(例えばメモリからの消去)を伴ってよい。従って、認証状態の解除は、これらの動作の終了および/または情報の無効化によって把握されてよい。また、例えば、認証が成功したときにその旨を示すフラグが立てられる態様において、当該フラグを倒す動作によって把握されてもよい。この場合、認証を前提とした動作の終了および/または認証を前提として取得した情報の無効化は必ずしも伴わなくてもよい。
(7.1.3. Cancellation of authentication status)
Naturally, the authentication state (state in which functional restrictions are removed) will be released at some point. For example, canceling the authentication state can be rephrased as returning to a non-authenticated state (a state before functional restrictions were released). Canceling the authentication state involves terminating functions that require authentication (e.g., VPN connection described below) and/or invalidating (e.g., erasing from memory) information acquired on the assumption of authentication (e.g., k described later). good. Therefore, cancellation of the authentication state may be recognized by the termination of these operations and/or the invalidation of information. Furthermore, for example, in a mode in which a flag is set to indicate that authentication has been successful, it may be recognized by an action of tossing down the flag. In this case, it is not necessary to terminate the operation based on the authentication and/or invalidate the information obtained based on the authentication.
 認証状態の解除は、種々の事象をトリガとして行われてよい。そのような事象としては、例えば、以下のものを挙げることができる。ユーザが操作部33に対して所定の操作を行ったこと。認証を必要とする機能(例えば所定の画像データをダウンロードして印刷する機能)に係る処理が終了したこと。所定時点(例えば操作部33に対する最後の操作が行われた時点)から所定の時間が経過したこと。人感センサによってユーザが画像処理装置3から離れたことが検知されたこと。 The authentication state may be canceled using various events as triggers. Examples of such events include the following: The user has performed a predetermined operation on the operation unit 33. Processing related to a function that requires authentication (for example, a function to download and print predetermined image data) has been completed. A predetermined time has elapsed since a predetermined time (for example, the time when the last operation on the operation unit 33 was performed). The human sensor detects that the user has left the image processing device 3.
(7.2.機能の制限解除に関する具体例)
 機能の制限解除の制御を実現するための具体的な構成は、種々可能である。以下では、主として、認証が成功したときに、ユーザ間で制限解除される機能に相違がある態様の具体例を示す。
(7.2. Specific example of releasing restrictions on functions)
Various specific configurations are possible for realizing control of function restriction release. Below, we will mainly show specific examples of aspects in which there are differences in the functions whose restrictions are lifted between users when authentication is successful.
 図12Aは、機能の制限解除の制御に利用される権限テーブルDT3を示す模式図である。 FIG. 12A is a schematic diagram showing the authority table DT3 used for controlling the release of function restrictions.
 権限テーブルDT3は、IDと権限情報D3とを紐付けて保持している。すなわち、権限情報D3は、機能毎に制限解除の可否を特定する情報であり、図12Aでは、印刷およびスキャンが例示されている。 The authority table DT3 holds IDs and authority information D3 in association with each other. That is, the authority information D3 is information that specifies whether restrictions can be canceled for each function, and printing and scanning are illustrated in FIG. 12A.
 権限テーブルDT3は、例えば、画像処理装置3によって保持されてよい。画像処理装置3は、自己の管理用テーブルDT0(図1)の一部として、権限テーブルDT3を有してよい。図1では、管理用テーブルDT0の「機能制限」として、権限レベル(後述)が例示されているが、「機能制限」に保持される情報は、図1の例とは異なり、権限情報D3であってよい。画像処理装置3は、自己の管理用テーブルDT0に登録されているユーザの権限情報D3を保持している。 The authority table DT3 may be held by the image processing device 3, for example. The image processing device 3 may have an authority table DT3 as a part of its own management table DT0 (FIG. 1). In FIG. 1, the authority level (described later) is illustrated as the "functional restriction" of the management table DT0, but the information held in the "functional restriction" is different from the example of FIG. 1, and is the authority information D3. It's good. The image processing device 3 holds user authority information D3 registered in its own management table DT0.
 そして、画像処理装置3MAは、認証に成功したユーザが自己の管理用テーブルDT0に登録されているユーザAである場合においては、ユーザAのIDに紐付けられている権限情報D3を参照して、機能の制限解除の制御を行う。また、画像処理装置3MAは、認証に成功したユーザが画像処理装置3MBの管理用テーブルDT0に登録されているユーザBである場合おいては、画像処理装置3MBからユーザBのIDに紐付けられている権限情報D3をインポートして、機能の制限解除の制御を行う。このインポートは、例えば、認証の成功前または後の適宜な時期に行われてよい。例えば、第1実施形態では、図7のST45またはST49(ST14)で行われてよく、第2実施形態では、図10のステップST75またはST77(ST14)で行われてよい。 Then, if the user who has successfully authenticated is the user A registered in its own management table DT0, the image processing device 3MA refers to the authority information D3 linked to the ID of the user A. , controls the release of restrictions on functions. In addition, if the user who has been successfully authenticated is user B who is registered in the management table DT0 of the image processing device 3MB, the image processing device 3MA is linked to the ID of the user B from the image processing device 3MB. The authority information D3 is imported, and the release of restrictions on functions is controlled. This import may occur, for example, at any appropriate time before or after successful authentication. For example, in the first embodiment, it may be performed in ST45 or ST49 (ST14) in FIG. 7, and in the second embodiment, it may be performed in step ST75 or ST77 (ST14) in FIG.
 上記とは異なり、図1に例示したように、管理用テーブルDT0は、IDに紐付けられた権限レベルの情報を保持してよい。この場合、例えば、画像処理装置3は、権限レベルと機能毎の制限の情報(権限情報D3を参照)とを紐付けたテーブルを有している。そして、画像処理装置3MAは、認証に成功したユーザAの権限レベルを自己の管理用テーブルDT0を参照して特定する。または画像処理装置3MAは、認証に成功したユーザBの権限レベルの情報を画像処理装置3MBからインポートする。インポートの時期は、上記の権限情報D3のインポートの時期と同様に任意である。そして、画像処理装置3MAは、特定された権限レベルに紐付けられている権限情報D3を参照して、機能の制限解除の制御を行う。なお、この態様は、権限テーブルDT3が、IDと権限レベルとを紐付けるテーブルと、権限レベルと権限情報D3とを紐付けるテーブルとに分割されている態様と捉えることができる。 Unlike the above, as illustrated in FIG. 1, the management table DT0 may hold information on authority levels linked to IDs. In this case, for example, the image processing device 3 has a table that links authority levels with information on restrictions for each function (see authority information D3). Then, the image processing device 3MA identifies the authority level of the user A who has been successfully authenticated by referring to its own management table DT0. Alternatively, the image processing device 3MA imports the authority level information of the user B who has been successfully authenticated from the image processing device 3MB. The timing of import is arbitrary, similar to the timing of importing authority information D3 above. Then, the image processing device 3MA refers to the authority information D3 linked to the specified authority level and controls the release of the restriction on the function. Note that this aspect can be regarded as an aspect in which the authority table DT3 is divided into a table that associates IDs and authority levels, and a table that associates authority levels and authority information D3.
 権限テーブルDT3は、画像処理装置3に加えて、または代えて、サーバ5に保存されていてもよい。そして、画像処理装置3MAは、認証に成功したユーザがユーザBである場合(またはユーザAおよびBのいずれの場合であっても)、権限情報D3をサーバ5から取得してよい。 The authority table DT3 may be stored in the server 5 in addition to or instead of the image processing device 3. Then, the image processing device 3MA may acquire the authority information D3 from the server 5 when the user who has been successfully authenticated is the user B (or regardless of whether it is the user A or the user B).
 また、権限テーブルDT3が、IDと権限レベルとを紐付けるテーブルと、権限レベルと権限情報D3とを紐付けるテーブルとに分割されている態様においては、例えば、画像処理装置3に加えて、または代えて、双方のテーブルがサーバ5に保存されていてよい。この場合の動作は、上記と同様である。また、権限レベルと権限情報D3とを紐付けるテーブルは、画像処理装置3に記憶されていてもよい。そして、画像処理装置3MAは、認証に成功したユーザがユーザBである場合(またはユーザAおよびBのいずれの場合であっても)、権限レベルをサーバ5から取得し、取得した権限レベルに紐付けられている権限情報D3を参照して、機能の制限解除の制御を行う。 Further, in an embodiment in which the authority table DT3 is divided into a table that associates IDs and authority levels, and a table that associates authority levels and authority information D3, for example, in addition to the image processing device 3, or Alternatively, both tables may be stored on the server 5. The operation in this case is the same as above. Further, a table linking authority levels and authority information D3 may be stored in the image processing device 3. Then, if the user who has successfully been authenticated is user B (or in both cases of users A and B), the image processing device 3MA acquires the authority level from the server 5 and links the acquired authority level. With reference to the attached authority information D3, the release of restrictions on functions is controlled.
 これまでの説明から理解されるように、種々のテーブルは、適宜に分割されたり、適宜に統合されたりして構わない。別の観点では、テーブルの実際の構成は、概念上の構成と異なっていて構わない。従って、例えば、IDと機能制限の情報とを関連付けて保持しているテーブルは、両者の情報を保持している1つのテーブルであってもよいし、権限レベルを介して分割した2つのテーブルであってもよい。 As understood from the above description, various tables may be divided or integrated as appropriate. In another aspect, the actual configuration of the table may differ from the conceptual configuration. Therefore, for example, a table that holds ID and function restriction information in association with each other may be one table that holds both information, or two tables that are divided based on authority level. There may be.
 既述のとおり、認証用情報および/または認証方法の種類によって、制限が解除される機能が異なってよい。この場合、例えば、認証用情報および/または認証方法の種類によって、権限情報D3の内容(または権限レベル)が異なっていてよい。また、画像処理装置3MAは、認証用情報および/または認証方法の種類を特定可能であるから、上記とは異なり、認証用情報および/または認証方法の種類によらずに、同じ権限情報D3(または権限レベル)を取得し、画像処理装置3MAにおいて、その権限情報D3で制限が解除されている機能のうち、特定のものに対して、制限を解除しないようにしてもよい。あるいは、画像処理装置3MBおよび/またはサーバ5が、認証用情報および/または認証方法の種類に応じて権限情報D3を改変してもよい。 As described above, the functions for which restrictions are lifted may differ depending on the type of authentication information and/or authentication method. In this case, the content (or authority level) of the authority information D3 may differ depending on the type of authentication information and/or authentication method, for example. Moreover, since the image processing device 3MA can specify the type of authentication information and/or authentication method, unlike the above, the image processing device 3MA can specify the same authority information D3( or authority level), and in the image processing apparatus 3MA, restrictions may not be lifted for a specific function among the functions for which restrictions have been lifted according to the authority information D3. Alternatively, the image processing device 3MB and/or the server 5 may modify the authority information D3 according to the type of authentication information and/or authentication method.
 処理の手順は、例えば、以下のとおりである。画像処理装置3(制御部29)は、UI部23に対する操作等によって所定の機能の実行が指示されると、現在のユーザが上記所定の機能を実行する権限を有しているか否か判定する。この判定においては、上記のとおり、画像処理装置3は、自己が保持している、または予め外部(他の画像処理装置3もしくはサーバ5)から取得した権限の情報を参照する。そして、画像処理装置3は、ユーザが権限を有していると判定したときは、上記所定の機能を実行し、そうでない場合は、上記所定の機能を実行しない。後者の場合においては、権限を有していないこと(あるいは認証がなされていないこと)が表示部35に表示されてもよい。 For example, the processing procedure is as follows. When the image processing device 3 (control unit 29) is instructed to execute a predetermined function by operating the UI unit 23, etc., the image processing device 3 (control unit 29) determines whether the current user has the authority to execute the predetermined function. . In this determination, as described above, the image processing device 3 refers to the authority information that it owns or has previously acquired from the outside (another image processing device 3 or the server 5). When the image processing device 3 determines that the user has the authority, it executes the predetermined function, and otherwise does not execute the predetermined function. In the latter case, the display unit 35 may display that the user is not authorized (or that the user has not been authenticated).
 以上に説明した動作において、ユーザが認証に成功することによって、画像処理装置3がユーザの権限情報に応じて所定の機能を実行可能な状態になるとき、機能の制限を解除する動作の一例が行われたと捉えられてよい。また、画像処理装置3がユーザの権限情報に応じて所定の機能を実行可能な状態になるとき、通常、画像処理装置3の内部的には、所定のフラグが立てられる。このフラグが立てられることも、機能の制限が解除される動作の一例と捉えられてよい。また、外部(他の画像処理装置3またはサーバ5)から権限の情報を取得する動作も、機能の制限を解除する動作の一例と捉えられてよい。さらに、実行が指示された所定の機能の権限をユーザが有しているか否か判定して、ユーザが権限を有している場合に上記所定の機能を実行する動作も、機能の制限を解除する動作の一例として捉えられてよい。 In the operation described above, when the image processing device 3 becomes capable of executing a predetermined function according to the user's authority information due to the user's successful authentication, an example of the operation of canceling the restriction on the function is as follows. It can be considered that this has been done. Further, when the image processing device 3 becomes capable of executing a predetermined function according to the user's authority information, a predetermined flag is usually set internally in the image processing device 3. The setting of this flag may also be considered as an example of an operation in which a restriction on a function is lifted. Further, the operation of acquiring authority information from the outside (another image processing device 3 or server 5) may also be considered as an example of the operation of canceling the restriction on functions. Furthermore, the operation of determining whether the user has the authority for the specified function that has been instructed to execute and executing the above specified function if the user has the authority also lifts the restriction on the function. This can be taken as an example of an action to do.
(7.3.機能制限に係るメニュー画面)
 認証結果に基づいて機能の制限解除を制御するとき、表示部35に表示されるメニュー画面の設定が共に行われてもよい。この設定は、ユーザ毎に行われてもよい。具体的には、以下のとおりである。
(7.3. Menu screen related to functional restrictions)
When controlling the release of functional restrictions based on the authentication result, settings for the menu screen displayed on the display unit 35 may also be performed. This setting may be performed for each user. Specifically, it is as follows.
 メニュー画面は、例えば、GUIにおける1以上の選択肢を含む画面(画像)である。ポインティングデバイスによって選択肢が選択されると、その選択肢に対応する処理が実行される。例えば、操作部33および表示部35がタッチパネルによって構成されている態様においては、指またはタッチペンによって、表示部35に表示された1以上の選択肢のいずれかが押されると、対応する処理が実行される。 The menu screen is, for example, a screen (image) that includes one or more options on the GUI. When an option is selected by the pointing device, a process corresponding to the option is executed. For example, in an embodiment in which the operation unit 33 and the display unit 35 are configured by touch panels, when one or more options displayed on the display unit 35 is pressed with a finger or a touch pen, the corresponding process is executed. Ru.
 画像処理装置3のメニュー画面において示される選択肢に対応する処理は、種々の処理とされてよい。例えば、選択肢は、印刷、スキャン、コピー、FAX送信およびFAX受信(ただし、これらは必ずしも分離できる概念ではない。)等の主要な機能に係る動作を実行させる処理であってよい。および/または、選択肢は、上記動作に係る設定を行う処理であってよい。そのような設定としては、例えば、紙の大きさの選択、印刷倍率の設定および印刷の濃さを挙げることができる。先の説明において、主要な機能は適宜に細分化されて権限が設定されてよいことを述べたが、この細分化の説明は、選択肢の細分化に適宜に援用されてよい。 The processes corresponding to the options shown on the menu screen of the image processing device 3 may be various processes. For example, the options may be processes that cause operations related to major functions such as printing, scanning, copying, FAX transmission, and FAX reception (although these are not necessarily separable concepts). And/or the option may be a process for making settings related to the above operation. Such settings include, for example, paper size selection, print magnification settings, and print darkness. In the previous explanation, it was stated that the main functions may be subdivided as appropriate and authorities may be set, but this explanation of subdivision may be used for subdivision of options as appropriate.
 ユーザ毎のメニュー画面は、例えば、ユーザ毎の好みを反映したものであってもよいし、および/またはユーザ毎の権限を反映したものであってもよい。前者としては、例えば、特定の選択肢の画面35a内における位置、大きさ、色および形状等をユーザの好みに合わせたものを挙げることができる。後者としては、例えば、ユーザが所定の機能について権限を有しているか否かによって、その機能に係る選択肢の表示態様を異ならせた画面を挙げることができる。より詳細には、例えば、権限の有無によって選択肢の色が異なっている画面、および権限が有る選択肢のみが表示される(権限が無い選択肢が表示されない)画面が挙げられる。 The menu screen for each user may, for example, reflect the preferences of each user and/or the authority of each user. The former includes, for example, adjusting the position, size, color, shape, etc. of a specific option within the screen 35a to suit the user's preference. Examples of the latter include, for example, a screen in which options for a given function are displayed in different ways depending on whether the user has authority for that function. More specifically, examples include a screen in which options have different colors depending on the presence or absence of authority, and a screen in which only options for which the user has authority are displayed (options for which he does not have authority are not displayed).
 なお、権限が有る選択肢のみが表示されるメニュー画面が表示される態様においては、ユーザは、権限が有る選択肢に対応する処理のみを指示できる。従って、この場合のメニュー画面の表示の制御は、機能の制限解除の制御の一例として捉えられてよい。 Note that in a mode where a menu screen is displayed in which only options for which the user has authority are displayed, the user can only instruct the process corresponding to the option for which he/she has authority. Therefore, controlling the display of the menu screen in this case may be regarded as an example of controlling the release of functional restrictions.
 認証結果に基づくユーザ毎のメニュー画面の設定は、認証に成功したユーザに対するメニュー画面と、それ以外のユーザに対するメニュー画面との2種のみの設定であってよい。また、例えば、認証に成功した互いに異なるユーザに互いに異なるメニュー画面を設定可能であってもよい。認証が成功しないユーザにはメニュー画面が表示されないようにしてもよい。複数種類の認証方法を選択可能な態様においては、認証方法によってメニュー画面が異なっていてもよいし、異なっていなくてもよい。 The menu screen settings for each user based on the authentication results may be set to only two types: a menu screen for users who have been successfully authenticated, and a menu screen for other users. Furthermore, for example, it may be possible to set different menu screens to different users who have successfully authenticated. The menu screen may not be displayed to users whose authentication is not successful. In an embodiment in which multiple types of authentication methods can be selected, the menu screens may or may not be different depending on the authentication method.
 画像処理装置3は、最初に表示されるメインメニュー画面と、メインメニューの画面の選択肢を選択することによって表示される1以上のサブメニュー画面とを表示可能であってよい。この場合において、ユーザ毎に設定されるメニュー画面は、メインメニュー画面であってもよいし、1以上のサブメニューの画面のうちの少なくとも1つであってもよいし、前記2種の双方であってもよい。また、ユーザ毎のメニュー画面の設定によって、サブメニュー画面の表示の可否が設定されてもよいし、複数のサブメニュー画面のうちの表示可能なサブメニュー画面の数が設定されてもよい。 The image processing device 3 may be capable of displaying a main menu screen that is initially displayed and one or more submenu screens that are displayed by selecting an option on the main menu screen. In this case, the menu screen set for each user may be the main menu screen, at least one of one or more submenu screens, or both of the above two types. There may be. Further, depending on the menu screen settings for each user, whether or not a submenu screen can be displayed may be set, or the number of submenu screens that can be displayed among a plurality of submenu screens may be set.
 上述したメニュー画面の設定は、より具体的な種々の態様で実現されてよい。以下に例を示す。 The menu screen settings described above may be realized in various more specific ways. An example is shown below.
 図12Bは、メニュー画面の設定に利用されるメニューテーブルDT7の構成を示す模式図である。 FIG. 12B is a schematic diagram showing the configuration of the menu table DT7 used for setting the menu screen.
 メニューテーブルDT7は、IDと、メニュー画面の態様(換言すればメニュー画面の設定)を特定するメニュー情報D7と、を紐付けて記憶している。図12Bでは、メニュー情報D7は、表示部35(タッチパネル)の画面35aの模式図によって表現されている。画面35aには、例えば、複数のボタンIGFが表示されており、複数のボタンIGFに対して選択的に操作(例えばタップ)がなされることによって、そのボタンに対応する機能に係る処理が行われる。 The menu table DT7 stores an ID and menu information D7 that specifies the mode of the menu screen (in other words, the settings of the menu screen) in association with each other. In FIG. 12B, menu information D7 is expressed by a schematic diagram of the screen 35a of the display unit 35 (touch panel). For example, a plurality of buttons IGF are displayed on the screen 35a, and by selectively operating (for example, tapping) on the plurality of buttons IGF, processing related to the function corresponding to that button is performed. .
 画像処理装置3は、認証に成功したユーザに対応付けられているメニュー情報D7を参照して、表示部35に表示するメニュー画面を制御する。なお、権限テーブルDT3(権限情報D3)の説明は、適宜にメニューテーブルDT7(メニュー情報D7)に援用されてよい。例えば、メニューテーブルDT7は、画像処理装置3が保持してもよいし、サーバ5が保持していてもよい。メニューテーブルDT7は、管理用テーブルDT0(および/または比較用テーブルDT1)と統合されていてもよいし、統合されていなくてもよい。 The image processing device 3 controls the menu screen displayed on the display unit 35 by referring to the menu information D7 associated with the user who has been successfully authenticated. Note that the description of the authority table DT3 (authority information D3) may be appropriately used in the menu table DT7 (menu information D7). For example, the menu table DT7 may be held by the image processing device 3 or by the server 5. Menu table DT7 may or may not be integrated with management table DT0 (and/or comparison table DT1).
(7.4.VPNに関する制限解除)
 認証結果に基づいて制限が解除される機能はVPN接続であってよい。具体的には、以下のとおりである。
(7.4. Lifting restrictions on VPN)
The function whose restrictions are lifted based on the authentication result may be a VPN connection. Specifically, it is as follows.
 VPNは、例えば、プライベートネットワークをパブリックネットワーク11に仮想的に拡張する。別の観点では、VPNは、パブリックネットワーク11を含んで構成される物理的に1つのネットワークを論理的に分割する。これにより、例えば、パブリックネットワーク11を介した通信がセキュアな環境下で行われる。 A VPN, for example, virtually extends a private network to the public network 11. From another perspective, a VPN logically divides a physically single network including the public network 11. Thereby, for example, communication via the public network 11 is performed in a secure environment.
 このような仮想的な拡張または論理的な分割は、例えば、認証、トンネリングおよび暗号化によって実現される。ただし、VPNを利用した通信は、暗号化が行われずに、認証およびトンネリングが行われるものであってもよい。なお、トンネリングは、暗号化の一種と捉えることもできる。 Such virtual expansion or logical division is achieved, for example, by authentication, tunneling, and encryption. However, communication using a VPN may be performed through authentication and tunneling without being encrypted. Note that tunneling can also be considered a type of encryption.
 認証では、接続を確立する対象としての正当性の確認が行われる。認証の方法としては、例えば、アカウント情報(IDおよびパスワード)を用いるもの、静的鍵を用いるもの、共通鍵(共有鍵)を用いるもの、秘密鍵および公開鍵の組み合わせを用いるもの、電子署名を用いるもの、電子証明書を用いるもの、セキュリティートークンを用いるもの、および上記の2以上を組み合わせたもの(例えば多要素認証)を挙げることができる。 In authentication, the validity of the connection being established is confirmed. Authentication methods include, for example, those that use account information (ID and password), those that use static keys, those that use a common key (shared key), those that use a combination of a private key and public key, and those that use electronic signatures. Examples include those that use electronic certificates, those that use security tokens, and those that combine two or more of the above (for example, multi-factor authentication).
 トンネリングでは、ネットワークを介して物理的または論理的に離れた2点間が同一点であるかのように扱うための動作が行われる。トンネリングは、例えば、カプセル化によって実現される。カプセル化では、例えば、通信に際して、パケット全体が、別のプロトコルのペイロード、別のレイヤのペイロードまたは同一のレイヤのペイロードに埋め込まれる。トンネリングは、適宜なレイヤで行われてよく、例えば、レイヤ3(ネットワーク層)またはレイヤ2(データリンク層)において行われてよい。 In tunneling, an operation is performed to treat two points that are physically or logically separated via a network as if they were the same point. Tunneling is achieved, for example, by encapsulation. In encapsulation, for example, an entire packet is embedded in a payload of another protocol, a payload of another layer, or a payload of the same layer during communication. Tunneling may be performed at any appropriate layer, for example at layer 3 (network layer) or layer 2 (data link layer).
 暗号化では、送受信される情報が第三者から解読不能な形式の情報に変換される。暗号化は、ペイロードのみに対して行われてもよいし、ヘッダおよびペイロードの双方に対して行われてもよい。別の観点では、暗号化は、適宜なレイヤで行われてよく、例えば、ネットワーク層、トランスポート層および/またはセッション層で行われてよい。暗号化の方式は適宜なものとされてよい。例えば、暗号化の方式としては、共通鍵を用いるもの、ならびに、秘密鍵および公開鍵の組み合わせを用いるものを挙げることができる。 Encryption converts information sent and received into a format that cannot be read by third parties. Encryption may be performed only on the payload, or on both the header and the payload. In another aspect, encryption may be performed at any appropriate layer, eg, at the network layer, transport layer, and/or session layer. An appropriate encryption method may be used. For example, encryption methods include those that use a common key and those that use a combination of a private key and a public key.
 VPNの種類は、適宜なものとされてよい。例えば、通信システム1のVPNには、リモートアクセス型VPN、および/またはLAN型(サイト間)VPNが適用されてよい。リモートアクセス型VPNでは、例えば、画像処理装置3等の通信機器にVPNのクライアントソフトがインストールされて、通信機器が直接的にVPNサーバとしてのサーバ5に対してVPN接続を行う。LAN型VPNでは、例えば、VPNゲートウェイがLAN(拠点)同士をVPN接続する。 The type of VPN may be selected as appropriate. For example, the VPN of the communication system 1 may be a remote access type VPN and/or a LAN type (intersite) VPN. In a remote access type VPN, for example, VPN client software is installed on a communication device such as the image processing device 3, and the communication device directly establishes a VPN connection to the server 5 as a VPN server. In a LAN type VPN, for example, a VPN gateway connects LANs (bases) to each other via VPN.
 ただし、実施形態では、リモートアクセス型VPNのクライアントとして機能する画像処理装置3の動作を例に取る。 However, in the embodiment, the operation of the image processing device 3 that functions as a client of a remote access VPN will be taken as an example.
 既述のように、パブリックネットワーク11は、種々の態様とされてよい。VPNの種類の観点からは、以下のとおりである。VPNは、パブリックネットワーク11にインターネットを含むインターネットVPNであってよい。また、VPNは、パブリックネットワーク11に通信業者等が提供する閉域ネットワークを含む、IP(Internet Protocol)-VPN、エントリーVPNまたは広域イーサネットであってよい。 As mentioned above, the public network 11 may take various forms. From the viewpoint of VPN types, they are as follows. The VPN may be an Internet VPN in which the public network 11 includes the Internet. Further, the VPN may be an IP (Internet Protocol)-VPN, an entry VPN, or a wide area Ethernet, including a closed network provided by a communication carrier or the like to the public network 11.
 VPNのためのプロトコルは、公知のものであってもよいし、新規なものであってもよく、サーバ5の管理者が独自に規定したものであってもよい。リモートアクセス型VPNの公知のプロトコルとしては、例えば、L2TP(Layer 2 Tunneling Protocol)およびIPsec(Security Architecture for Internet Protocol)の組み合わせ、ならびにPPTP(Point to Point Tunneling Protocol)を挙げることができる。 The protocol for the VPN may be a known one, a new one, or one uniquely defined by the administrator of the server 5. Known protocols for remote access VPNs include, for example, a combination of L2TP (Layer 2 Tunneling Protocol) and IPsec (Security Architecture for Internet Protocol), and PPTP (Point to Point Tunneling Protocol).
 図13は、上記のような動作の具体例を説明するフローチャートである。 FIG. 13 is a flowchart illustrating a specific example of the above operation.
 図13において、画像処理装置3は、既に触れたように、リモートアクセス型VPNのクライアントとして、VPNサーバとしてのサーバ5と通信を行うもの(例えば図1の画像処理装置3Aまたは3B)である。データ処理装置49は、画像処理装置3とVPN(別の観点ではVPNサーバとしてのサーバ5)を介して通信を行う装置である。データ処理装置49としては、例えば、他の画像処理装置3、サーバ7および端末9を挙げることができる。データ処理装置49は、サーバ5であってもよいが、図13では、両者が別個である態様を例にとっている。サーバ5ではないデータ処理装置49は、サーバ5を含むプライベートネットワーク13Aに含まれるもの(3C、7または9A)であってもよいし、含まれないもの(3A、3Bまたは9B)であってもよい。図13では、後者を例にとっている。 In FIG. 13, as already mentioned, the image processing device 3 is a remote access VPN client that communicates with the server 5 as a VPN server (for example, the image processing device 3A or 3B in FIG. 1). The data processing device 49 is a device that communicates with the image processing device 3 via a VPN (from another perspective, the server 5 as a VPN server). Examples of the data processing device 49 include another image processing device 3, the server 7, and the terminal 9. Although the data processing device 49 may be the server 5, FIG. 13 shows an example in which the two are separate. The data processing device 49 that is not the server 5 may be included in the private network 13A including the server 5 (3C, 7 or 9A) or may not be included (3A, 3B or 9B). good. In FIG. 13, the latter is taken as an example.
 図13に示す処理は、例えば、画像処理装置3においてVPN接続の開始条件が満たされたときに開始される。開始条件は、例えば、VPN接続を指示する所定の操作が操作部33に対してなされたこととされてよい。また、開始条件は、VPN接続を必要とするタスク(例えばデータ処理装置49から画像データをダウンロードして印刷する動作)の実行が操作部33に対してなされたこととされてもよい。このようなタスクが指示されたときに、VPN接続を行うか否かをユーザに問い合わせ、その結果、VPN接続を指示する所定の操作がなされたときに開始条件が満たされてもよい。また、開始条件は、外部の通信機器(例えば端末9)からの所定の信号が入力されたこととされてもよい。 The process shown in FIG. 13 is started, for example, when the VPN connection start condition is satisfied in the image processing device 3. The start condition may be, for example, that a predetermined operation instructing a VPN connection is performed on the operation unit 33. Further, the start condition may be that a task that requires a VPN connection (for example, an operation of downloading and printing image data from the data processing device 49) is performed on the operation unit 33. When such a task is instructed, the start condition may be satisfied when the user is asked whether or not to make a VPN connection, and as a result, a predetermined operation instructing the VPN connection is performed. Further, the start condition may be that a predetermined signal is input from an external communication device (for example, the terminal 9).
 VPN接続の開始条件が満たされると、第1~第3実施形態のいずれかで説明した認証のための処理が実行される。すなわち、図4の処理、図9の処理または図11の処理が実行される。図13では、ユーザAのためのステップST5が例示されているが、ステップST5として示されたステップは、ユーザBのためのステップST12であってもよい。あるいは、その後に行われる第3実施形態のサーバ5による認証であってもよい。 When the VPN connection start conditions are met, the authentication process described in any of the first to third embodiments is executed. That is, the process in FIG. 4, the process in FIG. 9, or the process in FIG. 11 is executed. Although step ST5 for user A is illustrated in FIG. 13, the step shown as step ST5 may be step ST12 for user B. Alternatively, authentication by the server 5 of the third embodiment may be performed subsequently.
 認証に成功すると、画像処理装置3は、サーバ5に対してVPN接続のための認証を要求する。図示の例では、画像処理装置3は、認証に成功したユーザのアカウント情報をサーバ5に送信している(ステップST29)。そして、サーバ5は、受信したアカウント情報が、自己が保持しているテーブル(図11の検証用テーブルDT2を参照)に登録されているか否かを判定する。これにより、認証が行われる。そして、サーバ5は、認証結果(認証の成否)をアカウント情報の送信元の画像処理装置3へ送信する(ステップST30)。 If the authentication is successful, the image processing device 3 requests the server 5 for authentication for VPN connection. In the illustrated example, the image processing device 3 transmits the account information of the user who has been successfully authenticated to the server 5 (step ST29). Then, the server 5 determines whether the received account information is registered in the table it holds (see verification table DT2 in FIG. 11). Authentication is thereby performed. Then, the server 5 transmits the authentication result (success or failure of authentication) to the image processing device 3 that is the source of the account information (step ST30).
 ステップST29においては、図11のアカウント情報の送信と同様に、ステップST5で肯定判定がなされた場合は、画像処理装置3MAは、自己が保持している比較用テーブルDT1において認証に成功したユーザAの生体情報と対応付けられているアカウント情報をサーバ5に送信する。また、画像処理装置3MAは、ステップST12で肯定判定がなされた場合は、図7または図10のステップST41で入力されたユーザBのアカウント情報をサーバ5に送信する。第3実施形態においては、ステップST29およびST30の認証は、生体認証が成功した後、機能の制限を解除する前にサーバ5で行われる認証であってもよい。 In step ST29, similarly to the transmission of account information in FIG. 11, if an affirmative determination is made in step ST5, the image processing apparatus 3MA transmits the user A who has been successfully authenticated in the comparison table DT1 held by the image processing apparatus 3MA. The account information associated with the biometric information of is sent to the server 5. Further, if an affirmative determination is made in step ST12, the image processing device 3MA transmits the account information of user B input in step ST41 of FIG. 7 or FIG. 10 to the server 5. In the third embodiment, the authentication in steps ST29 and ST30 may be performed by the server 5 after the biometric authentication is successful and before the function restriction is lifted.
 なお、ステップST29およびST30においては、図11のサーバ5による認証と同様に、アカウント情報以外の認証用情報に基づく認証が行われてもよいし、画像処理装置3MBからインポートされた認証用情報が用いられてもよい。 Note that in steps ST29 and ST30, similar to the authentication by the server 5 in FIG. 11, authentication may be performed based on authentication information other than account information, or authentication information imported from the image processing device 3MB may be used. may be used.
 上記の説明とは異なり、第1~第3実施形態で説明した認証が成功して機能の制限が解除されている状態で、VPN接続の開始条件が満たされたときに、ステップST29から処理が開始されてもよい。あるいは、第1~第3実施形態で説明した認証が成功したときに、自動的にVPN接続が行われてもよい。換言すれば、認証が成功したことがVPN接続の開始条件とされてもよい。 Unlike the above explanation, when the authentication described in the first to third embodiments is successful and the functional restrictions are lifted, and the VPN connection start condition is met, the process starts from step ST29. May be started. Alternatively, the VPN connection may be automatically established when the authentication described in the first to third embodiments is successful. In other words, successful authentication may be a condition for starting a VPN connection.
 あるいは、第1~第3実施形態で説明した認証が行われる前に、画像処理装置3とサーバ5との間でVPN接続がなされてもよい。別の観点では、個別のユーザによって入力された認証用情報をサーバ5へ送信してVPN接続に係る認証を行うのではなく、画像処理装置3が保持している認証用情報をサーバ5へ送信してVPN接続に係る認証が行われてもよい。この場合であっても、VPN接続を利用した機能の実行に対して、第1~第3実施形態で説明した認証を要求することによって、第三者が不正にVPN接続を利用する蓋然性は低減される。 Alternatively, a VPN connection may be established between the image processing device 3 and the server 5 before the authentication described in the first to third embodiments is performed. From another perspective, instead of transmitting the authentication information input by individual users to the server 5 to perform authentication related to the VPN connection, the authentication information held by the image processing device 3 is transmitted to the server 5. Authentication related to the VPN connection may also be performed. Even in this case, by requiring the authentication described in the first to third embodiments to execute functions using a VPN connection, the probability that a third party will illegally use the VPN connection is reduced. be done.
 認証が成功し、VPN接続が確立されると、画像処理装置3は、VPNを利用した通信を行う。図13では、データ処理装置49から画像データをダウンロードして印刷する動作が例示されている。具体的には、以下のとおりである。 If the authentication is successful and a VPN connection is established, the image processing device 3 performs communication using the VPN. FIG. 13 exemplifies the operation of downloading image data from the data processing device 49 and printing it. Specifically, it is as follows.
 ステップST31では、画像処理装置3は、VPNを介して画像データのダウンロードを要求する信号をサーバ5へ送信する。なお、ここでの画像データは、一般的な画像データの他、印刷ジョブとしての画像データであっても構わない。 In step ST31, the image processing device 3 transmits a signal requesting download of image data to the server 5 via the VPN. Note that the image data here may be general image data or image data as a print job.
 ステップST32では、サーバ5は、受信した信号に含まれる情報によって特定される送信先(ここではデータ処理装置49)へ、画像データを要求する信号を送信(転送)する。このとき、データ処理装置49がサーバ5を含むプライベートネットワーク13Aの外部の通信装置である場合においては、当該送信はVPNを介して行われてよい(図示の例)。この場合、ステップST32の前に予めデータ処理装置49がサーバ5へVPN接続されている。データ処理装置49がプライベートネットワーク13Aに含まれる通信装置である場合においては、通常のプライベートネットワーク13A内における通信が行われてよい。 In step ST32, the server 5 transmits (transfers) a signal requesting image data to the destination (here, the data processing device 49) specified by the information included in the received signal. At this time, if the data processing device 49 is a communication device external to the private network 13A including the server 5, the transmission may be performed via a VPN (as shown in the example). In this case, the data processing device 49 is connected to the server 5 via VPN in advance before step ST32. When the data processing device 49 is a communication device included in the private network 13A, normal communication within the private network 13A may be performed.
 ステップST33では、データ処理装置49は、要求された画像データをサーバ5へ送信する。このとき、ステップST32と同様に、データ処理装置49がプライベートネットワーク13Aの外部に位置する場合はVPNが利用されてよく(図示の例)、データ処理装置49がプライベートネットワーク13Aの内部に位置する場合は通常のプライベートネットワーク13A内における通信が行われてよい。 In step ST33, the data processing device 49 transmits the requested image data to the server 5. At this time, similarly to step ST32, if the data processing device 49 is located outside the private network 13A, a VPN may be used (as shown in the example), and if the data processing device 49 is located inside the private network 13A, a VPN may be used. Communication may take place within the normal private network 13A.
 ステップST34では、サーバ5は、受信した画像データを画像処理装置3へ送信(転送)する。このときの送信は、VPNを介してなされる。 In step ST34, the server 5 transmits (transfers) the received image data to the image processing device 3. Transmission at this time is performed via VPN.
 ステップST35では、画像処理装置3は、受信した画像データに基づく印刷を実行する。 In step ST35, the image processing device 3 executes printing based on the received image data.
 画像処理装置3がVPN接続を行うVPNサーバは、画像処理装置3を使用しているユーザが選択可能であってもよいし、選択不可能であってもよい。前者の場合、画像処理装置3は、1つのVPNを構成する2以上のVPNサーバからのみ接続先を選択可能であってもよいし、互いに異なる2以上のVPNを構成する2以上のVPNサーバから接続先を選択可能であってもよい。 The VPN server to which the image processing device 3 makes a VPN connection may or may not be selectable by the user using the image processing device 3. In the former case, the image processing device 3 may be able to select a connection destination only from two or more VPN servers that make up one VPN, or may be able to select a connection destination from two or more VPN servers that make up two or more different VPNs. It may also be possible to select a connection destination.
 VPN接続は、適宜な切断条件が満たされたときに切断されてよい。例えば、切断条件は、切断を指示する所定の操作が操作部33に対してなされたとこととされてよい。VPN接続が、VPN接続を必要とするタスクの実行の指示に基づいてなされる態様においては、切断条件は、上記タスクが終了したこととされてよい。また、例えば、切断条件は、認証状態が解除されたこととされてよい。なお、認証状態が解除される条件の例について既に述べた。 A VPN connection may be disconnected when appropriate disconnection conditions are met. For example, the cutting condition may be that a predetermined operation instructing cutting is performed on the operation unit 33. In an aspect where the VPN connection is made based on an instruction to execute a task that requires the VPN connection, the disconnection condition may be that the task is completed. Further, for example, the disconnection condition may be that the authentication state has been canceled. Note that examples of conditions for canceling the authentication state have already been described.
 上記の説明では、画像処理装置3がデータ処理装置49から画像データを受信して印刷を行う動作を例に取った。ただし、VPNを利用する動作は、これ以外にも種々可能である。例えば、スキャナ21によって取得した情報(例えば画像データ)がVPNを介してデータ処理装置49へ送信されてよい。 In the above description, the operation in which the image processing device 3 receives image data from the data processing device 49 and performs printing is taken as an example. However, various other operations using the VPN are possible. For example, information (eg, image data) acquired by the scanner 21 may be transmitted to the data processing device 49 via the VPN.
(8.実施形態のまとめ)
 以上のとおり、第1実施形態(または第3実施形態)に係る画像処理装置3は、画像処理部31と、入力部(検出部25)と、メモリ(補助記憶装置45)と、制御部29と、を有している。画像処理部31は、プリンタ19およびスキャナ21の少なくとも一方を含。検出部25は、ユーザの生体情報が入力される。補助記憶装置45は、ユーザ毎の第1の生体情報を保存する。制御部29は、検出部25に入力された第2の生体情報と、補助記憶装置45に保存されている第1の生体情報とを比較する認証の結果に基づき画像処理部31に関連する機能の制限解除を制御する(ステップST1~ST7)。画像処理装置3MAは、第1ユーザ(ユーザB)を特定する情報の入力(例えばステップST41のアカウント情報の入力)に応じて他の画像処理装置3MBからインポートしたユーザBの第1の生体情報とユーザBの第2の生体情報とを比較する認証の結果に基づき画像処理部31に関連する機能の制限解除を制御することもできる(ステップST8~ST14)。
(8. Summary of embodiments)
As described above, the image processing device 3 according to the first embodiment (or the third embodiment) includes the image processing section 31, the input section (detection section 25), the memory (auxiliary storage device 45), and the control section 29. It has . The image processing section 31 includes at least one of a printer 19 and a scanner 21. The detection unit 25 receives input of the user's biometric information. The auxiliary storage device 45 stores first biometric information for each user. The control unit 29 performs functions related to the image processing unit 31 based on the result of authentication that compares the second biometric information input to the detection unit 25 and the first biometric information stored in the auxiliary storage device 45. (steps ST1 to ST7). The image processing device 3MA inputs the first biometric information of the user B imported from the other image processing device 3MB in response to the input of information identifying the first user (user B) (for example, the input of account information in step ST41). It is also possible to control the release of restrictions on functions related to the image processing unit 31 based on the result of authentication that compares with the second biometric information of user B (steps ST8 to ST14).
 従って、例えば、実施形態の概要の説明で述べた効果が奏される。例えば、ユーザBの利便性の向上、画像処理装置3MAの記憶容量の節約、および/またはネットワーク10の負担軽減が図られる。また、画像処理装置3は、ユーザBを特定する情報(例えばID)が画像処理装置3MAに入力されることに応じて第1の生体情報をインポートするから、そのとき必要な第1の生体情報(すなわちユーザBの第1の生体情報)だけを他の画像処理装置3(例えば3MB)に対して要求できる。別の観点では、画像処理装置3は、ユーザBの第1の生体情報を使用した後、当該情報を消去できる。従って、例えば、定期的に他の画像処理装置の管理用テーブルDT0に基づいて自己の管理用テーブルを更新する画像処理装置に比較して、記憶容量を節約できる。なお、上記のユーザBを特定する情報は、ID(またはアカウント情報)に限定されず、例えば、他の認証用情報(第2の生体情報を除く)であっても構わない。 Therefore, for example, the effects described in the overview of the embodiment are achieved. For example, convenience for user B can be improved, storage capacity of image processing device 3MA can be saved, and/or load on network 10 can be reduced. Furthermore, since the image processing device 3 imports the first biometric information in response to the information identifying the user B (for example, ID) being input to the image processing device 3MA, the first biometric information necessary at that time (that is, the first biometric information of user B) can be requested to another image processing device 3 (for example, 3 MB). From another perspective, the image processing device 3 can delete the first biometric information of the user B after using the information. Therefore, compared to, for example, an image processing apparatus that periodically updates its own management table based on the management table DT0 of another image processing apparatus, storage capacity can be saved. Note that the information identifying the user B is not limited to the ID (or account information), and may be other authentication information (excluding the second biometric information), for example.
 第1実施形態(または第3実施形態)に係る画像処理装置3は、アカウント情報が入力されるUI部23をさらに有していてよい。アカウント情報は、IDとパスワードとを含んでよい。画像処理装置3MAは、他の画像処理装置3MBに前記第1ユーザ(ユーザB)の第1の生体情報を要求する際に、UI部23に入力されたユーザBのアカウント情報を画像処理装置3MBに送信してよい(図7のステップST42)。 The image processing device 3 according to the first embodiment (or the third embodiment) may further include a UI section 23 into which account information is input. Account information may include an ID and password. When requesting the first biometric information of the first user (user B) from another image processing device 3MB, the image processing device 3MA transfers the account information of the user B input to the UI unit 23 to the image processing device 3MB. (step ST42 in FIG. 7).
 この場合、例えば、画像処理装置3MBは、画像処理装置3Aから送信されたアカウント情報に基づく認証を行って、認証に成功したときだけ、ユーザBの第1の生体情報をエクスポートすることができる。その結果、例えば、ユーザBの第1の生体情報の意図されていない流出が生じる蓋然性が低減される。また、アカウント情報を用いることから、他の認証用情報を用いる態様に比較して、ユーザBの手続きが簡単である。 In this case, for example, the image processing device 3MB can perform authentication based on the account information sent from the image processing device 3A, and export the first biometric information of the user B only when the authentication is successful. As a result, for example, the probability of unintended leakage of user B's first biometric information is reduced. Furthermore, since account information is used, the procedure for user B is simpler than in the case where other authentication information is used.
 第1実施形態(または第3実施形態)に係る画像処理装置3MAは、所定時間が経過したことを含む消去条件が満たされたときに、他の画像処理装置3MBからインポートした第1ユーザ(ユーザB)の第1の生体情報を消去してよい(図7のステップST52)。 The image processing device 3MA according to the first embodiment (or the third embodiment) deletes the first user (user The first biometric information in B) may be deleted (step ST52 in FIG. 7).
 この場合、例えば、インポートされた生体情報が自動的に消去され、生体情報の意図されていない流出が生じる蓋然性が低減される。また、所定時間を適宜に設定することによって、インポートされた生体情報の再利用による利便性の向上と、インポートされた生体情報の消去によるセキュリティ向上とをバランスさせることができる。 In this case, for example, the imported biometric information is automatically deleted, reducing the possibility that the biometric information will be unintentionally leaked. Furthermore, by appropriately setting the predetermined time, it is possible to balance improved convenience by reusing imported biometric information and improved security by erasing imported biometric information.
 第1実施形態(または第3実施形態)に係る画像処理装置3MAは、認証の結果に基づき制限が解除された画像処理部31に関連する機能の実行が完了したことを含む消去条件が満たされたときに、他の画像処理装置3MBからインポートした第1ユーザ(ユーザB)の第1の生体情報を消去してよい(図7のステップST52)。 The image processing device 3MA according to the first embodiment (or the third embodiment) is configured to satisfy erasure conditions including completion of execution of functions related to the image processing unit 31 whose restrictions have been lifted based on the authentication result. At this time, the first biometric information of the first user (user B) imported from another image processing device 3MB may be deleted (step ST52 in FIG. 7).
 この場合、例えば、制限が解除された機能の実行が完了するごとにインポートされた生体情報が消去されることから、セキュリティが向上する。 In this case, for example, the imported biometric information is deleted every time the execution of the function for which the restriction has been lifted is completed, which improves security.
 第1実施形態(または第3実施形態)に係る画像処理装置3は、ユーザの操作を受け付けるUI部23をさらに有していてよい。画像処理装置3MAは、UI部23に所定の操作がなされたことを含む消去条件が満たされたときに、他の画像処理装置3MAからインポートした第1ユーザ(ユーザB)の第1の生体情報を消去してよい(図7のステップST52)。 The image processing device 3 according to the first embodiment (or the third embodiment) may further include a UI section 23 that receives user operations. The image processing device 3MA deletes the first biometric information of the first user (user B) imported from another image processing device 3MA when deletion conditions including a predetermined operation performed on the UI unit 23 are satisfied. may be deleted (step ST52 in FIG. 7).
 この場合、例えば、ユーザBは、インポートされた生体情報を自ら消去することができる。これにより、例えば、ユーザBは、消去条件に含まれる他の条件(例えば所定時間の経過による消去)の有無および/または設定によらずに、自己の生体情報の意図されていない流出の蓋然性を低減することができる。 In this case, for example, user B can delete the imported biometric information himself. As a result, for example, user B can avoid the possibility of unintended leakage of his or her biometric information, regardless of the presence or absence and/or setting of other conditions included in the deletion conditions (for example, deletion after a predetermined period of time has passed). can be reduced.
 第1実施形態(または第3実施形態)に係る画像処理装置3MAは、所定の自動消去条件が満たされたときに他の画像処理装置3MBからインポートしたユーザBの第1の生体情報を自動的に(第1の生体情報を消去するための操作を受け付けずに)消去してよい(図7のステップST52)。画像処理装置3MAは、ユーザBの第2の生体情報と、画像処理装置3MBからインポートしたユーザBの第1の生体情報とを比較して認証に成功した後(ステップST48の肯定判定の後)、かつ自動消去条件が満たされる前(ステップST51の肯定判定の前)に、入力部(検出部25)に再入力されたユーザBの第2の生体情報と、画像処理装置3MBからインポートしたユーザBの第1の生体情報とを比較して再認証を行ってよい(図8のステップST56)。 The image processing device 3MA according to the first embodiment (or the third embodiment) automatically deletes the first biometric information of the user B imported from another image processing device 3MB when a predetermined automatic deletion condition is met. (without accepting an operation for erasing the first biometric information) (step ST52 in FIG. 7). The image processing device 3MA compares the second biometric information of the user B with the first biometric information of the user B imported from the image processing device 3MB and succeeds in authentication (after the affirmative determination in step ST48). , and before the automatic deletion condition is satisfied (before the affirmative determination in step ST51), the second biometric information of user B that is re-entered into the input unit (detection unit 25) and the user imported from the image processing device 3MB. Re-authentication may be performed by comparing the first biometric information of B (step ST56 in FIG. 8).
 この場合、例えば、図8の説明で述べたように、ユーザBの第1の生体情報がインポートされる頻度が低減され、ネットワーク10の負担が軽減される。一方で、インポートされた第1の生体情報は、自動的に消去されるから、第1の生体情報の意図されていない流出が生じる蓋然性が低減される。 In this case, for example, as described in the explanation of FIG. 8, the frequency with which user B's first biometric information is imported is reduced, and the burden on the network 10 is reduced. On the other hand, since the imported first biometric information is automatically deleted, the possibility of unintended leakage of the first biometric information is reduced.
 第1実施形態(または第3実施形態)に係る画像処理装置3MAは、所定の消去条件が満たされるまで第2の生体情報を保持し、新たに検出された第2の生体情報と、保持している第2の生体情報とを比較する再認証を行い、再認証の結果に応じて機能の制限解除を制御してよい(図8のステップST56)。 The image processing device 3MA according to the first embodiment (or the third embodiment) retains the second biological information until a predetermined deletion condition is met, and retains the newly detected second biological information. Re-authentication may be performed to compare the biometric information with the second biometric information, and release of restrictions on functions may be controlled according to the result of the re-authentication (step ST56 in FIG. 8).
 この場合、図8の説明で述べたように、現在のユーザの体調等に応じた第2の生体情報が第1の生体情報の代わりに用いられるから、体調等に応じて再認証が失敗する蓋然性が低減される。また、第2の生体情報は、いずれ消去されるから、画像処理装置3MAの記憶容量が節約される。 In this case, as described in the explanation of FIG. 8, the second biometric information corresponding to the user's current physical condition is used instead of the first biometric information, so re-authentication may fail depending on the user's physical condition etc. The probability is reduced. Furthermore, since the second biometric information will be erased eventually, the storage capacity of the image processing device 3MA is saved.
 第2実施形態(または第3実施形態)に係る画像処理装置3は、画像処理部31と、入力部(検出部25)と、メモリ(補助記憶装置45)と、制御部29と、を有している。画像処理部31は、プリンタ19およびスキャナ21の少なくとも一方を含。検出部25は、ユーザの生体情報が入力される。補助記憶装置45は、ユーザ毎の第1の生体情報を保存する。制御部29は、検出部25に入力された第2の生体情報と、補助記憶装置45に保存されている第1の生体情報とを比較する認証の結果に基づき画像処理部31に関連する機能の制限解除を制御する(ステップST1~ST7)。画像処理装置3MAは、他の画像処理装置3MBに第1ユーザ(ユーザB)の第2の生体情報をエクスポートし(図9のステップST63)、画像処理装置3MBから受信した認証の結果に基づき画像処理部31に関連する機能の制限解除を制御することもできる(ステップST13)。 The image processing device 3 according to the second embodiment (or third embodiment) includes an image processing section 31, an input section (detection section 25), a memory (auxiliary storage device 45), and a control section 29. are doing. The image processing section 31 includes at least one of a printer 19 and a scanner 21. The detection unit 25 receives input of the user's biometric information. The auxiliary storage device 45 stores first biometric information for each user. The control unit 29 performs functions related to the image processing unit 31 based on the result of authentication that compares the second biometric information input to the detection unit 25 and the first biometric information stored in the auxiliary storage device 45. (steps ST1 to ST7). The image processing device 3MA exports the second biometric information of the first user (user B) to another image processing device 3MB (step ST63 in FIG. 9), and displays the image based on the authentication result received from the image processing device 3MB. It is also possible to control the release of restrictions on functions related to the processing unit 31 (step ST13).
 従って、第2実施形態においても、実施形態の概要で述べた効果が奏される。例えば、ユーザBの利便性の向上、画像処理装置3MAの記憶容量の節約、および/またはネットワーク10の負担軽減が図られる。 Therefore, the effects described in the overview of the embodiment are also achieved in the second embodiment. For example, convenience for user B can be improved, storage capacity of image processing device 3MA can be saved, and/or load on network 10 can be reduced.
 第2実施形態に係る画像処理装置3は、アカウント情報が入力されるUI部23をさらに有していてよい。アカウント情報は、IDとパスワードとを含んでよい。画像処理装置3MAは、UI部23に入力された第1ユーザ(ユーザB)のアカウント情報を他の画像処理装置3MBに送信してよく(図10のステップST42)、画像処理装置3MBからユーザBのアカウント情報に基づく認証の成功が通知された場合に(ステップST72で肯定判定がなされた場合に)、画像処理装置3MBへユーザBの第2の生体情報をエクスポートしてよい(ステップST73)。 The image processing device 3 according to the second embodiment may further include a UI section 23 into which account information is input. Account information may include an ID and password. The image processing device 3MA may transmit the account information of the first user (user B) input to the UI unit 23 to the other image processing device 3MB (step ST42 in FIG. 10), and the If successful authentication based on account information is notified (if an affirmative determination is made in step ST72), the second biometric information of user B may be exported to the image processing device 3MB (step ST73).
 この場合、例えば、画像処理装置3MAが誤った送信先にユーザBの第2の生体情報をエクスポートする蓋然性が低減される。すなわち、第2の生体情報の意図されていない流出が生じる蓋然性が低減される。また、アカウント情報を用いることから、他の認証用情報を用いる態様に比較して、ユーザBの手続きが簡単である。 In this case, for example, the probability that the image processing device 3MA exports the second biometric information of the user B to the wrong destination is reduced. That is, the probability that the second biological information will be unintentionally leaked is reduced. Furthermore, since account information is used, the procedure for user B is simpler than in the case where other authentication information is used.
 第2実施形態に係る画像処理装置3MAは、所定の消去条件(図10のステップST79)が満たされるまで第1ユーザ(ユーザB)の第2の生体情報を保持してよく、消去条件が満たされる前に所定の再認証条件が満たされたときに、ユーザBの第2の生体情報を画像処理装置3MBに再度エクスポートしてよい(図8のステップST56)。 The image processing device 3MA according to the second embodiment may retain the second biometric information of the first user (user B) until a predetermined deletion condition (step ST79 in FIG. When a predetermined re-authentication condition is satisfied before the re-authentication is performed, the second biometric information of the user B may be exported again to the image processing device 3MB (step ST56 in FIG. 8).
 この場合、例えば、5.3.2節で述べたように、画像処理装置3MBによる再認証が必要になったときに、ユーザBに生体情報の再入力を要求する必要性が低減され、ユーザBの利便性が向上する。なお、セキュリティの観点からは、上記の説明とは異なり、再認証の必要性が生じるたびに、ユーザBに生体情報の再入力が要求されてよい。 In this case, for example, as described in Section 5.3.2, when re-authentication by the image processing device 3MB is required, the need to request user B to re-enter biometric information is reduced, and the user The convenience of B is improved. Note that, from a security perspective, unlike the above explanation, user B may be requested to re-enter biometric information each time re-authentication is required.
 第1~第3実施形態に係る画像処理装置3は、個々のユーザに対して複数種類の第1の生体情報をメモリ(補助記憶装置45)に登録可能であってよい。 The image processing device 3 according to the first to third embodiments may be capable of registering a plurality of types of first biometric information for each user in the memory (auxiliary storage device 45).
 この場合、3.1節で述べたように、例えば、生体情報による認証に失敗したときに他の生体情報によって認証することができ利便性が向上する。および/または、ユーザまたは機能に応じて、2種以上の生体情報を要求することによってセキュリティを高くすることができる。 In this case, as described in Section 3.1, for example, when authentication using biometric information fails, authentication can be performed using other biometric information, improving convenience. And/or security can be increased by requesting two or more types of biometric information depending on the user or function.
 第1~第3実施形態に係る画像処理装置3MAは、ユーザによる他の画像処理装置3MB(インポート元またはエクスポート先)の情報を特定する情報の入力を受け付けるUI部23をさらに有していてよい(図5C参照)。 The image processing device 3MA according to the first to third embodiments may further include a UI unit 23 that receives an input of information specifying information on another image processing device 3MB (import source or export destination) by the user. (See Figure 5C).
 この場合、画像処理装置3MAは、複数の他の画像処理装置3に対して、ユーザBの第1の生体情報を保持しているか否か問い合わせなくてよい。その結果、例えば、通信システム1の負担が軽減される。特に、通信システム1が含む画像処理装置3の数が多いときに画像処理装置3の負担およびネットワーク10の負担が軽減される。 In this case, the image processing device 3MA does not need to inquire of the plurality of other image processing devices 3 whether they hold the first biometric information of the user B or not. As a result, for example, the burden on the communication system 1 is reduced. Particularly when the communication system 1 includes a large number of image processing devices 3, the burden on the image processing devices 3 and the burden on the network 10 is reduced.
 第1~第3実施形態に係る画像処理装置3MAは、他の画像処理装置3MB(インポート元またはエクスポート先)の複数の候補を表示する表示部35をさらに有していてよい(図5C参照)。 The image processing device 3MA according to the first to third embodiments may further include a display unit 35 that displays a plurality of candidates for other image processing devices 3MB (import source or export destination) (see FIG. 5C). .
 この場合、例えば、ユーザBは、複数の候補から画像処理装置3MBを選択すればよいから、ユーザBの負担が軽減される。また、図5Cのように、複数の候補または複数の候補と対応付けられているボタン(ソフトウェアキーまたはハードウェアキー)に対する操作によって画像処理装置3MBを選択可能な態様では、画像処理装置3MBのアドレスを入力する態様に比較して、さらにユーザの負担が軽減される。 In this case, for example, user B only has to select the image processing device 3MB from a plurality of candidates, so the burden on user B is reduced. Further, as shown in FIG. 5C, in an embodiment in which the image processing device 3MB can be selected by operating a plurality of candidates or a button (software key or hardware key) associated with the plurality of candidates, the address of the image processing device 3MB is The burden on the user is further reduced compared to the mode of inputting the information.
 第3実施形態(図11)において、メモリ(補助記憶装置45)は、ユーザ毎に第1の生体情報と認証用情報(例えばアカウント情報)とを関連付けて保存していてよい。画像処理装置3MAは、第2の生体情報と一致する第1の生体情報に関連付けて保存されている認証用情報に基づく情報を外部認証装置(サーバ5)に送信してよく、送信した情報に基づく認証結果をサーバ5から受信してよく、受信した認証結果に基づいて、画像処理部31に関連する機能の制限解除を制御してよい。 In the third embodiment (FIG. 11), the memory (auxiliary storage device 45) may store first biometric information and authentication information (for example, account information) in association with each other for each user. The image processing device 3MA may transmit information based on authentication information stored in association with the first biometric information that matches the second biometric information to the external authentication device (server 5), and The authentication result based on the image processing unit 31 may be received from the server 5, and the release of restrictions on functions related to the image processing unit 31 may be controlled based on the received authentication result.
 なお、上記は、ユーザがユーザAの場合の動作である。既述のとおり、ユーザがユーザBの場合においては、画像処理装置3MAに保存されている認証用情報に代えて、ユーザBが入力した認証用情報、または画像処理装置3MBからインポートされた認証用情報が用いられ、上記と同様の動作が行われてよい。 Note that the above is the operation when the user is user A. As mentioned above, when the user is User B, the authentication information input by User B or the authentication information imported from the image processing device 3MB is used instead of the authentication information stored in the image processing device 3MA. The information may be used to perform operations similar to those described above.
 上記のような動作が行われる場合、例えば、第3実施形態の説明で述べたように、生体認証(ローカル認証)と、サーバ5における認証との2段階の認証によって、機能の制限が解除されるから、セキュリティが向上する。また、別の観点では、ユーザAが画像処理装置3MAを利用する場合、ユーザAは、アカウント情報を入力せずに機能制限を解除できるから、利便性が向上する。 When the above operation is performed, for example, as described in the explanation of the third embodiment, the restriction on the function is lifted by two-step authentication of biometric authentication (local authentication) and authentication at the server 5. security is improved. In addition, from another point of view, when user A uses image processing device 3MA, user A can cancel functional restrictions without inputting account information, which improves convenience.
 第1~第3実施形態において、メモリ(補助記憶装置45)は、ユーザ毎に第1の生体情報と認証用情報(例えばアカウント情報)とを関連付けて保存していてよい。画像処理装置3MAは、外部認証装置(サーバ5)とのVPN接続のための認証を行うときに、第2の生体情報と一致する第1の生体情報に関連付けて保存されている認証用情報に基づく情報をサーバ5へ送信してよい(図13のステップST29)。 In the first to third embodiments, the memory (auxiliary storage device 45) may store first biometric information and authentication information (for example, account information) in association with each other for each user. When performing authentication for VPN connection with an external authentication device (server 5), the image processing device 3MA uses authentication information stored in association with the first biometric information that matches the second biometric information. The information based on the information may be transmitted to the server 5 (step ST29 in FIG. 13).
 なお、上記は、ユーザがユーザAの場合の動作である。既述のとおり、ユーザがユーザBの場合においては、画像処理装置3MAに保存されている認証用情報に代えて、ユーザBが入力した認証用情報、または画像処理装置3MBからインポートされた認証用情報が用いられ、上記と同様の動作が行われてよい。 Note that the above is the operation when the user is user A. As mentioned above, when the user is User B, the authentication information input by User B or the authentication information imported from the image processing device 3MB is used instead of the authentication information stored in the image processing device 3MA. The information may be used to perform operations similar to those described above.
 この場合、例えば、VPN接続は、第3実施形態と同様に、2段階の認証を経ることになる。その結果、セキュリティが向上する。また、別の観点では、ユーザAが画像処理装置3MAを利用する場合、ユーザAは、アカウント情報を入力せずにVPN接続を行うことができるから、利便性が向上する。 In this case, for example, the VPN connection will undergo two-step authentication, similar to the third embodiment. As a result, security is improved. In addition, from another point of view, when user A uses the image processing device 3MA, user A can make a VPN connection without inputting account information, which improves convenience.
 第1~第3実施形態に係る通信システム1は、上記のような画像処理装置3MAと、他の画像処理装置3MBとを有している。 The communication system 1 according to the first to third embodiments includes an image processing device 3MA as described above and another image processing device 3MB.
 これにより、上述した種々の効果が奏される。 As a result, the various effects described above are achieved.
 第1実施形態(または第3実施形態)に係る通信システム1は、第1実施形態(または第3実施形態)画像処理装置3MAと、他の画像処理装置3MBと、を有していてよい。画像処理装置3MAは、アカウント情報が入力されるUI部23をさらに有していてよく、UI部23に入力された第1ユーザ(ユーザB)のアカウント情報を画像処理装置3MBに送信してよい(図7のステップST42)。アカウント情報は、IDとパスワードとを含んでよい。画像処理装置3MBは、ユーザ毎に第1の生体情報とアカウント情報とを関連付けて保存していてよく、画像処理装置3MAから受信したアカウント情報と一致するアカウント情報に関連付けて保存されている第1の生体情報を画像処理装置3MAにエクスポートしてよい(図7のステップST45)。画像処理装置3MAは、画像処理装置3MBからエクスポートされたユーザBの第1の生体情報とユーザBの第2の生体情報とを比較して認証を行ってよい(ステップST48)。 The communication system 1 according to the first embodiment (or the third embodiment) may include the image processing device 3MA of the first embodiment (or the third embodiment) and another image processing device 3MB. The image processing device 3MA may further include a UI unit 23 into which account information is input, and may transmit the account information of the first user (user B) input to the UI unit 23 to the image processing device 3MB. (Step ST42 in FIG. 7). Account information may include an ID and password. The image processing device 3MB may store the first biometric information in association with account information for each user, and the first biometric information stored in association with the account information that matches the account information received from the image processing device 3MA. The biometric information may be exported to the image processing device 3MA (step ST45 in FIG. 7). The image processing device 3MA may perform authentication by comparing the first biometric information of the user B exported from the image processing device 3MB and the second biometric information of the user B (step ST48).
 この場合、画像処理装置3MBは、画像処理装置3Aから送信されたアカウント情報に基づく認証を行って、認証に成功したときだけ、ユーザBの第1の生体情報をエクスポートする。その結果、例えば、ユーザBの第1の生体情報の意図されていない流出が生じる蓋然性が低減される。また、アカウント情報を用いることから、他の認証用情報を用いる態様に比較して、ユーザBの手続きが簡単である。 In this case, the image processing device 3MB performs authentication based on the account information sent from the image processing device 3A, and exports the first biometric information of the user B only when the authentication is successful. As a result, for example, the probability of unintended leakage of user B's first biometric information is reduced. Furthermore, since account information is used, the procedure for user B is simpler than in the case where other authentication information is used.
 第2実施形態(または第3実施形態)に係る通信システム1は、第2実施形態(または第3実施形態)に係る画像処理装置3MAと、他の画像処理装置3MBと、を有していてよい。画像処理装置3MAは、アカウント情報が入力されるUI部23をさらに有していてよく、UI部23に入力された第1ユーザ(ユーザB)のアカウント情報を画像処理装置3MBに送信してよい(図10のステップST42)。アカウント情報は、IDとパスワードとを含んでよい。画像処理装置3MBは、ユーザ毎に第1の生体情報とアカウント情報とを関連付けて保存していてよく、画像処理装置3MAから受信したアカウント情報と一致するアカウント情報が保存されている場合に、認証の成功を画像処理装置3MAに通知してよい(図10のステップST71)。画像処理装置3MAは、認証の成功が通知された場合に、画像処理装置3MBへユーザBの第2の生体情報をエクスポートしてよい(ステップST73)。画像処理装置3MBは、画像処理装置3MAからエクスポートされたユーザBの第2の生体情報と、ユーザBの第1の生体情報とを比較して認証を行ってよく(ステップST74)、該認証の結果を画像処理装置3MAに送信してよい(ステップST75)。 The communication system 1 according to the second embodiment (or the third embodiment) includes an image processing device 3MA according to the second embodiment (or the third embodiment) and another image processing device 3MB. good. The image processing device 3MA may further include a UI unit 23 into which account information is input, and may transmit the account information of the first user (user B) input to the UI unit 23 to the image processing device 3MB. (Step ST42 in FIG. 10). Account information may include an ID and password. The image processing device 3MB may store the first biometric information and account information for each user in association with each other, and if account information that matches the account information received from the image processing device 3MA is stored, the image processing device 3MB performs authentication. The success may be notified to the image processing device 3MA (step ST71 in FIG. 10). When the image processing device 3MA is notified of successful authentication, the image processing device 3MA may export the second biometric information of the user B to the image processing device 3MB (step ST73). The image processing device 3MB may perform authentication by comparing the second biometric information of the user B exported from the image processing device 3MA with the first biometric information of the user B (step ST74). The result may be transmitted to the image processing device 3MA (step ST75).
 この場合、例えば、画像処理装置3MAが誤った送信先にユーザBの第2の生体情報をエクスポートする蓋然性が低減される。すなわち、第2の生体情報の意図されていない流出が生じる蓋然性が低減される。また、アカウント情報を用いることから、他の認証用情報を用いる態様に比較して、ユーザBの手続きが簡単である。 In this case, for example, the probability that the image processing device 3MA exports the second biometric information of the user B to the wrong destination is reduced. That is, the probability that the second biological information will be unintentionally leaked is reduced. Furthermore, since account information is used, the procedure for user B is simpler than in the case where other authentication information is used.
 第2実施形態(または第3実施形態)において、他の画像処理装置3MBは、所定時間が経過したことを含む消去条件が満たされたときに、画像処理装置3MAからエクスポートされた第2の生体情報を消去してよい(図10のステップST82)。 In the second embodiment (or third embodiment), the other image processing device 3MB deletes the second living body exported from the image processing device 3MA when deletion conditions including that a predetermined time has elapsed are met. The information may be deleted (step ST82 in FIG. 10).
 この場合、例えば、エクスポートされた第2の生体情報の意図されていない流出が生じる蓋然性が低減される。また、上記の所定時間を適宜に設定することによって、第2の生体情報の再利用による利便性の向上と、第2の生体情報の消去によるセキュリティ向上とをバランスさせることができる。 In this case, for example, the probability of unintended leakage of the exported second biological information is reduced. Furthermore, by appropriately setting the above predetermined time, it is possible to balance improved convenience by reusing the second biometric information and improved security by erasing the second biometric information.
 第2実施形態(または第3実施形態)において、他の画像処理装置3MBは、画像処理装置3MAから第1ユーザ(ユーザB)の第2の生体情報を消去する要請を受信したことを含む消去条件が満たされたときに、画像処理装置3MAかエクスポートされた第2の生体情報を消去してよい(図10のステップST82)。 In the second embodiment (or third embodiment), the other image processing device 3MB erases the second biometric information of the first user (user B), which includes receiving a request to erase the second biometric information of the first user (user B) from the image processing device 3MA. When the conditions are met, the image processing device 3MA may delete the exported second biometric information (step ST82 in FIG. 10).
 この場合、例えば、エクスポートされた第2の生体情報の意図されていない流出が生じる蓋然性が低減される。また、画像処理装置3MAの状況に応じて画像処理装置3MBが保持している第2の生体情報が消去される。その結果、例えば、ユーザBは、画像処理装置3MBから離れているにも関わらず、画像処理装置3MBが保持している第2の生体情報の消去を制御することができる。これにより、例えば、利便性および/またはセキュリティを向上させることができる。 In this case, for example, the probability of unintended leakage of the exported second biological information is reduced. Further, the second biometric information held by the image processing device 3MB is deleted depending on the status of the image processing device 3MA. As a result, for example, the user B can control the deletion of the second biometric information held by the image processing device 3MB, even though the user B is far from the image processing device 3MB. This can improve convenience and/or security, for example.
 第2実施形態(または第3実施形態)において、他の画像処理装置3MBは、所定の消去条件が満たされるまで第1ユーザ(ユーザB)の第2の生体情報を保持してよく、消去条件が満たされる前に、画像処理装置3MAからユーザBの第2の生体情報が再びエクスポートされたときに、再びエクスポートされたユーザBの第2の生体情報と、保持しているユーザBの第2の生体情報とを比較して認証を行ってよく、該認証の結果を画像処理装置3MAに送信してよい(図5のステップST56を参照)。 In the second embodiment (or third embodiment), the other image processing device 3MB may hold the second biometric information of the first user (user B) until a predetermined deletion condition is met; When the second biometric information of user B is exported again from the image processing device 3MA before the second biometric information of user B is satisfied, the second biometric information of user B that has been exported again and the second biometric information of user B held The authentication may be performed by comparing the biometric information with the biometric information, and the result of the authentication may be transmitted to the image processing device 3MA (see step ST56 in FIG. 5).
 この場合、例えば、5.3.2節で述べたように、エクスポートされた第2の生体情報を登録されている第1の生体情報に代えて用い、再認証を行うことができる。これにより、再認証がユーザの体調等に起因して失敗する蓋然性が低減される。 In this case, for example, as described in Section 5.3.2, the exported second biometric information can be used in place of the registered first biometric information to perform re-authentication. This reduces the probability that re-authentication will fail due to the user's physical condition or the like.
 なお、以上の実施形態において、画像処理装置3MAは画像処理装置の一例であり、画像処理装置3MBは他の画像処理装置の一例である。検出部25は入力部の一例である。補助記憶装置45はメモリの一例である。サーバ5は外部認証装置の一例である。アカウント情報は認証用情報の一例である。 Note that in the above embodiment, the image processing device 3MA is an example of an image processing device, and the image processing device 3MB is an example of another image processing device. The detection unit 25 is an example of an input unit. The auxiliary storage device 45 is an example of memory. The server 5 is an example of an external authentication device. Account information is an example of authentication information.
 本開示に係る技術は、以上の実施形態に限定されず、種々の態様で実施されてよい。 The technology according to the present disclosure is not limited to the above embodiments, and may be implemented in various ways.
 例えば、画像処理装置は、プリンタおよびスキャナを含む複合機ではなく、印刷機能のみを有するもの(すなわち狭義のプリンタ)、またはスキャナ機能のみを有するもの(すなわち狭義のスキャナ)であってもよい。なお、複合機は、(広義の)プリンタまたは(広義の)スキャナとして捉えられてよい。 For example, the image processing device is not a multifunction device including a printer and a scanner, but may be one that has only a printing function (i.e., a printer in the narrow sense) or one that has only a scanner function (i.e., a scanner in the narrow sense). Note that the multifunction peripheral may be regarded as a printer (in a broad sense) or a scanner (in a broad sense).
 本開示からは、第1の生体情報のインポート(第1実施形態)または第2の生体情報のエクスポート(第2実施形態)を要件としない概念を抽出可能である。例えば、複数種類の生体認証を登録可能な構成は、生体情報のインポートおよびエクスポートと切り離されて抽出されてもよい。 From the present disclosure, it is possible to extract a concept that does not require importing the first biometric information (first embodiment) or exporting the second biometric information (second embodiment). For example, a configuration in which multiple types of biometric authentication can be registered may be extracted separately from import and export of biometric information.
 1…通信システム、3、3A、3B、3C、3MAおよび3MB…画像処理装置、5…サーバ(外部認証装置)、19…プリンタ、21…スキャナ、23…UI部(入力部)、25…検出部(入力部)、27…通信部、29…制御部、31…画像処理部、33…操作部(入力部)、37…コネクタ(入力部)、45…補助記憶装置(メモリ)。 1... Communication system, 3, 3A, 3B, 3C, 3MA and 3MB... Image processing device, 5... Server (external authentication device), 19... Printer, 21... Scanner, 23... UI section (input section), 25... Detection section (input section), 27... communication section, 29... control section, 31... image processing section, 33... operating section (input section), 37... connector (input section), 45... auxiliary storage device (memory).

Claims (19)

  1.  プリンタおよびスキャナの少なくとも一方を含む画像処理部と、
     ユーザの生体情報が入力される入力部と、
     ユーザ毎の第1の生体情報を保存するメモリと、
     前記入力部に入力された第2の生体情報と、前記メモリに保存されている前記第1の生体情報とを比較する認証の結果に基づく、前記画像処理部に関連する機能の制限解除の制御、および
     第1ユーザを特定する情報の入力に応じて他の画像処理装置からインポートした前記第1ユーザの第1の生体情報と、前記第1ユーザの第2の生体情報とを比較する認証の結果に基づく、前記画像処理部に関連する機能の制限解除の制御、を行う制御部と、
     を備える画像処理装置。
    an image processing unit including at least one of a printer and a scanner;
    an input section into which the user's biometric information is input;
    a memory that stores first biometric information for each user;
    Controlling the release of restrictions on functions related to the image processing unit based on the result of authentication that compares the second biometric information input to the input unit and the first biometric information stored in the memory. , and authentication that compares first biometric information of the first user imported from another image processing device in response to input of information identifying the first user and second biometric information of the first user. a control unit that controls removal of restrictions on functions related to the image processing unit based on the results;
    An image processing device comprising:
  2.  アカウント情報が入力されるUI部をさらに有しており、
     アカウント情報は、IDとパスワードとを含み、
     前記他の画像処理装置に前記第1ユーザの第1の生体情報を要求する際に、前記UI部に入力された前記第1ユーザのアカウント情報を前記他の画像処理装置に送信する
     請求項1に記載の画像処理装置。
    It further has a UI section where account information is input,
    Account information includes ID and password,
    When requesting the first user's first biometric information from the other image processing device, the first user's account information input to the UI unit is transmitted to the other image processing device. The image processing device described in .
  3.  所定時間が経過したことを含む消去条件が満たされたときに、前記他の画像処理装置からインポートした前記第1ユーザの第1の生体情報を消去する
     請求項1または2に記載の画像処理装置。
    The image processing device according to claim 1 or 2, wherein the first biometric information of the first user imported from the other image processing device is deleted when a deletion condition including that a predetermined time has elapsed is satisfied. .
  4.  認証の結果に基づき制限が解除された前記画像処理部に関連する機能の実行が完了したことを含む消去条件が満たされたときに、前記他の画像処理装置からインポートした前記第1ユーザの第1の生体情報を消去する
     請求項1~3のいずれか1項に記載の画像処理装置。
    When deletion conditions including completion of the execution of the function related to the image processing unit whose restriction has been lifted based on the authentication result, the first user's first image data imported from the other image processing device is The image processing device according to any one of claims 1 to 3, wherein the biological information of the first image is deleted.
  5.  ユーザの操作を受け付けるUI部をさらに有しており、
     UI部に所定の操作がなされたことを含む消去条件が満たされたときに、前記他の画像処理装置からインポートした前記第1ユーザの第1の生体情報を消去する
     請求項1~4のいずれか1項に記載の画像処理装置。
    It further has a UI section that accepts user operations,
    Any one of claims 1 to 4, wherein the first biometric information of the first user imported from the other image processing device is deleted when a deletion condition including that a predetermined operation is performed on the UI section is satisfied. The image processing device according to item 1.
  6.  所定の自動消去条件が満たされたときに前記他の画像処理装置からインポートした前記第1ユーザの第1の生体情報を自動的に消去し、
     前記第1ユーザの第2の生体情報と、前記他の画像処理装置からインポートした前記第1ユーザの第1の生体情報とを比較して認証に成功した後、かつ前記自動消去条件が満たされる前に、前記入力部に再入力された前記第1ユーザの第2の生体情報と、前記他の画像処理装置からインポートした前記第1ユーザの第1の生体情報とを比較して再認証を行うことができる
     請求項1~5のいずれか1項に記載の画像処理装置。
    automatically deleting the first biometric information of the first user imported from the other image processing device when a predetermined automatic deletion condition is met;
    After the authentication is successful by comparing the second biometric information of the first user and the first biometric information of the first user imported from the other image processing device, and the automatic deletion condition is satisfied. re-authentication is performed by comparing the second biometric information of the first user re-entered into the input unit with the first biometric information of the first user imported from the other image processing device. The image processing device according to any one of claims 1 to 5, wherein the image processing device is capable of performing the following steps.
  7.  所定の消去条件が満たされるまで第2の生体情報を保持し、新たに検出された第2の生体情報と、保持している第2の生体情報とを比較する再認証を行い、再認証の結果に応じて機能の制限解除を制御する
     請求項1~6のいずれか1項に記載の画像処理装置。
    The second biometric information is retained until a predetermined deletion condition is met, and re-authentication is performed by comparing the newly detected second biometric information with the retained second biometric information. The image processing device according to any one of claims 1 to 6, wherein release of restrictions on functions is controlled according to a result.
  8.  プリンタおよびスキャナの少なくとも一方を含む画像処理部と、
     ユーザの生体情報が入力される入力部と、
     ユーザ毎の第1の生体情報を保存するメモリと、
     前記入力部に入力された第2の生体情報と、前記メモリに保存された前記第1の生体情報とを比較する認証の結果に基づく、前記画像処理部に関連する機能の制限解除の制御、および
     第1ユーザの第2の生体情報をエクスポートして他の画像処理装置から受信した前記第1ユーザの第2の生体情報に基づいた認証の結果に基づく、前記画像処理部に関連する機能の制限解除の制御、を行う制御部と、
     を備える画像処理装置。
    an image processing unit including at least one of a printer and a scanner;
    an input section into which the user's biometric information is input;
    a memory that stores first biometric information for each user;
    Controlling the release of restrictions on functions related to the image processing unit based on the result of authentication that compares the second biometric information input to the input unit and the first biometric information stored in the memory; and a function related to the image processing unit based on the result of authentication based on the second biometric information of the first user exported and received from another image processing device. a control unit that controls restriction release;
    An image processing device comprising:
  9.  アカウント情報が入力されるUI部をさらに有しており、
     アカウント情報は、IDとパスワードとを含み、
     前記UI部に入力された前記第1ユーザのアカウント情報を前記他の画像処理装置に送信し、前記他の画像処理装置から前記第1ユーザのアカウント情報に基づく認証の成功が通知された場合に、前記他の画像処理装置へ前記第1ユーザの第2の生体情報をエクスポートする
     請求項8に記載の画像処理装置。
    It further has a UI section where account information is input,
    Account information includes ID and password,
    When the account information of the first user inputted in the UI section is transmitted to the other image processing apparatus, and the other image processing apparatus notifies successful authentication based on the account information of the first user; The image processing device according to claim 8, wherein the second biometric information of the first user is exported to the other image processing device.
  10.  所定の消去条件が満たされるまで前記第1ユーザの第2の生体情報を保持し、前記消去条件が満たされる前に所定の再認証条件が満たされたときに、保持している前記第1ユーザの第2の生体情報を前記他の画像処理装置に再度エクスポートする
     請求項8または9に記載の画像処理装置。
    The second biometric information of the first user is retained until a predetermined deletion condition is met, and when a predetermined re-authentication condition is met before the deletion condition is met, the first user retains the second biometric information. The image processing device according to claim 8 or 9, wherein the second biometric information of the second biological information is exported again to the other image processing device.
  11.  個々のユーザに対して複数種類の第1の生体情報を前記メモリに登録可能である
     請求項1~10のいずれか1項に記載の画像処理装置。
    The image processing device according to any one of claims 1 to 10, wherein a plurality of types of first biometric information can be registered in the memory for each user.
  12.  ユーザによる前記他の画像処理装置を特定する情報の入力を受け付けるUI部をさらに有している
     請求項1~11のいずれか1項に記載の画像処理装置。
    The image processing apparatus according to any one of claims 1 to 11, further comprising a UI unit that receives input of information specifying the other image processing apparatus by a user.
  13.  前記他の画像処理装置となる複数の候補を表示する表示部をさらに有している
     請求項1~12のいずれか1項に記載の画像処理装置。
    The image processing device according to any one of claims 1 to 12, further comprising a display unit that displays a plurality of candidates for the other image processing device.
  14.  前記メモリは、ユーザ毎に第1の生体情報と認証用情報とを関連付けて保存しており、
     第2の生体情報と一致する第1の生体情報に関連付けて保存されている認証用情報に基づく情報を外部認証装置に送信し、送信した情報に基づく認証結果を前記外部認証装置から受信し、受信した認証結果に基づいて前記画像処理部に関連する機能の制限解除を制御する
     請求項1~13のいずれか1項に記載の画像処理装置。
    The memory stores first biometric information and authentication information in association with each other for each user,
    transmitting information based on authentication information stored in association with first biometric information that matches second biometric information to an external authentication device; receiving an authentication result based on the transmitted information from the external authentication device; The image processing apparatus according to any one of claims 1 to 13, wherein the image processing apparatus controls release of restrictions on functions related to the image processing unit based on the received authentication result.
  15.  前記メモリは、ユーザ毎に第1の生体情報と認証用情報とを関連付けて保存しており、
     外部認証装置とのVPN接続のための認証を行うときに、第2の生体情報と一致する第1の生体情報に関連付けて保存されている認証用情報に基づく情報を外部認証装置へ送信する
     請求項1~14のいずれか1項に記載の画像処理装置。
    The memory stores first biometric information and authentication information in association with each other for each user,
    When performing authentication for VPN connection with an external authentication device, transmit information based on authentication information stored in association with first biometric information that matches the second biometric information to the external authentication device Request. The image processing device according to any one of items 1 to 14.
  16.  請求項1~15のいずれか1項に記載の画像処理装置と、
     前記他の画像処理装置と、
     を有している通信システム。
    An image processing device according to any one of claims 1 to 15,
    The other image processing device;
    A communication system that has
  17.  請求項1~7のいずれか1項に記載の画像処理装置と、
     前記他の画像処理装置と、
     を有しており、
     前記画像処理装置は、
      アカウント情報が入力されるUI部をさらに有しており、
      前記UI部に入力された前記第1ユーザのアカウント情報を前記他の画像処理装置に送信し、
     アカウント情報は、IDとパスワードとを含み、
     前記他の画像処理装置は、
      ユーザ毎に第1の生体情報とアカウント情報とを関連付けて保存しており、
      前記画像処理装置から受信したアカウント情報と一致するアカウント情報に関連付けて保存されている第1の生体情報を前記画像処理装置にエクスポートし、
     前記画像処理装置は、前記他の画像処理装置からエクスポートされた前記第1ユーザの第1の生体情報と前記第1ユーザの第2の生体情報とを比較して認証を行う
     通信システム。
    An image processing device according to any one of claims 1 to 7,
    The other image processing device;
    It has
    The image processing device includes:
    It further has a UI section where account information is input,
    transmitting the account information of the first user input into the UI section to the other image processing device;
    Account information includes ID and password,
    The other image processing device is
    First biometric information and account information are stored in association with each user,
    exporting first biometric information stored in association with account information that matches account information received from the image processing device to the image processing device;
    The image processing device performs authentication by comparing the first biometric information of the first user exported from the other image processing device and the second biometric information of the first user.
  18.  請求項7~9のいずれか1項に記載の画像処理装置と、
     前記他の画像処理装置と、
     を有しており、
     前記画像処理装置は、
      アカウント情報が入力されるUI部をさらに有しており、
      前記UI部に入力された前記第1ユーザのアカウント情報を前記他の画像処理装置に送信し、
     アカウント情報は、IDとパスワードとを含み、
     前記他の画像処理装置は、
      ユーザ毎に第1の生体情報とアカウント情報とを関連付けて保存しており、
      前記画像処理装置から受信したアカウント情報と一致するアカウント情報が保存されている場合に、認証の成功を前記画像処理装置に通知し、
     前記画像処理装置は、前記認証の成功が通知された場合に、前記他の画像処理装置へ前記第1ユーザの第2の生体情報をエクスポートし、
     前記他の画像処理装置は、前記画像処理装置からエクスポートされた前記第1ユーザの第2の生体情報と、前記第1ユーザの第1の生体情報とを比較して認証を行い、該認証の結果を前記画像処理装置に送信する
     通信システム。
    The image processing device according to any one of claims 7 to 9,
    The other image processing device;
    It has
    The image processing device includes:
    It further has a UI section where account information is input,
    transmitting the account information of the first user input into the UI section to the other image processing device;
    Account information includes ID and password,
    The other image processing device is
    First biometric information and account information are stored in association with each user,
    Notifying the image processing device of successful authentication if account information matching the account information received from the image processing device is stored;
    The image processing device exports the second biometric information of the first user to the other image processing device when notified of the success of the authentication,
    The other image processing device performs authentication by comparing the second biometric information of the first user exported from the image processing device and the first biometric information of the first user, and performs the authentication. A communication system that transmits results to the image processing device.
  19.  請求項7~9のいずれか1項に記載の画像処理装置と、
     前記他の画像処理装置と、
     を有しており、
     前記他の画像処理装置は、所定の消去条件が満たされるまで前記第1ユーザの第2の生体情報を保持し、前記消去条件が満たされる前に、前記画像処理装置から前記第1ユーザの第2の生体情報が再度エクスポートされたときに、再度エクスポートされた前記第1ユーザの第2の生体情報と、保持している前記第1ユーザの第2の生体情報とを比較して認証を行い、該認証の結果を前記画像処理装置に送信する
     通信システム。
    The image processing device according to any one of claims 7 to 9,
    The other image processing device;
    It has
    The other image processing device retains the second biometric information of the first user until a predetermined deletion condition is met, and before the deletion condition is met, the other image processing device stores the second biometric information of the first user from the image processing device. When the second biometric information of the second user is exported again, the second biometric information of the first user that has been exported again is compared with the second biometric information of the first user held for authentication. , a communication system that transmits the authentication result to the image processing device.
PCT/JP2022/032789 2022-08-31 2022-08-31 Image processing device and communication system WO2024047801A1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
PCT/JP2022/032789 WO2024047801A1 (en) 2022-08-31 2022-08-31 Image processing device and communication system
JP2023550614A JP7408027B1 (en) 2022-08-31 2022-08-31 Image processing device and communication system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2022/032789 WO2024047801A1 (en) 2022-08-31 2022-08-31 Image processing device and communication system

Publications (1)

Publication Number Publication Date
WO2024047801A1 true WO2024047801A1 (en) 2024-03-07

Family

ID=89377138

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2022/032789 WO2024047801A1 (en) 2022-08-31 2022-08-31 Image processing device and communication system

Country Status (2)

Country Link
JP (1) JP7408027B1 (en)
WO (1) WO2024047801A1 (en)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2008021071A (en) * 2006-07-12 2008-01-31 Fujitsu Ltd Personal identification apparatus and personal identification method
JP2009271566A (en) * 2008-04-30 2009-11-19 Seiko Epson Corp Information display device, communication system and program
JP2012064006A (en) * 2010-09-16 2012-03-29 Fuji Xerox Co Ltd Information processor and program
JP2021056607A (en) * 2019-09-27 2021-04-08 コニカミノルタ株式会社 User authentication system, biological information server, image forming apparatus, and program thereof

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2007011609A (en) 2005-06-29 2007-01-18 Sharp Corp User authentication system, image processing system, authentication device, and relay device
JP2008021222A (en) 2006-07-14 2008-01-31 Murata Mach Ltd Image formation system, image forming apparatus and user authentication method

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2008021071A (en) * 2006-07-12 2008-01-31 Fujitsu Ltd Personal identification apparatus and personal identification method
JP2009271566A (en) * 2008-04-30 2009-11-19 Seiko Epson Corp Information display device, communication system and program
JP2012064006A (en) * 2010-09-16 2012-03-29 Fuji Xerox Co Ltd Information processor and program
JP2021056607A (en) * 2019-09-27 2021-04-08 コニカミノルタ株式会社 User authentication system, biological information server, image forming apparatus, and program thereof

Also Published As

Publication number Publication date
JP7408027B1 (en) 2024-01-04

Similar Documents

Publication Publication Date Title
US7693298B2 (en) Image processing system having a plurality of users utilizing a plurality of image processing apparatuses connected to network, image processing apparatus, and image processing program product executed by image processing apparatus
US8453259B2 (en) Authentication apparatus, authentication system, authentication method, and authentication program using biometric information for authentication
US8867059B2 (en) Image forming apparatus and method of transferring administrative authority of authentication print data
US7730526B2 (en) Management of physical security credentials at a multi-function device
JP4297092B2 (en) Printing apparatus, printing method, and computer program
ES2656352T3 (en) Imaging system, imaging device, and method for creating, maintaining, and applying authorization information
US20100100968A1 (en) Image processing apparatus
JP2006012136A (en) Control of document processing based on fingerprint of user
JP2013061770A (en) Service providing device and program
US10466943B2 (en) Image processing apparatus, method and non-transitory computer-readable recording medium storing instructions therefor
JP2019096938A (en) System, method in system, information processing apparatus, method in information processing apparatus, and program
JP4497200B2 (en) Image forming apparatus, image forming apparatus terminal apparatus, and program
CN107430655A (en) Equipment, authentication method and computer program product
JP7408027B1 (en) Image processing device and communication system
JP2011192061A (en) Electronic equipment and method of controlling the same
JP5555642B2 (en) Image forming apparatus
JP6838653B2 (en) Image forming system, image forming device and terminal device
JP7218455B1 (en) Image processing device and communication system
WO2024047800A1 (en) Image processing device and communication system
US10831424B1 (en) Authentication system with refresh tokens to print using a mobile application
JP2010183306A (en) Image forming apparatus and method for controlling image forming apparatus, and control program of image forming apparatus
WO2024047802A1 (en) Image processing device and communication system
JP2011199337A (en) Image forming apparatus and image forming method
JP7162762B1 (en) Image processing device
US20240231719A9 (en) Image processing device

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22957399

Country of ref document: EP

Kind code of ref document: A1