WO2024047800A1 - Dispositif de traitement d'image et système de communication - Google Patents

Dispositif de traitement d'image et système de communication Download PDF

Info

Publication number
WO2024047800A1
WO2024047800A1 PCT/JP2022/032788 JP2022032788W WO2024047800A1 WO 2024047800 A1 WO2024047800 A1 WO 2024047800A1 JP 2022032788 W JP2022032788 W JP 2022032788W WO 2024047800 A1 WO2024047800 A1 WO 2024047800A1
Authority
WO
WIPO (PCT)
Prior art keywords
authentication
image processing
processing device
information
user
Prior art date
Application number
PCT/JP2022/032788
Other languages
English (en)
Japanese (ja)
Inventor
浩史 岡
博文 鈴木
茂樹 高谷
Original Assignee
京セラ株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 京セラ株式会社 filed Critical 京セラ株式会社
Priority to PCT/JP2022/032788 priority Critical patent/WO2024047800A1/fr
Publication of WO2024047800A1 publication Critical patent/WO2024047800A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F21/00Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F21/30Authentication, i.e. establishing the identity or authorisation of security principals
    • G06F21/31User authentication
    • G06F21/32User authentication using biometric data, e.g. fingerprints, iris scans or voiceprints

Definitions

  • the present disclosure relates to an image processing device having at least one of a printer and a scanner, and a communication system including the image processing device.
  • Patent Document 1 An image processing device that performs biometric authentication is known (for example, Patent Document 1 below).
  • the image processing device described in Patent Document 1 reads a fingerprint using a document reading device.
  • a USB (Universal Serial Bus) memory in which fingerprint data for verification is stored is connected to the image processing device.
  • the image processing device performs authentication by comparing the read fingerprint with the fingerprint stored in the USB memory. If the authentication is successful, the image processing device allows the user to copy or fax.
  • USB Universal Serial Bus
  • An image processing device includes an image processing section, a detection section, a first memory, a communication section, and a control section.
  • the image processing section includes at least one of a printer and a scanner.
  • the detection unit detects biometric information of the user.
  • the first memory stores first biometric information and authentication information in association with each user.
  • the communication unit externally transmits information based on authentication information stored in the first memory in association with first biometric information that matches second biometric information of the first user detected by the detection unit. Send to authentication device.
  • the control unit controls the release of restrictions on functions related to at least one of the image processing unit and the communication unit based on the authentication result received from the external authentication device.
  • a communication system includes the image processing device and the external authentication device.
  • the external authentication device has a second memory that stores authentication information.
  • the external authentication device indicates that the authentication has been successful when authentication using information based on the authentication information received from the image processing device and authentication information stored in the second memory is successful.
  • the authentication result is sent to the image processing device.
  • FIG. 1 is a schematic diagram showing an example of a communication system according to an embodiment.
  • 2 is a schematic diagram showing a hardware configuration related to a signal processing system of an image processing device included in the communication system of FIG. 1.
  • FIG. 2 is a functional block diagram showing the configuration of the communication system in FIG. 1.
  • FIG. 2 is a flowchart showing an overview of the operation of the communication system in FIG. 1.
  • 5 is a flowchart illustrating an example of a procedure for authentication.
  • 5 is a flowchart illustrating an example of a procedure for monitoring a user using face authentication.
  • 9 is a flowchart showing a continuation of FIG. 8.
  • FIG. 5 is a flowchart illustrating an example of a procedure for requesting a user to re-enter biometric information.
  • FIG. 2 is a schematic diagram for explaining a situation in which a user is monitored by a human sensor.
  • 5 is a flowchart illustrating an example of a procedure for canceling an authentication state by a human sensor or an operation.
  • FIG. 3 is a block diagram for explaining a specific example of lifting restrictions based on an authentication result.
  • 5 is a flowchart illustrating an example of a procedure for releasing restrictions based on an authentication result.
  • FIG. 3 is a block diagram for explaining changes in the menu screen due to restriction cancellation.
  • 5 is a flowchart for explaining the release of restrictions related to VPN connection.
  • FIG. 1 is a flowchart illustrating an example of a procedure for requesting a user to re-enter biometric information.
  • FIG. 2 is a schematic diagram for explaining a situation in which a user is monitored by a human sensor.
  • FIG. 3 is a cross-sectional view showing a specific example of a detection unit that detects biological information using ultrasonic waves.
  • FIG. 3 is a schematic diagram for explaining an operation mode of a detection unit that detects biological information.
  • 5 is a flowchart illustrating an example of a processing procedure related to switching the operation mode of a detection unit that detects biological information.
  • biological information refers to the information itself about the characteristics that actually appear on a person (from another point of view, information that does not depend on the detection method), and when it refers to the raw information obtained by detecting the above-mentioned characteristics.
  • it refers to feature information extracted from raw information, and in other cases, it refers to information that has been processed from raw information or feature information according to the purpose of use. Examples of the processed information include information obtained by encrypting feature amounts.
  • authentication sometimes refers to the act of confirming the legitimacy of an object, and sometimes refers to the fact that the legitimacy has been confirmed or has been confirmed through such an act.
  • the fact that the validity has been confirmed is sometimes expressed as a successful authentication, and the fact that the legitimacy cannot be confirmed is sometimes expressed as a failure in authentication.
  • the "authentication state” refers to a state where authenticity has been confirmed, or a state where it is regarded as such.
  • network sometimes refers to a communication network, and sometimes refers to a combination of a communication network and devices connected to the communication network. The same holds true for the lower-level concept of network. Examples of the sub-concept terms of network are the Internet, public network, private network, LAN (Local Area Network), and VPN (Virtual Private Network).
  • VPN sometimes refers to a technology that virtually extends a private network to a public network, and sometimes refers to a network using this technology.
  • VPN may be appropriately used to refer to technical matters related to VPN.
  • a connection established for communication using a VPN is sometimes referred to as a VPN connection, and such a connection is sometimes referred to as a VPN connection.
  • connection can refer to a connection established through authentication (for example, a three-way handshake) (a connection in a narrow sense), or a connection that simply means that communication is possible (a connection in a broad sense).
  • authentication for example, a three-way handshake
  • connection in a narrow sense a connection in a narrow sense
  • connection establishment is prohibited. Things that are electrically (physically from another point of view) connected to each other by cables, but in terms of software (logically from another point of view) any communication is prohibited.
  • FIG. 1 is a schematic diagram showing the configuration of a communication system 1 according to an embodiment.
  • the communication system 1 includes a plurality of communication devices that are communicably connected to each other via a network.
  • the plurality of communication devices include one or more image processing devices.
  • three image processing devices 3A, 3B, and 3C are illustrated.
  • the image processing devices 3A to 3C may be referred to as an image processing device 3 (numerals in FIG. 2, etc.) without distinguishing them.
  • Image processing device 3 includes at least one of a printer and a scanner.
  • the plurality of communication devices includes a server 5 (an example of an external authentication device) that authenticates a user who uses the image processing device 3.
  • the image processing device 3 transmits information related to the user to the server 5 when the user attempts to use the image processing device 3 (or a predetermined function of the image processing device 3).
  • the server 5 performs authentication based on the received information. If the authentication by the server 5 is successful, for example, the image processing device 3 allows the user to use a predetermined function (for example, printing) (removes the restriction on the function). Conversely, if the authentication fails, the image processing device 3 does not allow the user to use the predetermined function (restricts the function).
  • control for removing restrictions on functions may be used as a concept that includes both restricting functions and removing restrictions on functions.
  • a term such as “control of cancellation of authentication state” is sometimes used as a concept that includes both maintaining an authentication state in which authentication has been successful and canceling the authentication state.
  • FIG. 3 is a diagram for explaining the above authentication in more detail, and is also a block diagram showing the configuration of the communication system 1 from a functional perspective.
  • the image processing device 3 stores in advance a comparison table DT1 in which biometric information D2 and account information D1 (an example of authentication information) are associated for each user. Then, if biometric information D2 (first biometric information) that matches the biometric information (second biometric information) inputted by the user during use is registered in the comparison table DT1, the image processing device 3 The account information D1 associated with the matching biometric information D2 is transmitted to the server 5. Note that, as is clear from common technical knowledge, "matching" of biometric information is an accuracy that does not pose a practical problem in biometric authentication (from another point of view, the accuracy required for the image processing device 3 and/or the communication system 1). This does not mean a complete match.
  • the server 5 stores in advance a verification table DT2 that includes account information D1 for each user. Then, if account information D1 that matches the received account information D1 is registered in the verification table DT2, the server 5 transmits an authentication result D3 indicating that the authentication was successful to the image processing device 3 that is the source. do. Further, if the account information that matches the received account information D1 is not registered in the verification table DT2, the server 5 transmits, for example, an authentication result indicating that authentication has failed to the image processing device 3 of the transmission source. do. Then, upon receiving the authentication result D3 indicating that the authentication was successful, the image processing device 3 releases the restriction on the function as described above.
  • the user does not need to input account information every time the device is used, improving convenience. Furthermore, for example, since biometric information verification in the image processing device 3 (local authentication) and account information verification in the server 5 (server authentication) are performed, security is improved compared to a mode in which only one of them is performed. improves.
  • Communication system 1 in general ( Figure 1) 1.1. Information used by communication system 1 1.1.1. Biological information 1.1.2. Authentication information (account information) 1.1.3. Facial feature data 1.2. Overall configuration of communication system 1 1.3. Overview of each communication device 1.4. Connection mode of communication equipment 2. Configuration of image processing device 3 ( Figure 2) 2.1. Overall configuration of image processing device 3 2.2. Printer 2.3. Scanner 2.4. UI (User Interface) section 2.4.1. Operation unit 2.4.2. Display section 2.5. Detection unit that detects biological information 2.5.1. Configuration of detection unit 2.5.2. Position and orientation of detection unit, etc. 2.6. Communication Department 2.7. Imaging unit 2.8.
  • the biometric information used by the image processing device 3 for authentication may be of various types, and may be information used in known biometric authentication, for example.
  • the biometric information may be information about the user's physical characteristics or may be information about the user's behavioral characteristics. Specific examples of physical characteristics include fingerprints, palm shapes, retinas (patterns of blood vessels, etc.), iris (distribution of shading values, etc.), faces, blood vessels (patterns of specific parts such as fingers), ear shapes, Sound (such as voice prints) and body odor may be mentioned.
  • Examples of behavioral characteristics include handwriting.
  • the authentication information that the image processing device 3 (comparison table DT1) stores in association with the biometric information D2 and transmits to the server 5 may be various types of information as long as it can show the validity of the user.
  • the authentication information includes account information D1, a static key, a common key, a private key (or public key), an electronic certificate, and biometric information (a biometric information to be compared with the biometric information detected by the detection unit 25). (different from the information D2).
  • the authentication information may be the information itself sent to the server 5, or may be used when generating the information sent to the server 5. good.
  • the former includes account information, static keys, electronic certificates, information obtained from security tokens, and biometric information.
  • the latter includes a common key and a private key (or public key). Both the former and the latter may be used as authentication information.
  • authentication information information based on authentication information
  • information based on authentication information is used as a superordinate concept of authentication information as the information itself sent to the server 5 and information generated based on the authentication information and sent to the server 5. Sometimes used.
  • the authentication information stored in the image processing device 3 and the authentication information stored in the server 5 do not have to be the same information. Further, the authentication information stored in the image processing device 3 may be appropriately processed and transmitted to the server 5. Challenge-response authentication may be performed as one such mode.
  • the authentication information before processing and the authentication information after processing are expressed as the same thing, unless otherwise specified or unless there is a contradiction. In other words, differences in the format and/or precision of the information are ignored, and if the information content indicating the validity of the user is the same, it is expressed as the same information. Therefore, for example, when the authentication information stored in the image processing device 3 and the authentication information stored in the server 5 match, in reality, the transmission from the image processing device 3 to the server 5 is The authentication information is processed in the process, and the authentication information stored in the image processing device 3 and the authentication information stored in the server 5 may be different in format or the like.
  • account information D1 will be mainly taken as an example of authentication information. Further, unless otherwise specified, explanations may be given on the assumption that the authentication information is the account information D1.
  • the account information D1 (FIG. 3) includes, for example, information for identifying a user (hereinafter sometimes abbreviated as "ID"). Further, the account information D1 may include a password. In the description of the embodiments, a mode in which the account information D1 includes an ID and a password may be taken as an example unless otherwise specified. However, as long as there is no contradiction, the word of account information D1 may be replaced with the word of ID (without password) or may be replaced with the word of ID and password.
  • the image processing device 3 may control maintenance and cancellation of the authentication state by using the user's facial feature data D4 (FIG. 3).
  • the facial feature data may be of various types, for example, and may be similar to known data used for face authentication.
  • a known matching method for face authentication may be used as a matching method for facial feature data.
  • the facial feature data includes, for example, information such as the relative positions of feature points related to the eyes, nose, mouth, and the like.
  • the facial feature data may be for the entire face, or may be for a part of the face (for example, a part not hidden by a mask).
  • the facial feature data D4 is not biometric information used for biometric authentication in the description of the embodiment, it may be used for biometric authentication.
  • Facial feature data is a type of biometric information, and explanations of terms regarding biometric information may be used for facial feature data.
  • the communication system 1 includes the image processing device 3 and the server 5, as described above.
  • the communication system 1 may include other communication equipment as appropriate.
  • a server 7 different from the server 5 and terminals 9A, 9B, and 9C are illustrated.
  • the terminals 9A to 9C may be referred to as the terminal 9 (representatively, the terminal 9A is given the reference numeral) without distinguishing them.
  • FIG. 1 a public network 11 and private networks 13A and 13B are illustrated.
  • the communication system 1 may be defined only by the server 5 and the image processing device 3 that is authenticated by the server 5.
  • the communication system 1 may be defined to include other communication devices (server 7 and terminal 9) that can communicate with the server 5 and/or the image processing device 3 that is authenticated by the server 5.
  • the communication system 1 may be defined to include private networks (13A and 13B) in addition to the communication devices (5, 3, 7, and 9) as described above.
  • the communication system 1 may be defined without the public network 11.
  • One example of the server 5 is a dedicated server, and another example is a cloud server.
  • the image processing device 3 includes at least one of a printer and a scanner, as described above. The following description will mainly take as an example a mode in which the image processing device 3 includes both a printer and a scanner.
  • the image processing device 3 may or may not be a multi-function product/printer/peripheral (MFP).
  • MFP multi-function product/printer/peripheral
  • the image processing device 3 may be capable of executing one or more of printing, scanning, copying, FAX transmission, and FAX reception (although these are not necessarily separable concepts), for example.
  • the method of operating the image processing device 3 is arbitrary.
  • the image processing device 3A may be installed in a store such as a convenience store and used by an unspecified number of users.
  • the image processing device 3B may be installed in a private home and used by a specific and small number of users (for example, one person).
  • the image processing device 3C may be installed in a company and used by a specific number of users.
  • the configuration and operation of the image processing device 3 described in the embodiment may be applied to any of the one or more image processing devices 3 included in the communication system 1.
  • the embodiments may be described using one of the image processing devices 3A to 3C as an example.
  • the explanation given for any of the image processing apparatuses 3A to 3C may be applied to other image processing apparatuses as long as no contradiction occurs.
  • the server 5 may also authenticate users who use other communication devices (for example, the terminal 9). Furthermore, the server 5 may process services other than authentication. For example, the server 5 may perform ECM (Enterprise Content Management) or function as a VPN server.
  • ECM Enterprise Content Management
  • the server 7 may provide various services.
  • server 7 may be a file server, a mail server and/or a web server.
  • the file server may store, for example, data of an image printed by the image processing device 3 or data scanned by the image processing device 3.
  • the mail server may deliver mail printed by the image processing device 3 or mail containing an image scanned by the image processing device 3.
  • the web server may execute web services performed through communication with the image processing device 3.
  • each of servers 5 and 7 is represented by one computer. However, one server may be realized by a plurality of distributed computers. A plurality of computers making up one server may be directly connected, included in one LAN, or included in mutually different LANs. Note that the servers 5 and 7 may be configured by one computer, or may be regarded as one server regardless of whether they are configured by one computer or not.
  • the terminal 9 may be of any appropriate type.
  • terminals 9A and 9B are depicted as laptop-type PCs (personal computers).
  • Terminal 9C is depicted as a smartphone.
  • the terminal 9 may be, for example, a desktop PC or a tablet PC.
  • the terminal 9 can be operated in any manner.
  • the terminal 9 may be one that is used by one or more specific users, such as a terminal owned by a company or a terminal owned by an individual, or one that is unspecified and used by many users, such as a terminal at an Internet cafe. It may be used by several users.
  • the public network 11 is a network that is open to the outside (for example, an unspecified number of communication devices). The specific aspect thereof may be determined as appropriate.
  • the public network 11 may include the Internet, a closed network provided by a telecommunications carrier, and/or a public telephone network.
  • the private networks 13A and 13B are networks that are not disclosed to the outside.
  • Private network 13A and/or 13B may be, for example, a LAN.
  • the LAN may be, for example, a network within the same building. Examples of the LAN include those using Ethernet (registered trademark) and Wi-Fi (registered trademark). Further, the private network 13A and/or 13B may be an intranet.
  • Transmission and/or reception of signals by the communication device may be performed via a wire or wirelessly. Further, the communication device (for example, the image processing device 3) may communicate with the public network 11 without being included in the private network, or may be included in the private network. A communication device (for example, the image processing device 3) included in the private network may communicate only within the private network, or may communicate with the public network 11 via the private network.
  • multiple communication devices may be connected to each other in various ways.
  • FIG. 1 it is as follows.
  • the image processing device 3A has not constructed a private network.
  • the image processing device 3A is capable of communicating with the public network 11 without going through a private network by including a router or the like (not shown) or by being connected to a router or the like.
  • the image processing device 3A may be able to communicate with a terminal 9 (not shown in FIG. 1) that is directly connected to the image processing device 3A by wire. Further, the image processing device 3A may be capable of short-range wireless communication with a terminal 9 (not shown in FIG. 1) placed near the image processing device 3A.
  • the image processing device 3B and the terminal 9B are connected to each other by a private network 13B. More specifically, both are connected via the router 15 (its hub). The image processing device 3B and the terminal 9B can communicate with the public network 11 via the router 15 and the like.
  • the image processing device 3C, server 5, server 7, and terminal 9A are connected to each other by a private network 13A.
  • the image processing device 3C, the server 7, and the terminal 9A can communicate with the public network 11 via the server 5, for example.
  • the server 5 may include a router or the like, or a router (not shown) or the like may be provided between the server 5 and the public network 11.
  • the terminal 9C communicates wirelessly with the public telephone network. Furthermore, the terminal 9C communicates with the public network 11 including the public telephone network.
  • the server 5 authenticates the user who uses the image processing device 3.
  • This authentication by the server 5 is performed, for example, on image processing apparatuses (3A and 3B in FIG. 1) connected to the server 5 via the public network 11.
  • authentication by the server 5 is performed not only on image processing apparatuses connected via the public network 11 but also on image processing apparatuses (3C in FIG. 1) included in the private network 13A including the server 5.
  • the server 5 may authenticate only image processing apparatuses included in the private network 13A that includes the server 5.
  • connection mode of communication equipment and the operating method of communication equipment (from another perspective, social positioning) is arbitrary.
  • the image processing device 3A that is not included in the private network may be installed in a store and used by an unspecified number of users as described above, or it may be installed in a company and used by an unspecified number of users, as described above, or It may be installed and used by a specific user.
  • the image processing device 3B included in the private network 13B may be installed in a private home and used by a specific and small number of users as described above, or the image processing device 3B included in the private network 13B may be used by a specific and small number of users. Alternatively, it may be installed in an Internet cafe and used by an unspecified number of users.
  • FIG. 2 is a schematic diagram showing a hardware configuration related to a signal processing system of the image processing device 3. As shown in FIG.
  • the image processing device 3 includes, for example, the following components.
  • a housing 17 (FIG. 1) constitutes the outer shape of the image processing device 3.
  • a printer 19 that performs printing.
  • a scanner 21 image scanner
  • a UI unit 23 that accepts user operations and/or presents information to the user.
  • a detection unit 25 detects biometric information of the user.
  • a communication unit 27 (FIG. 2) that performs communication.
  • An imaging unit 28 that images the surroundings of the image processing device 3 (at least a partial range thereof).
  • a processing section 29 (FIG. 2) that controls each section (19, 21, 23, 25, 27, and 28).
  • a connector 37 (FIG. 2) for connecting an appropriate device to the image processing apparatus 3.
  • the printer 19 and/or the scanner 21 may be referred to as an image processing section 31 (the reference numeral is shown in FIG. 2).
  • the housing 17 may be considered as part of the printer 19 or the scanner 21.
  • the processing unit 29 is conceptually one processing unit that controls all operations (including printing and scanning, for example) of the image processing device 3 (in terms of hardware, it is distributed over multiple units).
  • the objects (19, 21, 23, 25, 27, and 28) controlled by the processing section 29 may be conceptualized only in terms of mechanical parts that do not include the processing section, or the processing section (processing section 29).
  • Components other than the housing 17 (19, 21, 23, 25, 27, 28, and 29).
  • the word component refers to such components other than the housing 17.
  • the housing 17 holds or supports a plurality of components, or is mechanically connected to or coupled to a plurality of components. , it can be said.
  • the plurality of components are provided in the housing 17, it can be said that they are provided integrally with each other. Note that, as understood from the above description, when it is said that a component is provided in the casing 17, the casing 17 may be regarded as a part of the component.
  • the components and the housing 17 are fixed to each other (of course, excluding movable parts). Furthermore, the components are also fixed to each other. Further, unless the image processing device 3 is disassembled by, for example, removing screws, the components and the housing 17 cannot be separated from each other and placed in different locations. Furthermore, the constituent elements cannot be separated from each other and placed in different locations. However, unlike the above example, when it is said that the image processing device 3 has a component, the component may be detachable from the housing 17. In FIG. 2, a detection section 25A that is attached to and detached from the connector 37 is shown by a dotted line as a detection section 25A that is an example other than the detection section 25 that is fixed to the housing 17.
  • the specific positional relationship is arbitrary.
  • the component may be housed within the casing 17, provided integrally with the wall of the casing 17, protruding from the wall of the casing 17, or mounted on the casing 17.
  • the orientation and/or position relative to the body 17 may be variable.
  • the printer 19, scanner 21, communication section 27, and processing section 29 may be considered to be housed in the housing 17.
  • the UI section 23 and the detection section 25 may be considered to be integrally provided on the wall surface of the housing 17.
  • the imaging unit 28 may be considered to protrude from the wall of the housing 17 (fixed to the wall of the housing 17).
  • the size and shape of the image processing device 3 are arbitrary.
  • the image processing device 3 may have a size (mass) that can be carried by one person, such as a home multifunction device or a printer (see illustration of the image processing device 3B), or may have a size (mass) that can be carried by one person, such as a home multifunction device or a printer (see illustration of the image processing device 3B),
  • the image processing apparatus may have a size (mass) that cannot be carried by one person, such as a multifunction device or a printer (see illustrations of image processing apparatuses 3A and 3C).
  • the image processing device 3 may have a concept that is significantly different from a general multifunction peripheral or printer placed in a company (office) or a private home.
  • the printer 19 may print on roll paper.
  • the image processing device 3 may include a robot, and may apply coating to a vehicle body or the like using an inkjet head.
  • the image processing device 3 may be of a size that can be held in one hand, and the image processing device 3 itself may scan a medium to perform printing and/or scanning.
  • the printer 19 is configured, for example, to print on sheets of paper arranged within the housing 17 or on a tray protruding from the housing 17 to the outside, and to discharge the printed sheets.
  • the specific configuration of the printer 19 may be various configurations, for example, it may be similar to a known configuration.
  • the printer 19 may be an inkjet printer that prints by ejecting ink, a thermal printer that prints by heating thermal paper or an ink ribbon, or a photosensitive printer irradiated with light. It may also be an electrophotographic printer (for example, a laser printer) that transfers toner adhering to the body.
  • the inkjet printer may be a piezo type that applies pressure to the ink using a piezoelectric body, or a thermal type that applies pressure to the ink using bubbles generated in the heated ink.
  • the printer 19 may be a line printer in which the head has a length spanning the width of the sheet (a direction that intersects the conveying direction of the sheet), or the printer 19 may have a head that extends in the width direction of the sheet. It may be a serial printer that moves to.
  • the printer 19 may be a color printer or a monochrome printer.
  • the printer 19 may be capable of forming any image, or may be capable of printing only characters.
  • the scanner 21 is arranged on the original glass by a plurality of image pickup elements (not shown) that move along the original glass under the original glass (hidden in FIG. 1) exposed from the top surface of the housing 17. image and scan the original.
  • the scanner 21 may have various configurations, for example, may be similar to known configurations.
  • the configuration of the UI section 23 is arbitrary.
  • the UI section 23 includes an operation section 33 (reference numeral shown in FIG. 2) that receives user operations, and a display section 35 (reference numeral shown in FIG. 2) that visually presents information to the user.
  • the UI section 23 may not be provided, or only one of the operation section 33 and the display section 35 may be provided.
  • the UI unit 23 may include an audio unit that presents information to the user by sound.
  • the UI section 23 may be defined to include the connector 37, unlike the description of the embodiment. This is because connecting a device to the connector 37 may be a type of inputting an instruction to the image processing device 3.
  • the configuration of the operation section 33 is arbitrary.
  • the operation unit 33 accepts, for example, a user's touch operation.
  • Such an operation section 33 may include, for example, a touch panel and/or one or more buttons.
  • a touch panel (numerical symbol omitted) is illustrated as at least a part of the operation unit 33 of the image processing devices 3A and 3C.
  • a button 33a is illustrated as at least a part of the operation unit 33 of the image processing device 3B.
  • the button 33a may be a push button, a touch button, or another button.
  • the touch button may be a capacitive touch button or another touch button.
  • the image processing devices 3A and 3C may have buttons, and the image processing device 3B may have a touch panel.
  • the operation unit 33 may accept other types of operations such as voice operations.
  • the operation unit 33 may be used for various purposes. Typically, the operation unit 33 is used to instruct the image processing device 3 to execute processing related to the image processing unit 31. For example, by operating the operation unit 33, printing, scanning, and copying are performed, and settings related to these operations (for example, settings for paper selection, magnification, density, and/or color, etc.) are performed. In addition, for example, by operating the operation unit 33, access to data, transmission and reception of data, and input of authentication information may be performed.
  • the configuration of the display section 35 is arbitrary.
  • the display unit 35 may include at least one of a display capable of displaying any image, a display capable of displaying only arbitrary characters, a display capable of displaying only specific characters and/or specific graphics, and an indicator light. May contain one.
  • the image here is a concept that includes characters. Examples of displays that display arbitrary images or arbitrary characters include liquid crystal displays or organic EL (Electro Luminescence) displays that have a relatively large number of regularly arranged pixels. Furthermore, examples of displays that display specific characters and/or specific graphics include liquid crystal displays with a limited number and/or shape of pixels, or segment displays such as a 7-segment display. Segmented displays may take various forms, including liquid crystal displays. Examples of the indicator light include those including LEDs (Light Emitting Diodes). An appropriate number of indicator lights may be provided. In addition, in the following description, for convenience, expressions may be given on the premise that the display unit 35 can display any image.
  • Detection unit that detects biological information 2.5.1. Configuration of detection unit
  • various types of biometric information may be used for authentication. Therefore, the configuration of the detection section 25 may also be various. Furthermore, various detection units 25 may be used for the same type of biological information.
  • the basic configuration of the detection unit 25 may be the same as a known one.
  • the detection unit 25 may acquire an image related to biological information.
  • biological information obtained by acquiring images include fingerprints, palm shapes, retinas, iris, faces, blood vessels, and ear shapes.
  • a typical example of the detection unit 25 that acquires an image is an optical type.
  • the optical detection unit 25 includes an image sensor that detects light.
  • the light to be detected by the image sensor (in other words, the wavelength range) may be visible light or non-visible light (for example, infrared light).
  • the detection unit 25 may or may not have an illumination unit that irradiates the living body with light in the wavelength range detected by the image sensor.
  • the image may be a binary image, a grayscale image or a color image.
  • the detection unit 25 that acquires images may be of an ultrasonic type.
  • the ultrasonic detection unit 25 includes an ultrasonic element that transmits and receives ultrasonic waves.
  • the detection unit 25 including an ultrasonic element can acquire an image of the surface and/or internal shape of a living body. More specifically, the detection unit 25 transmits ultrasonic waves toward the living body and receives the reflected waves. An image that reflects the distance from the ultrasound element (ie, the shape of the living body) is acquired based on the time from transmission to reception.
  • the detection unit 25 that acquires images may be of a capacitive type.
  • the capacitive detection unit 25 includes a panel with which a living body comes into contact, and a plurality of electrodes arranged behind the panel and along the panel.
  • a part of a living body for example, a finger
  • the electric charge generated in the electrode at the position where it is in contact the position of a convex part on the body surface
  • the position where the living body is not in contact the position of a concave part on the body surface
  • the detection unit 25 that acquires images may acquire a two-dimensional image by sequentially acquiring line-shaped images in the transverse direction of the line-shaped images (that is, scanning), A two-dimensional image may be acquired substantially in one time without performing such scanning. Scanning may be realized by the operation of the detection unit 25 or by moving the living body relative to the detection unit 25.
  • the former includes, for example, a mode in which a carriage including an image sensor or an ultrasonic device moves.
  • the plurality of ultrasound elements can also perform electronic scanning without mechanical movement.
  • An example of the detection unit 25 other than the configuration that acquires images is one that includes a microphone that acquires audio. Thereby, voice (for example, voiceprint) information as biometric information is acquired. Further, for example, the other detection unit 25 may be a touch panel that accepts writing with a touch pen. Thereby, handwriting information as biometric information is acquired.
  • the detection unit 25 may be used for purposes other than acquiring biological information. From another perspective, the detection unit 25 may be realized by a component provided in the image processing device 3 for a purpose other than acquiring biological information. Alternatively, the detection unit 25 may be structurally inseparably combined with other components.
  • the detection unit 25 that acquires an image may be realized by the scanner 21 (or the imaging unit 28), unlike the illustrated example. That is, when it is said that the image processing device has a scanner (or the imaging section 28) and a detection section, the two may be the same component. The same applies when other components are shared with the detection unit 25 (not limited to the one that acquires images).
  • the detection unit 25 may also be used as a button so that when a finger is placed on a button included in the operation unit 33, a fingerprint is detected.
  • An example of such a button and detection section 25 is the capacitive detection section 25 described above.
  • the button operation is detected by the sensor including the plurality of electrodes described above. Further, for example, the reception of handwriting may be realized by a touch panel included in the operation unit 33.
  • the position, orientation, etc. of the detection unit 25 are arbitrary.
  • the detection unit 25 may be fixed to the housing 17, or may be connected to the housing 17 so that its position and/or orientation can be changed. Alternatively, it may be detachable from the housing 17.
  • the detection unit 25 (more precisely, a part directly involved in reading biometric information; for example, a detection surface on which a finger is placed when detecting a fingerprint; the same applies hereinafter in this paragraph) is a UI It may be arranged adjacent to the section 23.
  • the UI unit 23 and the detection unit 25 are located on the front side (same side) of the document reading surface of the scanner 21 (the top surface of the glass plate) and the lid covering the surface, and on the same side as the casing. They may be located together on the upper side of the body 17.
  • the UI section 23 and the detection section 25 may be located on the same panel (as in the illustrated example), or may not be located on the same panel.
  • the panel may be fixed relative to the housing 17 (its main body portion) or may be movable. From another perspective, the positional relationship between the UI unit 23 and the detection unit 25 is determined by the user's standing position and/or face position when operating the UI unit 23 and when causing the detection unit 25 to read biometric information. may be approximately the same.
  • the detection surface on which the finger is placed may be subjected to anti-virus treatment.
  • the detection surface is constituted by a plate-shaped member, and the material of this plate-shaped member may include a component that produces an antiviral effect.
  • the detection surface may be constituted by a film covering the above-mentioned plate-shaped member, etc., and the film may contain a component that produces an antiviral effect.
  • Components that produce antiviral effects include, for example, monovalent copper compounds and silver.
  • the type of virus to be targeted is arbitrary.
  • the antiviral property of the detection surface 25a may be such that the antiviral activity value is 2.0 or more in a test according to ISO (International Organization for Standardization) 21702, for example.
  • the detection surface 25a may have an antibacterial effect in addition to or instead of an antiviral effect.
  • the communication unit 27 is, for example, a portion of an interface for the image processing device 3 to communicate with the outside (for example, the public network 11) that is not included in the processing unit 29.
  • the communication unit 27 may include only hardware components, or may include a portion realized by software in addition to the hardware components. In the latter case, the communication section 27 may not be clearly distinguishable from the processing section 29.
  • the communication section 27 may have a connector or a port to which a cable is connected.
  • a port here is a concept that includes software elements in addition to a connector.
  • the communication unit 27 includes an RF (Radio Frequency) circuit that converts a baseband signal into a high frequency signal, and an RF (Radio Frequency) circuit that converts a baseband signal into a high frequency signal. and an antenna for converting the signal into a wireless signal.
  • the communication unit 27 may include, for example, an amplifier and/or a filter.
  • the imaging unit 28 images the surroundings of the image processing device 3 as described above.
  • the image captured by the imaging unit 28 is used, for example, to understand the situation of people around the image processing device 3.
  • the situation of the person identified from the image is used, for example, to control the authentication state, as detailed in Section 5.1.
  • the image processing device 3 detects that the user who was in the authenticated state has left the image processing device 3 based on the image captured by the imaging unit 28 while in the authenticated state, the image processing device 3 cancels the authenticated state. . This improves security.
  • the imaging unit 28 may set an appropriate position (space) as an imaging range with respect to the image processing device 3 depending on its specific usage mode.
  • the imaging range include the following. Area around UI section 23. The range that includes the surrounding area (same range as the above surrounding area, or wider range than the above surrounding area). The range in which the face of the user operating the UI unit 23 is displayed (the expected range). A relatively wide range above and below the position where the user operating the UI unit 23 stands (or sits) (position where there is a high probability of standing or sitting). Note that if the user's physique affects the range in which the face is captured, an appropriate user such as an adult or an elementary school student or older may be assumed.
  • the imaging range includes the surrounding area of the UI section 23, the UI section 23 (at least a portion thereof) itself may or may not be included in the imaging range.
  • the size of the surrounding area, etc. may be set as appropriate in light of the purpose of understanding the situation of a person using the image processing device 3 (for example, a person operating the UI unit 23).
  • the surrounding area is a range within a radius of 50 cm or within a radius of 1 m centered on the UI section 23 (its center), or a part of the range.
  • the term "UI section 23" may be replaced with the term "detection section 25" or "UI section 23 and detection section 25.”
  • the imaging range of the imaging unit 28 may remain unchanged in relation to the positional relationship with the image processing device 3. More specifically, for example, the imaging unit 28 may be fixedly provided to the housing 17 of the image processing device 3, and its position and orientation may not be changed. Further, for example, the imaging unit 28 may be configured such that the imaging range cannot be changed by a zoom lens or the like (or the processing unit 29 may be configured such that control to change the imaging range is not performed). . By doing so, for example, the status of the authenticated user and/or other users can be grasped with a certain degree of accuracy, and security is improved.
  • the imaging range may be variable.
  • a method that only reduces convenience but does not reduce security is adopted. For example, if the face of the user who is in the authenticated state is not detected, the authenticated state is canceled. In such a mode in which the authentication state is controlled, the user appropriately sets the imaging range so that the user's own face is imaged. Furthermore, it is difficult for other users to change the imaging range for illegal purposes.
  • the imaging unit 28 may be coupled to the housing 17 so as to be able to change its position and/or orientation. In this case, the position and/or orientation may be changed manually or by a motor or the like.
  • the imaging unit 28 may be connected to the processing unit 29 by a cord and can be separated from the housing 17. The imaging unit 28 may be able to change its position and/or orientation depending on how it is placed on the housing 17 or its surroundings.
  • the imaging unit 28 may repeatedly (in other words, continuously) capture images over a period in which a predetermined condition is satisfied (for example, a period in which the authentication state is maintained).
  • the imaging interval (period) at this time may be set as appropriate depending on the specific usage mode of the imaging unit 28.
  • the imaging interval may be less than 1 second or more than 1 second.
  • the imaging by the imaging unit 28 may be such that it can be interpreted as acquiring a moving image, or it can be regarded as repeatedly acquiring still images. It may be possible, or it may not be possible to make such a distinction.
  • the imaging interval may be changeable.
  • the imaging unit 28 may not perform imaging periodically, but may perform imaging in response to a predetermined trigger (for example, an instruction to execute a function whose restriction has been released).
  • the imaging unit 28 is configured to include an imaging element, generates a two-dimensional image signal (data from another perspective), and outputs it to the processing unit 29. More specific configurations may include various configurations. For example, the imaging unit 28 may use pan focus or auto focus. The imaging unit 28 may or may not have a mechanical shutter.
  • the image sensor may be a CCD (Charge-Coupled Device) image sensor or a CMOS (Complementary Metal-Oxide-Semiconductor) image sensor.
  • the light to be detected by the image sensor (in other words, the wavelength range) may be visible light or non-visible light (for example, infrared light).
  • the image may be a binary image, a grayscale image or a color image.
  • the processing unit 29 has, for example, a configuration similar to that of a computer. Specifically, for example, the processing unit 29 includes a CPU (Central Processing Unit) 39, a ROM (Read Only Memory) 41, a RAM (Random Access Memory) 43, and an auxiliary storage device 45. The processing unit 29 is constructed by the CPU 39 executing programs stored in the ROM 41 and/or the auxiliary storage device 45. In addition to the portion constructed as described above, the processing section 29 may include a logic circuit configured to perform only certain operations.
  • a CPU Central Processing Unit
  • ROM Read Only Memory
  • RAM Random Access Memory
  • the processing unit 29 is constructed by the CPU 39 executing programs stored in the ROM 41 and/or the auxiliary storage device 45.
  • the processing section 29 may include a logic circuit configured to perform only certain operations.
  • the connector 37 is for connecting peripheral equipment to the image processing device 3, for example.
  • the connector 37 may be of various standards, for example, USB.
  • the detection unit 25A according to another example is illustrated as a peripheral device connected to the connector 37, as described above.
  • peripheral devices connected to the connector 37 include a USB memory, a card reader, and the imaging unit 28 (different type from the example shown in FIG. 2).
  • bus 47 (FIG. 2).
  • all the components are schematically connected to one bus 47.
  • multiple buses may be connected in any suitable manner.
  • an address bus, a data bus and a control bus may be provided.
  • a crossbar switch and/or a link bus may be applied.
  • FIG 2 is just a schematic diagram. Therefore, for example, in reality, a plurality of various devices (for example, CPUs) may be distributed and provided.
  • the illustrated CPU 39 may be a concept including a CPU included in the printer 19 or the scanner 21.
  • An interface (not shown) may be interposed between the bus 47 and various devices (for example, the printer 19 or the scanner 21).
  • FIG. 2 has been described as showing the configuration of the image processing device 3.
  • FIG. 2 can be used as a block diagram showing the configuration of the servers 5 and 7 and the terminal 9 as appropriate.
  • the explanation of the components shown in FIG. 2 may also be applied to the components of the servers 5 and 7 and the terminal 9, as long as there is no contradiction.
  • the block diagram showing the configuration of the servers 5 and 7 and the terminal 9 may be obtained by omitting the printer 19 and scanner 21 from FIG. 2.
  • the block diagram showing the configuration of the servers 5 and 7 may be obtained by omitting the detection section 25, the imaging section 28, the operation section 33, and/or the display section 35 from FIG. 2.
  • the server 5 may perform authentication in response to requests from a plurality of image processing devices 3.
  • the image processing device 3 for example, a home-use device
  • the comparison table DT1 may be able to hold account information D1 and biometric information D2 for only one user. Even in such a mode, for convenience, it may be expressed that the account information D1 and the biometric information D2 are stored in association with each other "for each user.”
  • the processing unit 29 is constructed, for example, by the CPU 39 executing a program.
  • the processing section 29 of the image processing device 3 includes a comparison section 29a, a monitoring section 29b, and a control section 29c.
  • a verification unit 5a is constructed in the server 5, for example.
  • the comparison unit 29a compares the biometric information (second biometric information) detected by the detection unit 25 during use with the biometric information D2 (first biometric information) recorded in the comparison table DT1. . Then, the comparison unit 29a (from another perspective, the communication unit 27) transmits the account information D1 associated with the first biometric information that matches the second biometric information to the server 5.
  • the monitoring unit 29b monitors the situation of people around the image processing device 3 based on, for example, images captured by the imaging unit 28. For example, the monitoring unit 29b determines whether a user who is in an authenticated state is in a predetermined position based on the image processing device 3 (from another perspective, the image capturing unit 28) based on images repeatedly captured by the imaging unit 28 in the authenticated state. Determine whether it exists within the range.
  • the control unit 29c controls the authentication state, for example, based on the authentication result D3 received from the server 5 and the monitoring result of the monitoring unit 29b. For example, if the authentication result D3 is positive (indicating that the authentication was successful), restrictions on a predetermined function related to image processing are lifted (from another point of view, the image processing device 3 is placed in an authenticated state). ). Further, for example, when the monitoring unit 29b determines that the authenticated user is no longer present, the authenticated status is canceled (from another point of view, a predetermined function is restricted).
  • the verification unit 5a performs authentication based on, for example, whether the account information D1 received from the image processing device 3 is registered in the verification table DT2. Then, the verification unit 5a transmits the authentication result D3 indicating the success or failure of authentication to the image processing device 3 as the transmission source.
  • the comparison table DT1 is stored, for example, in a nonvolatile memory (for example, the auxiliary storage device 45) that the image processing device 3 has.
  • the verification table DT2 is stored, for example, in a nonvolatile memory 5b (for example, an auxiliary storage device) that the server 5 has. Note that in a mode in which the server 5 performs authentication in response to requests from a plurality of image processing devices 3, one verification table DT2 may be configured for each image processing device 3, or one verification table DT2 may be configured for each image processing device 3. Account information D1 corresponding to a plurality of image processing apparatuses 3 may be held.
  • the comparison table DT1 may be able to store two or more biometric information D2 in association with one account information D1.
  • the two or more biometric information D2 may be different biometric information of one user, for example. Examples of such biometric information include fingerprints of different fingers or fingerprints of the same finger acquired at different times. In the former case, for example, when authentication with one finger fails due to injury, aging, etc., authentication can be performed with another finger. In the latter case, the probability that authentication will fail when biometric information changes due to aging or the like is reduced.
  • two or more pieces of biometric information D2 associated with one account information D1 may belong to different people. That is, a "user” is not limited to a “person” and may be a concept that includes an "account” (from another perspective, a user group). However, in the description of the embodiment, for convenience and without particular notice, a mode in which one piece of biometric information D2 is associated with one piece of account information D1 may be taken as an example.
  • the monitoring unit 29b may perform monitoring using, for example, facial feature data detected from the image captured by the imaging unit 28.
  • the monitoring unit 29b may store facial feature data D4 specified from images captured at a predetermined time in the memory of the image processing device 3.
  • the above-mentioned predetermined time is, for example, approximately the same time as the time when biometric information is detected during use.
  • the approximately same period may be, for example, a period with a difference of 2 seconds or less before and after the biological information detection time, or a period of 1 second or less before and after the detection time.
  • the monitoring unit 29b may then compare the facial feature data D4 specified from the subsequently captured image with the facial feature data D4 stored in the memory.
  • the memory in which the facial feature data D4 is stored may be, for example, a volatile memory (eg, RAM 43). However, the memory may be a nonvolatile memory (for example, the auxiliary storage device 45).
  • FIG. 4 is a flowchart illustrating an overview of operations related to authentication in the image processing device 3 (from another perspective, the processing unit 29) and the server 5 (from another perspective, the verification unit 5a). Note that various flowcharts, including FIG. 4, are conceptual illustrations of operational procedures to facilitate understanding, and do not necessarily correspond to actual procedures, and may lack accuracy. Sometimes there are.
  • Steps ST1 to ST5 indicate procedures in advance preparation for authentication (initial registration).
  • Steps ST6 to ST11 show a procedure (procedure during use) in which the image processing device 3 requests authentication from the server 5 and executes an action according to the authentication result. Specifically, it is as follows.
  • the initial registration process including steps ST1 to ST5 is started, for example, by a predetermined operation on the operation unit 33 of the image processing device 3.
  • the operations here include not only operations on specific mechanical switches but also operations combined with a GUI (Graphical User Interface). The same applies to operations mentioned in other processes unless otherwise specified or unless a contradiction occurs.
  • step ST1 the processing unit 29 of the image processing device 3 receives input of the user's account information D1 via the operation unit 33. Furthermore, in step ST2, the processing section 29 controls the detection section 25 to detect the user's biometric information D2. Note that the order of steps ST1 and ST2 may be reversed.
  • step ST3 the processing unit 29 associates the acquired account information D1 and biometric information D2 and adds them to the comparison table DT1.
  • step ST4 transmits the account information D1 to the server 5 via the communication section 27.
  • step ST5 adds the received account information D1 to the verification table DT2. Registration is performed through the above operations.
  • the initial registration shown in FIG. 4 can be said to be an operation of linking biometric information D2 to account information D1 to which biometric information D2 is not linked.
  • the account information D1 may be registered before the initial registration or may be unregistered.
  • Additional registration is an operation of further linking another biometric information D2 to the account information D1 to which the biometric information D2 is linked.
  • Replacement registration is an operation of replacing biometric information D2 linked to account information D1 with other biometric information D2. This can reduce the probability that authentication will fail due to changes in biometric information D2 due to physical condition or aging, for example.
  • the replacement registration in addition to or in place of the biometric information D2, it may be possible to change the account information D1 (for example, change the password).
  • steps are taken to reduce the probability that a third party unrelated to the communication system 1 will fraudulently obtain an account and/or a third party unrelated to an existing account may access the account.
  • steps may be taken to reduce the probability of illegally linking biometric information.
  • a procedure for example, various known procedures may be applied.
  • the terminal 9 may be used for such a procedure.
  • the administrator of the image processing device 3 and/or the server 5 may intervene in the registration procedure.
  • the account information D1 and biometric information D2 for registration may be acquired by a terminal other than the image processing device 3 and added to the comparison table DT1 via the communication unit 27 or the connector 37.
  • the processing during use including steps ST6 to ST11 is started, for example, by a predetermined operation on the operation unit 33 of the image processing device 3.
  • step ST6 the processing unit 29 of the image processing device 3 controls the detection unit 25 to detect the user's biometric information D2.
  • the processing unit 29 (comparison unit 29a) determines whether the biometric information D2 (second biometric information) detected in step ST6 is registered in the comparison table DT1 updated in step ST3. do.
  • the processing unit 29 transmits the account information D1 to the server 5 (step ST8).
  • the processing unit 29 proceeds to step ST11 without executing step ST8, and limits the function (does not release the restriction on the function).
  • the function may be canceled when a predetermined condition is met.
  • step ST9 the server 5 (verification unit 5a) determines whether the account information D1 received in step ST8 is registered in the verification table DT2 updated in step ST5 (determines whether authentication is successful or not). do.). Then, the authentication result including information indicating the success or failure of authentication is transmitted to the image processing device 3 as the transmission source (step ST10).
  • step ST11 the processing unit 29 of the image processing device 3 controls the release of restrictions on the functions of at least one of the image processing unit 31 and the communication unit 27 based on the authentication result. For example, if the authentication result is positive, the restriction on the predetermined function is canceled, and if the authentication result is negative, the restriction on the predetermined function is restricted.
  • only one type of function may require authentication, or two or more types may be required. Further, it may be possible to repeatedly execute one type of function or to execute two or more types of functions with one authentication. However, authentication may be requested each time a function is executed, authentication may be requested for each type of function, or authentication may be requested again when a function with a high security level is executed.
  • the biometric information acquired during use may be deleted from the image processing device 3 immediately after being compared with the registered biometric information (step ST7).
  • the biometric information acquired during use may be stored in the image processing device 3 until an appropriate time thereafter (for example, when the authentication state is canceled) and used as appropriate.
  • the biometric information acquired during use may be used to update the registered biometric information.
  • FIG. 5 is a flowchart showing a specific example of steps ST6, ST7, and ST11 in FIG. 4. However, FIG. 5 may be regarded as a flowchart showing the procedure of another aspect similar to FIG. 4.
  • the process shown in FIG. 5 may be started at an appropriate time (from another point of view, in response to an appropriate trigger).
  • the process shown in FIG. 5 may be started when the user starts using the image processing device 3.
  • the process shown in FIG. 5 may be started when the image processing device 3 is powered on or when the image processing device 3 returns from standby mode.
  • the process shown in FIG. 5 may be started when the user attempts to use a function that requires authentication. In other words, when a user attempts to use a function that does not require authentication, the process shown in FIG. 5 does not need to be performed. Note that, for convenience, hereinafter, the process shown in FIG. 5 may be explained on the premise that the process shown in FIG. 5 is started when the user starts using the image processing device 3.
  • step ST51 the image processing device 3 displays a screen on the display unit 35 that prompts the user to input biometric information.
  • the image processing device 3 determines whether biometric information has been input. Then, the image processing device 3 proceeds to step ST53 when the determination is positive, and proceeds to step ST56 when the determination is negative.
  • step ST53 the image processing device 3 determines whether biometric information that matches the biometric information input in step ST52 is registered in the comparison table DT1. Then, the image processing device 3 proceeds to step ST54 when the determination is positive, and proceeds to step ST56 when the determination is negative.
  • step ST54 the image processing device 3 transmits to the server 5 the account information associated with the biometric information that matches the input biometric information in the comparison table DT1. Note that steps ST51 to ST54 correspond to steps ST6 to ST8 in FIG. 4.
  • the image processing device 3 receives the authentication result from the server 5 (step ST10 in FIG. 4). Then, the image processing device 3 controls the release of the restriction on the function according to the authentication result (step ST55). That is, the image processing device 3 releases the functional restriction when the authentication result is positive, and maintains the functional restriction when the authentication result is negative.
  • step ST56 the image processing device 3 determines whether account information has been input by operating the operation unit 33 or the like. Then, the image processing device 3 proceeds to step ST57 when the determination is positive, and returns to step ST52 when the determination is negative. When returning to step ST52, the user may be notified by the display unit 35 or the like that the authentication has failed.
  • the authentication information associated with the biometric information D2 is not limited to account information.
  • Authentication information other than account information may be input to the image processing device 3 as appropriate.
  • the authentication information may be input by connecting a USB memory storing authentication information (for example, a static key or an electronic certificate) to the connector 37.
  • step ST52 In the case of returning to step ST52, the state of functional restriction is not changed. Therefore, for example, in an embodiment in which the process shown in FIG. 5 is performed when the user starts using the image processing device 3, the user cannot use the image processing device 3. In another aspect, all functionality is limited. Further, for example, in an embodiment in which the process shown in FIG. 5 is started when the user attempts to use a function that requires authentication, the user cannot use the above function. In this aspect, the image processing device 3 does not return to step ST52, for example, but returns to the step of selectively accepting the use of various functions, and instead of displaying the screen prompting input of biometric information (step ST51). may display a screen that accepts instructions for using various functions.
  • step ST57 the account information input in step ST56 is sent to the server 5.
  • account information is transmitted to the server 5.
  • the information transmitted in steps ST54 and ST57 may or may not include information indicating whether the account information belongs to the biometrically authenticated user. It's okay.
  • the image processing device 3 receives the authentication result from the server 5 (step ST10 in FIG. 4).
  • the image processing device 3 controls the release of the restriction on the function according to the authentication result (step ST58). That is, the image processing device 3 cancels the restriction on the function when the authentication result is positive, and maintains the restriction on the function when the authentication result is negative.
  • steps ST55 and ST58 the release of functional restrictions is controlled according to the received authentication result.
  • the control for releasing the restriction on functions in steps ST55 and ST58 may be the same or different. That is, depending on the presence or absence of biometric authentication, the control for releasing restriction on functions may be the same or different. Examples of the latter include, for example (see also Section 6.1.2):
  • step ST58 there is a mode in which one or more functions whose restrictions are canceled in step ST58 are part of a plurality of functions whose restrictions are canceled in step ST55. That is, in step ST55, restrictions on more functions may be lifted than in step ST58.
  • the description here assumes, for example, that the control for releasing restriction on functions is the same for all users.
  • the control for releasing restrictions on functions differs for each user (see Section 6.2, etc.), focusing on a specific user or one or more users with a specific authority among multiple types of authority. I am assuming the case.
  • a mode is assumed in which the control for releasing restrictions on functions is different for each user (see Section 6.2, etc.).
  • the biometric authentication is successful and the restriction of the function is canceled in step ST55, the first function is restricted or the restriction is canceled depending on the user (according to the user's authority).
  • the function restriction is canceled in step ST58 without biometric authentication, the first function is restricted for any user (regardless of the user's authority).
  • the image processing device 3 may perform restriction release control depending on the presence or absence of biometric authentication that it knows about.
  • the transmission information transmitted in steps ST54 and ST57 may include information indicating the presence or absence of biometric authentication in addition to account information.
  • the server 5 may select a function whose restrictions are lifted depending on the presence or absence of biometric authentication, and may include information indicating the selection result in the authentication result.
  • the image processing device 3 may perform restriction release control according to the selection result included in the authentication result (see also Section 6.2).
  • the same account information input in step ST56 may or may not be stored in the comparison table DT1.
  • the image processing device 3 determines whether the account information input in step ST56 is registered in the comparison table DT1 (performs local authentication based on the account information), and determines whether the account information is registered.
  • Account information may be sent to the server 5 only in this case. In this case, for example, authentication of both the image processing device 3 and the server 5 is required, increasing security.
  • it is sufficient that the account information is registered in the server 5, so convenience for users who are not registered in the image processing device 3 is improved.
  • FIG. 6 is a schematic diagram showing an example of an image displayed on the screen 35a of the display unit 35 when prompting the user for authentication.
  • Image IG1 in FIG. 6 is displayed, for example, when the user starts using the image processing device 3 or when the user attempts to use a function that requires authentication.
  • Image IG1 prompts the user to select an authentication method by displaying "Please select an authentication method.”
  • the user can select an authentication method by, for example, performing a predetermined operation (for example, tapping) on a button labeled "ID, password input” or "authentication using biometric information.”
  • image IG3 is displayed on the screen 35a.
  • image IG5 is displayed on the screen 35a.
  • Image IG3 prompts the user to input the ID and password by displaying "Please enter your ID and password.” For example, the user selects the input fields associated with "ID” and “password” by tapping, etc., and then inputs characters (a broad concept including numbers and symbols) by operating a mechanical switch or software keyboard. ) can be input. Thereafter, when a predetermined operation (for example, tap) is performed on the "execute” button, authentication using the input ID and password is started. When a predetermined operation (for example, tap) is performed on the "back” button, image IG1 is redisplayed. Note that even if account information is not input for a predetermined period of time, image IG1 may be redisplayed.
  • a predetermined operation for example, tap
  • Image IG5 prompts the user to input biometric information by displaying "Please let me read your fingerprint.”
  • biometric information may be other than fingerprints.
  • the user can have his or her fingerprint read by placing his or her finger on the detection unit 25. The finger at this time is the finger whose fingerprint was read during registration (step ST2).
  • biometric information is detected (affirmative determination in step ST52)
  • authentication is started.
  • a predetermined operation for example, tap
  • image IG1 is redisplayed. Note that even if no biometric information is detected for a predetermined period of time, image IG1 may be redisplayed.
  • displaying the screen in step ST51 may be regarded as the start of controlling the display of images IG1, IG3, and IG5 shown in FIG. 6. Then, steps ST52, ST53, and ST56 may be performed in parallel with the display control (including display switching) of the images IG1, IG3, and IG5 described above.
  • step ST51 to ST55 and steps ST56 to ST58 are not combined with each other. It may also be done separately.
  • account information is input based on the assumption that biometric information has not been detected or biometric authentication has failed. , account information may be input. Specifically, it is as follows.
  • image IG1 may be displayed before step ST51. Then, when "authentication using biometric information" is selected, image IG5 may be displayed in step ST51. Then, the determination in step ST52 may be performed. When the determination in step ST52 is negative, the image processing device 3 may repeat step ST52 without proceeding to step ST56. Then, when the negative determination is repeated over a predetermined period, a message that the time has run out may be displayed on the display unit 35 (or not displayed), and the image IG1 may be redisplayed.
  • step ST52 when an affirmative determination is made in step ST52 and a negative determination is made in step ST53 in the above, for example, the image processing device 3 does not proceed to step ST56 and displays a message on the display unit 35 indicating that biometric authentication has failed. After displaying the image IG1, the process may return to the step (before step ST51) of displaying the image IG1. Alternatively, after displaying on the display section 35 that biometric authentication has failed, the image processing device 3 displays an image similar to the image IG5 on the display section 35 that prompts re-reading of the fingerprint, and then returns to step ST52. Good too.
  • step ST56 may be performed.
  • the image processing device 3 may repeat step ST56 without passing through step ST52. Then, when the negative determination is repeated over a predetermined period, a message that the time has run out may be displayed on the display unit 35 (or not displayed), and the image IG1 may be redisplayed.
  • image IG5 may be displayed in step ST51. If a negative determination is made in step ST52 because biological information cannot be detected for a predetermined period of time, or if a negative determination is made in step ST53, after or with a display to that effect, the image IG3 or an image similar to image IG3 may be displayed, and step ST56 may be performed. Thereafter, when the determination in step ST56 is negative, the image processing device 3 may repeat step ST56 without passing through step ST52. Then, when the negative determination is repeated over a predetermined period, a message that the time is up is displayed on the display unit 35 (or not displayed), and the process returns to step ST51, where the image IG5 may be displayed again. .
  • FIG. 5 (and FIG. 6) may be interpreted in this way.
  • the procedure for successful biometric authentication affirmative determination in step ST52 and positive determination in step ST53
  • the procedure may be considered to be shown as a procedure when biometric authentication is not performed or fails.
  • FIG. 7 is a schematic diagram showing an example of an image (image IG10) displayed on the screen 35a of the display unit 35 when biometric authentication fails.
  • Image IG10 shows options (buttons) that the user can select. There are three options: ⁇ Reread fingerprint,'' ⁇ Authenticate with ID and password,'' and ⁇ Authenticate with card.''
  • the image processing device 3 displays image IG5 on the screen 35a (step ST51), and proceeds to step ST52.
  • the image processing device 3 displays the image IG3 on the screen 35a and proceeds to step ST56.
  • the image processing device 3 displays an image prompting the user to read the authentication card with a card reader (not shown) included in the image processing device 3. is displayed on the screen 35a. Then, the image processing device 3 executes processing for performing authentication using an authentication card instead of step ST56.
  • the image IG5 is displayed on the screen 35a.
  • the image IG5 is an image that prompts the user to input (re-enter) biometric information, but the image IG10 itself, which includes the option "Re-read the fingerprint", and the above options also prompt the user to input (re-enter) the biometric information. This can be seen as an image that prompts users to input information.
  • the information that is communicated to the user when biometric authentication fails includes, for example, prompting the user to retry biometric authentication and/or inquiring whether to switch to authentication other than biometric authentication.
  • another authentication method may include connecting a USB memory in which information necessary for authentication is recorded. Further, another authentication may be biometric authentication using biometric information other than a fingerprint.
  • options for each type of authentication are shown as options when performing authentication other than fingerprint authentication. However, after selecting to perform another authentication, options for each type of authentication may be displayed. Although not particularly shown, an option to give up authentication may be displayed.
  • biometric authentication local authentication
  • an image such as image IG10 indicating whether to retry the same authentication and/or perform another authentication is displayed.
  • a similar display may be made when the authentication result received from the server 5 is negative.
  • canceling the authentication state can be rephrased as returning to a state where authentication is not performed.
  • Canceling the authentication state involves terminating functions that require authentication (e.g., VPN connection described below) and/or invalidating (e.g., erasing from memory) information acquired on the assumption of authentication (e.g., authority information described later). It's fine. Therefore, cancellation of the authentication state may be recognized by the termination of these operations and/or the invalidation of information.
  • a flag is set to indicate that authentication has been successful
  • it may be recognized by an action of tossing down the flag. In this case, it is not necessary to terminate the operation based on the authentication and/or invalidate the information acquired based on the authentication.
  • the authentication state may be canceled using various events as triggers. Examples of such events include the following: The user has performed a predetermined operation on the operation unit 33. When the image processing device 3 requests the user to detect biometric information when the user attempts to use a function that requires authentication (for example, a function to download and print predetermined image data), The completion of a process (sometimes called a "task"). A predetermined time has elapsed since a predetermined time (for example, the time when the last operation on the operation unit 33 was performed).
  • detection of biometric information by the detection unit 25 may be required (of course, it may not be required).
  • a display for example, an image
  • the trigger for canceling the authentication state may be that biometric information of a user who has already been successfully authenticated is detected again.
  • the probability that unintended authentication will occur is reduced. More specifically, for example, the probability of cancellation due to an erroneous operation on the operation unit 33 is reduced.
  • the image processing device 3 may perform operations for a relatively long time. Such operations include, for example, scanning multiple pages, printing multiple pages, transmitting large amounts of data, and/or receiving large amounts of data. When the user leaves the image processing device 3 during execution of such an operation, the probability that the authentication state will be canceled by a third party or the like is reduced.
  • the image processing device 3 may control maintenance and cancellation of the authentication state based on the image captured by the imaging unit 28 during the authentication state.
  • the authenticated state may be canceled when conditions including the absence of the authenticated user are satisfied.
  • FIGS. 8 and 9 are flowcharts showing an example of a procedure of processing executed by the image processing device 3 (processing unit 29).
  • the process in FIG. 8 is performed after step ST55 in FIG. is positive (that is, when the authentication state is started through biometric authentication).
  • the process in FIG. 9 is executed when an affirmative determination is made in step ST87 in FIG. 8, as indicated by the common "B" in FIGS. 8 and 9.
  • step ST81 the image processing device 3 uses the imaging unit 28 to image the imaging range. As described above, this imaging range is set so that, for example, the face of the user using the image processing device 3 can be imaged. Further, step ST81 is performed immediately after the authentication via biometric authentication is successful (from another point of view, approximately at the same time as when the authentication state is started), so basically the user in the authentication state is imaged.
  • step ST82 the image processing device 3 extracts facial feature data D4 (FIG. 3) from the image captured in step ST81, and stores it in the memory (for example, RAM 43).
  • step ST83 the image processing device 3 uses the imaging unit 28 to image the imaging range.
  • step ST84 the image processing device 3 determines whether the facial feature data D4 obtained from the image captured in step ST83 matches the facial feature data D4 saved in step ST82. Then, the image processing device 3 proceeds to step ST85 when the determination is negative, and proceeds to step ST87 when the determination is affirmative.
  • steps ST83 and ST84 are repeated.
  • the period at this time is arbitrary. For example, the period may be less than 1 second or more than 1 second.
  • the authentication state may be canceled (step ST86).
  • the authentication is not canceled immediately when the user's facial feature data D4 is not detected for a predetermined period of time. , the authentication status is canceled. This eliminates the inconvenience that the authentication state is canceled even when the user turns away from the imaging unit 28 for a short period of time due to some reason (for example, paper feeding work).
  • step ST85 the image processing device 3 determines whether the time counted by a predetermined timer has reached a predetermined time. This timer is in the initial state before step ST82, and is reset to the initial state when an affirmative determination is made in step ST84. Further, the timer starts counting when a negative determination is made in step ST84 when the timer is in the initial state, and continues counting when a negative determination is made in step ST84 when the timer is not in the initial state. Then, the image processing device 3 returns to step ST83 when a negative determination is made in step ST85, and proceeds to step ST86 when an affirmative determination is made. In step ST86, the image processing device 3 cancels the authentication state.
  • the predetermined time may be set by, for example, the manufacturer of the image processing device 3, the administrator of the image processing device 3, or an individual user.
  • the image processing device 3 determines whether a predetermined process (for example, a process related to at least one of the image processing unit 31 and the communication unit 27) has been instructed by a user different from the user who is in the authenticated state. judge. In other words, it is determined whether interrupt processing (interrupt operation) has been performed.
  • the image processing device 3 specifies the positional relationship between the image processing device 3 and one or more captured persons based on the image captured in step ST83, and operates the UI unit 23 based on the positional relationship. It is determined whether the identified operator is the same person who was determined to have facial feature data that matches the stored data (facial feature data D4) in step ST84. In identifying the operator, for example, the person closest to the image processing device 3, the UI unit 23, and/or the imaging unit 28 may be identified as the operator.
  • two or more pieces of facial feature data D4 may be detected from the image captured in step ST83. Then, in step ST84, it may be determined whether or not the facial feature data D4 stored in step ST82 exists (regardless of the number of detected facial feature data D4).
  • step ST87 When the determination result in step ST87 is negative (when it is determined that no interruption has been made), the image processing device 3 returns to step ST83. This maintains the authentication state. Moreover, when the image processing device 3 makes an affirmative determination, the process proceeds to step ST88.
  • step ST88 the image processing device 3 cancels the authentication state.
  • the cancellation here is not a complete cancellation, but a temporary one that allows the user to return to the authenticated state later without going through authentication.
  • three types of flags indicating authentication status may be prepared: authentication status, cancellation of authentication status, and temporary cancellation of authentication status, and a flag corresponding to temporary cancellation of authentication status may be set.
  • the image processing device 3 may limit the functions in the same way as when the flag for canceling the authentication state is set.
  • information necessary for restoring the authentication state may be stored as appropriate. For example, the facial feature data D4 acquired in step ST82 may remain saved.
  • the information that identifies the user who was in the authenticated state and/or the restrictions are lifted for the user who was in the authenticated state.
  • Information identifying the functions used may remain saved.
  • step ST89 the image processing device 3 executes the interrupt process that caused the affirmative determination in step ST87.
  • step ST90 the image processing device 3 determines whether or not the interrupt processing has ended. Then, the image processing device 3 continues the interrupt processing when the determination is negative, and proceeds to step ST91 when the determination is affirmative.
  • steps ST91 and ST92 processing similar to steps ST83 and ST84 is performed.
  • the image processing device 3 restores the authentication state (step ST93).
  • the image processing device 3 may set the above-mentioned flag to the authentication state and cancel the restriction on the predetermined function as before step ST88.
  • the image processing device 3 then returns to step ST83 (or ST87) again.
  • the image processing device 3 completely cancels the authentication state (step ST94).
  • step ST94 may be the same process as step ST86 in FIG. 8 (it may be interpreted as proceeding to step ST86 after a negative determination in step ST92).
  • the image processing device 3 then returns to step ST51 (ST6).
  • step ST83 The process returns to step ST83 via step ST93, and when an affirmative determination is made in step ST84, if the other user is already in a state where the other user is not regarded as an operator, a negative determination is made in step ST87, and the authentication status is is maintained. Furthermore, when the other user is still considered to be the operator (for example, when the other user is performing the next interrupt process), an affirmative determination is made again in step ST87 and the authentication state is changed. It will be canceled.
  • the facial feature data D4 may not be used.
  • the presence or absence and location of the authenticated user (the person imaged when authentication is performed) can be determined.
  • the user may be identified, or an interruption by another user (a person whose image is captured after authentication) may be identified.
  • step ST84 the question is not whether facial feature data that matches the facial feature data in step ST82 is detected, but whether the contour of the person who is considered to be in the authenticated state can be tracked (for example, if the contour is within the range) may be determined. Then, when tracking becomes impossible, a timer may start measuring time. After starting time measurement by the timer, it may be determined in step ST84 whether facial feature data matching the facial feature data in step ST82 is detected. Then, when an affirmative determination is made in step ST84 and the timer is reset to the initial state, step ST84 may be switched again to determining whether or not the contour can be tracked.
  • the imaged person (more specifically, the facial feature data D4 in the illustrated example) is unconditionally regarded as a user who is in an authenticated state.
  • a person in a predetermined positional relationship with respect to the image processing device 3 is specified (the above explanation regarding identification of the operator may be referred to), and this person is The user may be identified as an authenticated user.
  • the contour of the user may be tracked and/or the facial feature data D4 of the user may be stored in step ST82.
  • Step ST85 may be omitted and the authentication may be canceled without waiting for the predetermined time to elapse. Further, steps ST88 to ST93 may be omitted and the authentication state may be completely canceled at the stage when an interruption is detected. Further, if step ST85 is repeated and the presence of a person different from the user in the authenticated state is detected while waiting for the elapse of a predetermined time, the authenticated state is canceled without waiting for the elapse of the predetermined time. It's okay.
  • FIGS. 5, 8, and 9 only users who have successfully passed authentication including biometric authentication are monitored by the imaging unit 28. As a result, for example, monitoring is performed only when many functions are unrestricted (when a high security level is required from another point of view), reducing the load on the image processing device 3 and efficiently increasing security. can.
  • the processes in FIGS. 8 and 9 may be applied to users who have been brought into an authenticated state using other authentication methods. For example, the processes in FIGS. 8 and 9 may be executed not only after the restriction is removed in step ST55, but also after the restriction is removed in step ST58 (account information is input and authentication is successful). ).
  • control of the authentication state according to the situation of the person based on the image capture is performed based on the presence or absence and/or position of the user who is in the authentication state and/or other users (or the faces of these users) (the user's position is determined by the presence or absence of the user).
  • control may be performed according to the user's motion and posture.
  • the authentication may be canceled when a specific action (for example, arm waving action) by the authenticated user is detected.
  • the image processing device 3 may request re-input of biometric information in an authenticated state after biometric authentication, and maintain the authenticated state when biometric authentication (local authentication) based on the re-entered biometric information is successful. . This improves security, for example. An example of the procedure will be described below.
  • FIG. 10 is a flowchart illustrating an example of a procedure of processing executed by the image processing device 3 (processing unit 29) to control the authentication state. For example, as can be understood from the symbol "A", this process is performed after step ST55 in FIG. 5, and if the authentication result from the server 5 is positive, similar to the process in FIG. (i.e., when the authentication state is started via biometric authentication).
  • step ST101 the image processing device 3 determines whether a condition (re-input condition) that requires re-input of biometric information to maintain the authentication state is satisfied. Then, when the determination is negative, the image processing device 3 repeats step ST101 (stands by). This maintains the authentication state. Further, when the image processing device 3 makes an affirmative determination, the process proceeds to step ST102.
  • a condition re-input condition
  • the re-input conditions may be set as appropriate.
  • the re-input condition may include that a predetermined period of time has passed since the detection of the most recent biological information by the detection unit 25.
  • the detection of the past biometric information here is, for example, the first detection in step ST52 (detection for starting the authentication state) or the detection in step ST103 (detection for maintaining the authentication state) described later.
  • the predetermined time may be set by, for example, the manufacturer of the image processing device 3, the administrator of the image processing device 3, or an individual user.
  • the re-input condition may include that an operation instructing the execution of a predetermined function has been performed on the UI unit 23.
  • the predetermined function is, for example, a function related to at least one of the image processing unit 31 and the communication unit 27, and is one or more functions whose restrictions are lifted when the authentication result D3 from the server 5 is positive. It may be included in Moreover, all the functions may be selected from the one or more functions described above, or some functions (for example, a function requiring higher security) may be selected as the predetermined functions.
  • the predetermined function may be selected by, for example, the manufacturer of the image processing device 3, the image processing device 3, or an individual user.
  • step ST102 the image processing device 3 displays on the display unit 35 an image that prompts the user to re-enter the biometric information.
  • This image may be, for example, image IG5 in FIG. 6 or something similar thereto.
  • step ST103 the image processing device 3 determines whether biometric information has been re-input within a predetermined time after executing step ST102. Then, the image processing device 3 proceeds to step ST104 when the determination is positive, and proceeds to step ST105 when the determination is negative.
  • step ST104 the image processing device 3 displays the biometric information (first biometric information) in the comparison table DT1 that is determined to match the biometric information (second biometric information) inputted in step ST53. biometric information). Then, the image processing device 3 returns to step ST101 when the determination is positive, and proceeds to step ST105 when the determination is negative. When returning to step ST101, the authentication state is maintained.
  • step ST105 the image processing device 3 cancels the authentication state. Note that step ST105 is the same process as step ST86 in FIG.
  • the image processing device 3 may cancel the authentication state when the human sensor detects that the user has left the image processing device 3, as described below.
  • a means other than the imaging unit 28 will be taken as an example of the human sensor.
  • the human sensor may be the imaging unit 28.
  • FIG. 11 is a schematic diagram for explaining an example of canceling the authentication state when the user leaves the image processing device 3.
  • the user U1 is located around the image processing device 3. At this time, authentication has not been canceled.
  • the cancellation of authentication involves disconnection of the VPN connection.
  • the fact that the VPN connection is enabled between the image processing device 3 and the server 5 indicates that the authentication has not been canceled.
  • the user U1 is away from the image processing device 3. Then, the authentication is canceled because the user U1 leaves the image processing device 3. As a result, the VPN connection is disconnected, and the image processing device 3 is simply connected to the public network 11.
  • the image processing device 3 includes a human sensor 51, and based on the detection result of the human sensor 51, it is detected that the user U1 has left the image processing device 3.
  • the human sensor 51 may have various configurations.
  • the object directly detected by the human sensor 51 may be various types, for example, infrared rays, ultrasonic waves, and/or visible light.
  • the human sensor 51 that detects infrared rays detects, for example, infrared rays (heat from another point of view) emitted from a person or the like.
  • the human sensor 51 that detects ultrasonic waves for example, transmits ultrasonic waves in a predetermined direction or range and detects the reflected waves.
  • the human sensor 51 that detects visible light detects visible light reflected from a person or the like or visible light that is not blocked by a person or the like.
  • the human sensor 51 detects a person within a predetermined distance from the human sensor 51 on a straight line extending from the human sensor 51 (it is not necessary to be able to distinguish between people and other objects; the same applies hereinafter). It may be a device that detects a person within a cone-shaped area from the human sensor 51. Furthermore, the human sensor 51 may be one that detects the presence of a person itself, and/or may be one that detects the movement of a person. The human sensor 51 may be one that detects a person based on a difference between a physical quantity of the person (for example, the amount of heat) and a physical quantity of the surroundings, or may be one that detects a person not based on such a difference.
  • a physical quantity of the person for example, the amount of heat
  • the range in which a person is detected by the human sensor 51 may be set appropriately for the image processing device 3. As already mentioned, this range may be, for example, a linear range or a cone-shaped range. The width may be set as appropriate. In the illustrated example, the detection range is set on the side where the UI section 23 (operation section 33 and/or display section 35) and/or the detection section 25 are located with respect to the image processing device 3 in plan view. .
  • the trigger for canceling authentication may be that a predetermined period of time has elapsed since the last operation on the operation unit 33 was performed. This may be considered as a type of determination result that the user U1 has left the image processing device 3.
  • FIG. 12 is a flowchart illustrating an example of a procedure of processing executed by the image processing device 3 (processing unit 29) in order to realize the above-described operation of canceling authentication.
  • this process is performed after step ST55 in FIG. is positive (that is, when the authentication state is started via biometric authentication).
  • the process in FIG. 12 may be executed also when the authentication state is started by another authentication method.
  • the process in FIG. 12 may be executed not only after the restriction is removed in step ST55 but also after the restriction is removed in step ST58.
  • step ST41 the image processing device 3 determines whether a person is detected by the human sensor 51. Then, the image processing device 3 proceeds to step ST42 when the determination is positive, and proceeds to step ST43 when the determination is negative.
  • step ST42 the image processing device 3 determines whether a predetermined reset button has been pressed for a long time.
  • This operation is an example of an operation for instructing cancellation of authentication.
  • the reset button may be a single button 33a (see FIG. 1) or a button on a touch panel (not shown).
  • the image processing device 3 proceeds to step ST43 when the determination is positive, and proceeds to step ST44 when the determination is negative.
  • step ST43 the image processing device 3 sets a release flag. That is, when an event that becomes a trigger for canceling authentication occurs (when a negative determination is made in step ST41 or ST42), a cancellation flag is set. The image processing device 3 then skips steps ST44 and ST45 and proceeds to step ST46.
  • step ST44 the image processing device 3 determines whether execution of a task such as printing (processing that uses functions related to the image processing unit 31, etc.) is requested. Then, the image processing device 3 proceeds to step ST45 when the determination is positive, and skips step ST45 and proceeds to step ST46 when the determination is negative. In step ST45, the image processing device 3 starts the requested task (the processing section 29 instructs each section to perform operations related to the task).
  • a task such as printing (processing that uses functions related to the image processing unit 31, etc.) is requested. Then, the image processing device 3 proceeds to step ST45 when the determination is positive, and skips step ST45 and proceeds to step ST46 when the determination is negative.
  • step ST45 the image processing device 3 starts the requested task (the processing section 29 instructs each section to perform operations related to the task).
  • step ST46 the image processing device 3 determines whether the release flag is set. The image processing device 3 returns to step ST41 when the determination is negative, and proceeds to step ST47 when the determination is affirmative. When returning to step ST41, the authentication state is maintained.
  • step ST47 the image processing device 3 determines whether the task is being executed.
  • the image processing device 3 stands by (repeats step ST47) when the determination is positive, and proceeds to step ST48 when the determination is negative.
  • step ST48 the image processing device 3 cancels the authentication state. This process may be similar to step ST86 in FIG.
  • Removal of functional restrictions based on the authentication result may be performed in various ways. An example is shown below.
  • the function whose restriction is controlled to be lifted based on the authentication result may be, for example, a function related to at least one of the image processing section 31 (printer 19 and/or scanner 21) and the communication section 27.
  • Examples of restricted functions include the following: One or more of the following functions may be appropriately selected and set as a restriction target. Note that the plurality of functions listed below may overlap with each other or may be inseparable from one another.
  • printing by the printer 19 can be cited as a function to be restricted.
  • Printing may be restricted for each subdivided function.
  • printing may be based on scanning by the scanner 21, printing based on data received by the communication unit 27, or storing data in the image processing device 3 (auxiliary storage device 45) or a device connected to the connector 37 (e.g., non-volatile memory). It may be subdivided into printing based on the data that has been created.
  • the printing restrictions based on the data received by the communication unit 27 may be further subdivided according to the transmission source communication device (for example, another image processing device 3, the server 5 or 7, or the terminal 9). Note that such printing restrictions may be substantially implemented by restricting communication destinations. Furthermore, the printing restrictions based on the data received by the communication unit 27 may be further subdivided according to the mode of communication (normal data communication, email reception, or FAX reception).
  • the transmission source communication device for example, another image processing device 3, the server 5 or 7, or the terminal 9. Note that such printing restrictions may be substantially implemented by restricting communication destinations.
  • the printing restrictions based on the data received by the communication unit 27 may be further subdivided according to the mode of communication (normal data communication, email reception, or FAX reception).
  • the printing restrictions based on the data stored in the image processing device 3 may be further subdivided according to the type of box (folder or directory in another expression) in which the data is stored. Note that such printing restrictions may be substantially implemented by restricting access to a box in which highly confidential files (document files and/or image files) are expected to be stored.
  • the printing restrictions based on the data stored in the memory connected to the connector 37 may be further subdivided depending on the type or individual of the connected device. Note that such printing restrictions may be substantially realized by restricting the devices that can be connected to the connector 37 (so-called device control).
  • scanning by the scanner 21 is an example of a function that is subject to restriction. Similar to printing, scanning may be limited by granular features. For example, scanning is for copying (printing), for transmitting data (for example, image data), and for scanning data to the image processing device 3 (auxiliary storage device 45) or a device connected to the connector 37. may be subdivided into those for preservation and those for preservation.
  • Scanning for data transmission may be further subdivided depending on the destination communication device (for example, another image processing device 3, server 5 or 7, or terminal 9). Note that such scanning restrictions may be substantially implemented by restricting destinations. Furthermore, scanning for data transmission may be further subdivided depending on the mode of communication (normal data communication, email transmission, or FAX transmission).
  • the mode of communication normal data communication, email transmission, or FAX transmission.
  • Scans for storage in the image processing device 3 may be further subdivided according to the type of storage destination box. Note that such scanning restrictions may be substantially implemented by restricting access to a box in which highly confidential files are expected to be stored.
  • Scans for storage in the device connected to the connector 37 may be further subdivided depending on the type or individual of the connected device. Note that such scanning limitations may be substantially implemented by limiting the devices that can be connected to the connector 37.
  • the function to be restricted does not have to be a major function such as printing or scanning.
  • the restricted function may be a function that performs settings related to major functions, such as setting the size of the margins of printed paper.
  • such a function may be regarded as a function of printing with arbitrary margin settings, and may even be regarded as a type of main function.
  • the function to be restricted may be a function used by the administrator of the image processing device 3.
  • the image processing device 3 may be set to uniformly (regardless of the user's authentication results) prohibit some of the above-mentioned main functions or to prohibit connection of a predetermined device to the image processing device 3. may be accepted. Then, such setting restrictions may be canceled for a specific user (the administrator of the image processing device 3).
  • the image processing device 3 has various functions.
  • the functions to be restricted may be all or some of the various functions excluding the authentication function.
  • a user who fails biometric authentication may be substantially prevented from using the image processing device 3, or may be partially prevented from using the image processing device 3.
  • functions may be available.
  • users who failed the authentication to input account information could not actually use the image processing device 3, but could not use some of the functions. May be available.
  • restrictions on the first number of functions may be lifted for users who have not been authenticated at all.
  • the first number of functions may not be restricted.
  • restrictions on a second number of functions that include the first number of functions and are greater than the first number may be lifted, for example.
  • restrictions on a third number of functions that include the second number of functions and are greater than the second number may be lifted.
  • the first, second and third numbers are arbitrary.
  • the first number may be 0, or may be 1 or more.
  • the difference in restrictions for each user is that, for example, if authentication is successful by entering account information, only the second number of functions are targeted, and if authentication is successful by entering biometric authentication, If successful, a third number of functions may be targeted.
  • each number of functions are also arbitrary.
  • functions that are not included in the second number of functions but are included in the third number of functions may be, for example, any one or more of the following.
  • Transmission of image data read by the scanner 21 to the outside for example, the server 7, the terminal 9, or another image processing device 3; the same applies hereinafter).
  • a function that may be included in the first number of functions may be, for example, copying (a function of printing an image of a document read by the scanner 21 using the printer 19).
  • the functions not included in the first number of functions but included in the second number of functions may be, for example, any one or more of the following. Printing by printer 19 based on external data. Accessing boxes that are not configured to store sensitive files.
  • biometric authentication or other authentication such as inputting account information
  • the manner in which functional restrictions are lifted when biometric authentication (or other authentication such as inputting account information) is successful may be the same for all users, or may be set individually for each user. It may be possible. Assuming that there is no authentication method other than biometric authentication (for example, an authentication method that requires inputting account information), the former can be said from a different perspective as a user whose functionality is not lifted without authentication. There may be only two types of users: , and users who are authenticated and whose functional restrictions are lifted. There may be no difference in the functions that can be used between the users whose functions are lifted. In other words, considering the case where there are authentication methods with different security levels, such as authentication by inputting account information, there may be as many types of users as there are security levels (or less).
  • biometric authentication or other authentication such as authentication for inputting account information
  • biometric authentication for example, an authentication method that requires inputting account information
  • the authenticated users include users who can use only the first function, users who can use only the second function, users who can use both the first function and the second function, and users who have been authenticated.
  • restriction release operation may be realized in various more specific ways. An example is shown below.
  • FIG. 13 is a block diagram showing an example of the configuration of a signal processing system of the communication system 1 that realizes the above operation.
  • the comparison unit 29a of the image processing device 3 transmits the account information D1 to the server 5 as described above.
  • the server 5 has an authority table DT3 that links IDs and authority information D5. Then, the server 5 refers to the authority table DT3 and extracts the authority information D5 linked to the ID included in the received account information D1. Then, the server 5 transmits the extracted authority information D5 to the image processing device 3.
  • the processing unit 29 of the image processing device 3 cancels the restriction on the function based on the received authority information D5.
  • the transmission of the authority information D5 is based on the premise that the received account information D1 is registered in the verification table DT2 (FIG. 3) (that the authentication was successful). It may be treated or captured as a transmission of result D3.
  • the authority information D5 may be stored in the authority table DT3 in association with the ID by the administrator of the server 5, for example, before the above operation is executed (before step ST6 in FIG. 4).
  • image Information specifying the type of authentication method may be transmitted from the processing device 3 to the server 5 together with the account information D1.
  • the authority table DT3 of the server 5 the contents of the authority information D5 may differ depending on the type of authentication method. Then, the server 5 may extract the authority information D5 according to the received authentication method type information and transmit it to the image processing device 3.
  • the contents of the authority information D5 do not differ depending on the type of authentication method, and after extracting the authority information D5, the server 5 needs the contents of the authority information D5 depending on the type of authentication method. (For example, when biometric authentication is not performed, the restriction on a specific function is not lifted.), and the modified authority information D5 may be transmitted. Also, different from the above, the same authority information D5 is transmitted regardless of the type of authentication method, and the image processing device 3 is configured to perform a specific function among the functions whose restrictions are canceled by the authority information D5. You may also choose not to release the restrictions.
  • the verification table DT2 and the authority table DT3 may be integrated.
  • the same applies to other tables for example, a user information table DT5, which will be described later, and a menu table DT7, which will be described later with reference to FIG. 15
  • the illustrated table may be divided as appropriate.
  • IDs and information on restrictions for each function are directly linked.
  • the server 5 has a table that associates an ID with one of a predetermined number of authority levels, and a table that associates each of the predetermined number of authority levels with information on restrictions for each authority. may be stored in
  • part of the operations of the server 5 described above may be executed by the image processing device 3 (processing unit 29).
  • the image processing device 3 may have the authority table DT3.
  • the processing unit 29 refers to its own authority table DT3 and identifies the ID corresponding to the biometric information that matches the input biometric information (or The authority information D5 associated with the input ID) may be extracted to remove the restriction on the function.
  • the image processing device 3 may have both of the two divided tables, or may have only the latter table.
  • the server 5 uses the authority level information as the authority information D5 and transmits the information to the image processing device 3, unlike the illustrated example. Send to.
  • the processing unit 29 may refer to its own table, extract information on the presence or absence of restrictions for each function linked to the received authority level, and release the restrictions on the functions.
  • the image processing device 3 (processing unit 29) that has received or extracted the authority information may display the authority information on the display unit 35.
  • authority information is shown on the screen 35a of the display unit 35.
  • the image processing device 3 may display the user information on the screen 35a along with the authority information.
  • the user information includes, for example, a user name. Further, the user information may include other information such as the user's affiliation. If the user name is registered by the user, it may be initially registered, for example, along with the initial registration of account information D1. In addition, replacement registration may be performed in the same way as a password.
  • the server 5 has a user information table DT5 that links IDs and user names.
  • the user name is associated with the ID and stored in the user information table DT5 by the user and/or by the administrator of the server 5. May be stored (registered).
  • the server 5 extracts the user name corresponding to the ID received from the image processing device 3 from the user information table DT5, and sends it to the image processing device 3, in the same way as the extraction and transmission of the authority information from the authority table DT3 described above. Send.
  • the image processing device 3 displays the received user name on the screen 35a.
  • the user information table DT5 may be held by the image processing device 3.
  • the processing unit 29 When the processing unit 29 is notified of successful authentication from the server 5, it refers to its own user information table DT5 and identifies the ID (or user ID) corresponding to the biometric information that matches the input biometric information.
  • the user information associated with the ID input by the user may be extracted and displayed on the display unit 35.
  • the user information table DT5 may be integrated with the verification table DT2 and/or the authority table DT3.
  • the user name is defined separately from the ID, but the ID may be used as the user name and displayed on the screen 35a.
  • FIG. 14 is a diagram illustrating an example of a procedure of processing executed by the image processing device 3 (processing unit 29) to limit and release the function.
  • the process in FIG. 14 may be started as appropriate.
  • a mode that is started when the image processing apparatus 3 enters the startup mode by operating the power switch of the image processing apparatus 3 will be described as an example.
  • the authentication described with reference to FIG. 4 (step ST6 and subsequent steps) may be executed in parallel to the process of FIG. 14 at an appropriate time while the process of FIG. 14 is being executed.
  • step ST21 the image processing device 3 determines whether execution of a task such as printing is requested by an operation on the operation unit 33 or communication via the communication unit 27. Then, the image processing device 3 stands by when the determination is negative (from another perspective, step ST21 is repeated at a predetermined period). Further, when the determination is affirmative, the image processing device 3 proceeds to step ST22. Note that, for convenience of explanation, the tasks referred to here are limited to those whose execution is restricted and whose execution is lifted.
  • step ST22 the image processing device 3 determines whether the user has the authority to execute the requested task. Then, when the image processing device 3 makes an affirmative determination, the process proceeds to step ST23, and when it makes a negative judgment, the image processing device 3 proceeds to step ST24.
  • the authority information is not specified at the time of step ST22, or if the authority information is invalidated due to cancellation of authentication, it may be determined that there is no authority.
  • Examples of cases where authority information is not specified include cases where authentication processing has not been performed and cases where authentication has failed.
  • step ST23 the image processing device 3 controls the printer 19 and/or scanner 21 to execute the requested task (for example, printing). Note that steps ST21 and ST23 are the same as steps ST44 and ST45 in FIG. 12.
  • the image processing device 3 notifies the user that execution (function) of the requested task is restricted.
  • This notification may be made visually or acoustically, for example.
  • the visual notification may be one that displays a predetermined image and/or text, or may be one that sets a predetermined indicator light in a predetermined state (on state, blinking state, or off state). However, a combination of these may also be used.
  • the acoustic notification may be one that outputs a predetermined voice and/or warning sound (buzzer sound or melody). The same may apply to notifications in other steps.
  • step ST25 the image processing device 3 determines whether a predetermined termination condition is satisfied. Then, in the case of a negative determination, the image processing device 3 returns to step ST21, and in the case of an affirmative determination, it ends the process shown in FIG. 14.
  • the termination condition may be, for example, the same as the condition for terminating the startup of the image processing device 3 or the condition for transitioning to standby mode.
  • steps ST21 and ST22 the processing unit 29 determines whether or not the requested task is one whose execution is to be restricted and released, and when the determination is affirmative, the process proceeds to step ST22. , if the determination is negative, the process may proceed to step ST23.
  • the image processing device 3 determines whether or not valid (unauthenticated) authority information at that time is specified between step ST21 and step ST22, and when the determination is affirmative, Proceeding to step ST22, if the determination is negative, notification may be performed. In this notification, the image processing device 3 may display on the display unit 35 a display requesting the user for authentication (input of biometric information). Thereafter, the image processing device 3 may proceed to step ST25, or may wait until biometric information can be detected (for example, until a finger is placed on the detection unit 25 that detects a fingerprint). In the latter case, when the biometric information becomes detectable, the image processing device 3 may perform authentication (from step ST6) and specify the authority information, and proceed to step ST22. However, if biometric information cannot be detected even after a predetermined period of time has elapsed, or if a predetermined cancel operation has been performed, the image processing device 3 may proceed to step ST25.
  • the process in FIG. 14 may be started on the condition that authentication and authorization information have been specified.
  • the termination condition of step ST25 may be that the authentication has been canceled and the authority information has become invalid.
  • a display for example, image IG1 may be displayed on the display unit 35 requesting the user to authenticate.
  • step ST22 the operation of specifying the authority information of the authenticated user described with reference to FIG. 13 is, from another perspective, an operation of storing the received or extracted authority information so that it can be referenced in step ST22.
  • This operation may be considered as an example of an operation for lifting the restriction on the function based on the authentication result when the stored authority information includes information indicating that the user has authority for at least one function.
  • the affirmative determination in step ST22 and the task instruction in step ST23 may also be taken as an example of the operation of canceling the restriction on the function based on the authentication result.
  • settings for the menu screen displayed on the display unit 35 may also be performed. This setting may be performed for each user. Specifically, it is as follows.
  • the menu screen is, for example, a screen (image) that includes one or more options on the GUI.
  • a process corresponding to the option is executed.
  • the operation unit 33 and the display unit 35 are configured with a touch panel, when one or more options displayed on the display unit 35 is pressed with a finger or a touch pen, the corresponding process is executed. Ru.
  • the processes corresponding to the options shown on the menu screen of the image processing device 3 may be various processes.
  • the options may be processes that cause operations related to major functions such as printing, scanning, copying, FAX transmission, and FAX reception (although these are not necessarily separable concepts).
  • the option may be a process for making settings related to the above operation. Such settings include, for example, paper size selection, print magnification settings, and print density.
  • the main functions may be subdivided as appropriate and authorities may be set, but this explanation of subdivision may be used for subdivision of options as appropriate.
  • the menu screen for each user may, for example, reflect the preferences of each user and/or the authority of each user.
  • the former includes, for example, adjusting the position, size, color, shape, etc. of a specific option within the screen 35a to suit the user's preference.
  • Examples of the latter include, for example, a screen in which options for a given function are displayed in different ways depending on whether the user has authority for that function. More specifically, examples include a screen in which options have different colors depending on the presence or absence of authority, and a screen in which only options for which the user has authority are displayed (options for which he does not have authority are not displayed).
  • controlling the display of the menu screen in this case may be regarded as an example of controlling the release of functional restrictions.
  • the menu screen for the user who has successfully authenticated via biometric authentication and the menu screen for the other users There may be only two types of settings for the user: the menu screen. Further, for example, it may be possible to set different menu screens to different users who have successfully authenticated through biometric authentication.
  • the menu screen may not be displayed for users whose authentication via biometric authentication is not successful. For users who have succeeded in authentication other than biometric authentication, the same menu screen may be uniformly used, or different menu screens may be displayed, similarly to users who have succeeded in authentication via biometric authentication.
  • the image processing device 3 may be capable of displaying a main menu screen that is initially displayed and one or more submenu screens that are displayed by selecting an option on the main menu screen.
  • the menu screen set for each user may be the main menu screen, at least one of one or more submenu screens, or both of the above two types. There may be. Further, depending on the menu screen settings for each user, whether or not a submenu screen can be displayed may be set, or the number of submenu screens that can be displayed among a plurality of submenu screens may be set.
  • menu screen settings described above may be realized in various more specific ways. An example is shown below.
  • FIG. 15 is a block diagram showing the configuration of the signal processing system of the communication system 1 that implements the above settings.
  • the server 5 has, for example, a menu table DT7.
  • the menu table DT7 stores the ID and menu information D7 that specifies the mode of the menu screen (in other words, the settings of the menu screen) in association with each other.
  • the server 5 refers to the menu table DT7 and extracts the menu information D7 linked to the ID received from the image processing device 3. Then, the server 5 transmits the extracted menu information D7 to the image processing device 3.
  • the image processing device 3 displays a menu screen based on the received menu information D7 on the screen 35a of the display unit 35.
  • the transmission of the menu information D7 is based on the premise that, for example, the received account information D1 is registered in the verification table DT2 (FIG. 3) (that the authentication is successful). Therefore, the transmission of the menu information D7 may be handled or interpreted as the transmission of the authentication result D3.
  • the menu information D7 may be stored in the menu table DT7 in association with the ID, for example, before the above operation is executed (before step ST6 in FIG. 4).
  • the menu information D7 may be set by the user and/or the administrator of the server 5. For example, if the user's preferences are reflected in at least part of the menu screen settings for each user, the part may be set by the user. Furthermore, if the presence or absence of authority is reflected in at least part of the menu screen settings for each user, the part may be set by the administrator of the server 5. Note that the settings by the user may be based on user authentication to prevent unauthorized settings from being made by a third party.
  • the image processing device 3 will change the menu screen to the server 5 as described above.
  • information specifying the type of authentication method may be transmitted together with the account information D1.
  • the server 5 may extract menu information D7 according to the received authentication method type information and transmit it to the image processing device 3.
  • the contents of the menu information D7 do not differ depending on the type of authentication method, and after extracting the menu information D7, the server 5 needs the contents of the menu information D7 according to the type of authentication method.
  • the menu information D7 after the modification may be transmitted.
  • the same menu information D7 is sent regardless of the type of authentication method, and the image processing device 3 is configured to perform a specific function among the functions whose restrictions are canceled by the menu information D7. You may also choose not to release the restrictions.
  • the menu table DT7 may be integrated with at least one of the other tables. Contrary to the above, menu table DT7 may be divided as appropriate.
  • the menu table DT7 shown in FIG. 15 conceptually shows a mode in which an ID and information set for each of a plurality of setting items regarding the menu screen are directly linked.
  • a table obtained by dividing the menu table DT7 may be used. For example, a table in which an ID is associated with one of a predetermined number of menu screen types, information set for each of a predetermined number of menu screen types, and each of a plurality of setting items related to the menu screen. may be stored in the server 5.
  • the menu table DT7 may be held by the image processing device 3.
  • the image processing device 3 When the image processing device 3 is notified of successful authentication from the server 5, it refers to its own menu table DT7 and identifies the ID (or user ID) corresponding to the biometric information that matches the input biometric information.
  • the menu information D7 that is linked to the ID input by ) may be extracted, and the extracted menu information D7 may be displayed on the display unit 35.
  • the image processing device 3 may have both of the two divided tables, or may have only the latter table.
  • the server 5 stores the menu screen type information as the menu information D7 in the image, unlike the illustrated example. It is transmitted to the processing device 3. Then, the processing unit 29 refers to its own table, extracts information for each setting item linked to the received menu information D7, and displays a menu screen based on the extracted information. good.
  • the menu information D7 is not transmitted from the server 5 to the image processing device 3.
  • the image processing device 3 may have, for example, a table linking the authority information D5 and the menu information D7, and refer to the table to set the menu screen according to the authority information D5.
  • the authority information D5 may not be transmitted and only the menu information D7 may be transmitted.
  • the menu information D7 in this case can be regarded as a type of authority information.
  • the function whose restrictions are lifted based on the authentication result may be a VPN connection. Specifically, it is as follows.
  • a VPN for example, virtually extends a private network to the public network 11.
  • a VPN logically divides a physically single network including the public network 11. Thereby, for example, communication via the public network 11 is performed in a secure environment.
  • Such virtual expansion or logical division is achieved, for example, by authentication, tunneling, and encryption.
  • communication using a VPN may be performed through authentication and tunneling without being encrypted.
  • tunneling can also be considered a type of encryption.
  • Authentication methods include, for example, those that use account information (ID and password), those that use static keys, those that use a common key (shared key), those that use a combination of a private key and public key, and those that use electronic signatures. Examples include those that use electronic certificates, those that use security tokens, and those that combine two or more of the above (for example, multi-factor authentication).
  • the authentication for VPN connection from the image processing device 3 at least the authentication through biometric authentication (local authentication) (the authentication described with reference to steps ST6 to ST10 in FIG. 4) is executed.
  • Tunneling an operation is performed to treat two points that are physically or logically separated via a network as if they were the same point.
  • Tunneling is achieved, for example, by encapsulation.
  • encapsulation for example, an entire packet is embedded in a payload of another protocol, a payload of another layer, or a payload of the same layer during communication.
  • Tunneling may be performed at any appropriate layer, for example at layer 3 (network layer) or layer 2 (data link layer).
  • Encryption converts information sent and received into a format that cannot be read by third parties. Encryption may be performed only on the payload, or on both the header and the payload. In another aspect, encryption may be performed at any appropriate layer, eg, at the network layer, transport layer, and/or session layer. An appropriate encryption method may be used. For example, encryption methods include those that use a common key and those that use a combination of a private key and a public key.
  • the type of VPN may be selected as appropriate.
  • the VPN of the communication system 1 may be a remote access type VPN and/or a LAN type (intersite) VPN.
  • a remote access type VPN for example, VPN client software is installed on a communication device such as the image processing device 3, and the communication device directly establishes a VPN connection to the server 5 as a VPN server.
  • a LAN type VPN for example, a VPN gateway connects LANs (bases) to each other via VPN.
  • the operation of the image processing device 3 that functions as a client of a remote access VPN will be taken as an example.
  • the public network 11 may take various forms. From the viewpoint of VPN types, they are as follows.
  • the VPN may be an Internet VPN in which the public network 11 includes the Internet.
  • the VPN may be an IP (Internet Protocol)-VPN, an entry VPN, or a wide area Ethernet, including a closed network provided by a communication carrier or the like to the public network 11.
  • IP Internet Protocol
  • the protocol for the VPN may be a known one, a new one, or one uniquely defined by the administrator of the server 5.
  • Known protocols for remote access VPNs include, for example, a combination of L2TP (Layer 2 Tunneling Protocol) and IPsec (Security Architecture for Internet Protocol), and PPTP (Point to Point Tunneling Protocol).
  • FIG. 16 is a flowchart illustrating a specific example of the above operation.
  • the image processing device 3 is a remote access VPN client that communicates with the server 5 as a VPN server (for example, the image processing device 3A or 3B in FIG. 1).
  • the data processing device 49 is a device that communicates with the image processing device 3 via a VPN (from another perspective, the server 5 as a VPN server). Examples of the data processing device 49 include another image processing device 3, the server 7, and the terminal 9.
  • the data processing device 49 may be the server 5, but FIG. 16 shows an example in which the two are separate.
  • the data processing device 49 that is not the server 5 may be included in the private network 13A including the server 5 (3C, 7 or 9A) or may not be included (3A, 3B or 9B). good. In FIG. 16, the latter is taken as an example.
  • the process shown in FIG. 16 is started, for example, when the VPN connection start condition is satisfied in the image processing device 3.
  • the start condition may be, for example, that a predetermined operation instructing a VPN connection is performed on the operation unit 33. Further, the start condition may be that a task that requires a VPN connection (for example, an operation of downloading and printing image data from the data processing device 49) is performed on the operation unit 33. When such a task is instructed, the start condition may be satisfied when the user is asked whether or not to make a VPN connection, and as a result, a predetermined operation instructing the VPN connection is performed. Further, the start condition may be that a predetermined signal is input from an external communication device (for example, the terminal 9).
  • steps ST6 to ST10 in FIG. 4 are executed. That is, processing for authentication is executed. In FIG. 16, only step ST10 is shown. If authentication is successful, a VPN connection is established.
  • the image processing device 3 transmits a signal requesting a VPN connection to the server 5.
  • the server 5, which has received the above signal transmits a signal requesting transmission of the account information D1 to the image processing device 3.
  • the image processing device 3 that has received the signal causes the display unit 35 to display a display requesting the user to detect biological information (for example, image IG5). After that, steps ST6 to ST10 are executed.
  • the image processing device 3 when the start condition is satisfied, the image processing device 3 causes the display unit 35 to display a message requesting the user to detect biometric information. Next, the image processing device 3 executes steps ST6 (detection of biometric information) and ST7 (biometric authentication). Next, the image processing device 3 transmits data requesting VPN connection and account information D1 (step ST8). Note that both data may be transmitted separately or together. After that, steps ST9 and ST10 are performed.
  • Transmission of account information D1 is executed, for example, only when biometric authentication (step ST7) is successful. That is, VPN connection is permitted only through biometric authentication. However, the VPN connection may be permitted by other authentication methods (for example, an authentication method using the input account information D1).
  • the VPN connection may be automatically established when the authentication in steps ST6 to ST10 is successful, instead of determining the start condition prior to authentication.
  • successful authentication may be a condition for starting a VPN connection.
  • the timing or conditions for authentication may be set as appropriate.
  • the authentication state may be started prior to determining the start condition, and the VPN connection start condition may be the re-input condition described with reference to FIG. 10. Then, when an affirmative determination is made in step ST104 (when re-authentication is successful), a VPN connection may be established.
  • FIG. 16 exemplifies the operation of downloading image data from the data processing device 49 and printing it. Specifically, it is as follows.
  • step ST31 the image processing device 3 transmits a signal requesting download of image data to the server 5 via the VPN.
  • the image data here may be general image data or image data as a print job.
  • step ST32 the server 5 transmits (transfers) a signal requesting image data to the destination (here, the data processing device 49) specified by the information included in the received signal.
  • the data processing device 49 is a communication device external to the private network 13A including the server 5, the transmission may be performed via a VPN (as shown in the example).
  • the data processing device 49 is connected to the server 5 via VPN in advance before step ST32.
  • the data processing device 49 is a communication device included in the private network 13A, normal communication within the private network 13A may be performed.
  • step ST33 the data processing device 49 transmits the requested image data to the server 5.
  • a VPN may be used (as shown in the example), and if the data processing device 49 is located inside the private network 13A, a VPN may be used. Communication may take place within the normal private network 13A.
  • step ST34 the server 5 transmits (transfers) the received image data to the image processing device 3. Transmission at this time is performed via VPN.
  • step ST35 the image processing device 3 executes printing based on the received image data.
  • the VPN server to which the image processing device 3 makes a VPN connection may or may not be selectable by the user using the image processing device 3.
  • the image processing device 3 may be able to select a connection destination only from two or more VPN servers that make up one VPN, or may be able to select a connection destination from two or more VPN servers that make up two or more different VPNs. It may also be possible to select a connection destination.
  • the image processing device 3 may cause the display unit 35 to display a display (for example, an image) inquiring the user about the server 5 to connect to.
  • This display may, for example, present information on one or more connection destination candidates, or may prompt input of connection destination information.
  • the connection destination information that is presented and/or input is, for example, a host name or an IP address (or a name given to a VPN).
  • the connection destination information may be any name and/or figure stored in the auxiliary storage device 45 in advance by the administrator of the image processing device 3 in association with a host name or a fixed IP address.
  • the image processing device 3 may receive an operation on the operation unit 33 to select a connection destination from a plurality of candidates, an operation to input connection destination information by key input, etc. Furthermore, when the VPN connection is established, the processing unit 29 may cause the display unit 35 to display information indicating the connection destination with which the VPN connection has been established.
  • a VPN connection may be disconnected when appropriate disconnection conditions are met.
  • the cutting condition may be that a predetermined operation instructing cutting is performed on the operation unit 33.
  • the disconnection condition may be that the task is completed.
  • the disconnection condition may be that the authentication state has been canceled. Note that examples of conditions for canceling the authentication state have already been described.
  • the image processing device 3 may indicate this on the display unit 35 while the VPN is being connected. For example, an image indicating that the VPN is being connected may be displayed, or a specific indicator light may be in a specific state (for example, lit or blinking). Further, in the above description, it was mentioned that the connection destination of the VPN may be displayed, but the display of the connection destination may be taken as an example of a display indicating that the connection to the VPN is being made.
  • the operation in which the image processing device 3 receives image data from the data processing device 49 and performs printing is taken as an example.
  • various other operations using the VPN are possible.
  • information (eg, image data) acquired by the scanner 21 may be transmitted to the data processing device 49 via the VPN.
  • an ultrasonic detection unit 25 that detects a fingerprint will be described as an example of the detection unit. Further, the detection unit 25 that can be switched between standby mode and startup mode will be described using the detection unit 25 that reads a fingerprint as an example.
  • FIG. 17 is a schematic cross-sectional view showing an example of the configuration of the ultrasonic detection unit 25 that detects fingerprints. However, for convenience of illustrating arrows indicating ultrasonic waves, hatching indicating a cross section is omitted.
  • Fingerprints are, from another perspective, irregularities on the body surface.
  • the detection unit 25 detects this unevenness.
  • a finger F1 is placed on the detection surface 25a of the detection unit 25.
  • FIG. 17 shows an enlarged view of the detection surface 25a and a portion of its vicinity.
  • the unevenness on the lower surface of the finger F1 indicates convex portions and concave portions that constitute a fingerprint.
  • the detection unit 25 includes, for example, a plurality of ultrasonic elements 25b arranged along a detection surface 25a.
  • the plurality of ultrasonic elements 25b are covered with a medium layer 25c.
  • the surface of the medium layer 25c constitutes a detection surface 25a.
  • the difference between the acoustic impedance of the material of the medium layer 25c and the acoustic impedance of the body surface (skin) is smaller than the difference between the acoustic impedance of the material of the medium layer 25c and the acoustic impedance of air.
  • the acoustic impedance of the material of the medium layer 25c and the acoustic impedance of the body surface are approximately equal.
  • the plurality of ultrasonic elements 25b transmit ultrasonic waves toward the detection surface 25a.
  • the acoustic impedance of the detection surface 25a (medium layer 25c) and the acoustic impedance of the air are different (the difference is (relatively large), the ultrasonic waves are reflected (the intensity of the reflected waves is relatively strong). The reflected wave is received by the ultrasonic element 25b.
  • the acoustic impedance of the detection surface 25a and the acoustic impedance of the finger F1 are equivalent (the difference is relatively (the intensity of the reflected wave is relatively weak).
  • the ultrasonic element 25b detects that its own detection area corresponds to the concave part of the fingerprint by receiving the reflected wave reflected by the detection surface 25a (because the intensity of the reflected wave is strong). can. Conversely, the ultrasonic element 25b is unable to receive the reflected wave reflected by the detection surface 25a (because the intensity of the reflected wave is weak), and therefore detects that its own detection area corresponds to the convex part of the fingerprint. can be detected.
  • the configuration illustrated in FIG. 17 is capable of detecting biological information (for example, the shape of the palm) shown by unevenness on the body surface in addition to fingerprints.
  • the plurality of ultrasonic elements 25b may be arranged one-dimensionally or two-dimensionally so that the number in the left-right direction of FIG. 17 is greater than the number in the direction penetrating the page of FIG. 17. Then, the plurality of ultrasonic elements 25b may be mechanically moved in the penetrating direction of the paper to perform scanning and obtain a two-dimensional image. Alternatively, the plurality of ultrasonic elements 25b may be two-dimensionally arranged in the left-right direction of FIG. 17 and in the direction through the paper of FIG. 17, and a two-dimensional image may be acquired by electronic scanning. Alternatively, the plurality of ultrasonic elements 25b may be two-dimensionally arranged in an area equivalent to the range in which a fingerprint is read to obtain a two-dimensional image.
  • FIG. 18 is a schematic diagram for explaining switching between the startup mode and standby mode using the detection unit 25.
  • the detection unit 25 one that reads the fingerprint of the finger F1 is illustrated.
  • the description here may be applied to the detection unit 25 that detects other biological information as long as there is no contradiction.
  • the detection unit 25 may be configured to be switchable between a startup mode (lower row in FIG. 18) and a standby mode (upper row in FIG. 18).
  • the startup mode is, for example, a state in which biological information can be detected, and includes an operating state before the start of detection and an operating state during detection.
  • the standby mode is a state in which power consumption is lower than that in the startup mode (for example, the operating state before the start of detection). Standby mode may also be expressed in other ways, such as sleep mode.
  • the startup mode and standby mode can also be described as follows.
  • the detection section 25 includes a first drive section 26a, a second drive section 26b, and a power supply control section 26e.
  • power is supplied from the power supply control section 26e to both the first drive section 26a and the second drive section 26b, as indicated by arrows in the lower part of FIG.
  • standby mode as shown by the arrow in the upper part of FIG. 18, power is supplied from the power supply control section 26e only to the first drive section 26a.
  • the first drive unit 26a, the second drive unit 26b, and the power supply control unit 26e may be hardware elements, or may be elements that are a combination of hardware and software.
  • the first drive unit 26a may be a nonvolatile memory.
  • the second drive unit 26b may be a CPU, or may be a part (for example, a clock) of a plurality of functional units constructed by the CPU executing a program in the startup mode.
  • startup mode and standby mode can also be described as follows.
  • the detection unit 25 shifts from standby mode to startup mode, and then starts detecting biometric information.
  • the standby mode does not occur before the biometric information is detected.
  • the time required to start (or complete) detection of biometric information when attempting to input biometric information in standby mode is the same as the time required to start (or complete) detection of biometric information when attempting to input biometric information in standby mode. Longer than the time required to start (or complete) information detection.
  • the user may be notified whether the current mode of the detection unit 25 is the startup mode or the standby mode.
  • the specific mode of notification is arbitrary.
  • the detection unit 25 may have an indicator light adjacent to the detection surface 25a.
  • the activation mode may be indicated by turning on the indicator light
  • the standby mode may be indicated by turning off the indicator light.
  • the startup mode and standby mode may be indicated by characters or graphics.
  • FIG. 19 is a flowchart illustrating an example of a predetermined procedure for controlling switching of the operating state (mode) of the detection unit 25. This process is started, for example, when the image processing device 3 is powered on.
  • step ST71 the processing unit 29 executes a process of activating the detection unit 25.
  • the CPU executes a program stored in the ROM and/or the auxiliary storage device, so that an element (for example, an image sensor or an ultrasonic element 25b) in the detection unit 25 is activated.
  • a control section of the detection section 25 directly involved in control is constructed.
  • the detection unit 25 enters the activation mode described with reference to the lower part of FIG. 18 .
  • the CPU, ROM, auxiliary storage device, and control section of the detection section 25 described above are considered as part of the CPU 39, ROM 41, auxiliary storage device 45, and processing section 29 shown in FIG. I do not care.
  • the division of roles between the CPU, ROM, auxiliary storage device, and control section inside the detection section 25 and the CPU 39, ROM 41, auxiliary storage device 45, and processing section 29 may be set as appropriate, and the distinction between the two may be set as appropriate. It doesn't necessarily have to be clear. In the following, for convenience, the subject of processing will be the processing unit 29.
  • step ST72 the processing unit 29 determines whether the standby condition is satisfied. If the determination is affirmative, the processing unit 29 proceeds to step ST73 to put the detection unit 25 in standby mode. On the other hand, when the determination is negative, the processing unit 29 skips steps ST73 to ST75 and proceeds to step ST76 in order to maintain the activation mode.
  • the standby conditions may be set as appropriate.
  • the standby condition is that the image processing device 3 is not used for a predetermined period of time, that the detection unit 25 is not used for a specified period of time, and/or that the user operates the operation unit 33. It may be determined that a predetermined operation has been performed.
  • step ST73 the processing unit 29 puts the detection unit 25 in standby mode.
  • step ST74 the processing unit 29 determines whether a predetermined condition for canceling standby mode is satisfied. When the processing unit 29 makes a positive determination, the process proceeds to step ST75 and cancels the standby mode, and when the determination is negative, the processing unit 29 continues the standby mode (repeats step ST74).
  • the conditions for canceling standby mode in step ST74 may be set as appropriate.
  • FIG. 19 exemplifies a mode in which it is determined whether or not a finger is placed on the detection unit 25.
  • the standby mode may be canceled, for example, when a predetermined operation is performed on the operation unit 33 and/or when a situation requiring biometric authentication occurs.
  • step ST76 the processing unit 29 determines whether the conditions for terminating the process shown in FIG. 19 are satisfied.
  • the condition may be, for example, the same as the condition for terminating the startup of the image processing device 3, or may be that a predetermined operation has been performed on the operation unit 33. Then, in the case of a positive determination, the processing unit 29 finishes the process of FIG. 19 after executing a process (not shown) for terminating the activation of the detection unit 25, and in the case of a negative determination, returns to step ST72.
  • the finger detection in step ST74 may be implemented as appropriate.
  • at least one element for example, an imaging device or an ultrasound device
  • finger detection may be realized by only some of the ultrasonic elements 25b of the plurality of ultrasonic elements 25b shown in FIG. 17 transmitting and receiving ultrasonic waves.
  • an element that performs scanning may acquire information without scanning, thereby detecting a finger.
  • finger detection may be realized by a sensor provided separately from an element that detects biological information.
  • the activation of the detection unit 25 in step ST71 may be started and completed at an appropriate time in relation to various operations of the image processing device 3.
  • a preliminary operation may be performed before the image processing unit 31 executes printing and/or scanning for the purpose of improving image quality and/or speeding up printing.
  • the activation of the detection unit 25 may be completed, for example, before the preliminary operation is completed. Thereby, for example, at least a part of the preliminary operation is executed while authentication is being performed, and the entire process can be performed efficiently.
  • the detection unit 25 may be activated in parallel with the pre-operation or before the pre-operation.
  • preliminary operations include the following: If the printer 19 is an inkjet type printer, nozzle cleaning may be performed to clean the surface on which the nozzles that eject ink are formed before printing. This nozzle cleaning is an example of a preliminary operation. Further, the printer 19 and/or the scanner 21 may be preheated before printing or scanning in order to make the image quality immediately after printing or scanning similar to the image quality after that. Such preheating is an example of a preliminary operation.
  • the image processing device 3 includes the image processing section 31, the detection section 25, the first memory (for example, the auxiliary storage device 45), the communication section 27, and the control section 29c. ing.
  • Image processing section 31 includes at least one of printer 19 and scanner 21 .
  • the detection unit 25 detects the user's biometric information D2.
  • the auxiliary storage device 45 (comparison table DT1) stores first biometric information (biometric information D2) and authentication information (for example, account information D1) in association with each user.
  • the communication unit 27 uses authentication information stored in the auxiliary storage device 45 in association with the first biometric information that matches the second biometric information (biometric information D2) of the first user detected by the detection unit 25.
  • the control unit 29c controls the release of restrictions on functions related to at least one of the image processing unit 31 and the communication unit 27 based on the authentication result D3 received from the server 5.
  • the communication system 1 includes the image processing device 3 as described above and an external authentication device (server 5).
  • the server 5 includes a second memory (nonvolatile memory 5b that stores verification table DT2) that stores authentication information (for example, account information D1).
  • the server 5 also stores information based on the authentication information (for example, account information D1) received from the image processing device 3 (the authentication information itself or information generated based on the authentication information) and the nonvolatile memory 5b (verification information). If the authentication using the authentication information stored in the authentication table DT2) is successful, an authentication result D3 indicating that the authentication was successful is transmitted to the image processing device 3.
  • the need to input account information D1 each time authentication is performed is reduced, and user convenience is improved. And/or security is improved because two-step authentication is performed: biometric authentication (local authentication) in the image processing device 3 and authentication (server authentication) in the server 5.
  • the image processing device 3 may include a UI section 23 into which account information D1 as authentication information is input.
  • the image processing unit 31 and the communication unit 27 are configured based on the authentication result received from the server 5 by transmitting the account information D1 input to the UI unit 23 (input in step ST56) to the external authentication device (server 5). It may also be possible to control release of restrictions on functions related to at least one of them (step ST58).
  • Biometric information may change over time or depending on the user's physical condition.
  • the authentication result may be erroneous.
  • the user wants to use the predetermined function in a hurry it is possible to save the effort of re-registering the biometric information. In other words, user convenience is improved.
  • the image processing device 3 transmits the account information D1 stored in the first memory (auxiliary storage device 45) in association with the first biological information (comparison table DT1). If the authentication is successful, the restriction on the first function may be lifted according to the authority of the user who has succeeded in the authentication. If the authentication is successful by transmitting the account information D1 input to the UI section 23 (input in step ST56), the first function is restricted regardless of the authority of the user who has successfully authenticated. It's fine.
  • functions with a relatively low security level may be allowed to be used even by users who have not registered biometric information. Further, for users who should have registered biometric information but have failed biometric authentication, it is possible to provide the convenience of postponing the above-mentioned re-registration.
  • the security level can be maintained for functions that require a high security level. In other words, it is possible to achieve both convenience and security.
  • the information sent from the image processing device 3 to the server 5 includes the account information D1 and is not completely different information, so the configurations of the image processing device 3 and the server 5 can be simplified. .
  • the image processing device 3 may further include an imaging unit 28 that images at least a part of the area around the image processing device 3.
  • the image processing device 3 may control the authentication state according to the situation of the person, which is identified based on the image captured by the imaging unit 28 in the authentication state where the authentication is successful.
  • the authenticated state is maintained and the probability that a third party will illegally use the functions of the image processing device 3 is reduced. In other words, security is improved.
  • the user can cancel the authentication state without performing an operation on the operation unit 33 to cancel the authentication state. In other words, convenience is improved.
  • the image processing device 3 may control the authentication state using the facial feature data D4 detected from the image captured by the imaging unit 28.
  • the probability of misidentifying the user in the authentication state as another user is lower, and security is improved.
  • the tracking results are based only on the outline of a person, once the authenticated user disappears out of the imaging range, even if the authenticated user returns to the imaging range, the imaged person will still be authenticated. It is difficult to identify the user of a state with high accuracy.
  • facial feature data it is possible to identify with high accuracy that the imaged person is an authenticated user. As a result, for example, when an authenticated user disappears outside the imaging range, the need to immediately cancel the authenticated status is reduced, and user convenience is improved.
  • the image processing device 3 determines that the person corresponding to the facial feature data D4 of the first user who is in the authenticated state (the user who has been successfully authenticated through biometric authentication) has not been imaged by the imaging unit 28 for a predetermined period of time.
  • the authentication state may be canceled when the cancellation conditions including (steps ST84 to ST86 in FIG. 8) are satisfied.
  • the inconvenience that the authentication state is canceled even when the user leaves the imaging range for a short time for some reason is eliminated.
  • monitoring is performed by the imaging unit 28, which reduces the monitoring load. At the same time, the security level can be increased.
  • the image processing device 3 determines that the operator of the UI unit 23, who is identified based on the images repeatedly captured by the imaging unit 28, is a user different from the first user in the authentication state (the user who has successfully authenticated through biometric authentication).
  • the authentication state is canceled when the cancellation conditions including that the operator has become the first user are satisfied (step ST88 in FIG. 9), and the authentication state is canceled when the restoration conditions including that the operator has become the first user again are satisfied. may be restored (step ST93).
  • the authentication state is automatically and temporarily canceled. This improves user convenience while maintaining security.
  • the biometric information detected by the detection unit 25 may be biometric information (for example, a fingerprint) different from the facial feature data D4.
  • the detection unit 25 is not limited to a configuration that can image a face, but any configuration can be selected.
  • the detection unit 25 one with relatively high biometric authentication accuracy may be selected, or the owner of the image processing device 3 (for example, a company) may use a detection unit 25 that is also used in other devices. can.
  • the authentication itself is performed by the detection unit 25, and at this point, the probability that the image processing device 3 will be used illegally by a third party is reduced, so the monitoring by the imaging unit 28 is limited to It can be said that it is supplementary. Therefore, the facial feature data D4 acquired by the imaging unit 28 may have low authentication accuracy. Thereby, for example, costs related to the imaging unit 28 can be reduced.
  • the image processing device 3 may further include a display section 35.
  • the image processing device 3 displays a message when the re-input condition (step ST101 in FIG. 10) is satisfied when the first user (a user who has successfully authenticated through biometric authentication) is in an authentication state.
  • the section 35 may display a display prompting the detection section 25 to perform re-detection.
  • the image processing device 3 may maintain the authentication state when maintenance conditions including that the re-detected biometric information and the first biometric information (those in the comparison table DT1) match are satisfied.
  • the image processing device 3 may cancel the authentication state when cancellation conditions including that the re-detected biometric information and the first biometric information do not match are met.
  • the re-input condition may include that a predetermined period of time has passed since the detection of the most recent biological information by the detection unit 25.
  • the predetermined time is set short, the probability that the above-mentioned replacement can be detected will be improved, and security will be improved.
  • the predetermined time is set longer, the probability that the first user who has been using the image processing device 3 for a long time will be required to re-enter biometric information frequently will be reduced, and user convenience will be improved. improves. That is, it is possible to easily adjust the improvement of security or the improvement of user convenience.
  • the re-input condition may include that the UI unit 23 has been operated to instruct the execution of a predetermined function included in one or more functions whose restrictions have been lifted due to successful authentication.
  • the control unit 29c may enable at least one of image data transmission and reception via the VPN connection established by successful authentication (see FIG. 16).
  • the VPN connection is made through two-step authentication: biometric authentication (local authentication) and server authentication, so the security of the VPN is improved. This allows data to be sent and received with high security.
  • the image processing device 3 may further include a human sensor 51.
  • the human sensor 51 detects that the user U1 has left, the image processing device 3 cancels the authentication of the user U1 after the operation of the image processing unit 31 is completed (after a negative determination in step ST47 in FIG. 12). (Step ST48).
  • the authentication is canceled when the user U1 leaves the image processing device 3 as a trigger, so the probability that a third party uses the image processing device 3 as the user U1 is reduced.
  • the probability that a third party will use a function to which the third party does not have authority or that a third party will use a VPN connection for which the third party is not permitted is reduced.
  • the authentication is canceled after the operation of the image processing unit 31 is completed, even if the user U1 leaves the image processing apparatus 3 when printing and/or scanning is performed for a long time, the printing and/or scanning continues. This improves user convenience.
  • the image processing device 3 may further include a reset button (for example, the button 33a in FIG. 1).
  • the image processing device 3 may cancel user authentication by pressing the button 33a for a long time (including keeping a finger in contact with the touch button for a long time). Note that in this case, like the cancellation by the human sensor 51, the cancellation may be performed after the operation of the image processing section 31 is completed. That is, the long press of the button 33a may be one of one or more triggers.
  • the image processing device 3 may further include a display section 35.
  • the image processing device 3 fails in authentication using the second biometric information (biometric information input at the time of use)
  • the image processing device 3 displays a message on the display unit 35 prompting the detection unit 25 to re-detect the biometric information, and displays a message indicating that the biometric information is not used.
  • At least one of the displays may be displayed inquiring whether to switch to a different authentication than the one previously used (see FIG. 7).
  • the biometric information may change over time or depending on the user's physical condition.
  • the authentication result may be erroneous.
  • the user is likely to move on to the next action by being prompted to perform re-detection and/or being asked whether to perform another authentication.
  • user convenience is improved.
  • by allowing another authentication it is possible to avoid a situation where processing based on authentication cannot be performed even if authentication is not successful even if biometric information is redetected. And/or when the user is in a hurry, the user can perform processing based on authentication without having to re-register biometric information. From this point of view as well, user convenience is improved.
  • the biometric information may be a fingerprint.
  • the standby mode of the detection unit 25 may be canceled.
  • the user's action for inputting biometric information also serves as the user's action for canceling the standby mode, improving convenience for the user. Further, it is possible to cancel the standby mode by using a part of the function for detecting biological information, and it is possible to reduce the number of components.
  • the image processing device 3 may further include a display section 35.
  • the menu screen of the display unit 35 may be set for each user based on the authentication result D3 (FIG. 15).
  • Authentication using the account information D1 is performed at the server 5. Therefore, for example, through the management of the verification table DT2 in the server 5, users to whom different menu screens are provided can be selected. Further, for example, in an embodiment in which the server 5 has the menu table DT7, the settings of the menu screen can be centrally managed. As a result, convenience is improved.
  • the image processing device 3 may further include a display section 35. Based on the authentication result D3, user information and authority information may be displayed on the display unit 35 (FIG. 13).
  • the user can easily understand his/her own authority.
  • the probability of a situation in which a user instructs the image processing device 3 to perform an action such as printing, even though he/she does not have the authority, and then realizes that he/she does not have the authority, is reduced. In other words, user convenience is improved.
  • the detection unit 25 may detect irregularities on the user's body surface by transmitting and receiving ultrasonic waves.
  • the influence of natural lighting and/or artificial lighting around the image processing device 3 on the input of biological information is reduced. That is, the influence of the environment around the image processing device 3 on authentication is reduced, and the accuracy of authentication is improved. As a result, for example, the image processing device 3 can be placed in a dark place, and the influence of illumination on scanning etc. can be reduced.
  • the server 5 is an example of an external authentication device.
  • the auxiliary storage device 45 is an example of the first memory of the image processing device 3.
  • the nonvolatile memory 5b that stores the verification table DT2 of the server 5 is an example of a second memory.
  • Account information D1 is an example of authentication information.
  • the image processing device is not a multifunction device including a printer and a scanner, but may be one that has only a printing function (i.e., a printer in the narrow sense) or one that has only a scanner function (i.e., a scanner in the narrow sense).
  • the multifunction peripheral may be regarded as a printer (in a broad sense) or a scanner (in a broad sense).
  • the external authentication device is not limited to one having a configuration that conforms to the concept of a general server, and may be, for example, another image processing device. However, it is also possible to regard the other image processing device as a server.
  • the image processing device basically stores authentication information and biometric information in association with each other, but if necessary, other communication devices (e.g., servers, terminals, or other image processing devices) Authentication information and/or biometric information may be acquired from.
  • biometric authentication local authentication
  • authentication server authentication

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Security & Cryptography (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Hardware Design (AREA)
  • Software Systems (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Facsimiles In General (AREA)

Abstract

Ce dispositif de traitement d'image comprend une unité de traitement d'image, une unité de détection, une première mémoire, une unité de communication et une unité de commande. L'unité de traitement d'image comprend une imprimante et/ou un scanner. L'unité de détection détecte des informations biométriques concernant un utilisateur. La première mémoire stocke des premières informations biométriques et des informations d'authentification en association avec chaque utilisateur. L'unité de communication transmet, à un dispositif d'authentification externe, des informations sur la base des informations d'authentification stockées dans la première mémoire et associées aux premières informations biométriques qui correspondent à des secondes informations biométriques concernant un premier utilisateur détecté par l'unité de détection. L'unité de commande commande l'annulation des limitations imposées à une fonction associée à l'unité de traitement d'image et/ou à l'unité de communication sur la base d'un résultat d'authentification reçu en provenance du dispositif d'authentification externe.
PCT/JP2022/032788 2022-08-31 2022-08-31 Dispositif de traitement d'image et système de communication WO2024047800A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/JP2022/032788 WO2024047800A1 (fr) 2022-08-31 2022-08-31 Dispositif de traitement d'image et système de communication

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2022/032788 WO2024047800A1 (fr) 2022-08-31 2022-08-31 Dispositif de traitement d'image et système de communication

Publications (1)

Publication Number Publication Date
WO2024047800A1 true WO2024047800A1 (fr) 2024-03-07

Family

ID=90098952

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2022/032788 WO2024047800A1 (fr) 2022-08-31 2022-08-31 Dispositif de traitement d'image et système de communication

Country Status (1)

Country Link
WO (1) WO2024047800A1 (fr)

Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2002152446A (ja) * 2000-11-09 2002-05-24 Ricoh Co Ltd 複合機システムとそのメニュー表示方法及び記録媒体
JP2002183093A (ja) * 2000-12-12 2002-06-28 Canon Inc 制御装置および制御装置の制御方法および記憶媒体
JP2004077990A (ja) * 2002-08-21 2004-03-11 Canon Inc 画像形成装置
JP2007293813A (ja) * 2006-03-28 2007-11-08 Canon Inc 画像形成装置、その制御方法、システム、プログラム及び記憶媒体
JP2011054120A (ja) * 2009-09-04 2011-03-17 Konica Minolta Business Technologies Inc 画像処理装置、画像処理システムおよびユーザ認証方法
JP2013025057A (ja) * 2011-07-21 2013-02-04 Fuji Xerox Co Ltd 画像形成装置及びプログラム
JP2014186602A (ja) * 2013-03-25 2014-10-02 Konica Minolta Inc 認証システム、情報処理装置、認証方法及びプログラム
JP2015045916A (ja) * 2013-08-27 2015-03-12 シャープ株式会社 認証装置及び画像形成装置
JP2016525751A (ja) * 2013-07-15 2016-08-25 クアルコム,インコーポレイテッド センサアレイを動作させるための方法および集積回路
JP2019145152A (ja) * 2013-06-18 2019-08-29 アーム・アイピー・リミテッド トラステッドデバイス
JP2021136664A (ja) * 2020-02-28 2021-09-13 キヤノン株式会社 デバイス、制御方法、およびプログラム

Patent Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2002152446A (ja) * 2000-11-09 2002-05-24 Ricoh Co Ltd 複合機システムとそのメニュー表示方法及び記録媒体
JP2002183093A (ja) * 2000-12-12 2002-06-28 Canon Inc 制御装置および制御装置の制御方法および記憶媒体
JP2004077990A (ja) * 2002-08-21 2004-03-11 Canon Inc 画像形成装置
JP2007293813A (ja) * 2006-03-28 2007-11-08 Canon Inc 画像形成装置、その制御方法、システム、プログラム及び記憶媒体
JP2011054120A (ja) * 2009-09-04 2011-03-17 Konica Minolta Business Technologies Inc 画像処理装置、画像処理システムおよびユーザ認証方法
JP2013025057A (ja) * 2011-07-21 2013-02-04 Fuji Xerox Co Ltd 画像形成装置及びプログラム
JP2014186602A (ja) * 2013-03-25 2014-10-02 Konica Minolta Inc 認証システム、情報処理装置、認証方法及びプログラム
JP2019145152A (ja) * 2013-06-18 2019-08-29 アーム・アイピー・リミテッド トラステッドデバイス
JP2016525751A (ja) * 2013-07-15 2016-08-25 クアルコム,インコーポレイテッド センサアレイを動作させるための方法および集積回路
JP2015045916A (ja) * 2013-08-27 2015-03-12 シャープ株式会社 認証装置及び画像形成装置
JP2021136664A (ja) * 2020-02-28 2021-09-13 キヤノン株式会社 デバイス、制御方法、およびプログラム

Similar Documents

Publication Publication Date Title
JP6696246B2 (ja) 画像処理装置及びプログラム
US8453259B2 (en) Authentication apparatus, authentication system, authentication method, and authentication program using biometric information for authentication
JP7103460B2 (ja) 画像処理装置及びプログラム
JP4781193B2 (ja) リプログラフィック装置用の文書通知システム及び方法
CN104349046B9 (zh) 图像处理装置和方法
EP1913512B1 (fr) Systeme d'imagerie et procede d'authentification
JP7066380B2 (ja) システム、システムにおける方法、情報処理装置、情報処理装置における方法、およびプログラム
JP2013061770A (ja) サービス提供装置、及びプログラム
JP4497200B2 (ja) 画像形成装置、画像形成装置端末装置、および、プログラム
JP2018007036A (ja) 画像処理装置、画像処理システム、画像処理方法およびプログラム
JP5267141B2 (ja) 画像形成装置、画像形成装置の制御方法、および画像形成装置の制御プログラム
WO2024047800A1 (fr) Dispositif de traitement d'image et système de communication
JP5854081B2 (ja) 画像処理装置
WO2024047802A1 (fr) Dispositif de traitement d'image et système de communication
JP2019045954A (ja) データ処理装置、ユーザー認証方法及びユーザー認証プログラム
JP7218455B1 (ja) 画像処理装置および通信システム
JP6395628B2 (ja) 画像形成装置
JP7408027B1 (ja) 画像処理装置および通信システム
JP2022186739A (ja) ユーザー認証装置、ユーザー認証装置の制御方法および画像形成装置
CN109561228A (zh) 信息处理装置、存储介质和信息处理方法
JP2010183306A (ja) 画像形成装置、画像形成装置の制御方法、および画像形成装置の制御プログラム
JP2017199179A (ja) 情報処理装置、情報処理システム、認証方法およびプログラム
JP6825503B2 (ja) 情報処理装置、情報処理システム、情報処理方法およびプログラム
JP6840995B2 (ja) 情報処理装置、情報処理システム、プログラム、及び認証方法
JP7014266B2 (ja) 情報処理装置、情報処理システム、認証方法およびプログラム

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22957398

Country of ref document: EP

Kind code of ref document: A1