WO2015098253A1 - Dispositif électronique - Google Patents

Dispositif électronique Download PDF

Info

Publication number
WO2015098253A1
WO2015098253A1 PCT/JP2014/077616 JP2014077616W WO2015098253A1 WO 2015098253 A1 WO2015098253 A1 WO 2015098253A1 JP 2014077616 W JP2014077616 W JP 2014077616W WO 2015098253 A1 WO2015098253 A1 WO 2015098253A1
Authority
WO
WIPO (PCT)
Prior art keywords
unit
user
information
input
electronic device
Prior art date
Application number
PCT/JP2014/077616
Other languages
English (en)
Japanese (ja)
Inventor
志賀啓
内山洋治
伊三木一皇
有馬由桂
降矢雄飛
関口政一
關口直樹
村越雄
Original Assignee
株式会社ニコン
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 株式会社ニコン filed Critical 株式会社ニコン
Publication of WO2015098253A1 publication Critical patent/WO2015098253A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F21/00Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F21/30Authentication, i.e. establishing the identity or authorisation of security principals
    • G06F21/44Program or device authentication
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • G06F1/1613Constructional details or arrangements for portable computers
    • G06F1/163Wearable computers, e.g. on a belt
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F21/00Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F21/30Authentication, i.e. establishing the identity or authorisation of security principals
    • G06F21/31User authentication
    • G06F21/32User authentication using biometric data, e.g. fingerprints, iris scans or voiceprints
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F21/00Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F21/30Authentication, i.e. establishing the identity or authorisation of security principals
    • G06F21/31User authentication
    • G06F21/40User authentication by quorum, i.e. whereby two or more security principals are required
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality

Definitions

  • the present invention relates to an electronic device.
  • Patent Document 1 it cannot be said that sufficient consideration has been given to the point of increasing the security of the wearable terminal or the point of increasing the security of other devices using the wearable terminal.
  • the present invention has been made in view of the above-described problems, and an object thereof is to provide an electronic device capable of improving security.
  • the electronic device of the present invention includes a first input unit that inputs an authentication result of the first part of the user, a second input unit that inputs an authentication result of the second part of the user, and the first and second input units. And a generating unit that generates setting information related to the user's operation based on the input information.
  • a transmission unit that transmits the setting information to an external device may be provided. Further, the transmission unit may transmit the setting information by proximity communication or human body communication via the user.
  • the electronic apparatus of the present invention may further include a setting unit that performs settings related to the user's operation using the setting information.
  • the setting unit may set a restriction on the user's operation based on the input results of the first and second input units.
  • the first input unit may input an authentication result related to the user's eyes.
  • the second input unit may input an authentication result related to the user's hand.
  • the electronic device of the present invention includes an input unit that inputs an authentication result of a part of the user's body, and a transmission unit that transmits input information of the input unit to an external device through human body communication.
  • a generation unit is provided that generates setting information related to the user operation based on the authentication result input by the input unit. May be.
  • the electronic apparatus includes a storage unit that stores information, an imaging unit that performs imaging, a determination unit that determines whether the imaging unit captures information related to information stored in the storage unit, and the determination A notification unit that notifies the user according to the determination result of the unit.
  • information captured using the imaging unit may be stored in the storage unit.
  • the storage unit stores text data
  • the determination unit performs determination based on text data generated based on imaging data captured by the imaging unit and text data stored in the storage unit. It is good.
  • you may provide the transmission part which transmits the determination result of the said determination part to an external apparatus.
  • the transmission unit may transmit the information by proximity communication or human body communication via the user.
  • the electronic apparatus includes an imaging unit that captures an image regarding character information, a determination unit that determines whether the character information captured by the imaging unit is in communication with an external device, and a determination result of the determination unit And a notification unit for reporting to the user.
  • the electronic device of the present invention includes an input unit that inputs an authentication result related to the user's eyes, and a generation unit that generates setting information related to the user's operation based on the input information of the input unit.
  • the generation unit may limit the setting information based on an authentication result input by the input unit.
  • the electronic device of the present invention includes an input unit that inputs an authentication result related to the user's eyes, and a transmission unit that transmits input information of the input unit to an external device through human body communication.
  • the transmission unit may transmit information related to use restriction of the external device by the user.
  • the electronic device of the present invention has an effect that security can be improved.
  • FIG. 8B is a user authentication table stored in the storage unit of the wristwatch-type device. It is a figure which shows an example. It is a figure which shows the structure of the electronic device system which concerns on 2nd Embodiment. It is a flowchart which shows an example of the process of the spectacles type apparatus which concerns on 2nd Embodiment.
  • FIG. 11A and FIG. 11B are diagrams for explaining the processing in step S76 in FIG.
  • FIG. 1 is a block diagram showing the configuration of an electronic device system 100 according to the first embodiment.
  • the eyeglass-type device 10 includes an imaging unit 11, a display unit 12, an operation unit 13, a microphone 14, a storage unit 15, a communication unit 16, a retina information acquisition unit 18, an authentication unit 19, a control unit 17, and the like. Is provided.
  • the eyeglass-type device 10 is shown in a perspective view.
  • the spectacle-type device 10 includes a spectacle-type frame 110. The configuration of the eyeglass-type device 10 illustrated in FIG. 1 and not illustrated in FIG. 3 is provided inside the frame 110 or a part of the frame 110.
  • the imaging unit 11 includes a lens, an imaging device, an image processing unit, and the like, and captures still images and moving images. As shown in FIG. 3, the imaging unit 11 is provided near the end of the frame 110 (near the user's right eye). For this reason, when the user wears the glasses-type device 10 (the state shown in FIG. 2), it is possible to capture an image in the direction in which the user is facing (looking at).
  • the operation unit 13 is a touch pad provided on the frame 110, detects the movement of the user's finger, receives an operation from the user, and transmits the received operation information to the control unit 17.
  • the details of the imaging unit 11, the display unit 12, and the operation unit 13 are also disclosed in, for example, US Published Patent No. 2013/0044042.
  • the storage unit 15 is a non-volatile semiconductor memory such as a flash memory, for example, image data captured by the imaging unit 11, data used for authentication of the authentication unit 19 (see the user authentication table in FIG. 8A), The display data to be displayed on the display unit 12 and various programs are stored.
  • a non-volatile semiconductor memory such as a flash memory
  • the communication unit 16 performs human body communication with other devices.
  • the communication unit 16 includes an electrode unit 16a that is provided on the frame 110 and can contact a user, and a human body communication unit 16b that performs human body communication using the electrode unit 16a.
  • Human body communication includes a current system in which a weak current is passed through the human body and the current is modulated to transmit information, and an electric field system in which information is transmitted by modulating an electric field induced on the surface of the human body. . In this embodiment, it is possible to use either the current method or the electric field method. Note that human body communication is possible not only when the user directly contacts the electrode portion 16a of the eyeglass-type device 10 but also when the electrode portion 16a and the user are not in direct contact.
  • the retina information acquisition unit 18 includes an infrared irradiation unit and an infrared light receiving unit.
  • the retinal information acquisition unit 18 detects (scans) the path of blood vessels on the retina by irradiating the eyeball with infrared rays from the infrared irradiation unit and receiving the infrared rays reflected by the eyeballs with the infrared light reception unit. Then, the retinal information acquisition unit 18 outputs the detection result to the control unit 17.
  • the retinal information acquisition unit 18 is provided on the back side (face side) of the imaging unit 11 in FIG. 2 as an example.
  • the control unit 17 comprehensively controls the entire eyeglass-type device 10.
  • the control unit 17 includes a CPU (Central Processing Unit), a RAM (Random Access Memory), a ROM (Read Only Memory), and the like.
  • the control unit 17 performs control of the imaging unit 11 and the retinal information acquisition unit 18, control of the authentication unit 19 and the display unit 12, and the like.
  • the wristwatch type device 30 includes a display unit 31, an operation unit 32, a vein information acquisition unit 33, a storage unit 34, an authentication unit 35, a communication unit 36, a control unit 37, and the like.
  • the display unit 31 includes an organic EL (Electro-Luminescence) display, a liquid crystal display, and the like, and displays various types of information under instructions from the control unit 37.
  • the operation unit 32 includes a touch panel, buttons, and the like, receives user operations, and transmits them to the control unit 37.
  • the vein information acquisition unit 33 is provided at a position where the user can face the wrist of the user in a state where the user wears the wristwatch type device 30 (the state shown in FIG. 2).
  • the vein information acquisition unit 33 includes an infrared irradiation unit and an infrared light reception unit.
  • the vein information acquisition unit 33 detects (scans) the shape of the vein by irradiating the wrist with infrared rays from the infrared irradiation unit and receiving infrared rays reflected by the veins on the wrist with the infrared light reception unit.
  • the vein information acquisition unit 33 outputs the detection result to the control unit 37.
  • the storage unit 34 is a non-volatile semiconductor memory such as a flash memory, for example, data used for authentication of the authentication unit 35 (see the user authentication table in FIG. 8B), display data displayed on the display unit 31, And various programs are stored.
  • the communication unit 36 performs human body communication with other devices. Note that the communication unit 36 may perform near field communication with other devices. In the present embodiment, the communication unit 36 communicates with the communication unit 16 of the eyeglass-type device 10 and the first communication unit 24 of the PC 20 using human body communication.
  • the communication unit 36 includes an electrode unit 24a that can be in contact with the user's arm, and a human body communication unit 24b that performs human body communication via the electrode unit 24a. .
  • the communication unit 36 performs human body communication with the communication unit 16 of the glasses-type device 10 described above.
  • the PC 20 includes a display unit 21, an operation unit 22, a storage unit 23, a first communication unit 24, a second communication unit 25, a control unit 26, and the like.
  • the PC 20 may be a desktop PC as shown in FIG. 3, or may be a notebook PC, a tablet terminal, or a smartphone.
  • the second communication unit 25 communicates with other devices wirelessly or by wire.
  • the second communication unit 25 communicates with a web server (not shown) using the Internet or the like.
  • the control unit 26 includes a CPU, a RAM, a ROM, and the like, and controls the entire PC 20. In the present embodiment, the control unit 26 controls information communication with the eyeglass-type device 10 and the wristwatch-type device 30. In addition, the control unit 26 exchanges information with a Web server or the like to display a Web site on the display unit 21 or perform various information processing.
  • the pairing is a state in which cooperative processing between a plurality of devices can be executed. Further, when the power supply of one device is off, the power supply of the device that is turned off may be turned on by establishing human body communication or proximity communication with the other device. In the present embodiment, it is assumed that human body communication can be performed between paired devices.
  • FIG. 4 is a flowchart illustrating an example of authentication instruction transmission processing by the PC 20.
  • the process of FIG. 4 is a process executed while the PC 20 is powered on.
  • the control unit 26 determines whether or not use restriction information is necessary.
  • the use restriction information is necessary when making a transfer at a bank (net banking) site, when making a credit card payment, or when viewing a document file with a set sensitivity.
  • the determination in step S10 is affirmed and the process proceeds to step S12.
  • step S10 is repeated at predetermined time intervals.
  • step S10 When the determination in step S10 is affirmed and the process proceeds to step S12, the control unit 26 transmits an authentication instruction to the paired glasses-type device 10 and the wristwatch-type device 30. After the process of step S12 is performed, the process returns to step S10.
  • FIG. 5 is a flowchart illustrating an example of processing performed by the glasses-type device 10.
  • the process of FIG. 5 is a process executed while the glasses-type device 10 is powered on.
  • the control unit 17 stands by until an authentication instruction is received from the PC 20.
  • the process proceeds to step S22 at the timing when the process of step S12 of FIG. 4 described above is performed.
  • step S22 the control unit 17 instructs the retinal information acquisition unit 18 to acquire the user's retinal information.
  • step S24 the control unit 17 instructs the authentication unit 19 to execute user authentication of the eyeglass-type device 10 using the acquired retina information.
  • the authentication unit 19 performs user authentication with reference to the user authentication table (FIG. 8A) stored in the storage unit 15.
  • step S26 the control unit 17 determines whether an authentication result has been input from the wristwatch type device 30 or not.
  • the case where the determination in step S26 is negative is a case where the user does not wear the wristwatch type device 30. If the determination is negative, the process proceeds to step S28. If the determination is positive, the process proceeds to step S30. If transfering it to step S28, the control part 17 will determine with the wristwatch type
  • step S30 the control unit 17 determines whether or not the user authentication result of the eyeglass device 10 matches the user authentication result of the user of the wristwatch device 30.
  • the process proceeds to step S32, and the control unit 17 generates use restriction information (none).
  • the usage restriction information (none) is information indicating that the user can make a transfer or the like on the net banking site without restriction on the amount of money, or information indicating that there is no restriction on the amount of money used in credit card payment. Or information indicating that the user can use the document file with the confidentiality set without limitation.
  • step S40 the control unit 17 transmits use restriction information (none) to the PC 20 via the communication unit 16, and ends the process of FIG.
  • step S30 determines whether the user authentication results do not match. If transfering it to step S34, the control part 17 will judge whether user authentication of any one of the spectacles type apparatus 10 and the wristwatch type apparatus 30 failed.
  • step S34 determines whether the eyeglass-type device 10 or the wristwatch-type device 30 is successful. If the determination in step S34 is affirmative, that is, if user authentication of either the eyeglass-type device 10 or the wristwatch-type device 30 is successful, the process proceeds to step S36, and the control unit 17 uses the usage restriction information ( (Partially limited).
  • Usage restriction information (partial restriction) is information indicating that the user can make transfers, etc. under a certain amount of restrictions on the Internet banking site, and there are restrictions on the amount of money used in credit card payments. This is information indicating that a restriction is provided as compared to the case where there is not, and information indicating that a user can use a document file in which confidentiality is set under a predetermined restriction.
  • step S36 the control unit 17 transmits use restriction information (partial restriction) to the PC 20 via the communication unit 16 in step S40, and the whole process of FIG.
  • step S34 determines whether user authentication has failed in both the eyeglass-type device 10 and the wristwatch-type device 30 has failed in both the eyeglass-type device 10 and the wristwatch-type device 30, the process proceeds to step S38, and the control unit 17 uses the use restriction.
  • Generate information all restrictions
  • usage restriction information is information indicating that the user cannot perform transfers on the Internet banking site, information indicating that credit card payments cannot be made at all, and confidentiality settings. This is information indicating that the user cannot use the document file.
  • step S38 the control unit 17 transmits use restriction information (all restrictions) to the PC 20 via the communication unit 16 in step S40, and the whole process of FIG.
  • FIG. 6 is a flowchart illustrating an example of processing by the wristwatch type device 30.
  • the process of FIG. 6 is a process executed while the wristwatch type device 30 is powered on.
  • the control unit 37 stands by until an authentication instruction is received from the PC 20.
  • the control unit 37 proceeds to step S52.
  • step S52 the control unit 37 issues an instruction to the vein information acquisition unit 33 to acquire the vein information of the user.
  • step S54 the control unit 37 instructs the authentication unit 35 to execute user authentication using the acquired user vein information.
  • the authentication unit 35 performs user authentication with reference to the user authentication table (FIG. 8B) stored in the storage unit 34.
  • step S56 the control unit 37 transmits the authentication result of step S54 to the glasses-type device 10 via the communication unit 36.
  • the control part 37 transmits the information of authentication success or authentication failure to the spectacles type apparatus 10 as an authentication result.
  • step S58 the control unit 37 determines whether or not the authentication result has been transmitted to the glasses-type device 10. If the determination here is affirmed, all the processes in FIG. 6 are terminated. When the authentication result can be transmitted, all the processes in FIG. 6 are terminated by the process using the authentication result (S30 to S40 in FIG. 5) in the glasses-type device 10 as described above. Because.
  • step S58 determines whether the authentication result cannot be transmitted because the human body communication with the eyeglass-type device 10 is not established. If the determination in step S58 is negative, that is, if the authentication result cannot be transmitted because the human body communication with the eyeglass-type device 10 is not established, the control unit 37 proceeds to step S60. To do.
  • step S60 the control unit 37 determines whether or not the user authentication is successful in the process of step S54. When the determination in step S60 is affirmed, the process proceeds to step S62, and the control unit 37 generates use restriction information (partial restriction). After step S62, the control unit 37 transmits use restriction information (partial restriction) to the PC 20 via the communication unit 36 in step S66, and the whole process of FIG.
  • step S64 the case where it transfers to step S64 means the case where the user authentication in the spectacles type apparatus 10 is not performed, and the user authentication fails in the wristwatch type apparatus 30. For this reason, in step S64, the control unit 37 generates use restriction information (all restrictions). And the control part 37 transmits use restriction information (all restrictions) to PC20 via the communication part 36 in step S66, and complete
  • FIG. 7 is a flowchart illustrating an example of usage restriction information utilization processing by the PC 20.
  • the control unit 26 stands by until receiving use restriction information. That is, the control unit 26 proceeds to step S16 at the timing when step S40 of FIG. 5 is executed in the eyeglass-type device 10 or when step S66 of FIG. 6 is executed in the wristwatch-type device 30.
  • the control unit 26 executes a process based on the use restriction information. For example, when the use restriction information (none) is received when the user accesses the net banking site, the control unit 26 transmits the use restriction information (none) to the server that manages the net banking site. To do. In this case, the server removes the restriction on the usage amount of the user based on the usage restriction information (none). As a result, the user can make a transfer or the like on the net banking site without any amount limitation. In addition, when the use restriction information (partial restriction) is received when the user accesses the net banking site, the control unit 26 uses the use restriction information (partial restriction) to the server that manages the net banking site. Limit).
  • the server sets a limit on the usage amount of the user based on the usage restriction information (partial restriction). As a result, the user can make a transfer or the like at a net banking site under a predetermined amount limit. Further, when the use restriction information (all restrictions) is received when the user accesses the net banking site, the control unit 26 uses the use restriction information (all restrictions) for the server that manages the net banking site. Send. In this case, the server prohibits the use of the user based on the use restriction information (all restrictions). As a result, the user cannot perform transfer or the like on the net banking site. Note that the same processing as described above is performed at the time of credit card settlement.
  • the control unit 26 when the use restriction information (none) is received when the user accesses the document file in which the confidentiality is set, the control unit 26 permits the user to access the document file.
  • the use restriction information partial restriction
  • the control unit 26 determines whether the confidentiality level (level) set in the document file is high. ) To allow the user access to the document file.
  • the use restriction information all restrictions
  • the control unit 26 accesses the document file with the confidentiality set from the user. Is prohibited.
  • the control unit 17 of the eyeglass-type device 10 performs the user authentication based on the user authentication result using the user's retina and the user authentication result using the user's vein.
  • Setting information (usage restriction information) related to the operation is generated (S32, S36, S38). Accordingly, it is possible to generate appropriate use restriction information based on two user authentication results, and it is possible to improve the security of the apparatus by using the appropriate use restriction information.
  • the communication unit 16 of the eyeglass-type device 10 transmits use restriction information to the PC 20 by human body communication or proximity communication, so that the control unit 17 is a user in the vicinity of the PC 20. That is, user authentication information of a user who is likely to actually use the PC 20 can be acquired.
  • control unit 37 of the wristwatch-type device 30 acquires an authentication result of a part of the user's body (arm vein) and transmits it to the eyeglass-type device 10 by human body communication via the communication unit 36. To do. Thereby, user authentication results can be exchanged between wearing devices.
  • the spectacle-type device 10 and the wristwatch-type device 30 have been described as generating use restriction information for performing user operation settings on the PC 20 based on two user authentication results. It is not limited to.
  • the eyeglass-type device 10 or the wristwatch-type device 30 may generate use restriction information for performing user operation settings in the eyeglass-type device 10 or the wristwatch-type device 30. Thereby, the security in the glasses-type device 10 and the wristwatch-type device 30 can be improved.
  • the user authentication in the eyeglass-type device 10 is performed using the retina information.
  • the present invention is not limited to this, and iris information may be used in the authentication related to the user's eyes.
  • iris authentication is disclosed, for example, in Japanese Patent Application Laid-Open No. 2013-148961 (Patent No. 5360931).
  • user authentication in the wristwatch-type device 30 is performed using vein information.
  • fingerprint information may be used in authentication related to a user's hand.
  • the user authentication method is not limited to eyes and hands, and authentication using other parts of the user may be employed.
  • the user authentication process using the eyeglass-type device 10 and the wristwatch-type device 30 may be executed when the user views the display unit 21 of the PC 20.
  • the control unit 17 recognizes from the captured image that the screen of the display unit 21 has been captured by the imaging unit 11, the control unit 17 starts the processing of FIG.
  • the control unit 37 of the device 30 may be notified.
  • mold apparatus 30 should just perform the process of FIG. 6 at the timing which notified.
  • the control unit 17 displays whether or not the screen of the display unit 21 has been captured by the imaging unit 11, whether or not the characteristic portion of the display unit 21 is included in the captured image, or is displayed on the screen of the display unit 21. Judgment may be made based on whether or not a characteristic portion of the image is included.
  • the wristwatch-type device 30 transmits the user authentication result to the eyeglass-type device 10 (S56)
  • the present invention is not limited to this.
  • the eyeglass-type device 10 may transmit a user authentication result to the wristwatch-type device 30.
  • the use restriction information generated based on the user authentication result of the eyeglass-type device 10 and the user authentication result of the wristwatch-type device 30 is transmitted to the PC 20. It is not limited.
  • the user authentication result of the eyeglass-type device 10 and the user authentication result of the wristwatch-type device 30 may be transmitted to the PC 20.
  • the control unit 26 of the PC 20 may generate usage restriction information based on both user authentication results.
  • the eyeglass-type device 10 and the wristwatch-type device 30 are adopted as devices worn by a person.
  • the present invention is not limited to this, and other devices such as a contact lens type terminal may be used. It may be a device to be mounted.
  • the device that receives the use restriction information is the PC 20
  • the present invention is not limited to this.
  • other devices such as an imaging device may be used.
  • FIG. 9 is a block diagram showing the configuration of an electronic device system 100 ′ according to the second embodiment.
  • the eyeglass-type device 10 includes an image recognition unit 101 and an OCR (OpticalOptCharacter Recognition) unit 102. It is different from the electronic device system 100 of the first embodiment in that it is provided. Note that at least one of the image recognition unit 101 and the OCR unit 102 may be provided on the PC 20 (which may be a tablet terminal or a smartphone), and the eyeglass device 10 may receive the result by human body communication or proximity communication. .
  • the image recognition unit 101 extracts feature points of image data captured by the imaging unit 11 of the eyeglass-type device 10 using, for example, a feature amount detection algorithm such as SURF (Speeded Up Up Robust Features) or SIFT (Scale Invariant Feature Up Transform) and stores the feature points.
  • the image is compared with the image stored in the unit 15.
  • a feature amount detection algorithm such as SURF (Speeded Up Up Robust Features) or SIFT (Scale Invariant Feature Up Transform)
  • SURF Speeded Up Up Robust Features
  • SIFT Scale Invariant Feature Up Transform
  • the OCR unit 102 recognizes characters included in the image data captured by the imaging unit 11 of the glasses-type device 10 and converts them into text data.
  • a URL included in an image of a website imaged by the imaging unit 11 is recognized and converted into text data.
  • the process of FIG. 10 is a process executed when the user instructs execution of a phishing countermeasure process through the operation unit 13.
  • the control unit 17 acquires a captured image of the imaging unit 11. Note that the imaging unit 11 captures still images at predetermined time intervals.
  • step S72 the control unit 17 issues an instruction to the image recognition unit 101, and determines whether or not the user has viewed the browser on the display unit 21 based on the captured image of the imaging unit 11.
  • the image recognition unit 101 refers to, for example, the captured image and determines whether the mark attached to the upper left corner of the window (“*” mark in FIG. 11A) is a browser-specific mark. If the mark is unique to the browser, it is determined that the user has viewed the browser.
  • step S72 If the determination in step S72 is negative, the process returns to step S70, but if the determination is positive, the process proceeds to step S74.
  • step S74 the control unit 17 temporarily stores the captured image.
  • step S ⁇ b> 76 the control unit 17 issues an instruction to the image recognition unit 101, and compares the captured image with a past browsing image stored in the storage unit 15. For example, it is assumed that the image captured by the imaging unit 11 this time is the image illustrated in FIG. In this case, the image recognition unit 101 compares the image shown in FIG. 11A with the past browsing images stored in the storage unit 15, and calculates the similarity with each browsing image.
  • step S78 the control unit 17 determines whether there is a browse image having a similarity equal to or greater than a threshold value.
  • a threshold value for example, a numerical value such as 80% to 90% can be adopted. If the determination in step S78 is negative, that is, if there is no browse image with a similarity equal to or greater than the threshold, the process proceeds to step S80, and the control unit 17 converts the captured image stored temporarily into the past browse image. Is stored in the storage unit 15. Thereafter, the process returns to step S70.
  • the image captured this time shown in FIG. 11A has a similarity with a past browsing image as shown in FIG. 11B (an image when browsing the website of “Internet bank”) or more than a threshold value.
  • the determination in step S78 is positive, and the control unit 17 proceeds to step S82.
  • step S82 the control unit 17 uses the OCR unit 102 to compare the URL included in the captured image with the URL of the browsing image. It is assumed that the control unit 17 recognizes where the URL of the Web site is written in the browser. Therefore, the control unit 17 extracts the range in which the URL is described from the captured image and the browse image, and causes the OCR unit 102 to convert the text. Then, the control unit 17 compares the URLs converted into text. Note that the control unit 17 may compare the images in the URL portion using the image recognition unit 101.
  • step S84 the control unit 17 determines whether or not the URLs are the same as a result of comparing the text URLs.
  • the determination in step S84 is denied and the process proceeds to step S86.
  • the Web site included in the captured image is a Web aimed at phishing It may be a site.
  • step S84 determines whether the result of comparing the URLs is the same. If the determination in step S84 is affirmative, that is, if the result of comparing the URLs is the same, the process returns to step S70 without passing through step S86.
  • the control unit 17 includes the URL included in the captured image captured by the imaging unit 11 and the URL included in the past browsing image stored in the storage unit 15. And whether or not the imaging unit 11 has captured information related to the past browsing image (whether the Web site viewed by the user and the Web site included in the past browsing image are the same) is determined. The user is warned according to the determination result. Thereby, it is possible to prevent the user from accessing a fake website (such as a phishing scam site) similar to a website browsed in the past at first glance.
  • a fake website such as a phishing scam site
  • control unit 17 appropriately stores the captured image captured using the imaging unit 11 in the storage unit 15 (S80). Thereby, it becomes possible to add the captured image used as the past browsing image to the memory
  • control unit 17 transmits to the PC 20 a determination result as to whether or not the Web site viewed by the user is the same as the Web site included in the past browsing image, and the PC 20 To warn the user. Thereby, it is possible to display a warning in an easy-to-understand manner at an appropriate position for the user who uses the PC 20.
  • the URL may be converted into text at the timing (S80) when the past browsing image is stored in the storage unit 15, and stored in the storage unit 15 in association with the past browsing image.
  • the control unit 17 of the glasses-type device 10 performs image comparison and URL comparison.
  • the comparison of images and URLs may be performed by the PC 20, the wristwatch type device 30, or a server (cloud server) connected to the PC 20.
  • the control unit 17 of the glasses-type device 10 may transmit the image captured by the imaging unit 11 to the PC 20, the wristwatch-type device 30, or the server as needed.
  • the control unit 17 of the eyeglass-type device 10 includes the URL included in the captured image captured by the imaging unit 11 and the URL included in the past browsing image stored in the storage unit 15.
  • the image capturing unit 11 determines whether the information related to the past browsing image is captured.
  • the control unit 17 of the eyeglass-type device 10 communicates with a predetermined Web server or the like using the Internet or the like, searches for a URL included in the captured image captured by the imaging unit 11, and authenticates the false (false) It may be determined whether the URL is a website URL, and the determination result may be output (notified) to the user. This eliminates the need to store URLs included in past browsing images in the storage unit 15, thereby simplifying the process and reducing the amount of data stored in the storage unit 15.
  • the functions of the glasses-type device 10 and the wristwatch-type device 30 may be separated.
  • a contact lens type terminal provided with a display unit another function may be given to a spectacle type device or a wristwatch type device.
  • the above processing functions can be realized by a computer.
  • a program describing the processing contents of the functions that the processing device (CPU) should have is provided.
  • the program describing the processing contents can be recorded on a computer-readable recording medium (except for a carrier wave).
  • the program When the program is distributed, for example, it is sold in the form of a portable recording medium such as a DVD (Digital Versatile Disc) or CD-ROM (Compact Disc Read Only Memory) on which the program is recorded. It is also possible to store the program in a storage device of a server computer and transfer the program from the server computer to another computer via a network.
  • a portable recording medium such as a DVD (Digital Versatile Disc) or CD-ROM (Compact Disc Read Only Memory) on which the program is recorded. It is also possible to store the program in a storage device of a server computer and transfer the program from the server computer to another computer via a network.
  • the computer that executes the program stores, for example, the program recorded on the portable recording medium or the program transferred from the server computer in its own storage device. Then, the computer reads the program from its own storage device and executes processing according to the program. The computer can also read the program directly from the portable recording medium and execute processing according to the program. Further, each time the program is transferred from the server computer, the computer can sequentially execute processing according to the received program.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Security & Cryptography (AREA)
  • General Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Hardware Design (AREA)
  • Software Systems (AREA)
  • Human Computer Interaction (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

L'invention concerne un dispositif électronique équipé d'une première unité d'entrée (18), dans laquelle un résultat d'authentification concernant une première partie d'un utilisateur est entré, d'une deuxième unité d'entrée (16), dans laquelle un résultat d'authentification concernant une deuxième partie de l'utilisateur est entré, et d'une unité de génération (17) qui, à partir des informations d'entrée provenant des première et deuxième unités d'entrée, génère des informations de définition liées à l'opération d'un utilisateur.
PCT/JP2014/077616 2013-12-26 2014-10-16 Dispositif électronique WO2015098253A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2013-269755 2013-12-26
JP2013269755 2013-12-26

Publications (1)

Publication Number Publication Date
WO2015098253A1 true WO2015098253A1 (fr) 2015-07-02

Family

ID=53478132

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2014/077616 WO2015098253A1 (fr) 2013-12-26 2014-10-16 Dispositif électronique

Country Status (1)

Country Link
WO (1) WO2015098253A1 (fr)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2020021811A1 (fr) * 2018-07-25 2020-01-30 日本電信電話株式会社 Dispositif d'analyse, procédé d'analyse et programme d'analyse
WO2021204947A1 (fr) * 2020-04-08 2021-10-14 Heba Bevan Systèmes de détection d'infections et de maladies

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2002312324A (ja) * 2001-04-13 2002-10-25 Sony Corp リストバンド型認証デバイス、認証システム、情報出力デバイス
JP2003058509A (ja) * 2001-08-15 2003-02-28 Sony Corp 認証処理システム、認証処理方法、および認証デバイス、並びにコンピュータ・プログラム
JP2003060635A (ja) * 2001-08-13 2003-02-28 Sony Corp 個人認証システム、個人認証方法、および認証デバイス、並びにコンピュータ・プログラム
JP2005528662A (ja) * 2001-08-28 2005-09-22 ヒューレット・パッカード・カンパニー 生体認証使用者確認を使用する使用者装着可能な無線トランザクション装置
JP2007179206A (ja) * 2005-12-27 2007-07-12 Fujitsu Fip Corp 情報通信システム、及び、ツールバー提供サーバ、情報提供サーバ、不正サイト検出方法、並びに、ツールバープログラム
JP2008067218A (ja) * 2006-09-08 2008-03-21 Sony Corp 撮像表示装置、撮像表示方法
JP2008198028A (ja) * 2007-02-14 2008-08-28 Sony Corp ウェアラブル装置、認証方法、およびプログラム
JP2010516007A (ja) * 2007-01-16 2010-05-13 インターナショナル・ビジネス・マシーンズ・コーポレーション コンピュータ不正行為を検出するための方法及び装置

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2002312324A (ja) * 2001-04-13 2002-10-25 Sony Corp リストバンド型認証デバイス、認証システム、情報出力デバイス
JP2003060635A (ja) * 2001-08-13 2003-02-28 Sony Corp 個人認証システム、個人認証方法、および認証デバイス、並びにコンピュータ・プログラム
JP2003058509A (ja) * 2001-08-15 2003-02-28 Sony Corp 認証処理システム、認証処理方法、および認証デバイス、並びにコンピュータ・プログラム
JP2005528662A (ja) * 2001-08-28 2005-09-22 ヒューレット・パッカード・カンパニー 生体認証使用者確認を使用する使用者装着可能な無線トランザクション装置
JP2007179206A (ja) * 2005-12-27 2007-07-12 Fujitsu Fip Corp 情報通信システム、及び、ツールバー提供サーバ、情報提供サーバ、不正サイト検出方法、並びに、ツールバープログラム
JP2008067218A (ja) * 2006-09-08 2008-03-21 Sony Corp 撮像表示装置、撮像表示方法
JP2010516007A (ja) * 2007-01-16 2010-05-13 インターナショナル・ビジネス・マシーンズ・コーポレーション コンピュータ不正行為を検出するための方法及び装置
JP2008198028A (ja) * 2007-02-14 2008-08-28 Sony Corp ウェアラブル装置、認証方法、およびプログラム

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2020021811A1 (fr) * 2018-07-25 2020-01-30 日本電信電話株式会社 Dispositif d'analyse, procédé d'analyse et programme d'analyse
JPWO2020021811A1 (ja) * 2018-07-25 2021-02-15 日本電信電話株式会社 解析装置、解析方法及び解析プログラム
WO2021204947A1 (fr) * 2020-04-08 2021-10-14 Heba Bevan Systèmes de détection d'infections et de maladies
GB2611919A (en) * 2020-04-08 2023-04-19 Bevan Heba Infection and disease sensing systems

Similar Documents

Publication Publication Date Title
US10936709B2 (en) Electronic device and method for controlling the same
US10341113B2 (en) Password management
US10621583B2 (en) Wearable earpiece multifactorial biometric analysis system and method
Li et al. Whose move is it anyway? Authenticating smart wearable devices using unique head movement patterns
Peng et al. Continuous authentication with touch behavioral biometrics and voice on wearable glasses
US10706396B2 (en) Systems and methods for translating a gesture to initiate a financial transaction
Chen et al. Eyetell: Video-assisted touchscreen keystroke inference from eye movements
US10205883B2 (en) Display control method, terminal device, and storage medium
Neal et al. Surveying biometric authentication for mobile device security
US10042995B1 (en) Detecting authority for voice-driven devices
US20200026939A1 (en) Electronic device and method for controlling the same
US10733275B1 (en) Access control through head imaging and biometric authentication
US20160042172A1 (en) Method and apparatus for unlocking devices
CN103870738A (zh) 基于虹膜识别的可穿戴式身份认证装置
US10803159B2 (en) Electronic device and method for controlling the same
WO2021115424A1 (fr) Procédé de paiement vocal et dispositif électronique
CN107924432B (zh) 电子装置及其变换内容的方法
Zhang et al. TouchID: User authentication on mobile devices via inertial-touch gesture analysis
US11080417B2 (en) Private eye-to-eye communications with wearable heads up display
JP6743883B2 (ja) 電子機器、認証方法及びプログラム
KR102082418B1 (ko) 전자 장치 및 그 제어 방법
WO2015098253A1 (fr) Dispositif électronique
KR20210130856A (ko) 전자 장치 및 그 제어 방법
KR101805749B1 (ko) 사용자 인증 장치
JP6585518B2 (ja) 認証システム、方法及びプログラム

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 14875011

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

NENP Non-entry into the national phase

Ref country code: JP

122 Ep: pct application non-entry in european phase

Ref document number: 14875011

Country of ref document: EP

Kind code of ref document: A1