WO2016117061A1 - Terminal portatif et système de traitement d'informations utilisant ce terminal - Google Patents

Terminal portatif et système de traitement d'informations utilisant ce terminal Download PDF

Info

Publication number
WO2016117061A1
WO2016117061A1 PCT/JP2015/051605 JP2015051605W WO2016117061A1 WO 2016117061 A1 WO2016117061 A1 WO 2016117061A1 JP 2015051605 W JP2015051605 W JP 2015051605W WO 2016117061 A1 WO2016117061 A1 WO 2016117061A1
Authority
WO
WIPO (PCT)
Prior art keywords
wearable terminal
authentication
image
user
remote access
Prior art date
Application number
PCT/JP2015/051605
Other languages
English (en)
Japanese (ja)
Inventor
牧人 宇川
Original Assignee
株式会社野村総合研究所
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 株式会社野村総合研究所 filed Critical 株式会社野村総合研究所
Priority to PCT/JP2015/051605 priority Critical patent/WO2016117061A1/fr
Publication of WO2016117061A1 publication Critical patent/WO2016117061A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F21/00Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F21/30Authentication, i.e. establishing the identity or authorisation of security principals
    • G06F21/31User authentication
    • G06F21/32User authentication using biometric data, e.g. fingerprints, iris scans or voiceprints
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F21/00Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F21/70Protecting specific internal or peripheral components, in which the protection of a component leads to protection of the entire computer
    • G06F21/82Protecting input, output or interconnection devices
    • G06F21/84Protecting input, output or interconnection devices output devices, e.g. displays or monitors
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F21/00Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F21/70Protecting specific internal or peripheral components, in which the protection of a component leads to protection of the entire computer
    • G06F21/86Secure or tamper-resistant housings

Definitions

  • the present invention relates to an information processing system that performs remote access by a wearable terminal.
  • wearable terminals that can be worn and carried around have been proposed and started to be put into practical use.
  • a typical one there is a device that is worn on the head like glasses called a head-mounted display and displays an image near the eye.
  • Patent Document 1 proposes a system that performs user authentication by reading a retina and iris in a user's eyeball using a head-mounted display. According to Patent Document 1, the burden on the user is reduced and the information integrity that does not depend on the user's morality is improved.
  • Patent Document 1 does not take into account that the user may disassemble or process the head mounted display maliciously. For example, if the device that reads the retina and iris for authentication and the device that outputs the rays of the image to be displayed are physically separated, the head-mounted display is removed from the user's head. It is also possible to take out an image to the outside in a state where no is detected. If the image can be taken out, information leakage due to electronic data may occur due to photographing with a digital camera.
  • An object of one aspect of the present invention is to provide a technique for improving information integrity in an information processing system capable of remote access.
  • a wearable terminal is a wearable terminal that is worn on a user's head, and is configured inseparably from a reading unit that acquires a biological information image obtained by photographing the user's eyes, and the reading unit.
  • the reading unit projects a display image in an optical path that overlaps the optical path for acquiring the biological information image, a communication unit that communicates with a remote access server, and performs authentication of the user based on the biological information image.
  • a processing unit that accesses the remote access server via the communication unit when a typical authentication result is obtained.
  • the reading unit and the projection unit are inseparable, and the optical path from which the reading unit acquires the biological information image and the optical path from which the projection unit projects the display image overlap, so that not only the third party but also the user It is difficult to take out the display image displayed after authentication by biometric information by disassembling or processing the wearable terminal with malicious intent, and data leakage in remote access is suppressed, and information integrity is improved. .
  • FIG. 1 is a block diagram of a wearable terminal 10 according to Embodiment 1.
  • FIG. It is a figure for demonstrating the mounting form of the reading part 11 of the wearable terminal 10, and the projection part 12.
  • FIG. 1 is a block diagram of an information processing system according to a first embodiment.
  • 6 is a table illustrating an example of information held by the wearable terminal 10 according to the first embodiment. It is a table
  • 6 is a sequence diagram illustrating a remote access operation of the information processing system according to the first embodiment.
  • FIG. 4 is a flowchart illustrating an operation at the start of remote access of the wearable terminal 10 according to the first embodiment.
  • FIG. 3 is a flowchart illustrating an operation during remote access of the wearable terminal 10 according to the first embodiment.
  • 6 is a block diagram of a wearable terminal 10 according to a modification of the first embodiment.
  • FIG. FIG. 10 is a block diagram of an information processing system according to a second embodiment. It is a table
  • FIG. 10 is a sequence diagram illustrating a remote access operation of the information processing system according to the second embodiment. It is a flowchart which shows the operation
  • FIG. It is a flowchart which shows the operation
  • FIG. It is a block diagram of the wearable terminal 10 by Example 3.
  • FIG. 12 is a flowchart illustrating an operation during remote access of the wearable terminal 10 according to the third embodiment. It is a block diagram of the wearable terminal 10
  • FIG. 1 is a perspective view of the wearable terminal according to the first embodiment.
  • the wearable terminal 10 is a glasses-type computer worn on the user's head.
  • the display screen of the computer is projected onto a translucent eyeglass lens, and the user can see the display screen and the surroundings at the same time.
  • the wearable terminal 10 can be used as an input device with, for example, an external keyboard 18 connected thereto.
  • FIG. 2 is a block diagram of the wearable terminal 10 according to the first embodiment.
  • FIG. 3 is a diagram for explaining a mounting form of the reading unit 11 and the projection unit 12 of the wearable terminal 10.
  • FIG. 4 is a block diagram of the information processing system according to the first embodiment.
  • the wearable terminal 10 includes a reading unit 11, a projection unit 12, a processing unit 13, a communication unit 14, and an optical system 15.
  • the reading unit 11 acquires a biological information image obtained by photographing the user's eyes and transmits it to the processing unit 13.
  • the biological information image is an iris image as an example.
  • the projection unit 12 is configured inseparably from the reading unit 11, and the reading unit 11 projects a display image through an optical path that overlaps the optical path from which the biological information image is acquired. As shown in FIG. 3, the reading unit 11 and the projection unit 12 are inseparably configured by a sealing unit 16 formed by filling a resin. If the projection unit 12 is forcibly separated from the reading unit 11 or the optical path of the reading unit 11 and the optical path of the projection unit 12 are separated, the wearable terminal 10 itself is destroyed and becomes inoperable.
  • the wearable terminal 10 is connected to the authentication server 21 and the remote access server 22 via the network 23.
  • a plurality of wearable terminals 10 can be connected to the same authentication server 21 and remote access server 22.
  • the communication unit 14 of the wearable terminal 10 communicates with the authentication server 21 when executing authentication, and communicates with the remote access server 22 when performing remote access.
  • the communication method used by the communication unit 14 is not particularly limited, but an example is a wireless LAN based on Wi-Fi.
  • the processing unit 13 receives the biometric information image from the reading unit 11, performs user authentication in conjunction with the authentication server 21 via the communication unit 14, and if a positive authentication result is obtained, the communication unit 14 is The remote access server 22 is accessed.
  • the authentication is performed in conjunction with the wearable terminal 10 and the authentication server 21, but the wearable terminal 10 may perform the authentication alone.
  • the reading unit 11 and the projection unit 12 are inseparable, and the optical path from which the reading unit 11 obtains a biological information image and the optical path from which the projection unit 12 projects a display image overlap each other in hardware. Since the wearable terminal 10 has tamper resistance, it is difficult to take out the display image by disassembling or processing the wearable terminal 10 without destroying it. Not only a third party but also a user can take out a display image displayed through authentication based on biometric information maliciously. Therefore, the integrity of information against fraud by the user is improved, and data leakage in remote access is suppressed.
  • the reading unit 11 and the projection unit 12 are sealed by resin filling, and it is difficult to take out a display image to the outside by disassembling or deforming the wearable terminal 10. Suppression of outflow can be easily realized.
  • the optical system 15 includes a half mirror 17, and forms an optical path through which an iris image from the eye 19 to the reading unit 11 and a display image from the projection unit 12 to the eye 19 pass. The image reaches the eye 19.
  • the optical system 15 superimposes the light from the object around and the light of the display image from the projection unit 12 to reach the user's eye 19. Since the user can see the background as well as the display image on the virtual display, for example, the user can operate while looking at both the hand operating the keyboard 18 and the virtual display. Is possible.
  • FIG. 5 is a table showing an example of information held by the wearable terminal 10 according to the first embodiment.
  • the wearable terminal 10 holds individual terminal IDs (terminal identification information).
  • FIG. 6 is a table showing an example of user registration data held by the authentication server 21.
  • the authentication server 21 associates the terminal ID of each wearable terminal 10 with the iris data of the user who uses the wearable terminal 10 and holds it as user registration data.
  • the iris data is a histogram representing the characteristics of the user's iris generated from the iris image, and is data that can identify individual users.
  • the iris data may be encrypted data.
  • FIG. 7 is a sequence diagram illustrating a remote access operation of the information processing system according to the first embodiment.
  • the wearable terminal 10 reads an iris image from the user's eye 19 (step 101).
  • the wearable terminal 10 generates iris data from the read iris image and transmits it to the authentication server 21 together with its own terminal ID (step 102).
  • the authentication server 21 Upon receiving the terminal ID and the iris data, the authentication server 21 performs the iris authentication of the user by checking whether or not the data matching the combination of the received terminal ID and the iris data exists in the user registration data held in advance. (Step 103). Here, it is assumed that there is data in the user registration data that matches the combination of the received terminal ID and iris data. Therefore, a positive authentication result can be obtained in iris authentication.
  • the authentication server 21 transmits the authentication result to the wearable terminal 10 (step 104). Further, since a positive authentication result is obtained, the authentication server 21 enables the wearable terminal 10 to communicate with the remote access server 22 by redirection (step 105).
  • the wearable terminal 10 that can communicate with the remote access server 22 transmits an authentication request for normal authentication to the remote access server 22 (step 106).
  • the remote access server 22 that has received the authentication request performs normal user authentication (step 107).
  • the remote access server 22 transmits the authentication result to the wearable terminal 10 and permits remote access (step 108).
  • the wearable terminal 10 for which remote access is permitted then securely accesses the remote access server 22 (step 109).
  • the wearable terminal 10 stores the iris data (authentication data) based on the iris image (biological information) in the authentication server 21 that stores in advance user registration data for authenticating the user with the biometric information.
  • the authentication result of authentication using the iris data and the user registration data is received from the authentication server 21. For this reason, it is not necessary to store the personal information of the user in each wearable terminal 10.
  • the authentication server 21 stores the iris data and the terminal ID of the wearable terminal 10 as user registration data in association with each other, and the wearable terminal 10 stores the terminal ID of its own device in the iris data based on the iris image.
  • the authentication server 21 performs authentication by comparing the stored iris data and the corresponding terminal ID with the received iris data and the terminal ID added thereto. Since authentication is performed with the user and the wearable terminal 10 as a pair, a legitimate user cannot perform remote access without using the wearable terminal 10 associated with the user and can maintain higher security.
  • FIG. 8 is a flowchart showing the operation at the start of remote access of the wearable terminal 10 according to the first embodiment. Each process in FIG. 8 is mainly executed by the processing unit 13 or is controlled by the processing unit 13.
  • the wearable terminal 10 reads the user's iris image by the reading unit 11 when the user requests remote access (step 201) (step 202).
  • the wearable terminal 10 that has read the iris image generates iris data from the iris image, transmits the iris data and the terminal ID of the own device to the authentication server 21 as a pair, and issues an authentication request (step 203).
  • the wearable terminal 10 receives the authentication result from the authentication server 21 and determines whether the authentication result is positive or negative (step 204). If the authentication result is negative, the wearable terminal 10 displays that the authentication is NG. Then, the process returns to step 201 (step 206). If the authentication result is positive, wearable terminal 10 performs remote access to remote access server 22 (step 205).
  • FIG. 9 is a flowchart showing an operation during remote access of the wearable terminal 10 according to the first embodiment. Each process of FIG. 9 is mainly executed by the processing unit 13 or is controlled by the processing unit 13.
  • Wearable terminal 10 periodically reads an iris image at period T (steps 301 to 302). Then, the wearable terminal 10 generates iris data from the read iris image, transmits the iris data and the terminal ID of its own device to the authentication server 21, and makes an authentication request (step 303).
  • the wearable terminal 10 receives the authentication result from the authentication server 21, determines whether the authentication result is affirmative or negative (step 304), and returns to step 301 if the authentication result is affirmative. If the authentication result is negative, for example, the wearable terminal 10 displays on the screen that authentication NG has occurred (step 305), and stops remote access to the remote access server 22 (step 306). At this time, since the authentication server 21 has also obtained a negative authentication result, access to the remote access server 22 of the wearable terminal 10 is prohibited. The remote access can be stopped, for example, by prohibiting the wearable terminal 10 from accessing the remote access server 22 and stopping the projection of the display screen.
  • user authentication is performed using an iris image
  • the present invention is not limited to this.
  • user authentication may be performed using a retina image.
  • the reading unit 11 and the projection unit 12 are inseparable by filling them with resin to ensure tamper resistance.
  • the present invention is not limited to this.
  • the reading unit 11 and the projection unit 12 may be made inseparable by configuring them with a one-chip integrated circuit to ensure tamper resistance.
  • FIG. 10 is a block diagram of the wearable terminal 10 according to a modification of the first embodiment. Referring to FIG. 10, the reading unit 11 and the projection unit 12 are integrally configured by a one-chip integrated circuit. Thereby, the reading unit 11 and the projection unit 12 become inseparable, and the tamper resistance of the wearable terminal 10 is ensured.
  • the wearable terminal 10 and the user are associated with each other for authentication, and the user cannot access the remote access server 22 unless the specific wearable terminal 10 is used. It is not limited to.
  • the wearable terminal 10 may hold the iris data and authenticate alone without using the authentication server 21. As a result, the authentication server 21 is not required, and the cost can be reduced.
  • the authentication server 21 and the remote access server 22 are arranged as separate devices, but the present invention is not limited to this.
  • the authentication server 21 and the remote access server 22 may be mounted on the same computer and configured integrally.
  • the reading unit 11 there is no special processing as the operation of the projection unit 12 when the reading unit 11 is reading an iris image, but other configuration examples are possible.
  • the reading unit 11 when the reading unit 11 reads the iris image, the light of the projection of the display image from the projection unit 12 may be suppressed lower than normal. Thereby, the reflection of the display image from the eye 19 is suppressed, and the iris image can be easily read.
  • auxiliary light suitable for reading the iris image is irradiated, and the iris image is read by reflected light from the eye 19 of the auxiliary light. Good.
  • a reading auxiliary light source that emits auxiliary light suitable for reading the iris image is separately provided, and when the reading unit 11 reads the iris image, the reading auxiliary light source emits light.
  • the iris image may be read by the reflected light from the auxiliary light eye 19.
  • the wearable terminal 10 there is no particular restriction on location for the wearable terminal 10 to remotely access the remote access server 22, but the present invention is not limited to this.
  • the wearable terminal 10 is equipped with a GPS and can acquire position information of the own device, and the wearable terminal 10 authenticates the authentication server 21 on the condition that the own device exists in a predetermined accessible area. You may be able to request.
  • remote access is possible from home, a method of use is possible in which remote access is prohibited in other places such as a public place outside the home.
  • the remote access server 22 or the wearable terminal 10 may always record a remote access display image. It is easy to trace fraud, and fraud can be prevented in advance.
  • the wearable terminal 10 may be connectable with a remote control unit (not shown) for operating the wearable terminal 10.
  • the remote control unit is a pointing device such as a mouse. Thereby, the operability of the wearable terminal 10 is improved.
  • the control unit may incorporate a battery that supplies power to the wearable terminal 10.
  • the authentication server 21 holds authentication data (a pair of iris data and terminal ID) and performs authentication using the authentication data, but the present invention is not limited thereto.
  • the second embodiment shows an example in which the wearable terminal 10 holds iris data and performs authentication alone with a system configuration in which no authentication server exists. This eliminates the need for an authentication server, and can reduce the cost of the system.
  • the configuration of the wearable terminal 10 of the second embodiment is the same as that of the first embodiment shown in FIG.
  • the wearable terminal 10 of the second embodiment is different from that of the first embodiment in the data held and the processing executed by the processing unit 13.
  • FIG. 11 is a block diagram of the information processing system according to the second embodiment.
  • the wearable terminal 10 remotely accesses the remote access server 22 via the network 23.
  • FIG. 12 is a table showing an example of information held by the wearable terminal 10 according to the second embodiment.
  • the wearable terminal 10 according to the second embodiment holds a terminal ID and iris data.
  • the terminal ID is identification information for identifying an individual terminal
  • the wearable terminal 10 holds the terminal ID of its own device in a fixed manner.
  • the iris data is data generated from an iris image of a user registered as a user who can perform remote access using the wearable terminal 10, and is erased or changed while ensuring sufficient safety. Is set to be possible.
  • FIG. 13 is a sequence diagram illustrating a remote access operation of the information processing system according to the second embodiment.
  • the wearable terminal 10 reads an iris image from the user's eye 19 (step 401).
  • the wearable terminal 10 generates iris data from the read iris image, and compares the generated iris data with iris data registered in advance in its own device, thereby performing user iris authentication (step 402).
  • the generated iris data matches the iris data held in advance. Therefore, a positive authentication result can be obtained in iris authentication.
  • the authentication server 21 communicates with the remote access server 22 and transmits an authentication request for normal authentication to the remote access server 22 (step 403).
  • the remote access server 22 that has received the authentication request performs normal user authentication (step 404).
  • the remote access server 22 transmits the authentication result to the wearable terminal 10 and permits remote access (step 405).
  • the wearable terminal 10 to which remote access is permitted then securely accesses the remote access server 22 (step 406).
  • FIG. 14 is a flowchart showing an operation at the start of remote access of the wearable terminal 10 according to the second embodiment. Each process of FIG. 14 is mainly executed by the processing unit 13 or is controlled by the processing unit 13.
  • the wearable terminal 10 When there is a request for remote access by the user (step 501), the wearable terminal 10 reads the user's iris image by the reading unit 11 (step 502). The wearable terminal 10 that has read the iris image generates iris data from the iris image, and authenticates the user by collating with iris data that is authentication data held in advance (step 503).
  • the wearable terminal 10 determines whether the authentication result is positive or negative (step 504).
  • step 506 If the authentication result is negative, the fact that the authentication is NG is displayed and the process returns to step 501 (step 506). If the authentication result is positive, wearable terminal 10 performs remote access to remote access server 22 (step 505).
  • FIG. 15 is a flowchart showing an operation during remote access of the wearable terminal 10 according to the second embodiment. Each process of FIG. 15 is mainly executed by the processing unit 13 or is executed by the control of the processing unit 13.
  • Wearable terminal 10 periodically reads an iris image at period T (steps 601 to 602).
  • the wearable terminal 10 generates iris data from the read iris image, and collates the iris data with iris data that is authentication data held in advance (step 603).
  • the wearable terminal 10 determines whether the iris data match, that is, whether the authentication result is affirmative or negative (step 604). If the authentication result is affirmative, the wearable terminal 10 returns to step 601. If the authentication result is negative, for example, wearable terminal 10 displays on the screen that authentication NG has occurred (step 605), and stops remote access to remote access server 22 (step 606). The remote access can be stopped, for example, by prohibiting the wearable terminal 10 from accessing the remote access server 22 and stopping the projection of the display screen.
  • the information processing system repeats authentication using biometric information every predetermined time
  • the present invention is not limited to this.
  • the authentication based on the biometric information is not repeated every predetermined time, but it can be detected that the wearable terminal 10 is removed from the user's head, and the authentication is performed again when the removal is detected. That's it.
  • the wearable terminal 10 has been removed from the user's head after a positive user authentication result has been obtained, assuming that the user is authorized to access the remote access server 22 using the wearable terminal 10. Otherwise, the possibility that a legitimate user is using the wearable terminal 10 is sufficiently high, so the necessity of repeating authentication is low. Therefore, when the wearable terminal 10 detects that the wearable terminal 10 has been removed, authentication is performed again. As a result, the number of times authentication is executed can be reduced.
  • the wearable terminal 10 measures posture change with a gyro and detects removal based on the measurement result. Further, the wearable terminal 10 fixes a virtual display to be displayed to the user in the real space based on the posture change measured by the same gyro. Thereby, the virtual display can realize the same usability as the actual display for the user.
  • FIG. 16 is a block diagram of the wearable terminal 10 according to the third embodiment.
  • the wearable terminal 10 according to the third embodiment has a configuration in which a gyro sensor 31 is added to the wearable terminal 10 according to the first embodiment illustrated in FIG.
  • the configuration of the information processing system of the third embodiment is the same as that of the first embodiment shown in FIG.
  • the gyro sensor 31 is a sensor that detects a change in the attitude of the wearable terminal 10 itself.
  • the gyro sensor 31 outputs a signal indicating the attitude of the wearable terminal 10 or a change in attitude.
  • the processing unit 13 calculates a change in posture of the wearable terminal 10 based on a signal from the gyro sensor 31, and displays a display image on a virtual display fixed in the real space based on the change in posture. Since the appearance of the virtual display changes naturally when the user moves his / her head, the virtual display can realize the same usability as an actual display.
  • the operation of the information processing system according to the third embodiment at the start of remote access is the same as that of the first embodiment shown in FIG. Further, the operation at the start of remote access of the wearable terminal 10 according to the third embodiment is the same as that of the first embodiment shown in FIG.
  • FIG. 17 is a flowchart illustrating an operation during remote access of the wearable terminal 10 according to the third embodiment. Each process of FIG. 17 is mainly executed by the processing unit 13 or is executed under the control of the processing unit 13.
  • the wearable terminal 10 monitors the removal from the user's head using the gyro sensor 31 (step 701), and reads the iris image when the removal is detected (step 702).
  • the wearable terminal 10 generates iris data from the read iris image, transmits the iris data and the terminal ID of the own device to the authentication server 21, and makes an authentication request (step 703).
  • the wearable terminal 10 receives the authentication result from the authentication server 21, determines whether the authentication result is affirmative or negative (step 704), and returns to step 701 if the authentication result is affirmative. If the authentication result is negative, for example, the wearable terminal 10 displays on the screen that authentication NG has occurred (step 705), and stops remote access to the remote access server 22 (step 706). At this time, since the authentication server 21 has also obtained a negative authentication result, remote access to the remote access server 22 of the wearable terminal 10 is prohibited. The remote access can be stopped, for example, by prohibiting the wearable terminal 10 from accessing the remote access server 22 and stopping the projection of the display screen.
  • An attitude change of wearable terminal 10 can be detected by an acceleration sensor, an eye tracker, a physical switch, or a combination thereof, and removal can be determined based on the detection result.
  • Example 3 the wearable terminal 10 measured the posture change by the gyro sensor 31, and based on the measurement result, the virtual display to be displayed to the user was fixed in the real space.
  • the present invention is not limited to this.
  • the camera (external monitor) is used to photograph the surroundings, for example, the user who operates the keyboard, and the captured image and the display image from the remote access server 22 are fixed in the real space. And the synthesized image is displayed on the virtual display. According to this, the composite image displayed on the virtual display can realize the same usability as the actual display on the virtual display.
  • FIG. 18 is a block diagram of the wearable terminal 10 according to the fourth embodiment.
  • the wearable terminal 10 according to the present embodiment has a configuration in which a camera 32 is added to that of the third embodiment shown in FIG. The camera 32 captures a surrounding image and transmits the image data to the processing unit 13.
  • the processing unit 13 combines the captured image of the camera 32 and the display image from the remote access server 22 so that the display image is fixed in the real space, and displays the combined image on a virtual display.
  • the processing unit 13 records an image photographed by the camera 32 when the wearable terminal 10 is accessing the remote access server 22.
  • images that can be used for fraud tracking can be stored, so that it is easy to track when there is fraud, and fraud can be prevented by threat.
  • the camera 32 images the front of the user's head.
  • the processing unit 13 calculates a change in posture of the device based on a signal from the gyro sensor 31, displays a display image on a virtual display fixed in real space based on the change in posture, and the virtual display and the camera 32.
  • An image obtained by combining the image captured in step S is projected from the projection unit 12.
  • the virtual display since the appearance of the virtual display changes naturally when the user moves his / her head, the virtual display can realize the same usability as the actual display, and share the camera 32 with it. Since it is possible to accumulate images that can be used for tracking fraud, it is easy to track when there is a fraud, and fraud can be prevented in advance by threats.
  • the processing unit 13 records an image obtained by synthesizing a virtual display and a front image captured by the camera 32. Since an image obtained by synthesizing an image in front of the head imaged by the camera 32, that is, a visual field direction and a display image is recorded, for example, it is possible to record a keyboard operation and a screen displayed thereby, which is high. Unauthorized traceability can be obtained.
  • the fourth embodiment is different from the information processing system except that the wearable terminal 10 combines and displays the image captured by the camera 32 and the display image by the remote access server 22 and records the captured image of the camera 32.
  • the basic configuration and operation of the wearable terminal 10 are the same as those in the first embodiment.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Hardware Design (AREA)
  • General Engineering & Computer Science (AREA)
  • Computer Security & Cryptography (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Software Systems (AREA)
  • Human Computer Interaction (AREA)
  • Collating Specific Patterns (AREA)

Abstract

L'invention concerne une technologie conçue pour accroître l'intégrité des informations dans un système de traitement d'informations qui permet un accès à distance. Le terminal portatif ci-décrit, qui est monté sur la tête d'un utilisateur, comporte : une unité de lecture qui acquiert une image d'informations biologiques obtenue par photographie d'un œil de l'utilisateur ; une unité de projection qui est placée à demeure sur l'unité de lecture et qui projette une image à afficher sur un trajet optique différent du trajet optique sur lequel l'unité de lecture acquiert l'image d'informations biologiques ; une unité de communication qui communique avec un serveur d'accès à distance ; et une unité de traitement qui effectue une authentification d'utilisateur sur la base de l'image d'informations biologiques et qui accède, lorsqu'un résultat d'authentification positif est obtenu, au serveur d'accès à distance par l'intermédiaire de l'unité de communication.
PCT/JP2015/051605 2015-01-22 2015-01-22 Terminal portatif et système de traitement d'informations utilisant ce terminal WO2016117061A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/JP2015/051605 WO2016117061A1 (fr) 2015-01-22 2015-01-22 Terminal portatif et système de traitement d'informations utilisant ce terminal

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2015/051605 WO2016117061A1 (fr) 2015-01-22 2015-01-22 Terminal portatif et système de traitement d'informations utilisant ce terminal

Publications (1)

Publication Number Publication Date
WO2016117061A1 true WO2016117061A1 (fr) 2016-07-28

Family

ID=56416627

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2015/051605 WO2016117061A1 (fr) 2015-01-22 2015-01-22 Terminal portatif et système de traitement d'informations utilisant ce terminal

Country Status (1)

Country Link
WO (1) WO2016117061A1 (fr)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2018097437A (ja) * 2016-12-08 2018-06-21 株式会社テレパシージャパン ウェアラブル情報表示端末及びこれを備えるシステム
JP2020503578A (ja) * 2016-10-13 2020-01-30 アリババ・グループ・ホールディング・リミテッドAlibaba Group Holding Limited 仮想現実を使用したユーザ識別認証
JP2020504348A (ja) * 2016-10-13 2020-02-06 アリババ・グループ・ホールディング・リミテッドAlibaba Group Holding Limited 仮想現実に基づくサービス制御およびユーザ識別認証
JPWO2021245886A1 (fr) * 2020-06-04 2021-12-09
JP2022122890A (ja) * 2020-09-17 2022-08-23 日本電気株式会社 表示制御システム、表示制御方法、及びコンピュータプログラム

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP3840171B2 (ja) * 2001-10-26 2006-11-01 キヤノン株式会社 画像表示装置及びその方法並びに記憶媒体
JP2008241822A (ja) * 2007-03-26 2008-10-09 Mitsubishi Electric Corp 画像表示装置
JP2013175929A (ja) * 2012-02-24 2013-09-05 Nikon Corp 情報出力装置、及び情報出力方法
JP2013210588A (ja) * 2012-03-30 2013-10-10 Brother Ind Ltd ヘッドマウントディスプレイ
JP5414946B2 (ja) * 2011-06-16 2014-02-12 パナソニック株式会社 ヘッドマウントディスプレイおよびその位置ずれ調整方法
JP2014164359A (ja) * 2013-02-21 2014-09-08 Nec Networks & System Integration Corp 認証システム
JP2015005972A (ja) * 2013-05-22 2015-01-08 株式会社テレパシーホールディングス 撮影画像のプライバシー保護機能を有するウェアラブルデバイス及びその制御方法並びに画像共有システム

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP3840171B2 (ja) * 2001-10-26 2006-11-01 キヤノン株式会社 画像表示装置及びその方法並びに記憶媒体
JP2008241822A (ja) * 2007-03-26 2008-10-09 Mitsubishi Electric Corp 画像表示装置
JP5414946B2 (ja) * 2011-06-16 2014-02-12 パナソニック株式会社 ヘッドマウントディスプレイおよびその位置ずれ調整方法
JP2013175929A (ja) * 2012-02-24 2013-09-05 Nikon Corp 情報出力装置、及び情報出力方法
JP2013210588A (ja) * 2012-03-30 2013-10-10 Brother Ind Ltd ヘッドマウントディスプレイ
JP2014164359A (ja) * 2013-02-21 2014-09-08 Nec Networks & System Integration Corp 認証システム
JP2015005972A (ja) * 2013-05-22 2015-01-08 株式会社テレパシーホールディングス 撮影画像のプライバシー保護機能を有するウェアラブルデバイス及びその制御方法並びに画像共有システム

Cited By (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2020503578A (ja) * 2016-10-13 2020-01-30 アリババ・グループ・ホールディング・リミテッドAlibaba Group Holding Limited 仮想現実を使用したユーザ識別認証
JP2020504348A (ja) * 2016-10-13 2020-02-06 アリババ・グループ・ホールディング・リミテッドAlibaba Group Holding Limited 仮想現実に基づくサービス制御およびユーザ識別認証
US11227038B2 (en) 2016-10-13 2022-01-18 Advanced New Technologies Co., Ltd. User identity authentication using virtual reality
JP2018097437A (ja) * 2016-12-08 2018-06-21 株式会社テレパシージャパン ウェアラブル情報表示端末及びこれを備えるシステム
JPWO2021245886A1 (fr) * 2020-06-04 2021-12-09
WO2021245886A1 (fr) * 2020-06-04 2021-12-09 三菱電機株式会社 Système de gestion de circulation dans un bâtiment
JP7294538B2 (ja) 2020-06-04 2023-06-20 三菱電機株式会社 建物の交通管理システム
JP2022122890A (ja) * 2020-09-17 2022-08-23 日本電気株式会社 表示制御システム、表示制御方法、及びコンピュータプログラム
JP7400876B2 (ja) 2020-09-17 2023-12-19 日本電気株式会社 表示制御システム、表示制御方法、及びコンピュータプログラム

Similar Documents

Publication Publication Date Title
US11157606B2 (en) Facial recognition authentication system including path parameters
US10341113B2 (en) Password management
CN108293187B (zh) 利用可穿戴装置注册用户的方法和系统
US9600688B2 (en) Protecting display of potentially sensitive information
KR102173699B1 (ko) 안구 신호들의 인식 및 지속적인 생체 인증을 위한 시스템과 방법들
US9747500B2 (en) Wearable retina/iris scan authentication system
US20170324726A1 (en) Digital authentication using augmented reality
WO2016117061A1 (fr) Terminal portatif et système de traitement d'informations utilisant ce terminal
US20130069787A1 (en) Locking Mechanism Based on Unnatural Movement of Head-Mounted Display
BR112018007449B1 (pt) Dispositivo de computação, método implementado por computador e dispositivo de memória legível por computador
Shrestha et al. An offensive and defensive exposition of wearable computing
JP2007003745A (ja) 画像表示装置及び画像表示システム
CN106462226A (zh) 操作用户设备的显示器
US20230177128A1 (en) Authentication and calibration via gaze tracking
US20230073410A1 (en) Facial recognition and/or authentication system with monitored and/or controlled camera cycling
WO2023164268A1 (fr) Dispositifs, procédés et interfaces graphiques utilisateurs pour autoriser une opération sécurisée
JP2022189048A (ja) 認証システム、認証装置、認証方法、及びプログラム
WO2015093221A1 (fr) Dispositif électronique et programme
KR102537147B1 (ko) 인증된 증강 현실 콘텐츠의 제공 시스템 및 방법
US20230273985A1 (en) Devices, methods, and graphical user interfaces for authorizing a secure operation
KR20140067494A (ko) 지문을 이용한 보안 강화 방법, 전자 기기 및 기록 매체
JP2022025553A (ja) コンピュータプログラム

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 15878756

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 15878756

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: JP