WO2016035759A1 - 人型ロボット - Google Patents
人型ロボット Download PDFInfo
- Publication number
- WO2016035759A1 WO2016035759A1 PCT/JP2015/074739 JP2015074739W WO2016035759A1 WO 2016035759 A1 WO2016035759 A1 WO 2016035759A1 JP 2015074739 W JP2015074739 W JP 2015074739W WO 2016035759 A1 WO2016035759 A1 WO 2016035759A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- humanoid robot
- data set
- hand
- vein pattern
- vein
- Prior art date
Links
Images
Classifications
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J15/00—Gripping heads and other end effectors
- B25J15/08—Gripping heads and other end effectors having finger members
- B25J15/12—Gripping heads and other end effectors having finger members with flexible finger members
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/117—Identification of persons
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/48—Other medical applications
- A61B5/4887—Locating particular structures in or on the body
- A61B5/489—Blood vessels
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/68—Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient
- A61B5/6801—Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient specially adapted to be attached to or worn on the body surface
- A61B5/6813—Specially adapted to be attached to a specific body part
- A61B5/6825—Hand
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J13/00—Controls for manipulators
- B25J13/08—Controls for manipulators by means of sensing devices, e.g. viewing or touching devices
- B25J13/085—Force or torque sensors
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J13/00—Controls for manipulators
- B25J13/08—Controls for manipulators by means of sensing devices, e.g. viewing or touching devices
- B25J13/088—Controls for manipulators by means of sensing devices, e.g. viewing or touching devices with position, velocity or acceleration sensors
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J15/00—Gripping heads and other end effectors
- B25J15/0009—Gripping heads and other end effectors comprising multi-articulated fingers, e.g. resembling a human hand
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J19/00—Accessories fitted to manipulators, e.g. for monitoring, for viewing; Safety devices combined with or specially adapted for use in connection with manipulators
- B25J19/02—Sensing devices
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/0006—Exoskeletons, i.e. resembling a human figure
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
- B25J9/1612—Programme controls characterised by the hand, wrist, grip control
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
- B25J9/1694—Programme controls characterised by use of sensors other than normal servo-feedback from position, speed or acceleration sensors, perception control, multi-sensor controlled systems, sensor fusion
Definitions
- the present invention relates to a humanoid robot that can identify an individual with natural movement. About.
- Such a humanoid robot is usually capable of walking on two legs or moving by wheel, and can move to a necessary position autonomously by a human instruction. In the future, it is expected that it will be possible to work and perform in the field alone.
- Patent Document 1 discloses a humanoid robot that performs predetermined work contents when it is authenticated as a regular worker.
- the humanoid robot itself does not authenticate, but the operator performs an authentication procedure using another authentication device, and it is not natural as a humanoid robot that expects communication with humans.
- using another device is not convenient for home use.
- authentication procedures for humanoid robots that can communicate with humans that is, humanoid robots that have the purpose of softening and entertaining people by moving, they are not very smart but are inherently The charm of will be halved.
- an object of the present invention is to provide a humanoid robot capable of identifying an individual person who comes into contact with the robot by its natural operation.
- a humanoid robot in a right hand having a plurality of fingers having a joint and a driving mechanism capable of shaking hands with a human hand, and the right hand.
- a near-infrared light emitting device and a near-infrared light sensor wherein the near-infrared light emitted from the near-infrared light emitting device illuminates the inside of the human hand in a state of shaking with a human hand, and the near-infrared sensor forms a vein pattern
- the near-infrared light emitting device and the near-infrared sensor provided at a position that is detected by the near-infrared sensor, an information recording device that records the vein pattern detected by the near-infrared sensor, and the information recording device Information processing that compares the recorded vein pattern with another vein pattern detected by the near-infrared sensor and calculates the degree of similarity between them Characterized in that comprising a location.
- the information processing apparatus controls the movement of the right hand of the humanoid robot to perform a handshake with a human hand, and uses the near infrared light emitting device and the near infrared sensor to collate vein patterns.
- the similarity between the vein pattern acquired as a set and the vein pattern previously recorded as a registered data set in the information recording apparatus is calculated, and the person is identified based on the similarity.
- the vein pattern recorded as the registration data set is a vein pattern of the humanoid robot owner, and the owner is based on the similarity between the vein pattern and the matching data set. Authenticate.
- the registration data set is repeatedly acquired while changing the position of the right hand of the humanoid robot.
- the handshake is performed again to acquire the registration data set again, Both of them are used as a registered data set for personal identification by comparing with a matching data set.
- the result of the personal identification is reflected in the operation of the humanoid robot.
- the vein pattern is a vein pattern of a human finger.
- the humanoid robot According to the humanoid robot according to the present invention, it is possible to suppress the theft of the humanoid robot by identifying the person of the person in contact by the natural movement as a handshake person, and the identification result is also included in the operation of the humanoid robot. Can be reflected. Therefore, a humanoid robot with high communication ability with humans is realized.
- FIG. 1 is a diagram illustrating the basic configuration of a humanoid robot according to the present invention.
- FIG. 2 is a block diagram showing an information processing apparatus that controls the operation of the humanoid robot shown in FIG.
- FIG. 3 is a diagram illustrating a hand structure of the humanoid robot according to the first embodiment of the present invention.
- FIG. 4 is a flowchart showing a procedure for registering the vein data set of the owner by shaking hands with a person in the humanoid robot according to the first embodiment of the present invention.
- FIG. 5 is a flowchart showing a procedure for authenticating an owner by shaking hands with a person in the humanoid robot according to the first embodiment of the present invention.
- FIG. 1 is a diagram illustrating the basic configuration of a humanoid robot according to the present invention.
- FIG. 2 is a block diagram showing an information processing apparatus that controls the operation of the humanoid robot shown in FIG.
- FIG. 3 is a diagram illustrating a hand structure
- FIG. 6 is a diagram illustrating a hand structure of the humanoid robot according to the second embodiment of the present invention.
- FIG. 7 is a diagram for explaining a state in which a vein is photographed in a state in which the humanoid robot according to the second embodiment of the present invention is shaking hands with a person.
- FIG. 8 is a flowchart showing a procedure for identifying the person in the humanoid robot according to the first embodiment of the present invention.
- FIG. 9 is a diagram illustrating a hand structure of a humanoid robot according to a fifth embodiment of the present invention.
- FIG. 1 is a diagram showing a basic configuration of a humanoid robot.
- the humanoid robot 100 includes a left leg 22L, a right leg 22R, a torso 30, a left arm 42L, a right arm 42R, and a head 50.
- the left arm 42L and the right arm 42R on both sides of the torso 30 include a plurality of joints, their drive mechanisms, torque sensors, joint position sensors that detect the positions of the joints, and acceleration sensors, and can move in the same manner as a human arm. .
- the head 50 is provided with left and right eyes 52L and 52R each having an image sensor, left and right ears 54L and 54R each having a microphone, a nose 56 having an olfactory sensor, and a mouth 58 having a speaker. ing.
- the left leg 22L and the right leg 22R provided at the bottom of the torso 30 include a plurality of joints and their drive mechanisms, torque sensors, joint position sensors that detect the positions of the joints, and acceleration sensors. Is possible. However, even though these legs have a simpler structure, the features of the embodiments of the present invention described in detail below are applicable.
- the left leg and the right leg may be integrated without being separated, a plurality of wheels may be provided at the bottom thereof, and the wheel may be moved by being rotationally driven.
- FIG. 2 is a block diagram showing the main part of the information processing apparatus 10.
- a CPU 11, a ROM 12, a RAM 13, a timer 14, and an I / O control circuit 15 are connected to each other by a bus 16.
- the CPU 11 is activated by executing a program stored in the ROM 12, and by referring to an interrupt signal from the timer 14, recording and reading data to the RAM 13, outputting an instruction to the I / O control circuit 15, and the like.
- the operation of the humanoid robot 100 is controlled.
- the I / O control circuit 15 directly controls these external devices by outputting a control signal to an external device such as the drive motor control circuit 17 according to a command from the CPU 11 and realizes an actual operation according to the program.
- the drive motor control circuit 17 controls the supply of electric power to the motor that moves each joint of the humanoid robot 100.
- External devices controlled by the I / O control circuit 15 include a drive motor control circuit 17, a nonvolatile memory such as an EEPROM, a driving force transmission mechanism, a GPS, a wireless LAN, a temperature sensor, a humidity sensor, a position sensor, Battery etc. are included. Further, a near infrared LED and a near infrared sensor described later are also included.
- a drive motor control circuit 17 a nonvolatile memory such as an EEPROM, a driving force transmission mechanism, a GPS, a wireless LAN, a temperature sensor, a humidity sensor, a position sensor, Battery etc. are included. Further, a near infrared LED and a near infrared sensor described later are also included.
- the information processing apparatus controls the drive mechanism that moves the left leg 22L, the right leg 22R, the torso 30, the left arm 42L, the right arm 42R, and the head 50 of the humanoid robot 100. Since the control procedure and its specific control procedure may be generally known, details thereof will be omitted except for those specifically described below.
- the humanoid robot according to the first embodiment will be described with reference to FIGS. 3 to 5.
- the basic structure of this humanoid robot is as shown in FIG.
- the first embodiment is characterized in that the authentication means is provided in the right hand 60R at the end of the right arm 42R.
- FIG. 3 is a diagram illustrating the hand 60R of the humanoid robot according to the first embodiment.
- the hand 60R includes a palm part 62, a thumb part 64t, an index finger part 64i, a middle finger part 64m, a ring finger part 64r, and a little finger part 64f.
- Each finger portion is provided with joints provided at three locations, like a human being, and drive mechanisms for independently driving these joints.
- each joint is equipped with a joint position sensor, a torque sensor, and an acceleration sensor. By driving each joint while referring to the information of each sensor, it is possible to move like a human hand. It has become. In particular, it is configured to be able to perform a firm handshake that envelops the opponent's (human) hand.
- an essential requirement for the operation to be implemented in the hand 60R is that it can be shaken with a human. Therefore, there is no problem even if the index finger portion 64i, the middle finger portion 64m, the ring finger portion 64r, and the little finger portion 64f are integrated to form a composite finger portion.
- the composite finger portion is provided with a joint necessary for shaking hands with a human and can be driven independently of the thumb portion 64t.
- a near infrared sensor 74 is provided in the center of the palm portion 62 of the hand 60R, and a pair of near infrared LEDs 72a and 72b are also placed inside the palm portion 62 with the near infrared sensor 74 interposed therebetween (upper and lower in the figure). Is provided. Near-infrared rays are emitted from near-infrared LEDs 72a and 72b while shaking hands with a person. This near-infrared light passes through the near-infrared window 76 and illuminates the inside of each finger, index finger 65i, middle finger 65m, ring finger 65r, and little finger 65f, especially the middle finger 65m and ring finger 65r of the person shaking hands. Then, the reflected light is incident on the near infrared sensor 74, where it is detected and acquired as vein image data.
- vein authentication is performed.
- the image data is analyzed, and the image areas of the index finger 65i, the middle finger 65m, the ring finger 65r, and the little finger 65f are specified.
- each vein pattern is acquired from the image area
- the left arm 42L and the right arm 42R are completely symmetric, and the hand 60L at the end of the left arm 42L has the same structure.
- left-hand authentication means such as near-infrared LEDs 72a and 72b and near-infrared sensor 74 may be omitted, and authentication may always be performed only with the right hand. In this case, if a left-handed person tries to shake hands with his left hand, he may say, “I'm sorry. I will shake you with my right hand to identify you.”
- the humanoid robot 100 After the shipment, at the first activation, the humanoid robot 100 makes a general greeting such as “Nice to meet you” and then says “Tell me my owner's name”. Therefore, the owner answers to the humanoid robot 100, for example, "It's George.” The humanoid robot 100 recognizes the content by voice recognition, and says, “George, I will register you as an owner. Therefore, the owner shakes hands with the humanoid robot 100 and starts an authentication operation. The process will be described with reference to the flowchart of FIG.
- step S401 the humanoid robot 100 starts a handshake operation by holding out the hand 60R.
- This handshake operation is an operation for shifting to a state in which a person's hand is gripped with a constant pressure. Therefore, in step S402, it is detected by the joint position sensor whether the thumb part 64t is in a positional relationship corresponding to the handshake state with the palm part 62 or other finger parts, and each finger part corresponds to the handshake state.
- the torque sensor detects whether or not a certain level of torque is generated. In this way, it is determined whether or not the handshake state has been entered. If it has not shifted to the handshake state, after waiting for a certain time (for example, 100 milliseconds) in step S403, the transition to the handshake state is confirmed again in step S402. This is repeated until the transition to the handshake state is confirmed.
- a certain time for example, 100 milliseconds
- step S402 If it is determined in step S402 that the transition to the handshake state has been completed, the process proceeds to step S404.
- step S ⁇ b> 404 near infrared rays are emitted from the near infrared LEDs 72 a and 72 b, and a captured image is acquired by the near infrared sensor 74.
- three captured images are acquired at an interval of 100 milliseconds.
- processing is performed by a multitask including a photographing thread and a data registration thread.
- a shooting thread is first generated.
- the captured image data is acquired by the near infrared sensor 74 in step S404
- one data registration thread using the captured image data is generated.
- This data registration thread is a thread that performs the processing from step S410 to step S416.
- the shooting thread is a thread that performs the processing from step S401 to step S408 in parallel with the data registration thread.
- the shooting thread determines whether or not a predetermined number (three in this case) of captured images has been acquired in step S405. After waiting for 100 milliseconds in step S406, the captured image is acquired again in step S404.
- step S405 If it is determined in step S405 that a predetermined number (three in this case) of captured images have been acquired, the humanoid robot 100 changes the position of the hand 60R or shifts the hand grip to change the handshake position. The process of changing and acquiring a captured image again is repeated a predetermined number of times.
- step S407 it is determined whether captured image acquisition has been repeated a predetermined number of times. If it has not been repeated a predetermined number of times, in step S408, the process of changing the handshake position, re-gripping and then stopping, and further acquiring the captured image is repeated.
- step S407 If it is determined in step S407 that captured image acquisition has been repeated a predetermined number of times, the captured thread is terminated without acquiring more captured images (step S409).
- This predetermined number is, for example, 3 to 5 times, and 9 to 15 captured images are acquired for convenience.
- the data registration thread generated in step S404 every time one captured image is acquired extracts a vein data set from the captured image in step S410, and as an owner registration data set in step S411. Save to the information recording device.
- step S412 it is determined whether or not the shooting thread has ended. If the shooting thread has not ended, the data registration thread ends here in step S413. If it is determined in step S ⁇ b> 412 that the imaging thread has ended, the above process is repeated once again to acquire a vein data set. This is to increase the registration data and improve the accuracy of authentication.
- step S414 it is determined whether reacquisition has already been performed. If reacquisition has already been performed, the reacquisition is performed only once, so the entire process ends here. If the retry has not been performed yet, in step S415, the finger part of the hand 60R is once opened, the handshake state is released and the hand is released, and then, say “Please make handshake again”. . In step S416, a shooting thread is generated again, and the data registration thread is terminated. Thereafter, the above-described processing is repeated from step S401 using the re-generated shooting thread.
- the registration data acquired by the above processing is recorded in a vein database stored in a nonvolatile memory such as an EEPROM in association with the current location and registration date obtained by GPS using the name “George” as a key.
- a nonvolatile memory such as an EEPROM
- the same processing may be performed. For example, when the registration of one owner is completed, the humanoid robot 100 says “Please tell me the name of another owner”. If you answer "There is no other owner” or "This is the end", the process ends.
- step S501 the humanoid robot 100 starts a handshake operation by holding out the hand 60R.
- This handshake operation is an operation for shifting to a state in which a person's hand is gripped with a constant pressure.
- step S502 it is detected by the joint position sensor whether the thumb part 64t is in a positional relationship with the palm part 62 or other finger parts corresponding to the handshake state.
- the torque sensor detects whether or not a certain level of torque is generated in the part corresponding to the handshake state. In this way, it is determined whether or not the handshake state has been entered.
- step S503 If it has not shifted to the handshake state, it waits for a certain time (for example, 100 milliseconds) in step S503, and then confirms the shift to the handshake state again in step S502. This is repeated until the transition to the handshake state is confirmed.
- a certain time for example, 100 milliseconds
- step S502 If it is determined in step S502 that the transition to the handshake state has been completed, the process proceeds to step S504.
- step S504 near infrared rays are emitted from the near infrared LEDs 72a and 72b, and a captured image is acquired by the near infrared sensor 74.
- three captured images are acquired at an interval of 100 milliseconds.
- processing is performed by a multitask including an imaging thread and an evaluation thread.
- this process is first started as a shooting thread.
- captured image data is acquired by the near infrared sensor 74 in step S504
- one evaluation thread using the captured image data is generated.
- This evaluation thread is a thread that performs the processing from step S510 to step S520.
- the shooting thread is a thread that performs the processing from step S501 to step S508 in parallel with the evaluation thread.
- the shooting thread determines whether or not a predetermined number (three in this case) of captured images has been acquired in step S505. If the predetermined number has not yet been acquired, After waiting for 100 milliseconds in step S506, a captured image is acquired again in step S504.
- step S505 If it is determined in step S505 that a predetermined number (three in this case) of captured images have been acquired, the humanoid robot 100 changes the position of the hand 60R, The process is repeated such as shifting the grip and re-holding (step S508) and acquiring a captured image.
- the number of times of imaging including this repetition is about 2 to 3 times, and 6 to 9 captured images are acquired for convenience. That is, if it is determined in step S507 that the captured image acquisition has been repeated a predetermined number of times, the captured thread is terminated without acquiring any more captured images (step S509).
- step S510 a vein data set is extracted from the captured image and stored as a collation data set. The processing so far is the same as the acquisition of the registration data set shown in FIG.
- step S51 one of registered data sets registered as an owner is acquired from the vein database.
- step S512 the registered data set is compared and collated with the collation data set stored in step S510, and a score indicating the degree of similarity is calculated.
- step S513 it is determined whether or not the calculated score is higher than a predetermined threshold value. If it is higher than the predetermined threshold, in step S514, the person shaking hands is authenticated as the owner and the process ends with a result of successful authentication.
- the verification data set that has been successfully authenticated is added to the vein database as a new registration data set. If the number of registered data sets registered in the vein database exceeds a predetermined number (for example, 32 sets), one of the registered data sets having the smallest correlation with the other is deleted.
- a predetermined number for example, 32 sets
- step S513 determines whether there is a next registered data set. If the next registered data set is high, the process returns to step S511, and the new registered data set is processed again.
- step S515 determines whether or not the shooting thread has ended. If the shooting thread has not ended, the evaluation thread ends in step S517.
- step S516 If it is determined in step S516 that the shooting thread has ended, the authentication has been unsuccessful. However, it is possible that the positional relationship of the handshake was inappropriate, so authentication may be successful if the handshake is redone. Therefore, retry only once.
- step S5128 it is determined whether a retry has already been performed. If a retry has already been performed, the process ends with a result of unsuccessful authentication since the retry is only once. In this case, the humanoid robot 100 may explain such that “the owner cannot be confirmed, so that the robot operates in the restricted mode”.
- step S519 the finger part of the hand 60R is once opened, the handshake state is released and the hand is released, and then, “Please handshake again”.
- step S520 a shooting thread is generated again, and the evaluation thread is terminated. Thereafter, the above-described processing is repeated from step S501 by the again generated shooting thread.
- next evaluation thread may start before one evaluation thread ends. At that time, parallel processing is performed between the evaluation threads. However, if there is little merit of processing multiple evaluation threads in parallel (for example, when the information processing apparatus operates on a single processor), the next evaluation thread is kept waiting and the preceding evaluation thread is terminated. Then, the next evaluation thread may be started. In the above process, the retry is performed only once, but may be repeated a plurality of times.
- the robot When the owner is identified as described above, the robot basically operates in the operation mode for the owner with no operation restriction. If the owner is not identified, the operation is performed in the restricted mode. For example, when asked about the owner's personal life log, he / she answers in the owner's operation mode, but otherwise says “It cannot be answered because it is personal information”. Further, even if the owner is identified, the operation may be performed in the restricted mode when it is known by the image sensors of the left and right eyes 52L and 52R that there are other persons at the same time.
- the above-described first embodiment shown in FIG. 3 is so-called finger vein authentication in which authentication is performed by examining the veins of the finger of a person who is shaking hands.
- the vein authentication of the two fingers of the middle finger 65m and the ring finger 65r is performed, higher accuracy can be expected.
- near infrared rays are emitted in parallel to the palm of the hand, there is a feature that the focal length of the optical system can be increased and image data with less distortion can be obtained.
- FIG. 6 is a diagram illustrating a hand 80R of the humanoid robot according to the second embodiment. Since the basic part of the operation of the hand 80R excluding the part related to vein authentication is the same as that of the hand 60R of the humanoid robot shown in FIG. 3, the same reference numerals are given and description thereof is omitted.
- Example 2 a plurality of near-infrared LEDs 82b are provided in a row near the base of the thumb portion 64t of the hand 80R, and a near-infrared sensor 84m is provided near the base of each of the middle finger portion 64m and the ring finger portion 64r. 84r is provided. Further, a plurality of near-infrared LEDs 82t are also provided inside the tip of the thumb portion 64t. A light shielding cover 88 is provided to reduce external light incident on the near infrared sensors 84m and 84r.
- near-infrared LEDs 82b and 82t emit near-infrared rays while shaking hands with a person (see FIG. 7).
- This near-infrared light illuminates the inside of the palm HM of the person shaking hands, and the reflected light passes through the near-infrared windows 86m and 86r and enters the near-infrared sensors 84m and 84r. Obtained as image data. Using this captured image, palm vein authentication is performed.
- the registration process and authentication process of the humanoid robot 100 are basically the same as the registration process and authentication process of the first embodiment shown in FIGS. However, since two near-infrared sensors 84m and 84r are provided, and captured images are respectively acquired, the number of captured images is twice that of the first embodiment. As a result, the data set for verification is doubled and high accuracy can be expected.
- the robot owner not only the robot owner but also persons other than the owner are identified.
- the hardware of this humanoid robot is the same as that of the humanoid robot of the first embodiment or the second embodiment, and it is configured so that a firm handshake that wraps around the opponent's (human) hand is possible. .
- the humanoid robot left and right eyes 52L, to identify the person by the image sensor 52R, when it is recognized that met in person performs a greeting such as "Hello, I'm Robo of humanoid robot.” . If the person knows that this humanoid robot will personally identify with a handshake, and wants it, say “Robo, let's shake hands with me” and start the identification process.
- step S801 the humanoid robot starts a handshake operation by offering the hand 60R.
- This handshake operation is an operation for shifting to a state in which a person's hand is gripped with a constant pressure.
- a fixed time for example, 100 milliseconds
- step S802 If it is determined in step S802 that the transition to the handshake state has been completed, the process proceeds to step S804.
- step S804 near infrared rays are emitted from the near infrared LEDs 72b and 72t, and captured images are acquired by the near infrared sensors 74m and 74r.
- three captured images are acquired at intervals of 100 milliseconds for each near-infrared sensor.
- Processing is performed with multitasking consisting of threads and evaluation threads.
- this process is first started as a shooting thread.
- captured image data is acquired by the near-infrared sensors 74m and 74r in step S804
- one evaluation thread using the captured image data is generated.
- This evaluation thread is a thread that performs the processing from step S810 to step S820.
- the shooting thread is a thread that performs the processing from step S801 to step S808 in parallel with the evaluation thread.
- the shooting thread determines whether or not a predetermined number of captured images have been acquired in step S805. If the predetermined number of captured images has not been acquired yet, 100 milliseconds in step S806. After waiting, a captured image is acquired again in step S804.
- the predetermined number here is, for example, three in the first embodiment and three for each near-infrared sensor in the second embodiment.
- step S805 If it is determined in step S805 that the predetermined number of captured images have been acquired, the humanoid robot changes the position of the hand 60R or shifts the grip of the finger as in the registration process of FIG.
- step S808 The process of re-holding and standing still (step S808) and further acquiring a captured image are repeated.
- the number of photographing including this repetition is about 2 to 3 times. Accordingly, 6 to 9 captured images are acquired in the first embodiment, and 6 to 9 captured images are acquired for each near infrared sensor in the second embodiment, and thus 12 to 18 captured images are acquired. That is, if it is determined in step S807 that the captured image acquisition has been repeated a predetermined number of times, the captured thread is terminated without acquiring any more captured images (step S809).
- step S810 a vein data set is extracted from the captured image and stored as a matching data set. The processing so far is the same as the acquisition of the collation data set shown in FIG. 5 and FIG.
- step S811 one of registered data sets registered from the vein database is acquired.
- the owner's vein data set is acquired first here.
- step S812 the registered data set is compared with the verification data set stored in step S810, and a score indicating the degree of similarity is calculated.
- step S813 it is determined whether or not the calculated score is higher than a predetermined threshold value. If it is higher than the predetermined threshold value, personal identification of the person shaking hands is made in step S814, and the process ends with the result of successful personal identification. In that case, the humanoid robot uses the identified individual's name and says, for example, "You are George.” Thereafter, an operation using the personal identification result may be performed. Then, as in step S514, the vein database is updated with this collation data set.
- step S813 determines whether there is a next registered data set. If the next registered data set is high, the process returns to step S811, and the new registered data set is processed again.
- step S815 determines whether or not the shooting thread has ended. If the shooting thread has not ended, the evaluation thread ends in step S817.
- step S816 If it is determined in step S816 that the shooting thread has ended, the authentication has been unsuccessful. However, a retry similar to that in the embodiment shown in FIG. 5 is performed only when the owner cannot be determined but the possibility is high. Specifically, in step S813, although the calculated score is not higher than a predetermined threshold (first threshold), the calculated score is compared for the entire registered data set of the owner. Try again if you think it is expensive. For example, if there is a score higher than a second threshold value that is lower than the first threshold value, the retry is performed only once.
- first threshold a predetermined threshold
- step S818 it is determined whether or not to retry. That is, when the condition that the step S818 is not performed in the retry process and the score when compared with the registered data set of the owner is higher than the second threshold value are satisfied, Determine that a retry should be performed.
- step S819 the finger part of the hand 60R is once opened, the handshake state is released, the hand is released, and then “Please shake again”.
- step S820 a shooting thread is generated again, and the evaluation thread is terminated. Thereafter, the above-described processing is repeated from step S801 by the again generated shooting thread.
- step S821 If no retry is required, authentication is unsuccessful. In that case, the humanoid robot says, “Can you tell me your name?” For example, if the partner answers “It is Bob”, in step S821, the plurality of matching data sets are registered in association with the name Bob.
- This embodiment is characterized by performing vein authentication and measuring the heart rate.
- the measured heart rate is used as one of reference information for identity verification, or is communicated to the person by voice. For example, in the case where the determination of whether vein authentication is right or wrong is on the border line, if the heart rate is substantially the same, the possibility that the person is the person has increased and the authentication is successful. Or, he tries to replace the pulse meter.
- the basic part of the operation of this embodiment excluding the heart rate measurement is the same as that of the first embodiment or the second embodiment, so that the description thereof is omitted.
- the vein is located near the surface of the skin (about 1.5mm to 2.0mm). Therefore, by setting the focal position of the optical system of the near-infrared sensor at that position, the transmitted light that has passed through the vein is efficiently collected to capture the vein pattern.
- the artery is located inside the vein but still absorbs near infrared rays.
- the vein has almost no pulsation, but the artery periodically changes its size according to the heart rate by pulsation. Therefore, periodic fluctuations due to pulsations are reflected in the near-infrared signal level incident on the near-infrared sensor.
- the heart rate can be measured by calculating the average signal level of the output of the near-infrared sensor for each measurement and taking out a certain frequency range with a band-pass filter implemented by a digital filter or the like.
- imaging is performed three times at intervals of 100 milliseconds. However, in this embodiment, imaging is performed every 100 milliseconds while shaking hands. An average signal level for each image is calculated using all the images, and further a Fourier transform is performed to specify a period corresponding to the heart rate.
- sampling is performed at intervals of 100 milliseconds, it is possible to measure up to 300 times / minute in terms of heart rate. This is sufficient because the maximum human heart rate is about 200. Further, if the number of samplings is increased and high frequencies are removed, the accuracy is improved, but it is considered that such high frequency components can be ignored in a normal state. For example, as a frequency component of 30 times / minute or less, physical movement, heart rate fluctuation, or the like can be considered. However, the influence on the near-infrared signal level caused by physical movement is considered to be small. Further, since fluctuation of the heart rate has a limited measurement time range, it can be ignored from the required accuracy.
- no bandpass filter is used in this embodiment. That is, if the near-infrared signal level incident on the near-infrared sensor is spatially averaged, the average signal level obtained at 100 millisecond intervals is directly Fourier transformed, and the largest frequency component is specified from the result, The corresponding heart rate can be calculated. Specifically, shaking hands for about 7 seconds, 64 data are collected and processed by Fourier transform (FFT).
- FFT Fourier transform
- 256 data may be collected after speaking in advance, saying, for example, “I will measure your heart rate. In this case, a heart rate with higher accuracy can be measured.
- three images are picked up from images that have a large contrast between light and dark, that is, those that are in focus. Use.
- a sensor for acquiring biological information is further provided in the hand of the humanoid robot in any of the first to fourth embodiments.
- force sensors 91a and 91b for measuring the gripping force during a handshake and a temperature sensor 93 for measuring the body temperature are provided.
- the force sensors 91a, 91b are force sensors such as a piezo method or a strain gauge method, for example, and are in a position where the grip force is reliably received, for example, two positions in the direction of the index finger 64i at the base of the thumb portion 64t. In the position. Then, a change with time in the magnitude of the gripping force (grip strength pattern) from the start of the handshake operation to the stable state is recorded as an output from each force sensor.
- the fluctuation can be absorbed even if there is a slight change in the handshake position. Further, by correlating the temporal changes in the grip strength at each position with each other, it is possible to record a certain amount of the hand shake motion of the person and obtain it as a correlation value.
- the temperature sensor 93 is a temperature sensor using a thermistor element, for example, and is provided near the center of the palm 62. Then, the temperature of the skin of the hand is measured when shaking hands. The measured temperature is not the body temperature as it is, but gives an indication of the skin temperature during the handshake of the person.
- the grip strength pattern and the measured skin temperature at that time are recorded as the biological information data of the person.
- the grip strength pattern and the skin temperature substantially match the past recorded data, the authentication is successful. That is, these joint position patterns are used as auxiliary information for the determination of identity verification.
- biometric information data are sequentially stored as reference data, but are deleted from those with a large deviation from the average value when they exceed a certain number. Therefore, the biometric information data recorded is optimized by repeating the authentication.
- vein authentication is performed and information on the hand shape is acquired. That is, in each of the above-described embodiments, a constant torque is detected from each of the finger parts of the humanoid robot, that is, the torque sensors of the thumb part 64t, the index finger part 64i, the middle finger part 68m, the ring finger part 64r, and the little finger part 64f. In the state, the output of each joint position sensor is recorded. These joint position patterns are considered to reflect the shape of the hand of the person shaking hands.
- the joint position pattern at that time is recorded as data indicating the shape of the person's hand.
- the determination of whether vein authentication is right or wrong is in the border line, if the detected joint position pattern substantially matches the past recorded data, the authentication is successful. That is, these joint position patterns are also used as auxiliary information for the identification confirmation.
- the vein authentication is combined with another type of authentication technology.
- a formant of a registrant owner
- a formant is registered for voiceprint authentication before or after the vein data registration process or in parallel with the registration process.
- a formant is a pattern of local maximum values of frequencies included in speech. This voiceprint pattern is related to the shape of the vocal tract and is difficult to imitate.
- the features of intonation and accent, utterance level, and utterance speed are registered.
- a voice is recorded while speaking freely, the voice is analyzed, and a formant is calculated.
- formants can be acquired very naturally.
- Authenticating with a voiceprint involves recording a voice and verifying it with a registered voiceprint pattern before, after, or in parallel with the vein data authentication process. This collation result is reflected in the score of vein authentication.
- this humanoid robot always performs voice recognition during communication. Normally, it is not necessary to store the speech recognition result, but it is preferable to store all the results in a database separately for each conversation partner. For example, during a conversation, a morphological analysis is performed using the conversation content as text, and the content is compared with the content of the database. In this case, since the accuracy may not be so high, it can be used as one of the communications such as “Is it Mr. Ooo?”.
- face authentication data is registered before or after the vein data registration process, or in parallel with the registration process.
- a voice print authentication formant is acquired and face images are collected while speaking freely.
- Features are extracted from this face image and registered. Specifically, the relative position and size of the facial parts, the shape of the eyes, nose, cheekbones and chin are used as features.
- voiceprint authentication and face authentication are generally not as accurate as vein authentication. Authentication based on conversation content appears to be even lower. However, when the accuracy of vein authentication is lowered for some reason, it fulfills the function of complementing it. In addition, voiceprint authentication, face authentication, and authentication based on conversation content use an information source that is completely different from vein authentication. It is considered very effective.
- vein authentication is highly accurate, but it takes time to sequence due to processing such as database reference. Therefore, when performing authentication, first, authentication by any or a combination of voiceprint authentication, face authentication, and conversation content is performed, and if a clear result is not obtained by those authentication methods, then vein authentication is performed. May be.
- vein authentication is performed from the beginning, and other operations are You may make it authenticate by any one or combination of voiceprint authentication, face authentication, and conversation content.
- step S513 or step S813 the individual is successfully identified and the process ends.
- a predetermined threshold value for example, 32 sets
- the score is calculated for all of the registered data sets associated with the individual. Also good.
- the registration data set having the lowest score is deleted, and the matching data set is added as a new registration data set instead of the deleted registration data set.
- the humanoid robot According to the humanoid robot according to the present invention, it is possible to suppress theft of the humanoid robot by identifying the individual of the person in contact by the natural operation as a handshake person. Can also reflect the identification results. Therefore, the value of the humanoid robot itself is increased.
Landscapes
- Engineering & Computer Science (AREA)
- Health & Medical Sciences (AREA)
- Life Sciences & Earth Sciences (AREA)
- Robotics (AREA)
- Mechanical Engineering (AREA)
- General Health & Medical Sciences (AREA)
- Molecular Biology (AREA)
- Physics & Mathematics (AREA)
- Medical Informatics (AREA)
- Veterinary Medicine (AREA)
- Biomedical Technology (AREA)
- Animal Behavior & Ethology (AREA)
- Public Health (AREA)
- Heart & Thoracic Surgery (AREA)
- Surgery (AREA)
- Pathology (AREA)
- Biophysics (AREA)
- Human Computer Interaction (AREA)
- Vascular Medicine (AREA)
- Orthopedic Medicine & Surgery (AREA)
- Manipulator (AREA)
- Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)
Abstract
Description
に関する。
14 タイマー
15 制御回路
16 バス
17 駆動モーター制御回路
22L 左脚
22R 右脚
30 胴体
42L 左腕
42R 右腕
50 頭部
52L、52R 眼
54L、54R 耳
56 鼻
58 口
60L 左手
60R 右手
62 掌部
64f 小指部
64i 人差指部
64r 薬指部
64t 親指部
64m 中指部
65f 小指
65i 人差指
65m 中指
65r 薬指
72b、72t 近赤外線LED
74m、74r 近赤外線センサー
76m、76r 近赤外線窓
80R 手
84m、84r 近赤外線センサー
86m、86r 近赤外線窓
88 遮光カバー
91a、91b 力センサー
100 人型ロボット
Claims (7)
- 人間の手と握手ができるような関節と駆動機構を備えた複数の指部を有する右手と、
前記右手に設けられた近赤外線発光装置および近赤外線センサーであって、人間の手と握手をした状態において、前記近赤外線発光装置から発光した近赤外線が人間の手の内部を照らし、前記近赤外線センサーで静脈パターンとして前記近赤外線センサーで検出される様な位置に設けられた前記近赤外線発光装置および前記近赤外線センサーと、
前記近赤外線センサーで検出された静脈パターンを記録する情報記録装置と、
前記情報記録装置に記録されている静脈パターンと、前記近赤外線センサーで検出された別の静脈パターンを比較処理して、相互の類似度を算出する情報処理装置と
からなることを特徴とする人型ロボット。 - 前記情報処理装置は、前記人型ロボットの右手の動きを制御して人間の手と握手を行い、前記近赤外線発光装置と前記近赤外線センサーにより静脈パターンを照合用データセットとして取得し、この静脈パターンと、予め前記情報記録装置に登録データセットとして記録しておいた静脈パターンとの類似度を算出し、この類似度に基いて前記人間の個人識別を行うことを特徴とする請求項1に記載の人型ロボット。
- 前記登録データセットとして記録しておいた静脈パターンは、前記人型ロボットのオーナーの静脈パターンであり、この静脈パターンと前記照合用データセットとの類似度に基いてオーナーの認証を行うことを特徴とする請求項2に記載の人型ロボット。
- 前記登録データセットは、前記人型ロボットの右手の位置を変えながら繰り返し取得することを特徴とする請求項2に記載の人型ロボット。
- 前記登録データセットを取得した後、一旦、前記人型ロボットの右手を握手している人間の手から離して、再度握手してもう一度前記登録データセットの取得を行い、これらの双方を照合用データセットと比較処理して個人識別を行う為の登録データセットとして用いることを特徴とする請求項3に記載の人型ロボット。
- 前記個人識別の結果を、前記人型ロボットの動作に反映させることを特徴とする請求項5に記載の人型ロボット。
- 前記静脈パターンは、人間の指の静脈のパターンであることを特徴とする請求項6に記載の人型ロボット。
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2016546641A JP6295399B2 (ja) | 2014-09-03 | 2015-08-31 | 人型ロボット |
US15/508,443 US10195748B2 (en) | 2014-09-03 | 2015-08-31 | Humanoid robot |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2014-179226 | 2014-09-03 | ||
JP2014179226A JP2016052697A (ja) | 2014-09-03 | 2014-09-03 | 人型ロボット |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2016035759A1 true WO2016035759A1 (ja) | 2016-03-10 |
Family
ID=55439820
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2015/074739 WO2016035759A1 (ja) | 2014-09-03 | 2015-08-31 | 人型ロボット |
Country Status (3)
Country | Link |
---|---|
US (1) | US10195748B2 (ja) |
JP (2) | JP2016052697A (ja) |
WO (1) | WO2016035759A1 (ja) |
Cited By (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2018082124A (ja) * | 2016-11-18 | 2018-05-24 | 国立研究開発法人理化学研究所 | 磁気素子、スキルミオンメモリ、スキルミオンメモリ搭載中央演算処理lsi、データ記録装置、データ処理装置およびデータ通信装置 |
WO2019142664A1 (ja) * | 2018-01-16 | 2019-07-25 | ソニー株式会社 | 情報処理装置、情報処理方法、及び、プログラム |
JP2019524465A (ja) * | 2016-08-17 | 2019-09-05 | ユニバーシティ・オブ・ハートフォードシャー・ハイヤーエデュケーション・コーポレーションUniversity Of Hertfordshire Higher Education Corporation | ロボット |
JP2020503728A (ja) * | 2017-05-31 | 2020-01-30 | アリババ・グループ・ホールディング・リミテッドAlibaba Group Holding Limited | ブロックチェーンデータ処理方法および装置 |
Families Citing this family (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10708301B2 (en) * | 2016-03-24 | 2020-07-07 | Always Organised Ltd. | Method of, and apparatus for, secure online electronic communication |
US10682774B2 (en) | 2017-12-12 | 2020-06-16 | X Development Llc | Sensorized robotic gripping device |
US10792809B2 (en) * | 2017-12-12 | 2020-10-06 | X Development Llc | Robot grip detection using non-contact sensors |
JP6554202B1 (ja) * | 2018-04-02 | 2019-07-31 | 日本航空電子工業株式会社 | センサモジュール及びロボットシステム |
JP7326707B2 (ja) * | 2018-06-21 | 2023-08-16 | カシオ計算機株式会社 | ロボット、ロボットの制御方法及びプログラム |
EP3653348A1 (en) * | 2018-11-19 | 2020-05-20 | Tata Consultancy Services Limited | System and method for intelligent 3d imaging guided robotic gripper |
KR20240141029A (ko) * | 2023-03-16 | 2024-09-25 | 성균관대학교산학협력단 | Ai 로봇 메디컬 핸드 장치 및 이를 이용한 자가 검사 방법 |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2001236585A (ja) * | 2000-02-21 | 2001-08-31 | Sony Corp | 移動ロボット及び移動ロボットのための盗難防止方法 |
JP2003281653A (ja) * | 2002-03-26 | 2003-10-03 | Victor Co Of Japan Ltd | 自律行動ロボット |
JP2006235772A (ja) * | 2005-02-23 | 2006-09-07 | Tamotsu Yokoyama | データ収集解析表示システム |
JP2006255430A (ja) * | 2006-04-24 | 2006-09-28 | Fujitsu Ltd | 個人認識装置 |
JP2011081756A (ja) * | 2009-10-08 | 2011-04-21 | Inspire Corp | 静脈認証システム・・私の指パスの一日 |
JP2011240468A (ja) * | 2010-05-21 | 2011-12-01 | Toyota Motor Corp | ロボットの接触種類判別システム |
Family Cites Families (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
AU7017396A (en) * | 1995-09-08 | 1997-03-27 | Ross-Hime Designs, Inc. | Robotic manipulator |
JP4303602B2 (ja) * | 2004-01-09 | 2009-07-29 | 本田技研工業株式会社 | 顔面像取得システム |
JP4384021B2 (ja) * | 2004-12-14 | 2009-12-16 | 本田技研工業株式会社 | 脚式ロボットの制御装置 |
US8585620B2 (en) * | 2006-09-19 | 2013-11-19 | Myomo, Inc. | Powered orthotic device and method of using same |
JP4918004B2 (ja) * | 2006-11-24 | 2012-04-18 | パナソニック株式会社 | 多指ロボットハンド |
JP2008262272A (ja) * | 2007-04-10 | 2008-10-30 | Toyota Central R&D Labs Inc | 肌色モデル生成装置及びプログラム |
US8052185B2 (en) * | 2009-04-09 | 2011-11-08 | Disney Enterprises, Inc. | Robot hand with humanoid fingers |
KR101896473B1 (ko) * | 2012-01-04 | 2018-10-24 | 삼성전자주식회사 | 로봇 핸드의 제어 방법 |
US9814604B2 (en) * | 2012-08-12 | 2017-11-14 | 5Th Element Limited | Gripping device |
JP5549724B2 (ja) | 2012-11-12 | 2014-07-16 | 株式会社安川電機 | ロボットシステム |
-
2014
- 2014-09-03 JP JP2014179226A patent/JP2016052697A/ja active Pending
-
2015
- 2015-08-31 WO PCT/JP2015/074739 patent/WO2016035759A1/ja active Application Filing
- 2015-08-31 JP JP2016546641A patent/JP6295399B2/ja active Active
- 2015-08-31 US US15/508,443 patent/US10195748B2/en active Active
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2001236585A (ja) * | 2000-02-21 | 2001-08-31 | Sony Corp | 移動ロボット及び移動ロボットのための盗難防止方法 |
JP2003281653A (ja) * | 2002-03-26 | 2003-10-03 | Victor Co Of Japan Ltd | 自律行動ロボット |
JP2006235772A (ja) * | 2005-02-23 | 2006-09-07 | Tamotsu Yokoyama | データ収集解析表示システム |
JP2006255430A (ja) * | 2006-04-24 | 2006-09-28 | Fujitsu Ltd | 個人認識装置 |
JP2011081756A (ja) * | 2009-10-08 | 2011-04-21 | Inspire Corp | 静脈認証システム・・私の指パスの一日 |
JP2011240468A (ja) * | 2010-05-21 | 2011-12-01 | Toyota Motor Corp | ロボットの接触種類判別システム |
Non-Patent Citations (2)
Title |
---|
HENZE, B.; ET AL.: "Control applications of TORO - A Torque controlled humanoid robot", PROCEEDINGS OF 2014 14TH IEEE -RAS INTERNATIONAL CONFERENCE ON HUMANOID ROBOTS (HUMANOIDS, November 2014 (2014-11-01), pages 841 - 841, XP032736047, DOI: doi:10.1109/HUMANOIDS.2014.7041461 * |
SHIOMI, M.; ET AL.: "Interactive Humanoid Robots for a Science Museum", IEEE INTELLIGENT SYSTEMS, vol. 22, no. Issue:2, March 2007 (2007-03-01), pages 25 - 32, XP011175407, DOI: doi:10.1109/MIS.2007.37 * |
Cited By (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2019524465A (ja) * | 2016-08-17 | 2019-09-05 | ユニバーシティ・オブ・ハートフォードシャー・ハイヤーエデュケーション・コーポレーションUniversity Of Hertfordshire Higher Education Corporation | ロボット |
JP2018082124A (ja) * | 2016-11-18 | 2018-05-24 | 国立研究開発法人理化学研究所 | 磁気素子、スキルミオンメモリ、スキルミオンメモリ搭載中央演算処理lsi、データ記録装置、データ処理装置およびデータ通信装置 |
US10658426B2 (en) | 2016-11-18 | 2020-05-19 | Riken | Magnetic element, skyrmion memory, skyrmion memory-mounted central processing LSI, data recording apparatus, data processing apparatus, and data communication apparatus |
JP2020503728A (ja) * | 2017-05-31 | 2020-01-30 | アリババ・グループ・ホールディング・リミテッドAlibaba Group Holding Limited | ブロックチェーンデータ処理方法および装置 |
WO2019142664A1 (ja) * | 2018-01-16 | 2019-07-25 | ソニー株式会社 | 情報処理装置、情報処理方法、及び、プログラム |
Also Published As
Publication number | Publication date |
---|---|
JP6295399B2 (ja) | 2018-03-20 |
JPWO2016035759A1 (ja) | 2017-07-13 |
US10195748B2 (en) | 2019-02-05 |
JP2016052697A (ja) | 2016-04-14 |
US20170282380A1 (en) | 2017-10-05 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP6295399B2 (ja) | 人型ロボット | |
JP4294853B2 (ja) | 操作指示装置 | |
US10262123B2 (en) | Multimodal biometric authentication system and method with photoplethysmography (PPG) bulk absorption biometric | |
JP6841167B2 (ja) | コミュニケーション装置、コミュニケーションロボットおよびコミュニケーション制御プログラム | |
US9907103B2 (en) | Mobile terminal, wearable device, and equipment pairing method | |
US20210259557A1 (en) | Doorway system that utilizes wearable-based health state verifications | |
US20220032482A1 (en) | Information processing device and storage medium | |
US10836041B2 (en) | More endearing robot, robot control method, and non-transitory recording medium | |
JP2006247780A (ja) | コミュニケーションロボット | |
JP2019164352A (ja) | 人間型ロボットとユーザーの間におけるマルチモード会話を実行する方法、前記方法を実装するコンピュータプログラム及び人間型ロボット | |
US20100328033A1 (en) | Biometric authentication device, biometric authentication method, and storage medium | |
WO2017212901A1 (ja) | 車載装置 | |
JP7205148B2 (ja) | ロボット、制御方法、及び、プログラム | |
JP2006123136A (ja) | コミュニケーションロボット | |
US20170193207A1 (en) | Multimodal biometric authentication system and method with photoplethysmography (ppg) bulk absorption biometric | |
CN111108463A (zh) | 信息处理装置、信息处理方法和程序 | |
JP6380668B2 (ja) | 把持型心電測定装置 | |
KR20200137919A (ko) | 피부 케어 장치를 제어하는 전자 장치와 이의 동작 방법 | |
JP2005131748A (ja) | 関係検知システム | |
JP2020042404A (ja) | 撮像装置、構造物、情報処理装置、および生体認証システム | |
JP2019175432A (ja) | 対話制御装置、対話システム、対話制御方法及びプログラム | |
US20230095810A1 (en) | User Authentication Using Biometric and Motion-Related Data of a User Using a Set of Sensors | |
CN110625607A (zh) | 机器人、机器人的控制方法及存储介质 | |
JP2019208651A (ja) | 視力検査処理プログラムおよび視力検査システム | |
EP3839943A1 (en) | Instruction validation system |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 15838162 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2016546641 Country of ref document: JP Kind code of ref document: A |
|
WWE | Wipo information: entry into national phase |
Ref document number: 15508443 Country of ref document: US |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 15838162 Country of ref document: EP Kind code of ref document: A1 |