JP2013242843A - Information processing device, electronic apparatus, and program - Google Patents

Information processing device, electronic apparatus, and program Download PDF

Info

Publication number
JP2013242843A
JP2013242843A JP2012229326A JP2012229326A JP2013242843A JP 2013242843 A JP2013242843 A JP 2013242843A JP 2012229326 A JP2012229326 A JP 2012229326A JP 2012229326 A JP2012229326 A JP 2012229326A JP 2013242843 A JP2013242843 A JP 2013242843A
Authority
JP
Japan
Prior art keywords
object
collation
gesture
coordinates
detection
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
JP2012229326A
Other languages
Japanese (ja)
Other versions
JP5364194B2 (en
Inventor
Takashi Mochizuki
隆 望月
Original Assignee
Bank Of Tokyo-Mitsubishi Ufj Ltd
株式会社三菱東京Ufj銀行
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority to JP2012101258 priority Critical
Priority to JP2012101258 priority
Application filed by Bank Of Tokyo-Mitsubishi Ufj Ltd, 株式会社三菱東京Ufj銀行 filed Critical Bank Of Tokyo-Mitsubishi Ufj Ltd
Priority to JP2012229326A priority patent/JP5364194B2/en
Publication of JP2013242843A publication Critical patent/JP2013242843A/en
Application granted granted Critical
Publication of JP5364194B2 publication Critical patent/JP5364194B2/en
Application status is Active legal-status Critical
Anticipated expiration legal-status Critical

Links

Images

Abstract

PROBLEM TO BE SOLVED: To accurately recognize an input instruction to a device using an object which moves with a predetermined feature.SOLUTION: An information processing device comprises: detection means that detects an object existing in a predetermined detection range, and outputs detection data corresponding to the detected object; coordinate calculation means that calculates the coordinate of at least one part of the object on the basis of the detection data; and determination means that collates a collation pattern defining a trajectory set corresponding to the part of the object whose coordinate is calculated and an allowable range from the trajectory with an area determined on the basis of a temporal change in the calculated coordinate, and determines an instruction to a device on the basis of the collation result.

Description

  The present invention relates to a technique for setting a recognition space for recognizing an object.

  When inputting a user instruction to the computer, the user uses an input interface such as a keyboard or a mouse. In recent years, a technique related to a natural interface (sometimes referred to as a natural user interface) has been developed in order to facilitate the input of user instructions. For example, as described in Patent Document 1, a technique for inputting a user instruction by a gesture has been developed. Further, as described in Patent Document 2, a technique has been developed that improves accuracy when a user instruction is input by combining a user's line of sight with a gesture.

US Patent Application Publication No. 2011/0193939 US Patent Application Publication No. 2011/0029918

  By the way, since the movement of each part that performs the operation has its own characteristics, the method for accurately detecting the gesture differs depending on the part that performs the gesture to be detected. Therefore, further improvement in the detection accuracy of each gesture is desired.

  An object of the present invention is to accurately recognize an input instruction to an apparatus by using an object that operates with predetermined characteristics.

  According to an embodiment of the present invention, a detection unit that detects an object existing in a predetermined detection range and outputs detection data according to the detected object, and at least one location of the object based on the detection data Based on coordinate calculation means for calculating coordinates, a trajectory set corresponding to the location of the object for which the coordinates are calculated, a matching pattern that defines an allowable range from the trajectory, and a time change of the calculated coordinates An information processing apparatus is provided that includes a determination unit that compares an area to be determined and determines an instruction to the apparatus based on a comparison result.

  According to one embodiment of the present invention, the computer detects an object existing in a predetermined detection range and outputs detection data corresponding to the detected object based on detection data output from the detection means. Coordinate calculation means for calculating at least one coordinate, a trajectory set corresponding to the location of the object for which the coordinate is calculated, a collation pattern that defines an allowable range from the trajectory, and a time for the calculated coordinate There is provided a program for collating with an area determined based on a change and functioning as a determination unit that determines an instruction to the apparatus based on a collation result.

  ADVANTAGE OF THE INVENTION According to this invention, the input instruction | indication with respect to an apparatus by using the object which operate | moves with a predetermined characteristic can be recognized accurately.

1 is a schematic diagram illustrating a configuration of an electronic device 1 according to a first embodiment of the present invention. It is a block diagram which shows the function structure of the information processing apparatus 10 which concerns on 1st Embodiment of this invention. It is a block diagram which shows the structure of the recognition space setting function 100 which concerns on 1st Embodiment of this invention. It is a figure explaining an example of the method of setting the reference point C in the scanning part 130 which concerns on 1st Embodiment of this invention. It is a flowchart which shows the process which sets recognition space in the recognition space setting part 140 which concerns on 1st Embodiment of this invention. It is a figure explaining an example of the specific method of the scanning line shape in the recognition space setting part 140 which concerns on 1st Embodiment of this invention. It is a figure explaining an example of the identification method of the feature position in the recognition space setting part 140 which concerns on 1st Embodiment of this invention. It is a figure explaining an example of the adjustment method of the specific shape in the recognition space setting part 140 which concerns on 1st Embodiment of this invention. It is a figure explaining an example of the setting method of the recognition space in the recognition space setting part 140 which concerns on 1st Embodiment of this invention. It is a block diagram which shows the structure of the gesture input function 200 which concerns on 1st Embodiment of this invention. It is a figure explaining the position of the coordinate calculated by the behavior measurement part 210 which concerns on 1st Embodiment of this invention. It is a figure explaining the table for collation used by the determination part 220 which concerns on 1st Embodiment of this invention. It is a figure explaining the example of the gesture defined in the table for collation which concerns on 1st Embodiment of this invention. It is a flowchart which shows the process in the determination part 220 which concerns on 1st Embodiment of this invention. It is a block diagram which shows the structure of the gaze authentication function 300 which concerns on 1st Embodiment of this invention. It is a figure explaining the example of the confirmation screen displayed on the display apparatus 14 which concerns on 1st Embodiment of this invention. It is a flowchart which shows the process in the determination part 330 which concerns on the 1st Embodiment of this invention. It is a figure explaining the position of the coordinate calculated by the behavior measurement part 210 which concerns on 2nd Embodiment of this invention, and the usage method of the expansion-contraction apparatus 3000. FIG. It is a figure explaining the table for collation used by the determination part 220 which concerns on 2nd Embodiment of this invention. It is a figure explaining the example of the gesture defined in the table for collation concerning 2nd Embodiment of this invention. It is a figure explaining the position of the coordinate calculated by the behavior measurement part 210 which concerns on 3rd Embodiment of this invention, and the usage method of the expansion-contraction apparatus 3010. FIG. It is a figure explaining the usage method different from FIG. 21 of the expansion-contraction apparatus 3010 which concerns on 3rd Embodiment of this invention. It is a figure explaining the table for collation used by the determination part 220 which concerns on 3rd Embodiment of this invention. It is a figure explaining the usage method of the light-emitting device 3050 which concerns on 4th Embodiment of this invention. It is a figure explaining the table for collation used by the determination part 220 which concerns on 4th Embodiment of this invention. It is a figure explaining the usage method of the light-emitting device 3060 which concerns on 4th Embodiment of this invention. It is a figure explaining the table for collation used by the determination part 220 which concerns on 5th Embodiment of this invention. It is a figure explaining the registration area | region decided based on the position of the coordinate calculated by the behavior measurement part 210 which concerns on 5th Embodiment of this invention. It is a figure explaining the detail of the registration area | region which concerns on 5th Embodiment of this invention. It is a figure explaining the example of the collation pattern which concerns on 5th Embodiment of this invention. It is a figure explaining the comparison with the registration area | region and detection area which concern on 5th Embodiment of this invention. It is a figure explaining collation with a detection field and a collation pattern concerning a 5th embodiment of the present invention. It is a figure explaining the example when the detection area and collation pattern which concern on 5th Embodiment of this invention do not correspond. It is a figure explaining the example when the detection area and collation pattern which concern on 5th Embodiment of this invention correspond. It is a figure explaining another example of the collation pattern which concerns on 5th Embodiment of this invention.

  Hereinafter, an electronic apparatus according to an embodiment of the present invention will be described in detail with reference to the drawings. In addition, embodiment shown below is an example of embodiment of this invention, and this invention is not limited to these embodiment.

<First Embodiment>
An electronic apparatus according to a first embodiment of the present invention will be described in detail with reference to the drawings.

[overall structure]
FIG. 1 is a schematic diagram showing a configuration of an electronic apparatus 1 according to the first embodiment of the present invention. In this example, the electronic device 1 is a personal computer. The electronic device 1 is a natural interface that can accept an instruction input from the user to the electronic device 1 not only by the keyboard 13a or the mouse 13b but also by a user's gesture (mainly a gesture using a hand) or a line of sight. It is an apparatus having. The electronic device 1 may be any device that accepts an input of a user instruction and executes a process according to the instruction, such as a smartphone, a mobile phone, a television, a game machine, or a security device.

  The electronic device 1 includes an information processing apparatus 10. The electronic device 1 includes an operation unit 13 (in this example, a keyboard 13a and a mouse 13b), a display device 14, an object detection device 20, and a line-of-sight measurement device 30. Each of these components is connected to the information processing apparatus 10 by wire or wirelessly. In addition, some structures may be comprised as an integral apparatus, such as the object detection apparatus 20 and the visual line measuring apparatus 30 being comprised as an integral apparatus, or the whole structure is comprised as an integral apparatus. May be.

  The information processing apparatus 10 includes a CPU (Central Processing Unit) 11 and a memory 12. The CPU 11 implements various functions such as a recognition space setting function 100, a gesture input function 200, and a line-of-sight authentication function 300 (see FIG. 2) described later by executing a program stored in the memory 12. Each component of the electronic device 1 is controlled by various functions realized by the CPU 11.

  The operation unit 13 receives an operation by the user and outputs a signal corresponding to the received operation to the CPU 11. In the display device 14, the display mode of the screen is controlled by the control of the CPU 11.

  The object detection device 20 detects an object by the pattern projection method, measures the shape of the object in three dimensions, and outputs a measurement signal indicating the measurement result. The object detection device 20 includes a detection sensor 21 that detects an object existing in a predetermined detection range. In this example, the detection sensor 21 receives a light-emitting element that irradiates infrared light in a predetermined pattern (dot pattern, mesh pattern, etc.) and infrared light having a pattern distorted by reflection from an object, and receives a light-receiving signal. It is a depth sensor which has an image sensor for infrared light which outputs. The object detection device 20 outputs a measurement signal using the light reception signal output from the infrared image sensor. Therefore, the measurement signal reflects the shape of the object when viewed from the detection sensor 21.

  In this example, the pattern projection method is used to detect an object in the object detection device 20, but a TOF (Time of Flight) method used in a laser interferometer or the like may be used, an image sensor for visible light, or the like. A method of analyzing and detecting the image data obtained as a result of shooting with may be used. Also, a plurality of methods may be used in combination. In any method, the object detection device 20 outputs a measurement signal indicating the result of measuring the shape of an object existing in the detection range with three-dimensional coordinates.

  In this example, as shown in FIG. 1, the object detection device 20 is installed on the left side of the display device 14 as viewed from the user using the electronic device 1, and the detection sensor 21 is directed in the direction of the arrow A <b> 1. Therefore, infrared light is irradiated from the left hand side of the user toward the right hand side, and the space between the user who uses the electronic device 1 and the display device 14 and the surrounding space become the detection range. Therefore, the position of the object detection device 20 and the orientation of the detection sensor 21 are suitable for detecting the user's right hand as an object. In addition, arrangement | positioning of this object detection apparatus 20 is an example, Comprising: It is not limited to this arrangement | positioning, What is necessary is just to be determined according to the object which should be detected.

  The line-of-sight measurement device 30 is a device that performs imaging for measuring the user's line-of-sight direction. The line-of-sight measurement device 30 includes a line-of-sight sensor 31 including a light emitting element that irradiates infrared light toward the user's eye to acquire a Purkinje image, and an image sensor that captures an area including the user's eye. A photographing signal indicating a photographing result by the image sensor is output to the information processing apparatus 10. This imaging signal is used in the information processing apparatus 10 to measure the user's line-of-sight direction. In this example, the line-of-sight measurement device 30 is attached to the upper part of the display device 14 as shown in FIG.

  In this example, for the measurement of the line-of-sight direction, a known method of measuring the line-of-sight direction by the corneal reflection method using a Purkinje image is used, but the sclera tracker method, EOG (Electro-oculography) method, search coil method, etc. Other known methods may be used. When using another method, the line-of-sight measurement device 30 may acquire information on the line of sight of the user with a configuration corresponding to the method used for the measurement of the line-of-sight direction.

[Functional configuration of information processing apparatus 10]
FIG. 2 is a block diagram showing a functional configuration of the information processing apparatus 10 according to the first embodiment of the present invention. The information processing apparatus 10 includes a recognition space setting function 100, a gesture input function 200, a line-of-sight authentication determination function 300, a detection unit 500, and an execution unit 700.

  The detection unit 500 outputs detection data corresponding to the detected object based on the measurement signal output from the object detection device 20. The detection data may be any data as long as the shape (for example, contour) of the object detected in the detection range can be recognized from the measurement signal. The detection data may include data indicating an outline of the shape of the object (for example, the axis of the object). Note that the detection unit 500 may be included in the object detection device 20.

  The recognition space setting function 100 has a function of recognizing the position of one end of an object existing in the above-described detection range based on detection data from the detection unit 500 and setting a recognition space around it. In this example, the object is a user's hand and arm, and one end of the object means a hand (including a finger). In the following description, it is assumed that the object is a user's hand and arm. The recognition space is set around the hand in the detection range, and is set to recognize the gesture by paying attention to the movement of the hand.

  The gesture input function 200 recognizes the movement or shape of a hand (particularly a finger) in the recognition space as a gesture based on detection data from the detection unit 500, and uses an instruction corresponding to the gesture as an instruction input to the apparatus. Has a function to determine. In this example, it also has a function of determining an instruction input to the apparatus based on an operation on the operation unit 13.

  The line-of-sight authentication determination function 300 measures the user's line-of-sight direction based on the measurement signal from the line-of-sight measurement device 30 and determines whether the line of sight is facing a predetermined area displayed on the display device 14. Have. Details of the recognition space setting function 100, the gesture input function 200, and the line-of-sight authentication determination function 300 will be described later.

  The execution unit 700 controls the operation of the electronic device 1 by executing processing based on instructions determined by the processing of the gesture input function 200 and the line-of-sight authentication determination function 300. At this time, depending on the result of the process in the gesture input function 200 and the line-of-sight authentication determination function 300, the process based on the instruction may not be executed or the process content may be changed.

[Recognition space setting function 100]
FIG. 3 is a block diagram showing the configuration of the recognition space setting function 100 according to the first embodiment of the present invention. The recognition space setting function 100 is realized using each function of the region specifying unit 110, the reference point setting unit 120, the operation unit 130, and the recognition space setting unit 140. In this example, the process using the recognition space setting function 100 described in detail below is executed when a start instruction is input to the operation unit 13 by the user. This process may be executed at regular intervals, may be executed according to a predetermined schedule, or may be executed when a predetermined object such as a hand is detected in the detection range. Good.

  Based on the detection data output from the detection unit 500, the region specification unit 110 specifies a hand and arm region (hereinafter referred to as an object region) from the contour of an object detected in the detection range. In this example, the axis L (see FIG. 4) along the direction in which the arm extends is also specified.

  Based on the identified object region, the reference point setting unit 120 sets a reference point C, which will be a scanning start point described later, as shown in FIG.

  FIG. 4 is a diagram illustrating an example of a method for setting the reference point C in the scanning unit 130 according to the first embodiment of the present invention. A center point P shown in FIG. 4 indicates a center position in the entire detection range. The reference point setting unit 120 extends a line from the center point P in the vertical direction, and sets the intersection with the axis L as the reference point C. The reference point C may be a point closest to the center point P on the axis L, or may be set by other methods including a method not using the axis L or the center point P. It is desirable to set the area (hand 1000 and arm 2000), and it is more desirable to set the area obtained by removing the finger from the object area.

  Returning to FIG. 3, the description will be continued. The scanning unit 130 sets a scanning line SL surrounding the reference point C, and scans the detection range so that the scanning line SL extends from the reference point C (see FIG. 6). The shape of the scanning line SL is circular in this example, but may be any shape as long as it surrounds the reference point C, such as an ellipse or a rectangle.

  The recognition space setting unit 140 has a shape of the scanning line SL when the portion where the scanning line SL and the object region overlap is a part of the scanning line SL and one continuous line (hereinafter referred to as a specific condition). Based on the above, a recognition space is set. A method for setting the recognition space will be specifically described below with reference to FIGS.

  FIG. 5 is a flowchart showing processing for setting a recognition space in the recognition space setting unit 140 according to the first embodiment of the present invention. First, the recognition space setting unit 140 determines whether or not a portion where the scanning line SL and the object region (hand 1000 and arm 2000) overlap (hereinafter referred to as a superimposed portion) satisfies a specific condition (step S110). If the specific condition is not satisfied (step S110; No), this determination is subsequently performed. Therefore, this determination is executed at regular intervals as the scanning unit 130 enlarges the scanning line SL until it is determined that the specific condition is satisfied (step S110; Yes). When the top portion satisfies the specific condition (step S110; Yes), the recognition space setting unit 140 specifies the shape of the scanning line SL that satisfies the specific condition (step S120). Hereinafter, the shape of the specified scanning line SL is referred to as a specific shape. The processing in steps S110 and S120 will be specifically described with reference to FIG.

  FIG. 6 is a diagram for explaining an example of a scanning line shape specifying method in the recognition space setting unit 140 according to the first embodiment of the present invention. The scanning lines SL (1), SL (2), and SL (3) indicate the scanning lines SL extending from the reference point C. In FIG. 6, the scanning lines SL (1), SL (2), and SL (3) are indicated by broken lines, and the overlapping portion is indicated by a thick solid line. When extending to the scanning line SL (3), the scanning line SL (3) overlaps only the portion of the arm 2000 in the object region. That is, the overlapping portion LO (3) satisfies the specific condition, and the recognition space setting unit 140 specifies the shape of the scanning line SL (3) as the specific shape.

  Note that whether or not the specific condition is satisfied is not limited to this method, and other methods may be used. For example, the number of intersections between the scanning line SL and the contour portion of the object region may be extracted, and it may be determined whether or not the specific condition is indirectly satisfied, for example, when the number of intersections becomes two. As described above, the method for determining whether or not the specific condition is satisfied includes not only a direct determination of the state of the superimposed portion but also a determination method indirectly.

  Returning to FIG. Subsequently, the recognition space setting unit 140 specifies a feature position (step S130). In this example, the characteristic position is the position of the wrist between the hand 1000 and the arm 2000. An example of the method for specifying the feature position will be described with reference to FIG.

  FIG. 7 is a diagram for explaining an example of a feature position specifying method in the recognition space setting unit 140 according to the first embodiment of the present invention. The recognition space setting unit 140 translates the specific shape (scanning line) SL (3) along the axis L. In this example, SL (3), SL (4), and SL (5) are changed in this order. With this parallel movement, the overlapping portion LO changes in the order of LO (3), LO (4), LO (5), and the length of the overlapping portion LO changes. The recognition space setting unit 140 calculates the change in the length of the overlapped portion LO, and changes the length of the overlap portion LO abruptly (for example, when the amount of change in length exceeds a predetermined threshold). It is specified as (characteristic position). In the example illustrated in FIG. 7, the position corresponding to the overlapped portion LO (4) is specified as the feature position.

  As described above, when the overlapping portion LO is recognized by two intersection points between the scanning line SL and the contour portion of the object region, the distance between the two intersection points is set corresponding to the length of the overlapping portion LO. Use it.

  Returning to FIG. Subsequently, when the feature space is specified, the recognition space setting unit 140 adjusts the specific shape based on the feature position within a range that satisfies the specific condition (step S140). An example of a specific shape adjustment method will be described with reference to FIG.

  FIG. 8 is a diagram illustrating an example of a specific shape adjustment method in the recognition space setting unit 140 according to the first embodiment of the present invention. In this example, among the intersections of the specific shape (scan line) SL (4) and the axis L, the intersection D1 on the feature position side is fixed, and the diameter R of the specific shape is adjusted. At this time, the specific shape is adjusted so that the diameter R is minimized within a range that satisfies the specific condition (a predetermined margin may be provided). In the example illustrated in FIG. 8, the specific shape SL (4) is adjusted to the specific shape SL (6) having a diameter R connecting the intersections D1 and D2 with the axis L.

  When adjusting so that the diameter R is minimized, for example, the following method may be used. First, the intersection point D1 is fixed, and the specific shape is reduced until the specific condition is not satisfied (for example, the specific shape overlaps the finger portion of the hand 1000). Then, the specific shape size may be adjusted to the size immediately before the specific condition is not satisfied (for example, the specific shape does not overlap the finger portion of the hand 1000). Note that this adjustment method is an example, and another method may be used as long as the adjustment is performed based on the characteristic position within a range that satisfies the specific condition.

  Returning to FIG. Subsequently, the recognition space setting unit 140 sets a recognition space based on the adjusted specific shape (step S150). An example of a recognition space setting method will be described with reference to FIG.

  FIG. 9 is a diagram illustrating an example of a recognition space setting method in the recognition space setting unit 140 according to the first embodiment of the present invention. In this example, the recognition space is set as a sphere RA whose center is the intersection D1 and whose radius is the diameter R of the adjusted specific shape SL (6). In consideration of a margin, this radius may be R × (1 + α) (where α is a positive value representing an allowable error ratio) instead of R. The recognition space may not be a sphere, and may be various shapes such as an ellipsoid, a rectangular parallelepiped, and a hemisphere. In any case, it may be determined based on the adjusted specific shape SL (6) (for example, diameter), and the one end side (hand 1000) of the object is fixed with the characteristic position (wrist) fixed. It is desirable to include a movable range.

  Here, when making a gesture by moving the hand 1000, the hand 1000 moves around the wrist. Therefore, by setting the center of the recognition space near the wrist as a characteristic position, the behavior of the hand 1000 can be measured no matter how the hand 1000 moves. Further, erroneous recognition can be reduced due to the influence of surrounding movements other than the hand 1000. The recognition space set in this way is used in the gesture input function 200 (behavior measuring unit 210).

  There is a limit that the hand 1000 can be bent to the arm side. Therefore, the center of the sphere RA in the recognition space is not limited to D1, but may be a point moved to the fingertip side (D2 side) from D1. This center may be a midpoint between two intersections of the adjusted specific shape SL (6) and the contour of the object region. Thus, the radius when the center is not D1 may be a distance from this center to D2.

[Gesture input function 200]
FIG. 10 is a block diagram showing a configuration of the gesture input function 200 according to the first embodiment of the present invention. The gesture input function 200 is realized using the functions of the behavior measurement unit 210 and the determination unit 220.

  The behavior measurement unit 210 measures the behavior of at least a part of the object in the recognition space based on the detection data output from the detection unit 500. In this example, the behavior measurement unit 210 functions as a coordinate calculation unit that calculates the coordinates of at least a part of the object (the tip of five fingers in this example). The fingers whose coordinates are calculated are all five fingers in this example, but may be a part of the fingers or only one finger.

  At this time, the behavior measurement unit 210 may calculate the coordinates by extracting information in the recognition space from the information included in the detection data. Therefore, it is possible to reduce the load of information processing, and it is possible to improve the resolution in calculating coordinates. In order to improve accuracy, the optical system (lens magnification, etc.) of the object detection device 20 may be adjusted so that the detection range is narrowed down to the vicinity of the recognition space, and the density of the pattern irradiated by the detection sensor 21 in the vicinity of the recognition space (Dot pattern density, etc.) may be increased.

  FIG. 11 is a diagram for explaining the positions of coordinates calculated by the behavior measuring unit 210 according to the first embodiment of the present invention. The behavior measuring unit 210 calculates the coordinate A of the tip of the thumb 1001 of the hand 1000, the coordinate B of the tip of the index finger 1002, the coordinate C of the tip of the middle finger 1003, the coordinate D of the tip of the ring finger 1004, and the coordinate E of the tip of the little finger 1005. To do. The behavior measurement unit 210 continues to calculate coordinates at predetermined intervals following the change in the position of each finger. The origin of the coordinates may be a specific position in the detection range, a specific position in the recognition space (such as the center of the sphere RA), or any position of the hand 1000 (for example, any finger). The origin may be the origin. For example, when the coordinates of the tip of the thumb are used as the origin, the coordinates of the other fingers are calculated as relative coordinates based on the thumb.

  Returning to FIG. The determining unit 220 determines an instruction to the user's device based on the coordinates calculated by the behavior measuring unit 210 (hereinafter referred to as “calculated coordinates”). In this example, the determination unit 220 determines an instruction from the user to the apparatus based on the collation table in which the gesture type and the instruction type are associated with each other and the calculated coordinates. The instructions include, for example, an instruction for performing a specific process (calling a menu screen, executing a program, switching pages, etc.) in addition to an instruction to input a determination such as OK or NG. As will be described later, the determination unit 220 may change the instruction to be determined or may not determine the instruction depending on the determination result in the line-of-sight authentication determination function 300 or the like. In addition, the determination unit 220 may determine an instruction based on an operation on the operation unit 13.

  FIG. 12 is a diagram for explaining a collation table used in the determination unit 220 according to the first embodiment of the present invention. As shown in FIG. 12, in the verification table, the type of gesture and the type of instruction are associated with each other. In the example illustrated in FIG. 12, the gesture A and the instruction a are associated with each other, the gesture B and the instruction b are associated with each other, and the instruction c is associated with the change from the gesture A to the gesture B.

  FIG. 13 is a diagram for explaining an example of a gesture defined in the collation table according to the first embodiment of the present invention. In this example, the gesture A is a gesture shown in FIG. 13A, and the relative relationship between the calculated coordinates of each finger is determined as shown in the figure for the definition of the gesture A. That is, the gesture A is defined as a shape in which coordinates A to E are arranged vertically.

  On the other hand, the gesture B is a gesture shown in FIG. 13B, and the relative relationship of the calculated coordinates of each finger is determined as shown in the figure for the definition of the gesture B. That is, the gesture B is defined as a shape in which the coordinates B and C are separated into information and arranged side by side, and the coordinates A, D, and E are gathered below the coordinates B and C. The case of changing from the gesture A to the gesture B indicates a case of performing an operation of changing from the gesture shown in FIG. 13A to the gesture shown in FIG.

  In addition, although the right hand gesture is shown and described here, the left hand gesture may also be defined. Which hand gesture is applied is determined from the positional relationship between the feature position specified in the recognition space setting unit 140 described above and one end of the object (which is relatively on the left side, etc.). May be.

  FIG. 14 is a flowchart showing processing in the determination unit 220 according to the first embodiment of the present invention. The determination unit 220 calculates the change speed of the calculated coordinate corresponding to each finger from the change in the position of the calculated coordinate corresponding to each finger, and sets the speed of the finger with the fastest movement as the change speed Vf. The change speed Vf may be determined based on the change speed of the calculated coordinates of each finger, may be an average value of the change speeds of the calculated coordinates of each finger, or may be the calculated coordinates of a specific finger. The rate of change may be.

  The determination unit 220 determines whether the change speed Vf is equal to or higher than the first speed V1 (in this example, 30 cm / second) (step S210). If the change speed Vf is less than the first speed V1 (step S210; No), this determination is subsequently performed. When the change speed Vf becomes equal to or higher than the first speed V1 (step S210; Yes), the determination unit 220 determines whether or not the first time T1 has elapsed while the change speed Vf is equal to or higher than the second speed V2. However, if it has not elapsed (step S221; No), it is determined whether or not the change speed Vf has decreased to less than the second speed V2 (step S222). The second speed V2 is a speed (0.5 cm / second in this example) slower than the first speed V1. Specifically, it is desirable that the second speed V2 is a speed that can be regarded as a stationary state.

  If the first time T1 (0.5 seconds in this example) has elapsed (step S221; Yes) without the change speed Vf decelerating to less than the second speed V2 (step S222; No), the step Return to S210.

  If the change speed Vf decelerates to less than the second speed V2 (step S222; Yes) before the first time T1 has elapsed (step S221; No), the determination unit 220 determines that the change speed Vf is the first speed Tf. It is determined whether or not the second time T2 (1 second in this example) has elapsed in a state of less than 2 speed V2 (step S223). If the second time T2 has not elapsed, the process returns to step S222.

  When the change speed Vf is less than the second speed V2 and the second time T2 has elapsed (step S223; Yes), the calculated coordinates (first coordinates) of each finger are stored (step S230). Thereafter, the determination unit 220 determines whether the change speed Vf is equal to or higher than the first speed V1 (step S240). When the third time T3 has elapsed (step S250; Yes) at a speed less than the first speed V1 (step S240; No), the determination unit 220 uses the first coordinate stored in step S230 and the verification table gesture. The type of instruction corresponding to the first coordinate is determined (step S280). Then, the process returns to step S210.

  On the other hand, if the change speed Vf becomes equal to or higher than the first speed V1 (step S240; Yes) before the third time T3 (1 second in this example) has elapsed (step S250; No), the step The process proceeds to S261. Steps S261 to S263 are the same as the processes from Step S221 to Step S223, and thus the description thereof is omitted.

  When the change speed Vf is less than the second speed V2 and the second time T2 has elapsed (step S263; Yes), the calculated coordinates (second coordinates) of each finger are stored (step S270). Then, the determination unit 220 collates the change from the first coordinate stored in step S230 to the second coordinate stored in step S270 and the type of gesture in the verification table, and the type of instruction corresponding to this change. Is determined (step S280). Then, the process returns to step S210. The execution unit 700 is executed according to the instruction thus determined. When the instruction determined in step S280 is an instruction to end the gesture input, the instruction may be ended without returning to step S210. The instruction to end the gesture input may be performed by a user operation on the operation unit 13 or may be performed when a hand is not detected in the detection range for a certain period.

  The determination unit 220 recognizes the gesture as a gesture when the movement of the user's finger moves at a certain speed or more and remains almost stationary for a certain period of time by performing the processing in the above-described flow. When the user intends to input a gesture, the inventor moves the finger quickly to stop the finger for a while in a state corresponding to the intended gesture. We focused on the slow movement of fingers until the start of.

For example, assuming that a hand is spread out as a gesture, in general, a user often performs an operation of suddenly spreading a hand after slowly picking up the hand. For example, the flow is as shown in (1) to (3) below.
(1) The hand (finger) moves slowly until a gesture input is intended. Therefore, it is not subject to gesture recognition.
(2) As a preparation for starting a gesture, the hand (finger) moves slowly until the hand is picked up, or there is little rest time after the hand is picked up. Therefore, it is not subject to gesture recognition.
(3) In order to recognize as a gesture, the movement of spreading the hand has a fast movement of the hand (finger), and there is a stationary time after the hand is spread. Therefore, it becomes an object of gesture recognition.

[Gaze authentication determination function 300]
FIG. 15 is a block diagram showing a configuration of the line-of-sight authentication function 300 according to the first embodiment of the present invention. The line-of-sight authentication determination function 300 is realized using the functions of the line-of-sight measurement unit 310, the display control unit 320, and the determination unit 330.

  The line-of-sight measurement unit 310 measures the user's line-of-sight direction using the imaging signal output from the line-of-sight measurement device 30 and calculates which part of the display device 14 the user's line of sight is directed to. For the measurement of the line-of-sight direction, as described above, a known method for measuring the line-of-sight direction by a corneal reflection method using a Purkinje image is used, but other known methods may be used.

  The display control unit 320 has a function of controlling display on the display device 14 and causes the display device 14 to display a confirmation screen including an authentication area.

  FIG. 16 is a diagram illustrating an example of a confirmation screen displayed on the display device 14 according to the first embodiment of the present invention. The confirmation screen is provided with at least one authentication area. In the example shown in FIG. 16, authentication areas W1, W2, and W3 are provided. Further, VP indicates a position on the display device 14 where the user's line of sight obtained based on the calculation result of the line-of-sight measurement unit 310 is directed (hereinafter referred to as line-of-sight position VP). The line-of-sight position VP may be displayed on the display device 14 or may not be displayed.

  The confirmation screen is a screen that is displayed on the display device 14 as content to be confirmed by the user, such as a screen that is necessary for business confirmation performed in a bank or the like, and a screen that displays software usage rules. Items to be checked by the user, such as items to be confirmed, particularly important items, are displayed in a preset authentication area. As shown in FIG. 16, when there are a plurality of authentication areas, the order (authentication order) to be visually recognized by the user is determined. This may be determined in advance in association with the authentication area, or may be determined based on the positional relationship of the authentication area in the display device 14 (such as the earlier the order is on the screen).

  In this example, the authentication order is determined as the first, second, and third in the order of the authentication areas W1, W2, and W3. In the example shown in FIG. 16, two lines of horizontally written character strings are displayed in the authentication area W1, and two lines of horizontally written character strings are displayed in the authentication area W2, although the number of characters is smaller than that of the authentication area W1. In the authentication area W3, one line of horizontal character string is displayed. Note that the authentication area may be one area for the entire screen of the display device 14.

  Returning to FIG. 15, the description will be continued. The determination unit 330 determines whether the user has confirmed the confirmation screen based on the relationship between the line-of-sight position VP and the authentication area.

  FIG. 17 is a flowchart showing processing in the determination unit 330 according to the first embodiment of the present invention. First, the determination unit 330 continues to determine whether or not the line-of-sight position VP exists in the first authentication area (authentication area W1 in the example of FIG. 16) (a predetermined margin may be allowed) (step S310; No). ). When it is detected that the line-of-sight position VP exists in the first authentication area (step S310; Yes), the determination unit 330 determines whether there is an authentication area in the next authentication order (step S320).

  If there is an authentication area in the next authentication order (step S320; Yes), the determination unit 330 continues to determine whether the line-of-sight position VP exists in the next authentication area (step S330; No). . If it cannot be detected that the line-of-sight position VP exists in the next authentication area even after a predetermined time has elapsed, it is assumed that a series of processes for visually recognizing the authentication area has been interrupted, and the previous processes are reset. The process may be started again from step S310.

  When it is detected that the line-of-sight position VP is present in the next authentication area (step S330; Yes), the process returns to step S320 again. If there is no authentication area in the next authentication order (step S320; No), the determination unit 330 determines that the confirmation screen has been confirmed by the user (step S340), and ends the process. Hereinafter, this determination is referred to as user confirmation determination.

  Here, the determination unit 330 may determine that the line-of-sight position VP exists in the authentication area on the condition that the line-of-sight position VP passes through a part of the authentication area when the line-of-sight position VP moves. You may judge. Hereinafter, various conditions that can be taken for the determination are exemplified as the following (a) to (c).

(A) The line-of-sight position VP remains in the authentication area for a predetermined time continuously. Note that the predetermined time may be determined for each authentication area according to the amount of characters displayed in the authentication area.
(B) The line-of-sight position VP is stationary for a predetermined time in the authentication area. Note that the predetermined time may be determined for each authentication area according to the amount of characters displayed in the authentication area.
(C) It has been detected that the line-of-sight position VP has moved along the direction of the character string displayed in the authentication area (which may be shifted within a predetermined range). That is, the line-of-sight position VP moves approximately along the horizontal direction in horizontal writing, and the line-of-sight position VP moves approximately along the vertical direction in vertical writing. In the case where a character string is displayed over a plurality of lines, on the condition that the line-of-sight position VP has been detected in the authentication area at least as many times as the number of lines of the character string in the authentication area. Also good. Further, the amount of movement along the direction of the character string may be determined according to the length of the character string.

  Returning to FIG. 15, the description will be continued. The determination result in the determination unit 330 described above is reflected in the processing in the determination unit 220 and the execution unit 700. For example, when reflected in the processing in the execution unit 700, the execution unit 700 executes a process (such as transition to the next confirmation screen) determined in accordance with the confirmation screen when the user confirmation determination is made. You may make it do. Further, the process based on the instruction determined by the determination unit 220 may not be executed until the user confirmation determination is made.

  When reflected in the processing in the determination unit 220, the determination unit 220 may determine an instruction to the apparatus on the condition that the user confirmation is determined. Thereby, in the state where the user confirmation determination is not made, the input of the instruction by the user is rejected. Further, the type of instruction to be determined may be changed according to the presence / absence of user confirmation determination. For example, it may be changed to an instruction for transitioning to an error screen, displaying a pop-up indicating an error, or displaying which authentication area the user is not viewing.

  In this way, it is determined whether or not the user has visually recognized the authentication area on the confirmation screen, and the process reflecting the determination result is executed, so that the content is not read without reading the content such as the text to be confirmed by the user. It is possible to prevent the operation from proceeding by performing an operation indicating that it has been confirmed. At this time, when there are a plurality of authentication areas that require confirmation, the user often needs to visually recognize them in a predetermined order. However, when the user visually recognizes in an order different from the order, it may be considered that the line of sight is merely directed to the authentication area and the content is not confirmed. In such a case, since it is possible to prevent the user confirmation determination, it is possible to prevent the user from proceeding without confirming the contents.

[Another processing example in determination unit 330]
The determination unit 330 performs the user confirmation determination because the line-of-sight position VP exists in the authentication area in a predetermined order as described above, but performs the user confirmation determination only while the line-of-sight position VP exists in the authentication area. You may make it do. That is, the determination unit 330 performs user confirmation determination when detecting that the user's line of sight is facing the authentication area (the user is viewing the authentication area).

  In such a case, the determination unit 220 determines an instruction for the apparatus only while the user confirmation determination is being performed (while the user visually recognizes the authentication area), and does not determine otherwise. May be. As a result, the user can input an instruction to the apparatus by operating the gesture or the operation unit 13 only when the user is viewing the authentication area. Note that for some gestures or operations on the operation unit 13, an instruction to the apparatus may be determined regardless of the user confirmation determination.

  In addition, the user may be able to input an instruction to the apparatus for a predetermined time after visually recognizing the authentication area. In this case, the determination unit 330 may perform the user confirmation determination for a certain time after the line-of-sight position VP does not exist in the authentication area in addition to the line-of-sight position VP existing in the authentication area. In this way, after the user visually recognizes the authentication area, it is possible to input an instruction to the apparatus even if the user operates while looking at the operation unit 13.

  When there are a plurality of authentication areas on one screen, the determination unit 220 performs an operation on the input gesture or the operation unit 13 when the user visually confirms each authentication area and makes a user confirmation determination. Based on this, an instruction to the apparatus may be determined. Here, the operation unit 13 that receives an input of an operation from the user and the behavior measurement unit 210 that receives an input of a gesture can also be considered as an input reception unit that receives an input of an instruction from the user.

  Note that an instruction to the apparatus may be determined when the relationship between the recognized authentication area and the input gesture or operation on the operation unit 13 corresponds to a predetermined one. For example, when the user performs the gesture A while visually recognizing the authentication area W1, performs the gesture B while visually recognizing the authentication area W2, and performs the gesture A while visually recognizing the authentication area W3. An instruction to the apparatus may be determined accordingly.

  As described above, when the user's line of sight faces the authentication area, the instruction to the apparatus by the determination unit 200 is determined, so that it is possible to prevent the user from inputting the instruction to the apparatus while looking away. .

Second Embodiment
In the second embodiment of the present invention, the processing in the gesture input function 200 is different from that in the first embodiment. In the second embodiment, a gesture is input to the finger of the user's hand 1000 in a state in which a telescopic device capable of adjusting the length is attached.

  FIG. 18 is a diagram for explaining the position of the coordinates calculated by the behavior measuring unit 210 according to the second embodiment of the present invention and the method of using the expansion / contraction device 3000. The telescopic device 3000 is attached to the fingertip of the hand 1000. In the example shown in FIG. 18, it is attached to the index finger. The expansion / contraction device 3000 is provided with an expansion / contraction portion 3100, and the length of the expansion / contraction portion 3100 is changed by a predetermined operation on the expansion / contraction device 3000.

  For example, when the length of the expansion / contraction part 3100 is expanded and changed in multiple stages, various operations such as an operation of pushing the expansion / contraction part 3100 in a shrinking direction, an operation of a switch or the like, an operation of giving vibration by shaking the hand 1000, etc. Applicable. At this time, when the structure is configured to extend and contract by applying vibration, the length of the expansion / contraction part 3100 may be changed depending on the intensity of vibration, or the vibration may be expanded and contracted in one step with one vibration. You may make it the length of the expansion-contraction part 3100 change whenever it gives. In this example, the length of the stretchable part 3100 changes in five steps.

  FIG. 18A shows a state in which the expansion / contraction part 3100 is contracted most, and FIG. 18B shows a state in which the expansion / contraction part 3100 is extended most. Since the expansion / contraction device 3000 is attached to the index finger, the behavior measurement unit 210 recognizes the tip of the expansion / contraction part 3100 as the tip of the index finger when the tip of the protrusion is recognized and the coordinates are calculated. And calculated as coordinates B. Therefore, even when the user makes the same gesture, the expansion / contraction part 3100 is most contracted as shown in FIG. 18 (a) and the expansion / contraction part 3100 is most extended as shown in FIG. 18 (b). Even when the coordinates A, C, D, and E are the same, the coordinate B is different.

  FIG. 19 is a diagram illustrating a collation table used in the determination unit 220 according to the second embodiment of the present invention. The collation table in the second embodiment is substantially the same as the collation table in the first embodiment shown in FIG. 12, but is partially different. For example, in the second embodiment, in the gesture B, gesture types from B1 to B5 are set according to the expansion / contraction stage (5 stages in this example) of the expansion / contraction part 3100, and different instructions b1 to b5 are associated with each other. It has been.

  FIG. 20 is a diagram for explaining examples of gestures defined in the collation table according to the second embodiment of the present invention. In this example, FIG. 20A shows a gesture B1 (a state in which the expansion / contraction part 3100 is contracted most), and FIG. 20B shows a gesture B5 (a state in which the expansion / contraction part 3100 is extended most). In any gesture, it is the same gesture for the user. Therefore, as shown in FIG. 20, the coordinates A, C, D, and E among the coordinates are the same for all gestures, but the index B of the index finger to which the expansion / contraction device 3000 is attached. Is different for each gesture.

  In this way, even if it is the same gesture for the user, it can be recognized as different gestures by the number corresponding to the expansion / contraction stage of the expansion / contraction part 3100. If the five-stage expansion / contraction shown in this example is possible, the user can give five instructions even with one and the same gesture. Further, if an expansion / contraction device 3000 capable of expansion / contraction in five steps is attached to each finger, the determination unit 200 determines 25 types of instructions in five fingers × 5 steps even if the user makes the same gesture. can do.

<Third Embodiment>
In the third embodiment of the present invention, a description will be given of an expansion / contraction device 3010 that is attached to a portion that is not a coordinate calculation target, instead of attaching the expansion / contraction device 3000 used in the second embodiment to a finger that is a coordinate calculation target.

  FIG. 21 is a diagram for explaining the position of the coordinates calculated by the behavior measuring unit 210 according to the third embodiment of the present invention and the method of using the expansion / contraction device 3010. As shown in FIG. 21, the expansion / contraction device 3010 having the expansion / contraction part 3110 is supported by the wristband 3200 via the support member 3300. When wristband 3200 is attached to the wrist of the user, as shown in FIG. 21, expansion / contraction part 3110 is located in the palm part.

  In this example, the stretchable part 3110 extends from the most contracted state (FIG. 21A) toward the outside (downward in FIG. 21) from the palm. FIG. 21B shows the most extended state. As shown in FIG. 21, the behavior measurement unit 210 calculates the coordinates of the tip of each finger, and further calculates the coordinate Z of the tip of the extension / contraction part 3110 of the extension / contraction device 3010 in this example. The coordinate Z is preferably calculated based on a specific position in the hand 1000 (for example, the above-described intersection D1), and the same coordinate is desirably obtained regardless of the movement of the entire hand 1000. Note that the behavior measuring unit 210 uses the algorithm for detecting a specific shape, such as the shape of the expansion / contraction device 3010, separately from the algorithm for calculating the coordinates of the tip of the finger. May be calculated.

  In addition, the expansion / contraction apparatus 3010 may be attached not only to extend the expansion / contraction part 3110 as shown in FIG. 21 but also to extend in another direction.

  FIG. 22 is a diagram for explaining a method of using the expansion / contraction apparatus 3010 according to the third embodiment of the present invention, which is different from FIG. As a case different from the example shown in FIG. 21, for example, as shown in FIG. 22, the stretchable part 3110 may extend in a direction perpendicular to the palm. In this case, the coordinate Z may be calculated as a value indicating the distance from the palm.

  FIG. 23 is a diagram illustrating a collation table used in the determination unit 220 according to the third embodiment of the present invention. The collation table in the third embodiment is associated with the coordinate Z in addition to the collation table in the first embodiment shown in FIG. Z1, Z2,... At the coordinates Z defined in the collation table correspond to the expansion / contraction stage of the expansion / contraction part 3110. That is, in the collation table in this example, each instruction is associated with a combination of the gesture coordinates A to E and the coordinate Z.

  In the collation table of FIG. 23, even if the gesture A is the same (the coordinates A to E are the same), the instructions differ depending on the expansion / contraction stage (coordinate Z) of the expansion / contraction part 3110. Therefore, the determination unit 200 can determine the type of instruction according to the number of expansion / contraction stages of the expansion / contraction unit 3110 even if the user makes the same gesture. Note that a plurality of collation tables as in the first embodiment may be used, and the value of the coordinate Z may be associated with each. In this case, the determination unit 200 may switch and use the collation table according to the value of the coordinate Z. For example, when using gesture input in banking operations or the like, each of a plurality of collation tables may be associated with each type of operation. In this way, the type of work can be switched by the expansion / contraction of the expansion / contraction part 3110. In conjunction with this, the display on the display device 14 may be switched to a display corresponding to the switched type of work.

<Fourth embodiment>
In the fourth embodiment of the present invention, a case will be described in which a light emitting device provided with a light emitting element is attached instead of attaching the expansion device 3010 having a portion that expands and contracts to the hand 1000 as in the third embodiment. In this case, the object detection device 20 uses an image sensor capable of recognizing the light emission from the light emitting element, and the behavior measuring unit 210 calculates the color of light emitted from the light emitting element based on the output from the image sensor. Identify.

  FIG. 24 is a diagram illustrating a method of using the light emitting device 3050 according to the fourth embodiment of the present invention. In 4th Embodiment, it replaced with the expansion-contraction apparatus 3010 in 3rd Embodiment, and the light-emitting device 3050 is attached. The light emitting device 3050 is provided with a light emitting unit 3150. The light emitting unit 3150 includes a light emitting element such as an LED, and emits light to the surroundings. In this example, the light emitting unit 3150 emits light by switching between red, green, and blue according to a user operation. As with the user's operation, various operations such as operation of a switch or the like, and application of vibration by shaking the hand 1000 can be applied as in the case of the extension devices 3000 and 3010. Note that the light emitting unit 3150 may emit light of intermediate colors by combining the colors, or may emit light other than visible light. In the case of other than visible light, the object detection device 20 may use an image sensor that can detect the light.

  FIG. 25 is a diagram illustrating a collation table used in the determination unit 220 according to the fourth embodiment of the present invention. In addition to the collation table in the first embodiment shown in FIG. 12, emission colors are associated. Red, green,... In the light emission colors defined in the collation table correspond to the light emission colors of the light emitting unit 3150. That is, in the collation table in this example, each instruction is associated with the combination of the gesture coordinates A to E and the emission color. In the collation table of FIG. 25, even if the gesture A is the same (coordinates A to E are the same), the instructions differ depending on the light emission color of the light emitting unit 3110. As described in the third embodiment, a plurality of collation tables may be used, and the emission colors may be associated with each.

  FIG. 26 is a diagram illustrating a method of using the light emitting device 3060 according to the fourth embodiment of the present invention. In the above example, the light emitting device 3050 is attached to the hand 1000 using the wristband 3200. However, as shown in FIG. 26, a ring-shaped light emitting device 3060 that can be attached to a finger may be used. The light emitting device 3060 includes a light emitting unit 3160 that emits light in the circumferential direction of the finger. At this time, the light emitting device 3060 may be provided on a plurality of fingers, and the emission color may be defined by a combination of the respective colors.

  Note that the light emitting units 3150 and 3160 may change the temporal change (light emission pattern) of the light emission intensity. In this way, the behavior measuring unit 210 can easily distinguish the light emitted from the light emitting units 3150 and 3160 from the other light, and can reduce misrecognition of the light emission color. Moreover, if the light emission pattern can be changed, the instructions can be changed even if the gesture is the same because the light emission pattern is different. When the change by the light emission pattern is used, the light emitted from the light emitting units 3150 and 3160 may be monochromatic. As described above, in the fourth embodiment, the behavior measuring unit 210 also functions as a light emission state measuring unit that measures the light emission state in the light emitting units 3150 and 3160, and the instructions are different even if the gesture is the same because the light emission state is different. be able to.

<Fifth Embodiment>
In the fifth embodiment, an example of a collation method when the above-described determination unit 200 performs a collation of a change in gesture, for example, a change in the user's gesture from gesture A to B will be described.

  FIG. 27 is a diagram illustrating a collation table used in the determination unit 220 according to the fifth embodiment of the present invention. In this example, the type of gesture is determined as a case of changing from a specific gesture to another gesture (hereinafter referred to as a collation pattern), such as gesture A → B, and an instruction to the apparatus is associated with the gesture. . In the collation pattern, a trajectory of each finger in an ideal gesture change and an allowable range from the trajectory are determined. First, an example when a locus and an allowable range in the matching pattern are registered will be described. The process for registering the collation pattern described below is executed by the CPU 11. Therefore, the CPU 11 also functions as a registration unit that executes a process for registering a matching pattern.

  FIG. 28 is a diagram illustrating a registration area determined based on the coordinate position calculated by the behavior measuring unit 210 according to the fifth embodiment of the present invention. First, the CPU 11 sets registration areas SA, SB,... SE based on the coordinate position of the tip of each finger of the hand 1000 calculated by the behavior measurement unit 210.

  FIG. 29 is a diagram illustrating details of a registration area according to the fifth embodiment of the present invention. In this example, the registration area SB set at the tip of the index finger will be described as an example. The registration area SB is defined by a sphere (not necessarily a sphere) having the center LCB as the coordinate B of the tip of the index finger calculated by the behavior measuring unit 210 and the allowable length AR as a radius. The allowable length AR is determined in advance. The allowable length AR may be determined corresponding to each finger, or may be determined in common for all the fingers.

  FIG. 30 is a diagram illustrating an example of a collation pattern according to the fifth embodiment of the present invention. The collation pattern is defined by a trajectory TC and an allowable length AR when changing from a specific gesture to another gesture (for example, gesture A to gesture B). In FIG. 30, the trajectory TC and the allowable length AR are schematically shown for the matching pattern determined corresponding to the index finger among the matching patterns corresponding to each finger determined in the verification table. Therefore, for the collation patterns registered in the collation table, the collation pattern shown in FIG. 30 is determined corresponding to each finger. Note that the collation pattern corresponding to all fingers may not be registered in the collation table, and the collation pattern corresponding to only one finger may be registered.

  The start point TCs is the start position of the locus TC and represents the position of the center LCB at the time of the first gesture. The end point TCe is the end position of the trajectory TC, and represents the position of the center LCB when changed to another gesture. The trajectory TC represents the path of the center LCB from the start point TCs to the end point TCe when the gesture changes. Note that the start point TCs and the end point TCe may be almost the same place, such as when the gesture A → B → A is changed, and when the gesture is changed once and then returned to the original.

  As shown in FIG. 30, the collation pattern is a tube-shaped space formed by moving a registration area within a certain range from the tip of the finger by a change in gesture. This space is an acceptable range. The distance from each position of the trajectory TC to the outer edge AE of the allowable range corresponds to the length of the allowable length AR. For example, if the locus TC is a straight line, the outer edge AE has a side surface shape of a cylinder having a radius AR.

  In this example, the collation pattern in the collation table is registered using an actual gesture by the user as described above. However, the present invention is not limited to such registration, and each parameter is input as a numerical value. For example, it may be registered by another method. When registering using an actual gesture by a user, a different collation table may be used for each user. In this case, a means for identifying the user who operates the electronic device 1 may be provided in the electronic device 1. And the determination part 200 should just use the table for collation corresponding to the identified user.

  Next, an example will be described in which the determination unit 200 collates a gesture input by a user with a collation pattern using a collation table. When the determining unit 200 performs collation, the detection region that is narrower than the registration region is used instead of the registration region corresponding to the tip of each finger described above.

  FIG. 31 is a diagram for explaining a comparison between a registration area and a detection area according to the fifth embodiment of the present invention. In this example, the detection area CB set at the tip of the index finger will be described as an example. The detection area CB is defined by a sphere having the center LCB as the coordinate B of the tip of the index finger calculated by the behavior measuring unit 210 and the detection length CR as a radius. The detection length CR is determined in advance as a length shorter than the allowable length AR. It is desirable that the detection length CR is about half of the allowable length AR. The detection length CR may be determined corresponding to each finger, or may be determined in common for all the fingers. The determination unit 200 performs collation with the collation pattern using the detection region thus determined.

  FIG. 32 is a diagram illustrating collation between a detection region and a collation pattern according to the fifth embodiment of the present invention. For each finger, the determination unit 200 compares the verification pattern registered in the verification table with the temporal change of the detection area and performs verification. Since this detection area is determined based on the coordinates calculated corresponding to the tip of each finger, if the calculated coordinates are changed with time, the detection area also changes with time. This collation is performed as follows. In FIG. 32, the detection area CB at the tip of the index finger is described as an example. The same applies to FIGS. 33 and 34 described later.

  First, the region through which the detection region CB has passed due to the time change of the detection region CB includes all the trajectory TC (from the start point TCs to the end point TCe), and inside (allowable range) from the outer edge AE determined from the allowable length AR. A collation pattern that satisfies the two conditions of being within the range is searched from the collation table. In the following description, a matching pattern that satisfies this condition is referred to as a matching pattern that matches the detection area.

  FIG. 33 is a diagram for explaining an example in the case where the detection region and the verification pattern do not match according to the fifth embodiment of the present invention. As shown in FIG. 33, when the detection area CB does not include the locus TC, or when a part of the detection area CB goes out of the allowable range (outside the outer edge AE). The detection area CB does not match the verification pattern. As shown in FIG. 33, a portion of the trajectory TC through which the detection region CB has passed is indicated by a broken line as a passing portion TCD. The same applies to FIG.

  FIG. 34 is a diagram illustrating an example when the detection region and the matching pattern according to the fifth embodiment of the present invention match. As shown in FIG. 34, when the detection area CB passes through the trajectory TC from the start point TCs to the end point TCe in a state where the detection area CB is within the allowable range (inside the outer edge AE) (the trajectory TC is all passing portions). In the case of TCD), the detection area CB coincides with the collation pattern. When the detection areas for all fingers match the matching pattern corresponding to each finger, the determination unit 200 determines an instruction associated with the matching pattern as an instruction for the apparatus.

  It should be noted that the detection areas for all fingers do not have to match the collation pattern. For example, some predetermined fingers may only coincide with the collation pattern. Further, the match between the detection area and the collation pattern is not necessarily limited to the case where the above two conditions are satisfied. For example, even if there is a portion where the detection region does not pass the trajectory TC, it may be considered that the detection region has passed if the length is equal to or less than a predetermined threshold. Further, if the state where the detection area is within the allowable range is equal to or greater than a predetermined ratio of the whole, it may be considered that the detection area is within the allowable range. As described above, the determination unit 200 may perform collation including a margin.

  In the above-described collation pattern, the allowable length AR is determined as the same length at any position on the locus TC, but may be determined as a different length depending on the position.

  FIG. 35 is a diagram for explaining another example of the collation pattern according to the fifth embodiment of the present invention. In the example shown in FIG. 35, at the position near the center from the start point TCs to the end point TCe, the distance from the trajectory TC to the outer edge AE is the allowable length AR1, while it is close to the start point TCs or the end point TCe of the trajectory TC. At the position, the distance from the locus TC to the outer edge AE is an allowable length AR2 longer than the allowable length AR1. In this way, a matching pattern that is determined so that the allowable length AR differs depending on the position on the trajectory TC may be registered in the matching table. In addition, a collation pattern determined so that the locus TC does not pass through the center of the allowable range may be registered in the collation table.

  Further, the allowable length AR may be different depending on whether or not the user confirmation determination is performed in the line-of-sight authentication determination function 300. That is, the allowable length AR may be changed according to the determination result in the determination unit 330. For example, when the user confirmation determination is made, the allowable length AR may be made longer than when the user confirmation is not made. On the contrary, the detection length CR may be changed according to the determination result in the determination unit 330. For example, when the user confirmation determination is made, the detection length CR may be made shorter than when the user confirmation is not made. In this way, when the user is visually recognizing the authentication area, the permissible length AR becomes longer, so that the detection area and the matching pattern are easily matched. That is, the margin for recognizing a user's gesture increases.

<Modification>
As mentioned above, although embodiment and the Example of this invention were described, this invention can be implemented in various aspects as follows.

[Modification 1]
The line-of-sight measurement unit 310 described above measures the user's line-of-sight direction, but further measures the user's point of sight (concept including not only the direction but also the distance) by measuring the line-of-sight direction of the user's eyes. You may measure. In this case, the determination unit 330 further determines that the point of sight exists on the screen plane of the display device 14 (a predetermined margin may be allowed) in determining that the line-of-sight position VP exists in the authentication area. This may be a condition. For example, the intersection of the line of sight of the left eye and the line of sight of the right eye is obtained in space and treated as a gazing point, and it is determined whether or not the intersection exists on the screen plane of the display device 14. . When the obtained intersection does not exist on the screen plane, for example, when it is at the back side of the screen as viewed from the user (when a predetermined margin is allowed, it is shifted by a predetermined distance or more on the back side of the screen) In this case, the determination unit 330 may not perform the user confirmation determination even if the line-of-sight direction exists in the authentication area.

[Modification 2]
The recognition space setting function 100, the gesture input function 200, and the line-of-sight authentication determination function 300 described above have been described as being simultaneously implemented in the information processing apparatus 10, but may function independently of each other.

  For example, the recognition space set by the recognition space setting function 100 is not only used for gesture input, but can also be used for other purposes such as when it is desired to limit the range of image recognition to a specific range. In the gesture input function 200, the behavior of the object may be measured in the entire detection range or a predetermined range of the detection range even when the recognition space is not set. For example, the behavior of the entire arm may be measured, and an instruction (such as a screen scroll instruction corresponding to the swing of the arm) may be determined based on this behavior.

[Modification 3]
In the fourth embodiment, the determination unit 200 determines different types of instructions according to the emission color of the light emitting unit 3150 even if the user makes the same gesture. Depending on the, different types of instructions may be determined. For example, a collation table is provided corresponding to some of the keys on the keyboard 13a. Then, until the next key is input after the first key is input to the user's keyboard 13a, the determination unit 200 uses the verification table corresponding to the first key to instruct the device. Should be determined. Note that the input of the start instruction to the operation unit 13 of the process using the recognition space setting function 100 may be combined with the input of the key.

DESCRIPTION OF SYMBOLS 1 ... Electronic device, 10 ... Information processing apparatus, 11 ... CPU, 12 ... Memory, 13 ... Operation part, 13a ... Keyboard, 13b ... Mouse, 14 ... Display apparatus, 20 ... Object detection apparatus, 21 detection sensor, 30 ... Line of sight Measuring device 31 ... Gaze sensor 100 ... Recognition space setting function 110 ... Region specifying unit 120 ... Reference point setting unit 130 ... Scanning unit 140 ... Recognition space setting unit 200 ... Gesture input function 210 ... Behavior measurement Unit, 220 ... determination unit, 300 ... gaze authentication determination function, 310 ... gaze measurement unit, 320 ... display control unit, 330 ... determination unit, 500 ... detection unit, 700 ... execution unit, 1000 ... hand, 2000 ... arm, 3000 , 3010 ... telescopic device, 3050, 3060 ... light emitting device, 3100, 3110 ... telescopic unit, 3150, 3160 ... light emitting unit, 3200 ... wristband, 3300 ... Support member

According to an embodiment of the present invention, a detection unit that detects an object existing in a predetermined detection range in a three-dimensional space and outputs detection data according to the detected object, and based on the detection data, Coordinate calculating means for calculating coordinates in at least one three-dimensional space; a matching pattern in which a locus in a three-dimensional space set corresponding to the location of the object from which the coordinates are calculated and an allowable range from the locus are determined And an area determined based on the time change of the calculated coordinates, and an information processing apparatus provided with a determination unit that determines an instruction to the apparatus based on the comparison result.

According to one embodiment of the present invention, the computer detects an object existing in a predetermined detection range in the three-dimensional space, and based on detection data output from detection means that outputs detection data corresponding to the detected object. A coordinate calculation means for calculating coordinates in a three-dimensional space of at least one location of the object, a trajectory in a three-dimensional space set corresponding to the location of the object for which the coordinates are calculated, and an allowable range from the trajectory. There is provided a program for collating a predetermined collation pattern with an area determined based on the time change of the calculated coordinates and functioning as a determination unit that determines an instruction to the apparatus based on a collation result.

According to an embodiment of the present invention, a detection unit that detects an object existing in a predetermined detection range in a three-dimensional space and outputs detection data according to the detected object, and based on the detection data, Coordinate calculating means for calculating coordinates in at least one three-dimensional space; a matching pattern in which a locus in a three-dimensional space set corresponding to the location of the object from which the coordinates are calculated and an allowable range from the locus are determined And a determination means for determining an instruction to the apparatus based on the collation result by collating the calculated coordinate movement speed with a region determined based on a time change of the coordinate after the movement speed of the coordinate changes to a predetermined speed or more. An information processing apparatus is provided.

According to one embodiment of the present invention, the computer detects an object existing in a predetermined detection range in the three-dimensional space, and based on detection data output from detection means that outputs detection data corresponding to the detected object. A coordinate calculation means for calculating coordinates in a three-dimensional space of at least one location of the object, a trajectory in a three-dimensional space set corresponding to the location of the object for which the coordinates are calculated, and an allowable range from the trajectory. A decision to collate a predetermined collation pattern with an area determined based on a time change of the coordinate after the calculated movement speed of the coordinate changes to a predetermined speed or more, and determine an instruction to the apparatus based on the collation result A program for functioning as a means is provided.

Claims (6)

  1. Detecting means for detecting an object existing in a predetermined detection range and outputting detection data corresponding to the detected object;
    Coordinate calculating means for calculating coordinates of at least one location of the object based on the detection data;
    Collation is performed by collating a trajectory set corresponding to the location of the object for which the coordinates are calculated and a collation pattern that defines an allowable range from the trajectory with an area determined based on a time change of the calculated coordinates. An information processing apparatus comprising: determining means for determining an instruction to the apparatus based on the result.
  2.   The determining means uses a matching table representing a correspondence relationship between the matching pattern and an instruction to the device, and an area determined based on a time change of the calculated coordinates includes the locus of the matching pattern; The information processing apparatus according to claim 1, wherein the matching pattern that satisfies a condition included in the allowable range is specified, and an instruction for the device corresponding to the specified matching pattern is determined.
  3. Based on the calculated coordinates, further comprising registration means for registering the verification pattern in a verification table;
    The said determination means collates with the collation pattern registered into the said table for collation, and the area | region determined based on the time change of the said calculated coordinate after registration of the said collation pattern. Information processing device.
  4. A recognition space setting means for setting a part of the detection range indicated by the detection data as an authentication space;
    The coordinate calculation means calculates coordinates of at least one location of the object in the authentication space;
    The information processing apparatus according to claim 1, wherein the trajectory is included in the authentication space.
  5. An information processing apparatus according to any one of claims 1 to 4,
    An electronic device comprising: execution means for executing processing based on the determined instruction.
  6. Computer
    Coordinate calculating means for detecting an object existing in a predetermined detection range and calculating coordinates of at least one location of the object based on detection data output from the detection means for outputting detection data corresponding to the detected object; ,
    Collation is performed by collating a trajectory set corresponding to the location of the object for which the coordinates are calculated and a collation pattern that defines an allowable range from the trajectory with an area determined based on a time change of the calculated coordinates. A program for functioning as a determination unit that determines an instruction to the apparatus based on a result.
JP2012229326A 2012-04-26 2012-10-16 Information processing apparatus, electronic device, and program Active JP5364194B2 (en)

Priority Applications (3)

Application Number Priority Date Filing Date Title
JP2012101258 2012-04-26
JP2012101258 2012-04-26
JP2012229326A JP5364194B2 (en) 2012-04-26 2012-10-16 Information processing apparatus, electronic device, and program

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
JP2012229326A JP5364194B2 (en) 2012-04-26 2012-10-16 Information processing apparatus, electronic device, and program

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
JP2012114530 Division 2012-05-18

Publications (2)

Publication Number Publication Date
JP2013242843A true JP2013242843A (en) 2013-12-05
JP5364194B2 JP5364194B2 (en) 2013-12-11

Family

ID=47890549

Family Applications (10)

Application Number Title Priority Date Filing Date
JP2012114529A Active JP5174978B1 (en) 2012-04-26 2012-05-18 Information processing apparatus, electronic device, and program
JP2012114530A Active JP5148004B1 (en) 2012-04-26 2012-05-18 Information processing apparatus, electronic device, and program
JP2012128125A Active JP5174979B1 (en) 2012-04-26 2012-06-05 Information processing apparatus, electronic device, and program
JP2012229326A Active JP5364194B2 (en) 2012-04-26 2012-10-16 Information processing apparatus, electronic device, and program
JP2012229205A Active JP5232930B1 (en) 2012-04-26 2012-10-16 Information processing apparatus, electronic device, and program
JP2012233272A Active JP5315450B1 (en) 2012-04-26 2012-10-22 Information processing apparatus, electronic device, and program
JP2012233271A Active JP5314795B1 (en) 2012-04-26 2012-10-22 Information processing apparatus, electronic device, and program
JP2013018351A Active JP5444482B2 (en) 2012-04-26 2013-02-01 Information processing apparatus, electronic device, and program
JP2013141551A Active JP5993354B2 (en) 2012-04-26 2013-07-05 Information processing device
JP2016004171A Granted JP2016048588A (en) 2012-04-26 2016-01-13 Information processing apparatus

Family Applications Before (3)

Application Number Title Priority Date Filing Date
JP2012114529A Active JP5174978B1 (en) 2012-04-26 2012-05-18 Information processing apparatus, electronic device, and program
JP2012114530A Active JP5148004B1 (en) 2012-04-26 2012-05-18 Information processing apparatus, electronic device, and program
JP2012128125A Active JP5174979B1 (en) 2012-04-26 2012-06-05 Information processing apparatus, electronic device, and program

Family Applications After (6)

Application Number Title Priority Date Filing Date
JP2012229205A Active JP5232930B1 (en) 2012-04-26 2012-10-16 Information processing apparatus, electronic device, and program
JP2012233272A Active JP5315450B1 (en) 2012-04-26 2012-10-22 Information processing apparatus, electronic device, and program
JP2012233271A Active JP5314795B1 (en) 2012-04-26 2012-10-22 Information processing apparatus, electronic device, and program
JP2013018351A Active JP5444482B2 (en) 2012-04-26 2013-02-01 Information processing apparatus, electronic device, and program
JP2013141551A Active JP5993354B2 (en) 2012-04-26 2013-07-05 Information processing device
JP2016004171A Granted JP2016048588A (en) 2012-04-26 2016-01-13 Information processing apparatus

Country Status (1)

Country Link
JP (10) JP5174978B1 (en)

Families Citing this family (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2014128749A1 (en) 2013-02-19 2014-08-28 株式会社ブリリアントサービス Shape recognition device, shape recognition program, and shape recognition method
JP2014219712A (en) * 2013-05-01 2014-11-20 コニカミノルタ株式会社 Operation display device
JP6011579B2 (en) 2014-05-21 2016-10-19 株式会社デンソー Gesture input device
JP6494926B2 (en) * 2014-05-28 2019-04-03 京セラ株式会社 Mobile terminal, gesture control program, and gesture control method
JP6376886B2 (en) * 2014-08-05 2018-08-22 アルパイン株式会社 Input system and input method
EP3106343A1 (en) * 2015-06-19 2016-12-21 Continental Automotive GmbH Gesture based user input device for car electronics
JP6540809B2 (en) * 2015-08-06 2019-07-10 株式会社ニコン Electronic control device and electronic control program
US10168769B2 (en) 2015-09-28 2019-01-01 Nec Corporation Input apparatus, input method, and program
WO2017145423A1 (en) * 2016-02-25 2017-08-31 日本電気株式会社 Information processing system, information processing device, control method, and program
JP6538961B2 (en) 2016-03-04 2019-07-03 株式会社ソニー・インタラクティブエンタテインメント Control device
JP6444345B2 (en) * 2016-08-23 2018-12-26 株式会社コロプラ Method and apparatus for supporting input in virtual space, and program for causing computer to execute the method
JPWO2018198272A1 (en) * 2017-04-27 2019-11-07 株式会社ソニー・インタラクティブエンタテインメント Control device, information processing system, control method, and program

Family Cites Families (41)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH07105371A (en) * 1993-10-04 1995-04-21 Hitachi Ltd Hand print pattern recognition method
JPH0981309A (en) * 1995-09-13 1997-03-28 Toshiba Corp Input device
JP3487494B2 (en) * 1998-04-17 2004-01-19 日本電信電話株式会社 Menu selection method and device
JP3444200B2 (en) * 1998-07-07 2003-09-08 ソニー株式会社 Image data processing apparatus and method, and providing medium
JP3792907B2 (en) * 1998-08-06 2006-07-05 株式会社竹中工務店 Hand pointing device
JP2000098871A (en) * 1998-09-28 2000-04-07 Sony Corp Virtual image stereoscopic compositing device, virtual image stereoscopic compositing method, game device and recording medium
JP2000242394A (en) * 1999-02-24 2000-09-08 Nec Corp Virtual keyboard system
JP2001216069A (en) * 2000-02-01 2001-08-10 Toshiba Corp Operation inputting device and direction detecting method
JP2003044203A (en) * 2001-08-02 2003-02-14 Canon I-Tech Inc Information processor
JP2003131785A (en) * 2001-10-22 2003-05-09 Toshiba Corp Interface device, operation control method and program product
JP4286556B2 (en) * 2003-02-24 2009-07-01 株式会社東芝 Image display device
JP4053903B2 (en) * 2003-03-07 2008-02-27 日本電信電話株式会社 Pointing method, apparatus, and program
JP4153818B2 (en) * 2003-03-31 2008-09-24 本田技研工業株式会社 Gesture recognition device, gesture recognition method, and gesture recognition program
JP2004299025A (en) * 2003-04-01 2004-10-28 Honda Motor Co Ltd Mobile robot control device, mobile robot control method and mobile robot control program
JP2005301693A (en) * 2004-04-12 2005-10-27 Japan Science & Technology Agency Animation editing system
JP2005332229A (en) * 2004-05-20 2005-12-02 Nippon Telegr & Teleph Corp <Ntt> Attitude detector, attitude detection method, and program for the method
JP2006276928A (en) * 2005-03-28 2006-10-12 Canon Inc Piezo-electric motion detection device
JP5681633B2 (en) * 2008-10-27 2015-03-11 株式会社ソニー・コンピュータエンタテインメント Control device for communicating visual information
JP2009042796A (en) * 2005-11-25 2009-02-26 Panasonic Corp Gesture input device and method
JP2008186120A (en) * 2007-01-29 2008-08-14 Seiko Epson Corp Processor, processing method and program for executing processing according to user's instruction
JP2007164814A (en) * 2007-02-09 2007-06-28 Toshiba Corp Interface device
KR20100072198A (en) * 2007-08-19 2010-06-30 링보우 리미티드 Finger-worn device and related methods of use
JP4318056B1 (en) * 2008-06-03 2009-08-19 島根県 Image recognition apparatus and operation determination method
JP2010134629A (en) * 2008-12-03 2010-06-17 Sony Corp Information processing apparatus and method
JP5263833B2 (en) * 2009-05-18 2013-08-14 国立大学法人 奈良先端科学技術大学院大学 Ring-type interface, interface device, and interface method used for wearable computer
JP5282661B2 (en) * 2009-05-26 2013-09-04 ソニー株式会社 Information processing apparatus, information processing method, and program
US20100303297A1 (en) * 2009-05-30 2010-12-02 Anton Mikhailov Color calibration for object tracking
JP5201096B2 (en) * 2009-07-17 2013-06-05 大日本印刷株式会社 Interactive operation device
JP2011095985A (en) * 2009-10-29 2011-05-12 Nikon Corp Image display apparatus
JP2011100396A (en) * 2009-11-09 2011-05-19 Hitachi Ltd Video display device
JP2011118523A (en) * 2009-12-01 2011-06-16 Sekisui House Ltd Drawing display device
JP4900741B2 (en) * 2010-01-29 2012-03-21 島根県 Image recognition apparatus, operation determination method, and program
JP5569062B2 (en) * 2010-03-15 2014-08-13 オムロン株式会社 Gesture recognition device, method for controlling gesture recognition device, and control program
JP2011215968A (en) * 2010-03-31 2011-10-27 Namco Bandai Games Inc Program, information storage medium and object recognition system
WO2011142317A1 (en) * 2010-05-11 2011-11-17 日本システムウエア株式会社 Gesture recognition device, method, program, and computer-readable medium upon which program is stored
JP2012073658A (en) * 2010-09-01 2012-04-12 Shinsedai Kk Computer system
JP2012073659A (en) * 2010-09-01 2012-04-12 Shinsedai Kk Operation determination device, fingertip detection device, operation determination method, fingertip detection method, operation determination program, and fingertip detection program
JP5264844B2 (en) * 2010-09-06 2013-08-14 日本電信電話株式会社 Gesture recognition apparatus and method
JP5625643B2 (en) * 2010-09-07 2014-11-19 ソニー株式会社 Information processing apparatus and information processing method
JP5167523B2 (en) * 2010-09-22 2013-03-21 島根県 Operation input device, operation determination method, and program
JP4918171B1 (en) * 2011-07-21 2012-04-18 パナソニック株式会社 Image processing apparatus and document reading system having the same

Also Published As

Publication number Publication date
JP2013242652A (en) 2013-12-05
JP2016048588A (en) 2016-04-07
JP5364194B2 (en) 2013-12-11
JP2013242844A (en) 2013-12-05
JP5315450B1 (en) 2013-10-16
JP2013242889A (en) 2013-12-05
JP5148004B1 (en) 2013-02-20
JP5232930B1 (en) 2013-07-10
JP2013242834A (en) 2013-12-05
JP5174979B1 (en) 2013-04-03
JP5314795B1 (en) 2013-10-16
JP2013242845A (en) 2013-12-05
JP2013242842A (en) 2013-12-05
JP2013242651A (en) 2013-12-05
JP2014032646A (en) 2014-02-20
JP5444482B2 (en) 2014-03-19
JP5993354B2 (en) 2016-09-14
JP5174978B1 (en) 2013-04-03

Similar Documents

Publication Publication Date Title
ES2731560T3 (en) Look interaction with delayed deformation
US10031578B2 (en) Gaze detection in a 3D mapping environment
JP5755712B2 (en) Improved detection of wave engagement gestures
JP6030430B2 (en) Control device, vehicle and portable terminal
US10001838B2 (en) Feature tracking for device input
US20180181208A1 (en) Gesture Recognition Devices And Methods
KR101757080B1 (en) Method and system for human-to-computer gesture based simultaneous interactions using singular points of interest on a hand
US9552071B2 (en) Information processing apparatus, information processing method and computer program
US20170024007A1 (en) External user interface for head worn computing
US8768006B2 (en) Hand gesture recognition
US9762792B2 (en) Adjusting motion capture based on the distance between tracked objects
CA2811868C (en) Operation input apparatus, operation input method, and program
US9846486B2 (en) Systems and methods of direct pointing detection for interaction with a digital device
US10139966B2 (en) External user interface for head worn computing
JP5624530B2 (en) Command issuing device, method and program
JP4965653B2 (en) Virtual controller for visual display
US20160025974A1 (en) External user interface for head worn computing
US20160062118A1 (en) External user interface for head worn computing
JP5949319B2 (en) Gaze detection apparatus and gaze detection method
US8589824B2 (en) Gesture recognition interface system
US20160026239A1 (en) External user interface for head worn computing
US8837780B2 (en) Gesture based human interfaces
US8867791B2 (en) Gesture recognition method and interactive system using the same
US9377859B2 (en) Enhanced detection of circular engagement gesture
US10019843B2 (en) Controlling a near eye display

Legal Events

Date Code Title Description
TRDD Decision of grant or rejection written
A01 Written decision to grant a patent or to grant a registration (utility model)

Free format text: JAPANESE INTERMEDIATE CODE: A01

Effective date: 20130820

A61 First payment of annual fees (during grant procedure)

Free format text: JAPANESE INTERMEDIATE CODE: A61

Effective date: 20130906

R150 Certificate of patent or registration of utility model

Ref document number: 5364194

Country of ref document: JP

Free format text: JAPANESE INTERMEDIATE CODE: R150

Free format text: JAPANESE INTERMEDIATE CODE: R150

R250 Receipt of annual fees

Free format text: JAPANESE INTERMEDIATE CODE: R250

R250 Receipt of annual fees

Free format text: JAPANESE INTERMEDIATE CODE: R250

S533 Written request for registration of change of name

Free format text: JAPANESE INTERMEDIATE CODE: R313533

R350 Written notification of registration of transfer

Free format text: JAPANESE INTERMEDIATE CODE: R350

R250 Receipt of annual fees

Free format text: JAPANESE INTERMEDIATE CODE: R250

R250 Receipt of annual fees

Free format text: JAPANESE INTERMEDIATE CODE: R250

S111 Request for change of ownership or part of ownership

Free format text: JAPANESE INTERMEDIATE CODE: R313113