WO2017130407A1 - Dispositif de génération d'informations, procédé de génération d'informations, programme de génération d'informations, et support d'enregistrement - Google Patents

Dispositif de génération d'informations, procédé de génération d'informations, programme de génération d'informations, et support d'enregistrement Download PDF

Info

Publication number
WO2017130407A1
WO2017130407A1 PCT/JP2016/052786 JP2016052786W WO2017130407A1 WO 2017130407 A1 WO2017130407 A1 WO 2017130407A1 JP 2016052786 W JP2016052786 W JP 2016052786W WO 2017130407 A1 WO2017130407 A1 WO 2017130407A1
Authority
WO
WIPO (PCT)
Prior art keywords
inclination
information
vine
face
straight line
Prior art date
Application number
PCT/JP2016/052786
Other languages
English (en)
Japanese (ja)
Inventor
良司 野口
Original Assignee
パイオニア株式会社
一般財団法人 日本自動車研究所
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by パイオニア株式会社, 一般財団法人 日本自動車研究所 filed Critical パイオニア株式会社
Priority to PCT/JP2016/052786 priority Critical patent/WO2017130407A1/fr
Priority to JP2017563651A priority patent/JP6697006B2/ja
Publication of WO2017130407A1 publication Critical patent/WO2017130407A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/60Analysis of geometric attributes

Definitions

  • the present application relates to a technical field such as an information generation apparatus that creates direction information indicating a direction in which a face is facing based on an image of a head image.
  • Patent Literature 1 acquires the position and size of a reflected image of light on spectacles worn by the subject, and estimates the position of the subject's face based on the position and size of the reflected image. To do.
  • the present invention can create direction information indicating the direction in which the face is facing based on the image captured by the camera even when the face is in the lateral direction with respect to the camera. It is an object to provide a generation device and the like.
  • the invention according to claim 1 is based on detection means for detecting the inclination of the characteristic portion of the wearing object in an image obtained by photographing the head wearing the wearing object from the side, and the inclination of the characteristic part detected by the detection means. And generating means for generating face direction information indicating the direction in which the face is facing.
  • the invention according to claim 6 is an information generation method performed by the information generation apparatus, and includes a detection step of detecting an inclination of a characteristic portion of the wearing object in an image obtained by photographing a head wearing the wearing object from a side surface; A generation step of generating face direction information indicating a direction in which the face is facing based on the inclination of the feature detected by the detection step.
  • a detecting means for detecting the inclination of the characteristic portion of the wearing object in an image obtained by photographing the head of the wearing object from the side, and the characteristic portion detected by the detecting means. It is made to function as a production
  • a detection means for detecting a tilt of a characteristic portion of the wearing object in an image obtained by photographing the head of the wearing object from a side surface, and the feature portion detected by the detecting means. It is a computer-readable recording medium having recorded thereon an information generation program that functions as generation means for generating face direction information indicating the direction in which the face is directed from an inclination.
  • FIG. 1 is a block diagram of an information generation device 1.
  • FIG. It is an example of the block diagram of the face orientation determination apparatus D.
  • 3 is a functional block diagram illustrating a configuration example for realizing a function of a determination device D.
  • FIG. It is a flowchart which shows the operation example at the time of the determination apparatus D determining the direction of a face. It is a flowchart which shows an example of the spectacles wear determination process by the determination apparatus D.
  • (A) is the image which image
  • (B) is the image which image
  • (C) is the vine part of spectacles of the pattern C It is the image which photographed.
  • 12 is a flowchart illustrating an example of eyeglass orientation determination processing by the determination device D.
  • 5 is a flowchart illustrating an example of a first pitch angle determination process performed by a determination device D. It is a figure showing the relationship between the pitch angle of the spectacles of the pattern A, and the inclination of a vine part straight line. It is a flowchart which shows an example of the vine part straight line extraction process by the determination apparatus D.
  • 10 is a flowchart illustrating an example of a second pitch angle determination process by the determination device D. It is a figure showing the relationship between the pitch angle of the spectacles of the pattern B, and the inclination of a vine part straight line.
  • FIG. 5 is a flowchart illustrating an example of a yaw angle determination process performed by a determination device D. It is a figure which shows the relationship between the direction of spectacles, and the inclination of a vine part straight line.
  • 10 is a flowchart illustrating an example of face orientation determination processing by a determination device D.
  • A) is an example figure which shows the vine part whose shape of a vine is a special pattern.
  • the information generating apparatus 1 includes a detecting unit 111A, a generating unit 111B, a storage control unit 111C, and a storage unit 111D. Note that the storage means may be provided outside the information generating apparatus 1.
  • Detecting means 111A detects the inclination of the characteristic part of the wearing object in an image obtained by photographing the head wearing the wearing object from the side.
  • the wearing items include glasses and a mask.
  • a characteristic part is a part located in the side surface of the head in a wearing thing.
  • edge of spectacles or a mask is mentioned. More specifically, it is “vine” in the case of glasses, and “string” to be worn on the ear in the case of a mask.
  • the generating unit 111B generates face direction information indicating the direction in which the face is facing based on the inclination of the feature detected by the detecting unit 111A.
  • the direction in which the face is directed is indicated by, for example, a pitch angle (vertical angle) and a yaw angle (lateral angle).
  • the generation unit 111B generates wearing object direction information indicating the direction in which the wearing object is directed from the inclination of the feature part, and indicates a difference between the reference direction and the face direction when the wearing object faces the reference direction. Based on the difference information, the face direction information is generated by correcting the wearing direction information. As a result, correct face direction information can be generated even when the direction in which the attachment is facing does not match the direction in which the face is facing.
  • the information generation device 1 even when the face is directed laterally with respect to the camera, face direction information indicating the direction in which the face is directed is created based on the image of the head taken by the camera. can do.
  • the storage unit 111D may store the inclination information indicating the inclination of the characteristic portion and the generated face direction information in association with each other.
  • the generation unit 111B associates with the inclination information when inclination information indicating an inclination whose difference from the inclination of the characteristic portion detected by the detection unit 111A is equal to or less than the threshold is stored in the storage unit 111D.
  • the stored face direction information is acquired.
  • the threshold value is a value for determining a width in which the inclination of the characteristic portion detected by the detection unit 111A and the inclination indicated by the inclination information stored in the storage unit 111D can be regarded as substantially the same inclination.
  • the generation unit 111B does not generate new face direction information when the newly detected inclination of the feature is almost the same as the inclination of the feature when the face direction information was generated in the past. Therefore, it is possible to reduce the processing load related to the process for generating the face direction information from the inclination of the feature part.
  • the detection unit 111A detects a line indicating the edge of the feature
  • the generation unit 111B detects the inclination of the two lines and the distance between the two lines when the detection unit 111 detects the two lines.
  • the type information indicating the type of the wearing object may be further generated.
  • the storage control unit 111C further associates the type information with the tilt information and the face direction information and stores the type information in the storage unit 111D.
  • the generation unit 111B newly generates the type information
  • the storage unit 111D generates the type information.
  • the face direction information associated with the type information indicating the same type as the type information may be acquired.
  • the inclination of the characteristic portion and the face direction information are stored in association with each type. Can be made. Therefore, when wearing the same type of attachment as the one used to generate face direction information in the past, the newly detected inclination of the feature is the characteristic when the face direction information was generated in the past. When it is almost the same as the inclination of the part, it is not necessary to newly generate face direction information, so the processing load related to the process for generating the face direction information from the inclination of the characteristic part can be reduced.
  • the storage control unit 111C stores the face direction information generated by the generation unit 111B in the storage unit 111D, and the generation unit 111B includes a plurality of directions indicated by the face direction information when generating new face direction information. If there is a candidate, the face direction information may be generated based on at least one face direction information stored within a predetermined period. This prevents generation of face direction information indicating a direction far from the direction indicated by the latest face direction information even if there are a plurality of candidates as the direction indicated by the face direction information for some reason. can do.
  • Example described below is an Example at the time of applying this invention to the face direction determination apparatus D (Hereinafter, it may be called "determination apparatus D.”).
  • the determination apparatus D detects the edge of the vine of the glasses from an image obtained by photographing the head of the subject wearing the glasses, and calculates the inclination of the edge, thereby determining the direction of the glasses and the direction of the face. Determine.
  • the determination device D can determine the orientation of the face based on the inclination of the vine edge of the glasses.
  • the edge of the eyeglasses is detected from the image (determined that the eyeglasses are worn)
  • the direction of the vertical direction (sometimes referred to as the pitch direction) and the horizontal direction (sometimes referred to as the yaw direction) of the glasses / face camera with respect to the camera is determined.
  • the determination device D is roughly configured to include a control unit 111, a storage unit 112, a communication unit 113, a display unit 114, and an operation unit 115.
  • the storage unit 112 is configured by a hard disk drive, for example, and stores various programs including an OS (Operating System) and a face orientation determination program for determining the face orientation. In addition, the storage unit 112 stores an image captured by the camera C and various data used for the face orientation determination program.
  • OS Operating System
  • face orientation determination program for determining the face orientation.
  • the storage unit 112 stores an image captured by the camera C and various data used for the face orientation determination program.
  • the I / F unit 113 controls data exchange with the camera C.
  • the camera C is installed in front of the subject's face and photographs the subject's head.
  • the display unit 114 is configured by, for example, a liquid crystal display or the like, and displays an image or the like taken by the camera C.
  • the operation unit 115 includes, for example, a keyboard, a mouse, and the like, and receives an operation instruction from an operator and outputs the instruction content to the control unit 111 as an instruction signal.
  • the control unit 111 includes a CPU (Central Processing Unit), a ROM (Read Only Memory), a RAM (Random Access Memory), and the like.
  • the CPU implements various functions by reading and executing various programs including the face orientation determination program stored in the ROM or the storage unit 112.
  • FIG. 3 is a functional block diagram illustrating a configuration example for realizing the function of the determination device D.
  • the control unit 111 of the determination apparatus D executes the face orientation determination program, thereby causing the image acquisition unit 11, the spectacle vine region determination unit 12, the spectacle wear determination unit 13, the edge detection unit 14, the vine straight line detection unit 15, and the spectacles. It operates as a direction determination unit 16, a linear inclination calculation unit 17, a pitch direction determination unit 18, a yaw direction determination unit 19, a face direction determination unit 21, and a direction information generation unit 22.
  • the image acquisition unit 11 acquires an image in which the camera C images the head (face).
  • the glasses vine region determination unit 12 determines whether the face is facing sideways (for example, 30 degrees or more in the horizontal direction) from the image, and determines the region next to the eye as the vine region where the glasses vine exists.
  • the edge detection unit 14 detects an edge (a line representing the boundary of the vine portion in the image) in the region, and detects the vine portion straight line from the edge detected by the vine portion straight line detection unit 15.
  • the glasses wearing determination unit 13 determines whether glasses are worn depending on whether a vine straight line is detected.
  • the linear inclination calculation unit 17 calculates the inclination of the vine line
  • the pitch direction determination unit 18 determines the angle of the pitch direction of the glasses from the inclination
  • the yaw direction determination unit 19 Determines the angle of the glasses in the yaw direction.
  • the slope of the vine part straight line the slope of the upper edge straight line (line indicating the upper boundary of the vine part) and the slope of the lower edge straight line (line indicating the lower boundary of the vine part), or the upper edge straight line or the lower edge straight line Either one of the slopes can be used.
  • the face direction determination unit 21 determines the face direction based on the determined spectacle direction and the face direction reference information (an example of difference information), and the direction information generation unit 22 determines the face direction.
  • the direction information shown is generated.
  • FIG. 4 is a flowchart illustrating an example of the determination process of the determination device D.
  • control unit 111 as the image acquisition unit 11 acquires an image in which the camera C images the subject's head (step S ⁇ b> 11).
  • control unit 111 as the eyeglass vine region determination unit 12 determines whether or not the face shown in the image acquired in the process of step S11 is in the horizontal direction (step S12). Specifically, facial features (eyes, nose, mouth, ears, etc.) are extracted by a conventionally known method, and the orientation of the face is determined based on the positional relationship. In this embodiment, when the yaw angle is 30 degrees or more, it is determined to be in the landscape orientation.
  • step S12 determines that the face is not in the horizontal direction (step S12: NO).
  • step S12 determines that the face is sideways (step S12: YES)
  • step S13 determines a vine region in the image (step S13). Specifically, the x-coordinate standard and y-coordinate standard of the hanging area are determined based on the positions of the eyes, eyebrows, ears, nose, facial contour characteristics, and the like, and the hanging area is determined.
  • control unit 111 as the spectacle wear determination unit 13 performs spectacle wear determination processing (step S14).
  • spectacle wearing determination process will be described with reference to the flowchart of FIG.
  • control unit 111 as the edge detection unit 14 cuts out an image of the vine region determined in the process of step S13 (step S31).
  • control unit 111 performs edge detection on the image of the vine region (step S32).
  • edge detection a general edge detection method such as the Canny method can be used.
  • the control part 111 as the vine part straight line detection part 15 performs a vine part straight line detection process (step S33).
  • general straight line detection processing such as Hough transform is performed on the binarized image subjected to edge detection.
  • a general method such as a Hough transform can be used as a method for detecting a straight line representing the vine portion (vine portion straight line).
  • a straight line other than the straight line representing the vine part may be detected.
  • step S34 determines whether or not the vine portion straight line has been detected.
  • step S34: YES determines that the vine straight line has been detected
  • step S35 ends the process of the flowchart.
  • step S36 determines with not wearing glasses
  • a YES / NO determination may be made based on the number of detected straight lines. For example, (1) YES is determined when at least one straight line is detected, (2) YES is determined when two straight lines are detected, and (3) two or more straight lines are detected. It is good also as determining with YES in that case.
  • a straight line other than the straight line representing the vine portion (for example, a straight line representing the hairline) may be detected.
  • two straight lines having close inclinations are detected.
  • the number of images that can be detected with the straight line of the vine is greater than or equal to a predetermined number. It may be determined that the user is wearing glasses, and if the number is less than the predetermined number, it may be determined that the user is not wearing glasses. Thereby, even if a vine part straight line is erroneously detected by image processing, a robust vine part straight line detection process can be performed without being affected by such noise.
  • step S15 of FIG. 4 determines whether or not glasses are worn based on the processing result of step S14 (step S15).
  • step S15 NO
  • step S11 When the process proceeds to step S11 and glasses are worn (step S15: YES), glasses direction determination processing described later with reference to FIGS. 6 to 15 is performed (step S16).
  • control unit 111 performs a face orientation determination process described later with reference to FIG. 16 (step S17).
  • step S18 determines whether or not to continue the determination process.
  • step S18: YES the control unit 111 proceeds to the process of step S11, and when it is determined that the determination process is not to be continued (step S18: NO), the flowchart.
  • the eyeglasses are worn for three patterns of eyeglasses having different inclinations of the upper edge straight line and the lower edge straight line.
  • An example of determination processing is shown.
  • 6A shows the vine portion of the pattern A glasses
  • FIG. 6B shows the vine portion of the pattern B glasses
  • FIG. 6C shows the vine portion of the pattern C glasses.
  • the upper edge straight line 200U and the lower edge straight line 200L are substantially parallel and the interval is about 3 mm.
  • the inclination of the upper edge straight line 200U and the inclination of the lower edge straight line 200L are almost the same, so even if only one of the upper edge straight line 200U or the lower edge straight line 200L can be detected Therefore, the pitch angle of the glasses (face) can be determined.
  • the vine portion width is large (about 10 mm), and when the vine portion is far from the lens portion, the vine portion width is thin (about 2 mm).
  • the eyeglasses of pattern B since the inclination of the upper edge straight line 200U and the inclination of the lower edge straight line 200L are different, when only either the upper edge straight line 200U or the lower edge straight line 200L is detected, the eyeglasses are determined from the edge straight line. In order to determine the pitch angle of (face), it is determined whether one detected edge straight line is the upper edge straight line 200U or the lower edge straight line 200L, and the upper edge straight line 200U (or the lower edge) is determined.
  • the inclination with respect to the reference angle needs to be found.
  • the reference is determined such that the inclination of the upper edge straight line 200U is 10 degrees and the inclination of the lower edge straight line 200L is 0 degree, and the detected single edge straight line is the upper edge straight line. If either 200U or the lower edge straight line 200L is known, one edge straight line (even when wearing spectacles of pattern B in which the inclination of the upper edge straight line 200U and the inclination of the lower edge straight line 200L are different)
  • the pitch angle of the glasses (face) can be determined.
  • the width of the vine portion becomes thicker (about 7 mm) when the vine portion is closer to the lens portion, and the width of the vine portion becomes thinner (about 4 mm) when it is farther from the lens portion. However, it is not as thin as pattern B.
  • the control unit 111 as the eyeglass orientation determination unit 16 determines whether or not there is one vine straight line detected in the process of step S33 (step S51).
  • step S51: YES the control unit 111 performs a first pitch angle determination process, which will be described later with reference to FIG. 8 (step S52), and the process of step S56.
  • step S53 the control unit 111 determines whether there are two vine straight lines detected in the process of step S33.
  • step S53: NO determines that the number of detected vine straight lines is not two (three or more)
  • step S54 performs vine straight line extraction processing described later with reference to FIG. 11
  • step S55 a second pitch angle determination process described later with reference to FIG. 12 is performed
  • step S55 determines that there are two detected vine straight lines
  • step S55 determines that there are two detected vine straight lines
  • step S55 performs a second pitch angle determination process, which will be described later with reference to FIG. 12 (step S55). The process proceeds to S56.
  • control unit 111 performs a yaw angle determination process to be described later with reference to FIG. 15 (step S56).
  • the controller 111 determines the pitch angle of the glasses determined in the first pitch angle determination process in step S52 or the second pitch angle determination process in step S55 and the yaw of the glasses determined in the yaw angle determination process in step S56.
  • the spectacle direction information indicating the corner is generated (step S57), and the spectacle direction determination process is terminated.
  • the first pitch angle determination process is a process of determining the angle of the pitch direction facing the glasses from the inclination when there is one vine straight line.
  • control unit 111 as the spectacle direction determination unit 16 determines whether or not the type of the spectacles worn is type A (step S71). Specifically, for example, the user or the person who performs face orientation determination inputs in advance which type of glasses is worn, and the determination is made based on the input result.
  • step S71 NO
  • step S71: NO it is determined whether the vine straight line is the upper edge or the lower edge, and it is determined whether the inclination with respect to the reference angle is determined.
  • the control unit 111 determines “NO” because there are two detected vine straight lines, but the designer of the determination device D determines the processing load.
  • the designer In order to reduce the pitch angle with only the upper edge or the lower edge (that is, one vine straight line) instead of the two vine straight lines, the designer must determine whether the upper edge or the lower edge. Is set in advance.
  • the control unit 111 determines whether the vine straight line is the upper edge or the lower edge. Is determined to be confirmed. Further, depending on whether table information indicating an inclination corresponding to the reference angle is stored in advance in the storage unit 112 for the upper edge or the lower edge (edge set in advance by the designer) of the type of eyeglasses to be processed. Then, the control unit 111 determines whether or not an inclination with respect to the reference angle is determined. That is, when two vine line straight lines are detected and the designer wants to determine the pitch angle with one vine line line to reduce the processing load, the designer selects the upper edge or the lower edge.
  • Preliminarily setting and preparing table information indicating the inclination corresponding to the reference angle for the set edge allows the control unit 111 to determine “YES” in the process of step S72.
  • the pitch angle can be determined by a partial straight line. If the control unit 111 determines “NO” (step S ⁇ b> 72: NO), the first pitch angle determination process ends. On the other hand, if the control unit 111 determines “YES” (step S72: YES), the control unit 111 proceeds to the process of step S73.
  • step S71 determines that the type is A in the process of step S71 (step S71: YES)
  • the control unit 111 proceeds to the process of step S73.
  • control unit 111 as the linear inclination calculating unit 17 calculates the inclination of the vine straight line detected in the process of step S33 (step S73).
  • control unit 111 as the pitch direction determination unit 18 performs pitch angle determination based on the inclination of the vine line (step S73), and ends the first pitch angle determination process.
  • FIG. 9 shows the direction of the glasses of pattern A and the inclination of the upper edge straight line and the lower edge straight line of the vine.
  • the inclination (U) of the upper edge straight line 200U and the inclination (L) of the lower edge straight line 200L are substantially the same, and as described above, from the inclination of either the upper edge straight line 200U or the lower edge straight line 200L.
  • the pitch angle can be determined.
  • the control unit 111 sets the threshold of the vine straight line by appropriately setting a threshold for the inclination of the vine straight line as follows.
  • the pitch angle of the glasses can be determined from the inclination.
  • the vine part straight line extraction process is a process of extracting two vine part straight lines when there are three or more vine part straight lines.
  • the pitch angle is determined by the second pitch angle determination process based on the two vine straight lines.
  • the control unit 111 as the eyeglass orientation determination unit 16 determines whether or not there is vine part straight line information stored in the past (step S91).
  • the vine part straight line information is information when two vine straight lines are properly detected (step S33) or extracted (step S54), and the inclination, length, interval, and positional relationship between the two straight lines. It is information which shows etc.
  • the vine line information information stored in advance is used, or information stored about the currently processed straight line (the inclination of the straight line is stored in the process of step S94, and the length of the straight line is determined in step S97. And the interval between straight lines is stored in the process of step S98).
  • step S91: NO When it is determined that there is no vine portion straight line information stored in the past (step S91: NO), the control unit 111 proceeds to the process of step S94. On the other hand, when it is determined that there is the vine portion straight line information stored in the past (step S91: YES), the control unit 111 then determines whether or not the vine portion straight line information is used (step S92).
  • step S92 determines not to use the vine line information
  • step S92: NO determines whether the vine line information is to be used
  • step S92: YES determines whether the vine line information is to be used
  • step S93 determines whether the vine line information is to be used.
  • the control unit 111 extracts two vine line lines using the vine line information stored in the past (step S93). )
  • the vine portion straight line extraction process is terminated. Specifically, the two vine straight lines that are most similar to the information indicated by the vine straight line information are extracted. For example, a process of selecting a difference in inclination between the upper edge straight line 200U and the lower edge straight line 200L that is close to the difference in inclination between the two vine straight lines indicated by the stored vine straight line information, or the difference greatly exceeds. Alternatively, a process for excluding one of two significantly smaller straight lines can be performed.
  • the control unit 111 excludes a straight line having a significantly different inclination from the other straight lines among the three or more vine part straight lines extracted in the process of step S33 (step S94).
  • a straight line having a significantly different inclination from the other straight lines among the three or more vine part straight lines extracted in the process of step S33 (step S94).
  • the inclination is larger (steep slope) than the actual vine straight line, so such a straight line is excluded.
  • the control unit 111 determines whether or not appropriate two vine part straight lines have been extracted (step S95). Specifically, in this process, it is determined whether two vine part straight lines can be extracted, and whether the two vine part straight lines are appropriate as lengths, inclinations, intervals, and the like. When it is determined that the appropriate two vine part straight lines have been extracted (step S95: YES), the control unit 111 ends the vine part straight line extraction process. On the other hand, if it is determined that the appropriate two vine straight lines have not been extracted (step S95: NO), the control unit 111 then excludes straight lines that are significantly different in length from the other vine straight lines (step S95: NO). Step S96).
  • control unit 111 determines whether or not appropriate two vine portion straight lines have been extracted in the same manner as the processing in step S95 (step S97). When it is determined that the appropriate two vine part straight lines have been extracted (step S97: YES), the control unit 111 ends the vine part straight line extraction process. On the other hand, if it is determined that the appropriate two vine straight lines have not been extracted (step S97: NO), the control unit 111 then eliminates straight lines that create an inappropriate distance between the straight lines (step S98). ). For example, in a plurality of intervals formed by straight lines, straight lines that make a large gap from the others are removed.
  • control unit 111 determines whether or not appropriate two vine portion straight lines have been extracted in the same manner as in step S95 (step S99). When it is determined that the appropriate two vine part straight lines have been extracted (step S99: YES), the control unit 111 ends the vine part straight line extraction process. On the other hand, if it is determined that the appropriate two vine straight lines cannot be extracted (step S99: NO), the control unit 111 proceeds to the process of step S11 in FIG.
  • vine portion straight line extraction processing an example of processing for extracting appropriate two vine portion straight lines (upper edge straight line 200U and lower edge straight line 200L) has been described. However, two appropriate vine portion straight lines are extracted.
  • the method may be a method other than the method described individually.
  • the second pitch angle determination process is a process of determining an angle in the pitch direction of the glasses from the inclination when there are two vine straight lines.
  • the control unit 111 as the pitch direction determination unit 18 determines a spectacle pattern from two vine straight lines, and generates pattern information indicating the spectacle pattern (step S111).
  • the spectacles pattern is determined based on at least one of the inclination of the two vine straight lines or the interval between the two vine straight lines. For example, determining a slope E U of upper edge straight, slope E L is approximately equal to the lower edge straight line (the difference in inclination is less than 3 degrees) as a pattern A in the case. Further, it is determined that if the difference between the E U and E L is less than 3 degrees 8 degrees is a pattern B. Further, it is determined that if the difference between the E U and E L is less than 8 degrees 15 degrees or more is a pattern C. The basis for this will be described below.
  • FIG. 12 is a diagram illustrating an example of the relationship between the pattern B eyeglass direction and the inclination of the upper edge straight line 200U and the lower edge straight line 200L.
  • the inclination E U vine portion on the edge straight 200 U, a gradient difference E L vine subordinates edge straight 200L becomes as follows.
  • FIG. 13 is a diagram illustrating an example of the relationship between the direction of the glasses of the pattern C and the inclination of the vine portion upper edge straight line 200U and the lower edge straight line 200L.
  • the inclination E U vine portion on the edge straight 200 U, a gradient difference E L vine portion on the edge straight line 200L becomes as follows.
  • the inclination E U vine portion on the edge straight line 200 U when the difference between the inclination E L vine subordinates edge straight 200L of about 7 degrees to about 8 degrees, is possible to determine the spectacle pattern using the pitch angle information (For example, when the difference is about 7 degrees and the pitch angle is 20 degrees, the glasses of the pattern B are used, and when the difference is about 7 degrees and the pitch angle is 10 degrees, the glasses of the pattern C are used).
  • the process of determining the spectacle pattern may be performed using a plurality of images instead of using only one image. In such a case, the determination accuracy is improved.
  • the pattern of the glasses when determining the pattern of the glasses based on the distance between the two vine straight lines, if the distance is constant, it is determined as the pattern A glasses, and the distance between one end and the other is different and When the interval is greater than or equal to the threshold, it is determined that the pattern B is spectacles, and when the interval between the one end and the other end is different and the interval is less than the threshold, the pattern C can be determined.
  • the control unit 111 determines whether there is vine linear inclination information stored in the past (step S112).
  • the past can be set as appropriate.
  • the past may be 1 second before or 5 frames when shooting at 10 fps (vine straight line)
  • the inclination information may be stored every image or periodically (every several frames).
  • the vine part straight line inclination information is information indicating the inclination of two vine part straight lines included in the vine part straight line information.
  • step S112 determines that there is vine straight line inclination information stored in the past (step S112: YES), whether the past vine straight line inclination is different from the current vine straight line inclination or not. Is determined (step S113). Specifically, the difference in inclination between the two previous vine straight lines and the difference in inclination between the two current vine straight lines are compared to determine whether the glasses are of different types.
  • step S113 When it is determined that the inclination of the past vine straight line is different from the inclination of the current vine straight line (step S113: YES), the control unit 111 attaches and detaches glasses (different from the type of glasses worn in the past). (Step S114), the current vine linear inclination information is stored in the storage unit 112 (Step S115), and the process proceeds to Step S116. On the other hand, if the control unit 111 determines that the inclination of the past vine line is not different from the inclination of the current vine line (step S113: NO), the process proceeds to step S116.
  • control unit 111 as the pitch direction determination unit 18 performs the pitch angle determination based on the inclination of the two vine portion straight lines (step S116).
  • the pitch angle determination based on the inclination of the two vine straight lines will be specifically described.
  • the pitch angle of the pattern B glasses and the inclinations of the upper edge straight line and the lower edge straight line on the vine are examined. If the slope of the vine upper edge straight line is E U and the slope of the vine lower edge straight line is E L , the control unit 111 ⁇ “E U ⁇ 37 [degrees]” and “E L ⁇ 28 [degrees]” ⁇ 20 degrees or more above ⁇ “28.5 [degrees] ⁇ E U ⁇ 37 [degrees]” and “15 [degrees] ⁇ E L ⁇ 28 [degree]] ⁇ Up 10 degrees ⁇ “10 [degree] ⁇ E U ⁇ 28.5 [degree]” and “0 [degree] ⁇ E L ⁇ 15 [degree]” ⁇ Up and down 0 degree ⁇ “0 [Degree] ⁇ E U ⁇ 10 [degree] ”and“ ⁇ 15 [degree] ⁇ E L ⁇ 0 [degree]
  • control unit 111 similarly applies the pattern C glasses.
  • the pitch angle of the glasses is determined by performing the same determination as step S74 on the slope of one of the edge straight lines. can do.
  • the inclination information of the vine edge of the pitch angle may be stored and used thereafter for setting the threshold value of the conditional expression.
  • control unit 111 causes the storage unit 112 to store the pitch angle determined in the process of step S116 in association with the inclinations of the two vine part straight lines (step S117), and ends the second pitch angle determination process. To do.
  • the yaw angle determination process is a process of determining the angle (yaw angle) in the yaw direction facing the glasses from the inclination of the vine straight line.
  • the control unit 111 as the yaw direction determination unit 19 determines whether or not the pitch angle determined in the process of step S74 or step S116 is a pitch angle at which the yaw angle can be determined (step S131). For example, as shown in FIG. 15, the inclination of the vine straight line at 0 degrees up and down is difficult to determine the yaw angle because the inclination of the vine straight line does not change even if the yaw angle changes. Based on that. Therefore, the control unit 111 determines that the yaw angle cannot be determined when the slope of the vine line is nearly horizontal.
  • control unit 111 determines that the yaw angle cannot be determined (step S131: NO)
  • the control unit 111 ends the yaw angle determination process.
  • the control unit 111 determines whether or not the yaw angle determination is performed using two vine straight lines. (Step S132).
  • step S52 determines the pitch angle in the first pitch angle determination process in step S52, or in the second pitch angle determination process, either the upper edge straight line 200U or the lower edge straight line 200L for the glasses of the pattern A
  • step S132 determines that the yaw angle determination is not performed with two vine line straight lines (that is, the yaw angle is determined with one vine straight line) (step S132: NO).
  • step S133 A yaw angle is determined based on the inclination of one vine straight line (step S133), and the yaw angle determination process is terminated.
  • the control unit 111 has a spectacle pitch angle of 0 degrees up and down, for example, E1 ⁇ 10 [degrees] ⁇ Yaw angle of glasses 30 degrees left E1 ⁇ 10 [degree] ⁇ Yaw angle of glasses 40 degrees or more can be determined to determine the yaw angle of glasses.
  • E1 ⁇ 10 [degrees] the left is 40 degrees or more, but the pitch angle having a difference in inclination is 0 In cases other than degrees, it is possible to discriminate between 40 degrees left and 50 degrees left.
  • the control unit 111 when the pitch angle of the glasses is 10 degrees above, the control unit 111, for example, ⁇ E1 ⁇ 25 [degrees] ⁇ Yaw angle left of glasses 30 degrees ⁇ 20 [degrees] ⁇ E1 ⁇ 25 [degrees] ⁇ Yaw angle left 40 degrees of glasses ⁇ E1 ⁇ 20 [degrees] ⁇ Yaw angle 50 degrees left of glasses.
  • the control unit 111 determines to perform the yaw angle determination with two vine straight lines (step S132). : YES) The yaw angle is determined based on the inclination of the two vine straight lines (step S134), and the yaw angle determination process is terminated.
  • the control unit 111 is configured such that when the pitch angle of the glasses is 0 degrees up and down, “E U ⁇ 25 [degrees]” and “E L > 13 [degrees]” ⁇ Yaw angle left 30 degrees of glasses ⁇ “18 [degrees] ⁇ E U ⁇ 25 [degrees]” and “9 [degrees] ⁇ E L ⁇ 13 [degrees] ” ⁇ Yaw angle left of glasses 40 degrees ⁇ “ E U ⁇ 18 [degrees] ”and“ E L ⁇ 9 [degrees] ⁇ Glasses yaw angle left 50 degrees ”
  • the yaw angle of the glasses can be determined.
  • the control unit 111 when the pitch angle of the glasses is 10 degrees above, the control unit 111, for example, “E U ⁇ 34 [degrees]” and “E L > 24 [degrees]” ⁇ Yaw angle left 30 degrees of glasses ⁇ “30 [degrees] ⁇ E U ⁇ 34 [degrees]” and “22 [degrees] ⁇ E L ⁇ 24 [degrees] ⁇ eyeglass left yaw angle 40 degrees ⁇ “E U ⁇ 30 [degrees]” and “E L ⁇ 22 [degrees] ⁇ glasses yaw angle left 50 degrees
  • the control unit 111 can determine the yaw angle of the glasses when the pitch angle of the glasses is 20 degrees above, for example, “E U ⁇ 50 [degrees]” and “E L > 40 [degrees]” ⁇ Yaw angle left 30 degrees of glasses ⁇ “42 [degrees ⁇ U U ⁇ 50 [degrees]” and “33 [degrees] ⁇ E L ⁇ 40 [degrees] ⁇ eyeglass left yaw
  • the face orientation determination process will be described using the flowchart of FIG. Since the yaw angle of the face coincides with the yaw angle of the glasses, the yaw angle of the glasses is determined as the face yaw angle. In the face orientation determination process, the pitch angle of the face is determined based on the pitch angle of the glasses.
  • the control unit 111 as the face orientation determination unit 21 refers to the face orientation reference information, and determines the face pitch angle from the pitch angle of the glasses determined in the process of step S74 or step S116 (step S151).
  • the pitch angle of the glasses does not match the pitch angle of the face from the position of the nose and the way of wearing the glasses. Even when the slope of the vine straight line is almost horizontal, the face direction is slightly upward in some individuals, and the face direction is slightly downward in some individuals. Therefore, reference information for obtaining the pitch angle of the face when the pitch angle of the glasses is 0 degrees up and down is required. In this embodiment, this reference information is referred to as face orientation reference information.
  • the face orientation reference information “+2 degrees” is generated.
  • the control unit 111 determines that the face pitch angle is 12 degrees upward. Note that by periodically correcting the face orientation reference information, the face pitch angle can be correctly determined even if the glasses are misaligned.
  • control unit 111 as the direction information generation unit 22 generates face direction information indicating the face direction (pitch angle and yaw angle) (step S152), and ends the face direction determination process.
  • the control unit 111 (an example of “detection unit” and “generation unit”) has a head mounted with spectacles (an example of “wear”) from the side.
  • the inclination of the vine part of the glasses (an example of “feature part”) in the photographed image is detected, and face direction information indicating the direction in which the face is facing is generated from the detected inclination of the vine part.
  • the determination apparatus D of the present embodiment even when the face is directed laterally with respect to the camera C, the direction in which the face is directed is indicated based on the image of the head taken by the camera C. Face direction information can be created.
  • control unit 111 generates eyeglass direction information (an example of “wearing object direction information”) indicating the direction in which the eyeglasses are directed from the inclination of the vine part, and the face when the eyeglasses face the reference direction is turned. Based on the face orientation reference information indicating the direction (an example of “difference information”), the face direction information is generated by correcting the eyeglass direction information. Accordingly, correct face direction information can be generated even when the direction in which the glasses are facing does not match the direction in which the face is facing.
  • eyeglass direction information an example of “wearing object direction information”
  • the control unit 111 determines the face direction based on the inclination of the vine straight line.
  • the length of the vine straight line may be added to the determination element to determine the face direction.
  • the straight line of the vine portion in the image is such that the surface on which the vine portion of the head is located faces the camera (for example, the face on which the vine portion on the right is located faces the camera when the face is turned to the left). Is based on becoming longer.
  • the control unit 111 stores the yaw angle of the face (glasses) and the length of the vine straight line in the image in association with each other and stores them in the storage unit 112 and corresponds to the detected length of the vine straight line.
  • the final yaw angle of the face (glasses) is determined. It may be determined. For example, the average value of both yaw angles can be the final yaw angle.
  • control unit 111 When the control unit 111 (an example of the “storage control unit 111C”) generates the face direction information, the control unit 111 associates the inclination information indicating the inclination of the straight line of the vine and the face direction information in association with the storage unit 112 (the “storage unit”). It may be stored in (example). At this time, if the storage unit 112 stores inclination information indicating an inclination whose difference from the newly detected inclination of the vine line is equal to or less than a threshold value, the control unit 111 stores the inclination information in association with the inclination information. It is good also as acquiring the face direction information currently performed.
  • control unit 111 when the inclination of the newly detected vine straight line is almost the same as the inclination of the vine straight line when the face direction information was generated in the past (when the difference is equal to or less than the threshold), Since it is not necessary to newly generate the face direction information, it is possible to reduce the processing load related to the process for generating the face direction information from the inclination of the vine straight line.
  • the control unit 111 detects the vine straight line (an example of “a line indicating the edge of the characteristic part”), and when two vine straight lines are detected, the two vine straight lines are detected.
  • Pattern information (an example of “type information”) indicating a spectacle pattern is generated based on at least one of the inclination and the interval between the two vine straight lines. Therefore, when the control unit 111 (an example of “storage control unit”) further stores the pattern information in the storage unit 112 in association with the tilt information and the face direction information, and newly generates the pattern information, the storage unit 112. It is also possible to obtain face direction information associated with pattern information indicating the same type as the generated pattern information.
  • each type can be specified by at least one of the slopes or intervals of the two lines, the slope of the vine straight line and the face for each type.
  • Direction information can be stored in association with each other. Therefore, when wearing the same type of spectacles as when the face direction information was generated in the past, the slope of the newly detected vine straight line is the vine part when the face direction information was generated in the past. When the inclination is almost the same as the inclination of the straight line, it is not necessary to newly generate face direction information. Therefore, it is possible to reduce the processing load related to the process for generating the face direction information from the inclination of the vine portion straight line.
  • the control unit 111 stores the generated face direction information in the storage unit 112 (an example of “storage unit”), and generates new face direction information.
  • face direction information indicating a direction closest to the direction indicated by at least one face direction information stored within a predetermined period may be generated.
  • the latest face direction information may be used as at least one face direction information stored within a predetermined period.
  • the control unit 111 is the case where the slope of the detected vine straight line is substantially the same value as the threshold value, and when determining which of the face orientations (angles) is distinguished by the threshold value, the control unit 111 stores it in the storage unit 112.
  • the angle close to the direction (angle) indicated by the latest face direction information is determined as the determination result. Thereby, even when there are a plurality of candidates as the direction indicated by the face direction information, it is possible to prevent generation of face direction information indicating a direction far from the direction indicated by the latest face direction information. .
  • the direction of the glasses (face) is determined based on the inclination of the straight line indicating the edge of the vine portion of the glasses, but the mask (face) is determined based on the inclination of the straight line indicating the edge of the string to be worn on the ear of the mask. It is good also as determining the direction of.
  • the face direction indicating the direction in which the face is directed is corrected by correcting the eyeglass direction information using the face orientation reference information for obtaining the face pitch angle when the pitch angle of the glasses is 0 degrees.
  • the direction (pitch angle and yaw angle) in which the face faces may be obtained from the inclination of the vine edge straight line.
  • the vine part edge straight line can be specified from the slope of the vine part edge straight line so that the face direction (pitch angle and yaw angle) can be specified. It is assumed that a threshold is set for the inclination of. For example, when the inclination of the vine upper edge straight line is 15 degrees or more and less than 35 degrees and the inclination of the vine lower edge straight line is 7 degrees or more and less than 20 degrees, the face orientation is 10 degrees pitch angle and yaw angle A threshold is set for the inclination that determines that the angle is 30 degrees. This is performed for each pattern of glasses.
  • the control unit 111 determines the direction in which the face is facing by generating a threshold value, and generates face direction information. Thereby, face direction information can be directly generated from the inclination of the vine part edge straight line.
  • the control unit 111 identifies the orientation of the driver's face based on the image photographed by the camera C.
  • the direction of the face when the driver is facing the front (the direction of travel of the car) (the direction of the face that is statistically dominant by shooting the driver's head continuously) is set to the reference direction (for example, The pitch angle is 10 degrees and the yaw angle is 20 degrees. These angles are obtained from the elevation angle when the camera C is installed, etc.), and the face direction is 10 pitch angles and yaw angles with reference to the reference direction.
  • the control unit 111 performs threshold determination on the slope of the vine part edge straight line, thereby specifying the direction in which the face is facing and generating face direction information. I will do it.
  • the upper edge straight line and the lower edge straight line can be detected, and therefore the same processing as in the above embodiment is possible.
  • the reference numeral 303 in FIG. 17A also has no edge straight line, but the same processing as in the above embodiment can be performed by detecting a plurality of horizontal straight lines at the center of the vine as edge straight lines. is there. Even in the case of special glasses as indicated by reference numerals 302 and 304, the same processing as in the above embodiment can be performed by detecting the upper edge straight line and the lower edge straight line from the vine portion. For example, in the case of the glasses indicated by reference numeral 302 in FIG. 17B, if edge detection is performed in the vine region and straight lines in contact with these edges are obtained, these straight lines are the upper edge straight line 200U and the lower edge of the glasses vine. A straight line 200L is obtained, and the same processing is possible.

Landscapes

  • Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Geometry (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Image Analysis (AREA)
  • Image Processing (AREA)

Abstract

La présente invention vise à réaliser un dispositif de génération d'informations, etc. capable de créer des informations de direction indiquant la direction dans laquelle est orienté un visage, sur la base d'une image du visage prise par un appareil photo, même si le visage est orienté latéralement par rapport à l'appareil photo. Afin de résoudre ce problème, une inclinaison d'une partie caractéristique d'un objet vestimentaire porté autour d'une tête est détectée à partir d'une image de la tête prise à partir du côté, et des informations de direction de visage indiquant la direction dans laquelle est orienté le visage sont générées à partir de l'inclinaison détectée de la partie caractéristique.
PCT/JP2016/052786 2016-01-29 2016-01-29 Dispositif de génération d'informations, procédé de génération d'informations, programme de génération d'informations, et support d'enregistrement WO2017130407A1 (fr)

Priority Applications (2)

Application Number Priority Date Filing Date Title
PCT/JP2016/052786 WO2017130407A1 (fr) 2016-01-29 2016-01-29 Dispositif de génération d'informations, procédé de génération d'informations, programme de génération d'informations, et support d'enregistrement
JP2017563651A JP6697006B2 (ja) 2016-01-29 2016-01-29 情報生成装置、情報生成方法、情報生成プログラム及び記録媒体

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2016/052786 WO2017130407A1 (fr) 2016-01-29 2016-01-29 Dispositif de génération d'informations, procédé de génération d'informations, programme de génération d'informations, et support d'enregistrement

Publications (1)

Publication Number Publication Date
WO2017130407A1 true WO2017130407A1 (fr) 2017-08-03

Family

ID=59397631

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2016/052786 WO2017130407A1 (fr) 2016-01-29 2016-01-29 Dispositif de génération d'informations, procédé de génération d'informations, programme de génération d'informations, et support d'enregistrement

Country Status (2)

Country Link
JP (1) JP6697006B2 (fr)
WO (1) WO2017130407A1 (fr)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2019239694A1 (fr) * 2018-06-15 2019-12-19 ソニー株式会社 Dispositif de traitement d'informations, procédé de traitement d'informations, programme, et système de traitement d'informations
CN113228615A (zh) * 2018-12-28 2021-08-06 索尼集团公司 信息处理装置、信息处理方法和信息处理程序

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2006209442A (ja) * 2005-01-27 2006-08-10 Nissan Motor Co Ltd 眼位置検出装置
JP2008269182A (ja) * 2007-04-18 2008-11-06 Fujitsu Ltd 画像処理方法、画像処理装置、画像処理システム及びコンピュータプログラム
JP2009517745A (ja) * 2005-11-30 2009-04-30 シーイング・マシーンズ・プロプライエタリー・リミテッド 視覚的に頭と目を追跡するシステムにおける眼鏡の視覚的追跡
WO2009091029A1 (fr) * 2008-01-16 2009-07-23 Asahi Kasei Kabushiki Kaisha Dispositif, procédé et programme d'estimation de posture de visage

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP4445454B2 (ja) * 2005-10-20 2010-04-07 アイシン精機株式会社 顔中心位置検出装置、顔中心位置検出方法、及び、プログラム

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2006209442A (ja) * 2005-01-27 2006-08-10 Nissan Motor Co Ltd 眼位置検出装置
JP2009517745A (ja) * 2005-11-30 2009-04-30 シーイング・マシーンズ・プロプライエタリー・リミテッド 視覚的に頭と目を追跡するシステムにおける眼鏡の視覚的追跡
JP2008269182A (ja) * 2007-04-18 2008-11-06 Fujitsu Ltd 画像処理方法、画像処理装置、画像処理システム及びコンピュータプログラム
WO2009091029A1 (fr) * 2008-01-16 2009-07-23 Asahi Kasei Kabushiki Kaisha Dispositif, procédé et programme d'estimation de posture de visage

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2019239694A1 (fr) * 2018-06-15 2019-12-19 ソニー株式会社 Dispositif de traitement d'informations, procédé de traitement d'informations, programme, et système de traitement d'informations
US11417088B2 (en) 2018-06-15 2022-08-16 Sony Corporation Information processing device, information processing method, program, and information processing system
CN113228615A (zh) * 2018-12-28 2021-08-06 索尼集团公司 信息处理装置、信息处理方法和信息处理程序
CN113228615B (zh) * 2018-12-28 2023-11-07 索尼集团公司 信息处理装置、信息处理方法和计算机可读记录介质

Also Published As

Publication number Publication date
JP6697006B2 (ja) 2020-05-20
JPWO2017130407A1 (ja) 2018-11-15

Similar Documents

Publication Publication Date Title
KR102056333B1 (ko) 안경 렌즈 에지의 표시를 설정하기 위한 방법 및 장치 및 컴퓨터 프로그램
KR101169533B1 (ko) 얼굴 자세 추정 장치, 얼굴 자세 추정 방법 및 얼굴 자세 추정 프로그램을 기록한 컴퓨터 판독 가능한 기록 매체
JP4830650B2 (ja) 追跡装置
JP4445454B2 (ja) 顔中心位置検出装置、顔中心位置検出方法、及び、プログラム
JP5024067B2 (ja) 顔認証システム、方法及びプログラム
WO2017013913A1 (fr) Dispositif de détection du regard, terminal de lunetterie, procédé de détection du regard et programme
WO2017036160A1 (fr) Procédé de suppression des lunettes pour reconnaissance faciale
JP6582604B2 (ja) 瞳孔検出プログラム、瞳孔検出方法、瞳孔検出装置および視線検出システム
JP6550642B2 (ja) 皺検出装置および皺検出方法
JP4912206B2 (ja) 画像処理方法、画像処理装置、画像処理システム及びコンピュータプログラム
JP3926507B2 (ja) 目位置及び顔位置検出装置
CN105740778B (zh) 一种改进的三维人脸活体检测方法及其装置
TWI694809B (zh) 檢測眼球運動的方法、其程式、該程式的記憶媒體以及檢測眼球運動的裝置
KR102460665B1 (ko) 응시 거리를 결정하는 방법 및 디바이스
JP2007280250A (ja) 顔認証システム
JP6822482B2 (ja) 視線推定装置、視線推定方法及びプログラム記録媒体
JP2000137792A (ja) 眼部検出装置
JP2007072627A (ja) サングラス検出装置及び顔中心位置検出装置
JP6697006B2 (ja) 情報生成装置、情報生成方法、情報生成プログラム及び記録媒体
JP2009064395A (ja) ポインティングデバイス、操作者の注視位置とカーソルの位置との誤差の補正をコンピュータに実行させるためのプログラムおよびそのプログラムを記録したコンピュータ読み取り可能な記録媒体
US20060269128A1 (en) Image correction method and apparatus
JP4533849B2 (ja) 画像処理装置及び画像処理プログラム
JP2017151565A (ja) 検出システム、検出方法、および検出プログラム
JP4696571B2 (ja) 眼位置検出装置
WO2017159215A1 (fr) Dispositif de traitement d'informations et procédé de traitement d'informations

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 16887993

Country of ref document: EP

Kind code of ref document: A1

WWE Wipo information: entry into national phase

Ref document number: 2017563651

Country of ref document: JP

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 16887993

Country of ref document: EP

Kind code of ref document: A1