WO2011074198A1 - ユーザインタフェース装置および入力方法 - Google Patents
ユーザインタフェース装置および入力方法 Download PDFInfo
- Publication number
- WO2011074198A1 WO2011074198A1 PCT/JP2010/007026 JP2010007026W WO2011074198A1 WO 2011074198 A1 WO2011074198 A1 WO 2011074198A1 JP 2010007026 W JP2010007026 W JP 2010007026W WO 2011074198 A1 WO2011074198 A1 WO 2011074198A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- user
- interest
- degree
- user interface
- strength
- Prior art date
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
- G06F3/013—Eye tracking input arrangements
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/70—Determining position or orientation of objects or cameras
- G06T7/73—Determining position or orientation of objects or cameras using feature-based methods
- G06T7/74—Determining position or orientation of objects or cameras using feature-based methods involving reference images or patches
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10016—Video; Image sequence
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10016—Video; Image sequence
- G06T2207/10021—Stereoscopic video; Stereoscopic image sequence
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10024—Color image
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30196—Human being; Person
- G06T2207/30201—Face
Definitions
- the present invention relates to a user interface device and an input method for estimating a user's interest level in order to perform input processing based on the user's interest level for a plurality of objects displayed on a screen.
- the current general information system reacts according to the user's "explicit instructions” (for example, the user inputs characters with a keyboard, presses buttons on a remote control, or specifies an object with a pointing device).
- an interaction form for presenting information is adopted.
- smooth communication between a person and a system may be difficult due to troublesome operation, difficulty, and troublesome intention expression.
- Non-Patent Document 1 a user who views video content is photographed, and a tag of “Neutral”, “Positive”, or “Negative” is attached to the video content by estimating the degree of interest from the facial expression, and the program Systems that provide useful information when making recommendations have been proposed.
- Patent Document 1 in an image reproduction system that sequentially switches and reproduces a plurality of target images, peripheral sounds (viewer cheers etc.) or viewer movements (changes in facial expressions, etc.) during playback display. Based on this, a system for dynamically determining the display time of a target image is shown. In these techniques, basically, the degree of interest in a single content displayed on the screen is determined.
- a gaze can be given as a representative physical reaction for estimating a user's interest, interest, or intention for each of a plurality of contents displayed on the screen.
- Vision is dominant for humans to acquire information, while the central vision and effective visual field are limited in size. Therefore, in order to acquire information from the target, it is necessary to move the gazing point to the target, and as a result, the line of sight concentrates on the target with interest. Therefore, it can be said that the line of sight easily reflects the interest, interest, or intention of a person.
- Patent Document 2 discloses an apparatus that determines an object having a long gaze residence time as a user's desired object.
- This device displays a plurality of images as options on the screen, the user detects the line of sight toward the image with a line-of-sight angle detector, measures the dwell time of the line of sight in each image, and The images intended by the user are selected.
- the present invention solves the above-described conventional problems, and estimates the user's interest level with high accuracy in order to perform input processing based on the user's interest level for a plurality of objects displayed on the screen. It is an object of the present invention to provide a user interface device and an input method that can be used.
- a user interface device determines an interest level of a user in order to perform input processing based on the interest level of the user with respect to a plurality of objects displayed on a screen.
- a user interface device that estimates, a gaze direction detection unit that detects a gaze direction of the user, and a gaze dwell time calculation unit that calculates a dwell time in the gaze direction for each of the plurality of objects as a gaze dwell time;
- An attracting strength calculation unit that calculates an attracting strength for each object, and the degree of interest increases as the eye gaze residence time increases, and the degree of interest increases as the attracting strength decreases.
- An interest level estimation unit that estimates the interest level of the user for each.
- the degree of interest of the user it is possible to estimate the degree of interest of the user based on the eye-catching strength of each object in addition to the gaze dwell time of each object. If the attracting strength of the object is high, the user turns his gaze even if he is not interested in the object. That is, the line-of-sight residence time varies according to not only the degree of interest of the user with respect to the object but also the attractiveness intensity of the object. Therefore, by estimating the degree of interest based on the attractiveness intensity, it is possible to suppress the influence of the time that is not related to the user's degree of interest in the gaze dwelling time on the estimation of the degree of interest. That is, it is possible to estimate the degree of interest of the user with respect to the object with high accuracy. For example, power consumption can be reduced by performing display control such as displaying only objects having a high degree of interest of the user based on the degree of interest estimated in this way.
- the attraction strength calculation unit calculates the attraction strength for each object based on physical characteristics of the image of the object.
- the attraction strength calculation unit may calculate the attraction strength so that the attraction strength increases as the complexity increases, based on the complexity of the image of the object.
- the attraction strength calculation unit calculates the attraction strength so that the higher the degree of heterogeneity, the greater the degree of attraction, based on the degree of heterogeneity of the image of the object with respect to images around the object. May be.
- the attractive strength can be calculated based on the physical characteristics of the object.
- the attraction strength depends on the physical characteristics of the object (for example, the complexity or heterogeneity of the object). For example, if the complexity of the image of the object is high, it takes a long time to turn the line of sight toward the object in order to understand the contents of the object. Therefore, by calculating the attracting strength based on the physical characteristics of the object, the attracting strength can be accurately calculated, and the degree of interest of the user can be estimated with high accuracy.
- the attraction strength calculating unit calculates the attraction strength so that the attraction strength increases for each object as the psychological distance between the object and the user is shorter.
- the attraction strength calculation unit may calculate the psychological distance for each object so that the psychological distance becomes shorter as the number of times the user has viewed the object in the past increases.
- the attraction strength calculating unit determines whether the user is the object when the user is the creator of the object based on whether the user is the creator of the object.
- the psychological distance may be calculated so that the psychological distance is shorter than when the person is not the creator of the above.
- the psychological distance may be calculated so that the psychological distance becomes shorter.
- the attractiveness intensity can be calculated based on the psychological distance between the object and the user.
- the attraction strength is the psychological distance between the object and the user (for example, the number of times the user has viewed the object, whether the user is the creator of the object, and whether the user is displayed on the object) Etc.). Therefore, by calculating the attracting strength based on the psychological distance between the object and the user, the attracting strength can be accurately calculated, and the degree of interest of the user can be estimated with high accuracy.
- the attraction strength calculation unit calculates the attraction strength for each object based on a physical positional relationship between the object and the user. In this case, for example, for each object, the attracting strength calculation unit calculates the attracting strength so that the attracting strength increases as the physical distance decreases based on the physical distance between the object and the user. It may be calculated. In addition, for example, for each object, the attraction strength calculation unit is configured to increase the attraction strength as the angle is smaller based on an angle between a line connecting the object and the user and a normal line of the screen. The intensity may be calculated.
- the attractiveness intensity can be calculated based on the physical positional relationship between the object and the user.
- the attractive strength depends on the physical positional relationship between the object and the user (for example, the physical distance between the object and the user, or the direction of the object with respect to the user). For example, if the physical distance between the object and the user is short, the time for directing the line of sight to the object becomes long. Therefore, by calculating the attracting strength based on the physical positional relationship between the object and the user, the attracting strength can be accurately calculated, and the degree of interest of the user can be estimated with high accuracy.
- the interest level estimation unit calculates a corrected gaze dwell time by correcting the gaze dwell time so as to be shorter as the attracting strength is larger, and the interest level is higher as the calculated corrected gaze dwell time is longer.
- the degree of interest is estimated as follows.
- the degree-of-interest estimation unit may calculate the corrected gaze dwell time by subtracting a time corresponding to the attraction intensity from the gaze dwell time.
- the interest level estimation unit may calculate the corrected gaze dwell time by dividing the gaze dwell time using a value corresponding to the attracting strength.
- a screen control unit that performs display control of the screen according to the estimated degree of interest.
- the screen control unit may display information on the object having the highest estimated interest level among the plurality of objects on the screen. Further, for example, the screen control unit may display an object excluding an object having the highest estimated interest level among the plurality of objects or an object having the highest estimated interest level among the plurality of objects. You may change an aspect.
- display control of the screen can be performed according to the estimated interest level. For example, among the plurality of objects, the display brightness of the objects other than the object having the highest estimated degree of interest can be reduced, and the power consumption for displaying the plurality of objects can be reduced.
- the user interface device may be configured as an integrated circuit.
- the present invention can be realized not only as such a user interface device, but also as an input method in which the operation of the characteristic components included in such a user interface device is a step.
- the present invention can also be realized as a program that causes a computer to execute each step included in the input method.
- Such a program can be distributed via a computer-readable non-transitory recording medium such as a CD-ROM (Compact Disc Only Memory) or a transmission medium such as the Internet.
- the user's interest level can be estimated with high accuracy in order to perform input processing based on the user's interest level for a plurality of objects displayed on the screen.
- FIG. 1 is a block diagram showing a functional configuration of a user interface system according to an embodiment of the present invention.
- FIG. 2A is a diagram for explaining an arrangement example of cameras in the embodiment of the present invention.
- FIG. 2B is a diagram for explaining an installation example of the camera according to the embodiment of the present invention.
- FIG. 2C is a diagram for explaining an installation example of the camera according to the embodiment of the present invention.
- FIG. 3 is a flowchart showing an input method according to the embodiment of the present invention.
- FIG. 4 is a diagram for explaining the gaze direction detection processing in the embodiment of the present invention.
- FIG. 5 is a flowchart showing the flow of gaze direction detection processing in the embodiment of the present invention.
- FIG. 1 is a block diagram showing a functional configuration of a user interface system according to an embodiment of the present invention.
- FIG. 2A is a diagram for explaining an arrangement example of cameras in the embodiment of the present invention.
- FIG. 2B is a diagram for explaining an installation example of the
- FIG. 6 is a diagram for explaining the process of detecting the face orientation according to the embodiment of the present invention.
- FIG. 7 is a diagram for explaining the calculation of the line-of-sight direction reference plane in the embodiment of the present invention.
- FIG. 8 is a diagram for explaining detection of the center of the black eye in the embodiment of the present invention.
- FIG. 9 is a diagram for explaining the detection of the center of the black eye in the embodiment of the present invention.
- FIG. 10 is a diagram showing a display example of an object in the embodiment of the present invention.
- FIG. 11A is a diagram for describing an example of a method of correcting the line-of-sight dwell time according to the embodiment of the present invention.
- FIG. 11B is a diagram for describing an example of a method of correcting the line-of-sight dwell time according to the embodiment of the present invention.
- the user can easily turn his gaze toward an object with a high attracting strength. For example, when a user sees an object, the user looks at the main part of the object to understand what is drawn. At this time, if the main part of the object includes information that takes time to understand what is drawn, the gaze dwell time becomes long.
- the gaze dwell time includes not only the time when the user is gazing because the degree of interest in the object is high, but also the time when the gaze is directed because the attracting strength of the object is high. Therefore, the user interface device 10 according to the present embodiment estimates the degree of interest of the user using not only the line-of-sight dwell time but also the attracting strength of each object.
- FIG. 1 is a block diagram showing a functional configuration of a user interface system 100 according to the embodiment of the present invention.
- the user interface system 100 includes a user interface device 10, a display device 20, and a camera 30.
- the user interface device 10 estimates the degree of interest of the user in order to perform input processing based on the degree of interest of the user with respect to the plurality of objects displayed on the screen 26.
- the object is information displayed on a part or all of the screen 26.
- the object includes content such as a photograph, video, or text.
- the object includes an icon, a menu, a button, or the like that is used as a GUI (Graphical User Interface).
- the interest level is a value indicating the degree of interest of the user with respect to the contents of the object displayed on the screen 26. That is, the degree of interest is a value indicating how much the user is interested in the content of the object.
- the user interface device 10 includes a gaze direction detection unit 11, a gaze dwell time calculation unit 12, an attractiveness calculation unit 13, and an interest level estimation unit 14.
- the gaze direction detection unit 11 detects the gaze direction of the user. In the present embodiment, the gaze direction detection unit 11 detects the user's gaze direction from the image information generated by the camera 30.
- the line-of-sight direction is the direction of a straight line connecting the point that the user is gazing with the user's eyes. That is, the line-of-sight direction is a direction of a straight line connecting the gazing point on the user's screen 26 and the user's eyes.
- the line-of-sight residence time calculation unit 12 calculates the line-of-sight residence time for each of the plurality of objects displayed on the screen 26 as the line-of-sight residence time.
- the residence time is the time during which the line-of-sight direction stays within a certain range.
- the dwell time is the length of time that the gazing point on the screen 26 obtained from the line-of-sight direction is continuously present in the display area of the object, for example.
- the attraction strength calculator 13 calculates the attraction strength for each object.
- the attractive strength is a value indicating the strength of attractiveness. That is, the attracting strength is a value indicating the degree of ease of visual attention.
- the interest level estimation unit 14 estimates the user's interest level for each of the plurality of objects so that the interest level increases as the gaze residence time increases and the interest level increases as the attracting strength decreases.
- the degree-of-interest estimation unit 14 calculates a corrected line-of-sight dwell time in which the line-of-sight dwell time is corrected so as to decrease as the attracting strength increases. Then, the degree-of-interest estimation unit 14 estimates the degree of interest so that the degree of interest increases as the calculated corrected gaze residence time increases. Further, the interest level estimation unit 14 outputs the estimated interest level of the user to the display device 20.
- the display device 20 displays a plurality of objects on the screen 26.
- the display device 20 includes a screen control unit 25 and a screen 26.
- the screen control unit 25 performs display control of the screen 26 according to the estimated interest level. Specifically, the screen control unit 25 displays, for example, information on an object having the highest estimated interest level on the screen 26 among a plurality of objects. For example, when the object is movie content, the screen control unit 25 displays the story, director, or cast of the movie content as information about the object.
- the screen control unit 25 may change the display mode of the object having the highest estimated degree of interest among the plurality of objects. Specifically, the screen control unit 25 may change the display mode of the object, for example, by increasing the display area of the object, increasing the display brightness, or increasing the sharpness.
- the screen control unit 25 may change the display mode of the objects excluding the object having the highest estimated degree of interest among the plurality of objects. For example, the screen control unit 25 may reduce the display brightness of objects other than the object having the highest estimated degree of interest among the plurality of objects.
- the display device 20 displays information suitable for the user on the screen 26 without an explicit instruction from the user. It is possible to do. Therefore, the user interface system 100 can improve user convenience. Further, the display device 20 can reduce the display brightness of the object according to the degree of interest of the user, and can also reduce power consumption for displaying a plurality of objects.
- the screen 26 is, for example, a liquid crystal display panel or a plasma display panel. A plurality of objects are displayed on the screen 26 by the screen control unit 25.
- the camera 30 shoots the user and generates image information.
- the camera 30 includes an image sensor such as a CCD (Charge Coupled Device) or a CMOS (Complementary Metal Oxide Semiconductor).
- the camera 30 captures an image of a user existing in front of the screen 26 of the display device 20.
- the camera 30 is installed around the display device 20. Specifically, the camera 30 is installed at a position where the user existing in front of the screen 26 can be photographed. More specifically, the camera 30 is installed at a position shown in FIGS. 2A to 2C, for example.
- FIGS. 2A to 2C are diagrams for explaining an installation example of the camera 30 according to the embodiment of the present invention.
- the camera 30 is attached to the upper end portion of the screen 26 of the display device 20.
- the camera 30 may be installed in the vicinity of the screen 26 without being attached to the display device 20.
- the camera 30 may be installed on the wall surface or ceiling of a room in which the display device 20 is installed. That is, the camera 30 may be installed at an appropriate position in the space where the display device 20 is installed.
- FIG. 3 is a flowchart showing an input method according to the embodiment of the present invention.
- the gaze direction detection unit 11 detects the gaze direction of the user (S101). Thereafter, the line-of-sight residence time calculation unit 12 calculates the line-of-sight residence time for each of the plurality of objects displayed on the screen (S102).
- the attracting strength calculation unit 13 calculates the attracting strength for each object (S103).
- the degree-of-interest estimation unit 14 estimates the degree of interest of the user with respect to each of the plurality of objects so that the degree of interest increases as the gaze residence time increases and the degree of interest increases as the attracting strength decreases. (S104).
- FIG. 4 is a diagram for explaining the gaze direction detection processing in the embodiment of the present invention.
- the line-of-sight direction is calculated based on a combination of the orientation of the user's face (hereinafter simply referred to as “face orientation”) and the direction of the black eye portion in the user's eyes (hereinafter simply referred to as “black-eye direction”). .
- FIG. 5 is a flowchart showing a flow of gaze direction detection processing in the embodiment of the present invention.
- the line-of-sight direction detection unit 11 acquires an image in which the camera 30 captures a user existing in front of the screen 26 (S501). Subsequently, the line-of-sight direction detection unit 11 detects a face area from the acquired image (S502). Next, the line-of-sight direction detection unit 11 applies the area of the face part feature point corresponding to each reference face direction to the detected face area, and cuts out the area image of each face part feature point (S503).
- the line-of-sight direction detection unit 11 calculates the degree of correlation between the clipped region image and the template image stored in advance (S504). Subsequently, the line-of-sight direction detection unit 11 obtains a weighted sum obtained by weighting and adding the angle indicated by each reference face direction in accordance with the calculated ratio of correlation degrees, and the user corresponding to the detected face area. The face orientation is detected (S505).
- FIG. 6 is a diagram for explaining the process of detecting the face orientation according to the embodiment of the present invention.
- the line-of-sight direction detection unit 11 reads the facial part feature point area from the facial part area DB that stores the facial part feature point areas corresponding to each reference face direction. . Subsequently, as shown in FIG. 6B, the line-of-sight direction detection unit 11 applies the face part feature point area to the face area of the photographed image for each reference face direction, and the face part feature point area. Cut out images for each reference face.
- the line-of-sight direction detection unit 11 calculates the degree of correlation between the clipped region image and the template image held in the face part region template DB for each reference face direction. .
- the line-of-sight direction detection unit 11 calculates a weight for each reference face direction in accordance with the high degree of correlation indicated by the calculated degree of correlation. For example, the line-of-sight direction detection unit 11 calculates, as a weight, the ratio of the correlation degree of each reference face direction to the sum of the correlation degrees of the reference face direction.
- the line-of-sight direction detection unit 11 calculates the sum of values obtained by multiplying the angle indicated by the reference face direction by the calculated weight, as shown in FIG. Detect as.
- the weight for the reference face direction +20 degrees is “0.85”
- the weight for the front direction is “0.14”
- the line-of-sight direction detection unit 11 calculates the correlation degree for the facial part feature point area image, but is not limited to this, for example, calculates the correlation degree for the entire face area image. Also good.
- As another method of detecting the face orientation there is a method of detecting facial part feature points such as eyes, nose and mouth from the face image and calculating the face orientation from the positional relationship of the facial part feature points.
- the facial part feature point As a method of calculating the face orientation from the positional relationship of the facial part feature points, rotate and enlarge the 3D model of the facial part feature points prepared in advance so as to best match the facial part feature points obtained from one camera. There is a method of calculating the face orientation from the rotation amount of the obtained three-dimensional model by reducing and matching. Further, as another method for calculating the face orientation from the positional relationship between the facial part feature points, the facial part feature point positions in the left and right cameras using the principle of stereo vision based on images taken by two cameras. There is a method of calculating the three-dimensional position of each facial part feature point from the deviation on the image and calculating the face direction from the positional relationship of the obtained facial part feature points. For example, there is a method of detecting the normal direction of the plane stretched by the three-dimensional coordinate points of both eyes and mouth as the face direction.
- the gaze direction detection unit 11 detects the three-dimensional positions of the left and right eyes of the user using the stereo image captured by the camera 30, and calculates the gaze direction reference plane using the detected three-dimensional positions of the left and right eyes. (S506). Subsequently, the line-of-sight direction detection unit 11 detects the three-dimensional position of the center of the left and right eyes of the user using the stereo image captured by the camera 30 (S507). The line-of-sight direction detection unit 11 detects the black-eye direction using the line-of-sight direction reference plane and the three-dimensional position of the left and right black-eye centers (S508).
- the line-of-sight direction detection unit 11 detects the user's line-of-sight direction using the detected face direction and black-eye direction of the user (S509).
- the gaze direction detection unit 11 first calculates a gaze direction reference plane. Subsequently, the line-of-sight direction detection unit 11 detects the three-dimensional position of the center of the black eye. Finally, the gaze direction detection unit 11 detects the black eye direction.
- FIG. 7 is a diagram for explaining the calculation of the line-of-sight direction reference plane in the embodiment of the present invention.
- the line-of-sight direction reference plane is a plane that serves as a reference when detecting the direction of the black eye, and is the same as the left-right symmetric plane of the face as shown in FIG.
- the position of the eye is less affected by facial expressions and less erroneously detected than other facial parts such as the corner of the eye, the corner of the mouth, or the eyebrows. Therefore, the line-of-sight direction detection unit 11 calculates the line-of-sight direction reference plane, which is a symmetrical plane of the face, using the three-dimensional position of the eye.
- the line-of-sight direction detection unit 11 uses a face detection module and a face component detection module in each of two images (stereo images) taken by a stereo camera, which is the camera 30, to detect the right and left eye regions. Is detected. Then, the line-of-sight direction detection unit 11 measures the three-dimensional position of each of the right and left eyes using the detected positional shift (parallax) between the images of the eye areas. Further, as shown in FIG. 7, the line-of-sight direction detection unit 11 calculates, as the line-of-sight direction reference plane, a vertical bisector with a line segment whose end point is the detected three-dimensional position of the left and right eye eyes.
- FIG 8 and 9 are diagrams for explaining detection of the center of the black eye in the embodiment of the present invention.
- the line-of-sight direction detection unit 11 detects the center of the black eye when detecting the black eye direction. Perform detection.
- the gaze direction detection unit 11 first detects the position of the corner of the eye and the position of the eye from the captured image. Then, the line-of-sight direction detection unit 11 detects, as a black eye region, a region having a low luminance from a region including the corners of the eyes and the eyes as shown in FIG. Specifically, the line-of-sight direction detection unit 11 detects, for example, an area where the luminance is equal to or less than a predetermined threshold and is larger than a predetermined size as a black eye area.
- the line-of-sight detection unit 11 sets a black-eye detection filter including the first region 1 and the second region as shown in FIG. 8 at an arbitrary position in the black-eye region. Then, the line-of-sight direction detection unit 11 searches for the position of the black-eye detection filter that maximizes the inter-region variance between the luminance of the pixels in the first region 1 and the luminance of the pixels in the second region. The indicated position is detected as the center of the black eye. Finally, the line-of-sight direction detection unit 11 detects the three-dimensional position of the center of the black eye using the shift of the position of the center of the black eye in the stereo image, as described above.
- the gaze direction detecting unit 11 detects the black eye direction using the calculated gaze direction reference plane and the detected three-dimensional position of the center of the black eye. It is known that there is almost no individual difference in the diameter of an eyeball of an adult. For example, in the case of a Japanese, it is about 24 mm. Accordingly, if the position of the center of the black eye when the reference direction (for example, the front) is known is known, it can be converted and calculated in the direction of the black eye by obtaining the displacement from there to the current center position of the black eye.
- the reference direction for example, the front
- the gaze direction detection unit 11 uses the fact that the midpoint of the center of the left and right black eyes exists on the center of the face, that is, the gaze direction reference plane.
- the black eye direction is detected by calculating the distance between the reference line and the reference direction.
- the line-of-sight direction detection unit 11 uses the eyeball radius R and the distance d of the midpoint of the line segment connecting the centers of the left and right black eyes with respect to the line-of-sight direction reference plane (hereinafter referred to as “black-eye midpoint”).
- black-eye midpoint the line-of-sight direction reference plane
- the gaze direction detection unit 11 detects the black eye direction using the gaze direction reference plane and the three-dimensional position of the black eye center.
- the gaze direction detection unit 11 detects the gaze direction of the user in the real space using the detected face direction and the black eye direction.
- the line-of-sight dwell time calculation unit 12 calculates the dwell time in the line-of-sight direction for each object on the screen 26 based on the line-of-sight direction detected by the line-of-sight direction detection unit 11. That is, the line-of-sight dwell time calculation unit 12 calculates the time during which the line-of-sight direction dwells in order to see the object on the screen 26 for each object.
- the gaze residence time calculation unit 12 acquires, for example, the three-dimensional position of the user's eyes. Then, the line-of-sight residence time calculation unit 12 calculates an intersection point between the screen 26 and a straight line extending in the line-of-sight direction from the three-dimensional position of the user's eyes. Further, the line-of-sight residence time calculation unit 12 calculates, as the line-of-sight residence time, for each object, the time during which the gazing point is continuously present in the display area of the object.
- the line-of-sight residence time calculation unit 12 does not necessarily calculate the line-of-sight residence time as described above.
- the line-of-sight dwell time calculation unit 12 may calculate, as the line-of-sight dwell time, for each object, the time during which the gazing point is continuously present within a certain range centered on the display area of the object.
- the gaze residence time calculation unit 12 may detect that the gaze point stays within the area when the gaze point moves outside the display area of the object and returns to the display area within a predetermined time.
- the gaze dwell time may be calculated by regarding that the gazing point is continuously present.
- the attracting strength indicates the degree of ease of visual attention.
- the attractiveness intensity is a value indicating how easily the image of each object displayed on the screen draws the user's line of sight.
- the attractiveness intensity depends on the physical characteristics of the image. For example, the higher the color or texture of an image is, the more likely it is to catch the user's line of sight as the lightness or saturation increases. In addition, the more dissimilar the surrounding image and physical features are, the easier the image is to draw the user's line of sight.
- the attractiveness intensity also depends on the psychological distance between the user and the object. For example, an image that has a deeper relationship with the user or an image that has a higher number of times of visual recognition tends to draw the user's line of sight.
- the psychological distance indicates a psychological relationship between the user and the object.
- the psychological distance becomes shorter as the psychological relationship between the user and the object is higher.
- the attractiveness strength depends on the physical positional relationship between the user and the object. That is, the attractiveness intensity is also affected by the positional relationship between the position where the user exists and the position where the object is displayed. For example, an image displayed in front of the user tends to draw the user's line of sight.
- the attracting strength calculation unit 13 calculates the attracting strength A (i) of the object i as shown in Equation 2.
- Image (i) represents the attractiveness intensity based on the physical characteristics of the image of the object i. That is, the attraction strength calculation unit 13 calculates the attraction strength for each object based on the physical characteristics of the image of the object.
- Apsy (i) represents the attractiveness intensity based on the psychological distance between the user and the object i. That is, for each object, the attracting strength calculation unit 13 calculates the attracting strength so that the attracting strength increases as the psychological distance between the object and the user decreases.
- Aphy (i) represents the attractiveness intensity based on the physical positional relationship between the user and the object i. That is, the attraction strength calculator 13 calculates the attraction strength for each object based on the physical positional relationship between the object and the user.
- a1, a2, and a3 are adjustment parameters for adjusting the influence of each term on the attractive strength.
- a predetermined numerical value of 0 or more is set in a1, a2, and a3.
- 0 may be set to a2 and a3, and a numerical value greater than 0 may be set to a1. That is, a numerical value larger than 0 may be set in at least one of a1, a2, and a3.
- the attraction intensity calculation unit 13 is based on at least one of the physical characteristics of the image of the object, the psychological distance between the object and the user, and the physical positional relationship between the object and the user.
- the attractive strength may be calculated.
- i1, i2, s1, s2, s3, h1, and h2 are adjustment parameters for adjusting the influence of each term on the attractive strength.
- predetermined numerical values of 0 or more are set.
- complex (i) represents the complexity of the image of the object i. Therefore, for each object, the attracting strength calculation unit 13 calculates the attracting strength so that the attracting strength increases as the complexity increases, based on the complexity of the image of the object.
- hetero (i) represents the degree of heterogeneity of the image of the object i with respect to the image around the object i. Therefore, for each object, the attracting strength calculation unit 13 calculates the attracting strength so that the attracting strength increases as the degree of heterogeneity increases, based on the dissimilarity of the image of the object with respect to the surrounding image of the object.
- e_st (i) is “1” if the user is the creator of the object i, and “0” otherwise. That is, when the user is the creator of the object and the user is not the creator of the object based on whether or not the user is the creator of the object for each object.
- the psychological distance is calculated so that the psychological distance becomes shorter.
- the user interface device 10 receives in advance input of identification information for identifying the user from the user, and determines whether or not the user is the creator of the object based on the identification information. do it.
- information for identifying the creator of an object may be stored in advance in a storage unit (not shown) in association with the object, for example.
- E_sb (i) is “1” if the user is a subject displayed in the object i, and “0” otherwise. That is, for each object, the attracting strength calculation unit 13 determines, based on whether or not the user is a subject displayed in the object, a psychological distance when the user is a subject than when the user is not a subject. The psychological distance is calculated so that becomes short.
- the user interface device 10 receives, for example, input of identification information for identifying the user from the user in advance, and based on the identification information, the user is a subject displayed in the object. What is necessary is just to determine whether there exists.
- information for identifying a subject displayed in an object may be stored in advance in a storage unit (not shown) in association with the object, for example.
- E_w (i) represents the number of times the user visually recognized the object i (hereinafter simply referred to as “visual number”). Therefore, the attracting strength calculation unit 13 calculates the psychological distance for each object so that the psychological distance becomes shorter as the number of times the user has viewed the object in the past increases.
- dist (i) represents a physical distance between the user and the object i. Therefore, for each object, the attracting strength calculation unit 13 calculates the attracting strength so that the attracting strength increases as the physical distance decreases, based on the physical distance between the object and the user.
- Ang (i) represents an angle formed by a line connecting the object i and the user and a normal line of the screen 26. Therefore, for each object, the attracting strength calculation unit 13 calculates the attracting strength so that the attracting strength increases as the angle decreases, based on the angle between the line connecting the object and the user and the normal line of the screen 26. .
- the attracting strength calculation unit 13 divides the image of the object i into regions using a known image processing technique. Then, the attracting strength calculation unit 13 calculates complex (i) so that the complexity increases as the number of regions increases, according to the number of regions obtained by region division.
- the attracting strength calculation unit 13 may calculate the complexity using, for example, a method disclosed in Patent Document 3 (Japanese Patent Laid-Open No. 2007-18025).
- Patent Document 3 Japanese Patent Laid-Open No. 2007-18025
- the method disclosed in Patent Document 3 is merely an example of a method for calculating the complexity of an image, and the present invention is not limited to this.
- the attracting strength calculation unit 13 calculates the degree of difference between the color and texture in the image around the object i and the color and texture in the image of the object i as the degree of heterogeneity.
- the color and texture in each image are, for example, the color and texture that occupy the largest area in the image.
- Non-Patent Document 2 (“Shoji Tanaka, Seiji Iguchi, Yuichi Iwabuchi, Ryohei Nakatsu: Attracting degree evaluation model based on physical characteristics of image area, IEICE paper) Journal A, Vol. J83-A, No. 5, pp. 576-588, 2000 ”) may be used to calculate the degree of heterogeneity.
- the method disclosed in Non-Patent Document 2 is merely an example of a method for calculating the degree of heterogeneity, and the present invention is not limited to this.
- Non-Patent Document 2 a method for calculating the degree of heterogeneity disclosed in Non-Patent Document 2 will be described.
- the heterogeneity of the physical feature amount is calculated by the following equation 7 where d is the difference between the feature amount and the average feature amount of the entire region, dm is the average value of d, and std is the standard deviation of d.
- the attractiveness calculation unit 13 calculates the color difference between the average color of the image area of the object i and the average color of the entire screen including the image of the surrounding objects, the color difference average, and the standard deviation of the color difference. Then, the attraction intensity calculation unit 13 substitutes the calculation result into Equation 7 to calculate the color heterogeneity HC (i) of the image of the object i.
- the color difference may be calculated by using a color difference formula in the CIE L * a * b * (Elster, Aster, Biester) color space established by the CIE (International Lighting Commission), for example.
- the attractive strength calculation unit 13 may use Non-Patent Document 3 (B. S. Manjunath, W. Y. Ma: Texture features for browsing and retri- val of image data, IEEE Trans. Pattern. Mach.Intel., Vol.18, No.8, pp.837-842, 1996) may be calculated.
- the attractiveness calculation unit 13 calculates the Euclidean distance between the texture feature vectors as the difference between the texture feature amounts. Then, the attracting strength calculation unit 13 substitutes the distance between the vector of the image of the object i and the average vector of the entire screen including the images of the objects around the object i, the average of the distance, and the standard deviation of the distance into Equation 7. Then, the texture heterogeneity HT (i) of the image of the object i is calculated.
- the interest level estimation unit 14 interests the user with respect to each of the plurality of objects displayed on the screen 26 so that the interest level increases as the gaze residence time increases and the interest level increases as the attracting strength decreases. Estimate the degree.
- the degree-of-interest estimation unit 14 first calculates the corrected gaze dwell time by correcting the gaze dwell time so that the greater the attractiveness strength is, the shorter the gaze dwell time is. Then, the degree-of-interest estimation unit 14 estimates the degree of interest so that the degree of interest increases as the calculated corrected gaze residence time increases.
- FIG. 10 is a diagram showing a display example of objects in the embodiment of the present invention.
- FIG. 11A and FIG. 11B are diagrams for explaining an example of a method of correcting the line-of-sight dwell time according to the embodiment of the present invention.
- FIG. 11A is a diagram for explaining a method of calculating a corrected gaze dwell time by subtracting a time corresponding to the attractive strength from the gaze dwell time.
- the degree-of-interest estimation unit 14 calculates the corrected gaze dwell time according to the following formula 8. That is, the degree-of-interest estimation unit 14 calculates the corrected gaze dwell time by subtracting the time corresponding to the attractiveness intensity from the gaze dwell time.
- Tc (i) T (i) ⁇ A (i) ⁇ ga (ga> 0) (Formula 8)
- T (i) represents the gaze dwell time for the object i.
- Tc (i) represents the corrected gaze dwell time for the object i.
- ga is an adjustment parameter for adjusting the correction amount.
- the degree-of-interest estimation unit 14 does not necessarily need to calculate the corrected gaze dwell time by subtracting the time corresponding to the attracting intensity from the gaze dwell time.
- the degree-of-interest estimation unit 14 may calculate the corrected line-of-sight dwell time by dividing the line-of-sight dwell time by using a value corresponding to the attractiveness intensity.
- FIG. 11B is a diagram for explaining a method of calculating the corrected gaze dwell time by dividing the gaze dwell time by a value according to the attractive strength.
- the degree-of-interest estimation unit 14 calculates the corrected gaze dwell time according to the following formula 9. That is, the degree-of-interest estimation unit 14 calculates the corrected gaze dwell time by dividing the gaze dwell time by using a value corresponding to the attractive strength. That is, the degree-of-interest estimation unit 14 corrects the gaze residence time by contracting the gaze residence time according to the attractiveness intensity.
- Tc (i) T (i) / A (i) ⁇ gb (gb> 0) (Formula 9)
- gb is an adjustment parameter for adjusting the correction amount.
- the degree-of-interest estimation unit 14 calculates the corrected gaze dwell time by correcting the gaze dwell time so that the greater the attractiveness strength is, the shorter the gaze dwell time is.
- the degree-of-interest estimation unit 14 estimates the degree of interest using the corrected gaze dwell time calculated above as shown in the following equation 10. That is, the degree-of-interest estimation unit 14 estimates the degree of interest so that the degree of interest increases as the corrected gaze dwell time increases.
- I (i) represents the degree of interest.
- K is an adjustment parameter for adjusting the magnitude of the value of interest.
- the gaze dwell time is corrected using the eye-catching strength of the object instead of using the gaze dwell time as it is.
- the corrected gaze dwell time is used.
- the time included in the gaze dwell time is irrelevant to the degree of interest (for example, the contents are complicated and the contents can be understood only at a glance. Because of the difficulty, it is possible to suppress the influence of the time when the object is being watched or the time when the object is being watched because the stimulus intensity is strong.
- the user interface device 10 can estimate the degree of interest of the user based on the attracting strength of each object. If the attracting strength of the object is high, the user turns his gaze even if he is not interested in the object. That is, the line-of-sight residence time varies according to not only the degree of interest of the user with respect to the object but also the attractiveness intensity of the object. Therefore, the user interface apparatus 10 can suppress the influence which the time which is not related to a user's interest level among eyes
- the user interface device 10 can calculate the attracting strength based on the physical characteristics of the object.
- the attraction strength depends on the physical characteristics of the object (for example, the complexity or heterogeneity of the object). For example, if the complexity of the image of the object is high, it takes a long time to turn the line of sight toward the object in order to understand the contents of the object. Therefore, the user interface device 10 can accurately calculate the attracting strength by calculating the attracting strength based on the physical characteristics of the object, and can estimate the degree of interest of the user with high accuracy. Become.
- the user interface device 10 can calculate the attractiveness intensity based on the psychological distance between the object and the user.
- the attraction strength is the psychological distance between the object and the user (for example, the number of times the user has viewed the object, whether the user is the creator of the object, and whether the user is displayed on the object) Etc.). Therefore, the user interface device 10 can accurately calculate the attracting strength by calculating the attracting strength based on the psychological distance between the object and the user, and can estimate the degree of interest of the user with high accuracy. It becomes possible.
- the user interface device 10 can calculate the attractiveness intensity based on the physical positional relationship between the object and the user.
- the attractive strength depends on the physical positional relationship between the object and the user (for example, the physical distance between the object and the user, or the direction of the object with respect to the user). For example, if the physical distance between the object and the user is short, the time for directing the line of sight to the object becomes long. Therefore, the user interface device 10 can accurately calculate the attracting strength by calculating the attracting strength based on the physical positional relationship between the object and the user, and can estimate the degree of interest of the user with high accuracy. It becomes possible.
- the user interface device 10 can estimate the degree of interest based on the gaze dwell time corrected according to the attractiveness intensity. Therefore, the user interface device 10 can remove the time not related to the user's interest level from the line-of-sight residence time, and thus can estimate the user's interest level with high accuracy.
- the user interface system 100 has been described based on the embodiments.
- the present invention is not limited to these embodiments. Unless it deviates from the meaning of this invention, what made the various deformation
- the line-of-sight direction detection unit 11 detects the line-of-sight direction based on the image information generated by the camera 30, but it is of course not limited to the method of detecting the line-of-sight direction based on the image information. .
- a method for detecting the line-of-sight direction for example, a method for detecting the line-of-sight direction using a measurement device attached to the user (first method), or a line-of-sight direction using a non-contact device such as an infrared light source. And the like (second method).
- the first method includes an EOG (Electro-oculography) method for detecting eye movement from a change in the cornea-retinal potential obtained from an electrode attached to the head, or an induced current generated in a contact lens incorporating a coil.
- EOG Electro-oculography
- search coil method for detecting eye movement from the eye.
- a wearable eye camera such as a helmet or glasses.
- a corneal reflection method in which a near-infrared point light source is irradiated to the eye, and the direction of the line of sight is estimated from the Purkinje image reflected from the cornea and the position of the pupil.
- the degree-of-interest estimation unit 14 calculates the corrected gaze dwell time. However, it is not always necessary to calculate the corrected gaze dwell time. For example, the degree-of-interest estimation unit 14 may correct the estimated degree of interest using the attracting strength after estimating the degree of interest based on the gaze residence time.
- the display device 20 includes the screen control unit 25.
- the user interface device 10 may include the screen control unit 25.
- the user interface device 10 may be called a screen control device.
- the user interface device 10 may include a screen 26.
- the user interface device 10 may be called a display device.
- the display device 20 controls the display of the screen according to the estimated degree of interest, but it is not always necessary to perform the display control of the screen.
- the display device 20 may output a sound according to the degree of interest.
- the display device 20 may output a sound (for example, a machine sound) indicating information on an object having the highest degree of interest among a plurality of objects.
- the user interface device 10 is arranged outside the display device 20, but may be built in the display device 20.
- the above user interface device is specifically a computer system including a microprocessor, a ROM (Read Only Memory), a RAM (Random Access Memory), a hard disk unit, a display unit, a keyboard, a mouse, and the like.
- a computer program is stored in the RAM or the hard disk unit.
- Each device achieves its function by the microprocessor operating according to the computer program.
- the computer program is configured by combining a plurality of instruction codes indicating instructions for the computer in order to achieve a predetermined function.
- Each device is not limited to a computer system including all of a microprocessor, a ROM, a RAM, a hard disk unit, a display unit, a keyboard, a mouse, and the like, and may be a computer system including a part of them.
- a part or all of the components constituting the user interface device may be configured by one system LSI (Large Scale Integration).
- the system LSI is a super multifunctional LSI manufactured by integrating a plurality of components on one chip, and specifically, a computer system including a microprocessor, a ROM, a RAM, and the like. .
- a computer program is stored in the RAM.
- the system LSI achieves its functions by the microprocessor operating according to the computer program.
- system LSI may be called IC, LSI, super LSI, or ultra LSI depending on the degree of integration.
- method of circuit integration is not limited to LSI's, and implementation using dedicated circuitry or general purpose processors is also possible.
- An FPGA Field Programmable Gate Array
- a reconfigurable processor that can reconfigure the connection and setting of circuit cells inside the LSI may be used.
- a part or all of the components constituting the user interface device may be configured by an IC card that can be attached to and detached from the user interface device or a single module.
- the IC card or the module is a computer system including a microprocessor, a ROM, a RAM, and the like.
- the IC card or the module may include the super multifunctional LSI described above.
- the IC card or the module achieves its function by the microprocessor operating according to the computer program. This IC card or this module may have tamper resistance.
- the present invention may be the input method shown above. Moreover, it may be a computer program that realizes these input methods by a computer, or may be a digital signal composed of the computer program.
- the present invention also provides a computer program-readable non-transitory recording medium such as a flexible disk, hard disk, CD-ROM, MO, DVD, DVD-ROM, DVD-RAM, BD ( It may be recorded on a Blu-ray Disc (registered trademark), a semiconductor memory, or the like. Further, the present invention may be the computer program or the digital signal recorded on these recording media.
- a computer program-readable non-transitory recording medium such as a flexible disk, hard disk, CD-ROM, MO, DVD, DVD-ROM, DVD-RAM, BD ( It may be recorded on a Blu-ray Disc (registered trademark), a semiconductor memory, or the like.
- the present invention may be the computer program or the digital signal recorded on these recording media.
- the computer program or the digital signal may be transmitted via an electric communication line, a wireless or wired communication line, a network represented by the Internet, a data broadcast, or the like.
- the present invention may also be a computer system including a microprocessor and a memory.
- the memory may store the computer program, and the microprocessor may operate according to the computer program.
- the program or the digital signal is recorded on the recording medium and transferred, or the program or the digital signal is transferred via the network or the like, and executed by another independent computer system. It is good.
- the user interface device is useful as a user interface device that performs input processing based on the degree of interest of the user with respect to a plurality of objects displayed on the screen. It can also be applied to applications such as measuring advertising effects in digital signage.
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Human Computer Interaction (AREA)
- Computer Vision & Pattern Recognition (AREA)
- User Interface Of Digital Computer (AREA)
- Position Input By Displaying (AREA)
- Eye Examination Apparatus (AREA)
Abstract
Description
まず、視線方向検出処理(S101)の詳細について、図4~図9を用いて説明する。
R:眼球半径(12mm)
d:視線方向基準面と黒目中点との距離
次に、視線滞留時間算出処理(S102)の詳細について説明する。
次に、誘目強度算出処理(S103)の詳細について説明する。
Apsy(i)=s1×e_st(i)+s2×e_sb(i)+s3×e_w(i) (式5)
Aphy(i)=h1/dist(i)+h2/ang(i) (式6)
次に、関心度推定処理(S104)の詳細について説明する。
まず、視線滞留時間の補正について、図10~図11Bを用いて説明する。
次に、関心度推定部14は、上記により算出された補正視線滞留時間を用いて、次式10のように関心度を推定する。すなわち、関心度推定部14は、補正視線滞留時間が長いほど関心度が高くなるように関心度を推定する。
11 視線方向検出部
12 視線滞留時間算出部
13 誘目強度算出部
14 関心度推定部
20 表示装置
25 画面制御部
26 画面
30 カメラ
100 ユーザインタフェースシステム
Claims (20)
- 画面上に表示された複数のオブジェクトに対するユーザの関心度に基づいて入力処理を行うために前記ユーザの関心度を推定するユーザインタフェース装置であって、
前記ユーザの視線方向を検出する視線方向検出部と、
前記複数のオブジェクトの各々に対する前記視線方向の滞留時間を視線滞留時間として算出する視線滞留時間算出部と、
前記オブジェクトごとに誘目強度を算出する誘目強度算出部と、
前記視線滞留時間が長いほど関心度が高くなるように、かつ前記誘目強度が小さいほど関心度が高くなるように、前記複数のオブジェクトの各々に対する前記ユーザの関心度を推定する関心度推定部とを備える
ユーザインタフェース装置。 - 前記誘目強度算出部は、前記オブジェクトごとに、当該オブジェクトの画像の物理的な特徴に基づいて前記誘目強度を算出する
請求項1に記載のユーザインタフェース装置。 - 前記誘目強度算出部は、前記オブジェクトごとに、当該オブジェクトの画像の複雑度に基づいて、複雑度が高いほど誘目強度が大きくなるように前記誘目強度を算出する
請求項2に記載のユーザインタフェース装置。 - 前記誘目強度算出部は、前記オブジェクトごとに、当該オブジェクトの周囲の画像に対する当該オブジェクトの画像の異質度に基づいて、異質度が高いほど誘目強度が大きくなるように前記誘目強度を算出する
請求項2または3に記載のユーザインタフェース装置。 - 前記誘目強度算出部は、前記オブジェクトごとに、当該オブジェクトと前記ユーザとの間の心理的距離が短いほど誘目強度が大きくなるように前記誘目強度を算出する
請求項1~4のいずれか1項に記載のユーザインタフェース装置。 - 前記誘目強度算出部は、前記オブジェクトごとに、前記ユーザが当該オブジェクトを過去に視覚した回数が多いほど心理的距離が短くなるように前記心理的距離を算出する
請求項5に記載のユーザインタフェース装置。 - 前記誘目強度算出部は、前記オブジェクトごとに、前記ユーザが当該オブジェクトの作成者であるか否かに基づいて、前記ユーザが当該オブジェクトの作成者である場合に、前記ユーザが当該オブジェクトの作成者でない場合よりも心理的距離が短くなるように、前記心理的距離を算出する
請求項5または6に記載のユーザインタフェース装置。 - 前記誘目強度算出部は、前記オブジェクトごとに、前記ユーザが当該オブジェクト内に表示されている被写体であるか否かに基づいて、前記ユーザが被写体である場合に前記ユーザが被写体でない場合よりも心理的距離が短くなるように、前記心理的距離を算出する
請求項5~7のいずれか1項に記載のユーザインタフェース装置。 - 前記誘目強度算出部は、前記オブジェクトごとに、当該オブジェクトと前記ユーザとの物理的な位置関係に基づいて前記誘目強度を算出する
請求項1~8のいずれか1項に記載のユーザインタフェース装置。 - 前記誘目強度算出部は、前記オブジェクトごとに、当該オブジェクトと前記ユーザとの間の物理的距離に基づいて、物理的距離が短いほど誘目強度が大きくなるように前記誘目強度を算出する
請求項9に記載のユーザインタフェース装置。 - 前記誘目強度算出部は、前記オブジェクトごとに、当該オブジェクトおよび前記ユーザを結ぶ線と前記画面の法線とのなす角度に基づいて、角度が小さいほど誘目強度が大きくなるように前記誘目強度を算出する
請求項9または10に記載のユーザインタフェース装置。 - 前記関心度推定部は、前記誘目強度が大きいほど短くなるように前記視線滞留時間を補正することにより補正視線滞留時間を算出し、算出した前記補正視線滞留時間が長いほど関心度が高くなるように前記関心度を推定する
請求項1~11のいずれか1項に記載のユーザインタフェース装置。 - 前記関心度推定部は、前記視線滞留時間から前記誘目強度に応じた時間を減算することにより前記補正視線滞留時間を算出する
請求項12に記載のユーザインタフェース装置。 - 前記関心度推定部は、前記誘目強度に応じた値を用いて前記視線滞留時間を除算することにより前記補正視線滞留時間を算出する
請求項12に記載のユーザインタフェース装置。 - さらに、
推定された前記関心度に応じて、前記画面の表示制御を行う画面制御部を備える
請求項1~14のいずれか1項に記載のユーザインタフェース装置。 - 画面制御部は、前記複数のオブジェクトのうち、推定された前記関心度が最も高いオブジェクトに関する情報を、前記画面に表示させる
請求項15に記載のユーザインタフェース装置。 - 画面制御部は、前記複数のオブジェクトのうち、推定された前記関心度が最も高いオブジェクト、または、前記複数のオブジェクトのうち、推定された前記関心度が最も高いオブジェクトを除くオブジェクトの表示態様を変化させる
請求項15に記載のユーザインタフェース装置。 - 前記ユーザインタフェース装置は、集積回路として構成されている
請求項1~17のいずれか1項に記載のユーザインタフェース装置。 - 画面上に表示された複数のオブジェクトに対するユーザの関心度に基づいて入力処理を行うためにコンピュータが前記ユーザの関心度を推定する入力方法であって、
前記ユーザの視線方向を検出する視線方向検出ステップと、
前記複数のオブジェクトの各々に対する、検出された前記視線方向の滞留時間を、視線滞留時間として算出する視線滞留時間算出ステップと、
前記オブジェクトごとに誘目強度を算出する誘目強度算出ステップと、
前記視線滞留時間が長いほど関心度が高くなるように、かつ前記誘目強度が小さいほど関心度が高くなるように、前記複数のオブジェクトの各々に対する前記ユーザの関心度を推定する関心度推定ステップとを含む
入力方法。 - 請求項19に記載の入力方法をコンピュータに実行させるためのプログラム。
Priority Applications (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2011545938A JP5602155B2 (ja) | 2009-12-14 | 2010-12-02 | ユーザインタフェース装置および入力方法 |
CN201080006051.5A CN102301316B (zh) | 2009-12-14 | 2010-12-02 | 用户界面装置以及输入方法 |
EP10837235.0A EP2515206B1 (en) | 2009-12-14 | 2010-12-02 | User interface apparatus and input method |
US13/147,246 US8830164B2 (en) | 2009-12-14 | 2010-12-02 | User interface device and input method |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2009-282491 | 2009-12-14 | ||
JP2009282491 | 2009-12-14 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2011074198A1 true WO2011074198A1 (ja) | 2011-06-23 |
Family
ID=44166973
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2010/007026 WO2011074198A1 (ja) | 2009-12-14 | 2010-12-02 | ユーザインタフェース装置および入力方法 |
Country Status (5)
Country | Link |
---|---|
US (1) | US8830164B2 (ja) |
EP (1) | EP2515206B1 (ja) |
JP (1) | JP5602155B2 (ja) |
CN (1) | CN102301316B (ja) |
WO (1) | WO2011074198A1 (ja) |
Cited By (15)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
GB2494235A (en) * | 2011-08-30 | 2013-03-06 | Gen Electric | Gaze- and/or pose-dependent interactive advertising |
KR101490505B1 (ko) * | 2014-07-08 | 2015-02-10 | 주식회사 테라클 | 관심도 생성 방법 및 장치 |
JP2015039487A (ja) * | 2013-08-21 | 2015-03-02 | 大日本印刷株式会社 | 生理指標を用いる視線分析システムおよび方法 |
JP2015052879A (ja) * | 2013-09-06 | 2015-03-19 | カシオ計算機株式会社 | 情報処理装置及びプログラム |
JP2015127897A (ja) * | 2013-12-27 | 2015-07-09 | ソニー株式会社 | 表示制御装置、表示制御システム、表示制御方法、およびプログラム |
JP2016046730A (ja) * | 2014-08-25 | 2016-04-04 | 学校法人早稲田大学 | 視聴者注目情報提供システム、時空間マーカ設定装置及びそのプログラム、並びに、情報提供装置及びそのプログラム |
JP5961736B1 (ja) * | 2015-08-17 | 2016-08-02 | 株式会社コロプラ | ヘッドマウントディスプレイシステムを制御する方法、および、プログラム |
JP2017041229A (ja) * | 2016-06-08 | 2017-02-23 | 株式会社コロプラ | ヘッドマウントディスプレイシステムを制御する方法、および、プログラム |
JP2017182628A (ja) * | 2016-03-31 | 2017-10-05 | 株式会社エヌ・ティ・ティ・データ | 拡張現実ユーザインタフェース適用装置および制御方法 |
CN108489507A (zh) * | 2018-03-20 | 2018-09-04 | 京东方科技集团股份有限公司 | 导航装置及其工作方法、计算机可读存储介质 |
JP2020038336A (ja) * | 2019-02-19 | 2020-03-12 | オムロン株式会社 | 情報処理装置、情報処理方法、および情報処理プログラム |
JP2020038562A (ja) * | 2018-09-05 | 2020-03-12 | オムロン株式会社 | 情報処理装置、情報処理方法、および情報処理プログラム |
US10642353B2 (en) | 2017-07-19 | 2020-05-05 | Fujitsu Limited | Non-transitory computer-readable storage medium, information processing apparatus, and information processing method |
WO2024047990A1 (ja) * | 2022-09-02 | 2024-03-07 | キヤノン株式会社 | 情報処理装置 |
US11972619B2 (en) | 2019-02-20 | 2024-04-30 | Evident Corporation | Information processing device, information processing system, information processing method and computer-readable recording medium |
Families Citing this family (48)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2011114567A1 (ja) * | 2010-03-18 | 2011-09-22 | 富士フイルム株式会社 | 立体表示装置及び立体撮影装置、並びに利き目判定方法及びこれに用いる利き目判定プログラム並びに記録媒体 |
WO2012105196A1 (ja) * | 2011-02-04 | 2012-08-09 | パナソニック株式会社 | 関心度推定装置および関心度推定方法 |
WO2012175785A1 (en) * | 2011-06-20 | 2012-12-27 | Nokia Corporation | Methods, apparatuses and computer program products for performing accurate pose estimation of objects |
US20130138499A1 (en) * | 2011-11-30 | 2013-05-30 | General Electric Company | Usage measurent techniques and systems for interactive advertising |
WO2013085193A1 (ko) * | 2011-12-06 | 2013-06-13 | 경북대학교 산학협력단 | 사용자 인지 향상 장치 및 그 인지 향상 방법 |
EP2795425A4 (en) * | 2011-12-23 | 2015-08-26 | Thomson Licensing | COMPUTER DEVICE WITH POWER CONSUMPTION MANAGEMENT AND METHOD FOR MANAGING THE POWER CONSUMPTION OF A COMPUTER DEVICE |
US9024844B2 (en) * | 2012-01-25 | 2015-05-05 | Microsoft Technology Licensing, Llc | Recognition of image on external display |
KR101922589B1 (ko) * | 2012-02-15 | 2018-11-27 | 삼성전자주식회사 | 디스플레이장치 및 그 시선추적방법 |
KR101408591B1 (ko) * | 2012-04-27 | 2014-06-17 | 삼성전기주식회사 | 무안경 3차원 영상 디스플레이 장치 및 방법 |
US9030505B2 (en) * | 2012-05-17 | 2015-05-12 | Nokia Technologies Oy | Method and apparatus for attracting a user's gaze to information in a non-intrusive manner |
JP6007600B2 (ja) * | 2012-06-07 | 2016-10-12 | ソニー株式会社 | 画像処理装置、画像処理方法およびプログラム |
US8965624B2 (en) | 2012-08-14 | 2015-02-24 | Ebay Inc. | Method and system of vehicle tracking portal |
CN103870146B (zh) * | 2012-12-17 | 2020-06-23 | 联想(北京)有限公司 | 一种信息处理方法及电子设备 |
JP2014157466A (ja) * | 2013-02-15 | 2014-08-28 | Sony Corp | 情報処理装置及び記憶媒体 |
US11228805B2 (en) * | 2013-03-15 | 2022-01-18 | Dish Technologies Llc | Customized commercial metrics and presentation via integrated virtual environment devices |
KR102098277B1 (ko) | 2013-06-11 | 2020-04-07 | 삼성전자주식회사 | 시선 추적을 이용한 시인성 개선 방법, 저장 매체 및 전자 장치 |
JP5420793B1 (ja) * | 2013-09-10 | 2014-02-19 | テレパシー インク | 画像の視認距離を調整できるヘッドマウントディスプレイ |
US20150113454A1 (en) * | 2013-10-21 | 2015-04-23 | Motorola Mobility Llc | Delivery of Contextual Data to a Computing Device Using Eye Tracking Technology |
US9804753B2 (en) * | 2014-03-20 | 2017-10-31 | Microsoft Technology Licensing, Llc | Selection using eye gaze evaluation over time |
US10424103B2 (en) * | 2014-04-29 | 2019-09-24 | Microsoft Technology Licensing, Llc | Display device viewer gaze attraction |
US10416759B2 (en) * | 2014-05-13 | 2019-09-17 | Lenovo (Singapore) Pte. Ltd. | Eye tracking laser pointer |
KR102240632B1 (ko) * | 2014-06-10 | 2021-04-16 | 삼성디스플레이 주식회사 | 생체 효과 영상을 제공하는 전자 기기의 구동 방법 |
KR102253444B1 (ko) | 2014-07-08 | 2021-05-20 | 삼성디스플레이 주식회사 | 사용자의 눈 깜빡임을 유도하는 유도 영상 표시 방법, 장치, 및 컴퓨터 판독 가능한 기록매체 |
CN104199544B (zh) * | 2014-08-28 | 2018-06-22 | 华南理工大学 | 基于视线跟踪的广告定向投放方法 |
EP3009918A1 (en) * | 2014-10-13 | 2016-04-20 | Thomson Licensing | Method for controlling the displaying of text for aiding reading on a display device, and apparatus adapted for carrying out the method and computer readable storage medium |
WO2016058847A1 (en) | 2014-10-13 | 2016-04-21 | Thomson Licensing | Method for controlling the displaying of text for aiding reading on a display device, and apparatus adapted for carrying out the method, computer program, and computer readable storage medium |
US9898865B2 (en) * | 2015-06-22 | 2018-02-20 | Microsoft Technology Licensing, Llc | System and method for spawning drawing surfaces |
US11194398B2 (en) | 2015-09-26 | 2021-12-07 | Intel Corporation | Technologies for adaptive rendering using 3D sensors |
JP6282769B2 (ja) * | 2016-06-23 | 2018-02-21 | 株式会社ガイア・システム・ソリューション | エンゲージメント値処理システム及びエンゲージメント値処理装置 |
US9875398B1 (en) * | 2016-06-30 | 2018-01-23 | The United States Of America As Represented By The Secretary Of The Army | System and method for face recognition with two-dimensional sensing modality |
US20180007328A1 (en) * | 2016-07-01 | 2018-01-04 | Intel Corporation | Viewpoint adaptive image projection system |
JP6799063B2 (ja) * | 2016-07-20 | 2020-12-09 | 富士フイルム株式会社 | 注目位置認識装置、撮像装置、表示装置、注目位置認識方法及びプログラム |
EP3501014A1 (en) | 2016-08-17 | 2019-06-26 | VID SCALE, Inc. | Secondary content insertion in 360-degree video |
TWI642030B (zh) * | 2017-08-09 | 2018-11-21 | 宏碁股份有限公司 | 視覺效用分析方法及相關眼球追蹤裝置與系統 |
CN107577959A (zh) * | 2017-10-11 | 2018-01-12 | 厦门美图移动科技有限公司 | 一种隐私保护方法及移动终端 |
US11151600B2 (en) * | 2018-04-23 | 2021-10-19 | International Business Machines Corporation | Cognitive analysis of user engagement with visual displays |
USD914021S1 (en) | 2018-12-18 | 2021-03-23 | Intel Corporation | Touchpad display screen for computing device |
US11949934B2 (en) * | 2018-12-28 | 2024-04-02 | Gree, Inc. | Video distribution system, video distribution method, video distribution program, information processing terminal, and video viewing program |
EP3948492A4 (en) * | 2019-03-27 | 2022-11-09 | INTEL Corporation | SMART BILLBOARD APPARATUS AND RELATED METHODS |
US11379016B2 (en) | 2019-05-23 | 2022-07-05 | Intel Corporation | Methods and apparatus to operate closed-lid portable computers |
US11786694B2 (en) | 2019-05-24 | 2023-10-17 | NeuroLight, Inc. | Device, method, and app for facilitating sleep |
US11543873B2 (en) | 2019-09-27 | 2023-01-03 | Intel Corporation | Wake-on-touch display screen devices and related methods |
US11733761B2 (en) | 2019-11-11 | 2023-08-22 | Intel Corporation | Methods and apparatus to manage power and performance of computing devices based on user presence |
US11809535B2 (en) | 2019-12-23 | 2023-11-07 | Intel Corporation | Systems and methods for multi-modal user device authentication |
US11360528B2 (en) | 2019-12-27 | 2022-06-14 | Intel Corporation | Apparatus and methods for thermal management of electronic user devices based on user activity |
JP7327370B2 (ja) * | 2020-12-07 | 2023-08-16 | 横河電機株式会社 | 装置、方法およびプログラム |
CN118140169A (zh) | 2021-09-27 | 2024-06-04 | 三星电子株式会社 | 用于显示内容的电子装置和方法 |
US20240115933A1 (en) * | 2022-10-09 | 2024-04-11 | Sony Interactive Entertainment Inc. | Group control of computer game using aggregated area of gaze |
Citations (13)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH09190325A (ja) | 1996-01-09 | 1997-07-22 | Canon Inc | 表示装置 |
JP2004199695A (ja) * | 2002-12-18 | 2004-07-15 | Xerox Corp | 視覚的特性を自動的に選択して背景に対して目標を強調表示する方法 |
WO2006082979A1 (ja) * | 2005-02-07 | 2006-08-10 | Matsushita Electric Industrial Co., Ltd. | 画像処理装置および画像処理方法 |
JP2007018025A (ja) | 2004-06-30 | 2007-01-25 | Sharp Corp | 複雑度測定方法、処理選択方法、画像処理方法、そのプログラムおよび記録媒体、画像処理装置、並びに、画像処理システム |
JP2007286995A (ja) * | 2006-04-19 | 2007-11-01 | Hitachi Ltd | 注目度計測装置及び注目度計測システム |
JP2008502990A (ja) * | 2004-06-18 | 2008-01-31 | トビイ テクノロジー アーベー | 視線追跡に基づいてコンピュータ装置を制御するための装置、方法及びコンピュータプログラム |
JP2008112401A (ja) * | 2006-10-31 | 2008-05-15 | Mitsubishi Electric Corp | 広告効果測定装置 |
JP2008141484A (ja) | 2006-12-01 | 2008-06-19 | Sanyo Electric Co Ltd | 画像再生システム及び映像信号供給装置 |
JP2009193499A (ja) * | 2008-02-18 | 2009-08-27 | Hitachi Ltd | 注視商品データ取得方法および商品販売管理システム |
JP2009535683A (ja) * | 2006-04-28 | 2009-10-01 | トムソン ライセンシング | オブジェクト・ベース視覚的注意モデルの顕著性推定 |
JP2009245364A (ja) * | 2008-03-31 | 2009-10-22 | Nec Corp | 広告管理システム、広告管理装置、広告管理方法、及びプログラム |
WO2010070882A1 (ja) * | 2008-12-16 | 2010-06-24 | パナソニック株式会社 | 情報表示装置及び情報表示方法 |
WO2010143377A1 (ja) * | 2009-06-08 | 2010-12-16 | パナソニック株式会社 | 注視対象判定装置及び注視対象判定方法 |
Family Cites Families (14)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP3482923B2 (ja) * | 1999-10-28 | 2004-01-06 | セイコーエプソン株式会社 | 自動構図決定装置 |
US20050047629A1 (en) * | 2003-08-25 | 2005-03-03 | International Business Machines Corporation | System and method for selectively expanding or contracting a portion of a display using eye-gaze tracking |
US7963652B2 (en) * | 2003-11-14 | 2011-06-21 | Queen's University At Kingston | Method and apparatus for calibration-free eye tracking |
JP4207883B2 (ja) * | 2004-03-24 | 2009-01-14 | セイコーエプソン株式会社 | 視線誘導度算出システム |
US7260276B2 (en) | 2004-06-30 | 2007-08-21 | Sharp Laboratories Of America, Inc. | Methods and systems for complexity estimation and complexity-based selection |
JP4046121B2 (ja) * | 2005-03-24 | 2008-02-13 | セイコーエプソン株式会社 | 立体画像表示装置及び方法 |
EP2002322B1 (en) * | 2006-03-23 | 2009-07-15 | Koninklijke Philips Electronics N.V. | Hotspots for eye track control of image manipulation |
CN101432775B (zh) * | 2006-04-28 | 2012-10-03 | 汤姆逊许可公司 | 基于对象的视觉注意力模型的显著性评估方法 |
JP4775123B2 (ja) * | 2006-06-09 | 2011-09-21 | 日産自動車株式会社 | 車両用監視装置 |
US7542210B2 (en) * | 2006-06-29 | 2009-06-02 | Chirieleison Sr Anthony | Eye tracking head mounted display |
JP4991440B2 (ja) * | 2007-08-08 | 2012-08-01 | 株式会社日立製作所 | 商品販売装置、商品販売管理システム、商品販売管理方法およびプログラム |
EP2107787A1 (en) * | 2008-03-31 | 2009-10-07 | FUJIFILM Corporation | Image trimming device |
JP4282091B1 (ja) * | 2008-09-04 | 2009-06-17 | 株式会社モバイルビジネスプロモート | 端末装置、情報処理方法及びプログラム |
US8581838B2 (en) * | 2008-12-19 | 2013-11-12 | Samsung Electronics Co., Ltd. | Eye gaze control during avatar-based communication |
-
2010
- 2010-12-02 EP EP10837235.0A patent/EP2515206B1/en active Active
- 2010-12-02 CN CN201080006051.5A patent/CN102301316B/zh active Active
- 2010-12-02 WO PCT/JP2010/007026 patent/WO2011074198A1/ja active Application Filing
- 2010-12-02 US US13/147,246 patent/US8830164B2/en active Active
- 2010-12-02 JP JP2011545938A patent/JP5602155B2/ja active Active
Patent Citations (13)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH09190325A (ja) | 1996-01-09 | 1997-07-22 | Canon Inc | 表示装置 |
JP2004199695A (ja) * | 2002-12-18 | 2004-07-15 | Xerox Corp | 視覚的特性を自動的に選択して背景に対して目標を強調表示する方法 |
JP2008502990A (ja) * | 2004-06-18 | 2008-01-31 | トビイ テクノロジー アーベー | 視線追跡に基づいてコンピュータ装置を制御するための装置、方法及びコンピュータプログラム |
JP2007018025A (ja) | 2004-06-30 | 2007-01-25 | Sharp Corp | 複雑度測定方法、処理選択方法、画像処理方法、そのプログラムおよび記録媒体、画像処理装置、並びに、画像処理システム |
WO2006082979A1 (ja) * | 2005-02-07 | 2006-08-10 | Matsushita Electric Industrial Co., Ltd. | 画像処理装置および画像処理方法 |
JP2007286995A (ja) * | 2006-04-19 | 2007-11-01 | Hitachi Ltd | 注目度計測装置及び注目度計測システム |
JP2009535683A (ja) * | 2006-04-28 | 2009-10-01 | トムソン ライセンシング | オブジェクト・ベース視覚的注意モデルの顕著性推定 |
JP2008112401A (ja) * | 2006-10-31 | 2008-05-15 | Mitsubishi Electric Corp | 広告効果測定装置 |
JP2008141484A (ja) | 2006-12-01 | 2008-06-19 | Sanyo Electric Co Ltd | 画像再生システム及び映像信号供給装置 |
JP2009193499A (ja) * | 2008-02-18 | 2009-08-27 | Hitachi Ltd | 注視商品データ取得方法および商品販売管理システム |
JP2009245364A (ja) * | 2008-03-31 | 2009-10-22 | Nec Corp | 広告管理システム、広告管理装置、広告管理方法、及びプログラム |
WO2010070882A1 (ja) * | 2008-12-16 | 2010-06-24 | パナソニック株式会社 | 情報表示装置及び情報表示方法 |
WO2010143377A1 (ja) * | 2009-06-08 | 2010-12-16 | パナソニック株式会社 | 注視対象判定装置及び注視対象判定方法 |
Non-Patent Citations (3)
Title |
---|
B. S. MANJUNATH, W. Y. MA: "Texture features for browsing and retrieval of image data", IEEE TRANS. PATTERN ANAL. AND MACH. INTELL., vol. 18, no. 8, 1996, pages 837 - 842 |
MASANORI MIYAHARA, MASAKI AOKI, TETSUYA TAKIGUCHI, YASUO ARIKI: "Tagging Video Contents Based on Interest Estimation from Facial Expression", IPSJ JOURNAL, vol. 49, no. 10, 2008, pages 3694 - 3702 |
SHOJI TANAKA, SEIJI IGUCHI, YUICHI IWADATE, RYOHEI NAKATSU: "Attractiveness Evaluation Model based on the Physical Features of Image Regions", IEICE JOURNAL A, vol. J83-A, no. 5, 2000, pages 576 - 588 |
Cited By (20)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
GB2494235B (en) * | 2011-08-30 | 2017-08-30 | Gen Electric | Person tracking and interactive advertising |
GB2494235A (en) * | 2011-08-30 | 2013-03-06 | Gen Electric | Gaze- and/or pose-dependent interactive advertising |
JP2015039487A (ja) * | 2013-08-21 | 2015-03-02 | 大日本印刷株式会社 | 生理指標を用いる視線分析システムおよび方法 |
JP2015052879A (ja) * | 2013-09-06 | 2015-03-19 | カシオ計算機株式会社 | 情報処理装置及びプログラム |
JP2015127897A (ja) * | 2013-12-27 | 2015-07-09 | ソニー株式会社 | 表示制御装置、表示制御システム、表示制御方法、およびプログラム |
KR101490505B1 (ko) * | 2014-07-08 | 2015-02-10 | 주식회사 테라클 | 관심도 생성 방법 및 장치 |
JP2016046730A (ja) * | 2014-08-25 | 2016-04-04 | 学校法人早稲田大学 | 視聴者注目情報提供システム、時空間マーカ設定装置及びそのプログラム、並びに、情報提供装置及びそのプログラム |
US10311640B2 (en) | 2015-08-17 | 2019-06-04 | Colopl, Inc. | Method and apparatus for providing virtual space, and non-transitory computer readable data storage medium storing program causing computer to perform method |
WO2017030037A1 (ja) * | 2015-08-17 | 2017-02-23 | 株式会社コロプラ | 仮想空間を提供するためにコンピュータで実行される方法、当該方法をコンピュータに実行させるプログラム、および、仮想空間を提供するための装置 |
JP5961736B1 (ja) * | 2015-08-17 | 2016-08-02 | 株式会社コロプラ | ヘッドマウントディスプレイシステムを制御する方法、および、プログラム |
JP2017040970A (ja) * | 2015-08-17 | 2017-02-23 | 株式会社コロプラ | ヘッドマウントディスプレイシステムを制御する方法、および、プログラム |
JP2017182628A (ja) * | 2016-03-31 | 2017-10-05 | 株式会社エヌ・ティ・ティ・データ | 拡張現実ユーザインタフェース適用装置および制御方法 |
JP2017041229A (ja) * | 2016-06-08 | 2017-02-23 | 株式会社コロプラ | ヘッドマウントディスプレイシステムを制御する方法、および、プログラム |
US10642353B2 (en) | 2017-07-19 | 2020-05-05 | Fujitsu Limited | Non-transitory computer-readable storage medium, information processing apparatus, and information processing method |
CN108489507A (zh) * | 2018-03-20 | 2018-09-04 | 京东方科技集团股份有限公司 | 导航装置及其工作方法、计算机可读存储介质 |
JP2020038562A (ja) * | 2018-09-05 | 2020-03-12 | オムロン株式会社 | 情報処理装置、情報処理方法、および情報処理プログラム |
JP2020038336A (ja) * | 2019-02-19 | 2020-03-12 | オムロン株式会社 | 情報処理装置、情報処理方法、および情報処理プログラム |
JP7263825B2 (ja) | 2019-02-19 | 2023-04-25 | オムロン株式会社 | 情報処理装置、情報処理方法、および情報処理プログラム |
US11972619B2 (en) | 2019-02-20 | 2024-04-30 | Evident Corporation | Information processing device, information processing system, information processing method and computer-readable recording medium |
WO2024047990A1 (ja) * | 2022-09-02 | 2024-03-07 | キヤノン株式会社 | 情報処理装置 |
Also Published As
Publication number | Publication date |
---|---|
CN102301316A (zh) | 2011-12-28 |
EP2515206A1 (en) | 2012-10-24 |
CN102301316B (zh) | 2015-07-22 |
EP2515206A4 (en) | 2015-01-21 |
JP5602155B2 (ja) | 2014-10-08 |
US8830164B2 (en) | 2014-09-09 |
JPWO2011074198A1 (ja) | 2013-04-25 |
EP2515206B1 (en) | 2019-08-14 |
US20110298702A1 (en) | 2011-12-08 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP5602155B2 (ja) | ユーザインタフェース装置および入力方法 | |
JP5286371B2 (ja) | 情報表示装置及び情報表示方法 | |
US9538219B2 (en) | Degree of interest estimating device and degree of interest estimating method | |
WO2013018267A1 (ja) | 提示制御装置、及び提示制御方法 | |
CN111886564B (zh) | 信息处理装置、信息处理方法和程序 | |
US20170255260A1 (en) | Information processing apparatus, information processing method, and program | |
JP6123694B2 (ja) | 情報処理装置、情報処理方法、及びプログラム | |
WO2021095277A1 (ja) | 視線検出方法、視線検出装置、及び制御プログラム | |
JP6745518B1 (ja) | 視線検出方法、視線検出装置、及び制御プログラム | |
KR20200144196A (ko) | 전자 장치 및 각막 이미지를 이용한 전자 장치의 기능 제공 방법 | |
CN112585673B (zh) | 信息处理设备、信息处理方法及程序 | |
US20210400234A1 (en) | Information processing apparatus, information processing method, and program | |
US12001746B2 (en) | Electronic apparatus, and method for displaying image on display device | |
US20230185371A1 (en) | Electronic device | |
US20230188828A1 (en) | Electronic device | |
EP4374242A1 (en) | Screen interaction using eog coordinates | |
CN117730298A (zh) | 使用eog坐标的屏幕交互 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
WWE | Wipo information: entry into national phase |
Ref document number: 201080006051.5 Country of ref document: CN |
|
ENP | Entry into the national phase |
Ref document number: 2011545938 Country of ref document: JP Kind code of ref document: A |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2010837235 Country of ref document: EP |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 10837235 Country of ref document: EP Kind code of ref document: A1 |
|
WWE | Wipo information: entry into national phase |
Ref document number: 13147246 Country of ref document: US |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 10837235 Country of ref document: EP Kind code of ref document: A1 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |