JP2009157634A - Irradiation control device, irradiation control program, and visual line analysis system - Google Patents

Irradiation control device, irradiation control program, and visual line analysis system Download PDF

Info

Publication number
JP2009157634A
JP2009157634A JP2007334850A JP2007334850A JP2009157634A JP 2009157634 A JP2009157634 A JP 2009157634A JP 2007334850 A JP2007334850 A JP 2007334850A JP 2007334850 A JP2007334850 A JP 2007334850A JP 2009157634 A JP2009157634 A JP 2009157634A
Authority
JP
Japan
Prior art keywords
irradiation
means
line
image
sight
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
JP2007334850A
Other languages
Japanese (ja)
Inventor
Yutaka Ando
Takeshi Mizunashi
裕 安藤
豪 水梨
Original Assignee
Fuji Xerox Co Ltd
富士ゼロックス株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Fuji Xerox Co Ltd, 富士ゼロックス株式会社 filed Critical Fuji Xerox Co Ltd
Priority to JP2007334850A priority Critical patent/JP2009157634A/en
Publication of JP2009157634A publication Critical patent/JP2009157634A/en
Application status is Withdrawn legal-status Critical

Links

Images

Abstract

PROBLEM TO BE SOLVED: To propose a technique capable of easily grasping the movement of human eyes.
An irradiation control device 1 irradiates a human eyeball E with near-infrared rays by a light beam irradiating means 11, and takes an image of the eyeball E in a state where the near-infrared rays are irradiated with an eyeball imaging means 12. In step 13, the amount of movement of the eyeball E is measured based on the photographed image to detect the line-of-sight direction. Then, the irradiation device drive control means 21 changes the infrared irradiation direction by the infrared irradiation means 23 to the detected line-of-sight direction. As a result, infrared rays are emitted in the direction of the line of sight of the subject (customer) wearing the irradiation control device 1 on the head, so that the analyst (customer) can use a head-mounted display (HMD) 3 having an infrared imaging function. By wearing, the movement of each subject's line of sight can be grasped in real time.
[Selection] Figure 3

Description

  The present invention relates to an irradiation control device, an irradiation control program, and a line-of-sight analysis system.

  Even if it is re-captured video information, copyright can be presented and embedded, and as an invention for enabling mechanical search, a video is projected and displayed on the display surface, There has been proposed an invention in which a digital watermark is projected and presented by infrared rays or ultraviolet rays on a video displayed on the display surface by a light irradiation device (see Patent Document 1).

As an invention for accurately displaying information related to the object being viewed to the user, when the user views the posting through a half mirror of a head mounted display (HMD), the infrared illumination device changes the display to the display. The invisible code reflected by the irradiated infrared ray is received by the light receiving device, and based on this, a display image is created using the content obtained from the server system, and is displayed on the display by the HMD display device to the user. An invention has been proposed (see Patent Document 2).
Japanese Patent Laying-Open No. 2005-026759 JP 2003-242168 A

  In a space with a certain size, such as a showroom, observing how visitors and spectators move their eyes in the space is to evaluate the quality of customers in that space. It can be an important means. For example, if the customer does not move the line of sight as intended by the space designer, or if the amount of movement of the line of sight itself is very large, it is not a design that fully grasps the characteristics of the customer. It can be determined that the design was designed to make the customer tired. In other words, by analyzing the movement of the line of sight of the customer, it can be used as a material for evaluating whether the space such as a showroom is good or bad.

  In addition, by using the line-of-sight information, there is a case where it is possible to obtain a clue to change the customer service in a flexible manner. For example, if a customer ’s line of sight stays in one place, that is, if they are gazing at something, it may indicate that the customer has a strong interest in something, You can also attract more customer attention by deepening the topic about something. In addition, when it is detected that the customer is not looking in the direction that the customer wants to see, it is possible to prompt the user to see a desired thing.

  However, since it is extremely difficult to grasp the movement of the customer's eyeball and the object at the tip of the line of sight at the same time, the grasp of the movement of the customer's line of sight can only be felt while the customer is being treated. It was a temporary one. Therefore, it is difficult for the customer to perform sufficient observation and grasp on the spot for evaluation of the quality of the space where the customer is treated, and it is not possible to make use of the customer's line of sight for the occasional treatment. It was very difficult. It can also be easily imagined that the difficulty increases when there are multiple customers.

One way to solve this problem is to record the customer's situation with a video camera and later analyze the gaze movement or gaze target. However, there are various problems from the viewpoint of protecting personal information. Therefore, smooth implementation is difficult and there is no real-time capability. In addition, it is possible to attach a line-of-sight detection device to the customer and record the movement of the line of sight of the customer. However, if the caregiver is not paying attention by carrying a device such as a display on which the line of sight of the customer is displayed However, there is a problem that the first purpose of service is neglected.
Further, when analyzing the line of sight of a plurality of customers, it is conceivable to overlap the lines of sight of a plurality of customers with respect to a certain object on one image in order to compare the differences between the customers. When measuring with a conventional line-of-sight measuring instrument, measure the visual field coordinates that each customer sees and the line-of-sight coordinates (coordinates in the visual field image where the individual moves freely). Then, based on the measurement result, the lines of sight of a plurality of customers are superimposed on one image. However, it is necessary to measure from which position and direction the visual field image is taken. Although it is possible to detect the shooting position and direction of the field-of-view image using GPS or a sensor, it is necessary to install the sensor indoors. Further, complicated processing such as correction according to the photographing position and direction is required.

  The present invention has been made in view of the above-described conventional circumstances, and an object of the present invention is to propose a technique capable of easily grasping the movement of human eyes.

  The irradiation control apparatus according to claim 1 is a detection unit that detects a human gaze direction, an irradiation unit that irradiates light and can change the irradiation direction, and a gaze direction in which the irradiation direction of the light by the irradiation unit is detected. And changing means for changing to a human head, and mounting means for mounting the detecting means, the irradiating means, and the changing means on a human head.

  An irradiation control apparatus according to a second aspect is the irradiation control apparatus according to the first aspect, wherein the irradiating means focuses and irradiates a visible ray at a substantially central point in a visual line direction.

  According to a third aspect of the present invention, there is provided the irradiation control device according to the first aspect, wherein the irradiation means focuses and irradiates an invisible light beam at a substantially central point in the line-of-sight direction.

  An irradiation control apparatus according to a fourth aspect is the irradiation control apparatus according to the third aspect, wherein the irradiation means irradiates infrared rays.

  The irradiation control device according to claim 5 is the irradiation control device according to claim 3 or claim 4, further comprising an acquisition unit that acquires an image indicating information relating to a person wearing the irradiation control device, wherein the irradiation unit includes: The acquired image is projected onto an object in the line-of-sight direction.

  The irradiation control apparatus according to a sixth aspect is the irradiation control apparatus according to the fifth aspect, wherein the acquisition unit acquires an image for identifying a person wearing the irradiation control apparatus.

  An irradiation control apparatus according to a seventh aspect of the present invention is the irradiation control apparatus according to the fifth aspect, further comprising measurement means for measuring a biological state of a human wearing the irradiation control apparatus, wherein the acquisition means obtains the measurement result. The image shown is acquired.

  An eye-gaze analysis system according to an eighth aspect includes the irradiation control device according to any one of claims 3 to 7 and an imaging device, and the imaging device selects an object to be analyzed for the attention degree of the eye-gaze. A first imaging means for imaging; a second imaging means for imaging invisible light applied to the object; and a composition for combining the image of the imaged object and the image obtained by visualizing the captured invisible light. Means and output means for outputting the synthesized video.

  The line-of-sight analysis system according to claim 9 is the line-of-sight analysis system according to claim 8, wherein the imaging device further includes display means for displaying an image, and the output means outputs the synthesized image to the display means. It is characterized by being displayed.

  The line-of-sight analysis system according to claim 10 is the line-of-sight analysis system according to claim 8, wherein the imaging device further includes storage means for storing video, and the output means outputs the synthesized video to the storage means. It is memorized.

  The line-of-sight analysis system according to claim 11 is the line-of-sight analysis system according to any one of claims 8 to 10, further comprising generation means for generating an image that visualizes an area in which an invisible light irradiation time exceeds a threshold value. And the synthesizing unit synthesizes the image of the photographed object with the generated image.

  An irradiation control program according to claim 12 is provided in a computer of an irradiation control device including irradiation means capable of irradiating light and changing the irradiation direction, and mounting means for mounting the irradiation means on a human head. A detection function for detecting a human gaze direction and a change function for changing a light irradiation direction by the irradiation unit to the detected gaze direction are realized.

According to the irradiation control device of the first aspect, since the light is irradiated in the human gaze direction, the human gaze direction can be easily grasped by specifying the light irradiation destination.
Moreover, it can also be used for the use of the lighting fixture which lightens a human gaze direction, for example by irradiating illumination light over a fixed range. In other words, the conventional flashlight requires that the human intentionally indicates the light irradiation destination, and the helmet-fixed lighting device is irradiated in the line of sight although the light is irradiated in the human face direction. However, according to the irradiation control device of the present invention, it is convenient to surely brighten the line of sight of the person wearing it.
In addition, since each functional unit is attached to the human head in an integrated manner, the gaze direction can be reliably detected regardless of the orientation of the human head, and the gaze direction and irradiation direction can be easily linked. Can be done.
In addition, it is possible to record the trajectory of a plurality of light rays irradiated to the target object as an image by irradiating the human eye with the light ray and imaging the space including the target object to be analyzed by the imaging device. It becomes possible. That is, since it is not necessary to measure the shooting position and the direction of the visual field image related to a plurality of persons, it is possible to easily superimpose a plurality of human eyes (light rays) in the same space.

  According to the irradiation control device of the second aspect, the visible light is irradiated to the substantially center point in the line-of-sight direction of the human (subject) wearing this, so that the analyst can easily grasp the line-of-sight direction of the subject.

  According to the irradiation control device of claim 3, since the invisible light is irradiated to the substantially center point of the subject's line of sight, the light does not affect the consciousness of the subject, and the analyst visualizes the invisible light. By using the display device, the gaze direction of the subject can be grasped.

  According to the irradiation control device of the fourth aspect, it is possible to reliably irradiate a light beam in the line of sight of the subject with infrared rays that are excellent in straightness and reach far.

  According to the irradiation control device of the fifth aspect, since the image indicating the information related to the subject is projected on the object in the direction of the subject's line of sight, not only the direction of the subject's line of sight but also information related to the subject can be obtained.

  According to the irradiation control apparatus of the sixth aspect, the subject can be identified by the image projected in the subject's line-of-sight direction.

  According to the irradiation control device of the seventh aspect, the ecological state of the subject can be grasped from the image projected in the direction of the subject's line of sight.

  According to the line-of-sight analysis system of claim 8, since a composite image obtained by synthesizing an image obtained by photographing an object that is an object of analysis of the attention level of the line of sight and an image that visualizes the invisible light irradiated on the object is output. It is possible to grasp at a glance where the line of sight is directed on the target object.

  According to the line-of-sight analysis system of claim 9, by displaying the composite image on the display means, the analyst can immediately analyze the line-of-sight direction on the spot.

  According to the line-of-sight analysis system of claim 10, the analyst can analyze the line-of-sight direction afterwards by storing the synthesized video in the storage means.

  According to the line-of-sight analysis system of the eleventh aspect, the analyst can grasp the region that is attracting the attention of the subject.

  According to the irradiation control program of the twelfth aspect, the computer of the irradiation control device can realize a function of changing the light irradiation direction to the visual line direction of the subject.

The present invention will be specifically described based on the embodiments exemplified below.
FIG. 1 shows an overview of the line-of-sight analysis system according to the present example, taking as an example a case where a customer treats a plurality of customers.
Each customer (customers 1, 2,..., N) wears the irradiation control device 1 on the head, and an invisible ray (infrared ray in this example) R is irradiated from the irradiation control device 1 in the direction of the line of sight of the customer. Yes. The infrared ray R is focused and irradiated at a substantially central point in the line-of-sight direction, and line segments L1 and L2 in the figure indicate the locus of the irradiated infrared ray R.
The customer wears a head-mounted display (HMD) 3 having an infrared photographing function on his / her head, and grasps the movement of each customer's line of sight in real time by viewing the image visualized by infrared rays with the HMD 3.

FIG. 2 shows an appearance of the irradiation control apparatus 1 according to this example.
The irradiation control device 1 of this example includes a line-of-sight measurement device 10 and an infrared irradiation device 20 in a spectacle-shaped frame (mounting means). Note that the irradiation control device 1 may have another shape such as a helmet shape, for example, as long as it can be fixedly attached to a human head.

FIG. 3 shows functional configurations of the line-of-sight measurement device 10 and the infrared irradiation device 20 of the irradiation control device 1 according to this example.
The line-of-sight measurement device 10 measures the momentum of the eyeball E based on the captured image, the light beam irradiation unit 11 that irradiates the human eyeball E with near-infrared rays, the eyeball imaging unit 12 that images the eyeball E, and determines the line-of-sight direction. And a line-of-sight calculation means 13 for detection. However, when measuring the line of sight by image processing, means for irradiating near infrared rays is not necessary.

The infrared irradiation device 20 includes infrared irradiation means 23 that can emit infrared rays and change the irradiation direction, irradiation device drive control means 21 that performs control to change the infrared irradiation direction to the detected line-of-sight direction, and infrared rays. And infrared control means 22 for controlling the shape of the image projected onto the object.
In this example, identification information of each individual is input prior to wearing, and the infrared control unit 22 generates an identification image for identifying each individual based on the identification information, and supplies the identification image to the infrared irradiation unit 23 to output the image. Is projected onto the object in the irradiation direction. The image may be an image set in advance unique to the apparatus. In this example, a two-dimensional barcode image is used, but an image of a number, a character string, a symbol, or the like may be used as long as each individual can be identified.

An embodiment in which the pupil cornea reflection method is used as a method of detecting the line-of-sight direction by the line-of-sight measurement apparatus 10 of this example will be described with reference to FIG.
When the human eyeball E is irradiated with near infrared rays from the light irradiation means 11, the reflected light (Purkinje image) appears on the cornea surface of the eyeball E. Since this Purkinje image does not move even when the eyeball E moves, it is used as a reference point E3 when detecting the line-of-sight direction. Then, the eyeball E that has been irradiated with near-infrared rays is photographed by the eyeball imaging means 12, and the line-of-sight calculation means 13 takes the center point E2 and the reference point (Purkinier image) E3 of the black-eye portion E1 specified by the image processing. Based on this, the amount of movement of the eyeball E (the amount of movement of the center point E2) is measured, and the movement angle of the eyeball E is calculated based on this. As a result, the fluctuation angle in the horizontal direction and the vertical direction with respect to the reference direction of the line of sight can be specified, and the line-of-sight direction can be detected.

  Note that the detection of the line-of-sight direction is not limited to the above method, and for example, it can be detected by the sclera reflection method. This method irradiates minute infrared light and measures the eye movement based on the difference in reflectance between the sclera (white eye) and the cornea (black eye). It is also possible to detect the position of both black eyes on the face by image processing. In this method, usually, the nose is first detected, and the line-of-sight direction is calculated based on the relative positions of both eyes with reference to the nose.

  Here, the infrared irradiation unit 23 is controlled by the irradiation device drive control unit 21 so that infrared rays are emitted in the detected line-of-sight direction. However, since the movement of the eyeball varies among individuals, the infrared ray is correctly irradiated in the line-of-sight direction. Thus, it is necessary to calibrate the infrared irradiation direction for each individual in advance.

The calibration is performed by measuring the eyeball position when viewing a certain coordinate in the real space and the movement angle of the eyeball when moving from a certain coordinate to another coordinate.
Specifically, as shown in FIG. 5, a plurality of LED lamps provided on the calibration plate C are caused to emit light in order, followed by tracking them from a position at a fixed interval (viewing distance), and the line of sight between them. Measure the moving angle of. For example, the lateral illumination angle is calibrated based on the line-of-sight movement angle, the distance between the LED lamps, and the viewing distance when the LED lamps shifted in the horizontal direction are sequentially followed, and shifted in the vertical direction. The vertical irradiation angle is calibrated based on the line-of-sight movement angle, the distance between the LED lamps, and the viewing distance when the LED lamps are sequentially followed.
The calibration is not limited to the above method, and as shown in FIG. 6, the infrared irradiation device 20 is provided with a visible light irradiation means, and the visible light irradiation means changes the direction of the visible light to irradiate. You may make it implement | achieve by having you follow in. Further, in the above calibration, when the depth to the target at the time of calibration is different from the depth to the target at the time of measurement, there is a possibility that an error occurs in the detected line-of-sight direction. In that case, it is possible to reduce the error by measuring the angle of convergence (the line of sight of both eyes).

FIG. 7 shows a flow of infrared irradiation processing in the line-of-sight direction by the irradiation control apparatus 1 of this example.
The line-of-sight measurement device 10 measures the amount of eye movement from an image of the eyeball of a human (subject, customer in this example) wearing the irradiation control device 1, calculates the eye movement angle, and detects the direction of the line of sight (step) S1, S2).
And the irradiation apparatus drive control means 21 controls the infrared irradiation angle by the infrared irradiation means 23 based on the detection result of a gaze direction (step S3). Moreover, the irradiation apparatus drive control means 22 produces | generates a test subject's identification image, and supplies it to the infrared irradiation means 23 (step S4).
As a result, the infrared irradiation means 23 irradiates infrared rays in the line-of-sight direction, and projects the identification image onto the target object ahead (step S5).
An analyst (a service person in this example) can grasp the movement of each gaze of each subject in real time while distinguishing each subject by wearing the HMD 3 described later.

FIG. 8 illustrates a basic functional configuration of the HMD 3.
The HMD 3 in FIG. 3 includes a display unit 34 that displays an input video, a video acquisition unit 31 that captures an object to be analyzed for the attention level of the line of sight, and an invisible light beam (this example) The information is output to the display unit 34 by combining the additional information light receiving unit 32 that captures the infrared image) and the image of the captured object (visible light image) and the image (infrared image) that visualizes the captured infrared image. It is a display device equipped with a synthesizing means 33 and that can be worn on a human head.
That is, the video synthesized by the video acquisition stage 31 and the video taken by the additional information light receiving means 32 are synthesized by the information synthesizing means 33 and output to the display means 34 for display.

  The line-of-sight analysis system of this example not only simply synthesizes the visible light image and the infrared image as described above, but also has a decoding function for extracting information by decoding the identification image of the subject in the infrared image, The HMD 3 that generates the composite video in consideration of the extracted information is used.

FIG. 10 shows a functional configuration of the HMD 3 having a decoding function.
The HMD 3 in the figure is configured by using two CCD cameras (image acquisition means 31 and additional information light receiving means 32) capable of taking visible light and infrared images. Specifically, an infrared filter 41 that blocks passage of light other than infrared light (visible light), a CCD camera 42 that captures an image of an object that has passed through the infrared filter 41 (that is, an infrared image), an image of the object ( A CCD camera 43 that captures a visible light image and an infrared image), and a difference detection means that detects a difference between the infrared image captured by the CCD camera 42 and the visible light image and infrared image captured by the CCD camera 43 to extract the infrared image. 44. Decoding means 45 for identifying the identification image in the extracted infrared image and detecting the position coordinates of the identification image and extracting information by decoding the identification image, extracted from the position coordinates of the identification image and the identification image Information for generating an image based on the information (identification information of the subject in this example) and combining the image with the visible light image captured by the CCD camera 42 And a forming unit 46,.

  That is, in the line-of-sight analysis system of this example, since the image of the two-dimensional barcode is used as the customer identification image, other information related to the customer can be additionally included as described later. Since each two-dimensional barcode image cannot be identified, it is impossible to know who is looking at it by simply visualizing the image, and images that can be easily identified by humans (for example, images of different colors and shapes) This makes it possible to understand who is watching where.

The line-of-sight analysis system of this example further includes a storage device 5 that stores the imaging result of the infrared irradiation object.
FIG. 11 shows a simplified functional configuration of the storage device 5 of this example.
Similar to the HMD 3 having a decoding function, the storage device 5 shown in the figure includes two cameras (a video acquisition unit 51 and an additional information light receiving unit 52). These cameras are fixedly installed, and the video taken by each camera is synthesized by the information synthesizing means 53 and output to the database 54 for storage.

In this example, the position coordinates of the identification image for each video frame and information extracted from the identification image are additionally stored in the database 54.
By using this additionally stored information, for example, as shown in FIG. 9, the line segments L1 and L2 for each identification image drawn along the position information of each identification image within a predetermined number of frames are displayed as a visible light image. Can be combined and displayed, and can be used to analyze the trajectory of each customer's line of sight.
In addition, for example, for each region of the visible light image, the time when each identification image is projected in the region is measured, and an image that visualizes the region where the time exceeds the threshold is generated, and this is converted into a visible light image. It is possible to synthesize and display it, and it can be used for analysis of a region that attracts the attention of the subject.

In addition, the infrared irradiation device 55 is provided in the storage device 5, and the generated image of the segment for each identification image (trajectory of each customer's line of sight) and the region where the projection time of the identification image exceeds the threshold (region with high attention) ) May be projected onto the subject by the infrared irradiation device 55.
Thereby, the customer who wears HMD3 can grasp immediately the locus | trajectory of each customer's eyes | visual_axis, and the area | region where a high degree of attention is on the spot. FIG. 12 shows an example of projection of an image of a region where the projection time of the identification image exceeds the threshold, and it can be understood that the portion of the region M1 is attracting customer attention.

In this example, “Takehiko Ohno,“ IMPACT: Browsing Support Method Based on Reuse of Gaze Information ”, Interactive System and Software VIII (WISS2000), Junichi Kyokumoto, Modern Science, 2000, pp. The line-of-sight attention degree is calculated using a calculation formula described in 137-146 ″.
Specifically, there are n stationary point of sight in a certain area, the motion vector d i dwell point when the dwell time of the stopping point f i t i, the position vector of the f i was F i,
d i = F i −F i−1
It is determined. At this time, assuming that d i = (d z , d y ), the initial value of the region attention level is V 0 , and the region area is S E , the calculation method of the region attention level V is as follows (Formula 1).
Further, the region attention level attenuates with the passage of time, and the region attention level V after the elapse of time u has an initial value V 0 .
V = V 0 · C u
It is. In the above equations, A and C are constants.
Of course, the line-of-sight attention may be calculated by another calculation formula.

  In the line-of-sight analysis system of this example, as shown in FIG. 13, since the two-dimensional barcode image M2 is projected from the irradiation control apparatus 1 with infrared rays R, the customer is informed of the customer in addition to the identification information. Such various information can be additionally included in the two-dimensional barcode image M2.

  For example, as indicated by a broken line in FIG. 3, a biological information measuring unit 25 that measures the biological state of the subject is provided in the irradiation control device 1, and the measurement result is supplied to the infrared control unit 22. And the infrared control means 22 produces | generates the two-dimensional barcode image containing the information of the said measurement result, and irradiates the said image by the infrared irradiation means 23. FIG.

Biological information and its main measurement methods include the following.
Electroencephalogram: Electrodes are attached on the scalp to measure potential changes associated with brain activity.
Heartbeat: Electrodes are attached to the surface of the body across the heart, and changes in potential due to heart activity are measured.
Blood pressure: Measured with a blood pressure measurement device.
Respiration: Measure respiratory flow and measure exhalation and inspiration.
Electrocutaneous activity: Wearing electrodes on the skin and measuring sweating electrically.
Body temperature: The temperature of a specific part is measured by a thermometer or thermography.
Electromyogram: Derived and measured by action potential electrodes accompanying muscle contraction.

  In the description so far, the case of irradiating infrared rays in the line-of-sight direction from the irradiation control device 1 has been described. However, for example, invisible light in other wavelength ranges such as ultraviolet rays may be irradiated from the irradiation control device 1, A photographing function corresponding to the wavelength of the invisible light may be provided in the HMD 3 or the storage device 5.

Moreover, you may make it irradiate visible light from the irradiation control apparatus 1. FIG. In this case, the customer, not the customer, wears the irradiation control device 1 so that visible light is emitted in the direction of the customer's line of sight. The customer's line of sight can be collected at this point.
FIG. 14 shows a processing flow in the case of irradiating visible light from the irradiation control device 1, but it is the same as that in the case of irradiating infrared light except that the light beam to be irradiated is visible light, and the description thereof is omitted. .
Moreover, as shown in FIG. 15, you may use as an illuminating device which lightens a human gaze direction by irradiating illumination light from the irradiation control apparatus 1 over a fixed range in a gaze direction.

  As shown in FIG. 16, the irradiation control apparatus 1 of this example receives a CPU 61 that performs various arithmetic processes, a RAM 62 that is a work area of the CPU 61, a ROM 63 that stores a basic control program, and an operation input from a user. Input interface (I / F) 64, a data storage unit 65 for storing an irradiation control program for realizing each function according to the present invention, and the like, and a computer having hardware resources.

  Then, by executing the irradiation control program according to the present invention using each hardware resource, the detection means (line-of-sight calculation means 13), the change means (irradiation device drive control means 21), the acquisition means (infrared rays) according to the present invention. The control means 22) and the like are realized in the computer of the irradiation control apparatus 1. Each functional unit described above is not limited to a mode realized by the software configuration as in this example, and may be configured by a dedicated hardware module.

  The HMD 3 and the storage device 5 of this example are also configured by a computer having hardware resources such as a CPU, a RAM, and a ROM, like the irradiation control device 1, and the program according to the present invention using each hardware resource. By executing the above, the synthesizing means (information synthesizing means 33, 53) according to the present invention is realized in the computer of the HMD 3 and the storage device 5, but it may be constituted by a dedicated hardware module.

It is a figure which shows the outline | summary of the gaze analysis system which concerns on an example of this invention. It is a figure which shows the external appearance of the irradiation control apparatus which concerns on an example of this invention. It is a figure which shows the function structure of the irradiation control apparatus which concerns on an example of this invention. It is a figure explaining the detection of the gaze direction which concerns on an example of this invention. It is a figure explaining the calibration process which concerns on an example of this invention. It is a figure explaining the calibration process which concerns on an example of this invention. It is a figure explaining the infrared irradiation process to the gaze direction which concerns on an example of this invention. It is a figure which shows the function structure of HMD which concerns on an example of this invention. It is a figure which shows the example of a display of the imaging | photography result of the infrared irradiation thing which concerns on an example of this invention. It is a figure which shows the function structure of HMD which concerns on an example of this invention. It is a figure which shows the function structure of the memory | storage device which concerns on an example of this invention. It is a figure which shows the example of a projection of the imaging | photography result of the infrared irradiation object which concerns on an example of this invention. It is a figure which shows the example of infrared irradiation by the irradiation control apparatus which concerns on an example of this invention. It is a figure explaining the visible light irradiation process to the gaze direction concerning an example of the present invention. It is a figure which shows the case where the irradiation control apparatus which concerns on an example of this invention is used for a lighting fixture. It is a figure which shows the hardware constitutions of the irradiation control apparatus which concerns on an example of this invention.

Explanation of symbols

1: irradiation control device,
3: Head mounted display (HMD),
5: Storage device
10: Eye gaze measuring device,
20: Infrared irradiation device,
11: Light irradiation means,
12: Eyeball imaging means,
13: Gaze calculation means,
21: Irradiation device drive control means,
22: Infrared control means,
23: infrared irradiation means,
24: biological information measuring means,
31, 51: Image acquisition means,
32, 52: Additional information light receiving means,
33, 53, 46: information synthesis means,
34: display means,
41: Infrared filter,
42, 43: CCD camera,
44: Difference detection means,
45: Decoding means,
54: Database

Claims (12)

  1. Detection means for detecting a human gaze direction;
    Irradiating means capable of irradiating light and changing the irradiation direction;
    Changing means for changing the light irradiation direction by the irradiation means to the detected line-of-sight direction;
    Mounting means for mounting the detection means, the irradiation means, and the changing means on a human head;
    An irradiation control device comprising:
  2.   The irradiation control apparatus according to claim 1, wherein the irradiation unit focuses and irradiates a visible ray at a substantially center point in a line-of-sight direction.
  3.   The irradiation control apparatus according to claim 1, wherein the irradiation unit focuses and irradiates an invisible light beam at a substantially central point in a line-of-sight direction.
  4.   The irradiation control apparatus according to claim 3, wherein the irradiation unit irradiates infrared rays.
  5. Further comprising an acquisition means for acquiring an image indicating information relating to a person wearing the irradiation control device;
    The irradiation control apparatus according to claim 3, wherein the irradiation unit projects the acquired image onto an object in a line-of-sight direction.
  6.   The irradiation control apparatus according to claim 5, wherein the acquisition unit acquires an image for identifying a person wearing the irradiation control apparatus.
  7. It further comprises measuring means for measuring the biological state of a human wearing the irradiation control device,
    The irradiation control apparatus according to claim 5, wherein the acquisition unit acquires an image indicating the measurement result.
  8. An irradiation control device according to any one of claims 3 to 7, and an imaging device.
    The imaging device
    First imaging means for imaging an object to be analyzed for gaze attention level;
    A second means for photographing the invisible light irradiated on the object;
    A synthesizing unit that synthesizes the image of the captured object and the image of the captured invisible light;
    Output means for outputting the synthesized video;
    A line-of-sight analysis system characterized by comprising:
  9. The photographing apparatus further includes display means for displaying an image,
    9. The line-of-sight analysis system according to claim 8, wherein the output means outputs the synthesized video to the display means for display.
  10. The photographing apparatus further includes storage means for storing video,
    The line-of-sight analysis system according to claim 8, wherein the output unit outputs and stores the synthesized video to the storage unit.
  11. A generating unit that generates an image that visualizes an area in which the time when the invisible light is irradiated exceeds a threshold;
    The line-of-sight analysis system according to any one of claims 8 to 10, wherein the synthesizing unit synthesizes the image of the photographed object and the generated image.
  12. A computer of an irradiation control device comprising: irradiation means capable of irradiating light and changing the irradiation direction; and mounting means for mounting the irradiation means on a human head,
    A detection function that detects the direction of the human gaze,
    A change function for changing the light irradiation direction by the irradiation means to the detected line-of-sight direction;
    Irradiation control program to realize.
JP2007334850A 2007-12-26 2007-12-26 Irradiation control device, irradiation control program, and visual line analysis system Withdrawn JP2009157634A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2007334850A JP2009157634A (en) 2007-12-26 2007-12-26 Irradiation control device, irradiation control program, and visual line analysis system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
JP2007334850A JP2009157634A (en) 2007-12-26 2007-12-26 Irradiation control device, irradiation control program, and visual line analysis system

Publications (1)

Publication Number Publication Date
JP2009157634A true JP2009157634A (en) 2009-07-16

Family

ID=40961597

Family Applications (1)

Application Number Title Priority Date Filing Date
JP2007334850A Withdrawn JP2009157634A (en) 2007-12-26 2007-12-26 Irradiation control device, irradiation control program, and visual line analysis system

Country Status (1)

Country Link
JP (1) JP2009157634A (en)

Cited By (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2013066634A1 (en) * 2011-11-02 2013-05-10 Google Inc. Eye gaze detection to determine speed of image movement
US8500281B2 (en) 2009-08-19 2013-08-06 Electronics And Telecommunications Research Institute Apparatus for irradiating beam at user's eye gaze point and operation method thereof
WO2013138647A1 (en) * 2012-03-15 2013-09-19 Google Inc. Using convergence angle to select among different ui elements
JP2014504762A (en) * 2011-01-19 2014-02-24 マチック ラブズ Method and apparatus for determining gaze direction
KR101369775B1 (en) * 2009-08-19 2014-03-06 한국전자통신연구원 Apparatus for beaming at the eye-gaze point and operation method thereof
WO2016200224A1 (en) * 2015-06-10 2016-12-15 주식회사 룩시드랩스 Method for monitoring bioactivity of user, system, and non-transitory computer-readable recording medium
CN106466185A (en) * 2015-08-18 2017-03-01 富士施乐株式会社 Optical measuring device and light irradiation method of reseptance
US10146055B2 (en) 2013-09-06 2018-12-04 3M Innovative Properties Company Head mounted display with eye tracking
US10545340B2 (en) 2015-03-27 2020-01-28 3M Innovative Properties Company Head mounted display and low conspicuity pupil illuminator

Cited By (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8500281B2 (en) 2009-08-19 2013-08-06 Electronics And Telecommunications Research Institute Apparatus for irradiating beam at user's eye gaze point and operation method thereof
KR101369775B1 (en) * 2009-08-19 2014-03-06 한국전자통신연구원 Apparatus for beaming at the eye-gaze point and operation method thereof
JP2014504762A (en) * 2011-01-19 2014-02-24 マチック ラブズ Method and apparatus for determining gaze direction
WO2013066634A1 (en) * 2011-11-02 2013-05-10 Google Inc. Eye gaze detection to determine speed of image movement
US8970452B2 (en) 2011-11-02 2015-03-03 Google Inc. Imaging method
WO2013138647A1 (en) * 2012-03-15 2013-09-19 Google Inc. Using convergence angle to select among different ui elements
US10146055B2 (en) 2013-09-06 2018-12-04 3M Innovative Properties Company Head mounted display with eye tracking
US10545340B2 (en) 2015-03-27 2020-01-28 3M Innovative Properties Company Head mounted display and low conspicuity pupil illuminator
WO2016200224A1 (en) * 2015-06-10 2016-12-15 주식회사 룩시드랩스 Method for monitoring bioactivity of user, system, and non-transitory computer-readable recording medium
CN106466185A (en) * 2015-08-18 2017-03-01 富士施乐株式会社 Optical measuring device and light irradiation method of reseptance

Similar Documents

Publication Publication Date Title
CN1056259C (en) Visual Information Systems
US8918162B2 (en) System and method for using three dimensional infrared imaging to provide psychological profiles of individuals
JP5887026B2 (en) Head mounted system and method for computing and rendering a stream of digital images using the head mounted system
JP5609973B2 (en) Method and system for examining or training eye movements and body movements, and method for examining or training visual ability and intention
US5204703A (en) Eye movement and pupil diameter apparatus and method
Shih et al. A novel approach to 3-D gaze tracking using stereo cameras
US7872635B2 (en) Foveated display eye-tracking system and method
US5555895A (en) Process and device for eye movement analysis
JP2005500630A (en) Target tracking system
KR100953235B1 (en) Functional brain imaging for detecting and assessing deception and concealed recognition, and cognitive/emotional response to information
JP5975126B2 (en) Fundus observation apparatus and fundus observation program
Clarke et al. Using high frame rate CMOS sensors for three-dimensional eye tracking
RU2454198C2 (en) System and method of positioning electrodes on patient's body
US8243132B2 (en) Image output apparatus, image output method and image output computer readable medium
JP5700427B2 (en) Autism diagnosis support device
JP5498375B2 (en) Visual field inspection system, driving method for visual field inspection apparatus, computer program, information medium or computer readable medium, and processor
Weber et al. Impulsive testing of semicircular-canal function using video-oculography
JP2004209227A (en) Method and apparatus of image diagnosis for skin
US6578962B1 (en) Calibration-free eye gaze tracking
US20030206272A1 (en) Ocular fundus auto imager
US10016243B2 (en) Systems and methods for assisted surgical navigation
US9994228B2 (en) Systems and methods for controlling a vehicle or device in response to a measured human response to a provocative environment
JP6308940B2 (en) System and method for identifying eye tracking scene reference position
JP2008516727A (en) Digital ophthalmic workstation
US9247870B2 (en) Method and apparatus for system synchronization in video oculography based neuro-otologic testing and evaluation

Legal Events

Date Code Title Description
RD02 Notification of acceptance of power of attorney

Free format text: JAPANESE INTERMEDIATE CODE: A7422

Effective date: 20101118

A621 Written request for application examination

Free format text: JAPANESE INTERMEDIATE CODE: A621

Effective date: 20101124

RD04 Notification of resignation of power of attorney

Free format text: JAPANESE INTERMEDIATE CODE: A7424

Effective date: 20110505

A761 Written withdrawal of application

Free format text: JAPANESE INTERMEDIATE CODE: A761

Effective date: 20110930