TWI769755B - An eye movement index analysis method for moving stimulating objects on areas of interest - Google Patents

An eye movement index analysis method for moving stimulating objects on areas of interest Download PDF

Info

Publication number
TWI769755B
TWI769755B TW110110678A TW110110678A TWI769755B TW I769755 B TWI769755 B TW I769755B TW 110110678 A TW110110678 A TW 110110678A TW 110110678 A TW110110678 A TW 110110678A TW I769755 B TWI769755 B TW I769755B
Authority
TW
Taiwan
Prior art keywords
eye movement
region
interest
file
time
Prior art date
Application number
TW110110678A
Other languages
Chinese (zh)
Other versions
TW202237023A (en
Inventor
王岱伊
Original Assignee
靜宜大學
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 靜宜大學 filed Critical 靜宜大學
Priority to TW110110678A priority Critical patent/TWI769755B/en
Application granted granted Critical
Publication of TWI769755B publication Critical patent/TWI769755B/en
Publication of TW202237023A publication Critical patent/TW202237023A/en

Links

Images

Landscapes

  • Image Analysis (AREA)
  • Eye Examination Apparatus (AREA)

Abstract

This invention relate to an eye movement index analysis method for moving stimulating objects on areas of interest. This is capturing and recording a viewable range of user by a visual image capturing device, to generate a process video file with moving stimulating objects. And an eye tracker device detects the location and action of user’s eye synchronously, to generate an eye movement data file. Then, a processing device executes time and space synchronization with areas of interest (AOI) set by researcher and users’ eye movement, and to generate an eye tracking index file.

Description

一種用於一動態刺激物之感興趣區的眼動指標分析方法 An eye movement index analysis method for a region of interest of a dynamic stimulus

本發明是有關一種眼動分析方法,尤其是可供分析使用者在可移動的感興趣區域內的眼動行為分析方法。 The present invention relates to an eye movement analysis method, especially an eye movement behavior analysis method that can be used to analyze a user's movable area of interest.

以往為了瞭解使用者的注意力分佈及學習者的認知歷程,通常都是透過自陳問卷調查,或經由訪談或放聲思考等方式,再由研究者進行語句分析,但這些歷程分析的方法容易使受試者因記憶偏誤或為了符合社會期待,而無法反映出其內在真正的認知歷程。 In the past, in order to understand the user's attention distribution and the learner's cognitive process, usually through self-report questionnaires, or through interviews or thinking aloud, and then the researchers conducted sentence analysis, but these process analysis methods are easy to use. Subjects were unable to reflect their inner true cognitive process due to memory bias or to conform to social expectations.

為了獲得更客觀的行為資料,近年來,眼球追蹤技術(eye movement monitoring)提供了自然且即時的測量來探討認知、情緒、動機等議題,因此,眼球追蹤技術已經被廣泛地使用在各個領域中。其中眼動追蹤(eye tracking)是利用光學儀器來追蹤並紀錄眼球移動的方法,是透過觀察並分析眼球移動的軌跡將可研究人類視知覺或注意力的分布情形,進而推測其認知歷程。Mayer(2010)在Learning and Instruction的特刊「多媒體研究與眼動資料」中提到:「眼動資料提供了研究者了解學習者在學習期間知識處理的狀況,因此當眼動儀作為教 學研究工具時,將能了解哪個區塊(Where)的什麼(What)教學材料產生作用?何時(When)發生作用?發生了多久(How long)?更要了解如何(How)發生作用,以顯示閱讀時更細部的內在歷程」。 In order to obtain more objective behavioral data, in recent years, eye movement monitoring technology provides natural and instant measurements to explore topics such as cognition, emotion, motivation, etc. Therefore, eye tracking technology has been widely used in various fields . Among them, eye tracking is a method of using optical instruments to track and record eye movements. By observing and analyzing the trajectory of eye movements, it is possible to study the distribution of human visual perception or attention, and then infer its cognitive process. Mayer (2010) mentioned in the special issue of Learning and Instruction "Multimedia Research and Eye-tracking Data": "Eye-tracking data provide researchers with an When learning research tools, will you be able to understand what teaching material in which block (Where) works? When does it work? How long has it happened? It is more important to understand how (How) works to show the inner process of reading in more detail".

雖然目前已有相當多研究利用眼動儀探究學習者在閱讀文字或圖片時的注意力分布狀況,但如果是針對動態的影片或動態系統進行學習時,由於呈動態的剌激物與呈靜態的文字或圖片不同,不會長時間固定在同一位置,因此,以習知的分析技術,無法進行專注在動態的影片或動態系統的眼動進行分析,以產生學習者的眼動指標數據,因此,如何提供分析學習者在可移動的感興趣區域內的眼動行為,並獲得受試者眼睛落入感興趣區域的眼動指標數據,將會是本案所要著重的問題與焦點。 Although there have been quite a few studies using eye trackers to explore learners' attention distribution when reading text or pictures, if they are learning from dynamic videos or dynamic systems, due to dynamic stimuli and static The text or pictures are different and will not be fixed in the same position for a long time. Therefore, with the conventional analysis technology, it is impossible to analyze the eye movement of the dynamic video or dynamic system to generate the eye movement index data of the learners. Therefore, , how to provide and analyze the eye movement behavior of the learners in the movable area of interest, and obtain the eye movement index data of the subject's eyes falling into the area of interest, will be the focus and focus of this case.

本發明之一目的在提供對眼動軌跡進行紀錄與整合,以分析並獲得受試者眼睛落入感興趣區域的眼動指標數據。 One of the objectives of the present invention is to provide recording and integration of the eye movement trajectory, so as to analyze and obtain the eye movement index data of the subject's eye falling into the region of interest.

本發明之一種一動態刺激物之感興趣區的眼動指標分析方法,是用於一動態刺激物之感興趣區的眼動指標分析系統,供分析一受試者的視野範圍對於至少一剌激物的注意力分布狀況,所述的眼動指標分析系統包括一刺激物紀錄裝置、一眼動偵測裝置、一處理裝置及一顯示裝置,所述的處理裝置是分別信號連結至所述的刺激物紀錄裝置、所述的眼動偵測裝置及所述的顯示裝置,所述的處理裝置具有一刺激物與感興趣區疊加播放模組、一感興趣區域設定模組、一影像時空同步模組、一 時間同步模組及一感興趣區眼動指標分析模組,其中所述的眼動分析方法包括:a)利用所述的刺激物紀錄模組對所述的受試者瀏覽的視野內容進行錄影;b)利用所述的眼動偵測裝置偵測並記錄所述的受試者之一眼動軌跡;c)錄影完畢,產生一眼動軌跡資料及一刺激歷程影像檔;d)利用所述的刺激物與感興趣區疊加播放模組播放所述的刺激歷程影像檔;e)所述的顯示裝置呈現出所述的刺激歷程影像檔之一刺激歷程影像;f)利用所述的感興趣區域設定模組對所述的刺激歷程影像進行一時間區間與一空間區域的設定;g)產生一感興趣區域與感興趣區間設定檔;h)利用所述的影像時空同步模組將所述的刺激歷程影像檔及所述的感興趣區域與感興趣區間設定檔進行時間與空間同步;i)產生一時空同步檔;j)利用所述的刺激物與感興趣區疊加播放模組播放所述的同步檔;k)所述的顯示裝置呈現出一具有一感興趣區框歷程影像檔之歷程影像;m)利用所述的時間同步模組將所述的具有所述的感興趣區框歷程影像檔與所述的眼動軌跡資料之二時間戳記同步;n)產生一時間同步檔;p)校正所述的眼動軌跡資料; q)利用所述的感興趣區眼動指標分析模組對所述的時間同步檔進行分析;及r)產生一眼動指標檔。 An eye movement index analysis method for a region of interest of a dynamic stimulus of the present invention is an eye movement index analysis system for a region of interest of a dynamic stimulus for analyzing the range of a subject's visual field for at least one stab The attention distribution of stimuli, the eye movement index analysis system includes a stimulus recording device, an eye movement detection device, a processing device and a display device, and the processing device is signal-connected to the The stimulus recording device, the eye movement detection device and the display device, the processing device has a stimuli and a region of interest overlay playback module, a region of interest setting module, and an image space-time synchronization module, one Time synchronization module and an eye movement index analysis module in a region of interest, wherein the eye movement analysis method includes: a) using the stimulus recording module to record the visual field content browsed by the subject B) utilize described eye movement detection device to detect and record one eye movement track of described experimenter; C) videotape is finished, produce eye movement track data and a stimulus process image file; D) utilize described The stimulation process image file is played by the superimposed playback module between the stimulus and the region of interest; e) the display device presents one of the stimulation process image files; f) the region of interest is used The setting module performs the setting of a time interval and a space area on the stimulation history image; g) generates a set file of an area of interest and an area of interest; h) uses the image space-time synchronization module to The stimulation process image file and the set file of the region of interest and the region of interest are synchronized in time and space; i) a space-time synchronization file is generated; j) the superimposed playback module of the stimulus and the region of interest is used to play the k) the display device presents a process image with a ROI frame process image file; m) using the time synchronization module to display the process image with the ROI frame process The image file is synchronized with the second time stamp of the eye movement track data; n) a time synchronization file is generated; p) the eye movement track data is corrected; q) using the ROI eye movement index analysis module to analyze the time synchronization file; and r) generating an eye movement index file.

本發明之一種用於一動態刺激物之感興趣區的眼動指標分析方法,是透過刺激物紀錄裝置擷取並紀錄使用者的視野範圍,以及由眼動分析模組同步偵測使用者的眼動軌跡以產生一眼動軌跡資料檔,加以分析及處理,並再將處理後的資料進行時間同步,並獲得眼動指標檔,以提供對眼動軌跡進行紀錄與整合,以分析並獲得受試者眼睛落入感興趣區域的眼動指標數據,並達成上述目的。 An eye movement index analysis method for a region of interest of a dynamic stimulus of the present invention is to capture and record the user's visual field range through a stimulus recording device, and the eye movement analysis module synchronously detects the user's visual field. The eye movement track is used to generate an eye movement track data file, analyze and process it, and then synchronize the processed data in time, and obtain the eye movement index file, so as to provide the recording and integration of the eye movement track, so as to analyze and obtain the subject data. The eye movement index data of the subject's eyes falling into the region of interest, and the above purpose is achieved.

1:刺激物紀錄裝置 1: Stimulus recording device

2:眼動偵測裝置 2: Eye movement detection device

2’:頭戴式眼動偵測裝置 2': Head-mounted eye movement detection device

3:處理裝置 3: Processing device

4:顯示裝置 4: Display device

31:刺激物與感興趣區疊加播放模組 31: Stimulus and area of interest overlay playback module

32:感興趣區域設定模組 32: Region of interest setting module

33:影像時空同步模組 33: Image space-time synchronization module

34:時間同步模組 34: Time synchronization module

35:感興趣區眼動指標分析模組 35: Area of interest eye movement index analysis module

301~319、401~419、3191~3193:步驟 301~319, 401~419, 3191~3193: Steps

圖1是本案一實施例的一種用於一動態刺激物之感興趣區的眼動指標分析方法的眼動分析系統的方塊圖;圖2是圖1之眼動指標分析方法的處理模組的方塊圖;圖3是圖1之眼動指標分析方法的流程圖;圖4本案一實施例的一種用於一動態刺激物之感興趣區的眼動指標分析方法的眼動分析系統的方塊圖;圖5是圖4之眼動指標分析方法的流程圖;圖6是圖3及圖5之眼動指標分析方法的感興趣區域疊加至刺激歷程影像檔上的示意圖; 圖7是本案另一實施例的一種用於一動態刺激物之感興趣區的眼動指標分析方法的流程圖;及圖8是圖7之眼動指標分析方法所產生的感興趣區狀態轉換表的示意圖。 1 is a block diagram of an eye movement analysis system for an eye movement index analysis method of a region of interest of a dynamic stimulus according to an embodiment of the present case; FIG. 2 is a block diagram of a processing module of the eye movement index analysis method of FIG. 1 Block diagram; Figure 3 is a flowchart of the eye movement index analysis method of Figure 1; Figure 4 is a block diagram of an eye movement analysis system for an eye movement index analysis method for a region of interest of a dynamic stimulus according to an embodiment of the present case Fig. 5 is the flow chart of the eye movement index analysis method of Fig. 4; Fig. 6 is the schematic diagram that the region of interest of the eye movement index analysis method of Fig. 3 and Fig. 5 is superimposed on the stimulus history image file; 7 is a flowchart of an eye movement index analysis method for a region of interest of a dynamic stimulus according to another embodiment of the present application; and FIG. 8 is a state transition of the region of interest generated by the eye movement index analysis method of FIG. 7 Schematic diagram of the table.

依照本發明一實施例的一種用於一動態刺激物之感興趣區的眼動指標分析方法,是用於一動態刺激物之感興趣區的眼動指標分析系統,供分析一受試者的視野範圍對於至少一剌激物的注意力分布狀況,如圖1及圖2所示,眼動指標分析系統包括一刺激物紀錄裝置1、一眼動偵測裝置2、一處理裝置3及一顯示裝置4,處理裝置3是分別信號連結至刺激物紀錄裝置1、眼動偵測裝置2及顯示裝置4,處理裝置3具有一刺激物與感興趣區疊加播放模組31、一感興趣區域設定模組32、一影像時空同步模組33、一時間同步模組34及一感興趣區眼動指標分析模組35。 An eye movement index analysis method for a region of interest of a dynamic stimulus according to an embodiment of the present invention is an eye movement index analysis system for a region of interest of a dynamic stimulus for analyzing a subject's eye movement index. The attention distribution status of the visual field for at least one stimulus, as shown in Figure 1 and Figure 2, the eye movement index analysis system includes a stimulus recording device 1, an eye movement detection device 2, a processing device 3 and a display The device 4 and the processing device 3 are signal-connected to the stimulus recording device 1, the eye movement detection device 2 and the display device 4, respectively. The processing device 3 has a stimuli and a region of interest overlay playback module 31, and a region of interest setting The module 32 , an image space-time synchronization module 33 , a time synchronization module 34 and a region of interest eye movement index analysis module 35 .

在本實施例中,刺激物紀錄裝置1可以是一螢幕側錄器,可將使用者的正在觀察的一刺激物顯示模組(圖未示)之一螢幕(圖未示)進行內容側錄,以擷取所顯示的具有剌激物的顯示內容,一併參考圖3所示,一開始如步驟301,利用眼動偵測裝置2對刺激物顯示模組(圖未示)之螢幕(圖未示)的一顯示範圍進行眼動校正,再如步驟302,利用刺激物紀錄裝置1對受試者瀏覽的螢幕內容的所有行為開始錄影,如步驟303,利用該眼動偵測裝置3偵測並記錄受試者之一眼動軌跡,再如步驟304,錄影完畢,產生一眼動軌跡資料及一刺激歷程影像檔,如步驟305,利用刺激物 與感興趣區疊加播放模組31播放刺激歷程影像檔。 In this embodiment, the stimulus recording device 1 can be a screen skimmer, which can perform content skimming on a screen (not shown) of a stimulus display module (not shown) that the user is observing , in order to capture the displayed content with the stimuli. Referring to FIG. 3, at the beginning, as in step 301, the eye movement detection device 2 is used to display the stimuli on the screen (not shown) of the stimuli display module (not shown). Perform eye movement correction in a display range (not shown), and in step 302, use the stimuli recording device 1 to start recording all the behaviors of the screen content browsed by the subject, in step 303, use the eye movement detection device 3 Detect and record an eye movement trajectory of the subject, and in step 304, after the video recording is completed, generate eye movement trajectory data and a stimulus process image file, in step 305, use the stimulus The stimulation process image file is played by the superimposed playing module 31 with the region of interest.

如步驟306,顯示裝置4呈現出刺激歷程影像檔之一刺激歷程影像,如步驟307,利用感興趣區域設定模組32對該刺激歷程影像進行一時間區間與-空間區域的設定,如步驟308,產生一感興趣區域與感興趣區間設定檔,如步驟309,利用影像時空同步模組33將刺激歷程影像檔及感興趣區域與感興趣區間設定檔進行時間與空間同步,如步驟310,產生一時空同步檔,如步驟311,利用該刺激物與感興趣區疊加播放模組31播放該同步檔,如步驟312,顯示裝置4呈現出一具有一感興趣區框歷程影像檔之歷程影像,如步驟313,判斷是否需要再設定另-個感興趣區域與感興趣區間設定檔,如需要,回到步驟307,以相同方式再產生另一感興趣區域及區間設定檔。 In step 306 , the display device 4 presents one of the stimulation history image files. In step 307 , the region of interest setting module 32 is used to set a time interval and a -spatial area for the stimulation history image, in step 308 . , generate a region of interest and region of interest profile, as in step 309, use the image spatiotemporal synchronization module 33 to synchronize the stimulation process image file, region of interest and region of interest profile in time and space, as in step 310, generate A time-space synchronization file, as in step 311, utilizes the stimulus and the ROI overlay playback module 31 to play the synchronization file, as in step 312, the display device 4 presents a process image with a ROI frame process image file, In step 313, it is determined whether another ROI and ROI profiles need to be set up. If necessary, go back to step 307 to generate another ROI and ROI profile in the same way.

如不需要,則如步驟314利用時間同步模組34將該具有感興趣區框歷程影像檔與眼動軌跡資料之二時間戳記同步,如步驟315,產生一時間同步檔,如步驟316,利用刺激物與感興趣區疊加播放模組31判斷時間同步檔是否成功,如不成功,進入步驟317,對眼動軌跡資料進行時間校正,如同步成功,則進入步驟318,利用感興趣區眼動指標分析模組35對該時間同步檔進行分析,最後如步驟319,產生一眼動指標檔。 If it is not needed, in step 314, use the time synchronization module 34 to synchronize the frame history image file with the region of interest with the two time stamps of the eye movement track data, in step 315, generate a time synchronization file, in step 316, use The superimposed playback module 31 of the stimuli and the region of interest determines whether the time synchronization file is successful. If it is unsuccessful, go to step 317 to perform time correction on the eye movement track data. If the synchronization is successful, go to step 318 to use the eye movement of the region of interest. The index analysis module 35 analyzes the time synchronization file, and finally, in step 319 , generates an eye movement index file.

依照本發明另一實施例的一種用於一動態刺激物之感興趣區的眼動指標分析方法,如圖2、圖4及圖5所示,本實施例中,是將前一實施例的眼動偵測裝置替換為頭戴式眼動偵測裝置2’,並供擷取使用者的視野範圍內具的真實世界的影像,一開始如步驟401,對於頭戴式眼動 偵測裝置2’進行眼動校正,再如步驟402,利用刺激物紀錄裝置1對受試者瀏覽所處場域之真實世界開始錄影,接著如步驟403,利用頭戴式眼動偵測裝置2’偵測並記錄受試者之一眼動軌跡,再如步驟404,錄影完畢,產生一眼動軌跡資料及一刺激歷程影像檔,如步驟405,利用刺激物與感興趣區疊加播放模組31播放刺激歷程影像檔。 According to another embodiment of the present invention, an eye movement index analysis method for a region of interest of a dynamic stimulus is shown in FIG. 2 , FIG. 4 and FIG. 5 . The eye movement detection device is replaced by the head-mounted eye movement detection device 2', and is used to capture the real-world image within the user's field of view. The beginning is shown in step 401. For the head-mounted eye movement The detection device 2' performs eye movement correction, and in step 402, the stimuli recording device 1 is used to start video recording of the real world where the subject browses the field, and then in step 403, the head-mounted eye movement detection device is used 2' Detect and record an eye movement track of the subject, and as in step 404, after video recording, generate eye movement track data and a stimulus process image file, as in step 405, use the stimulus and the region of interest to overlay the playback module 31 Play the stimulus process image file.

如步驟406,顯示裝置4呈現出刺激歷程影像檔之一刺激歷程影像,如步驟407,利用感興趣區域設定模組32對該刺激歷程影像進行一時間區間與-空間區域的設定,如步驟408,產生一感興趣區域與感興趣區間設定檔,如步驟409,利用影像時空同步模組33將刺激歷程影像檔及感興趣區域與感興趣區間設定檔進行時間與空間同步,如步驟410,產生一時空同步檔,如步驟411,利用該刺激物與感興趣區疊加播放模組31播放該同步檔,如步驟412,顯示裝置4呈現出一具有一感興趣區框歷程影像檔之歷程影像,如步驟413,判斷是否需要再設定另-個感興趣區域與感興趣區間設定檔,如需要,回到步驟407,以相同方式再產生另一感興趣區域及區間設定檔。 In step 406 , the display device 4 presents one of the stimulation history image files. In step 407 , the region of interest setting module 32 is used to set a time interval and a -spatial area for the stimulation history image, in step 408 . , generate a region of interest and region of interest profile, as in step 409, use the image spatiotemporal synchronization module 33 to synchronize the stimulation process image file, region of interest and region of interest profile in time and space, as in step 410, generate A time-space synchronization file, as in step 411, utilizes the stimulus and the ROI overlay playback module 31 to play the synchronization file, as in step 412, the display device 4 presents a process image with a ROI frame process image file, In step 413, it is determined whether another ROI and ROI profiles need to be reconfigured. If necessary, go back to step 407 to generate another ROI and ROI profile in the same way.

如不需要,則如步驟414利用時間同步模組34將該具有感興趣區框歷程影像檔與眼動軌跡資料之二時間戳記同步,如步驟415,產生一時間同步檔,如步驟416,利用刺激物與感興趣區疊加播放模組31判斷時間同步檔是否成功,如不成功,進入步驟417,對眼動軌跡資料進行時間校正,如同步成功,則進入步驟418,利用感興趣區眼動指標分析模組35對該時間同步檔進行分析,最後如步驟419,產生一眼動指標檔。 If it is not required, in step 414, use the time synchronization module 34 to synchronize the frame history image file with the region of interest with the two time stamps of the eye tracking data, in step 415, generate a time synchronization file, in step 416, use The superimposed playback module 31 of the stimuli and the region of interest determines whether the time synchronization file is successful. If it is unsuccessful, go to step 417 to perform time correction on the eye movement track data. If the synchronization is successful, go to step 418 and use the eye movement of the region of interest. The index analysis module 35 analyzes the time synchronization file, and finally, in step 419 , generates an eye movement index file.

其中前述的二實施例中的感興趣區域與感興趣區間設定檔,是包括一感興趣區域名稱資訊、一開始時間資訊、一結束時間資訊、一區域開始左上X座標軸資訊、一區域開始左上y座標軸資訊、一區域開始右下X座標軸資訊、一區域開始右下y座標軸資訊、一區域結束左上X座標軸資訊、一區域結束左上y座標軸資訊、一區域結束右下X座標軸資訊、一區域結束右下y座標軸資訊及一註解資訊,一併參考圖6所示,依據使用者的感興趣的區域(Area of interest,AOI)描述格式為「AOI名稱,開始時間,結束時間,左上x,左上y,右下x,右下y,[左上x,左上y,右下x,右下y]//註解」,例如「heart,14.3,15.5,649,172,709,248,377,18,978,655//心臟位移變大顆」,代表意思為名稱為”heart”的AOI會在14.3~15.5秒之間出現在影片中,一開始出現(14.3秒)時的矩形範圍為左上座標(649,172)右下座標(709,248),中間持續會平移且變換大小,直到結束(15.5秒)時AOI的矩形範圍變成左上(377,18)右下(978,655),如果是靜態不會移動且變換大小的AOI則可以簡化為「heart,11.4,14.3,649,172,709,248//心臟」,代表的意義為名稱為”heart”的AOI會在11.4~14.3秒之間出現在影片中,期間AOI的大小與位置都是固定的,在影片中的矩形範圍為左上(649,172)右下(709,248)。 The ROI and ROI profiles in the aforementioned two embodiments include a ROI name information, a start time information, an end time information, a region start upper left X coordinate axis information, a region start upper left y Coordinate axis information, a region start lower right X axis information, a region start lower right y axis information, a region end upper left X axis information, a region end upper left y axis information, a region end lower right X axis information, a region end right The lower y-axis information and an annotation information, as shown in Figure 6, are described according to the user's area of interest (AOI) in the format of "AOI name, start time, end time, upper left x, upper left y , lower right x, lower right y, [upper left x, upper left y, lower right x, lower right y]//annotation", such as "heart,14.3,15.5,649,172,709,248,377,18,978,655//heart displacement becomes larger", representing The AOI with the name "heart" will appear in the film between 14.3 and 15.5 seconds, and the rectangle when it first appears (14.3 seconds) is the upper left coordinate (649,172) and the lower right coordinate (709,248), and it will continue to pan in the middle. And change the size, until the end (15.5 seconds), the rectangular range of the AOI becomes the upper left (377, 18) and the lower right (978, 655). If it is a static AOI that does not move and changes size, it can be simplified as "heart, 11.4, 14.3, 649,172,709,248//heart", which means that the AOI named "heart" will appear in the video between 11.4 and 14.3 seconds, during which the size and position of the AOI are fixed, and the rectangle in the video is the upper left ( 649,172) lower right (709,248).

由於一個AOI在影片中會有複雜的變化行為(如反覆的出現與消失、一直改變大小與位置等),所以通常需要多行描述來定義。譬如“heart”這個AOI的完整定義範例如下:heart,11.4,14.3,649,172,709,248//心臟 Since an AOI will have complex changing behaviors in the movie (such as repeated appearance and disappearance, changing size and position, etc.), it usually requires multiple lines of description to define. For example, the complete definition example of the AOI of "heart" is as follows: heart, 11.4, 14.3, 649, 172, 709, 248 //heart

heart,14.3,15.5,649,172,709,248,377,18,978,655//心臟位移變大顆 heart,14.3,15.5,649,172,709,248,377,18,978,655//Heart displacement becomes larger

heart,15.5,40.4,377,18,978,655//心臟 heart,15.5,40.4,377,18,978,655//heart

heart,43.5,57,423,31,913,636//心臟 heart,43.5,57,423,31,913,636//heart

依照本發明另一實施例的一種用於眼動分析系統的眼動分析方法,一併參考圖3及圖7所示,其中步驟319更包括下列步驟,首先如步驟3191,對眼動指標檔的各資訊進行簡化轉換,並轉換為一原數據檔,接下來如步驟3192,一併參考圖8所示,將原數據檔轉換成一感興趣區狀態轉換表,其中狀態轉換表包括一開始時間資訊、一結束時間資訊、一狀態資訊、一感興趣區名稱資訊及一持續時間資訊。 According to another embodiment of the present invention, an eye movement analysis method for an eye movement analysis system is shown in FIG. 3 and FIG. 7 together, wherein step 319 further includes the following steps. First, in step 3191, the eye movement index file is Each of the information is simplified and converted into an original data file. Next, in step 3192, referring to FIG. 8, the original data file is converted into an ROI state conversion table, wherein the state conversion table includes a start time. information, an end time information, a status information, a region of interest name information, and a duration information.

在感興趣區狀態轉換表中,很多狀態的持續時間非常的短,很多連100毫秒都不到,因此可將時間太短的狀態刪除,譬如第08行的凝視時間只有6毫秒就會被刪除。 In the state transition table of the region of interest, the duration of many states is very short, and many states are less than 100 milliseconds. Therefore, the states with too short time can be deleted. For example, the gaze time of row 08 is only 6 milliseconds and will be deleted. .

最後如步驟3193,產出一眼動指標統計檔案,其中眼動指標統計檔案具有一組欄位,欄位包括每個學生的總凝視時間、每個感興趣區的第一次凝視時間、每個感興趣區的第一次凝視的持續時間、每個感興趣區的總凝視次數、每個感興趣區的總凝視持續時間合。 Finally, as in step 3193, the eye movement index statistical file is generated, wherein the eye movement index statistical file has a set of fields, and the fields include the total gaze time of each student, the first gaze time of each area of interest, each The duration of the first gaze in the ROI, the total number of gazes per ROI, and the total gaze duration per ROI combined.

本發明之一動態刺激物之感興趣區的眼動指標分析方法,是將刺激物紀錄裝置擷取並紀錄使用者的視野範圍,以及由眼動分析模組同步偵測使用者的眼動軌跡以產生一眼動軌跡資料檔,由處理模組中的各單元進行分析、處理及時間同步,並紀錄與整合使用者的眼動軌跡,以分析並獲得受試者眼睛落入感興趣區域的眼動指標數據,並達成上述目 的。 An eye movement index analysis method for a region of interest of dynamic stimuli of the present invention is to capture and record the user's visual field range by a stimulus recording device, and synchronously detect the user's eye movement track by an eye movement analysis module In order to generate an eye movement trajectory data file, each unit in the processing module will analyze, process and time synchronization, and record and integrate the user's eye movement trajectory to analyze and obtain the subject's eye falling into the region of interest. dynamic indicator data, and achieve the above goals of.

雖然本發明已以較佳實施例揭露如上,然其並非用以限定本發明,任何熟習此技藝者,在不脫離本發明的精神和範圍內,當可作些許的更動與潤飾,因此本發明的保護範圍當視後附之申請專利範圍所界定者為準。 Although the present invention has been disclosed above with preferred embodiments, it is not intended to limit the present invention. Anyone skilled in the art can make some changes and modifications without departing from the spirit and scope of the present invention. Therefore, the present invention The scope of protection shall be determined by the scope of the appended patent application.

301~319:步驟 301~319: Steps

Claims (7)

一種的眼動分析方法,係用於一動態刺激物之感興趣區的眼動指標分析系統,供分析一受試者的視野範圍對於至少一剌激物的注意力分布狀況,該眼動指標分析系統包括一刺激物紀錄裝置、一眼動偵測裝置、一處理裝置及一顯示裝置,該處理裝置係分別信號連結至該刺激物紀錄裝置、該眼動偵測裝置及該顯示裝置,該處理裝置具有一刺激物與感興趣區疊加播放模組、一感興趣區域設定模組、一影像時空同步模組、一時間同步模組及一感興趣區眼動指標分析模組,其中該眼動分析方法包括:a)利用該刺激物紀錄模組對該受試者瀏覽的視野內容進行錄影;b)利用該眼動偵測裝置偵測並記錄該受試者之一眼動軌跡;c)錄影完畢,產生一眼動軌跡資料及一刺激歷程影像檔;d)利用該刺激物與感興趣區疊加播放模組播放該刺激歷程影像檔;e)該顯示裝置呈現出該刺激歷程影像檔之一刺激歷程影像;f)利用該感興趣區域設定模組對該刺激歷程影像進行一時間區間與一空間區域的設定;g)產生一感興趣區域與感興趣區間設定檔;h)利用該影像時空同步模組將該刺激歷程影像檔及該感興趣區域與感興趣區間設定檔進行時間與空間同步,其中該感興趣區域與感興趣區間設定檔包括一感興趣區域名稱資訊、一開始時間資訊、一結束時間資訊、一區域開始左上X座標軸資訊、一區域開始左上y座標軸資訊、一區域開始右下X座標軸資訊、一區域開始右下y座標軸資訊、一區域 結束左上X座標軸資訊、一區域結束左上y座標軸資訊、一區域結束右下X座標軸資訊、一區域結束右下y座標軸資訊及一註解資訊;i)產生一時空同步檔;j)利用該刺激物與感興趣區疊加播放模組播放該同步檔;k)該顯示裝置呈現出一具有一感興趣區框歷程影像檔之歷程影像;l)判斷是否需要再設定另一個感興趣區域與感興趣區間設定檔,如判斷需要,則重新進行該步驟f);m)利用該時間同步模組將該具有該感興趣區框歷程影像檔與該眼動軌跡資料之二時間戳記同步;n)產生一時間同步檔;p)修改該眼動軌跡資料;q)利用該感興趣區眼動指標分析模組對該時間同步檔進行分析;及r)產生一眼動指標檔。 An eye movement analysis method, which is an eye movement index analysis system for a region of interest of a dynamic stimulus, for analyzing the attention distribution of a subject's visual field for at least one stimulus, the eye movement index The analysis system includes a stimulus recording device, an eye movement detection device, a processing device and a display device, the processing device is signal-connected to the stimulus recording device, the eye movement detection device and the display device, respectively. The device has a stimuli and a region of interest overlay playback module, a region of interest setting module, an image space-time synchronization module, a time synchronization module and a region of interest eye movement index analysis module, wherein the eye movement The analysis method includes: a) using the stimulus recording module to record the visual field content browsed by the subject; b) using the eye movement detection device to detect and record an eye movement track of the subject; c) video recording After finishing, generating eye movement trajectory data and a stimulation process image file; d) playing the stimulation process image file using the stimuli and the ROI overlay playback module; e) the display device presents one of the stimulation process image files on the display device history image; f) use the region of interest setting module to set a time interval and a space region for the stimulus history image; g) generate a region of interest and region of interest setting file; h) use the image to synchronize time and space The module performs time and space synchronization on the stimulation process image file, the region of interest and the region of interest configuration file, wherein the region of interest and the region of interest configuration file includes a region of interest name information, a start time information, a End time information, a region start upper left X-axis information, a region start upper left y-axis information, a region start lower right X-axis information, a region start lower right y-axis information, a region End the upper left X-coordinate axis information, a region end the upper left y-coordinate axis information, a region end the lower right X-coordinate axis information, a region end the lower right y-coordinate axis information and an annotation information; i) generate a space-time synchronization file; j) use the stimulus Playing the synchronization file with the ROI overlay playback module; k) the display device presents a process image with a ROI frame process image file; l) judging whether it is necessary to set another ROI and an ROI setting file, if it is judged necessary, then perform step f) again; m) use the time synchronization module to synchronize the frame history image file with the region of interest with the two time stamps of the eye movement track data; n) generate a time synchronization file; p) modifying the eye movement track data; q) analyzing the time synchronization file by using the eye movement index analysis module of the region of interest; and r) generating an eye movement index file. 如申請專利範圍第1項所述的眼動分析方法,更包括一在該步驟a)前的步驟a’)利用該眼動偵測裝置對一刺激物顯示模組之一螢幕的一顯示範圍進行眼動校正。 The eye movement analysis method described in claim 1, further comprising a step a') before the step a) using the eye movement detection device to display a display area of a screen of a stimulus display module Perform eye movement correction. 如申請專利範圍第2項所述的眼動分析方法,其中該步驟a)包括一個步驟a1)該刺激物紀錄裝置係對該受試者瀏覽該螢幕內容的所有行為開始錄影。 The eye movement analysis method as described in claim 2, wherein the step a) includes a step a1) the stimulus recording device starts recording all the behaviors of the subject browsing the screen content. 如申請專利範圍第1項所述的眼動分析方法,其中該眼動偵測裝置係一頭戴式眼動偵測裝置,該眼動分析方法更包括一在該步驟a)前的a”)對於該頭戴式眼動偵測裝置進行眼動校正。 The eye movement analysis method as described in claim 1, wherein the eye movement detection device is a head-mounted eye movement detection device, and the eye movement analysis method further comprises a "a" before the step a). ) to perform eye movement correction on the head-mounted eye movement detection device. 如申請專利範圍第2項所述的眼動分析方法,其中該步驟a)包括一個步驟a2)該刺激物紀錄裝置係對該受試者瀏覽所處場域之一真實世界開始錄影。 The eye movement analysis method as described in claim 2, wherein the step a) comprises a step a2) the stimulus recording device starts recording the real world in a field where the subject is browsing. 如申請專利範圍第1項所述的眼動分析方法,更包括一在該步驟n)及步驟p)之間的步驟o)利用該刺激物與感興趣區疊加播放模組判斷該時間同步檔是否成功,如判斷不成功,對該眼動軌跡資料進行時間校正。 The eye movement analysis method described in item 1 of the scope of the patent application further comprises a step o) between the step n) and the step p), using the stimulus and the region of interest overlay playback module to determine the time synchronization file Whether it is successful or not, if the judgment is unsuccessful, time correction is performed on the eye movement track data. 如申請專利範圍第1項所述的眼動分析方法,其中步驟r)包括下列步驟:r1)對該眼動指標檔的各資訊進行簡化轉換,並轉換為一原數據檔;r2)將原數據檔轉換成一感興趣區狀態轉換表,並將發生時間低於一第一預定時間的資訊進行刪除,該狀態轉換表包括一開始時間資訊、一結束時間資訊、一狀態資訊、一感興趣區名稱資訊及一持續時間資訊;及r3)產出一眼動指標統計檔案,其中該眼動指標統計檔案具有一組欄位,該欄位包括每個學生的總凝視時間、每個感興趣區的第一次凝視時 間、每個感興趣區的第一次凝視的持續時間、每個感興趣區的總凝視次數、每個感興趣區的總凝視持續時間合。 The eye movement analysis method described in item 1 of the scope of the application, wherein step r) comprises the following steps: r1) simplify and convert each information of the eye movement index file into an original data file; r2) convert the original data The data file is converted into an ROI state transition table, and the information whose occurrence time is lower than a first predetermined time is deleted. The state transition table includes a start time information, an end time information, a state information, and a ROI name information and a duration information; and r3) produce an eye movement index statistic file, wherein the eye movement index statistic file has a set of fields including the total gaze time of each student, the when you stare for the first time time, duration of the first gaze per ROI, total number of gazes per ROI, total gaze duration per ROI combined.
TW110110678A 2021-03-24 2021-03-24 An eye movement index analysis method for moving stimulating objects on areas of interest TWI769755B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
TW110110678A TWI769755B (en) 2021-03-24 2021-03-24 An eye movement index analysis method for moving stimulating objects on areas of interest

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
TW110110678A TWI769755B (en) 2021-03-24 2021-03-24 An eye movement index analysis method for moving stimulating objects on areas of interest

Publications (2)

Publication Number Publication Date
TWI769755B true TWI769755B (en) 2022-07-01
TW202237023A TW202237023A (en) 2022-10-01

Family

ID=83439494

Family Applications (1)

Application Number Title Priority Date Filing Date
TW110110678A TWI769755B (en) 2021-03-24 2021-03-24 An eye movement index analysis method for moving stimulating objects on areas of interest

Country Status (1)

Country Link
TW (1) TWI769755B (en)

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
TW201928597A (en) * 2017-12-11 2019-07-16 國立臺灣師範大學 Eye movement analysis system for automatically extracting and dynamically comparing area of interest including a pre-processing module and a digitalized file playing module and applied to digital file playback
US20200195940A1 (en) * 2018-12-14 2020-06-18 Apple Inc. Gaze-Driven Recording of Video

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
TW201928597A (en) * 2017-12-11 2019-07-16 國立臺灣師範大學 Eye movement analysis system for automatically extracting and dynamically comparing area of interest including a pre-processing module and a digitalized file playing module and applied to digital file playback
US20200195940A1 (en) * 2018-12-14 2020-06-18 Apple Inc. Gaze-Driven Recording of Video

Also Published As

Publication number Publication date
TW202237023A (en) 2022-10-01

Similar Documents

Publication Publication Date Title
Conklin et al. Eye-tracking: A guide for applied linguistics research
DeAngelus et al. Top-down control of eye movements: Yarbus revisited
US10342472B2 (en) Systems and methods for assessing and improving sustained attention
Wilkinson et al. Eye tracking research to answer questions about augmentative and alternative communication assessment and intervention
Wang et al. Multi-sensor eye-tracking systems and tools for capturing Student attention and understanding engagement in learning: A review
Ma et al. Glancee: An adaptable system for instructors to grasp student learning status in synchronous online classes
Chukoskie et al. Quantifying gaze behavior during real-world interactions using automated object, face, and fixation detection
Dawood et al. Affective computational model to extract natural affective states of students with Asperger syndrome (AS) in computer-based learning environment
Kassner et al. PUPIL: constructing the space of visual attention
Shukla et al. SMART-T: A system for novel fully automated anticipatory eye-tracking paradigms
Orlosky et al. Using eye tracked virtual reality to classify understanding of vocabulary in recall tasks
WO2023041940A1 (en) Gaze-based behavioural monitoring system
El Haddioui et al. Learner behavior analysis through eye tracking
TWI679558B (en) The method of identifying fixations real-time from the raw eye- tracking data and a real-time identifying fixations system applying this method
Pande et al. Eye-tracking in STEM education research: limitations, experiences and possible extensions
Prendinger et al. Understanding the effect of life-like interface agents through users' eye movements
Kacorri et al. Comparing native signers' perception of American Sign Language animations and videos via eye tracking
TWI769755B (en) An eye movement index analysis method for moving stimulating objects on areas of interest
Hynes et al. An evaluation of lower facial micro expressions as an implicit QoE metric for an augmented reality procedure assistance application
Mirault et al. Using virtual reality to assess reading fluency in children
JP2022135476A (en) Information processing apparatus and program
Ali et al. A Review on Different Approaches for Assessing Student Attentiveness in Classroom using Behavioural Elements
Chen et al. Applicable prospects of eye tracking technology in the research of landscape visual perception
El Haddioui Eye Tracking Applications for E-Learning Purposes: An Overview and Perspectives
Khosravi et al. Employing a Wearable Eye-tracker to Observe Mind-wandering in Dynamic Stimuli