CN116320233A - Automatic recording system and method for tracking eye movements by intelligent glasses - Google Patents
Automatic recording system and method for tracking eye movements by intelligent glasses Download PDFInfo
- Publication number
- CN116320233A CN116320233A CN202211550608.2A CN202211550608A CN116320233A CN 116320233 A CN116320233 A CN 116320233A CN 202211550608 A CN202211550608 A CN 202211550608A CN 116320233 A CN116320233 A CN 116320233A
- Authority
- CN
- China
- Prior art keywords
- intelligent glasses
- wearer
- image
- eye
- eyes
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 239000011521 glass Substances 0.000 title claims abstract description 50
- 238000000034 method Methods 0.000 title claims abstract description 22
- 230000004424 eye movement Effects 0.000 title claims abstract description 20
- 230000000007 visual effect Effects 0.000 claims abstract description 24
- 238000005286 illumination Methods 0.000 claims abstract description 14
- 238000005516 engineering process Methods 0.000 claims abstract 2
- 230000011664 signaling Effects 0.000 claims abstract 2
- 230000035945 sensitivity Effects 0.000 claims description 12
- 238000012937 correction Methods 0.000 claims description 6
- 210000001747 pupil Anatomy 0.000 claims description 5
- 230000004418 eye rotation Effects 0.000 claims description 4
- 230000000694 effects Effects 0.000 claims description 2
- 238000005259 measurement Methods 0.000 claims description 2
- 238000005375 photometry Methods 0.000 claims description 2
- 238000009877 rendering Methods 0.000 abstract 2
- 230000001360 synchronised effect Effects 0.000 abstract 1
- 210000004087 cornea Anatomy 0.000 description 2
- 230000000875 corresponding effect Effects 0.000 description 2
- 238000012217 deletion Methods 0.000 description 2
- 230000037430 deletion Effects 0.000 description 2
- 230000001960 triggered effect Effects 0.000 description 2
- 238000006243 chemical reaction Methods 0.000 description 1
- 238000005457 optimization Methods 0.000 description 1
- 230000002093 peripheral effect Effects 0.000 description 1
- 238000012545 processing Methods 0.000 description 1
- 239000004984 smart glass Substances 0.000 description 1
- 230000002123 temporal effect Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
- G06F3/013—Eye tracking input arrangements
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B3/00—Apparatus for testing the eyes; Instruments for examining the eyes
- A61B3/10—Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions
- A61B3/14—Arrangements specially adapted for eye photography
- A61B3/145—Arrangements specially adapted for eye photography by video means
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N5/00—Details of television systems
- H04N5/76—Television signal recording
-
- Y—GENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
- Y02—TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
- Y02B—CLIMATE CHANGE MITIGATION TECHNOLOGIES RELATED TO BUILDINGS, e.g. HOUSING, HOUSE APPLIANCES OR RELATED END-USER APPLICATIONS
- Y02B20/00—Energy efficient lighting technologies, e.g. halogen lamps or gas discharge lamps
- Y02B20/40—Control techniques providing energy savings, e.g. smart controller or presence detection
Abstract
An automatic shooting and recording system and method for tracking eye movements by using intelligent glasses, wherein the intelligent glasses capture movement signaling by learning eye movement habits, and automatically shoot or record content sceneries seen by a wearer by using an eye movement tracking technology. The synchronous intelligent glasses analyze the quality of the visual field image of the wearer, combine the focal coordinate parameters of the eyes and the vision, realize the zone rendering of the visual field image based on the vision focal point based on the gaze point rendering principle, automatically adjust the exposure degree, illumination uniformity and definition of the image, and store the processed image or video in the terminal equipment. Various shooting methods such as live shooting, video recording and automatic shooting are realized.
Description
Technical Field
The invention belongs to the technical field of eye movement tracking.
Background
The current intelligent terminal such as the automatic photographing function of the intelligent glasses needs to calculate various parameters such as camera brightness, exposure, sensitivity and the like through a software algorithm, when the various parameters meet the requirements, the automatic photographing can be realized, and when the various parameters fail to meet the requirements, certain optimization processing is needed. The existing automatic photographing method cannot set the automatic photographing requirement in a personalized manner. The embodiment of the application provides an automatic shooting and recording system and method for tracking eye movements by intelligent glasses, which can solve the problem that the automatic shooting method cannot be set in a personalized way.
Disclosure of Invention
The patent discloses an automatic recording system and method for tracking eye movements by intelligent glasses, the specific method is as follows:
step 1: the camera of the intelligent glasses automatically acquires eye information of a wearer;
step 11: two cameras in the intelligent glasses respectively acquire eye information of a wearer;
step 12: the eye information comprises the eye pupil center and cornea reflection center of the wearer, so that an eye image and a pupil image are acquired;
step 2: tracking eye movements of the wearer;
step 21: based on the eye image and pupil image that intelligent glasses camera obtained, obtain the sight angle of wearer's eyes, the sight angle includes the eye rotation angle about and the eye rotation angle about, through the eye action of automatic tracking wearer, learns the eye habit of wearer, forms the field of view angle threshold value of wearer.
Step 22: the intelligent glasses built-in sensor can obtain the face angle of a wearer, and according to the face angle and the sight angles of the eyes, sight starting point coordinates of the eyes, namely sight focal points of the wearer, are obtained, and focal point coordinates of the sight of the eyes are calculated by utilizing the two end point coordinates.
Step 3: the wearer sets up the action order of shooting and video recording by oneself, the intelligent glasses record the order to the apparatus terminal;
step 31: the action instructions of shooting or video recording can be defined by the wearer, and the action instructions of the wearer can be learned and recorded by the intelligent glasses.
Step 32: the magnitude of the motion amplitude that triggers the recognition of the person's motion characteristics may be predetermined by obtaining a predetermined sensitivity level for the wearer, although the wearer may default to one sensitivity level by the system instead of selecting the sensitivity level in advance. Based on the default sensitivity level, when the character action feature with the corresponding action amplitude is identified, shooting or video recording operation is triggered to be executed.
Step 33: the action command and the sensitivity level can be checked through the terminal equipment, and actions such as adjustment, change, deletion and the like can be set on the terminal equipment if needed.
Step 4: the intelligent glasses receive action instructions of shooting or video recording;
step 41: after the intelligent glasses receive the instructions, the current visual field image of the wearer is determined based on the visual field angle threshold and the focal coordinates of the eyes and the eyes.
Step 5: the intelligent glasses automatically render the visual field image of the wearer;
step 51: the image of the visual focus of a common human is the highest definition, which is due to the fact that the light image processed by the foveal cone cells, the peripheral image of the visual focus is gradually blurred, and the number of the video cone cells is gradually reduced, and the number of the video rod cells is gradually increased.
The intelligent glasses analyze the quality of the visual field image of the wearer, and the exposure degree, the illumination uniformity and the definition of the image are automatically adjusted by combining the focal coordinate parameters of the binocular vision.
Step 52: the exposure of the image is to use a central point light measurement mode at the vicinity of the visual focus, and only the brightness of a scene in a small central range is measured as an automatic exposure basis; and taking the average brightness value in the range of the part of the image as exposure basis by using an average photometry mode at the periphery of the visual focus. The visual focus vicinity is defined as an image range with the visual focus coordinate as a circle center and the radius a, and the value of the radius a can be set by a wearer through terminal equipment or can be set by default by a system. The brightness weight of the image exposure at the central part is higher, the edge weight is lower, and the brightness of the shot main body and the brightness of surrounding scenes can be simultaneously considered, so that the light metering precision of the shot main body is higher, and the method is particularly suitable for shooting the figure picture with the scenery.
Step 53: in fact, the response of human eyes to the brightness of light is not a linear proportional relation, and the input-output characteristic curves of various photoelectric conversion related devices are generally nonlinear, so that the intelligent glasses automatically complete the adjustment of the exposure degree of the image, and the uniformity and the definition of illumination of the whole image are not consistent with those seen by the human eyes, and the response of the image sensor in the camera on the intelligent glasses is nearly linear, so that correction is needed for correctly outputting the image conforming to the response of the human eyes to the brightness on the terminal device.
The method of gamma correction is used in the image sensor of the camera on the intelligent glasses, namely, the method of editing the gamma curve of the image to carry out nonlinear tone editing on the image, and the dark color part and the light color part in the image signal are detected, and the proportion of the dark color part and the light color part is increased, so that the image contrast effect is improved. In image quality adjustment, an important role of gamma correction is to embody details and enhance contrast. Since the human eye does not feel obvious about the difference in high luminance but rather notices about the small difference in low luminance, the gamma correction causes the details of the low luminance portion to be perceived by the human eye by increasing the gap of the low luminance portion, and the details of the image are substantially all of the same.
As the visual focus in the visual field of human eyes is the highest definition and bright, the visual edge can be blurred and dull, and the contrast of dark and light parts in the image is improved in a gamma correction mode, so that the illumination uniformity and definition of the image are improved.
Step 54: the exposure, illumination uniformity and sharpness of the image can be preset and adjusted by the wearer. Of course, the wearer may not preset the level, but default the exposure, illumination uniformity, and sharpness setting of one image by the system. The image quality of the fundus image of the wearer is automatically adjusted based on the default settings of the exposure, illumination uniformity and sharpness of the image.
Step 55: the level of the exposure degree, the illumination uniformity and the definition of the image can be checked through the terminal equipment, and if the adjustment or the change, the reset and other actions can be set on the terminal equipment.
Step 6: the intelligent glasses complete shooting or video recording operation and transmit pictures or videos to the terminal equipment for storage.
Step 61: and the intelligent glasses transmit the automatically rendered pictures or videos to the terminal equipment for storage.
Drawings
FIG. 1 is a flow chart of an automatic recording system and method for tracking eye movements by smart glasses.
Detailed Description
The following description of the embodiments of the present invention will be made more apparent and fully hereinafter with reference to the accompanying drawings, in which some, but not all embodiments of the invention are shown. All other embodiments, which can be made by those skilled in the art based on the embodiments of the invention without making any inventive effort, are intended to be within the scope of the invention.
Step 1: the camera of the intelligent glasses automatically acquires eye information of a wearer;
step 11: two cameras in the intelligent glasses respectively acquire eye information of a wearer, a camera A on the left side of the intelligent glasses acquires eye information of the left eye of the wearer, and a camera B on the right side of the intelligent glasses acquires eye information of the right eye of the wearer;
step 12: the eye information comprises the eye pupil center and cornea reflection center of the wearer, so that an eye image and a pupil image are acquired;
step 2: tracking eye movements of the wearer;
step 21: based on the eye images and pupil images of the left eye and the right eye acquired by the intelligent glasses camera, the sight angles of the eyes of the wearer are obtained, the sight angles comprise left and right rotation angles of eyes and up and down rotation angles of eyes, eye habit of the wearer is learned through automatic tracking of eye actions of the wearer, and the sight angle threshold of the wearer is formed.
For the normal human example, the viewing angle is generally 56 degrees above, 74 degrees below, 65 degrees on the nasal side and 91 degrees on the temporal side.
Step 22: the intelligent glasses built-in sensor can obtain the face angle of a wearer, and according to the face angle and the sight angles of the eyes, sight starting point coordinates of the eyes, namely sight focal points of the wearer, are obtained, and focal point coordinates of the sight of the eyes are calculated by utilizing the two end point coordinates.
Step 3: the wearer sets up the action order of shooting and video recording by oneself, the intelligent glasses record the order to the apparatus terminal;
step 31: the wearer can define the action instructions of shooting or video recording by himself, for example, the action instructions of shooting are shot twice in rapid succession, the action instructions of recording are shot twice in single rapid left eye, the action instructions of recording are shot beginning twice in single rapid right eye, the action instructions of recording are shot ending twice in single rapid right eye, and the action instructions of the wearer are learned and recorded by the intelligent glasses.
Step 32: the magnitude of the motion amplitude that triggers the recognition of the person's motion characteristics may be predetermined by obtaining a predetermined sensitivity level for the wearer, although the wearer may default to one sensitivity level by the system instead of selecting the sensitivity level in advance. Based on the default sensitivity level, when the character action feature with the corresponding action amplitude is identified, shooting or video recording operation is triggered to be executed.
Step 33: the action command and the sensitivity level can be checked through the terminal equipment, and actions such as adjustment, change, deletion and the like can be set on the terminal equipment if needed.
Step 4: the intelligent glasses receive a shot action instruction;
step 41: after the intelligent glasses receive shooting instructions, the current visual field image of the wearer is determined based on the visual field angle threshold and the focal coordinates of the eyes and the eyes.
Step 5: the intelligent glasses automatically render the visual field image of the wearer;
step 51: the intelligent glasses analyze the quality of the visual field image of the wearer, and the exposure degree, the illumination uniformity and the definition of the image are automatically adjusted by combining the focal coordinate parameters of the binocular vision.
Step 52: the exposure, illumination uniformity and sharpness of the image may all be preset by the wearer to a level of adjustment. Of course, the wearer may not preset the level, but default the exposure, illumination uniformity, and sharpness level of an image by the system. The image quality of the fundus image of the wearer is automatically adjusted based on the default level of exposure, illumination uniformity and sharpness of the image.
Step 53: the level of the exposure degree, the illumination uniformity and the definition of the image can be checked through the terminal equipment, and if the adjustment or the change, the reset and other actions can be set on the terminal equipment.
Step 6: the intelligent glasses complete shooting operation and transmit pictures or videos to the terminal equipment for storage.
Step 61: and the intelligent glasses transmit the automatically rendered pictures to the terminal equipment for storage.
Claims (7)
1. An automatic shooting system and method for tracking eye movements by intelligent glasses are characterized in that: by utilizing an eye movement tracking technology, the intelligent glasses learn eye movement habits, capture movement signaling, shoot or record content scenery seen by a wearer, synchronously analyze the quality of visual field images of the wearer, automatically adjust the exposure degree, illumination uniformity and definition of the images by combining focal coordinate parameters of binocular vision, and store the processed images or videos in terminal equipment.
2. The automatic recording system and method for tracking eye movements by intelligent glasses according to claim 1, wherein: based on the eye image and pupil image that intelligent glasses camera obtained, obtain the sight angle of wearer's eyes, the sight angle includes the eye rotation angle about and the eye rotation angle about, through the eye action of automatic tracking wearer, learns the eye habit of wearer, forms the field of view angle threshold value of wearer.
3. The automatic recording system and method for tracking eye movements by intelligent glasses according to claim 1, wherein: the intelligent glasses built-in sensor can obtain the face angle of a wearer, and according to the face angle and the sight angles of the eyes, sight starting and ending point coordinates of the eyes, namely sight focal points, are obtained, and the focal point coordinates of the sight of the eyes are calculated by utilizing the two end point coordinates.
4. The automatic recording system and method for tracking eye movements by intelligent glasses according to claim 1, wherein: the intelligent glasses can automatically define the action instructions of shooting or video recording, learn and record the action instructions of the wearer, and the sensitivity of the action instruction identification is adjustable.
5. The automatic recording system and method for tracking eye movements by intelligent glasses according to claim 1, wherein: after the intelligent glasses receive the instructions, the current visual field image of the wearer is determined based on the visual field angle threshold and the focal coordinates of the eyes and the eyes.
6. The automatic recording system and method for tracking eye movements by intelligent glasses according to claim 1, wherein: the exposure of the image uses a central point light measurement mode at the vicinity of the visual focus, and only the brightness of a scene in a small central range is measured as an automatic exposure basis; and taking the average brightness value in the range of the part of the image as exposure basis by using an average photometry mode at the periphery of the visual focus.
7. The automatic recording system and method for tracking eye movements by intelligent glasses according to claim 1, wherein: the method of gamma correction is used in the image sensor of the camera on the intelligent glasses, namely, the method of editing the gamma curve of the image to carry out nonlinear tone editing on the image, and the dark color part and the light color part in the image signal are detected, and the proportion of the dark color part and the light color part is increased, so that the image contrast effect is improved.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202211550608.2A CN116320233A (en) | 2022-12-05 | 2022-12-05 | Automatic recording system and method for tracking eye movements by intelligent glasses |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202211550608.2A CN116320233A (en) | 2022-12-05 | 2022-12-05 | Automatic recording system and method for tracking eye movements by intelligent glasses |
Publications (1)
Publication Number | Publication Date |
---|---|
CN116320233A true CN116320233A (en) | 2023-06-23 |
Family
ID=86789351
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202211550608.2A Pending CN116320233A (en) | 2022-12-05 | 2022-12-05 | Automatic recording system and method for tracking eye movements by intelligent glasses |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN116320233A (en) |
-
2022
- 2022-12-05 CN CN202211550608.2A patent/CN116320233A/en active Pending
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN109633907B (en) | Method for automatically adjusting brightness of monocular AR (augmented reality) glasses and storage medium | |
EP3035681B1 (en) | Image processing method and apparatus | |
US10962808B2 (en) | Contact lens with image pickup control | |
CN103945121B (en) | A kind of information processing method and electronic equipment | |
CN107181918B (en) | A kind of dynamic filming control method and system for catching video camera of optics | |
TWI516804B (en) | Head mounted display apparatus and backlight adjustment method thereof | |
US20170223261A1 (en) | Image pickup device and method of tracking subject thereof | |
CN103747183B (en) | Mobile phone shooting focusing method | |
CN105827960A (en) | Imaging method and device | |
CN103369214A (en) | An image acquiring method and an image acquiring apparatus | |
US20150304625A1 (en) | Image processing device, method, and recording medium | |
KR20170011362A (en) | Imaging apparatus and method for the same | |
CN109725423B (en) | Method for automatically adjusting brightness of monocular AR (augmented reality) glasses and storage medium | |
CN108200340A (en) | The camera arrangement and photographic method of eye sight line can be detected | |
CN112666705A (en) | Eye movement tracking device and eye movement tracking method | |
CN106657801A (en) | Video information acquiring method and apparatus | |
US20200221005A1 (en) | Method and device for tracking photographing | |
CN108093170B (en) | User photographing method, device and equipment | |
CN107436681A (en) | Automatically adjust the mobile terminal and its method of the display size of word | |
CN108462831B (en) | Image processing method, image processing device, storage medium and electronic equipment | |
CN108156387A (en) | Terminate the device and method of camera shooting automatically by detecting eye sight line | |
CN116320233A (en) | Automatic recording system and method for tracking eye movements by intelligent glasses | |
US20220329740A1 (en) | Electronic apparatus, method for controlling electronic apparatus, and non-transitory computer readable storage medium | |
KR20180000580A (en) | cost volume calculation apparatus stereo matching system having a illuminator and method therefor | |
CN110536044B (en) | Automatic certificate photo shooting method and device |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication |