WO2022196093A1 - Dispositif de traitement d'informations, procédé de détection de ligne visuelle et programme - Google Patents
Dispositif de traitement d'informations, procédé de détection de ligne visuelle et programme Download PDFInfo
- Publication number
- WO2022196093A1 WO2022196093A1 PCT/JP2022/002165 JP2022002165W WO2022196093A1 WO 2022196093 A1 WO2022196093 A1 WO 2022196093A1 JP 2022002165 W JP2022002165 W JP 2022002165W WO 2022196093 A1 WO2022196093 A1 WO 2022196093A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- information
- line
- sight detection
- angle information
- calibration data
- Prior art date
Links
- 238000001514 detection method Methods 0.000 title claims abstract description 282
- 230000010365 information processing Effects 0.000 title claims abstract description 96
- 238000000034 method Methods 0.000 claims abstract description 133
- 230000008569 process Effects 0.000 claims abstract description 125
- 238000012937 correction Methods 0.000 claims abstract description 89
- 230000008859 change Effects 0.000 claims abstract description 19
- 238000012545 processing Methods 0.000 claims description 199
- 238000004364 calculation method Methods 0.000 claims description 131
- 238000003384 imaging method Methods 0.000 claims description 83
- 210000000744 eyelid Anatomy 0.000 claims description 21
- 210000001508 eye Anatomy 0.000 description 139
- 230000036544 posture Effects 0.000 description 48
- 238000005516 engineering process Methods 0.000 description 28
- 238000004891 communication Methods 0.000 description 27
- 239000013598 vector Substances 0.000 description 24
- 230000006870 function Effects 0.000 description 20
- 230000003287 optical effect Effects 0.000 description 19
- 230000000007 visual effect Effects 0.000 description 15
- 238000010586 diagram Methods 0.000 description 11
- 230000000875 corresponding effect Effects 0.000 description 10
- 239000003550 marker Substances 0.000 description 10
- 230000001133 acceleration Effects 0.000 description 7
- 210000005252 bulbus oculi Anatomy 0.000 description 7
- 230000000694 effects Effects 0.000 description 7
- 238000005401 electroluminescence Methods 0.000 description 6
- 208000016339 iris pattern Diseases 0.000 description 6
- 210000001747 pupil Anatomy 0.000 description 6
- 230000011218 segmentation Effects 0.000 description 6
- 230000009466 transformation Effects 0.000 description 6
- 238000013473 artificial intelligence Methods 0.000 description 5
- 210000004087 cornea Anatomy 0.000 description 5
- 230000004044 response Effects 0.000 description 5
- 210000003128 head Anatomy 0.000 description 4
- 238000001028 reflection method Methods 0.000 description 4
- 238000006243 chemical reaction Methods 0.000 description 3
- 239000004973 liquid crystal related substance Substances 0.000 description 3
- 238000005259 measurement Methods 0.000 description 3
- 230000007246 mechanism Effects 0.000 description 3
- 239000004065 semiconductor Substances 0.000 description 3
- 210000004556 brain Anatomy 0.000 description 2
- 238000004590 computer program Methods 0.000 description 2
- 230000005484 gravity Effects 0.000 description 2
- 238000012986 modification Methods 0.000 description 2
- 230000004048 modification Effects 0.000 description 2
- 238000012806 monitoring device Methods 0.000 description 2
- 238000012544 monitoring process Methods 0.000 description 2
- 238000001454 recorded image Methods 0.000 description 2
- 230000004913 activation Effects 0.000 description 1
- 230000003190 augmentative effect Effects 0.000 description 1
- 230000005540 biological transmission Effects 0.000 description 1
- 230000000295 complement effect Effects 0.000 description 1
- 239000002131 composite material Substances 0.000 description 1
- 230000006835 compression Effects 0.000 description 1
- 238000007906 compression Methods 0.000 description 1
- 230000002596 correlated effect Effects 0.000 description 1
- 229910044991 metal oxide Inorganic materials 0.000 description 1
- 150000004706 metal oxides Chemical class 0.000 description 1
- 239000000203 mixture Substances 0.000 description 1
- 230000001151 other effect Effects 0.000 description 1
- 238000007781 pre-processing Methods 0.000 description 1
- 238000003672 processing method Methods 0.000 description 1
- 230000001179 pupillary effect Effects 0.000 description 1
- 238000011160 research Methods 0.000 description 1
- 238000005070 sampling Methods 0.000 description 1
Images
Classifications
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B3/00—Apparatus for testing the eyes; Instruments for examining the eyes
- A61B3/10—Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions
- A61B3/113—Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions for determining or recording eye movement
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B7/00—Mountings, adjusting means, or light-tight connections, for optical elements
- G02B7/28—Systems for automatic generation of focusing signals
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/033—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
- G06F3/038—Control and interface arrangements therefor, e.g. drivers or device-embedded control circuitry
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/70—Determining position or orientation of objects or cameras
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
Definitions
- This technology relates to an information processing device, a line-of-sight detection method, and a program, and particularly to technology for line-of-sight detection processing.
- a technique for detecting a line of sight using a pupillary corneal reflection method is known.
- a viewfinder of an imaging device camera
- the user holds the camera vertically or horizontally, so the relative posture between the user and the camera is not constant. Therefore, the tilt of the eye detected by the line-of-sight detection device may be significantly different from that at the time of calibration. Therefore, for example, a plurality of calibration data corresponding to a plurality of camera orientations are required.
- Patent Document 1 discloses a method for detecting whether the camera is held vertically or horizontally by detecting the orientation of the camera with respect to gravity using an acceleration sensor or the like, and storing a plurality of calibration data according to the camera orientation. It is disclosed that
- an object of the present disclosure is to be able to deal with various relative posture states using calibration data obtained in one calibration.
- An information processing apparatus performs a line-of-sight detection process for detecting a line-of-sight direction based on an eye image captured by an eye-image capturing unit, and performs calibration data used for the line-of-sight detection process by:
- a line-of-sight detection calculation unit is provided for performing correction processing for correction based on roll angle information, which is information on a relative angle change between the eye and the eye image capturing unit.
- roll angle information which is information on a relative angle change between the eye and the eye image capturing unit.
- correction processing based on roll angle information is processing for further correcting the calibration data in accordance with the orientation of the user's eyes when the line of sight is detected.
- the line-of-sight detection calculation unit includes reference angle information used to calculate the roll angle information when performing calibration processing for acquiring calibration data used in the line-of-sight detection processing. and store the calibration data in association with the reference angle information. For example, information that changes according to the relative posture relationship between the user's eyes and the information processing device is acquired during the calibration process, and stored as reference angle information in association with the calibration data.
- the line-of-sight detection calculation unit acquires current angle information, which is the same type of information as the reference angle information, during the line-of-sight detection processing. It is conceivable to calculate the roll angle information using the current angle information and perform the correction processing. For example, as information that changes according to the relative posture relationship between the user's eyes and the information processing device, the same information as the reference angle information is acquired during the line-of-sight detection process and compared with the reference angle information to obtain the roll angle information.
- the line-of-sight detection calculation unit may calculate roll angle information from an iris code used in iris authentication processing.
- iris authentication as a method of user personal authentication.
- the iris code used in this iris authentication is used to obtain roll angle information during line-of-sight detection processing.
- the line-of-sight detection calculation unit performs calibration processing for obtaining calibration data used in the line-of-sight detection processing at an opportunity to obtain an iris code, and converts the obtained iris code to the
- the iris code used in the iris authentication process is information that changes according to the relative posture relationship between the user's eyes and the information processing device. This iris code is acquired when performing calibration, and stored as reference angle information in association with the calibration data.
- the line-of-sight detection calculation unit uses the iris code detected in the iris authentication process as current angle information, and uses the reference angle information and the current angle information to obtain the roll angle information. is calculated to perform the correction process.
- the iris code obtained in the iris authentication process is used as the current angle information. This is compared with pre-stored reference angle information to obtain roll angle information, and the calibration data is corrected so that it can be used for line-of-sight detection processing.
- the line-of-sight detection calculation unit calculates the roll angle information based on the Hamming distance between the iris code as the reference angle information and the iris code as the current angle information. can be considered. Since the iris code is information bit-shifted by a shift amount corresponding to the amount of change in the relative posture relationship, the change in angle can be obtained from the Hamming distance.
- the line-of-sight detection calculation unit detects the iris code detected in the iris authentication process and the iris code authenticated as belonging to the same person as the reference angle information.
- the iris code detected in the iris authentication process may be used as current angle information, and the correction process may be performed using the roll angle information calculated using the reference angle information and the current angle information. Correction processing using the iris code is performed when it is determined that the person is a person whose iris code has been stored as reference angle information in the past.
- the line-of-sight detection calculation unit detects the iris code detected in the iris authentication process and the iris code authenticated as belonging to the same person as the reference angle information when the iris code is not stored as the reference angle information.
- the calibration process may be performed, and the iris code acquired in the iris authentication process may be stored as the reference angle information in association with the calibration data. If the person is not a person whose iris code has been stored as reference angle information in the past, the roll angle information cannot be calculated correctly even if the reference angle information and the current angle information are compared, so correction processing is not performed.
- the line-of-sight detection calculation unit may calculate the roll angle information based on eyelid boundary information in the captured image of the eye.
- the boundary information is information related to the boundary of the eyelids. For example, the boundary line of the eyelid, the feature points near the boundary line, the circumscribed triangle of the eyelid boundary, and the like.
- the line-of-sight detection calculation unit acquires the boundary information from the captured image of the eye when performing calibration processing for acquiring calibration data used in the line-of-sight detection processing.
- the boundary information may be stored in association with the calibration data as reference angle information used for calculating the roll angle information.
- Information about the shape of the eyelid boundary that appears in the captured image of the eye is information that changes according to the relative posture relationship between the user's eyes and the information processing device. This boundary information is obtained when performing calibration, and stored as reference angle information in association with the calibration data.
- the line-of-sight detection calculation unit acquires the boundary information from the captured image of the eye as current angle information, and uses the reference angle information and the current angle information to determine the roll angle. It is conceivable to calculate the angle information and perform the correction process.
- boundary information is used as current angle information. This is compared with pre-stored reference angle information to obtain roll angle information, and the calibration data is corrected so that it can be used for line-of-sight detection processing.
- the line-of-sight detection calculation unit calculates the roll angle information based on the contact position information with respect to the touch panel. For example, in the case where a touch panel is provided at the bottom of the viewfinder of the imaging device, roll angle information is obtained by detecting the contact position of the user's nose or the like on the touch panel when the user looks into the viewfinder.
- the line-of-sight detection calculation unit acquires the contact position information when performing calibration processing for acquiring calibration data used in the line-of-sight detection processing, and calculates the contact position information. It is conceivable to perform a process of storing the information in association with the calibration data as reference angle information used for calculating the roll angle information.
- the contact position information on the touch panel is information that changes according to the relative posture relationship between the user's eyes and the information processing device. This contact position information is acquired when performing calibration, and stored as reference angle information in association with calibration data.
- the line-of-sight detection calculation unit acquires the contact position information to use it as current angle information, and calculates the roll angle information using the reference angle information and the current angle information. It is conceivable that the correction process is performed as described above. At the time of line-of-sight detection, contact position information is used as current angle information. This is compared with pre-stored reference angle information to obtain roll angle information, and the calibration data is corrected so that it can be used for line-of-sight detection processing.
- the information processing device includes a detection unit that detects the posture of the device itself, and the line-of-sight detection calculation unit receives device posture information of the head-mounted device transmitted from the head-mounted device, It is conceivable to calculate the roll angle information based on the device orientation information detected by the detection unit. For example, assuming a combination of an imaging device and a head-mounted device, the head-mounted device corresponds to the posture of the user. Therefore, it is possible to obtain roll angle information for correction processing by comparing apparatus attitude information.
- the information processing apparatus includes the eye image capturing unit. That is, in an information processing apparatus integrally provided with an eye image capturing section, a line-of-sight detection process based on an eye image by the eye image capturing section and a correction process of calibration data are performed.
- the information processing apparatus includes an imaging unit that performs image capturing and the eye image capturing unit. That is, it is an example configured as an imaging device that performs line-of-sight detection processing.
- a line-of-sight detection method performs line-of-sight detection processing for detecting a line-of-sight direction based on an eye image captured by an eye image capturing unit, and performs calibration data used for the line-of-sight detection processing by:
- an information processing device executes a line-of-sight detection calculation for performing a correction process based on roll angle information, which is information about a relative angle change between an eye and the eye image capturing unit.
- roll angle information which is information about a relative angle change between an eye and the eye image capturing unit.
- the information processing apparatus performs line-of-sight detection using the corrected calibration data.
- a program according to the present technology is a program that causes an information processing apparatus to execute the line-of-sight detection calculation described above. This facilitates the realization of the above information processing apparatus.
- FIG. 1 is an explanatory diagram of an example of an information processing device according to an embodiment of the present technology
- FIG. 1 is a perspective view of an imaging device according to an embodiment
- FIG. 1 is a block diagram of an imaging device according to an embodiment
- FIG. 1 is a block diagram of an information processing device according to an embodiment
- FIG. 10 is an explanatory diagram of line-of-sight detection processing
- FIG. 4 is an explanatory diagram of an optical axis and a visual axis
- FIG. 10 is an explanatory diagram of calibration using a plurality of points
- FIG. 4 is an explanatory diagram of generation and application of calibration data
- 7 is a flow chart of an example of processing when recording calibration data according to the first embodiment
- FIG. 10 is an explanatory diagram of comparing the line angles of the eyeballs of the same person using iris patterns;
- FIG. 10 is an explanatory diagram of Hamming distance comparison of iris codes according to the first embodiment;
- 7 is a flowchart of an example of correction processing according to the first embodiment;
- FIG. 11 is an explanatory diagram of detection of roll angle information from an eye image according to the second embodiment;
- FIG. 11 is a flow chart of an example of processing when recording calibration data according to the second embodiment;
- FIG. 10 is a flowchart of an example of correction processing according to the second embodiment;
- FIG. 11 is a flow chart of an example of processing when recording calibration data according to the third embodiment;
- FIG. FIG. 11 is a flow chart of an example of correction processing according to the third embodiment;
- FIG. 14 is a flow chart of an example of processing when recording calibration data according to the fourth embodiment
- FIG. 16 is a flow chart of an example of correction processing according to the fourth embodiment
- FIG. 5 is an explanatory diagram of an application example of roll angle information according to the embodiment
- An information processing apparatus is an apparatus capable of performing information processing, and specifically, an apparatus including a microprocessor or the like and capable of performing line-of-sight detection calculations.
- a line-of-sight detection calculation is a calculation as a line-of-sight detection process for detecting a line-of-sight direction based on an eye image captured by an eye image pickup unit inside or outside the apparatus.
- the “eye image” referred to in the present disclosure is an image of a person's eye.
- correction processing is also performed for correcting the calibration data used in the line-of-sight detection processing based on roll angle information, which is information on relative angle changes between the eyes and the eye image capturing unit.
- a processor device such as a CPU or a DSP that performs at least line-of-sight detection processing and correction processing, or a device provided with such a processor device, is the information processing device according to the present disclosure.
- FIG. 1 illustrates an imaging device 1, a terminal device 100, and a table top display 120 as specific device examples corresponding to information processing devices.
- the imaging device 1 it is assumed that it has a function of detecting the line-of-sight direction of the user looking into the EVF (Electric Viewfinder) 5.
- a tablet device, a smartphone, etc. are exemplified.
- these terminal devices 100 for example, a configuration example for detecting the line-of-sight direction of the user with respect to the screen is conceivable.
- the equipment listed in FIG. 1 is only an example.
- the technology of the present disclosure can be applied to equipment that detects the line-of-sight direction.
- Devices corresponding to the information processing device of the present disclosure are extremely diverse, such as, for example, television receivers, game machines, personal computers, workstations, robots, monitoring devices, and sensor devices.
- FIG. 2 is a perspective view of the imaging device 1 as seen from the rear side.
- the subject side is the front side (front side)
- the photographer side is the rear side (rear side).
- the imaging device 1 includes a camera housing 2 and a lens barrel 3 that is detachable from the camera housing 2 and attached to the front surface portion 2a. It is an example that the lens barrel 3 is detachable as a so-called interchangeable lens, and the lens barrel may be a lens barrel that cannot be removed from the camera housing 2 .
- a rear monitor 4 is arranged on the rear surface portion 2 b of the camera housing 2 .
- the rear monitor 4 displays live view images, reproduced images of recorded images, and the like.
- the rear monitor 4 is configured by a display device such as a liquid crystal display (LCD) or an organic EL (Electro-Luminescence) display.
- LCD liquid crystal display
- organic EL Electro-Luminescence
- the EVF 5 is arranged on the upper surface portion 2 c of the camera housing 2 .
- the EVF 5 includes an EVF monitor 5a and a frame-shaped enclosing portion 5b that protrudes rearward so as to enclose an upper portion and left and right sides of the EVF monitor 5a.
- the EVF monitor 5a is formed using an LCD, an organic EL display, or the like.
- An optical view finder (OVF) may be provided instead of the EVF monitor 5a.
- Various operators 6 are provided on the rear surface portion 2b and the upper surface portion 2c.
- it is a shutter button (release button), a reproduction menu activation button, an enter button, a cross key, a cancel button, a zoom key, a slide key, and the like.
- manipulators 6 include various types of manipulators such as buttons, dials, pressable and rotatable composite manipulators.
- shutter operation, menu operation, playback operation, mode selection/switching operation, focus operation, zoom operation, and parameter selection/setting such as shutter speed and F-number are enabled by operating elements 6 of various modes.
- FIG. 3 shows the internal configuration of the imaging device 1.
- the imaging apparatus 1 includes, for example, a lens system 11, an imaging element section 12, a camera signal processing section 13, a recording control section 14, a display section 15, a communication section 16, an operation section 17, a camera control section 18, a memory section 19, a driver section 22, and a , a sensor unit 23 and a line-of-sight detection device unit 41 .
- the lens system 11 includes lenses such as a zoom lens and a focus lens, an aperture mechanism, and the like. Light (incident light) from a subject is guided by the lens system 11 and condensed on the imaging element section 12 .
- the imaging device unit 12 is configured by having an image sensor 12a (imaging device) such as a CMOS (Complementary Metal Oxide Semiconductor) type or a CCD (Charge Coupled Device) type.
- image sensor 12a imaging device
- CDS Correlated Double Sampling
- AGC Automatic Gain Control
- the imaging signal as digital data is output to the camera signal processing section 13 and the camera control section 18 in the subsequent stage.
- the camera signal processing unit 13 is configured as an image processing processor such as a DSP (Digital Signal Processor).
- the camera signal processing section 13 performs various signal processing on the digital signal (captured image signal) from the imaging element section 12 .
- the camera signal processing unit 13 performs preprocessing, synchronization processing, YC generation processing, resolution conversion processing, and the like.
- the camera signal processing unit 13 performs, for example, compression encoding for recording and communication, formatting, generation and addition of metadata, etc. on the image data that has been subjected to the various types of processing described above, and converts the image data into image data for recording and communication.
- file is generated.
- an image file in a format such as JPEG (Joint Photographic Experts Group), TIFF (Tagged Image File Format), or GIF (Graphics Interchange Format) is generated as a still image file.
- JPEG Joint Photographic Experts Group
- TIFF Tagged Image File Format
- GIF Graphics Interchange Format
- the recording control unit 14 performs recording and reproduction on a recording medium such as a non-volatile memory.
- the recording control unit 14 performs a process of recording image files such as moving image data and still image data, and metadata including thumbnail images and the like on a recording medium, for example.
- the recording control unit 14 may be configured as a flash memory built in the imaging device 1 and its writing/reading circuit.
- the recording control unit 14 may be configured by a card recording/reproducing unit that performs recording/reproducing access to a recording medium detachable from the imaging apparatus 1, such as a memory card (portable flash memory, etc.).
- the recording control unit 14 may be implemented as an HDD (Hard Disk Drive) or the like as a form incorporated in the imaging device 1 .
- HDD Hard Disk Drive
- the display unit 15 is a display unit that provides various displays to the photographer.
- a liquid crystal panel LCD: Liquid Crystal Display
- organic EL display devices such as Electro-Luminescence
- the display unit 15 executes various displays on the display screen based on instructions from the camera control unit 18 .
- the display unit 15 displays a reproduced image of image data read from the recording medium by the recording control unit 14 .
- the display unit 15 is supplied with the image data of the captured image whose resolution has been converted for display by the camera signal processing unit 13, and the display unit 15 responds to an instruction from the camera control unit 18 to display the image data of the captured image. may be displayed.
- a so-called through image (monitoring image of the subject), which is an image captured while confirming the composition or recording a moving image, is displayed.
- the display unit 15 displays various operation menus, icons, messages, etc., that is, as a GUI (Graphical User Interface) on the screen based on instructions from the camera control unit 18 .
- GUI Graphic User Interface
- the communication unit 16 performs wired or wireless data communication and network communication with external devices. For example, captured image data (still image files and moving image files) and metadata are transmitted and output to external information processing devices, display devices, recording devices, playback devices, and the like.
- the communication unit 16 performs communication via various networks such as the Internet, a home network, and a LAN (Local Area Network), and can transmit and receive various data to and from servers, terminals, etc. on the network. can.
- the communication unit 16 enables the imaging device 1 to communicate with, for example, a PC, a smartphone, a tablet terminal, headphones, earphones, a headset, or the like via Bluetooth (registered trademark), Wi-Fi communication, NFC, or the like. It may also be possible to perform mutual information communication by means of distance wireless communication or infrared communication. Alternatively, the imaging device 1 and other equipment may be able to communicate with each other through wired connection communication.
- the operation unit 17 collectively indicates an input device for a user to perform various operation inputs. Specifically, the operation unit 17 indicates various operators (keys, dials, touch panels, touch pads, etc.) provided on the housing of the imaging device 1 . For example, it is assumed that the touch panel is provided on the surface of the rear monitor 4 . A user's operation is detected by the operation unit 17 , and a signal corresponding to the input operation is sent to the camera control unit 18 .
- the line-of-sight detection device 41 is a device that captures an eye image for detecting the line of sight of the user. It is composed of an image capturing unit 51 (for example, an infrared camera) and the like. For example, by arranging the line-of-sight detection device 41 having the eye image capturing unit 51 in the EVF 5 shown in FIG. Also, such a line-of-sight detection device unit 41 is arranged near the rear monitor 4 so that an eye image for detecting the line-of-sight direction of the user looking at the rear monitor 4 can be picked up. good too.
- an image capturing unit 51 for example, an infrared camera
- the camera control unit 18 is configured by a microcomputer (information processing device) having a CPU (Central Processing Unit).
- the memory unit 19 stores information and the like that the camera control unit 18 uses for processing.
- a ROM Read Only Memory
- RAM Random Access Memory
- flash memory and the like are comprehensively illustrated.
- the memory section 19 may be a memory area built into a microcomputer chip as the camera control section 18, or may be configured by a separate memory chip.
- the camera control unit 18 controls the entire imaging apparatus 1 by executing programs stored in the ROM of the memory unit 19, flash memory, or the like. For example, the camera control unit 18 controls the shutter speed of the image sensor unit 12, instructs various signal processing in the camera signal processing unit 13, performs image capturing and recording operations in response to user operations, reproduces recorded image files, performs lens It controls the operations of necessary units for operations of the lens system 11 such as zoom, focus, and aperture adjustment in the lens barrel, user interface operations, and the like.
- the camera control unit 18 has a function as a line-of-sight detection calculation unit 40 by an application program.
- the line-of-sight detection calculation unit 40 analyzes the eye image obtained from the line-of-sight detection device unit 41 and performs line-of-sight detection processing for detecting the line-of-sight direction of the user. Also, in order to improve the accuracy of this line-of-sight detection, calibration processing is also performed. Furthermore, in the case of the present embodiment, the calibration data used for the line-of-sight detection process is corrected based on the roll angle information, which is the information on the relative angle change between the eye and the eye image capturing section in the line-of-sight detection device section 41. I am trying to process it. Details of these processes will be described later.
- the camera control unit 18 can also perform various controls based on the user's line-of-sight direction detected by the function of the line-of-sight detection calculation unit 40 . For example, depending on the line-of-sight direction, it is possible to set the focus area so that the subject in the line-of-sight direction is in just focus, or to perform aperture adjustment control according to the brightness of the subject in the line-of-sight direction.
- the camera control unit 18 may perform so-called AI (artificial intelligence) processing for line-of-sight detection processing, correction processing, and other processing.
- AI artificial intelligence
- the RAM in the memory unit 19 is used as a work area for the CPU of the camera control unit 18 to perform various data processing, and is used for temporary storage of data, programs, and the like.
- the ROM and flash memory (nonvolatile memory) in the memory unit 19 store an OS (Operating System) for the CPU to control each unit, content files such as image files, application programs for various operations, and firmware. , and used to store various setting information.
- Various setting information includes communication setting information, exposure setting, shutter speed setting, mode setting as setting information related to imaging operation, white balance setting, color setting, image effect setting as setting information related to image processing, etc.
- the memory unit 19 also stores programs for line-of-sight detection processing, calibration processing, and correction processing, which will be described later, and data used for these processing.
- the memory unit 19 can also function as a database regarding calibration data for line-of-sight detection processing.
- authentication data for iris authentication which will be described later, code information obtained by encoding an iris pattern, pattern data used for object recognition by semantic segmentation, and the like are also stored.
- the driver unit 22 includes, for example, a motor driver for the zoom lens drive motor, a motor driver for the focus lens drive motor, a motor driver for the motor of the aperture mechanism, and the like. These motor drivers apply drive currents to the corresponding drivers in accordance with instructions from the camera control unit 18 to move the focus lens and zoom lens, open and close the diaphragm blades of the diaphragm mechanism, and the like.
- the sensor unit 23 comprehensively indicates various sensors mounted on the imaging device.
- an IMU intial measurement unit
- the IMU can detect angular velocity with, for example, three-axis angular velocity (gyro) sensors of pitch, yaw, and roll, and acceleration with an acceleration sensor. This makes it possible to detect the orientation of the imaging device 1 with respect to the direction of gravity.
- a position information sensor, an illuminance sensor, a range sensor, etc. may be mounted.
- the line-of-sight detection device unit 41 is incorporated in the EVF 5 or near the rear monitor 4, and the line-of-sight direction of the user looking through the EVF 5 or looking at the rear monitor 4 can be detected.
- the camera control unit 18 (line-of-sight detection calculation unit 40) performs line-of-sight detection processing, calibration processing, and calibration data correction processing based on the eye image captured by the line-of-sight detection device unit 41.
- FIG. 4 describes a configuration example of the terminal device 100 as a tablet device or a smartphone illustrated in FIG. 1 as an example of the information processing device of the present disclosure.
- the CPU 71 of the terminal device 100 executes various programs according to a program stored in a ROM 72 or a non-volatile memory unit 74 such as an EEP-ROM (Electrically Erasable Programmable Read-Only Memory), or a program loaded from the storage unit 79 to the RAM 73. Execute the process.
- the RAM 73 also appropriately stores data necessary for the CPU 71 to execute various processes.
- the CPU 71 , ROM 72 , RAM 73 and nonvolatile memory section 74 are interconnected via a bus 83 .
- An input/output interface 75 is also connected to this bus 83 .
- the terminal device 100 is assumed to perform image processing and AI (artificial intelligence) processing, instead of the CPU 71 or together with the CPU 71, GPU (Graphics Processing Unit), GPGPU (General-purpose computing on graphics processing units) , an AI-dedicated processor, or the like may be provided.
- AI artificial intelligence
- the input/output interface 75 is connected to an input section 76 including operators and operating devices.
- various operators and operation devices such as a keyboard, mouse, key, dial, touch panel, touch pad, remote controller, etc. are assumed.
- a user's operation is detected by the input unit 76 , and a signal corresponding to the input operation is interpreted by the CPU 71 .
- a microphone is also envisioned as input 76 .
- a voice uttered by the user can also be input as operation information.
- Various sensing devices such as an image sensor (imaging unit), an acceleration sensor, an angular velocity sensor, a vibration sensor, an air pressure sensor, a temperature sensor, and an illuminance sensor are also assumed as the input unit.
- the input/output interface 75 is connected integrally or separately with a display unit 77 such as an LCD or an organic EL panel, and an audio output unit 78 such as a speaker.
- the display unit 77 is a display unit that performs various displays, and is configured by, for example, a display device provided in the housing of the terminal device 100, a separate display device connected to the terminal device 100, or the like.
- the display unit 77 displays images for various types of image processing, moving images to be processed, etc. on the display screen based on instructions from the CPU 71 . Further, the display unit 77 displays various operation menus, icons, messages, etc., ie, as a GUI (Graphical User Interface), based on instructions from the CPU 71 .
- GUI Graphic User Interface
- the input/output interface 75 may be connected to a storage unit 79 made up of a hard disk, a solid-state memory, etc., and a communication unit 80 made up of a modem or the like.
- the storage unit 79 stores various programs, data files, and the like.
- a database may also be constructed.
- the communication unit 80 performs communication processing via a transmission line such as the Internet, and communication by wired/wireless communication with various devices, bus communication, and the like.
- a drive 81 is also connected to the input/output interface 75 as required, and a removable recording medium 82 such as a magnetic disk, optical disk, magneto-optical disk, or semiconductor memory is appropriately loaded.
- Data files such as image files and various computer programs can be read from the removable recording medium 82 by the drive 81 .
- the read data file is stored in the storage unit 79 , and the image and sound contained in the data file are output by the display unit 77 and the sound output unit 78 .
- Computer programs and the like read from the removable recording medium 82 are installed in the storage unit 79 as required.
- software for the processing of this embodiment can be installed via network communication by the communication unit 80 or via the removable recording medium 82.
- the software may be stored in advance in the ROM 72, the storage unit 79, or the like.
- the line-of-sight detection device section 41 is connected to the input/output interface 75 .
- the line-of-sight detection device 41 is a device that captures an eye image for detecting the line of sight of the user.
- LED light source
- an eye image capturing unit 51 for example, an infrared camera
- the CPU 71 functions as the line-of-sight detection calculation unit 40 by means of an application program.
- the line-of-sight detection calculation unit 40 analyzes the eye image obtained from the line-of-sight detection device unit 41 and performs line-of-sight detection processing for detecting the line-of-sight direction of the user, as described for the imaging device 1 in FIG. 3 .
- calibration processing and correction processing for correcting calibration data based on roll angle information which is information on relative angle changes between the eyes and the eye image capturing unit in the sight line detection device unit 41, are performed.
- the CPU 71 can also perform various controls based on the user's line-of-sight direction detected by the function of the line-of-sight detection calculation unit 40 . For example, it is possible to move a pointer on the display unit 77, switch images, output sound, and perform communication according to the line-of-sight direction.
- the non-volatile memory unit 74, ROM 72, and storage unit 79 can also function as a database regarding calibration data for line-of-sight detection processing. For example, authentication data for iris authentication, which will be described later, code information obtained by encoding an iris pattern, pattern data used for object recognition by semantic segmentation, and the like are also stored.
- the table top display 120 in FIG. 1 game machines, personal computers, workstations, etc. have almost the same configuration.
- the television receiver has a tuner function
- the monitoring device and the sensor device have an imaging unit and a detection unit necessary for monitoring and sensing
- the robot has the necessary functions for the operation of the robot. It is sufficient if the function is provided.
- Gaze detection and calibration The line-of-sight detection and calibration performed by the imaging device 1, the terminal device 100, and the like will be described.
- a "reference point” and a “moving part (moving point)" of the eye are found, and the line of sight is detected from the position of the moving point with respect to the reference point.
- the corneal reflection method and the image processing method are known as specific methods, the pupil corneal reflection method will be explained here.
- the eye is photographed while the cornea is irradiated with a point light source, and the center of corneal curvature obtained from this corneal reflection image (Purkinje image) is used as a reference point and the moving point is measured as the pupil.
- 5A and 5B show a pupil 60, a cornea 61, an iris 62, an inner corner 63, and an outer corner 64 as structures of the eye.
- FIG. 5C schematically shows an infrared irradiation section 50 and an eye image pickup section 51 as an example of the line-of-sight detection device section 41 shown in FIGS.
- the infrared irradiation unit 50 is composed of, for example, an infrared LED (light emitting diode).
- the eye image pickup unit 51 is configured by, for example, an infrared camera. In this case, the infrared irradiation unit 50 irradiates the cornea 61 with a point light source.
- the eye image capturing unit 51 captures an image of the eye in a state where the cornea 61 is irradiated with the point light source, and outputs the captured eye image.
- This eye image is supplied to the line-of-sight detection calculation unit 40, that is, the camera control unit 18 in FIG. 3 and the CPU 71 in FIG.
- the line-of-sight detection calculation unit 40 detects the pupil 60 and the corneal reflection image 65 from the eye images shown in FIGS. 5D and 5E as line-of-sight detection processing.
- the line-of-sight direction is calculated from the positions of the moving point and the reference point. That is, the viewpoint (coordinates) is measured by applying the imaging information of the corneal reflection image 65 to the eyeball model.
- FIG. 6 shows the optical axis AX1 and visual axis AX2 of the human eye.
- the optical axis AX1 is the corneal normal passing through the center of the pupil 60 .
- This optical axis AX1 can be estimated by the eyeball 3D model.
- a visual axis AX2 is an axis that connects the nodal point (central posterior surface of the lens 66) and the fovea 70. FIG. What a person actually sees is this visual axis AX2.
- the optical axis AX1 and the visual axis AX2 are tilted by about 5 degrees. There are individual differences in this inclination, and it is generally about 4 to 8 degrees.
- the direction of the optical axis AX1 can be calculated by observing the pupil 60, so by correcting it to the direction of the visual axis AX2, the line-of-sight direction of the person (actual viewing direction) can be accurately detected. become. Therefore, calibration data for correcting the deviation between the optical axis AX1 and the visual axis AX2 for each individual is collected by performing calibration processing. As a result, correction can be performed using the calibration data in subsequent actual line-of-sight detection.
- the optical axis AX1 when viewing a certain "point" in the field of view is estimated.
- the difference between the vector from the center of curvature of the cornea 61 to that "point” and the vector of the optical axis AX1 is measured.
- the visual axis AX2 at that time is estimated from the optical axis AX1 when an arbitrary point is viewed.
- a plurality of points for example, 5 to 9 points) in the normal visual field are used because the calibration parameters differ depending on the orientation of the eyeball.
- FIG. 7 shows an example of calibration processing using multiple points.
- a gaze point marker 67 is displayed on the screen viewed by the user. The user is made to gaze at this marker 67 . Observe the line of sight at that time. The observable line-of-sight direction is the direction of the optical axis AX1. Also, the direction toward the center of the marker 67 is the direction of the visual axis AX2.
- the marker 67 is moved to the next point. As shown in the figure, data is collected while moving the marker 67 to, for example, the center, upper left corner, upper right corner, lower left corner, and lower right corner of the screen.
- vectors G and T shown in FIG. 8A are obtained for each point.
- Vector G is the detected viewing direction (that is, optical axis AX1)
- vector T is a vector (that is, visual axis AX2) connecting the center of the user's eye and the calibration point (the point indicated by marker 67).
- a rotational transformation Q from this vector G to vector T is obtained.
- the rotation transform Q is a quaternion.
- the rotational transformation Q for each vector G is stored in the database 52 as calibration data for the person who made the measurement. This rotational transformation Q becomes a correction amount for each individual.
- the database 52 may be constructed, for example, in the memory section 19 of FIG. 3 or the non-volatile memory section 74 or storage section 79 of FIG.
- each correction amount is weighted by the reciprocal of the distance from the gaze point P0 to the neighboring three points P1, P2, and P3, and added.
- the correction amount is applied to the current gazing point P0 to obtain the post-correction gazing point P0A. This makes it possible to accurately detect the direction in which the user is actually looking.
- the difference between the attitude relationship when the calibration process is performed and the attitude relationship when the line-of-sight detection is actually performed can be represented by the amount of rotation in the roll direction.
- Correction processing is performed to correct the calibration data according to the information.
- the roll angle can be said to be information about the relative angle between the user's eyes and the eye image capturing unit 51, with reference to the time of the calibration process (angle change amount).
- the corrected calibration data is used to correct from the optical axis AX1 to the visual axis AX2. To detect a line of sight with high accuracy only with obtained calibration data.
- the camera control unit 18 or the CPU 71 also has a function of performing iris authentication processing.
- Iris authentication is known as a technology that uses eyeballs. This technique encodes an iris pattern into an iris code, and compares the iris code to perform personal authentication. When the Hamming distance is obtained by shifting the iris code when performing this comparison, the shift amount that minimizes it is proportional to the roll angle of the eyeball. Therefore, correction is performed by rotating the calibration data by the difference (roll angle) between the angle at the time of calibration processing and the current angle.
- the processing of the line-of-sight detection calculation unit 40 for performing such processing will be described.
- the processing of the line-of-sight detection calculation unit 40 described below is processing performed by an information processing device such as the camera control unit 18 in FIG. 3 and the CPU 71 in FIG.
- FIG. 9 shows calibration processing of the line-of-sight detection calculation unit 40 .
- the line-of-sight detection calculation unit 40 acquires an image captured by the eye image capturing unit 51 of the line-of-sight detection device unit 41, that is, an eye image of the user.
- step S102 the camera control unit 18 or the CPU 71 equipped with the line-of-sight detection calculation unit 40 performs encoding for iris authentication by the iris authentication function also provided. That is, an iris code is generated by encoding the iris pattern observed from the user's eye image.
- FIG. 10A For example, assume that an eye image (image IM1) as shown in FIG. 10A is captured.
- image IM1 image IM1
- the donut-shaped portion of the iris 62 is extracted by polar coordinate transformation, and as shown in FIGS. 11A and 11B, 256 divisions in the angular direction and 4 lines in the radial direction are obtained.
- it is converted into a 2048-bit code as shown in FIG. 11C by a Gapole wavelet filter.
- one pixel has 2 bits of filter response real part and imaginary part, which is 2048 bits as 256 degrees ⁇ 2 bits ⁇ 4 lines. This is the iris code obtained from the eye image as image IM1.
- step S103 of FIG. 9 the line-of-sight detection calculation unit 40 collects calibration data.
- the vector G and the vector T are determined while the marker 67 is gazed at, and the rotational transformation Q is determined with the position of the marker 67 as the gaze point. While changing the position of the marker 67, this is performed for a plurality of fixation points. As a result, calibration data is obtained in the posture relation in which the eye image as the image IM1 was obtained.
- step S104 the line-of-sight detection calculation unit 40 stores the calibration data and the iris code as a pair in the database 52 in association with each other.
- the iris code paired with the calibration data and stored in step S104 is used at least as information used for line-of-sight detection.
- the camera control unit 18 or the CPU 71 may perform registration processing separately as information for personal authentication for the generated iris code when the iris code is generated in step S102.
- the iris code stored for the line-of-sight detection process in step S104 may also be used as information for later personal authentication.
- the iris code registration for personal authentication is performed simultaneously with the calibration process shown in FIG. may be held on another occasion.
- step S201 the line-of-sight detection calculation unit 40 acquires an image captured by the eye image capturing unit 51 of the line-of-sight detection device unit 41, that is, an eye image of the user.
- image IM1 in FIG. 10A an eye image in the same posture relationship as in the calibration process may be obtained.
- an eye image with a pose relationship can be obtained.
- the image IM2 is an image with a convolution angle ⁇ compared to the image IM1.
- step S202 the camera control unit 18 or the CPU 71 having the line-of-sight detection calculation unit 40 performs iris authentication using the iris authentication function also provided.
- the camera control unit 18 or CPU 71 first generates an iris code from the eye image in the same manner as described above. If an eye image exactly the same as the image IM1 with zero rotation angle is obtained for the same person who obtained the iris code in the past, the same iris code as the iris code in FIG. 11C is obtained. However, although there may be some bit differences depending on the imaging state of the eye image and the determination of the iris pattern, the codes are generally the same. 10B for the same person, the iris code is bit-shifted as compared to the iris code in FIG. 11C, as in FIG. 11D. you get the code.
- the Hamming distance that is, the amount of difference in bits
- the Hamming distance is minimized.
- the iris code stored for iris authentication in the past is the iris code of the user who is registered as a person whose personal authentication is OK.
- the iris code may be stored during the calibration process of FIG. 9, or may be stored during a separate registration process.
- the minimum value of the Hamming distance is found as described above, the Hamming distance is compared with a predetermined threshold value for authentication.
- the minimum value of the Hamming distance is equal to or less than a predetermined value, it can be determined that the current iris code belongs to the same person as the iris code stored for iris authentication in the past. This is because the smaller the Hamming distance, the higher the similarity of the iris code.
- the minimum value of the Hamming distance is not equal to or less than the predetermined value, it can be determined that the user has not stored an iris code as a person who should be authenticated in the past.
- the camera control unit 18 or the CPU 71 can thus perform personal authentication using the iris authentication method.
- step S203 the camera control unit 18 or the CPU 71 continues processing using the function of the line-of-sight detection calculation unit 40.
- the line-of-sight detection calculation unit 40 determines whether or not the iris code acquired this time and the iris code of the same person are stored in the database 52 in a pair with the calibration data.
- the roll angle is calculated based on the Hamming distance between the iris code detected this time and the iris code stored in the database 52 .
- the iris code detected this time is positioned as the current angle information and compared with the iris code as the reference angle information stored in step S104 of FIG. For example, if the iris code of the image IM2 is obtained this time, the iris code detected this time as current angle information is bit-shifted from the stored iris code by the amount corresponding to the rotation angle ⁇ . .
- the bit shift amount when the Hamming distance becomes the minimum corresponds to the rotation angle ⁇ . That is, the roll angle can be calculated from the bit shift amount up to the minimum Hamming distance.
- step S206 the line-of-sight detection calculation unit 40 calculates a calibration vector at the current gaze point from the stored calibration data. Then, in step S207, the calibration vector is rotated by the roll angle to obtain the corrected calibration vector. This means that the calibration data has been corrected according to the current relative attitude relationship. Therefore, after that, using the corrected calibration data, the line-of-sight detection process can be performed as described with reference to FIG. 8B.
- step S210 the process of FIG. 12 proceeds from step S204 to step S210.
- the line-of-sight detection calculation unit 40 collects calibration data in step S210.
- step S211 the line-of-sight detection calculation unit 40 stores the calibration data and the iris code in the database 52 in a state in which they are associated as a pair.
- iris code may be executed independently of personal authentication by iris authentication.
- iris authentication for example, the iris code of a user whose authentication is acceptable is registered, and if the iris code obtained during personal authentication is determined to be of the same person as the registered iris code, the authentication is accepted. be.
- the iris code stored in pairs with the calibration data is stored at least in the sense of reference angle information for correcting the calibration data. Therefore, regardless of whether the user is authenticated OK or not, when performing the calibration processing in step S103 of FIG. 9 or step S210 of FIG. They may be stored in pairs. However, it is conceivable to prevent the processing of steps S210 and S211 from being performed for users who have not been authenticated.
- FIG. 12 there are various possible timings for iris authentication, roll angle detection, and calibration data correction. It may be executed irregularly by some trigger, or may be executed periodically. For example, in the case of a configuration in which the EVF 5 of the imaging device 1 detects the line of sight, it is conceivable to execute the processing in FIG.
- the movement of the imaging device 1 or the terminal device 100 is acquired by an acceleration sensor, a gyro sensor, or the like, and if there is a large movement, the inclination may have changed. good too. In such a case, roll angle detection and calibration data correction may be performed intermittently only when movement such as rotation stops.
- Second Embodiment Correction Processing Based on Eye Image>
- This process is particularly applicable to devices that do not perform iris authentication.
- the eyelid area is detected by semantic segmentation of the eyeball image, and the roll angle relative to the eyelid area shape at the time of calibration is obtained.
- the line-of-sight detection calculation unit 40 also has a function of performing object recognition processing from an image by semantic segmentation.
- FIG. 13A is an eye image captured by the eye image capturing unit 51 .
- FIG. 13B shows the eyelid region 68 of this eye image with diagonal lines.
- step S 14 and 15 show processing examples of the line-of-sight detection calculation unit 40.
- FIG. FIG. 14 is an example of calibration processing.
- step S ⁇ b>120 the line-of-sight detection calculation unit 40 acquires an eye image from the eye image capturing unit 51 .
- step S121 the line-of-sight detection calculation unit 40 determines the eyelid region 68 in the eye image by semantic segmentation processing, and acquires the eyelid boundary information.
- Boundary information is information related to the boundary of the eyelids.
- information on the boundary line 69 can be used as boundary information.
- feature points near the boundary line 69 may be extracted and used as boundary information.
- a circumscribing triangle 86 with respect to the boundary of the eyelid may be detected as indicated by the dashed line in FIG. 13B, and the information of the circumscribing triangle 86 may be used as the boundary information.
- step S122 in FIG. 14 the line-of-sight detection calculation unit 40 performs calibration processing and collects calibration data in the same manner as in step S103 in FIG. 9 described above.
- FIG. 15 shows an example of calibration data correction processing.
- the line-of-sight detection calculation unit 40 acquires an eye image from the eye image capturing unit 51 .
- the line-of-sight detection calculation unit 40 determines the eyelid region 68 in the eye image by semantic segmentation processing, and acquires the eyelid boundary information.
- step S222 the line-of-sight detection calculation unit 40 compares the boundary information acquired in step S221 with the boundary information stored in the database 52 to obtain the roll angle.
- the boundary information detected this time is used as the current angle information, and is compared with the boundary information as the reference angle information stored in step S123 of FIG.
- the line-of-sight detection calculation unit 40 calculates a calibration vector at the current gaze point from the stored calibration data in step S223 of FIG. Then, in step S224, the calibration vector is rotated by the roll angle to obtain the corrected calibration vector. As a result, the calibration data is corrected in accordance with the current relative posture relationship, and thereafter the line-of-sight detection process can be performed using the corrected calibration data.
- the imaging device 1 has a touch panel provided on the surface of the rear monitor 4 .
- the user's nose may touch or come close to the touch panel. If there is a response on the right side of the touch panel, you know you are looking with your left eye. If there is a response on the left side of the touch panel, you know that you are looking with your right eye. Further, since the deviation of the detection position on the touch panel shows the change in the roll angle, the roll angle can be estimated therefrom.
- FIG. 16 and 17 show processing examples of the line-of-sight detection calculation unit 40.
- FIG. FIG. 16 is an example of calibration processing.
- the line-of-sight detection calculation unit 40 performs calibration processing and collects calibration data in the same manner as in step S103 of FIG. 9 described above.
- step S ⁇ b>141 the line-of-sight detection calculation unit 40 acquires contact position information from information on the touch panel of the operation unit 17 .
- the response of the touch panel while the user is looking into the EVF 5 for the calibration process is accumulated to generate contact position information on the touch panel with the nose during the calibration process.
- step S ⁇ b>142 the line-of-sight detection calculation unit 40 pairs the calibration data and the contact position information and stores them in the database 52 . That is, in this case, the contact position information is stored as the reference angle information for setting the calibration data to be stored so that the roll angle is zero.
- FIG. 17 shows an example of calibration data correction processing.
- the line-of-sight detection calculation unit 40 acquires contact position information on the touch panel.
- the contact position information detected this time becomes the current angle information to be compared with the contact position information as the reference angle information stored in step S142 of FIG. Therefore, in step S241, the line-of-sight detection calculation unit 40 compares the contact position information acquired in step S240 with the contact position information stored in the database 52 to obtain the roll angle.
- the difference in contact position on the touch panel corresponds to the roll angle centering on the position of the EVF 5, for example.
- the line-of-sight detection calculation unit 40 calculates a calibration vector at the current gaze point from the stored calibration data in step S242. Then, in step S243, the calibration vector is rotated by the roll angle to obtain the corrected calibration vector. As a result, the calibration data is corrected in accordance with the current relative posture relationship, and thereafter the line-of-sight detection process can be performed using the corrected calibration data.
- FIG. 18 shows the yaw axis Yax, the pitch axis Pax, and the roll axis Rax.
- the sensor unit 23 of the imaging device 1 has a function of detecting the orientation of the device itself by using an acceleration sensor, an angular velocity sensor, or the like. For example, it detects postures in three axial directions of yaw, pitch, and roll.
- the headset 130 can also detect the posture of the device itself in three axial directions of yaw, pitch, and roll using an acceleration sensor, an angular velocity sensor, and the like. It is assumed that the imaging apparatus 1 receives the posture information of the headset 130 through the communication unit 16, and the camera control unit 18 can detect the posture state of the headset 130 in real time.
- the pose of headset 130 can be assumed to be the pose of the user's head. Therefore, the line-of-sight detection calculation unit 40 of the image pickup device 1 detects the relative relationship between the orientation of the image pickup device 1 itself and the orientation of the headset 130 and the relative relationship between the user's eye and the eye image pickup unit 51 for the EVF 5 and the rear monitor 4 . posture relationship (tilt amount) can be detected. That is, the current roll angle can be determined.
- the roll angle of the eye-to-eye image pickup unit 51 is the difference in roll angle when rotation correction is applied so that the yaw angles in the depth direction of the imaging device 1 and the depth direction of the headset 130 match.
- step S150 the line-of-sight detection calculation unit 40 performs calibration processing and collects calibration data in the same manner as in step S103 of FIG. 9 described above.
- step S151 the line-of-sight detection calculation unit 40 acquires device orientation information of the imaging device 1 itself.
- step S ⁇ b>152 the line-of-sight detection calculation unit 40 acquires device orientation information of the headset 130 .
- step S ⁇ b>153 the line-of-sight detection calculation unit 40 associates the device orientation information of the imaging device 1 , the device orientation information of the headset 130 , and the calibration data, and stores them in the database 52 . That is, in this case, the calibration data to be stored includes device orientation information of the imaging device 1 and device orientation information of the headset 130 as reference angle information for setting the roll angle to zero. Note that the relative angle obtained from the device orientation information of the imaging device 1 and the device orientation information of the headset 130 at this time may be used as the reference angle information.
- FIG. 20 shows an example of calibration data correction processing.
- the line-of-sight detection calculation unit 40 acquires current device orientation information of the imaging device 1 .
- the line-of-sight detection calculation unit 40 acquires current device orientation information of the headset 130 .
- the device orientation information of the imaging device 1 and the device orientation information of the headset 130 acquired this time become current angle information that is compared with both device orientation information as the reference angle information stored in step S157 of FIG.
- step S252 the line-of-sight detection calculation unit 40 calculates the roll angle based on the device posture information. For example, if the database 52 stores device posture information of the imaging device 1 and device posture information of the headset 130, a relative angle is obtained as the relative posture difference. Also, the relative angle is obtained for each device orientation information acquired in steps S250 and S251. The difference between the stored relative angle of both apparatus attitude information and the current relative angle of both attitude information is the current roll angle.
- the line-of-sight detection calculation unit 40 calculates a calibration vector at the current gaze point from the stored calibration data in step S253. Then, in step S254, the calibration vector is rotated by the roll angle to obtain the corrected calibration vector. As a result, the calibration data is corrected in accordance with the current relative posture relationship, and thereafter the line-of-sight detection process can be performed using the corrected calibration data.
- processing may be performed to notify the user that it is recommended to correct the inclination of the wearing state.
- Correction for eliminating the difference between the optical axis AX1 and the visual axis AX2 in the line-of-sight detection process uses three neighboring points P1, P2, and P3 with respect to the current gaze point P0, as described with reference to FIG. 8B. Therefore, it is possible to improve the accuracy of the points gazed at during the calibration process performed in advance, that is, the areas included in the plurality of points on which the markers 67 are displayed as shown in FIG.
- FIG. 21A assumes, for example, the screen of the EVF 5 or the rear monitor 4, and shows a hatched area 90 in the screen as an area with a high calibration effect.
- the four corners of this area 90 correspond to the four corners showing the markers 67 as shown in FIG.
- three points P1, P2, and P3 near the current gaze point P0 can be obtained as shown in FIG. 8B, so that the calibration effect is enhanced.
- the accuracy is lowered because three neighboring points cannot be obtained.
- the rotation of the screen is considered. If the screen is rotated vertically and horizontally when the area 90 with high calibration accuracy is not square, the area outside the area 90 becomes relatively wide in the vertical direction, as shown in FIG. 21B. Therefore, for example, as shown in FIG. 21C, for a UI that uses the entire screen based on when the image capturing apparatus 1 is held horizontally, when the screen is rotated when held vertically, the icons used for the UI, etc., are changed as shown in FIG. 21D. , so as to be within the area 90 . The number of icons to be displayed and the contents may be changed without changing the scale.
- FIG. 21 shows an example in which the screen of the EVF 5 or the rear monitor 4 is rotated from a horizontally elongated state to a vertically elongated state by holding the imaging device 1 vertically.
- the area outside the area 90 can be widened to the left and right.
- placement of icons and the like, size conversion, etc. are similarly applicable.
- the face is slanted, it is possible to maintain the previous state.
- UI parts such as icons according to the detection accuracy of the roll angle. Since the detection accuracy differs depending on the method of obtaining the roll angle, if the accuracy is low, the UI parts are made larger to provide a screen that allows deviation.
- the information processing apparatus (imaging device 1, terminal device 100, etc.) of the embodiment includes a line-of-sight detection calculation unit 40.
- FIG. The line-of-sight detection calculation unit 40 performs line-of-sight detection processing for detecting the line-of-sight direction based on the eye image captured by the eye image capturing unit 51 in the line-of-sight detection device unit 41, and also calculates the calibration data used for the line-of-sight detection processing from the eye. Correction processing is performed based on the roll angle information, which is the information on the relative angle change between the eye image pickup unit 51 and the eye image pickup unit 51 .
- the line-of-sight detection calculation unit 40 acquires reference angle information used for calculating roll angle information when performing calibration processing for acquiring calibration data used for line-of-sight detection processing, and performs calibration data and the reference angle information.
- the reference angle information is angle information with zero roll angle.
- the iris code, boundary information, contact position information, device attitude information, etc. acquired when performing calibration.
- the reference angle information is not limited to the iris code, boundary information, contact position information, and device posture information, and various other types of information are conceivable. Information that changes according to the relative posture relationship between the user's eyes and the eye image capturing unit 51 of the imaging device 1 or the terminal device 100 may be used as the reference angle information.
- the line-of-sight detection calculation unit 40 acquires current angle information, which is the same type of information as the reference angle information, during line-of-sight detection processing, and calculates roll angle information using the reference angle information and the current angle information.
- An example of performing correction processing by The current angle information is angle information at the time of line-of-sight detection processing.
- the iris code, boundary information, contact position information, device orientation information, etc. acquired during the line-of-sight detection process.
- Roll angle information can be easily calculated by comparing the current angle information and the reference angle information.
- the current angle information may be the same type of information as the reference angle information, and is not limited to the iris code, boundary information, contact position information, and apparatus orientation information. Various types of current angle information are conceivable according to the type of information employed as the reference angle information.
- the line-of-sight detection calculation unit 40 calculates the roll angle information from the iris code used in the iris authentication process. This makes it possible to obtain roll angle information using information obtained by iris authentication processing, and eliminates the need to separately perform detection processing only for roll angle information. Therefore, processing can be made more efficient.
- the line-of-sight detection calculation unit 40 performs calibration processing to acquire calibration data used for line-of-sight detection processing at an opportunity to acquire an iris code, and uses the acquired iris code to calculate roll angle information.
- the line-of-sight detection calculation unit 40 uses the iris code detected in the iris authentication process as the current angle information, calculates the roll angle information using the reference angle information and the current angle information, and performs the correction process.
- An example was given (see FIG. 12).
- the roll angle information can be easily calculated by comparing the reference angle information with the iris code acquired during the line-of-sight detection process as the current angle information. Therefore, when iris authentication is performed, calibration data correction processing can be executed without special detection.
- the line-of-sight detection calculation unit 40 calculates the roll angle information based on the Hamming distance between the iris code as the reference angle information and the iris code as the current angle information. Both the current angle information and the iris code as the reference angle information have a bit-shifted relationship according to the relative change in the eye posture of the imaging device 1 and the user's eye. Therefore, the shift amount when the Hamming distance is the minimum represents the angle change amount, that is, the roll angle information. Accordingly, roll angle information can be calculated, and calibration data correction processing can be executed.
- the line-of-sight detection calculation unit 40 detects the iris code in the iris authentication process.
- An example has been described in which the calculated iris code is used as current angle information, and correction processing is performed using roll angle information calculated using the reference angle information and current angle information.
- correction processing using the iris code is performed in steps S205 to S207 of FIG. do. As a result, calculation of the roll angle information using the iris code becomes appropriate, and correction processing of the calibration data appropriate for the current person is ensured.
- the line-of-sight detection calculation unit 40 performs the calibration process. and stores the iris code acquired in the iris authentication process as reference angle information in association with the calibration data. That is, it is the processing of steps S210 and S211 in FIG. In this case, even if the reference angle information and the current angle information are compared, the roll angle information cannot be calculated correctly, so correction processing is not performed. Accordingly, correction processing based on inaccurate roll angle information can be prevented from being performed. Also, when detecting a new person, by storing the current iris code as reference angle information in association with the calibration data, the calibration data for that person can be appropriately corrected thereafter. .
- the line-of-sight detection calculation unit 40 calculates the roll angle information based on the eyelid boundary information in the captured image of the eye.
- Boundary information is information representing features related to the eyelids.
- the line-of-sight detection calculation unit 40 acquires boundary information from the captured image of the eye when performing calibration processing for obtaining calibration data used in the line-of-sight detection processing, and rolls the boundary information.
- the reference angle information used for calculating the angle information an example is given in which processing is performed to store the reference angle information in association with the calibration data (see FIG. 14).
- the line-of-sight detection calculation unit 40 acquires boundary information from the captured image of the eye and uses it as current angle information, calculates roll angle information using the reference angle information and the current angle information, and performs correction processing. (see FIG. 15).
- the roll angle information can be easily calculated by acquiring the boundary information, which is the information on the boundary shape of the eyelids, from the eye image and using it as the current angle information, and comparing it with the reference angle information.
- the line-of-sight detection calculation unit 40 calculates the roll angle information based on the contact position information on the touch panel. By using the contact position on the touch panel, it is possible to easily obtain the roll angle information and perform correction processing of the calibration data.
- the device does not perform personal authentication processing or object recognition from images, and is suitable for use in devices in which the user looks into the display unit 15, such as the imaging device 1 and the telescope.
- the line-of-sight detection calculation unit 40 acquires contact position information when performing calibration processing for acquiring calibration data used in line-of-sight detection processing, and converts the contact position information into roll angle information.
- the reference angle information used for the calculation an example is given in which the reference angle information is stored in association with the calibration data (see FIG. 16).
- the line-of-sight detection calculation unit 40 acquires the contact position information and uses it as current angle information, calculates the roll angle information using the reference angle information and the current angle information, and performs the correction process. (see FIG. 17).
- the roll angle information can be easily calculated by acquiring the boundary information, which is the information on the boundary shape of the eyelids, from the eye image and using it as the current angle information, and comparing it with the reference angle information.
- the information processing device (for example, the imaging device 1) includes a detection unit (for example, the sensor unit 23) that detects its own posture, and the line-of-sight detection calculation unit 40 is connected to an external head-mounted device (for example, An example of calculating the roll angle information based on the device orientation information transmitted from the headset 130) and the device orientation information detected by the detection unit has been given (see FIGS. 18, 19, and 20).
- the device orientation information of the imaging device 1 and the headset 130 the relative relationship between the orientation of the user's head and the line of sight of the imaging device 1 can be known. Therefore, the roll angle can be calculated, and correction processing of the calibration data can be performed.
- the terminal device 100 and the imaging device 1 as the information processing device of the embodiment include the line-of-sight detection device section 41 having the eye image imaging section 51. That is, in the information processing apparatus integrally including the eye image capturing unit 51, the line-of-sight detection processing and calibration data correction processing are performed by the line-of-sight detection calculation unit 40 based on the detection result of the line-of-sight detection device unit 41. .
- the terminal device 100 and the imaging device 1 can implement a device that detects the line of sight after correcting the calibration data. Therefore, highly accurate line-of-sight detection processing can be performed in the terminal device 100 or the like, and highly accurate line-of-sight direction detection results can be applied to various types of processing.
- the angle at which the user removes the screen is diverse. Therefore, the calibration data correction process of the present technology is extremely useful. Also, in the case of a device such as the tabletop display 120, the relative posture relationship varies depending on the user's standing position with respect to the table, and a user standing in a corner may appear oblique, especially to the eye imaging device. be. In such a situation, the correction processing of the present technology is useful.
- the device including the line-of-sight detection device unit 41 and the device including the line-of-sight detection calculation unit 40 are configured separately.
- the imaging device 1 is given as an example of an information processing device that includes an imaging unit that captures an image, the line-of-sight detection device unit 41 that has the eye image capturing unit 51 , and the line-of-sight detection calculation unit 40 .
- the imaging section is an imaging system including the lens system 11 , the imaging element section 12 and the camera signal processing section 13 .
- the imaging apparatus 1 is realized as an apparatus that detects the line of sight after correcting the calibration data. Therefore, highly accurate line-of-sight detection processing becomes possible, and the result of line-of-sight detection can be effectively used for imaging operations. For example, focus control according to the line-of-sight direction can be realized with high precision.
- reference angle information iris code, boundary information, contact position information, device posture information
- iris code, boundary information, contact position information, device posture information is stored for each of the right and left eyes. etc.
- the tilt of the face may be obtained using skeleton estimation or face detection.
- a program according to an embodiment is a program that causes a CPU, DSP, GPU, GPGPU, AI processor, etc., or a device including these to execute the calibration data correction process described above. That is, the program according to the embodiment performs line-of-sight detection processing for detecting the line-of-sight direction based on the eye image captured by the eye image capturing unit 51, and also performs the calibration data used for the line-of-sight detection processing in relation to the eyes and the eye image capturing unit. It is a program that causes an information processing device such as the camera control unit 18 or the CPU 71 to execute a line-of-sight detection calculation that performs a correction process based on roll angle information, which is information about a change in angle. With such a program, the calibration data correction process referred to in the present disclosure can be realized by various information processing apparatuses.
- a HDD as a recording medium built in equipment such as a computer device, or in a ROM or the like in a microcomputer having a CPU.
- a flexible disc a CD-ROM (Compact Disc Read Only Memory), an MO (Magneto Optical) disc, a DVD (Digital Versatile Disc), a Blu-ray disc (Blu-ray Disc (registered trademark)), a magnetic disc, a semiconductor memory
- a removable recording medium such as a memory card.
- Such removable recording media can be provided as so-called package software.
- a program from a removable recording medium to a personal computer or the like, it can also be downloaded from a download site via a network such as a LAN (Local Area Network) or the Internet.
- LAN Local Area Network
- Such a program is suitable for widely providing the information processing apparatus of the present disclosure.
- a terminal device 100 such as a smartphone or tablet, a mobile phone, a personal computer, a game device, a video device, a PDA (Personal Digital Assistant), etc.
- these devices function as the information processing device of the present disclosure. be able to.
- a line-of-sight detection process for detecting a line-of-sight direction is performed based on an eye image captured by an eye image capturing unit, and calibration data used for the line-of-sight detection process is obtained by detecting a relative angle change between the eye and the eye image capturing unit.
- An information processing apparatus comprising a line-of-sight detection calculation unit that performs correction processing for correcting based on roll angle information, which is information of.
- the line-of-sight detection calculation unit is When performing the calibration process for acquiring the calibration data used for the line-of-sight detection process, the reference angle information used for calculating the roll angle information is acquired, and the calibration data and the reference angle information are stored in association with each other.
- the information processing apparatus according to (1) above.
- (3) The line-of-sight detection calculation unit is During the line-of-sight detection process, current angle information that is the same type of information as the reference angle information is acquired, and the roll angle information is calculated using the reference angle information and the current angle information, and the correction process is performed.
- the information processing apparatus according to (2) above.
- (4) The line-of-sight detection calculation unit is The information processing apparatus according to any one of (1) to (3) above, wherein roll angle information is calculated from an iris code used in iris authentication processing.
- the line-of-sight detection calculation unit is performing calibration processing for acquiring calibration data used in the line-of-sight detection processing at an opportunity to acquire the iris code,
- the information processing apparatus according to (4) above wherein the acquired iris code is stored as reference angle information used for calculating the roll angle information in association with calibration data.
- the line-of-sight detection calculation unit is The information processing apparatus according to (5) above, wherein the iris code detected in the iris authentication process is used as current angle information, and the roll angle information is calculated using the reference angle information and the current angle information to perform the correction process. .
- the line-of-sight detection calculation unit is The information processing apparatus according to (6), wherein the roll angle information is calculated based on a Hamming distance between the iris code as the reference angle information and the iris code as the current angle information.
- the line-of-sight detection calculation unit is When the iris code detected in the iris authentication process and the iris code authenticated as belonging to the same person are stored as the reference angle information, the iris code detected in the iris authentication process is used as the current angle information, and the reference angle
- the information processing apparatus according to any one of (6) to (7) above, wherein the correction process is performed using the roll angle information calculated using the information and the current angle information.
- the line-of-sight detection calculation unit is If the iris code detected in the iris authentication process and the iris code authenticated as belonging to the same person are not stored as the reference angle information, the calibration process is performed, and the iris code acquired in the iris authentication process is used as the iris code.
- the information processing apparatus according to any one of (6) to (8) above, wherein the reference angle information is stored in association with the calibration data.
- the line-of-sight detection calculation unit is The information processing apparatus according to any one of (1) to (3) above, wherein roll angle information is calculated based on eyelid boundary information in a captured image of an eye.
- the line-of-sight detection calculation unit is obtaining the boundary information from the captured image of the eye when performing calibration processing for obtaining calibration data used in the line-of-sight detection processing;
- the information processing apparatus according to (10), wherein the boundary information is stored in association with calibration data as reference angle information used for calculating the roll angle information.
- the line-of-sight detection calculation unit is The information according to (11) above, wherein the boundary information is obtained from the captured image of the eye and used as current angle information, and the roll angle information is calculated using the reference angle information and the current angle information to perform the correction process. processing equipment.
- the line-of-sight detection calculation unit is The information processing apparatus according to any one of (1) to (3) above, wherein roll angle information is calculated based on contact position information with respect to the touch panel.
- the line-of-sight detection calculation unit is Acquiring the contact position information when performing a calibration process for acquiring calibration data used in the line-of-sight detection process, The information processing apparatus according to (13), wherein the contact position information is stored in association with calibration data as reference angle information used for calculating the roll angle information.
- the line-of-sight detection calculation unit is The information processing apparatus according to (14), wherein the contact position information is obtained as current angle information, the roll angle information is calculated using the reference angle information and the current angle information, and the correction process is performed.
- the line-of-sight detection calculation unit is calculating the roll angle information based on the device posture information of the head-mounted device transmitted from the head-mounted device and the device posture information detected by the detection unit; any of the above (1) to (3) 1.
- the information processing device according to claim 1.
- the information processing apparatus according to any one of (1) to (16) above, including the eye image capturing section.
- an imaging unit that captures an image; the eye image capturing unit; The information processing apparatus according to any one of (1) to (17) above.
- a line-of-sight detection process for detecting a line-of-sight direction is performed based on an eye image captured by an eye image capturing unit, and calibration data used for the line-of-sight detection process is obtained by detecting a relative angle change between the eye and the eye image capturing unit.
- the line-of-sight detection calculation that performs the correction process based on the roll angle information, which is the information of A line-of-sight detection method executed by an information processing device.
- a line-of-sight detection process for detecting a line-of-sight direction is performed based on an eye image captured by an eye image capturing unit, and calibration data used for the line-of-sight detection process is obtained by detecting a relative angle change between the eye and the eye image capturing unit.
- the line-of-sight detection calculation that performs the correction process based on the roll angle information, which is the information of A program to be executed by an information processing device.
- imaging device 4 rear monitor 5 EVF 15 display unit 16 communication unit 18 camera control unit 19 memory unit 23 sensor unit 24 line-of-sight detection unit 40 line-of-sight detection calculation unit 41 line-of-sight detection device unit 50 infrared irradiation unit 51 eye image capturing unit 52 database 71 CPU 100 terminal device 120 table top display 130 headset
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Life Sciences & Earth Sciences (AREA)
- Health & Medical Sciences (AREA)
- Computer Vision & Pattern Recognition (AREA)
- General Engineering & Computer Science (AREA)
- Human Computer Interaction (AREA)
- Biophysics (AREA)
- Medical Informatics (AREA)
- Signal Processing (AREA)
- Multimedia (AREA)
- Ophthalmology & Optometry (AREA)
- Biomedical Technology (AREA)
- Heart & Thoracic Surgery (AREA)
- Optics & Photonics (AREA)
- Molecular Biology (AREA)
- Surgery (AREA)
- Animal Behavior & Ethology (AREA)
- General Health & Medical Sciences (AREA)
- Public Health (AREA)
- Veterinary Medicine (AREA)
- Eye Examination Apparatus (AREA)
Abstract
L'invention concerne un dispositif de traitement d'informations comprenant une unité de calcul de détection de ligne visuelle qui effectue un processus de détection de ligne visuelle pour détecter une direction de ligne visuelle sur la base d'une image d'œil capturée par une unité de capture d'image d'œil et effectue un processus de correction pour corriger des données d'étalonnage à utiliser dans le processus de détection de ligne visuelle sur la base d'informations d'angle de roulis concernant un changement de l'angle relatif entre l'œil et une unité de capture d'image dans l'unité de dispositif de détection de ligne visuelle.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2023506806A JPWO2022196093A1 (fr) | 2021-03-17 | 2022-01-21 |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2021043750 | 2021-03-17 | ||
JP2021-043750 | 2021-03-17 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2022196093A1 true WO2022196093A1 (fr) | 2022-09-22 |
Family
ID=83320228
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2022/002165 WO2022196093A1 (fr) | 2021-03-17 | 2022-01-21 | Dispositif de traitement d'informations, procédé de détection de ligne visuelle et programme |
Country Status (2)
Country | Link |
---|---|
JP (1) | JPWO2022196093A1 (fr) |
WO (1) | WO2022196093A1 (fr) |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2000229067A (ja) * | 1999-02-09 | 2000-08-22 | Canon Inc | 視線検出装置およびそれを備えた光学機器 |
JP2003132355A (ja) * | 2001-10-25 | 2003-05-09 | Matsushita Electric Ind Co Ltd | 虹彩認証方法及びその装置 |
JP2004129927A (ja) * | 2002-10-11 | 2004-04-30 | Canon Inc | 視線検出装置 |
WO2015136908A1 (fr) * | 2014-03-13 | 2015-09-17 | パナソニックIpマネジメント株式会社 | Dispositif de détection de regard |
-
2022
- 2022-01-21 JP JP2023506806A patent/JPWO2022196093A1/ja active Pending
- 2022-01-21 WO PCT/JP2022/002165 patent/WO2022196093A1/fr active Application Filing
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2000229067A (ja) * | 1999-02-09 | 2000-08-22 | Canon Inc | 視線検出装置およびそれを備えた光学機器 |
JP2003132355A (ja) * | 2001-10-25 | 2003-05-09 | Matsushita Electric Ind Co Ltd | 虹彩認証方法及びその装置 |
JP2004129927A (ja) * | 2002-10-11 | 2004-04-30 | Canon Inc | 視線検出装置 |
WO2015136908A1 (fr) * | 2014-03-13 | 2015-09-17 | パナソニックIpマネジメント株式会社 | Dispositif de détection de regard |
Also Published As
Publication number | Publication date |
---|---|
JPWO2022196093A1 (fr) | 2022-09-22 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP6025690B2 (ja) | 情報処理装置および情報処理方法 | |
CN106133649B (zh) | 使用双目注视约束的眼睛凝视跟踪 | |
KR20190046845A (ko) | 정보 처리 장치 및 방법, 그리고 프로그램 | |
WO2017126172A1 (fr) | Dispositif de traitement d'informations, procédé de traitement d'informations et support d'enregistrement | |
JP6897728B2 (ja) | 画像処理装置、画像処理方法およびプログラム | |
US20140177926A1 (en) | Information notification apparatus that notifies information of motion of a subject | |
JP2010147769A (ja) | 撮像システム、画像提示方法、プログラム | |
US8400532B2 (en) | Digital image capturing device providing photographing composition and method thereof | |
JP2015088096A (ja) | 情報処理装置および情報処理方法 | |
JP2015088095A (ja) | 情報処理装置および情報処理方法 | |
US20150022627A1 (en) | Photographing apparatus, photographing method and computer-readable storage medium storing photographing program of photographing apparatus | |
JP7059934B2 (ja) | 情報処理装置、情報処理方法、及びプログラム | |
KR20170074742A (ko) | 화상 처리 장치, 화상 처리 방법 및 프로그램 | |
JPWO2010073619A1 (ja) | 撮像装置 | |
JP2009010987A (ja) | 電子カメラ | |
JP2015088098A (ja) | 情報処理装置および情報処理方法 | |
JP6677900B2 (ja) | 画像処理装置、画像処理方法及びプログラム | |
JP2006040232A (ja) | 画像処理装置及びその方法、撮像装置、プログラム | |
WO2018146922A1 (fr) | Dispositif de traitement d'informations, procédé de traitement d'informations et programme | |
JP7425562B2 (ja) | 撮像装置およびその制御方法 | |
US11443719B2 (en) | Information processing apparatus and information processing method | |
WO2022061541A1 (fr) | Procédé de commande, stabilisateur portatif, système, et support de stockage lisible par ordinateur | |
WO2018116582A1 (fr) | Dispositif de commande, procédé de commande, et système d'observation médicale | |
WO2022196093A1 (fr) | Dispositif de traitement d'informations, procédé de détection de ligne visuelle et programme | |
US9679391B2 (en) | Image pickup system and image pickup method |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 22770855 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2023506806 Country of ref document: JP Kind code of ref document: A |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 22770855 Country of ref document: EP Kind code of ref document: A1 |