WO2018207959A1 - Dispositif et procédé de traitement d'images - Google Patents

Dispositif et procédé de traitement d'images Download PDF

Info

Publication number
WO2018207959A1
WO2018207959A1 PCT/KR2017/004881 KR2017004881W WO2018207959A1 WO 2018207959 A1 WO2018207959 A1 WO 2018207959A1 KR 2017004881 W KR2017004881 W KR 2017004881W WO 2018207959 A1 WO2018207959 A1 WO 2018207959A1
Authority
WO
WIPO (PCT)
Prior art keywords
outline
target object
image
elliptic equation
shape information
Prior art date
Application number
PCT/KR2017/004881
Other languages
English (en)
Korean (ko)
Inventor
채용욱
최태준
Original Assignee
주식회사 룩시드랩스
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 주식회사 룩시드랩스 filed Critical 주식회사 룩시드랩스
Priority to KR1020197025494A priority Critical patent/KR20190107738A/ko
Priority to PCT/KR2017/004881 priority patent/WO2018207959A1/fr
Publication of WO2018207959A1 publication Critical patent/WO2018207959A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/40Image enhancement or restoration using histogram techniques
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/70Denoising; Smoothing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/11Region-based segmentation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/13Edge detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • G06T7/246Analysis of motion using feature-based methods, e.g. the tracking of corners or segments
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/60Analysis of geometric attributes
    • G06T7/62Analysis of geometric attributes of area, perimeter, diameter or volume
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/18Eye characteristics, e.g. of the iris
    • G06V40/19Sensors therefor
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30004Biomedical image processing
    • G06T2207/30041Eye; Retina; Ophthalmic

Definitions

  • the present specification relates to an image processing apparatus and method for eye tracking, and more particularly, to an image processing apparatus and method for estimating the size of a pupil or an iris from an eye image.
  • the pupil is a hollow, circular space located in the center of the iris, through which external light is transmitted to the retina.
  • the main function of these pupils is to reduce chromatic and spherical aberration by adjusting the amount of light reaching the retina and maximizing the depth of focus to maximize visual perception.
  • the pupil is enlarged to detect a lot of light in the dark place, and the pupil size is reduced in order to reduce the amount of light reaching the retina in the bright place to make a clearer image.
  • the pupil size changes due to the change of emotion, but the pupil size increases due to fear, surprise, and pain.
  • the size of the pupil is closely correlated with various cognitive neuroscientific phenomena such as changes in light brightness, attention, cognitive load, and emotion, and is widely used in research in related fields. .
  • various cognitive neuroscientific phenomena such as changes in light brightness, attention, cognitive load, and emotion
  • the present specification provides an image processing method for estimating the size of an iris or a pupil from an eye image.
  • the present specification provides an image processing method for performing an elliptic equation evaluation process to calculate an accurate elliptic equation for the iris or pupil covered by the eyelid, to accurately estimate the size of the pupil or iris.
  • the present disclosure provides a method for accurately estimating the size of the iris or the pupil by performing a process of correcting the outline of the iris image or the pupil image using a Fourier descriptor to remove the portion covered by the eyelid.
  • Image processing method comprises the steps of obtaining an eye image (eye image); Detecting an outline of an image of a target object corresponding to a pupil or an iris from the eye image; Calculating an elliptic equation for the target object based on the outline, and determining whether the elliptic equation is recognized as a final elliptic equation for the target object; If the elliptic equation is not recognized as the final elliptic equation, correcting an outline of the target object image using a Fourier descriptor, wherein the Fourier descriptor is shape information indicating a shape characteristic of the target object image. Obtained from; And when the elliptic equation is recognized as the final elliptic equation, acquiring information on at least one of an area, a shape, or a location of the target object using the final elliptic equation. It may include.
  • determining whether the final elliptic equation is recognized may be calculated by recalculating an elliptic equation for the target object based on the corrected outline. It can be determined whether or not it is recognized as a final elliptic equation for the target object.
  • determining whether the final elliptic equation is recognized may include: calculating an error rate based on a distance between each outline point constituting the outline and an ellipse having the elliptic equation, and based on the error rate And determining whether the elliptic equation is a final elliptic equation for the target object.
  • the shape information may be obtained based on a radial distance between the center of gravity of the outline points constituting the outline and each outline point.
  • the step of correcting an outline of the target object image may include: removing noise on the shape information by using the Fourier descriptor; And removing a part associated with a neighboring object image covering the target object image from the outline by using a slope characteristic of the shape information.
  • the removing of the noise on the shape information may include extracting a low frequency region of the Fourier descriptor; And restoring the shape information by inverse Fourier transforming the low frequency region.
  • the low frequency region may be a frequency region including frequency components lower than a reference frequency.
  • the removing of the part associated with the surrounding object image may include: dividing the outline of the target object image and the graph of the shape information into a plurality of parts based on a peak value of the graph representing the shape information; Calculating a max-min gradient for each part; And removing the part having the largest value of the maximum-minimum gradient.
  • An image processing apparatus comprising: a memory for storing data; And a processor coupled to a memory, the processor comprising: obtaining an eye image, detecting an outline of an image of a target object corresponding to the pupil or iris from the eye image, Calculate an elliptic equation for the target object based on the above, determine whether the elliptic equation is recognized as the final elliptic equation for the target object, and if the elliptic equation is not recognized as the final elliptic equation, a Fourier descriptor (correcting the outline of the target object image using a fourier descriptor, wherein the Fourier descriptor is obtained from shape information indicating shape characteristics of the target object image, and when the elliptic equation is recognized as the final elliptic equation, Using an elliptic equation Group may obtain information about at least one of the size of the target object (area), shape (shape) or a position (location).
  • an elliptic equation for the iris or the pupil may be calculated by performing an evaluation process of the elliptic equation for the iris or the pupil, and the size of the iris or the pupil obscured by the eyelid may be accurately estimated.
  • the present specification can remove the portion covered by the eyelid by performing the correction process of the outline of the iris image or the pupil image using the Fourier descriptor, through which the size of the iris or pupil covered by the eyelid can be accurately estimated. .
  • FIG. 1 is a block diagram of an image processing apparatus according to an embodiment of the present invention.
  • FIG 2 is an exemplary view of an eye image processed by an image processing apparatus according to an embodiment of the present invention.
  • FIG 3 is an enlarged view of an eye image processed by an image processing apparatus according to an embodiment of the present invention.
  • FIG. 4 is a flowchart of an image processing method by an image processing apparatus according to an embodiment of the present invention.
  • FIG. 5 is a flowchart illustrating a process of evaluating an elliptic equation by an image processing apparatus according to an embodiment of the present invention.
  • FIG. 6 shows shape information and Fourier descriptor for an elliptic equation according to an embodiment of the present invention.
  • FIG. 7 is a flowchart illustrating an operation of correcting an outline of a target object image by an image processing apparatus according to an exemplary embodiment.
  • FIG. 8 is an example of a graph showing detailed steps of outline correction of FIG. 7.
  • 9 and 10 illustrate changes in an ellipse calculated based on an outline before or after the image processing apparatus according to an embodiment of the present invention performs outline correction.
  • 11 is a comparison of results obtained by estimating the size of a target object based on an elliptic equation according to an elliptic equation evaluation and an outline correction according to an embodiment of the present invention.
  • FIG. 12 illustrates a calculation time for an image processing method of an image processing apparatus according to an embodiment of the present invention.
  • FIG. 13 illustrates an iBrain interface system according to an exemplary embodiment of the present invention.
  • FIG. 14 is a block diagram of a host device and a slave device according to an embodiment of the present invention.
  • FIG. 15 illustrates an iBrain interface (EBI) device according to an embodiment of the present invention.
  • 16 is a block diagram of an EBI device in accordance with an embodiment of the present invention.
  • FIG. 1 is a block diagram of an image processing apparatus according to an embodiment of the present invention.
  • an image processing apparatus 1 may include a processor 10 and a memory 20.
  • the processor 10 may be configured to perform an operation according to various embodiments of the present disclosure.
  • the processor 10 may be configured to perform image processing on an eye image.
  • the processor 10 may be configured to perform an elliptic equation evaluation and outline correction process to be described later. In this way, the processor 10 may accurately estimate the size of a target object corresponding to the pupil or the iris in real time from the eye image. This will be described in detail below with reference to FIGS. 2 to 12.
  • At least one of a module, data, a program, or software for implementing an operation of the image processing apparatus 1 according to various embodiments of the present disclosure may be stored in the memory 20 and executed by the processor 10.
  • the processor 10 may control other configurations.
  • the memory 20 is connected to the processor 10 and stores various information / data for driving the processor 10.
  • the memory 20 may be included in the processor 10 or installed outside the processor 10 and connected to the processor 10 by a known means.
  • the specific configuration of the image processing apparatus 1 of FIG. 1 may be implemented such that various embodiments of the present invention described below are applied independently or two or more embodiments are applied together.
  • the image processing apparatus 1 may be included as a module in a device or a system having a photographing function or a gaze tracking function for an eye image, or may be implemented as a separate device. If the image processing apparatus 1 is implemented as a separate device, the image processing apparatus 1 further includes a unit for obtaining an eye image from another device, for example, a communication unit for receiving an eye image from another device, and the like. can do. In addition, embodiments of another apparatus or system including the image processing apparatus 1 will be described below with reference to FIGS. 13 to 17.
  • FIG 2 is an exemplary view of an eye image processed by an image processing apparatus according to an embodiment of the present invention.
  • the image processing apparatus may estimate the size of the pupil or the iris by processing the eye image.
  • the eye image may include images of both eyes or one eye.
  • the eye image processed by the image processing apparatus may be a low resolution eye image.
  • the eye image may be a low resolution camera (eg, a head mounted display (HMD) device, an eye brain interface (EBI) system / device, etc.) worn by a user.
  • a low-resolution user's eye image captured by a 640 * 480 resolution camera).
  • the eye image may be an eye image of a low resolution user photographed at a distance.
  • the contour of this object image will generally consist of a small number of contour points.
  • all of the individual outline points constituting this outline have a significant meaning, so the precise processing thereof is very important to increase the accuracy of image processing.
  • small processing errors can have a big impact on the image processing results, so sophisticated processing of individual outline points is important.
  • an ellipse equation for a pupil is estimated based on the outline points of the pupil image to estimate the actual size of the pupil
  • the precise processing of the individual outline points constituting the pupil image by the image processing device is performed.
  • the treatment is very important.
  • the pupil image is obscured by an image of another surrounding object (e.g., eyelid image, etc.)
  • elaborate removal of the outline points of the obscured or obscured portion of the image processing device yields an accurate elliptic equation for the pupil. It is one of the very important factors for.
  • An image processing apparatus may be a device suitable for processing such a low resolution eye image.
  • various embodiments of the present disclosure will be described, with the eye image processed by the image processing apparatus as a low resolution image.
  • the high resolution eye image may be processed by the image processing apparatus through the same or similar processing.
  • the following mainly describes the image processing process for the pupil portion mainly, but the same or similar description can be easily applied to the image processing process for the iris portion including the pupil it will be readily understood by those skilled in the art. Can be.
  • the pupil and iris images in the eye image are hidden by the eyelid image. That is, the eye image of the embodiment of FIG. 3 may be an image showing the eye in a state where the pupil and the iris are covered by the eyelid.
  • the pupil image may be an image of the pupil or an image associated with the pupil part, and may indicate the shape of the pupil.
  • the iris image is an image associated with the iris portion or the image of the iris, it may represent the shape of the iris.
  • the eyelid image is an image of the eyelid or an image associated with the eyelid portion, which may represent the shape of the eyelid.
  • the pupil image 31 in the eye image may be covered by the eyelid image.
  • the actual pupil (or iris) is circular in shape, but since the eye ball is spherical, the image of the pupil (or iris) on the photographed image is generally seen as elliptical.
  • the contour of the pupil image (or iris image) is in the form of an ellipse.
  • the outline of the pupil image may not have the shape of an intact ellipse.
  • the outline 32 of the pupil image detected from the eye image of FIG. 3A may be as shown in FIG. 3B.
  • the image processing apparatus may calculate an elliptic equation for the pupil based on the outline 32 of the pupil image in order to estimate the size of the pupil. This will be described in detail below with reference to FIG. 4.
  • the outline 33 of the ellipse having the elliptic equation thus calculated may be as shown in FIGS. 3 (c) and 3 (d). Specifically, FIG. 3 (c) shows the outline 32 of the pupil image and the outline 33 of the ellipse having the calculated elliptic equation, and FIG. 3 (d) shows the outline of the ellipse having the calculated elliptic equation. (33) is shown.
  • the image processing apparatus has calculated an elliptic equation for the pupil based on the outline of the pupil image including the part obscured by the eyelid image.
  • the shape of the ellipse represented by the calculated elliptic equation and the actual shape of the pupil differ. That is, the calculated elliptic equation cannot represent the exact size of the pupil.
  • the image processing apparatus may perform an image processing process of removing a portion of the pupil image, which is covered by the eyelid image, in order to calculate an accurate elliptic equation. That is, an outline correction process may be performed to remove the hidden portion, which will be described below in detail with reference to FIG. 7.
  • the image processing apparatus may again calculate an accurate elliptic equation for the pupil, and determine this as the final elliptic equation for the pupil.
  • An elliptic equation evaluation process will be described in detail below with reference to FIG. 5.
  • the outline 34 of the ellipse having the elliptic equation thus recalculated may be as shown in FIGS. 3 (e) and 3 (f). Specifically, FIG. 3 (e) shows an outline 34 of an ellipse having an elliptic equation calculated again, and FIG. 3 (f) shows an ellipse (ellipse of FIG. 3 (d)) having an elliptic equation calculated first.
  • the outline 33 is compared with the outline 34 of an ellipse (ellipse of FIG. 3 (e)) having an elliptic equation calculated again.
  • the image processing apparatus is based on the outline of the pupil image from which the portion covered by the eyelid image has been removed. Since the elliptic equation for the pupil is calculated, the shape of the ellipse represented by the calculated elliptic equation and the actual shape of the pupil may be substantially the same. That is, the calculated elliptic equation may represent the exact size of the pupil. Thus, this elliptic equation is the final elliptic equation and can be used to obtain geometric information about the pupil.
  • FIG. 4 is a flowchart of an image processing method by an image processing apparatus according to an embodiment of the present invention.
  • FIG. 4 illustrates a method of estimating the size of a pupil or an iris from an eye image by an image processing apparatus according to an exemplary embodiment.
  • Each step of FIG. 4 may be performed by the image processing apparatus or the processor of the image processing apparatus described above with reference to FIG. 2.
  • the image processing apparatus may acquire an eye image (S410).
  • the eye image may include an image of the target object (target object image) and / or an image of a surrounding object covering the target object (a peripheral object image).
  • the target object may be a pupil or an iris.
  • the surrounding object may be an eyelid that hides the pupil or iris (eg, the upper eyelid).
  • various embodiments of the present invention will be described with a focus on the case where the target object is the pupil or the iris and the surrounding object is the eyelid.
  • the image processing apparatus may detect an outline of an image of the target object from the eye image (S420). For example, the image processing apparatus may detect an outline of a pupil image or an iris image that is a target object image from the eye image. That is, the image processing apparatus may detect the outline of the image portion associated with the pupil or iris which is the target object from the eye image.
  • a method of detecting the outline of the image of the target object there are a variety of methods depending on the type of image, the resolution, the type of the target object, and the like.
  • the image processing apparatus may detect the outline of the image of the target object from the eye image using one or more of these known methods.
  • the outline of the detected target object image may be as shown in FIG. 3 (b).
  • the outline of the detected target object image may include a plurality of outline points. That is, the detected outline may be composed of a plurality of outline points.
  • the number of outline points constituting the outline may vary according to the resolution of the image. For example, in the case of a low resolution image, the outline may consist of a relatively small number of outline points. Therefore, in the case of low resolution images, the processing of one outline point is very important as described above. Meanwhile, in the case of a high resolution image, the outline may be configured by a relatively large number of outline points.
  • the outline of the detected target object image may include a portion associated with the image of the surrounding object covering the target object.
  • the outline of the detected pupil image or iris image may include a portion associated with the eyelid image that obscures the pupil image or iris image.
  • some of the plurality of outline points that make up the outline may be associated with an eyelid image that masks the pupil or iris. That is, some of the outline points may be outline points that indicate portions of the eyelid image. This has been described above with reference to FIG. 3.
  • the image processing apparatus may calculate an elliptic equation for the target object based on the outline of the target object image (S430).
  • the image processing apparatus may calculate an elliptic equation for the target object based on the plurality of outline points constituting the outline of the target object image.
  • the image processing apparatus may calculate parameters of the elliptic equation for the iris or pupil (eg, the center of the ellipse, the long axis, the length of the short axis, the slope, etc.) using the outline points of the iris image or the pupil image. .
  • the elliptic equation and its parameters may be as shown in Equation 1 below.
  • Xc and Yc represent the center point of an ellipse
  • a and b represent the length of a long axis and a short axis, respectively
  • (theta) represents a slope
  • the image of the target object on the photographed eye image is generally seen as an ellipse. Therefore, the elliptic equation calculated based on the outline of the target object image may be used as an equation for representing the shape or size of the target object.
  • the image processing apparatus may determine whether the calculated elliptic equation is recognized as the final elliptic equation for the target object (S440). This will be described below in detail with reference to FIG. 5.
  • the final elliptic equation may be an elliptic equation used to obtain or estimate geometric information about the target object.
  • the final elliptic equation may be an elliptic equation used to obtain information about at least one of an area, shape or location of the target object.
  • the image processing apparatus uses the final elliptic equation to provide information on at least one of the area, shape or location of the target object. It can be obtained (S450). That is, the image processing apparatus may obtain at least one of size information indicating the size of the target object, shape information indicating the shape of the target object, or location information indicating the location of the target object. Through this, the actual size of the target object may be estimated.
  • Information about the estimated size and size of the pupil may be used to acquire information such as brightness change of the light, user's attention, cognitive load, and emotion change / analysis in real time.
  • the estimated information may be used for gaze recognition and estimation.
  • the image processing apparatus may correct an outline of the target object image by using a Fourier descriptor (S460).
  • the Fourier descriptor may be obtained from shape information / data indicative of the shape characteristic of the target object image. This will be described below in detail with reference to FIGS. 6 to 7.
  • the image processing apparatus may repeat the above-described steps S430 to S460 on the corrected outline. For example, the image processing apparatus may recalculate an elliptic equation based on the corrected outline of the target object image, and determine whether the calculated elliptic equation is recognized as the final elliptic equation for the target object. When the calculated elliptic equation is recognized as the final elliptic equation for the target object, the image processing apparatus may acquire size information and the like of the target object using the elliptic equation. If the recalculated elliptic equation is not recognized as the final elliptic equation for the target object, the image processing apparatus may recorrect the outline of the target object image using the Fourier descriptor. In this case, the above-described process may be repeated until the calculated elliptic equation is recognized as the final elliptic equation for the target object.
  • the elliptic equation evaluation process may correspond to step S440 of FIG. 4. That is, the elliptic equation evaluation process may be a step in which the image processing apparatus determines whether the elliptic equation is recognized as the final elliptic equation for the target object.
  • Each step of FIG. 5 may be performed by a processor of the image processing apparatus or the image processing apparatus described above with reference to FIG. 2.
  • the target object may be covered by the surrounding object.
  • the outline of the target object image may include a portion that is covered by the surrounding object image having different shape characteristics. Therefore, the elliptic equation calculated based on this outline is difficult to accurately estimate the size of the target object and the like. Therefore, before the image processing apparatus obtains the size information and the like of the target object using the calculated elliptic equation, whether the calculated elliptic equation can be recognized as the final elliptic equation that can be used to estimate the size and the like of the target object. An elliptic equation evaluation process can be performed. In this way, the image processing apparatus may acquire accurate size information of the target object based on the exact elliptic equation, without unnecessary calculation process of estimating the size of the target object based on the incorrect elliptic equation.
  • the image processing apparatus calculates an error ratio based on the distance between each outline point constituting the outline and an ellipse having an elliptic equation, and determines whether the elliptic equation is the final elliptic equation for the target object according to the error ratio.
  • the elliptic equation can be evaluated in this way. Hereinafter, this will be described in detail with reference to FIG. 5.
  • the error rate may be a value representing a ratio of the outline points whose distance to the ellipse among the entire outline points exceeds the reference distance. That is, the error rate may be a value representing the ratio of the outline points corresponding to the ellipse equation, that is, the reference point spaced apart by a reference distance or more from the ellipse represented by the elliptic equation.
  • the image processing apparatus may calculate a distance from an ellipse having an elliptic equation with respect to each outline point constituting the outline (S510). That is, the image processing apparatus may calculate a distance (eg, a minimum distance) between each outline point and an ellipse. For example, the distance d between each outline point and an ellipse may be calculated using Equation 2 below.
  • the image processing apparatus may calculate an error rate based on the distance d between each outline point and the ellipse.
  • the image processing apparatus may calculate a ratio of outline points whose absolute value of the distance d is equal to or less than a reference value (for example, 0.1), and calculate an error ratio based on the ratio (S520). In this case, whether the absolute value of the distance d is less than or equal to the reference value may be determined using Equation 3 below.
  • the image processing apparatus may determine whether the elliptic equation is the final elliptic equation for the target object based on the error ratio (S530). For example, the image processing apparatus may determine that the elliptic equation is the final elliptic equation for the target object when the error rate is less than the reference ratio (eg, 0.1). That is, when the ratio of the outline points whose distance to the ellipse among the entire outline points exceeds the reference distance is less than the reference ratio, the image processing apparatus may determine that the elliptic equation is the final elliptic equation for the target object. Alternatively, the image processing apparatus may determine that the elliptic equation is not the final elliptic equation for the target object when the error rate is less than the reference ratio (eg, 0.1).
  • the reference ratio eg, 0.1
  • FIG. 3 (d) An embodiment in which an elliptic equation whose error rate is less than the reference ratio is calculated is shown in FIG. 3 (d), and an embodiment in which an elliptic equation in which the error rate is less than the reference ratio is calculated is shown in FIG.
  • the image processing apparatus checks to what extent the outline points exist on or within the reference distance of the ellipse calculated by the outline points, i.e., how well the outline points and the ellipse fit. Based on this, it can be determined whether the elliptic equation can be recognized as the final elliptic equation for the target object.
  • the image processing apparatus can evaluate the elliptic equation at a high processing speed.
  • FIG. 6 shows shape information and Fourier descriptor for an elliptic equation according to an embodiment of the present invention.
  • the Fourier descriptor is an outline-based such as a wavelet-based shape descriptor, a triangle-based shape descriptor, an ellipse-based shape descriptor.
  • the contour-based shape descriptors it may be referred to as a Fourier shape descriptor or a Fourier-based shape descriptor.
  • Such a Fourier descriptor can be obtained from shape information / data that represents a characteristic (shape characteristic or shape signature) for the shape of a particular object.
  • the shape information may be obtained from the coordinates of the outline points of the particular object or image of the particular object.
  • shape information may be obtained from a radial distance between the outline points of a particular object or object image and the center of gravity (or centroid) of the outline points.
  • the radial distance may be information indicating the distance between the center point and the outer line point according to the angle change (eg, the counterclockwise angle change).
  • the shape information may be in the form of a distance function indicating a radius distance according to the change in angle. That is, the shape information may be information expressed as a distance function indicating a radius distance according to the angle change.
  • shape information may be referred to as a shape function, a distance function, or the like.
  • shape information about an ellipse and a Fourier descriptor will be described by way of example.
  • the shape information is information represented by the distance function / information representing the above-described radial distance.
  • the elliptic equation of the ellipse is as shown in Equation 5 below.
  • the radial distance of the specific coordinates (xi, yi) on the ellipse may be calculated through Equation 6 below.
  • Shape information can be calculated based on this radial distance information (or function).
  • the graph of the shape information thus calculated may be as shown in FIG. 6 (b).
  • the point with the maximum peak value and the point with the minimum peak value are It may be an edge or an inflection point (eg, points a, b, c, d) in the object.
  • the Fourier descriptor can be obtained by Fourier transforming this shape information.
  • the graph of the Fourier descriptor thus obtained may be as shown in FIG. 6 (c).
  • Such a Fourier descriptor may represent shape information / characteristics of a corresponding object or object image for each frequency component.
  • the high frequency component of the Fourier descriptor for a specific object may be treated as a noise component.
  • the high frequency component of the Fourier descriptor may be treated as noise. Therefore, when the high frequency component is removed through the high-pass filter, the noise component of the object is removed. Accordingly, the object having the outline from which the noise is removed may be restored by inverse Fourier transform.
  • the object can be simplified with more and more major shape features only. As such, the Fourier descriptor can be used to remove the noise portion of a particular object or object image while maintaining its original shape characteristics.
  • the image processing apparatus may perform outline correction to calculate an accurate elliptic equation. To this end, the image processing apparatus may use the above-described Fourier descriptor.
  • FIG. 7 is a flowchart illustrating an operation of correcting an outline of a target object image by an image processing apparatus according to an exemplary embodiment.
  • FIG. 8 is an example of a graph showing detailed steps of outline correction of FIG. 7. In the embodiment of FIG. 8, it is assumed that the target object is a pupil.
  • Each step of FIGS. 7 to 8 may be performed by the image processing apparatus or the processor of the image processing apparatus described above with reference to FIG. 2.
  • the correcting the outline of the image of the target object may be the step S460 of FIG. 4. That is, the step of correcting the outline of the image of the target object may be a step of correcting the outline of the target object image using a Fourier descriptor when the calculated elliptic equation is not recognized as the final elliptic equation for the target object.
  • the Fourier descriptor may be obtained from shape information / data indicative of the shape characteristic of the target object image.
  • the process of adjusting the outline may include removing noise on the shape information using a Fourier descriptor and removing a part associated with the surrounding object image covering the target object image from the outline using a slope characteristic of the shape information. can do.
  • a process of removing noise will be described with reference to steps S710 to S730, and a process of removing a part associated with the surrounding object image will be described with reference to steps S740 to S770.
  • the image processing apparatus may acquire shape information about the target object image (S710).
  • the shape information may provide information about shape characteristics of the target object image. This is the same as described above with reference to FIG. 6.
  • the image processing apparatus may obtain shape information about the target object image based on coordinate information of outline points of the target object image.
  • the image processing apparatus may obtain shape information about the pupil image based on the coordinate information of the outline points of the pupil image.
  • the image processing apparatus may obtain shape information about the target object image based on a radial distance between each outline point of the target object image and the center of gravity of the outline points.
  • the image processing apparatus may obtain shape information about the pupil image based on a radius distance between each outline point of the pupil image and the center of the outline points.
  • a graph representing shape information of the pupil image obtained from the eye image of FIG. 3A may be the same as that of FIG. 8A.
  • the x-axis of the graph of shape information may represent an angle and the y-axis may represent a radial distance. That is, in this case, the shape information may be in the form of a distance function representing the distance according to the angle change.
  • the image processing apparatus may obtain a Fourier descriptor by Fourier transforming the shape information (S720).
  • the Fourier descriptor may represent the shape characteristic / information of the target object image for each frequency component.
  • the Fourier descriptor may provide shape characteristics / information for each frequency component of the pupil image.
  • the graph representing the Fourier descriptor thus obtained may be as shown in FIG. 8 (b).
  • the x-axis of the graph of this Fourier descriptor may represent frequency and the y-axis may represent magnitude.
  • the Fourier descriptor for the pupil image shows a form in which amplitudes are concentrated on frequency components below a specific frequency. That is, the Fourier descriptor for the elliptical pupil image may have a shape in which the size is concentrated in the low frequency component.
  • the image processing apparatus may recover the shape information by extracting the low frequency region of the Fourier descriptor and inversely transforming the low frequency region (S730). Through this, noise on shape information may be removed.
  • the low frequency region means a frequency region including frequency components lower than a preset reference frequency.
  • the reference frequency may be a frequency that allows the recovered shape information to remain sufficiently free of noise while maintaining the original shape characteristics.
  • the reference frequency may be set differently according to, for example, the type of the target object, the resolution of the image, and various other factors. That is, the reference frequency can be variable or flexible.
  • the image processing apparatus may include a portion of the low frequency region from the lowest frequency component of the Fourier descriptor information to a frequency component lower than the reference frequency (eg, the 16th lowest frequency component).
  • the shape information can be recovered by extracting the inverse Fourier transform.
  • the reconstructed shape information may be in a state where noise is removed while maintaining original shape characteristics.
  • the graph showing the shape information from which the noise is removed may be as shown in FIG. FIG. 8 (c) shows a graph showing the shape information of FIG. 8 (a) and a graph showing the shape information from which noise is removed.
  • the image processing apparatus may divide the outline of the target object image and the graph of the shape information into a plurality of parts based on the peak value of the graph representing the shape information (S740). In an embodiment, the image processing apparatus may subdivide the outline and the graph of the target object image into a plurality of parts based on the maximum peak value of the graph representing the shape information. For example, the image processing apparatus may subdivide the graph of the outline of the pupil image and the shape information into a plurality of parts based on the maximum peak value of the graph representing the restored shape information as shown in FIG.
  • each part of the outline of the target object image may be matched one-to-one with each part of the graph of shape information. That is, parts (1) to (6) of the outline of the left view of FIG. 8 (d) may correspond to parts (a) to (f) of the graph of the right view of FIG. 8 (d).
  • the part (part (a)) covered by the linear eyelid image in the outline of the oval pupil image is located far from the center of gravity. Therefore, since part (a) has a maximum peak value, as shown in the left figure of FIG. 8 (d), part (a) may correspond to part (1) having the maximum peak value of the shape information graph.
  • the image processing apparatus may correspond one-to-one to each part of the outline of the target object image and each part of the graph of the shape information.
  • the image processing apparatus may calculate a first order gradient of a graph representing shape information (S750).
  • the graph of the primary gradient thus calculated may be as shown in FIG. 8 (e).
  • the image processing apparatus may calculate a max-min gradient for each part (S760). That is, the image processing apparatus may calculate a difference between the maximum gradient and the minimum gradient for each part.
  • the maximum-minimum gradient thus calculated may be as indicated by the arrow in FIG. 8 (e). In this case, as shown in FIG. 8E, the part (part 1) covered by the eyelid image is almost straight, and thus may have the largest maximum-minimum gradient value.
  • the image processing apparatus may remove the part having the largest maximum-minimum gradient from the outline of the target object image (S770).
  • the image processing apparatus may remove all or some of the outline points associated with the part having the largest minimum-minimum gradient, thereby removing the part having the largest maximum-minimum gradient from the outline of the target object image. In this way, the image processing apparatus may correct the outline of the target object image.
  • the image processing apparatus may use all of the portions that are covered by the peripheral object image having the shape characteristics different from the target object image from the outline of the target object image, using the slope characteristic (eg, the primary gradient value) of the shape information. Some can be removed.
  • the outline of the corrected target object image may be as shown in FIG. 8 (f).
  • the corrected outline of the pupil image may be a state in which part or all of the part (part (a)) covered by the eyelid image is removed.
  • FIG. 9 and 10 illustrate changes in an ellipse calculated based on an outline before or after the image processing apparatus according to an embodiment of the present invention performs outline correction. Meanwhile, FIG. 9 shows a change of an ellipse on the eye image, and FIG. 10 shows a change of an ellipse on the outline of the target object image.
  • 9 (a) and 10 (a) show an ellipse a1 (or an outline of an ellipse) calculated based on an outline a3 of an ellipse object image before the image processing apparatus performs outline correction of the target object image.
  • the calculated ellipse a1 does not substantially coincide with the outline a3 of the target object image. As described above, this is because the outline includes a part that is covered by the surrounding object rather than an ellipse. Therefore, the image processing apparatus may not recognize the calculated elliptic equation as the final elliptic equation for the target object, and may perform outline correction of the target object image.
  • the image processing apparatus may recalculate an elliptic equation based on the corrected outline of the target object image, and determine whether the recalculated elliptic equation may be recognized as the final elliptic equation for the target object.
  • the ellipse a2 (or the outline of the ellipse) represented by the elliptic equation thus calculated may be the same as in FIGS. 9B and 10B.
  • FIGS. 9B and 10B illustrate an ellipse a2 calculated based on an outline a4 of an ellipse object image after the image processing apparatus performs outline correction of the target object image.
  • the recalculated ellipse a2 may substantially coincide with the outline a4 of the target object image. As described above, this is because the part whose outline is covered by the surrounding object is corrected to be removed. Accordingly, the image processing apparatus may recognize the calculated elliptic equation as the final elliptic equation for the target object, and acquire information on at least one of the size, shape, or position of the target object by using the final elliptic equation. Meanwhile, whether or not the calculated ellipse substantially coincides with the outline of the target object image may be confirmed by evaluating the elliptic equation described above with reference to FIG. 5.
  • FIG. 9 (c) is a diagram comparing the calculated ellipses before or after the image processing apparatus performs outline correction of the target object image. As shown in FIG. 9C, the ellipse a2 calculated after the outline correction is more closely matched to the outline of the target object image compared to the ellipse a1 calculated before the outline correction is performed. You can check it.
  • the target object is a pupil.
  • the size of the target object may be the area / area of the pupil.
  • the area of the pupil can be calculated through the equation below.
  • a and b represent the length of the long axis and short axis of the ellipse, respectively.
  • series 1 shows a result graph when elliptic equation evaluation and outline correction are not performed
  • series 2 shows a result graph when elliptic equation evaluation and outline correction are performed.
  • the x-axis of the graph represents the frame number of the video image
  • the y-axis represents the width of the pupil.
  • the graph may be a graph using a median filter for each of 10 frames.
  • a change in the area of an actual pupil may be represented more accurately than in the case of series 1. That is, when elliptic equation evaluation and outline correction are performed, it is possible to more accurately obtain information about the actual change in the pupil area.
  • FIG. 12 illustrates a calculation time for an image processing method of an image processing apparatus according to an embodiment of the present invention.
  • the image processing apparatus may estimate the shape of the pupil or the iris in real time.
  • FIGS. 13 to 17 an exemplary apparatus or system including the image processing apparatus described above or corresponding to the image processing apparatus will be described.
  • the EBI device or EBI system of the embodiment of FIGS. 13 to 17 is merely an example of an apparatus or system that may include or correspond to an image processing apparatus, and the image processing apparatus may be included in various other apparatuses, Or other device.
  • FIG. 13 illustrates an iBrain interface system according to an exemplary embodiment of the present invention.
  • FIG. 14 is a block diagram of a host device and a slave device according to an embodiment of the present invention.
  • an EyeBrain Interface (EBI) system may include a host device 150 and a slave device 100.
  • the slave device 100 may represent various types of wearable devices that can be worn by a user.
  • the slave device 100 may represent a device that contacts / wears on a user's body part such as a head mounted display (HMD), a headset, a smart ring, a smart watch, an earset, an earphone, and the like.
  • the slave device 100 may include at least one sensor to sense the biosignal of the user through a body part of the user.
  • the biosignal may represent various signals generated from the user's body according to the user's conscious and / or unconscious (eg, breathing, heartbeat, metabolism, etc.) behavior such as pulse, blood pressure, brain wave, and the like.
  • the slave device 100 may sense the brain wave of the user as the biosignal of the user and transmit the sensing result to the host device 150.
  • the host device 150 may represent a device operating based on a sensing result of the biosignal received from the slave device 100.
  • the host device 150 may be various electronic devices that receive a biosignal sensing result of the user from the slave device 100 and perform various operations based on the received sensing result.
  • the host device 150 may be, for example, various electronic devices such as a TV, a smartphone, a tablet PC, a smart car, a PC, a laptop, and the like.
  • the EBI system includes a slave device 100 and a host device 150 to provide a control scheme based on a biosignal of a user.
  • the system directly senses the user's intention by sensing the user's biosignal and is controlled accordingly.
  • the EBI system provides the user with a more convenient and intentional control method.
  • the configuration of the slave device 100 and the host device 150 will be described in more detail.
  • the slave device 100 includes a position marker unit 120, a gaze tracking unit 130, an EEG sensing unit 110, a sensor unit 260, a communication unit 250, and a processor 240. ) May be included.
  • the position marker unit 120 may include at least one light emitting device (eg, an infrared LED) that emits light.
  • the host device 150 may track the position marker unit of the slave device 100 in real time, whereby the position of the user wearing the slave device 100, the position, the distance between the host device 150 and the user, and The relative position, etc. (hereinafter, 'user's position') can be detected.
  • the plurality of light emitting elements may be positioned in the position marker unit 120 spaced apart by a predetermined distance.
  • the host device 150 may detect the relative distance between the host device 150 and the user by tracking the light emitting elements of each position marker unit 120 and measuring the separation distance between the light emitting elements in real time. Can be. For example, when the position marker unit 120 moves away from the host device 150, the separation distance between the light emitting elements measured in the host device 150 is reduced, and the position marker unit 120 is connected to the host device 150. When approaching, the separation distance between the light emitting devices measured by the host device 150 may increase.
  • the host device 150 calculates a ratio between the separation distance between the light emitting devices measured in real time and the predetermined separation distance between the actual light emitting devices, and thereby calculates the relative distance between the host device 150 and the user. The distance can be calculated.
  • the position marker unit 120 for tracking a user's position may be included in the slave device 100 in various forms, and the host device 150 includes the position, size, and location of these position marker units 120. The position of the user may be detected based on the number, location, and separation distance of the light emitting devices.
  • the gaze tracking unit 130 may track the gaze of the user.
  • the gaze tracking unit 130 may be provided in the slave device 100 to be positioned around the eyes of the user to track the eyes of the user (eye movement) in real time.
  • the eye tracking unit 130 may include a light emitting device (eg, an infrared LED) that emits light and a camera sensor that receives (or senses) the light emitted from the light emitting device.
  • the gaze tracking unit 130 may photograph light reflected from the user's eyes with a camera sensor and transmit the photographed image to the processor 240 (video analysis method).
  • video analysis method video analysis method
  • the eye tracking unit 130 may include a contact lens method (mirror built-in contact) in addition to the above-described video analysis method.
  • User's eyes using the reflected light of the lens, the eye tracking method using the magnetic field of the coiled contact lens) or the sensor attachment method (the eye tracking method using the electric field according to the eye movement by attaching the sensor around the eyes) Can be tracked.
  • the EEG sensing unit 110 may sense the EEG of the user.
  • the EEG sensing unit 110 may include at least one electroencephalogram (EGE) sensor and / or a magnettoencephalography (MEG) and a near-infrared spectrometer (NIRS).
  • EGE electroencephalogram
  • MEG magnettoencephalography
  • NIRS near-infrared spectrometer
  • the brain wave sensing unit 110 may be provided at a body (eg, head) contact position where the brain wave of the user may be measured when the user wears the slave device 100, and measure the brain wave of the user.
  • the EEG sensing unit 110 measures an electrical / optical frequency that varies according to the EEG of the various frequencies generated from the body part of the contacted user or the activation state of the brain.
  • EEG is a biosignal
  • simply extracting the brain waves of the user and analyzing them on a uniform basis is less accurate in distinguishing the user's current cognitive state. Therefore, in order to accurately measure the cognitive state of the user based on the EEG, the present disclosure provides a calibration method of the EEG according to the current cognitive state for each user.
  • the sensor unit 260 may include at least one sensing means, and may sense the surrounding environment of the device 100 using the sensing means. In addition, the sensor unit 260 may transmit the sensing result to the processor. In particular, in the present specification, the sensor unit may sense a movement, a movement, and the like of the slave device 100 and transmit a sensing result to the processor 240.
  • the sensor unit 260 is an inertia measurement unit (IMU) sensor, a gravity sensor, a geomagnetic sensor, a motion sensor, a gyro sensor, an accelerometer, a magnetometer, an acceleration sensor, an infrared ray as a sensing means.
  • the sensor may include an inclination sensor, an altitude sensor, an infrared sensor, an illuminance sensor, a global positioning system (GPS) sensor, and the like.
  • GPS global positioning system
  • the sensor unit 260 collectively refers to the various sensing means described above.
  • the sensor unit 260 may sense various user inputs and environment of the device, and may transmit a sensing result to allow the processor to perform an operation accordingly.
  • the above-described sensing means may be included in the slave device 100 as a separate element or integrated into at least one or more elements.
  • the communication unit 250 may communicate with an external device using various protocols, and may transmit / receive data through the communication unit 250.
  • the communication unit 250 may connect to a network by wire or wirelessly to transmit / receive various signals and / or data.
  • the slave device 100 may perform pairing with the host device 150 using the communication unit 250.
  • the slave device 100 may transmit / receive various signals / data with the host device 150 using the communication unit 250.
  • the processor 240 may control the position marker unit 120, the eye tracking unit 130, the brain wave sensing unit 110, the sensor unit 260, and the communication unit 250.
  • the processor 240 may control transmission / reception of signals (or data) between the above-described units.
  • the processor 240 may transmit a sensing result received from at least one sensor provided in the slave device 100 to the host device 150.
  • the sensing result may refer to raw data obtained by using at least one sensor included in the slave device 100 or data processed through a predetermined algorithm.
  • processor 240 may perform various operations for calibrating the user's gaze and brain waves, which will be described in detail later with reference to FIGS. 6 to 13.
  • the slave device 100 may optionally include some of the configuration units shown in FIGS. 1 and 2, and in addition, various units required for the purpose and operation of the device, such as a memory unit, a camera unit, and a power supply unit. May be further included.
  • the host device 150 may include a camera unit 140, a display unit 210, a communication unit 230, and a processor 220.
  • the camera unit 140 may photograph the position marker unit 120 of the slave device 100.
  • the camera unit 140 may capture the position marker unit 120 of the slave device 100 to obtain a captured image of the position marker unit 120.
  • the camera unit 140 may transmit the acquired captured image to the processor 220, and the processor 220 may process the captured image to acquire a position of a user wearing the slave device 100.
  • the processor 220 may acquire the position of the user by analyzing the position and size of the position marker units 120, the number of the light emitting elements included therein, the position, and the separation distance.
  • the camera unit 140 may be configured as a wide angle camera having an angle of view of about 60 degrees or more.
  • the camera unit 140 is configured as a general camera (camera having an angle of view of less than 60 degrees), the left and right angles of about 60 degrees in front of the host device 150, about 60 to about between the slave device 100 and the host device 150. You can track your position in the 90cm range.
  • the camera unit 140 is configured with a wide angle camera (camera having an angle of view of 60 degrees or more), the left and right angles of about 170 degrees in front of the host device 150, and between the slave device 100 and the host device 150 Position tracking is possible up to a distance of about 3m.
  • the camera unit 140 of the present invention can be configured as a wide angle camera to obtain more accurate position data of the user.
  • the display unit 210 may display an image.
  • the image may represent a still image, a moving image, text, a virtual reality (VR) image, an AR (Augment Reality) image, or various other visual expressions including the same, which may be displayed on the screen.
  • the display unit 210 includes a liquid crystal display, a thin film transistor-liquid crystal display, an organic light-emitting diode (OLED), a 3D display, a transparent It may include at least one of a transparent OLED (TOLED).
  • the display unit 210 may be made of a metal foil, very thin grass, or a plastic substrate.
  • a plastic substrate a PC substrate, a PET substrate, a PES substrate, a PI substrate, a PEN substrate, an AryLite substrate, or the like may be used.
  • the communication unit 230 may communicate with an external device using various protocols, and may transmit / receive data through the communication device 230.
  • the communication unit 230 may connect to a network by wire or wirelessly to transmit / receive various signals and / or data.
  • the host device 150 may perform pairing with the slave device 100 using the communication unit 230. In addition, the host device 150 may transmit / receive various signals / data with the slave device 100 using the communication unit 230.
  • the processor 220 may control the camera unit 140, the display unit 210, and the communication unit 230.
  • the processor 220 may control transmission / reception of signals (or data) between the above-described units.
  • the processor 220 may perform various commands (or operations) corresponding to the sensing result received from the slave device 100. For example, when the gaze coordinates of the user are received as a result of sensing, the processor 220 may execute a command for selecting a visual object (eg, an icon) at a specific position on the display unit 210 mapped to the gaze coordinates. Can be done. In addition, when user EEG data corresponding to the “focused” state is received as the sensing result, the processor 220 may execute a command for executing the selected visual object (eg, executing an application corresponding to the selected icon). have.
  • a visual object eg, an icon
  • the EEG calibration for mapping the specific cognitive state of the user and the EEG of a specific frequency also needs to be preceded. Accordingly, the present invention can provide an EBC (Eye Brain Calibration) interface for simultaneously calibrating the user's gaze and brain waves.
  • EBC Eye Brain Calibration
  • the host device 150 may optionally include some of the configuration units shown in FIGS. 13 and 14, and in addition, various units required for the purpose and operation of the device, such as a sensor unit, a memory unit, and a power supply unit. May be further included.
  • FIG. 15 illustrates an iBrain interface (EBI) device according to an embodiment of the present invention.
  • EBI iBrain interface
  • the EBI device 400 may represent a device in which the slave device 100 and the host device 150 described above with reference to FIGS. 1 to 3 are integrated into one device. Therefore, the EBI device 400 may directly sense the biosignal and perform various operations based on the sensing result.
  • the EBI device 400 may be configured in the form of a wearable device that can be worn on a user's body.
  • the EBI device 400 may include an EEG sensing unit 500, a gaze tracking unit 510, a communication unit 530, a display unit 540, and a processor 520. Since the description of the units included in the EBI device 400 overlaps with the description above with reference to FIG. 14, the following description will focus on the differences.
  • the EEG sensing unit 500 may sense the EEG of the user.
  • the EEG sensing unit 500 may include at least one electroencephalogram (EGE) sensor and / or a magnettoencephalography (MEG).
  • EGE electroencephalogram
  • MEG magnettoencephalography
  • the brain wave sensing unit 500 may be provided at a body (eg, head) contact position where the user's brain wave may be measured when the user wears the EBI device, and measure the brain wave of the user.
  • the gaze tracking unit 510 may track the eyes of the user.
  • the gaze tracking unit 510 may be provided in the EBI device 400 to be positioned around the eyes of the user to track the eyes of the user (eye movement) in real time.
  • the gaze tracking unit 510 may include a light emitting device (eg, an infrared LED) that emits light and a camera sensor that receives (or senses) the light emitted from the light emitting device.
  • the eye image photographed by the eye tracking unit 51 may be as shown in FIG. 15C. This is the same as described above with reference to FIG. 2.
  • the communication unit 530 may communicate with an external device using various protocols, and may transmit / receive data through the communication unit 530.
  • the communication unit 530 may connect to a network by wire or wirelessly to transmit / receive various signals and / or data.
  • the display unit 540 may display an image.
  • the image may represent a still image, a moving image, text, a virtual reality (VR) image, an AR (Augment Reality) image, or various other visual expressions including the same, which may be displayed on the screen.
  • VR virtual reality
  • AR Augment Reality
  • the processor 520 may control the EEG sensing unit 500, the eye tracking unit 510, the communication unit 530, the display unit 540, and the communication unit 530.
  • the processor 520 may control transmission / reception of signals (or data) between the above-described units.
  • the processor 520 may perform various operations corresponding to the sensing result received from the EEG sensing unit 500 and / or the gaze tracking unit 510.
  • the EBI device 400 may optionally include some of the component units shown in FIG. 16.
  • the EBI device 400 may further include various sensors required for the purpose and operation of the device 400 such as a sensor unit, a memory unit, and a power supply unit. Units may be further included.
  • the drawings are divided and described, but the embodiments described in each drawing may be merged to implement a new embodiment.
  • the display device is not limited to the configuration and method of the embodiments described as described above, the above embodiments are configured by selectively combining all or some of the embodiments so that various modifications can be made May be

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Multimedia (AREA)
  • Geometry (AREA)
  • Health & Medical Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Ophthalmology & Optometry (AREA)
  • Human Computer Interaction (AREA)
  • Image Analysis (AREA)
  • Image Processing (AREA)

Abstract

Selon un mode de réalisation de l'invention, un procédé de traitement d'images consiste à : acquérir une image d'un œil ; détecter un contour d'une image d'un objet cible correspondant à une pupille ou à un iris à partir de l'image de l'œil ; calculer une équation elliptique pour l'objet cible d'après le contour et déterminer si l'équation elliptique est admise comme équation elliptique finale pour l'objet cible ; et corriger le contour de l'image d'objet cible à l'aide d'un descripteur de Fourier lorsque l'équation elliptique n'est pas admise comme équation elliptique finale, le descripteur de Fourier pouvant être acquis à partir des informations de forme indiquant les caractéristiques de forme de l'image d'objet cible.
PCT/KR2017/004881 2017-05-11 2017-05-11 Dispositif et procédé de traitement d'images WO2018207959A1 (fr)

Priority Applications (2)

Application Number Priority Date Filing Date Title
KR1020197025494A KR20190107738A (ko) 2017-05-11 2017-05-11 이미지 처리 장치 및 방법
PCT/KR2017/004881 WO2018207959A1 (fr) 2017-05-11 2017-05-11 Dispositif et procédé de traitement d'images

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/KR2017/004881 WO2018207959A1 (fr) 2017-05-11 2017-05-11 Dispositif et procédé de traitement d'images

Publications (1)

Publication Number Publication Date
WO2018207959A1 true WO2018207959A1 (fr) 2018-11-15

Family

ID=64105631

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/KR2017/004881 WO2018207959A1 (fr) 2017-05-11 2017-05-11 Dispositif et procédé de traitement d'images

Country Status (2)

Country Link
KR (1) KR20190107738A (fr)
WO (1) WO2018207959A1 (fr)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112651394A (zh) * 2020-12-31 2021-04-13 北京一起教育科技有限责任公司 一种图像检测方法、装置及电子设备

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20070160267A1 (en) * 2006-01-11 2007-07-12 Jones Michael J Method for localizing irises in images using gradients and textures
KR100794361B1 (ko) * 2006-11-30 2008-01-15 연세대학교 산학협력단 홍채 인식 성능 향상을 위한 눈꺼풀 검출과 속눈썹 보간방법
US20080170760A1 (en) * 2007-01-17 2008-07-17 Donald Martin Monro Shape representation using fourier transforms
US20110150334A1 (en) * 2008-07-23 2011-06-23 Indian University & Technology Corporation System and method for non-cooperative iris image acquisition
KR20140106926A (ko) * 2013-02-27 2014-09-04 한국전자통신연구원 동공 검출 장치 및 동공 검출 방법

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20070160267A1 (en) * 2006-01-11 2007-07-12 Jones Michael J Method for localizing irises in images using gradients and textures
KR100794361B1 (ko) * 2006-11-30 2008-01-15 연세대학교 산학협력단 홍채 인식 성능 향상을 위한 눈꺼풀 검출과 속눈썹 보간방법
US20080170760A1 (en) * 2007-01-17 2008-07-17 Donald Martin Monro Shape representation using fourier transforms
US20110150334A1 (en) * 2008-07-23 2011-06-23 Indian University & Technology Corporation System and method for non-cooperative iris image acquisition
KR20140106926A (ko) * 2013-02-27 2014-09-04 한국전자통신연구원 동공 검출 장치 및 동공 검출 방법

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112651394A (zh) * 2020-12-31 2021-04-13 北京一起教育科技有限责任公司 一种图像检测方法、装置及电子设备
CN112651394B (zh) * 2020-12-31 2023-11-14 北京一起教育科技有限责任公司 一种图像检测方法、装置及电子设备

Also Published As

Publication number Publication date
KR20190107738A (ko) 2019-09-20

Similar Documents

Publication Publication Date Title
WO2017104869A1 (fr) Système d'une interface œil-cerveau (ebi) et son procédé de commande
WO2018155892A1 (fr) Procédé d'affichage d'une image, support de stockage et dispositif électronique associé
WO2020036343A1 (fr) Dispositif électronique et procédé de commande correspondant
WO2019088769A1 (fr) Procédé et système de fourniture d'informations médicales basé sur une api ouverte
WO2018012945A1 (fr) Procédé et dispositif d'obtention d'image, et support d'enregistrement associé
EP3718048A1 (fr) Procédé d'analyse d'objets dans des images enregistrées par une caméra d'un dispositif monté sur la tête
WO2015183033A1 (fr) Procédé de traitement de données et dispositif électronique correspondant
WO2018236058A1 (fr) Dispositif électronique pour fournir des informations de propriété d'une source de lumière externe pour un objet d'intérêt
EP3920782A1 (fr) Dispositif électronique pouvant être porté comprenant un capteur biométrique et un module de charge sans fil
WO2015122566A1 (fr) Dispositif d'affichage monté sur tête pour afficher un guide de capture d'image en réalité augmentée, et son procédé de commande
WO2018216868A1 (fr) Dispositif électronique et procédé d'entrée de dispositif d'entrée
WO2020197153A1 (fr) Dispositif électronique pour dépister le risque d'apnée obstructive du sommeil, et procédé de fonctionnement associé
WO2018097683A1 (fr) Dispositif électronique, dispositif électronique externe et procédé de connexion de dispositif électronique et de dispositif électronique externe
EP3921808A1 (fr) Appareil et procédé d'affichage de contenus sur un dispositif de réalité augmentée
WO2018208093A1 (fr) Procédé de fourniture de rétroaction haptique et dispositif électronique destiné à sa mise en œuvre
WO2020256325A1 (fr) Dispositif électronique et procédé de fourniture d'une fonction à l'aide d'une image cornéenne dans le dispositif électronique
US11029754B2 (en) Calibration method, portable device, and computer-readable storage medium
WO2020242087A1 (fr) Dispositif électronique et procédé de correction de données biométriques sur la base de la distance entre le dispositif électronique et l'utilisateur, mesurée à l'aide d'au moins un capteur
WO2016021907A1 (fr) Système de traitement d'informations et procédé utilisant un dispositif à porter sur soi
WO2018207959A1 (fr) Dispositif et procédé de traitement d'images
WO2021177622A1 (fr) Dispositif électronique portable pour fournir une image virtuelle
JP2005261728A (ja) 視線方向認識装置及び視線方向認識プログラム
US11137600B2 (en) Display device, display control method, and display system
WO2020022750A1 (fr) Dispositif électronique pouvant fournir multiples points focaux d'une lumière émise par un dispositif d'affichage
WO2019240564A1 (fr) Module de fonction détachable pour acquérir des données biométriques et visiocasque le comprenant

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 17909467

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 20197025494

Country of ref document: KR

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 17909467

Country of ref document: EP

Kind code of ref document: A1