CN104346427A - Apparatus and method for analyzing image including event information - Google Patents

Apparatus and method for analyzing image including event information Download PDF

Info

Publication number
CN104346427A
CN104346427A CN201410366650.8A CN201410366650A CN104346427A CN 104346427 A CN104346427 A CN 104346427A CN 201410366650 A CN201410366650 A CN 201410366650A CN 104346427 A CN104346427 A CN 104346427A
Authority
CN
China
Prior art keywords
pixel
event
pixels
pattern
analyzing
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201410366650.8A
Other languages
Chinese (zh)
Other versions
CN104346427B (en
Inventor
李俊行
柳贤锡
李圭彬
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Samsung Electronics Co Ltd
Original Assignee
Samsung Electronics Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from KR1020130098273A external-priority patent/KR102129916B1/en
Application filed by Samsung Electronics Co Ltd filed Critical Samsung Electronics Co Ltd
Publication of CN104346427A publication Critical patent/CN104346427A/en
Application granted granted Critical
Publication of CN104346427B publication Critical patent/CN104346427B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • G06T7/246Analysis of motion using feature-based methods, e.g. the tracking of corners or segments
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/50Depth or shape recovery
    • G06T7/55Depth or shape recovery from multiple images
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10004Still image; Photographic image

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Multimedia (AREA)
  • Image Analysis (AREA)
  • Studio Devices (AREA)
  • User Interface Of Digital Computer (AREA)
  • Closed-Circuit Television Systems (AREA)

Abstract

一种用于分析包括事件信息的图像的设备和方法,所述设备和方法确定包括在输入图像中与事件信息相应的至少一个像素组的图案,并基于至少一个图案来分析对象的外形和对象的运动中的至少一个。

An apparatus and method for analyzing an image including event information, the apparatus and method determining a pattern of at least one pixel group corresponding to event information included in an input image, and analyzing an outline of an object and an object based on the at least one pattern at least one of the sports.

Description

用于分析包括事件信息的图像的设备和方法Apparatus and method for analyzing images including event information

技术领域technical field

与示例性实施例一致的方法和设备涉及一种用于分析图像的设备,更具体地讲,涉及一种用于分析包括在输入图像中的对象的方法和设备。Methods and apparatus consistent with the exemplary embodiments relate to an apparatus for analyzing an image, and more particularly, to a method and apparatus for analyzing an object included in an input image.

背景技术Background technique

图像处理可以指所有形式的信息处理,其中,在信息处理中图像被输入和输出,例如,图像处理可包括对照片、视频等的分析或处理。Image processing may refer to all forms of information processing in which images are input and output, for example, image processing may include analysis or processing of photographs, videos, and the like.

感测用于图像处理的输入数据的装置可以是视觉传感器,例如,可包括基于用于制造半导体装置的技术的光电式传感器等,其中,光电式传感器已变成集成电路。A device sensing input data for image processing may be a visual sensor, for example, may include a photoelectric sensor based on a technique used to manufacture a semiconductor device, which has become an integrated circuit, or the like.

发明内容Contents of the invention

根据示例性实施例的一方面,提供一种用于分析图像的设备,所述设备包括:分类器,被配置为接收与基于事件的视觉传感器的至少一个像素相应的事件信号,并被配置为确定基于事件的视觉传感器的多个像素的像素组的图案,其中,像素组包括所述至少一个像素和与所述至少一个像素邻近的基于事件的视觉传感器的多个邻近像素;分析器,被配置为基于像素组的图案来确定对象的外形和对象的移动中的至少一个。According to an aspect of an exemplary embodiment, there is provided an apparatus for analyzing an image, the apparatus comprising: a classifier configured to receive an event signal corresponding to at least one pixel of an event-based vision sensor, and configured to determining a pattern of a pixel group of a plurality of pixels of the event-based vision sensor, wherein the pixel group includes the at least one pixel and a plurality of adjacent pixels of the event-based vision sensor adjacent to the at least one pixel; the analyzer, by It is configured to determine at least one of the shape of the object and the movement of the object based on the pattern of groups of pixels.

分类器可确定像素组是否与多个预定边缘图案中的至少一个边缘图案相应。The classifier may determine whether a group of pixels corresponds to at least one edge pattern among a plurality of predetermined edge patterns.

分析器可包括:计算器,基于像素组的图案来计算与像素组相应的速度;运动分析器,被配置为基于与像素组相应的速度来分析对象的移动。The analyzer may include: a calculator to calculate a speed corresponding to the pixel group based on the pattern of the pixel group; and a motion analyzer configured to analyze the movement of the object based on the speed corresponding to the pixel group.

用于分析图像的设备还可包括:处理器,被配置为基于对象的移动计算关于用户输入的点的相对坐标的变化量,并基于所述相对坐标的变化量来处理用户输入。The apparatus for analyzing an image may further include a processor configured to calculate a change amount of a relative coordinate of a point with respect to a user input based on the movement of the object, and process the user input based on the change amount of the relative coordinate.

根据示例性实施例的一个方面,提供一种用于分析图像的设备,所述设备包括:分类器,被配置为接收与基于事件的视觉传感器的第一像素相应的第一事件信号和与基于事件的视觉传感器的第二像素相应的第二事件信号,确定基于事件的视觉传感器的第一多个像素的第一像素组的第一图案,并确定第二多个像素的第二像素组的第二图案,其中,第一像素组包括所述至少一个第一像素和与所述至少一个第一像素邻近的基于事件的视觉传感器的第一多个邻近像素,第二像素组包括所述至少一个第二像素和与所述至少一个第二像素邻近的第二多个邻近像素;分析器,基于第一图案检测对象的第一位置,基于第二图案检测对象的第二位置,并基于第一位置和第二位置来确定对象的深度。According to an aspect of an exemplary embodiment, there is provided an apparatus for analyzing an image, the apparatus comprising: a classifier configured to receive a first event signal corresponding to a first pixel of an event-based vision sensor and An event based on a second event signal of a second pixel of the vision sensor, determining a first pattern of a first pixel group of a first plurality of pixels of the event based vision sensor, and determining a second pixel group of a second plurality of pixels A second pattern, wherein a first pixel group includes the at least one first pixel and a first plurality of neighboring pixels of an event-based vision sensor adjacent to the at least one first pixel, and a second pixel group includes the at least one first pixel. a second pixel and a second plurality of adjacent pixels adjacent to the at least one second pixel; an analyzer for detecting a first position of the object based on the first pattern, detecting a second position of the object based on the second pattern, and based on the first pattern A position and a second position are used to determine the depth of the object.

根据示例性实施例的一个方面,提供一种用于分析图像的方法,所述方法包括:接收与基于事件的视觉传感器的至少一个像素相应的事件信号,确定基于事件的视觉传感器的多个像素的像素组的图案,基于像素组的图案来分析对象的外形和对象的移动中的至少一个,其中,像素组包括所述至少一个像素和与所述至少一个像素邻近的基于事件的视觉传感器的多个邻近像素。According to an aspect of an exemplary embodiment, there is provided a method for analyzing an image, the method comprising: receiving an event signal corresponding to at least one pixel of an event-based vision sensor, determining a plurality of pixels of the event-based vision sensor Analyzing at least one of the shape of the object and the movement of the object based on the pattern of the pixel group, wherein the pixel group includes the at least one pixel and an event-based vision sensor adjacent to the at least one pixel multiple neighboring pixels.

根据示例性实施例的一个方面,提供一种用于分析图像的方法,所述方法包括:接收指示对象的移动的与基于事件的视觉传感器的像素相应的事件信号的输入,选择和与事件信号1相应的像素邻近的基于事件的视觉传感器的多个邻近像素,基于所述多个邻近像素的边缘图案来确定对象的边缘的位置来确定所述多个邻近像素的边缘图案,并基于对象的边缘的位置来分析对象的外形。According to an aspect of an exemplary embodiment, there is provided a method for analyzing an image, the method comprising: receiving an input of an event signal corresponding to a pixel of an event-based vision sensor indicating movement of an object, selecting and coordinating the event signal a corresponding pixel adjacent to a plurality of neighboring pixels of the event-based vision sensor, determining a position of an edge of an object based on an edge pattern of the plurality of neighboring pixels, and determining an edge pattern of the plurality of neighboring pixels based on an edge pattern of the object The position of the edge to analyze the shape of the object.

从详细描述、附图和权利要求,示例性实施例的其他特征和方面将会清楚。Other features and aspects of the exemplary embodiments will be apparent from the detailed description, drawings, and claims.

附图说明Description of drawings

图1是示出根据示例性实施例的用于分析图像的设备的框图;FIG. 1 is a block diagram illustrating an apparatus for analyzing an image according to an exemplary embodiment;

图2A至图2C是示出根据示例性实施例的多个预定边缘图案的示图;2A to 2C are diagrams illustrating a plurality of predetermined edge patterns according to an exemplary embodiment;

图3A和图3B是示出根据示例性实施例的用于对像素组的图案进行分类的方案的示图;3A and 3B are diagrams illustrating a scheme for classifying patterns of pixel groups according to an exemplary embodiment;

图4是示出根据示例性实施例的用于确定像素组的边缘的方向的方案的示图;FIG. 4 is a diagram illustrating a scheme for determining a direction of an edge of a pixel group according to an exemplary embodiment;

图5A和图5B是示出根据示例性实施例的用于基于输入图像分析对象的外形的方案的示图;5A and 5B are diagrams illustrating a scheme for analyzing a shape of an object based on an input image, according to an exemplary embodiment;

图6是示出根据示例性实施例的用于计算与像素组相应的速度的方案的示图;FIG. 6 is a diagram illustrating a scheme for calculating a velocity corresponding to a pixel group according to an exemplary embodiment;

图7是示出根据示例性实施例的用于使用刚体模型来分析对象的运动的方案的示图;7 is a diagram illustrating a scheme for analyzing motion of an object using a rigid body model, according to an exemplary embodiment;

图8A至图8D是示出根据示例性实施例的用于提高分析对象的运动的精确度的方案的示图;8A to 8D are diagrams illustrating a scheme for improving the accuracy of analyzing a motion of an object according to an exemplary embodiment;

图9是示出根据示例性实施例的用于基于对象的移动速度来处理用户输入的方案的示图;FIG. 9 is a diagram illustrating a scheme for processing a user input based on a moving speed of an object, according to an exemplary embodiment;

图10是示出根据示例性实施例的用于基于对象的深度来处理用户输入的方案的示图;FIG. 10 is a diagram illustrating a scheme for processing a user input based on a depth of an object, according to an exemplary embodiment;

图11是示出根据示例性实施例的用于分析图像的方法的流程图;11 is a flowchart illustrating a method for analyzing an image according to an exemplary embodiment;

图12是示出根据示例性实施例的用于分析三维(3D)图像的设备的框图。FIG. 12 is a block diagram illustrating an apparatus for analyzing a three-dimensional (3D) image according to an exemplary embodiment.

贯穿附图和详细描述,除非另有描述,否则相同的附图标号将被理解为指示相同的元件、特征和结构。为了清晰、说明和便利性,可夸大这些元件的相对尺寸和描绘。Throughout the drawings and the detailed description, unless otherwise described, the same drawing reference numerals will be understood to refer to the same elements, features and structures. The relative size and depiction of these elements may be exaggerated for clarity, illustration, and convenience.

具体实施方式Detailed ways

提供以下详细描述来帮助读者获得在此所描述的方法、设备和/或系统的全面理解。因此,在此所描述的方法、设备和/或系统的各种改变、修改和等同物对于本领域的普通技术人员是隐含的。所描述的处理步骤和/或操作的进程是示例;然而,处理步骤和/或操作的顺序不限于在此阐述的顺序,并且除了必须按特定顺序发生的步骤和/或操作之外,可将顺序修改为本领域的普通技术人员将理解的顺序。另外,为了更加清晰简明,可省略公知功能和结构的各个描述。The following detailed description is provided to assist the reader in gaining a comprehensive understanding of the methods, devices and/or systems described herein. Accordingly, various changes, modifications, and equivalents of the methods, devices, and/or systems described herein will suggest themselves to those of ordinary skill in the art. The described progression of processing steps and/or operations is an example; however, the order of processing steps and/or operations is not limited to the order set forth herein, and, except for steps and/or operations that must occur in a specific order, may be The order is modified as would be understood by one of ordinary skill in the art. Also, various descriptions of well-known functions and constructions may be omitted for increased clarity and conciseness.

图1是示出根据示例性实施例的用于分析图像的设备100的框图。FIG. 1 is a block diagram illustrating an apparatus 100 for analyzing images according to an exemplary embodiment.

在参照图1描述用于分析图像的设备100之前,将简单地讨论将由设备100使用的输入图像。根据示例性实施例的输入图像可以是指用于捕捉对象的基于事件的视觉传感器的输出图像。基于事件的视觉传感器可响应于检测到预定事件来异步地输出事件信号。预定事件可包括入射在基于事件的视觉传感器上的光的亮度的变化。例如,当检测到事件(例如,在预定像素中使光变亮的事件)时,基于事件的视觉传感器可输出与相关像素相应的导通(ON)事件,从而增加亮度。此外,当检测到事件(例如,在预定像素中使光变暗的事件)时,基于事件的视觉传感器可输出与相关像素相应的截止(OFF)事件,从而减小亮度。Before describing the apparatus 100 for analyzing images with reference to FIG. 1 , an input image to be used by the apparatus 100 will be briefly discussed. An input image according to an exemplary embodiment may refer to an output image of an event-based vision sensor for capturing an object. An event-based vision sensor may asynchronously output an event signal in response to detecting a predetermined event. The predetermined event may include a change in brightness of light incident on the event-based vision sensor. For example, when an event is detected (eg, an event that brightens light in a predetermined pixel), an event-based vision sensor may output a corresponding ON event for the associated pixel, thereby increasing the brightness. Additionally, when an event is detected (eg, an event that dims light in a predetermined pixel), the event-based vision sensor may output an OFF event corresponding to the associated pixel, thereby reducing brightness.

与基于帧的视觉传感器不同,基于事件的视觉传感器可在帧单元中不对多个像素的光电二极管进行扫描的情况下,输出检测到光的变化的一部分像素数据。入射在视觉传感器上的光的亮度变化可由对象的移动引起。例如,假设光源被基本上固定,并假设对象随着时间的流逝不会自发地发光。在这种情况下,入射到视觉传感器的光可以指从光源产生并从对象反射的光。当对象不移动时,由于从静止对象反射的光基本上不存在变化,因此入射到基于事件的视觉传感器的光的亮度不会发生变化。相反地,当对象移动时,由于从对象反射光,因此入射到视觉传感器的入射光的亮度会发生变化,并因此入射到基于事件的视觉传感器的光基于对象的移动而变化。Unlike a frame-based vision sensor, an event-based vision sensor can output a part of pixel data in which a change in light is detected without scanning photodiodes of a plurality of pixels in frame units. Changes in brightness of light incident on the vision sensor may be caused by movement of the object. For example, assume that the light source is substantially fixed, and assume that the object does not spontaneously emit light over time. In this case, light incident to the vision sensor may refer to light generated from a light source and reflected from an object. When the object is not moving, the brightness of the light incident on the event-based vision sensor does not change because there is essentially no change in the light reflected from the stationary object. Conversely, when an object moves, the brightness of incident light to the vision sensor changes due to reflection of light from the object, and thus light incident to the event-based vision sensor changes based on the movement of the object.

基于事件的视觉传感器可包括动态视觉传感器。动态视觉传感器可包括以人类视网膜和视神经的原理进行操作的人工视觉传感器。响应于对象的移动,动态视觉传感器可将事件信号输出到基于事件的视觉传感器。事件信号可包括响应于对象的移动被异步产生的信息。事件信号可包括诸如以下信息:从视网膜传送到人类的大脑的视神经信息。例如,可在检测到移动对象时产生事件信号,而针对静止对象可不产生事件信号。包括在事件信号中的至少一个像素可与被检测到移动的对象相应。Event-based vision sensors may include dynamic vision sensors. Dynamic vision sensors may include artificial vision sensors that operate on principles of the human retina and optic nerve. In response to the movement of the object, the dynamic vision sensor can output an event signal to the event-based vision sensor. Event signals may include information that is generated asynchronously in response to movement of an object. Event signals may include information such as optic nerve information transmitted from the retina to the human brain. For example, an event signal may be generated when a moving object is detected, but may not be generated for a stationary object. At least one pixel included in the event signal may correspond to an object whose movement is detected.

参照图1,用于分析图像的设备100可包括分类器110和分析器120。分类器110可基于包括检测到对象的移动的事件信号的输入图像对至少一个像素组的图案进行分类。所述至少一个像素组可包括与事件信号相应的像素和与相应像素邻近的多个像素。Referring to FIG. 1 , an apparatus 100 for analyzing an image may include a classifier 110 and an analyzer 120 . The classifier 110 may classify the pattern of at least one pixel group based on the input image including the event signal in which the movement of the object is detected. The at least one pixel group may include a pixel corresponding to the event signal and a plurality of pixels adjacent to the corresponding pixel.

在下文中,为了便于描述,像素组可包括9个像素的3×3矩阵,假设与事件信号相应的像素被布置在像素组的中心处,并假设在相应像素周围布置的8个邻近像素被包括在所述像素组中。用于配置像素组的这种方案仅是示例性的,可以以各种方式来修改用于配置像素组的方案。Hereinafter, for convenience of description, a pixel group may include a 3×3 matrix of 9 pixels, assuming that a pixel corresponding to an event signal is arranged at the center of the pixel group, and assuming that 8 adjacent pixels arranged around the corresponding pixel are included in the pixel group. This scheme for configuring pixel groups is only exemplary, and the scheme for configuring pixel groups can be modified in various ways.

分类器110可确定与事件信号相应的像素组(即,事件发生的一组像素)是否与多个预定边缘图案相应。例如,参照图2A,所述多个预定边缘图案可包括24个边缘图案P1至P24。24个边缘图案P1至P24可以是指与对象的边缘关联的图案。当与事件信号相应的像素组被确定为与所述多个预定边缘图像中的任意一个相应时,分类器110可将与事件信号相应的像素组的图案确定为相应边缘图案。例如,基于与事件信号相应的像素组与图2A的边缘图案P1相应的确定结果,分类器110可将与事件信号相应的像素组分类为边缘图案P1。分类器110可舍弃与所述多个预定边缘图案中的任何一个不关联的与事件信号相应的像素组。The classifier 110 may determine whether a pixel group corresponding to the event signal (ie, a group of pixels where the event occurred) corresponds to a plurality of predetermined edge patterns. For example, referring to FIG. 2A , the plurality of predetermined edge patterns may include 24 edge patterns P 1 to P 24 . The 24 edge patterns P1 to P24 may refer to patterns associated with edges of objects. When the pixel group corresponding to the event signal is determined to correspond to any one of the plurality of predetermined edge images, the classifier 110 may determine the pattern of the pixel group corresponding to the event signal as the corresponding edge pattern. For example, based on a determination result that the pixel group corresponding to the event signal corresponds to the edge pattern P 1 of FIG. 2A , the classifier 110 may classify the pixel group corresponding to the event signal into the edge pattern P 1 . The classifier 110 may discard groups of pixels corresponding to event signals that are not associated with any one of the plurality of predetermined edge patterns.

将参照图2A至图3B讨论与分类器110对与事件信号相应的像素组的图案进行分类的方案相关的详细描述。A detailed description related to a scheme in which the classifier 110 classifies patterns of pixel groups corresponding to event signals will be discussed with reference to FIGS. 2A to 3B .

分析器120可基于由分类器110分类的至少一个像素组的图案来分析对象的外形(诸如形状、轮廓或对象相对于基于事件的传感器的位置的位置)和对象的运动中的至少一个。分析器120可使用至少一个像素组的图案来确定与至少一个像素组相应的边缘的方向,从而分析对象的外形。可选地,分析器120可计算与和对象的边缘关联的像素组相应的速度,并基于计算出的像素组的速度来分析对象的运动。分析器120可确定对象的移动速度分量、对象的旋转速度分量和对象的缩放速度分量中的至少一个,以分析对象的运动。The analyzer 120 may analyze at least one of an object's appearance (such as a shape, contour, or position of the object relative to the position of the event-based sensor) and the motion of the object based on the pattern of the at least one pixel group classified by the classifier 110 . The analyzer 120 may determine a direction of an edge corresponding to the at least one pixel group using the pattern of the at least one pixel group, thereby analyzing the shape of the object. Alternatively, the analyzer 120 may calculate a velocity corresponding to a pixel group associated with an edge of the object, and analyze the motion of the object based on the calculated velocity of the pixel group. The analyzer 120 may determine at least one of a movement speed component of the object, a rotation speed component of the object, and a scaling speed component of the object to analyze the motion of the object.

随后将参照图4至图8D讨论与分析器120的操作相关的详细描述。A detailed description related to the operation of the analyzer 120 will be discussed later with reference to FIGS. 4 to 8D .

图2A至图2C是示出根据示例性实施例的多个预定边缘图案的示图。参照图2A,多个边缘图案可与对象的边缘关联。2A to 2C are diagrams illustrating a plurality of predetermined edge patterns according to an exemplary embodiment. Referring to FIG. 2A, a plurality of edge patterns may be associated with an edge of an object.

事件信号可包括检测到预定事件的时间的时间戳、用于指示事件的类型的指示符以及检测到预定事件的像素的索引。如以下所讨论的,与分辨率的像素相应的时间戳可存储在存储器中的表中,进而可利用像素的事件时间的时间信号。The event signal may include a timestamp of when the predetermined event was detected, an indicator indicating the type of event, and an index of the pixel at which the predetermined event was detected. As discussed below, time stamps corresponding to pixels of resolution may be stored in a table in memory, and a time signal of the pixel's event time may then be utilized.

根据示例性实施例的用于分析图像的设备可基于检测到事件的像素的时间戳和多个邻近像素的时间戳之间的差来对像素组的图案进行分类。用于分析图像的设备可确定多种类型的邻近像素以对像素组的图案进行分类。用于分析图像的设备可计算检测到事件的像素的时间戳和多个邻近像素的时间戳之间的差,并基于计算的结果来确定一种或更多种类型的邻近像素。The apparatus for analyzing an image according to an exemplary embodiment may classify a pattern of a pixel group based on a difference between a time stamp of a pixel at which an event is detected and time stamps of a plurality of neighboring pixels. Devices for analyzing images can determine various types of neighboring pixels to classify patterns of groups of pixels. A device for analyzing an image may calculate a difference between a timestamp of a pixel at which an event was detected and timestamps of a plurality of neighboring pixels, and determine one or more types of neighboring pixels based on a result of the calculation.

用于分析图像的设备可使用用于管理与全部像素相应的时间戳的数据结构。当检测到事件信号时,用于分析图像的设备可更新包括在事件信号中的像素的时间戳。此时,用于分析图像的设备可舍弃先前存储的信息,并存储新更新的信息。当检测到当前事件时,用于分析图像的设备可将与当前事件相应的当前像素的时间戳的值更新为当前时间。用于分析图像的设备可通过计算与当前事件相应的更新后的当前像素的时间戳和邻近像素的时间戳之间的差,来确定邻近像素的像素类型。当与邻近像素的相应的先前事件被检测到时,邻近像素的时间戳可能被更新。A device for analyzing an image may use a data structure for managing time stamps corresponding to all pixels. When an event signal is detected, the device for analyzing the image may update the time stamps of the pixels included in the event signal. At this point, the device for analyzing the image may discard previously stored information and store newly updated information. When a current event is detected, the apparatus for analyzing an image may update a value of a time stamp of a current pixel corresponding to the current event to a current time. The apparatus for analyzing the image may determine the pixel type of the neighboring pixel by calculating the difference between the updated timestamp of the current pixel corresponding to the current event and the timestamp of the neighboring pixel. The time stamps of neighboring pixels may be updated when corresponding previous events with neighboring pixels are detected.

用于分析图像的设备可基于等式1确定邻近像素的类型。The device for analyzing the image may determine the type of neighboring pixels based on Equation 1.

[等式1][equation 1]

tt evev -- tt nxnx ≥&Greater Equal; TT EE. →&Right Arrow; EE. -- typetype ≤≤ TT SS →&Right Arrow; SS -- typetype

这里,tev表示产生事件的像素的时间戳,tnx表示邻近像素的时间戳,其中,tev和tnx之间的差指示像素事件之间的时间相关性,所述时间相关性用于指示边缘的定向移动;TE表示用于确定E-type(E类型)的阈值,该阈值可以与缓慢的事件相应;TS表示用于确定S-type(S类型)的阈值,该阈值可以与快速的事件相应。可基于像素的灵敏度或将被应用的应用来设置TE和TS。例如,当将被检测移动的对象与用户的手相应时,TE和TS可被设置在毫秒(ms)至几十ms的范围中。可选地,当将被检测移动的对象与明显比用户的手移动得更快的对象相应时,TE和TS可被设置为若干微秒(μs)或更小。TE和TS可被设置为不同值(其中,TS<TE,如表1所示),并且必要时,可被设置为相等的值。Here, t ev represents the timestamp of the pixel that generated the event, and t nx represents the timestamp of neighboring pixels, where the difference between t ev and t nx indicates the temporal correlation between pixel events, which is used for Indicates the directional movement of the edge; T E represents the threshold used to determine the E-type (E type), which can correspond to slow events; T S represents the threshold used to determine the S-type (S type), which can Corresponds to fast events. TE and TS can be set based on the sensitivity of the pixel or the application to be used. For example, when an object whose movement is to be detected corresponds to a user's hand, TE and TS may be set in a range of milliseconds (ms) to several tens of ms. Alternatively, TE and TS may be set to several microseconds (μs) or less when an object whose movement is to be detected corresponds to an object that moves significantly faster than the user's hand. T E and T S can be set to different values (where T S <T E , as shown in Table 1), and if necessary, can be set to equal values.

当从检测到先前事件的第一时间点到在邻近像素中检测到当前事件的随后的第二时间点过去预定时间段时,用于分析图像的设备可确定所述邻近像素为E-type。例如,可将在与检测到当前事件的像素邻近的像素之中的在预定时间段(例如,TE)期间没有检测到新事件的邻近像素分类为E-type的邻近像素。When a predetermined period of time elapses from a first time point at which a previous event is detected to a subsequent second time point at which a current event is detected in a neighboring pixel, the apparatus for analyzing an image may determine that the neighboring pixel is an E-type. For example, a neighboring pixel in which a new event has not been detected during a predetermined period of time (eg, TE ) among pixels neighboring a pixel in which a current event is detected may be classified as an E-type neighboring pixel.

当从检测到先前事件的时间点开始在预定时间段内在邻近像素中检测到当前事件时,用于分析图像的设备可将所述邻近像素确定为S-type。例如,可将在与检测到当前事件的像素邻近的像素之中的在预定时间段(例如,TS)中检测到新事件的邻近像素分类为S-type的邻近像素。When a current event is detected in a neighboring pixel within a predetermined time period from a time point at which a previous event was detected, the apparatus for analyzing an image may determine the neighboring pixel as an S-type. For example, a neighboring pixel in which a new event is detected within a predetermined time period (eg, T S ) among pixels neighboring a pixel in which a current event is detected may be classified as an S-type neighboring pixel.

预定边缘图案可包括邻近像素。例如,如图2A中所示,当与产生事件的像素最接近的顶部像素、底部像素、左侧像素和右侧像素被使用时,预定边缘图案可包括邻近像素n1至n8。用于配置预定边缘图案的邻近像素的类型的组合可彼此不同。The predetermined edge pattern may include adjacent pixels. For example, as shown in FIG. 2A, when the top, bottom, left, and right pixels closest to the event-generating pixel are used, the predetermined edge pattern may include adjacent pixels n1 to n8. Combinations of types of adjacent pixels for configuring a predetermined edge pattern may be different from each other.

例如,边缘图案P1可包括E-type的像素n1、n2和n4210以及S-type的像素n3和n6220。用于分析图像的设备可将以下像素组分类为边缘图案P1,所述像素组包括在预定时间段(例如,TE)期间没有检测到新事件的像素n1、n2和n4210的方向的邻近像素,以及在预定时间段(例如,TS)中检测到新事件的像素n3和n6220的方向的邻近像素。在这种情况下,用于分析图像的设备可使用在预定时间段(例如,TS)中检测到新事件的S-type的像素n3和n6来分析相应像素组的边缘的方向。这是因为,当基于对象的移动产生了新事件时,可在基本上相同的时间点检测到在包括在对象的边缘中的像素的位置处发生的事件。如将参照图4被详细描述的那样,边缘图案P1可被映射到连接像素n3和n6220的线的方向的边缘。For example, the edge pattern P1 may include pixels n1, n2, and n4 210 of E-type and pixels n3 and n6 220 of S-type. The device for analyzing the image may classify as an edge pattern P 1 a set of pixels comprising the directional neighbors of pixels n1, n2, and n4 210 for which no new event has been detected during a predetermined period of time (e.g., T E ). pixel, and neighboring pixels in the direction of pixel n3 and n6 220 where a new event is detected within a predetermined time period (eg, T S ). In this case, the apparatus for analyzing an image may analyze a direction of an edge of a corresponding pixel group using the S-type pixels n3 and n6 for which a new event is detected within a predetermined time period (eg, T S ). This is because, when a new event is generated based on the movement of the object, the event occurring at the position of the pixel included in the edge of the object can be detected at substantially the same point in time. As will be described in detail with reference to FIG. 4 , the edge pattern P1 may be mapped to an edge in the direction of a line connecting the pixels n3 and n6 220 .

以类似的方式,边缘图案P24可包括E-type的像素n5、n7和n8240以及S-type的像素n3和n6250。用于分析图像的设备可将以下像素组分类为边缘图案P24,所述像素组包括在预定时间段(例如,TE)期间未检测到新事件的像素n5、n7和n8240的方向的邻近像素,以及在预定时间段(例如,TS)期间检测到新事件的像素n3和n6250的方向的邻近像素。在这种情况下,用于分析图像的设备可使用在预定时间段(例如,TS)中检测到新事件的S-type的像素n3和n6250来分析相应像素组的边缘的方向。如将参照图4被详细描述的那样,边缘图案P24可被映射到连接像素n3和n6250的线的方向的边缘。In a similar manner, the edge pattern P24 may include pixels n5, n7, and n8240 of E-type and pixels n3 and n6250 of S-type. The device for analyzing the image may classify, as an edge pattern P 24 , the group of pixels comprising the directional neighbors of pixels n5, n7, and n8 240 for which no new event has been detected during a predetermined period of time (e.g., TE ). pixel, and neighboring pixels in the direction of pixel n3 and n6 250 where a new event was detected during a predetermined time period (eg, T S ). In this case, the apparatus for analyzing an image may analyze a direction of an edge of a corresponding pixel group using the S-type pixels n3 and n6 250 that detect a new event within a predetermined period of time (eg, T S ). As will be described in detail with reference to FIG. 4 , the edge pattern P24 may be mapped to an edge in a direction of a line connecting the pixels n3 and n6 250 .

根据另一示例性实施例的用于分析图像的设备可利用比图2A中所示的8个邻近像素更多的邻近像素。例如,用于分析图像的设备可使用5×5像素矩阵的24个邻近像素(如图2B中所示)或7×7像素矩阵的48个邻近像素(如图2C中所示)。图2A、图2B和图2C中的示例性实施例仅是示例性的,可以以各种方式来进行修改。用于分析图像的设备可对包括在像素组中的邻近像素的类型与包括在多个预定边缘图案中的邻近像素的类型进行比较,并将像素组的图案确定为所述多个预定边缘图案中的任意一个。基于比较的结果,用于分析图像的设备可从多个预定边缘图案中确定与像素组匹配的边缘图案。为了确定匹配,用于分析图像的设备可对包括在像素组中的邻近像素的类型与包括在多个预定边缘图案中的邻近像素的类型进行比较。在一个示例中,当第一边缘图案和第二边缘图案包括在多个预定边缘图案中时,用于分析图像的设备可对包括在像素组中的邻近像素的类型与包括在第一边缘图案中的邻近像素的类型进行比较。此外,用于分析图像的设备可对包括在像素组中的邻近像素的类型与包括在第二边缘图案中的邻近像素的类型进行比较。当包括在像素组中的邻近像素的类型与包括在第一边缘图案中的邻近像素的类型相应时,用于分析图像的设备可确定像素组的图案为第一边缘图案的图案。可选地,当包括在像素组中的邻近像素的类型与包括在第二边缘图案中的邻近像素的类型相应时,用于分析图像的设备可确定像素组的图案为第二边缘图案。用于分析图像的设备可确定具有与包括在像素组中的邻近像素相应的邻近像素的边缘图案为所述像素组的图案。The apparatus for analyzing an image according to another exemplary embodiment may utilize more neighboring pixels than the 8 neighboring pixels shown in FIG. 2A . For example, a device for analyzing an image may use a 5x5 pixel matrix of 24 adjacent pixels (as shown in Figure 2B) or a 7x7 pixel matrix of 48 adjacent pixels (as shown in Figure 2C). The exemplary embodiments in FIGS. 2A , 2B, and 2C are merely exemplary, and may be modified in various ways. The apparatus for analyzing an image may compare types of adjacent pixels included in the pixel group with types of adjacent pixels included in a plurality of predetermined edge patterns, and determine the pattern of the pixel group as the plurality of predetermined edge patterns any of the . Based on the results of the comparison, the device for analyzing the image may determine an edge pattern matching the pixel group from among a plurality of predetermined edge patterns. To determine a match, the device for analyzing the image may compare the type of neighboring pixels included in the group of pixels with the types of neighboring pixels included in the plurality of predetermined edge patterns. In one example, when the first edge pattern and the second edge pattern are included in a plurality of predetermined edge patterns, the device for analyzing the image may compare the types of adjacent pixels included in the pixel group with those included in the first edge pattern The type of neighboring pixels in the comparison. Furthermore, the apparatus for analyzing the image may compare the type of adjacent pixels included in the pixel group with the type of adjacent pixels included in the second edge pattern. The apparatus for analyzing an image may determine the pattern of the pixel group to be that of the first edge pattern when the type of adjacent pixels included in the pixel group corresponds to the type of adjacent pixels included in the first edge pattern. Alternatively, the apparatus for analyzing an image may determine the pattern of the pixel group to be the second edge pattern when a type of adjacent pixels included in the pixel group corresponds to a type of adjacent pixels included in the second edge pattern. The apparatus for analyzing an image may determine an edge pattern having adjacent pixels corresponding to adjacent pixels included in the pixel group as the pattern of the pixel group.

在必要时,可将包括在预定边缘图案中的邻近像素的一部分不可知论地(agnostically)指定为“非关注”类型。这种像素既不被分类为E-type也不被分类为S-type。例如,边缘图案P1可包括“非关注”类型的像素n5、n7和n8230。边缘图案P24可包括“非关注”类型的像素n1、n2和n3260。When necessary, a part of adjacent pixels included in a predetermined edge pattern may be agnostically designated as a "non-attention" type. Such pixels are neither classified as E-type nor S-type. For example, edge pattern P1 may include pixels n5, n7, and n8230 of type "not of interest". The edge pattern P24 may include pixels n1, n2, and n3 260 of type "not of interest".

用于分析图像的设备可使用在包括在边缘图案中的邻近像素中的不与“非关注”类型相应的邻近像素来对像素组的图案进行分类。换句话说,用于分析图像的设备可仅使用被分类为E-type和S-type的这些像素来对像素组的图案进行分类。例如,当用于分析图像的设备确定像素组是否与边缘图案P1相应时,用于分析图像的设备可不考虑像素n5、n7和n8230。类似地,当用于分析图像的设备确定像素组是否与边缘图案P24相应时,用于分析图像的设备可不考虑像素n1、n2和n4260。The apparatus for analyzing an image may classify the pattern of the pixel group using adjacent pixels that do not correspond to the 'not of interest' type among adjacent pixels included in the edge pattern. In other words, the apparatus for analyzing an image may classify patterns of pixel groups using only those pixels classified into E-type and S-type. For example, the device for analyzing an image may disregard pixels n5, n7, and n8230 when determining whether a group of pixels corresponds to edge pattern P1 . Similarly, the device for analyzing the image may disregard pixels n1 , n2 and n4 260 when determining whether a group of pixels corresponds to edge pattern P 24 .

可以以各种方式来存储多个预定边缘图案。例如,可以以比特值的格式来存储包括在24个边缘图案P1至P24中的E-type的邻近像素和S-type的邻近像素,如表1中所示。The plurality of predetermined edge patterns can be stored in various ways. For example, E-type neighboring pixels and S-type neighboring pixels included in 24 edge patterns P1 to P24 may be stored in a bit value format as shown in Table 1.

[表1][Table 1]

这里,PnE表示包括在边缘图案Pn中的E-type的邻近像素。当假设使用8个邻近像素时,PnE可被配置为可分别与像素n1至n8相应的8个比特。8个比特中的与E-type的邻近像素相应的比特可被设置为“1”,其他剩余的比特(S-type或“非关注”类型)可被设置为“0”。例如,边缘图案P1可包括作为E-type的邻近像素的像素n1、n2和n4210,因此P1E的比特值可被设置为第一比特、第二比特和第四比特为“1”的“11010000”。可将P1E的比特值“11010000”表示为十六进制数,在这种情况下,可将P1E表示为“D0”。当根据另一示例性实施例使用24个邻近像素时,可用24个比特来配置PnE,当使用48个邻近像素时,可用48个比特来配置PnE。Here, PnE denotes an E-type neighboring pixel included in the edge pattern Pn . When it is assumed that 8 neighboring pixels are used, PnE may be configured as 8 bits which may respectively correspond to pixels n1 to n8. A bit corresponding to a neighboring pixel of E-type among 8 bits may be set to '1', and other remaining bits (S-type or 'not of interest' type) may be set to '0'. For example, the edge pattern P1 may include pixels n1, n2, and n4 210 that are adjacent pixels of E-type, so the bit value of P1E may be set to "11010000" where the first bit, the second bit, and the fourth bit are "1". ". The bit value "11010000" of P1E can be expressed as a hexadecimal number, and in this case, P1E can be expressed as "D0". When 24 adjacent pixels are used according to another exemplary embodiment, PnE may be configured with 24 bits, and when 48 adjacent pixels are used, PnE may be configured with 48 bits.

此外,PnS表示包括在边缘图案Pn中的S-type的邻近像素。当假设使用8个邻近像素时,PnS可被配置为可分别与像素n1至n8相应的8个比特。8个比特中的与S-type的邻近像素相应的比特可被设置为“1”,其他剩余的比特(E-type或“非关注”类型)可被设置为“0”。例如,边缘图案P1可包括作为S-type的邻近像素的像素n3和n6220,因此P1S的比特值可被设置为第三比特和第六比特为“1”的“00100100”。可将P1S的比特值“00100100”表示为十六进制数,在这种情况下,可将P1E表示为“24”。当根据另一示例性实施例使用24个邻近像素时,可用24个比特来配置PnS,当使用48个邻近像素时,可用48个比特来配置PnS。Also, PnS denotes an S-type neighboring pixel included in the edge pattern Pn . When it is assumed that 8 adjacent pixels are used, PnS may be configured as 8 bits respectively corresponding to pixels n1 to n8. A bit corresponding to a neighboring pixel of S-type among 8 bits may be set to '1', and the other remaining bits (E-type or 'non-interest' type) may be set to '0'. For example, the edge pattern P1 may include pixels n3 and n6220 which are adjacent pixels of S-type, and thus the bit value of P1S may be set to '00100100' with the third and sixth bits being '1'. The bit value "00100100" of P1S can be expressed as a hexadecimal number, and in this case, P1E can be expressed as "24". When 24 adjacent pixels are used according to another exemplary embodiment, PnS may be configured with 24 bits, and when 48 adjacent pixels are used, PnS may be configured with 48 bits.

用于分析图像的设备可检验由PnE表示的邻近像素是否与E-type相应,以及由PnS表示的邻近像素是否与S-type相应,并基于被分析的像素来确定像素组是否与边缘图案Pn相应。The apparatus for analyzing the image may check whether the adjacent pixels indicated by PnE correspond to the E-type, and whether the adjacent pixels indicated by PnS correspond to the S-type, and determine whether the group of pixels corresponds to the edge pattern P based on the analyzed pixels. n corresponding.

当像素组的图案被分类时,用于分析图像的设备可不考虑“非关注”类型的邻近像素。因此,用于分析图像的设备可不使用明确地指示“非关注”类型的邻近像素的信息。例如,在PnE和PnS两者中,与边缘图案Pn中的“非关注”类型的邻近像素相应的比特可被设置为“0”。然而,与对PnE和PnS执行以比特为单位的OR运算的结果相关的“0”比特可表示“非相关”类型的邻近像素。例如,当对P1E=“11010000”和P1S=“00100100”执行以比特为单位的逻辑OR运算时,P1E OR P1S=“11110100”。因为在P1E OR P1S中的“0”比特可与第五比特、第七比特和第八比特相应,所以P1E OR P1S=“11110100”可表示包括在边缘图案P1中的“非关注”类型的邻近像素可以是像素n5、n7和n8230。A device for analyzing an image may disregard neighboring pixels of the "non-interest" type when a pattern of groups of pixels is classified. Accordingly, an apparatus for analyzing an image may not use information that explicitly indicates neighboring pixels of a "non-interest" type. For example, in both PnE and PnS, bits corresponding to neighboring pixels of the “non-interest” type in the edge pattern Pn may be set to “0”. However, a "0" bit associated with the result of performing a bitwise OR operation on PnE and PnS may represent a neighboring pixel of the "non-correlated" type. For example, when a logical OR operation in units of bits is performed on P1E="11010000" and P1S="00100100", P1E OR P1S="11110100". Since the "0" bit in P1E OR P1S may correspond to the fifth, seventh, and eighth bits, P1E OR P1S = "11110100" may indicate the "non-concern" type included in the edge pattern P1 Neighboring pixels may be pixels n5, n7, and n8230.

表1是用于表示包括在边缘图案Pn中的E-type的邻近像素和S-type的邻近像素的示例性实施例。本领域的技术人员将理解对表1进行各种修改以表示包括在边缘图案Pn中的E-type的邻近像素和S-type的邻近像素。Table 1 is an exemplary embodiment for representing E-type neighboring pixels and S-type neighboring pixels included in the edge pattern Pn . Those skilled in the art will understand that various modifications are made to Table 1 to represent the E-type neighboring pixels and the S-type neighboring pixels included in the edge pattern Pn .

图3A和图3B是示出根据示例性实施例的用于对像素组的图案进行分类的方案的示图。3A and 3B are diagrams illustrating a scheme for classifying patterns of pixel groups according to an exemplary embodiment.

参照图3A,可将24个边缘图案P1至P24分组为6个组310、320、330、340、350和360。例如,可基于E-type的邻近像素是否共同存在来将24个边缘图案P1至P24分组为6个组310至360。可由包括E-type的n1、n2和n4的边缘图案P1、P2、P6、P7和P8来配置组310。可由边缘图案P4、P5、P9、P10和P13来组成组320,其中,边缘图案P4、P5、P9、P10和P13中的每一个包括E-type的像素n2、n3和n5。如图3A中所示,可将在图2A至图2C中被表示为E-type的像素之中的在组中的共同像素表示为交叉阴影线图案。Referring to FIG. 3A , 24 edge patterns P 1 to P 24 may be grouped into 6 groups 310 , 320 , 330 , 340 , 350 and 360 . For example, 24 edge patterns P 1 to P 24 may be grouped into 6 groups 310 to 360 based on whether adjacent pixels of E-type co-exist. The group 310 may be configured by edge patterns P 1 , P 2 , P 6 , P 7 and P 8 including n1 , n2 and n4 of E-type. The group 320 may be composed of edge patterns P 4 , P 5 , P 9 , P 10 , and P 13 , wherein each of the edge patterns P 4 , P 5 , P 9 , P 10 , and P 13 includes an E-type pixel n2, n3 and n5. As shown in FIG. 3A , common pixels in a group among pixels represented as E-type in FIGS. 2A to 2C may be represented as a cross-hatch pattern.

如表2中所示,可将6个组和包括在6个各自组中的边缘图案划分为屏蔽比特值(E)和额外比特值(G)。As shown in Table 2, the 6 groups and the edge patterns included in the 6 respective groups can be divided into masked bit values (E) and extra bit values (G).

[表2][Table 2]

用于分析图像的设备可检验在像素组的邻近像素之中的在与屏蔽比特值(E)相应的位置处的邻近像素是否与E-type相应,以确定像素组的图案与哪个组相应。用于分析图像的设备可检验在与相应组的额外比特值(G)相应的位置处的邻近像素是否与E-type相应,以确定像素组的图案。The apparatus for analyzing an image may check whether a neighboring pixel at a position corresponding to the mask bit value (E) among neighboring pixels of a pixel group corresponds to E-type to determine which group a pattern of the pixel group corresponds to. The apparatus for analyzing the image may check whether adjacent pixels at positions corresponding to the corresponding group of extra bit values (G) correspond to E-type to determine the pattern of the pixel group.

例如,因为包括在组310中的边缘图案P1、P2、P6、P7和P8包括E-type的像素n1、n2和n4,所以可将代表组310的屏蔽比特E1的第一比特、第二比特和第四比特设置为“1”。用于分析图像的设备可使用屏蔽比特E1来验证像素n1、n2和n4是否与E-type的邻近像素相应,以确定像素组是否包括在组310中。For example, because the edge patterns P 1 , P 2 , P 6 , P 7 , and P 8 included in the group 310 include pixels n1, n2, and n4 of E-type, the first mask bit E1 representing the group 310 may be bit, the second bit and the fourth bit are set to "1". The apparatus for analyzing an image may verify whether the pixels n1 , n2 , and n4 correspond to neighboring pixels of E-type using the mask bit E1 to determine whether the group of pixels is included in the group 310 .

此外,用于分析图像的设备可使用比特值G11、G12和G13来确定与被分类为组310的像素组相应的边缘图案。例如,比特值G11可以是指第六比特被设置为“1”的额外比特值。用于分析图像的设备可使用G11来验证像素组的像素n6是否与E-type相应。Also, the apparatus for analyzing an image may determine an edge pattern corresponding to a pixel group classified into the group 310 using the bit values G11 , G12 , and G13 . For example, the bit value G11 may refer to an extra bit value in which the sixth bit is set to "1". The device for analyzing the image may use G11 to verify whether pixel n6 of the pixel group corresponds to the E-type.

用于分析图像的设备可基于像素组的像素n6与E-type相应的确定结果,确定像素组与边缘图案P6或边缘图案P7相应。另外,用于分析图像的设备可使用比特值G12来验证像素组的像素n3是否与E-type相应,以确定像素组与边缘图案P6和边缘图案P7之中的哪个图案相应。例如,当像素组的像素n3与E-type相应时,用于分析图像的设备可确定像素组与边缘图案P7相应。相反地,当像素组的像素n3不与E-type相应时,用于分析图像的设备可确定像素组与边缘图案P6相应。The apparatus for analyzing an image may determine that the pixel group corresponds to the edge pattern P 6 or the edge pattern P 7 based on a determination result that the pixel n 6 of the pixel group corresponds to the E-type. In addition, the apparatus for analyzing an image may verify whether the pixel n3 of the pixel group corresponds to the E-type using the bit value G12 to determine which pattern of the edge pattern P6 and the edge pattern P7 the pixel group corresponds to. For example, when the pixel n3 of the pixel group corresponds to the E-type, the apparatus for analyzing the image may determine that the pixel group corresponds to the edge pattern P7 . Conversely, when the pixel n3 of the pixel group does not correspond to the E-type, the apparatus for analyzing the image may determine that the pixel group corresponds to the edge pattern P6 .

基于像素组的像素n6不与E-type相应的确定结果,用于分析图像的设备可确定像素组与边缘图案P1、P2或P8相应。另外,用于分析图像的设备可使用比特值G12来验证像素组的像素n3是否与E-type相应,当像素组的像素n3不与E-type相应时,用于分析图像的设备可确定像素组与边缘图案P1相应。当像素组的n3与E-type相应时,用于分析图像的设备可使用G13来验证像素组的像素n5是否与E-type相应。当像素组的像素n5与E-type相应时,用于分析图像的设备可确定像素组与边缘图案P8相应,当像素n5不与E-type相应时,用于分析图像的设备可确定像素组与边缘图案P2相应。Based on the determination that the pixel n6 of the pixel group does not correspond to the E-type, the apparatus for analyzing the image may determine that the pixel group corresponds to the edge pattern P 1 , P 2 or P 8 . In addition, the device for analyzing the image may use the bit value G12 to verify whether the pixel n3 of the pixel group corresponds to the E-type, and when the pixel n3 of the pixel group does not correspond to the E-type, the device for analyzing the image may determine that the pixel Group corresponds to edge pattern P1 . When n3 of the pixel group corresponds to E-type, the apparatus for analyzing the image may use G13 to verify whether pixel n5 of the pixel group corresponds to E-type. When the pixel n5 of the pixel group corresponds to the E-type, the device for analyzing the image can determine that the pixel group corresponds to the edge pattern P8 , and when the pixel n5 does not correspond to the E-type, the device for analyzing the image can determine that the pixel Group corresponds to edge pattern P2 .

当在与E1至E4的屏蔽比特值相应的位置处的邻近像素不与E-type相应时,用于分析图像的设备可确定像素组的图案属于组350或组360。用于分析图像的设备可验证在与比特值G51和G52的额外比特值相应的位置处的邻近像素是否与S-type相应,并确定像素组属于组350和组360之中的哪个组。The apparatus for analyzing an image may determine that the pattern of the pixel group belongs to the group 350 or the group 360 when adjacent pixels at positions corresponding to the mask bit values of E1 to E4 do not correspond to the E-type. The apparatus for analyzing an image may verify whether adjacent pixels at positions corresponding to extra bit values of the bit values G51 and G52 correspond to S-type, and determine which group among the group 350 and the group 360 the pixel group belongs to.

在一个示例中,用于分析图像的设备可使用比特值G51来确定像素组是否属于组360,其中,比特值G51可以是指第二比特和第七比特被设置为“1”的额外比特值。用于分析图像的设备可使用比特值G51来验证像素组的像素n2和n7是否与S-type相应,以确定像素组是否属于组360。另外,用于分析图像的设备还可验证(继排除像素n2和n7之后)剩余的单个邻近像素是否与E-type相应,以确定像素组与边缘图案P11和边缘图案P14中的哪个图案相应。In one example, an apparatus for analyzing an image may use a bit value G51 to determine whether a group of pixels belongs to group 360, where bit value G51 may refer to an additional bit value with the second bit and the seventh bit set to "1" . The apparatus for analyzing an image may verify whether the pixels n2 and n7 of the pixel group correspond to the S-type using the bit value G51 to determine whether the pixel group belongs to the group 360 . Additionally, the apparatus for analyzing the image may also verify (following the exclusion of pixels n2 and n7) whether the remaining single neighboring pixel corresponds to the E-type to determine which of the edge patterns P 11 and P 14 the pixel group corresponds to corresponding.

此外,用于分析图像的设备可使用比特值G52来确定像素组是否属于组350,其中,比特值G52可以是指第四比特和第五比特被设置为“1”的额外比特值。用于分析图像的设备可验证像素组的像素n4和n5是否与S-type相应,并确定像素组是否属于组350。另外,用于分析图像的设备还可验证(继排除像素n4和n5之后)剩余的单个邻近像素是否与E-type相应,并在边缘图案P3和边缘图案P22之中确定将被分类为像素组的图案的图案。In addition, the apparatus for analyzing an image may determine whether the pixel group belongs to the group 350 using a bit value G52, where the bit value G52 may refer to an additional bit value in which the fourth bit and the fifth bit are set to "1". The apparatus for analyzing an image may verify whether the pixels n4 and n5 of the pixel group correspond to the S-type, and determine whether the pixel group belongs to the group 350 . In addition, the device for analyzing the image may also verify (following the exclusion of pixels n4 and n5) whether the remaining single neighboring pixel corresponds to the E-type, and determines among edge pattern P3 and edge pattern P22 to be classified as A pattern of patterns of pixel groups.

参照图3B,用于分析图像的设备可使用包括在像素组中的E-type的邻近像素来对相应像素组的边缘图案进行分类。如随后将讨论的,当用于E-type的邻近像素的阈值TE等于用于S-type的邻近像素的阈值TS时,用于分析图像的设备可对像素组的边缘图案进行分类。Referring to FIG. 3B , the apparatus for analyzing an image may classify an edge pattern of a corresponding pixel group using neighboring pixels of an E-type included in a pixel group. As will be discussed later, the apparatus for analyzing an image may classify edge patterns of groups of pixels when the threshold TE for E-type neighboring pixels is equal to the threshold TS for S-type neighboring pixels.

更具体地讲,用于分析图像的设备可检验邻近像素b0至b7是否与E-type的邻近像素相应。当多个邻近像素与E-type相应时,用于分析图像的设备可将与相关像素相应的比特设置为“1”,否则,将与相关像素相应的比特设置为“0”。用于分析图像的设备可基于等式2来计算P-val。More specifically, the apparatus for analyzing an image may check whether neighboring pixels b 0 to b 7 correspond to neighboring pixels of E-type. When a plurality of adjacent pixels correspond to E-type, the apparatus for analyzing an image may set a bit corresponding to a relevant pixel to '1', and otherwise, set a bit corresponding to a relevant pixel to '0'. The device for analyzing images may calculate P-val based on Equation 2.

[等式2][equation 2]

P-val=(B<<1)AND B AND(B>>1)P-val=(B<<1) AND B AND(B>>1)

这里,B表示8比特的比特值,例如,b0b1b2b3b4b5b6b7;(B<<1)表示通过以比特为单位将8比特B的比特值向左循环位移1比特的程度而获得的值,例如,b1b2b3b4b5b6b7b0;(B>>1)表示通过以比特为单位将8比特B的比特值向右循环位移1比特的程度而获得的值,例如,b7b0b1b2b3b4b5b6;P-val表示通过以比特为单位对(B<<1)、B和(B>>1)执行逻辑AND运算而获得的比特值。Here, B represents an 8-bit bit value, for example, b 0 b 1 b 2 b 3 b 4 b 5 b 6 b 7 ; (B<<1) means that by shifting the 8-bit B bit value to the left in units of bits The value obtained by cyclically shifting the degree of 1 bit, for example, b 1 b 2 b 3 b 4 b 5 b 6 b 7 b 0 ; (B>>1) means that by shifting the bit value of 8 bits B to The value obtained by right cyclically shifting the degree of 1 bit, for example, b 7 b 0 b 1 b 2 b 3 b 4 b 5 b 6 ; (B>>1) A bit value obtained by performing a logical AND operation.

如表3中所示,用于分析图像的设备可使用查找表(LUT)从计算出的P-val确定边缘图案。例如,当计算出的P-val是“00000001”时,用于分析图像的设备可确定相关像素组与边缘图案P11相应。当计算出的P-val是表3的LUT中不存在的值时,用于分析图像的设备可确定计算出的P-val不与预定边缘图案中的任何一种相应。As shown in Table 3, an apparatus for analyzing an image may determine an edge pattern from the calculated P-val using a look-up table (LUT). For example, when the calculated P-val is "00000001", the apparatus for analyzing the image may determine that the relevant pixel group corresponds to the edge pattern P11 . When the calculated P-val is a value not present in the LUT of Table 3, the apparatus for analyzing an image may determine that the calculated P-val does not correspond to any of the predetermined edge patterns.

[表3][table 3]

当P-val等于十进制17、34、68和136时,两个边缘图案可以是候选。与两个边缘图案相应的P-val被认为是“非关注”类型的邻近像素与E-type的邻近像素相应的情况,并且用于分析图像的设备可基于预定规则选择可能的边缘图案中的任意一个。例如,用于分析图像的设备可确定另外确定的邻近像素的类型,并从两个边缘图案中选择边缘图案中的任意一个。可选地,用于分析图像的设备可随机地选择边缘图案中的任意一个。When P-val is equal to 17, 34, 68 and 136 decimal, two edge patterns can be candidates. A P-val corresponding to two edge patterns is considered to be the case where adjacent pixels of the "non-interest" type correspond to adjacent pixels of the E-type, and the device for analyzing the image may select one of the possible edge patterns based on predetermined rules. anyone. For example, the device for analyzing the image may determine the type of the additionally determined neighboring pixels and select any one of the edge patterns from the two edge patterns. Alternatively, the device for analyzing the image may randomly select any one of the edge patterns.

图4是示出根据示例性实施例的用于确定像素组的边缘的方向的方案的示图。参照图4,多个预定边缘图案可被映射到具有预定方向的边缘。FIG. 4 is a diagram illustrating a scheme for determining a direction of an edge of a pixel group according to an exemplary embodiment. Referring to FIG. 4, a plurality of predetermined edge patterns may be mapped to edges having predetermined directions.

多个预定边缘图案可被映射到具有主要方向的边缘,其中,S-type的邻近像素沿所述主要方向被布置。例如,边缘图案P1、P7、P18和P24410可被映射到具有第E2方向的边缘415。此外,边缘图案P11、P12、P13和P14420可被映射到具有第E4方向的边缘425。如表4中所示,24个边缘图案可被映射到沿8个方向的边缘。A plurality of predetermined edge patterns may be mapped to an edge having a main direction along which neighboring pixels of the S-type are arranged. For example, edge patterns P 1 , P 7 , P 18 , and P 24 410 may be mapped to an edge 415 having an E2-th direction. In addition, edge patterns P 11 , P 12 , P 13 , and P 14 420 may be mapped to an edge 425 having an E4th direction. As shown in Table 4, 24 edge patterns can be mapped to edges along 8 directions.

[表4][Table 4]

P1P1 E2E2 P13P13 E4E4 P2P2 E1E1 P14P14 E4E4 P3P3 E0E0 P15P15 E5E5 P4P4 E7E7 P16P16 E6E6 P5P5 E6E6 P17P17 E0E0 P6P6 E3E3 P18P18 E2E2 P7P7 E2E2 P19P19 E3E3 P8P8 E0E0 P20P20 E6E6 P9P9 E6E6 P21P21 E7E7 P10P10 E5E5 P22P22 E0E0 P11P11 E4E4 P23P23 E1E1 P12P12 E4E4 P24P24 E2E2

用于分析图像的设备可对与检测到事件的像素相应的像素组的图案进行分类。可将多个被分类的图案映射到表4的边缘,并因此,用于分析图像的设备可识别检测到事件的多个像素中的边缘的方向。例如,使用表4映射的边缘图案可被存储为多个事件中的边缘,并且用于分析图像的设备可针对多个像素组合存储的边缘信息来确定对象的外形。A device for analyzing an image may classify a pattern of groups of pixels corresponding to pixels in which an event was detected. A number of classified patterns can be mapped to the edges of Table 4, and thus, a device for analyzing the image can identify the direction of the edge in the number of pixels where the event was detected. For example, edge patterns mapped using Table 4 may be stored as edges in multiple events, and the device for analyzing the image may combine the stored edge information for multiple pixels to determine the shape of the object.

更具体地讲,用户的手的移动是用于分析图像的设备接收通过移动而产生的事件信号的情况的示例。在此情况下,事件像素可包括与用户的手的边缘相应的像素和与用户的手的内部相应的像素两者。用于分析图像的设备可确定与事件像素中的多个像素相应的边缘图案。这里,预定边缘图案可包括与边缘相应的边缘图案。因此,用于分析图像的设备可确定与手的内部相应的像素不与预定边缘图案中的任何一个相应,并确定与用户的手的边缘相应的像素与预定边缘图案中的任何一个相应。当与人的手的边缘相应的多个像素被确定为与预定边缘图案中的任何一种相应时,用于分析图像的设备可基于表4,在与人的手的边缘相应的多个像素中确定与相关边缘图案相应的边缘的方向。结果,用于分析图像的设备可在与人的手的边缘相应的多个像素中确定缘的方向,并通过整合多个边缘的方向来确定用户的手的外形。More specifically, the movement of the user's hand is an example of a case where an apparatus for analyzing an image receives an event signal generated by the movement. In this case, the event pixels may include both pixels corresponding to the edge of the user's hand and pixels corresponding to the inside of the user's hand. A device for analyzing an image may determine an edge pattern corresponding to a plurality of the event pixels. Here, the predetermined edge pattern may include an edge pattern corresponding to the edge. Accordingly, the apparatus for analyzing the image may determine that pixels corresponding to the inside of the hand do not correspond to any of the predetermined edge patterns, and determine that pixels corresponding to the edge of the user's hand correspond to any of the predetermined edge patterns. When the plurality of pixels corresponding to the edge of the human hand are determined to correspond to any one of the predetermined edge patterns, the apparatus for analyzing the image may, based on Table 4, at the plurality of pixels corresponding to the edge of the human hand Determine the orientation of the edge corresponding to the associated edge pattern in . As a result, the apparatus for analyzing the image may determine the direction of the edge among a plurality of pixels corresponding to the edge of the person's hand, and determine the shape of the user's hand by integrating the directions of the plurality of edges.

图5A和图5B是示出根据示例性实施例的用于基于输入图像来分析对象的外形的方案的示图。5A and 5B are diagrams illustrating a scheme for analyzing a shape of an object based on an input image, according to an exemplary embodiment.

参照图5A,输入图像可以是基于事件的视觉传感器的输出,其中,基于事件的视觉传感器捕捉布置在气缸的中心周围的按相等旋转速度以顺时针方式旋转的8个连杆。在此情况下,基于事件的视觉传感器可通过检测变亮事件和变暗事件来输出事件信号。例如,基于事件的视觉传感器可通过检测图像中多个像素的亮度经由沿顺时针方向旋转的8个连杆而增加或减小了大于预定值的程度来输出事件信号。在图5A中,黑点(■)可以是指传感器检测变暗事件的输出,其中,在变暗事件中,亮度至少减小所述预定值;白点(□)可以是指检测变亮事件的输出,其中,在变亮事件中,亮度至少增加所述预定值。参照图5B,用于分析图像的设备可使用图5A的输入图像来分析对象的外形。Referring to FIG. 5A , the input image may be the output of an event-based vision sensor that captures 8 connecting rods arranged around the center of the cylinder rotating in a clockwise manner at equal rotational speeds. In this case, an event-based vision sensor can output an event signal by detecting brightening events and dimming events. For example, an event-based vision sensor may output an event signal by detecting the degree to which brightness of a plurality of pixels in an image has increased or decreased by more than a predetermined value via 8 linkages rotating in a clockwise direction. In FIG. 5A, a black dot (■) may refer to the output of the sensor detecting a dimming event in which the brightness is reduced by at least the predetermined value; a white dot (□) may refer to the detection of a brightening event The output of , wherein, in a brighten event, the brightness is increased by at least the predetermined value. Referring to FIG. 5B , the apparatus for analyzing an image may analyze a shape of an object using the input image of FIG. 5A .

用于分析图像的设备可基于通过图1至图4描述的方案来选择与预定边缘图案相应的像素组,并基于与选择的像素组相应的边缘的方向来估计对象的外形。间接地,用于分析图像的设备可有效地去除由于拖尾效应等而包括在输入图像中的噪声。The apparatus for analyzing an image may select a pixel group corresponding to a predetermined edge pattern based on the scheme described through FIGS. 1 to 4 and estimate a shape of an object based on a direction of an edge corresponding to the selected pixel group. Indirectly, the apparatus for analyzing an image can effectively remove noise included in an input image due to smearing or the like.

除了对象的外形之外,用于分析图像的设备还可分析对象的运动。用于分析图像的设备可计算与对象的边缘相应的多个像素处的速度,并在从事件信号分析对象的外形之后,使用在与边缘相应的多个像素处的速度来分析对象的运动。在下文中,将参照图6来描述用于分析图像的设备计算在与对象的边缘相应的多个像素处的速度的操作,并且将参照图7来描述用于分析图像的设备分析对象的运动的操作。In addition to the shape of the object, the device for analyzing the image may also analyze the motion of the object. The apparatus for analyzing an image may calculate velocities at a plurality of pixels corresponding to an edge of the object, and analyze a motion of the object using the velocities at the plurality of pixels corresponding to the edge after analyzing a shape of the object from an event signal. Hereinafter, the operation of the apparatus for analyzing an image to calculate velocities at a plurality of pixels corresponding to the edge of the object will be described with reference to FIG. operate.

图6是示出根据示例性实施例的用于计算与像素组相应的速度的方案的示图。参照图6,像素组可包括运动方向信息,用于分析图像的设备可使用邻近像素组来计算与像素组相应的速度。FIG. 6 is a diagram illustrating a scheme for calculating a velocity corresponding to a pixel group according to an exemplary embodiment. Referring to FIG. 6 , a pixel group may include motion direction information, and an apparatus for analyzing an image may calculate a velocity corresponding to the pixel group using neighboring pixel groups.

这里,用于分析图像的设备可计算与对象的边缘相应的像素组的速度。例如,用于分析图像的设备可针对被分类为图2A至图2C的预定边缘图案P1至P24的像素组来计算相应像素组的速度,而不是针对包括在事件信号中的多个像素来计算相应像素组的速度。如前面所描述的,因为预定边缘图案P1至P24可包括与对象的边缘相应的边缘图案,所以用于分析图像的设备可计算与对象的边缘相应的像素组的速度。Here, the apparatus for analyzing an image may calculate a velocity of a pixel group corresponding to an edge of an object. For example, the apparatus for analyzing an image may calculate the velocity of a corresponding pixel group for a group of pixels classified into the predetermined edge patterns P1 to P24 of FIGS. to calculate the velocity of the corresponding pixel group. As previously described, since the predetermined edge patterns P1 to P24 may include edge patterns corresponding to edges of objects, the apparatus for analyzing an image may calculate velocities of pixel groups corresponding to edges of objects.

用于分析图像的设备可基于等式3来计算与像素组相应的x轴方向速度Vx和y轴方向速度VyThe apparatus for analyzing an image may calculate an x-axis direction velocity V x and a y-axis direction velocity V y corresponding to a pixel group based on Equation 3.

[等式3][equation 3]

VV xx VV ythe y == &Sigma;&Sigma; ii == 11 ,, SS -- Typetype 88 &alpha;&alpha; ii dxdx ii // dtdt ii dydy ii // dtdt ii ,, &alpha;&alpha; ii == || &Integral;&Integral; &theta;&theta; ii ,, aa &theta;&theta; ii ,, bb coscos (( &theta;&theta; )) d&theta;d&theta; ||

θi,a和θi,b表示基于像素组的中心覆盖S-type的第i个邻近像素的边界角。例如,当像素n5是S-type时,θ5,a 620和θ5,b 610可以是基于像素组的中心覆盖像素n5的边界角。θi ,a and θi ,b denote the boundary angle of the i-th neighboring pixel covering the S-type based on the center of the pixel group. For example, when pixel n5 is S-type, θ5,a 620 and θ5 ,b 610 may be the boundary angles covering pixel n5 based on the center of the pixel group.

用于分析图像的设备可基于等式4减轻针对时间戳的噪声的灵敏度。The device for analyzing the image may mitigate the sensitivity to noise of the time stamp based on Equation 4.

[等式4][equation 4]

VV xx VV ythe y == &Sigma;&Sigma; ii == 11 ,, SS -- Typetype 88 &alpha;&alpha; ii dxdx ii // << dtdt >> dydy ii // << dtdt >> ,, &alpha;&alpha; ii == || &Integral;&Integral; &theta;&theta; ii ,, aa &theta;&theta; ii ,, bb coscos (( &theta;&theta; )) d&theta;d&theta; ||

这里,<dt>表示与等式5相同的值。Here, <dt> represents the same value as Equation 5.

[等式5][equation 5]

用于分析图像的设备可存储与像素组相应的速度,速度可包括通过等式3至等式5计算出的x轴方向速度Vx和y轴方向速度Vy。与像素组相应的速度可以是指位于相应像素组的中心处的事件像素的速度。如前面所描述的,用于分析图像的设备可使用预定边缘图案针对与对象的边缘相应的事件像素计算x轴方向速度Vx和y轴方向速度Vy。在下文中,将参照图7使用与对象的边缘相应的事件像素的速度来描述用于分析对象的运动的方法。The apparatus for analyzing an image may store velocities corresponding to pixel groups, and the velocities may include x-axis direction velocity V x and y-axis direction velocity V y calculated through Equations 3 to 5 . The velocity corresponding to the pixel group may refer to the velocity of the event pixel located at the center of the corresponding pixel group. As previously described, the apparatus for analyzing an image may calculate an x-axis direction velocity V x and a y-axis direction velocity V y for an event pixel corresponding to an edge of an object using a predetermined edge pattern. Hereinafter, a method for analyzing a motion of an object will be described using a velocity of an event pixel corresponding to an edge of the object with reference to FIG. 7 .

图7是示出根据示例性实施例的使用刚体模型分析对象的运动的方案的示图。参照图7,用于分析图像的设备可分析对象700的4自由度(4-DOF)运动。FIG. 7 is a diagram illustrating a scheme of analyzing motion of an object using a rigid body model, according to an exemplary embodiment. Referring to FIG. 7 , an apparatus for analyzing an image may analyze a 4-degree-of-freedom (4-DOF) motion of an object 700 .

例如,根据示例性实施例,使用检测到对象700的移动的输入图像。对象700可在二维(2D)表面上按移动速度Vp 740进行移动。可选地,对象700可基于旋转中心OC720按角速度ω721进行旋转。可选地,对象700可基于缩放中心Oz 730被扩张或收缩至缩放速度VzFor example, according to an exemplary embodiment, an input image in which movement of the object 700 is detected is used. The object 700 can move at a moving velocity V p 740 on a two-dimensional (2D) surface. Optionally, the object 700 may rotate at an angular velocity ω721 based on the center of rotation OC 720 . Optionally, the object 700 may be expanded or contracted to a scaling velocity V z based on the scaling center O z 730 .

用于分析图像的设备可分析对象700的移动速度分量、旋转速度分量和缩放速度分量。用于分析图像的设备可使用参照图1至图6提供的描述来计算存在于对象700的边缘上的预定点Pi 710处的速度Vi。例如,速度Vi可以是指基于等式3至等式5计算出的x轴方向速度Vx和y轴方向速度VyThe apparatus for analyzing an image may analyze a movement speed component, a rotation speed component, and a scaling speed component of the object 700 . The apparatus for analyzing an image may calculate a velocity V i existing at a predetermined point P i 710 on the edge of the object 700 using the description provided with reference to FIGS. 1 to 6 . For example, the velocity V i may refer to the x-axis direction velocity V x and the y-axis direction velocity V y calculated based on Equation 3 to Equation 5 .

用于分析图像的设备可对速度Vi进行建模,如等式6所示。The device used to analyze the image can model the velocity Vi , as shown in Equation 6.

[等式6][equation 6]

Vzi+Vri+Vp=Vi V zi +V ri +V p =V i

这里,Vzi 731、Vri 722和Vp 740可以是指在点Pi 710处的缩放速度分量、旋转速度分量和移动速度分量。如等式6中所示,用于分析图像的设备可对速度Vi进行建模,以将位于被设置在对象700的边缘上的点Pi 710处的速度Vi分解成缩放速度分量、旋转速度分量和移动速度分量。这里,等式6可被定义为等式7。Here, V zi 731 , V ri 722 , and V p 740 may refer to scaling speed components, rotation speed components, and movement speed components at the point P i 710 . As shown in Equation 6, the device for analyzing the image may model the velocity V to decompose the velocity V at a point P i 710 disposed on the edge of the object 700 into scaled velocity components, Rotation velocity component and movement velocity component. Here, Equation 6 may be defined as Equation 7.

[等式7][equation 7]

tPi+ωA(Pi-Oc)+Vp=Vi tP i +ωA(P i -O c )+V p =V i

这里,tPi表示缩放速度分量,以Oz 730为原点的坐标Pi 710可表示矢量Vzi 731的方向和大小,参数t表示可对矢量Vzi 731的大小执行缩放;ωA(Pi-Oc)表示旋转速度分量,坐标差(Pi-Oc)可表示从旋转中心Oc 720朝向坐标Pi 710的矢量的方向和大小;矩阵A表示用于对从旋转中心Oc 720朝向坐标Pi 710的矢量进行旋转的旋转矩阵,例如,矩阵 A = 0 - 1 1 0 . 由于矩阵A而被旋转的矢量可指向矢量Vri 722,并且参数ω可对矢量Vri 722的大小执行缩放。Here, tP i represents the scaling velocity component, and the coordinate P i 710 with O z 730 as the origin can represent the direction and magnitude of the vector V zi 731, and the parameter t represents that scaling can be performed on the magnitude of the vector V zi 731; ωA(P i − O c ) represents the rotation velocity component , and the coordinate difference (P i -O c ) can represent the direction and magnitude of the vector from the rotation center O c 720 toward the coordinate P i 710; The vector of coordinates Pi 710 is rotated by a rotation matrix, e.g., the matrix A = 0 - 1 1 0 . The rotated vector due to matrix A may point to vector V ri 722 , and the parameter ω may perform scaling on the magnitude of vector V ri 722 .

用于分析图像的设备可基于等式7计算缩放速度分量参数t、旋转速度分量参数ω、旋转中心Oc和移动速度分量Vp。这是因为用于分析图像的设备知道位于边缘的多个点的坐标Pi 710以及在相应点处的速度Vi。用于分析图像的设备可分析移动速度分量、旋转速度分量和缩放速度分量(例如,对象的4-DOF)中的至少一个。The apparatus for analyzing an image may calculate a scaling speed component parameter t, a rotation speed component parameter ω, a rotation center O c , and a movement speed component V p based on Equation 7. This is because the device for analyzing the image knows the coordinates P i 710 of points located on the edge and the velocity V i at the corresponding points. The apparatus for analyzing an image may analyze at least one of a movement speed component, a rotation speed component, and a scaling speed component (for example, 4-DOF of an object).

可以以各种方式来实现用于基于等式7计算缩放速度分量参数t、旋转速度分量参数ω、旋转中心Oc和移动速度分量Vp的方法。根据示例性实施例,可从等式7推导出等式8。The method for calculating the scaling velocity component parameter t, the rotation velocity component parameter ω, the rotation center Oc , and the movement velocity component Vp based on Equation 7 can be implemented in various ways. According to an exemplary embodiment, Equation 8 may be derived from Equation 7.

[等式8][Equation 8]

tt (( PP ii -- PP &OverBar;&OverBar; )) ++ &omega;A&omega;A (( PP ii -- PP &OverBar;&OverBar; )) == VV ii -- VV &OverBar;&OverBar;

这里,Pi表示位于对象700的边缘上的第i个点的坐标,表示位于对象700的边缘上的点的坐标的平均值。Vi表示在位于对象700的边缘上的第i点处的速度,表示在位于对象700的边缘上的点处的速度的平均值。多个变量可以以等式9至等式12来定义。Here, P i represents the coordinates of the i-th point located on the edge of the object 700, represents an average value of coordinates of points located on the edge of the object 700 . V i represents the velocity at the i-th point located on the edge of the object 700, represents the average value of the velocities at points located on the edge of the object 700 . A number of variables can be defined in Equation 9 to Equation 12.

[等式9][equation 9]

Pi=(xi,yi)P i = (x i , y i )

[等式10][equation 10]

PP &OverBar;&OverBar; == (( 11 NN &Sigma;&Sigma; ii == 11 NN xx ii ,, 11 NN &Sigma;&Sigma; ii == 11 NN ythe y ii ))

[等式11][equation 11]

Vi=(Vxi,Vyi)V i = (V xi , V yi )

[等式12][Equation 12]

VV &OverBar;&OverBar; == (( 11 NN &Sigma;&Sigma; ii == 11 NN VV xixi ,, 11 NN &Sigma;&Sigma; ii == 11 NN VV yiyi ))

用于分析图像的设备可基于等式3至等式5计算x轴方向速度Vx和y轴方向速度Vy,并将计算出的Vx和Vy存储为在像素Pi处的速度Vi。用于分析图像的设备可使用多个像素的坐标Pi和多个像素的速度Vi来计算用于分析图像的设备可使用多个Pi、多个Vi和等式8来计算参数t和参数ω。例如,基于伪逆方案可从等式8推导出等式13和等式14。The apparatus for analyzing an image may calculate the x-axis direction velocity V x and the y-axis direction velocity V y based on Equation 3 to Equation 5, and store the calculated V x and V y as the velocity V at the pixel P i i . A device for analyzing an image may use the coordinates P i of a plurality of pixels and the velocities V i of a plurality of pixels to calculate and A device for analyzing an image may use a plurality of Pi , a plurality of V i , and Equation 8 to calculate parameter t and parameter ω. For example, Equation 13 and Equation 14 can be derived from Equation 8 based on the pseudo-inverse scheme.

[等式13][Equation 13]

tt == &sigma;&sigma; (( xx ,, VV xx )) ++ &sigma;&sigma; (( ythe y ,, VV ythe y )) &sigma;&sigma; 22 (( PP ))

[等式14][Equation 14]

&omega;&omega; == &sigma;&sigma; (( xx ,, VV ythe y )) ++ &sigma;&sigma; (( ythe y ,, VV xx )) &sigma;&sigma; 22 (( PP ))

这里,σ2(P)=σ2(x)+σ2(y),σ(·)表示用于计算标准差的算子;σ(x,y)=E[(x-E[x])(y-E[y])];E[·]表示期望值或平均值。用于分析图像的设备可基于等式13和等式14计算缩放速度分量参数t和旋转速度分量参数ω。Here, σ 2 (P)=σ 2 (x)+σ 2 (y), σ(·) represents the operator used to calculate the standard deviation; σ(x,y)=E[(xE[x])( yE[y])]; E[·] represents the expected value or average value. The apparatus for analyzing an image may calculate a scaling speed component parameter t and a rotation speed component parameter ω based on Equation 13 and Equation 14.

图8A至图8D是示出根据示例性实施例的用于提高分析对象800的运动的精确度的方案的示图。8A to 8D are diagrams illustrating a scheme for improving the accuracy of analyzing a motion of an object 800 according to an exemplary embodiment.

参照图8A,用于分析图像的设备可使用尺寸比像素组的尺寸更大的观察区域810来提高运动分析精确度。观察区域810可以是指包括多个像素组的像素组811的集合。例如,观察区域810可包括通过沿对象800的边缘将对象800分段成相同尺寸而产生的区域。Referring to FIG. 8A , an apparatus for analyzing an image may improve motion analysis accuracy using an observation area 810 having a size larger than that of a pixel group. The observation area 810 may refer to a set of pixel groups 811 including a plurality of pixel groups. For example, the observation area 810 may include an area generated by segmenting the object 800 into equal sizes along an edge of the object 800 .

在选择观察区域810时,用于分析图像的设备可确定将被包括的像素组811的各种图案。例如,用于分析图像的设备可选择具有不同图案的像素组811的观察区域,并执行运动分析。Upon selecting the viewing area 810, the device for analyzing the image may determine various patterns of pixel groups 811 to be included. For example, a device for analyzing an image may select a viewing area with different patterns of pixel groups 811 and perform motion analysis.

例如,假设以下情况:对象800在没有旋转、收缩和扩张的情况下向右820移动。这里,对象800可以是指具有矩形形状的对象,并且可以是在倾斜或以倾斜方式定向的同时移动的对象。观察区域810可包括具有相同或相似图案的像素组811。当基于包括在观察区域810中的像素组811执行运动分析时,虽然实际移动的方向是向右的方向820,但是可将实际移动的方向分析为向右下方812移动。用于分析图像的设备可在对象800的边缘中选择包括非直线部分的观察区域830,而不是包括直线部分的观察区域810。用于分析图像的设备可通过选择包括不同图案的像素组的观察区域来提高分析运动的精确度。For example, assume the following situation: Object 800 moves to the right 820 without rotation, contraction, and expansion. Here, the object 800 may refer to an object having a rectangular shape, and may be an object moving while being inclined or oriented in an oblique manner. Viewing area 810 may include groups of pixels 811 having the same or similar pattern. When the motion analysis is performed based on the pixel group 811 included in the observation area 810 , although the direction of the actual movement is the right direction 820 , the direction of the actual movement may be analyzed as the movement to the lower right 812 . The apparatus for analyzing the image may select the observation area 830 including the non-linear portion in the edge of the object 800 instead of the observation area 810 including the linear portion. A device for analyzing images may improve the accuracy of analyzing motion by selecting viewing regions that include different patterns of pixel groups.

根据另一示例性实施例的用于分析图像的设备可使用均匀性水平(LOH)提高分析运动的精确度。例如,用于分析图像的设备可基于等式15计算小块(patch)的LOH,并选择具有低LOH的小块。这里,小块可包括大于3×3个像素的尺寸的像素组。An apparatus for analyzing an image according to another exemplary embodiment may improve accuracy of analyzing motion using a level of uniformity (LOH). For example, the apparatus for analyzing an image may calculate an LOH of a patch based on Equation 15, and select a patch with a low LOH. Here, the small block may include a pixel group having a size greater than 3×3 pixels.

[等式15][Equation 15]

这里,θref和θi表示位于多个小块(patch)的中心处的像素的边缘角(即,方位)、和第i个邻近像素的边缘角(即,方位)。当LOH低时,包括在相应小块中的像素组的图案彼此具有一定程度的相似性,当LOH高时,包括在相应小块中的像素组的图案可以是不相似的。用于分析图像的设备可选择具有低LOH的小块,并选择包括不同图案的像素组的观察区域。因此,边缘可被分类,并且边缘的方位可用于确定图像的重要特征,还可被应用于确定对象的形状和/或移动。Here, θ ref and θ i denote an edge angle (ie, orientation) of a pixel located at the center of a plurality of patches, and an edge angle (ie, orientation) of an i-th neighboring pixel. When LOH is low, patterns of pixel groups included in corresponding small blocks have a certain degree of similarity to each other, and when LOH is high, patterns of pixel groups included in corresponding small blocks may not be similar. The device for analyzing the image may select a patch with a low LOH and select a viewing area that includes a different pattern of pixel groups. Thus, edges can be classified, and the orientation of the edges can be used to determine important features of the image, and can also be applied to determine the shape and/or movement of objects.

图8B是示出使用LOH分析运动的精确度被提高的示例。例如,假设以下情况:对象800沿向右的方向820移动。当在观察区域810中观察对象800的移动时,对象800的边缘可在时间点t被观察为边缘831,并在随后的时间点t+Δt被观察为边缘832。在此情况下,因为对象800沿向右的方向820移动,所以对象800的实际移动的速度可表示为速度矢量840。当未使用利用LOH的运动分析时,会将对象800的移动速度计算为速度矢量850。速度矢量850可在方向和大小上与表示实际移动速度的速度矢量840不同。FIG. 8B is an example showing that the accuracy of analyzing motion using LOH is improved. For example, assume the following situation: Object 800 moves in direction 820 to the right. When the movement of the object 800 is observed in the observation area 810 , the edge of the object 800 may be observed as an edge 831 at a time point t, and as an edge 832 at a subsequent time point t+Δt. In this case, because the object 800 is moving in a rightward direction 820 , the velocity of the actual movement of the object 800 can be represented as a velocity vector 840 . When motion analysis with LOH is not used, the velocity of movement of object 800 is calculated as velocity vector 850 . Velocity vector 850 may differ in direction and magnitude from velocity vector 840 representing the actual velocity of movement.

当使用利用LOH的运动分析时,可将对象800的移动速度计算为速度矢量860。速度矢量860的方向可与表示对象800的实际移动的速度矢量840的方向相同,然而,速度矢量860的大小与速度矢量840的大小不同。When using motion analysis with LOH, the velocity of movement of object 800 may be calculated as velocity vector 860 . The direction of the velocity vector 860 may be the same as the direction of the velocity vector 840 representing the actual movement of the object 800 , however, the magnitude of the velocity vector 860 is different from the magnitude of the velocity vector 840 .

参照图8C,虽然对象按具有相同速度和方向的实际速度872移动,但是基于对象800的外形871计算出的速度873的大小可以不同。例如,出现在沿着实际移动的方向的方向(例如,x轴方向)上的矢量分量越小,则计算出的速度的大小越小。Referring to FIG. 8C, although the object moves at the actual speed 872 having the same speed and direction, the magnitude of the speed 873 calculated based on the shape 871 of the object 800 may be different. For example, the smaller the vector component appearing in the direction (for example, the x-axis direction) along the direction of actual movement, the smaller the magnitude of the calculated velocity.

参照图8D,用于分析图像的设备可基于等式16和等式17校正对象800的移动速度的大小。更具体地讲,在操作881,用于分析图像的设备可接收Vi作为边缘事件。用于分析图像的设备可使用Vi计算Vp、Oc、t和ω。由于可参考在图1至图7中描述的相似特征,因此将省略针对操作881和882的详细描述。在操作883,用于分析图像的设备可通过使用LOH来分析运动以计算Vp。类似地,可将图8A的描述应用于操作883,并因此省略操作883的详细描述。Referring to FIG. 8D , the apparatus for analyzing an image may correct the magnitude of the moving speed of the object 800 based on Equation 16 and Equation 17. Referring to FIG. More specifically, at operation 881, the apparatus for analyzing an image may receive Vi as an edge event. The device for analyzing the image can calculate Vp , Oc , t, and ω using Vi . Since similar features described in FIGS. 1 to 7 may be referred to, detailed descriptions for operations 881 and 882 will be omitted. In operation 883, the apparatus for analyzing the image may analyze motion by using the LOH to calculate Vp . Similarly, the description of FIG. 8A may be applied to operation 883, and thus a detailed description of operation 883 is omitted.

在操作884,用于分析图像的设备可基于等式16计算Vi gen。在等式16中使用的参数的定义可与在等式7中使用的参数的定义相同。In operation 884, the apparatus for analyzing the image may calculate V i gen based on Equation 16. The definitions of the parameters used in Equation 16 may be the same as those used in Equation 7.

[等式16][Equation 16]

Vi gen=tPi+ωA(Pi-Oc)+Vp V i gen =tP i +ωA(P i -O c )+V p

在操作885,用于分析图像的设备可基于等式17计算Vi cor。这里,θ表示Vi和Vi gen之间的角度差,并可与在图8B中示出的角度855相应。In operation 885, the apparatus for analyzing the image may calculate V i cor based on Equation 17. Here, θ represents the angular difference between V i and V i gen and may correspond to angle 855 shown in FIG. 8B .

[等式17][Equation 17]

VV ii corcor == 11 -- tanthe tan &theta;&theta; tanthe tan &theta;&theta; 11 VV ii

尽管Vi具有与实际移动的速度的大小相似的大小,但是Vi可具有与实际移动的方向不同的方向。这是因为移动的速度是针对所有边缘而不论LOH如何而被计算出的。相反地,尽管Vi gen具有比实际移动的速度的大小更小的大小,但是Vi gen可具有与实际移动的方向相同的方向。这是因为移动的速度是针对在低LOH的观察范围内的边缘被计算出的。因此,用于分析图像的设备可从Vi获得矢量的大小,并从Vi gen获得矢量的方向,以基于等式17计算Vi corAlthough V i has a magnitude similar to the magnitude of the velocity of the actual movement, Vi may have a direction different from the direction of the actual movement. This is because the speed of movement is calculated for all edges regardless of LOH. Conversely, although V i gen has a smaller magnitude than the magnitude of the velocity of the actual movement, Vi gen may have the same direction as the direction of the actual movement. This is because the speed of movement is calculated for edges within the observation range of the low LOH. Therefore, the device for analyzing the image can obtain the magnitude of the vector from V i and the direction of the vector from Vi gen to calculate V i cor based on Equation 17.

根据另一示例性实施例,用于分析图像的设备可反复地重复操作882至操作885。例如,当θ是90度时,因为tan90°的值未被定义,所以会难以计算Vi cor。在此情况下,用于分析图像的设备可反复地旋转Vi来计算Vi cor。旋转数可以是两次或更多次。这里,需要满足表示在第k次重复中允许的最大旋转角。According to another exemplary embodiment, the apparatus for analyzing an image may iteratively repeat operation 882 to operation 885 . For example, when θ is 90 degrees, since the value of tan90° is not defined, it will be difficult to calculate V i cor . In this case, the device for analyzing the image may iteratively rotate V i to calculate V i cor . The number of spins can be two or more. Here, need to satisfy Indicates the maximum rotation angle allowed in the kth repetition.

图9是示出根据示例性实施例的用于基于对象910的移动速度来处理用户输入的方案的示图。FIG. 9 is a diagram illustrating a scheme for processing a user input based on a moving speed of an object 910 according to an exemplary embodiment.

参照图9,用于分析图像的设备可基于对象910的运动来处理用户输入。对象910可以是用户的手。Referring to FIG. 9 , the apparatus for analyzing an image may process a user input based on a motion of an object 910 . Object 910 may be a user's hand.

用于分析图像的设备可使用针对图1至图8在前面提供的描述来计算对象910的移动速度915。用于分析图像的设备可使用计算出的移动速度915来计算关于用户输入的点的相对坐标的变化量。The apparatus for analyzing the images may calculate the movement speed 915 of the object 910 using the description provided above with respect to FIGS. 1-8 . The apparatus for analyzing an image may use the calculated movement speed 915 to calculate a change amount with respect to a relative coordinate of a point input by the user.

用于分析图像的设备可基于相对坐标的变化量来处理用户输入。例如,用于分析图像的设备可将标示在显示器920上的光标的位置从当前位置921移动到新位置922。The device for analyzing the image may process the user input based on the amount of change in the relative coordinates. For example, the device for analyzing the image may move the position of the cursor indicated on the display 920 from the current position 921 to a new position 922 .

因此,相对坐标可指示用户界面(UI)的指示符(诸如鼠标指针)关于UI的指示符的当前位置的相对位置。用于处理用户输入的设备可基于对象的运动来计算UI的指示符的相对位置的变化量。例如,在对象在1秒内按1m/s向右移动的情况下,可将UI的指示符的相对位置的变化量计算为方向为右且大小为1m的矢量。Accordingly, relative coordinates may indicate a relative position of an indicator of a user interface (UI), such as a mouse pointer, with respect to a current location of the indicator of the UI. The apparatus for processing the user input may calculate the amount of change in the relative position of the indicator of the UI based on the motion of the object. For example, when an object moves to the right at 1 m/s within 1 second, the change amount of the relative position of the indicator of the UI can be calculated as a vector whose direction is right and whose magnitude is 1 m.

用于处理用户输入的设备可通过计算按照UI的指示符的相对位置的变化量从UI的指示符的当前位置移动到的新位置来确定UI的指示符的相对位置。用于处理用户输入的设备可将UI的指示符从当前位置移动到所述新位置。The apparatus for processing the user input may determine the relative position of the pointer of the UI by calculating a new position moved from the current position of the pointer of the UI according to a change amount of the relative position of the pointer of the UI. The means for processing user input may move the indicator of the UI from the current location to the new location.

虽然在示图中未示出,但是用于分析图像的设备可包括识别器和处理器。识别器可基于包括检测到对象的移动的事件信号的输入图像来识别对象的运动。例如,识别器可使用针对图1至图8在前面提供的描述来计算对象910的移动速度915。处理器可基于由识别器识别出的对象的运动计算用于用户输入的相对坐标。例如,处理器可使用由识别器计算的移动速度915,计算用于用户输入的相对坐标的变化量。处理器可基于相对坐标的变化量来更新相对坐标,并使用更新后的相对坐标来处理用户输入。Although not shown in the diagram, an apparatus for analyzing an image may include a recognizer and a processor. The recognizer may recognize the motion of the object based on an input image including an event signal in which the motion of the object is detected. For example, the recognizer may calculate the movement speed 915 of the object 910 using the description provided above with respect to FIGS. 1-8 . The processor may calculate relative coordinates for user input based on the motion of the object recognized by the recognizer. For example, the processor may use the moving speed 915 calculated by the recognizer to calculate the amount of change in relative coordinates for user input. The processor may update the relative coordinates based on the amount of change of the relative coordinates, and use the updated relative coordinates to process user input.

图10是示出根据示例性实施例的用于基于对象的深度来处理用户输入的方案的示图。FIG. 10 is a diagram illustrating a scheme for processing a user input based on a depth of an object, according to an exemplary embodiment.

参照图10,用于分析图像的设备还可使用针对同一对象的移动在空间上彼此隔开的两个位置处检测到的两个不同事件信号,来分析对象的深度。例如,传感器1020可包括与左眼相应的第一传感器和与右眼相应的第二传感器。用于分析图像的设备可使用从与双眼相应两个传感器输出的图像之间的视差(disparity)来测量对象的深度。Referring to FIG. 10 , the apparatus for analyzing an image may also analyze the depth of an object using two different event signals detected at two locations spatially separated from each other for movement of the same object. For example, the sensor 1020 may include a first sensor corresponding to a left eye and a second sensor corresponding to a right eye. The apparatus for analyzing images may measure the depth of an object using disparity between images output from two sensors corresponding to both eyes.

参照图10,用于分析图像的设备可基于用于使与左眼和右眼相应的多个小块的相似度水平(LOS)最大化的方案,来计算从两个传感器输出的两个图像的视差。更具体地讲,用于分析图像的设备可处理从与左眼相应的传感器1020输出的事件信号和从与右眼相应的传感器1020输出的事件信号两者。例如,用于分析图像的设备可通过使用两个各自的事件信号对边缘图案进行分类来检测与边缘相应的像素,并基于检测到的像素的边缘图案来确定相应像素中的边缘的方向。在此情况下,用于分析图像的设备可获得对象的两个边缘被彼此隔开的重叠图像。在两个图像中被彼此隔开的对象的两个边缘构成两个图像之间的视差。用于分析图像的设备可将表5的算法应用到对象的两个边缘被彼此隔开的重叠图像。Referring to FIG. 10 , the apparatus for analyzing an image may calculate two images output from two sensors based on a scheme for maximizing a level of similarity (LOS) of a plurality of patches corresponding to left and right eyes. parallax. More specifically, the apparatus for analyzing images may process both the event signal output from the sensor 1020 corresponding to the left eye and the event signal output from the sensor 1020 corresponding to the right eye. For example, the apparatus for analyzing an image may detect pixels corresponding to edges by classifying edge patterns using two respective event signals, and determine the direction of the edges in the corresponding pixels based on the detected edge patterns of the pixels. In this case, the device for analyzing the image may obtain an overlapping image in which the two edges of the object are spaced apart from each other. Two edges of an object that are spaced apart from each other in the two images constitute a disparity between the two images. The apparatus for analyzing an image may apply the algorithm of Table 5 to an overlapping image in which two edges of an object are separated from each other.

[表5][table 5]

lLOSLOS (( xx ,, ythe y ,, dd )) &equiv;&equiv; 11 Mm &Sigma;&Sigma; (( xx ii ythe y ii )) &Element;&Element; patchpatch (( xx ,, ythe y )) coscos 22 {{ &theta;&theta; (( xx ii ,, ythe y ii )) -- &theta;&theta; (( xx ii ++ dd ,, ythe y ii )) }} ,,

θ:方位角,M:有效像素数θ: azimuth, M: number of effective pixels

gLOSgLOS (( dd )) &equiv;&equiv; &Sigma;&Sigma; ythe y rLOSrLOS (( ythe y ,, dd ))

LOS(x,y,d)≡lLOS(x,y,d)×rLOS(y,d)×gLOS(d)LOS(x, y, d)≡lLOS(x, y, d)×rLOS(y,d)×gLOS(d)

disparitydisparity (( xx ,, ythe y )) == argarg maxmax dd {{ LOSLOS (( xx ,, ythe y ,, dd )) }}

这里,(x,y)表示小块在图像中的坐标,并且多个点可被包括在该小块中;(xi,yi)表示包括在(x,y)坐标的小块中的第i个点的坐标,d表示两个图像的视差。可从与左眼相应的传感器和与右眼相应的传感器接收两个图像,并因此,所述两个图像通常可沿着x轴的方向被彼此隔开。因此,d可表示两个图像沿x轴的方向被彼此隔开的程度。θ(xi,yi)表示方位角,并可与在(xi,yi)坐标的点处计算出的边缘的方向相应。Here, (x, y) represents the coordinates of the small block in the image, and multiple points can be included in the small block; ( xi , y i ) represents the points included in the small block of (x, y) coordinates The coordinates of the i-th point, d represents the disparity of the two images. Two images may be received from a sensor corresponding to a left eye and a sensor corresponding to a right eye, and thus, the two images may generally be spaced apart from each other along the direction of the x-axis. Therefore, d may represent the degree to which two images are separated from each other in the direction of the x-axis. θ( xi ,y i ) represents an azimuth, and may correspond to the direction of the edge calculated at the point of ( xi ,y i ) coordinates.

1LOS(x,y,d)表示用于确定(x,y)坐标的单个小块中的LOS的数学公式;rLOS(y,d)表示用于确定y坐标的一维(1D)线上的LOS的数学公式;gLOS(d)表示用于确定整个图像的2D区域中的LOS的数学公式。1LOS(x,y,d) represents the mathematical formula used to determine the LOS in a single patch of (x,y) coordinates; rLOS(y,d) represents the mathematical formula used to determine the y coordinates of a one-dimensional (1D) line Mathematical formula of LOS; gLOS(d) represents a mathematical formula for determining LOS in a 2D area of the entire image.

用于分析图像的设备可计算用于使LOS(x,y,d)最大化的d。为了使LOS(x,y,d)最大化,可以使θ(xi,yi)和θ(xi+d,yi)之间的差最小化,并因此,用于分析图像的设备可计算用于使θ(xi,yi)和θ(xi+d,yi)之间的差最小化的d。The device for analyzing the image may calculate d for maximizing LOS(x,y,d). To maximize LOS(x,y,d), the difference between θ( xi ,y i ) and θ( xi +d,y i ) can be minimized, and therefore, the device used to analyze the image d for minimizing the difference between θ( xi ,y i ) and θ( xi +d,y i ) can be calculated.

当两个图像之间的视差增加时,用于分析图像的设备可将对象的深度估计为变浅,并且当两个图像之间的视差减小时,用于分析图像的设备可将对象的深度估计为变深。用于分析图像的设备可基于估计出的对象的深度来处理用户输入。When the disparity between the two images increases, the device for analyzing the images can estimate the depth of the object as shallower, and when the disparity between the two images decreases, the device for analyzing the image can estimate the depth of the object as Estimated to be darkened. The device for analyzing the image may process the user input based on the estimated depth of the object.

用于分析图像的设备可从传感器1020确定与对象1010的深度相应的操作模式。例如,可基于深度从传感器1020预先确定第一操作模式区域1031、第二操作模式区域1032、第三操作模式区域1033和在对象1010后面的背景区域1034。The apparatus for analyzing an image may determine an operation mode corresponding to the depth of the object 1010 from the sensor 1020 . For example, a first operating mode area 1031 , a second operating mode area 1032 , a third operating mode area 1033 , and a background area 1034 behind the object 1010 may be predetermined from the sensor 1020 based on depth.

当对象1010的深度被确定为与第二操作模式区域1032相应时,用于分析图像的设备可基于用于处理与第二操作模式区域1032相应的用户输入的方案,使用对象1010来处理输入。When the depth of the object 1010 is determined to correspond to the second operation mode region 1032 , the apparatus for analyzing an image may process an input using the object 1010 based on a scheme for processing a user input corresponding to the second operation mode region 1032 .

图11是示出根据示例性实施例的用于分析图像的方法的流程图。FIG. 11 is a flowchart illustrating a method for analyzing an image according to an exemplary embodiment.

参照图11,在操作1110,根据示例性实施例的用于分析图像的设备可读取事件信息。在操作1120,用于分析图像的设备可基于产生的事件的位置和产生时间来更新事件产生时间图(map)。在操作1130,用于分析图像的设备可针对事件产生时间图产生的邻近事件分析事件产生时间的图案。Referring to FIG. 11 , in operation 1110, the apparatus for analyzing an image according to an exemplary embodiment may read event information. In operation 1120, the apparatus for analyzing an image may update an event generation time map (map) based on the location and generation time of the generated event. In operation 1130, the apparatus for analyzing an image may analyze a pattern of event generation times with respect to neighboring events generated by the event generation time graph.

在操作1140,用于分析图像的设备可基于事件产生图案对边缘的方向性进行分类。在操作1150,用于分析图像的设备可基于边缘方向图案和事件产生图案来提取速度分量。In operation 1140, the apparatus for analyzing an image may classify directionality of an edge based on an event generation pattern. In operation 1150, the apparatus for analyzing an image may extract a velocity component based on an edge direction pattern and an event generation pattern.

在操作1160,用于分析图像的设备可积累事件信息以分析对象的移动。在操作1170,用于分析图像的设备可确定积累的事件的数量是否足以准确地分析对象的移动。作为所述确定的结果,当确定积累的事件的数量不足以分析对象移动时,在操作1175,用于分析图像的设备可确定积累时间是否足以准确地分析对象的移动。当确定积累时间不足以分析对象移动时,用于分析图像的设备可返回到操作1110,并进一步积累新的事件信息。In operation 1160, the apparatus for analyzing an image may accumulate event information to analyze movement of an object. In operation 1170, the apparatus for analyzing an image may determine whether the number of accumulated events is sufficient to accurately analyze the movement of the object. As a result of the determination, when it is determined that the number of accumulated events is insufficient to analyze the movement of the object, in operation 1175, the apparatus for analyzing images may determine whether the accumulation time is sufficient to accurately analyze the movement of the object. When it is determined that the accumulation time is not enough to analyze the object movement, the apparatus for analyzing images may return to operation 1110 and further accumulate new event information.

当积累的事件的数量足以准确地分析对象的移动,或尽管积累的事件的数量不足,但是积累时间足以准确地分析对象的移动时,在操作1180,用于分析图像的设备可针对多个分量对对象的位置和移动速度进行划分。用于分析图像的设备可获得作为移动速度分量的移动速度、扩张或收缩速度、以及旋转速度。When the number of accumulated events is sufficient to accurately analyze the movement of the object, or the accumulation time is sufficient to accurately analyze the movement of the object although the number of accumulated events is insufficient, in operation 1180, the device for analyzing the image may target a plurality of components Divide the object's position and movement speed. The device for analyzing the image may obtain the velocity of movement, the velocity of expansion or contraction, and the velocity of rotation as components of the velocity of movement.

在操作1190,用于分析图像的设备可确定对象的主要移动分量。例如,用于分析图像的设备可确定在移动速度、扩张或收缩速度、以及旋转速度之中非常有助于对象的移动的至少一个移动分量。In operation 1190, the apparatus for analyzing an image may determine a main moving component of an object. For example, the device for analyzing images may determine at least one movement component among movement speed, expansion or contraction speed, and rotation speed that contributes significantly to the movement of the object.

由于可参考参照图1至图10所描述的类似特征,因此将省略图11的详细描述。Since similar features described with reference to FIGS. 1 to 10 can be referred to, a detailed description of FIG. 11 will be omitted.

图12是示出根据示例性实施例的用于分析三维(3D)图像的设备1200的框图。FIG. 12 is a block diagram illustrating an apparatus 1200 for analyzing a three-dimensional (3D) image according to an exemplary embodiment.

参照图12,用于分析3D图像的设备可包括至少两个图像变化检测器1210和1215、边缘方向信息提取器1220和1225、速度信息提取器1230和1235、以及平均边缘方向信息提取器1240和1245,其中,图像变化检测器1210和1215分别包括基于事件的视觉传感器1211和1216,边缘方向信息提取器1220和1225用于针对一个或更多个事件检测边缘方向信息,速度信息提取器1230和1235用于针对一个或更多个事件检测速度信息,平均边缘方向信息提取器1240和1245用于在预定时间针对一个或更多个事件检测像素的平均边缘方向。12, the apparatus for analyzing a 3D image may include at least two image change detectors 1210 and 1215, edge direction information extractors 1220 and 1225, velocity information extractors 1230 and 1235, and average edge direction information extractors 1240 and 1240. 1245, wherein image change detectors 1210 and 1215 include event-based visual sensors 1211 and 1216, respectively, edge direction information extractors 1220 and 1225 are used to detect edge direction information for one or more events, velocity information extractors 1230 and 1235 for detecting velocity information for one or more events, and average edge direction information extractors 1240 and 1245 for detecting average edge directions of pixels for one or more events at a predetermined time.

此外,用于分析3D图像的设备1200还可包括视差图提取器1250、距离信息映射器1260和3D位置/移动分析器1270,其中,视差图提取器1250用于基于边缘方向信息提取器1220和1225的边缘方向信息来确定视差图,距离信息映射器1260用于针对一个或更多个事件确定距离信息。In addition, the device 1200 for analyzing 3D images may further include a disparity map extractor 1250, a distance information mapper 1260, and a 3D position/movement analyzer 1270, wherein the disparity map extractor 1250 is used to extract information based on the edge direction information extractor 1220 and The edge direction information of 1225 is used to determine the disparity map, and the distance information mapper 1260 is used to determine the distance information for one or more events.

由于可参考参照图1至图11所描述的类似特征,因此将省略图12中描述的多个模块的详细描述。Since similar features described with reference to FIGS. 1 to 11 can be referred to, detailed descriptions of various modules described in FIG. 12 will be omitted.

在附图中示出的示例性实施例可由包括总线、至少一个处理器(例如,中央处理器、微处理器等)、和存储器的设备来实现,其中,总线连接到所述设备的每个单元,所述至少一个处理器连接到总线来控制所述设备的操作以实现上述功能并执行命令,存储器连接到总线以存储命令、接收的消息和产生的消息。The exemplary embodiments shown in the drawings can be implemented by devices including a bus, at least one processor (eg, a central processing unit, a microprocessor, etc.), and a memory, wherein the bus is connected to each of the devices unit, the at least one processor is connected to the bus to control the operation of the device to implement the above functions and execute commands, and the memory is connected to the bus to store commands, received messages and generated messages.

如将由本领域技术人员理解的,可由执行特定任务的软件和/或硬件组件(诸如现场可编程门阵列(FPGA)或专用集成电路(ASIC))的任意组合来实现示例性实施例。单元或模块可方便地被配置为位于可寻址的存储介质,并被配置为在一个或更多个处理器或微处理器上执行。因此,举例来说,单元或模块可包括组件(诸如软件组件、面向对象的软件组件、类组件和任务组件)、处理、函数、属性、程序、子程序、程序代码段、驱动器、固件、微代码、电路、数据、数据库、数据结构、表、数组和变量。在组件和单元中提供的功能可合并成更少的组件和单元或模块,或者被进一部分离成另外的组件和单元或模块。As will be appreciated by those skilled in the art, the exemplary embodiments may be implemented by any combination of software and/or hardware components, such as field programmable gate arrays (FPGAs) or application specific integrated circuits (ASICs), that perform particular tasks. A unit or module may conveniently be located on the addressable storage medium and configured to execute on one or more processors or microprocessors. Thus, by way of example, a unit or module may include a component (such as a software component, an object-oriented software component, a class component, and a task component), a process, a function, an attribute, a procedure, a subroutine, a program code segment, a driver, a firmware, a microprocessor Code, Circuits, Data, Databases, Data Structures, Tables, Arrays, and Variables. Functionality provided in components and units may be combined into fewer components and units or modules, or further separated into additional components and units or modules.

上述示例性实施例还可实现在计算机可读介质中,其中,计算机可读介质包括用于实现由计算机执行的各种操作的程序指令。所述介质还可单独或与程序指令组合地包括数据文件、数据结构等。计算机可读介质的示例包括磁介质(诸如硬盘、软盘和磁带)、光介质(诸如CD ROM盘和DVD)、磁光介质(诸如光盘)、专门配置为存储和执行程序指令的硬件装置(诸如只读存储器(ROM)、随机存取存储器(RAM)、闪存等)。程序指令的示例可包括机器代码(诸如由编译器生成的)和包含可由计算机使用解释器执行的更高级代码的文件两者。描述的硬件装置可被配置为用作一个或更多个软件模块,以执行上述示例性实施例的操作,反之依然。The above-described exemplary embodiments can also be implemented in a computer-readable medium including program instructions for implementing various operations performed by a computer. The media may also include data files, data structures, etc., alone or in combination with program instructions. Examples of computer-readable media include magnetic media (such as hard disks, floppy disks, and magnetic tape), optical media (such as CD ROM disks and DVDs), magneto-optical media (such as optical disks), hardware devices specially configured to store and execute program instructions (such as read-only memory (ROM), random-access memory (RAM), flash memory, etc.). Examples of program instructions may include both machine code (such as generated by a compiler) and files containing higher-level code executable by a computer using an interpreter. The described hardware devices may be configured to act as one or more software modules in order to perform the operations of the above-described exemplary embodiments, and vice versa.

以上已描述了若干个示例性实施例。然而,应理解,可做出各种修改。例如,如果以不同顺序来执行所描述的技术,和/或如果以不同方式来组合所描述的系统、架构、装置和电路中的组件和/或以其他组件或它们的等同物来替换或补充所描述的系统、架构、装置和电路中的组件,则可实现适当的结果。因此,其他实现也在权利要求的范围内。Several exemplary embodiments have been described above. However, it should be understood that various modifications may be made. For example, if the described techniques are performed in a different order, and/or if components in the described systems, architectures, devices, and circuits are combined in a different manner and/or replaced or supplemented with other components or their equivalents The described systems, architectures, devices and components within the circuits then achieve suitable results. Accordingly, other implementations are within the scope of the following claims.

Claims (34)

1.一种用于分析图像的设备,所述设备包括:1. An apparatus for analyzing images, said apparatus comprising: 分类器,被配置为接收与基于事件的视觉传感器的至少一个像素相应的事件信号,并被配置为确定基于事件的视觉传感器的多个像素的像素组的图案,其中,像素组包括所述至少一个像素和与所述至少一个像素邻近的基于事件的视觉传感器的多个邻近像素;A classifier configured to receive an event signal corresponding to at least one pixel of the event-based vision sensor and configured to determine a pattern of a pixel group of a plurality of pixels of the event-based vision sensor, wherein the pixel group includes the at least a pixel and a plurality of neighboring pixels of an event-based vision sensor adjacent to the at least one pixel; 分析器,被配置为基于像素组的图案确定对象的外形和对象的移动中的至少一个。An analyzer configured to determine at least one of the shape of the object and the movement of the object based on the pattern of groups of pixels. 2.如权利要求1所述的设备,其中,事件信号指示在基于事件的视觉传感器的所述至少一个像素的位置处的事件的发生。2. The device of claim 1, wherein the event signal is indicative of the occurrence of an event at the location of the at least one pixel of the event-based vision sensor. 3.如权利要求1所述的设备,其中,分类器确定像素组是否与多个预定边缘图案之中的至少一个边缘图案相应。3. The apparatus of claim 1, wherein the classifier determines whether the group of pixels corresponds to at least one edge pattern among a plurality of predetermined edge patterns. 4.如权利要求3所述的设备,其中,分类器响应于确定像素组不与所述至少一个边缘图案相应,舍弃该像素组。4. The apparatus of claim 3, wherein the classifier discards the group of pixels in response to determining that the group of pixels does not correspond to the at least one edge pattern. 5.如权利要求1所述的设备,其中,分类器包括:5. The device of claim 1, wherein the classifier comprises: 类型确定器,被配置为基于与所述至少一个像素相应的事件信号的时间戳和与所述多个邻近像素相应的事件信号的时间戳之间的差来确定所述多个邻近像素的像素类型;a type determiner configured to determine a pixel of the plurality of neighboring pixels based on a difference between a time stamp of an event signal corresponding to the at least one pixel and a time stamp of an event signal corresponding to the plurality of neighboring pixels type; 图案确定器,被配置为基于所述多个邻近像素的像素类型来确定像素组的图案。A pattern determiner configured to determine a pattern of pixel groups based on pixel types of the plurality of neighboring pixels. 6.如权利要求5所述的设备,其中,所述多个邻近像素的像素类型包括第一像素类型和第二像素类型,其中,在与所述至少一个像素相应的事件信号的时间戳和与第一像素类型的邻近像素相应的事件信号的时间戳之间的差小于第一阈值,在与所述至少一个像素相应的事件信号的时间戳和与第二像素类型的邻近像素相应的事件信号的时间戳之间的差大于第二阈值。6. The device of claim 5 , wherein the pixel types of the plurality of neighboring pixels include a first pixel type and a second pixel type, wherein the time stamp of the event signal corresponding to the at least one pixel and The difference between the time stamps of the event signals corresponding to neighboring pixels of the first pixel type is less than a first threshold, after the time stamps of the event signals corresponding to the at least one pixel and the event corresponding to neighboring pixels of the second pixel type The difference between the time stamps of the signals is greater than a second threshold. 7.如权利要求1所述的设备,其中,分析器包括:7. The device of claim 1, wherein the analyzer comprises: 外形分析器,被配置为基于像素组的图案来确定与像素组相应的对象的边缘的方向。A shape analyzer configured to determine the direction of an edge of an object corresponding to a pixel group based on the pattern of the pixel group. 8.如权利要求1所述的设备,其中,分析器包括:8. The device of claim 1, wherein the analyzer comprises: 计算器,被配置为基于像素组的图案计算与像素组相应的速度;a calculator configured to calculate a velocity corresponding to the pixel group based on the pattern of the pixel group; 运动分析器,被配置为基于与像素组相应的速度来分析对象的移动。A motion analyzer configured to analyze movement of objects based on velocities corresponding to groups of pixels. 9.如权利要求8所述的设备,其中,对象的移动包括对象的移动速度分量、对象的旋转速度分量和对象的缩放速度分量中的至少一个。9. The apparatus of claim 8, wherein the movement of the object includes at least one of a movement speed component of the object, a rotation speed component of the object, and a scaling speed component of the object. 10.如权利要求8所述的设备,其中,分析器还包括:10. The device of claim 8, wherein the analyzer further comprises: 选择器,被配置为基于包括在观察区域中的像素组的各种图案,从多个观察区域之中选择用于分析对象的移动的至少一个观察区域。A selector configured to select at least one observation area for analyzing movement of the object from among the plurality of observation areas based on various patterns of pixel groups included in the observation area. 11.如权利要求1所述的设备,还包括:11. The device of claim 1, further comprising: 处理器,被配置为基于对象的移动计算关于用户输入的点的相对坐标的变化量,并基于所述相对坐标的变化量来处理用户输入。A processor configured to calculate, based on the movement of the object, a change in relative coordinates of a point input by a user, and process the user input based on the change in relative coordinates. 12.一种用于分析图像的设备,所述设备包括:12. An apparatus for analyzing images, said apparatus comprising: 分类器,被配置为接收与基于事件的视觉传感器的第一像素相应的第一事件信号和与基于事件的视觉传感器的第二像素相应的第二事件信号,确定基于事件的视觉传感器的第一多个像素的第一像素组的第一图案,并确定第二多个像素的第二像素组的第二图案,其中,第一像素组包括所述至少一个第一像素和与所述至少一个第一像素邻近的基于事件的视觉传感器的第一多个邻近像素,第二像素组包括所述至少一个第二像素和与所述至少一个第二像素邻近的第二多个邻近像素;A classifier configured to receive a first event signal corresponding to a first pixel of the event-based vision sensor and a second event signal corresponding to a second pixel of the event-based vision sensor, determine a first event-based event signal of the event-based vision sensor a first pattern of a first pixel group of a plurality of pixels, and determine a second pattern of a second pixel group of a second plurality of pixels, wherein the first pixel group includes the at least one first pixel and the at least one a first plurality of adjacent pixels of the event-based vision sensor adjacent to the first pixel, the second pixel group comprising the at least one second pixel and a second plurality of adjacent pixels adjacent to the at least one second pixel; 分析器,基于第一图案检测对象的第一位置,基于第二图案检测对象的第二位置,并基于第一位置和第二位置确定对象的深度。An analyzer that detects a first location of the object based on the first pattern, detects a second location of the object based on the second pattern, and determines a depth of the object based on the first location and the second location. 13.如权利要求12所述的设备,其中,第一事件信号指示在基于事件的视觉传感器的第一像素的第一位置处的第一事件的发生,第二事件信号指示在基于事件的视觉传感器的第二像素的第二位置处的第二事件的发生。13. The device of claim 12 , wherein the first event signal is indicative of the occurrence of a first event at a first location of a first pixel of the event-based vision sensor, and the second event signal is indicative of an occurrence of a first event at a first pixel of the event-based vision sensor. Occurrence of a second event at a second location of a second pixel of the sensor. 14.如权利要求12所述的设备,其中,第一事件信号与在第一位置处检测到对象的移动的信号相应,第二事件信号与在空间上与第一位置隔开的第二位置处检测到所述移动的信号相应。14. The apparatus of claim 12 , wherein the first event signal corresponds to a signal detecting movement of the object at a first location, and the second event signal corresponds to a second location spatially separated from the first location. The signal corresponding to the movement is detected. 15.如权利要求12所述的设备,其中,分析器被配置为计算第一位置和第二位置之间的距离,当第一位置和第二位置之间的距离增加时,将对象的深度估计为变浅,当第一位置和第二位置之间的距离减小时,将对象的深度估计为变深。15. The device of claim 12, wherein the analyzer is configured to calculate the distance between the first location and the second location, and when the distance between the first location and the second location increases, the depth of the object Estimated to become shallower, the depth of the object is estimated to become darker as the distance between the first location and the second location decreases. 16.如权利要求12所述的设备,还包括:16. The device of claim 12, further comprising: 处理器,被配置为确定与对象的深度相应的运动模式,并基于运动模式来处理用户输入。A processor configured to determine a motion pattern corresponding to the depth of the object, and process user input based on the motion pattern. 17.一种分析图像的方法,所述方法包括:17. A method of analyzing an image, the method comprising: 接收与基于事件的视觉传感器的至少一个像素相应的事件信号;receiving an event signal corresponding to at least one pixel of the event-based vision sensor; 确定基于事件的视觉传感器的多个像素的像素组的图案,其中,像素组包括所述至少一个像素和与所述至少一个像素邻近的基于事件的视觉传感器的多个邻近像素;determining a pattern of pixel groups of a plurality of pixels of the event-based vision sensor, wherein the pixel group includes the at least one pixel and a plurality of adjacent pixels of the event-based vision sensor adjacent to the at least one pixel; 基于像素组的图案确定对象的外形和对象的移动中的至少一个。At least one of the shape of the object and the movement of the object is determined based on the pattern of groups of pixels. 18.如权利要求17所述的方法,其中,事件信号指示在基于事件的视觉传感器的所述至少一个像素的位置处的事件的发生。18. The method of claim 17, wherein the event signal is indicative of the occurrence of an event at the location of the at least one pixel of the event-based vision sensor. 19.如权利要求17所述的方法,其中,确定像素组的图案的步骤包括:19. The method of claim 17, wherein determining the pattern of groups of pixels comprises: 基于与所述至少一个像素相应的事件信号的时间戳和与所述多个邻近像素相应的事件信号的时间戳之间的差来确定所述多个邻近像素的像素类型;determining a pixel type for the plurality of adjacent pixels based on a difference between a timestamp of an event signal corresponding to the at least one pixel and a timestamp of an event signal corresponding to the plurality of adjacent pixels; 基于所述多个邻近像素的像素类型来确定像素组是否与多个预定边缘图案之中的至少一个边缘图案相应;determining whether a group of pixels corresponds to at least one edge pattern among a plurality of predetermined edge patterns based on pixel types of the plurality of neighboring pixels; 响应于确定像素组不与所述至少一个边缘图案相应,舍弃该像素组,响应于确定像素组与所述至少一个边缘图案相应,确定该像素组的图案为所述至少一个边缘图案。Responsive to determining that the group of pixels does not correspond to the at least one edge pattern, discarding the group of pixels, and in response to determining that the group of pixels corresponds to the at least one edge pattern, determining the pattern of the group of pixels to be the at least one edge pattern. 20.如权利要求17所述的方法,其中,确定对象的外形和对象的移动中的至少一个的步骤包括:基于像素组的图案确定与像素组相应的边缘的方向。20. The method of claim 17, wherein determining at least one of the shape of the object and the movement of the object comprises determining a direction of an edge corresponding to the pixel group based on a pattern of the pixel group. 21.如权利要求17所述的方法,其中,确定对象的外形和对象的移动中的至少一个的步骤包括:21. The method of claim 17, wherein determining at least one of the shape of the object and the movement of the object comprises: 基于包括在观察区域中的像素组的各种图案,从多个观察区域之中选择用于分析对象的移动的至少一个观察区域;selecting at least one observation area for analyzing movement of the object from among a plurality of observation areas based on various patterns of pixel groups included in the observation area; 计算与包括在所述至少一个观察区域中的多个像素组相应的速度;calculating velocities corresponding to a plurality of pixel groups included in the at least one viewing area; 基于与所述多个像素组相应的速度来分析对象的移动。Movement of the object is analyzed based on velocities corresponding to the plurality of pixel groups. 22.如权利要求17所述的方法,还包括:22. The method of claim 17, further comprising: 基于包括在对象的移动中的对象的移动速度计算关于用户输入的点的相对坐标的变化量;calculating a change amount with respect to the relative coordinates of the point input by the user based on the moving speed of the object included in the movement of the object; 基于所述相对坐标的变化量来处理用户输入。User input is processed based on the amount of change in the relative coordinates. 23.一种用于分析图像的方法,所述方法包括:23. A method for analyzing an image, the method comprising: 接收指示对象的移动的与基于事件的视觉传感器的像素相应的事件信号的输入;receiving an input of an event signal corresponding to a pixel of the event-based vision sensor indicative of movement of the object; 选择和与事件信号相应的像素邻近的基于事件的视觉传感器的多个邻近像素;selecting a plurality of neighboring pixels of the event-based vision sensor adjacent to a pixel corresponding to the event signal; 确定所述多个邻近像素的边缘图案;determining an edge pattern for the plurality of neighboring pixels; 基于所述多个邻近像素的边缘图案来确定对象的边缘的位置;determining a location of an edge of an object based on an edge pattern of the plurality of neighboring pixels; 基于对象的边缘的位置来分析对象的外形。The shape of the object is analyzed based on the positions of the edges of the object. 24.如权利要求23所述的方法,其中,事件信号指示在基于事件的视觉传感器的像素的位置处的事件的发生。24. The method of claim 23, wherein the event signal is indicative of the occurrence of an event at the location of a pixel of the event-based vision sensor. 25.如权利要求23所述的方法,其中,确定边缘图案的步骤包括:25. The method of claim 23, wherein determining the edge pattern comprises: 基于与所述多个邻近像素相应的事件信号的时间戳和与所述像素相应的事件信号的时间戳之间的差来确定所述多个邻近像素的像素类型;determining a pixel type for the plurality of neighboring pixels based on a time stamp of an event signal corresponding to the plurality of neighboring pixels and a difference between a time stamp of an event signal corresponding to the pixel; 基于所述多个邻近像素的像素类型来确定所述多个邻近像素的边缘图案是否与多个边缘图案中的至少一个边缘图案相应。Whether the edge pattern of the plurality of adjacent pixels corresponds to at least one edge pattern of the plurality of edge patterns is determined based on pixel types of the plurality of adjacent pixels. 26.如权利要求25所述的方法,其中,确定边缘图案的步骤还包括:26. The method of claim 25, wherein the step of determining the edge pattern further comprises: 如果所述多个邻近像素的边缘图案不与所述至少一个边缘图案相应,则舍弃所述像素和所述多个邻近像素。If the edge pattern of the plurality of neighboring pixels does not correspond to the at least one edge pattern, discarding the pixel and the plurality of neighboring pixels. 27.如权利要求23所述的方法,还包括:27. The method of claim 23, further comprising: 基于对象的外形来分析对象的移动。The movement of the object is analyzed based on the shape of the object. 28.一种用于处理用户输入的设备,所述设备包括:28. An apparatus for processing user input, the apparatus comprising: 识别器,被配置为基于输入图像来识别对象的移动,其中,输入图像包括指示检测到对象的移动的基于事件的视觉传感器的至少一个像素的事件信号;a recognizer configured to recognize movement of an object based on an input image, wherein the input image includes an event signal indicative of at least one pixel of an event-based vision sensor that detected movement of the object; 处理器,被配置为基于对象的移动来更新对象的与用户输入相应的相对位置。A processor configured to update the relative position of the object corresponding to the user input based on the movement of the object. 29.如权利要求28所述的设备,其中,事件信号指示在基于事件的视觉传感器的所述至少一个像素处的事件的发生。29. The device of claim 28, wherein the event signal is indicative of the occurrence of an event at the at least one pixel of the event-based vision sensor. 30.如权利要求28所述的设备,其中,对象的移动包括对象的移动速度分量、对象的旋转速度分量和对象的缩放速度分量中的至少一个。30. The apparatus of claim 28, wherein the movement of the object includes at least one of a movement speed component of the object, a rotation speed component of the object, and a scaling speed component of the object. 31.如权利要求28所述的设备,其中,识别器包括:31. The device of claim 28, wherein the identifier comprises: 分类器,被配置为确定输入图像中的至少一个像素组的图案,其中,所述至少一个像素组包括与事件信号相应的所述至少一个像素和与所述至少一个像素邻近的多个邻近像素;a classifier configured to determine a pattern of at least one pixel group in the input image, wherein the at least one pixel group includes the at least one pixel corresponding to the event signal and a plurality of neighboring pixels adjacent to the at least one pixel ; 分析器,被配置为基于所述至少一个像素组的图案来分析对象的移动。An analyzer configured to analyze movement of the object based on the pattern of the at least one pixel group. 32.如权利要求31所述的设备,其中,分类器包括:32. The device of claim 31 , wherein the classifier comprises: 类型确定器,被配置为基于与所述至少一个像素相应的事件信号的时间戳和与所述多个邻近像素相应的事件信号的时间戳之间的差来确定所述多个邻近像素的像素类型;a type determiner configured to determine a pixel of the plurality of neighboring pixels based on a difference between a time stamp of an event signal corresponding to the at least one pixel and a time stamp of an event signal corresponding to the plurality of neighboring pixels type; 图案确定器,被配置为基于所述多个邻近像素的像素类型来确定像素组的图案。A pattern determiner configured to determine a pattern of pixel groups based on pixel types of the plurality of neighboring pixels. 33.如权利要求31所述的设备,其中,分析器包括:33. The device of claim 31 , wherein the analyzer comprises: 计算器,被配置为基于像素组的图案来计算与像素组相应的速度;a calculator configured to calculate a velocity corresponding to the pixel group based on the pattern of the pixel group; 运动分析器,被配置为基于与像素组相应的速度来分析对象的移动。A motion analyzer configured to analyze movement of objects based on velocities corresponding to groups of pixels. 34.如权利要求28所述的设备,其中,处理器还被配置为:基于对象的移动来计算对象的相对位置的变化量,基于所述变化量来更新对象的相对位置,并基于更新后的对象的相对位置来处理用户输入。34. The device according to claim 28, wherein the processor is further configured to: calculate an amount of change of the relative position of the object based on the movement of the object, update the relative position of the object based on the amount of change, and based on the updated The relative position of the object to handle user input.
CN201410366650.8A 2013-07-29 2014-07-29 Apparatus and method for analyzing images including event information Active CN104346427B (en)

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
KR10-2013-0089254 2013-07-29
KR20130089254 2013-07-29
KR10-2013-0098273 2013-08-20
KR1020130098273A KR102129916B1 (en) 2013-07-29 2013-08-20 Apparatus and method for analyzing an image including event information

Publications (2)

Publication Number Publication Date
CN104346427A true CN104346427A (en) 2015-02-11
CN104346427B CN104346427B (en) 2019-08-30

Family

ID=51228312

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201410366650.8A Active CN104346427B (en) 2013-07-29 2014-07-29 Apparatus and method for analyzing images including event information

Country Status (4)

Country Link
US (1) US9767571B2 (en)
EP (1) EP2838069B1 (en)
JP (1) JP6483370B2 (en)
CN (1) CN104346427B (en)

Cited By (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106488151A (en) * 2015-09-01 2017-03-08 三星电子株式会社 Sensor based on event and the pixel of the sensor based on event
CN106997453A (en) * 2016-01-22 2017-08-01 三星电子株式会社 Event signal processing method and equipment
CN107018357A (en) * 2016-01-27 2017-08-04 三星电子株式会社 Method and apparatus on the event sampling of the dynamic visual sensor of image formation
CN107750372A (en) * 2015-03-16 2018-03-02 皮埃尔和玛利居里大学(巴黎第六大学) The method that scene three-dimensional (3D) is rebuild
CN110365912A (en) * 2015-04-20 2019-10-22 三星电子株式会社 Imaging unit, system, and image sensor unit
WO2019210546A1 (en) * 2018-05-04 2019-11-07 上海芯仑光电科技有限公司 Data processing method and computing device
US20200041258A1 (en) 2015-04-20 2020-02-06 Samsung Electronics Co., Ltd. Cmos image sensor for rgb imaging and depth measurement with laser sheet scan
CN111274834A (en) * 2018-12-04 2020-06-12 西克股份公司 Reading of optical codes
US10883821B2 (en) 2015-04-20 2021-01-05 Samsung Electronics Co., Ltd. CMOS image sensor for 2D imaging and depth measurement with ambient light rejection
CN114270804A (en) * 2019-08-28 2022-04-01 索尼互动娱乐股份有限公司 Sensor system, image processing apparatus, image processing method, and program
US11736832B2 (en) 2015-04-20 2023-08-22 Samsung Electronics Co., Ltd. Timestamp calibration of the 3D camera with epipolar line laser point scanning
US11924545B2 (en) 2015-04-20 2024-03-05 Samsung Electronics Co., Ltd. Concurrent RGBZ sensor and system

Families Citing this family (18)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR102298652B1 (en) 2015-01-27 2021-09-06 삼성전자주식회사 Method and apparatus for determining disparty
KR102307055B1 (en) 2015-04-28 2021-10-01 삼성전자주식회사 Method and apparatus for extracting static pattern based on output of event-based sensor
EP3113108B1 (en) * 2015-07-02 2020-03-11 Continental Automotive GmbH Detection of lens contamination using expected edge trajectories
US10269131B2 (en) 2015-09-10 2019-04-23 Kabushiki Kaisha Toshiba Image processing apparatus and image processing method
KR102465212B1 (en) * 2015-10-30 2022-11-10 삼성전자주식회사 Photographing apparatus using multiple exposure sensor and photographing method thereof
US9934557B2 (en) 2016-03-22 2018-04-03 Samsung Electronics Co., Ltd Method and apparatus of image representation and processing for dynamic vision sensor
CN108574793B (en) 2017-03-08 2022-05-10 三星电子株式会社 Image processing equipment and electronic equipment including the same configured to regenerate time stamps
JP7357622B2 (en) * 2018-01-11 2023-10-06 ジェンサイト・バイオロジクス Method and device for processing asynchronous signals generated by event-based optical sensors
CN109919957B (en) * 2019-01-08 2020-11-27 同济大学 A Corner Detection Method Based on Dynamic Vision Sensor
EP3690736A1 (en) * 2019-01-30 2020-08-05 Prophesee Method of processing information from an event-based sensor
JP7120180B2 (en) * 2019-08-07 2022-08-17 トヨタ自動車株式会社 image sensor
JP7264028B2 (en) * 2019-12-05 2023-04-25 トヨタ自動車株式会社 Information providing system, information providing method, information terminal and information display method
CN111770245B (en) * 2020-07-29 2021-05-25 中国科学院长春光学精密机械与物理研究所 Pixel structure of a retina-like image sensor
JP2022073228A (en) 2020-10-30 2022-05-17 ソニーセミコンダクタソリューションズ株式会社 Information processing device and electronic equipment
US20240246241A1 (en) * 2021-06-03 2024-07-25 Sony Group Corporation Information processing apparatus, information processing system, and information processing method
US12243296B2 (en) * 2021-07-06 2025-03-04 Samsung Electronics Co., Ltd. Method of determining visual interference using a weighted combination of CIS and DVS measurement
US12262128B2 (en) * 2021-07-21 2025-03-25 Sony Group Corporation Image sensor control circuitry and image sensor control method
KR20230070091A (en) * 2021-11-12 2023-05-22 삼성전자주식회사 Operating method of dynamic vision sensor system

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101305398A (en) * 2005-10-12 2008-11-12 有源光学有限公司 Method for forming composite image based on multiple image frames
CN102177524A (en) * 2008-08-08 2011-09-07 实耐宝公司 Image-based inventory control system using advanced image recognition
CN102271253A (en) * 2010-06-07 2011-12-07 索尼公司 Image processing method using motion estimation and image processing apparatus

Family Cites Families (19)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20070268369A1 (en) * 2004-04-28 2007-11-22 Chuo Electronics Co., Ltd. Automatic Imaging Method and Apparatus
JP4140588B2 (en) 2004-09-01 2008-08-27 日産自動車株式会社 Moving object detection device
US7403866B2 (en) * 2004-10-06 2008-07-22 Telefonaktiebolaget L M Ericsson (Publ) High-resolution, timer-efficient sliding window
JP4650079B2 (en) * 2004-11-30 2011-03-16 日産自動車株式会社 Object detection apparatus and method
US20060197664A1 (en) 2005-01-18 2006-09-07 Board Of Regents, The University Of Texas System Method, system and apparatus for a time stamped visual motion sensor
US7613322B2 (en) * 2005-05-19 2009-11-03 Objectvideo, Inc. Periodic motion detection with applications to multi-grabbing
WO2006128315A1 (en) 2005-06-03 2006-12-07 Universität Zürich Photoarray for detecting time-dependent image data
KR100762670B1 (en) 2006-06-07 2007-10-01 삼성전자주식회사 Method and apparatus for generating disparity map from stereo image and method and apparatus for stereo matching therefor
EP2274725A1 (en) * 2008-04-04 2011-01-19 Advanced Micro Devices, Inc. Filtering method and apparatus for anti-aliasing
US20100079413A1 (en) * 2008-09-29 2010-04-01 Denso Corporation Control device
US20100118199A1 (en) * 2008-11-10 2010-05-13 Kabushiki Kaisha Toshiba Video/Audio Processor and Video/Audio Processing Method
JP5376906B2 (en) 2008-11-11 2013-12-25 パナソニック株式会社 Feature amount extraction device, object identification device, and feature amount extraction method
US8286102B1 (en) * 2010-05-27 2012-10-09 Adobe Systems Incorporated System and method for image processing using multi-touch gestures
US8698092B2 (en) * 2010-09-10 2014-04-15 Samsung Electronics Co., Ltd. Method and apparatus for motion recognition
KR101779564B1 (en) 2010-09-10 2017-09-20 삼성전자주식회사 Method and Apparatus for Motion Recognition
JP5624702B2 (en) * 2010-11-16 2014-11-12 日本放送協会 Image feature amount calculation apparatus and image feature amount calculation program
JP5645699B2 (en) * 2011-02-16 2014-12-24 三菱電機株式会社 Motion detection device and method, video signal processing device and method, and video display device
KR101880998B1 (en) 2011-10-14 2018-07-24 삼성전자주식회사 Apparatus and Method for motion recognition with event base vision sensor
KR102070562B1 (en) * 2012-06-19 2020-01-30 삼성전자주식회사 Event-based image processing device and method thereof

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101305398A (en) * 2005-10-12 2008-11-12 有源光学有限公司 Method for forming composite image based on multiple image frames
CN102177524A (en) * 2008-08-08 2011-09-07 实耐宝公司 Image-based inventory control system using advanced image recognition
CN102271253A (en) * 2010-06-07 2011-12-07 索尼公司 Image processing method using motion estimation and image processing apparatus

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
JÄURGEN KOGLER等: "Event-based Stereo Matching Approaches for Frameless Address Event Stereo Data", 《ADVANCES IN VISUAL COMPUTING》 *

Cited By (24)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107750372A (en) * 2015-03-16 2018-03-02 皮埃尔和玛利居里大学(巴黎第六大学) The method that scene three-dimensional (3D) is rebuild
CN107750372B (en) * 2015-03-16 2021-12-10 皮埃尔和玛利居里大学(巴黎第六大学) Method and device for three-dimensional reconstruction of scene and computer readable medium
US11002531B2 (en) 2015-04-20 2021-05-11 Samsung Electronics Co., Ltd. CMOS image sensor for RGB imaging and depth measurement with laser sheet scan
US11431938B2 (en) 2015-04-20 2022-08-30 Samsung Electronics Co., Ltd. Timestamp calibration of the 3D camera with epipolar line laser point scanning
CN110365912A (en) * 2015-04-20 2019-10-22 三星电子株式会社 Imaging unit, system, and image sensor unit
US11924545B2 (en) 2015-04-20 2024-03-05 Samsung Electronics Co., Ltd. Concurrent RGBZ sensor and system
US20200041258A1 (en) 2015-04-20 2020-02-06 Samsung Electronics Co., Ltd. Cmos image sensor for rgb imaging and depth measurement with laser sheet scan
US11736832B2 (en) 2015-04-20 2023-08-22 Samsung Electronics Co., Ltd. Timestamp calibration of the 3D camera with epipolar line laser point scanning
US11725933B2 (en) 2015-04-20 2023-08-15 Samsung Electronics Co., Ltd. CMOS image sensor for RGB imaging and depth measurement with laser sheet scan
US10883821B2 (en) 2015-04-20 2021-01-05 Samsung Electronics Co., Ltd. CMOS image sensor for 2D imaging and depth measurement with ambient light rejection
US10883822B2 (en) 2015-04-20 2021-01-05 Samsung Electronics Co., Ltd. CMOS image sensor for 2D imaging and depth measurement with ambient light rejection
US11650044B2 (en) 2015-04-20 2023-05-16 Samsung Electronics Co., Ltd. CMOS image sensor for 2D imaging and depth measurement with ambient light rejection
US11378390B2 (en) 2015-04-20 2022-07-05 Samsung Electronics Co., Ltd. CMOS image sensor for 2D imaging and depth measurement with ambient light rejection
US11131542B2 (en) 2015-04-20 2021-09-28 Samsung Electronics Co., Ltd. CMOS image sensor for RGB imaging and depth measurement with laser sheet scan
US10893227B2 (en) 2015-04-20 2021-01-12 Samsung Electronics Co., Ltd. Timestamp calibration of the 3D camera with epipolar line laser point scanning
CN106488151A (en) * 2015-09-01 2017-03-08 三星电子株式会社 Sensor based on event and the pixel of the sensor based on event
CN106997453B (en) * 2016-01-22 2022-01-28 三星电子株式会社 Event signal processing method and device
CN106997453A (en) * 2016-01-22 2017-08-01 三星电子株式会社 Event signal processing method and equipment
CN107018357A (en) * 2016-01-27 2017-08-04 三星电子株式会社 Method and apparatus on the event sampling of the dynamic visual sensor of image formation
CN107018357B (en) * 2016-01-27 2020-07-14 三星电子株式会社 Method and Apparatus for Event Sampling of Dynamic Vision Sensors for Image Formation
US11481908B2 (en) 2018-05-04 2022-10-25 Omnivision Sensor Solution (Shanghai) Co., Ltd Data processing method and computing device
WO2019210546A1 (en) * 2018-05-04 2019-11-07 上海芯仑光电科技有限公司 Data processing method and computing device
CN111274834A (en) * 2018-12-04 2020-06-12 西克股份公司 Reading of optical codes
CN114270804A (en) * 2019-08-28 2022-04-01 索尼互动娱乐股份有限公司 Sensor system, image processing apparatus, image processing method, and program

Also Published As

Publication number Publication date
JP6483370B2 (en) 2019-03-13
EP2838069A2 (en) 2015-02-18
EP2838069B1 (en) 2017-12-27
EP2838069A3 (en) 2015-10-07
JP2015028780A (en) 2015-02-12
CN104346427B (en) 2019-08-30
US9767571B2 (en) 2017-09-19
US20150030204A1 (en) 2015-01-29

Similar Documents

Publication Publication Date Title
CN104346427B (en) Apparatus and method for analyzing images including event information
US10891500B2 (en) Method and apparatus for acquiring traffic sign information
EP3754295B1 (en) Three-dimensional measuring system and three-dimensional measuring method
US9025875B2 (en) People counting device, people counting method and people counting program
US20120121166A1 (en) Method and apparatus for three dimensional parallel object segmentation
JP2007527569A (en) Imminent collision detection based on stereoscopic vision
WO2012175731A1 (en) Depth measurement quality enhancement
US9928426B1 (en) Vehicle detection, tracking and localization based on enhanced anti-perspective transformation
JP2016071846A (en) Method and apparatus for detecting obstacle based on monocular camera
CN105335955A (en) Object detection method and object detection apparatus
CN106709895A (en) Image generating method and apparatus
JP2016205887A (en) Road surface gradient detector
JP6454984B2 (en) Hand position determination method and device based on depth image
CN105741312A (en) Target object tracking method and device
CN106355608A (en) Stereoscopic matching method on basis of variable-weight cost computation and S-census transformation
EP3879810A1 (en) Imaging device
CN110809766A (en) Advanced driver assistance system and method
KR102129916B1 (en) Apparatus and method for analyzing an image including event information
US20160196657A1 (en) Method and system for providing depth mapping using patterned light
EP3955207A1 (en) Object detection device
CN104992431A (en) Method and device for multispectral image registration
Yusuf et al. Data Fusion of Semantic and Depth Information in the Context of Object Detection
CN105138979A (en) Method for detecting the head of moving human body based on stereo visual sense
CN105488802A (en) Fingertip depth detection method and system
Srikakulapu et al. Depth estimation from single image using defocus and texture cues

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant