CN104346427A - Apparatus and method for analyzing image including event information - Google Patents

Apparatus and method for analyzing image including event information Download PDF

Info

Publication number
CN104346427A
CN104346427A CN201410366650.8A CN201410366650A CN104346427A CN 104346427 A CN104346427 A CN 104346427A CN 201410366650 A CN201410366650 A CN 201410366650A CN 104346427 A CN104346427 A CN 104346427A
Authority
CN
China
Prior art keywords
pixel
event
equipment
pattern
pixel groups
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201410366650.8A
Other languages
Chinese (zh)
Other versions
CN104346427B (en
Inventor
李俊行
柳贤锡
李圭彬
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Samsung Electronics Co Ltd
Original Assignee
Samsung Electronics Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from KR1020130098273A external-priority patent/KR102129916B1/en
Application filed by Samsung Electronics Co Ltd filed Critical Samsung Electronics Co Ltd
Publication of CN104346427A publication Critical patent/CN104346427A/en
Application granted granted Critical
Publication of CN104346427B publication Critical patent/CN104346427B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • G06T7/246Analysis of motion using feature-based methods, e.g. the tracking of corners or segments
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/50Depth or shape recovery
    • G06T7/55Depth or shape recovery from multiple images
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10004Still image; Photographic image

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Multimedia (AREA)
  • Image Analysis (AREA)
  • Studio Devices (AREA)
  • User Interface Of Digital Computer (AREA)
  • Closed-Circuit Television Systems (AREA)

Abstract

An apparatus and a method for analyzing an image including event information for determining a pattern of at least one pixel group corresponding to event information included in an input image, and analyzes at least one of an appearance of an object and a motion of the object, based on the at least one pattern.

Description

For analyzing equipment and the method for the image comprising event information
Technical field
The method and apparatus consistent with exemplary embodiment relates to a kind of equipment for analysis chart picture, more particularly, relates to a kind of method and apparatus for analyzing the object comprised in the input image.
Background technology
Image procossing can refer to the information processing of form of ownership, and wherein, in information processing, image is transfused to and exports, and such as, image procossing can comprise analysis or the process of comparison film, video etc.
The device that sensing is used for the input data of image procossing can be vision sensor, and such as, can comprise the electro-optical pickoff etc. based on the technology for the manufacture of semiconductor device, wherein, electro-optical pickoff becomes integrated circuit.
Summary of the invention
According to the one side of exemplary embodiment, a kind of equipment for analysis chart picture is provided, described equipment comprises: sorter, be configured to receive the event signal corresponding at least one pixel of the vision sensor based on event, and be configured to the pattern of the pixel groups of multiple pixels of the vision sensor determined based on event, wherein, pixel groups comprises multiple neighborhood pixels of at least one pixel described and the vision sensor based on event contiguous with at least one pixel described; Analyzer, be configured to based on the pattern of pixel groups determine in the profile of object and the movement of object at least one.
Sorter can determine that whether pixel groups is corresponding at least one third edge pattern in multiple predetermined edge pattern.
Analyzer can comprise: counter, and the pattern based on pixel groups calculates the speed corresponding to pixel groups; Motion analyzer, is configured to the movement carrying out analytic target based on the speed corresponding to pixel groups.
Equipment for analysis chart picture also can comprise: processor, is configured to the variable quantity of the relative coordinate of the point that object-based mobile computing inputs about user, and processes user's input based on the variable quantity of described relative coordinate.
According to an aspect of exemplary embodiment, a kind of equipment for analysis chart picture is provided, described equipment comprises: sorter, be configured to receive first event signal corresponding to the first pixel of the vision sensor based on event and the second event signal corresponding with the second pixel of the vision sensor based on event, determine the first pattern of the first pixel groups of more than first pixel of the vision sensor based on event, and determine the second pattern of the second pixel groups of more than second pixel, wherein, first pixel groups comprises more than first neighborhood pixels of at least one first pixel described and the vision sensor based on event contiguous with at least one the first pixel described, second pixel groups comprises at least one second pixel described and more than second neighborhood pixels contiguous with at least one the second pixel described, analyzer, based on the primary importance of the first pattern detection object, based on the second place of the second pattern detection object, and determines the degree of depth of object based on primary importance and the second place.
According to an aspect of exemplary embodiment, a kind of method for analysis chart picture is provided, described method comprises: receive the event signal corresponding at least one pixel of the vision sensor based on event, determine the pattern of the pixel groups of multiple pixels of the vision sensor based on event, pattern based on pixel groups carrys out at least one in the profile of analytic target and the movement of object, wherein, pixel groups comprises multiple neighborhood pixels of at least one pixel described and the vision sensor based on event contiguous with at least one pixel described.
According to an aspect of exemplary embodiment, a kind of method for analysis chart picture is provided, described method comprises: the input receiving the event signal corresponding to the pixel of the vision sensor based on event of the movement of denoted object, multiple neighborhood pixels of the vision sensor based on event of selection and the pixel corresponding to event signal 1 vicinity, third edge pattern based on described multiple neighborhood pixels determines that the third edge pattern of described multiple neighborhood pixels is determined in the position at the edge of object, and the profile of analytic target is carried out in the position at object-based edge.
From detailed description, accompanying drawing and claim, other characteristic sum aspects of exemplary embodiment will be clear.
Accompanying drawing explanation
Fig. 1 is the block diagram of the equipment for analysis chart picture illustrated according to exemplary embodiment;
Fig. 2 A to Fig. 2 C is the diagram of the multiple predetermined edge patterns illustrated according to exemplary embodiment;
Fig. 3 A and Fig. 3 B illustrates the diagram of scheme for classifying to the pattern of pixel groups according to exemplary embodiment;
Fig. 4 is the diagram of the scheme in the direction at the edge for determining pixel groups illustrated according to exemplary embodiment;
Fig. 5 A and Fig. 5 B is the diagram of the scheme for the profile based on input picture analytic target illustrated according to exemplary embodiment;
Fig. 6 is the diagram of the scheme for calculating the speed corresponding to pixel groups illustrated according to exemplary embodiment;
Fig. 7 illustrates the diagram for using rigid model to carry out the scheme of the motion of analytic target according to exemplary embodiment;
Fig. 8 A to Fig. 8 D is the diagram of the scheme of the degree of accuracy of the motion for improving analytic target illustrated according to exemplary embodiment;
Fig. 9 illustrates the diagram processing the scheme of user's input for object-based translational speed according to exemplary embodiment;
Figure 10 illustrates the diagram processing the scheme of user's input for the object-based degree of depth according to exemplary embodiment;
Figure 11 is the process flow diagram of the method for analysis chart picture illustrated according to exemplary embodiment;
Figure 12 is the block diagram of the equipment for analyzing three-dimensional (3D) image illustrated according to exemplary embodiment.
Throughout the drawings and detailed description, unless otherwise described, otherwise identical drawing reference numeral will be understood to indicate identical element, characteristic sum structure.In order to clear, illustrate and convenience, can exaggerate the relative size of these elements and description.
Embodiment
There is provided following detailed description to help complete understanding that reader obtains method described herein, equipment and/or system.Therefore, the various changes of method described herein, equipment and/or system, amendment and equivalent are implicit for those of ordinary skill in the art.Described treatment step and/or the process of operation are examples; But the order for the treatment of step and/or operation is not limited to order set forth herein, and except must by particular order occur step and/or operation except, can be by sequential update those of ordinary skill in the art will understand order.In addition, in order to more concisely clear, each description of known function and structure can be omitted.
Fig. 1 is the block diagram of the equipment 100 for analysis chart picture illustrated according to exemplary embodiment.
Before describe the equipment 100 being used for analysis chart picture with reference to Fig. 1, the input picture will used will be discussed simply by equipment 100.The output image of the vision sensor based on event for captured object can be referred to according to the input picture of exemplary embodiment.Vision sensor based on event can in response to detecting that scheduled event carrys out outgoing event signal asynchronously.Scheduled event can comprise the change of the brightness be incident on based on the light on the vision sensor of event.Such as, when event (such as, making the event that light brightens in intended pixel) being detected, the exportable conducting corresponding to related pixel (ON) of the vision sensor based on event event, thus increase brightness.In addition, when event (such as, making the event that light is dimmed in intended pixel) being detected, the exportable cut-off corresponding to related pixel (OFF) of the vision sensor based on event event, thus reduce brightness.
Different from the vision sensor based on frame, the vision sensor based on event can not scan the photodiode of multiple pixel in frame unit, output detections is to the one part of pixel data of the change of light.The brightness change being incident on the light on vision sensor can be caused by the movement of object.Such as, suppose that light source is substantially fixed, and suppose object as time goes by can not be spontaneously luminous.In this case, the light inciding vision sensor can refer to produce and from the light of object reflection from light source.When object does not move, because the light reflected from stationary objects does not exist change substantially, the brightness therefore inciding the light of the vision sensor based on event can not change.On the contrary, when object move, due to from object reflection light, the brightness therefore inciding the incident light of vision sensor can change, and the object-based movement of light therefore incided based on the vision sensor of event and changing.
Vision sensor based on event can comprise dynamic vision sensor.Dynamic vision sensor can comprise and carries out with the principle of human retina and optic nerve artificial vision's sensor of operating.In response to the movement of object, event signal can be outputted to the vision sensor based on event by dynamic vision sensor.Event signal can comprise movement in response to object by the information of asynchronous generation.Event signal can comprise such as following information: the optic nerve information being sent to the brain of the mankind from retina.Such as, event signal can be produced when mobile object being detected, and can not event signal be produced for stationary objects.At least one pixel be included in event signal can be corresponding to the object being detected movement.
With reference to Fig. 1, the equipment 100 for analysis chart picture can comprise sorter 110 and analyzer 120.Sorter 110 can be classified based on the pattern of input picture at least one pixel groups of the event signal comprising movement object being detected.At least one pixel groups described can comprise the pixel corresponding to event signal and the multiple pixels contiguous with respective pixel.
Hereinafter, for convenience of description, pixel groups can comprise 3 × 3 matrixes of 9 pixels, supposes that the pixel corresponding to event signal is disposed in the center of pixel groups, and supposes that 8 neighborhood pixels arranged around respective pixel are included in described pixel groups.Being only exemplary for configuring this scheme of pixel groups, the scheme for configuring pixel groups can being revised in every way.
Sorter 110 can determine whether the pixel groups (that is, one group pixel of event generation) corresponding to event signal be corresponding with multiple predetermined edge pattern.For example, referring to Fig. 2 A, described multiple predetermined edge pattern can comprise 24 third edge pattern P 1to P 24.24 third edge pattern P 1to P 24can refer to and the pattern that the edge of object associates.When the pixel groups corresponding to event signal is confirmed as corresponding with any one in described multiple predetermined edge image, the pattern of the pixel groups corresponding to event signal can be defined as respective edges pattern by sorter 110.Such as, based on the third edge pattern P of the pixel groups corresponding to event signal and Fig. 2 A 1corresponding determination result, the pixel groups corresponding to event signal can be categorized as third edge pattern P by sorter 110 1.Sorter 110 can be given up and any one pixel groups corresponding with event signal do not associated in described multiple predetermined edge pattern.
With reference to Fig. 2 A to Fig. 3 B, the detailed description relevant to the scheme that the pattern of sorter 110 to the pixel groups corresponding with event signal is classified is discussed.
Analyzer 120 can based on the pattern of at least one pixel groups of being classified by sorter 110 come in the profile (such as shape, profile or object are relative to the position of the position of the sensor based on event) of analytic target and the motion of object at least one.Analyzer 120 can use the pattern of at least one pixel groups to determine the direction at the edge corresponding at least one pixel groups, thus the profile of analytic target.Alternatively, analyzer 120 can calculate the speed corresponding to the pixel groups that the edge of object associates, and carrys out the motion of analytic target based on the speed of the pixel groups calculated.Analyzer 120 can determine at least one in the scaling speed component of the translational speed component of object, the rotational speed component of object and object, with the motion of analytic target.
Subsequently with reference to the detailed description that the operation of Fig. 4 to Fig. 8 D discussion and analysis device 120 is relevant.
Fig. 2 A to Fig. 2 C is the diagram of the multiple predetermined edge patterns illustrated according to exemplary embodiment.With reference to Fig. 2 A, multiple third edge pattern can associate with the edge of object.
Event signal can comprise time scheduled event being detected timestamp, be used to indicate the designator of the type of event and the index of pixel of scheduled event detected.As discussed below, the timestamp corresponding to the pixel of resolution can store in table in memory, and then can utilize the time signal of event time of pixel.
Can classify based on the pattern of difference to pixel groups detected between the timestamp of pixel of event and the timestamp of multiple neighborhood pixels according to the equipment for analysis chart picture of exemplary embodiment.Equipment for analysis chart picture can determine that polytype neighborhood pixels is to classify to the pattern of pixel groups.Equipment for analysis chart picture can calculate the difference between the timestamp of the pixel of the event of detecting and the timestamp of multiple neighborhood pixels, and determines the neighborhood pixels of one or more types based on the result calculated.
Equipment for analysis chart picture can use the data structure for managing the timestamp corresponding to whole pixel.When event signal being detected, the timestamp of the renewable pixel be included in event signal of the equipment for analysis chart picture.Now, the equipment for analysis chart picture can give up previously stored information, and stores the new information upgraded.When current event being detected, the value of the timestamp of the current pixel corresponding to current event can be updated to current time by the equipment for analysis chart picture.For the equipment of analysis chart picture by calculating the difference between the timestamp of the current pixel after the renewal corresponding to current event and the timestamp of neighborhood pixels, determine the type of pixel of neighborhood pixels.When being detected to the corresponding preceding events of neighborhood pixels, the timestamp of neighborhood pixels may be updated.
Equipment for analysis chart picture can determine the type of neighborhood pixels based on equation 1.
[equation 1]
t ev - t nx ≥ T E → E - type ≤ T S → S - type
Here, t evrepresent the timestamp of the pixel of generation event, t nxrepresent the timestamp of neighborhood pixels, wherein, t evand t nxbetween difference instruction pixel event between temporal correlation, described temporal correlation is used to indicate the displacement at edge; T erepresent and be used for the threshold value determining E-type (E type), this threshold value can to event be corresponding slowly; T srepresent and be used for the threshold value determining S-type (S type), this threshold value can to event be corresponding fast.Maybe should being used for of being employed can be arranged T based on the sensitivity of pixel eand T s.Such as, when the palmistry of the object of detected movement and user is seasonable, T eand T smillisecond (ms) can be arranged on in the scope of tens ms.Alternatively, when the object of detected movement is moved object is corresponding faster to the obvious hand than user, T eand T ssome microseconds (μ s) or less can be set to.T eand T sdifferent value (wherein, T can be set to s< T e, as shown in table 1), and if desired, equal value can be set to.
When from detect the very first time of preceding events put in neighborhood pixels, current event to be detected the second time point subsequently in the past predetermined amount of time time, the equipment for analysis chart picture can determine that described neighborhood pixels is E-type.Such as, can by with detect among the pixel that the pixel of current event is contiguous at predetermined amount of time (such as, T e) period do not detect that the neighborhood pixels of new events is categorized as the neighborhood pixels of E-type.
When current event being detected within a predetermined period of time in neighborhood pixels from time point preceding events being detected, described neighborhood pixels can be defined as S-type by the equipment for analysis chart picture.Such as, can by with detect among the pixel that the pixel of current event is contiguous at predetermined amount of time (such as, T s) in detect that the neighborhood pixels of new events is categorized as the neighborhood pixels of S-type.
Predetermined edge pattern can comprise neighborhood pixels.Such as, as shown in Figure 2 A, when with produce the immediate top pixel of pixel of event, bottom pixel, left pixel and right pixel used time, predetermined edge pattern can comprise neighborhood pixels n1 to n8.Combination for configuring the type of the neighborhood pixels of predetermined edge pattern can be different from each other.
Such as, third edge pattern P 1pixel n3 and the n6220 of the pixel n1 of E-type, n2 and n4210 and S-type can be comprised.Following pixel groups can be categorized as third edge pattern P by the equipment for analysis chart picture 1, described pixel groups is included in predetermined amount of time (such as, T e) period do not detect the neighborhood pixels in the pixel n1 of new events, the direction of n2 and n4210, and at predetermined amount of time (such as, T s) in the neighborhood pixels in the pixel n3 of new events and the direction of n6220 detected.In this case, the equipment for analysis chart picture can be used in predetermined amount of time (such as, T s) in detect that the pixel n3 of the S-type of new events and n6 is to analyze the direction at the edge of respective groups of pixels.This is because, when object-based movement creates new events, the position event of the pixel in the edge being included in object can be detected at substantially the same time point.As being described in detail with reference to Fig. 4, third edge pattern P 1the edge in the direction of the line connecting pixel n3 and n6220 can be mapped to.
In a similar fashion, third edge pattern P 24pixel n3 and the n6250 of the pixel n5 of E-type, n7 and n8240 and S-type can be comprised.Following pixel groups can be categorized as third edge pattern P by the equipment for analysis chart picture 24, described pixel groups is included in predetermined amount of time (such as, T e) period do not detect the neighborhood pixels in the pixel n5 of new events, the direction of n7 and n8240, and at predetermined amount of time (such as, T s) period detects the neighborhood pixels in the pixel n3 of new events and the direction of n6250.In this case, the equipment for analysis chart picture can be used in predetermined amount of time (such as, T s) in detect that the pixel n3 of the S-type of new events and n6250 is to analyze the direction at the edge of respective groups of pixels.As being described in detail with reference to Fig. 4, third edge pattern P 24the edge in the direction of the line connecting pixel n3 and n6250 can be mapped to.
Neighborhood pixels more more than the neighborhood pixels of 8 shown in Fig. 2 A can be utilized according to the equipment for analysis chart picture of another exemplary embodiment.Such as, the equipment for analysis chart picture can use 24 neighborhood pixels (as shown in Figure 2 B) of 5 × 5 picture element matrixs or 48 neighborhood pixels (as shown in FIG. 2 C) of 7 × 7 picture element matrixs.Exemplary embodiment in Fig. 2 A, Fig. 2 B and Fig. 2 C is only exemplary, can modify in every way.Equipment for analysis chart picture can compare the type of the type of the neighborhood pixels be included in pixel groups with the neighborhood pixels be included in multiple predetermined edge pattern, and the pattern of pixel groups is defined as any one in described multiple predetermined edge pattern.Result based on the comparison, the equipment for analysis chart picture can determine the third edge pattern of mating with pixel groups from multiple predetermined edge pattern.In order to determine coupling, the equipment for analysis chart picture can compare the type of the type of the neighborhood pixels be included in pixel groups with the neighborhood pixels be included in multiple predetermined edge pattern.In one example, when the first third edge pattern and the second third edge pattern are included in multiple predetermined edge pattern, the equipment for analysis chart picture can compare the type of the type of the neighborhood pixels be included in pixel groups with the neighborhood pixels be included in the first third edge pattern.In addition, the equipment for analysis chart picture can compare the type of the type of the neighborhood pixels be included in pixel groups with the neighborhood pixels be included in the second third edge pattern.When the type being included in the neighborhood pixels in pixel groups is corresponding to the type of the neighborhood pixels be included in the first third edge pattern, the equipment for analysis chart picture can determine that the pattern of pixel groups is the pattern of the first third edge pattern.Alternatively, when the type being included in the neighborhood pixels in pixel groups is corresponding to the type of the neighborhood pixels be included in the second third edge pattern, the equipment for analysis chart picture can determine that the pattern of pixel groups is the second third edge pattern.The third edge pattern that equipment for analysis chart picture can determine to have the neighborhood pixels corresponding to the neighborhood pixels be included in pixel groups is the pattern of described pixel groups.
Where necessary, can by a part of agnosticism of the neighborhood pixels be included in predetermined edge pattern (agnostically) be appointed as " non-interesting " type.This pixel is neither classified as E-type and is not also classified as S-type.Such as, third edge pattern P 1the pixel n5 of " non-interesting " type, n7 and n8230 can be comprised.Third edge pattern P 24the pixel n1 of " non-interesting " type, n2 and n3260 can be comprised.
Equipment for analysis chart picture can be used in not classifying with the pattern of the corresponding neighborhood pixels of " non-interesting " type to pixel groups in the neighborhood pixels be included in third edge pattern.In other words, the equipment for analysis chart picture can only use the pattern of these pixels to pixel groups being classified as E-type and S-type to classify.Such as, when the equipment determination pixel groups for analysis chart picture whether with third edge pattern P 1time corresponding, the equipment for analysis chart picture can not consider pixel n5, n7 and n8230.Similarly, when the equipment determination pixel groups for analysis chart picture whether with third edge pattern P 24time corresponding, the equipment for analysis chart picture can not consider pixel n1, n2 and n4260.
Multiple predetermined edge pattern can be stored in every way.Such as, can store with the form of bit value and be included in 24 third edge pattern P 1to P 24in the neighborhood pixels of E-type and the neighborhood pixels of S-type, as shown in table 1.
[table 1]
Here, PnE represents and is included in third edge pattern P nin the neighborhood pixels of E-type.When supposing use 8 neighborhood pixels, PnE can be configured to 8 bits that can be corresponding to pixel n1 to n8 respectively.The bit corresponding to the neighborhood pixels of E-type in 8 bits can be set to " 1 ", and other remaining bits (S-type or " non-interesting " type) can be set to " 0 ".Such as, third edge pattern P 1can comprise the pixel n1 of the neighborhood pixels as E-type, n2 and n4210, therefore the bit value of P1E can be set to " 11010000 " that the first bit, the second bit and the 4th bit are " 1 ".The bit value of P1E " 11010000 " can be expressed as sexadecimal number, in this case, P1E can be expressed as " D0 ".When using 24 neighborhood pixels according to another exemplary embodiment, PnE can be configured with 24 bits, when use 48 neighborhood pixels, PnE can be configured with 48 bits.
In addition, PnS represents and is included in third edge pattern P nin the neighborhood pixels of S-type.When supposing use 8 neighborhood pixels, PnS can be configured to 8 bits that can be corresponding to pixel n1 to n8 respectively.The bit corresponding to the neighborhood pixels of S-type in 8 bits can be set to " 1 ", and other remaining bits (E-type or " non-interesting " type) can be set to " 0 ".Such as, third edge pattern P 1can comprise the pixel n3 as the neighborhood pixels of S-type and n6220, therefore the bit value of P1S can be set to " 00100100 " that the 3rd bit and the 6th bit are " 1 ".The bit value of P1S " 00100100 " can be expressed as sexadecimal number, in this case, P1E can be expressed as " 24 ".When using 24 neighborhood pixels according to another exemplary embodiment, PnS can be configured with 24 bits, when use 48 neighborhood pixels, PnS can be configured with 48 bits.
Whether the equipment for analysis chart picture can check the neighborhood pixels represented by PnE corresponding to E-type, and whether the neighborhood pixels represented by PnS corresponding to S-type, and based on analyzed pixel determine pixel groups whether with third edge pattern P ncorresponding.
When the pattern of pixel groups is classified, the equipment for analysis chart picture can not consider the neighborhood pixels of " non-interesting " type.Therefore, the information of the neighborhood pixels indicating " non-interesting " type clearly can not be used for the equipment of analysis chart picture.Such as, in PnE and PnS, with third edge pattern P nin the corresponding bit of neighborhood pixels of " non-interesting " type can be set to " 0 ".But " 0 " bit relevant to the result PnE and PnS being performed to OR computing bitwise can represent the neighborhood pixels of " irrelevant " type.Such as, when performing logic OR operation bitwise to P1E=" 11010000 " and P1S=" 00100100 ", P1E OR P1S=" 11110100 ".Because " 0 " bit in P1E OR P1S can be corresponding with the 8th bit to the 5th bit, the 7th bit, so P1E OR P1S=" 11110100 " can represent be included in third edge pattern P 1in the neighborhood pixels of " non-interesting " type can be pixel n5, n7 and n8230.
Table 1 is included in third edge pattern P for representing nin the neighborhood pixels of E-type and the exemplary embodiment of the neighborhood pixels of S-type.It should be appreciated by those skilled in the art that his-and-hers watches 1 carry out various amendment and are included in third edge pattern P to represent nin the neighborhood pixels of E-type and the neighborhood pixels of S-type.
Fig. 3 A and Fig. 3 B illustrates the diagram of scheme for classifying to the pattern of pixel groups according to exemplary embodiment.
With reference to Fig. 3 A, can by 24 third edge pattern P 1to P 24be grouped into 6 groups 310,320,330,340,350 and 360.Such as, whether jointly can exist 24 third edge pattern P based on the neighborhood pixels of E-type 1to P 24be grouped into 6 groups 310 to 360.Can by the third edge pattern P of n1, n2 and n4 comprising E-type 1, P 2, P 6, P 7and P 8configure group 310.Can by third edge pattern P 4, P 5, P 9, P 10and P 13form group 320, wherein, third edge pattern P 4, P 5, P 9, P 10and P 13in each comprise pixel n2, n3 and n5 of E-type.As shown in fig. 3, the common pixel in group among the pixel being represented as E-type in Fig. 2 A to Fig. 2 C can be expressed as cross-hatch pattern.
As shown in table 2,6 groups and the third edge pattern be included in 6 respective group can be divided into shielding ratio paricular value (E) and additional bit value (G).
[table 2]
Equipment for analysis chart picture can check among the neighborhood pixels of pixel groups whether corresponding with E-type to the neighborhood pixels of the corresponding position of shielding ratio paricular value (E), to determine which the pattern of pixel groups organized corresponding to.Whether the equipment for analysis chart picture can be checked corresponding with E-type to the neighborhood pixels of the corresponding position of the additional bit value (G) of respective sets, to determine the pattern of pixel groups.
Such as, because be included in the third edge pattern P in group 310 1, P 2, P 6, P 7and P 8comprise the pixel n1 of E-type, n2 and n4, so first bit of the shielding bit E1 of representative group 310, the second bit and the 4th bit can be set to " 1 ".Equipment for analysis chart picture can use shielding bit E1 to verify that whether pixel n1, n2 and n4 be corresponding to the neighborhood pixels of E-type, to determine whether pixel groups is included in group 310.
In addition, the equipment for analysis chart picture can use bit value G11, G12 and G13 to determine the third edge pattern corresponding to the pixel groups being classified as group 310.Such as, bit value G11 can refer to that the 6th bit is set to the additional bit value of " 1 ".Whether the equipment for analysis chart picture can use G11 corresponding to E-type to verify the pixel n6 of pixel groups.
Equipment for analysis chart picture based on the corresponding determination result of pixel n6 and the E-type of pixel groups, can determine pixel groups and third edge pattern P 6or third edge pattern P 7corresponding.In addition, whether the equipment for analysis chart picture can use bit value G12 corresponding to E-type to verify the pixel n3 of pixel groups, to determine pixel groups and third edge pattern P 6with third edge pattern P 7among which pattern corresponding.Such as, when pixel n3 and the E-type of pixel groups is corresponding, the equipment for analysis chart picture can determine pixel groups and third edge pattern P 7corresponding.On the contrary, when the pixel n3 of pixel groups is not corresponding to E-type, the equipment for analysis chart picture can determine pixel groups and third edge pattern P 6corresponding.
Based on the determination result that the pixel n6 of pixel groups is not corresponding to E-type, the equipment for analysis chart picture can determine pixel groups and third edge pattern P 1, P 2or P 8corresponding.In addition, whether the equipment for analysis chart picture can use bit value G12 corresponding to E-type to verify the pixel n3 of pixel groups, and when the pixel n3 of pixel groups is not corresponding to E-type, the equipment for analysis chart picture can determine pixel groups and third edge pattern P 1corresponding.When n3 and the E-type of pixel groups is corresponding, whether the equipment for analysis chart picture can use G13 corresponding to E-type to verify the pixel n5 of pixel groups.When pixel n5 and the E-type of pixel groups is corresponding, the equipment for analysis chart picture can determine pixel groups and third edge pattern P 8corresponding, when pixel n5 is not corresponding to E-type, the equipment for analysis chart picture can determine pixel groups and third edge pattern P 2corresponding.
When the neighborhood pixels of the position corresponding in the shielding ratio paricular value to E1 to E4 is not corresponding with E-type, the equipment for analysis chart picture can determine that the pattern of pixel groups belongs to group 350 or group 360.Equipment for analysis chart picture can be verified whether with S-type corresponding to the additional bit of bit value G51 and G52 if being worth the neighborhood pixels of corresponding position, and determines pixel groups belongs to which group among group 350 and group 360.
In one example, the equipment for analysis chart picture can use bit value G51 to determine whether pixel groups belongs to group 360, and wherein, bit value G51 can refer to that the second bit and the 7th bit are set to the additional bit value of " 1 ".Whether the equipment for analysis chart picture can use bit value G51 corresponding to S-type to the pixel n2 and n7 that verify pixel groups, to determine whether pixel groups belongs to group 360.In addition, the equipment for analysis chart picture also can be verified (after eliminating pixel n2 and n7), and whether remaining single neighborhood pixels is corresponding to E-type, to determine pixel groups and third edge pattern P 11with third edge pattern P 14in which pattern corresponding.
In addition, the equipment for analysis chart picture can use bit value G52 to determine whether pixel groups belongs to group 350, and wherein, bit value G52 can refer to that the 4th bit and the 5th bit are set to the additional bit value of " 1 ".Equipment for analysis chart picture can verify that whether the pixel n4 of pixel groups and n5 is corresponding to S-type, and determines whether pixel groups belongs to group 350.In addition, the equipment for analysis chart picture also can be verified (after eliminating pixel n4 and n5), and whether remaining single neighborhood pixels is corresponding to E-type, and at third edge pattern P 3with third edge pattern P 22among determine the pattern of the pattern by being classified as pixel groups.
With reference to Fig. 3 B, the equipment for analysis chart picture can use the third edge pattern of neighborhood pixels to respective groups of pixels of the E-type be included in pixel groups to classify.As subsequently by discussion, as the threshold value T of the neighborhood pixels for E-type eequal the threshold value T of the neighborhood pixels for S-type stime, the equipment for analysis chart picture can be classified to the third edge pattern of pixel groups.
More particularly, the equipment for analysis chart picture can check neighborhood pixels b 0to b 7whether corresponding to the neighborhood pixels of E-type.When multiple neighborhood pixels is corresponding to E-type, the bit corresponding to related pixel can be set to " 1 " by the equipment for analysis chart picture, otherwise, the bit corresponding to related pixel is set to " 0 ".Equipment for analysis chart picture can calculate P-val based on equation 2.
[equation 2]
P-val=(B<<1)AND B AND(B>>1)
Here, B represents the bit value of 8 bits, such as, and b 0b 1b 2b 3b 4b 5b 6b 7; (B<<1) represent by bitwise by the bit value of the 8 bit B degree of cyclic shift 1 bit and the value that obtains left, such as, b 1b 2b 3b 4b 5b 6b 7b 0; (B>>1) represent by bitwise by the bit value of the 8 bit B degree of cyclic shift 1 bit and the value that obtains to the right, such as, b 7b 0b 1b 2b 3b 4b 5b 6; P-val represents the bit value by bitwise obtaining (B<<1), B and the AND computing of (B>>1) actuating logic.
As shown in table 3, the equipment for analysis chart picture can use look-up table (LUT) to determine third edge pattern from the P-val calculated.Such as, when the P-val calculated is " 00000001 ", the equipment for analysis chart picture can determine related pixel group and third edge pattern P 11corresponding.When the P-val calculated is non-existent value in the LUT of table 3, the equipment for analysis chart picture can determine that the P-val calculated is not corresponding to any one in predetermined edge pattern.
[table 3]
When P-val equals the decimal system 17,34,68 and 136, two third edge pattern can be candidates.The P-val corresponding to two third edge pattern is considered to the corresponding situation of the neighborhood pixels of the neighborhood pixels of " non-interesting " type and E-type, and for analysis chart picture equipment can based on pre-defined rule select in possible third edge pattern any one.Such as, the equipment for analysis chart picture can determine the type of the neighborhood pixels determined in addition, and from two third edge pattern, select any one in third edge pattern.Alternatively, any one in third edge pattern can be selected randomly for the equipment of analysis chart picture.
Fig. 4 is the diagram of the scheme in the direction at the edge for determining pixel groups illustrated according to exemplary embodiment.With reference to Fig. 4, multiple predetermined edge pattern can be mapped to the edge with predetermined direction.
Multiple predetermined edge pattern can be mapped to the edge with Main way, and wherein, the neighborhood pixels of S-type is arranged along described Main way.Such as, third edge pattern P 1, P 7, P 18and P 24410 can be mapped to the edge 415 with E2 direction.In addition, third edge pattern P 11, P 12, P 13and P 14420 can be mapped to the edge 425 with E4 direction.As shown in table 4,24 third edge pattern can be mapped to the edge along 8 directions.
[table 4]
P1 E2 P13 E4
P2 E1 P14 E4
P3 E0 P15 E5
P4 E7 P16 E6
P5 E6 P17 E0
P6 E3 P18 E2
P7 E2 P19 E3
P8 E0 P20 E6
P9 E6 P21 E7
P10 E5 P22 E0
P11 E4 P23 E1
P12 E4 P24 E2
Equipment for analysis chart picture can be classified to the pattern of the pixel groups corresponding to the pixel of event being detected.Can by the edge of multiple pattern Mapping be classified to table 4, and therefore, the equipment identifiable design for analysis chart picture detects the direction at the edge in multiple pixels of event.Such as, the third edge pattern that use table 4 maps can be stored as edge in multiple event, and the marginal information that can store for multiple combination of pixels for the equipment of analysis chart picture is to determine the profile of object.
More particularly, the movement of the hand of user is the example of the equipment reception situation of the event signal of generation by moving for analysis chart picture.In the case, event pixel can comprise the pixel corresponding to the edge of the hand of user and the pixel corresponding with the inside of the hand of user.Equipment for analysis chart picture can determine the third edge pattern corresponding to the multiple pixels in event pixel.Here, predetermined edge pattern can comprise the third edge pattern corresponding to edge.Therefore, the equipment for analysis chart picture can determine that the pixel corresponding to the inside of hand is not corresponding with any one in predetermined edge pattern, and determines that any one in the pixel corresponding to the edge of the hand of user and predetermined edge pattern is corresponding.When the multiple pixels corresponding when the edge of the hand to people are confirmed as corresponding with any one in predetermined edge pattern, equipment for analysis chart picture based on table 4, can determine the direction at the edge corresponding with relevant edge pattern in multiple pixels that the edge of the hand to people is corresponding.As a result, the equipment for analysis chart picture can determine the direction of edge in multiple pixels that the edge of the hand to people is corresponding, and is determined the profile of the hand of user by the direction integrating multiple edge.
Fig. 5 A and Fig. 5 B is the diagram of the scheme of the profile for carrying out analytic target based on input picture illustrated according to exemplary embodiment.
With reference to Fig. 5 A, input picture can be the output of the vision sensor based on event, and wherein, the vision sensor based on event catches pericentral 8 connecting rods rotated in clockwise manner by equal rotational speed being arranged in cylinder.In the case, the vision sensor based on event carrys out outgoing event signal by detecting brighten event and dimmed event.Such as, increase or reduce the degree being greater than predetermined value by the brightness of pixel multiple in detected image via 8 connecting rods be rotated in a clockwise direction based on the vision sensor of event and carry out outgoing event signal.In fig. 5, stain (■) can refer to that sensor detects the output of dimmed event, and wherein, in dimmed event, brightness at least reduces described predetermined value; White point () can refer to the output detecting the event that brightens, and wherein, in the event of brightening, brightness at least increases described predetermined value.With reference to Fig. 5 B, the equipment for analysis chart picture can use the input picture of Fig. 5 A to carry out the profile of analytic target.
Equipment for analysis chart picture can select the pixel groups corresponding to predetermined edge pattern based on the scheme described by Fig. 1 to Fig. 4, and estimates the profile of object based on to the direction at the corresponding edge of pixel groups selected.Indirectly, the equipment for analysis chart picture can remove the noise comprised in the input image due to smearing etc. effectively.
Except the profile of object, the equipment for analysis chart picture also can the motion of analytic target.Equipment for analysis chart picture can calculate the speed at the multiple pixel places corresponding to the edge of object, and after the profile from event signal analysis object, the speed being used in the multiple pixel places corresponding to edge carrys out the motion of analytic target.Hereinafter, describe the operation of the speed of equipment calculating at the multiple pixel places corresponding to the edge of object for analysis chart picture with reference to Fig. 6, and describe the operation of the motion of the device analysis object for analysis chart picture with reference to Fig. 7.
Fig. 6 is the diagram of the scheme for calculating the speed corresponding to pixel groups illustrated according to exemplary embodiment.With reference to Fig. 6, pixel groups can comprise direction of motion information, and the equipment for analysis chart picture can use neighborhood pixels group to calculate the speed corresponding to pixel groups.
Here, the speed of the pixel groups corresponding to the edge of object can be calculated for the equipment of analysis chart picture.Such as, the equipment for analysis chart picture can for the predetermined edge pattern P being classified as Fig. 2 A to Fig. 2 C 1to P 24pixel groups calculate the speed of respective groups of pixels, instead of for the multiple pixels be included in event signal to calculate the speed of respective groups of pixels.As previously described, because predetermined edge pattern P 1to P 24the third edge pattern corresponding to the edge of object can be comprised, can calculate the speed of the pixel groups corresponding to the edge of object for the equipment of analysis chart picture.
Equipment for analysis chart picture can calculate the x-axis direction speed V corresponding to pixel groups based on equation 3 xwith y-axis direction speed V y.
[equation 3]
V x V y = &Sigma; i = 1 , S - Type 8 &alpha; i dx i / dt i dy i / dt i , &alpha; i = | &Integral; &theta; i , a &theta; i , b cos ( &theta; ) d&theta; |
θ i,aand θ i,brepresent that the center based on pixel groups covers the boundary angle of i-th neighborhood pixels of S-type.Such as, when pixel n5 is S-type, θ 5, a620 and θ 5, b610 can be cover the boundary angle of pixel n5 based on the center of pixel groups.
Equipment for analysis chart picture can alleviate the sensitivity of the noise for timestamp based on equation 4.
[equation 4]
V x V y = &Sigma; i = 1 , S - Type 8 &alpha; i dx i / < dt > dy i / < dt > , &alpha; i = | &Integral; &theta; i , a &theta; i , b cos ( &theta; ) d&theta; |
Here, <dt> represents the value identical with equation 5.
[equation 5]
Equipment for analysis chart picture can store the speed corresponding to pixel groups, and speed can comprise the x-axis direction speed V calculated by equation 3 to equation 5 xwith y-axis direction speed V y.The speed corresponding to pixel groups can refer to the speed of the event pixel of the center being positioned at respective groups of pixels.As previously described, the equipment for analysis chart picture can use predetermined edge pattern to calculate x-axis direction speed V for the event pixel corresponding to the edge of object xwith y-axis direction speed V y.Hereinafter, the speed of the event pixel corresponding to the edge of object is used to describe the method for the motion for analytic target with reference to Fig. 7.
Fig. 7 is the diagram of the scheme of the motion of the use rigid model analytic target illustrated according to exemplary embodiment.With reference to Fig. 7, the equipment for analysis chart picture can 4DOF (4-DOF) motion of analytic target 700.
Such as, according to exemplary embodiment, use the input picture movement of object 700 being detected.Object 700 can in two dimension (2D) on the surface by translational speed V p740 move.Alternatively, object 700 can based on rotation center O c720 rotate by angular velocity omega 721.Alternatively, object 700 can based on convergent-divergent center O z730 are expanded or are contracted to scaling speed V z.
Equipment for analysis chart picture can the translational speed component of analytic target 700, rotational speed component and scaling speed component.Equipment for analysis chart picture can use the description provided referring to figs. 1 through Fig. 6 to calculate predetermined point P on the edge being present in object 700 ithe speed V at 710 places i.Such as, speed V ithe x-axis direction speed V calculated based on equation 3 to equation 5 can be referred to xwith y-axis direction speed V y.
Equipment for analysis chart picture can to speed V icarry out modeling, as shown in equation 6.
[equation 6]
V zi+V ri+V p=V i
Here, V zi731, V ri722 and V p740 can refer at a P ithe scaling speed component at 710 places, rotational speed component and translational speed component.As shown in equation 6, the equipment for analysis chart picture can to speed V icarry out modeling, with by be positioned at be arranged on object 700 edge on some P ithe speed V at 710 places iresolve into scaling speed component, rotational speed component and translational speed component.Here, equation 6 can be defined as equation 7.
[equation 7]
tP i+ωA(P i-O c)+V p=V i
Here, tP irepresent scaling speed component, with O z730 is the coordinate P of initial point i710 can represent vector V zithe direction of 731 and size, parametric t represents can to vector V zithe size of 731 performs convergent-divergent; ω A (P i-O c) represent rotational speed component, coordinate difference (P i-O c) can represent from rotation center O c720 towards coordinate P ithe direction of the vector of 710 and size; Matrix A represents for from rotation center O c720 towards coordinate P ithe vector of 710 carries out the rotation matrix rotated, such as, and matrix A = 0 - 1 1 0 . Can pointing vector V by the vector rotated due to matrix A ri722, and parameter ω can to vector V rithe size of 722 performs convergent-divergent.
Equipment for analysis chart picture can calculate scaling speed component parameters t, rotational speed component parameters ω, rotation center O based on equation 7 cwith translational speed component V p.This is because know the coordinate P of the multiple points being positioned at edge for the equipment of analysis chart picture i710 and speed V at respective point place i.Equipment for analysis chart picture can analyze at least one in translational speed component, rotational speed component and scaling speed component (such as, the 4-DOF of object).
Can realize in every way for calculating scaling speed component parameters t, rotational speed component parameters ω, rotation center O based on equation 7 cwith translational speed component V pmethod.According to exemplary embodiment, equation 8 can be derived from equation 7.
[equation 8]
t ( P i - P &OverBar; ) + &omega;A ( P i - P &OverBar; ) = V i - V &OverBar;
Here, P irepresent the coordinate of i-th point be positioned on the edge of object 700, represent the mean value of the coordinate of the point be positioned on the edge of object 700.V irepresent the speed at i-th place on the edge being positioned at object 700, represent the mean value of the speed at the some place on the edge being positioned at object 700.Multiple variable can define by 9 to equation 12 in equation.
[equation 9]
P i=(x i,y i)
[equation 10]
P &OverBar; = ( 1 N &Sigma; i = 1 N x i , 1 N &Sigma; i = 1 N y i )
[equation 11]
V i=(V xi,V yi)
[equation 12]
V &OverBar; = ( 1 N &Sigma; i = 1 N V xi , 1 N &Sigma; i = 1 N V yi )
Equipment for analysis chart picture can calculate x-axis direction speed V based on equation 3 to equation 5 xwith y-axis direction speed V y, and the V that will calculate xand V ybe stored as at pixel P ithe speed V at place i.Equipment for analysis chart picture can use the coordinate P of multiple pixel iwith the speed V of multiple pixel icalculate with equipment for analysis chart picture can use multiple P i, multiple V i, calculating parameter t and parameter ω is carried out with equation 8.Such as, equation 13 and equation 14 can be derived from equation 8 based on pseudoinverse scheme.
[equation 13]
t = &sigma; ( x , V x ) + &sigma; ( y , V y ) &sigma; 2 ( P )
[equation 14]
&omega; = &sigma; ( x , V y ) + &sigma; ( y , V x ) &sigma; 2 ( P )
Here, σ 2(P)=σ 2(x)+σ 2(y), σ () represents the operator for calculating standard deviation; σ (x, y)=E [(x-E [x]) (y-E [y])]; E [] represents expectation value or mean value.Equipment for analysis chart picture can calculate scaling speed component parameters t and rotational speed component parameters ω based on equation 13 and equation 14.
Fig. 8 A to Fig. 8 D is the diagram of the scheme of the degree of accuracy of the motion for improving analytic target 800 illustrated according to exemplary embodiment.
With reference to Fig. 8 A, the viewing area 810 that the equipment for analysis chart picture can use size larger than the size of pixel groups is to improve motion analysis degree of accuracy.Viewing area 810 can refer to the set of the pixel groups 811 comprising multiple pixel groups.Such as, viewing area 810 can comprise the region produced by object 800 being segmented into same size along the edge of object 800.
When selecting viewing area 810, the equipment for analysis chart picture can be determined the various patterns of the pixel groups 811 be included.Such as, the equipment for analysis chart picture can select the viewing area of the pixel groups 811 with different pattern, and performs motion analysis.
Such as, following situation is supposed: object 800 820 moves when not rotating, shrinking and expanding to the right.Here, object 800 can refer to the object with rectangular shape, and can be tilt or directed in an inclined manner while movement object.Viewing area 810 can comprise the pixel groups 811 with same or similar pattern.When performing motion analysis based on the pixel groups 811 be included in viewing area 810, although the direction of actual movement is direction 820 to the right, can by the Orientation of reality movement for 812 to move to the right.Equipment for analysis chart picture can select the viewing area 830 comprising non-rectilinear sections in the edge of object 800, instead of comprises the viewing area 810 of straight line portion.Equipment for analysis chart picture improves by selecting the viewing area comprising the pixel groups of different pattern the degree of accuracy analyzing motion.
Homogeneity level (LOH) can be used to improve the degree of accuracy analyzing motion according to the equipment for analysis chart picture of another exemplary embodiment.Such as, the equipment for analysis chart picture can calculate the LOH of fritter (patch) based on equation 15, and selects the fritter with low LOH.Here, fritter can comprise the pixel groups of the size being greater than 3 × 3 pixels.
[equation 15]
Here, θ refand θ irepresent and be positioned at the edge angle (that is, orientation) of the pixel of the center of multiple fritter (patch) and the edge angle (that is, orientation) of i-th neighborhood pixels.When LOH is low, the pattern being included in the pixel groups in corresponding fritter has similarity to a certain degree each other, and when LOH height, the pattern being included in the pixel groups in corresponding fritter can be dissimilar.Equipment for analysis chart picture can select the fritter with low LOH, and selects the viewing area comprising the pixel groups of different pattern.Therefore, edge can be classified, and the orientation at edge can be used for the key character determining image, also can be applied to shape and/or the movement of determining object.
Fig. 8 B is the example illustrating that the degree of accuracy using LOH to analyze motion is enhanced.Such as, following situation is supposed: object 800 moves along direction 820 to the right.When in viewing area 810 during the object of observation 800 mobile, the edge of object 800 can be observed to edge 831 at time point t, and is observed to edge 832 at time point t+ Δ t subsequently.In the case, because object 800 moves along direction 820 to the right, so the speed of the actual movement of object 800 can be expressed as velocity 840.When not using the motion analysis utilizing LOH, the translational speed of object 800 can be calculated as velocity 850.Velocity 850 can be different from representing the velocity 840 of actual translational speed on direction and size.
When use utilizes the motion analysis of LOH, the translational speed of object 800 can be calculated as velocity 860.The direction of velocity 860 can be identical with the direction of the velocity 840 of the actual movement of expression object 800, but, the size of velocity 860 and varying in size of velocity 840.
With reference to Fig. 8 C, although object moves by the actual speed 872 with identical speed and direction, the size of the speed 873 that the profile 871 based on object 800 calculates can be different.Such as, the vector component appeared on the direction (such as, x-axis direction) in the direction along reality movement is less, then the size of the speed calculated is less.
With reference to Fig. 8 D, the equipment for analysis chart picture can based on the size of the translational speed of equation 16 and equation 17 calibration object 800.More particularly, in operation 881, the equipment for analysis chart picture can receive V ias edge event.Equipment for analysis chart picture can use V icalculate V p, O c, t and ω.Due to reference to the similar features described in Fig. 1 to Fig. 7, therefore the detailed description for operation 881 and 882 can will be omitted.In operation 883, the equipment for analysis chart picture analyzes motion to calculate V by using LOH p.Similarly, the description of Fig. 8 A can be applied to operation 883, and therefore omit the detailed description of operation 883.
In operation 884, the equipment for analysis chart picture can calculate V based on equation 16 i gen.The definition of the parameter used in equation 16 can be identical with the definition of the parameter used in equation 7.
[equation 16]
V i gen=tP i+ωA(P i-O c)+V p
In operation 885, the equipment for analysis chart picture can calculate V based on equation 17 i cor.Here, θ represents V iand V i genbetween differential seat angle, and can be corresponding to the angle 855 illustrated in the fig. 8b.
[equation 17]
V i cor = 1 - tan &theta; tan &theta; 1 V i
Although V ithere is the size similar to the size of the speed of reality movement, but V ithe direction different from the direction of reality movement can be had.This is because the speed of movement is for all edges, no matter how LOH is calculated.On the contrary, although V i genthere is the size less than the size of the speed of reality movement, but V i genthe direction identical with the direction of reality movement can be had.This is because the speed of movement is calculated for the edge in the range of observation of low LOH.Therefore, the equipment for analysis chart picture can from V iobtain the size of vector, and from V i genobtain the direction of vector, to calculate V based on equation 17 i cor.
According to another exemplary embodiment, the equipment for analysis chart picture can repetitive operation 882 to operation 885 repeatedly.Such as, when θ is 90 degree, because the value of tan90 ° is not defined, so can be difficult to calculate V i cor.In the case, the equipment for analysis chart picture can rotate V repeatedly icalculate V i cor.Rotation number can be twice or more time.Here, demand fulfillment represent the maximum rotation angle allowed in kth time repeats.
Fig. 9 is the diagram of the scheme for processing user's input based on the translational speed of object 910 illustrated according to exemplary embodiment.
With reference to Fig. 9, the equipment for analysis chart picture can process user's input based on the motion of object 910.Object 910 can be the hand of user.
Equipment for analysis chart picture can use the description provided above for Fig. 1 to Fig. 8 to carry out the translational speed 915 of calculating object 910.Equipment for analysis chart picture can use the translational speed 915 calculated to calculate the variable quantity of the relative coordinate of the point about user's input.
Equipment for analysis chart picture can process user's input based on the variable quantity of relative coordinate.Such as, the position of the cursor be indicated on display 920 can be moved to reposition 922 from current location 921 by the equipment for analysis chart picture.
Therefore, relative coordinate can the designator (such as mouse pointer) of indicative user interface (UI) about the relative position of the current location of the designator of UI.Equipment for the treatment of user's input object-based motion can calculate the variable quantity of the relative position of the designator of UI.Such as, when object moved right by 1m/s in 1 second, the variable quantity of the relative position of the designator of UI can be calculated as direction be right and size is the vector of 1m.
For the treatment of the equipment of user's input by calculating the reposition that moves to from the current location of the designator of UI according to the variable quantity of the relative position of the designator of UI to determine the relative position of the designator of UI.The designator of UI can be moved to described reposition from current location by the equipment for the treatment of user's input.
Although showing not shown, the equipment for analysis chart picture can comprise recognizer and processor.Recognizer can identify the motion of object based on the input picture of the event signal comprising movement object being detected.Such as, recognizer can use the description provided above for Fig. 1 to Fig. 8 to carry out the translational speed 915 of calculating object 910.The relative coordinate that processor can input for user based on the motion calculation of the object identified by recognizer.Such as, processor can use the translational speed 915 calculated by recognizer, calculates the variable quantity of the relative coordinate being used for user's input.Processor can upgrade relative coordinate based on the variable quantity of relative coordinate, and uses the relative coordinate after upgrading to process user's input.
Figure 10 illustrates the diagram processing the scheme of user's input for the object-based degree of depth according to exemplary embodiment.
With reference to Figure 10, the equipment for analysis chart picture also can use two the different event signals detected for two positions that the movement of same target is spatially spaced, carrys out the degree of depth of analytic target.Such as, sensor 1020 can comprise the first sensor corresponding to left eye and second sensor corresponding with right eye.The parallax (disparity) between the image exported from two sensors corresponding to eyes can be used the degree of depth of measuring object is carried out for the equipment of analysis chart picture.
With reference to Figure 10, the equipment for analysis chart picture based on similarity level (LOS) the maximized scheme for making the multiple fritters corresponding to left eye and right eye, can calculate the parallax of two images exported from two sensors.More particularly, the equipment for analysis chart picture can process the event signal exported from the sensor 1020 corresponding to left eye and the event signal exported from the sensor 1020 corresponding with right eye.Such as, the equipment for analysis chart picture detects the pixel corresponding to edge by using two respective event signal edge patterns to carry out classifying, and determines the direction at the edge in respective pixel based on the third edge pattern of the pixel detected.In the case, the superimposed images that two edges that can obtain object for the equipment of analysis chart picture are separated from each other.Two edges of the object be separated from each other in both images form the parallax between two images.For the superimposed images that the algorithm application of table 5 can be separated from each other to two edges of object by the equipment of analysis chart picture.
[table 5]
lLOS ( x , y , d ) &equiv; 1 M &Sigma; ( x i y i ) &Element; patch @ ( x , y ) cos 2 { &theta; ( x i , y i ) - &theta; ( x i + d , y i ) } ,
θ: position angle, M: valid pixel number
gLOS ( d ) &equiv; &Sigma; y rLOS ( y , d )
LOS(x,y,d)≡lLOS(x,y,d)×rLOS(y,d)×gLOS(d)
disparity ( x , y ) = arg max d { LOS ( x , y , d ) }
Here, (x, y) represents fritter coordinate in the picture, and multiple point can be included in this fritter; (x i, y i) representing the coordinate being included in i-th point in the fritter of (x, y) coordinate, d represents the parallax of two images.Can receive two images from the sensor corresponding to left eye and the sensor corresponding with right eye, and therefore, described two images can be separated from each other along the direction of x-axis usually.Therefore, d can represent the degree that two images are separated from each other along the direction of x-axis.θ (x i, y i) represent position angle, and can with at (x i, y i) direction at edge that calculates, the some place of coordinate is corresponding.
1LOS (x, y, d) represents the mathematical formulae being used for the LOS determined in the single fritter of (x, y) coordinate; RLOS (y, d) represents the mathematical formulae of the LOS on one dimension (1D) line determining y coordinate; GLOS (d) represents the mathematical formulae for determining the LOS in the 2D region of whole image.
Equipment for analysis chart picture can calculate and be used for making LOS (x, y, d) maximized d.In order to make LOS (x, y, d) maximize, θ (x can be made i, y i) and θ (x i+ d, y i) between difference minimize, and therefore, the equipment for analysis chart picture can calculate and be used for making θ (x i, y i) and θ (x i+ d, y i) between the minimized d of difference.
When parallax between two images increases, the equipment for analysis chart picture can by the estimation of Depth of object for shoaling, and during parallax reduction when two images between, the equipment for analysis chart picture can by the estimation of Depth of object for deepening.Equipment for analysis chart picture can process user's input based on the degree of depth of the object estimated.
Equipment for analysis chart picture can determine the operator scheme corresponding to the degree of depth of object 1010 from sensor 1020.Such as, the first operator scheme region 1032, operator scheme region 1031, second, the 3rd operator scheme region 1033 and the background area after object 1,010 1034 can be pre-determined based on the degree of depth from sensor 1020.
When the degree of depth of object 1010 is confirmed as corresponding to the second operator scheme region 1032, the equipment for analysis chart picture can, based on the scheme inputted for the treatment of the user corresponding to the second operator scheme region 1032, use object 1010 to process input.
Figure 11 is the process flow diagram of the method for analysis chart picture illustrated according to exemplary embodiment.
With reference to Figure 11, in operation 1110, the device-readable for analysis chart picture according to exemplary embodiment gets event information.In operation 1120, the equipment for analysis chart picture can upgrade event generation time figure (map) based on the position of the event produced and generation time.In operation 1130, the proximal event that the equipment for analysis chart picture can produce for event generation time figure analyzes the pattern of event generation time.
In operation 1140, the directivity that the equipment for analysis chart picture can produce pattern edge based on event is classified.In operation 1150, the equipment for analysis chart picture can produce pattern based on edge direction pattern and event and carry out extraction rate component.
In operation 1160, the equipment for analysis chart picture can accumulate event information with the movement of analytic target.In operation 1170, the equipment for analysis chart picture can determine whether the quantity of the event accumulated is enough to the movement of analytic target exactly.As the described result determined, when determining that the lazy weight of the event accumulated moves with analytic target, in operation 1175, the equipment for analysis chart picture can determine whether integration time is enough to the movement of analytic target exactly.When determining that being not enough to analytic target integration time moves, the equipment for analysis chart picture can turn back to operation 1110, and the event information that accumulation is new further.
When the quantity of event of accumulation is enough to the movement of analytic target exactly, although or the lazy weight of event of accumulation, but when being enough to analytic target exactly mobile integration time, in operation 1180, the equipment for analysis chart picture can divide the position of object and translational speed for multiple component.Equipment for analysis chart picture can obtain as the translational speed of translational speed component, expansion or contraction speed and rotational speed.
In operation 1190, the equipment for analysis chart picture can determine the main mobile component of object.Such as, at least one mobile component of the movement contributing to very much object among translational speed, expansion or contraction speed and rotational speed can be determined for the equipment of analysis chart picture.
Due to reference to referring to figs. 1 through the similar characteristics described by Figure 10, therefore the detailed description of Figure 11 can will be omitted.
Figure 12 is the block diagram of the equipment 1200 for analyzing three-dimensional (3D) image illustrated according to exemplary embodiment.
With reference to Figure 12, equipment for analyzing 3D rendering can comprise at least two Image Change Detection devices 1210 and 1215, edge directional information extraction apparatus 1220 and 1225, velocity information extraction apparatus 1230 and 1235, and average edge directional information extraction apparatus 1240 and 1245, wherein, Image Change Detection device 1210 and 1215 comprises the vision sensor 1211 and 1216 based on event respectively, edge directional information extraction apparatus 1220 and 1225 is for for one or more event detection edge directional information, velocity information extraction apparatus 1230 and 1235 is for for one or more event detection velocity information, average edge directional information extraction apparatus 1240 and 1245 is in the average edge direction of the schedule time for one or more event detection pixel.
In addition, equipment 1200 for analyzing 3D rendering also can comprise disparity map extraction apparatus 1250, range information mapper 1260 and 3D position/mobile analyzer 1270, wherein, disparity map extraction apparatus 1250 is for determining disparity map based on the edge directional information of edge directional information extraction apparatus 1220 and 1225, and range information mapper 1260 is for for one or more event determination range information.
Due to reference to referring to figs. 1 through the similar characteristics described by Figure 11, therefore the detailed description of the multiple modules described in Figure 12 can will be omitted.
Exemplary embodiment illustrated in the accompanying drawings can by comprising bus, at least one processor (such as, central processing unit, microprocessor etc.) and the equipment of storer realize, wherein, bus is connected to each unit of described equipment, at least one processor described is connected to bus to control the operation of described equipment to realize above-mentioned functions and fill order, and storer is connected to bus with the message of the message of memory command, reception and generation.
As being appreciated by those skilled in the art, implementing exemplary embodiment can be carried out by the execution software of particular task and/or the combination in any of nextport hardware component NextPort (such as field programmable gate array (FPGA) or special IC (ASIC)).Unit or module can be configured to be positioned at addressable storage medium easily, and are configured to perform on one or more processor or microprocessor.Therefore, for example, unit or module can comprise assembly (such as component software, OO component software, class component and task component), process, function, attribute, program, subroutine, program code segments, driver, firmware, microcode, circuit, data, database, data structure, table, array and variable.The function provided in assembly and unit can be merged into less assembly and unit or module, or is entered a part from the other assembly of one-tenth and unit or module.
Above-mentioned exemplary embodiment also can realize in computer-readable medium, and wherein, computer-readable medium comprises the programmed instruction for realizing the various operations performed by computing machine.Described medium also can comprise data file, data structure etc. with combining separately or with programmed instruction.The example of computer-readable medium comprises magnetic medium (such as hard disk, floppy disk and tape), light medium (such as CD ROM dish and DVD), magnet-optical medium (such as CD), is specially configured to the hardware unit (such as ROM (read-only memory) (ROM), random access memory (RAM), flash memory etc.) of storage and execution of program instructions.The example of programmed instruction can comprise machine code (such as being generated by compiler) and comprise the file of the more high-level code that interpreter can be used to perform by computing machine.The hardware unit described can be configured to be used as one or more software module, to perform the operation of above-mentioned exemplary embodiment, otherwise still.
Be described above several exemplary embodiments.However, it should be understood that and can make various amendment.Such as, if perform described technology with different order, if and/or the assembly combined by different way in described system, framework, device and circuit and/or replace with other assemblies or their equivalent or supplement the assembly in described system, framework, device and circuit, then can realize suitable result.Therefore, other realize also within the scope of the claims.

Claims (34)

1., for an equipment for analysis chart picture, described equipment comprises:
Sorter, be configured to receive the event signal corresponding at least one pixel of the vision sensor based on event, and be configured to the pattern of the pixel groups of multiple pixels of the vision sensor determined based on event, wherein, pixel groups comprises multiple neighborhood pixels of at least one pixel described and the vision sensor based on event contiguous with at least one pixel described;
Analyzer, is configured to based at least one in the profile of pattern determination object of pixel groups and the movement of object.
2. equipment as claimed in claim 1, wherein, the generation of event signal instruction event of the position of at least one pixel described in the vision sensor based on event.
3. equipment as claimed in claim 1, wherein, whether sorter determination pixel groups is corresponding at least one third edge pattern among multiple predetermined edge pattern.
4. equipment as claimed in claim 3, wherein, sorter, in response to determining that pixel groups is not corresponding at least one third edge pattern described, gives up this pixel groups.
5. equipment as claimed in claim 1, wherein, sorter comprises:
Type determiner, is configured to the type of pixel determining described multiple neighborhood pixels based on the difference between the timestamp of the event signal corresponding at least one pixel described and the timestamp of the event signal corresponding with described multiple neighborhood pixels;
Pattern determiner, is configured to the pattern determining pixel groups based on the type of pixel of described multiple neighborhood pixels.
6. equipment as claimed in claim 5, wherein, the type of pixel of described multiple neighborhood pixels comprises the first type of pixel and the second type of pixel, wherein, difference between the timestamp and the timestamp of the event signal corresponding with the neighborhood pixels of the first type of pixel of the event signal corresponding at least one pixel described is less than first threshold, and the difference between the timestamp and the timestamp of the event signal corresponding with the neighborhood pixels of the second type of pixel of the event signal corresponding at least one pixel described is greater than Second Threshold.
7. equipment as claimed in claim 1, wherein, analyzer comprises:
Contour analysis device, is configured to the direction determining the edge of the object corresponding to pixel groups based on the pattern of pixel groups.
8. equipment as claimed in claim 1, wherein, analyzer comprises:
Counter, the pattern be configured to based on pixel groups calculates the speed corresponding to pixel groups;
Motion analyzer, is configured to the movement carrying out analytic target based on the speed corresponding to pixel groups.
9. equipment as claimed in claim 8, wherein, the movement of object comprises at least one in the translational speed component of object, the rotational speed component of object and the scaling speed component of object.
10. equipment as claimed in claim 8, wherein, analyzer also comprises:
Selector switch, is configured to the various patterns based on the pixel groups be included in viewing area, selects at least one viewing area of the movement being used for analytic target among multiple viewing area.
11. equipment as claimed in claim 1, also comprise:
Processor, is configured to the variable quantity of the relative coordinate of the point that object-based mobile computing inputs about user, and processes user's input based on the variable quantity of described relative coordinate.
12. 1 kinds of equipment for analysis chart picture, described equipment comprises:
Sorter, be configured to receive first event signal corresponding to the first pixel of the vision sensor based on event and the second event signal corresponding with the second pixel of the vision sensor based on event, determine the first pattern of the first pixel groups of more than first pixel of the vision sensor based on event, and determine the second pattern of the second pixel groups of more than second pixel, wherein, first pixel groups comprises more than first neighborhood pixels of at least one first pixel described and the vision sensor based on event contiguous with at least one the first pixel described, second pixel groups comprises at least one second pixel described and more than second neighborhood pixels contiguous with at least one the second pixel described,
Analyzer, based on the primary importance of the first pattern detection object, based on the second place of the second pattern detection object, and based on the degree of depth of primary importance and second place determination object.
13. equipment as claimed in claim 12, wherein, first event signal instruction is in the generation of the first event of the first position of the first pixel of the vision sensor based on event, and second event signal designation is in the generation of the second event of the second position of the second pixel of the vision sensor based on event.
14. equipment as claimed in claim 12, wherein, the first event signal is corresponding to the signal of movement object being detected in first position, and second event signal and the second position spatially separated with primary importance detect that the signal of described movement is corresponding.
15. equipment as claimed in claim 12, wherein, analyzer is configured to calculate the distance between primary importance and the second place, when distance between primary importance and the second place increases, by the estimation of Depth of object for shoaling, when distance between primary importance and the second place reduces, by the estimation of Depth of object for deepening.
16. equipment as claimed in claim 12, also comprise:
Processor, is configured to determine the motor pattern corresponding to the degree of depth of object, and processes user's input based on motor pattern.
The method of 17. 1 kinds of analysis chart pictures, described method comprises:
Receive the event signal corresponding at least one pixel of the vision sensor based on event;
Determine the pattern of the pixel groups of multiple pixels of the vision sensor based on event, wherein, pixel groups comprises multiple neighborhood pixels of at least one pixel described and the vision sensor based on event contiguous with at least one pixel described;
Based at least one in the profile of pattern determination object of pixel groups and the movement of object.
18. methods as claimed in claim 17, wherein, the generation of event signal instruction event of the position of at least one pixel described in the vision sensor based on event.
19. methods as claimed in claim 17, wherein, determine that the step of the pattern of pixel groups comprises:
The type of pixel of described multiple neighborhood pixels is determined based on the difference between the timestamp of the event signal corresponding at least one pixel described and the timestamp of the event signal corresponding with described multiple neighborhood pixels;
Type of pixel based on described multiple neighborhood pixels determines that whether pixel groups is corresponding at least one third edge pattern among multiple predetermined edge pattern;
In response to determining that pixel groups is not corresponding at least one third edge pattern described, giving up this pixel groups, in response to determining that pixel groups is corresponding at least one third edge pattern described, determining that the pattern of this pixel groups is at least one third edge pattern described.
20. methods as claimed in claim 17, wherein, determine that the step of at least one in the profile of object and the movement of object comprises: the direction at the edge corresponding to pixel groups determined by the pattern based on pixel groups.
21. methods as claimed in claim 17, wherein, determine that the step of at least one in the profile of object and the movement of object comprises:
Based on the various patterns of the pixel groups be included in viewing area, among multiple viewing area, select at least one viewing area of the movement being used for analytic target;
Calculate the speed corresponding to the multiple pixel groups be included at least one viewing area described;
The movement of analytic target is carried out based on the speed corresponding to described multiple pixel groups.
22. methods as claimed in claim 17, also comprise:
Translational speed based on the object be included in the movement of object calculates the variable quantity of the relative coordinate of the point about user's input;
Variable quantity based on described relative coordinate processes user's input.
23. 1 kinds of methods for analysis chart picture, described method comprises:
Receive the input of the event signal corresponding to the pixel of the vision sensor based on event of the movement of denoted object;
Multiple neighborhood pixels of the vision sensor based on event of selection and the pixel corresponding to event signal vicinity;
Determine the third edge pattern of described multiple neighborhood pixels;
Third edge pattern based on described multiple neighborhood pixels determines the position at the edge of object;
The profile of analytic target is carried out in the position at object-based edge.
24. methods as claimed in claim 23, wherein, the generation of the event of event signal instruction in the position of the pixel of the vision sensor based on event.
25. methods as claimed in claim 23, wherein, determine that the step of third edge pattern comprises:
The type of pixel of described multiple neighborhood pixels is determined based on the difference between the timestamp of the event signal corresponding to described multiple neighborhood pixels and the timestamp of the event signal corresponding with described pixel;
Type of pixel based on described multiple neighborhood pixels determines that whether the third edge pattern of described multiple neighborhood pixels is corresponding at least one third edge pattern in multiple third edge pattern.
26. methods as claimed in claim 25, wherein, determine that the step of third edge pattern also comprises:
If the third edge pattern of described multiple neighborhood pixels is not corresponding at least one third edge pattern described, then give up described pixel and described multiple neighborhood pixels.
27. methods as claimed in claim 23, also comprise:
Object-based profile carrys out the movement of analytic target.
28. 1 kinds of equipment for the treatment of user's input, described equipment comprises:
Recognizer, is configured to the movement identifying object based on input picture, and wherein, input picture comprises the event signal that instruction detects at least one pixel of the vision sensor based on event of the movement of object;
Processor, what be configured to that object-based movement carrys out upgating object inputs corresponding relative position to user.
29. equipment as claimed in claim 28, wherein, the generation of event signal instruction event at least one pixel place described in the vision sensor based on event.
30. equipment as claimed in claim 28, wherein, the movement of object comprises at least one in the translational speed component of object, the rotational speed component of object and the scaling speed component of object.
31. equipment as claimed in claim 28, wherein, recognizer comprises:
Sorter, is configured to the pattern of at least one pixel groups determined in input picture, wherein, at least one pixel groups described comprise corresponding to event signal described at least one pixel and the multiple neighborhood pixels contiguous with described at least one pixel;
Analyzer, is configured to the movement carrying out analytic target based on the pattern of at least one pixel groups described.
32. equipment as claimed in claim 31, wherein, sorter comprises:
Type determiner, is configured to the type of pixel determining described multiple neighborhood pixels based on the difference between the timestamp of the event signal corresponding at least one pixel described and the timestamp of the event signal corresponding with described multiple neighborhood pixels;
Pattern determiner, is configured to the pattern determining pixel groups based on the type of pixel of described multiple neighborhood pixels.
33. equipment as claimed in claim 31, wherein, analyzer comprises:
Counter, the pattern be configured to based on pixel groups calculates the speed corresponding to pixel groups;
Motion analyzer, is configured to the movement carrying out analytic target based on the speed corresponding to pixel groups.
34. equipment as claimed in claim 28, wherein, processor is also configured to: object-based movement carrys out the variable quantity of the relative position of calculating object, carrys out the relative position of upgating object based on described variable quantity, and processes user's input based on the relative position of the object after renewal.
CN201410366650.8A 2013-07-29 2014-07-29 For analyzing the device and method of the image including event information Active CN104346427B (en)

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
KR20130089254 2013-07-29
KR10-2013-0089254 2013-07-29
KR1020130098273A KR102129916B1 (en) 2013-07-29 2013-08-20 Apparatus and method for analyzing an image including event information
KR10-2013-0098273 2013-08-20

Publications (2)

Publication Number Publication Date
CN104346427A true CN104346427A (en) 2015-02-11
CN104346427B CN104346427B (en) 2019-08-30

Family

ID=51228312

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201410366650.8A Active CN104346427B (en) 2013-07-29 2014-07-29 For analyzing the device and method of the image including event information

Country Status (4)

Country Link
US (1) US9767571B2 (en)
EP (1) EP2838069B1 (en)
JP (1) JP6483370B2 (en)
CN (1) CN104346427B (en)

Cited By (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106488151A (en) * 2015-09-01 2017-03-08 三星电子株式会社 Sensor based on event and the pixel of the sensor based on event
CN106997453A (en) * 2016-01-22 2017-08-01 三星电子株式会社 Event signal processing method and equipment
CN107018357A (en) * 2016-01-27 2017-08-04 三星电子株式会社 Method and apparatus on the event sampling of the dynamic visual sensor of image formation
CN107750372A (en) * 2015-03-16 2018-03-02 皮埃尔和玛利居里大学(巴黎第六大学) The method that scene three-dimensional (3D) is rebuild
CN110365912A (en) * 2015-04-20 2019-10-22 三星电子株式会社 Imaging unit, system and image sensor cell
WO2019210546A1 (en) * 2018-05-04 2019-11-07 上海芯仑光电科技有限公司 Data processing method and computing device
US20200041258A1 (en) 2015-04-20 2020-02-06 Samsung Electronics Co., Ltd. Cmos image sensor for rgb imaging and depth measurement with laser sheet scan
CN111274834A (en) * 2018-12-04 2020-06-12 西克股份公司 Reading of optical codes
US10883822B2 (en) 2015-04-20 2021-01-05 Samsung Electronics Co., Ltd. CMOS image sensor for 2D imaging and depth measurement with ambient light rejection
US11736832B2 (en) 2015-04-20 2023-08-22 Samsung Electronics Co., Ltd. Timestamp calibration of the 3D camera with epipolar line laser point scanning
US11924545B2 (en) 2015-04-20 2024-03-05 Samsung Electronics Co., Ltd. Concurrent RGBZ sensor and system

Families Citing this family (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR102298652B1 (en) 2015-01-27 2021-09-06 삼성전자주식회사 Method and apparatus for determining disparty
KR102307055B1 (en) 2015-04-28 2021-10-01 삼성전자주식회사 Method and apparatus for extracting static pattern based on output of event-based sensor
EP3113108B1 (en) * 2015-07-02 2020-03-11 Continental Automotive GmbH Detection of lens contamination using expected edge trajectories
US10269131B2 (en) 2015-09-10 2019-04-23 Kabushiki Kaisha Toshiba Image processing apparatus and image processing method
KR102465212B1 (en) * 2015-10-30 2022-11-10 삼성전자주식회사 Photographing apparatus using multiple exposure sensor and photographing method thereof
US9934557B2 (en) 2016-03-22 2018-04-03 Samsung Electronics Co., Ltd Method and apparatus of image representation and processing for dynamic vision sensor
CN108574793B (en) 2017-03-08 2022-05-10 三星电子株式会社 Image processing apparatus configured to regenerate time stamp and electronic apparatus including the same
WO2019137973A1 (en) * 2018-01-11 2019-07-18 Gensight Biologics Method and device for processing asynchronous signals generated by an event-based light sensor
CN109919957B (en) * 2019-01-08 2020-11-27 同济大学 Corner detection method based on dynamic vision sensor
EP3690736A1 (en) * 2019-01-30 2020-08-05 Prophesee Method of processing information from an event-based sensor
JP7120180B2 (en) * 2019-08-07 2022-08-17 トヨタ自動車株式会社 image sensor
JP7264028B2 (en) * 2019-12-05 2023-04-25 トヨタ自動車株式会社 Information providing system, information providing method, information terminal and information display method
CN111770245B (en) * 2020-07-29 2021-05-25 中国科学院长春光学精密机械与物理研究所 Pixel structure of retina-like image sensor
WO2022254836A1 (en) * 2021-06-03 2022-12-08 ソニーグループ株式会社 Information processing device, information processing system, and information processing method
US20230013877A1 (en) * 2021-07-06 2023-01-19 Samsung Electronics Co., Ltd. Method of determining visual interference using a weighted combination of cis and dvs measurement
US20230026592A1 (en) * 2021-07-21 2023-01-26 Sony Group Corporation Image sensor control circuitry and image sensor control method

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101305398A (en) * 2005-10-12 2008-11-12 有源光学有限公司 Method for forming synthesis image based on a plurality of image frames
CN102177524A (en) * 2008-08-08 2011-09-07 实耐宝公司 Image-based inventory control system using advanced image recognition
CN102271253A (en) * 2010-06-07 2011-12-07 索尼公司 Image processing method using motion estimation and image processing apparatus

Family Cites Families (19)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2005107240A1 (en) * 2004-04-28 2005-11-10 Chuo Electronics Co., Ltd. Automatic imaging method and apparatus
JP4140588B2 (en) 2004-09-01 2008-08-27 日産自動車株式会社 Moving object detection device
US7403866B2 (en) * 2004-10-06 2008-07-22 Telefonaktiebolaget L M Ericsson (Publ) High-resolution, timer-efficient sliding window
JP4650079B2 (en) * 2004-11-30 2011-03-16 日産自動車株式会社 Object detection apparatus and method
US20060197664A1 (en) 2005-01-18 2006-09-07 Board Of Regents, The University Of Texas System Method, system and apparatus for a time stamped visual motion sensor
US7613322B2 (en) * 2005-05-19 2009-11-03 Objectvideo, Inc. Periodic motion detection with applications to multi-grabbing
KR101331982B1 (en) 2005-06-03 2013-11-25 우니페르지타에트 취리히 Photoarray for detecting time­dependent image data
KR100762670B1 (en) 2006-06-07 2007-10-01 삼성전자주식회사 Method and device for generating disparity map from stereo image and stereo matching method and device therefor
CN102016916B (en) * 2008-04-04 2014-08-13 先进微装置公司 Filtering method and apparatus for anti-aliasing
US20100079413A1 (en) * 2008-09-29 2010-04-01 Denso Corporation Control device
US20100118199A1 (en) * 2008-11-10 2010-05-13 Kabushiki Kaisha Toshiba Video/Audio Processor and Video/Audio Processing Method
JP5376906B2 (en) 2008-11-11 2013-12-25 パナソニック株式会社 Feature amount extraction device, object identification device, and feature amount extraction method
US8286102B1 (en) * 2010-05-27 2012-10-09 Adobe Systems Incorporated System and method for image processing using multi-touch gestures
US8698092B2 (en) * 2010-09-10 2014-04-15 Samsung Electronics Co., Ltd. Method and apparatus for motion recognition
KR101779564B1 (en) 2010-09-10 2017-09-20 삼성전자주식회사 Method and Apparatus for Motion Recognition
JP5624702B2 (en) * 2010-11-16 2014-11-12 日本放送協会 Image feature amount calculation apparatus and image feature amount calculation program
JP5645699B2 (en) * 2011-02-16 2014-12-24 三菱電機株式会社 Motion detection device and method, video signal processing device and method, and video display device
KR101880998B1 (en) * 2011-10-14 2018-07-24 삼성전자주식회사 Apparatus and Method for motion recognition with event base vision sensor
EP2677500B1 (en) * 2012-06-19 2021-06-23 Samsung Electronics Co., Ltd. Event-based image processing apparatus and method

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101305398A (en) * 2005-10-12 2008-11-12 有源光学有限公司 Method for forming synthesis image based on a plurality of image frames
CN102177524A (en) * 2008-08-08 2011-09-07 实耐宝公司 Image-based inventory control system using advanced image recognition
CN102271253A (en) * 2010-06-07 2011-12-07 索尼公司 Image processing method using motion estimation and image processing apparatus

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
JÄURGEN KOGLER等: "Event-based Stereo Matching Approaches for Frameless Address Event Stereo Data", 《ADVANCES IN VISUAL COMPUTING》 *

Cited By (23)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107750372A (en) * 2015-03-16 2018-03-02 皮埃尔和玛利居里大学(巴黎第六大学) The method that scene three-dimensional (3D) is rebuild
CN107750372B (en) * 2015-03-16 2021-12-10 皮埃尔和玛利居里大学(巴黎第六大学) Method and device for three-dimensional reconstruction of scene and computer readable medium
US11431938B2 (en) 2015-04-20 2022-08-30 Samsung Electronics Co., Ltd. Timestamp calibration of the 3D camera with epipolar line laser point scanning
US10883822B2 (en) 2015-04-20 2021-01-05 Samsung Electronics Co., Ltd. CMOS image sensor for 2D imaging and depth measurement with ambient light rejection
CN110365912A (en) * 2015-04-20 2019-10-22 三星电子株式会社 Imaging unit, system and image sensor cell
US11725933B2 (en) 2015-04-20 2023-08-15 Samsung Electronics Co., Ltd. CMOS image sensor for RGB imaging and depth measurement with laser sheet scan
US20200041258A1 (en) 2015-04-20 2020-02-06 Samsung Electronics Co., Ltd. Cmos image sensor for rgb imaging and depth measurement with laser sheet scan
US11650044B2 (en) 2015-04-20 2023-05-16 Samsung Electronics Co., Ltd. CMOS image sensor for 2D imaging and depth measurement with ambient light rejection
US11378390B2 (en) 2015-04-20 2022-07-05 Samsung Electronics Co., Ltd. CMOS image sensor for 2D imaging and depth measurement with ambient light rejection
US11736832B2 (en) 2015-04-20 2023-08-22 Samsung Electronics Co., Ltd. Timestamp calibration of the 3D camera with epipolar line laser point scanning
US10883821B2 (en) 2015-04-20 2021-01-05 Samsung Electronics Co., Ltd. CMOS image sensor for 2D imaging and depth measurement with ambient light rejection
US10893227B2 (en) 2015-04-20 2021-01-12 Samsung Electronics Co., Ltd. Timestamp calibration of the 3D camera with epipolar line laser point scanning
US11002531B2 (en) 2015-04-20 2021-05-11 Samsung Electronics Co., Ltd. CMOS image sensor for RGB imaging and depth measurement with laser sheet scan
US11131542B2 (en) 2015-04-20 2021-09-28 Samsung Electronics Co., Ltd. CMOS image sensor for RGB imaging and depth measurement with laser sheet scan
US11924545B2 (en) 2015-04-20 2024-03-05 Samsung Electronics Co., Ltd. Concurrent RGBZ sensor and system
CN106488151A (en) * 2015-09-01 2017-03-08 三星电子株式会社 Sensor based on event and the pixel of the sensor based on event
CN106997453B (en) * 2016-01-22 2022-01-28 三星电子株式会社 Event signal processing method and device
CN106997453A (en) * 2016-01-22 2017-08-01 三星电子株式会社 Event signal processing method and equipment
CN107018357B (en) * 2016-01-27 2020-07-14 三星电子株式会社 Method and apparatus for event sampling for dynamic vision sensor with respect to image formation
CN107018357A (en) * 2016-01-27 2017-08-04 三星电子株式会社 Method and apparatus on the event sampling of the dynamic visual sensor of image formation
US11481908B2 (en) 2018-05-04 2022-10-25 Omnivision Sensor Solution (Shanghai) Co., Ltd Data processing method and computing device
WO2019210546A1 (en) * 2018-05-04 2019-11-07 上海芯仑光电科技有限公司 Data processing method and computing device
CN111274834A (en) * 2018-12-04 2020-06-12 西克股份公司 Reading of optical codes

Also Published As

Publication number Publication date
EP2838069A2 (en) 2015-02-18
EP2838069A3 (en) 2015-10-07
US20150030204A1 (en) 2015-01-29
US9767571B2 (en) 2017-09-19
JP2015028780A (en) 2015-02-12
CN104346427B (en) 2019-08-30
JP6483370B2 (en) 2019-03-13
EP2838069B1 (en) 2017-12-27

Similar Documents

Publication Publication Date Title
CN104346427A (en) Apparatus and method for analyzing image including event information
JP5952001B2 (en) Camera motion estimation method and apparatus using depth information, augmented reality system
EP3258445B1 (en) Augmented reality occlusion
US20240331413A1 (en) Associating two dimensional label data with three-dimensional point cloud data
US11842514B1 (en) Determining a pose of an object from rgb-d images
US20150145959A1 (en) Use of Spatially Structured Light for Dynamic Three Dimensional Reconstruction and Reality Augmentation
Gupta et al. Real-time stereo matching using adaptive binary window
US20150098623A1 (en) Image processing apparatus and method
Häne et al. Stereo depth map fusion for robot navigation
US8634637B2 (en) Method and apparatus for reducing the memory requirement for determining disparity values for at least two stereoscopically recorded images
KR102359230B1 (en) Method and apparatus for providing virtual room
US20210124960A1 (en) Object recognition method and object recognition device performing the same
CN103916654A (en) Method Of Obtaining Depth Information And Display Apparatus
US9595125B2 (en) Expanding a digital representation of a physical plane
US20160232705A1 (en) Method for 3D Scene Reconstruction with Cross-Constrained Line Matching
KR20160009487A (en) Device and Method of 3D Image Display
CN108090953B (en) Region-of-interest reconstruction method, system and computer-readable storage medium
KR102129916B1 (en) Apparatus and method for analyzing an image including event information
KR20170047780A (en) Low-cost calculation apparatus using the adaptive window mask and method therefor
CN116912417A (en) Texture mapping method, device, equipment and storage medium based on three-dimensional reconstruction of human face
Karbasi et al. Analysis and enhancement of the denoising depth data using Kinect through iterative technique
CN113126944B (en) Depth map display method, display device, electronic device, and storage medium
Dong et al. Resolving incorrect visual occlusion in outdoor augmented reality using TOF camera and OpenGL frame buffer
JP6060612B2 (en) Moving surface situation recognition device, moving object, and program
EP2975850A1 (en) Method for correcting motion estimation between at least two frames of a video sequence, corresponding device, computer program and non-transitory computer-readable medium

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant