WO2022264405A1 - Signal processing device, signal processing method, and program - Google Patents

Signal processing device, signal processing method, and program Download PDF

Info

Publication number
WO2022264405A1
WO2022264405A1 PCT/JP2021/023211 JP2021023211W WO2022264405A1 WO 2022264405 A1 WO2022264405 A1 WO 2022264405A1 JP 2021023211 W JP2021023211 W JP 2021023211W WO 2022264405 A1 WO2022264405 A1 WO 2022264405A1
Authority
WO
WIPO (PCT)
Prior art keywords
event
sensor
output
signal processing
polarity
Prior art date
Application number
PCT/JP2021/023211
Other languages
French (fr)
Japanese (ja)
Inventor
宏昌 長沼
Original Assignee
株式会社ソニー・インタラクティブエンタテインメント
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 株式会社ソニー・インタラクティブエンタテインメント filed Critical 株式会社ソニー・インタラクティブエンタテインメント
Priority to PCT/JP2021/023211 priority Critical patent/WO2022264405A1/en
Publication of WO2022264405A1 publication Critical patent/WO2022264405A1/en

Links

Images

Definitions

  • the present invention relates to a signal processing device, signal processing method and program.
  • An event-driven vision sensor in which pixels that detect changes in the intensity of incident light generate signals asynchronously with time.
  • An event-driven vision sensor is advantageous in that it can operate at low power and at high speed compared to a frame-type vision sensor that scans all pixels at predetermined intervals, specifically image sensors such as CCD and CMOS. is. Techniques related to such an event-driven vision sensor are described in Patent Document 1 and Patent Document 2, for example.
  • an object of the present invention is to provide a signal processing device, a signal processing method, and a program capable of effectively correcting afterimages, noise, etc. in the output of an event-driven vision sensor.
  • the output from an event-driven vision sensor comprising a plurality of sensors forming a sensor array includes the position information of each sensor within the sensor array, the polarity of the light intensity change and a time stamp.
  • At least one event signal in the series of event signals is in at least one of a temporal pattern identified by time stamps of the series of event signals and/or a spatial pattern identified by location information of the series of event signals.
  • a signal processing apparatus includes a data corrector that corrects based on the data.
  • an event-driven vision sensor including a plurality of sensors constituting a sensor array, position information of each sensor in the sensor array, polarity of light intensity change, and time stamp.
  • At least one event signal in the series of event signals comprising at least one of a temporal pattern identified by time stamps of the series of event signals and/or a spatial pattern identified by location information of the series of event signals.
  • a signal processing method is provided that includes correcting based on .
  • an event-driven vision sensor including a plurality of sensors forming a sensor array, position information of each sensor in the sensor array, polarity of light intensity change and time stamp at least one event signal in the sequence of event signals comprising at least one of a temporal pattern identified by time stamps of the sequence of event signals and/or a spatial pattern identified by location information of the sequence of event signals
  • a program is provided for causing a computer to implement a function of correcting based on the above.
  • FIG. 1 is a diagram showing a schematic configuration of a system according to one embodiment of the present invention.
  • FIG. 4 is a diagram showing an example of a correction value table in one embodiment of the present invention;
  • FIG. 4 is a flow chart showing an example of processing for correcting an event signal in one embodiment of the present invention; It is a figure which shows the example of the preparation method of the correction value table in one Embodiment of this invention.
  • 6 is a flowchart showing an example of processing for creating a correction value table in one embodiment of the present invention;
  • FIG. 4 is a diagram for explaining a specific example of a method of correcting an event signal;
  • FIG. FIG. 4 is a diagram for explaining a specific example of a method of correcting an event signal;
  • FIG. 1 is a diagram showing a schematic configuration of a system according to one embodiment of the present invention.
  • FIG. 4 is a diagram showing an example of a correction value table in one embodiment of the present invention
  • FIG. 4 is a flow chart showing
  • FIG. 1 is a diagram showing a schematic configuration of a system according to one embodiment of the present invention.
  • the system 10 includes a vision sensor 100 which is an Event Driven Sensor (EDS) and a signal processor 200 .
  • EDS Event Driven Sensor
  • Vision sensor 100 includes a sensor array 120 made up of a plurality of sensors 110 corresponding to pixels of an image, and processing circuitry 130 coupled to sensor array 120 .
  • the sensor 110 includes a light-receiving element and generates an event signal upon detecting a change in the intensity of incident light, more specifically a change in luminance.
  • sensors 110 are arranged in two orthogonal directions (illustrated as x-direction and y-direction). For example, event signals are read out from sensor 110 according to addresses generated by an address generator included in processing circuitry 130 for the x and y directions, respectively.
  • the event signal is not read even if reading is executed, and reading of the next address is executed.
  • the event signals output from the vision sensor 100 are time asynchronous.
  • the event signal (X, Y, P, t) output from the vision sensor 100 includes the positional information X, Y of the sensor 110 in the sensor array 120 and the polarity P (p: positive or n: negative) of the luminance change. , and timestamp t.
  • the signal processing device 200 includes a communication interface 210, a buffer memory 220, a processing circuit 230, and a storage section 240.
  • Communication interface 210 receives event signals from processing circuitry 130 of vision sensor 100 .
  • the received event signal is temporarily stored in buffer memory 220 .
  • the processing circuit 230 operates, for example, according to a program stored in the storage section 240 and processes event signals read from the buffer memory 220 .
  • the processing circuitry 230 includes a data corrector 231 that corrects the event signal as described below. Based on the event signal corrected by the data correction unit 231, the processing circuit 230 generates, in time series, an image mapping the position where the luminance change occurs, and temporarily or permanently stores it in the storage unit 240.
  • processing circuitry 230 may transmit the event signal to yet another device via communication interface 210 .
  • the processing circuit 230 may further include an illumination intensity detector 232 .
  • the storage unit 240 stores a correction value table 241 referenced by the data correction unit 231 . The configuration of each unit will be further described below.
  • the data correction unit 231 corrects the event signal temporarily stored in the buffer memory 220 based on at least one of the temporal pattern and the spatial pattern of the event signal defined in the correction value table 241. do. More specifically, the data correction unit 231 corrects a series of event signals in which at least one of position information and time stamps satisfies a predetermined relationship, a temporal pattern specified by the time stamps of the series of event signals, Alternatively, when the spatial pattern specified by the position information of the series of event signals fits the pattern defined in the correction value table 241, at least one of the series of event signals is corrected.
  • the data correcting unit 231 may be configured based on a temporal pattern specified by time stamps of a series of event signals having position information that satisfies a predetermined relationship with the position information of at least one event signal to be processed. , at least one event signal may be corrected. More specifically, when there are a plurality of event signals output from the sensors 110 with the same positional information X and Y, the data correction unit 231 corrects the temporally specified by the time stamp t and the polarity P of these event signals. corrects the event signal based on the pattern.
  • the data correction unit 231 may disable the output of the specific sensor 110 in the event signal for a predetermined period of time. good.
  • the disabled output event signal is not displayed, for example, in the image mapping the location where the luminance change occurred, nor is it transmitted to another device.
  • the data correction unit 231 converts the spatial pattern specified by the position information of a series of event signals having time stamps that satisfy a predetermined relationship to the time stamps of at least one event signal to be processed into a spatial pattern. At least one event signal may be corrected based on. More specifically, when there are a plurality of event signals having time stamps t close to each other, the data correction unit 231 corrects the spatial pattern specified by the position information X, Y and the polarity P of the event signals. Correct the event signal based on When the spatial pattern specified by the position information X and Y matches the pattern defined in the correction value table 241, the data correction unit 231 converts the output of the specific sensor 110 in the event signal to the adjacent sensor 110.
  • synchronizing the outputs of a plurality of sensors 110 means, for example, changing the polarity of an event signal output from one sensor 110 to match the polarity of an event signal output from another sensor 110.
  • the time stamp of the event signal output from one sensor 110 may be changed to match the time stamp of the event signal output from the other sensor 110 .
  • FIG. 2 is a diagram showing an example of a correction value table in one embodiment of the invention.
  • the correction value table 241 temporal patterns and spatial patterns are defined for each position information X, Y of the sensor 110 and for each event signal polarity P(p, n).
  • the temporal pattern and the spatial pattern differ respectively according to the position information and the polarity of the at least one event signal to be corrected.
  • Each sensor 110 constituting the sensor array 120 has different characteristics, and the characteristics may differ even when the polarities of p and n are detected. It is possible to correct the event signal more appropriately according to the difference in characteristics of each. As shown, multiple patterns (pattern 1 through pattern n) may be defined for the same position information and polarity.
  • Such correction prevents afterimages caused by the sensor 110 repeatedly detecting luminance changes multiple times when there is actually only one luminance change.
  • the period T1 functions as a threshold value for determining whether the repeated output is due to the characteristics of the sensor or whether luminance changes actually occur multiple times.
  • the time stamp of that event signal is rewritten according to the time stamp of the event signal of the other sensor 110 .
  • Such correction actually causes a change in brightness, and the output of the sensor 110 that does not detect a change in brightness even though a change in brightness is detected by an adjacent sensor is the output of the vision sensor. It can be useful to prevent going into noise in the 100 output.
  • the illumination intensity detection unit 232 detects illumination intensity in the environment of the sensor array 120 using the illumination sensor 140 arranged near the sensor array 120 . If the illumination intensity detection section 232 is present, the data correction section 231 corrects the output value of the vision sensor 100 based on the correction value table 241 defined for each illumination intensity detected by the illumination intensity detection section 232 . Specifically, for example, a plurality of correction value tables 241 may be prepared for each illumination intensity range, and the records of the correction value table 241 as shown in FIG. (X, Y, P, L) to which the illumination intensity range L is added may be recorded as a key.
  • FIG. 3 is a flow chart showing an example of processing for correcting an event signal in one embodiment of the present invention.
  • the data correction unit 231 creates a correction value table using position information (X, Y) and polarity P of the received event signal as keys. 241 is searched (step S102). If there is a record in the correction value table 241 that matches the key, that is, a pattern definition for the acquired event signal position information (X, Y) and the polarity P (step S103), the data correction unit 231 receives For the event signal, it is determined whether or not correction is necessary based on the pattern.
  • the data correction unit 231 reads from the buffer memory an event signal related to the pattern defined in the correction value table 241 (step S104). For example, if a temporal pattern is defined for the position information (X, Y) and the polarity P of the event signal, the data correction unit 231 detects the time-series event signals obtained by the sensors with the same position information (X, Y). is read from the buffer memory. Further, when a spatial pattern is defined for the position information (X, Y) and the polarity P of the event signal, the data correction unit 231 has a predetermined relationship with the position information (X, Y), for example, adjacent positions The event signal acquired by the information sensor is read out from the buffer memory.
  • the data correction unit 231 determines whether or not the event signal received in step S101 and the event signal additionally read in step S104 apply to the pattern defined in the correction value table 241 (step S105). If the event signal fits the defined pattern, the data correction unit 231 corrects the event signal according to the correction method associated with the pattern in the correction value table 241 (step S106). Specifically, for example, the data correction unit 231 performs correction by invalidating a specific event signal or rewriting the event signal as described above.
  • FIG. 4 is a diagram showing an example of a method of creating a correction value table according to one embodiment of the present invention.
  • the correction value table 241 described in the above example is created as part of the calibration procedure of the vision sensor 100, for example.
  • the vision sensor 100 is attached to a two-axis camera platform 501, and the vision sensor 100 is moved using the camera platform 501 with the test pattern 502 at least partially arranged within the angle of view of the vision sensor 100. move.
  • the reflectance in each area of the test pattern 502, the positional relationship between the vision sensor 100 and the test pattern 502, and the movement speed of the vision sensor 100 by the platform 501 are known, and the vision sensor calculated based on these is known.
  • ideal output values 521 are stored in the storage unit 520 of the computing device 500 .
  • the correction value determiner 510 of the computing device 500 compares the event signal received from the vision sensor 100 with the ideal output value 521 to respond to the temporal or spatial pattern of the event signal and the pattern.
  • a correction method is specified and stored in the storage unit 520 as a correction value table 241 .
  • the illumination intensity-adjustable illumination 503 is arranged when the correction value table 241 is created, and the illuminance sensor 140 similar to that shown in FIG.
  • a correction value table 241 that differs according to the illumination intensity is created.
  • FIG. 5 is a flowchart showing an example of processing for creating a correction value table in one embodiment of the present invention.
  • the correction value determining unit 510 of the arithmetic device 500 compares the ideal output value and the actually generated event signal for each of the sensors 110 that make up the sensor array 120 of the vision sensor 100. (Step S201), it is determined whether or not correction is necessary (Step S202). For example, if the detection of the luminance change is delayed for a predetermined time or more from the ideal timing, if the luminance change that should have been detected is not detected, or if the luminance change that should not have been detected is detected, correction is not necessary.
  • the correction value determination unit 510 recognizes the temporal or spatial pattern of the target event signal (step S203). If a temporal or spatial pattern can be recognized for the event signal requiring correction (step S204), the correction value determination unit 510 converts the recognized pattern and event signal into an ideal output value. A correction method for making the distance closer is recorded in the correction value table 241 (step S205).
  • FIG. 6 and 7 are diagrams for explaining a specific example of the event signal correction method.
  • Sensor A outputs an event signal (x 1 , y 1 , p, t 1 ) at time t 1 after time t 0 and has not output an event signal thereafter.
  • sensor B outputs event signals (x 1 , y 1 , p, t 1 ) at time t 1 and then outputs event signals (x 1 , y 1 , n, t 2 ) at time t 2 .
  • event signals (x 1 , y 1 , p , t 3 ) are output.
  • a repetitive output in which the polarity P is output in the order p, n, p is identified as the temporal pattern to be corrected.
  • the correction method may be defined as "invalidating all but the first p outputs".
  • 9 adjacent pixels included in the area on the sensor array 120 where the luminance change occurs due to the movement of the test pattern 502 shown in FIG.
  • the event signal of sensor 110 is shown.
  • the spatial pattern is defined as ⁇ p output from only 8 pixels out of 9 adjacent pixels'', and the correction method is ⁇ p output for all adjacent 9 pixels. may be defined as "to".
  • a temporal or spatial pattern is defined for each sensor position information and polarity, and when there is an output that fits the pattern, Event signals are corrected.
  • This allows, for example, to disable repetitive outputs with polarities that differ from the actual luminance changes in the event signal, or to synchronize the output of a sensor whose output is out of sync with adjacent sensors to an adjacent sensor.
  • By defining a pattern for each sensor position information and polarity by the actual measurement procedure as shown in FIG. it is possible to effectively correct afterimages, noise, etc. in the output of the vision sensor.

Landscapes

  • Photometry And Measurement Of Optical Pulse Characteristics (AREA)

Abstract

Provided is a signal processing device comprising a data correction unit for correcting at least one event signal among a series of event signals outputted from an event-driven vision sensor that includes a plurality of sensors constituting a sensor array and including the position information of each sensor in the sensor array, the polarity of a light intensity change, and a time stamp, on the basis of at least one of a temporal pattern specified by the time stamp of the series of event signals and a spatial pattern specified by the position information of the series of event signals.

Description

信号処理装置、信号処理方法およびプログラムSIGNAL PROCESSING DEVICE, SIGNAL PROCESSING METHOD, AND PROGRAM
 本発明は、信号処理装置、信号処理方法およびプログラムに関する。 The present invention relates to a signal processing device, signal processing method and program.
 入射する光の強度変化を検出した画素が時間非同期的に信号を生成する、イベント駆動型のビジョンセンサが知られている。イベント駆動型のビジョンセンサは、所定の周期ごとに全画素をスキャンするフレーム型ビジョンセンサ、具体的にはCCDやCMOSなどのイメージセンサに比べて、低電力で高速に動作可能である点で有利である。このようなイベント駆動型のビジョンセンサに関する技術は、例えば特許文献1および特許文献2に記載されている。 An event-driven vision sensor is known, in which pixels that detect changes in the intensity of incident light generate signals asynchronously with time. An event-driven vision sensor is advantageous in that it can operate at low power and at high speed compared to a frame-type vision sensor that scans all pixels at predetermined intervals, specifically image sensors such as CCD and CMOS. is. Techniques related to such an event-driven vision sensor are described in Patent Document 1 and Patent Document 2, for example.
特表2014-535098号公報Japanese Patent Publication No. 2014-535098 特開2018-85725号公報JP 2018-85725 A
 ここで、イベント駆動型のビジョンセンサでは、例えばセンサごとの特性の違いによって、実際の光の強度変化にはない残像やノイズなどが発生する場合がある。しかしながら、そのような残像やノイズなどを補正する技術については、まだ提案されていない。 Here, with event-driven vision sensors, for example, due to differences in the characteristics of each sensor, afterimages and noise that do not occur in actual light intensity changes may occur. However, no technology has yet been proposed for correcting such afterimages and noise.
 そこで、本発明は、イベント駆動型のビジョンセンサの出力における残像やノイズなどを効果的に補正することが可能な信号処理装置、信号処理方法およびプログラムを提供することを目的とする。 Therefore, an object of the present invention is to provide a signal processing device, a signal processing method, and a program capable of effectively correcting afterimages, noise, etc. in the output of an event-driven vision sensor.
 本発明のある観点によれば、センサアレイを構成する複数のセンサを含むイベント駆動型のビジョンセンサから出力され、センサアレイ内における各センサの位置情報、光の強度変化の極性およびタイムスタンプを含む一連のイベント信号のうち少なくとも1つのイベント信号を、一連のイベント信号のタイムスタンプによって特定される時間的なパターン、または一連のイベント信号の位置情報によって特定される空間的なパターンの少なくともいずれかに基づいて補正するデータ補正部を備える信号処理装置が提供される。 According to one aspect of the invention, the output from an event-driven vision sensor comprising a plurality of sensors forming a sensor array includes the position information of each sensor within the sensor array, the polarity of the light intensity change and a time stamp. At least one event signal in the series of event signals is in at least one of a temporal pattern identified by time stamps of the series of event signals and/or a spatial pattern identified by location information of the series of event signals. A signal processing apparatus is provided that includes a data corrector that corrects based on the data.
 本発明の別の観点によれば、センサアレイを構成する複数のセンサを含むイベント駆動型のビジョンセンサから出力され、センサアレイ内における各センサの位置情報、光の強度変化の極性およびタイムスタンプを含む一連のイベント信号のうち少なくとも1つのイベント信号を、一連のイベント信号のタイムスタンプによって特定される時間的なパターン、または一連のイベント信号の位置情報によって特定される空間的なパターンの少なくともいずれかに基づいて補正するステップを含む信号処理方法が提供される。 According to another aspect of the present invention, output from an event-driven vision sensor including a plurality of sensors constituting a sensor array, position information of each sensor in the sensor array, polarity of light intensity change, and time stamp. At least one event signal in the series of event signals comprising at least one of a temporal pattern identified by time stamps of the series of event signals and/or a spatial pattern identified by location information of the series of event signals. A signal processing method is provided that includes correcting based on .
 本発明のさらに別の観点によれば、センサアレイを構成する複数のセンサを含むイベント駆動型のビジョンセンサから出力され、センサアレイ内における各センサの位置情報、光の強度変化の極性およびタイムスタンプを含む一連のイベント信号のうち少なくとも1つのイベント信号を、一連のイベント信号のタイムスタンプによって特定される時間的なパターン、または一連のイベント信号の位置情報によって特定される空間的なパターンの少なくともいずれかに基づいて補正する機能をコンピュータに実現させるためのプログラムが提供される。 According to yet another aspect of the present invention, output from an event-driven vision sensor including a plurality of sensors forming a sensor array, position information of each sensor in the sensor array, polarity of light intensity change and time stamp at least one event signal in the sequence of event signals comprising at least one of a temporal pattern identified by time stamps of the sequence of event signals and/or a spatial pattern identified by location information of the sequence of event signals A program is provided for causing a computer to implement a function of correcting based on the above.
図1は本発明の一実施形態に係るシステムの概略的な構成を示す図である。FIG. 1 is a diagram showing a schematic configuration of a system according to one embodiment of the present invention. 本発明の一実施形態における補正値テーブルの例を示す図である。FIG. 4 is a diagram showing an example of a correction value table in one embodiment of the present invention; FIG. 本発明の一実施形態においてイベント信号を補正する処理の例を示すフローチャートである。4 is a flow chart showing an example of processing for correcting an event signal in one embodiment of the present invention; 本発明の一実施形態における補正値テーブルの作成方法の例を示す図である。It is a figure which shows the example of the preparation method of the correction value table in one Embodiment of this invention. 本発明の一実施形態において補正値テーブルを作成する処理の例を示すフローチャートである。6 is a flowchart showing an example of processing for creating a correction value table in one embodiment of the present invention; イベント信号の補正方法の具体的な例について説明するための図である。FIG. 4 is a diagram for explaining a specific example of a method of correcting an event signal; FIG. イベント信号の補正方法の具体的な例について説明するための図である。FIG. 4 is a diagram for explaining a specific example of a method of correcting an event signal; FIG.
 以下に添付図面を参照しながら、本発明の好適な実施形態について詳細に説明する。なお、本明細書および図面において、実質的に同一の機能構成を有する構成要素については、同一の符号を付することにより重複した説明を省略する。 Preferred embodiments of the present invention will be described in detail below with reference to the accompanying drawings. In the present specification and drawings, constituent elements having substantially the same functional configuration are denoted by the same reference numerals, thereby omitting redundant description.
 図1は、本発明の一実施形態に係るシステムの概略的な構成を示す図である。図示された例において、システム10は、イベント駆動型のセンサ(EDS:Event Driven Sensor)であるビジョンセンサ100と、信号処理装置200とを含む。 FIG. 1 is a diagram showing a schematic configuration of a system according to one embodiment of the present invention. In the illustrated example, the system 10 includes a vision sensor 100 which is an Event Driven Sensor (EDS) and a signal processor 200 .
 ビジョンセンサ100は、画像のピクセルに対応する複数のセンサ110で構成されるセンサアレイ120と、センサアレイ120に接続される処理回路130とを含む。センサ110は、受光素子を含み、入射する光の強度変化、より具体的には輝度変化を検出したときにイベント信号を生成する。センサアレイ120において、センサ110は直交する2方向(x方向およびy方向として図示する)に配列されている。例えば、処理回路130に含まれるアドレス発生装置がそれぞれx方向およびy方向について発生させるアドレスに従ってセンサ110からイベント信号が読み出される。ここで、あるアドレスのセンサ110がイベントを検出しなかった場合には読み出しが実行されてもイベント信号は読み出されず、次のアドレスの読み出しが実行される。従って、ビジョンセンサ100から出力されるイベント信号は時間非同期的である。ビジョンセンサ100から出力されるイベント信号(X,Y,P,t)は、センサアレイ120内におけるセンサ110の位置情報X,Yと、輝度変化の極性P(p:positiveまたはn:negative)と、タイムスタンプtとを含む。 Vision sensor 100 includes a sensor array 120 made up of a plurality of sensors 110 corresponding to pixels of an image, and processing circuitry 130 coupled to sensor array 120 . The sensor 110 includes a light-receiving element and generates an event signal upon detecting a change in the intensity of incident light, more specifically a change in luminance. In sensor array 120, sensors 110 are arranged in two orthogonal directions (illustrated as x-direction and y-direction). For example, event signals are read out from sensor 110 according to addresses generated by an address generator included in processing circuitry 130 for the x and y directions, respectively. Here, if the sensor 110 at a certain address does not detect an event, the event signal is not read even if reading is executed, and reading of the next address is executed. Therefore, the event signals output from the vision sensor 100 are time asynchronous. The event signal (X, Y, P, t) output from the vision sensor 100 includes the positional information X, Y of the sensor 110 in the sensor array 120 and the polarity P (p: positive or n: negative) of the luminance change. , and timestamp t.
 信号処理装置200は、通信インターフェース210と、バッファメモリ220と、処理回路230と、記憶部240とを含む。通信インターフェース210は、ビジョンセンサ100の処理回路130からイベント信号を受信する。受信されたイベント信号は、バッファメモリ220に一時的に格納される。処理回路230は、例えば記憶部240に格納されたプログラムに従って動作し、バッファメモリ220から読み出されたイベント信号を処理する。本実施形態において、処理回路230は、以下で説明するようにイベント信号を補正するデータ補正部231を含む。処理回路230は、データ補正部231によって補正されたイベント信号に基づいて、例えば輝度変化が発生した位置をマッピングした画像を時系列で生成し、記憶部240に一時的または持続的に格納する。あるいは、処理回路230は、通信インターフェース210を介してイベント信号をさらに別の装置に送信してもよい。また、処理回路230は、照明強度検出部232をさらに含んでもよい。記憶部240には、データ補正部231によって参照される補正値テーブル241が格納される。以下、各部の構成についてさらに説明する。 The signal processing device 200 includes a communication interface 210, a buffer memory 220, a processing circuit 230, and a storage section 240. Communication interface 210 receives event signals from processing circuitry 130 of vision sensor 100 . The received event signal is temporarily stored in buffer memory 220 . The processing circuit 230 operates, for example, according to a program stored in the storage section 240 and processes event signals read from the buffer memory 220 . In this embodiment, the processing circuitry 230 includes a data corrector 231 that corrects the event signal as described below. Based on the event signal corrected by the data correction unit 231, the processing circuit 230 generates, in time series, an image mapping the position where the luminance change occurs, and temporarily or permanently stores it in the storage unit 240. Alternatively, processing circuitry 230 may transmit the event signal to yet another device via communication interface 210 . Also, the processing circuit 230 may further include an illumination intensity detector 232 . The storage unit 240 stores a correction value table 241 referenced by the data correction unit 231 . The configuration of each unit will be further described below.
 データ補正部231は、補正値テーブル241に定義されたイベント信号の時間的なパターン、または空間的なパターンの少なくともいずれかに基づいて、バッファメモリ220に一時的に格納されているイベント信号を補正する。より具体的には、データ補正部231は、位置情報、またはタイムスタンプの少なくともいずれかが所定の関係を満たす一連のイベント信号について、一連のイベント信号のタイムスタンプによって特定される時間的なパターン、または一連のイベント信号の位置情報によって特定される空間的なパターンが補正値テーブル241に定義されたパターンにあてはまる場合に、一連のイベント信号のうち少なくとも1つのイベント信号を補正する。 The data correction unit 231 corrects the event signal temporarily stored in the buffer memory 220 based on at least one of the temporal pattern and the spatial pattern of the event signal defined in the correction value table 241. do. More specifically, the data correction unit 231 corrects a series of event signals in which at least one of position information and time stamps satisfies a predetermined relationship, a temporal pattern specified by the time stamps of the series of event signals, Alternatively, when the spatial pattern specified by the position information of the series of event signals fits the pattern defined in the correction value table 241, at least one of the series of event signals is corrected.
 例えば、データ補正部231は、処理対象になる少なくとも1つのイベント信号の位置情報に対して所定の関係を満たす位置情報を有する一連のイベント信号のタイムスタンプによって特定される時間的なパターンに基づいて、少なくとも1つのイベント信号を補正してもよい。より具体的には、データ補正部231は、同じ位置情報X,Yのセンサ110の出力による複数のイベント信号がある場合に、それらのイベント信号のタイムスタンプtおよび極性Pによって特定される時間的なパターンに基づいてイベント信号を補正する。データ補正部231は、タイムスタンプtによって特定される時間的なパターンが補正値テーブル241に定義されたパターンにあてはまる場合に、イベント信号において特定のセンサ110の出力を所定の時間にわたって無効化してもよい。無効化された出力のイベント信号は、例えば輝度変化が発生した位置をマッピングした画像には表示されず、別の装置にも送信されない。 For example, the data correcting unit 231 may be configured based on a temporal pattern specified by time stamps of a series of event signals having position information that satisfies a predetermined relationship with the position information of at least one event signal to be processed. , at least one event signal may be corrected. More specifically, when there are a plurality of event signals output from the sensors 110 with the same positional information X and Y, the data correction unit 231 corrects the temporally specified by the time stamp t and the polarity P of these event signals. corrects the event signal based on the pattern. If the temporal pattern specified by the time stamp t matches the pattern defined in the correction value table 241, the data correction unit 231 may disable the output of the specific sensor 110 in the event signal for a predetermined period of time. good. The disabled output event signal is not displayed, for example, in the image mapping the location where the luminance change occurred, nor is it transmitted to another device.
 また、例えば、データ補正部231は、処理対象になる少なくとも1つのイベント信号のタイムスタンプに対して所定の関係を満たすタイムスタンプを有する一連のイベント信号の位置情報によって特定される空間的なパターンに基づいて、少なくとも1つのイベント信号を補正してもよい。より具体的には、データ補正部231は、互いに近接したタイムスタンプtを有する複数のイベント信号がある場合に、それらのイベント信号の位置情報X,Yおよび極性Pによって特定される空間的なパターンに基づいてイベント信号を補正する。データ補正部231は、位置情報X,Yによって特定される空間的なパターンが補正値テーブル241に定義されたパターンにあてはまる場合に、イベント信号において特定のセンサ110の出力を隣接する他のセンサ110の出力に同期させてもよい。ここで、複数のセンサ110の出力を同期させることは、例えば1つのセンサ110から出力されたイベント信号の極性を他のセンサ110から出力されたイベント信号の極性に合わせて変更することであってもよいし、1つのセンサ110から出力されたイベント信号のタイムスタンプを他のセンサ110から出力されたイベント信号のタイムスタンプに揃えて変更することであってもよい。 In addition, for example, the data correction unit 231 converts the spatial pattern specified by the position information of a series of event signals having time stamps that satisfy a predetermined relationship to the time stamps of at least one event signal to be processed into a spatial pattern. At least one event signal may be corrected based on. More specifically, when there are a plurality of event signals having time stamps t close to each other, the data correction unit 231 corrects the spatial pattern specified by the position information X, Y and the polarity P of the event signals. Correct the event signal based on When the spatial pattern specified by the position information X and Y matches the pattern defined in the correction value table 241, the data correction unit 231 converts the output of the specific sensor 110 in the event signal to the adjacent sensor 110. can be synchronized with the output of Here, synchronizing the outputs of a plurality of sensors 110 means, for example, changing the polarity of an event signal output from one sensor 110 to match the polarity of an event signal output from another sensor 110. Alternatively, the time stamp of the event signal output from one sensor 110 may be changed to match the time stamp of the event signal output from the other sensor 110 .
 図2は、本発明の一実施形態における補正値テーブルの例を示す図である。本実施形態において、補正値テーブル241では、時間的なパターンおよび空間的なパターンがセンサ110の位置情報X,Yごと、かつイベント信号の極性P(p,n)ごとに定義される。この場合、時間的なパターンおよび空間的なパターンは、補正される少なくとも1つのイベント信号の位置情報および極性によってそれぞれ異なる。図示された例では、補正値テーブル241のレコードが、センサ110の位置情報X,Yおよび極性Pをキーとして格納されている。例えば位置情報(X,Y)=(x,y)のセンサ110について、極性P=pの場合と極性P=nのそれぞれについてパターンが定義される。センサアレイ120を構成する個々のセンサ110ごとに特性が異なり、またp,nそれぞれの極性を検出する場合でも特性が異なりうるため、上記のようなパターンの定義によって個々のセンサ110ごと、および極性ごとの特性の違いに応じてより適切にイベント信号を補正することが可能になる。図示されているように、おなじ位置情報および極性について複数のパターン(パターン1~パターンn)が定義されてもよい。 FIG. 2 is a diagram showing an example of a correction value table in one embodiment of the invention. In this embodiment, in the correction value table 241, temporal patterns and spatial patterns are defined for each position information X, Y of the sensor 110 and for each event signal polarity P(p, n). In this case, the temporal pattern and the spatial pattern differ respectively according to the position information and the polarity of the at least one event signal to be corrected. In the illustrated example, the records of the correction value table 241 are stored using the position information X, Y and the polarity P of the sensor 110 as keys. For example, for the sensor 110 with position information (X, Y)=(x 1 , y 1 ), patterns are defined for the polarity P=p and for the polarity P=n. Each sensor 110 constituting the sensor array 120 has different characteristics, and the characteristics may differ even when the polarities of p and n are detected. It is possible to correct the event signal more appropriately according to the difference in characteristics of each. As shown, multiple patterns (pattern 1 through pattern n) may be defined for the same position information and polarity.
 具体的には、例えば、キーが(x,y,p)のレコードには、「パターン1」として「周期T未満の繰り返し出力」が定義され、「補正方法1」として「最初のp出力以外を無効化する」ことが定義されている。この定義に従って、データ補正部231は、例えばイベント信号(x,y,p,t)、すなわち位置情報(X,Y)=(x,y)のセンサ110で時刻tに極性P=pの輝度変化が検出されたことを示すイベント信号の後に、イベント信号(x,y,n,t)およびイベント信号(x,y,p,t)が取得された場合、t-t<Tであれば「周期T未満の繰り返し出力」のパターンが発生していると判定され、時刻t,tのイベント信号を無効化する。このような補正は、後述する例のように、実際には1回しか輝度変化が発生していないときにセンサ110が繰り返して複数回の輝度変化を検出してしまうことによって生じる残像を防止するために有効でありうる。周期Tは、繰り返し出力がセンサの特性によるものか、実際に複数回の輝度変化が発生しているかを判定するための閾値として機能する。 Specifically, for example, in the record whose key is (x 1 , y 1 , p), “repeated output with a period T less than 1 ” is defined as “pattern 1”, and “first Invalidate other than p output" is defined. According to this definition, the data correction unit 231, for example, the event signal (x 1 , y 1 , p, t 1 ), that is, the sensor 110 of the position information (X, Y)=(x 1 , y 1 ) at the time t 1 Event signals (x 1 , y 1 , n, t 2 ) and event signals (x 1 , y 1 , p, t 3 ) are acquired after the event signal indicating that a luminance change with polarity P=p was detected. If t 3 −t 1 <T 1 , it is determined that a pattern of “repeated output with less than period T 1 ” occurs, and event signals at times t 2 and t 3 are invalidated. Such correction, as in an example described later, prevents afterimages caused by the sensor 110 repeatedly detecting luminance changes multiple times when there is actually only one luminance change. can be effective for The period T1 functions as a threshold value for determining whether the repeated output is due to the characteristics of the sensor or whether luminance changes actually occur multiple times.
 キーが(x,y,n)、すなわち最初のレコードと同じ位置情報であって極性が異なるレコードには、「パターン1」として「周期T未満の繰り返し出力」が定義され、「補正方法1」として「最初のn出力以外を無効化する」ことが定義されている。同じ位置情報(X,Y)のセンサ110で、極性P=p,nの両方の場合に繰り返し出力が発生する場合は、このように極性P=p,nのそれぞれについてパターンが定義される。周期T,Tは同じ値であってもよいし(T=T)、後述するような実測の結果に基づいて異なる値であってもよい(T≠T)。 For a record whose key is (x 1 , y 1 , n), that is, the same position information as the first record but a different polarity, "pattern 1" is defined as "repeated output with period T less than 2 ", and "correction "Method 1" is defined as "invalidating all but the first n outputs". If the sensor 110 with the same position information (X, Y) produces repeated outputs for both polarities P=p, n, patterns are thus defined for each of the polarities P=p, n. The periods T 1 and T 2 may have the same value (T 1 =T 2 ), or may have different values (T 1 ≠T 2 ) based on actual measurement results as described later.
 一方、キーが(x,y,p)のレコードには、「パターン1」として「隣接9画素のうち8画素のみでp出力」が定義され、「補正方法1」として「隣接9画素をすべてp出力にする」ことが定義されている。この定義に従って、データ補正部231は、位置情報(X,Y)=(x,y)を含む隣接9画素(3×3の領域を構成する)のセンサ110のうち、8つのセンサ110で同じ極性P=pのイベント信号が出力されている一方で、残る1つのセンサ110では出力が同期していない場合に、残る1つのセンサ110の出力を8つのセンサ110の出力に同期させる。具体的には、残る1つのセンサ110でイベント信号が出力されないか、または極性P=nのイベント信号が出力されている場合に、データ補正部231は該当するセンサ110の出力を極性P=pのイベント信号に書き換える。または、より後のタイムスタンプで極性P=pのイベント信号が出力されていた場合、そのイベント信号のタイムスタンプを他のセンサ110のイベント信号のタイムスタンプに応じて書き換える。このような補正は、後述する例のように、実際には輝度変化が発生し、隣接するセンサでは輝度変化が検出されているにもかかわらず輝度変化を検出しないセンサ110の出力が、ビジョンセンサ100の出力におけるノイズになってしまうことを防止するために有効でありうる。 On the other hand, in the record whose key is (x 2 , y 2 , p), "p output at only 8 pixels out of 9 adjacent pixels" is defined as "pattern 1", are all p-outputs" is defined. According to this definition, the data correction unit 231 selects eight sensors 110 out of nine adjacent pixels (constituting a 3×3 area) sensors 110 including position information (X, Y)=(x 2 , y 2 ). , the output of the remaining sensor 110 is synchronized with the outputs of the eight sensors 110 when the output of the remaining sensor 110 is not synchronized while the event signal of the same polarity P=p is output. Specifically, when the remaining one sensor 110 does not output an event signal or outputs an event signal with polarity P=n, the data correction unit 231 changes the output of the corresponding sensor 110 to the polarity P=p event signal. Alternatively, if an event signal with polarity P=p is output at a later time stamp, the time stamp of that event signal is rewritten according to the time stamp of the event signal of the other sensor 110 . Such correction, as in an example described later, actually causes a change in brightness, and the output of the sensor 110 that does not detect a change in brightness even though a change in brightness is detected by an adjacent sensor is the output of the vision sensor. It can be useful to prevent going into noise in the 100 output.
 なお、図2に示された例において、位置情報(X,Y)=(x,y)および(X,Y)=(x,y)のセンサ110については、極性P=p,nのいずれの場合にも補正が必要なパターンが発生しないため、レコードは記録されていない。データ補正部231は、これらのセンサ110のイベント信号については補正値テーブル241にキーが存在しないことから補正の処理をスキップする。図2に示された補正値テーブル241の形式は例示であり、イベント信号の時間的または空間的なパターンの定義と補正方法とが関連付けられていれば、データの記録形式は特に限定されない。 It should be noted that in the example shown in FIG. 2, for sensors 110 with position information (X, Y)=(x 3 , y 3 ) and (X, Y)=(x 4 , y 4 ), polarity P=p , n, no pattern requiring correction occurs, so no record is recorded. The data correction unit 231 skips correction processing for these event signals from the sensors 110 because there is no key in the correction value table 241 . The format of the correction value table 241 shown in FIG. 2 is an example, and the data recording format is not particularly limited as long as the definition of the temporal or spatial pattern of the event signal is associated with the correction method.
 再び図1を参照して、照明強度検出部232は、センサアレイ120の近傍に配置された照度センサ140を用いてセンサアレイ120の環境における照明強度を検出する。照明強度検出部232がある場合、データ補正部231は、照明強度検出部232によって検出された照明強度ごとに定義された補正値テーブル241に基づいてビジョンセンサ100の出力値を補正する。具体的には、例えば、照明強度の範囲ごとに複数の補正値テーブル241が用意されてもよいし、図2に示したような補正値テーブル241のレコードが位置情報X,Yおよび極性Pに照明強度範囲Lを加えた(X,Y,P,L)をキーとして記録されてもよい。 Referring to FIG. 1 again, the illumination intensity detection unit 232 detects illumination intensity in the environment of the sensor array 120 using the illumination sensor 140 arranged near the sensor array 120 . If the illumination intensity detection section 232 is present, the data correction section 231 corrects the output value of the vision sensor 100 based on the correction value table 241 defined for each illumination intensity detected by the illumination intensity detection section 232 . Specifically, for example, a plurality of correction value tables 241 may be prepared for each illumination intensity range, and the records of the correction value table 241 as shown in FIG. (X, Y, P, L) to which the illumination intensity range L is added may be recorded as a key.
 図3は、本発明の一実施形態においてイベント信号を補正する処理の例を示すフローチャートである。図示された例では、通信インターフェース210においてイベント信号が受信された場合(ステップS101)、データ補正部231が受信されたイベント信号の位置情報(X,Y)および極性Pをキーにして補正値テーブル241をサーチする(ステップS102)。キーに一致する補正値テーブル241のレコード、すなわち取得されたイベント信号の位置情報(X,Y)および極性Pについてのパターンの定義が存在する場合(ステップS103)、データ補正部231は受信されたイベント信号について、パターンに基づく補正の要否の判定を実行する。 FIG. 3 is a flow chart showing an example of processing for correcting an event signal in one embodiment of the present invention. In the illustrated example, when an event signal is received by the communication interface 210 (step S101), the data correction unit 231 creates a correction value table using position information (X, Y) and polarity P of the received event signal as keys. 241 is searched (step S102). If there is a record in the correction value table 241 that matches the key, that is, a pattern definition for the acquired event signal position information (X, Y) and the polarity P (step S103), the data correction unit 231 receives For the event signal, it is determined whether or not correction is necessary based on the pattern.
 具体的には、データ補正部231は、補正値テーブル241に定義されたパターンに関係するイベント信号をバッファメモリから読み出す(ステップS104)。例えばイベント信号の位置情報(X,Y)および極性Pについて時間的なパターンが定義されている場合、データ補正部231は同じ位置情報(X,Y)のセンサで取得された時系列のイベント信号をバッファメモリから読み出す。また、イベント信号の位置情報(X,Y)および極性Pについて空間的なパターンが定義されている場合、データ補正部231は位置情報(X,Y)と所定の関係にある、例えば隣接する位置情報のセンサで取得されたイベント信号をバッファメモリから読み出す。 Specifically, the data correction unit 231 reads from the buffer memory an event signal related to the pattern defined in the correction value table 241 (step S104). For example, if a temporal pattern is defined for the position information (X, Y) and the polarity P of the event signal, the data correction unit 231 detects the time-series event signals obtained by the sensors with the same position information (X, Y). is read from the buffer memory. Further, when a spatial pattern is defined for the position information (X, Y) and the polarity P of the event signal, the data correction unit 231 has a predetermined relationship with the position information (X, Y), for example, adjacent positions The event signal acquired by the information sensor is read out from the buffer memory.
 さらに、データ補正部231は、ステップS101で受信されたイベント信号、およびステップS104で追加して読み出されたイベント信号が、補正値テーブル241に定義されたパターンにあてはまるか否かを判定する(ステップS105)。イベント信号が定義されたパターンにあてはまる場合、データ補正部231は補正値テーブル241でパターンに関連付けられた補正方法に従って、イベント信号の補正を実行する(ステップS106)。具体的には、例えば、データ補正部231は上記のように特定のイベント信号を無効化する、またはイベント信号を書き換えることによって補正を実行する。 Further, the data correction unit 231 determines whether or not the event signal received in step S101 and the event signal additionally read in step S104 apply to the pattern defined in the correction value table 241 ( step S105). If the event signal fits the defined pattern, the data correction unit 231 corrects the event signal according to the correction method associated with the pattern in the correction value table 241 (step S106). Specifically, for example, the data correction unit 231 performs correction by invalidating a specific event signal or rewriting the event signal as described above.
 図4は、本発明の一実施形態における補正値テーブルの作成方法の例を示す図である。上記の例で説明された補正値テーブル241は、例えばビジョンセンサ100の校正手順の一部として作成される。図示された例では、2軸の雲台501にビジョンセンサ100を取り付け、ビジョンセンサ100の画角内にテストパターン502を少なくとも部分的に配置した状態で、雲台501を用いてビジョンセンサ100を移動させる。このとき、テストパターン502の各領域における反射率、ビジョンセンサ100とテストパターン502との位置関係、および雲台501によるビジョンセンサ100の移動速度は既知であり、これらに基づいて算出されたビジョンセンサ100の理想的な出力値521が、演算装置500の記憶部520に格納されている。演算装置500の補正値決定部510は、ビジョンセンサ100のから受信されたイベント信号と理想的な出力値521とを比較することによってイベント信号の時間的または空間的なパターン、およびパターンに対応する補正方法を特定し、補正値テーブル241として記憶部520に格納する。 FIG. 4 is a diagram showing an example of a method of creating a correction value table according to one embodiment of the present invention. The correction value table 241 described in the above example is created as part of the calibration procedure of the vision sensor 100, for example. In the illustrated example, the vision sensor 100 is attached to a two-axis camera platform 501, and the vision sensor 100 is moved using the camera platform 501 with the test pattern 502 at least partially arranged within the angle of view of the vision sensor 100. move. At this time, the reflectance in each area of the test pattern 502, the positional relationship between the vision sensor 100 and the test pattern 502, and the movement speed of the vision sensor 100 by the platform 501 are known, and the vision sensor calculated based on these is known. 100 ideal output values 521 are stored in the storage unit 520 of the computing device 500 . The correction value determiner 510 of the computing device 500 compares the event signal received from the vision sensor 100 with the ideal output value 521 to respond to the temporal or spatial pattern of the event signal and the pattern. A correction method is specified and stored in the storage unit 520 as a correction value table 241 .
 上述のように、照明強度ごとに補正値テーブル241が定義される場合、補正値テーブル241の作成時には照明強度を調節可能な照明503が配置され、図1に示したのと同様の照度センサ140でセンサアレイ120の環境における照明強度を検出することによって、照明強度に応じて異なる補正値テーブル241が作成される。 As described above, when the correction value table 241 is defined for each illumination intensity, the illumination intensity-adjustable illumination 503 is arranged when the correction value table 241 is created, and the illuminance sensor 140 similar to that shown in FIG. By detecting the illumination intensity in the environment of the sensor array 120 at , a correction value table 241 that differs according to the illumination intensity is created.
 図5は、本発明の一実施形態において補正値テーブルを作成する処理の例を示すフローチャートである。図示された例では、演算装置500の補正値決定部510が、ビジョンセンサ100でセンサアレイ120を構成するセンサ110のそれぞれについて、理想的な出力値と実際に生成されたイベント信号とを比較して(ステップS201)、補正の必要があるか否かを判定する(ステップS202)。例えば、輝度変化の検出が理想的なタイミングから所定時間以上遅延したり、検出されるはずの輝度変化が検出されなかったり、検出されないはずの輝度変化が検出されたりした場合に、補正の必要があると判定される。補正の必要がある場合、補正値決定部510は、対象になるイベント信号について、時間的な、または空間的なパターンを認識する(ステップS203)。補正が必要なイベント信号について時間的な、または空間的なパターンが認識可能であった場合(ステップS204)、補正値決定部510は、認識されたパターン、およびイベント信号を理想的な出力値に近づけるための補正方法を補正値テーブル241に記録する(ステップS205)。 FIG. 5 is a flowchart showing an example of processing for creating a correction value table in one embodiment of the present invention. In the illustrated example, the correction value determining unit 510 of the arithmetic device 500 compares the ideal output value and the actually generated event signal for each of the sensors 110 that make up the sensor array 120 of the vision sensor 100. (Step S201), it is determined whether or not correction is necessary (Step S202). For example, if the detection of the luminance change is delayed for a predetermined time or more from the ideal timing, if the luminance change that should have been detected is not detected, or if the luminance change that should not have been detected is detected, correction is not necessary. It is determined that there is If correction is necessary, the correction value determination unit 510 recognizes the temporal or spatial pattern of the target event signal (step S203). If a temporal or spatial pattern can be recognized for the event signal requiring correction (step S204), the correction value determination unit 510 converts the recognized pattern and event signal into an ideal output value. A correction method for making the distance closer is recorded in the correction value table 241 (step S205).
 図6および図7は、イベント信号の補正方法の具体的な例について説明するための図である。図6に示された例では、図4に示したテストパターン502の移動によって同じ時刻tに極性P=pの輝度変化が1回だけ発生した場合の、位置情報(X,Y)=(x,y)のセンサ(センサA)およびの位置情報(X,Y)=(x,y)のセンサ(センサB)イベント信号が示されている。センサAは、時刻tよりも後の時刻tにイベント信号(x,y,p,t)を出力し、その後はイベント信号を出力していない。一方、センサBは、時刻tにイベント信号(x,y,p,t)を出力した後、時刻tにイベント信号(x,y,n,t)を出力し、さらに時刻tにイベント信号(x,y,p,t)を出力している。この場合、センサBについて、極性Pがp,n,pの順で出力される繰り返し出力が、補正すべき時間的なパターンとして特定される。具体的には、例えば、t-t<Tとなるような周期Tを設定し、センサBで極性P=pの場合の時間的なパターンとして「周期T1未満の繰り返し出力」を定義し、補正方法として「最初のp出力以外を無効化する」ことを定義してもよい。 6 and 7 are diagrams for explaining a specific example of the event signal correction method. In the example shown in FIG. 6, position information (X, Y) = ( x 1 , y 1 ) sensor (sensor A) and position information (X, Y)=(x 2 , y 2 ) sensor (sensor B) event signals are shown. Sensor A outputs an event signal (x 1 , y 1 , p, t 1 ) at time t 1 after time t 0 and has not output an event signal thereafter. On the other hand, sensor B outputs event signals (x 1 , y 1 , p, t 1 ) at time t 1 and then outputs event signals (x 1 , y 1 , n, t 2 ) at time t 2 . Furthermore, at time t3, event signals (x 1 , y 1 , p , t 3 ) are output. In this case, for the sensor B, a repetitive output in which the polarity P is output in the order p, n, p is identified as the temporal pattern to be corrected. Specifically, for example, a cycle T 1 is set such that t 3 −t 1 <T 1 , and “repeated output less than cycle T1” is set as a temporal pattern in the case of sensor B with polarity P=p. may be defined, and the correction method may be defined as "invalidating all but the first p outputs".
 一方、図7に示された例では、図4に示したテストパターン502の移動によって輝度変化が発生したセンサアレイ120上の領域に含まれる隣接9画素(3×3の領域を構成する)のセンサ110のイベント信号が示されている。この例において、隣接9画素のうちの8画素のセンサで極性P=pのイベント信号が出力されているにもかかわらず、領域の右端のセンサDではイベント信号が出力されていない。この場合、センサDに隣接するセンサEについて、隣接9画素のセンサのうち8つのセンサで極性P=pのイベント信号が出力され、残る1つのセンサでは極性P=pのイベント信号が出力されていないことが、補正すべき空間的なパターンとして特定される。具体的には、例えば、センサEで極性P=pの場合の空間的なパターンとして「隣接9画素のうち8画素のみでp出力」を定義し、補正方法として「隣接9画素をすべてp出力にする」ことを定義してもよい。 On the other hand, in the example shown in FIG. 7, 9 adjacent pixels (constituting a 3×3 area) included in the area on the sensor array 120 where the luminance change occurs due to the movement of the test pattern 502 shown in FIG. The event signal of sensor 110 is shown. In this example, the sensor D at the right end of the region does not output an event signal, although the event signal with the polarity P=p is output by the sensors of eight pixels out of the nine adjacent pixels. In this case, for the sensor E adjacent to the sensor D, eight of the nine adjacent pixels output an event signal with a polarity of P=p, and the remaining one sensor outputs an event signal with a polarity of P=p. are identified as spatial patterns to be corrected. Specifically, for example, when the sensor E has the polarity P=p, the spatial pattern is defined as ``p output from only 8 pixels out of 9 adjacent pixels'', and the correction method is ``p output for all adjacent 9 pixels. may be defined as "to".
 上記で説明したような本発明の一実施形態によれば、イベント信号において、センサの位置情報および極性ごとに時間的な、または空間的なパターンが定義され、パターンにあてはまる出力があった場合にイベント信号が補正される。これによって、例えば、イベント信号における実際の輝度変化とは異なる極性の繰り返し出力を無効化したり、隣接するセンサと出力が同期していないセンサの出力を隣接するセンサに同期させたりすることができる。例えば図4に示されたような実測の手順によってセンサの位置情報および極性ごとにパターンを定義することによって、個々のセンサごとの、また検出される極性ごとの特性の違いを補正値に反映し、ビジョンセンサの出力における残像やノイズなどを効果的に補正することができる。 According to one embodiment of the present invention as described above, in the event signal, a temporal or spatial pattern is defined for each sensor position information and polarity, and when there is an output that fits the pattern, Event signals are corrected. This allows, for example, to disable repetitive outputs with polarities that differ from the actual luminance changes in the event signal, or to synchronize the output of a sensor whose output is out of sync with adjacent sensors to an adjacent sensor. For example, by defining a pattern for each sensor position information and polarity by the actual measurement procedure as shown in FIG. , it is possible to effectively correct afterimages, noise, etc. in the output of the vision sensor.
 以上、添付図面を参照しながら本発明の好適な実施形態について詳細に説明したが、本発明はかかる例に限定されない。本発明の属する技術の分野における通常の知識を有する者であれば、請求の範囲に記載された技術的思想の範囲内において、各種の変形例または修正例に想到し得ることは明らかであり、これらについても、当然に本発明の技術的範囲に属するものと了解される。 Although the preferred embodiments of the present invention have been described in detail above with reference to the accompanying drawings, the present invention is not limited to such examples. It is obvious that a person having ordinary knowledge in the technical field to which the present invention belongs can conceive of various modifications or modifications within the scope of the technical idea described in the claims. It is understood that these also belong to the technical scope of the present invention.
 10…システム、100…ビジョンセンサ、110…センサ、120…センサアレイ、130…処理回路、140…照度センサ、200…信号処理装置、210…通信インターフェース、220…バッファメモリ、230…処理回路、231…データ補正部、232…照明強度検出部、240…記憶部、241…補正値テーブル。
 
DESCRIPTION OF SYMBOLS 10... System, 100... Vision sensor, 110... Sensor, 120... Sensor array, 130... Processing circuit, 140... Illuminance sensor, 200... Signal processing apparatus, 210... Communication interface, 220... Buffer memory, 230... Processing circuit, 231 ... data correction unit, 232 ... illumination intensity detection unit, 240 ... storage unit, 241 ... correction value table.

Claims (12)

  1.  センサアレイを構成する複数のセンサを含むイベント駆動型のビジョンセンサから出力され、前記センサアレイ内における各センサの位置情報、光の強度変化の極性およびタイムスタンプを含む一連のイベント信号のうち少なくとも1つのイベント信号を、前記一連のイベント信号の前記タイムスタンプによって特定される時間的なパターン、または前記一連のイベント信号の前記位置情報によって特定される空間的なパターンの少なくともいずれかに基づいて補正するデータ補正部を備える信号処理装置。 At least one of a series of event signals output from an event-driven vision sensor including a plurality of sensors forming a sensor array and including position information of each sensor in the sensor array, polarity of light intensity change, and time stamp. correcting one event signal based on at least one of a temporal pattern identified by the time stamps of the sequence of event signals and/or a spatial pattern identified by the location information of the sequence of event signals. A signal processing device comprising a data corrector.
  2.  前記一連のイベント信号は、前記少なくとも1つのイベント信号の前記位置情報に対して所定の関係を満たす前記位置情報を有し、
     前記データ補正部は、前記一連のイベント信号の前記タイムスタンプによって特定される前記時間的なパターンに基づいて前記少なくとも1つのイベント信号を補正する、請求項1に記載の信号処理装置。
    said sequence of event signals having said location information satisfying a predetermined relationship to said location information of said at least one event signal;
    2. The signal processing device according to claim 1, wherein said data correction unit corrects said at least one event signal based on said temporal pattern specified by said time stamps of said series of event signals.
  3.  前記一連のイベント信号は、前記少なくとも1つのイベント信号の前記タイムスタンプに対して所定の関係を満たす前記タイムスタンプを有し、
     前記データ補正部は、前記一連のイベント信号の前記位置情報によって特定される前記空間的なパターンに基づいて前記少なくとも1つのイベント信号を補正する、請求項1に記載の信号処理装置。
    said sequence of event signals having said time stamps satisfying a predetermined relationship to said time stamps of said at least one event signal;
    2. The signal processing device according to claim 1, wherein said data correction unit corrects said at least one event signal based on said spatial pattern specified by said position information of said series of event signals.
  4.  前記時間的なパターンまたは前記空間的なパターンの少なくともいずれかは、前記少なくとも1つのイベント信号の前記位置情報によって異なる、請求項1から請求項3のいずれか1項に記載の信号処理装置。 The signal processing device according to any one of claims 1 to 3, wherein at least one of said temporal pattern or said spatial pattern differs according to said position information of said at least one event signal.
  5.  前記時間的なパターンまたは前記空間的なパターンの少なくともいずれかは、前記少なくとも1つのイベント信号の前記極性によって異なる、請求項1から請求項4のいずれか1項に記載の信号処理装置。 The signal processing device according to any one of claims 1 to 4, wherein at least one of said temporal pattern or said spatial pattern differs according to said polarity of said at least one event signal.
  6.  前記センサアレイの環境における照明強度を検出する照明強度検出部をさらに備え、
     前記時間的なパターンまたは前記空間的なパターンの少なくともいずれかは、前記照明強度ごとに定義される、請求項1から請求項5のいずれか1項に記載の信号処理装置。
    further comprising an illumination intensity detection unit that detects illumination intensity in the environment of the sensor array;
    6. The signal processing device according to any one of claims 1 to 5, wherein at least one of said temporal pattern and said spatial pattern is defined for each illumination intensity.
  7.  前記時間的なパターンは、前記極性の繰り返し出力を含む、請求項1から請求項6のいずれか1項に記載の信号処理装置。 The signal processing device according to any one of claims 1 to 6, wherein the temporal pattern includes repeated outputs of the polarity.
  8.  前記データ補正部は、前記極性の繰り返し出力を無効化する、請求項7に記載の信号処理装置。 The signal processing device according to claim 7, wherein the data correction unit invalidates the repeated output of the polarity.
  9.  前記空間的なパターンは、第1のセンサの出力が前記第1のセンサに隣接する第2のセンサの出力と同期していないことを含む、請求項1から請求項8のいずれか1項に記載の信号処理装置。 9. The spatial pattern of any one of claims 1 to 8, wherein the spatial pattern includes a first sensor output being out of sync with a second sensor output adjacent to the first sensor. A signal processor as described.
  10.  前記データ補正部は、前記第1のセンサの出力を前記第2のセンサの出力に同期させる、請求項9に記載の信号処理装置。 The signal processing device according to claim 9, wherein the data correction unit synchronizes the output of the first sensor with the output of the second sensor.
  11.  センサアレイを構成する複数のセンサを含むイベント駆動型のビジョンセンサから出力され、前記センサアレイ内における各センサの位置情報、光の強度変化の極性およびタイムスタンプを含む一連のイベント信号のうち少なくとも1つのイベント信号を、前記一連のイベント信号の前記タイムスタンプによって特定される時間的なパターン、または前記一連のイベント信号の前記位置情報によって特定される空間的なパターンの少なくともいずれかに基づいて補正するステップを含む信号処理方法。 At least one of a series of event signals output from an event-driven vision sensor including a plurality of sensors forming a sensor array and including position information of each sensor in the sensor array, polarity of light intensity change, and time stamp. correcting one event signal based on at least one of a temporal pattern identified by the time stamps of the sequence of event signals and/or a spatial pattern identified by the location information of the sequence of event signals. A signal processing method that includes steps.
  12.  センサアレイを構成する複数のセンサを含むイベント駆動型のビジョンセンサから出力され、前記センサアレイ内における各センサの位置情報、光の強度変化の極性およびタイムスタンプを含む一連のイベント信号のうち少なくとも1つのイベント信号を、前記一連のイベント信号の前記タイムスタンプによって特定される時間的なパターン、または前記一連のイベント信号の前記位置情報によって特定される空間的なパターンの少なくともいずれかに基づいて補正する機能をコンピュータに実現させるためのプログラム。
     
    At least one of a series of event signals output from an event-driven vision sensor including a plurality of sensors forming a sensor array and including position information of each sensor in the sensor array, polarity of light intensity change, and time stamp. correcting one event signal based on at least one of a temporal pattern identified by the time stamps of the sequence of event signals and/or a spatial pattern identified by the location information of the sequence of event signals. A program that makes a computer implement a function.
PCT/JP2021/023211 2021-06-18 2021-06-18 Signal processing device, signal processing method, and program WO2022264405A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/JP2021/023211 WO2022264405A1 (en) 2021-06-18 2021-06-18 Signal processing device, signal processing method, and program

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2021/023211 WO2022264405A1 (en) 2021-06-18 2021-06-18 Signal processing device, signal processing method, and program

Publications (1)

Publication Number Publication Date
WO2022264405A1 true WO2022264405A1 (en) 2022-12-22

Family

ID=84526944

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2021/023211 WO2022264405A1 (en) 2021-06-18 2021-06-18 Signal processing device, signal processing method, and program

Country Status (1)

Country Link
WO (1) WO2022264405A1 (en)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20180262705A1 (en) * 2017-03-08 2018-09-13 Samsung Electronics Co., Ltd. Image processing device configured to regenerate timestamp and electronic device including the same
US20190362256A1 (en) * 2018-05-24 2019-11-28 Samsung Electronics Co., Ltd. Event-based sensor that filters for flicker
WO2021125031A1 (en) * 2019-12-17 2021-06-24 ソニーセミコンダクタソリューションズ株式会社 Solid-state imaging device and distance measurement device

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20180262705A1 (en) * 2017-03-08 2018-09-13 Samsung Electronics Co., Ltd. Image processing device configured to regenerate timestamp and electronic device including the same
US20190362256A1 (en) * 2018-05-24 2019-11-28 Samsung Electronics Co., Ltd. Event-based sensor that filters for flicker
WO2021125031A1 (en) * 2019-12-17 2021-06-24 ソニーセミコンダクタソリューションズ株式会社 Solid-state imaging device and distance measurement device

Similar Documents

Publication Publication Date Title
KR20180119476A (en) Method and system for time alignment calibration, event annotation and/or database generation
US20060132626A1 (en) Pixel defect correction device
US11202041B2 (en) Event camera
JP7352637B2 (en) Sensor module, sensor system, image processing device, image processing method and program
WO2022107647A1 (en) Information processing device, system, information processing method, and information processing program
US10613650B2 (en) Navigation device with fast frame rate upshift
US10628951B2 (en) Distance measurement system applicable to different reflecting surfaces and computer system
CN115701130A (en) Electronic device for compensating time delay of dynamic vision sensor
WO2022264405A1 (en) Signal processing device, signal processing method, and program
KR100874670B1 (en) Image Processor, Image Processing Unit, and Defective Pixel Correction Methods
JP3227815B2 (en) Solid-state imaging device
CN115278097B (en) Image generation method, device, electronic equipment and medium
JP6030890B2 (en) Image processing unit, image processing method, and stand type scanner
CN102298470B (en) Image sensing module
TWI423143B (en) Image sensing module
JP4329677B2 (en) Motion detection device
JP2008164338A (en) Position sensor
JP7191238B2 (en) SENSOR SYSTEM, IMAGE PROCESSING DEVICE, IMAGE PROCESSING METHOD AND PROGRAM
WO2024194945A1 (en) Information processing device, system, information processing method, information processing program, and computer system
JP7252352B2 (en) Transfer control device, system, transfer control method and program
WO2024194944A1 (en) Information processing device, system, information processing method, information processing program, and computer system
KR100860307B1 (en) Correction Device of Bad Pixel and Method thereof
JP4268953B2 (en) Motion detection apparatus and program thereof
JPH07146139A (en) Signal correction apparatus of ccd for range finding
JP4285573B2 (en) Motion detection device

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21946080

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 21946080

Country of ref document: EP

Kind code of ref document: A1