US20220172375A1 - Vision sensor, image processing device including the same, and operating method of the vision sensor - Google Patents

Vision sensor, image processing device including the same, and operating method of the vision sensor Download PDF

Info

Publication number
US20220172375A1
US20220172375A1 US17/515,755 US202117515755A US2022172375A1 US 20220172375 A1 US20220172375 A1 US 20220172375A1 US 202117515755 A US202117515755 A US 202117515755A US 2022172375 A1 US2022172375 A1 US 2022172375A1
Authority
US
United States
Prior art keywords
event
timestamp
map
vision sensor
occurred
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
US17/515,755
Inventor
Jongseok Seo
Junhyuk Park
Hyunku Lee
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Samsung Electronics Co Ltd
Original Assignee
Samsung Electronics Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Samsung Electronics Co Ltd filed Critical Samsung Electronics Co Ltd
Assigned to SAMSUNG ELECTRONICS CO., LTD. reassignment SAMSUNG ELECTRONICS CO., LTD. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: Park, Junhyuk, SEO, JONGSEOK, LEE, HYUNKU
Publication of US20220172375A1 publication Critical patent/US20220172375A1/en
Priority to US18/120,251 priority Critical patent/US20230217123A1/en
Pending legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N25/00Circuitry of solid-state image sensors [SSIS]; Control thereof
    • H04N25/70SSIS architectures; Circuits associated therewith
    • H04N25/701Line sensors
    • H04N25/7013Line sensors using abutted sensors forming a long line
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N25/00Circuitry of solid-state image sensors [SSIS]; Control thereof
    • H04N25/40Extracting pixel data from image sensors by controlling scanning circuits, e.g. by modifying the number of pixels sampled or to be sampled
    • H04N25/44Extracting pixel data from image sensors by controlling scanning circuits, e.g. by modifying the number of pixels sampled or to be sampled by partially reading an SSIS array
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N25/00Circuitry of solid-state image sensors [SSIS]; Control thereof
    • H04N25/40Extracting pixel data from image sensors by controlling scanning circuits, e.g. by modifying the number of pixels sampled or to be sampled
    • H04N25/44Extracting pixel data from image sensors by controlling scanning circuits, e.g. by modifying the number of pixels sampled or to be sampled by partially reading an SSIS array
    • H04N25/443Extracting pixel data from image sensors by controlling scanning circuits, e.g. by modifying the number of pixels sampled or to be sampled by partially reading an SSIS array by reading pixels from selected 2D regions of the array, e.g. for windowing or digital zooming
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N25/00Circuitry of solid-state image sensors [SSIS]; Control thereof
    • H04N25/47Image sensors with pixel address output; Event-driven image sensors; Selection of pixels to be read out based on image data
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • G06T7/215Motion-based segmentation
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/434Disassembling of a multiplex stream, e.g. demultiplexing audio and video streams, extraction of additional data from a video stream; Remultiplexing of multiplex streams; Extraction or processing of SI; Disassembling of packetised elementary stream
    • H04N21/4344Remultiplexing of multiplex streams, e.g. by modifying time stamps or remapping the packet identifiers
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/80Generation or processing of content or additional data by content creator independently of the distribution process; Content per se
    • H04N21/85Assembly of content; Generation of multimedia applications
    • H04N21/854Content authoring
    • H04N21/8547Content authoring involving timestamps for synchronizing content
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/80Camera processing pipelines; Components thereof
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N25/00Circuitry of solid-state image sensors [SSIS]; Control thereof
    • H04N25/40Extracting pixel data from image sensors by controlling scanning circuits, e.g. by modifying the number of pixels sampled or to be sampled
    • H04N25/46Extracting pixel data from image sensors by controlling scanning circuits, e.g. by modifying the number of pixels sampled or to be sampled by combining or binning pixels
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N25/00Circuitry of solid-state image sensors [SSIS]; Control thereof
    • H04N25/50Control of the SSIS exposure
    • H04N25/57Control of the dynamic range
    • H04N25/58Control of the dynamic range involving two or more exposures
    • H04N25/581Control of the dynamic range involving two or more exposures acquired simultaneously
    • H04N25/585Control of the dynamic range involving two or more exposures acquired simultaneously with pixels having different sensitivities within the sensor, e.g. fast or slow pixels or pixels having different sizes
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N25/00Circuitry of solid-state image sensors [SSIS]; Control thereof
    • H04N25/70SSIS architectures; Circuits associated therewith
    • H04N25/76Addressed sensors, e.g. MOS or CMOS sensors
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/14Picture signal circuitry for video frequency region
    • H04N5/144Movement detection
    • H04N5/3454
    • H04N5/37457
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20112Image segmentation details

Definitions

  • Example embodiments of the present disclosure relate to a vision sensor, and more particularly, to a vision sensor capable of generating and transmitting a timestamp map, an image processing device including the vision sensor, and an operating method of the vision sensor.
  • a vision sensor for example, a dynamic vision sensor, generates, upon occurrence of an event, for example, a variation in an intensity of light, information about the event, that is, an event signal, and transmits the event signal to a processor.
  • an event for example, a variation in an intensity of light
  • a dynamic vision sensor transmits only event data and a timestamp, and data processing such as writing a timestamp map by using the data is mainly performed using a processor located outside the dynamic vision sensor.
  • An operation of writing a timestamp map by using raw data consumes resources of a host, and thus, a method of reducing the burden of the host needs to be researched.
  • One or more example embodiments provide a vision sensor generating a timestamp map and an optical flow map based on event data and a timestamp, an image processing device including the vision sensor, and an operating method of the vision sensor.
  • a vision sensor including a pixel array including a plurality of pixels disposed in a matrix form, an event detection circuit configured to detect whether an event has occurred in the plurality of pixels and generate event signals corresponding to pixels from among the plurality of pixels in which an event has occurred, a map data processor configured to generate a timestamp map based on the event signals, and an interface circuit configured to transmit vision sensor data including at least one of the event signals and the timestamp map to an external processor, wherein the timestamp map includes timestamp information indicating polarity information, address information, and an event occurrence time of a pixel included in an event signal corresponding to the pixel.
  • an image processing device including a vision sensor configured to generate a plurality of event signals corresponding to a plurality of pixels in which an event has occurred based on a movement of an object, from among the plurality of pixels included in a pixel array, generate a timestamp map based on polarity information, address information, and event occurrence time information of a pixel included in the plurality of event signals, and output vision sensor data including the event signals and the timestamp map, and a processor configured to detect the movement of the object by processing the vision sensor data output from the vision sensor.
  • an operating method of a vision sensor including detecting whether an event has occurred in a plurality of pixels and generating event signals corresponding to pixels from among the plurality of pixels in which an event has occurred, generating vision sensor data including a timestamp map based on the event signals, and transmitting the vision sensor data to an external processor, wherein the timestamp map includes timestamp information indicating polarity information, address information, and an event occurrence time of a pixel included in an event signal.
  • FIG. 1 is a block diagram illustrating an image processing device according to an example embodiment
  • FIG. 2 is a block diagram illustrating a vision sensor according to an example embodiment
  • FIG. 3 is a block diagram illustrating an event detection circuit according to an example embodiment
  • FIGS. 4A and 4B are block diagrams illustrating a map data module according to an example embodiment
  • FIG. 5 is a flowchart of an operating method of a vision sensor according to an example embodiment
  • FIG. 6 is a circuit diagram illustrating an implementation example of a pixel
  • FIGS. 7A and 7B are diagrams illustrating a method of transmitting, by a vision sensor according to an example embodiment, data via an interface circuit
  • FIGS. 8A, 8B, and 8C are diagrams illustrating event binning of a vision sensor according to an example embodiment
  • FIG. 9 is a diagram for describing a method of generating a plurality of timestamp maps by using a vision sensor according to an example embodiment
  • FIG. 10 is a diagram for describing a timestamp map generator in a vision sensor according to an example embodiment
  • FIG. 11 is a diagram for describing a method of reducing, by a vision sensor according to an example embodiment, data of a timestamp map
  • FIG. 12 is a diagram for describing a time when a timestamp map is generated and a time when an optical flow map is generated in a vision sensor according to an example embodiment
  • FIGS. 13A and 13B are diagrams for describing generation of an optical flow map in a vision sensor according to an example embodiment.
  • FIG. 14 is a block diagram illustrating an example of an electronic device to which a vision sensor according to an example embodiment is applied.
  • FIG. 1 is a block diagram illustrating an image processing device according to an example embodiment.
  • An image processing device 10 may be mounted in an electronic device having an image or light sensing function.
  • the image processing device 10 may be mounted in an electronic device such as a camera, a smartphone, a wearable device, Internet of Things (IoT), a tablet personal computer (PC), a personal digital assistant (PDA), a portable multimedia player (PMP), a navigation device, a drone, an advanced driver-assistance system (ADAS), or the like.
  • the image processing device 10 may be included as a component of a vehicle, furniture, manufacturing equipment, a door, various measuring instruments, or the like.
  • the image processing device 10 may include a vision sensor 100 and a processor 200 .
  • the vision sensor 100 detects a variation in an intensity of incident light and transmits vision sensor data VSD including at least one of an event signal EVS, a timestamp map TSM, and an optical flow map OFM, to the processor 200 .
  • the vision sensor 100 may detect a variation in an intensity of incident light and output an event signal.
  • the vision sensor 100 may be a dynamic vision sensor outputting event signals EVS with respect to pixels, in which a variation in a light intensity is detected, for example, pixels in which an event has occurred.
  • the variation in the intensity of light may be due to a movement of an object, of which an image is captured using the vision sensor 100 , or due to a movement of the vision sensor 100 or the image processing device 10 .
  • the vision sensor 100 may transmit event signals EVS to the processor 200 periodically or non-periodically.
  • the vision sensor 100 may transmit, to the processor 200 , not only the event signals EVS including an address, a polarity, and a timestamp but also a timestamp map TSM or an optical flow map OFM generated based on the event signal EVS.
  • the vision sensor 100 may selectively transmit the event signals EVS to the processor 200 .
  • the vision sensor 100 may transmit, to the processor 200 , those event signals EVS generated from pixels PX corresponding to a region of interest (ROI) set in a pixel array 110 from among event signals generated to correspond to the pixel array 110 .
  • ROI region of interest
  • the vision sensor 100 may apply crop or event binning to the event signals EVS. Also, the vision sensor 100 may generate a timestamp map TSM or an optical flow map OFM based on an event signal EVS. According to an example embodiment, the vision sensor 100 may selectively transmit event signals EVS, a timestamp map TSM, or an optical flow map OFM to the outside according to a transmission mode.
  • the processor 200 may process the event signal EVS received from the vision sensor 100 , and detect a movement of an object or a movement of an object on an image recognized by the image processing device 10 , and may use an algorithm such as a simultaneous localization and mapping (SLAM).
  • the processor 200 may include an application specific integrated circuit (ASIC), a field-programmable gate array (FPGA), a dedicated microprocessor, a microprocessor, a general purpose processor, or the like.
  • the processor 200 may include an application processor or an image processor.
  • the vision sensor 100 and the processor 200 may be each implemented as an integrated circuit (IC).
  • IC integrated circuit
  • the vision sensor 100 and the processor 200 may be implemented as separate semiconductor chips, or may be implemented as single chips.
  • the vision sensor 100 and the processor 200 may be implemented as system on chips (SoCs).
  • FIG. 2 is a block diagram illustrating a vision sensor according to an example embodiment.
  • the vision sensor 100 may include the pixel array 110 , an event detection circuit 120 , a map data module 130 , and an interface circuit 140 .
  • the pixel array 110 may include a plurality of pixels PX arranged in a matrix form. Each of the pixels PX may detect events such as an increase or a decrease in an intensity of received light. For example, each of the pixels PX may be connected to the event detection circuit 120 via column lines extending in a column direction and row lines extending in a row direction. A signal indicating that an event has occurred and polarity information of an event, for example, whether the event is an on-event where a light intensity increases or an off-event where a light intensity decreases, may be output from a pixel PX in which the event has occurred, to the event detection circuit 120 .
  • the event detection circuit 120 may read events from the pixel array 110 and process the events.
  • the event detection circuit 120 may generate an event signal EVS including polarity information of the occurred event, an address of a pixel in which the event has occurred, and a timestamp.
  • the event signal EVS may be generated in various formats, for example, in an address event representation (AER) format including address information, timestamp information, and polarity information of a pixel in which an event has occurred or a raw format including event occurrence information of all pixels.
  • AER address event representation
  • the event detection circuit 120 may process events that occurred in the pixel array 110 , in units of pixels, in units of pixel groups including a plurality of pixels, in units of columns or in units of frames.
  • the map data module 130 may adjust an amount of the event signals EVS by post-processing such as crop, which refers to setting of an ROI in a pixel array with respect to an event signal EVS or event binning for setting a data amount, and adjust a size of a timestamp map TSM and an optical flow map OFM that are generated.
  • the map data module 130 may generate map data MDT including a timestamp map or an optical flow map and output the map data MDT to the interface circuit 140 , or output the event signals EVS received from the event detection circuit 120 to the interface circuit 140 .
  • the interface circuit 140 may receive the event signals EVS and pieces of map data MDT, and transmit vision sensor data VSD to the processor 200 ( FIG. 1 ) according to a set protocol.
  • the interface circuit 140 may pack the event signals EVS and the pieces of the map data MDT into individual signal units, packet units, or frame units, according to a set protocol to generate vision sensor data VSD, and transmit the vision sensor data VSD to the processor 200 .
  • the interface circuit 140 may include a mobile industry processor interface (MIPI).
  • MIPI mobile industry processor interface
  • the event signals EVS or pieces of map data MDT being output may be the event signals EVS or the pieces of map data MDT being converted into vision sensor data VSD via the interface circuit 140 and the vision sensor data VSD being transmitted to the processor 200 .
  • FIG. 3 is a block diagram illustrating an event detection circuit according to an example embodiment.
  • the event detection circuit 120 may include a voltage generator 121 , a digital timing/AER generator (DTAG) 122 , and an event signal processing (ESP) unit 123 .
  • the DTAG 122 may include a column AER generator and a row AER generator.
  • the voltage generator 121 may generate a voltage provided to the pixel array 110 .
  • the voltage generator 121 may generate threshold voltages used to detect an on-event or an off-event from the pixel PX ( FIG. 2 ) or bias voltages.
  • the voltage generator 121 may change a voltage level of threshold voltages provided to pixels of an ROI, and may differently change voltage levels of the threshold voltages with respect to a plurality of ROIs.
  • the DTAG 122 may receive, from the pixel PX in which an event has occurred, a signal indicating that the event has occurred, and generate a timestamp TS including information about a time when the event of the pixel PX has occurred, and an address ADDR including a column address, a row address, or a group address.
  • the column AER generator may receive, from the pixel PX in which an event has occurred, a signal indicating that the event has occurred, for example, a column request, and generate a column address C_ADDR of the pixel PX in which the event has occurred.
  • the row AER generator may receive, from the pixel PX in which an event has occurred, a signal indicating that the event has occurred, for example, a row request, and generate a row address R_ADDR of the pixel PX in which the event has occurred.
  • a group address G_ADDR may also be generated in units of preset groups.
  • the pixel array 110 may be scanned in units of columns, and when a request is received from a certain column, for example, a first column, the column AER generator may transmit a response signal to the first column.
  • the pixel PX in which the event has occurred and which has received the response signal may transmit polarity information Pol, for example, a signal indicating the occurrence of an on-event or an off-event, to the row AER generator.
  • polarity information Pol for example, a signal indicating the occurrence of an on-event or an off-event
  • the row AER generator may transmit a reset signal to the pixel PX in which the event has occurred.
  • the pixel PX in which the event has occurred may be reset.
  • the row AER generator may control a period at which a reset signal is generated.
  • the row AER generator may generate information about a time when an event has occurred, that is, a timestamp TS.
  • the operations of the row AER generator and the column AER generator are described above by assuming that the pixel array 110 is scanned in units of columns.
  • the operations of the row AER generator and the column AER generator are not limited thereto, and the row AER generator and the column AER generator may read, from the pixel PX in which an event has occurred, whether the event has occurred and polarity information Pol, in various manners.
  • the pixel array 110 may be scanned in units of rows, and operations of the row AER generator and the column AER generator may be exchanged with each other, that is, the column AER generator may receive a polarity information Pol and transmit a reset signal to the pixel array 110 .
  • the row AER generator and the column AER generator may individually access the pixel PX in which the event has occurred.
  • the ESP unit 123 may generate an event signal EVS based on a column address C_ADDR, a row address R_ADDR, a group address G_ADDR, a polarity information Pol, and a timestamp TS received from the DTAG 122 .
  • the ESP unit 123 may remove a noise event, and generate an event signal EVS with respect to valid events. For example, when an amount of events that occurred for a certain period of time is less than a threshold, the ESP unit 123 may determine the events as noise, and may not generate an event signal EVS with respect to the noise event.
  • FIGS. 4A and 4B are block diagrams illustrating a map data module according to an example embodiment.
  • the map data module 130 may include a crop unit 131 , a binning unit 132 , a timestamp map generator 133 , and a map data buffer 135 .
  • the crop unit 131 may set at least one ROI on a pixel array including a plurality of pixels, according to a user's settings or a preset algorithm. For example, the crop unit 131 may set a region in which multiple events occur, as an ROI, from among a plurality of regions set with respect to the pixel array 110 , or set an arbitrary region corresponding to pixels in which multiple events occur during a certain period of time, as an ROI. According to another example embodiment, an arbitrary region that is set by a user arbitrarily or corresponds to pixels PX from which a certain object is sensed may be set as an ROI. However, the ROI is not limited thereto and may be set in various manners.
  • the binning unit 132 may adjust a data amount used in generating a map according to a user's settings or a preset algorithm. For example, the binning unit 132 may perform a binning operation on an ROI by grouping pixels included in a set ROI, in certain units.
  • the timestamp map generator 133 may generate a timestamp map TSM based on an event signal EVS adjusted using the crop unit 131 and the binning unit 132 . For example, the timestamp map generator 133 may generate a timestamp map TSM only with respect to an ROI that is designated by cropping. Also, the timestamp map generator 133 may generate a timestamp map TSM having a size adjusted by binning.
  • the map data buffer 135 may store the timestamp map TSM generated using the timestamp map generator 133 , and transmit the stored timestamp map TSM to the interface circuit 140 in the vision sensor 100 as map data MDT.
  • the map data module 130 may include the crop unit 131 , the binning unit 132 , the timestamp map generator 133 , an optical flow map generator 134 , and the map data buffer 135 .
  • the optical flow map generator 134 may generate an optical flow map OFM based on the timestamp map TSM stored in the map data buffer 135 .
  • the optical flow map OFM may be stored in the map data buffer 135 in which the timestamp map TSM is stored.
  • An optical flow may be a pattern of apparent motion of objects, surfaces, and edges in a visual scene caused by the relative motion between an observer and a scene, and may be a distribution of apparent velocities of movement of a brightness pattern in an image.
  • an optical flow may be information about a direction and velocity in and at which a pixel in which an event signal is generated is varied.
  • FIG. 5 is a flowchart of an operating method of a vision sensor according to an example embodiment.
  • the event detection circuit 120 may detect whether an event has occurred in each of a plurality of pixels, and generate event signals EVS corresponding to pixels in which an event has occurred.
  • the event detection circuit 120 may generate event signals EVS in an AER format including address information, timestamp information, and polarity information of a pixel in which an event has occurred or in a raw format including event occurrence information of all pixels.
  • the map data module 130 may receive the event signals EVS from the event detection circuit 120 (S 110 ).
  • the map data module 130 may perform crop of selecting event signals generated in an ROI of a pixel array, from among event signals, according to a user's settings or a preset algorithm (S 120 ).
  • the map data module 130 may perform event binning (S 130 ). When the number of events that occurred in a preset N*N size-pixel is greater than a preset threshold value, the map data module 130 may determine that an event has occurred, and store an event signal EVS on an array for storing a timestamp map corresponding to the N*N size-pixel.
  • the map data module 130 may generate a timestamp map TSM (S 140 ).
  • the map data module 130 may generate a plurality of timestamp maps TSM having different reference frames from each other in an event signal frame period that is periodically generated.
  • the map data module 130 may periodically generate a first timestamp reference and a first timestamp map based on a preset number of event signal frames counted from the first timestamp reference, and periodically generate a second timestamp reference and a second timestamp map based on a preset number of event signal frames counted from the second timestamp reference.
  • the first timestamp map may be generated by setting a first timestamp reference frame as a first event signal frame and accumulating up to an eleventh event signal frame
  • the second timestamp map may be generated by setting a second timestamp reference frame as a fifth event signal frame and accumulating from the fifth event signal frame up to a fifteenth event signal frame.
  • Three or more timestamp maps TSM may also be generated according to the occurrence of an event signal EVS or user's settings.
  • the map data module 130 may store an offset value of a timestamp based on a preset timestamp reference of a timestamp map TSM, and set a timestamp reference of the timestamp map TSM based on event occurrence information.
  • the map data module 130 may transmit raw data by adding, a timestamp reference value corresponding to the timestamp map TSM. For example, when an event generation time is r+k, a timestamp reference may be set as r, and an offset value of a timestamp may be stored as k.
  • a r+k value obtained by adding the timestamp reference r to the offset value k may be transmitted.
  • the map data module 130 may generate an optical flow map OFM by estimating, from the timestamp map TSM according to a preset algorithm, an optical flow including information about a direction and velocity in and at which a pixel in which an event signal is generated is varied.
  • the map data module 130 may store the optical flow map OFM in a location of the map data buffer 135 in which the timestamp map TSM is stored, instead of the timestamp map TSM. Accordingly, a map data buffer size may be more effectively controlled.
  • the interface circuit 140 may transmit vision sensor data VSD including the event signal EVS, the timestamp map TSM, or the optical flow map OFM, to an external processor (S 150 ).
  • the interface circuit 140 may selectively transmit each piece of data according to an output mode.
  • the interface circuit 140 may output only an event signal included in the vision sensor data VSD to an external processor.
  • the interface circuit 140 may generate at least one virtual channel and transmit an event signal to an external processor via a first virtual channel, and transmit timestamp map data to an external processor via a second virtual channel.
  • the interface circuit 140 may generate at least one virtual channel, and transmit an event signal to an external processor via the first virtual channel, and transmit optical flow map data to the external processor via the second virtual channel.
  • FIG. 6 is a circuit diagram illustrating an implementation example of a pixel.
  • a pixel PX may include a photoelectric conversion device PD, an amplifier 111 , a first comparator 112 , a second comparator 113 , an on-event holder 114 , an off-event holder 115 , and a reset switch SW.
  • the pixel PX may further include a capacitor for removing noise generated in the pixel PX or coming from the outside or various switches.
  • the photoelectric conversion device PD may convert incident light, that is, an optical signal, into an electrical signal, for example, an electric current.
  • the photoelectric conversion device PD may include, for example, a photodiode, a phototransistor, a port gate or a pinned photodiode, or the like.
  • the photoelectric conversion device PD may generate an electrical signal having a higher level as an intensity of incident light increases.
  • the amplifier 111 may convert a received current into a voltage and amplify a voltage level. An output voltage of the amplifier 111 may be provided to the first comparator 112 and the second comparator 113 .
  • the first comparator 112 may compare an output voltage Vout of the amplifier 111 with an on-threshold voltage TH 1 , and generate an on signal E_ON based on a comparison result.
  • the second comparator 113 may compare an output voltage Vout of the amplifier 111 with an off-threshold voltage TH 2 , and generate an off signal E_OFF based on a comparison result.
  • the first comparator 112 and the second comparator 113 may generate an on signal E_ON or an off signal E_OFF when a variation amount of light received by the photoelectric conversion device PD is equal to or greater than a certain level of variation.
  • the on signal E_ON may be at a high level when a light amount received by the photoelectric conversion device PD increases to a certain level or more
  • the off signal E_OFF may be at a high level when a light amount received by the photoelectric conversion device PD is reduced to a certain level or less.
  • the on-event holder 114 and the off-event holder 115 may respectively hold the on signal E_ON and the off signal E_OFF and then output the same.
  • the on signal E_ON and the off signal E_OFF may be output.
  • levels of the on-threshold voltage TH 1 and the off-threshold voltage TH 2 may be modified.
  • the light sensitivity may be reduced. Accordingly, the level of the on-threshold voltage TH 1 may be increased, and the level of the off-threshold voltage TH 2 may be reduced. Accordingly, when a variation in light received by the photoelectric conversion device PD is greater than before the levels of the on-threshold voltage TH 1 and the off-threshold voltage TH 2 are modified, the first comparator 112 and the second comparator 113 may generate the on signal E_ON or the off signal E_OFF.
  • FIGS. 7A and 7B are diagrams illustrating a method of transmitting, by a vision sensor according to an example embodiment, data via an interface circuit.
  • event data may be generated based on an event signal EVS generated based on, for example, 2 bits/pixel and 1000 fps, and a pixel array size (a ⁇ b).
  • a packet including at least one event signal EVS may be output from the interface circuit 140 as event data.
  • a packet may include a timestamp, a column address, a row address, and polarity information of an event signal, and an arrangement order thereof is not limited.
  • a header indicating a start of a packet may be added to a front end of the packet, and a tail indicating an end of the packet may be added to a back end of the packet.
  • a packet may include at least one event signal.
  • a timestamp may include information about a time when an event has occurred.
  • a timestamp may include 32 bits, but is not limited thereto.
  • a column address and a row address may each include a plurality of bits, for example, 8 bits.
  • a vision sensor including a plurality of pixels arranged in up to eight rows and up to eight columns may be supported.
  • this is an example, and the number of bits of a column address and a row address may vary according to the number of pixels.
  • Polarity information may include information about an on-event and an off-event.
  • the polarity information may include 1 bit including information about whether an on-event has occurred and 1 bit including information about whether an off-event has occurred.
  • a bit indicating an on-event and a bit indicating an off-event may not be both ‘1’ but may be both ‘0.’
  • the map data MDT includes more information in each frame than the event signal EVS, and thus, a size and generation period of the map data MDT may be adjusted according to a user's settings.
  • the map data MDT may be generated based on 16 bits/pixel and 50 fps.
  • a size (c ⁇ d) of the map data MDT may be adjusted by cropping and event binning setting.
  • the interface circuit 140 may be implemented as an MIPI interface, and for example, a D-PHY interface, which is an interface between a camera and a display, may be used.
  • the map data MDT may be transmitted simultaneously with periodically transmitting the event data EDT.
  • the map data MDT may be transmitted via a second virtual channel VC 2 .
  • the map data MDT may also be additionally transmitted without affecting transmission of the event data EDT.
  • FIGS. 8A through 8C are diagrams illustrating event binning of a vision sensor according to an example embodiment.
  • Binning refers to a data preprocessing technique of dividing raw data into smaller sections (bins) and replacing a value of the sections with a median value or the like.
  • event binning an event is determined to have occurred when the number of event signals generated on a pixel array of a preset N ⁇ N size is greater than a threshold.
  • each 4 ⁇ 4 size period of a pixel array may be set as a binning region, and when two or more events have occurred in a binning region (con), it is determined that an event has occurred in that binning region.
  • n denotes pixels in which an off-event has occurred
  • p denotes pixels in which an on-event has occurred. For example, it may be determined that an off-event has occurred in a pixel in a first binning region b 1 , and an on-event has occurred in four pixels in a second binning region b 2 , and an off-event has occurred in three pixels of a third binning region b 3 , and one on-event has occurred in a pixel in a fourth binning region b 4 .
  • data obtained after applying event binning may include information indicating that no event has occurred in the first binning region b 1 , and an on-event has occurred in the second binning region b 2 , and an off-event has occurred in the third binning region b 3 , and no event has occurred in the fourth binning region b 4 .
  • event binning in a 4 ⁇ 4 size to an event signal generated in a pixel array of an 8 ⁇ 8 size, data of a 2 ⁇ 2 size may be obtained.
  • FIG. 9 is a diagram for describing a method of generating a plurality of timestamp maps by using a vision sensor according to an example embodiment.
  • timestamp map TSM When the vision sensor 100 generates a timestamp map TSM, the number of event signal frames or a time period may be set. Referring to FIG. 9 , timestamp map data is generated for every twelve event signal frames. An event signal here may be in an AER format.
  • the vision sensor 100 may generate a plurality of timestamp maps TSM, for example, first through third timestamp maps. In this case, periods of frames for generating the first through third timestamp maps may overlap each other.
  • the plurality of timestamp maps TSM may be generated based on identical event signal frames, but may differ in a timestamp reference, which is a reference time point of a timestamp map generation period.
  • a first timestamp map may be generated by generating timestamp map data 1 TSM1 by accumulating from an nth event signal frame to a (n+11)th event signal frame included in a first map accumulate time window (Map 1 accumulate time window), a total of twelve event signal frames, with respect to a first timestamp reference frame TSR1.
  • a second timestamp map may be generated by generating timestamp map data 2 TSM2 by accumulating a total of twelve event signal frames, from an (n+4)th event signal frame included in a second map accumulate time window (Map 2 accumulate time window), with respect to a second timestamp reference frame TSR2.
  • a third timestamp map may be generated by generating timestamp map data 3 TSM3 by accumulating a total of twelve event signal frames, from an (n+8)th event signal frame included in a third map accumulate time window (Map 3 accumulate time window), with respect to a third timestamp reference frame TSR 3 .
  • a timestamp reference used as a reference differs, but event signal frames used in generating the timestamp maps may overlap each other.
  • FIG. 10 is a diagram for describing a timestamp map generator in a vision sensor according to an example embodiment.
  • the timestamp map generator 133 may include an address generator 133 a and an offset calculator 133 b.
  • the address generator 133 a may determine an address WA to which information including the event signal EVS and the timestamp TS is to be recorded on a timestamp map TSM, based on a column address C_ADDR, a group address G_ADDR, an on-event PE, and/or an off-event NE.
  • the offset calculator 133 b generates the timestamp map TSM by using a frame counter FC indicating an order in which the timestamp TS and an event signal frame have occurred and a timestamp reference TSR. An operation of the offset calculator 133 b will be described in detail with reference to FIG. 11 .
  • FIG. 11 is a diagram for describing a method of reducing, by a vision sensor according to an example embodiment, data of a timestamp map.
  • the offset calculator 133 b may generate a timestamp map TSM including offset values, which are difference values between a reference time Ref and a timestamp. For example, when the reference time Ref is set, only an offset value from the reference time Ref with respect to a generated event signal EVS is stored in a timestamp map, and later when transmitting timestamp map data, the reference time Ref may be added to the offset value to be transmitted. Accordingly, the storage efficiency of the map data buffer 135 , which is a limited storage space, may be improved.
  • the offset calculator 133 b may set the reference time Ref based on information about generation of the event signal EVS.
  • FIG. 12 is a diagram for describing a time when a timestamp map is generated and a time when an optical flow map is generated in a vision sensor according to an example embodiment.
  • a timestamp map or an optical flow map may be generated for each of event signal frames of a preset number. For example, referring to FIG. 12 , a timestamp map may be generated immediately after eight event signal frames are generated, and an optical flow map may be generated based on the timestamp map after generating the timestamp map. For example, the optical flow map may be generated immediately after generating the timestamp map before the ninth signal frame is generated. The timestamp map and the optical flow map may be periodically generated each time when every eight event signal frames are generated.
  • An event signal here may be in an AER format.
  • FIGS. 13A and 13B are diagrams for describing generation of an optical flow map in a vision sensor according to an example embodiment.
  • a 5 ⁇ 5 size digital filter (z) having an appropriate filter coefficient with respect to a timestamp map (x) may be used.
  • dummy data (y) may be located at an edge of the timestamp map (x) to apply the digital filter (z).
  • a user may set a digital filter coefficient by considering a relationship between a kernel size and a filter coefficient.
  • filters may calculate an optical flow OF.
  • example embodiments are not limited thereto.
  • various algorithms other than a method of applying a filter may be used.
  • the optical flow map OFM may be calculated and stored in a storage location of the timestamp map TSM.
  • the optical flow map OFM may be stored in that particular location, instead of the timestamp map TSM.
  • a size of a memory, for example, synchronous static random access memory (SRAM), in the map data buffer 135 may be more effectively managed in this manner.
  • SRAM synchronous static random access memory
  • FIG. 14 is a block diagram illustrating an example of an electronic device to which a vision sensor according to an example embodiment is applied.
  • an electronic device 1000 may include a vision sensor 1100 , an image sensor 1200 , a main processor 1300 , a working memory 1400 , a storage 1500 , a display device 1600 , a user interface 1700 , and a communicator 1800 .
  • the vision sensor 100 described with reference to FIGS. 1 through 13 may be applied as the vision sensor 1100 .
  • the vision sensor 1100 may sense an object to generate event signals, and generate a timestamp map (e.g., TSM in FIG. 1 ) or an optical flow map (e.g., OFM in FIG. 1 ), based on the generated event signals, and transmit data of at least one of an event signal (e.g., EVS of FIG. 2 ), a timestamp map TSM, or an optical flow map OFM to the main processor 1300 .
  • a timestamp map e.g., TSM in FIG. 1
  • an optical flow map e.g., OFM in FIG. 1
  • the image sensor 1200 may generate image data, such as raw image data, based on a received optical signal and provide the image data to the main processor 1300 .
  • the main processor 1300 may control the overall operation of the electronic device 1000 , and may detect a movement of an object by processing event data, that is, event signals received from the vision sensor 1100 . Also, an image frame may be received from the image sensor 1200 and image processing may be performed based on preset information. Similar to the processor 200 shown in FIG. 1 , the main processor 1300 may include an application specific integrated circuit (ASIC), a field-programmable gate array (FPGA), a dedicated microprocessor, a microprocessor, a general purpose processor, or the like. According to an example embodiment, the main processor 1300 may include an application processor or an image processor.
  • ASIC application specific integrated circuit
  • FPGA field-programmable gate array
  • the main processor 1300 may include an application processor or an image processor.
  • the working memory 1400 may store data used for an operation of the electronic device 1000 .
  • the working memory 1400 may temporarily store packets or frames processed by the main processor 1300 .
  • the working memory 1400 may include a volatile memory such as dynamic random access memory (DRAM) and SRAM, and/or a non-volatile memory such as phase-change RAM (PRAM), magneto-resistive RAM (MRAM), resistive RAM (ReRAM), and ferro-electric RAM (FRAM).
  • DRAM dynamic random access memory
  • SRAM static random access memory
  • PRAM phase-change RAM
  • MRAM magneto-resistive RAM
  • ReRAM resistive RAM
  • FRAM ferro-electric RAM
  • the storage 1500 may store data, of which storage is requested from the main processor 1300 or other components.
  • the storage 1500 may include a non-volatile memory such as flash memory, PRAM, MRAM, ReRAM, and FRAM.
  • the display device 1600 may include a display panel, a display driving circuit, and a display serial interface (DSI).
  • the display panel may be implemented as various devices such as a liquid crystal display (LCD) device, a light-emitting diode (LED) display, an organic LED (OLED) display, and an active matrix OLED (AMOLED) display.
  • the display driving circuit may include a timing controller, a source driver, and the like, needed to drive a display panel.
  • a DSI host embedded in the main processor 1300 may perform serial communication with the display panel through the DSI.
  • the user interface 1700 may include at least one of input interfaces such as a keyboard, a mouse, a keypad, a button, a touch panel, a touch screen, a touch pad, a touch ball, a gyroscope sensor, a vibration sensor, and an acceleration sensor.
  • input interfaces such as a keyboard, a mouse, a keypad, a button, a touch panel, a touch screen, a touch pad, a touch ball, a gyroscope sensor, a vibration sensor, and an acceleration sensor.
  • the communicator 1800 may exchange signals with an external device/system through an antenna 1830 .
  • a transceiver 1810 and a modulator/demodulator (MODEM) 1820 of the communicator 1800 may process signals exchanged with external devices/systems according to wireless communication protocols such as Long Term Evolution (LTE), Worldwide Interoperability for Microwave Access (WIMAX), Global System for Mobile communication (GSM), Code Division Multiple Access (CDMA), Bluetooth, Near Field Communication (NFC), Wireless Fidelity (Wi-Fi), or Radio Frequency Identification (RFID).
  • LTE Long Term Evolution
  • WIMAX Worldwide Interoperability for Microwave Access
  • GSM Global System for Mobile communication
  • CDMA Code Division Multiple Access
  • Bluetooth Bluetooth
  • NFC Near Field Communication
  • Wi-Fi Wireless Fidelity
  • RFID Radio Frequency Identification
  • Components of the electronic device 1000 may exchange data according to at least one of various interface protocols such as Universal Serial Bus (USB), Small Computer System Interface (SCSI), MIPI, Inter-Integrated Circuit (I2C), Peripheral Component Interconnect Express (PCIe), Mobile PCIe (M-PCIe), Advanced Technology Attachment (ATA), Parallel ATA (PATA), Serial ATA (SATA), Serial Attached SCSI (SAS), Integrated Drive Electronics (IDE), Enhanced IDE (EIDE), Nonvolatile Memory Express (NVMe), or Universal Flash Storage (UFS).
  • USB Universal Serial Bus
  • SCSI Small Computer System Interface
  • MIPI Inter-Integrated Circuit
  • I2C Inter-Integrated Circuit
  • PCIe Peripheral Component Interconnect Express
  • M-PCIe Mobile PCIe
  • ATA Advanced Technology Attachment
  • PATA Parallel ATA
  • SAS Serial ATA
  • IDE Serial Attached SCSI
  • IDE Integrated Drive Electronics
  • EIDE Enhanced IDE
  • At least one of the components, elements, modules or units represented by a block in the drawings including FIGS. 3 and 4A may be embodied as various numbers of hardware, software and/or firmware structures that execute respective functions described above, according to an exemplary embodiment.
  • at least one of these units may use a direct circuit structure, such as a memory, a processor, a logic circuit, a look-up table, etc. that may execute the respective functions through controls of one or more microprocessors or other control apparatuses.
  • At least one of these units may be specifically embodied by a module, a program, or a part of code, which contains one or more executable instructions for performing specified logic functions, and executed by one or more microprocessors or other control apparatuses.
  • at least one of these units may include or may be implemented by a processor such as a central processing unit (CPU) that performs the respective functions, a microprocessor, or the like. Two or more of these units may be combined into one single unit which performs all operations or functions of the combined two or more units. Also, at least part of functions of at least one of these units may be performed by another of these units.
  • a bus is not illustrated in the above block diagrams, communication between the units may be performed through the bus. Functional aspects of the above exemplary embodiments may be implemented in algorithms that execute on one or more processors.
  • the units represented by a block or processing steps may employ any number of related art techniques for electronics configuration, signal processing and/or control, data processing and the like.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Security & Cryptography (AREA)
  • Studio Devices (AREA)

Abstract

Provided is a vision sensor including a pixel array including a plurality of pixels disposed in a matrix form, an event detection circuit configured to detect whether an event has occurred in the plurality of pixels and generate event signals corresponding to pixels from among the plurality of pixels in which an event has occurred, a map data processor configured to generate a timestamp map based on the event signals, and an interface circuit configured to transmit vision sensor data including at least one of the event signals and the timestamp map to an external processor, wherein the timestamp map includes timestamp information indicating polarity information, address information, and an event occurrence time of a pixel included in an event signal corresponding to the pixel.

Description

    CROSS-REFERENCE TO THE RELATED APPLICATION
  • This application is based on and claims priority to Korean Patent Application No. 10-2020-0165943, filed on Dec. 1, 2020, in the Korean Intellectual Property Office, the disclosure of which is incorporated by reference herein in its entirety.
  • BACKGROUND 1. Field
  • Example embodiments of the present disclosure relate to a vision sensor, and more particularly, to a vision sensor capable of generating and transmitting a timestamp map, an image processing device including the vision sensor, and an operating method of the vision sensor.
  • 2. Background of Related Art
  • A vision sensor, for example, a dynamic vision sensor, generates, upon occurrence of an event, for example, a variation in an intensity of light, information about the event, that is, an event signal, and transmits the event signal to a processor.
  • A dynamic vision sensor according to related art transmits only event data and a timestamp, and data processing such as writing a timestamp map by using the data is mainly performed using a processor located outside the dynamic vision sensor. An operation of writing a timestamp map by using raw data consumes resources of a host, and thus, a method of reducing the burden of the host needs to be researched.
  • SUMMARY
  • One or more example embodiments provide a vision sensor generating a timestamp map and an optical flow map based on event data and a timestamp, an image processing device including the vision sensor, and an operating method of the vision sensor.
  • According to an aspect of an example embodiment, there is provided a vision sensor including a pixel array including a plurality of pixels disposed in a matrix form, an event detection circuit configured to detect whether an event has occurred in the plurality of pixels and generate event signals corresponding to pixels from among the plurality of pixels in which an event has occurred, a map data processor configured to generate a timestamp map based on the event signals, and an interface circuit configured to transmit vision sensor data including at least one of the event signals and the timestamp map to an external processor, wherein the timestamp map includes timestamp information indicating polarity information, address information, and an event occurrence time of a pixel included in an event signal corresponding to the pixel.
  • According to another aspect of an example embodiment, there is provided an image processing device including a vision sensor configured to generate a plurality of event signals corresponding to a plurality of pixels in which an event has occurred based on a movement of an object, from among the plurality of pixels included in a pixel array, generate a timestamp map based on polarity information, address information, and event occurrence time information of a pixel included in the plurality of event signals, and output vision sensor data including the event signals and the timestamp map, and a processor configured to detect the movement of the object by processing the vision sensor data output from the vision sensor.
  • According to an aspect of an example embodiment, there is provided an operating method of a vision sensor, the operating method including detecting whether an event has occurred in a plurality of pixels and generating event signals corresponding to pixels from among the plurality of pixels in which an event has occurred, generating vision sensor data including a timestamp map based on the event signals, and transmitting the vision sensor data to an external processor, wherein the timestamp map includes timestamp information indicating polarity information, address information, and an event occurrence time of a pixel included in an event signal.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • The above and/or other aspects, features, and advantages of example embodiments will be more clearly understood from the following detailed description, taken in conjunction with the accompanying drawings, in which:
  • FIG. 1 is a block diagram illustrating an image processing device according to an example embodiment;
  • FIG. 2 is a block diagram illustrating a vision sensor according to an example embodiment;
  • FIG. 3 is a block diagram illustrating an event detection circuit according to an example embodiment;
  • FIGS. 4A and 4B are block diagrams illustrating a map data module according to an example embodiment;
  • FIG. 5 is a flowchart of an operating method of a vision sensor according to an example embodiment;
  • FIG. 6 is a circuit diagram illustrating an implementation example of a pixel;
  • FIGS. 7A and 7B are diagrams illustrating a method of transmitting, by a vision sensor according to an example embodiment, data via an interface circuit;
  • FIGS. 8A, 8B, and 8C are diagrams illustrating event binning of a vision sensor according to an example embodiment;
  • FIG. 9 is a diagram for describing a method of generating a plurality of timestamp maps by using a vision sensor according to an example embodiment;
  • FIG. 10 is a diagram for describing a timestamp map generator in a vision sensor according to an example embodiment;
  • FIG. 11 is a diagram for describing a method of reducing, by a vision sensor according to an example embodiment, data of a timestamp map;
  • FIG. 12 is a diagram for describing a time when a timestamp map is generated and a time when an optical flow map is generated in a vision sensor according to an example embodiment;
  • FIGS. 13A and 13B are diagrams for describing generation of an optical flow map in a vision sensor according to an example embodiment; and
  • FIG. 14 is a block diagram illustrating an example of an electronic device to which a vision sensor according to an example embodiment is applied.
  • DETAILED DESCRIPTION
  • Hereinafter, various example embodiments will be described with reference to the attached drawings.
  • FIG. 1 is a block diagram illustrating an image processing device according to an example embodiment.
  • An image processing device 10 according to an example embodiment may be mounted in an electronic device having an image or light sensing function. For example, the image processing device 10 may be mounted in an electronic device such as a camera, a smartphone, a wearable device, Internet of Things (IoT), a tablet personal computer (PC), a personal digital assistant (PDA), a portable multimedia player (PMP), a navigation device, a drone, an advanced driver-assistance system (ADAS), or the like. Also, the image processing device 10 may be included as a component of a vehicle, furniture, manufacturing equipment, a door, various measuring instruments, or the like.
  • Referring to FIG. 1, the image processing device 10 may include a vision sensor 100 and a processor 200. The vision sensor 100 detects a variation in an intensity of incident light and transmits vision sensor data VSD including at least one of an event signal EVS, a timestamp map TSM, and an optical flow map OFM, to the processor 200.
  • The vision sensor 100 may detect a variation in an intensity of incident light and output an event signal. The vision sensor 100 may be a dynamic vision sensor outputting event signals EVS with respect to pixels, in which a variation in a light intensity is detected, for example, pixels in which an event has occurred. The variation in the intensity of light may be due to a movement of an object, of which an image is captured using the vision sensor 100, or due to a movement of the vision sensor 100 or the image processing device 10. The vision sensor 100 may transmit event signals EVS to the processor 200 periodically or non-periodically. The vision sensor 100 may transmit, to the processor 200, not only the event signals EVS including an address, a polarity, and a timestamp but also a timestamp map TSM or an optical flow map OFM generated based on the event signal EVS.
  • The vision sensor 100 may selectively transmit the event signals EVS to the processor 200. The vision sensor 100 may transmit, to the processor 200, those event signals EVS generated from pixels PX corresponding to a region of interest (ROI) set in a pixel array 110 from among event signals generated to correspond to the pixel array 110.
  • According to an example embodiment, the vision sensor 100 may apply crop or event binning to the event signals EVS. Also, the vision sensor 100 may generate a timestamp map TSM or an optical flow map OFM based on an event signal EVS. According to an example embodiment, the vision sensor 100 may selectively transmit event signals EVS, a timestamp map TSM, or an optical flow map OFM to the outside according to a transmission mode.
  • The processor 200 may process the event signal EVS received from the vision sensor 100, and detect a movement of an object or a movement of an object on an image recognized by the image processing device 10, and may use an algorithm such as a simultaneous localization and mapping (SLAM). The processor 200 may include an application specific integrated circuit (ASIC), a field-programmable gate array (FPGA), a dedicated microprocessor, a microprocessor, a general purpose processor, or the like. According to an example embodiment, the processor 200 may include an application processor or an image processor.
  • In addition, the vision sensor 100 and the processor 200 may be each implemented as an integrated circuit (IC). For example, the vision sensor 100 and the processor 200 may be implemented as separate semiconductor chips, or may be implemented as single chips. For example, the vision sensor 100 and the processor 200 may be implemented as system on chips (SoCs).
  • FIG. 2 is a block diagram illustrating a vision sensor according to an example embodiment.
  • Referring to FIG. 2, the vision sensor 100 may include the pixel array 110, an event detection circuit 120, a map data module 130, and an interface circuit 140.
  • The pixel array 110 may include a plurality of pixels PX arranged in a matrix form. Each of the pixels PX may detect events such as an increase or a decrease in an intensity of received light. For example, each of the pixels PX may be connected to the event detection circuit 120 via column lines extending in a column direction and row lines extending in a row direction. A signal indicating that an event has occurred and polarity information of an event, for example, whether the event is an on-event where a light intensity increases or an off-event where a light intensity decreases, may be output from a pixel PX in which the event has occurred, to the event detection circuit 120.
  • The event detection circuit 120 may read events from the pixel array 110 and process the events. The event detection circuit 120 may generate an event signal EVS including polarity information of the occurred event, an address of a pixel in which the event has occurred, and a timestamp. The event signal EVS may be generated in various formats, for example, in an address event representation (AER) format including address information, timestamp information, and polarity information of a pixel in which an event has occurred or a raw format including event occurrence information of all pixels.
  • The event detection circuit 120 may process events that occurred in the pixel array 110, in units of pixels, in units of pixel groups including a plurality of pixels, in units of columns or in units of frames.
  • The map data module 130 may adjust an amount of the event signals EVS by post-processing such as crop, which refers to setting of an ROI in a pixel array with respect to an event signal EVS or event binning for setting a data amount, and adjust a size of a timestamp map TSM and an optical flow map OFM that are generated. According to an example embodiment, the map data module 130 may generate map data MDT including a timestamp map or an optical flow map and output the map data MDT to the interface circuit 140, or output the event signals EVS received from the event detection circuit 120 to the interface circuit 140.
  • The interface circuit 140 may receive the event signals EVS and pieces of map data MDT, and transmit vision sensor data VSD to the processor 200 (FIG. 1) according to a set protocol. The interface circuit 140 may pack the event signals EVS and the pieces of the map data MDT into individual signal units, packet units, or frame units, according to a set protocol to generate vision sensor data VSD, and transmit the vision sensor data VSD to the processor 200. For example, the interface circuit 140 may include a mobile industry processor interface (MIPI).
  • Hereinafter, the event signals EVS or pieces of map data MDT being output may be the event signals EVS or the pieces of map data MDT being converted into vision sensor data VSD via the interface circuit 140 and the vision sensor data VSD being transmitted to the processor 200.
  • FIG. 3 is a block diagram illustrating an event detection circuit according to an example embodiment.
  • Referring to FIG. 3, the event detection circuit 120 may include a voltage generator 121, a digital timing/AER generator (DTAG) 122, and an event signal processing (ESP) unit 123. The DTAG 122 may include a column AER generator and a row AER generator.
  • The voltage generator 121 may generate a voltage provided to the pixel array 110. For example, the voltage generator 121 may generate threshold voltages used to detect an on-event or an off-event from the pixel PX (FIG. 2) or bias voltages. The voltage generator 121 may change a voltage level of threshold voltages provided to pixels of an ROI, and may differently change voltage levels of the threshold voltages with respect to a plurality of ROIs.
  • The DTAG 122 may receive, from the pixel PX in which an event has occurred, a signal indicating that the event has occurred, and generate a timestamp TS including information about a time when the event of the pixel PX has occurred, and an address ADDR including a column address, a row address, or a group address.
  • For example, the column AER generator may receive, from the pixel PX in which an event has occurred, a signal indicating that the event has occurred, for example, a column request, and generate a column address C_ADDR of the pixel PX in which the event has occurred.
  • The row AER generator may receive, from the pixel PX in which an event has occurred, a signal indicating that the event has occurred, for example, a row request, and generate a row address R_ADDR of the pixel PX in which the event has occurred. Instead of the row AER generator generating a row address, a group address G_ADDR may also be generated in units of preset groups.
  • According to an example embodiment, the pixel array 110 may be scanned in units of columns, and when a request is received from a certain column, for example, a first column, the column AER generator may transmit a response signal to the first column. The pixel PX in which the event has occurred and which has received the response signal may transmit polarity information Pol, for example, a signal indicating the occurrence of an on-event or an off-event, to the row AER generator. Upon receiving the polarity information Pol, the row AER generator may transmit a reset signal to the pixel PX in which the event has occurred. In response to the reset signal, the pixel PX in which the event has occurred may be reset. The row AER generator may control a period at which a reset signal is generated. The row AER generator may generate information about a time when an event has occurred, that is, a timestamp TS.
  • Operations of the row AER generator and the column AER generator are described above by assuming that the pixel array 110 is scanned in units of columns. However, the operations of the row AER generator and the column AER generator are not limited thereto, and the row AER generator and the column AER generator may read, from the pixel PX in which an event has occurred, whether the event has occurred and polarity information Pol, in various manners. For example, the pixel array 110 may be scanned in units of rows, and operations of the row AER generator and the column AER generator may be exchanged with each other, that is, the column AER generator may receive a polarity information Pol and transmit a reset signal to the pixel array 110. Also, the row AER generator and the column AER generator may individually access the pixel PX in which the event has occurred.
  • The ESP unit 123 may generate an event signal EVS based on a column address C_ADDR, a row address R_ADDR, a group address G_ADDR, a polarity information Pol, and a timestamp TS received from the DTAG 122. According to an example embodiment, the ESP unit 123 may remove a noise event, and generate an event signal EVS with respect to valid events. For example, when an amount of events that occurred for a certain period of time is less than a threshold, the ESP unit 123 may determine the events as noise, and may not generate an event signal EVS with respect to the noise event.
  • FIGS. 4A and 4B are block diagrams illustrating a map data module according to an example embodiment.
  • Referring to FIG. 4A, the map data module 130 may include a crop unit 131, a binning unit 132, a timestamp map generator 133, and a map data buffer 135.
  • The crop unit 131 may set at least one ROI on a pixel array including a plurality of pixels, according to a user's settings or a preset algorithm. For example, the crop unit 131 may set a region in which multiple events occur, as an ROI, from among a plurality of regions set with respect to the pixel array 110, or set an arbitrary region corresponding to pixels in which multiple events occur during a certain period of time, as an ROI. According to another example embodiment, an arbitrary region that is set by a user arbitrarily or corresponds to pixels PX from which a certain object is sensed may be set as an ROI. However, the ROI is not limited thereto and may be set in various manners.
  • The binning unit 132 may adjust a data amount used in generating a map according to a user's settings or a preset algorithm. For example, the binning unit 132 may perform a binning operation on an ROI by grouping pixels included in a set ROI, in certain units.
  • The timestamp map generator 133 may generate a timestamp map TSM based on an event signal EVS adjusted using the crop unit 131 and the binning unit 132. For example, the timestamp map generator 133 may generate a timestamp map TSM only with respect to an ROI that is designated by cropping. Also, the timestamp map generator 133 may generate a timestamp map TSM having a size adjusted by binning.
  • The map data buffer 135 may store the timestamp map TSM generated using the timestamp map generator 133, and transmit the stored timestamp map TSM to the interface circuit 140 in the vision sensor 100 as map data MDT.
  • Referring to FIG. 4B, the map data module 130 may include the crop unit 131, the binning unit 132, the timestamp map generator 133, an optical flow map generator 134, and the map data buffer 135. The optical flow map generator 134 may generate an optical flow map OFM based on the timestamp map TSM stored in the map data buffer 135. The optical flow map OFM may be stored in the map data buffer 135 in which the timestamp map TSM is stored.
  • An optical flow may be a pattern of apparent motion of objects, surfaces, and edges in a visual scene caused by the relative motion between an observer and a scene, and may be a distribution of apparent velocities of movement of a brightness pattern in an image. According to example embodiments, an optical flow may be information about a direction and velocity in and at which a pixel in which an event signal is generated is varied.
  • FIG. 5 is a flowchart of an operating method of a vision sensor according to an example embodiment.
  • Referring to FIGS. 2 and 5, the event detection circuit 120 may detect whether an event has occurred in each of a plurality of pixels, and generate event signals EVS corresponding to pixels in which an event has occurred. The event detection circuit 120 may generate event signals EVS in an AER format including address information, timestamp information, and polarity information of a pixel in which an event has occurred or in a raw format including event occurrence information of all pixels.
  • The map data module 130 may receive the event signals EVS from the event detection circuit 120 (S110).
  • The map data module 130 may perform crop of selecting event signals generated in an ROI of a pixel array, from among event signals, according to a user's settings or a preset algorithm (S120).
  • The map data module 130 may perform event binning (S130). When the number of events that occurred in a preset N*N size-pixel is greater than a preset threshold value, the map data module 130 may determine that an event has occurred, and store an event signal EVS on an array for storing a timestamp map corresponding to the N*N size-pixel.
  • The map data module 130 may generate a timestamp map TSM (S140). The map data module 130 may generate a plurality of timestamp maps TSM having different reference frames from each other in an event signal frame period that is periodically generated.
  • For example, the map data module 130 may periodically generate a first timestamp reference and a first timestamp map based on a preset number of event signal frames counted from the first timestamp reference, and periodically generate a second timestamp reference and a second timestamp map based on a preset number of event signal frames counted from the second timestamp reference. For example, the first timestamp map may be generated by setting a first timestamp reference frame as a first event signal frame and accumulating up to an eleventh event signal frame, and the second timestamp map may be generated by setting a second timestamp reference frame as a fifth event signal frame and accumulating from the fifth event signal frame up to a fifteenth event signal frame. Three or more timestamp maps TSM may also be generated according to the occurrence of an event signal EVS or user's settings.
  • The map data module 130 may store an offset value of a timestamp based on a preset timestamp reference of a timestamp map TSM, and set a timestamp reference of the timestamp map TSM based on event occurrence information. When transmitting the timestamp map TSM to another device later, the map data module 130 may transmit raw data by adding, a timestamp reference value corresponding to the timestamp map TSM. For example, when an event generation time is r+k, a timestamp reference may be set as r, and an offset value of a timestamp may be stored as k. When transmitting the timestamp map to another device after the timestamp map is generated, a r+k value obtained by adding the timestamp reference r to the offset value k may be transmitted.
  • The map data module 130 may generate an optical flow map OFM by estimating, from the timestamp map TSM according to a preset algorithm, an optical flow including information about a direction and velocity in and at which a pixel in which an event signal is generated is varied. The map data module 130 may store the optical flow map OFM in a location of the map data buffer 135 in which the timestamp map TSM is stored, instead of the timestamp map TSM. Accordingly, a map data buffer size may be more effectively controlled.
  • The interface circuit 140 may transmit vision sensor data VSD including the event signal EVS, the timestamp map TSM, or the optical flow map OFM, to an external processor (S150). The interface circuit 140 may selectively transmit each piece of data according to an output mode.
  • For example, in a first output mode, the interface circuit 140 may output only an event signal included in the vision sensor data VSD to an external processor. In a second output mode, the interface circuit 140 may generate at least one virtual channel and transmit an event signal to an external processor via a first virtual channel, and transmit timestamp map data to an external processor via a second virtual channel. In a third output mode, the interface circuit 140 may generate at least one virtual channel, and transmit an event signal to an external processor via the first virtual channel, and transmit optical flow map data to the external processor via the second virtual channel.
  • FIG. 6 is a circuit diagram illustrating an implementation example of a pixel.
  • Referring to FIG. 6, a pixel PX may include a photoelectric conversion device PD, an amplifier 111, a first comparator 112, a second comparator 113, an on-event holder 114, an off-event holder 115, and a reset switch SW. The pixel PX may further include a capacitor for removing noise generated in the pixel PX or coming from the outside or various switches.
  • The photoelectric conversion device PD may convert incident light, that is, an optical signal, into an electrical signal, for example, an electric current. The photoelectric conversion device PD may include, for example, a photodiode, a phototransistor, a port gate or a pinned photodiode, or the like. The photoelectric conversion device PD may generate an electrical signal having a higher level as an intensity of incident light increases.
  • The amplifier 111 may convert a received current into a voltage and amplify a voltage level. An output voltage of the amplifier 111 may be provided to the first comparator 112 and the second comparator 113.
  • The first comparator 112 may compare an output voltage Vout of the amplifier 111 with an on-threshold voltage TH1, and generate an on signal E_ON based on a comparison result. The second comparator 113 may compare an output voltage Vout of the amplifier 111 with an off-threshold voltage TH2, and generate an off signal E_OFF based on a comparison result. The first comparator 112 and the second comparator 113 may generate an on signal E_ON or an off signal E_OFF when a variation amount of light received by the photoelectric conversion device PD is equal to or greater than a certain level of variation.
  • For example, the on signal E_ON may be at a high level when a light amount received by the photoelectric conversion device PD increases to a certain level or more, and the off signal E_OFF may be at a high level when a light amount received by the photoelectric conversion device PD is reduced to a certain level or less. The on-event holder 114 and the off-event holder 115 may respectively hold the on signal E_ON and the off signal E_OFF and then output the same. When the pixel PX is scanned, the on signal E_ON and the off signal E_OFF may be output. As described above, when light sensitivity is adjusted, levels of the on-threshold voltage TH1 and the off-threshold voltage TH2 may be modified. For example, the light sensitivity may be reduced. Accordingly, the level of the on-threshold voltage TH1 may be increased, and the level of the off-threshold voltage TH2 may be reduced. Accordingly, when a variation in light received by the photoelectric conversion device PD is greater than before the levels of the on-threshold voltage TH1 and the off-threshold voltage TH2 are modified, the first comparator 112 and the second comparator 113 may generate the on signal E_ON or the off signal E_OFF.
  • FIGS. 7A and 7B are diagrams illustrating a method of transmitting, by a vision sensor according to an example embodiment, data via an interface circuit.
  • Referring to FIG. 7A, event data may be generated based on an event signal EVS generated based on, for example, 2 bits/pixel and 1000 fps, and a pixel array size (a×b). A packet including at least one event signal EVS may be output from the interface circuit 140 as event data. A packet may include a timestamp, a column address, a row address, and polarity information of an event signal, and an arrangement order thereof is not limited. A header indicating a start of a packet may be added to a front end of the packet, and a tail indicating an end of the packet may be added to a back end of the packet. A packet may include at least one event signal.
  • A timestamp may include information about a time when an event has occurred. For example, a timestamp may include 32 bits, but is not limited thereto.
  • A column address and a row address may each include a plurality of bits, for example, 8 bits. In this case, a vision sensor including a plurality of pixels arranged in up to eight rows and up to eight columns may be supported. However, this is an example, and the number of bits of a column address and a row address may vary according to the number of pixels.
  • Polarity information may include information about an on-event and an off-event. For example, the polarity information may include 1 bit including information about whether an on-event has occurred and 1 bit including information about whether an off-event has occurred. For example, a bit indicating an on-event and a bit indicating an off-event may not be both ‘1’ but may be both ‘0.’
  • The map data MDT includes more information in each frame than the event signal EVS, and thus, a size and generation period of the map data MDT may be adjusted according to a user's settings. For example, the map data MDT may be generated based on 16 bits/pixel and 50 fps. A size (c×d) of the map data MDT may be adjusted by cropping and event binning setting.
  • Referring to FIG. 7B, according to an example embodiment, the interface circuit 140 may be implemented as an MIPI interface, and for example, a D-PHY interface, which is an interface between a camera and a display, may be used. Here, by using a plurality of virtual channels VC, the map data MDT may be transmitted simultaneously with periodically transmitting the event data EDT. For example, as illustrated in FIG. 7B, while periodically transmitting event data EDT via a first virtual channel VC1, the map data MDT may be transmitted via a second virtual channel VC2. By appropriately setting a size and transmission period of the map data MDT, the map data MDT may also be additionally transmitted without affecting transmission of the event data EDT.
  • FIGS. 8A through 8C are diagrams illustrating event binning of a vision sensor according to an example embodiment.
  • Binning refers to a data preprocessing technique of dividing raw data into smaller sections (bins) and replacing a value of the sections with a median value or the like. In the present disclosure, according to event binning, an event is determined to have occurred when the number of event signals generated on a pixel array of a preset N×N size is greater than a threshold. Referring to FIG. 8A, each 4×4 size period of a pixel array may be set as a binning region, and when two or more events have occurred in a binning region (con), it is determined that an event has occurred in that binning region.
  • Referring to FIG. 8B, events that have occurred in a pixel array of an 8×8 size are shown. n denotes pixels in which an off-event has occurred, and p denotes pixels in which an on-event has occurred. For example, it may be determined that an off-event has occurred in a pixel in a first binning region b1, and an on-event has occurred in four pixels in a second binning region b2, and an off-event has occurred in three pixels of a third binning region b3, and one on-event has occurred in a pixel in a fourth binning region b4.
  • Referring to FIG. 8C, data obtained after applying event binning may include information indicating that no event has occurred in the first binning region b1, and an on-event has occurred in the second binning region b2, and an off-event has occurred in the third binning region b3, and no event has occurred in the fourth binning region b4. By applying event binning in a 4×4 size, to an event signal generated in a pixel array of an 8×8 size, data of a 2×2 size may be obtained.
  • FIG. 9 is a diagram for describing a method of generating a plurality of timestamp maps by using a vision sensor according to an example embodiment.
  • When the vision sensor 100 generates a timestamp map TSM, the number of event signal frames or a time period may be set. Referring to FIG. 9, timestamp map data is generated for every twelve event signal frames. An event signal here may be in an AER format.
  • The vision sensor 100 may generate a plurality of timestamp maps TSM, for example, first through third timestamp maps. In this case, periods of frames for generating the first through third timestamp maps may overlap each other. The plurality of timestamp maps TSM may be generated based on identical event signal frames, but may differ in a timestamp reference, which is a reference time point of a timestamp map generation period.
  • For example, when generating three timestamp maps TSM, a first timestamp map may be generated by generating timestamp map data 1 TSM1 by accumulating from an nth event signal frame to a (n+11)th event signal frame included in a first map accumulate time window (Map 1 accumulate time window), a total of twelve event signal frames, with respect to a first timestamp reference frame TSR1. A second timestamp map may be generated by generating timestamp map data 2 TSM2 by accumulating a total of twelve event signal frames, from an (n+4)th event signal frame included in a second map accumulate time window (Map 2 accumulate time window), with respect to a second timestamp reference frame TSR2. A third timestamp map may be generated by generating timestamp map data 3 TSM3 by accumulating a total of twelve event signal frames, from an (n+8)th event signal frame included in a third map accumulate time window (Map 3 accumulate time window), with respect to a third timestamp reference frame TSR3. In the timestamp maps, a timestamp reference used as a reference differs, but event signal frames used in generating the timestamp maps may overlap each other.
  • FIG. 10 is a diagram for describing a timestamp map generator in a vision sensor according to an example embodiment.
  • Referring to FIG. 10, the timestamp map generator 133 may include an address generator 133 a and an offset calculator 133 b. When an event is sensed in the pixel array 110 and an event signal EVS is generated, the address generator 133 a may determine an address WA to which information including the event signal EVS and the timestamp TS is to be recorded on a timestamp map TSM, based on a column address C_ADDR, a group address G_ADDR, an on-event PE, and/or an off-event NE. The offset calculator 133 b generates the timestamp map TSM by using a frame counter FC indicating an order in which the timestamp TS and an event signal frame have occurred and a timestamp reference TSR. An operation of the offset calculator 133 b will be described in detail with reference to FIG. 11.
  • FIG. 11 is a diagram for describing a method of reducing, by a vision sensor according to an example embodiment, data of a timestamp map.
  • Referring to FIGS. 10 and 11, the offset calculator 133 b may generate a timestamp map TSM including offset values, which are difference values between a reference time Ref and a timestamp. For example, when the reference time Ref is set, only an offset value from the reference time Ref with respect to a generated event signal EVS is stored in a timestamp map, and later when transmitting timestamp map data, the reference time Ref may be added to the offset value to be transmitted. Accordingly, the storage efficiency of the map data buffer 135, which is a limited storage space, may be improved. The offset calculator 133 b may set the reference time Ref based on information about generation of the event signal EVS.
  • FIG. 12 is a diagram for describing a time when a timestamp map is generated and a time when an optical flow map is generated in a vision sensor according to an example embodiment.
  • When a map generation period is set, a timestamp map or an optical flow map may be generated for each of event signal frames of a preset number. For example, referring to FIG. 12, a timestamp map may be generated immediately after eight event signal frames are generated, and an optical flow map may be generated based on the timestamp map after generating the timestamp map. For example, the optical flow map may be generated immediately after generating the timestamp map before the ninth signal frame is generated. The timestamp map and the optical flow map may be periodically generated each time when every eight event signal frames are generated. An event signal here may be in an AER format.
  • FIGS. 13A and 13B are diagrams for describing generation of an optical flow map in a vision sensor according to an example embodiment.
  • For example, to generate an optical flow map, a 5×5 size digital filter (z) having an appropriate filter coefficient with respect to a timestamp map (x) may be used. Referring to FIG. 13A, dummy data (y) may be located at an edge of the timestamp map (x) to apply the digital filter (z). A user may set a digital filter coefficient by considering a relationship between a kernel size and a filter coefficient. Various forms of filters may calculate an optical flow OF. However, example embodiments are not limited thereto. For example, various algorithms other than a method of applying a filter may be used.
  • When the timestamp map TSM is completed, an optical flow with respect to the timestamp map TSM may be calculated. Referring to FIG. 13B, the optical flow map OFM may be calculated and stored in a storage location of the timestamp map TSM. For example, when a timestamp map TSM is generated with respect to certain data on which crop or binning is performed, and the timestamp map TSM is stored at a particular location of a map data buffer, the optical flow map OFM may be stored in that particular location, instead of the timestamp map TSM. A size of a memory, for example, synchronous static random access memory (SRAM), in the map data buffer 135 may be more effectively managed in this manner.
  • FIG. 14 is a block diagram illustrating an example of an electronic device to which a vision sensor according to an example embodiment is applied.
  • Referring to FIG. 14, an electronic device 1000 may include a vision sensor 1100, an image sensor 1200, a main processor 1300, a working memory 1400, a storage 1500, a display device 1600, a user interface 1700, and a communicator 1800.
  • The vision sensor 100 described with reference to FIGS. 1 through 13 may be applied as the vision sensor 1100. The vision sensor 1100 may sense an object to generate event signals, and generate a timestamp map (e.g., TSM in FIG. 1) or an optical flow map (e.g., OFM in FIG. 1), based on the generated event signals, and transmit data of at least one of an event signal (e.g., EVS of FIG. 2), a timestamp map TSM, or an optical flow map OFM to the main processor 1300.
  • The image sensor 1200 may generate image data, such as raw image data, based on a received optical signal and provide the image data to the main processor 1300.
  • The main processor 1300 may control the overall operation of the electronic device 1000, and may detect a movement of an object by processing event data, that is, event signals received from the vision sensor 1100. Also, an image frame may be received from the image sensor 1200 and image processing may be performed based on preset information. Similar to the processor 200 shown in FIG. 1, the main processor 1300 may include an application specific integrated circuit (ASIC), a field-programmable gate array (FPGA), a dedicated microprocessor, a microprocessor, a general purpose processor, or the like. According to an example embodiment, the main processor 1300 may include an application processor or an image processor.
  • The working memory 1400 may store data used for an operation of the electronic device 1000. For example, the working memory 1400 may temporarily store packets or frames processed by the main processor 1300. For example, the working memory 1400 may include a volatile memory such as dynamic random access memory (DRAM) and SRAM, and/or a non-volatile memory such as phase-change RAM (PRAM), magneto-resistive RAM (MRAM), resistive RAM (ReRAM), and ferro-electric RAM (FRAM).
  • The storage 1500 may store data, of which storage is requested from the main processor 1300 or other components. The storage 1500 may include a non-volatile memory such as flash memory, PRAM, MRAM, ReRAM, and FRAM.
  • The display device 1600 may include a display panel, a display driving circuit, and a display serial interface (DSI). For example, the display panel may be implemented as various devices such as a liquid crystal display (LCD) device, a light-emitting diode (LED) display, an organic LED (OLED) display, and an active matrix OLED (AMOLED) display. The display driving circuit may include a timing controller, a source driver, and the like, needed to drive a display panel. A DSI host embedded in the main processor 1300 may perform serial communication with the display panel through the DSI.
  • The user interface 1700 may include at least one of input interfaces such as a keyboard, a mouse, a keypad, a button, a touch panel, a touch screen, a touch pad, a touch ball, a gyroscope sensor, a vibration sensor, and an acceleration sensor.
  • The communicator 1800 may exchange signals with an external device/system through an antenna 1830. A transceiver 1810 and a modulator/demodulator (MODEM) 1820 of the communicator 1800 may process signals exchanged with external devices/systems according to wireless communication protocols such as Long Term Evolution (LTE), Worldwide Interoperability for Microwave Access (WIMAX), Global System for Mobile communication (GSM), Code Division Multiple Access (CDMA), Bluetooth, Near Field Communication (NFC), Wireless Fidelity (Wi-Fi), or Radio Frequency Identification (RFID).
  • Components of the electronic device 1000, for example, the vision sensor 1100, the image sensor 1200, the main processor 1300, the working memory 1400, the storage 1500, the display device 1600, the user interface 1700, and the communicator 1800, may exchange data according to at least one of various interface protocols such as Universal Serial Bus (USB), Small Computer System Interface (SCSI), MIPI, Inter-Integrated Circuit (I2C), Peripheral Component Interconnect Express (PCIe), Mobile PCIe (M-PCIe), Advanced Technology Attachment (ATA), Parallel ATA (PATA), Serial ATA (SATA), Serial Attached SCSI (SAS), Integrated Drive Electronics (IDE), Enhanced IDE (EIDE), Nonvolatile Memory Express (NVMe), or Universal Flash Storage (UFS).
  • At least one of the components, elements, modules or units (collectively “units” in this paragraph) represented by a block in the drawings including FIGS. 3 and 4A may be embodied as various numbers of hardware, software and/or firmware structures that execute respective functions described above, according to an exemplary embodiment. For example, at least one of these units may use a direct circuit structure, such as a memory, a processor, a logic circuit, a look-up table, etc. that may execute the respective functions through controls of one or more microprocessors or other control apparatuses. Also, at least one of these units may be specifically embodied by a module, a program, or a part of code, which contains one or more executable instructions for performing specified logic functions, and executed by one or more microprocessors or other control apparatuses. Further, at least one of these units may include or may be implemented by a processor such as a central processing unit (CPU) that performs the respective functions, a microprocessor, or the like. Two or more of these units may be combined into one single unit which performs all operations or functions of the combined two or more units. Also, at least part of functions of at least one of these units may be performed by another of these units. Further, although a bus is not illustrated in the above block diagrams, communication between the units may be performed through the bus. Functional aspects of the above exemplary embodiments may be implemented in algorithms that execute on one or more processors. Furthermore, the units represented by a block or processing steps may employ any number of related art techniques for electronics configuration, signal processing and/or control, data processing and the like.
  • While example embodiments have been illustrated and described above, it will be apparent to those skilled in the art that modifications and variations could be made without departing from the scope as defined by the appended claims and their equivalents.

Claims (20)

What is claimed is:
1. A vision sensor comprising:
a pixel array comprising a plurality of pixels disposed in a matrix form;
an event detection circuit configured to detect whether an event has occurred in the plurality of pixels and generate event signals corresponding to pixels from among the plurality of pixels in which an event has occurred;
a map data processor configured to generate a timestamp map based on the event signals; and
an interface circuit configured to transmit vision sensor data including at least one of the event signals and the timestamp map to an external processor,
wherein the timestamp map includes timestamp information indicating polarity information, address information, and an event occurrence time of a pixel included in an event signal corresponding to the pixel.
2. The vision sensor of claim 1, wherein the event signals are generated in an address-event representation (AER) format including address information, timestamp information, and polarity information of the pixels in which an event has occurred or a raw format including event occurrence information of all of the plurality of pixels.
3. The vision sensor of claim 1, wherein the map data processor is further configured to select event signals generated in a region of interest (ROI) of the pixel array, from among the event signals.
4. The vision sensor of claim 1, wherein, based on a number of events that occurred in a preset N*N size-pixel being greater than a preset threshold value, the map data processor is further configured to determine that an event has occurred in the N*N size-pixel, and store an event signal corresponding to the N*N size-pixel.
5. The vision sensor of claim 1, wherein the interface circuit is further configured to:
in a first output mode, output an event signal included in the vision sensor data to the external processor;
in a second output mode, generate at least one virtual channel, transmit the event signal to the external processor via a first virtual channel, and transmit the timestamp map data to the external processor via a second virtual channel; and
in a third output mode, generate at least one virtual channel, transmit the event signal to the external processor via the first virtual channel, and transmit optical flow map data to the external processor via the second virtual channel.
6. The vision sensor of claim 1, wherein the map data processor is further configured to receive event signals periodically generated in each of frames, and generate a plurality of timestamp maps based on different frame periods.
7. The vision sensor of claim 6, wherein the map data processor is further configured to periodically generate a first timestamp reference frame and a first timestamp map based on a preset number of event signal frames from the first timestamp reference frame, and periodically generate a second timestamp reference frame and a second timestamp map based on a preset number of event signal frames from the second timestamp reference frame, and
wherein the first timestamp map and the second timestamp map have overlapping event signal frame periods for generating the first timestamp map and the second timestamp map.
8. The vision sensor of claim 1, wherein the timestamp map is configured to store an offset value of a timestamp based on a preset timestamp reference.
9. The vision sensor of claim 8, wherein the timestamp map is configured to set a timestamp reference based on event occurrence information.
10. The vision sensor of claim 1, wherein the map data processor is further configured to generate an optical flow map by estimating an optical flow including information about a direction and a velocity in and at which a pixel in which an event signal is generated is varied, from the timestamp map, based on a preset algorithm.
11. The vision sensor of claim 10, wherein the map data processor is further configured to store the optical flow map in a location of a map data buffer, in which the timestamp map is stored, instead of the timestamp map.
12. An image processing device comprising:
a vision sensor configured to:
generate a plurality of event signals corresponding to a plurality of pixels in which an event has occurred based on a movement of an object, from among the plurality of pixels included in a pixel array;
generate a timestamp map based on polarity information, address information, and event occurrence time information of a pixel included in the plurality of event signals, and output vision sensor data including the event signals and the timestamp map; and
a processor configured to detect the movement of the object by processing the vision sensor data output from the vision sensor.
13. The image processing device of claim 12, wherein the vision sensor is further configured to:
generate an optical flow map by estimating an optical flow including information about a direction and a velocity in and at which a pixel in which an event signal is generated is varied, from the timestamp map based on a preset algorithm; and
replace the generated optical flow map in a location of a map data buffer in which the timestamp map is stored.
14. The image processing device of claim 12, wherein the vision sensor is further configured to generate at least one virtual channel, transmit the event signals to the processor via a first virtual channel, and transmit the timestamp map data or optical flow map data to the processor via a second virtual channel.
15. The image processing device of claim 12, wherein, based on a number of events that occurred in a preset N*N size-pixel being greater than a preset threshold value, the vision sensor determines that an event has occurred.
16. An operating method of a vision sensor, the operating method comprising:
detecting whether an event has occurred in a plurality of pixels and generating event signals corresponding to pixels from among the plurality of pixels in which an event has occurred;
generating vision sensor data including a timestamp map based on the event signals; and
transmitting the vision sensor data to an external processor,
wherein the timestamp map includes timestamp information indicating polarity information, address information, and an event occurrence time of a pixel included in an event signal.
17. The operating method of claim 16, wherein the event signals are generated in an address-event representation (AER) format including address information, timestamp information, and polarity information of the pixels in which an event has occurred or a raw format including event occurrence information of all of the plurality of pixels.
18. The operating method of claim 16, further comprising selecting event signals generated in a region of interest (ROI) of a pixel array including the plurality of pixels, from among the event signals.
19. The operating method of claim 16, wherein, based on a number of events that occurred in a preset N*N size-pixel being greater than a preset threshold value, the vision sensor determines that an event has occurred in the preset N*N size-pixel.
20. The operating method of claim 16, further comprising generating an optical flow map by estimating an optical flow including information about a direction and a velocity in and at which a pixel in which an event signal is generated is varied, from the timestamp map based on a preset algorithm.
US17/515,755 2020-12-01 2021-11-01 Vision sensor, image processing device including the same, and operating method of the vision sensor Pending US20220172375A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US18/120,251 US20230217123A1 (en) 2020-12-01 2023-03-10 Vision sensor, image processing device including the same, and operating method of the vision sensor

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
KR1020200165943A KR20220076944A (en) 2020-12-01 2020-12-01 Vision sensor, image processing device comprising thereof and operating method of vision sensor
KR10-2020-0165943 2020-12-01

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US18/120,251 Continuation US20230217123A1 (en) 2020-12-01 2023-03-10 Vision sensor, image processing device including the same, and operating method of the vision sensor

Publications (1)

Publication Number Publication Date
US20220172375A1 true US20220172375A1 (en) 2022-06-02

Family

ID=81752846

Family Applications (2)

Application Number Title Priority Date Filing Date
US17/515,755 Pending US20220172375A1 (en) 2020-12-01 2021-11-01 Vision sensor, image processing device including the same, and operating method of the vision sensor
US18/120,251 Pending US20230217123A1 (en) 2020-12-01 2023-03-10 Vision sensor, image processing device including the same, and operating method of the vision sensor

Family Applications After (1)

Application Number Title Priority Date Filing Date
US18/120,251 Pending US20230217123A1 (en) 2020-12-01 2023-03-10 Vision sensor, image processing device including the same, and operating method of the vision sensor

Country Status (3)

Country Link
US (2) US20220172375A1 (en)
KR (1) KR20220076944A (en)
CN (1) CN114584721A (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116757968A (en) * 2023-08-18 2023-09-15 深圳时识科技有限公司 Noise reduction method and device, chip, event imaging device and electronic equipment
US11895414B1 (en) * 2022-08-04 2024-02-06 Qualcomm Incorporated Virtual channel configuration session of a camera sensor

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117372941A (en) * 2022-06-30 2024-01-09 清华大学 Event data processing method and related equipment

Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20130335595A1 (en) * 2012-06-19 2013-12-19 Samsung Electronics Co., Ltd. Event-based image processing apparatus and method
US20160078321A1 (en) * 2014-09-16 2016-03-17 Qualcomm Incorporated Interfacing an event based system with a frame based processing system
US20170213105A1 (en) * 2016-01-27 2017-07-27 Zhengping Ji Method and apparatus for event sampling of dynamic vision sensor on image formation
US20180262705A1 (en) * 2017-03-08 2018-09-13 Samsung Electronics Co., Ltd. Image processing device configured to regenerate timestamp and electronic device including the same
US20180295298A1 (en) * 2017-04-06 2018-10-11 Samsung Electronics Co., Ltd. Intensity image acquisition from dynamic vision sensors
US20190043583A1 (en) * 2018-08-23 2019-02-07 Intel Corporation Clustering events in a content addressable memory
US20190289230A1 (en) * 2018-03-14 2019-09-19 Insightness Ag Event-based vision sensor with direct memory control
US20190356849A1 (en) * 2018-05-18 2019-11-21 Samsung Electronics Co., Ltd. Cmos-assisted inside-out dynamic vision sensor tracking for low power mobile platforms
US20200260022A1 (en) * 2017-09-28 2020-08-13 Zermatt Technologies Llc System and method for event camera data processing
US20210152757A1 (en) * 2018-12-05 2021-05-20 Sony Semiconductor Solutions Corporation Solid-state imaging device, signal processing chip, and electronic apparatus
US20220030185A1 (en) * 2018-09-28 2022-01-27 Sony Semiconductor Solutions Corporation Data processing device, data processing method, and program
US20220030186A1 (en) * 2018-09-28 2022-01-27 Sony Semiconductor Solutions Corporation Solid imaging element, control method for solid imaging element, and electronic apparatus

Patent Citations (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20130335595A1 (en) * 2012-06-19 2013-12-19 Samsung Electronics Co., Ltd. Event-based image processing apparatus and method
EP2677500A2 (en) * 2012-06-19 2013-12-25 Samsung Electronics Co., Ltd Event-based image processing apparatus and method
US20160078321A1 (en) * 2014-09-16 2016-03-17 Qualcomm Incorporated Interfacing an event based system with a frame based processing system
US20170213105A1 (en) * 2016-01-27 2017-07-27 Zhengping Ji Method and apparatus for event sampling of dynamic vision sensor on image formation
US20180262705A1 (en) * 2017-03-08 2018-09-13 Samsung Electronics Co., Ltd. Image processing device configured to regenerate timestamp and electronic device including the same
US20180295298A1 (en) * 2017-04-06 2018-10-11 Samsung Electronics Co., Ltd. Intensity image acquisition from dynamic vision sensors
US20200260022A1 (en) * 2017-09-28 2020-08-13 Zermatt Technologies Llc System and method for event camera data processing
US20190289230A1 (en) * 2018-03-14 2019-09-19 Insightness Ag Event-based vision sensor with direct memory control
US20190356849A1 (en) * 2018-05-18 2019-11-21 Samsung Electronics Co., Ltd. Cmos-assisted inside-out dynamic vision sensor tracking for low power mobile platforms
US20190043583A1 (en) * 2018-08-23 2019-02-07 Intel Corporation Clustering events in a content addressable memory
US20220030185A1 (en) * 2018-09-28 2022-01-27 Sony Semiconductor Solutions Corporation Data processing device, data processing method, and program
US20220030186A1 (en) * 2018-09-28 2022-01-27 Sony Semiconductor Solutions Corporation Solid imaging element, control method for solid imaging element, and electronic apparatus
US20210152757A1 (en) * 2018-12-05 2021-05-20 Sony Semiconductor Solutions Corporation Solid-state imaging device, signal processing chip, and electronic apparatus

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11895414B1 (en) * 2022-08-04 2024-02-06 Qualcomm Incorporated Virtual channel configuration session of a camera sensor
US20240048858A1 (en) * 2022-08-04 2024-02-08 Qualcomm Incorporated Virtual channel configuration session of a camera sensor
CN116757968A (en) * 2023-08-18 2023-09-15 深圳时识科技有限公司 Noise reduction method and device, chip, event imaging device and electronic equipment

Also Published As

Publication number Publication date
CN114584721A (en) 2022-06-03
US20230217123A1 (en) 2023-07-06
KR20220076944A (en) 2022-06-08

Similar Documents

Publication Publication Date Title
US11532143B2 (en) Vision sensor, image processing device including the vision sensor, and operating method of the vision sensor
US11575849B2 (en) Image processing device configured to regenerate timestamp and electronic device including the same
US20220172375A1 (en) Vision sensor, image processing device including the same, and operating method of the vision sensor
US10972691B2 (en) Dynamic vision sensor, electronic device and data transfer method thereof
US11582410B2 (en) Dynamic vision sensor and image processing device including the same
KR20150120124A (en) Dynamic vision sensor and motion recognition device including the same
US20230283870A1 (en) Vision sensor and operating method thereof
US11950003B2 (en) Vision sensor and operating method of the same
US11418735B2 (en) Image processing device including vision sensor and operating method thereof
US11765486B2 (en) Vision sensor and image processing device including the same
US11558573B2 (en) Sensor for accumulation signal
EP4093021A1 (en) Vision sensor and operating method thereof
US20240107188A1 (en) Vision sensor and image processing device including the same
US20230300282A1 (en) Vision sensor, image processing device including the same, and operating method of the vision sensor
KR102422392B1 (en) Image processing device configured to generate depth map and method of operating the saem

Legal Events

Date Code Title Description
AS Assignment

Owner name: SAMSUNG ELECTRONICS CO., LTD., KOREA, REPUBLIC OF

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:SEO, JONGSEOK;PARK, JUNHYUK;LEE, HYUNKU;SIGNING DATES FROM 20211005 TO 20211020;REEL/FRAME:058597/0603

STPP Information on status: patent application and granting procedure in general

Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION

STPP Information on status: patent application and granting procedure in general

Free format text: NON FINAL ACTION MAILED

STPP Information on status: patent application and granting procedure in general

Free format text: ADVISORY ACTION MAILED

STPP Information on status: patent application and granting procedure in general

Free format text: NON FINAL ACTION MAILED

STPP Information on status: patent application and granting procedure in general

Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER

STPP Information on status: patent application and granting procedure in general

Free format text: FINAL REJECTION MAILED

STPP Information on status: patent application and granting procedure in general

Free format text: ADVISORY ACTION MAILED