US9122994B2 - Apparatus and methods for temporally proximate object recognition - Google Patents
Apparatus and methods for temporally proximate object recognition Download PDFInfo
- Publication number
- US9122994B2 US9122994B2 US13/152,105 US201113152105A US9122994B2 US 9122994 B2 US9122994 B2 US 9122994B2 US 201113152105 A US201113152105 A US 201113152105A US 9122994 B2 US9122994 B2 US 9122994B2
- Authority
- US
- United States
- Prior art keywords
- detector
- regime
- frames
- excitability
- detection signal
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Expired - Fee Related, expires
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/04—Architecture, e.g. interconnection topology
- G06N3/049—Temporal neural networks, e.g. delay elements, oscillating neurons or pulsed inputs
Definitions
- the present invention relates generally to object recognition and identification in a computerized processing system, and more particularly in one exemplary aspect to a computer vision apparatus and methods of temporally proximate object recognition.
- Object recognition in the context of computer vision relates to finding a given object in an image or a sequence of frames in a video segment.
- temporally proximate features that have high temporal correlations are identified within the sequence of frames, with each successive frame containing a temporally proximate representation of an object.
- Object representations also referred to as the “view”, may change from frame to frame due to a variety of object transformations, such as rotation, movement/translation, change in lighting, background, noise, appearance of other objects, partial blocking/unblocking of the object, etc.
- Temporally proximate object representations occur when the frame rate of object capture is commensurate with the timescales of these transformations, so that at least a subset of a particular object representation appears in several consecutive frames. Temporal proximity of object representations allows a computer vision system to recognize and associate different views with the same object (for example, different phases of a rotating triangle are recognized and associated with the same triangle). Such temporal processing (also referred to as learning), enables object detection and tracking based on an invariant system response with respect to commonly appearing transformations (e.g., rotation, scaling, and translation).
- transformations e.g., rotation, scaling, and translation
- temporal correlation between successive frames are reduced by discontinuities, sudden object movements, and noise
- temporal correlations are typically useful for tracking objects evolving continuously and slowly, e.g., on time scales that are comparable to the frame interval, such as tracking human movements in a typical video stream of about 24 frames per second (fps).
- the present invention satisfies the foregoing needs by providing, inter alia, apparatus and methods for temporally proximate object recognition.
- an image processing apparatus in one aspect, includes a receiver configured to receive a sequence of image frames comprising an object undergoing a transformation, an encoder configured to encode at least a portion of an image frame within the sequence of image frames into a group of pulses, and a detector.
- the detector is in one variant coupled by a plurality of transmission channels to the encoder and is configured to receive the group of pulses, and to generate a detection signal based at least in part on the group of pulses such that excitability parameter of the detector is configurable to be increased above a reference value responsive to generation of the detection signal.
- a gain of at least one of the plurality of transmission channels is configurable to be adjusted from a first value to a second value responsive to an arrival of at least one pulse of the group of pulses at the detector within a time interval relative to the detection signal.
- the excitability parameter and the gain of at least one of the plurality of transmission channels cooperate to effect recognition of the object in the image frame of the sequence of image frames invariantly with respect to the transformation.
- value of the second gain is configured based at least in part on the time between the detection signal and the at least one pulse such that the second value is greater than the first value responsive to the at least one pulse arriving at the detector prior to the detection signal, and the second value is less than the first value responsive to the at least one pulse arriving at the detector subsequent to the detection signal.
- an apparatus configured for object recognition.
- the apparatus comprises a receiving module configured to receive a first plurality of frames relating to an object undergoing a transformation, an encoder configured to encode at least a portion of a frame within the first plurality of frames into a group of pulses, and at least one detector in communication with the encoder.
- the detector is configured to receive the group of pulses, and to generate a first signal based at least in part on the group of pulses such that at least a subset of the first plurality of frames comprises a plurality of views of the object undergoing the transformation; the apparatus is configured to recognize the object in a view of the plurality of views invariantly with respect to the transformation.
- the at least one detector comprises a first detector and a second detector
- the first detector is configurable to be adjusted from a first regime to a second regime responsive to a second signal associated with the second detector such that the second regime is configured based at least in part on a value associated with the second signal, the fourth regime comprising one of: (i) a detection inhibition regime, and (ii) detection enhancement regime.
- the at least one detector is adjustable from a first regime to a second regime responsive to a second signal such that the second signal comprises a detection signal generated in response to a first frame of the first plurality of frames, and the first regime comprises a first parameter and the second regime comprises a second parameter, the second parameter configured substantially different from the first parameter until at least receiving of a second frame of the first plurality of frames, the second frame being generated subsequent to the first frame.
- the least one detector is further configurable to be adjusted from a third regime a fourth regime based at least in part on a second detection signal, the second detection signal being generated responsive to a second plurality of frames, the second plurality of frames temporally preceding the first plurality of frames.
- the apparatus further comprises a plurality of transmission channels coupled between the at least one detector and the encoder, and logic configured to adjust at least one of the plurality of transmission channels from a first scheme to a second scheme based at least in part on the second detection signal being generated.
- the apparatus includes an encoder configured to receive a first plurality of views of an object undergoing a transformation, and to encode at least a portion of a first view of the first plurality of views into a group of pulses.
- the apparatus further includes a first detector configured to generate a detection signal based at least in part on the receiving the group of pulses, such that the first detector is adjustable from a first regime to a second regime responsive to a receipt of a pulse of the group of pulses; the apparatus is configured to recognize the object in the first view of the first plurality of views invariantly with respect to the transformation.
- the detection signal is generated in response to the first view of the first plurality of views, and the first regime comprises a first parameter and the second regime comprises a second parameter, the second parameter configured substantially different from the first parameter until at least a receiving of a second view of the first plurality of views, the second view being received subsequent to the first view.
- the apparatus further comprises a plurality of channels coupled between the first detector and the encoder.
- the apparatus comprises logic configured to adjust at least one of the plurality of channels from a first scheme to a second scheme responsive to an arrival of at least one pulse of the group of pulses at the first detector via the at least one of the plurality of channels within a first time interval relative to the signal.
- the adjustment from the first scheme to the second scheme is in one variant configured based at least in part on the interval such that the first scheme is characterized by a first channel gain, and the second scheme is characterized by second channel gain.
- the apparatus comprises logic configured to adjust at least one of the plurality of channels from a first scheme to a second scheme based at least in part on a second detection signal, the second detection signal being generated responsive to a second plurality of views.
- the second plurality of views temporally precedes the first plurality of views such that the adjustment of the at least one of the plurality of channels is effected responsive to generation of the second detection signal.
- the apparatus further comprises a second detector; the first detector is configurable to be adjusted from a third regime to a fourth regime responsive to a signal associated with the second detector such that the fourth regime is configured based at least in part on a value associated with the signal, the fourth regime comprising one of (i) a detection inhibition regime, and (ii) detection enhancement regime.
- a computer readable apparatus comprises non-transient data stored thereon, the data being generated via a computerized apparatus configured to process a plurality of frames relating to an object undergoing a transformation.
- the plurality of frames are received by a detector apparatus of the computerized apparatus in communication with a plurality of communication channels according to a method comprising, receiving a first group of pulses via at least a subset of the plurality of communication channels, the first group of pulses associated with a first view of the object, generating a detection signal based at least in part on the receiving, and adjusting at least one communication channels within the subset from a first scheme to a second scheme responsive to the detection signal so that the generating and adjusting cooperate to effect recognition of the object in the first view invariantly with respect to the transformation.
- a method for use in a computerized apparatus configured to process a first plurality of frames comprising views of an object undergoing a transformation.
- the first plurality of frames are being received by a detector apparatus in communication with a plurality of channels, and the method comprises generating a detection signal responsive to receiving a first group of pulses via the plurality of channels, the first group associated with a first view of the object, and adjusting at least one of the channels from a first scheme to a second scheme responsive to the detection signal, such that the generating and adjusting cooperate to effect recognition of the object in the first view.
- the recognition is invariant with respect to the transformation.
- the method comprises adjusting the detector apparatus from a first regime to a second regime responsive to a signal.
- the method comprises adjusting the detector apparatus from a third regime a fourth regime based at least in part on a second detection signal, the second detection signal being generated responsive to a second plurality of frames relating to the object undergoing the transformation such that the second plurality of frames temporally preceding the first plurality of frames.
- the method comprises adjusting at least one of the plurality of the channels from a third scheme to a fourth scheme responsive to an arrival of at least one pulse of the group of pulses at the detector apparatus via said at least one of the plurality of channels within a first interval relative to the signal, such that the adjustment from the first scheme to the fourth scheme is configured based at least in part on the interval.
- FIG. 1 is a block diagram illustrating an exemplary object recognition apparatus according to one embodiment of the invention.
- FIG. 1A is a graphical representation illustrating encoding of an input signal into a pattern of pulses, and a sample response of the detector node, according to one embodiment of the invention.
- FIG. 2A is a graphical illustration of detector node operation in response to an input according to one embodiment the invention.
- FIG. 2B is a graphical illustration of detector sensitivity adjustment in response to a detection signal generation according to one embodiment the invention.
- FIG. 2C is a graphical illustration of detector sensitivity adjustment in response to an input according to one embodiment the invention.
- FIG. 2D is a graphical illustration of detector sensitivity adjustment in response to both an input and a detection signal generation according to one embodiment the invention.
- FIG. 2E is a graphical illustration of channel gain modulation in response to an input according to one embodiment the invention.
- FIG. 3 is a block diagram illustrating one embodiment of the invention where the object detection apparatus is equipped with detector inhibition.
- FIG. 3A is a graphical representation illustrating mutual inhibition between two detector nodes in response to a signal representative of different objects according to a first embodiment of the invention.
- FIG. 4 is a graphical representation illustrating mutual inhibition between two detector nodes in response to a signal comprising a mixture of two objects to according to a second embodiment of the invention.
- the terms “computer”, “computing device”, and “computerized device”, include, but are not limited to, mainframe computers, workstations, servers, personal computers (PCs) and minicomputers, whether desktop, laptop, or otherwise, personal digital assistants (PDAs), handheld computers, embedded computers, programmable logic devices, digital signal processor systems, personal communicators, tablet computers, portable navigation aids, J2ME equipped devices, cellular telephones, smartphones, personal integrated communication or entertainment devices, neurocomputers, neuromorphic chips, or literally any other device capable of executing a set of instructions and processing an incoming data signal.
- PDAs personal digital assistants
- handheld computers handheld computers
- embedded computers embedded computers
- programmable logic devices digital signal processor systems
- personal communicators personal communicators
- tablet computers tablet computers
- portable navigation aids J2ME equipped devices
- J2ME equipped devices J2ME equipped devices
- cellular telephones smartphones
- personal integrated communication or entertainment devices neurocomputers, neuromorphic chips, or literally any other device capable of executing a set of instructions and
- ⁇ As used herein, the term “computer program” or “software” is meant generally to include any sequence or human or machine cognizable steps which perform a function.
- Such program may be rendered in virtually any programming language or environment including, for example, C/C++, C#, Fortran, COBOL, MATLABTM, PASCAL, Python, assembly language, markup languages (e.g., HTML, SGML, XML, VoXML), and the like, as well as object-oriented environments such as the Common Object Request Broker Architecture (CORBA), JavaTM (including J2ME, Java Beans, etc.), Binary Runtime Environment (e.g., BREW), and the like.
- CORBA Common Object Request Broker Architecture
- JavaTM including J2ME, Java Beans, etc.
- BREW Binary Runtime Environment
- connection refers without limitation to a causal link between any two or more entities (whether physical or logical/virtual), which enables information exchange between the entities.
- the term “invariant” is meant generally to refer to, without limitation, the response of a recognition system or its components that is not substantially different when one or more parameters of the incoming signal are varied.
- the system, or some of its subsystems may generate a complex pattern of pulses in response to an input signal, and changing parameters of the signal would not change substantially the pattern of pulses, but only affect the time of its generation.
- memory includes any type of integrated circuit or other storage device adapted for storing digital data including, without limitation, ROM, PROM, EEPROM, DRAM, SDRAM, DDR/2 SDRAM, EDO/FPMS, RLDRAM, SRAM, “flash” memory (e.g., NAND/NOR), memristor, and PSRAM.
- microprocessor and “digital processor” are meant generally to include all types of digital processing devices including, without limitation, digital signal processors (DSPs), reduced instruction set computers (RISC), general-purpose (CISC) processors, microprocessors, gate arrays (e.g., FPGAs), PLDs, reconfigurable compute fabrics (RCFs), array processors, secure microprocessors, and application-specific integrated circuits (ASICs).
- DSPs digital signal processors
- RISC reduced instruction set computers
- CISC general-purpose processors
- microprocessors e.g., FPGAs), PLDs, reconfigurable compute fabrics (RCFs), array processors, secure microprocessors, and application-specific integrated circuits (ASICs).
- DSPs digital signal processors
- RISC reduced instruction set computers
- CISC general-purpose processors
- microprocessors gate arrays (e.g., FPGAs), PLDs, reconfigurable compute fabrics (RCFs), array processors, secure microprocess
- pulse pattern As used herein the terms “pulse pattern”, “pattern of pulses”, or “pattern of pulse latencies” are meant generally and without limitation to denote a set of pulses, arranged (in space and time) in a predictable manner that is recognizable at a predetermined level of statistical significance.
- pulse As used herein, the terms “pulse”, “spike”, “burst of spikes”, and “pulse train” are meant generally to refer to, without limitation, any type of a pulsed signal, e.g., a rapid change in some characteristic of a signal, e.g., amplitude, intensity, phase or frequency, from a baseline value to a higher or lower value, followed by a rapid return to the baseline value and may refer to any of a single spike, a burst of spikes, an electronic pulse, a pulse in voltage, a pulse in electrical current, a software representation of a pulse and/or burst of pulses, a software representation of a latency or timing of the pulse, and any other pulse or pulse type associated with a pulsed transmission system or mechanism.
- a pulsed signal e.g., a rapid change in some characteristic of a signal, e.g., amplitude, intensity, phase or frequency, from a baseline value to a higher or lower value, followed by a
- pulse latency As used herein, the terms “pulse latency”, “absolute latency”, and “latency” are meant generally to refer to, without limitation, a temporal delay or a spatial offset between an event (e.g., the onset of a stimulus, an initial pulse, or just a point in time) and a pulse.
- an event e.g., the onset of a stimulus, an initial pulse, or just a point in time
- pulse group latency or “pulse pattern latency” refer to, without limitation, an absolute latency of a group (pattern) of pulses that is expressed as a latency of the earliest pulse within the group.
- relative pulse latencies refer to, without limitation, a latency pattern or distribution within a group (or pattern) of pulses that is referenced with respect to the pulse group latency.
- pulse-code is meant generally to denote, without limitation, information encoding into a patterns of pulses (or pulse latencies) along a single pulsed channel or relative pulse latencies along multiple channels.
- wireless means any wireless signal, data, communication, or other interface including without limitation Wi-Fi, Bluetooth, 3G (e.g., 3GPP, 3GPP2, and UMTS), HSDPA/HSUPA, TDMA, CDMA (e.g., IS-95A, WCDMA, etc.), FHSS, DSSS, GSM, PAN/802.15, WiMAX (802.16), 802.20, narrowband/FDMA, OFDM, PCS/DCS, Long Term Evolution (LTE) or LTE-Advanced (LTE-A), analog cellular, CDPD, satellite systems such as GPS, millimeter wave or microwave systems, optical, acoustic, and infrared (i.e., IrDA).
- 3G e.g., 3GPP, 3GPP2, and UMTS
- HSDPA/HSUPA e.g., TDMA
- CDMA e.g., IS-95A, WCDMA, etc.
- FHSS DSSS
- the present invention provides, in one salient aspect, apparatus and methods for detecting and recognizing objects invariantly with respect to one or more temporally proximate object transformations. These transformations may include, inter alia, rotation, translation, position change, and scaling. Many other transformations useful with the invention exist, such as for example pitch change for object/feature recognition in sound signals, texture for tactile signals, and/or transparency and color for visual objects.
- Temporally proximate time scales are determined by, inter alia, the nature of the object and its transformations, and typically require a time period sufficient to capture adequate information about the object. By way of example, when processing visual images containing people in applications such as surveillance or human-computer interaction, such time scales are typically on the order of 0.1-0.3 seconds (s), which translates into 2-7 frames at a typical rate of 24 frames per second (fps).
- the incoming signal is encoded to produce a pulse-code output that depends only on the predetermined object type. Invariant representations of features and objects emerge from the temporal proximity of different transformations of these features, and objects in the input.
- the encoded signal is transmitted from the encoder to one or more detectors over a plurality of transmission channels. Each detector is configured to generate a detection signal upon recognizing the predetermined representation of the object in the encoded signal.
- Another implementation of the presented invention uses pulse timing-dependent plasticity, wherein the response of detectors and/or transmission channels is dynamically adjusted based in part on prior detector activity and/or a prior input. Transmission characteristics of different channels (for example, the conduction delay or the strength of transmission describing the strength of the impact of the incoming pulse onto the receiving unit), are adaptively adjusted based on prior input signals (history), so that the detection apparatus acquires, through learning and adaptation, invariant recognition properties initially not present.
- the pattern of relative pulse latencies is generated in the pulsed output signal upon the occurrence of one or more of cyclic events, such as a clock signal, an internally generated oscillatory wave, an arrival of the input frame, an appearance of a new feature in the frame, and/or a time related to a previous event.
- cyclic events such as a clock signal, an internally generated oscillatory wave, an arrival of the input frame, an appearance of a new feature in the frame, and/or a time related to a previous event.
- the detector nodes are configurable to interact with each other on comparatively short time scales. For example, a detector that is the first to recognize the object of interest transmits an indication to neighboring detectors, the indication configured to prevent the other nodes from generating detection signals.
- portions of the object recognition apparatus are embodied in a remote server, comprising a computer readable apparatus.
- Embodiments of object recognition functionality of the present invention are useful in a variety of applications including for instance a prosthetic device, autonomous robotic apparatus, and other electromechanical devices requiring object recognition functionality.
- embodiments of the invention may also be used for processing of signals of other, often non-visual modalities, including various bands of electromagnetic waves (microwave, x-ray, infrared, etc.) and pressure (sound, seismic, tactile) signals.
- electromagnetic waves microwave, x-ray, infrared, etc.
- pressure sound, seismic, tactile
- Embodiments of the invention may be, for example, deployed in a hardware and/or software implementation of a computer-vision system, provided in one or more of a prosthetic device, robotic device and any other specialized visual system.
- an image processing system may include a processor embodied in an application specific integrated circuit (“ASIC”), which can be adapted or configured for use in an embedded application such as a prosthetic device.
- ASIC application specific integrated circuit
- FIGS. 1 through 4 exemplary embodiments of the pulse-code temporally proximate object recognition apparatus and methods of the invention are excita described.
- the image processing apparatus 100 includes an encoder 104 configured to receive an input signal 102 .
- the input signal is presented as a sequence of frames.
- the input signal in this case is a sequence of images (image frames) received from a CCD camera via a receiver apparatus or downloaded from a file.
- the image is a two-dimensional matrix of RGB values refreshed at a 24 Hz frame rate. It will be appreciated by those skilled in the art that the above image parameters are merely exemplary, and many other image representations (e.g., bitmap, CMYK, grayscale, etc.) and/or frame rates are equally useful with the present invention.
- the encoder 104 transforms (encodes) the input signal into an encoded signal 106 .
- the encoded signal comprises a plurality of pulses (also referred to as a group of pulses) configured to model neuron behavior. It is known in the field of neuroscience that neurons generate action potentials, often called “spikes”, “impulses”, or “pulses” and transmit them to other neurons. Individual pulses within the pulse group, typically last on the order of 1-2 ms and are approximated by discrete temporal events. In a different approach, each individual pulse of the pulse group is composed of several individual pulses (spikes).
- the encoded signal is transmitted from the encoder 104 via multiple communication links (also referred to as transmission channels, or communication channels) 108 to one or more detectors (also referred to as the pulse receiving unit, the detector node, or the receiving unit) of the first detector bank 110 .
- multiple communication links also referred to as transmission channels, or communication channels
- detectors also referred to as the pulse receiving unit, the detector node, or the receiving unit
- a single encoder can be coupled to any number of detector nodes that is compatible with the detection apparatus hardware and software limitations. Furthermore, a single detector node may be coupled to any practical number of encoders.
- each of the detectors 110 _ 1 , 110 — n contains logic (which may be implemented as a software code, hardware logic, or a combination of thereof) configured to recognize a predetermined pattern of pulses in the encoded signal 106 , using any of the mechanisms described below, and to produce detection output signals transmitted over communication channels 112 .
- the designators 112 _ 1 , 112 — n denote output of the detectors 110 _ 1 , 110 — n , respectively.
- the detection signals are delivered to a downstream bank of detectors 114 (that includes several detectors 114 _ 1 , 114 — m , 114 — k ) for recognition of complex object features and objects, similar to the exemplary embodiment described in commonly owned and co-pending U.S. patent application Ser. No. 13/152,084 entitled “Apparatus and Methods for Pulse-Code Invariant Object Recognition”, filed contemporaneously herewith, and incorporated herein by reference in its entirety.
- each subsequent bank of detectors is configured to receive signals from the previous bank, and to detect more complex (compared to the featured detected by the preceding detector bank) features and objects. For example, a bank of edge detectors is followed by a bank of bar detectors, followed by a bank of corner detectors and so on, thereby enabling alphabet recognition by the apparatus.
- detectors may be interconnected with loops and detection of features can be intermixed.
- Each of the detectors within upstream detector bank 110 generates detection signals on communication channels 112 _ 1 , 112 — n (with appropriate latency) that propagate with different conduction delays to the detectors of the downstream bank of detectors 114 .
- the detector cascade of the embodiment of FIG. 1 may contain any practical number of detector nodes and detector banks determined, inter alia, by the software/hardware resources of the detection apparatus and complexity of the objects being detected.
- the input signal includes a sequence of frames 121 - 125 .
- Each of the frames 121 - 125 is encoded into a respective group of pulses (e.g., pulse groups 146 , 147 corresponding to the frames 123 , 124 , respectively, in FIG. 1 ).
- the encoded pulses 142 - 144 are transmitted along respective communication channels 131 - 134 using e.g., any of the mechanisms described below.
- Pulse latency is determined in the illustrated embodiment as a temporal delay between a reference event and an arrival of the pulse along a channel (e.g., line 140 denotes the latency of pulse 142 corresponding to the frame 121 in FIG. 1A ). Pulse latency is measured with respect to the corresponding frame, as denoted by vertical broken lines 171 - 175 .
- latency for each pulse within the pulse group 147 is configured with respect to the onset of the frame 174 .
- an event trigger such as sudden change in the visual signal (e.g., due to a visual saccade or a sudden movement of the image camera, movement of parts of the visual signal, appearance or disappearance of an object in the visual scene), or alternatively a clock signal may be used as the temporal reference.
- Each of the frames 121 - 125 in FIG. 1A contains a representation of an object (an upright cup 161 and rotated cups 162 - 165 ) that is undergoing a rotational transformation.
- Other transformations such as translation, scaling, lighting, transparency, color changes, and/or a combination thereof are equally compatible with the invention, provided the transformations occur slowly, compared to the frame rate, and sequential phases (views) of the object transformation appear in a temporal proximity in the captured frames, as illustrated in the frames 121 - 125 of FIG. 1A .
- the term “temporal proximity” is used in the present context to describe object representations (views) that appear within a sequence of input frames taken over a period of time commensurate with the object transformation time scale. The exact duration of this interval may be application-specific. For example, in an embodiment of the object recognition apparatus configured to process visual signals containing one or more people, it is useful if object transformation lasts for about 2-7 frames (or for a period of 100-300 ms) in order for the detection apparatus to capture sufficient information related to the object.
- the image-to-pulse encoding is configured to produce different patterns of pulses in response to different representation of the same object, as illustrated by the pulse groups corresponding to objects representations 161 - 165 in the frames 121 - 125 .
- Even relatively similar object representations, such as cups 164 , 165 of close orientation, are encoded into two very different pulse patterns, as illustrated by the pulse groups 147 , 148 in FIG. 1A .
- the pulse groups 147 , 148 in FIG. 1A lie one salient advantage of the invention; i.e., the ability to discriminate minute distinctions between two images.
- two different objects are encoded into the same pattern of pulses, in which case internal representation invariance is then a property of the encoder. Therefore, a detector that receives such patterns inherits that particular invariance. For example, contrast and/or color information can be lost in the encoding stage, in which case the object detection apparatus responds invariantly to frames of different contrast and/or color.
- a detector receives the pulse group (such as 145 - 148 ), and generates a detection signal (pulses 151 - 156 ) in response to every pulse group that contains the predetermined pattern of pulses corresponding to the object of interest.
- the detector As the detector receives the input pulses, it makes a determination whether or not to “fire” a detection signal. In one implementation the detector is likely to fire when input pulses arrive fairly synchronously along some subset of input channels. In another implementation the detector is likely to fire if the incoming pattern of pulses exhibits certain inter pulse intervals. In one implementation, the detector logic relies on a continuous nature of the natural world, wherein pulse patterns that are similar and arrive in proximity are very likely to encode the same object. The detector logic adjusts the likelihood of detection signal based on input/detection history. This is an exemplary adjustment mechanism of the detection apparatus that increases a likelihood of the detector response to that particular object.
- the detection signals are transmitted from the detector node to downstream nodes along respective downstream transmission channels (such as the channel 135 in FIG. 1A ).
- Such appearance of consecutive sequence of views in temporal proximity facilitates object identification by the apparatus invariantly to the object transformation.
- the detection apparatus of FIG. 1A recognizes the rotated cup in each of the frames 162 - 165 as being the same object of interest as in the frame 161 , even though the views of the object, and consequently the representations thereof, are different.
- FIG. 1A shows different or distinct views of the object transformation within the input frame sequence 161 - 165
- other frame sequence configurations are compatible with the invention (for example, repetitions of the same view for more than one frame, etc.).
- Such repeated frames none the less allow the detection apparatus to recognize the object invariantly to the transformation, while (in one implementation) increasing the processing time required for detection.
- the first frame 121 comprises a default representation of the object (the upright cup 161 ) that corresponds to the target state of the detector, described in detail below with reference to FIGS. 2A-2C .
- the detector is configured to recognize the pulse pattern corresponding to that default representation, and to generate the detection signal (a positive response).
- the exemplary apparatus of FIG. 1A may not necessarily produce the detection signal when a new object (or objects) first appears in the input signal. If the first input pattern of pulses corresponds to the target state (for example the upright cup), the detector generates the detection signal. When, however, the detector receives an input pulse pattern corresponding to a different object representation (such as an upside down cup), it may not recognize it based on the pulse group of one such frame alone. However, receipt of subsequent pulse groups corresponding to the upside down cup (over many frames) by the detector causes the detector to recognize the upside down object representation (even in representations that were previously not recognized) due to the temporal proximity-based adjustment mechanism described below in further detail.
- a new object or objects
- the sensitivity of the detector is in one embodiment adjusted (increased), so that the detector node becomes more sensitive to that specific object representation, and is more likely to recognize that specific object in the subsequent pulse groups.
- the detector is configured to generate detection signal only after receiving the whole input pulse group, as illustrated by the detection signals 153 corresponding to the pulse group 146 .
- the detector is configured to respond to an input pattern even before all of the input pulses arrive at the detector, as illustrated by the detection signal 152 corresponding to the pulse group 145 in FIG. 1A .
- the detection signal 152 is generated before the last pulse of the pulse group has arrived at the detector (such as pulse 143 of pulse group 145 propagating on the channel 134 ).
- the leading portion (also referred to as the “prefix”) of the pulse group 145 is sufficient to trigger the detection logic of the detector such that subsequent pulses are not necessary to perform recognition. It also enables the detection apparatus to report object detection sooner.
- the remaining input pulses may be used to trigger additional detection pulses, and contribute to detector excitability adjustment.
- the encoder is configured to generate more than one pulse for one or more selected transmission channels, as illustrated by the pulses 144 transmitted on the channel 132 , corresponding to the input frame 125 in FIG. 1A .
- the detection signal generated by the receiving unit contains more than one pulse, corresponding to the same input pulse group, as illustrated by pulses 155 , 156 corresponding to the same pulse group 148 and frame 125 in FIG. 1A .
- multiple pulses sent over a particular channel within the same pulse group serve to emphasize the importance of that particular channel for object detection, and to facilitate detection response from the receiving unit.
- transmissions of multiple pulses are used to combat effects of noise, interference and/or intermittent connections during transmission.
- the timing of the detection signal i.e., detection pulse latency
- the timing of the detection signal encodes the level of confidence generated by the detection algorithm that the input pulse group represents the object of interest.
- a delayed response corresponds to a low confidence of the detection algorithm.
- Such delay may be caused by, for instance, performing additional computations (e.g., additional iterations of the algorithm, etc.) by the detector.
- a timely detector response conversely corresponds to a higher confidence of the detector.
- such delayed detection signal is followed by a lower latency (‘fast’) detection signal that corresponds to a subsequent pulse group that is a better match (closer to the actual target state).
- a late-generated detection signal facilitates the detector response to the next frame, and causes a downstream detector to receive two input pulses.
- object encoding apparatus and methods useful with the present invention are described in a commonly owned and co-pending U.S. patent application Ser. No. 13/152,084 entitled APPARATUS AND METHODS FOR PULSE-CODE INVARIANT OBJECT RECOGNITION incorporated by reference, supra, is used.
- This approach encodes an object into a group of pulses such that an identity (or type) of each object is encoded into relative (to one another) pulse latencies and parameters of the object, such as scale, position, rotation, are encoded into the group delay (that is common to all pulses within the group) of the pulse group.
- This encoding approach enables object recognition that is invariant to object parameters, such as scale, position, rotation, hence advantageously simplifying the object detection apparatus.
- the object recognition apparatus (such as the apparatus 100 of FIG. 1A ) is configured to dynamically adjust dynamic properties of the detectors, such as excitability (also referred to as the excitability parameter, node sensitivity or node responsiveness), as a result of performing detection computations and or pulsed input to the detector.
- excitability also referred to as the excitability parameter, node sensitivity or node responsiveness
- excitability is typically used to describe the propensity of neurons to excite (or fire) an action potential in response to stimulation.
- increased excitability is used herein, without limitation, to describe increased sensitivity of the detector node to a particular input pattern of pulses. This increase results in a detection enhancement regime and is achieved in a variety of ways including, but not limited to, lowering detection threshold (e.g., a correlation score) or amplifying the input, releasing detector from inhibition, or a combination thereof.
- ⁇ right arrow over ( ⁇ ) ⁇ is the detector state vector
- F( ⁇ right arrow over ( ⁇ ) ⁇ ) is the vector state function of ⁇ right arrow over ( ⁇ ) ⁇ V, V being the set of all possible states of the system
- I(t) is an input into the detector (such as pulse groups 145 - 148 in FIG. 1A ).
- the detector response is determined based on the condition that ⁇ right arrow over ( ⁇ ) ⁇ S for some set S ⁇ V.
- the set S corresponds to the detector target state, also referred to as the pulse regime set, where the detector is configured to generate the detection signal.
- ⁇ right arrow over ( ⁇ ) ⁇ : G ( ⁇ right arrow over ( ⁇ ) ⁇ ) (Eqn. 2) where G( ⁇ right arrow over ( ⁇ ) ⁇ ) is a predetermined reset function.
- the increase in detector sensitivity to subsequent inputs is achieved by introducing a state adjustment variable ⁇ right arrow over (e) ⁇ , which describes the state vector adjustment such that for all ⁇ right arrow over ( ⁇ ) ⁇ the following condition is satisfied: ⁇ right arrow over ( ⁇ ) ⁇ + ⁇ right arrow over (e) ⁇ ,S ⁇ right arrow over ( ⁇ ) ⁇ ,S ⁇ , (Eqn. 3) where ⁇ , S ⁇ denotes distance to set S. In one approach the distance ⁇ , S ⁇ is a Euclidean norm.
- the state adjustment parameter is chosen such that it converges to zero with time to reflect the fact that temporally close inputs are likely to represent the same object.
- the convergence is exponential and is expressed as:
- the non-zero state adjustment parameter ⁇ right arrow over (e) ⁇ is configured to push the state vector towards the set S (make the set S ‘more attractive’).
- ⁇ right arrow over (e) ⁇ is substantially greater than zero, a weaker input is sufficient for the detector to reach the state S and to generate the detection response.
- state adjustment in response to the detection sets the state adjustment parameter to a non-zero value, which in turn increases a likelihood of subsequent responses.
- F 1 and F 2 are state functions governing dynamics of the state variables ⁇ 1 and ⁇ 2 , respectively.
- FIG. 2A illustrates one exemplary approach of detector operation, where excitability (sensitivity) of the detector is kept constant, and is governed by the following implementation of the state Eqn. 6:
- FIG. 2A depicts detector response governed by Eqns. 7-9 in accordance with one embodiment of the invention.
- the trace 212 depicts the system state variable ⁇ 1
- the trace 202 depicts input I(t)
- the trace 210 denotes the state adjustment variable e 1 .
- the horizontal axis shows time progression.
- the detection apparatus 200 receives an input pulse 204 of an amplitude a 1 , corresponding to one of the pulses of the pulse group generated by the encoder (and transmitted along one of the transmission channels, as described above with respect to FIGS. 1 and 1A ). Note that although the apparatus 200 is configured to receive pulses within the pulse group on different channels, only a single input channel is shown for clarity in FIG.
- another input pulse 206 of amplitude a 3 ⁇ a 1 is received by the detector 200 .
- the weaker input 206 does not evoke a detection signal output from the detector.
- lack of the detector output is due the detector object recognition logic producing an output that is outside the detection boundary (for example, below a detection threshold).
- the detection logic is configured to screen out input pulses with amplitudes that are below a predetermined level.
- the detector of FIG. 2A produces a weak transient deviation 216 from the rest state with amplitude a 4 that is much lower than the amplitude of the detection signal 214 .
- the detector output channel stays quiet under this same condition.
- the detection apparatus 200 has the state adjustment variable e 1 set to zero, as depicted by the trace 210 which remains at the base state throughout the FIG. 2A .
- FIG. 2B depicts the dynamics of the detector apparatus when configured to adjust the detector state every time the detector produces a detection output.
- the adjustment variable e 1 is configured to decay exponentially with time according to the following expression:
- the detector 220 receives a weaker input 226 of amplitude a 7 ⁇ a 5 . Contrary to the detector 200 , the weaker input 226 evokes a detection output pulse 236 of amplitude a 6 at time t 7 .
- the difference in the outcomes between the two detector embodiments of FIGS. 2A and 2B is due to, inter cilia, the state adjustment variable being substantially greater than zero in the detector embodiment of FIG. 2B , when the second, weaker, input arrives at the detector at time t 6 .
- the state adjustment variable e 1 is reset again to the predetermined level (step 246 in FIG. 2 ), thereby sustaining the increased sensitivity of the detector to subsequent inputs.
- the detection apparatus corresponding to FIG. 2B is referred to as the learning detector. That is, a prior input (“a history”) effects a change in the dynamic behavior of the detection apparatus, therefore, casing the apparatus to respond differently to subsequent input pulses.
- the detection apparatus 220 is configured to adjust its dynamic state ( 244 , 246 ) every time the detection signal ( 234 , 236 ) is generated. This behavior is substantially different from the detector behavior shown in FIG. 2A above, where the detector dynamic state is not affected by prior inputs.
- FIG. 2C Another approach to detector state adjustment is shown and described with respect to FIG. 2C .
- the detector apparatus of FIG. 2C is configured to adjust its dynamics for every input pulse it receives, without regard for the detection signal generation.
- An exemplary operation sequence of FIG. 2C detector embodiment is as follows: at time t 8 the detector receives an input pulse 252 of an amplitude a 8 and generates a detection signal 254 at time t 9 . Subsequent to the detection signal generation, the detector output decays with time, as shown by the trace 270 . Contrary to the embodiment of FIG. 2B , detector state in FIG. 2C is adjusted at time t 8 (step 256 ), for example, in accordance with Eqn. 10.
- the state adjustment variable e 1 260 decays exponentially with time, for example, in accordance with Eqn. 11. However, the time decay constant ⁇ in the detector embodiment of FIG. 2C is greater than that of the embodiment of FIG. 2B , as seen from comparing the traces 260 and 240 , respectively.
- the detector 250 receives a weaker input 262 of amplitude a 10 ⁇ a 8 .
- the weaker input pulse 262 does not evoke the detection signal, but results in the detector 250 producing a weak transient deviation 264 from the rest state with an amplitude all that is much lower than the amplitude a 9 of the detection signal 254 .
- the detector output channel stays quiet.
- the state adjustment variable e 1 is reset (step 266 in FIG. 2 ), thereby increased excitability (sensitivity) of the detector to subsequent inputs.
- the detection apparatus 250 employs the state adjustment (such as 256 , 266 ) in response to an input ( 252 , 262 ), the detector response differs from that of the detector 220 described above. While the state of the detection apparatus 250 is adjusted for every received input, the detection signal is not generated in response to every input pulse (such as the input 262 ). This is due to a faster decay of the state adjustment variable e 1 in the detector 250 compared to the detector 220 , as is seen by comparing the traces 240 , 260 of FIGS. 2B and 2C , respectively.
- the state of the detector of 250 progresses further away from the target set (firing set) S between the time t 8 (when the state is adjusted) and the time t 10 (when the subsequent input pulse 262 arrives).
- Detector state that is further distance away from the target set (as indicated by a lower level of trace 260 at time t 10 , compared to the level of trace 240 at time t 6 ) prevents the detector 250 from generating the detection signal in response to the input pulse 262 .
- the state adjustment 266 causes the detector to respond differently to a subsequent input pulse 272 that arrives at the detection apparatus 250 at time t 11 and triggers generation of the detection signal 274 .
- the difference in the detection apparatus 250 responses to input pulses 262 , 272 is due to, inter alfa, a shorter time lag ⁇ t 5 between the pulses 272 and 262 , compared to the time lag ⁇ t 4 between the input 262 and 252 .
- the state of the detector 250 at time t 11 is closer to the target set S, as illustrated by a higher level of the parameter e 1 (trace 260 in FIG. 2C ) at time t 11 .
- the detector state is adjusted in response to both input arrival and detection signal generation, as is shown and described below with respect to FIG. 2D .
- detector state in FIG. 2D is adjusted responsive to both, the input pulse at time t 8 (step 256 ) and the detection pulse at t 9 (step 286 ), for example, in accordance with Eqn. 10.
- the state adjustment variable e 1 260 decays exponentially with time, for example, in accordance with Eqn. 11.
- the weaker input pulse 262 of amplitude a 10 ⁇ a 8 does not evoke the detection signal, but results in the detector state adjustment 266 at time t 10 , and therefore an increased sensitivity of the detector 280 to subsequent inputs.
- Multiple detector state adjustments 256 , 286 , 266 move the detector 280 state closer to the target set S, thereby causing the detector to respond more quickly to a subsequent input pulse 272 that arrives at time t 11 and triggers generation of the detection signal 284 at t 12 .
- the detector state is adjusted at times t 11 , t 12 (step 276 , 288 in FIG. 2D ), thereby increased excitability (sensitivity) of the detector to subsequent inputs.
- the adjustment variable is a running, decaying average of the input activity.
- the detector implementations described with respect to FIGS. 2B-2D are in one embodiment obtained by a detector state model developed by the Assignee hereof, which cause the detection apparatus to adjust its response based on a history of pulsed inputs so that a prior input effects a change in the dynamic behavior of the detector (i.e., by increasing detector excitability and causing the detection apparatus to respond differently to subsequent input pulses).
- the timescales t det are typically on the order of several frames ( 2 - 7 ), and are configured to effect object detection within a sequence of temporally proximate views of the object during a transformation (see description of FIG. 1A , supra).
- invariant object detection is facilitated via an adjustment of transmission channels, such as the communication channels 108 , 112 in FIG. 1 .
- the detection apparatus of this approach is configured to adjust transmission channel properties in response to a channel pulse transmission history and/or detector activity history. These adjustments change future susceptibility of the detection apparatus thus causing it to respond differently to subsequent pulse inputs.
- the term “susceptibility” is used here to refer generally and without limitation to a combined effect of channel adjustment and detector node adjustment, and is differentiated from the term “excitability” (the latter which only applies to the detector node adjustment).
- the detection apparatus combines a quadratic integrate-and-fire detector model of Eqn. 7 with a simulated transmission channel as follows:
- dv 1 dt av 1 2 + bv 1 + c + I 1 ⁇ ( t ) + f ⁇ ( g , v ) , ( Eqn . ⁇ 12 )
- g(t) is the channel gain
- ⁇ (g, ⁇ ) is the channel response function.
- the transmission channel is modeled as the N-Methyl-D-aspartate (NMDA) channel where ⁇ (g NMDA ) is a function of channel parameter g NMDA :
- parameters a,b,c, ⁇ 1 , ⁇ , and ⁇ NMDA are configured to achieve desired model behavior.
- the exemplary NMDA channel works as follows. When detector state v is near the rest and g NMDA is positive, the state v is weakly pushed towards the target state.
- the input pulses arriving to the detector node cause channel gain adjustment by incrementing the g NMDA by the value of the input signal at this time. In this approach, stronger input pulses (pulses of higher amplitude) produce larger adjustment, and hence make the detector apparatus more sensitive to subsequent inputs.
- the adjustment logic may be implemented separately in each channel if desired. Such channels are referred to as the “smart channels”. Alternatively, the adjustment logic for all of the channels is implemented in the detector node, therefore allowing for simpler channel architecture. Combinations of the foregoing may also be employed.
- the detector node uses a linear superposition of channel contributions g ch NMDA in order to generate detection signal.
- this equivalence advantageously simplifies implementation of the detector apparatus by implementing gain adjustment g NMDA in the detector node itself and allowing for a simpler channel configuration.
- each of the transmission channels is characterized by its own unique parameter g ch NMDA as:
- the channel gain is set to a value (predetermined or dynamically derived), which may be unique to that particular channel or shared among many channels.
- a transmission delay of the channel is adjusted as required (either up or down) in response to the channel carrying an input to the detector.
- the channel parameter (gain or delay) is adjusted for every channel that is coupled to the detector, after that detector has generated a detection signal based on a pulse group, even if that particular channel did not carry an input pulse for that group.
- the detection apparatus adjusts channel parameters in response to a detector receiving an input pulse via the channel.
- Such functionality provides for increasing the likelihood of detecting an object of interest based on prior inputs, even when these inputs had not evoked a prior detection response from the detector.
- channel adjustment persists over only a set of several frames (e.g., 2-7). Therefore, the channel adjustment timescales t gch are comparable to the detector excitability adjustment timescale t det shown and described with respect to FIGS. 2B-2C .
- Channel gain adjustment is also configured to effect object detection within a sequence of temporally proximate views of the object during a transformation (see description of FIG. 1A , supra).
- channel characteristics are modulated so that the channel contribution to the detector varies with time.
- channel modulation is configured to suppress some of the channels, such as by setting channel gain to zero or transmission delay to a value outside of the limits that are allowable by the detection logic.
- Channel modulation advantageously provides a mechanism allowing the detection apparatus to search for a most optimal channel configuration to account for varying inputs and changes of the detector node configuration (detector network topology).
- channel modulation is implemented on time scales t mod that are relatively long, compared to the inter-frame interval.
- the channel modulation allows the detection apparatus to select transmission channels that are the most relevant (i.e., provide the largest contribution) to the detection computations.
- the channels that are relevant to a particular detector are those that are active (carry a pulse within the pulse group) whenever the detector recognizes an object of interest (and thus responds with a detection signal).
- the detector is also likely to respond to subsequent inputs that following the triggering pulse group and thus channels involved in carrying pulses of the subsequent pulse groups become relevant. If the input has temporal consistency, certain sequences of input appear together frequently.
- the transmission channels that are inconsistent with the representations of the object of interest; that is, their pulses do not correlate with the activity of the receiving unit may be turned down and eventually ignored.
- channel modulation is effected via (i) probabilistic methods, or (ii) a combination of deterministic and probabilistic techniques, where one portion of the channels is modulated deterministically, and other channels are modulated using statistical analysis.
- the deterministic/probabilistic channel modulation is performed with respect to time, such that during one period of time (e.g., search for most optimal channel) channels are modulated using random variables approach. Once the optimum channel (or a group of channel) is detected, these channel(s) are modulated deterministically.
- the pulse timing-dependent plasticity is used as the long-term modulation to potentiate (amplify) or suppress certain transmission channels.
- the detector such as the detector 135 in FIG. 1A
- the detection signal 154 that is based on the pulse group 147
- ⁇ ( t ) is the arrival of the last input pulse to the detector on channel c and ⁇ (t) is a predetermined the detector gain function.
- the transmission channel gain is maintained within a predetermined range, typically between zero and a maximum allowable value, that is configured based on a specific implementation.
- FIG. 2E depicts an exemplary channel modulating function.
- the magnitude of the exemplary gain adjustment of FIG. 2E is configured with respect to the time interval between the received pulse t p and the detection pulse t f , (such as the detection pulse 154 FIG. 1A ) generated in response to the pulse group (for example, the pulse group 147 FIG. 1A ).
- the pulse arrives to the detector before the detection signal (branch 296 ) the respective channel is amplified, with the most recent contributions (smaller
- the channels that transmit pulses that correspond to pulse groups prior to the detection signal are amplified, but the amount of extra gain is greatly reduced due to a fast decaying characteristic of the gain function (such as for example of Eqn. 18), thus making these older channel pulses less relevant.
- the pulse arrives to the detector after the detection signal (branch 298 in FIG. 2E ) the respective channel is suppressed, with the most recent contributions (smaller
- the channel gain function g(t) is selected in the same form as the detector gain function ⁇ (t).
- the parameters of Eqns. 17 and 19 are e.g., selected to such that the transmission channel gain is modulated within a bounded range that is specific to a particular application. This approach enables the detecting apparatus to amplify the transmission channels that frequently carry input pulses (within the predetermined time window) prior to the detection pulse, and suppress the other channels that deliver pulses inconsistent with the temporally proximate views of the object (see description of FIG. 1A supra).
- the pulse timing-dependent plasticity provides for a selection of the smallest set of channels that are required for detection of each specific object.
- a detector responding with detection pulses to a particular group of pulses that arrive over a set of channels, causes amplification of these channels and reduction of other channels using pulse timing-dependent plasticity (for example of FIG. 2D ). Over time, only the channels that contribute to that particular pulse group sequence remain active. However, as the plasticity of the detector apparatus continues to amplify active channels, the detector generates detection pulses before all of the pulses (within the pulse group) arrive at the detector (see description with respect to FIG. 1A above). As a result, the pulse timing plasticity in effect selects a subset of channels that carry a prefix of the group of pulses that the detector reports, thus advantageously resulting in the smallest possible channel subset, allowing for faster detection.
- pulse timing dependent plasticity is used in combination with a recovery mechanism which amplifies suppressed channels over a longer time scale.
- This approach advantageously provides a recovery a mechanism that prevents detector input starvation: i.e., where all of the inputs to a particular detector (such as the detector 135 in FIG. 1A ) are suppressed (inactive) due to an adverse combination of different factors (such as input, channel noise, detector and channel gain, detector inhibition, etc).
- the recovery mechanism enables amplification of (at least a portion of) the transmission channels over a certain period of time t rec so that the detector becomes active.
- the detection apparatus achieves increased susceptibility to certain inputs by modulating the strength of pulse timing-dependent plasticity dependent on the detector activity history.
- the potentiation of a transmission channel in response to a detection pulse is increased if the detector has been previously active (such as the detector generated one or more detection pulses during a predetermined time window prior to the detection pulse 154 ).
- channel suppression is reduced following pulse delivery to a recently active detector.
- channel modulation timescales t mod are configured to be at least an order of magnitude longer, as compared to the channel gain adjustment t gch and detector excitability adjustment t det timescales described above.
- the slow channel modulation is configured to depend on the overall activity of the detector apparatus, and is effective over thousands of frames.
- one or more detector nodes are configurable to affect activity of other nodes by providing a signal (or signals). These signals may promote or inhibit secondary node activity and control the degree of inhibition, resulting in a competition between nodes. Such competition advantageously facilitates the detection apparatus recognizing multiple objects invariantly with respect to temporally consistent transformations of these objects. In turn, in one approach, multiple nodes are configured to mutually inhibit each other.
- an inhibited detector still is allowed to generate detection signals (provided it receives a sufficiently favorable input).
- FIG. 3 is a block diagram of an exemplary detector network apparatus configured for invariant object recognition using detector inhibition according to one embodiment of the invention.
- the apparatus 300 comprises several banks of detectors 110 , 114 , that are similar to the detectors described above with respect to FIG. 1 .
- different detectors of the same bank are denoted by a “_n” designator, such that for example designator 110 _ 1 denotes the first detector of the detector bank 110 .
- designator 110 _ 1 denotes the first detector of the detector bank 110 .
- the detectors of the detector bank 110 are coupled to an encoder (for example, the encoder 104 of FIG. 1 ) via a plurality of transmission channels (not shown), and to the downstream detector bank 114 via a second plurality of transmission channels 112 . Furthermore, some or all individual detectors within each bank are in operable communication with each other via links 301 - 304 .
- the links 301 - 304 are effected via transmission channels that are similar to the channels 112 .
- links 301 - 304 are implemented via a simple communication channels, such as side band channels, etc.
- the channels delivering inhibition signals between detectors (such as the channels 301 , 302 ) are implemented differently compared to the transmission channels (such as the channels 112 ).
- different detectors of the detection apparatus 300 are linked via a one way link ( 302 , 304 ) or a two way link ( 301 , 303 ). These can span immediate nodes (for example links 301 , 302 ) or distant nodes ( 303 , 304 ). Many other topographies (such as one-to-all, etc.) will be readily recognized and implemented by one of ordinary skill when provided the present disclosure, and are equally compatible with the present invention.
- the links 301 - 304 are used by the nodes to transmit and receive detector inhibition according to any of the mechanisms described in detail below.
- each of the detectors 110 _ 1 , 110 — n contains logic (configured to recognize a predetermined pattern of pulses in the encoded signal 106 , and to produce detection output signal transmitted over the communication channels 112 .
- Each of the detectors within the upstream detector bank 110 generates detection signals (with appropriate latency) that propagate with different conduction delays on channels 112 _ 1 , 112 — n to detectors of the downstream detector bank 114 .
- the detector cascade of the embodiment of FIG. 3 may contain any practical number of detector nodes and detector banks determined, inter alia, by the software/hardware resources of the detection apparatus and complexity of the objects being detected.
- the detector apparatus 300 comprises a few detector nodes (for example less than 10) each node is configurable to inhibit any other node. In another implementation (comprising a large number of nodes) each detector node is configured to inhibit a smaller subset (typically 5-10) of neighboring nodes.
- WTA winner take all
- Still another approach commonly referred to as the “soft” inhibition, impedes object detection by the other nodes while still allowing generation of the detection signals.
- such inhibition is effected via an increase of the detection threshold of the second nodes.
- an additional delay is used to delay detection signal output from the secondary nodes. In the latter case, it is possible that two or more detector nodes report the same object of interest. However, the responses by the secondary nodes are delayed with respect to the primary node response.
- node inhibition is configured to reduce the magnitude of the detection pulse generated by the secondary node. A combination of the above and or similar approaches may also be used consistent with the present invention.
- the inhibition remains in effect until the arrival of the next pulse group (frame).
- the nodes remain inhibited for more than one frame.
- many other inhibition schemes are equally compatible and useful with the present invention, such as a combination of hard/soft inhibition rules configured over varying time periods (for example, some nodes are soft inhibited over a first number of frames, while other nodes are hard inhibited over a second number of frames).
- inhibition of one detector (for example 110 _ 1 in FIG. 3 ) by another detector (for example 110 _ 2 in FIG. 3 ) is configured to diminish exponentially over time.
- Such inhibition configuration allows the 110 _ 1 detector to respond to a particular object once the inhibition signal drops below a certain threshold. The inhibition level is maintained above the threshold by periodic transmissions of the inhibition signal by the detector 110 _ 2 .
- a node that is inhibited to respond to representations of a first object responds to views of other objects. For example, consider an input sequence of frames containing representations of object A, frequently followed by representations of object B.
- a detector (such as, for example, the detector 335 in FIG. 3A ) that detects input pattern relating to object A also learns to detect object B by mechanisms above.
- the input frame sequence further contains representations of objects C, D, E, F, and G, which are not detected by the detector 335 (for example, due to having a less pronounced temporal structure compared to the representations of objects A and B).
- a second detector such as, for example, the detector 336 in FIG. 3A ) that is inhibited by the detector 335 from responding to representations of objects A and B is adjusted to respond to the representation of object C.
- Another approach combines multiple node inhibition with the long-term modulation of transmission channels described supra.
- This approach advantageously allows adjustment of dynamic parameters (gain, transmission delay, detection pulse amplitude and timing, etc.) of individual detectors and transmission channels given the appropriate input frames (also referred to as the “training” input or cycle).
- the object detection apparatus Upon performing a training cycle, the object detection apparatus becomes responsive to a certain set of objects, each response being invariant to temporally proximate views of the objects.
- the apparatus 300 is configured to receive a sequence of input frames 301 - 305 .
- Some of the input frames ( 301 - 302 , 304 - 305 ) contain representations of one of two objects ( 321 , 322 ) separated by one or more frames (such as the frame 303 , also referred to as the blank frame) where there the objects 321 , 322 are not present.
- Such situations occur frequently in the physical world where objects do not suddenly change into one another, instead appearing consecutively in series of consecutive frames interleaved by blank frames.
- a single blank frame 303 is shown in FIG. 3A , several blank frames may be present in the input sequence of frames.
- blank frames contain representations of objects other than the objects 321 , 322 .
- the temporal separation of objects as shown in FIG. 3A allows the apparatus to distinguish between different objects and to report the detection via different detectors.
- the increased excitability of detector 355 which responds to the first object 321 decays over the duration of the blank frame 313 (or a plurality of blank frames in general) and by the time the second object 322 appears in frame 314 , it is less likely to be confused with the first object 321 by, for example, the detector 355 .
- the detector 356 has an opportunity to respond to the second object 322 .
- the detection apparatus respond to both object representations as though it is a single object.
- objects typically do not change into each other so abruptly.
- representations of the same object with respect to some transformation e.g., rotation or scaling
- the objects 321 , 322 are being subjected to a rotational transformation. It is appreciated by those skilled in the art that a variety of other transformations are useable and compatible with the present invention, such as was described with respect to FIG. 1A .
- the detection apparatus 300 of FIG. 3 includes an encoder module configured to encode each of the input frames 301 - 305 into a respective pattern (group) of pulses 341 - 345 that propagate along the transmission channels 131 - 134 .
- the detection apparatus 300 comprises two (or more) detectors whose activity is depicted by the traces 355 , 356 , respectively.
- the detector response trace 355 contains the detection pulses 333 , 334 generated in response to receiving the pulse patterns 341 , 342 , indicating the presence of the object 321 in the frames 301 , 302 .
- the frame 302 contains a weak representation of the object 321 , such as the object moving out of the sensing field or fading away.
- the detector 355 is configured without learning functionality and it, therefore, does not respond to the pulse pattern 342 , because the weaker input is not sufficient to move the detector state sufficiently towards the target set (firing set) S.
- the detection apparatus 300 is configured to adjust the state of the detector 335 (according to, inter alia, Eqn. 10), and to increase detector excitability upon generation of the detection signal 333 in response to the pulse group 341 .
- This adjustment moves the detector state closer to the target state prior to receipt of the subsequent pulse group 343 .
- Higher detector excitability aids the detector 355 in recognizing the object of interest in the pulse pattern 342 , and to cause generation of the detection pulse 334 .
- the detector apparatus 300 of FIG. 3A further comprises mutual detector inhibition, which his illustrated by the detector response traces 355 , 356 .
- mutual detector inhibition As the detector 355 produces detection pulses 333 , 334 it prevents the detector 356 from detecting the same object by transmitting an inhibition signal (not shown).
- This approach advantageously ensures that only a single detector (for example, the detector 355 ) produces detection output for the specific object (for example, 321 ).
- the blank frame 303 does not trigger detection signal generation by either detector 355 , 356 as the frame 303 contains no relevant object representations.
- the increased susceptibility of the detector node 355 diminishes subsequent to the frame 303 .
- the frames 304 , 305 in FIG. 3 each contain representations of a different object ( 322 ), which the detector 356 is configured to recognize. Likewise to the description with respect to the object 321 , supra, the detector 356 is configured inhibit responses of the detector 355 as indicated by the absence of the detection pulses in the trace 355 corresponding to the detection pulses 336 , 337 generated in response to the frames 304 , 305 . Such “selective filtering” significantly simplifies operation of downstream detectors (not shown), which no longer need to deal with weaker and/or late detection responses that may be produced by the detector 355 in response to the frames 304 , 305 , or by the detector 356 in response to the frames 301 , 302 .
- FIG. 4 illustrates another exemplary embodiment of object recognition apparatus according to the invention.
- the apparatus 400 is configured to recognize data frames comprising of a mix of two different objects. Similar to the embodiment described above with respect to FIG. 3 , the detection apparatus 400 comprises an encoder module configured to encode each of the input frames 401 - 405 into a respective pattern (group) of pulses 441 - 445 that propagate along the transmission channels 131 - 134 .
- the detector apparatus 400 further comprises two detectors whose activity is depicted by the traces 455 , 456 .
- the apparatus 400 receives some number of blank frames ( 401 , 402 ).
- the blank frames 401 - 402 are nevertheless encoded into pulse patterns 441 - 442 .
- pulse patterns 441 - 442 include pulses propagating only on channels 151 , 152 and neither of the detectors 455 , 456 responds to the pulse patterns 441 - 442 .
- the blank frame comprises the object of interest for a certain detector that is configured to respond to blank frames.
- both detectors 455 , 456 remain in approximately similar dynamic states, and are described by a comparable excitability to subsequent inputs.
- the three subsequent input frames 403 - 405 in FIG. 4 comprise representations of an object 406 that is a composite of two different sub-objects.
- the sub-objects are the objects 321 , 322 of FIG. 3 described above.
- the frames 402 - 405 are encoded into the respective pulse groups 443 - 445 .
- Each of the detectors 455 , 456 is configured to detect and respond to representations of the specific sub-object 321 , 322 , respectively. This is illustrated by the detection pulses 432 , 436 generated by the detectors 455 , 456 , respectively, for example, responsive to the receipt of the pulse group 443 .
- the detector 455 is the primary node, as it is first to generate the detection response (the pulse 432 occurs prior to the pulse 436 ).
- the primary node prevents the secondary node (the detector 456 ) from generating subsequent detection responses to the same object representation (subsequent potential responses are denoted by dashed pulses 435 , 438 ), even though the corresponding input frames contain representation of the sub-object 322 .
- the inhibition feature of the detection apparatus 400 ensures that only the detector 455 generates the detection signal 434 , 435 in response to the composite object 406 represented in frames 404 , 405 .
- the apparatus 400 in such case reports only the object of interest on the channel corresponding to the detector 455 while suppressing reporting of the presence of the other features. Once the detectors of the apparatus 400 detect the object of interest, they in effect become “specialized detectors” that keep responding to various representations of that object.
- small variations within input frames may cause the detector 456 to become the primary node by responding first to the pulse group 443 .
- the primary node inhibits other detectors (the detector 455 ) from producing detection signals for the subsequent frames.
- Channel or detector adjustment is in the illustrated embodiment configurable to last over longer periods of time.
- channel long-term modulation persists over several hundreds or thousands of frame sequences (such as frame sequence described with respect to FIG. 1A supra).
- t mod is greater than 1000 frames at 24 fps.
- time scales there are three relevant time scales that are relevant to the exemplary embodiment of the invention: (i) the dynamics of detectors (the shortest scale), (ii) the transformation time scale (where we have temporal proximity), and (iii) the long time scale, in which the system can adjust its parameters to better detect object. Detector adjustments are transient in all cases; that is, they span only a few frames (1-10). The inhibition may last from under 1 frame (on the order of 20 ms) up to a few frames.
- the intermediate time scale embodiments of the invention implement an increased excitability; i.e., once an object has been detected, it is more likely to be reported in the subsequent frame. In the long time scale this means that if statistically frame A is often followed by frame B, the detector will learn to report frame B even if it only reported frame A in the beginning.
- This functionality can be achieved by pulse timing dependent plasticity alone, provided that on the intermediate time scale there is an increased excitability period.
- Channel modulation may be transient as well (sometimes referred to as “short-term plasticity”), which can implement increased susceptibility as described above.
- Long-term channel modulation (colloquially “learning”) works on slower time scales. In fact, the slight changes to the channel gain may appear frequently or even constantly, but it will take anywhere from hundreds up to tens of thousands of frames to actually change the gain from zero to some maximum allowed value.
- Transient changes short term plasticity
- long term modulation learning
- the short term gain modulation may be proportional to the long term gain, in which case weak channels are allowed only weak transient effects.
- Other solutions where in general the short term modulation is a function of parameters of channel determined by long term learning are also possible.
- Pulse coded representations advantageously allow for a very high representation capacity in an object recognition apparatus, as compared to rate encoding. This is due to, inter cilia, invariant properties of pulse encoding which produces the same encoding (invariant output) for many different representations of the object (input). Additionally, by simultaneously connecting a single detector to several encoders (and other detectors), the same detector can receive and process signals representing different objects, thereby effecting resource reuse while increasing processing throughput.
- the detectors of the network are dynamically reused, wherein some or all detectors are a part of many “detection sequences”; i.e. cascades of pulses going through that system.
- signal representation throughout the detector cascade is uniformly configured. That is, detector output is of the same form the input, therefore allowing dynamic detector network configuration without the need for supplementary signal conditioning.
- An object detection apparatus equipped with long-term channel modulation based on pulse timing plasticity offers a competitive mechanism that enables some of the detectors to specialize in certain objects of interest.
- the specialized detector responds much more quickly (compared to the non-specialized detectors) to input patterns of interest, leaving other detectors inactive by the use of an inhibition mechanism.
- some of the detectors are left unspecialized, thus providing redundancy in the detector network.
- pulse timing dependent plasticity where subsequent dynamics of detectors and transmission channels is determined based in part on a prior activity and/or prior input, enables the detector apparatus to adapt its configuration (learn) and to develop invariant recognition properties through learning and adaptation.
- exemplary embodiments of the present invention are useful in a variety of devices including without limitation prosthetic devices, autonomous and robotic apparatus, and other electromechanical devices requiring object recognition functionality.
- robotic devises are manufacturing robots (e.g., automotive), military, medical (e.g. processing of microscopy, x-ray, ultrasonography, tomography).
- autonomous vehicles include rovers, unmanned air vehicles, underwater vehicles, smart appliances (e.g. ROOMBA®), etc.
- Embodiments of the present invention are further applicable to a wide assortment of applications including computer human interaction (e.g., recognition of gestures, voice, posture, face, etc.), controlling processes (e.g., an industrial robot, autonomous and other vehicles), augmented reality applications, organization of information (e.g., for indexing databases of images and image sequences), access control (e.g., opening a door based on a gesture, opening an access way based on detection of an authorized person), detecting events (e.g., for visual surveillance or people or animal counting, tracking), data input, financial transactions (payment processing based on recognition of a person or a special payment symbol) and many others.
- computer human interaction e.g., recognition of gestures, voice, posture, face, etc.
- controlling processes e.g., an industrial robot, autonomous and other vehicles
- augmented reality applications organization of information (e.g., for indexing databases of images and image sequences)
- access control e.g., opening a door based on a gesture, opening an
- the present invention can be used to simplify tasks related to motion estimation, such as where an image sequence is processed to produce an estimate of the object position (and hence velocity) either at each points in the image or in the 3D scene, or even of the camera that produces the images.
- tasks related to motion estimation such as where an image sequence is processed to produce an estimate of the object position (and hence velocity) either at each points in the image or in the 3D scene, or even of the camera that produces the images.
- Examples of such tasks are: ego motion, i.e., determining the three-dimensional rigid motion (rotation and translation) of the camera from an image sequence produced by the camera; following the movements of a set of interest points or objects (e.g., vehicles or humans) in the image sequence and with respect to the image plane.
- portions of the object recognition system are embodied in a remote server, comprising a computer readable apparatus storing computer executable instructions configured to perform pattern recognition in data streams for various applications, such as scientific, geophysical exploration, surveillance, navigation, data mining (e.g., content-based image retrieval).
- applications such as scientific, geophysical exploration, surveillance, navigation, data mining (e.g., content-based image retrieval).
- applications such as scientific, geophysical exploration, surveillance, navigation, data mining (e.g., content-based image retrieval).
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- Data Mining & Analysis (AREA)
- General Health & Medical Sciences (AREA)
- Biomedical Technology (AREA)
- Biophysics (AREA)
- Computational Linguistics (AREA)
- Life Sciences & Earth Sciences (AREA)
- Evolutionary Computation (AREA)
- Artificial Intelligence (AREA)
- Molecular Biology (AREA)
- Computing Systems (AREA)
- General Engineering & Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Mathematical Physics (AREA)
- Software Systems (AREA)
- Health & Medical Sciences (AREA)
- Image Analysis (AREA)
Abstract
Description
{right arrow over (ν)}:=G({right arrow over (ν)}) (Eqn. 2)
where G({right arrow over (ν)}) is a predetermined reset function. The increase in detector sensitivity to subsequent inputs is achieved by introducing a state adjustment variable {right arrow over (e)}, which describes the state vector adjustment such that for all {right arrow over (ν)} the following condition is satisfied:
∥{right arrow over (ν)}+{right arrow over (e)},S∥<∥{right arrow over (ν)},S∥, (Eqn. 3)
where ∥, S∥ denotes distance to set S. In one approach the distance ∥, S∥ is a Euclidean norm.
Using Eqn. 4, the vector state Eqn. 2 becomes:
where F1 and F2 are state functions governing dynamics of the state variables ν1 and ν2, respectively.
where parameters of the Eqn. 7 are set as follows:
a=0.007,b=0.7,c=16.8,d=−0.007,p=−0.06,q=−3.6,r=−0.03. (Eqn. 8)
ν1:=−50,
ν2:=ν2+100. (Eqn. 9)
e 1:=106. (Eqn. 10)
where g(t) is the channel gain, and ƒ(g, ν) is the channel response function. In one approach, the transmission channel is modeled as the N-Methyl-D-aspartate (NMDA) channel where ƒ(gNMDA) is a function of channel parameter gNMDA:
g NMDA =g NMDA +I(t). (Eqn. 15)
g NMDA=Σch g ch NMDA, (Eqn. 16)
where gNMDA is equivalent to a single channel gain parameter of Eqn. 12. That is, for a linear channel superposition, an adjustment of the detector gain is equivalent to a gain adjustment of all respective channels, provided that the condition of the Eqn. 16 holds. In one approach, this equivalence advantageously simplifies implementation of the detector apparatus by implementing gain adjustment gNMDA in the detector node itself and allowing for a simpler channel configuration. The functionality obtained is exactly the same as above, but the increased susceptibility to subsequent input is now the property of the pulse-carrying channel. Other channel superposition implementations and/or channel scaling are usable with the exemplary embodiment of the present invention, provided that the condition of Eqn. 16 is satisfied.
The channel gain is set to a value (predetermined or dynamically derived), which may be unique to that particular channel or shared among many channels.
dƒ c=ƒ(t−t c) (Eqn. 17)
ƒ(t)=αe −t. (Eqn. 18)
dg c =g(t−t ƒ), (Eqn. 19)
where tƒ is the time of the last output pulse from the detector and g(t) is a gain function. In order to implement a bounded detection system, the transmission channel gain is maintained within a predetermined range, typically between zero and a maximum allowable value, that is configured based on a specific implementation.
Claims (35)
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US13/152,105 US9122994B2 (en) | 2010-03-26 | 2011-06-02 | Apparatus and methods for temporally proximate object recognition |
PCT/US2012/040567 WO2012167164A1 (en) | 2011-06-02 | 2012-06-01 | Apparatus and methods for temporally proximate object recognition |
Applications Claiming Priority (5)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US31819110P | 2010-03-26 | 2010-03-26 | |
US12/869,573 US8315305B2 (en) | 2010-03-26 | 2010-08-26 | Systems and methods for invariant pulse latency coding |
US12/869,583 US8467623B2 (en) | 2010-03-26 | 2010-08-26 | Invariant pulse latency coding systems and methods systems and methods |
US13/117,048 US9311593B2 (en) | 2010-03-26 | 2011-05-26 | Apparatus and methods for polychronous encoding and multiplexing in neuronal prosthetic devices |
US13/152,105 US9122994B2 (en) | 2010-03-26 | 2011-06-02 | Apparatus and methods for temporally proximate object recognition |
Publications (2)
Publication Number | Publication Date |
---|---|
US20120308076A1 US20120308076A1 (en) | 2012-12-06 |
US9122994B2 true US9122994B2 (en) | 2015-09-01 |
Family
ID=47259923
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US13/152,105 Expired - Fee Related US9122994B2 (en) | 2010-03-26 | 2011-06-02 | Apparatus and methods for temporally proximate object recognition |
Country Status (2)
Country | Link |
---|---|
US (1) | US9122994B2 (en) |
WO (1) | WO2012167164A1 (en) |
Cited By (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20140267606A1 (en) * | 2013-03-15 | 2014-09-18 | The Trustees Of Columbia University In The City Of New York | Systems and Methods for Time Encoding and Decoding Machines |
US9792546B2 (en) | 2013-06-14 | 2017-10-17 | Brain Corporation | Hierarchical robotic controller apparatus and methods |
US9789605B2 (en) | 2014-02-03 | 2017-10-17 | Brain Corporation | Apparatus and methods for control of robot actions based on corrective user inputs |
US9821457B1 (en) | 2013-05-31 | 2017-11-21 | Brain Corporation | Adaptive robotic interface apparatus and methods |
US9844873B2 (en) | 2013-11-01 | 2017-12-19 | Brain Corporation | Apparatus and methods for haptic training of robots |
US9902062B2 (en) | 2014-10-02 | 2018-02-27 | Brain Corporation | Apparatus and methods for training path navigation by robots |
US9950426B2 (en) | 2013-06-14 | 2018-04-24 | Brain Corporation | Predictive robotic controller apparatus and methods |
US10155310B2 (en) | 2013-03-15 | 2018-12-18 | Brain Corporation | Adaptive predictor apparatus and methods |
US10376117B2 (en) | 2015-02-26 | 2019-08-13 | Brain Corporation | Apparatus and methods for programming and training of robotic household appliances |
US11893474B2 (en) | 2015-10-23 | 2024-02-06 | Semiconductor Energy Laboratory Co., Ltd. | Semiconductor device and electronic device |
Families Citing this family (108)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US9122994B2 (en) | 2010-03-26 | 2015-09-01 | Brain Corporation | Apparatus and methods for temporally proximate object recognition |
US8467623B2 (en) | 2010-03-26 | 2013-06-18 | Brain Corporation | Invariant pulse latency coding systems and methods systems and methods |
US9311593B2 (en) | 2010-03-26 | 2016-04-12 | Brain Corporation | Apparatus and methods for polychronous encoding and multiplexing in neuronal prosthetic devices |
US9405975B2 (en) | 2010-03-26 | 2016-08-02 | Brain Corporation | Apparatus and methods for pulse-code invariant object recognition |
US9906838B2 (en) | 2010-07-12 | 2018-02-27 | Time Warner Cable Enterprises Llc | Apparatus and methods for content delivery and message exchange across multiple content delivery networks |
US8942466B2 (en) | 2010-08-26 | 2015-01-27 | Brain Corporation | Sensory input processing apparatus and methods |
US9152915B1 (en) | 2010-08-26 | 2015-10-06 | Brain Corporation | Apparatus and methods for encoding vector into pulse-code output |
US9015093B1 (en) | 2010-10-26 | 2015-04-21 | Michael Lamport Commons | Intelligent control with hierarchical stacked neural networks |
US8775341B1 (en) | 2010-10-26 | 2014-07-08 | Michael Lamport Commons | Intelligent control with hierarchical stacked neural networks |
US9070039B2 (en) | 2013-02-01 | 2015-06-30 | Brian Corporation | Temporal winner takes all spiking neuron network sensory processing apparatus and methods |
US9147156B2 (en) | 2011-09-21 | 2015-09-29 | Qualcomm Technologies Inc. | Apparatus and methods for synaptic update in a pulse-coded network |
US9047568B1 (en) | 2012-09-20 | 2015-06-02 | Brain Corporation | Apparatus and methods for encoding of sensory data using artificial spiking neurons |
US8990133B1 (en) | 2012-12-20 | 2015-03-24 | Brain Corporation | Apparatus and methods for state-dependent learning in spiking neuron networks |
US9566710B2 (en) | 2011-06-02 | 2017-02-14 | Brain Corporation | Apparatus and methods for operating robotic devices using selective state space training |
US9460387B2 (en) | 2011-09-21 | 2016-10-04 | Qualcomm Technologies Inc. | Apparatus and methods for implementing event-based updates in neuron networks |
US8725658B2 (en) | 2011-09-21 | 2014-05-13 | Brain Corporation | Elementary network description for efficient memory management in neuromorphic systems |
US9104973B2 (en) | 2011-09-21 | 2015-08-11 | Qualcomm Technologies Inc. | Elementary network description for neuromorphic systems with plurality of doublets wherein doublet events rules are executed in parallel |
US9412064B2 (en) | 2011-08-17 | 2016-08-09 | Qualcomm Technologies Inc. | Event-based communication in spiking neuron networks communicating a neural activity payload with an efficacy update |
US8725662B2 (en) | 2011-09-21 | 2014-05-13 | Brain Corporation | Apparatus and method for partial evaluation of synaptic updates based on system events |
US8719199B2 (en) | 2011-09-21 | 2014-05-06 | Brain Corporation | Systems and methods for providing a neural network having an elementary network description for efficient implementation of event-triggered plasticity rules |
US20150074026A1 (en) * | 2011-08-17 | 2015-03-12 | Qualcomm Technologies Inc. | Apparatus and methods for event-based plasticity in spiking neuron networks |
US9156165B2 (en) | 2011-09-21 | 2015-10-13 | Brain Corporation | Adaptive critic apparatus and methods |
US9117176B2 (en) | 2011-09-21 | 2015-08-25 | Qualcomm Technologies Inc. | Round-trip engineering apparatus and methods for neural networks |
US9098811B2 (en) | 2012-06-04 | 2015-08-04 | Brain Corporation | Spiking neuron network apparatus and methods |
US10210452B2 (en) | 2011-09-21 | 2019-02-19 | Qualcomm Incorporated | High level neuromorphic network description apparatus and methods |
US9213937B2 (en) | 2011-09-21 | 2015-12-15 | Brain Corporation | Apparatus and methods for gating analog and spiking signals in artificial neural networks |
US9146546B2 (en) | 2012-06-04 | 2015-09-29 | Brain Corporation | Systems and apparatus for implementing task-specific learning using spiking neurons |
US9104186B2 (en) | 2012-06-04 | 2015-08-11 | Brain Corporation | Stochastic apparatus and methods for implementing generalized learning rules |
US9015092B2 (en) | 2012-06-04 | 2015-04-21 | Brain Corporation | Dynamically reconfigurable stochastic learning apparatus and methods |
TW201339903A (en) * | 2012-03-26 | 2013-10-01 | Hon Hai Prec Ind Co Ltd | System and method for remotely controlling AUV |
US9129221B2 (en) * | 2012-05-07 | 2015-09-08 | Brain Corporation | Spiking neural network feedback apparatus and methods |
US9224090B2 (en) * | 2012-05-07 | 2015-12-29 | Brain Corporation | Sensory input processing apparatus in a spiking neural network |
US20130297539A1 (en) * | 2012-05-07 | 2013-11-07 | Filip Piekniewski | Spiking neural network object recognition apparatus and methods |
US9208432B2 (en) | 2012-06-01 | 2015-12-08 | Brain Corporation | Neural network learning and collaboration apparatus and methods |
US9412041B1 (en) | 2012-06-29 | 2016-08-09 | Brain Corporation | Retinal apparatus and methods |
US9111215B2 (en) | 2012-07-03 | 2015-08-18 | Brain Corporation | Conditional plasticity spiking neuron network apparatus and methods |
US8977582B2 (en) | 2012-07-12 | 2015-03-10 | Brain Corporation | Spiking neuron network sensory processing apparatus and methods |
US9256823B2 (en) | 2012-07-27 | 2016-02-09 | Qualcomm Technologies Inc. | Apparatus and methods for efficient updates in spiking neuron network |
US9256215B2 (en) | 2012-07-27 | 2016-02-09 | Brain Corporation | Apparatus and methods for generalized state-dependent learning in spiking neuron networks |
US9440352B2 (en) | 2012-08-31 | 2016-09-13 | Qualcomm Technologies Inc. | Apparatus and methods for robotic learning |
US9186793B1 (en) | 2012-08-31 | 2015-11-17 | Brain Corporation | Apparatus and methods for controlling attention of a robot |
US9367798B2 (en) | 2012-09-20 | 2016-06-14 | Brain Corporation | Spiking neuron network adaptive control apparatus and methods |
US9311594B1 (en) | 2012-09-20 | 2016-04-12 | Brain Corporation | Spiking neuron network apparatus and methods for encoding of sensory data |
US8793205B1 (en) | 2012-09-20 | 2014-07-29 | Brain Corporation | Robotic learning and evolution apparatus |
US9189730B1 (en) | 2012-09-20 | 2015-11-17 | Brain Corporation | Modulated stochasticity spiking neuron network controller apparatus and methods |
US9082079B1 (en) | 2012-10-22 | 2015-07-14 | Brain Corporation | Proportional-integral-derivative controller effecting expansion kernels comprising a plurality of spiking neurons associated with a plurality of receptive fields |
US9111226B2 (en) | 2012-10-25 | 2015-08-18 | Brain Corporation | Modulated plasticity apparatus and methods for spiking neuron network |
US9183493B2 (en) | 2012-10-25 | 2015-11-10 | Brain Corporation | Adaptive plasticity apparatus and methods for spiking neuron network |
US9218563B2 (en) | 2012-10-25 | 2015-12-22 | Brain Corporation | Spiking neuron sensory processing apparatus and methods for saliency detection |
US9275326B2 (en) | 2012-11-30 | 2016-03-01 | Brain Corporation | Rate stabilization through plasticity in spiking neuron network |
US9123127B2 (en) | 2012-12-10 | 2015-09-01 | Brain Corporation | Contrast enhancement spiking neuron network sensory processing apparatus and methods |
US9195934B1 (en) | 2013-01-31 | 2015-11-24 | Brain Corporation | Spiking neuron classifier apparatus and methods using conditionally independent subsets |
US9177245B2 (en) | 2013-02-08 | 2015-11-03 | Qualcomm Technologies Inc. | Spiking network apparatus and method with bimodal spike-timing dependent plasticity |
US8996177B2 (en) | 2013-03-15 | 2015-03-31 | Brain Corporation | Robotic training apparatus and methods |
US9008840B1 (en) | 2013-04-19 | 2015-04-14 | Brain Corporation | Apparatus and methods for reinforcement-guided supervised learning |
KR102143225B1 (en) * | 2013-05-06 | 2020-08-11 | 삼성전자주식회사 | Method and apparatus for transmitting spike event information of neuromorphic chip, and neuromorphic chip |
US9384443B2 (en) | 2013-06-14 | 2016-07-05 | Brain Corporation | Robotic training apparatus and methods |
US9436909B2 (en) | 2013-06-19 | 2016-09-06 | Brain Corporation | Increased dynamic range artificial neuron network apparatus and methods |
US9239985B2 (en) | 2013-06-19 | 2016-01-19 | Brain Corporation | Apparatus and methods for processing inputs in an artificial neuron network |
US9552546B1 (en) | 2013-07-30 | 2017-01-24 | Brain Corporation | Apparatus and methods for efficacy balancing in a spiking neuron network |
US9579789B2 (en) | 2013-09-27 | 2017-02-28 | Brain Corporation | Apparatus and methods for training of robotic control arbitration |
US9296101B2 (en) | 2013-09-27 | 2016-03-29 | Brain Corporation | Robotic control arbitration apparatus and methods |
US9489623B1 (en) | 2013-10-15 | 2016-11-08 | Brain Corporation | Apparatus and methods for backward propagation of errors in a spiking neuron network |
US9463571B2 (en) | 2013-11-01 | 2016-10-11 | Brian Corporation | Apparatus and methods for online training of robots |
US9248569B2 (en) | 2013-11-22 | 2016-02-02 | Brain Corporation | Discrepancy detection apparatus and methods for machine learning |
US11385673B1 (en) * | 2014-01-03 | 2022-07-12 | David James Ellis | Digital data processing circuitry |
US9364950B2 (en) | 2014-03-13 | 2016-06-14 | Brain Corporation | Trainable modular robotic methods |
US9533413B2 (en) | 2014-03-13 | 2017-01-03 | Brain Corporation | Trainable modular robotic apparatus and methods |
US9987743B2 (en) | 2014-03-13 | 2018-06-05 | Brain Corporation | Trainable modular robotic apparatus and methods |
US20150278641A1 (en) * | 2014-03-27 | 2015-10-01 | Qualcomm Incorporated | Invariant object representation of images using spiking neural networks |
US9613308B2 (en) | 2014-04-03 | 2017-04-04 | Brain Corporation | Spoofing remote control apparatus and methods |
US9630317B2 (en) | 2014-04-03 | 2017-04-25 | Brain Corporation | Learning apparatus and methods for control of robotic devices via spoofing |
US9098753B1 (en) | 2014-04-25 | 2015-08-04 | Google Inc. | Methods and systems for object detection using multiple sensors |
US9195903B2 (en) | 2014-04-29 | 2015-11-24 | International Business Machines Corporation | Extracting salient features from video using a neurosynaptic system |
US9346167B2 (en) | 2014-04-29 | 2016-05-24 | Brain Corporation | Trainable convolutional network apparatus and methods for operating a robotic vehicle |
US9713982B2 (en) | 2014-05-22 | 2017-07-25 | Brain Corporation | Apparatus and methods for robotic operation using video imagery |
US9475422B2 (en) * | 2014-05-22 | 2016-10-25 | Applied Invention, Llc | Communication between autonomous vehicle and external observers |
US10194163B2 (en) | 2014-05-22 | 2019-01-29 | Brain Corporation | Apparatus and methods for real time estimation of differential motion in live video |
US9939253B2 (en) | 2014-05-22 | 2018-04-10 | Brain Corporation | Apparatus and methods for distance estimation using multiple image sensors |
US9373058B2 (en) | 2014-05-29 | 2016-06-21 | International Business Machines Corporation | Scene understanding using a neurosynaptic system |
US9848112B2 (en) | 2014-07-01 | 2017-12-19 | Brain Corporation | Optical detection apparatus and methods |
US10115054B2 (en) | 2014-07-02 | 2018-10-30 | International Business Machines Corporation | Classifying features using a neurosynaptic system |
US9798972B2 (en) | 2014-07-02 | 2017-10-24 | International Business Machines Corporation | Feature extraction using a neurosynaptic system for object classification |
US10057593B2 (en) | 2014-07-08 | 2018-08-21 | Brain Corporation | Apparatus and methods for distance estimation using stereo imagery |
US9860077B2 (en) | 2014-09-17 | 2018-01-02 | Brain Corporation | Home animation apparatus and methods |
US9849588B2 (en) | 2014-09-17 | 2017-12-26 | Brain Corporation | Apparatus and methods for remotely controlling robotic devices |
US9579790B2 (en) | 2014-09-17 | 2017-02-28 | Brain Corporation | Apparatus and methods for removal of learned behaviors in robots |
US9821470B2 (en) | 2014-09-17 | 2017-11-21 | Brain Corporation | Apparatus and methods for context determination using real time sensor data |
US10055850B2 (en) | 2014-09-19 | 2018-08-21 | Brain Corporation | Salient features tracking apparatus and methods using visual initialization |
US9881349B1 (en) | 2014-10-24 | 2018-01-30 | Gopro, Inc. | Apparatus and methods for computerized object identification |
US9426946B2 (en) | 2014-12-02 | 2016-08-30 | Brain Corporation | Computerized learning landscaping apparatus and methods |
US9840003B2 (en) | 2015-06-24 | 2017-12-12 | Brain Corporation | Apparatus and methods for safe navigation of robotic devices |
US10197664B2 (en) | 2015-07-20 | 2019-02-05 | Brain Corporation | Apparatus and methods for detection of objects using broadband signals |
US10295972B2 (en) | 2016-04-29 | 2019-05-21 | Brain Corporation | Systems and methods to operate controllable devices with gestures and/or noises |
US10241514B2 (en) | 2016-05-11 | 2019-03-26 | Brain Corporation | Systems and methods for initializing a robot to autonomously travel a trained route |
US9987752B2 (en) | 2016-06-10 | 2018-06-05 | Brain Corporation | Systems and methods for automatic detection of spills |
US10282849B2 (en) | 2016-06-17 | 2019-05-07 | Brain Corporation | Systems and methods for predictive/reconstructive visual object tracker |
US10016896B2 (en) | 2016-06-30 | 2018-07-10 | Brain Corporation | Systems and methods for robotic behavior around moving bodies |
US11238337B2 (en) * | 2016-08-22 | 2022-02-01 | Applied Brain Research Inc. | Methods and systems for implementing dynamic neural networks |
TWI622938B (en) * | 2016-09-13 | 2018-05-01 | 創意引晴(開曼)控股有限公司 | Image recognizing method for preventing recognition result from confusion |
US10274325B2 (en) | 2016-11-01 | 2019-04-30 | Brain Corporation | Systems and methods for robotic mapping |
US10001780B2 (en) | 2016-11-02 | 2018-06-19 | Brain Corporation | Systems and methods for dynamic route planning in autonomous navigation |
US10723018B2 (en) | 2016-11-28 | 2020-07-28 | Brain Corporation | Systems and methods for remote operating and/or monitoring of a robot |
US10377040B2 (en) | 2017-02-02 | 2019-08-13 | Brain Corporation | Systems and methods for assisting a robotic apparatus |
US10852730B2 (en) | 2017-02-08 | 2020-12-01 | Brain Corporation | Systems and methods for robotic mobile platforms |
US10293485B2 (en) | 2017-03-30 | 2019-05-21 | Brain Corporation | Systems and methods for robotic path planning |
US11565411B2 (en) * | 2019-05-29 | 2023-01-31 | Lg Electronics Inc. | Intelligent robot cleaner for setting travel route based on video learning and managing method thereof |
US11270127B1 (en) * | 2021-05-05 | 2022-03-08 | Marc Joseph Kirch | Synchronized pulses identify and locate targets rapidly |
Citations (108)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5063603A (en) | 1989-11-06 | 1991-11-05 | David Sarnoff Research Center, Inc. | Dynamic method for recognizing objects and image processing system therefor |
US5138447A (en) | 1991-02-11 | 1992-08-11 | General Instrument Corporation | Method and apparatus for communicating compressed digital video signals using multiple processors |
US5216752A (en) | 1990-12-19 | 1993-06-01 | Baylor College Of Medicine | Interspike interval decoding neural network |
US5272535A (en) | 1991-06-13 | 1993-12-21 | Loral Fairchild Corporation | Image sensor with exposure control, selectable interlaced, pseudo interlaced or non-interlaced readout and video compression |
US5355435A (en) | 1992-05-18 | 1994-10-11 | New Mexico State University Technology Transfer Corp. | Asynchronous temporal neural processing element |
US5638359A (en) | 1992-12-14 | 1997-06-10 | Nokia Telecommunications Oy | Method for congestion management in a frame relay network and a node in a frame relay network |
US5652594A (en) | 1967-12-28 | 1997-07-29 | Lockheed Martin Corporation | Signal processor affording improved immunity to medium anomalies and interference in remote object detection system |
US5673367A (en) | 1992-10-01 | 1997-09-30 | Buckley; Theresa M. | Method for neural network control of motion using real-time environmental feedback |
RU2108612C1 (en) | 1994-09-14 | 1998-04-10 | Круглов Сергей Петрович | Adaptive control system with identifier and implicit reference model |
US5875108A (en) | 1991-12-23 | 1999-02-23 | Hoffberg; Steven M. | Ergonomic man-machine interface incorporating adaptive pattern recognition based control system |
US6009418A (en) | 1996-05-02 | 1999-12-28 | Cooper; David L. | Method and apparatus for neural networking using semantic attractor architecture |
US6014653A (en) | 1996-01-26 | 2000-01-11 | Thaler; Stephen L. | Non-algorithmically implemented artificial neural networks and components thereof |
US6035389A (en) | 1998-08-11 | 2000-03-07 | Intel Corporation | Scheduling instructions with different latencies |
US20020038294A1 (en) * | 2000-06-16 | 2002-03-28 | Masakazu Matsugu | Apparatus and method for detecting or recognizing pattern by employing a plurality of feature detecting elements |
US6418424B1 (en) | 1991-12-23 | 2002-07-09 | Steven M. Hoffberg | Ergonomic man-machine interface incorporating adaptive pattern recognition based control system |
US6458157B1 (en) | 1997-08-04 | 2002-10-01 | Suaning Gregg Joergen | Retinal stimulator |
US6509854B1 (en) | 1997-03-16 | 2003-01-21 | Hitachi, Ltd. | DA conversion circuit |
US20030050903A1 (en) | 1997-06-11 | 2003-03-13 | Jim-Shih Liaw | Dynamic synapse for signal processing in neural networks |
US6546291B2 (en) | 2000-02-16 | 2003-04-08 | Massachusetts Eye & Ear Infirmary | Balance prosthesis |
US6545708B1 (en) | 1997-07-11 | 2003-04-08 | Sony Corporation | Camera controlling device and method for predicted viewing |
US6545705B1 (en) | 1998-04-10 | 2003-04-08 | Lynx System Developers, Inc. | Camera with object recognition/data output |
US6581046B1 (en) | 1997-10-10 | 2003-06-17 | Yeda Research And Development Co. Ltd. | Neuronal phase-locked loops |
US6625317B1 (en) | 1995-09-12 | 2003-09-23 | Art Gaffin | Visual imaging system and method |
US20030216919A1 (en) | 2002-05-13 | 2003-11-20 | Roushar Joseph C. | Multi-dimensional method and apparatus for automated language interpretation |
US20040136439A1 (en) | 2002-11-15 | 2004-07-15 | Brandon Dewberry | Methods and systems acquiring impulse signals |
US20040170330A1 (en) | 1998-08-12 | 2004-09-02 | Pixonics, Inc. | Video coding reconstruction apparatus and methods |
US20040193670A1 (en) | 2001-05-21 | 2004-09-30 | Langan John D. | Spatio-temporal filter and method |
US20050015351A1 (en) | 2003-07-18 | 2005-01-20 | Alex Nugent | Nanotechnology neural network methods and systems |
US20050036649A1 (en) | 2001-08-23 | 2005-02-17 | Jun Yokono | Robot apparatus, face recognition method, and face recognition apparatus |
US20050283450A1 (en) | 2004-06-11 | 2005-12-22 | Masakazu Matsugu | Information processing apparatus, information processing method, pattern recognition apparatus, and pattern recognition method |
US20060094001A1 (en) | 2002-11-29 | 2006-05-04 | Torre Vicent E | Method and device for image processing and learning with neuronal cultures |
US20060129728A1 (en) | 2004-12-09 | 2006-06-15 | Hampel Craig E | Memory interface with workload adaptive encode/decode |
US20060161218A1 (en) | 2003-11-26 | 2006-07-20 | Wicab, Inc. | Systems and methods for treating traumatic brain injury |
US20070022068A1 (en) | 2005-07-01 | 2007-01-25 | Ralph Linsker | Neural networks for prediction and control |
US20070176643A1 (en) | 2005-06-17 | 2007-08-02 | Alex Nugent | Universal logic gate utilizing nanotechnology |
US20070208678A1 (en) | 2004-03-17 | 2007-09-06 | Canon Kabushiki Kaisha | Parallel Pulse Signal Processing Apparatus, Pattern Recognition Apparatus, And Image Input Apparatus |
US20080100482A1 (en) | 2003-05-27 | 2008-05-01 | Lazar Aurel A | Multichannel Time Encoding And Decoding Of A Signal |
JP4087423B2 (en) | 2006-10-17 | 2008-05-21 | 京セラミタ株式会社 | Portable communication device |
WO2008083335A2 (en) | 2006-12-29 | 2008-07-10 | Neurosciences Research Foundation, Inc. | Solving the distal reward problem through linkage of stdp and dopamine signaling |
US20080199072A1 (en) | 2003-02-27 | 2008-08-21 | Sony Corporation | Image processing device and method, learning device and method, recording medium, and program |
US20080237446A1 (en) | 2007-02-16 | 2008-10-02 | Texas Instruments Incorporated | Solid-state image pickup device and method |
WO2008132066A1 (en) | 2007-04-27 | 2008-11-06 | Siemens Aktiengesellschaft | A method for computer-assisted learning of one or more neural networks |
US20090043722A1 (en) | 2003-03-27 | 2009-02-12 | Alex Nugent | Adaptive neural network utilizing nanotechnology-based components |
US7580907B1 (en) | 2004-01-14 | 2009-08-25 | Evolved Machines, Inc. | Invariant object recognition |
US20090287624A1 (en) | 2005-12-23 | 2009-11-19 | Societe De Commercialisation De Produits De La Recherche Applique-Socpra-Sciences Et Genie S.E.C. | Spatio-temporal pattern recognition using a spiking neural network and processing thereof on a portable and/or distributed computer |
US7653255B2 (en) | 2004-06-02 | 2010-01-26 | Adobe Systems Incorporated | Image region of interest encoding |
US20100036457A1 (en) | 2008-08-07 | 2010-02-11 | Massachusetts Institute Of Technology | Coding for visual prostheses |
US20100081958A1 (en) | 2006-10-02 | 2010-04-01 | She Christy L | Pulse-based feature extraction for neural recordings |
US20100086171A1 (en) | 2008-10-02 | 2010-04-08 | Silverbrook Research Pty Ltd | Method of imaging coding pattern having merged data symbols |
US20100100482A1 (en) | 2007-01-23 | 2010-04-22 | Sxip Identity Corp. | Intermediate Data Generation For Transaction Processing |
US7737933B2 (en) | 2000-09-26 | 2010-06-15 | Toshiba Matsushita Display Technology Co., Ltd. | Display unit and drive system thereof and an information display unit |
US20100166320A1 (en) | 2008-12-26 | 2010-07-01 | Paquier Williams J F | Multi-stage image pattern recognizer |
US7765029B2 (en) | 2005-09-13 | 2010-07-27 | Neurosciences Research Foundation, Inc. | Hybrid control device |
US20100225824A1 (en) | 2007-06-28 | 2010-09-09 | The Trustees Of Columbia University In The City Of New York | Multi-Input Multi-Output Time Encoding And Decoding Machines |
US20100235310A1 (en) | 2009-01-27 | 2010-09-16 | Gage Fred H | Temporally dynamic artificial neural networks |
US20100299296A1 (en) | 2009-05-21 | 2010-11-25 | International Business Machines Corporation | Electronic learning synapse with spike-timing dependent plasticity using unipolar memory-switching elements |
US7849030B2 (en) | 2006-05-31 | 2010-12-07 | Hartford Fire Insurance Company | Method and system for classifying documents |
RU2406105C2 (en) | 2006-06-13 | 2010-12-10 | Филипп Геннадьевич Нестерук | Method of processing information in neural networks |
US20110016071A1 (en) | 2009-07-20 | 2011-01-20 | Guillen Marcos E | Method for efficiently simulating the information processing in cells and tissues of the nervous system with a temporal series compressed encoding neural network |
US20110119215A1 (en) | 2009-11-13 | 2011-05-19 | International Business Machines Corporation | Hardware analog-digital neural networks |
US20110119214A1 (en) | 2009-11-18 | 2011-05-19 | International Business Machines Corporation | Area efficient neuromorphic circuits |
US20110137843A1 (en) | 2008-08-28 | 2011-06-09 | Massachusetts Institute Of Technology | Circuits and Methods Representative of Spike Timing Dependent Plasticity of Neurons |
US20110160741A1 (en) | 2008-06-09 | 2011-06-30 | Hiroyuki Asano | Medical treatment tool for tubular organ |
RU2424561C2 (en) | 2005-08-31 | 2011-07-20 | Майкрософт Корпорейшн | Training convolutional neural network on graphics processing units |
US8000967B2 (en) | 2005-03-09 | 2011-08-16 | Telefonaktiebolaget Lm Ericsson (Publ) | Low-complexity code excited linear prediction encoding |
CN102226740A (en) | 2011-04-18 | 2011-10-26 | 中国计量学院 | Bearing fault detection method based on manner of controlling stochastic resonance by external periodic signal |
US20120011090A1 (en) | 2010-07-07 | 2012-01-12 | Qualcomm Incorporated | Methods and systems for three-memristor synapse with stdp and dopamine signaling |
US20120083982A1 (en) | 2010-10-05 | 2012-04-05 | Zachary Thomas Bonefas | System and method for governing a speed of an autonomous vehicle |
US20120084240A1 (en) | 2010-09-30 | 2012-04-05 | International Business Machines Corporation | Phase change memory synaptronic circuit for spiking computation, association and recall |
US8154436B2 (en) | 2005-10-24 | 2012-04-10 | Mitsubishi Electric Information Technology Centre Europe B.V. | Object detection |
US20120109866A1 (en) | 2010-10-29 | 2012-05-03 | International Business Machines Corporation | Compact cognitive synaptic computing circuits |
US8281997B2 (en) | 2008-02-19 | 2012-10-09 | Bilcare Technologies Singapore Pte. Ltd. | Reading device for identifying a tag or an object adapted to be identified, related methods and systems |
US8315305B2 (en) | 2010-03-26 | 2012-11-20 | Brain Corporation | Systems and methods for invariant pulse latency coding |
US20120303091A1 (en) | 2010-03-26 | 2012-11-29 | Izhikevich Eugene M | Apparatus and methods for polychronous encoding and multiplexing in neuronal prosthetic devices |
US20120308136A1 (en) | 2010-03-26 | 2012-12-06 | Izhikevich Eugene M | Apparatus and methods for pulse-code invariant object recognition |
US20120308076A1 (en) | 2010-03-26 | 2012-12-06 | Filip Lukasz Piekniewski | Apparatus and methods for temporally proximate object recognition |
US20130046716A1 (en) | 2011-08-16 | 2013-02-21 | Qualcomm Incorporated | Method and apparatus for neural temporal coding, learning and recognition |
US8390707B2 (en) | 2008-02-28 | 2013-03-05 | Kabushiki Kaisha Toshiba | Solid-state imaging device and manufacturing method thereof |
US20130073495A1 (en) | 2011-09-21 | 2013-03-21 | Eugene M. Izhikevich | Elementary network description for neuromorphic systems |
US20130073498A1 (en) | 2011-09-21 | 2013-03-21 | Eugene M. Izhikevich | Elementary network description for efficient link between neuronal models and neuromorphic systems |
US20130073500A1 (en) | 2011-09-21 | 2013-03-21 | Botond Szatmary | High level neuromorphic network description apparatus and methods |
US20130073499A1 (en) | 2011-09-21 | 2013-03-21 | Eugene M. Izhikevich | Apparatus and method for partial evaluation of synaptic updates based on system events |
US20130073492A1 (en) | 2011-09-21 | 2013-03-21 | Eugene M. Izhikevich | Elementary network description for efficient implementation of event-triggered plasticity rules in neuromorphic systems |
US20130073484A1 (en) | 2011-09-21 | 2013-03-21 | Eugene M. Izhikevich | Elementary network description for efficient memory management in neuromorphic systems |
US20130073496A1 (en) | 2011-09-21 | 2013-03-21 | Botond Szatmary | Tag-based apparatus and methods for neural networks |
US20130073491A1 (en) | 2011-09-21 | 2013-03-21 | Eugene M. Izhikevich | Apparatus and methods for synaptic update in a pulse-coded network |
US8416847B2 (en) | 1998-12-21 | 2013-04-09 | Zin Stai Pte. In, Llc | Separate plane compression using plurality of compression methods including ZLN and ZLD methods |
US20130151450A1 (en) | 2011-12-07 | 2013-06-13 | Filip Ponulak | Neural network apparatus and methods for signal conversion |
US20130218821A1 (en) | 2011-09-21 | 2013-08-22 | Botond Szatmary | Round-trip engineering apparatus and methods for neural networks |
US20130297541A1 (en) | 2012-05-07 | 2013-11-07 | Filip Piekniewski | Spiking neural network feedback apparatus and methods |
US20130297539A1 (en) | 2012-05-07 | 2013-11-07 | Filip Piekniewski | Spiking neural network object recognition apparatus and methods |
US20130297542A1 (en) | 2012-05-07 | 2013-11-07 | Filip Piekniewski | Sensory input processing apparatus in a spiking neural network |
US20130325774A1 (en) | 2012-06-04 | 2013-12-05 | Brain Corporation | Learning stochastic apparatus and methods |
US20130325775A1 (en) | 2012-06-04 | 2013-12-05 | Brain Corporation | Dynamically reconfigurable stochastic learning apparatus and methods |
US20130325773A1 (en) | 2012-06-04 | 2013-12-05 | Brain Corporation | Stochastic apparatus and methods for implementing generalized learning rules |
US20130325768A1 (en) | 2012-06-04 | 2013-12-05 | Brain Corporation | Stochastic spiking network learning apparatus and methods |
US20130325766A1 (en) | 2012-06-04 | 2013-12-05 | Csaba Petre | Spiking neuron network apparatus and methods |
US20130325777A1 (en) | 2012-06-04 | 2013-12-05 | Csaba Petre | Spiking neuron network apparatus and methods |
US20140012788A1 (en) | 2012-07-03 | 2014-01-09 | Filip Piekniewski | Conditional plasticity spiking neuron network apparatus and methods |
US20140016858A1 (en) | 2012-07-12 | 2014-01-16 | Micah Richert | Spiking neuron network sensory processing apparatus and methods |
US20140032459A1 (en) | 2012-07-27 | 2014-01-30 | Brain Corporation | Apparatus and methods for generalized state-dependent learning in spiking neuron networks |
US20140032458A1 (en) | 2012-07-27 | 2014-01-30 | Oleg Sinyavskiy | Apparatus and methods for efficient updates in spiking neuron network |
US20140052679A1 (en) | 2011-09-21 | 2014-02-20 | Oleg Sinyavskiy | Apparatus and methods for implementing event-based updates in spiking neuron networks |
US20140064609A1 (en) | 2010-08-26 | 2014-03-06 | Csaba Petre | Sensory input processing apparatus and methods |
US20140122397A1 (en) | 2012-10-25 | 2014-05-01 | Brain Corporation | Adaptive plasticity apparatus and methods for spiking neuron network |
US20140122399A1 (en) | 2012-10-25 | 2014-05-01 | Brain Corporation | Apparatus and methods for activity-based plasticity in a spiking neuron network |
US20140122398A1 (en) | 2012-10-25 | 2014-05-01 | Brain Corporation | Modulated plasticity apparatus and methods for spiking neuron network |
US20140156574A1 (en) | 2012-11-30 | 2014-06-05 | Brain Corporation | Rate stabilization through plasticity in spiking neuron network |
-
2011
- 2011-06-02 US US13/152,105 patent/US9122994B2/en not_active Expired - Fee Related
-
2012
- 2012-06-01 WO PCT/US2012/040567 patent/WO2012167164A1/en active Application Filing
Patent Citations (119)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5652594A (en) | 1967-12-28 | 1997-07-29 | Lockheed Martin Corporation | Signal processor affording improved immunity to medium anomalies and interference in remote object detection system |
US5063603A (en) | 1989-11-06 | 1991-11-05 | David Sarnoff Research Center, Inc. | Dynamic method for recognizing objects and image processing system therefor |
US5216752A (en) | 1990-12-19 | 1993-06-01 | Baylor College Of Medicine | Interspike interval decoding neural network |
US5138447A (en) | 1991-02-11 | 1992-08-11 | General Instrument Corporation | Method and apparatus for communicating compressed digital video signals using multiple processors |
US5272535A (en) | 1991-06-13 | 1993-12-21 | Loral Fairchild Corporation | Image sensor with exposure control, selectable interlaced, pseudo interlaced or non-interlaced readout and video compression |
US5875108A (en) | 1991-12-23 | 1999-02-23 | Hoffberg; Steven M. | Ergonomic man-machine interface incorporating adaptive pattern recognition based control system |
US6418424B1 (en) | 1991-12-23 | 2002-07-09 | Steven M. Hoffberg | Ergonomic man-machine interface incorporating adaptive pattern recognition based control system |
US5355435A (en) | 1992-05-18 | 1994-10-11 | New Mexico State University Technology Transfer Corp. | Asynchronous temporal neural processing element |
US5673367A (en) | 1992-10-01 | 1997-09-30 | Buckley; Theresa M. | Method for neural network control of motion using real-time environmental feedback |
US5638359A (en) | 1992-12-14 | 1997-06-10 | Nokia Telecommunications Oy | Method for congestion management in a frame relay network and a node in a frame relay network |
RU2108612C1 (en) | 1994-09-14 | 1998-04-10 | Круглов Сергей Петрович | Adaptive control system with identifier and implicit reference model |
US6625317B1 (en) | 1995-09-12 | 2003-09-23 | Art Gaffin | Visual imaging system and method |
US6014653A (en) | 1996-01-26 | 2000-01-11 | Thaler; Stephen L. | Non-algorithmically implemented artificial neural networks and components thereof |
US6009418A (en) | 1996-05-02 | 1999-12-28 | Cooper; David L. | Method and apparatus for neural networking using semantic attractor architecture |
US6509854B1 (en) | 1997-03-16 | 2003-01-21 | Hitachi, Ltd. | DA conversion circuit |
US20030050903A1 (en) | 1997-06-11 | 2003-03-13 | Jim-Shih Liaw | Dynamic synapse for signal processing in neural networks |
US6545708B1 (en) | 1997-07-11 | 2003-04-08 | Sony Corporation | Camera controlling device and method for predicted viewing |
US6458157B1 (en) | 1997-08-04 | 2002-10-01 | Suaning Gregg Joergen | Retinal stimulator |
US6581046B1 (en) | 1997-10-10 | 2003-06-17 | Yeda Research And Development Co. Ltd. | Neuronal phase-locked loops |
US6545705B1 (en) | 1998-04-10 | 2003-04-08 | Lynx System Developers, Inc. | Camera with object recognition/data output |
US6035389A (en) | 1998-08-11 | 2000-03-07 | Intel Corporation | Scheduling instructions with different latencies |
US20040170330A1 (en) | 1998-08-12 | 2004-09-02 | Pixonics, Inc. | Video coding reconstruction apparatus and methods |
US8416847B2 (en) | 1998-12-21 | 2013-04-09 | Zin Stai Pte. In, Llc | Separate plane compression using plurality of compression methods including ZLN and ZLD methods |
US6546291B2 (en) | 2000-02-16 | 2003-04-08 | Massachusetts Eye & Ear Infirmary | Balance prosthesis |
US20020038294A1 (en) * | 2000-06-16 | 2002-03-28 | Masakazu Matsugu | Apparatus and method for detecting or recognizing pattern by employing a plurality of feature detecting elements |
US7054850B2 (en) | 2000-06-16 | 2006-05-30 | Canon Kabushiki Kaisha | Apparatus and method for detecting or recognizing pattern by employing a plurality of feature detecting elements |
US7737933B2 (en) | 2000-09-26 | 2010-06-15 | Toshiba Matsushita Display Technology Co., Ltd. | Display unit and drive system thereof and an information display unit |
US20040193670A1 (en) | 2001-05-21 | 2004-09-30 | Langan John D. | Spatio-temporal filter and method |
US20050036649A1 (en) | 2001-08-23 | 2005-02-17 | Jun Yokono | Robot apparatus, face recognition method, and face recognition apparatus |
US20030216919A1 (en) | 2002-05-13 | 2003-11-20 | Roushar Joseph C. | Multi-dimensional method and apparatus for automated language interpretation |
US20040136439A1 (en) | 2002-11-15 | 2004-07-15 | Brandon Dewberry | Methods and systems acquiring impulse signals |
US20060094001A1 (en) | 2002-11-29 | 2006-05-04 | Torre Vicent E | Method and device for image processing and learning with neuronal cultures |
US20080199072A1 (en) | 2003-02-27 | 2008-08-21 | Sony Corporation | Image processing device and method, learning device and method, recording medium, and program |
US20090043722A1 (en) | 2003-03-27 | 2009-02-12 | Alex Nugent | Adaptive neural network utilizing nanotechnology-based components |
US20080100482A1 (en) | 2003-05-27 | 2008-05-01 | Lazar Aurel A | Multichannel Time Encoding And Decoding Of A Signal |
US20050015351A1 (en) | 2003-07-18 | 2005-01-20 | Alex Nugent | Nanotechnology neural network methods and systems |
US20060161218A1 (en) | 2003-11-26 | 2006-07-20 | Wicab, Inc. | Systems and methods for treating traumatic brain injury |
US7580907B1 (en) | 2004-01-14 | 2009-08-25 | Evolved Machines, Inc. | Invariant object recognition |
US20070208678A1 (en) | 2004-03-17 | 2007-09-06 | Canon Kabushiki Kaisha | Parallel Pulse Signal Processing Apparatus, Pattern Recognition Apparatus, And Image Input Apparatus |
US7653255B2 (en) | 2004-06-02 | 2010-01-26 | Adobe Systems Incorporated | Image region of interest encoding |
US20050283450A1 (en) | 2004-06-11 | 2005-12-22 | Masakazu Matsugu | Information processing apparatus, information processing method, pattern recognition apparatus, and pattern recognition method |
US8015130B2 (en) | 2004-06-11 | 2011-09-06 | Canon Kabushiki Kaisha | Information processing apparatus, information processing method, pattern recognition apparatus, and pattern recognition method |
US20060129728A1 (en) | 2004-12-09 | 2006-06-15 | Hampel Craig E | Memory interface with workload adaptive encode/decode |
US8000967B2 (en) | 2005-03-09 | 2011-08-16 | Telefonaktiebolaget Lm Ericsson (Publ) | Low-complexity code excited linear prediction encoding |
US20070176643A1 (en) | 2005-06-17 | 2007-08-02 | Alex Nugent | Universal logic gate utilizing nanotechnology |
US20070022068A1 (en) | 2005-07-01 | 2007-01-25 | Ralph Linsker | Neural networks for prediction and control |
RU2424561C2 (en) | 2005-08-31 | 2011-07-20 | Майкрософт Корпорейшн | Training convolutional neural network on graphics processing units |
US8583286B2 (en) | 2005-09-13 | 2013-11-12 | Neurosciences Research Foundation, Inc. | Hybrid control device |
US7765029B2 (en) | 2005-09-13 | 2010-07-27 | Neurosciences Research Foundation, Inc. | Hybrid control device |
US8154436B2 (en) | 2005-10-24 | 2012-04-10 | Mitsubishi Electric Information Technology Centre Europe B.V. | Object detection |
US20090287624A1 (en) | 2005-12-23 | 2009-11-19 | Societe De Commercialisation De Produits De La Recherche Applique-Socpra-Sciences Et Genie S.E.C. | Spatio-temporal pattern recognition using a spiking neural network and processing thereof on a portable and/or distributed computer |
US8346692B2 (en) | 2005-12-23 | 2013-01-01 | Societe De Commercialisation Des Produits De La Recherche Appliquee-Socpra-Sciences Et Genie S.E.C. | Spatio-temporal pattern recognition using a spiking neural network and processing thereof on a portable and/or distributed computer |
US7849030B2 (en) | 2006-05-31 | 2010-12-07 | Hartford Fire Insurance Company | Method and system for classifying documents |
RU2406105C2 (en) | 2006-06-13 | 2010-12-10 | Филипп Геннадьевич Нестерук | Method of processing information in neural networks |
US20100081958A1 (en) | 2006-10-02 | 2010-04-01 | She Christy L | Pulse-based feature extraction for neural recordings |
JP4087423B2 (en) | 2006-10-17 | 2008-05-21 | 京セラミタ株式会社 | Portable communication device |
US8103602B2 (en) | 2006-12-29 | 2012-01-24 | Neurosciences Research Foundation, Inc. | Solving the distal reward problem through linkage of STDP and dopamine signaling |
WO2008083335A2 (en) | 2006-12-29 | 2008-07-10 | Neurosciences Research Foundation, Inc. | Solving the distal reward problem through linkage of stdp and dopamine signaling |
US20100100482A1 (en) | 2007-01-23 | 2010-04-22 | Sxip Identity Corp. | Intermediate Data Generation For Transaction Processing |
US20080237446A1 (en) | 2007-02-16 | 2008-10-02 | Texas Instruments Incorporated | Solid-state image pickup device and method |
WO2008132066A1 (en) | 2007-04-27 | 2008-11-06 | Siemens Aktiengesellschaft | A method for computer-assisted learning of one or more neural networks |
US20100225824A1 (en) | 2007-06-28 | 2010-09-09 | The Trustees Of Columbia University In The City Of New York | Multi-Input Multi-Output Time Encoding And Decoding Machines |
US8281997B2 (en) | 2008-02-19 | 2012-10-09 | Bilcare Technologies Singapore Pte. Ltd. | Reading device for identifying a tag or an object adapted to be identified, related methods and systems |
US8390707B2 (en) | 2008-02-28 | 2013-03-05 | Kabushiki Kaisha Toshiba | Solid-state imaging device and manufacturing method thereof |
US20110160741A1 (en) | 2008-06-09 | 2011-06-30 | Hiroyuki Asano | Medical treatment tool for tubular organ |
US20100036457A1 (en) | 2008-08-07 | 2010-02-11 | Massachusetts Institute Of Technology | Coding for visual prostheses |
US20110137843A1 (en) | 2008-08-28 | 2011-06-09 | Massachusetts Institute Of Technology | Circuits and Methods Representative of Spike Timing Dependent Plasticity of Neurons |
US20100086171A1 (en) | 2008-10-02 | 2010-04-08 | Silverbrook Research Pty Ltd | Method of imaging coding pattern having merged data symbols |
US20100166320A1 (en) | 2008-12-26 | 2010-07-01 | Paquier Williams J F | Multi-stage image pattern recognizer |
US8160354B2 (en) | 2008-12-26 | 2012-04-17 | Five Apes, Inc. | Multi-stage image pattern recognizer |
US20100235310A1 (en) | 2009-01-27 | 2010-09-16 | Gage Fred H | Temporally dynamic artificial neural networks |
US20100299296A1 (en) | 2009-05-21 | 2010-11-25 | International Business Machines Corporation | Electronic learning synapse with spike-timing dependent plasticity using unipolar memory-switching elements |
US20110016071A1 (en) | 2009-07-20 | 2011-01-20 | Guillen Marcos E | Method for efficiently simulating the information processing in cells and tissues of the nervous system with a temporal series compressed encoding neural network |
US8200593B2 (en) | 2009-07-20 | 2012-06-12 | Corticaldb Inc | Method for efficiently simulating the information processing in cells and tissues of the nervous system with a temporal series compressed encoding neural network |
US20110119215A1 (en) | 2009-11-13 | 2011-05-19 | International Business Machines Corporation | Hardware analog-digital neural networks |
US8311965B2 (en) | 2009-11-18 | 2012-11-13 | International Business Machines Corporation | Area efficient neuromorphic circuits using field effect transistors (FET) and variable resistance material |
US20110119214A1 (en) | 2009-11-18 | 2011-05-19 | International Business Machines Corporation | Area efficient neuromorphic circuits |
US20130251278A1 (en) | 2010-03-26 | 2013-09-26 | Eugene M. Izhikevich | Invariant pulse latency coding systems and methods |
US8315305B2 (en) | 2010-03-26 | 2012-11-20 | Brain Corporation | Systems and methods for invariant pulse latency coding |
US20120303091A1 (en) | 2010-03-26 | 2012-11-29 | Izhikevich Eugene M | Apparatus and methods for polychronous encoding and multiplexing in neuronal prosthetic devices |
US20120308136A1 (en) | 2010-03-26 | 2012-12-06 | Izhikevich Eugene M | Apparatus and methods for pulse-code invariant object recognition |
US20120308076A1 (en) | 2010-03-26 | 2012-12-06 | Filip Lukasz Piekniewski | Apparatus and methods for temporally proximate object recognition |
US8467623B2 (en) | 2010-03-26 | 2013-06-18 | Brain Corporation | Invariant pulse latency coding systems and methods systems and methods |
US20120011090A1 (en) | 2010-07-07 | 2012-01-12 | Qualcomm Incorporated | Methods and systems for three-memristor synapse with stdp and dopamine signaling |
US20140064609A1 (en) | 2010-08-26 | 2014-03-06 | Csaba Petre | Sensory input processing apparatus and methods |
US20120084240A1 (en) | 2010-09-30 | 2012-04-05 | International Business Machines Corporation | Phase change memory synaptronic circuit for spiking computation, association and recall |
US20120083982A1 (en) | 2010-10-05 | 2012-04-05 | Zachary Thomas Bonefas | System and method for governing a speed of an autonomous vehicle |
US20120109866A1 (en) | 2010-10-29 | 2012-05-03 | International Business Machines Corporation | Compact cognitive synaptic computing circuits |
CN102226740A (en) | 2011-04-18 | 2011-10-26 | 中国计量学院 | Bearing fault detection method based on manner of controlling stochastic resonance by external periodic signal |
US20130046716A1 (en) | 2011-08-16 | 2013-02-21 | Qualcomm Incorporated | Method and apparatus for neural temporal coding, learning and recognition |
US20130073498A1 (en) | 2011-09-21 | 2013-03-21 | Eugene M. Izhikevich | Elementary network description for efficient link between neuronal models and neuromorphic systems |
US20140052679A1 (en) | 2011-09-21 | 2014-02-20 | Oleg Sinyavskiy | Apparatus and methods for implementing event-based updates in spiking neuron networks |
US20130073491A1 (en) | 2011-09-21 | 2013-03-21 | Eugene M. Izhikevich | Apparatus and methods for synaptic update in a pulse-coded network |
US20130073484A1 (en) | 2011-09-21 | 2013-03-21 | Eugene M. Izhikevich | Elementary network description for efficient memory management in neuromorphic systems |
US20130073492A1 (en) | 2011-09-21 | 2013-03-21 | Eugene M. Izhikevich | Elementary network description for efficient implementation of event-triggered plasticity rules in neuromorphic systems |
US20130218821A1 (en) | 2011-09-21 | 2013-08-22 | Botond Szatmary | Round-trip engineering apparatus and methods for neural networks |
US20130073499A1 (en) | 2011-09-21 | 2013-03-21 | Eugene M. Izhikevich | Apparatus and method for partial evaluation of synaptic updates based on system events |
US20130073496A1 (en) | 2011-09-21 | 2013-03-21 | Botond Szatmary | Tag-based apparatus and methods for neural networks |
US8712941B2 (en) | 2011-09-21 | 2014-04-29 | Brain Corporation | Elementary network description for efficient link between neuronal models and neuromorphic systems |
US20130073495A1 (en) | 2011-09-21 | 2013-03-21 | Eugene M. Izhikevich | Elementary network description for neuromorphic systems |
US20130073500A1 (en) | 2011-09-21 | 2013-03-21 | Botond Szatmary | High level neuromorphic network description apparatus and methods |
US20130151450A1 (en) | 2011-12-07 | 2013-06-13 | Filip Ponulak | Neural network apparatus and methods for signal conversion |
US20130297541A1 (en) | 2012-05-07 | 2013-11-07 | Filip Piekniewski | Spiking neural network feedback apparatus and methods |
US20130297542A1 (en) | 2012-05-07 | 2013-11-07 | Filip Piekniewski | Sensory input processing apparatus in a spiking neural network |
US20130297539A1 (en) | 2012-05-07 | 2013-11-07 | Filip Piekniewski | Spiking neural network object recognition apparatus and methods |
US20130325773A1 (en) | 2012-06-04 | 2013-12-05 | Brain Corporation | Stochastic apparatus and methods for implementing generalized learning rules |
US20130325768A1 (en) | 2012-06-04 | 2013-12-05 | Brain Corporation | Stochastic spiking network learning apparatus and methods |
US20130325766A1 (en) | 2012-06-04 | 2013-12-05 | Csaba Petre | Spiking neuron network apparatus and methods |
US20130325777A1 (en) | 2012-06-04 | 2013-12-05 | Csaba Petre | Spiking neuron network apparatus and methods |
US20130325775A1 (en) | 2012-06-04 | 2013-12-05 | Brain Corporation | Dynamically reconfigurable stochastic learning apparatus and methods |
US20130325774A1 (en) | 2012-06-04 | 2013-12-05 | Brain Corporation | Learning stochastic apparatus and methods |
US20140012788A1 (en) | 2012-07-03 | 2014-01-09 | Filip Piekniewski | Conditional plasticity spiking neuron network apparatus and methods |
US20140016858A1 (en) | 2012-07-12 | 2014-01-16 | Micah Richert | Spiking neuron network sensory processing apparatus and methods |
US20140032458A1 (en) | 2012-07-27 | 2014-01-30 | Oleg Sinyavskiy | Apparatus and methods for efficient updates in spiking neuron network |
US20140032459A1 (en) | 2012-07-27 | 2014-01-30 | Brain Corporation | Apparatus and methods for generalized state-dependent learning in spiking neuron networks |
US20140122397A1 (en) | 2012-10-25 | 2014-05-01 | Brain Corporation | Adaptive plasticity apparatus and methods for spiking neuron network |
US20140122399A1 (en) | 2012-10-25 | 2014-05-01 | Brain Corporation | Apparatus and methods for activity-based plasticity in a spiking neuron network |
US20140122398A1 (en) | 2012-10-25 | 2014-05-01 | Brain Corporation | Modulated plasticity apparatus and methods for spiking neuron network |
US20140156574A1 (en) | 2012-11-30 | 2014-06-05 | Brain Corporation | Rate stabilization through plasticity in spiking neuron network |
Non-Patent Citations (91)
Title |
---|
Berkes and Wiskott, Slow feature analysis yields a rich repertoire of complex cell properties. Journal of Vision (2005) vol. 5 (6). |
Bohte, 'Spiking Nueral Networks' Doctorate at the University of Leiden, Holland, Mar. 5, 2003, pp. 1-133 [retrieved on Nov. 14, 2012]. Retrieved from the internet: . |
Bohte, 'Spiking Nueral Networks' Doctorate at the University of Leiden, Holland, Mar. 5, 2003, pp. 1-133 [retrieved on Nov. 14, 2012]. Retrieved from the internet: <URL: http://holnepagcs,cwi ,n11-sbolltedmblica6ond)hdthesislxif>. |
Brette et al., Brian: a simple and flexible simulator for spiking neural networks, The Neuromorphic Engineer, Jul. 1, 2009, pp. 1-4, doi: 10.2417/1200906.1659. |
Brette, et al., "Simulation of Networks of Spiking Neurons: A Review of Tools and Strategies", Received Nov. 29, 2006, Revised Apr. 2, 2007, Accepted Apr. 12, 2007, Springer Science, 50 pages. |
Cessac et al. 'Overview of facts and issues about neural coding by spikes.' Journal of Physiology, Paris 104.1 (2010): 5. |
Cuntz et al., 'One Rule to Grow Them All: A General Theory of Neuronal Branching and Its Paractical Application' PLOS Computational Biology, 6 (8), Published Aug. 5, 2010. |
Davison et al., PyNN: a common interface for neuronal network simulators, Frontiers in Neuroinformatics, Jan. 2009, pp. 1-10, vol. 2, Article 11. |
Djurfeldt, Mikael, The Connection-set Algebra: a formalism for the representation of connectivity structure in neuronal network models, implementations in Python and C++, and their use in simulators BMC Neuroscience Jul. 18, 2011 p. 1 12(Suppl I):P80. |
Dorval et al. 'Probability distributions of the logarithm of inter-spike intervals yield accurate entropy estimates from small datasets.' Journal of neuroscience methods 173.1 (2008): 129. |
Fidjeland et al., Accelerated Simulation of Spiking Neural Networks Using GPUs [online],2010 [retrieved on Jun. 15, 2013], Retrieved from the Internet: URL:http://ieeexplore.ieee.org/xpls/abs-all.jsp?ammber=5596678&tag=1. |
Field, G.; Chichilnisky, E., Information Processing in the Primate Retina: Circuitry and Coding. Annual Review of Neuroscience, 2007, 30(1), 1-30. |
Fiete, et al., Spike-Time-Dependent Plasticity and Heterosynaptic Competition Organize Networks to Produce Long Scale-Free Sequences of Neural Activity. Neuron 65, Feb. 25, 2010, pp. 563-576. |
Floreano et al., 'Neuroevolution: from architectures to learning' Evol. Intel. Jan. 2008 1:47-62, [retrieved Dec. 30, 2013] [retrieved online from URL:. |
Floreano et al., 'Neuroevolution: from architectures to learning' Evol. Intel. Jan. 2008 1:47-62, [retrieved Dec. 30, 2013] [retrieved online from URL:<http://inforscience.epfl.ch/record/112676/files/FloreanoDuerrMattiussi2008.p df>. |
Földiák, P., Learning invariance from transformation sequences. Neural Computation, 1991, 3(2), 194-200. |
Froemke et al., Temporal modulation of spike-timing-dependent plasticity, Frontiers in Synaptic Neuroscience, vol. 2, Article 19, pp. 1-16 [online] Jun. 2010 [retrieved on Dec. 16, 2013]. Retrieved from the internet: . |
Froemke et al., Temporal modulation of spike-timing-dependent plasticity, Frontiers in Synaptic Neuroscience, vol. 2, Article 19, pp. 1-16 [online] Jun. 2010 [retrieved on Dec. 16, 2013]. Retrieved from the internet: <frontiersin.org>. |
Gerstner et al. (1996) A neuronal learning rule for sub-millisecond temporal coding. Nature vol. 383 (6595) pp. 76-78. |
Gewaltig et al., 'NEST (Neural Simulation Tool)', Scholarpedia, 2007, pp. 1-15, 2(4): 1430, doi: 10.4249/scholarpedia.1430. |
Gleeson et al., NeuroML: A Language for Describing Data Driven Models of Neurons and Networks with a High Degree of Biological Detail, PLoS Computational Biology, Jun. 2010, pp. 1-19 vol. 6 Issue 6. |
Gluck, Stimulus Generalization and Representation in Adaptive Network Models of Category Learning [online], 1991 [retrieved on Aug. 24, 2013]. Retrieved from the Internet:<URL:http:// www.google.com/url?sa=t&rct=j&q=Giuck+%22STIMULUS+G ENERALIZATION+and+REPRESENTATIO N+1 N+ADAPTIVE+NETWORK+MODELS+OF+CATEGORY+LEARN I NG%22+ 1991. |
Gollisch et al. 'Rapid neural coding in the retina with relative spike latencies.' Science 319.5866 (2008): 11 08-1111. |
Goodman et al., Brian: a simulator for spiking neural networks in Python, Frontiers in Neuroinformatics, Nov. 2008, pp. 1-10, vol. 2, Article 5. |
Gorchetchnikov et al., NineML: declarative, mathematically-explicit descriptions of spiking neuronal networks, Frontiers in Neuroinformatics, Conference Abstract: 4th INCF Congress of Neuroinformatics, doi: 10.3389/conf.fninf.2011.08.00098. |
Graham, Lyle J., The Surf-Hippo Reference Manual, http://www.neurophys.biomedicale.univparis5. fr/-graham/surf-hippo-files/Surf-Hippo%20Reference%20Manual.pdf, Mar. 2002, pp. 1-128. |
Hopfield JJ (1995) Pattern recognition computation using action potential timing for stimulus representation. Nature 376: 33-36. |
Izhikevich E. M. and Hoppensteadt F.C. (2009) Polychronous Wavefront Computations. International Journal of Bifurcation and Chaos. 19:1733-1739. |
Izhikevich E.M. (2004) Which Model to Use for Cortical Spiking Neurons? IEEE Transactions on Neural Networks, 15:1063-1070. |
Izhikevich E.M. (2006) Polychronization: Computation With Spikes. Neural Computation,18:245-282. |
Izhikevich E.M., "Neural Excitability, Spiking and Bursting", Neurosciences Institute, Received Jun. 9, 1999, Revised Oct. 25, 1999, 1171-1266, 96 pages. |
Izhikevich et al., 'Relating STDP to BCM', Neural Computation (2003) 15, 1511-1523. |
Izhikevich, E.M. (2007) Dynamical Systems in Neuroscience: The Geometry of Excitability and Bursting, The MIT Press, 2007. |
Izhikevich, 'Simple Model of Spiking Neurons', IEEE Transactions on Neural Networks, vol. 14, No. 6, Nov. 2003, pp. 1569-1572. |
Janowitz, M.K.; Van Rossum, M.C.W. Excitability changes that complement Hebbian learning. Network, Computation in Neural Systems, 2006, 17 (1), 31-41. |
Karbowski et al., 'Multispikes and Synchronization in a Large Neural Network with Temporal Delays', Neural Computation 12, 1573-1606 (2000). |
Kazantsev, et al., "Active Spike Transmission in the Neuron Model With a Winding Threshold Maniford", Jan. 3, 2012, 205-211, 7 pages. |
Khotanzad, Alireza, Classification of invariant image representations using a neural network, IEEF. Transactions on Acoustics, Speech, and Signal Processing vol. 38 No. 6 Jun. 1990 pp. 1028-1038. |
Khotanzad, Alireza. Classification of invariant image representations using a neural network. IEEF. Transactions on Acoustics, Speech, and Signal Processing. vol. 38. No. 6. Jun. 1990, pp. 1028-1038 [online], [retrieved on Aug. 24, 2012]. Retrieved from the Internet <URL: http://www-ee.uta.edu/eeweb/IP/Courses/SPR/Reference/Khotanzad.pdf> p. 1028 col. 2 para 2, p. 1029 col. 1 para 3. |
Khotanzad, Alireza. Classification of invariant image representations using a neural network. IEEF. Transactions on Acoustics, Speech, and Signal Processing. vol. 38. No. 6. Jun. 1990, pp. 1028-1038 [online], [retrieved on Aug. 24, 2012]. Retrieved from the Internet p. 1028 col. 2 para 2, p. 1029 col. 1 para 3. |
Khotanzad, 'Classification of invariant image representations using a neural network' IEEF. Transactions on Acoustics, Speech, and Signal Processing, vol. 38, No. 6, Jun. 1990, pp. 1028-1038 [online], [retrieved on Dec. 10, 2013]. Retrieved from the Internet . |
Khotanzad, 'Classification of invariant image representations using a neural network' IEEF. Transactions on Acoustics, Speech, and Signal Processing, vol. 38, No. 6, Jun. 1990, pp. 1028-1038 [online], [retrieved on Dec. 10, 2013]. Retrieved from the Internet <URL: http://www-ee.uta.edu/eeweb/IP/Courses/SPR/Reference/ Khotanzad.pdf>. |
Kling-Petersen, PhD, "Sun and HPC: From Systems to PetaScale" Sun Microsystems, no date, 31 pages. |
Knoblauch, et al. Memory Capacities for Synaptic and Structural Plasticity, Neural Computation 2009, pp. 1-45. |
Laurent, 'Issue 1-nnql-Refactor Nucleus into its own file-Neural Network Query Language' [retrieved on Nov. 12, 2013]. Retrieved from the Internet: URL:https:// code.google.com/p/nnql/issues/detail?id-1. |
Laurent, 'The Neural Network Query Language (NNQL) Reference' [retrieved on Nov. 12, 2013]. Retrieved from the Internet: . |
Laurent, 'The Neural Network Query Language (NNQL) Reference' [retrieved on Nov. 12, 2013]. Retrieved from the Internet: <URL′https://code.google.com/p/nnql/issues/detail?id=1>. |
Lazar et al. 'A video time encoding machine', in Proceedings of the 15th IEEE International Conference on Image Processing (ICIP '08), 2008, pp. 717-720. |
Lazar et al. 'Multichannel time encoding with integrate-and-fire neurons.' Neurocomputing 65 (2005): 401-407. |
Lazar et al.,'Consistent recovery of sensory stimuli encoded with MIMO neural circuits.' Computational intelligence and neuroscience (2010): 2. |
Martinez-Perez, et al., "Automatic Activity Estimation Based on Object Behavior Signature", 2010, 10 pages. |
Masquelier and Thorpe, Learning to recognize objects using waves of spikes and Spike Timing-Dependent Plasticity. Neural Networks (IJCNN), The 2010 International Joint Conference on DOI-10.1109/IJCNN.2010.5596934 (2010) pp. 1-8. |
Masquelier, Timothee. 'Relative spike time coding and STOP-based orientation selectivity in the early visual system in natural continuous and saccadic vision: a computational model.' Journal of computational neuroscience 32.3 (2012): 425-441. |
Matsugu, et al., "Convolutional Spiking Neural Network for Robust Object Detection with Population Code Using Structured Pulse Packets", 2004, 39-55, 17 pages. |
Meister, M., Multineuronal codes in retinal signaling. Proceedings of the National Academy of sciences. 1996, 93, 609-614. |
Meister, M.; Berry, M.J. The neural code of the retina, Neuron. 1999, 22, 435-450. |
Nichols, A Re configurable Computing Architecture for Implementing Artificial Neural Networks on FPGA, Master's Thesis, The University of Guelph, 2003, pp. 1-235. |
Oster M., Lichtsteiner P., Delbruck T, Liu S. A Spike-Based Saccadic Recognition System. ISCAS 2007. IEEE International Symposium on Circuits and Systems, 2009, pp. 3083-3086. |
Paugam-Moisy et al. 'Computing with spiking neuron networks.' Handbook of Natural Computing, 40 p. Springer, Heidelberg (2009). |
Paugam-Moisy et al., "Computing with spiking neuron networks" G. Rozenberg T. Back, J. Kok (Eds.), Handbook of Natural Computing, Springer-Verlag (2010) [retrieved Dec. 30, 2013], [retrieved online from link.springer.com]. |
Pavlidis et al. Spiking neural network training using evolutionary algorithms. In: Proceedings 2005 IEEE International Joint Conference on Neural Networkds, 2005. IJCNN'05, vol. 4, pp. 2190-2194 Publication Date Jul. 31, 2005 [online] [Retrieved on Dec. 10, 2013] Retrieved from the Internet <URL: http://citeseerx.ist.psu.edu/ viewdoc/download?doi=10.1.1.5.4346&rep=rep1&type=pdf. |
Pavlidis, NG, Tasoulis, OK, Plagianakos, VP, Nikiforidis, G, Vrahatis MN: Spiking neural network training using evolutionary algorithms. In: Proceedings. 2005 IEEE Internationai Joint Conference on Neural Networks, 2005. IJCNN'05, vol. 4, pp. 2190-2194 Publication Date Jul. 31, 2005 [online], [retrieved on Aug. 24, 2012]. Retrieved from the Internet <URL: http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.5.4346&rep=rep1&type=pdf V. Conclusion. |
Ramachandran, et al., "The Perception of Phantom Limbs", The D.O. Hebb Lecture, Center for Brain and Cognition, University of California, 1998, 121, 1603-1630, 28 pages. |
Rekeczky, et al., "Cellular Multiadaptive Analogic Architecture: A Computational Framework for UAV Applications." May 2004. |
Revow M., Williams C., and Hinton, G.E., 1996, Using Generative Models for Handwritten Digit Recognition, IEEE Trans. on Pattern Analysis and Machine Intelligence, 18, No. 6, Jun. 1996. |
Sanchez, Efficient Simulation Scheme for Spiking Neural Networks, Doctoral Thesis, Universita di Granada Mar. 28, 2008, pp. 1-104. |
Sato et al., 'Pulse interval and width modulation for video transmission.' Cable Television, IEEE Transactions on 4 (1978): 165-173. |
Schemmel, J., et al. Implementing synaptic plasticity in a VLSI spiking neural network model, Proceedings of the 2006 International Joint Conference on Neural Networks, Jul. 2006 pp. 1-6. |
Schemmel, J., Grubl, A., Meier, K., Mueller, E.: Implementing synaptic plasticity in a VLSI spiking neural network model. In: Proceedings of the 2006 International Joint Conference on Neural Networks (IJCNN'06), IEEE Press Jul. 16-21, 2006, pp. 1-6 [online], [retrieved or Aug. 24, 2012]. Retrieved from the Internet <URL: http://www.kip.uni-heidelberg.de/Veroeffentlichungen/download.cgi/4620/ps/1774.pdf> Introduction. |
Schemmel, J., Grubl, A., Meier, K., Mueller, E.: Implementing synaptic plasticity in a VLSI spiking neural network model. In: Proceedings of the 2006 International Joint Conference on Neural Networks (IJCNN'06), IEEE Press Jul. 16-21, 2006, pp. 1-6 [online], [retrieved or Aug. 24, 2012]. Retrieved from the Internet Introduction. |
Schnitzer, M.J.; Meister, M.; Multineuronal Firing Patterns in the Signal from Eye to Brain. Neuron, 2003, 37, 499-511. |
Serrano-Gotarredona, et al, "On Real-Time: AER 2-D Convolutions Hardware for Neuromorphic Spike-based Cortical Processing", Jul. 2008. |
Simulink.Rtm. model [online], [Retrieved on Dec. 10, 2013] Retrieved from <URL: http://www.mathworks.com/ products/simulink/index.html>. |
Simulink.Rtm. model [online], [Retrieved on Dec. 10, 2013] Retrieved from <URL: http://www.mathworks.com/ products/simulink/index.html>. |
Sinyavskiy et al. 'Reinforcement learning of a spiking neural network in the task of control of an agent in a virtual discrete environment' Rus. J. Nonlin. Dyn., 2011, vol. 7, No. 4 (Mobile Robots), pp. 859-875, chapters 1-8 (Russian Article with English Abstract). |
Sjostrom et al., 'Spike-Timing Dependent Plasticity' Scholarpedia, 5(2):1362 (2010), pp. 1-18. |
Stringer, et al., "Invariant Object Recognition in the Visual System with Novel Views of 3D Objects", 2002, 2585-2596, 12 pages. |
Szatmary et al., 'Spike-timing Theory of Working Memory' PLoS Computational Biology, vol. 6, Issue 8, Aug. 19, 2010 [retrieved on Dec. 30, 2013]. Retrieved from the Internet: . |
Szatmary et al., 'Spike-timing Theory of Working Memory' PLoS Computational Biology, vol. 6, Issue 8, Aug. 19, 2010 [retrieved on Dec. 30, 2013]. Retrieved from the Internet: <URL: http://www.ploscompbiol.org/article/info%3Adoi%2F10.1371 %2Fjournal.pcbi.10008 79#>. |
Thomas S. and Riesenhuber, M, 2004, Realistic Modeling of Simple and Complex Cell Tuning in the HMAX Model, and Implications for Invariant Object Recognition in Cortex, AI Memo 2004-017 Jul. 2004. |
Thorpe, S.J., Delorme, A. & Vanrullen, R. (2001). Spike-based strategies for rapid processing. Neural Networks 14, pp. 715-725. |
Thorpe, S.J., Guyonneau, R., Guilbaud, N., Allegraud, J-M. & Vanrullen, R. (2004). SpikeNet: real-time visual processing with one spike per neuron. Neurocornputing, 58-60, pp. 857-864. |
Tim Gollisch and Markus Meister (2008) Rapid Neural Coding in the Retina with Relative Spike Latencies. Science 319:1108-1111. |
Van Rullen R.; Thorpe, S. Rate Coding versus temporal order coding: What the Retinal ganglion cells tell the visual cortex. Neural computation, 2001, 13, 1255-1283. |
Vanrullen, R. & Koch, C. (2003). Is perception discrete or continuous? Trends in Cognitive Sciences 7(5), pp. 207-213. |
Vanrullen, R., Guyonneau, R. & Thorpe, S.J. (2005). Spike times make sense. Trends in Neurosciences 28(1). |
Wallis, G.; Rolls, E. T. A model of invariant object recognition in the visual system. Progress in Neurobiology. 1997, 51, 167-194. |
Wang, 'The time dimension for scene analysis.' Neural Networks, IEEE Transactions on 16.6 (2005): 1401-1426. |
Wiskott, et al., "Slow Feature Analysis", 2002, 29 pages. |
Wiskott, L.; Sejnowski, T.J. Slow feature analysis: Unsupervised learning of invariances. Neural Computation, 2002, 14, (4), 715-770. |
Zarandy, et al., "Bi-i: A Standalone Ultra High Speed Cellular Vision System", Jun. 2005. |
Cited By (13)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20140267606A1 (en) * | 2013-03-15 | 2014-09-18 | The Trustees Of Columbia University In The City Of New York | Systems and Methods for Time Encoding and Decoding Machines |
US10155310B2 (en) | 2013-03-15 | 2018-12-18 | Brain Corporation | Adaptive predictor apparatus and methods |
US9821457B1 (en) | 2013-05-31 | 2017-11-21 | Brain Corporation | Adaptive robotic interface apparatus and methods |
US9950426B2 (en) | 2013-06-14 | 2018-04-24 | Brain Corporation | Predictive robotic controller apparatus and methods |
US9792546B2 (en) | 2013-06-14 | 2017-10-17 | Brain Corporation | Hierarchical robotic controller apparatus and methods |
US9844873B2 (en) | 2013-11-01 | 2017-12-19 | Brain Corporation | Apparatus and methods for haptic training of robots |
US9789605B2 (en) | 2014-02-03 | 2017-10-17 | Brain Corporation | Apparatus and methods for control of robot actions based on corrective user inputs |
US10322507B2 (en) | 2014-02-03 | 2019-06-18 | Brain Corporation | Apparatus and methods for control of robot actions based on corrective user inputs |
US9902062B2 (en) | 2014-10-02 | 2018-02-27 | Brain Corporation | Apparatus and methods for training path navigation by robots |
US10105841B1 (en) * | 2014-10-02 | 2018-10-23 | Brain Corporation | Apparatus and methods for programming and training of robotic devices |
US10131052B1 (en) | 2014-10-02 | 2018-11-20 | Brain Corporation | Persistent predictor apparatus and methods for task switching |
US10376117B2 (en) | 2015-02-26 | 2019-08-13 | Brain Corporation | Apparatus and methods for programming and training of robotic household appliances |
US11893474B2 (en) | 2015-10-23 | 2024-02-06 | Semiconductor Energy Laboratory Co., Ltd. | Semiconductor device and electronic device |
Also Published As
Publication number | Publication date |
---|---|
US20120308076A1 (en) | 2012-12-06 |
WO2012167164A1 (en) | 2012-12-06 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US9122994B2 (en) | Apparatus and methods for temporally proximate object recognition | |
US9405975B2 (en) | Apparatus and methods for pulse-code invariant object recognition | |
US9177245B2 (en) | Spiking network apparatus and method with bimodal spike-timing dependent plasticity | |
US9111226B2 (en) | Modulated plasticity apparatus and methods for spiking neuron network | |
US9224090B2 (en) | Sensory input processing apparatus in a spiking neural network | |
US9129221B2 (en) | Spiking neural network feedback apparatus and methods | |
US9183493B2 (en) | Adaptive plasticity apparatus and methods for spiking neuron network | |
US8942466B2 (en) | Sensory input processing apparatus and methods | |
US9275326B2 (en) | Rate stabilization through plasticity in spiking neuron network | |
US9460385B2 (en) | Apparatus and methods for rate-modulated plasticity in a neuron network | |
US20130297539A1 (en) | Spiking neural network object recognition apparatus and methods | |
US8972315B2 (en) | Apparatus and methods for activity-based plasticity in a spiking neuron network | |
US9111215B2 (en) | Conditional plasticity spiking neuron network apparatus and methods | |
US9098811B2 (en) | Spiking neuron network apparatus and methods | |
US9218563B2 (en) | Spiking neuron sensory processing apparatus and methods for saliency detection | |
US8977582B2 (en) | Spiking neuron network sensory processing apparatus and methods | |
US9311593B2 (en) | Apparatus and methods for polychronous encoding and multiplexing in neuronal prosthetic devices | |
US8990133B1 (en) | Apparatus and methods for state-dependent learning in spiking neuron networks | |
Becker et al. | Unsupervised neural network learning procedures for feature extraction and classification | |
US9552546B1 (en) | Apparatus and methods for efficacy balancing in a spiking neuron network | |
Cinelli | Anomaly detection in surveillance videos using deep residual networks | |
KR20190079188A (en) | Gesture recognition system and methods based on deep learning using sensor data | |
Zhu | Nonlinear Time Series Prediction by Using RBF Network |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: BRAIN CORPORATION, CALIFORNIA Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:PIEKNIEWSKI, FILIP LUKASZ;PETRE, CSABA;SOKOL, SACH HANSEN;AND OTHERS;REEL/FRAME:027667/0806 Effective date: 20111007 |
|
STCF | Information on status: patent grant |
Free format text: PATENTED CASE |
|
FEPP | Fee payment procedure |
Free format text: MAINTENANCE FEE REMINDER MAILED (ORIGINAL EVENT CODE: REM.); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY |
|
LAPS | Lapse for failure to pay maintenance fees |
Free format text: PATENT EXPIRED FOR FAILURE TO PAY MAINTENANCE FEES (ORIGINAL EVENT CODE: EXP.); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY |
|
STCH | Information on status: patent discontinuation |
Free format text: PATENT EXPIRED DUE TO NONPAYMENT OF MAINTENANCE FEES UNDER 37 CFR 1.362 |
|
FP | Expired due to failure to pay maintenance fee |
Effective date: 20190901 |