Login| Sign Up| Help| Contact|

Patent Searching and Data


Title:
APPARATUS AND METHODS FOR POLYCHRONOUS ENCODING AND MULTIPLEXING IN NEURONAL PROSTHETIC DEVICES
Document Type and Number:
WIPO Patent Application WO/2012/162658
Kind Code:
A1
Abstract:
Apparatus and methods for encoding sensory input information into patterns of pulses and message multiplexing. In one implementation, the patterns of pulses are polychronous (time-locked by not necessary synchronous), and a retinal prosthetic encodes the input signal into the polychronous patterns for delivery via stimulating electrodes. Different polychronous patterns simultaneously encode different sensory signals; (such as different features of the image), thus providing for message multiplexing. Increasing data transmission capacity allows for a reduction in the number of electrodes required for data transmission. In one implementation, an adaptive feedback mechanism is employed to facilitate encoder operation. In another aspect, a computer vision system is described.

Inventors:
IZHIKEVICH EUGENE M (US)
Application Number:
PCT/US2012/039696
Publication Date:
November 29, 2012
Filing Date:
May 25, 2012
Export Citation:
Click for automatic bibliography generation   Help
Assignee:
BRAIN CORP (US)
IZHIKEVICH EUGENE M (US)
International Classes:
A61F2/16
Foreign References:
US6458157B12002-10-01
US20060161218A12006-07-20
US20040193670A12004-09-30
JPH0487423A1992-03-19
US6546291B22003-04-08
Other References:
SZATMARY ET AL.: "Spike-Timing Theory of Working Memory.", PLOS COMPUTATIONAL BIOLOGY., vol. 6, no. ISSUE, 19 August 2010 (2010-08-19), Retrieved from the Internet [retrieved on 20120819]
Attorney, Agent or Firm:
GUTIERREZ, Peter, J. (PC16644 West Bernardo Drive, Suite 20, San Diego CA, US)
Download PDF:
Claims:
WHAT IS CLAIMED:

1. A sensory input encoding apparatus, comprising:

a processing apparatus configured to receive and encode the sensory input into a plurality of pulses, wherein:

the sensory input comprises information indicative of an object characteristic; and at least a portion of the plurality of pulses is configured in a first polychronous pattern comprising at least two non-synchronous pulses.

2. The apparatus of Claim 1 , wherein the sensory input comprises a frame of pixels.

3. The apparatus of Claim 1 , wherein the sensory input comprises a visual input signal.

4. The apparatus of Claim 3, wherein the first polychronous pattern is

characterized by a group delay that is common to all pulses within the plurality of pulses, the group delay configured based at least in part on the visual input signal.

5. The apparatus of Claim 3, wherein the first polychronous pattern is

characterized by a group delay that is common to all pulses within the plurality of pulses; and the group delay is based at least in part on an output generated by a spatiotemporal filter using the visual input signal.

6. The apparatus of Claim 1 , wherein the first polychronous pattern is generated responsive to a trigger selected from a group consisting of: (i) a temporal event; (ii) receipt of the sensory input; (iii) an appearance of the object in the sensory input; and (iv) a timer event.

7. The apparatus of Claim 1, wherein the first polychronous pattern is adapted for transmission via a plurality of transmission channels.

8. The apparatus of Claim 7, wherein at least a subset of the plurality of transmission channels is associated with a channel delay configured to effect a coincident arrival of the at least two non-synchronous pulses at a decoder.

9. The apparatus of Claim 8, wherein the channel delay is configured based at least in part on the first polychronous pattern.

10. The apparatus of Claim 8, wherein the decoder comprises a coincidence detector configured to receive the at least two non-synchronous pulses and to decode the object characteristic responsive to the reception of the at least two non-synchronous pulses.

11. The apparatus of Claim 7, wherein at least a subset of the plurality of the transmission channels is configurable based on a second polychronous pattern generated prior to the first polychronous pattern.

12. A sensory prosthetic apparatus, comprising:

an encoder configured to:

receive a sensory input;

encode the sensory input into a plurality of pulses configured in a polychronous pattern comprising at least two non-synchronous pulses; and

transmit the plurality of pulses via a plurality of transmission channels.

13. The apparatus of Claim 12, wherein each of the plurality of the transmission channels is configured for operable coupling to a receptor.

14. The apparatus of Claim 13, wherein the receptor is configured to interface to at least a portion of a vertebrate nervous system.

15. The apparatus of Claim 14, wherein the apparatus comprises a retinal prosthetic device.

16. The apparatus of Claim 15, wherein the receptor is configured to interface to a retinal ganglion cell.

17. The apparatus of Claim 15, wherein the receptor is configured to interface to a retinal bipolar cell.

18. The apparatus of Claim 12, wherein the encoder is configurable to be adaptively adjusted based at least in part on a feedback signal provided by the host of the sensory prosthetic apparatus.

19. The apparatus of Claim 18, wherein

encoding of the sensory input into the plurality of pulses is characterized by at least one parameter; and

the encoder adaptive adjustment is configured to modify the at least one parameter.

20. A method of encoding sensory input for use in a processing apparatus, the method comprising: receiving the sensory input comprising information indicative of an object characteristic;

encoding the sensory input into a plurality of pulses; and

configuring at least a portion of the plurality of pulses in a first polychronous pattern comprising at least two non-synchronous pulses.

21. The method of Claim 20, wherein:

the sensory input comprises a visual input signal; and

the first polychronous pattern is characterized by a group delay that is common to all pulses within the plurality of pulses, the group delay configured based at least in part on the visual input signal.

22. The method of Claim 21 , wherein the group delay is configured based at least in part on an output generated by a spatiotemporal filter responsive to the visual input signal.

23. An apparatus configured to transmit a signal to a vertebrate nervous system, the apparatus comprising:

a processor configured to receive an input signal representative of at least a portion of a visual frame;

an encoder configured to encode the input signal into a plurality of pulses; and a plurality of stimulating electrodes;

wherein:

at least a subset of the plurality of pulses is configured in a polychronous pattern comprising at least two non-synchronous pulses;

information related to at least the portion of the visual frame is encoded into the polychronous pattern;

the polychronous pattern is characterized by a group delay that is common to all pulses within the plurality of pulses, the group delay being determined by a spatiotemporal filter applied to the input signal; and

the polychronous pattern is adapted for transmission via a plurality of electrodes operably coupled to the nervous system.

Description:
APPARATUS AND METHODS FOR POLYCHRONOUS ENCODING AND MULTIPLEXING IN NEURONAL PROSTHETIC DEVICES

Priority and Cross-Reference to Related Applications

[0001] This application claims priority to U.S. Patent Application Serial No.

13/117,048 filed May 26, 2011 of the same title, which is incorporated herein by reference in its entirety.

[0002] This application is related to co-owned U.S. provisional patent application No.

61/318,191, filed March 26, 2010 and entitled "Systems and Method for Pulse-Code Invariant Object Recognition", U.S. patent application No. 12/869,573, filed August 26, 2010 and entitled "Systems and Methods for Invariant Pulse Latency Coding", U.S. patent application No. 12/869,583, filed August 26, 2010 and entitled "Invariant Pulse Latency Coding Systems and Methods", each of the foregoing incorporated herein by reference in its entirety.

BACKGROUND OF THE INVENTION

1. Field of the Invention

[0003] The present invention relates generally to visual encoding in a computerized processing system, and more particularly in one exemplary aspect to a computer vision apparatus and methods of encoding of visual information for vision prosthetic devices.

2. Description of Related Art

[0004] In a vertebrate vision system, the retina is a light sensitive tissue lining the inner surface of the eye. Light falling upon the retina triggers nerve impulses that are sent to visual centers of the brain via fibers of the optic nerve. The optic fibers from the retina to the thalamus represent a visual information bottleneck. The retina, consisting of millions of photoreceptors, is coupled to millions of neurons in the primary visual cortex to process visual information, yet there are only hundreds of thousands of retinal ganglion cells giving rise to the optic fibers that connect retina to the brain.

[0005] When retinal photoreceptors are damaged, a retinal prosthetic device (e.g., a semiconductor chip) is utilized to encode visual information, and transmit it to the retinal ganglion cells (RGC) via stimulating electrodes. In this case, the number of electrodes (typically limited to tens or hundreds per chip) acts as the information bottleneck that limits the transmission bandwidth and hence the resolution of the perceived image. Presently available retinal prosthetic devices use a small number of individual light sensors directly coupled to the RGCs via individual stimulating electrodes. The sensors are typically arranged into a square pattern (e.g., 3x3 pattern in the exemplary retinal prosthetic manufactured by Second Sight Medical Products, Inc.), and are configured to operate independently from one another by generating electrical pulses in the electrodes in response to light stimulation. The electric pulses evoke firings of the RGCs to mimic their natural firing patterns. It is an open question on what is the optimal firing pattern of RGCs to encode visual signals in a retinal prosthetic device.

[0006] Existing models used for retinal prosthetic signal encoding utilize rate encoding; the RGCs are stimulated with pulses of currents of various amplitudes or durations (or waveforms) so to make RGCs fire with various firing rates. This is in line with the common neuroscience theory that the frequency of random firing of retinal ganglion cells (and not the precise timing of pulses) is used to transmit the information to the brain (see, e.g., Field, G.; Chichilnisky, E. Information Processing in the Primate Retina: Circuitry and Coding. Annual Review of Neuroscience, 2007, 30(1), 1-30). In another existing approach (see Van RuUen R.; Thorpe, S. Rate Coding versus temporal order coding: What the Retinal ganglion cells tell the visual cortex. Neural computation, 2001, 13, 1255-1283), a coding scheme is suggested where each retinal ganglion cell encodes information into pulse timing (or latency) that is measured with respect to a reference timing signal (e.g., the onset of the image). Here, the RGC with the strongest signal fires first, the RGC with the second strongest signal fires next, and so on. Each RGC fires only once.

[0007] In both cases (i.e., rate coding and latency coding), each RGC encodes the photoreceptor signal (or other features of the image) into a single analog value (rate or latency), and RGCs encode their values independently. That is, the message transmitted along one RGC is the same regardless of the activities of the other RGCs.

[0008] In a different approach described in e.g., Meister, M. Multineuronal codes in retinal signaling. Proceedings of the National Academy of sciences. 1996, 93, 609-614, Meister, M; Berry M. J. II. The neural code of the retina. Neuron. 1999, 22, 435-450, and Schnitzer, M.J.; Meister, M.; Multineuronal Firing Patterns in the Signal from Eye to Brain. Neuron, 2003, 37, 499-511 , encoding and multiplexing of visual information into patterns of synchronous pulses involving multiple cells is performed. The advantage of such neuronal code is higher information transmission capacity, as it allows for multiplexing of various photoreceptor signals into pulsed output of multiple RGCs. For example, 4 photoreceptor signals or features, 1,2,3, and 4, can be encoded into 3 RGC synchronous firings l->( 1,0,0), 2->(0,l,0), 3->(0,0,l), and 4->( 1 ,1,0), where 1 represents the corresponding RGC firing, and 0 represents a quiescent state. When photoreceptors 1 and 3 are active, the output is a superposition 1+3 -> (1,0,1), resulting in multiplexing. However, when too many photoreceptors are active, the output consists of a synchronous barrage of pulses (e.g., (1,1,1)) and the information is lost.

[0009] All existing approaches have limited information transmission capacity, at least in part because they (i) do not fully utilize multichannel patterns of pulses to encode visual signals, and (ii) do not fully take advantage of the brain's ability to learn to decode such patterns. Accordingly, there is a salient need for a more efficient and scalable visual encoding solution that utilizes data compression at a retinal level prior to data transmission to the brain, in order to among other things, increase resolution capabilities of the retinal prosthetic devices.

SUMMARY OF THE INVENTION

[0010] The present invention satisfies the foregoing needs by providing, inter alia, apparatus and methods for polychronous encoding and multiplexing in, e.g., neuronal prosthetic devices.

[0011 ] In one aspect of the invention, an apparatus configured to transmit a signal to a vertebrate nervous system is disclosed. In one embodiment, the apparatus comprises: a processor configured to receive an input signal representative of at least a portion of a visual frame, an encoder configured to encode the input signal into a plurality of pulses, and a plurality of stimulating electrodes. In one variant, at least a subset of the plurality of pulses is configured in a polychronous pattern comprising at least two non-synchronous pulses. Information related to at least the portion of the visual frame is encoded into the polychronous pattern, and the polychronous pattern is characterized by a group delay that is common to all pulses within the plurality of pulses. The group delay is determined in one implementation by a spatiotemporal filter applied to the input signal, and the polychronous pattern is adapted for transmission via a plurality of electrodes operably coupled to the nervous system.

[0012] In another aspect of the invention, a sensory input encoding apparatus is disclosed. In one embodiment, the apparatus comprises a processing apparatus configured to receive and encode the sensory input, comprising information indicative of an object characteristic, into a plurality of pulses such that at least a portion of the plurality of pulses is configured in a first polychronous pattern comprising at least two non-synchronous pulses.

[0013] In one variant, the sensory input comprises a visual input signal comprising a frame of pixels.

[0014] In another variant, the first polychronous pattern is characterized by a group delay that is common to all pulses within the plurality of pulses, the group delay configured based at least in part on the visual input signal such that the first polychronous pattern is characterized by a group delay that is common to all pulses within the plurality of pulses. The group delay is based in one implementation at least in part on an output generated by a spatiotemporal filter using the visual input signal.

[0015] In another variant, the first polychronous pattern is generated responsive to a trigger such as: (i) a temporal event; (ii) receipt of the sensory input; (iii) an appearance of the object in the sensory input; and/or (iv) a timer event.

[0016] In yet another variant, the first polychronous pattern is adapted for transmission via a plurality of transmission channels such that at least a subset of the plurality of transmission channels is associated with a channel delay configured to effect a coincident arrival of the at least two non-synchronous pulses at a decoder, and the channel delay is configured based at least in part on the first polychronous pattern.

[0017] In still another variant, at least a subset of the plurality of the transmission channels is configurable based on a second polychronous pattern generated prior to the first polychronous pattern.

[0018] In a third aspect of the invention, a sensory prosthetic apparatus is disclosed.

In one embodiment, the apparatus comprises: an encoder configured to receive a sensory input, and to encode the sensory input into a plurality of pulses configured in a polychronous pattern comprising at least two non-synchronous pulses, and to transmit the plurality of pulses via a plurality of transmission channels, such that each of the plurality of the transmission channels is configured for operable coupling to a receptor.

[0019] In one variant, the receptor is configured to interface to at least a portion of a vertebrate nervous system, and the apparatus comprises a retinal prosthetic device.

[0020] In another variant, the receptor is configured to interface to a retinal ganglion cell.

[0021] In yet another variant, the encoder is configurable to be adaptively adjusted based at least in part on a feedback signal provided by the host of the sensory prosthetic apparatus. Encoding of the sensory input into the plurality of pulses is characterized by at least one parameter, and the encoder adaptive adjustment is configured to modify the at least one parameter.

[0022] In a fourth aspect of the invention, a method of encoding sensory input for use in a processing apparatus is disclosed. In one embodiment, the method comprises receiving the sensory input comprising information indicative of an object characteristic, encoding the sensory input into a plurality of pulses, con iguring at least a portion of the plurality of pulses in a first polychronous pattern comprising at least two non-synchronous pulses.

[0023] In one variant, the sensory input comprises a visual input signal, and the first polychronous pattern is characterized by a group delay that is common to all pulses within the plurality of pulses, the group delay configured based at least in part on the visual input signal.

[0024] In another variant, the group delay is configured based at least in part on an output generated by a spatiotemporal filter responsive to the visual input signal.

[0025] In another aspect of the invention, an image processing system is disclosed. In one embodiment, the system comprises a processor configured to execute instructions maintained in a storage medium; the instructions cause the processor to encode and multiplex visual signals in neuronal prosthetic devices.

[0026] Further features of the present invention, its nature and various advantages will be more apparent from the accompanying drawings and the following detailed description.

BRIEF DESCRIPTION OF THE DRAWINGS

[0027] FIG. 1 is a graphical illustration of synchronous, asynchronous and polychronous pulse groups.

[0028] FIG. 1 A is a graphical illustration of polychronous pulse groups according to one embodiment of the invention.

[0029] FIG. 2 is a block diagram illustrating an exemplary encoding apparatus according to one embodiment of the invention.

[0030] FIG. 3 is a block diagram illustrating visual signal encoding and decoding using pulse latencies according to one embodiment the invention.

[0031] FIG. 3A is a block diagram illustrating visual signal encoding and message multiplexing using pulse latencies according to a one embodiment the invention. [0032] FIG. 3B is a block diagram illustrating encoding and decoding of multiple objects using pulse latencies according to one embodiment the invention.

[0033] FIG. 4 is a block diagram illustrating visual signal encoding, decoding and message multiplexing using a bank of filters according to one embodiment the invention.

[0034] FIG. 5 is a logical flow chart illustrating one exemplary embodiment of the method of encoding and decoding of objects.

[0035] All Figures disclosed herein are © Copyright 2011 Brain Corporation. All rights reserved.

DETAILED DESCRIPTION OF THE INVENTION

[0036] Embodiments of the present invention will now be described in detail with reference to the drawings, which are provided as illustrative examples so as to enable those skilled in the art to practice the invention. Notably, the figures and examples below are not meant to limit the scope of the present invention to a single embodiment; other embodiments are possible by way of interchange of, or combination with, some or all of the described or illustrated elements. Wherever convenient, the same reference numbers will be used throughout the drawings to refer to same or like parts.

[0037] Where certain elements of these embodiments can be partially or fully implemented using known components, only those portions of such known components that are necessary for an understanding of the present invention will be described, and detailed descriptions of other portions of such known components will be omitted so as not to obscure the invention.

[0038] In the present specification, an embodiment showing a singular component should not be considered limiting; rather, the invention is intended to encompass other embodiments including a plurality of the same component, and vice- versa, unless explicitly stated otherwise herein.

[0039] Further, the present invention encompasses present and future known equivalents to the components referred to herein by way of illustration.

[0040] As used herein, the terms "computer", "computing device", and

"computerized device ", include, but are not limited to, mainframe computers, workstations, servers, personal computers (PCs) and minicomputers, whether desktop, laptop, or otherwise, personal digital assistants (PDAs), handheld computers, embedded computers, programmable logic devices, digital signal processor systems, personal communicators, tablet computers, portable navigation aids, J2ME equipped devices, cellular telephones, smartphones, personal integrated communication or entertainment devices, neurocomputers, neuromorphic chips, or literally any other device capable of executing a set of instructions and processing an incoming data signal.

[0041] As used herein, the term "computer program" or "software" is meant generally to include any sequence or human or machine cognizable steps which perform a function. Such program may be rendered in virtually any programming language or environment including, for example, C/C++, C#, Fortran, COBOL, MATLAB™, PASCAL, Python, assembly language, markup languages (e.g., HTML, SGML, XML, VoXML), and the like, as well as object-oriented environments such as the Common Object Request Broker Architecture (CORBA), Java™ (including J2ME, Java Beans, etc.), Binary Runtime Environment (e.g., BREW), and the like.

[0042] As used herein, the terms "connection", "link", "transmission channel", "delay line", refer without limitation to a causal link between any two or more entities (whether physical, wired/wireless, or logical/virtual), which enables information exchange between the entities.

[0043] As used herein, the term "memory" includes any type of integrated circuit or other storage device adapted for storing digital data including, without limitation, ROM, PROM, EEPROM, DRAM, SDRAM, DDR/2 SDRAM, EDO/FPMS, RLDRAM, SRAM, "flash" memory (e.g., NAND/NOR), memristor, and PSRAM.

[0044] As used herein, the terms "microprocessor" and "digital processor" are meant generally to include all types of digital processing devices including, without limitation, digital signal processors (DSPs), reduced instruction set computers (RISC), general-purpose (CISC) processors, microprocessors, gate arrays (e.g., FPGAs), PLDs, reconfigurable compute fabrics (RCFs), array processors, secure microprocessors, and application-specific integrated circuits (ASICs). Such digital processors may be contained on a single unitary IC die, or distributed across multiple components.

[0045] As used herein the terms "pulse pattern", "pattern of pulses", or "pattern of pulse latencies" are meant generally and without limitation to denote a group of pulses, arranged (in space and time) in a manner that is recognizable at a predetermined level of statistical significance.

[0046] As used herein, the terms "pulse", "spike", "burst of spikes", and "pulse train" are meant generally to refer to, without limitation, any type of a pulsed signal, e.g., a rapid change in some characteristic of a signal, e.g., amplitude, intensity, phase or frequency, from a baseline value to a higher or lower value, followed by a rapid return to the baseline value, and may refer to any of a single spike, a burst of spikes, an electronic pulse, a pulse in voltage, a pulse in electrical current, a software representation of a pulse and/or burst of pulses, a software representation of a latency or timing of the pulse, and any other pulse or pulse type associated with a pulsed transmission system or mechanism.

[0047] As used herein, the terms "pulse latency", "absolute latency", and "latency" are meant generally to refer to, without limitation, a temporal delay or a spatial offset between an event (e.g., the onset of a stimulus, an initial pulse, or just a point in time) and a pulse.

[0048] As used herein, the terms "pulse group latency", or "pulse pattern latency" refer to, without limitation, an absolute latency of a group (pattern) of pulses that is expressed as a latency of the earliest pulse within the group.

[0049] As used herein, the terms "relative pulse latencies" refer to, without limitation, a latency pattern or distribution within a group (or pattern) of pulses that is referenced with respect to the pulse group latency.

[0050] As used herein, the term "pulse-code" is meant generally to denote, without limitation, information encoding into a patterns of pulses (or pulse latencies) along a single pulsed channel or relative pulse latencies along multiple channels.

[0051] As used herein, the term "wireless" means any wireless signal, data, communication, or other interface including without limitation Wi-Fi, Bluetooth, 3G (e.g., 3 GPP, 3GPP2, and UMTS), HSDPA/HSUPA, TDMA, CDMA (e.g., IS-95A, WCDMA, etc.), FHSS, DSSS, GSM, PAN/802.15, WiMAX (802.16), 802.20, narrowband/FDMA, OFDM, PCS/DCS, Long Term Evolution (LTE) or LTE-Advanced (LTE-A), analog cellular, CDPD, satellite systems such as GPS, millimeter wave or microwave systems, optical, acoustic, and infrared (i.e., IrDA).

Overview

[0052] The present invention provides, in one salient aspect, improved apparatus and methods for polychronous encoding and multiplexing in neuronal prosthetic devices.

[0053] In one embodiment of the invention, a precise spike-timing code is utilized to achieve higher data transmission rate. An exemplary encoding apparatus comprises an encoder that is configured to receive an input signal from an image sensor (such as a charge- coupled device (CCD) or a complementary metal-oxide-semiconductor (CMOS) active pixel sensor device, and to transmit encoded data via multiple transmission channels. The encoder utilizes a precise time base (that is common between different channels), and encodes information into a patterns of pulses that are time-locked to each other but not necessary synchronous. Such pulses are called polychronous (see Izhikevich, E.M. Polychronization: Computation With Spikes. Neural Computation, 2006, 18, 245-282), as shown in FIG. 1. Such patterns of polychronous pulses are referred to as polychronous patterns or polychronous pulse groups. Such pulse groups are the basis of the polychronous code that has even greater information capacity and multiplexing capability than synchronous patterns, because the code uses the relative timing of pulses as an additional variable.

[0054] In one implementation, a pulse delay encoding is employed in order to multiplex signals from different sensing elements onto transmission channels, and to generate precise pulse-timing patterns. In another implementation, banks of basis spatial filters are employed to project (partition) the incoming visual signal onto a set of basis projections prior to multiplexing. In one variant, banks of spatio-temporal filters are utilized when the input signals exhibit temporal variability.

[0055] In another aspect of the invention, the precise pulse-timing patterns are utilized in a retinal prosthetic device in order to increase data transmission capacity, and to create prosthetic devices to deliver richer (higher-resolution) signals using fewer stimulation electrodes compared to presently available approaches.

[0056] In another implementation, the polychronous encoding scheme described here is used in a generalized neuronal prosthetic devices that benefit from an increased transmission bandwidth in order to deliver rich sensory information into the brain via a smaller number of stimulating electrodes.

[0057] Embodiments of the polychronous encoding and multiplexing of visual information functionality are useful in a variety of robotic electronic devices and computer vision systems requiring object recognition functionality.

Detailed Description of Exemplary Embodiments

[0058] Detailed descriptions of the various embodiments and variants of the apparatus and methods of the invention are now provided. Although certain aspects of the invention can best be understood in the context of the conversion of visual input into polychronous pulse pattern output and subsequent data transmission to the RGCs via stimulating electrodes, embodiments of the invention may also be used for processing of other sensory signals of other, non-visual modalities, including various bands of electromagnetic waves (e.g., infrared, etc.) and pressure (e.g., sound, tactile) signals, and chemical signals (e.g., odor, taste). [0059] Embodiments of the invention may be for example deployed in a hardware and/or software implementation of a computer-vision system, provided in one or more of a prosthetic device, robotic device and any other specialized visual system. In one such implementation, an image processing system may include a processor embodied in an application specific integrated circuit ("ASIC"), which can be adapted or configured for use in an embedded application such as a prosthetic device.

Exemplary Encoding Apparatus

[0060] Referring now to FIGS. 1 through 5A, exemplary embodiments of polychronous pulse-code encoding apparatus and methods of the invention are described. In one embodiment, the apparatus and methods encode each object (or a portion of an object or a visual feature) into a pattern of pulses and the timing of the occurrence of the pattern as described in detail below.

[0061] It is known in the field of neuroscience that neurons generate action potentials, often called "spikes", "impulses", or "pulses", and transmit them to other neurons. Such pulses are discrete temporal events, and there can be many pulses per unit of time. Conventionally, a stereotypical burst of a few spikes is often considered to be a pulse (also referred to as the "complex spike"). Polychronous pulse groups

[0062] In one aspect of the invention, the encoding apparatus utilizes a precise time base (that is common between different channels), and encodes information into a patterns of pulses that are time-locked to each other (but not necessary synchronous). FIG. 1 illustrates one example of synchronous (106), non-synchronous (114), and polychronous (116) pulse patterns that utilize two transmission channels (102, 104). In another variant (not shown), a single transmission channel is in conjunction with a reference time base, such as a periodic clock signal is used instead.

[0063] The synchronous pulse transmissions (106) comprise pulse groups 108, 110, 112 that appear in synchronization on both transmission channels 102, 104. The synchronous pulse group 108 contains two distinct time delays (dtl, dt2), and therefore is capable of transmission of two bits of information.

[0064] Appearance of pulses and or pulse groups on the channels 102, 104 at undetermined times with respect to each other (or a reference event) corresponds to the non- synchronous (or asynchronous) pulse transmission 114. Such transmissions do not imply a reproducible time-locking pattern, but usually describe noisy, random, non-synchronous events, such as data communication at random intervals.

[0065] Polychronous pulse transmissions (such as transmissions 116 in FIG, 1) comprise pulses that appear on two channels (102, 104) at predetermined time intervals with respect to a reference event (such as marked by the vertical dashed line 119).

[0066] In polychronous patterns, such as shown by the pulse groups 118 of FIG. 1, there is a predetermined relationship between the pulses, and all of the pulses within the polychronous group 118 appear at predetermined time intervals dt3-dt6. Hence, the polychronous transmissions of four pulses carry 4 bits of information, compared to the two bits carried by the synchronous pulse group 106. The polychronous pulse structure 120 shown in FIG. 1 comprises two polychronous pulse groups 118 that are detectable by a variety of decoding techniques, such as, for example, correlation, matched filter, or coincidence detection coupled with a variable delay.

[0067] FIG. 1A shows other exemplary polychronous pulse patterns useful with the invention. Pulses 122, 124 are generated on the two channels 102, 104, respectively, at predetermined times d7, dt8, all referenced to a common time event. The pulses 122, 124 form the polychronous pattern 126 that may be repeated as shown in FIG. 1A. Another polychronous pattern 130 is comprised of two pulses 122, 128 on the first channel 102 and the pulse 124 on the second channel 104, all generated at predetermined time intervals dt7, dt8, dt9. The pattern 130 is repeated several times with additional randomly appearing pulses 134, which are considered as noise and are removed during processing. The deterministic structure of the polychronous pulse pattern, as characterized by the predetermined pulse timing on different channels that is referenced to the common event, aids in noise removal using e.g., methodologies described below.

[0068] Polychronous patterns (such as, for example, the patterns 126, 130 of FIG. 1 A) form a basis of the polychronous code that provides greater information capacity and multiplexing capability compared to synchronous patterns, because the polychronous pulse patterns allow the use of pulse timing (such as time delays dt7, dt8, dt9, for example) as an additional variable for data transmission.

[0069] One embodiment of the invention that utilizes polychronous encoding using pulse delays is shown and described below with respect to FIG. 2. An object 202 (such as a letter 'Ι although any other visual feature can be used here, such as an edge or a "blob") is placed in a field of view of a visual sensing device, such as a charge-coupled device (CCD) or a complementary metal-oxide- semiconductor (CMOS) active pixel sensor. The visual input signal from the object is captured by light detectors inside the visual sensor, which produce a two-dimensional m χ n matrix of pixels 204. The pixel matrix comprises a representation of various object features (such as plurality of edges) and parameters (size, orientation). In the embodiment of FIG. 2, each pixel 206 within the matrix 204 carries a value corresponding to bright or dark areas of the object as represented by the white and the black squares, respectively. In another implementation, each pixel 206 comprises an 8-bit red-green-blue (RGB) three-color model digitized value (e.g., refreshed at a suitable frame rate, for example 10-30 Hz). It will be appreciated by those skilled in the art that the above image parameters are merely exemplary, and many other image representations (e.g., bitmap, cyan-magenta-yellow- and key (CMYK) four color model, n-bit gray scale, etc.) are equally applicable to, and useful with, the present invention. In yet another implementation, the pixel matrix provides a suitably conditioned (amplified, impedance matched and/or filtered) analog output from the light sensors.

[0070] The visual sensor output 208 is encoded by the encoder apparatus 210 into polychronous patterns of pulses 212, which are transmitted via transmission channels 214. In one approach, the transmission channels comprise the stimulating electrodes (and accompanying analog signal conditioning circuitry) that couple the encoder to the retinal ganglion cells (or other receptors in retina or in the brain), and stimulate the desired pattern of spikes within the RGCs.

[0071] When electrodes are inserted into nervous tissue, pulses of current injected via the electrodes evoke electrical responses in the tissue. A strong brief pulse of current often results in one or more pulses induced in the neurons in the tissue proximal to the electrode. In order to evoke a reliable response in one or a small subset of neurons next to the tip of the stimulating electrode, appropriate conditioning and/or waveform shaping of the stimulating signal may be required. For the purposes of the present discussion, it is assumed that the stimulating signal waveform is adequately configured in order to evoke the desirable pattern of firing in the stimulated neurons.

[0072] In another approach, the transmission channels comprise data links that deliver polychronous pulses to a computerized proceeding apparatus.

Delay-based encoding

[0073] In another implementation, the encoding apparatus 300 utilizes a delay-based approach in order to encode the input as illustrated and described with respect to FIG. 3. The encoding apparatus 300 uses delay-transmission lines to convert visual input signal information into polychronous pulse patterns. The exemplary encoding apparatus 300 comprises two transmission channels 318, although other numbers of channels may be used. The input visual pixel matrix (such as the matrix 204 in FIG. 2) is partitioned into two-by- two blocks of pixels, such as the pixel block 302. The visual input signal appears at a time tO, which acts as a reference for all subsequent pulsed events. The encoder apparatus 300 comprises two pulse-generating units, also referred to as the encoder nodes 314, 316. Each pixel in the grid 302 evokes a unique pattern of polychronous pulses in the pulse-generating units 314, 316, which generate pulses based on predetermined delay values. In the embodiment of FIG. 3, dark pixels correspond to a logical value of "one" and white pixels correspond to a logical value of "zero". Other nomenclatures (such as the inverse, or n-level) may be used as well. The encoder delays are configured based on the position of the dark pixel within the grid (as shown by pixel grids 304-310) and the path to the detector node 314, 316. In FIG. 3, the delays are labeled by numerals 1-4, next to the arrows linking the pixel grids 304-310 and the pulse generating units.

[0074] For example, if only the top right pixel is active, as in the grid 304, the encoder generates a polychronous pattern 330 along the transmission channels 318, 320. The pulse pattern 330 comprises two pulses (one per each channel) generated with the predetermined delays 326, 328. The delays 326, 328 shown in the example in FIG. 3 are 1 and 2 time units respectively. Here, the time units refer to arbitrary time units, comprising e.g., milliseconds, or units of time of the numerical solver, or units of the global clock).

[0075] The pulse pattern 330 is received by a decoding apparatus 340, which comprises two l-to-4 switching nodes 333, 335 coupling of the transmission channels 318, 320 to the detector nodes 332-338. The detection apparatus 340 applies predetermined delay compensation to the received input. The values of the delay compensation are shown in FIG. 3 as numerals 1-4 next to the arrows linking the nodes 333, 335 and the detectors 332-338.

[0076] Each of the detectors 332-338 is con igured to receive input each of the transmission channels 318, 320, and use the received input pulses to decode information about the stimulus objects by acting as a coincidence detector. For example the detector node 332, initially in a zero (FALSE) state, transitions to a 'TRUE' slate and generates a pulse output if the received pulses are coincident (or nearly coincident within some allowable jitter or tolerance). If the received pulses are not coincident, the detector node 332 remains in the zero state. It will be appreciated by those skilled in the art that the above coincident detector can be implemented using many existing neuronal models, including e.g., the integrate-and- fire model, Hodgkin-Huxley model, FitzHugh-Nagumo model, or quadratic integrate-and-fire model, as well as others.

[0077] In the example shown in FIG. 3, the detector 332 receives a coincident pulse pair, and generates the detection signal, causing the detection apparatus 340 to report detection of the pixel pattern 303 at its output. All remaining detectors 334-338 remain in their previous states. In another variant comprising more than two transmission channels, the detectors 332-338 generate detection signals whenever they receive two or more coincident pulses

[0078] FIG. 3A depicts exemplary encoder operation in response to a pixel grid 342 comprising two active pixels, located at the top right and the bottom left of the pixel grid. As shown in FIG. 3A, in addition to the encoder generating the pulses 322, 324 with delays of 1 and 2 respectively (corresponding to the pixel pattern 304), the encoder generates pulses 344, 346 with delays 348, 352 of 4 and 3, respectively. The resulting polychronous pulse pattern is depicted in the dashed box 350 of FIG. 3A. Notice that the pulse pattern 350 is a superposition of two polychronous patterns - one for the first pixel 304, and the other for the second pixel 310. Thus, the pulse pattern 350 conveys independently two different messages using pulse multiplexing.

[0079] In another embodiment, a distinct polychronous patterns are designated in order to encode conjunctions of input signals. Certain conjunctions, referred to as the higher- order features, correspond to, for example, intersections of horizontal and vertical lines forming corners of various orientations, etc. are encoded by their specific polychronous patterns.

[0080] Referring now to FIG. 3B, exemplary encoding, transmission, and decoding of multiple input pixel grids (or pixel frames) is shown. The encoder 312 receives two separate pixel grids 302, 342 and encodes each grid into a separate polychronous group of pulses (330, 350, respectively). The pulse groups 330, 350 are transmitted from the encoder 312 to the decoder apparatus 340. The decoder decodes the received pulse groups to recover the original pixel grids 302, 42.

[0081] Exemplary embodiment of FIGS. 3-3B illustrate to encoding and transmission of 4-bit binary input via 2 transmission channels. This represents a doubling of data throughput/per channel, compared to the presently available solutions that utilize banks of independent transmission channels. Although a 2-by-2 pixel grid is shown in FIGS. 3-3B, it is appreciated by those skilled in the arts that higher density implementations can readily be constructed using the principles of the invention described above. [0082] As it applies to the exemplary embodiments of FIGS. 3-3B, channel information capacity is typically limited by the ability of the encoder unambiguously encode and transmit, and the decoder to unambiguously receive and decode, adjacent pulse pairs without causing pulse to pulse interference, also referred to as the inter-symbol interference or ISI. For a typical pulse of duration T, the maximum pulse rate is limited to less than M=l/T pulses per second (s) (or bits per s (bps). Higher bit rates may be achievable using pulse shaping techniques.

[0083] A polychronous encoder, comprising N encoding elements coupled to N output channels (with each carrying pulses at M discrete time intervals), has data capacity of 2 Λ {ΝxΜ} different polychronous patterns to encode 2 Λ (ΝxΜ) different messages. This number is exponentially more than the number of messages (2 Λ N) that can be encoded using the synchronous patterns (as demonstrated by Meister, 1996, Meister and Berry, 1999, and Schnitzer and Meister, 2003) or the number of messages (M Λ N) using the rank-order coding (Van Rullen and Thorpe, 2001). Thus, the encoding apparatus and methods described herein advantageously deliver higher data transmission bandwidth than may be used to deliver higher resolution visual signal (in space, such as higher pixel count, in time (higher frame rate), color, intensity, or luminance contrast representation) using fewer channels.

[0084] Depending on the number and density of stimulation electrodes, one can set up the encoding scheme to map more pixels into more pulse-generating units, so as to maximize the information characteristic of the encoder.

[0085] When pulse patterns are delivered to the brain via stimulating electrodes, they evoke polychronous firing of action potentials (also called spikes or pulses) in neurons. Since the brain has neuronal fibers with different conduction delays, it is configured to adaptively adjust (learn) the neuronal network in order to decode such polychronous code, and extract the transmitted information.

[0086] It is established that the human brain has multiple redundant synaptic connections with a diversity of conduction delays and experience-dependent plasticity, thereby enabling a person to recognize different polychronous patterns by learning and adjusting synaptic strength or the conduction delays. Similarly, the decoder 340 disclosed herein is configurable to adaptively to recognize the polychronous patterns by adjusting the delays or transmission characteristics of the connections onto its pulse-generating units 333- 338. [0087] Another embodiment of the present invention comprises a visual sensor with a number of pixels that is substantially greater (by, for example a factor of 10) than the number of pulse-generating units. The pulse-generating units (that are similar to the encoder nodes 314, 316) are configured to generate pulses based on the state of pixels that are proximal to each other, so that there is a continuity and locality of the transformation from the pixel space to the pulse space. Such configuration facilitates learning and decoding of the polychronous patterns by the nervous system.

Filter-based encoding

[0088] In another aspect of the invention, a set of basis filters is used by the encoder in encoding the sensory input signal into polychronous pulse patterns as shown and described below with respect to FIG. 4. The encoding sequence is divided into two stages; during the first stage, a sensory input (such as the pixel grid 302 described above with respect to FIG. 3) is received by a bank of filters 412 to extract various features. The filter bank comprises a set of basis filters, e.g. i=l:4}, denoted by the designators 404-410, respectively, that are configured to determine a projection of the input sensory signal pattern onto four filter components according to the following expression:

In some embodiments, it might be desirable to have a set of orthogonal filters; i.e., filters satisfying the orthogonality condition:

The integration in Eqns. 1-2 is performed with respect to a spatial vector-variable x, which describes a horizontal and/or vertical dimension of the pixel grid 302.

[0089] At the second stage, each of the basis filter projection values f i is used by the encoder 416 to determine the presence of the active pixel (or a feature such as the active pattern of pixels corresponding to the filter F i ), and to construct a corresponding polychronous pattern P i . In one approach, a binary determination algorithm is uses such that if the value of the projection f i exceeds a certain threshold T, then the output contains P i note the polychronous pulse pattern 422 generated by the pulse generating nodes 418, 420 of the encoder. Table 1 below provides an exemplary relationship between different filter projections and the corresponding pixel location and encoder delays.

[0090] As shown in FIG. 4, the pixel grid 302 comprising an active pixel in the first row and second column evokes the polychronous pattern P i , with the corresponding delay values of 1 , 2, which are used by the encoder nodes 418, 420, respectively.

[0091] In another variant, an analog basis projection discrimination algorithm is used, such that the value of the filter projection determines the timing (for example, the group delay) of the appearance of the polychronous pulse pattern 422. The group delay is defined herein as a time interval that is common to all pulses within the pulse group as referenced to an event (such as, for example, an appearance of the sensory input). In one variant, the pulse pattern P i is considered not present when the group delay exceeds a predetermined maximum allowable value. When multiple variables basis filter projection (for example f 1 and f 3 ) satisfy the condition of Eqn. 2, the encoder 416 produces a pulse pattern comprised of a superposition of several respective polychronous patterns P i , P3), thereby producing a multiplexed pulse pattern.

[0092] The embodiment in FIG. 2 corresponds to the bank of binary filters f i (x) as in 211 -214 (black corresponds to the value of one, and white corresponds to the value of zero) and P i = (1,2), P 2 =(2,4), P 3 =(3,1), and P 4 =(4,3) with binary determination of the presence of the polychronous pattern in the output based on the nonzero value of the corresponding projection fi. Notice that the pattern 242 is a superposition of P 1 and P 4 . Superposition of two polychronous patterns is a new polychronous pattern.

[0093] In another embodiment, not shown, the sensory input / (such as the input pixel grid 302) comprises a temporal component I = I(x, t). Accordingly, a bank of spatio-temporal filters that is filters in the general form of F i (x, t), is used. Exemplary Method of Encoder Operation

[0094] In another aspect of the invention, a method of encoder apparatus operation, described in detail with respect to FIG. 5, allows for adaptive adjustment of the encoding parameters (such as encoding of the information from the pixel space to polychronous pulse groups).

[0095] At step 522 of the method 500 of FIG. 5, a sensory input corresponding to an object of interest (such as a frame of pixels 204 or a digitized image representing the object 202) is received by the encoding apparatus, such as, for example the 210 of encoder of the apparatus 200 in FIG. 2. At step 524, the encoder 210 configures encoding parameters corresponding to. an initial encoder state. At step 526, the sensory input 204 is encoded into a polychronous pattern of pulses (for example pattern 212), and the encoder output is transmitted at step 528 to the destination via transmission channels (such as the channels 214 of FIG. 2). In one implementation, the destination comprises RGCs of a vertebrate nervous system. In another implementation, encoded pulse data are transmitted to computerized image processing apparatus, comprising a plurality of pulse detectors.

[0096] At step 530, the received polychronous pulse patterns are analyzed, and detection of object features (such as, for example, different edges of the object 202) is performed. Based on the detection results (such as for example detection score comparison to a preset threshold, timing delay, etc) at step 532 a feedback signal 534 may be provided to the encoder in order to adaptively adjust specific aspects of the encoding and/or parameters of the transmission channels (such as the channels 214 of FIG. 2). In one implementation, the feedback comprises a verbal or other feedback from a patient wearing the prosthetic device. For example, the temporal span of the polychronous pattern (i.e., the average time between the first and the last pulse in the patterns) set initially to large values, are gradually decreased until the patient stops reporting improvements in perception of the sensory signals. In another variant, the number of pulses per stimulating electrodes is adjusted. In yet another variant, the complexity of polychronous patterns (the number of channels that each polychronous pattern spans) is adjusted, or the set of spatiotemporal filters is adjusted, or a combination thereof is adjusted.

[0097] In yet another variant, the feedback signal 534 comprises one or more parameters related to the detector state (detection score, detector state adjustment, detection confidence) that are received by the encoder in order to determine if encoding and/or transmission channel adjustment is required. [0098] In one implementation, the generation of polychronous pulse patterns of FIGS.

3 through 4 is triggered by an internal event, e.g., an internal clock cycle. Alternatively, the trigger may comprise an external event, such as, for example, an appearance of a new input signal (e.g., new visual frame) or sudden change of the input signal (e.g., appearance of new object or feature).

Exemplary Uses and Applications of Certain Aspects of the Invention

[0099] Polychronous pulse-coded representations advantageously allow for a very high representation capacity in an object recognition apparatus, as compared with a single message per channel data transmission by individual channels. This is due to, inter alia, the use of pulse timing as additional data encoding parameter, and the ability to multiplex different messages (corresponding to different object features and/or objects) into a single pulse group transmission. Such multiplexing is enabled due to at least in part, polychronous properties of the pulse group, where the relative timing of each pulse set is uniquely determined with respect to a reference event thus enabling unambiguous object decoding using, for example, synchronous detection as described above with respect to FIG. 3. Although not commonly accepted in the neuroscience field, the brain has the capability to learn to decode and correctly interpret such patterns, and the present invention takes advantage of such a capability.

[00100] Advantageously, the higher information density achieved by the polychronous encoding allows delivering richer (higher-resolution) signals using fewer stimulation electrodes, compared to presently available approaches.

[00101] It will be recognized that while certain aspects of the invention are described in terms of a binary pixel image processing, these descriptions are only illustrative of the broader methods of the invention, and may be modified as required by the particular application. Therefore, polychronous encoding may be applied to encoding of other visual characteristics such as color, transparency, contrast, luminance, or visual motion as well as non-visual sensory inputs signals (e.g., infrared signals or other electromagnetic signals outside the visible spectrum).

[00102] It will be further recognized that while certain aspects of the invention are described in terms of retinal prosthetic device, principled of the invention are useful in a wide variety of other sensory prosthetic applications comprising delivery of sensory signals to the brain via a peripheral prosthetic device, such as auditory prosthetics (cochlear), using skin or tongue stimulation, or stimulating nervous tissue inside the brain using wireless connections. [00103] Advantageously, exemplary embodiments of the present invention are useful in a variety of robotics and computer vision devices to efficiently transfer visual and other sensory information from peripheral sensors (e.g., the camera) to the information-processing unit (e.g., artificial nervous system) including without limitation prosthetic devices, autonomous and robotic apparatus, and other electromechanical devices requiring objet recognition functionality. Examples of such robotic devises are manufacturing robots (e.g., automotive), military, medical (e.g. processing of microscopy, x-ray, ultrasonography, tomography). Examples of autonomous vehicles include rovers, unmanned air vehicles, underwater vehicles, smart appliances (e.g. ROOMBA®), etc.

[0104] The foregoing descriptions of the invention are intended to be illustrative, and not in any way limiting; those skilled in the art will appreciate that the invention can be practiced with various combinations of the functionalities and capabilities described above, and can include fewer or additional components than described above. Certain additional aspects and features of the invention are further set forth below, and can be obtained using the functionalities and components described in more detail above. These improvements advantageously translate into a system that requires fewer detectors and fewer processing units, compared to the prior art, and that allows taking advantage of the combinatorial richness of the pulse code and the brain's ability to decode and interpret it.

[0105] Embodiments of the present invention are further applicable to a wide assortment of applications including computer human interaction (e.g., recognition of gestures, voice, posture, face, etc.), controlling processes (e.g., an industrial robot, autonomous and other vehicles), augmented reality applications, organization of information (e.g., for indexing databases of images and image sequences), access control (e.g., opening a door based on a gesture, opening an access way based on detection of an authorized person), detecting events (e.g., for visual surveillance or people or animal counting, tracking), data input, financial transactions (payment processing based on recognition of a person or a special payment symbol) and many others.

[0106] Advantageously, the present invention can be used to simplify tasks related to motion estimation, such as where an image sequence is processed to produce an estimate of the object position (and hence velocity) either at each points in the image or in the 3D scene, or even of the camera that produces the images. Examples of such tasks are: ego motion, i.e., determining the three-dimensional rigid motion (rotation and translation) of the camera from an image sequence produced by the camera; following the movements of a set of interest points or objects (e.g., vehicles or humans) in the image sequence and with respect to the image plane.

[0107] In another approach, portions of the object recognition system are embodied in

a remote server configured to perform pattern recognition in data streams for various

applications, such as scientific, geophysical exploration, surveillance, navigation, data mining

(e.g., content-based image retrieval). Myriad other applications exist that will be recognized

by those of ordinary skill given the present disclosure.

[0108] It will be recognized that while certain aspects of the invention are described

in terms of a specific sequence of steps of a method, these descriptions are only illustrative of

the broader methods of the invention, and may be modified as required by the particular

application. Certain steps may be rendered unnecessary or optional under certain

circumstances. Additionally, certain steps or functionality may be added to the disclosed

embodiments, or the order of performance of two or more steps permuted. All such variations

are considered to be encompassed within the invention disclosed and claimed herein.

[00109] While the above detailed description has shown, described, and pointed out

novel features of the invention as applied to various embodiments, it will be understood that

various omissions, substitutions, and changes in the form and details of the device or process

illustrated may be made by those skilled in the art without departing from the invention. The

foregoing description is of the best mode presently contemplated of carrying out the

invention. This description is in no way meant to be limiting, but rather should be taken as

illustrative of the general principles of the invention. The scope of the invention should be

determined with reference to the claims.

i