WO2019005051A1 - Camera communications system using high speed camera sensors - Google Patents

Camera communications system using high speed camera sensors Download PDF

Info

Publication number
WO2019005051A1
WO2019005051A1 PCT/US2017/039852 US2017039852W WO2019005051A1 WO 2019005051 A1 WO2019005051 A1 WO 2019005051A1 US 2017039852 W US2017039852 W US 2017039852W WO 2019005051 A1 WO2019005051 A1 WO 2019005051A1
Authority
WO
WIPO (PCT)
Prior art keywords
pixel
light
change
module
sensing device
Prior art date
Application number
PCT/US2017/039852
Other languages
French (fr)
Inventor
Javier Perez-Ramirez
Richard D. Roberts
Abdullah SEVINCER
Original Assignee
Intel Corporation
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Intel Corporation filed Critical Intel Corporation
Priority to PCT/US2017/039852 priority Critical patent/WO2019005051A1/en
Publication of WO2019005051A1 publication Critical patent/WO2019005051A1/en

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04BTRANSMISSION
    • H04B10/00Transmission systems employing electromagnetic waves other than radio-waves, e.g. infrared, visible or ultraviolet light, or employing corpuscular radiation, e.g. quantum communication
    • H04B10/11Arrangements specific to free-space transmission, i.e. transmission through air or vacuum
    • H04B10/114Indoor or close-range type systems
    • H04B10/116Visible light communication
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04BTRANSMISSION
    • H04B10/00Transmission systems employing electromagnetic waves other than radio-waves, e.g. infrared, visible or ultraviolet light, or employing corpuscular radiation, e.g. quantum communication
    • H04B10/60Receivers
    • H04B10/66Non-coherent receivers, e.g. using direct detection
    • H04B10/67Optical arrangements in the receiver

Definitions

  • the technical field of various embodiments of the invention involves using visible light as a communications medium, and in particular using a digital camera as the receiver in such communications.
  • communication with visible light may be performed using modulation at a rate too rapid for the human eye to detect.
  • the human eye has a cutoff frequency of about 100 hertz, so a modulation rate of 200 hertz is generally considered invisible to humans.
  • different forms of pulse modulation may be used, such as but not limited to pulse width modulation, pulse position modulation, and pulse amplitude modulation.
  • each may use a different method of encoding data, they all have in common that the brightness effect to the human viewer (assuming constant amplitude of the pulses) may be the effective average over time of the on-off states, and that flicker caused by the pulses will be invisible to the human eye if the modulation rate (pulses per second) is sufficiently high.
  • LED light sources are ideal for such rapid switching between light levels, but other light sources may be used as well.
  • Fig. 1 shows modulated light sources, according to an embodiment of the invention.
  • FIGs. 2A and 2B show an example of how data may be encoded by a light source using pulse modulation, according to an embodiment of the invention.
  • FIG. 3 shows processing modules for a pulsed-light digital receiver for a high speed sensor communications system, according to an embodiment of the invention.
  • FIG. 4 shows a pixel cluster, according to an embodiment of the invention.
  • Coupled is used to indicate that two or more elements are in direct physical or electrical contact with each other.
  • Connected is used to indicate that two or more elements are in direct physical or electrical contact with each other.
  • Connected is used to indicate that two or more elements are in direct physical or electrical contact with each other.
  • Connected is used to indicate that two or more elements are in direct physical or electrical contact with each other.
  • Connected is used to indicate that two or more elements are in direct physical or electrical contact with each other.
  • Coupled is used to indicate that two or more elements co-operate or interact with each other, but they may or may not have intervening physical or electrical components between them.
  • the term 'camera' may be used to describe a device that senses modulated light coming through a lens with an array of light-sensitive pixel receptors. The device might or might not be able to take standard pictures or video, but such standardized picture/video functionality is not described herein.
  • dynamic vision sensor (DVS) camera may be used to describe a camera with an image array that is designed to detect, at a pixel level, modulated light that exceeds a threshold value and report when such modulation occurs.
  • Various embodiments of the invention may be implemented fully or partially in software and/or firmware.
  • This software and/or firmware may take the form of
  • the instructions may be in any suitable form, such as but not limited to source code, compiled code, interpreted code, executable code, static code, dynamic code, and the like.
  • a computer-readable medium may include any tangible non-transitory medium for storing information in a form readable by one or more computers, such as but not limited to read only memory (ROM); random access memory (RAM);
  • magnetic disk storage media optical storage media
  • flash memory etc.
  • Fig. 1 shows modulated light sources, according to an embodiment of the invention.
  • the light sources may be LED lights capable of pulsing their light output at a rate that is too fast for the human eye to detect, so the effect for human visibility is the same as with a standard light bulb.
  • the transmitters may be light sources mounted in the ceiling of a room, and three of the seven ceiling lights are indicated as modulated light sources. Other types and quantities of light sources may also be used.
  • Fig. 1 shows an image of a part of the room that includes the ceiling lights, as that image might be seen by a camera or other pulsed light reader.
  • the right-hand portion of Fig. 1 shows where the modulated light from those three light sources would strike the pixel matrix in the image sensor of the camera, indicated as CI, C2, and C3.
  • Light sources CI, C2, and C3 may each be spread across multiple pixels of the image sensor. The embodiments of this document will be described as coming from a single light source.
  • FIGs. 2A and 2B show an example of how data may be encoded by a light source using pulse modulation, according to an embodiment of the invention.
  • This particular embodiment uses pulse position modulation, though other types of pulse modulation might be used in different embodiments.
  • Pulse Position Modulation This is a technique for encoding l 's and
  • Fig. 2A shows a modulation scheme in which a logic 0 is indicated by placing a pulse at the beginning of a bit time, and a logic 1 is indicated by placing a pulse in the middle of a bit time, effectively dividing the bit time into two portions.
  • the opposite polarity could also be used.
  • Start Frame Delimiter SFD
  • SFD Start Frame Delimiter
  • the third of these three pulses may then be followed immediately by 1-1/2 bit times without any pulses. This is also an illegal combination, since both 0's and l 's require a pulse somewhere within a single bit time.
  • the particular pattern of Fig. 2B may be used to 1) indicate an SFD, and 2) indicate the leading edge of the following bit times, with the leading edge of the middle pulse indicating the leading edge of a bit time and the third pulse indicating the middle of a bit time. Accordingly, the leading edge of all subsequent bit times may then be measured from the leading edge of the second pulse in this SFD.
  • the duration of a bit time may be determined as double the spacing between consecutive pulses in an SFD, or alternately, equal to the spacing between the first and third pulses of the SFD.
  • An additional advantage of the illustrated SFD is that it contains three pulses within three bit times, which is the same average distribution of pulses that is found in the subsequent data. This in turn may result in a steady level of illumination over time, as that illumination is perceived by the human eye.
  • the image array in a digital image camera may contain a dynamic vision sensor (DVS), which may also be referred to as a 'smart sensor array' .
  • DVD dynamic vision sensor
  • Each pixel receptor in a smart sensor array may be referred to as a 'smart pixel receptor'.
  • each smart pixel receptor in a DVS may work as an independent entity, capturing light in an asynchronous manner without regard to what the other pixel receptors are doing.
  • a smart pixel receptor may only react to changes in the level of illumination it receives, generating a 'pixel event' when the change in illumination exceeds a particular threshold.
  • Each pixel event may generate a data packet containing information indicating the polarity of the event (positive or negative change in perceived light
  • the data packet may also indicate the amount of change in
  • DVS cameras may have excellent temporal resolution (for example, 10 microseconds). Thus, operation of a camera that only examines the output of a small number of pixels may be equivalent to a normal camera that examines the output of all pixels in the array at tens of thousands of frames per second.
  • each smart pixel may have a large dynamic range, allowing a large illumination range in the scene (such as, for example, 120 dB).
  • FIG. 3 shows processing modules for a pulsed-light digital receiver for a high speed sensor communications system 300, according to an embodiment of the invention.
  • each module may perform the operations described below.
  • Async/Sync module 310 may receive the pixel events asynchronously and produce synchronous frames which are then fed to the Modulated Light Detector module.
  • the Modulated Light Detector 320 may find a cluster of pixels that could be potentially associated with a single modulated light source, such as for example, one of the ceiling lamps of Fig. 1. It may be assumed that each pixel in a cluster is receiving the same pulse train at the same time from that light source, such as for example, light source C3 in Fig. 1. After processing, the pixel cluster may be passed on to the Pixel Combining module 330.
  • Fig. 4 shows a pixel cluster, according to an embodiment of the invention.
  • modulated light from light source C3 may strike the darkened pixels of Fig. 4.
  • Unmodulated light (for example, from the area around light source C3) may also strike surrounding pixels, but be previously eliminated from consideration by the Modulated Light Detector.
  • light that is modulated at the wrong frequency may also have been eliminated (for example, but not limited to, LED light pulses that are flickering at a 120 Hz rate with a variable duty cycle to permit dimming).
  • the Pixel Combining module 330 may process each pixel in a pixel cluster to generate a waveform representing modulated light for the entire light source.
  • Various approaches may be used for the pixel combining function, such as but not limited to Equal Gain Combining (EGC) or Maximal Ratio Combining (MRC).
  • the output of the Pixel Combining module may be a waveform with a positive spike in the signal at the front edge of each pulse in the original pulse train, and a negative spike in the signal at the trailing edge of each pulse.
  • Other embodiments may use other techniques, such as but not limited to using the reverse polarity for these spikes.
  • the waveforms produced by the Pixel Combining module may be passed on to Alignment module 340.
  • the SFD may be used to indicate the beginning of a data packet.
  • a matched filter may be used to detect each SFD.
  • the aligned waveforms produced by the Alignment module may include not only an indication of the SFDs, but of all the other pulses in the payload following each SFD.
  • the waveforms being output from the Alignment module may be sent to the
  • Subsampling module 350 may subsample the waveform extracted from the Alignment module, and pass on the results to the Demodulation module 350 to demodulate the encoded bits.
  • samples may be processed to extract transmitted encoded bits from symbols.
  • a decision may be made based on the sign of the number resulting from this operation.
  • a positive outcome denotes a logical ' ⁇ '
  • a negative outcome denotes a logical T .
  • Other embodiments may use the opposite polarity.
  • the output of the Demodulation module 350 may be the bit train that was originally encoded by the transmitter in the light source.
  • Fig. 3 Not shown in Fig. 3 is an optical lens to focus the incoming light onto the array of smart pixel receptors.
  • the ability of the system to successfully decode the modulated pulse train and resulting bits may also increase.
  • Various techniques may be used to achieve this result.
  • a larger light source may be used.
  • the distance between the light source and camera may be decreased.
  • the image may be defocused so that it effectively strikes more pixels.
  • Example 1 shows a light sensing device, comprising: an array of smart pixel receptors in a dynamic vision sensor (DVS) camera, each smart pixel receptor configured to detect a change in intensity of light exceeding a threshold value and configured to generate a pixel event reporting the change; a first module to convert asynchronously received pixel events from multiple smart pixel receptors to a synchronous frame signal; a second module to convert the synchronous frame signal representing multiple adjacent pixels to a pixel cluster signal; a third module to convert the pixel cluster signal from the second module into a waveform representing the multiple pixels in the pixel cluster; a fourth module to find start frame delimiters (SFDs) in the waveform; a fifth module to sample the output of the fourth module; and a sixth module to demodulate the sampled output and produce a series of bits.
  • DVS dynamic vision sensor
  • Example 2 shows the light sensing device of example 1, wherein the change in intensity of light is to result from leading and trailing edges of light pulses.
  • Example 3 shows the light sensing device of example 1, wherein the light pulses are to be configured as pulse position modulation.
  • Example 4 shows the light sensing device of example 1, wherein the pixel event comprises information indicating a polarity of the change in illumination, information indicating the pixel address, and a time stamp indicating when the pixel event happened.
  • Example 5 shows the light sensing device of example 4, wherein the pixel event further comprises an indication of an amount of the change in illumination.
  • Example 6 shows the light sensing device of example 1, further comprising an optical lens to focus the light on the array of smart pixel receptors.
  • Example 7 shows a method of sensing modulated light, comprising: detecting, by each of multiple smart pixel receptors in an array of smart pixel receptors in a dynamic vision sensor (DVS) camera, a change in intensity of light exceeding a threshold value and generating a pixel event reporting the change; converting asynchronously received changes in the intensity from multiple smart pixel receptors into a synchronous frame signal; converting the synchronous frame signal representing multiple adjacent pixels into a pixel cluster signal; converting the pixel cluster signal into a waveform representing the multiple pixels in the pixel cluster; finding start frame delimiters (SFDs) in the waveform; sampling an output of signals following the SFDs; and demodulating the sampled output and producing a series of bits from the demodulated sampled output.
  • DVS dynamic vision sensor
  • Example 8 shows the method of example 7, wherein the change in intensity of light results from leading and trailing edges of light pulses.
  • Example 9 shows the method of example 7, wherein the light pulses are configured as pulse position modulation.
  • Example 10 shows the method of example 7, wherein the pixel event comprises information indicating a polarity of the change in illumination, information indicating the pixel address, and a time stamp indicating when the pixel event happened.
  • Example 11 shows the method of example 10, wherein the pixel event further comprises an indication of an amount of the change in illumination.
  • Example 12 shows a computer-readable non-transitory storage medium that contains instructions, which when executed by one or more processors result in performing operations comprising: receiving multiple pixel events, each pixel event comprising information indicating a change in intensity of light detected by a different smart pixel receptor in an array of smart pixel receptors in a dynamic vision sensor (DVS) camera;
  • DVS dynamic vision sensor
  • Example 13 shows the medium of example 12, wherein the change in intensity of light results from leading and trailing edges of light pulses.
  • Example 14 shows the medium of example 12, wherein the light pulses are configured as pulse position modulation.
  • Example 15 shows the medium of example 12, wherein the pixel event comprises information indicating a polarity of the change in illumination, information indicating the pixel address, and a time stamp indicating when the pixel event happened.
  • Example 16 shows the medium of example 15, wherein the pixel event further comprises an indication of an amount of the change in illumination.
  • Example 17 shows a light sensing device, comprising: an array of smart pixel receptors in a dynamic vision sensor (DVS) camera, each smart pixel receptor having means to detect a change in intensity of light exceeding a threshold value and means to generate a pixel event reporting the change; a first module having means to convert asynchronously received changes in the intensity from multiple smart pixel receptors to a synchronous frame signal; a second module having means to convert the synchronous frame signal representing multiple adjacent pixels to a pixel cluster signal; a third module having means to convert signals from the second module into a waveform representing the multiple pixels in the pixel cluster; a fourth module having means to find start frame delimiters (SFDs) in the waveform; a fifth module having means to sample the output of the fourth module; and a sixth module to demodulate the sampled output and produce a series of bits.
  • DVS dynamic vision sensor
  • Example 18 shows the light sensing device of example 17, wherein the change in intensity of light is to result from leading and trailing edges of light pulses.
  • Example 19 shows the light sensing device of example 17, wherein the light pulses are to be configured as pulse position modulation.
  • Example 20 shows the light sensing device of example 17, wherein the pixel event comprises information indicating a polarity of the change in illumination, information indicating the pixel address, and a time stamp indicating when the pixel event happened.
  • Example 21 shows the light sensing device of example 20, wherein the pixel event further comprises an indication of an amount of the change in illumination.
  • Example 22 shows the light sensing device of example 17, further comprising a means to focus light on the array of smart pixel receptors.

Landscapes

  • Physics & Mathematics (AREA)
  • Electromagnetism (AREA)
  • Engineering & Computer Science (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Signal Processing (AREA)
  • Optical Communication System (AREA)
  • Studio Devices (AREA)

Abstract

Camera Communication (CamCom) systems may be used to communicate data using visible light as the medium, and in which the camera is the receiver. In various embodiments, high speed communications may be achieved with a high frame rate digital camera, but without using region of interest (ROI) subsampling to reduce the number of pixels being considered. In particular, the camera may be a dynamic vision sensor (DVS) camera in which each pixel sensor detects a change in light level (such as a pulse) that exceeds a threshold value, and generates a 'pixel event' in response that indicates various information about the detection.

Description

CAMERA COMMUNICATIONS SYSTEM USING HIGH SPEED
CAMERA SENSORS
TECHNICAL FIELD OF THE INVENTION
[0001] In general, the technical field of various embodiments of the invention involves using visible light as a communications medium, and in particular using a digital camera as the receiver in such communications.
BACKGROUND
[0002] Using visible light as a communications medium is known, and in fact using visible light to communicate through optical fibers is a mature technology. But more recently, using visible light to communicate through a non-solid medium has achieved more attention. In particular, an Institute of Electrical and Electronic Engineers (IEEE) standard 802.15.7 is being developed to promote industry standardization for the PHY and MAC layers, and in the Abstract of revision 2 of that document it mentions the possibility of using a digital camera as the receiver in such a system. However, one issue with using a camera is that the frame rate of the transmitter may not be synchronized with the frame rate of the camera. Another issue is that the source of the light from the transmitter may be spread across an unknown number of multiple pixels, requiring identification of those pixels and coordination of the processing of those pixels.
[0003] In general, to avoid undesirable flicker effects, communication with visible light may be performed using modulation at a rate too rapid for the human eye to detect. Most agree that the human eye has a cutoff frequency of about 100 hertz, so a modulation rate of 200 hertz is generally considered invisible to humans. In various embodiments, different forms of pulse modulation may be used, such as but not limited to pulse width modulation, pulse position modulation, and pulse amplitude modulation. Although each may use a different method of encoding data, they all have in common that the brightness effect to the human viewer (assuming constant amplitude of the pulses) may be the effective average over time of the on-off states, and that flicker caused by the pulses will be invisible to the human eye if the modulation rate (pulses per second) is sufficiently high. LED light sources are ideal for such rapid switching between light levels, but other light sources may be used as well. BRIEF DESCRIPTION OF THE DRAWINGS
[0004] Some embodiments of the invention may be better understood by referring to the following description and accompanying drawings that are used to illustrate embodiments of the invention. In the drawings:
[0005] Fig. 1 shows modulated light sources, according to an embodiment of the invention.
[0006] Figs. 2A and 2B show an example of how data may be encoded by a light source using pulse modulation, according to an embodiment of the invention.
[0007] Fig. 3 shows processing modules for a pulsed-light digital receiver for a high speed sensor communications system, according to an embodiment of the invention.
[0008] Fig. 4 shows a pixel cluster, according to an embodiment of the invention.
DETAILED DESCRIPTION
[0009] In the following description, numerous specific details are set forth. However, it is understood that embodiments of the invention may be practiced without these specific details. In other instances, well-known circuits, structures and techniques have not been shown in detail in order not to obscure an understanding of this description.
[0010] References to "one embodiment", "an embodiment", "example embodiment",
"various embodiments", etc., indicate that the embodiment(s) of the invention so described may include particular features, structures, or characteristics, but not every embodiment necessarily includes the particular features, structures, or characteristics. Further, some embodiments may have some, all, or none of the features described for other embodiments.
[0011] In the following description and claims, the terms "coupled" and "connected," along with their derivatives, may be used. It should be understood that these terms are not intended as synonyms for each other. Rather, in particular embodiments, "connected" is used to indicate that two or more elements are in direct physical or electrical contact with each other. "Coupled" is used to indicate that two or more elements co-operate or interact with each other, but they may or may not have intervening physical or electrical components between them.
[0012] As used in the claims, unless otherwise specified the use of the ordinal adjectives "first", "second", "third", etc., to describe a common element, merely indicates that different instances of like elements are being referred to, and is not intended to imply that the elements so described must be in a given sequence, either temporally, spatially, in ranking, or in any other manner. [0013] As used in this document, the term 'camera' may be used to describe a device that senses modulated light coming through a lens with an array of light-sensitive pixel receptors. The device might or might not be able to take standard pictures or video, but such standardized picture/video functionality is not described herein.
[0014] As used herein, the term dynamic vision sensor (DVS) camera may be used to describe a camera with an image array that is designed to detect, at a pixel level, modulated light that exceeds a threshold value and report when such modulation occurs.
[0015] Various embodiments of the invention may be implemented fully or partially in software and/or firmware. This software and/or firmware may take the form of
instructions contained in or on a non-transitory computer-readable storage medium, with the intention that the instructions may be eventually read and executed by the one or more processors to enable performance of the operations described herein, even if the instructions are initially external to the processors. The instructions may be in any suitable form, such as but not limited to source code, compiled code, interpreted code, executable code, static code, dynamic code, and the like. Such a computer-readable medium may include any tangible non-transitory medium for storing information in a form readable by one or more computers, such as but not limited to read only memory (ROM); random access memory (RAM);
magnetic disk storage media; optical storage media; a flash memory, etc.
[0016] Fig. 1 shows modulated light sources, according to an embodiment of the invention. In this example, the light sources may be LED lights capable of pulsing their light output at a rate that is too fast for the human eye to detect, so the effect for human visibility is the same as with a standard light bulb. In this example, the transmitters may be light sources mounted in the ceiling of a room, and three of the seven ceiling lights are indicated as modulated light sources. Other types and quantities of light sources may also be used.
[0017] The left-hand portion of Fig. 1 shows an image of a part of the room that includes the ceiling lights, as that image might be seen by a camera or other pulsed light reader. The right-hand portion of Fig. 1 shows where the modulated light from those three light sources would strike the pixel matrix in the image sensor of the camera, indicated as CI, C2, and C3. Light sources CI, C2, and C3 may each be spread across multiple pixels of the image sensor. The embodiments of this document will be described as coming from a single light source.
[0018] Figs. 2A and 2B show an example of how data may be encoded by a light source using pulse modulation, according to an embodiment of the invention. This particular embodiment uses pulse position modulation, though other types of pulse modulation might be used in different embodiments.
[0019] Pulse Position Modulation (PPM) - This is a technique for encoding l 's and
0's by the time placement of pulses within designated bit times. For example, Fig. 2A shows a modulation scheme in which a logic 0 is indicated by placing a pulse at the beginning of a bit time, and a logic 1 is indicated by placing a pulse in the middle of a bit time, effectively dividing the bit time into two portions. Alternately, the opposite polarity could also be used.
[0020] Start Frame Delimiter (SFD) - But to place each pulse within a bit time, it may be necessary to synchronize the data from the transmitter, so the receiver will know when each bit time begins and ends. This may be indicated by starting with a pattern of pulses that will not be found within a normal data stream (i.e., an 'illegal' pulse pattern), plus some indication of how to measure subsequent bit times from that pattern. This in turn may require knowing how long each bit time lasts (which may be pre-defined, or alternately, derived from the SFD), and may also require some sort of timing marker to delineate a starting point for subsequent bit time boundaries.
[0021] In the example of Fig. 2B, three pulses half a bit time apart may be
transmitted. Based on the PPM of Fig. 2 A, this clearly represents an illegal combination of pulses, since under that definition a maximum of two pulses can be legally placed within the time duration occupied by a single bit time (i.e., a T followed by a 'Ο'), regardless of when the bit time boundaries occur. Any legal 3 -bit combination would place at least one of those pulses a full bit time away from the nearest other pulse.
[0022] The third of these three pulses may then be followed immediately by 1-1/2 bit times without any pulses. This is also an illegal combination, since both 0's and l 's require a pulse somewhere within a single bit time. The particular pattern of Fig. 2B may be used to 1) indicate an SFD, and 2) indicate the leading edge of the following bit times, with the leading edge of the middle pulse indicating the leading edge of a bit time and the third pulse indicating the middle of a bit time. Accordingly, the leading edge of all subsequent bit times may then be measured from the leading edge of the second pulse in this SFD.
[0023] Even if the duration of bit times is not predefined, in this embodiment the duration of a bit time may be determined as double the spacing between consecutive pulses in an SFD, or alternately, equal to the spacing between the first and third pulses of the SFD. An additional advantage of the illustrated SFD is that it contains three pulses within three bit times, which is the same average distribution of pulses that is found in the subsequent data. This in turn may result in a steady level of illumination over time, as that illumination is perceived by the human eye.
[0024] In some embodiments, the image array in a digital image camera may contain a dynamic vision sensor (DVS), which may also be referred to as a 'smart sensor array' . Each pixel receptor in a smart sensor array may be referred to as a 'smart pixel receptor'. Unlike a standard camera sensor array, in which pixel receptors in the array are globally triggered at a constant rate, each smart pixel receptor in a DVS may work as an independent entity, capturing light in an asynchronous manner without regard to what the other pixel receptors are doing. A smart pixel receptor may only react to changes in the level of illumination it receives, generating a 'pixel event' when the change in illumination exceeds a particular threshold. Each pixel event may generate a data packet containing information indicating the polarity of the event (positive or negative change in perceived light
illumination), the pixel address, and a time stamp indicating when the pixel event happened. In some embodiments, the data packet may also indicate the amount of change in
illumination that was detected.
[0025] In a DVS, since the illumination of most pixels may remain constant over a particular time period, only data packets from a relatively small number of pixels may need to be considered during that time period, thus reducing the complexity of computation considerably. The effect of this may be to significantly increase the effective frame rate of the camera,
[0026] DVS cameras may have excellent temporal resolution (for example, 10 microseconds). Thus, operation of a camera that only examines the output of a small number of pixels may be equivalent to a normal camera that examines the output of all pixels in the array at tens of thousands of frames per second. In addition, each smart pixel may have a large dynamic range, allowing a large illumination range in the scene (such as, for example, 120 dB).
[0027] Fig. 3 shows processing modules for a pulsed-light digital receiver for a high speed sensor communications system 300, according to an embodiment of the invention. In the described DVS embodiment, each module may perform the operations described below.
[0028] Async/Sync module 310 may receive the pixel events asynchronously and produce synchronous frames which are then fed to the Modulated Light Detector module.
[0029] The Modulated Light Detector 320 may find a cluster of pixels that could be potentially associated with a single modulated light source, such as for example, one of the ceiling lamps of Fig. 1. It may be assumed that each pixel in a cluster is receiving the same pulse train at the same time from that light source, such as for example, light source C3 in Fig. 1. After processing, the pixel cluster may be passed on to the Pixel Combining module 330.
[0030] Fig. 4 shows a pixel cluster, according to an embodiment of the invention. In this example, modulated light from light source C3 may strike the darkened pixels of Fig. 4. Unmodulated light (for example, from the area around light source C3) may also strike surrounding pixels, but be previously eliminated from consideration by the Modulated Light Detector. In some embodiments, light that is modulated at the wrong frequency may also have been eliminated (for example, but not limited to, LED light pulses that are flickering at a 120 Hz rate with a variable duty cycle to permit dimming).
[0031] Returning to Fig. 3, the Pixel Combining module 330 may process each pixel in a pixel cluster to generate a waveform representing modulated light for the entire light source. Various approaches may be used for the pixel combining function, such as but not limited to Equal Gain Combining (EGC) or Maximal Ratio Combining (MRC). In one embodiment, the output of the Pixel Combining module may be a waveform with a positive spike in the signal at the front edge of each pulse in the original pulse train, and a negative spike in the signal at the trailing edge of each pulse. Other embodiments may use other techniques, such as but not limited to using the reverse polarity for these spikes. The waveforms produced by the Pixel Combining module may be passed on to Alignment module 340.
[0032] In order for the Alignment module 340 to extract the data bits from the waveform, the SFD may be used to indicate the beginning of a data packet. In one embodiment a matched filter may be used to detect each SFD. The aligned waveforms produced by the Alignment module may include not only an indication of the SFDs, but of all the other pulses in the payload following each SFD.
[0033] The waveforms being output from the Alignment module may be sent to the
Subsampling module. Subsampling module 350 may subsample the waveform extracted from the Alignment module, and pass on the results to the Demodulation module 350 to demodulate the encoded bits.
[0034] In the Demodulation module, samples may be processed to extract transmitted encoded bits from symbols. In one embodiment, by selecting consecutive pairs of samples corresponding to a symbol and subtraction of these sample pairs, a decision may be made based on the sign of the number resulting from this operation. In one embodiment, a positive outcome denotes a logical 'Ο', while a negative outcome denotes a logical T . Other embodiments may use the opposite polarity. Regardless of the technique used, the output of the Demodulation module 350 may be the bit train that was originally encoded by the transmitter in the light source.
[0035] Not shown in Fig. 3 is an optical lens to focus the incoming light onto the array of smart pixel receptors.
[0036] As the number of modulated pixels within a pixel cluster increases, the ability of the system to successfully decode the modulated pulse train and resulting bits may also increase. Various techniques may be used to achieve this result. In one embodiment, a larger light source may be used. In another embodiment, the distance between the light source and camera may be decreased. In another embodiment, the image may be defocused so that it effectively strikes more pixels.
EXAMPLES
[0037] The following examples pertain to particular embodiments:
[0038] Example 1 shows a light sensing device, comprising: an array of smart pixel receptors in a dynamic vision sensor (DVS) camera, each smart pixel receptor configured to detect a change in intensity of light exceeding a threshold value and configured to generate a pixel event reporting the change; a first module to convert asynchronously received pixel events from multiple smart pixel receptors to a synchronous frame signal; a second module to convert the synchronous frame signal representing multiple adjacent pixels to a pixel cluster signal; a third module to convert the pixel cluster signal from the second module into a waveform representing the multiple pixels in the pixel cluster; a fourth module to find start frame delimiters (SFDs) in the waveform; a fifth module to sample the output of the fourth module; and a sixth module to demodulate the sampled output and produce a series of bits.
[0039] Example 2 shows the light sensing device of example 1, wherein the change in intensity of light is to result from leading and trailing edges of light pulses.
[0040] Example 3 shows the light sensing device of example 1, wherein the light pulses are to be configured as pulse position modulation.
[0041] Example 4 shows the light sensing device of example 1, wherein the pixel event comprises information indicating a polarity of the change in illumination, information indicating the pixel address, and a time stamp indicating when the pixel event happened.
[0042] Example 5 shows the light sensing device of example 4, wherein the pixel event further comprises an indication of an amount of the change in illumination. [0043] Example 6 shows the light sensing device of example 1, further comprising an optical lens to focus the light on the array of smart pixel receptors.
[0044] Example 7 shows a method of sensing modulated light, comprising: detecting, by each of multiple smart pixel receptors in an array of smart pixel receptors in a dynamic vision sensor (DVS) camera, a change in intensity of light exceeding a threshold value and generating a pixel event reporting the change; converting asynchronously received changes in the intensity from multiple smart pixel receptors into a synchronous frame signal; converting the synchronous frame signal representing multiple adjacent pixels into a pixel cluster signal; converting the pixel cluster signal into a waveform representing the multiple pixels in the pixel cluster; finding start frame delimiters (SFDs) in the waveform; sampling an output of signals following the SFDs; and demodulating the sampled output and producing a series of bits from the demodulated sampled output.
[0045] Example 8 shows the method of example 7, wherein the change in intensity of light results from leading and trailing edges of light pulses.
[0046] Example 9 shows the method of example 7, wherein the light pulses are configured as pulse position modulation.
[0047] Example 10 shows the method of example 7, wherein the pixel event comprises information indicating a polarity of the change in illumination, information indicating the pixel address, and a time stamp indicating when the pixel event happened.
[0048] Example 11 shows the method of example 10, wherein the pixel event further comprises an indication of an amount of the change in illumination.
[0049] Example 12 shows a computer-readable non-transitory storage medium that contains instructions, which when executed by one or more processors result in performing operations comprising: receiving multiple pixel events, each pixel event comprising information indicating a change in intensity of light detected by a different smart pixel receptor in an array of smart pixel receptors in a dynamic vision sensor (DVS) camera;
converting the received changes in the intensity from multiple smart pixel receptors into a synchronous frame signal; converting the synchronous frame signal representing multiple adjacent pixels into a pixel cluster signal; converting the pixel cluster signal into a waveform representing the multiple pixels in the pixel cluster; finding start frame delimiters (SFDs) in the waveform; sampling an output of signals following the SFDs; and demodulating the sampled output and producing a series of bits from the demodulated sampled output.
[0050] Example 13 shows the medium of example 12, wherein the change in intensity of light results from leading and trailing edges of light pulses. [0051] Example 14 shows the medium of example 12, wherein the light pulses are configured as pulse position modulation.
[0052] Example 15 shows the medium of example 12, wherein the pixel event comprises information indicating a polarity of the change in illumination, information indicating the pixel address, and a time stamp indicating when the pixel event happened.
[0053] Example 16 shows the medium of example 15, wherein the pixel event further comprises an indication of an amount of the change in illumination.
[0054] Example 17 shows a light sensing device, comprising: an array of smart pixel receptors in a dynamic vision sensor (DVS) camera, each smart pixel receptor having means to detect a change in intensity of light exceeding a threshold value and means to generate a pixel event reporting the change; a first module having means to convert asynchronously received changes in the intensity from multiple smart pixel receptors to a synchronous frame signal; a second module having means to convert the synchronous frame signal representing multiple adjacent pixels to a pixel cluster signal; a third module having means to convert signals from the second module into a waveform representing the multiple pixels in the pixel cluster; a fourth module having means to find start frame delimiters (SFDs) in the waveform; a fifth module having means to sample the output of the fourth module; and a sixth module to demodulate the sampled output and produce a series of bits.
[0055] Example 18 shows the light sensing device of example 17, wherein the change in intensity of light is to result from leading and trailing edges of light pulses.
[0056] Example 19 shows the light sensing device of example 17, wherein the light pulses are to be configured as pulse position modulation.
[0057] Example 20 shows the light sensing device of example 17, wherein the pixel event comprises information indicating a polarity of the change in illumination, information indicating the pixel address, and a time stamp indicating when the pixel event happened.
[0058] Example 21 shows the light sensing device of example 20, wherein the pixel event further comprises an indication of an amount of the change in illumination.
[0059] Example 22 shows the light sensing device of example 17, further comprising a means to focus light on the array of smart pixel receptors.
[0060] The foregoing description is intended to be illustrative and not limiting.
Variations will occur to those of skill in the art. Those variations are intended to be included in the various embodiments of the invention, which are limited only by the scope of the following claims.

Claims

1. A light sensing device, comprising:
an array of smart pixel receptors in a dynamic vision sensor (DVS) camera, each smart pixel receptor configured to detect a change in intensity of light exceeding a threshold value and configured to generate a pixel event reporting the change;
a first module to convert asynchronously received pixel events from multiple smart pixel receptors to a synchronous frame signal;
a second module to convert the synchronous frame signal representing multiple adjacent pixels to a pixel cluster signal;
a third module to convert the pixel cluster signal from the second module into a waveform representing the multiple pixels in the pixel cluster;
a fourth module to find start frame delimiters (SFDs) in the waveform;
a fifth module to sample the output of the fourth module; and
a sixth module to demodulate the sampled output and produce a series of bits.
2. The light sensing device of claim 1, wherein the change in intensity of light is to result from leading and trailing edges of light pulses.
3. The light sensing device of claim 1, wherein the light pulses are to be configured as pulse position modulation.
4. The light sensing device of claim 1, wherein the pixel event comprises information indicating a polarity of the change in illumination, information indicating the pixel address, and a time stamp indicating when the pixel event happened.
5. The light sensing device of claim 4, wherein the pixel event further comprises an indication of an amount of the change in illumination.
6. The light sensing device of claim 1, further comprising an optical lens to focus the light on the array of smart pixel receptors.
7. A method of sensing modulated light, comprising: detecting, by each of multiple smart pixel receptors in an array of smart pixel receptors in a dynamic vision sensor (DVS) camera, a change in intensity of light exceeding a threshold value and generating a pixel event reporting the change;
converting asynchronously received changes in the intensity from multiple smart pixel receptors into a synchronous frame signal;
converting the synchronous frame signal representing multiple adjacent pixels into a pixel cluster signal;
converting the pixel cluster signal into a waveform representing the multiple pixels in the pixel cluster;
finding start frame delimiters (SFDs) in the waveform;
sampling an output of signals following the SFDs; and
demodulating the sampled output and producing a series of bits from the demodulated sampled output.
8. The method of claim 7, wherein the change in intensity of light results from leading and trailing edges of light pulses.
9. The method of claim 7, wherein the light pulses are configured as pulse position modulation.
10. The method of claim 7, wherein the pixel event comprises information indicating a polarity of the change in illumination, information indicating the pixel address, and a time stamp indicating when the pixel event happened.
11. The method of claim 10, wherein the pixel event further comprises an indication of an amount of the change in illumination.
12. A computer-readable non-transitory storage medium that contains instructions, which when executed by one or more processors result in performing operations comprising:
receiving multiple pixel events, each pixel event comprising information indicating a change in intensity of light detected by a different smart pixel receptor in an array of smart pixel receptors in a dynamic vision sensor (DVS) camera;
converting the received changes in the intensity from multiple smart pixel receptors into a synchronous frame signal; converting the synchronous frame signal representing multiple adjacent pixels into a pixel cluster signal;
converting the pixel cluster signal into a waveform representing the multiple pixels in the pixel cluster;
finding start frame delimiters (SFDs) in the waveform;
sampling an output of signals following the SFDs; and
demodulating the sampled output and producing a series of bits from the demodulated sampled output.
13. The medium of claim 12, wherein the change in intensity of light results from leading and trailing edges of light pulses.
14. The medium of claim 12, wherein the light pulses are configured as pulse position modulation.
15. The medium of claim 12, wherein the pixel event comprises information indicating a polarity of the change in illumination, information indicating the pixel address, and a time stamp indicating when the pixel event happened.
16. The medium of claim 15, wherein the pixel event further comprises an indication of an amount of the change in illumination.
17. A light sensing device, comprising:
an array of smart pixel receptors in a dynamic vision sensor (DVS) camera, each smart pixel receptor having means to detect a change in intensity of light exceeding a threshold value and means to generate a pixel event reporting the change;
a first module having means to convert asynchronously received changes in the intensity from multiple smart pixel receptors to a synchronous frame signal;
a second module having means to convert the synchronous frame signal representing multiple adjacent pixels to a pixel cluster signal;
a third module having means to convert signals from the second module into a waveform representing the multiple pixels in the pixel cluster;
a fourth module having means to find start frame delimiters (SFDs) in the waveform; a fifth module having means to sample the output of the fourth module; and a sixth module to demodulate the sampled output and produce a series of bits.
18. The light sensing device of claim 17, wherein the change in intensity of light is to result from leading and trailing edges of light pulses.
19. The light sensing device of claim 17, wherein the light pulses are to be configured as pulse position modulation.
20. The light sensing device of claim 17, wherein the pixel event comprises information indicating a polarity of the change in illumination, information indicating the pixel address, and a time stamp indicating when the pixel event happened.
21. The light sensing device of claim 20, wherein the pixel event further comprises an indication of an amount of the change in illumination.
22. The light sensing device of claim 17, further comprising a means to focus light on the array of smart pixel receptors.
PCT/US2017/039852 2017-06-29 2017-06-29 Camera communications system using high speed camera sensors WO2019005051A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/US2017/039852 WO2019005051A1 (en) 2017-06-29 2017-06-29 Camera communications system using high speed camera sensors

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/US2017/039852 WO2019005051A1 (en) 2017-06-29 2017-06-29 Camera communications system using high speed camera sensors

Publications (1)

Publication Number Publication Date
WO2019005051A1 true WO2019005051A1 (en) 2019-01-03

Family

ID=64742587

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2017/039852 WO2019005051A1 (en) 2017-06-29 2017-06-29 Camera communications system using high speed camera sensors

Country Status (1)

Country Link
WO (1) WO2019005051A1 (en)

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110177200A (en) * 2019-06-28 2019-08-27 Oppo广东移动通信有限公司 Camera module, electronic equipment and image shooting method
CN112019835A (en) * 2020-08-08 2020-12-01 欧菲微电子技术有限公司 Frame rate verification device and method for dynamic vision sensor module and storage medium
CN113408671A (en) * 2021-08-18 2021-09-17 成都时识科技有限公司 Object identification method and device, chip and electronic equipment
CN113424516A (en) * 2019-02-11 2021-09-21 普罗费塞公司 Method of processing a series of events received asynchronously from a pixel array of an event-based photosensor
EP3930218A1 (en) 2020-06-23 2021-12-29 IMRA Europe S.A.S. Vlc in streets
EP3930220A1 (en) 2020-06-23 2021-12-29 IMRA Europe S.A.S. Vlc in factories
CN114777764A (en) * 2022-04-20 2022-07-22 中国科学院光电技术研究所 High-dynamic star sensor star point extraction method based on event camera

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20070058987A1 (en) * 2005-09-13 2007-03-15 Kabushiki Kaisha Toshiba Visible light communication system and method therefor
US20100054748A1 (en) * 2007-03-13 2010-03-04 Yoshiyuki Sato Receiver and system for visible light communication
US20110229130A1 (en) * 2008-11-25 2011-09-22 Atsuya Yokoi Visible ray communication system, transmission apparatus, and signal transmission method
US20140093234A1 (en) * 2012-09-28 2014-04-03 Richard D. Roberts Methods and apparatus for multiphase sampling of modulated light
US8750719B2 (en) * 2010-05-14 2014-06-10 Taiyo Yuden Co., Ltd. Visible light communication receiver, visible light communication system, and visible light communication method

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20070058987A1 (en) * 2005-09-13 2007-03-15 Kabushiki Kaisha Toshiba Visible light communication system and method therefor
US20100054748A1 (en) * 2007-03-13 2010-03-04 Yoshiyuki Sato Receiver and system for visible light communication
US20110229130A1 (en) * 2008-11-25 2011-09-22 Atsuya Yokoi Visible ray communication system, transmission apparatus, and signal transmission method
US8750719B2 (en) * 2010-05-14 2014-06-10 Taiyo Yuden Co., Ltd. Visible light communication receiver, visible light communication system, and visible light communication method
US20140093234A1 (en) * 2012-09-28 2014-04-03 Richard D. Roberts Methods and apparatus for multiphase sampling of modulated light

Cited By (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113424516A (en) * 2019-02-11 2021-09-21 普罗费塞公司 Method of processing a series of events received asynchronously from a pixel array of an event-based photosensor
CN110177200A (en) * 2019-06-28 2019-08-27 Oppo广东移动通信有限公司 Camera module, electronic equipment and image shooting method
CN110177200B (en) * 2019-06-28 2020-11-27 Oppo广东移动通信有限公司 Camera module, electronic equipment and image shooting method
EP3930218A1 (en) 2020-06-23 2021-12-29 IMRA Europe S.A.S. Vlc in streets
EP3930220A1 (en) 2020-06-23 2021-12-29 IMRA Europe S.A.S. Vlc in factories
US11496215B2 (en) 2020-06-23 2022-11-08 Aisin Corporation VLC in factories
US11722892B2 (en) 2020-06-23 2023-08-08 Aisin Corporation VLC in streets
CN112019835A (en) * 2020-08-08 2020-12-01 欧菲微电子技术有限公司 Frame rate verification device and method for dynamic vision sensor module and storage medium
CN113408671A (en) * 2021-08-18 2021-09-17 成都时识科技有限公司 Object identification method and device, chip and electronic equipment
CN113408671B (en) * 2021-08-18 2021-11-16 成都时识科技有限公司 Object identification method and device, chip and electronic equipment
CN114777764A (en) * 2022-04-20 2022-07-22 中国科学院光电技术研究所 High-dynamic star sensor star point extraction method based on event camera
CN114777764B (en) * 2022-04-20 2023-06-30 中国科学院光电技术研究所 High-dynamic star sensor star point extraction method based on event camera

Similar Documents

Publication Publication Date Title
WO2019005051A1 (en) Camera communications system using high speed camera sensors
Luo et al. Undersampled-based modulation schemes for optical camera communications
EP2974080B1 (en) Method and apparatus of decoding low-rate visible light communication signals
Aoyama et al. Visible light communication using a conventional image sensor
US11303804B2 (en) Method and apparatus of processing a signal from an event-based sensor
Liu et al. Foundational analysis of spatial optical wireless communication utilizing image sensor
Ji et al. Vehicular visible light communications with LED taillight and rolling shutter camera
Ghassemlooy et al. Optical camera communications
RU2557802C2 (en) Method of detecting data to be transmitted in visible light using sensor of standard camera
CN108292958B (en) Optical wireless communication technology
Nagura et al. Tracking an LED array transmitter for visible light communications in the driving situation
US9667865B2 (en) Optical demodulation using an image sensor
RU2682427C2 (en) Coded light
WO2022237591A1 (en) Moving object identification method and apparatus, electronic device, and readable storage medium
CN103430626A (en) Light detection system and method
US10070067B2 (en) Systems, methods, and media for extracting information and a display image from two captured images
EP3753379A1 (en) Devices and methods for the transmission and reception of coded light
Schmid et al. Using smartphones as continuous receivers in a visible light communication system
KR20170084709A (en) Image sensor communication system based on dimmable M-PSK
Liu et al. Undersampled differential phase shift on-off keying for optical camera communications
Marcu et al. Flicker free VLC system with automatic code resynchronization using low frame rate camera
Wang et al. Demonstration of a covert camera-screen communication system
Teli et al. Selective capture based high-speed optical vehicular signaling system
Marcu et al. Flicker free VLC system with enhanced transmitter and low frame rate camera
CN108076295B (en) Machine vision communication system based on multi-dimensional code

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 17915442

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 17915442

Country of ref document: EP

Kind code of ref document: A1