WO2022254789A1 - Dispositif de réception et système de transmission/réception - Google Patents

Dispositif de réception et système de transmission/réception Download PDF

Info

Publication number
WO2022254789A1
WO2022254789A1 PCT/JP2022/004582 JP2022004582W WO2022254789A1 WO 2022254789 A1 WO2022254789 A1 WO 2022254789A1 JP 2022004582 W JP2022004582 W JP 2022004582W WO 2022254789 A1 WO2022254789 A1 WO 2022254789A1
Authority
WO
WIPO (PCT)
Prior art keywords
event
optical signal
transmission
light
receiving device
Prior art date
Application number
PCT/JP2022/004582
Other languages
English (en)
Japanese (ja)
Inventor
俊 海津
Original Assignee
ソニーグループ株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by ソニーグループ株式会社 filed Critical ソニーグループ株式会社
Publication of WO2022254789A1 publication Critical patent/WO2022254789A1/fr

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04BTRANSMISSION
    • H04B10/00Transmission systems employing electromagnetic waves other than radio-waves, e.g. infrared, visible or ultraviolet light, or employing corpuscular radiation, e.g. quantum communication
    • H04B10/11Arrangements specific to free-space transmission, i.e. transmission through air or vacuum
    • H04B10/114Indoor or close-range type systems
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N25/00Circuitry of solid-state image sensors [SSIS]; Control thereof
    • H04N25/70SSIS architectures; Circuits associated therewith

Definitions

  • the present technology relates to a receiving device and a transmitting/receiving system, and more particularly to a receiving device and a transmitting/receiving system capable of realizing optical communication capable of acquiring transmission information at high speed and also acquiring a light emission position.
  • the remote controller of the television receiver transmits signals such as the channel number to the television receiver on the receiving side by blinking infrared light. Since this signal transmission method cannot acquire light emission position information, its application range is limited.
  • Non-Patent Document 1 There is also a method of capturing an image of a marker represented by a two-dimensional code and acquiring information and the marker position by image recognition (see, for example, Non-Patent Document 1). With this method, data may not be obtained depending on the distance to the marker and the inclination.
  • This technology has been developed in view of the situation described above, and makes it possible to realize optical communication that can acquire transmission information at high speed and also acquire the light emission position.
  • a receiving device includes: Receives an optical signal that indicates "transmission start”, “0", and "1", which are transmitted as transmission information, at predetermined blinking intervals, and outputs the temporal change of the optical signal as event data.
  • an event sensor that a demodulator that demodulates the optical signal based on the event data to acquire the transmission information.
  • an optical signal representing "transmission start”, "0", and “1" transmitted as transmission information at predetermined blinking intervals is received.
  • a temporal change in the signal is output as event data, the optical signal is demodulated based on the event data, and the transmission information is acquired.
  • a transmission/reception system includes: comprising a transmitting device and a receiving device,
  • the transmission device transmits, as transmission information, an optical signal representing each of "transmission start", "0", and "1" at predetermined blinking intervals,
  • the receiving device an event sensor that receives the optical signal and outputs a temporal change of the optical signal as event data; a demodulator that demodulates the optical signal based on the event data to acquire the transmission information.
  • it comprises a transmitting device and a receiving device, and in the transmitting device, an optical signal representing "transmission start", "0", and "1" respectively at predetermined blinking intervals is transmitted as transmission information, the optical signal is received by the receiving device, the temporal change of the optical signal is output from the event sensor as event data, and the optical signal is demodulated based on the event data, The transmission information is obtained.
  • the receiving device and the transmitting/receiving system may be independent devices, or may be internal blocks forming one device.
  • FIG. 1 is a block diagram showing a configuration example of a first embodiment of a transmission/reception system to which the present technology is applied;
  • FIG. FIG. 2 is a diagram showing an application example of the transmission/reception system of FIG. 1; It is a figure explaining the event data which a light receiving sensor outputs. It is a figure explaining the event data which a light receiving sensor outputs.
  • FIG. 4 is a diagram for explaining an emission signal generated by a modulating section of a transmission device;
  • FIG. 4 is a diagram for explaining an emission signal generated by a modulating section of a transmission device;
  • 3 is a block diagram showing a configuration example of a demodulator;
  • FIG. 4 is a diagram for explaining divided regions of a pixel array that are processing units of a demodulation unit;
  • FIG. 10 is a diagram for explaining processing of an event accumulation unit;
  • FIG. 4 is a diagram for explaining processing of a signal demodulator; It is a figure explaining the example in the case of setting a detection threshold dynamically. It is a figure explaining the process of a coordinate calculation part. It is a figure explaining the process of a coordinate calculation part.
  • 4 is a flowchart for explaining transmission/reception processing according to the first embodiment; It is a figure explaining the modification of 1st Embodiment.
  • FIG. 11 is a block diagram showing a configuration example of a demodulator in a modified example of the first embodiment;
  • FIG. 11 is a block diagram showing a configuration example of a demodulator in a modified example of the first embodiment;
  • FIG. 12 is a block diagram showing a configuration example of a transmission/reception system according to a second embodiment of the present technology
  • FIG. 11 is a diagram showing an application example of a transmission/reception system according to a second embodiment
  • FIG. 12 is a diagram showing another application example of the transmitting/receiving system according to the second embodiment
  • 20 is a diagram illustrating processing of a coordinate calculation unit in the application example of FIG. 19
  • FIG. 10 is a diagram illustrating a pixel configuration example for simultaneously acquiring an event detection and an RGB image; It is a figure explaining the application example of the transmission/reception system which concerns on 3rd Embodiment.
  • FIG. 12 is a block diagram showing a configuration example of a transmission/reception system according to a third embodiment to which the present technology is applied; It is a figure explaining the calculation method of self-position estimation.
  • FIG. 12 is a block diagram showing a modification of the third embodiment;
  • FIG. 2 is a block diagram showing a configuration example of an imaging element used as a light receiving sensor in FIG. 1;
  • FIG. 3 is a block diagram showing a configuration example of an address event detection circuit;
  • FIG. It is a circuit showing detailed configurations of a current-voltage conversion circuit, a subtractor, and a quantizer. It is a more detailed circuit configuration example of the current-voltage conversion circuit, the buffer, the subtractor, and the quantizer.
  • FIG. 12 is a block diagram showing a configuration example of a transmission/reception system according to a third embodiment to which the present technology is applied; It is a figure explaining the calculation method of self-position estimation.
  • FIG. 12 is a block diagram showing a modification of the
  • 11 is a circuit diagram showing another configuration example of a quantizer; 31 shows a more detailed circuit configuration example of a current-voltage conversion circuit, a buffer, a subtractor, and a quantizer when the quantizer of FIG. 30 is employed.
  • 1 is a block diagram showing a configuration example of an embodiment of a computer to which the present technology is applied; FIG.
  • FIG. 1 is a block diagram showing a configuration example of a first embodiment of a transmission/reception system to which the present technology is applied.
  • the transmission/reception system 1 of FIG. 1 The transmission/reception system 1 of FIG.
  • the transmission device 11 outputs an optical signal obtained by modulating the transmission information to the blinking timing of light.
  • the receiving device 12 receives the optical signal output from the transmitting device 11 and demodulates it based on the blinking timing of the light, thereby obtaining transmission information from the transmitting device 11 as received information.
  • the transmission information is composed of digital data in which predetermined characters, symbols, images (moving or still images), sounds, etc. are represented by 0 or 1.
  • the transmission information is stored in the transmission device 11 in advance.
  • the information may be stored information or information generated by the transmission device 11.
  • the transmission information may be information supplied from the outside of the transmission device 11. FIG.
  • the transmitting device 11 has a modulating section 21 and a light emitting section 22 .
  • the receiver 12 has a light receiving sensor 31 and a demodulator 32 .
  • the modulation unit 21 modulates digital data of "0" or "1", which is transmission information, at the blinking timing of light. Also, the modulation unit 21 adds a start code, which indicates the start of transmission, to the beginning of the transmission information. The modulation unit 21 supplies the light emitting unit 22 with a light emission signal indicating the blinking timing of the light, which is generated according to the transmission information.
  • the light emitting unit 22 has a light emitting source such as a light emitting diode or a semiconductor laser that outputs infrared light with a wavelength ranging from approximately 850 nm to 940 nm.
  • the light emitting unit 22 outputs an optical signal corresponding to transmission information by turning on and off infrared light emission based on the light emission signal supplied from the modulation unit 21 .
  • the type of light source and the wavelength range of the illuminating light can be appropriately selected according to the use of the transmitting/receiving system 1 and the like.
  • the light receiving sensor 31 receives (receives) the optical signal output from the light emitting section 22 of the transmitting device 11 .
  • the light-receiving sensor 31 has a pixel array in which a plurality of pixels are arranged in a matrix. Each pixel detects a temporal change in an optical signal from a temporal change in an electrical signal obtained by photoelectrically converting the optical signal, and detects an event. Output as data.
  • Such a sensor that outputs a temporal change in an electrical signal obtained by photoelectrically converting an optical signal as event data is also called an event sensor or EVS (event-based vision sensor).
  • the light-receiving sensor 31 supplies the demodulator 32 with event data obtained by capturing the luminance change of the optical signal.
  • the demodulator 32 restores (demodulates) the optical signal from the transmitter 11 based on the event data supplied from the light receiving sensor 31, and acquires the restored transmission information as reception information.
  • the demodulator 32 outputs the restored “0” or “1” data and the two-dimensional positional information obtained by acquiring the data, specifically, the (x, y) coordinates of the pixel array to the subsequent stage.
  • the transmission/reception system 1 is configured as described above, and can transmit and receive predetermined information by transmitting and receiving optical signals.
  • the transmitting device 11 can transmit a predetermined image as transmission information, and the receiving device 12 can receive the image and display it on an external display.
  • the transmitting device 11 can transmit a character string of a predetermined URL as transmission information, and the receiving device 12 can access the received URL and display the website on an external display.
  • the transmitting/receiving system 1 has a plurality of both or one of the transmitting device 11 and the receiving device 12 in addition to one-to-one communication between the transmitting device 11 and the receiving device 12, and performs one-to-many and many-to-many communication.
  • Transmitting device 11 may be configured as part of a host device that generates or provides transmission information.
  • receiving device 12 may be configured as part of a host device that utilizes received information.
  • FIG. 2 is a diagram showing an application example of the transmission/reception system 1.
  • FIG. 2 is a diagram showing an application example of the transmission/reception system 1.
  • three transmitters 11 are arranged at predetermined intervals on the wall surface of a predetermined indoor space.
  • three transmitters 11 are distinguished as transmitters 11A to 11C.
  • the receiving device 12 is built into the VR goggles 41.
  • the receiving device 12 is built into the smart phone 42 .
  • the VR goggles 41 and the receiving device 12 inside the smartphone 42 are not shown in FIG.
  • the transmission/reception system 1 in Fig. 2 can be used for VR attractions.
  • the receiving device 12 of the VR goggles 41 receives an optical signal emitted by one of the transmitting devices 11A to 11C, and transmits characters, sounds, or images indicated by the transmission information to the display of the VR goggles 41. Or output from the speaker.
  • directivity can be given to the optical signals emitted by the transmission devices 11A to 11C.
  • the optical signal emitted by the transmitter 11A can be received by a predetermined area 43A in the indoor space, and the optical signals emitted by the transmitters 11B and 11C can be received by predetermined areas 43B and 43C, respectively.
  • the user carries the smartphone 42 and moves around the room, and follows the instructions displayed on the display of the smartphone 42 to perform predetermined tasks. More specifically, when the user moves to the predetermined area 43A, the user performs a predetermined work according to instructions displayed on the display of the smartphone 42 based on the transmission information from the transmission device 11A. When the user moves to the predetermined area 43B, the user performs a predetermined work according to instructions displayed on the display of the smartphone 42 based on the transmission information from the transmission device 11B. When the user moves to the predetermined area 43C, the user performs a predetermined work according to instructions displayed on the display of the smartphone 42 based on the transmission information from the transmission device 11C. The work instructions indicated by the transmission information from the transmitters 11A to 11C are different.
  • the transmission/reception system 1 can be used for information communication in attractions, operation instructions to operators, etc., as described above.
  • the transmission/reception system 1 can be used, for example, for transmission/reception of sign information for movement of an AGV (automated guided vehicle). It goes without saying that the example of use of the transmitting/receiving system 1 illustrated is merely an example, and that other methods of use are also possible.
  • the light-receiving sensor 31 detects a temporal change in light-receiving luminance according to the on/off of the infrared light output by the light-emitting unit 22 of the transmitting device 11 as transmission information, and outputs it as event data.
  • FIG. 3 shows an example of event data output by the light receiving sensor 31.
  • the light-receiving sensor 31 receives the time t i at which the event occurred, the coordinates (x i , y i ) representing the position of the pixel at which the event occurred, and the polarity of the luminance change as the event.
  • the time t i of the event is a time stamp representing the time when the event occurs, and is represented, for example, by the count value of a counter based on a predetermined clock signal within the sensor.
  • a time stamp corresponding to the timing at which an event occurs can be said to be time information representing the (relative) time at which the event occurred as long as the interval between the events is maintained at the time of event occurrence.
  • Polarity p i represents the direction of luminance change when luminance change (light amount change) exceeding a predetermined threshold (hereinafter referred to as event threshold EV_TH) occurs as an event, and whether the luminance change is positive or , indicates whether the change is in the negative direction.
  • the polarity p i of the event is, for example, represented by "1" when the change is in the positive direction and by "0" when the change is in the negative direction.
  • an event with a positive luminance change is also referred to as a positive event
  • an event with a negative luminance change is also referred to as a negative event.
  • the light-receiving sensor 31 outputs only the position coordinates, polarity, and time information of pixels that have detected luminance changes. Since the light-receiving sensor 31 generates and outputs only the net change (difference) of position coordinates, polarity, and time information, there is no redundancy in the amount of data and has high time resolution on the order of ⁇ sec. As a result, high-speed optical signals can be captured efficiently and accurately.
  • event data is output each time an event occurs.
  • a general image sensor performs photographing in synchronization with a vertical synchronizing signal and outputs frame data, which is image data of one frame (screen), at the cycle of the vertical synchronizing signal. Since event data is output only at the timing when an event occurs, it can be said to be an asynchronous or address-controlled sensor.
  • FIG. 4 is a diagram illustrating an example of event data output from the light receiving sensor 31.
  • FIG. 4 is a diagram illustrating an example of event data output from the light receiving sensor 31.
  • the event data are plotted as dots at the spatio-temporal location (x, y, t) of the event.
  • frame data When displaying event data on a display, frame data is generated by accumulating event data for a predetermined time width (frame width). An image in which pixels at other positions are set to a predetermined color such as gray can be generated as an event image and displayed.
  • the polarity of the light amount change as an event for the event data for example, when the polarity is a change in the positive direction (positive), the pixel is set to white, and the polarity is set to white. is a change in the negative direction (negative), this can be done by setting the pixel to black and setting the pixels elsewhere in the frame to a predetermined color, such as gray.
  • Example of light emission signal> The light emission signal generated by the modulation section 21 of the transmission device 11 will be described with reference to FIGS. 5 and 6.
  • FIG. 5 The light emission signal generated by the modulation section 21 of the transmission device 11 will be described with reference to FIGS. 5 and 6.
  • FIG. 5 shows an example of the light emission signal generated by the modulation section 21.
  • the transmission information output by the transmission device 11 can be expressed by repeating the code of "0" or "1".
  • the modulation unit 21 generates a light emission signal by converting each of “0” and “1” of digital data as transmission information into a predetermined blinking interval, and outputs the light emission signal to the light emission unit 22 .
  • the light emitting unit 22 outputs an optical signal based on the light emission signal. Also, before outputting the light emission signal corresponding to the transmission information, the modulation section 21 generates a light emission signal corresponding to a start code meaning “start of transmission” and outputs the light emission signal to the light emission section 22 .
  • FIG. 5 shows an example of a light emission signal that transmits "0", “1”, “0”, “0”, . . . following the start code.
  • Transmission start "0", and “1" light emission signals are represented by different blinking intervals.
  • the light emission signal of "transmission start” has a code length of 5 msec
  • the first 4.5 msec is a light period (light emission period)
  • the last 0.5 msec is a dark period (light off period).
  • the light emission signal of "0” has a code length of 1 msec
  • the first 0.5 msec being a bright period (light emitting period)
  • the last 0.5 msec being a dark period (extinguishing period).
  • the light emission signal of "1” has a code length of 2.5 msec, the first 2.5 msec being a bright period (light emitting period), and the last 0.5 msec being a dark period (extinguishing period).
  • the code lengths of "transmission start", "0", and “1" are preferably set by selecting a period corresponding to a frequency that is less affected by ambient light flicker. For example, flicker of 100 Hz or 120 Hz may occur indoors depending on the frequency of the power supply. By selecting a code length that is different from these frequencies (flicker frequency), reception stability can be improved. can be made
  • the light-receiving sensor 31 detects a change in brightness, but for example, it cannot detect a change in brightness due to the transmitted information because the amount of light is insufficient. I can put it away. As a result, the receiving device 12 may not be able to acquire the transmission information accurately.
  • FIG. 6 shows an example of a light emission signal in which the code length of "0" is 1 msec and the code length of "1" is 2 msec which is twice the code length of "0".
  • FIG. 6 shows an example of a light emission signal that transmits "0", "0", "0", "0" as transmission information.
  • the lower part of FIG. 6 shows an example of a light emission signal that transmits "0", "1", and "0" as transmission information.
  • the transmitting device 11 outputs the emission signals "0", “0”, “0”, and “0” in the upper part of FIG. , the transmission information may be erroneously detected as “0", “1”, and "0" in the lower row.
  • the code lengths of "0” and “1” are integral multiples, erroneous detection may occur. It is preferable that the code lengths of "1" and “1” do not have an integral multiple relationship. Moreover, it is preferable that the difference (time difference) between the code lengths of “0” and “1” be set to a time difference that is greater than the variation in response to blinking detection of the light receiving sensor 31 . For example, by setting the ratio of the code lengths of "0" and “1” to 1.5 times or 2.5 times, errors can be easily detected.
  • the light period (light emission period) of the light emission signal is set to be the same between “0” and “1", but as in the example of FIG. is preferably set to different periods for "0" and "1".
  • the code lengths of "0" and “1" are integral multiples.
  • the code lengths of "0" and “1” are both set to 1 msec, and the code of "0” has a light period (light emission period) of 0.25 msec at the beginning and a 0 at the end.
  • the dark period (light-out period) is 0.75 msec, and the code "1" may be set so that the first 0.75 msec is the bright period (light-emitting period) and the last 0.25 msec is the dark period (light-out period).
  • FIG. 7 is a block diagram showing a more detailed configuration example of the demodulator 32. As shown in FIG.
  • the demodulation section 32 is composed of an event integration section 51, a signal demodulation section 52, a coordinate calculation section 53, and an area integration section .
  • the event data output from the light receiving sensor 31 is supplied to the event integrating section 51 and the coordinate calculating section 53 .
  • Each part of the demodulator 32 divides the pixel array corresponding to the light receiving area of the light receiving sensor 31 into a plurality of areas (hereinafter referred to as divided areas) and performs predetermined signal processing for each divided area.
  • the event integration unit 51 integrates the event data output from the light receiving sensor 31 for each divided area in a predetermined integration period.
  • the integration period here is much shorter than the frame period (1/30 sec or 1/60 sec) of a general image sensor, and sufficiently shorter than the code length of "0" and "1". period.
  • the event integration unit 51 supplies the event integration result for each divided area to the signal demodulation unit 52 .
  • the signal demodulation unit 52 detects the blinking timing of the optical signal based on the event integration result for each divided area supplied from the event integration unit 51 . Based on the detected blinking timing, the signal demodulator 52 detects "transmission start” and restores (demodulates) data (code) of "0" or "1". Here, if a blinking event occurs with a time difference corresponding to the communication code in each divided area, the data is demodulated to "0" or "1", but a blinking event occurs with a time difference other than the communication code. If so, the signal demodulation section 52 supplies the error to the area integrating section 54 .
  • the coordinate calculation unit 53 calculates the representative position (x, y) of event occurrence for each divided region of the pixel array, and supplies it to the region integration unit 54 as the representative coordinates (x, y) of the received optical signal.
  • the area integration unit 54 acquires data of "0" or "1” or an error, which is the demodulation result of each divided area of the pixel array, from the signal demodulation unit 52. Also, the area integration unit 54 acquires the representative coordinates (x, y) of each divided area of the pixel array from the coordinate calculation unit 53 . The area integration unit 54 combines the demodulated “0” or “1” data and the representative coordinates (x, y) for the divided areas in which the "0" or “1” data is demodulated, and outputs the result to the subsequent stage. do.
  • FIG. 8 shows an example of a divided area of the pixel array that is the processing unit of the demodulator 32 .
  • the pixel array PA of the light receiving sensor 31 is divided into a plurality of divided regions RG.
  • the divided region RG is composed of H ⁇ K pixels PX, H pixels in the X direction and K pixels in the Y direction (H, K>0).
  • the event integration unit 51 integrates the event data output from the light receiving sensor 31 for each divided area RG.
  • FIG. 9 is a diagram explaining the processing of the event integration unit 51 for a predetermined divided area.
  • the event integration unit 51 integrates the event data output from the light receiving sensor 31 for each divided area in a predetermined integration period.
  • the event data accumulation period is 0.25 msec.
  • FIG. 10 is a diagram for explaining the processing of the signal demodulator 52 for the predetermined divided area RG.
  • the signal demodulation unit 52 detects the blinking timing of the optical signal based on the event integration number supplied from the event integration unit 51 for each integration period. Specifically, the signal demodulator 52 determines whether or not the number of positive events or negative events integrated in each integration period is greater than a predetermined detection threshold DET_TH, thereby adjusting the blinking timing of the optical signal. To detect.
  • the time of the integration period when the event accumulated number of positive events is greater than the detection threshold DET_TH is set as the optical signal ON time
  • the time of the integration period when the event accumulated number of negative events is greater than the detection threshold DET_TH is set as the optical signal OFF time
  • the signal demodulator 52 calculates the period from the ON time to the OFF time of the optical signal as the light period, and determines the signs of "transmission start", "0", and "1" from the length of the light period.
  • the detection threshold DET_TH can be set to a value that is tuned so as to detect only optical signals by checking in advance the number of accumulated events with and without optical signals, noise, etc.
  • the number of event accumulations can vary depending on the event threshold EV_TH, which is a parameter of the light receiving sensor 31, a plurality of detection thresholds DET_TH may be used according to the parameters.
  • the event threshold EV_TH set by the light receiving sensor 31 may be classified by predetermined range, and the corresponding detection threshold DET_TH may be set.
  • the detection threshold DET_TH may be set according to the accumulation period.
  • the detection threshold DET_TH may be dynamically set (adjusted) based on the accumulated number of events that are currently occurring instead of a fixed value determined in advance.
  • FIG. 11 is a diagram illustrating an example of dynamically setting the detection threshold DET_TH.
  • the signal demodulation unit 52 calculates the average value of the event accumulated numbers detected in each divided area RG, and uses it as the event accumulated number due to ambient light.
  • the average value of the number of accumulated events may be the average value of all the divided regions RG of the pixel array, or, for example, the average value of Q (Q>1) divided regions RG with a small number of accumulated events in the entire pixel array. It's okay.
  • the signal demodulator 52 subtracts the event integration number due to ambient light from the event integration number of each divided region RG, thereby calculating the event integration number with ambient light removed.
  • the blinking timing of the light emission signal is detected by determining whether or not the cumulative number of positive events or negative events from which ambient light is removed is greater than a predetermined detection threshold DET_TH.
  • the detection threshold DET_TH can be dynamically adjusted according to the number of events accumulated by ambient light.
  • the light receiving sensor 31 does not receive optical signals in the divided regions RG1 and RG2 of the divided regions RG of the pixel array, but receives an optical signal in the divided region RG3 .
  • a light-receiving point 61 of the divided region RG3 represents the region that received the optical signal.
  • the number of event accumulations counted in segmented areas RG 1 and RG 2 is due to ambient light and noise, and the number of events counted in segmented area RG 3 is due to ambient light and noise as well as the received Including optical signals.
  • the signal demodulator 52 calculates the average value of the event accumulated numbers of the divided areas RG1 and RG2 having a small event accumulated number, and calculates the event accumulated number of the non-receiving areas that do not receive the optical signal. Then, the signal demodulator 52 subtracts the event accumulation number of the non-receiving area from the event accumulation number of the divided area RG3 , thereby calculating the event accumulation number of the transmission information from which ambient light, noise, and the like are removed. As a result, changes in the light source such as flicker can be removed, and events corresponding to optical signals can be stably detected.
  • the detection threshold DET_TH is increased according to the accumulated number of events in the non-receiving area. increase the threshold DET_TH).
  • FIG. 12 the processing of the coordinate calculation unit 53 will be described with reference to FIGS. 12 and 13.
  • FIG. 12 the processing of the coordinate calculation unit 53 will be described with reference to FIGS. 12 and 13.
  • FIG. 12 shows 3 ⁇ 3 divided areas RG, and the light receiving point 61, which is the area where the optical signal is received, exists in the central divided area RG.
  • a method of calculating the representative coordinates (x, y) of the light receiving point 61 will be described with reference to FIG.
  • FIG. 13A is a diagram explaining a first method of calculating the representative coordinates (x, y) of the light receiving point 61.
  • FIG. 13A is a diagram explaining a first method of calculating the representative coordinates (x, y) of the light receiving point 61.
  • the coordinate calculation unit 53 confirms the event accumulation number of each of the plurality of pixels PX forming the divided region RG in a predetermined order, for example, by the raster scanning method shown in A of FIG. By calculating, the representative coordinates (x, y) of the light receiving point 61 can be calculated. However, since the first calculation method requires a search for all pixels in the two-dimensional area, the calculation cost is high and it is not suitable for high-speed detection.
  • FIG. 13B is a diagram explaining a second method of calculating the representative coordinates (x, y) of the light receiving point 61.
  • FIG. 13B is a diagram explaining a second method of calculating the representative coordinates (x, y) of the light receiving point 61.
  • the coordinate calculation unit 53 calculates a column integrated value by adding the event integrated number of each of the plurality of pixels PX forming the divided region RG in each of the pixel columns in the horizontal direction and the vertical direction. Then, the coordinate calculation unit 53 sets the x-coordinate (horizontal position) of the pixel row at which the column integrated value in the vertical direction is the maximum value as the representative x-coordinate of the light receiving point 61, and the column integrated value in the horizontal direction is at the maximum value. Using the y-coordinate (vertical position) of the pixel row as the representative y-coordinate of the light receiving point 61, the representative coordinate (x, y) is calculated. In this way, by determining the representative coordinates (x, y) based on the results of the one-dimensional search in each of the horizontal and vertical directions, calculation costs can be suppressed and detection can be performed at high speed.
  • the coordinate calculation unit 53 selects the above-described first calculation method and second calculation method as necessary, and calculates the representative coordinates (x, y).
  • the entire area of the light receiving point 61 is included in one divided area RG, but the area of the light receiving point 61 is divided into multiple divided areas RG. Even when straddling, the representative coordinates (x, y) can be calculated by the above-described first calculation method and second calculation method.
  • step S1 the modulation section 21 of the transmission device 11 generates a light emission signal corresponding to transmission information. More specifically, the modulation unit 21 generates a light emission signal of a start code, and then modulates the digital data of "0" or "1", which is the transmission information supplied from the outside, to the blinking timing of the light. Thus, a light emission signal is generated.
  • the light emission signals of the start code, "0", and “1" are signals with different light blinking timings.
  • step S2 the light emitting section 22 outputs an optical signal corresponding to transmission information by turning on and off infrared light emission based on the light emission signal supplied from the modulating section 21 .
  • step S3 the light receiving sensor 31 of the receiving device 12 receives the optical signal, detects temporal changes in the optical signal, and outputs event data. More specifically, the light sensor 31 receives (receives) an optical signal output from the light emitting unit 22 of the transmitter 11 . Each pixel of the light-receiving sensor 31 outputs event data to the demodulator 32 when detecting a temporal change in an electrical signal obtained by photoelectrically converting an optical signal.
  • step S4 the event integration section 51 of the demodulation section 32 integrates the event data output from the light receiving sensor 31 for each divided region RG of the pixel array in a predetermined integration period.
  • the event integration unit 51 supplies the event integration result for each divided region RG to the signal demodulation unit 52 .
  • step S5 the signal demodulator 52 demodulates data to "0" or "1” based on the event integration result for each divided region RG. That is, the signal demodulator 52 detects the blinking timing of the optical signal based on the event integration result for each divided region RG supplied from the event integration unit 51 . Then, the signal demodulator 52 detects the “start of transmission” based on the detected blinking timing, and then restores (demodulates) the data to “0” or “1”. An error is supplied to the area integrating section 54 for the divided area RG in which the blinking event occurs with a time difference other than the communication code.
  • step S6 the coordinate calculation unit 53 calculates the representative position (x, y) of event occurrence for each divided region RG of the pixel array, and supplies it to the region integration unit 54 as the representative coordinates (x, y).
  • the method for calculating the representative coordinates (x, y) either the first calculation method or the second calculation method described with reference to FIGS. 12 and 13 is selected by operation settings, for example.
  • steps S4 and S5 and the processing of step S6 can be executed in parallel.
  • step S7 the area integrating section 54 outputs the received information based on the demodulation result from the signal demodulating section 52 and the two-dimensional position information obtained by acquiring the received information to the outside. More specifically, the region integration unit 54 acquires data of “0” or “1” or an error, which is the result of demodulation of each divided region RG of the pixel array, from the signal demodulation unit 52 . Also, the area integration unit 54 acquires the representative coordinates (x, y) of each divided area RG of the pixel array from the coordinate calculation unit 53 . The region integration unit 54 combines the demodulated data of “0” or “1” with the representative coordinates (x, y) for the divided region RG in which the data of “0” or “1” has been demodulated, and outputs the Output.
  • the event sensor capable of detecting the change in the optical signal (luminance change) at high speed is used as the light receiving sensor 31, so that the desired transmission information can be transmitted/received at high speed. can do.
  • the light receiving sensor 31 has a two-dimensional light receiving area (pixel array), it is possible to acquire the light emitting position of the optical signal. That is, it is possible to realize optical communication capable of acquiring transmission information at high speed and also acquiring the light emission position.
  • the transmission/reception system 1 does not make it difficult to acquire information due to distance or angle (inclination) like information acquisition using a marker disclosed in Non-Patent Document 1, and is robust against distance and angle. have.
  • the transmitting device 11 and the receiving device 12 were arranged at a certain distance, or the position of the transmitting device 11 was unknown (unknown) as seen from the receiving device 12 .
  • the transmission/reception system 1 is conceivable in which the relative positions of the light-emitting part 22 and the light-receiving sensor 31 are fixed by fixing them closely.
  • the light-emitting part 22 and the light-receiving sensor 31 may be separated by a short distance, or they may be attached to each other.
  • the relationship between the housing 71 and the housing 72 can be the relationship between the smartphone or VR goggles and the attachments attached thereto, or the relationship between the camera body and the interchangeable lens in an interchangeable-lens imaging device.
  • the position of the light-emitting unit 22 indicates in advance at which pixel position in the pixel array of the light-receiving sensor 31 blinking occurs. .
  • the demodulation section 32 of the receiving device 12 can be composed only of the event integration section 51 and the signal demodulation section 52, as shown in FIG. That is, since the light source position of the light emitting unit 22 is uniquely determined corresponding to the pixel PX where blinking is detected, the coordinate calculation unit 53 can be omitted, and the area integration unit 54 can also be omitted.
  • FIGS. 15 and 16 have the following effects in addition to the effects of the first embodiment described above.
  • the pixel PX corresponding to the light source position of the light emitting unit 22 can be specified with respect to the pixel array PA of the light receiving sensor 31. If the pixel PX that receives the optical signal can be specified, the event detection of the pixel PX that does not correspond to the light source position of the light emitting section 22 can be disabled. By limiting the activated pixels (effective pixels) of the pixel array PA, it is possible to speed up the readout of the pixels PX and reduce the power consumption of the light receiving sensor 31 .
  • the light-emitting unit 22 and the light-receiving sensor 31 are close to each other within a predetermined distance (including contact), the influence of ambient light can be suppressed. It is possible to set the area size of the divided area RG for which the event accumulation number is calculated by 51 to a small number of pixels.
  • FIG. 17 is a block diagram showing a configuration example of a transmission/reception system according to a second embodiment of the present technology.
  • the configuration of the transmission device 11 is the same as that of the first embodiment, and the configuration of the reception device 12 is different.
  • the receiver 12 additionally includes an RGB sensor 81 , a coordinate corrector 82 , and a superimposing unit 83 .
  • the RGB sensor 81 is an image sensor that captures a subject and generates a two-dimensional color image, and corresponds to a so-called normal camera.
  • the RGB sensor 81 generates a two-dimensional color image obtained by photographing a subject and supplies it to the coordinate correction section 82 .
  • the coordinate correction unit 82 acquires the received information and the two-dimensional position information from which the received information is acquired from the demodulation unit 32 . Also, the coordinate correction unit 82 acquires a two-dimensional color image from the RGB sensor 81 . The coordinate correction unit 82 aligns the coordinate system of the two-dimensional color image with the coordinate system of the received information based on the positional information of the reference light emitting point that serves as a reference for positional correction.
  • the superimposing unit 83 generates and outputs an image in which the received information demodulated by the demodulating unit 32 is superimposed on a predetermined position of the two-dimensional color image obtained by the RGB sensor 81 .
  • the superimposing unit 83 generates an image in which the received information is superimposed on a predetermined position of the two-dimensional color image, and displays it on a predetermined display.
  • FIG. 18 is a diagram showing an application example of the transmission/reception system 1 according to the second embodiment.
  • a display device 91 and two transmission devices 11A and 11B are provided on a predetermined wall surface.
  • the two transmitters 11 A and 11 B are installed at diagonal positions of the display device 91 .
  • the display device 91 displays a predetermined image.
  • the two transmitters 11A and 11B transmit (light) optical signals corresponding to character information superimposed on a predetermined image displayed by the display device 91 as transmission information. For simplicity, it is assumed that the two transmitters 11A and 11B transmit the same information, but the transmitters 11A and 11B may transmit different information.
  • the receiving device 12 is built into the smartphone 42 owned by the user.
  • the RGB sensor 81 of the receiving device 12 captures the image displayed by the display device 91 and the optical signals of the transmitting devices 11A and 11B.
  • a light receiving sensor 31 of the receiving device 12 receives an optical signal from at least one of the transmitting devices 11A and 11B and detects an event.
  • the demodulator 32 of the receiving device 12 demodulates into received information based on the result of the event integration.
  • the received information demodulated by the demodulation unit 32 (character "A" in the example of FIG. 18) is superimposed on the image displayed by the display device 91 and displayed.
  • Alignment by the coordinate correction unit 82 can be performed, for example, as follows.
  • the positions of the transmitting devices 11A and 11B detected by the light receiving sensor 31 and the RGB sensor 81, respectively, are used as the reference light emitting points that serve as the reference for position correction, and the positional relationship between the coordinate system of the two-dimensional color image and the coordinate system of the received information.
  • the coordinate systems of both are aligned.
  • the shift amounts in the X and Y directions may simply be obtained from the positions of the transmitters 11A and 11B, and one coordinate system may be shifted to the other coordinate system.
  • the light-receiving sensor 31 and the RGB sensor 81 are arranged adjacent to each other at an extremely short distance, it is sufficient to add a predetermined offset amount without performing coordinate calculation for alignment.
  • the transmission information is only the letter "A", but the transmission devices 11A and 11B may transmit information about the area to be superimposed as well as the superimposed character.
  • the transmission information is not limited to characters, and may be images.
  • FIG. 19 is a diagram showing another application example of the transmission/reception system 1 according to the second embodiment.
  • the display device 91 is an LCD (liquid crystal display) that displays an image using a backlight
  • the backlight can be used as the light emitting unit 22 to blink in response to transmission information.
  • the coordinate calculation unit 53 of the demodulation unit 32 adds the event accumulation numbers in each pixel row in the horizontal direction and the vertical direction in the entire pixel array PA, as shown in FIG. Then, the coordinate calculation unit 53 detects a pixel row in which the number of event accumulations in the vertical direction is equal to or greater than a predetermined value, and detects the x-coordinate start position x1 and the x-coordinate end position x2.
  • the coordinate calculation unit 53 detects a pixel row in which the number of event accumulations in the horizontal direction is equal to or greater than a predetermined value, and detects a y-coordinate start position y1 and a y-coordinate end position y2. Then, using the light emission area of the backlight set by the rectangular area of the coordinates (x1, y1) and the coordinates (x2, y2), the coordinate correction unit 82 corrects the coordinate system of the two-dimensional color image and the coordinates of the received information. Alignment with the system is performed.
  • the superimposition unit 83 functions as an integration unit that integrates the two-dimensional image and the character information, and the two-dimensional image generated by the RGB sensor 81 is transmitted by an optical signal.
  • the transmission information is not limited to characters and may be voice (audio). Therefore, when voice is transmitted as transmission information, the synthesizing unit as the integration unit converts the voice as transmission information into, for example, RGB It can be output in synchronization with the two-dimensional image generated by the sensor 81 .
  • the receiving device 12 has the RGB sensor 81, but the RGB sensor 81 is provided separately as an external device, and the two-dimensional color image obtained by the RGB sensor 81 may be input to the coordinate correction unit 82 of the receiving device 12 .
  • a sensor capable of simultaneously detecting both event detection and a two-dimensional color image of a subject may be used as the light receiving sensor 31 .
  • pixels PX there is a pixel array PA in which event detection pixels EV_PX that detect events and RGB detection pixels RGB_PX that detect R, G, or B light are mixed.
  • a sensor 31 may be used.
  • the pixel array PA of FIG. 21 has a configuration in which each pixel PX is limited to either an event detection pixel EV_PX or an RGB detection pixel RGB_PX. may be configured to simultaneously detect both of the received light.
  • the coordinate correction section 82 can be omitted.
  • the received information can be displayed superimposed on the two-dimensional color image.
  • RGB sensor 81 of the second embodiment shown in FIG. 17 a monochrome sensor that generates a monochrome two-dimensional image may be used.
  • FIG. 22 shows an application example of the transmission/reception system 1 according to the third embodiment.
  • a plurality of transmitters 11 are installed at predetermined locations.
  • six transmitters 11A to 11F are installed, but the number of transmitters 11 is not limited.
  • the receiving device 12 is attached to a predetermined location on the moving object 101 such as an AGV.
  • the mobile object 101 is, for example, a vehicle that moves on the ground according to programmed rules.
  • the mobile object 101 may be a flying mobile object such as a drone.
  • the receiving device 12 receives optical signals output from a plurality of transmitting devices 11 and moves while estimating its own position.
  • FIG. 23 is a block diagram showing a configuration example of a transmission/reception system according to the third embodiment.
  • the configuration of the transmitting device 11 is the same as that of the first embodiment, and the configuration of the receiving device 12 is different.
  • the receiver 12 is additionally provided with a memory 121 and a self-position estimator 122 .
  • 23 shows only one transmission device 11, the transmission/reception system 1 according to the third embodiment includes a plurality of transmission devices 11 as shown in FIG.
  • the transmission device 11 transmits, as transmission information, a light source ID, which is information for identifying its own light emitting unit 22, by means of an optical signal. As shown in FIG. 22, each of a plurality of transmitters 11 arranged at different locations transmits the light source ID of its own light emitter 22 by an optical signal.
  • the storage unit 121 stores the light source position information (X i , Y i , Z i ) of (the light emitting unit 22 i of) the transmission device 11 i that can be received by the reception device 12 in association with the light source ID i .
  • the light source ID i and the light source position information (X i , Y i , Z i ) stored in the storage unit 121 are supplied to the self-position estimation unit 122 as necessary.
  • the self-position estimation unit 122 obtains the light source ID i and (x i , y i ) coordinates of the plurality of light emitting units 22 i supplied from the demodulation unit 32 and the corresponding light source position coordinates (X i , Y i , Z i ), and outputs its own position information.
  • a calculation method for estimating the self-position from a plurality of light source positions detected by the light receiving sensor 31 will be briefly described with reference to FIG.
  • the heights (Z positions) of the plurality of transmitters 11 and receivers 12 in the three-dimensional space (X, Y, Z) are assumed to be the same, and the two-dimensional plane of (X, Y) Self-position estimation in will be described.
  • the current position of the receiver 12 is the position (X c , Y c ), and the light receiving sensor 31 detects the optical signals of the light emitting units 22A, 22B, 22I, 22N, . (hereinafter also referred to as a bright spot) is received.
  • the position where the light emitting unit 22A is installed is (X 1 , Y 1 )
  • the position where the light emitting unit 22B is installed is (X 2 , Y 2 )
  • the position where the light emitting unit 22I is installed is (X i , Y i )
  • the position where the light emitting unit 22N is installed is (X n , Y n ).
  • the current position (X c , Y c ) and the direction ⁇ c of the receiving device 12 are unknown variables to be obtained here.
  • cx is the x-coordinate of the center of the angle of view of the light-receiving sensor 31
  • xi is the x-coordinate of the bright spot on the light-receiving sensor 31 of the light emitting unit 22I
  • f is the focal length of the light-receiving sensor 31.
  • FIG. 25 is a block diagram showing a modification of the third embodiment.
  • the transmission device 11 is configured to transmit the light source ID for identifying its own light emitting unit 22 as transmission information by means of an optical signal.
  • the transmission device 11 transmits its own light source position information (X i , Y i , Z i ) as transmission information. .
  • the receiving device 12 can omit the storage unit 121 that stores the light source position information (X i , Y i , Z i ).
  • the demodulator 32 of the receiver 12 demodulates the optical signal output from each transmitter 11 to obtain the light source position information (X i , Y i , Z i ) are acquired and supplied to the self-position estimation unit 122 together with the (x i , y i ) coordinates of the acquired pixel array.
  • the self-position estimation unit 122 uses the light source position information (X i , Y i , Z i ) of the plurality of light emitting units 22 i supplied from the demodulation unit 32 and the (x i , y i ) coordinates of the pixel array to estimate the self position. Estimates the position and outputs its own position information.
  • the storage unit 121 of FIG. can be omitted, and the self position information can be output.
  • the receiving device 12 may further include additional sensors such as an acceleration sensor, a gyro sensor, and an IMU (inertial measurement unit), and may use the sensor information of the additional sensors to estimate its own position.
  • additional sensors such as an acceleration sensor, a gyro sensor, and an IMU (inertial measurement unit), and may use the sensor information of the additional sensors to estimate its own position.
  • the distance between the transmission device 11 and the reception device 12 is short or long. Regardless of whether or not there is one, if the optical signal from each transmitter 11 can be received, information can be obtained stably, so that information can be transmitted and received robustly with respect to distance.
  • the receiving device 12 has all the configurations of the first to third embodiments, and performs one of the operations of the first to third embodiments according to the operation mode. can be configured.
  • each pixel PX of the pixel array PA is provided with a filter, such as a color filter, which allows only light in a predetermined wavelength band to pass through, like a normal image sensor, and an optical signal that can be received (receivable) is sent to the pixel. It can be configured differently depending on the PX.
  • the light emitting unit 22 of the transmission device 11 changes the wavelength of the optical signal to be output, for example, to R (red), G (green), B (blue), IR (infrared), etc. according to the type of transmission information. Use properly. This makes it possible to transmit and receive more complicated transmission information.
  • the transmission/reception system 1 In the transmission/reception system 1 described above, the blinking timing of light is detected and converted (demodulated) into digital data of "0" or "1", but the transmission/reception system 1 is used as a system for transmitting and receiving analog numerical information. can also
  • the transmitter 11 allocates temperatures from 0° C. to 40° C. in association with frequencies from 100 Hz to 500 Hz, and outputs optical signals at frequencies corresponding to temperatures to be transmitted.
  • the receiver 12 detects the frequency of the received optical signal, converts it into a corresponding temperature based on the received frequency, and outputs it. For example, when receiving a blinking frequency of 300 Hz, the receiving device 12 converts it into a numerical value (temperature information) of 20° C. and outputs it.
  • FIG. 26 is a block diagram showing a configuration example of the imaging element 311 used as the light receiving sensor 31. As shown in FIG.
  • the imaging device 311 includes a pixel array section 341 , a drive section 342 , a Y arbiter 343 , an X arbiter 344 and an output section 345 .
  • a plurality of pixels 361 are arranged in a two-dimensional grid in the pixel array section 341 .
  • Each pixel 361 has a photodiode 371 as a photoelectric conversion element and an address event detection circuit 372 .
  • the address event detection circuit 372 detects a change in the photocurrent as an event when the photocurrent as an electrical signal generated by the photoelectric conversion of the photodiode 371 changes beyond a predetermined threshold.
  • address event detection circuit 372 outputs a request to Y arbiter 343 and X arbiter 344 requesting output of event data representing the occurrence of the event.
  • the driving section 342 drives the pixel array section 341 by supplying a control signal to each pixel 361 of the pixel array section 341 .
  • the Y arbiter 343 arbitrates requests from pixels 361 in the same row in the pixel array section 341 and returns a response indicating permission or non-permission of event data output to the pixel 361 that has sent the request.
  • the X arbiter 344 arbitrates requests from the pixels 361 in the same column in the pixel array section 341 and returns a response indicating permission or non-permission of event data output to the pixel 361 that has transmitted the request.
  • a pixel 361 to which a permission response has been returned from both the Y arbiter 343 and the X arbiter 344 can output event data to the output unit 345 .
  • the imaging device 311 may be configured to include only one of the Y arbiter 343 and the X arbiter 344 .
  • the imaging device 311 may be configured to include only one of the Y arbiter 343 and the X arbiter 344 .
  • the X arbiter 344 data of all pixels 361 in the same column including the pixel 361 that has transmitted the request are transferred to the output section 345 .
  • the output section 345 only the event data of the pixel 361 that actually generated the event is selected.
  • the Y arbiter 343 only, the pixel data is transferred to the output unit 345 in units of rows, and only event data of the necessary pixels 361 are selected in the subsequent stage.
  • the output unit 345 performs necessary processing on the event data output by each pixel 361 forming the pixel array unit 341, and outputs the processed data through an output terminal.
  • FIG. 27 is a block diagram showing a configuration example of the address event detection circuit 372. As shown in FIG.
  • the address event detection circuit 372 includes a current-voltage conversion circuit 381, a buffer 382, a subtractor 383, a quantizer 384, and a transfer circuit 385.
  • the current-voltage conversion circuit 381 converts the photocurrent from the corresponding photodiode 371 into a voltage signal.
  • the current-voltage conversion circuit 381 generates a voltage signal corresponding to the logarithmic value of the photocurrent and outputs it to the buffer 382 .
  • the buffer 382 buffers the voltage signal from the current-voltage conversion circuit 381 and outputs it to the subtractor 383 .
  • this buffer 382 it is possible to secure the isolation of noise associated with the switching operation of the latter stage and improve the driving power for driving the latter stage. Note that this buffer 382 can be omitted.
  • the subtractor 383 reduces the level of the voltage signal from the buffer 382 according to the control signal from the driving section 342.
  • the subtractor 383 outputs the lowered voltage signal to the quantizer 384 .
  • the quantizer 384 quantizes the voltage signal from the subtractor 383 into a digital signal and supplies it to the transfer circuit 385 as event data.
  • the transfer circuit 385 transfers (outputs) the event data to the output unit 345 . That is, the transfer circuit 385 supplies the Y arbiter 343 and the X arbiter 344 with a request for outputting event data. When the transfer circuit 385 receives a response from the Y arbiter 343 and the X arbiter 344 to the effect that the output of the event data is permitted in response to the request, the transfer circuit 385 transfers the event data to the output unit 345 .
  • FIG. 28 is a circuit showing the detailed configuration of the current-voltage conversion circuit 381, subtractor 383, and quantizer 384. As shown in FIG. FIG. 28 also shows the photodiode 371 connected to the current-voltage conversion circuit 381 .
  • the current-voltage conversion circuit 381 is composed of FETs 411 to 413.
  • FETs 411 and 413 for example, an N-type MOS (NMOS) FET can be used, and as the FET 412, for example, a P-type MOS (PMOS) FET can be used.
  • NMOS N-type MOS
  • PMOS P-type MOS
  • the photodiode 371 receives incident light, performs photoelectric conversion, and generates and flows a photocurrent as an electrical signal.
  • the current-voltage conversion circuit 381 converts the photocurrent from the photodiode 371 into a voltage (hereinafter also referred to as photovoltage) VLOG corresponding to the logarithm of the photocurrent, and outputs it to the buffer 382 .
  • the source of the FET411 is connected to the gate of the FET413, and the photocurrent from the photodiode 371 flows through the connection point between the source of the FET411 and the gate of the FET413.
  • the drain of FET411 is connected to power supply VDD, and its gate is connected to the drain of FET413.
  • the source of the FET412 is connected to the power supply VDD, and its drain is connected to the connection point between the gate of the FET411 and the drain of the FET413.
  • a predetermined bias voltage Vbias is applied to the gate of the FET 412 .
  • the source of FET 413 is grounded.
  • the drain of FET411 is connected to the power supply VDD side and is a source follower.
  • a photodiode 371 is connected to the source of the FET 411 which is a source follower, and a photocurrent flows through the FET 411 (from its drain to its source) due to charges generated by photoelectric conversion of the photodiode 371 .
  • the FET 411 operates in a subthreshold region, and a photovoltage VLOG corresponding to the logarithm of the photocurrent flowing through the FET 411 appears at the gate of the FET 411 .
  • the FET 411 converts the photocurrent from the photodiode 371 into the photovoltage VLOG corresponding to the logarithm of the photocurrent.
  • the photovoltage VLOG is output to the subtractor 383 via the buffer 382 from the connection point between the gate of the FET 411 and the drain of the FET 413 .
  • the subtractor 383 calculates the difference between the current light voltage and the light voltage at a timing different from the current light voltage by a very small amount of time, and outputs a difference signal Vdiff corresponding to the difference. do.
  • the subtractor 383 includes a capacitor 431, an operational amplifier 432, a capacitor 433, and a switch 434.
  • Quantizer 384 comprises comparators 451 and 452 .
  • One end of the capacitor 431 is connected to the output of the buffer 382 and the other end is connected to the input terminal of the operational amplifier 432 . Therefore, the (inverted) input terminal of the operational amplifier 432 receives the photovoltage VLOG through the capacitor 431 .
  • the output terminal of operational amplifier 432 is connected to the non-inverting input terminals (+) of comparators 451 and 452 of quantizer 384 .
  • One end of the capacitor 433 is connected to the input terminal of the operational amplifier 432 and the other end is connected to the output terminal of the operational amplifier 432 .
  • a switch 434 is connected to the capacitor 433 so as to turn on/off the connection across the capacitor 433 .
  • the switch 434 turns on/off the connection between both ends of the capacitor 433 by turning on/off according to the control signal of the driving section 342 .
  • the capacitor 433 and switch 434 constitute a switched capacitor. By temporarily turning on the off switch 434 and then turning it off again, the capacitor 433 is discharged and reset to a state in which it can store new charge.
  • Vinit be the photovoltage VLOG on the photodiode 371 side of the capacitor 431 when the switch 434 is turned on
  • C1 be the capacitance (electrostatic capacitance) of the capacitor 431
  • the input terminal of the operational amplifier 432 is virtual ground, and the charge Qinit accumulated in the capacitor 431 when the switch 434 is on is expressed by equation (5).
  • Qinit C1 ⁇ Vinit (5)
  • both ends of the capacitor 433 are short-circuited, so the charge accumulated in the capacitor 433 is zero.
  • Equation (7) the charge Q2 accumulated in the capacitor 433 is expressed by Equation (7) using the differential signal Vdiff, which is the output voltage of the operational amplifier 432.
  • Vdiff the differential signal
  • the subtractor 383 subtracts the photovoltages Vafter and Vinit, that is, calculates a difference signal Vdiff corresponding to the difference (Vafter - Vinit) between the photovoltages Vafter and Vinit.
  • the gain of the subtraction of the subtractor 383 is C1/C2. Therefore, the subtractor 383 outputs a voltage obtained by multiplying the change in the photovoltage VLOG after the capacitor 433 is reset by C1/C2 as the difference signal Vdiff.
  • the subtractor 383 outputs the difference signal Vdiff when the switch 434 is turned on and off by the control signal output by the driving section 342 .
  • the difference signal Vdiff output from the subtractor 383 is supplied to non-inverting input terminals (+) of comparators 451 and 452 of the quantizer 384 .
  • the comparator 451 compares the difference signal Vdiff from the subtractor 383 with the +side threshold Vrefp input to the inverting input terminal (-).
  • the comparator 451 outputs an H (High) level or L (Low) level detection signal DET(+), which indicates whether or not the + side threshold value Vrefp is exceeded, to the transfer circuit 385 as the quantized value of the difference signal Vdiff. .
  • the comparator 452 compares the difference signal Vdiff from the subtractor 383 with the - side threshold Vrefn input to the inverting input terminal (-).
  • the comparator 452 outputs an H (High) level or L (Low) level detection signal DET(-) indicating whether or not the minus side threshold value Vrefn is exceeded to the transfer circuit 385 as the quantized value of the difference signal Vdiff. .
  • FIG. 29 shows a more detailed circuit configuration example of the current-voltage conversion circuit 381, buffer 382, subtractor 383, and quantizer 384 shown in FIG.
  • FIG. 30 is a circuit diagram showing another configuration example of the quantizer 384.
  • FIG. 30 is a circuit diagram showing another configuration example of the quantizer 384.
  • the quantizer 384 shown in FIG. 29 always compared the difference signal Vdiff from the subtractor 383 with both the + side threshold (voltage) Vrefp and the - side threshold (voltage) Vrefn and output the comparison result.
  • the quantizer 384 of FIG. 30 includes one comparator 453 and a switch 454, and outputs a result of comparison with two thresholds (voltages) VthON or VthOFF switched by the switch 454. .
  • the switch 454 is connected to the inverting input terminal (-) of the comparator 453 and selects the terminal a or b according to the control signal from the driving section 342.
  • a voltage VthON as a threshold is supplied to the terminal a, and a voltage VthOFF ( ⁇ VthON) as a threshold is supplied to the terminal b. Therefore, the voltage VthON or VthOFF is supplied to the inverting input terminal of the comparator 453 .
  • the comparator 453 compares the difference signal Vdiff from the subtractor 383 with the voltage VthON or VthOFF, and transfers the H-level or L-level detection signal DET representing the comparison result to the transfer circuit 385 as the quantized value of the difference signal Vdiff. Output to
  • FIG. 31 shows a more detailed circuit configuration example of the current-voltage conversion circuit 381, buffer 382, subtractor 383, and quantizer 384 when the quantizer 384 shown in FIG. 30 is employed.
  • a terminal VAZ for initialization (AutoZero) is added as a terminal of the switch 454 in addition to the voltage VthON and the voltage VthOFF.
  • the switch 454 of the quantizer 384 selects the terminal VAZ at the timing when the H (High) level initialization signal AZ is supplied to the gate of the FET 471 composed of an N-type MOS (NMOS) FET in the subtractor 383. , perform initialization operations.
  • the switch 454 selects the terminal of the voltage VthON or the voltage VthOFF based on the control signal from the driving section 342, and the detection signal DET representing the comparison result with the selected threshold is sent from the quantizer 384 to the transfer circuit. 385.
  • the light-receiving sensor 31 in FIG. 1 can use the imaging device 311 as described above to output temporal changes in electrical signals obtained by photoelectrically converting optical signals as event data.
  • FIG. 32 is a block diagram showing a hardware configuration example of a computer as a signal processing device that executes each process of the above-described transmitting device 11 or receiving device 12 by a program.
  • a CPU Central Processing Unit
  • ROM Read Only Memory
  • RAM Random Access Memory
  • An input/output interface 505 is further connected to the bus 504 .
  • An input unit 506 , an output unit 507 , a storage unit 508 , a communication unit 509 and a drive 510 are connected to the input/output interface 505 .
  • the input unit 506 consists of a keyboard, mouse, microphone, touch panel, input terminals, and the like.
  • the output unit 507 includes a display, a speaker, an output terminal, and the like.
  • a storage unit 508 includes a hard disk, a RAM disk, a nonvolatile memory, or the like.
  • a communication unit 509 includes a network interface and the like.
  • a drive 510 drives a removable recording medium 511 such as a magnetic disk, optical disk, magneto-optical disk, or semiconductor memory.
  • the CPU 501 loads, for example, a program stored in the storage unit 508 into the RAM 503 via the input/output interface 505 and the bus 504, and executes the above-described series of programs. is processed.
  • the RAM 503 also appropriately stores data necessary for the CPU 501 to execute various processes.
  • the program executed by the computer (CPU 501) can be provided by being recorded on a removable recording medium 511 such as package media, for example. Also, the program can be provided via a wired or wireless transmission medium such as a local area network, the Internet, or digital satellite broadcasting.
  • the program can be installed in the storage unit 508 via the input/output interface 505 by loading the removable recording medium 511 into the drive 510 . Also, the program can be received by the communication unit 509 and installed in the storage unit 508 via a wired or wireless transmission medium. In addition, programs can be installed in the ROM 502 and the storage unit 508 in advance.
  • the program executed by the computer may be a program that is processed in chronological order according to the order described in this specification, or may be executed in parallel or at a necessary timing such as when a call is made. It may be a program in which processing is performed.
  • Embodiments of the present technology are not limited to the above-described embodiments, and various modifications are possible without departing from the gist of the present technology.
  • each step described in the flowchart above can be executed by a single device, or can be shared by a plurality of devices.
  • one step includes multiple processes
  • the multiple processes included in the one step can be executed by one device or shared by multiple devices.
  • this technique can take the following configurations.
  • (1) Receives an optical signal that indicates "transmission start”, “0”, and “1", which are transmitted as transmission information, at predetermined blinking intervals, and outputs the temporal change of the optical signal as event data.
  • an event sensor that and a demodulator that demodulates the optical signal based on the event data to obtain the transmission information.
  • the demodulation unit demodulates the transmission information to “transmission start”, “0”, or “1” based on the result of integrating the event data in a predetermined integration period.
  • receiving device receives the receiving device according to (1) or (2) above, wherein the code lengths of the optical signals of "0" and "1" are not in the relationship of integral multiples.
  • the receiving apparatus (4) The receiving apparatus according to any one of (1) to (3), wherein the difference in code length between the optical signals of "0" and “1” is configured to be a time difference greater than response variation. (5) The receiving apparatus according to any one of (1) to (4), wherein the code length of the optical signal of "0" and “1” is composed of a period of a frequency different from the flicker frequency of ambient light. (6) According to any one of (1) to (5), the event sensor outputs, as the event data, the two-dimensional position where the temporal change of the optical signal is detected and the polarity of the temporal change of the optical signal. receiver.
  • the demodulation unit detects a predetermined blinking timing of the optical signal by detecting an accumulation period in which the number of accumulated event data is greater than a predetermined detection threshold, and demodulates it into data of “0” or “1”.
  • the receiving device according to (9) above.
  • the receiving device wherein the demodulator determines an error when the detected predetermined blinking timing is not a time difference corresponding to data "0" or "1”.
  • the predetermined detection threshold is dynamically set according to an accumulated number of the event data for each area.
  • the demodulator calculates the representative coordinates of the optical signal based on the integrated number of the event data for each region, and outputs the demodulated "0" or "1" data and the representative coordinates.
  • the receiver according to any one of 9) to (12).
  • the demodulator calculates horizontal and vertical column integrated values of the area, and calculates the above-mentioned
  • the receiving device according to (13), which calculates representative coordinates.
  • the event data includes a positive event representing a positive temporal change of the optical signal and a negative event representing a negative temporal change of the optical signal,
  • the receiving device according to any one of (1) to (14), wherein the demodulator demodulates data to "0" or "1" based on a period from the positive event to the negative event.
  • the event data includes a positive event representing a positive temporal change of the optical signal and a negative event representing a negative temporal change of the optical signal,
  • the wavelength of the optical signal differs depending on the type of the transmission information;
  • an image sensor that captures a subject and generates a two-dimensional image
  • the receiving device further comprising: an integration unit that integrates the two-dimensional image generated by the image sensor and the transmission information demodulated by the demodulation unit.
  • an integration unit that integrates the two-dimensional image generated by the image sensor and the transmission information demodulated by the demodulation unit.
  • the event sensor Based on the demodulated transmission information, further comprising a self-position estimating unit that estimates the self-position, the event sensor receives, as the transmission information, the optical signal corresponding to identification information or position information of a light emitting unit that outputs the optical signal;
  • the demodulator demodulates and acquires identification information or position information of the light emitter.
  • the transmission device transmits, as transmission information, an optical signal representing each of "transmission start", "0", and "1" at predetermined blinking intervals,
  • the receiving device an event sensor that receives the optical signal and outputs a temporal change of the optical signal as event data;
  • a transmission/reception system comprising: a demodulator that demodulates the optical signal based on the event data to obtain the transmission information.

Landscapes

  • Engineering & Computer Science (AREA)
  • Signal Processing (AREA)
  • Multimedia (AREA)
  • Physics & Mathematics (AREA)
  • Electromagnetism (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Optical Communication System (AREA)

Abstract

La présente invention concerne un dispositif de réception et un système de transmission/réception conçus pour permettre une communication optique au cours de laquelle des informations de transmission peuvent être acquises à une vitesse élevée et une position d'émission de lumière peut également être acquise. Le dispositif de réception comprend : un capteur d'événement conçu pour recevoir un signal optique dans lequel chaque « début de transmission », « 0 » et « 1 » transmis à titre d'informations de transmission est représenté par un intervalle de clignotement prescrit et pour sortir des variations temporelles dans le signal optique à titre de données d'événement ; et une unité de démodulation conçue pour démoduler le signal optique sur la base des données d'événement de façon à acquérir les informations de transmission. La présente invention peut par exemple être appliquée à un système de transmission/réception et assimilé conçu pour transmettre des caractères, des symboles, des images et des sons prescrits, etc., à titre d'informations de transmission en utilisant un signal optique.
PCT/JP2022/004582 2021-05-31 2022-02-07 Dispositif de réception et système de transmission/réception WO2022254789A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2021091304 2021-05-31
JP2021-091304 2021-05-31

Publications (1)

Publication Number Publication Date
WO2022254789A1 true WO2022254789A1 (fr) 2022-12-08

Family

ID=84324100

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2022/004582 WO2022254789A1 (fr) 2021-05-31 2022-02-07 Dispositif de réception et système de transmission/réception

Country Status (1)

Country Link
WO (1) WO2022254789A1 (fr)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20230142456A1 (en) * 2016-10-21 2023-05-11 Panasonic Intellectual Property Corporation Of America Transmission device, reception device, communication system, transmission method, reception method, and communication method
WO2024018920A1 (fr) * 2022-07-21 2024-01-25 ソニーセミコンダクタソリューションズ株式会社 Dispositif de traitement d'informations et procédé de traitement d'informations

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH10221468A (ja) * 1997-02-04 1998-08-21 Nec Aerospace Syst Ltd イベント測定装置
JPH11122186A (ja) * 1997-10-14 1999-04-30 Sharp Corp ワイヤレス光通信システム
JP2003132478A (ja) * 2001-10-22 2003-05-09 Tokyo Gas Co Ltd ガスメータ
JP2004242209A (ja) * 2003-02-07 2004-08-26 Victor Co Of Japan Ltd 光無線伝送システム及び光無線伝送装置
JP2011023819A (ja) * 2009-07-13 2011-02-03 Casio Computer Co Ltd 撮像装置、撮像方法及びプログラム
JP2013009072A (ja) * 2011-06-23 2013-01-10 Casio Comput Co Ltd 情報伝送システム、受光装置、情報伝送方法、及び、プログラム
JP2018509847A (ja) * 2015-03-16 2018-04-05 ユニヴェルシテ・ピエール・エ・マリ・キュリ・(パリ・6) 非同期信号を処理するための方法

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH10221468A (ja) * 1997-02-04 1998-08-21 Nec Aerospace Syst Ltd イベント測定装置
JPH11122186A (ja) * 1997-10-14 1999-04-30 Sharp Corp ワイヤレス光通信システム
JP2003132478A (ja) * 2001-10-22 2003-05-09 Tokyo Gas Co Ltd ガスメータ
JP2004242209A (ja) * 2003-02-07 2004-08-26 Victor Co Of Japan Ltd 光無線伝送システム及び光無線伝送装置
JP2011023819A (ja) * 2009-07-13 2011-02-03 Casio Computer Co Ltd 撮像装置、撮像方法及びプログラム
JP2013009072A (ja) * 2011-06-23 2013-01-10 Casio Comput Co Ltd 情報伝送システム、受光装置、情報伝送方法、及び、プログラム
JP2018509847A (ja) * 2015-03-16 2018-04-05 ユニヴェルシテ・ピエール・エ・マリ・キュリ・(パリ・6) 非同期信号を処理するための方法

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20230142456A1 (en) * 2016-10-21 2023-05-11 Panasonic Intellectual Property Corporation Of America Transmission device, reception device, communication system, transmission method, reception method, and communication method
WO2024018920A1 (fr) * 2022-07-21 2024-01-25 ソニーセミコンダクタソリューションズ株式会社 Dispositif de traitement d'informations et procédé de traitement d'informations

Similar Documents

Publication Publication Date Title
WO2022254789A1 (fr) Dispositif de réception et système de transmission/réception
TWI722283B (zh) 多工高動態範圍影像
US10193627B1 (en) Detection of visible light communication sources over a high dynamic range
EP3922007B1 (fr) Systèmes et procédés d'imagerie numérique faisant appel à des imageurs de pixels informatiques avec de multiples compteurs de pixels intégrés
US20070171298A1 (en) Image capturing element, image capturing apparatus, image capturing method, image capturing system, and image processing apparatus
KR20070045126A (ko) 이미지 신호 처리 방법 및 장치
CN103365481B (zh) 投影系统及其自动校正方法
CN112449130A (zh) 具有闪烁分析电路的事件传感器
JP4830270B2 (ja) 固体撮像装置および固体撮像装置の信号処理方法
US20210327090A1 (en) Sensor calibration system, display control apparatus, program, and sensor calibration method
US10582122B2 (en) Image processing apparatus, image processing method, and image pickup apparatus
CN111491109B (zh) 光检测芯片、图像处理装置及其运作方法
US11074715B2 (en) Image processing apparatus and method
US9531968B2 (en) Imagers having image processing circuitry with error detection capabilities
US9819853B2 (en) Imaging device and focusing control method
US20190014281A1 (en) Solid-state image sensor, image capturing apparatus and image capturing method
US9648222B2 (en) Image capturing apparatus that controls supply of amperage when reading out pixel signals and method for controlling the same
US20200358959A1 (en) Imaging device and signal processing device
US9560287B2 (en) Noise level based exposure time control for sequential subimages
US9179084B2 (en) Solid-state imaging device
US20230177713A1 (en) Information processing apparatus, information processing method, and program
US10694125B2 (en) Solid-state imaging element, method of operating solid-state imaging element, imaging apparatus, and electronic device
WO2019144262A1 (fr) Procédé et appareil de détection de tache, et dispositif électronique mobile
KR102216505B1 (ko) 안면 추적 카메라 모듈 및 방법
US9134812B2 (en) Image positioning method and interactive imaging system using the same

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22815548

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 22815548

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: JP