WO2022008989A1 - Multi-domain optical sensor - Google Patents
Multi-domain optical sensor Download PDFInfo
- Publication number
- WO2022008989A1 WO2022008989A1 PCT/IB2021/054262 IB2021054262W WO2022008989A1 WO 2022008989 A1 WO2022008989 A1 WO 2022008989A1 IB 2021054262 W IB2021054262 W IB 2021054262W WO 2022008989 A1 WO2022008989 A1 WO 2022008989A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- pixels
- signal
- sensing
- data
- doppler
- Prior art date
Links
Images
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S7/00—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
- G01S7/48—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
- G01S7/491—Details of non-pulse systems
- G01S7/4912—Receivers
- G01S7/4917—Receivers superposing optical signals in a photodetector, e.g. optical heterodyne detection
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S17/00—Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
- G01S17/02—Systems using the reflection of electromagnetic waves other than radio waves
- G01S17/06—Systems determining position data of a target
- G01S17/08—Systems determining position data of a target for measuring distance only
- G01S17/10—Systems determining position data of a target for measuring distance only using transmission of interrupted, pulse-modulated waves
- G01S17/26—Systems determining position data of a target for measuring distance only using transmission of interrupted, pulse-modulated waves wherein the transmitted pulses use a frequency-modulated or phase-modulated carrier wave, e.g. for pulse compression of received signals
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S17/00—Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
- G01S17/02—Systems using the reflection of electromagnetic waves other than radio waves
- G01S17/06—Systems determining position data of a target
- G01S17/08—Systems determining position data of a target for measuring distance only
- G01S17/32—Systems determining position data of a target for measuring distance only using transmission of continuous waves, whether amplitude-, frequency-, or phase-modulated, or unmodulated
- G01S17/34—Systems determining position data of a target for measuring distance only using transmission of continuous waves, whether amplitude-, frequency-, or phase-modulated, or unmodulated using transmission of continuous, frequency-modulated waves while heterodyning the received signal, or a signal derived therefrom, with a locally-generated signal related to the contemporaneously transmitted signal
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S17/00—Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
- G01S17/88—Lidar systems specially adapted for specific applications
- G01S17/89—Lidar systems specially adapted for specific applications for mapping or imaging
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S17/00—Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
- G01S17/88—Lidar systems specially adapted for specific applications
- G01S17/93—Lidar systems specially adapted for specific applications for anti-collision purposes
- G01S17/931—Lidar systems specially adapted for specific applications for anti-collision purposes of land vehicles
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S7/00—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
- G01S7/48—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
- G01S7/481—Constructional features, e.g. arrangements of optical elements
- G01S7/4811—Constructional features, e.g. arrangements of optical elements common to transmitter and receiver
Abstract
In many applications such as autonomous vehicle and ADAS (advanced driver assistance system), both LIDAR and camera sensors play important and complementary roles in sensing surroundings. The scanless LIDAR sensor on chip architecture disclosed in this application is suitable to build a LIDAR and a camera sensor on a single chip and share one set of optics, enabling a combined FMCW Doppler LIDAR and camera sensor inherently to work together and jointly sense directions in field of view simultaneously in parallel, without mechanical, electronic or photonic scanning, no extra efforts are needed to align them either, and multi-domains of sensed information are in inherent fusion. The combined optical sensor provides object sensing information in multiple domains: angles of view, distance, relative velocity, colors (in red-green-blue vector) and light strength, also provides better than sequential transmission of massive sensed data, to transfer the massive amount of data according to priority.
Description
This invention relates generally to optical sensor, in particular, to LIDAR and camera sensors.
LIDAR (light detection and ranging) devices are viewed as a major sensing means in an ADAS (advanced driver assistance system) of a vehicle, as well as in a driving control system of an autonomous vehicle. In order to “see” objects in various directions around the LIDAR receiver device, and to determine directions of the objects, mechanical means may be used to scan across directions by the LIDAR system, e.g. the continuously rotating platform used in prior art patent US8,836,922. It is known that mechanical scanning parts, especially those continuously moving mechanical parts, are subject to failures with shorter mean time to failure (MTTF) and higher costs.
CW (continuous wave) and FMCW (frequency modulated continuous wave) Doppler LIDAR, as disclosed in prior patent US6,697,148 is a powerful sensing tool for applications such as ADAS and autonomous vehicles, however it performs speed and/or distance measurements at a single direction at a time. To sense objects in various directions, it may still have to use scanning means such as rotating mirror or other mechanically moving aiming means.
Furthermore, in time critical applications, not all directions of a LIDAR sensing data are of equal level of urgency. Sequentially scanning over the field of view and sequentially transferring the data is generally non-optimal and negatively affects detection and response time.
On the other hand, camera sensors are able to sense objects by colors, good at identifying traffic signs, and have other advantages under day light. Separately use the two types of optical sensors (LIDAR and camera) need extra efforts to align their angle of view, take more space to install, and use duplicated optics.
There is a need for a new optical sensor to perform CW and/or FMCW Doppler detection and ranging in many or all directions of interest, without using mechanically moving parts, even without using any forms of scanning, and to differentiate sensing data obtained from various directions and convey the most urgent data with higher priority, also to combine LIDAR and camera sensor functions in one to get more advantages in cost and a combined sensing performance.
Mechanical scanning parts of LIDAR, especially those continuously moving mechanical parts, are subject to failures with shorter mean time to failure (MTTF) and higher costs. LIDARs using other types of scanning, such as scanning using MEMS, OPA (optical phase array), and Rotman lens, detect objects over discontinuous periods of time. Detection of any sudden changes is delayed by the gap of scanning beam revisits. LIDAR sensing information and camera sensing information are obtained by two devices, costs high, and need extra efforts of fusion. Further, LIDAR and camera sensor information in a field of view are not equally urgent. In time critical applications, treat them equally and sequentially transferring the information may miss valuable action time and cause serious consequences.
In one aspect, the invention provides an embodiment of an optical sensor, comprising: an array of pixels; and an interface module, coupled with the pixels, for conveying sensing results outside the sensor chip; wherein at least some of the pixels are Doppler sensing pixels, comprising a photo-detector and at least one mixer, for detecting a modulated light signal, and mixing the detected signal with at least one LO (local oscillator) signals, and producing at least one mixing product signals; or comprising an LO (local oscillator)-pumped photo-detector, for detecting a modulated light signal, and mixing the detected signal with at least one LO (local oscillator) signals to produce a least one mixing product signals.
In another aspect, at least one embodiment of the invention provides a Doppler LIDAR receiver, comprising a Doppler sensor having an array of Doppler sensing pixels, for producing at least one Doppler sensing signals; an optical scope, optically coupled with the Doppler sensor, for producing an optical image of objects on the Doppler sensor; a digital signal processor module, coupled with the Doppler sensor, for processing the at least one Doppler sensing signals; and whereby, the array of Doppler sensing pixels each is operable to detect a modulated light signal from or associated with an object under detection, mix a detected signal with a local replica signal, and produce at least one of the Doppler sensing signals; and whereby, the digital signal processor module is operable, based on at least one of the Doppler sensing signals produced by a pixel and an address of said pixel in the array, to report at least one of: a relative moving speed of said object and direction of moving in terms approaching to or leaving from said Doppler LIDAR receiver; a distance between said object and said Doppler LIDAR receiver; and a direction of said object in field of view relative to said Doppler LIDAR receiver.
In yet another aspect, at least one embodiment of the invention provides a method for effectively conveying data from pixels of an optical sensor device to a processor, comprising steps of: sending initial data produced by a sample set of pixels to a processor; associating, with each of the pixels by the processor, a table of parameters among a number of priority categories for conveying data in a next period; queuing, for each of the categories, the data produced by pixels into queuing buffers according to the parameters for the corresponding category that a pixel is associated with; multiplexing, according to priority and scheduling parameters in the table, the queued data from the buffers into a transmission channel; if the transmission channel cannot convey all queued data in the buffers according to the scheduling parameters specified in the table, discarding some most stale data from the lowest priority buffers; and repeating the associating, queuing, multiplexing and discarding steps above in another next period.
In further yet another aspect, the invention provides an optical sensing apparatus for detecting a shift in frequency in an amplitude envelope waveform of an optical signal, comprising at least one of: at least one avalanche photodiode (APD), at least one single-photon avalanche diode (SPAD), and at least one photo-sensing device whose optical to electrical conversion rate depends on an instantaneous bias voltage; wherein said APD, SPAD or photo-sensing device is biased by a time-varying voltage based, at least in part, on a replica signal of the amplitude envelope waveform.
Other aspects of the invention will become clear thereafter in the detailed description of the preferred embodiments and the claims.
The disclosed scanless Doppler sensing LIDAR will provide longer MTTF and lower costs than mechanical scanning LIDAR, its pixel-wise parallel sensing feature provides faster detection time than beam steering scanning LIDARs and also makes full use of processing gains in detection time. The disclosed LIDAR and camera combo sensor device provides more domains of information being sensed jointly and in fusion inherently, with further reduced costs. The priority based sensing data transfer feature enables more timely sensing in time critical applications.
For a better understanding of the invention and to show more clearly how it may be carried into effect, reference will now be made, by way of example only, to the accompanying drawings which illustrate distinctive features of at least one exemplary embodiment of the invention, in which:
It will be appreciated that in the description herein, numerous specific details are set forth in order to provide a thorough understanding of the invention. However, it will be understood by those of ordinary skill in the art that the invention may be practiced without these specific details. In other instances, well-known methods, procedures and components have not been described in detail so as not to obscure the invention. Furthermore, this description is not to be considered as limiting the scope of the invention, but rather as merely providing a particular preferred working embodiment thereof.
By way of example, illustrates a functional diagram of a Doppler LIDAR receiver device using a LIDAR sensor with direct Doppler sensing pixels. In the figure, objects 101 illuminated by modulated light source (such as disclosed in patent US 6,697,148), or lights emitted by modulated light beacons 101 (such as disclosed in patent application US16917805), which are attached to objects being sensed, are sensed by at least one LIDAR receiver device 100 in an application field, comprising of a Doppler sensing unit 102, a digital signal processor (DSP) module 60, an interface module 50 and interconnect wires 40. The Doppler sensing unit 102 includes a housing 10 that may be designed in different shapes to hold the components of the Doppler sensing unit, and suitable for being mounted on a platform using the sensing unit, such as a car; a LIDAR sensor chip 20 that contains an array of Doppler sensing pixels of light signal, e.g., the pixel 70 as one of them, which will be explained in more detail with , and hereinafter; an optical scope 30 that may be as simple as a lens as shown in the drawing, or a pinhole (including a pinhole-equivalent structure), or may be more complex to include a plurality of lenses, optical filter(s), aperture control means, focal adjustment means, zoom adjustment means, may further include mirrors and optical fiber or light guide, etc. (not shown in drawing); the modulated light signals from objects 101, either reflected by surface of the objects, or directly emitted from a light beacon devices, will project their images 103 onto the pixels on the LIDAR sensor chip 20, being Doppler-sensed by individual pixels e.g., the pixel 70 as one of them, as will be explained with as well as and ; and on the chip, a portion of the semiconductor area implements an interface circuit (not shown in drawing) to collect the Doppler sensing output signals from the pixels on chip, and through the wires 40 to pass the signals to the mating interface module 50 for further processing at DSP 60. As can be seen in the figure, direction information of individual objects, as long as within the scope of view, is indicated by positions of pixels in the array on chip, electronically represented by its address (position index of pixels), without the need of scanning using any moving mechanical means. The pixel address carries important information about direction of objects relative to the Doppler sensing unit 102. In certain applications, it may be desirable to place the pixels on chip unevenly so as to optimize resolution of direction detection, and/or compensate deformation caused by optics. The physical shape of individual pixels may not have to be square or rectangular and may use other shapes to optimize or make tradeoff of performances and/or costs. The drawing of is not drawn to scale.
The area containing Doppler sensing pixel array in sensor chip 20 does not have to use rectangular shape, in some application scenarios, shapes other than rectangular may be preferred. illustrates an exemplary embodiment of an omnidirectional Doppler sensing unit 102, in which the sensing unit structure is supported by a housing 10 which is transparent for the upper portion to allow light signals to come in from all 360 degrees around the horizontal plane and nearly entire lower hemisphere; a lens or a set of lenses 30 makes the images focus on the sensing pixels on Doppler sensor chip 20; a specially designed convex mirror 310, which may be built according to what patent US6,744,569 teaches, reflects light signals of objects from all directions around horizontal plane and lower hemisphere onto the sensor pixels on chip 20. As will be appreciated, the effective sensing area of the pixel array may be preferred to shape as a ring plate, and through DSP means, the images sensed by the pixel array on the ring plate shaped area may be electronically reconstructed into a Doppler sensing Panorama, superimposed onto an a visible light Panorama as needed, for example, like the one shown in for human viewing. For machine sensing and autonomous driving purpose, reconstructing into a Panorama might not be necessary, as driving decisions may be as conveniently made based on sensed information as “flattened” on a ring shaped plate. The drawing of is for purpose of showing concepts, and is not drawn to scale. The center part and corners of the silicon not used for building pixels may be used to build supplementary circuits for the chip, e.g., the interface, LO buffering and distribution, power regulation and distribution, which will not be elaborated herein.
For pixel array of rectangular shape on the sensor chip, individual pixels may be evenly placed according to grids of Cartesian coordinates, parallel to the edges, and address of the pixels are numbered accordingly, to represent direction of sensed objects. For pixel arrays of circular shape or ring shape, the individual pixels may be places along polar coordinates, e.g., spaced by equal polar angle and radial length to reflect equal angular step size of incoming lights from objects that form images at positions of pixels. Since in some applications, not all directions are of equal importance, multiple zones on the pixel array may be implemented with different pixel densities. Unevenly spaced pixels may be implemented to correct optical deformity as well.
In an alternative embodiment, the convex mirror(s) and the scope(s) in embodiments of and may be replaced by a fish-eye optical scope and achieve nearly 180 degree view of a hemisphere, and two of such structure together will be able to perform Doppler sensing substantially in both upper and lower hemispheres. shows an example of such omnidirectional Doppler sensing unit, in which the two Doppler sensors 20 and 20’are placed on upper and lower side of a drone, and fish-eye scopes 610 and 610’ are installed in front of the sensor chips to project omnidirectional image onto the Doppler sensors 20 and 20’, one to sense the lower hemisphere and the other to sense the upper hemisphere (The drawing is not drawn to scale).
The amount of Doppler sensing data to be transferred out of the sensor chip 20 depends on 1) total number of pixels; 2) maximum bandwidth in the mixing product signals, which is proportional to the maximum Doppler shift of concern in the application, and in embodiments using FMCW modulating signal, it also depends on FM sweeping rate and maximum range in design. If the data interface is able to convey all digitized data from all pixels, then the chip may simply passing the mixing product signals through an anti-aliasing filter (not shown in drawings of and ) and then use analog to digital converter (not shown in drawings too) to digitize the filtered analog signal, time multiplexing the data by the interface circuits on chip (not shown in drawings), and send them out. If the amount of data is too large to be all passed, it is preferable to locally pre-process the result and select to pass only the output signals from “important” pixels, or provide variable amount of data dependent of determined priority of pixels.
What pixels are “important”? How does the sensor chip 20 determine it? The answer is application dependent. Take the example of autonomous vehicle application in a “controlled” traffic region, in which all vehicles are equipped with beacon transmitters (e.g., the ones described in patent application US16917805), and all land structure in the region are also marked by such beacons, then the important pixels may be those with beacon signals exposed onto them that are much stronger than reflected background signals. The ones with closer distances and positive Doppler shifts (i.e., approaching objects) are most important ones since they are the objects may have higher potential risk of collisions with the vehicle in question. In application scenarios to detect reflected signals, the signal strength may not be a reliable measure as the signal strength depends not only on distance and size of objects, but also depends on surface properties of objects. In this case, a high positive Doppler as well as close distance may be good criteria for selecting important pixels to output data.
On-chip hot spot detection is a pre-selection of pixels and their neighboring ones that need to watch with higher attention, so as to output these data to off-chip DSP for further processing. For signal strength based selection, may use sum-and-dump (integrate and discharge) of absolute values/magnitude of mixing product signals at a given time interval, and pass the results to a threshold; for Doppler shift based selection, estimators of frequency or angular speed (of phase) may be used, e.g., an estimator based on frequency counter may use threshold comparator (preferably with appropriate amount of hysteresis) to detect and count number of “sign” changes in the mixing product signals from mixers that mix with CW local replicas during a given time interval to determine, or alternatively based on time needed for a given number of “sign” changes thereof to determine, and in either case, may choose to only count the “sign” changes in the direction of phase rotations for positive Doppler shifts (approaching objects). As known in the art, distance may also be determined based on frequency information using FMCW technique. In the selection of important pixels, quick and potentially less accurate processing may be used, and relying on more accurate further processing on DSP 60 for final processing.
Alternatively, in another preferred embodiment, since both distance and radial velocity of an object can be derived from frequency information of the mixing product signals of the pixels, instead of output the raw mixing product signals, the pixels may only output the detected frequency values or a quantity associated with the frequency, such as a phase angular rate. Frequency estimators (or equivalent) may be implemented in the pixels to obtain the detected values of frequency or quantities associated with the frequency. Frequency estimators are well known to the skilled in the art, including the examples in the previous paragraph. In further alternative embodiment, pixels may output estimated frequency values as baseline output, and based on feedback from external DSP 60, some subset of pixels are selected to provide raw digitized mixing product signals.
Priority based data interface protocol is an important feature for massive sensing data device in time critical, mission critical and/or consequence critical applications, such as the example discussed herein – the massive parallel sensing pixels of a LIDAR in autonomous vehicle control. In the following paragraphs, we describe some preferred embodiments of data interface protocol suitable for the LIDAR architecture disclosed in this patent application as well as in the parent applications (application numbers US16926400, US17126623 and US17194389), including camera sensing data as will be described hereinafter.
In one preferred embodiment, a set of initial sensing data may be conveyed with equal priority, and may simply convey sensing data of all pixels with a low initial update rate (on a simple “frame by frame” basis); alternatively, to quickly get an overall picture, may reduce initial resolution among pixels, for example only convey one pixel data every L pixels in column numbers and one every P in row numbers. The output of each of the pixels may be transferred out by a truncated limited length block, and pixels are served one by one in a sequential order according to the pixel location addresses, such as by incremental column numbers of each row and then by incrementing the row numbers. After receiving and processing the initial data, the DSP processor 60, will provider a feedback table to the sensor chip 20 that each of the pixels is assigned to a priority level i, where i = 1,2, …, N. According to the priority level assigned in the table, the sensor chip 20 will adjust the data conveying settings onwards and continue to receive new feedback tables from the DSP processor 60, and readjust data conveying settings accordingly. Alternatively, the feedback table provided by the DSP processor 60 may contain more parameters than just N priority levels, for example, may include sampling rate, block size, update rate, and order of pixels to send data, etc. When the DSP processor 60 is detecting the LIDAR orientation is in change, for example, when a vehicle is making a turn, the feedback table may provide predicted aiming adjust parameters. The table may also include additional moving prediction parameters. For example, when a set of pixels related to a brick on road in the lane driving along, the set of pixels may currently be assigned to a high priority for data transfer, and predict a new set of pixels after moving, and assigning them to high priority automatically in next period of time without further feedback.
In another preferred embodiment, in addition to determining data conveying parameters based on feedback from the DSP processor 60, the chip may also implement on-chip preprocessing to determine priority of pixels to convey data from. This will react more quickly to sudden changes. For example, in highway driving condition, the LIDAR may be installed on a car that follows another car driving in the same direction, feedback from the DSP processor 60 may be very good in determining pixel data priority corresponding to surrounding cars that already for a while exist in the field of view, and the pixels corresponding to the open sky, but may not react quickly enough if a brick on the highway previously blocked by the car in front suddenly appears after the front car no longer blocks its view. On-chip processing must quickly determine the sudden change after being able to see the brick approaching at a high speed, and quickly assign the pixels around the image of the brick to a priority possibly even higher than the highest in the feedback table may have assigned to. The on-chip processing may not be as accurate, and may erroneously assign a high priority when it is not necessary, but the nature of time, mission and consequence critical control cannot afford missing or delaying any data for a truly dangerous event.
In addition to the raw sensing data (e.g., the digitized mixing product signal) or estimated / pre-processed sensing data (e.g., the estimated frequency values of mixing product signals, or quantities associated with the frequency), the contents of data output from each pixel may further include pixel position index in the array, timestamp, estimated level of time criticalness, estimated level of consequence criticalness, parameters related to data quality (level of confidence, e.g., signal strength, signal to interference ratio, margin toward saturation, level of platform vibrations, weather visibility, etc.), and time to next scheduled data.
According to level of time criticalness or level of priority, pixel data packets may be queued in a plurality of queuing data buffers, each queuing buffer is assigned to an update rate that needs to meet. A scheduler controls a multiplexer to select data among the queuing buffers to send through transmission channel. Among the plurality of queuing data buffers, data packet structure may be different, e.g., different block length, holding data of different sampling rate or decimating factor. For example, the pixels corresponding to the open sky may be queued in a buffer with low update rate, and high decimating factor in time and space (number of pixels to skip); pixels corresponding to or close to object boundaries (e.g., contours of vehicles, pedestrians, curbs, poles, lane line, and other objects) may be queued in a dedicated queue or queues. For certain purpose of processing, a set of adjacent pixels may be grouped to combine their mixing product signals into one single output, in a way forming a larger “super pixel”. Such sensing data may be queued separately with special settings of parameters for transmission.
In an alternative embodiment, the DSP 60 may be implemented on the sensor chip 20 in entirety or partially, so that processing of signals created by all pixels are performed within the chip 20, or at least in part.
In some application scenarios, it is desirable to illuminate surroundings simultaneously using said modulated light source, so that all directions of sensing interest will be illuminated. One embodiment to achieve this is to use the apparatus shown in in a reversed light propagation direction, i.e., the modulated light source is placed at position of 20, emits the modulated light, and the light comes out through the optics 30 and is reflected by convex mirror 310 towards surrounding objects. Light energy may also be more densely focused towards directions that need longer illumination range, e.g., more concentrated towards front than back and sides in vehicular applications.
In applications such as autonomous vehicle, it is commonly known that, LIDAR and camera sensors each has advantages over the other and combining their sensing results to make driving decisions is required. If a camera sensor (of visible light or infrared light, for example) and a LIDAR sensor are separately installed, combining their sensing results need to align their angle of view and this process often is not an easy task when they go through separate optical scopes. The separate installation location causes offset in angle of view, also two sets of optics cause different image distortions and image sizes. Physically combining the two types of sensors in one unified optical sensor will bring in more advantages. Furthermore, by sharing one set of optics will reduce cost. A unified optical sensor device also saves installation space and reduces decoration costs. In one embodiment, a unified camera and LIDAR sensor chip contains mixed two types pixels: camera sensing pixels, which may sense Red-Green-Blue colors and light intensity, similar or identical to the ones used in a camera sensor, and LIDAR sensing pixels, such as described in embodiments of , , and , and may also include other types of LIDAR sensing pixels such as ones based on Time of Fly (ToF). The angle of view information is represented by the physical position of the pixels on a chip, and the two types of sensing information is inherently aligned in their angles of view. Alternatively, pixels may each be implemented to sense both camera information and LIDAR information. In either of the embodiments, micro optical filters may be placed on top of the photo sensing areas of pixels to selectively pass red, green, blue and LIDAR sensing used infrared wavelength bands. Other color separating and/or selective color sensing technologies known in the arts may also be used. Priority determination methods described hereinabove may be used to determine not only priority in transferring LIDAR sensing information but also in transferring corresponding camera sensing information of same pixels (or pixels closed by) through priority based interface described hereinabove. Since camera sensing pixels are well known in the art, it is not described in further detail herein.
Certain terms are used to refer to particular components. As one skilled in the art will appreciate, people may refer to a component by different names. It is not intended to distinguish between components that differ in name but not in function. For example, a camera sensor is also known as an image sensor, a vision sensor, a video sensor, video camera sensor, a cam sensor, etc..
The terms “including” and “comprising” are used in an open-ended fashion, and thus should be interpreted to mean “including, but not limited to”. The terms “example” and “exemplary” are used simply to identify instances for illustrative purposes and should not be interpreted as limiting the scope of the invention to the stated instances.
Also, the term “couple” in any form is intended to mean either a direct or indirect connection through other devices and connections.
It should be understood that various modifications can be made to the embodiments described and illustrated herein, without departing from the invention, the scope of which is defined in the appended claims.
Claims (42)
- An optical sensor, comprising:
an array of pixels; and
an interface module, coupled with the pixels, for conveying sensing results outside the sensor;
wherein at least some of the pixels are Doppler sensing pixels, comprising at least one group of:
group 1:
a photo-detector for detecting a modulated light signal from or associated with objects being sensed, and producing a detected signal; and
at least one mixer, coupled with the photo-detector and the interface module, for mixing at least one LO (local oscillator) signals with the detected signal or a signal derived from the detected signal, and producing at least one mixing product signal(s);
group 2:
at least one LO-pumped photo-detector, coupled with the interface module, for detecting a modulated light signal from or associated with objects being sensed, and mixing with at least one LO signal to produce a least one mixing product signal(s). - The optical sensor of claim 1, wherein said Doppler sensing pixels, each further includes:
a grating coupler, coupled optically with at least one of the photo-detector and the LO-pumped photo-detector, for selectively coupling the modulated light signal at a given wavelength band into said at least one of the photo-detector and the LO-pumped photo-detector. - The optical sensor of claim 1, wherein said at least one LO signals include at least one of:
a local replica signal, which is identical in frequency to a modulating signal in a LiDAR transmitter;
a local replica signal, which is identical in frequency to at least one component of a modulating signal in a LiDAR transmitter;
a pair of local replica signals, which is phase shifted by 90 degrees with each other and identical in frequency to at least one component of a modulating signal in a LiDAR transmitter;
a local replica signal, which is a counterpart of a modulating signal in a LiDAR transmitter by shifting a constant frequency;
a local replica signal, which is a counterpart of least one component of a modulating signal in a LiDAR transmitter by shifting a constant frequency;
a pair of local replica signals, which are counterparts of at least one component of a modulating signal in a LiDAR transmitter, by shifting a constant frequency, and said pair are phase shifted by 90 degrees with each other in component(s) of said modulating signal; and
a rectangular wave counterpart of above list. - The optical sensor of claim 3, wherein said at least one mixer includes:
a mixer, a quadrature mixer and a single sideband mixer;
a plurality of mixers, quadrature mixers, or single sideband mixers each separately mixing the detected signal or a signal derived from the detected signal with one of:
at least one CW components of said LO signals;
at least one FMCW components of said LO signals;
one CW component of the least one CW components of said LO signals;
one FMCW component of the least one FMCW components of said LO signals; and
a rectangular wave counterpart of the above list. - The optical sensor of claim 3, wherein said at least one LO-pumped photo-detector includes:
an LO-pumped photo-detector, a quadrature LO-pumped photo-detector, or a single sideband quadrature LO-pumped photo-detector;
a plurality of LO-pumped photo-detectors, quadrature LO-pumped photo-detectors, or single sideband quadrature LO-pumped photo-detectors, each pumped by one of:
at least one CW components of said LO signals;
at least one FMCW components of the LO signals;
one CW component of the least one CW components of the LO signals;
one FMCW component of the least one FMCW components of the LO signals;
a rectangular wave counterpart of the above list. - The optical sensor of claim 1, wherein said Doppler sensing pixels, further includes at least one of:
a first filter, coupled with the photo-detector and the mixer, for attenuating out-of-band interference and noise components in the detected electrical signal;
a second filter or filters, coupled with the mixer or the LO-pumped photo-detector, for attenuating frequency components outside band of interest in the mixing product signal;
a first estimator, coupled with the mixer or the LO-pumped photo-detector, for estimating a frequency or a quantity associated with a frequency of the mixing product signal(s); and
a second estimator, coupled with the mixer or the LO-pumped photo-detector, for estimating at least one of a signal strength or a signal to interference and noise ratio of the modulated light signal. - The optical sensor of claim 6, wherein said Doppler sensing pixels provide sensing information in forms of at least one of:
digitized mixing product signal(s);
estimated frequency of the mixing product signal(s);
estimated quantities related to frequency of the mixing product signal(s);
estimated signal strength;
a velocity of an object being sensed by said Doppler sensing pixel;
a range (distance) of an object being sensed by said Doppler sensing pixel; and
a signal strength from an object being sensed by said Doppler sensing pixel. - The optical sensor of claim 1, wherein said Doppler sensing pixels each further includes a micro optical lens on top of each said same, for directing light exposed onto an area of said same more onto a photo-detecting area of said same.
- The optical sensor of claim 2, wherein said Doppler sensing pixels each further includes a micro optical lens on top of each said same, for at least one of:
directing light being exposed onto the micro optical lens towards said grating coupler; and
directing light being exposed onto the micro optical lens with substantially parallel light rays at a desired incident angle towards said grating coupler. - The optical sensor of claim 3, wherein said LO-pumped photo-detector comprises at least one of:
at least one avalanche photodiode (APD);
at least one single-photon avalanche diode (SPAD); and
at least one photo-sensing device which exhibits an optical to electrical conversion rate that is dependent on an instantaneous bias voltage;
which is biased by a time-varying voltage based, at least in part, on said at least one LO signal. - The optical sensor of claim 10, wherein said LO-pumped photo-detector further includes at least one component whose effective capacitance varies with the at least one LO signal.
- The optical sensor of claim 11, wherein said at least one component is a part of a tuning circuit.
- The optical sensor of claim 1, further includes at least one pre-processing functional module, coupled with the mixers or the LO-pumped photo-detectors of the Doppler sensing pixels, for pre-processing the mixing product signals, and prioritizing a sensing data from the Doppler sensing pixels to be exported for further processing.
- The optical sensor of claim 13, wherein the pre-processing functional module functions at least one of:
to estimate a quantity associated with a strength of said modulated light signals detected by a Doppler sensing pixel;
to estimate a quantity associated with a signal to interference and noise ratio of said modulated light signals detected by a Doppler sensing pixel;
to estimate a quantity associated with a Doppler shift in envelop waveform of said modulated light signals detected by a Doppler sensing pixel;
to estimate a quantity associated with a frequency shift in envelop waveform of said modulated light signals detected by a Doppler sensing pixel;
to estimate a quantity associated with a distance of an object associated with said modulated light signals detected by a Doppler sensing pixel; and
to determine, based on at least one said estimated quantities listed above, priority parameters for transferring data for further processing. - The optical sensor of claim 1, further includes at least one of:
at least one first amplifiers, coupled with the photo-detectors and the mixers, for amplifying the detected electrical signals;
at least one second amplifiers, coupled with the mixers or the LO-pumped photo-detectors, for amplifying the mixing product signals;
at least one filters, coupled with the mixers or the LO-pumped photo-detectors, for attenuating frequency components outside band of interest in the mixing product signals;
at least one analog to digital converters, coupled with the mixers or the LO-pumped photo-detectors, for digitizing the mixing product signals; and
at least one digital signal processor module, coupled with the Doppler sensing pixels for processing the mixing product signals. - The optical sensor of claim 1 is built on a semiconductor chip.
- The optical sensor of claim 16, wherein the array of Doppler sensing pixels are placed on the semiconductor chip in an area of at least one of:
a rectangular shape;
a round shape;
a ring shape;
an oval shape;
an oval ring shape; and
a curved belt shape. - The optical sensor of claim 16, wherein the array of Doppler sensing pixels are placed on the semiconductor chip and spaced according to at least one of:
Cartesian coordinates; and
polar coordinates. - The optical sensor of claim 16, wherein the array of Doppler sensing pixels are placed on the semiconductor chip in a plurality of zones, and wherein, in each of the zones the Doppler sensing pixels are placed evenly according to one of a Cartesian or a polar coordinates, and densities of placement are based on the zone the Doppler sensing pixels belong to.
- A Doppler LIDAR receiver, comprising:
a Doppler sensor having an array of Doppler sensing pixels, for producing at least one Doppler sensing signals;
an optical scope, optically coupled with the Doppler sensor, for producing an optical image of objects on the Doppler sensor;
a digital signal processor module, coupled with the Doppler sensor, for processing the at least one Doppler sensing signals; and
whereby, the array of Doppler sensing pixels each is operable to detect a modulated light signal from or associated with an object under detection, mix a detected signal with a local replica signal, and produce at least one of the Doppler sensing signals;
and whereby, the digital signal processor module is operable, based on at least one of the Doppler sensing signals produced by a Doppler sensing pixel and an address of said Doppler sensing pixel in the array, to report at least one of:
a relative moving speed of said object and direction of moving in terms approaching to or leaving from said Doppler LIDAR receiver;
a distance between said object and said Doppler LIDAR receiver; and
a direction of said object relative to said Doppler LIDAR receiver. - An optical sensor chip, comprising at least one of:
an array of mixed camera sensing pixels and LIDAR sensing pixels; and
an array of dual function pixels which sense both camera information and LIDAR information. - The optical sensor chip of claim 21, wherein the camera sensing pixels, LIDAR sensing pixels and the dual function pixels (herein referred to as “the pixels”) are operable to sense and/or indicate at least one of:
a color of visible light and a direction of a sensed portion of an object in a field of view;
a strength of visible light and a direction of a sensed portion of an object in a field of view;
a strength of light in an infrared range and a direction of a sensed portion of an object in a field of view;
a strength of light in an ultraviolet range and a direction of a sensed portion of an object in a field of view;
a human-invisible color in an infrared range and a direction of a sensed portion of an object;
a human-invisible color in an ultraviolet range and a direction of a sensed portion of an object in a field of view;
at least one quantity that is able to derive a range (distance) of a sensed portion of an object, and a data to indicate a direction of said sensed portion of said object in a field of view; and
at least one quantity that is able to derive a velocity of a sensed portion of an object, and a data to indicate a direction of said sensed portion of said object in a field of view. - The optical sensor chip of claim 21 wherein said array of dual function pixels, array of mixed camera sensing pixels and LIDAR sensing pixels are placed on the chip in an area of at least one of:
a rectangular shape;
a round shape;
a ring shape;
an oval shape;
an oval ring shape; and
a curved belt shape. - The optical sensor chip of claim 21 wherein said array of dual function pixels, array of mixed camera sensing pixels and LIDAR sensing pixels are placed on the chip and spaced according to at least one of:
Cartesian coordinates; and
polar coordinates. - The optical sensor chip of claim 21 wherein said array of dual function pixels, array of mixed camera sensing pixels and LIDAR sensing pixels (herein referred generally to as “the pixels”) are placed on the chip in a plurality of zones, and wherein, in each of the zones the pixels are placed evenly according to one of a Cartesian or a polar coordinates, and densities of placement are based on the zone the pixels belong to.
- The optical sensor chip of claim 21 is operable to sense the camera information and the LIDAR information and associate said camera information and said LIDAR information with each other based on an angle of view.
- The optical sensor chip of claim 21 is operable to perform at least one of:
determining a priority of sensed data;
sharing said determined priority between a camera sensing data and a LIDAR sensing data obtained by one said dual function pixel or obtained by a pair of adjacent camera sensing pixel and LIDAR sensing pixel; and
transferring both said LIDAR sensing data and said camera sensing data based, at least in part, on said determined and/or shared priority. - The optical sensor chip of claim 21 further includes micro optical filters for selectively passing red, green, blue and other bands of wavelength of light and light signals.
- A method for effectively conveying data from pixels of an optical sensor device to a processor, comprising steps of:
sending initial data produced by a sample set of pixels to a processor;
associating, with each of the pixels by the processor, a table of parameters among a number of priority categories for conveying data in a next period;
queuing, for each of the categories, the data produced by pixels into queuing buffers according to the parameters for the corresponding category that a pixel is associated with;
multiplexing, according to priority and scheduling parameters in the table, the queued data from the buffers into a transmission channel;
if the transmission channel cannot convey all queued data in the buffers according to the parameters specified in the table, discarding some most stale data from the lowest priority buffers; and
repeating the associating, queuing, multiplexing and discarding steps above in another next period. - The method of claim 29, further comprises steps of:
selecting, by a pre-processor locally coupled with the pixels, a subset of pixels;
caching up data from the selected pixels;
expediting the cached data to the processor over the transmission channel;
receiving, from the processor, feedback indicators;
adding/removing pixels for caching according to the feedback indicators over a period; and
repeating the selecting, caching, expediting, receiving, and adding/removing steps. - The method of claim 30, wherein the period is specified by the feedback indicators.
- The method of claim 29, wherein the data is grouped in packets, and said packets include a sensor data, and at least one of:
a position index of the pixel from which said sensor data in said packet is produced;
a timestamp of the data packet at which said sensor data is produced;
a sampling rate of the said sensor data;
an under sampling range indicator;
a decimation factor of the said sensor data;
an estimated level of time criticalness;
an estimated level of consequence criticalness;
a data queuing category indicator;
time to expire for said sensor data;
a confidence level indicator;
time to next scheduled data of this kind; and
error correction check bits. - The method of claim 29, wherein the table of parameters includes at least one of:
position index or range of position index of pixels;
prediction of position index or range of position index of pixels in a future interval based on moving and aiming change;
category index for data queuing and transmission;
level of time criticalness;
level of consequence criticalness;
sensor signal type;
sensor signal sampling rate;
sensor signal decimation factor;
sensor data block length in a packet;
packet update rate;
desired packet update rate;
guaranteed minimum packet update rate; and
conditions for discarding data. - A method for effectively conveying data from pixels of a sensor device to a processor, comprising steps of:
determining, by a pre-processor locally coupled with the pixels, which of a plurality of priority categories each of the pixels is assigned to;
queuing data produced by the pixels into queuing buffers, according to the priority category being assigned to, and according to a set of parameters associated with the priority category;
multiplexing, according to priority and scheduling parameters specified for the priority categories, the queued data from the buffers into a transmission channel;
discarding, if the transmission channel cannot convey all queued data in the buffers according to the parameters specified for the priority categories, some most stale data from the lowest priority buffers; and
repeating the determining, queuing, multiplexing and discarding steps above in next period. - The method of claim 34, wherein the data is grouped in packets, and said packets include a sensor data, and at least one of:
a position index of the pixel from which said sensor data in said packet is produced;
a timestamp of the data packet at which said sensor data is produced;
a sampling rate of the said sensor data;
an under sampling range indicator;
a decimation factor of the said sensor data;
an estimated level of time criticalness;
an estimated level of consequence criticalness;
an data queuing category indicator;
time to expire for said sensor data;
a confidence level indicator;
time to next scheduled data of this kind; and
error correction check bits. - The method of claim 34, wherein the set of parameters associated with each of the categories include at least one of:
level of time criticalness;
level of consequence criticalness;
sensor signal type;
sensor signal sampling rate;
sensor signal decimation factor;
sensor data block length in a packet;
packet update rate;
desired packet update rate;
guaranteed minimum packet update rate;
maximum allowed queuing delay; and
conditions for discarding data. - An omnidirectional optical sensor receiver, comprising:
an optical sensor having an array of pixels operable to detect at least one of Doppler information, light color information and light strength information, and produce sensing signals;
one group of
group 1: a convex mirror, and an optical scope, optically coupled with the optical sensor, for redirecting lights from objects under detection in all directions of a hemisphere or a partial sphere and producing an optical image of objects on the optical sensor;
group 2: a fish-eye scope, optically coupled with the optical sensor, for redirecting lights from objects under detection in all directions of a hemisphere or a partial sphere and producing an optical image of objects on the optical sensor;
a digital signal processor module, coupled with the optical sensor, for processing the sensing signals; and
whereby, the digital signal processor module is operable, based on at least one of the sensing signals produced by an optical sensing pixel and an address of said optical sensing pixel in said array, to detect at least one of:
a relative moving speed of said object and direction of moving in terms approaching to or leaving from said omnidirectional optical sensor receiver;
a distance between said object and said omnidirectional optical sensor receiver;
a direction of said object relative to said omnidirectional optical sensor receiver;
quantities associated with color of light from or associated with said object; and
a strength of light from or associated with said object. - The omnidirectional optical sensor receiver of claim 37, wherein said convex mirror is substantially a hyperbolic convex mirror.
- The omnidirectional optical sensor receiver of claim 37, wherein the array of pixels are placed in an area shaped in one of:
a rectangular shape;
a round shape; and
a ring shape. - The omnidirectional optical sensor receiver of claim 37, further includes another duplicated set of same on the opposite side, so that the two hemispherical omnidirectional optical sensor receivers together forms a spherical omnidirectional optical sensor receiver.
- An optical sensing apparatus for detecting a shift in frequency in an amplitude envelope waveform of an optical signal, comprising at least one of:
at least one avalanche photodiode (APD);
at least one single-photon avalanche diode (SPAD); and
at least one photo-sensing device whose optical to electrical conversion rate depends on an instantaneous bias voltage;
wherein said APD, SPAD or photo-sensing device is biased by a time-varying voltage based, at least in part, on a replica signal of the amplitude envelope waveform. - The optical sensing apparatus of claim 41 further includes at least one of:
at least one amplifier, coupled with said APD, SPAD or said photo-sensing device, for amplifying an electrical signal sensed from the optical signal;
at lease one filter or tuning circuit, coupled with said APD, SPAD or said photo-sensing device, for attenuating frequency components outside a band of interest;
a frequency estimator, coupled with said APD, SPAD or said photo-sensing device, for estimating an frequency;
at least one grating coupler, optically coupled with said APD, SPAD or said photo-sensing device, for selectively coupling optical signals into said APD, SPAD or said photo-sensing device; and
an optical filter, placed in a ray path of said optical signal, for selectively passing and stopping components of light wavelengths.
Applications Claiming Priority (6)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US16/926,400 | 2020-07-10 | ||
US16/926,400 US20220011431A1 (en) | 2020-07-10 | 2020-07-10 | Camera sensor for lidar with doppler-sensing pixels |
US17/126,623 | 2020-12-18 | ||
US17/126,623 US20220011430A1 (en) | 2020-07-10 | 2020-12-18 | Lidar sensor on chip with doppler-sensing pixels |
US17/194,389 US20220011438A1 (en) | 2020-07-10 | 2021-03-08 | Multi-domain optical sensor chip and apparatus |
US17/194,389 | 2021-03-08 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2022008989A1 true WO2022008989A1 (en) | 2022-01-13 |
Family
ID=79172459
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/IB2021/054262 WO2022008989A1 (en) | 2020-07-10 | 2021-05-18 | Multi-domain optical sensor |
Country Status (2)
Country | Link |
---|---|
US (1) | US20220011438A1 (en) |
WO (1) | WO2022008989A1 (en) |
Families Citing this family (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN115128581B (en) * | 2022-08-31 | 2022-12-20 | 上海羲禾科技有限公司 | Silicon optical chip and laser radar based on same |
Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20100312500A1 (en) * | 2007-07-20 | 2010-12-09 | Stephen Morgan | Array of Electromagnetic Radiation Sensors with On-Chip Processing Circuitry |
US9179062B1 (en) * | 2014-11-06 | 2015-11-03 | Duelight Llc | Systems and methods for performing operations on pixel data |
US20190018115A1 (en) * | 2017-07-12 | 2019-01-17 | Airbus Defence and Space GmbH | Lidar arrangement and lidar method |
US20190086518A1 (en) * | 2017-09-19 | 2019-03-21 | Veoneer Us, Inc. | Scanning lidar system and method |
US20190154439A1 (en) * | 2016-03-04 | 2019-05-23 | May Patents Ltd. | A Method and Apparatus for Cooperative Usage of Multiple Distance Meters |
US20190317219A1 (en) * | 2018-04-11 | 2019-10-17 | Aurora Innovation, Inc. | Control of Autonomous Vehicle Based on Environmental Object Classification Determined Using Phase Coherent LIDAR Data |
US20200057151A1 (en) * | 2018-08-16 | 2020-02-20 | Sense Photonics, Inc. | Integrated lidar image-sensor devices and systems and related methods of operation |
-
2021
- 2021-03-08 US US17/194,389 patent/US20220011438A1/en active Pending
- 2021-05-18 WO PCT/IB2021/054262 patent/WO2022008989A1/en active Application Filing
Patent Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20100312500A1 (en) * | 2007-07-20 | 2010-12-09 | Stephen Morgan | Array of Electromagnetic Radiation Sensors with On-Chip Processing Circuitry |
US9179062B1 (en) * | 2014-11-06 | 2015-11-03 | Duelight Llc | Systems and methods for performing operations on pixel data |
US20190154439A1 (en) * | 2016-03-04 | 2019-05-23 | May Patents Ltd. | A Method and Apparatus for Cooperative Usage of Multiple Distance Meters |
US20190018115A1 (en) * | 2017-07-12 | 2019-01-17 | Airbus Defence and Space GmbH | Lidar arrangement and lidar method |
US20190086518A1 (en) * | 2017-09-19 | 2019-03-21 | Veoneer Us, Inc. | Scanning lidar system and method |
US20190317219A1 (en) * | 2018-04-11 | 2019-10-17 | Aurora Innovation, Inc. | Control of Autonomous Vehicle Based on Environmental Object Classification Determined Using Phase Coherent LIDAR Data |
US20200057151A1 (en) * | 2018-08-16 | 2020-02-20 | Sense Photonics, Inc. | Integrated lidar image-sensor devices and systems and related methods of operation |
Also Published As
Publication number | Publication date |
---|---|
US20220011438A1 (en) | 2022-01-13 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP7429274B2 (en) | Optical imaging transmitter with enhanced brightness | |
US10637574B2 (en) | Free space optical communication system | |
US8760634B2 (en) | Optical synthetic aperture radar | |
US20140085629A1 (en) | Active Hyperspectral Imaging Systems | |
EP0766101A2 (en) | Scanning optical rangefinder | |
KR20170015160A (en) | sensor ASSEMBLY with selective infrared filter ARRAY | |
US7369309B2 (en) | Confocal microscope | |
EP0619502B1 (en) | Scanning optical rangefinder | |
WO2022008989A1 (en) | Multi-domain optical sensor | |
IL275956B1 (en) | Parallax compensating spatial filters | |
US20220011430A1 (en) | Lidar sensor on chip with doppler-sensing pixels | |
US11555894B2 (en) | System and method for adaptive optical tracking with selectable tracking modes | |
CN101170359B (en) | Optical communication receiving device capable of tracking signal movement | |
US5898791A (en) | Spinning focal plane array camera particularly suited for real time pattern recognition | |
US20220011431A1 (en) | Camera sensor for lidar with doppler-sensing pixels | |
US11137591B1 (en) | System and method for compact, adaptive optical sensor for obtaining areal images | |
WO2020235458A1 (en) | Image-processing device, method, and electronic apparatus | |
US6774366B1 (en) | Image integration and multiple laser source projection | |
US20200067603A1 (en) | Heterodyne starring array active imager | |
JP2004325202A (en) | Laser radar system | |
JP2007507929A (en) | Infrared (IR) receiver | |
Kurimoto et al. | Visible light communication system using low-speed image sensor and two-dimensional optical scanner | |
US20240069285A1 (en) | Optical transceiver arrays | |
US20040007660A1 (en) | Optical sensor device for receiving light signal | |
JP2024057783A (en) | Receiving device, communication device, and communication system |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 21838388 Country of ref document: EP Kind code of ref document: A1 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 21838388 Country of ref document: EP Kind code of ref document: A1 |