CN111273050A - Signal acquisition processing method and device - Google Patents
Signal acquisition processing method and device Download PDFInfo
- Publication number
- CN111273050A CN111273050A CN202010089554.9A CN202010089554A CN111273050A CN 111273050 A CN111273050 A CN 111273050A CN 202010089554 A CN202010089554 A CN 202010089554A CN 111273050 A CN111273050 A CN 111273050A
- Authority
- CN
- China
- Prior art keywords
- signal
- target scene
- modulation
- sub
- frequency
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
- 238000003672 processing method Methods 0.000 title claims abstract description 26
- 230000003287 optical effect Effects 0.000 claims abstract description 156
- 238000000034 method Methods 0.000 claims abstract description 79
- 230000008859 change Effects 0.000 description 76
- 230000033001 locomotion Effects 0.000 description 60
- 238000001514 detection method Methods 0.000 description 53
- 230000000875 corresponding effect Effects 0.000 description 46
- 230000001133 acceleration Effects 0.000 description 18
- 238000010586 diagram Methods 0.000 description 17
- 230000008569 process Effects 0.000 description 16
- 230000006835 compression Effects 0.000 description 15
- 238000007906 compression Methods 0.000 description 15
- 230000003595 spectral effect Effects 0.000 description 15
- 238000005070 sampling Methods 0.000 description 10
- 238000001228 spectrum Methods 0.000 description 9
- 238000003384 imaging method Methods 0.000 description 8
- 239000004973 liquid crystal related substance Substances 0.000 description 8
- 210000002858 crystal cell Anatomy 0.000 description 5
- 238000001914 filtration Methods 0.000 description 3
- 230000010354 integration Effects 0.000 description 3
- 238000011084 recovery Methods 0.000 description 3
- 230000003068 static effect Effects 0.000 description 3
- 230000005540 biological transmission Effects 0.000 description 2
- 238000006073 displacement reaction Methods 0.000 description 2
- 229910052754 neon Inorganic materials 0.000 description 2
- GKAOGPIIYCISHV-UHFFFAOYSA-N neon atom Chemical compound [Ne] GKAOGPIIYCISHV-UHFFFAOYSA-N 0.000 description 2
- 239000013307 optical fiber Substances 0.000 description 2
- 230000010287 polarization Effects 0.000 description 2
- 238000004458 analytical method Methods 0.000 description 1
- 238000004364 calculation method Methods 0.000 description 1
- 238000006243 chemical reaction Methods 0.000 description 1
- 230000002596 correlated effect Effects 0.000 description 1
- 238000007405 data analysis Methods 0.000 description 1
- 230000007547 defect Effects 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 238000009499 grossing Methods 0.000 description 1
- 239000011159 matrix material Substances 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000009466 transformation Effects 0.000 description 1
- 230000007704 transition Effects 0.000 description 1
- 238000004804 winding Methods 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01P—MEASURING LINEAR OR ANGULAR SPEED, ACCELERATION, DECELERATION, OR SHOCK; INDICATING PRESENCE, ABSENCE, OR DIRECTION, OF MOVEMENT
- G01P3/00—Measuring linear or angular speed; Measuring differences of linear or angular speeds
- G01P3/36—Devices characterised by the use of optical means, e.g. using infrared, visible, or ultraviolet light
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C11/00—Photogrammetry or videogrammetry, e.g. stereogrammetry; Photographic surveying
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01M—TESTING STATIC OR DYNAMIC BALANCE OF MACHINES OR STRUCTURES; TESTING OF STRUCTURES OR APPARATUS, NOT OTHERWISE PROVIDED FOR
- G01M11/00—Testing of optical apparatus; Testing structures by optical methods not otherwise provided for
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01P—MEASURING LINEAR OR ANGULAR SPEED, ACCELERATION, DECELERATION, OR SHOCK; INDICATING PRESENCE, ABSENCE, OR DIRECTION, OF MOVEMENT
- G01P15/00—Measuring acceleration; Measuring deceleration; Measuring shock, i.e. sudden change of acceleration
Landscapes
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Engineering & Computer Science (AREA)
- Chemical & Material Sciences (AREA)
- Analytical Chemistry (AREA)
- Electromagnetism (AREA)
- Power Engineering (AREA)
- Multimedia (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Optical Communication System (AREA)
Abstract
The embodiment of the disclosure relates to a signal acquisition processing method and a signal acquisition processing device. The signal acquisition and processing method comprises the following steps: performing a first processing on a light signal from a target scene; and collecting the optical signal after the first processing. The method and the device of the embodiment of the disclosure reduce the data volume, reduce the requirements on the performance of the signal acquisition device and improve the signal acquisition processing speed.
Description
Technical Field
The disclosure belongs to the technical field of signal processing, and relates to a signal acquisition processing method and a signal acquisition processing device, in particular to a signal acquisition processing method and a signal acquisition processing device corresponding to the signal acquisition processing method.
Background
"images" are one of the most important categories of information in the modern intelligent age. From industry to entertainment, from autodrive to mobile terminals, image information is required in every segment to help humans and machines "understand the world".
In some application scenes, people often do not directly use the acquired image, but use the image after further computer algorithm processing, and after the processing, the information of the picture has higher use value. The current algorithm processing flow may generally be that after acquiring image information, the image information is stored in a computer, and then the computer is used to execute an image processing algorithm on the acquired image. Because the computer is used for storing the acquired data and processing a large amount of data, the existing method has low processing speed, poor real-time performance and low efficiency. For example, the speed of the current image processing system is generally within 100 hz, which is difficult to meet the requirements of people in some application scenarios.
Disclosure of Invention
The embodiment of the disclosure provides a signal acquisition processing method and a signal acquisition processing device, so as to solve the technical problems.
According to at least one embodiment of the present disclosure, there is provided a signal acquisition processing method including: performing a first processing on a light signal from a target scene; and collecting the optical signal after the first processing.
The method according to any of the preceding embodiments of the present disclosure, for example, further comprising: obtaining information in the target scene based on the collected data.
The method according to any of the preceding embodiments of the present disclosure, for example, the obtaining information in the target scene based on the acquired data includes: carrying out second processing on the acquired data to obtain information in the target scene; the information obtained is formed into an image.
According to the method of any of the preceding embodiments of the present disclosure, for example, the first processing of the light signal from the target scene comprises: determining an encoding signal of the target scene; and carrying out coding processing on the optical signal from the target scene based on the coding signal, wherein the signal after the coding processing is the optical signal.
The method according to any of the preceding embodiments of the present disclosure, for example, the acquiring the light signal after the first processing includes: and acquiring the optical signal after the coding processing, wherein the optical signal after the coding processing comprises the information of the coded signal and the optical signal from the target scene.
According to the method of any of the previous embodiments of the present disclosure, for example, the coded processed light signal is an integration result of multiplication of the coded signal and a light signal from the target scene.
The method according to any of the preceding embodiments of the present disclosure, for example, the first processing the light signal from the target scene further comprises: projecting the light signal from the target scene to a spatial light modulator; the determining the coded signal of the target scene comprises: determining a modulation signal of the spatial light modulator; the encoding processing the light signal from the target scene based on the encoded signal comprises: modulating the light signal from the target scene based on the modulation signal of the spatial light modulator.
The method according to any of the preceding embodiments of the present disclosure, for example, wherein the modulation signal comprises a frequency parameter and a phase parameter, and the determining the modulation signal of the spatial light modulator comprises: determining the frequency parameter of the modulated signal; determining the phase parameter of the modulated signal; determining the modulation signal based on the frequency parameter and the phase parameter.
The method according to any of the preceding embodiments of the present disclosure, for example, determining the frequency parameter of the modulated signal comprises: determining a property and/or type of the pre-collected optical signal, determining a frequency parameter of the modulation signal based on the property and/or type of the pre-collected optical signal.
The method according to any of the preceding embodiments of the present disclosure, for example, determining the frequency parameter of the modulated signal comprises: determining the complexity of the target scene; determining a frequency parameter of the modulated signal based on the complexity.
According to the method of any of the previous embodiments of the present disclosure, for example, the target scene includes a moving object or a brightness-varying object, and determining the frequency parameter of the modulation signal includes: and determining the motion parameter of the moving object or the brightness change parameter of the brightness change object, and determining the frequency parameter of the modulation signal based on the motion parameter or the brightness change parameter.
The method according to any of the preceding embodiments of the present disclosure, for example, said projecting the light signal from the target scene to a spatial light modulator comprises: passing a light signal from the target scene through a first optical element; the light signal passing through the first optical element is projected onto the spatial light modulator; the spatial light modulator receive-modulates a light signal that is projected onto the optical element after passing through the optical element.
The method according to any of the preceding embodiments of the present disclosure, for example, wherein the spatial light modulator comprises a plurality of sub-modulation units, and the projecting the light signal from the target scene to the spatial light modulator comprises: projecting the light signal from the target scene onto the sub-modulation units of the spatial light modulator, the determining the modulation signal of the spatial light modulator comprising: determining a sub-modulation signal of a sub-modulation unit of each of the spatial light modulators; the modulating the light signal from the target scene based on the modulation signal of the spatial light modulator comprises: and modulating the light signal from the target scene based on the sub-modulation signals of the sub-modulation units.
The method according to any of the preceding embodiments of the present disclosure, for example, wherein the sub-modulation signals comprise a frequency parameter and a phase parameter, and the determining the sub-modulation signals of the sub-modulation units of each of the spatial light modulators comprises: determining a frequency parameter of each of the sub-modulation signals; determining a phase parameter for each of the sub-modulated signals; determining the sub-modulation signal based on a frequency parameter and a phase parameter of the sub-modulation signal.
The method according to any of the preceding embodiments of the present disclosure, for example, the determining the frequency parameter of the sub-modulation signal comprises: information to be obtained from the target scene is predetermined, and the number of frequencies and/or the value of frequencies in the frequency parameters of the sub-modulation signals of the sub-modulation units are determined based on the information to be obtained from the target scene.
The method according to any of the preceding embodiments of the present disclosure, for example, the determining the frequency parameter and the phase parameter of each of the sub-modulation signals further comprises: determining a position of each of the sub-modulation units in the spatial light modulator, and further determining a frequency parameter and/or a phase parameter of each of the sub-modulation units based on the position.
According to the method of any of the preceding embodiments of the present disclosure, for example, the modulation signal of each of the plurality of sub-modulation units uses the same frequency.
According to the method of any of the preceding embodiments of the present disclosure, for example, the modulation signals of at least two sub-modulation units in the plurality of sub-modulation units use different frequencies.
The method according to any of the preceding embodiments of the present disclosure, for example, wherein the frequency value is in a low frequency range.
The method according to any of the preceding embodiments of the present disclosure, for example, wherein 0 is included in the plurality of frequency values.
The method of any preceding embodiment of the present disclosure, for example, wherein 0 is not included in the plurality of frequency values.
According to the method of any of the preceding embodiments of the present disclosure, for example, the frequency parameter of the modulation signal of the sub-modulation unit is related to a first time, and the first time is related to a performance parameter of a device that acquires the optical signal.
According to the method of any of the preceding embodiments of the present disclosure, for example, the frequency of the modulation signal of the sub-modulation unit is less than or equal to the inverse of the first time.
The method according to any of the preceding embodiments of the present disclosure, determining the number of frequencies is based on, for example, a number of images formed in a first time.
According to the method of any preceding embodiment of the present disclosure, for example, the number of frequencies is proportional to the number of images formed in the first time.
According to the method of any one of the preceding embodiments of the present disclosure, for example, when the number of frequencies is plural, the numerical interval between the plural frequencies is determined based on the first time.
The method of any preceding embodiment of the present disclosure, for example, wherein the numerical interval is less than or equal to a reciprocal of the first time.
According to the method of any of the preceding embodiments of the present disclosure, for example, the phase parameters of at least three sub-modulation units of the plurality of sub-modulation units are different.
According to the method of any of the previous embodiments of the present disclosure, for example, the phase distribution of the phase parameter in the modulation signal of the sub-modulation unit is in the range of 0-2 pi.
According to the method of any of the preceding embodiments of the present disclosure, for example, determining the sub-modulation signals of the sub-modulation units of each of the spatial light modulators further comprises: dividing the plurality of sub-modulation units into a plurality of groups, each group comprising at least two sub-modulation units, all sub-modulation units in each group of sub-modulation units being adjacent in spatial position in the spatial light modulator; a sub-modulation signal of the sub-modulation unit is determined based on the group.
The method according to any of the preceding embodiments of the present disclosure, for example, said determining the sub-modulation signals of the sub-modulation units based on said group comprises: determining the frequency value of the modulation signal of each group of sub-modulation units, wherein the frequency values of each group of sub-modulation units are the same, and determining the phase value of the modulation signal of each group of sub-modulation units, and the phase value of each group of sub-modulation units is different.
According to the method of any one of the preceding embodiments of the present disclosure, for example, the frequency value of the sub-modulation signal of each group of sub-modulation units is different from the frequency value of the modulation signal of the adjacent other groups of sub-modulation units.
According to the method of any one of the preceding embodiments of the present disclosure, for example, the frequency value of the sub-modulation signal of each group of sub-modulation units is the same as the frequency value of the modulation signal of the adjacent other groups of sub-modulation units.
According to the method of any of the preceding embodiments of the present disclosure, for example, the sub-modulation units comprise optical lenses which are pivoted about respective central axes at a frequency and phase corresponding to the modulation signals of the sub-modulation units.
According to the method of any one of the preceding embodiments of the present disclosure, for example, the sub-modulation unit includes a liquid crystal cell, and the liquid crystal cell is turned on and off at a frequency and a phase corresponding to a modulation signal of the sub-modulation unit.
According to the method of any one of the preceding embodiments of the present disclosure, for example, the target scene includes at least one moving object or at least one brightness change object, and the first processing the light signal from the target scene includes: determining at least one frequency parameter and/or at least one phase parameter in the modulation signal according to at least one motion parameter or brightness change parameter of the moving object or brightness change object; determining the modulation signal based on the at least one frequency parameter and/or at least one phase parameter.
The method according to any of the preceding embodiments of the present disclosure, for example, the frequency value range of the frequency parameter of the modulation signal is greater than or equal to the frequency value range of the at least one frequency parameter.
The method according to any of the preceding embodiments of the present disclosure, for example, wherein the motion parameter comprises a motion speed, the brightness variation parameter comprises a brightness variation speed, and the determining at least one frequency value in the modulation signal according to at least one motion parameter or at least one brightness variation parameter of the moving object or brightness variation object comprises: determining at least one time value of the at least one moving object or brightness change object passing through a detection unit of an optical signal acquisition device or at least one time value of a brightness change point of the brightness change object passing through the detection unit according to the movement speed or the brightness change speed; determining the at least one frequency value from the at least one time value.
According to the method of any one of the preceding embodiments of the present disclosure, for example, the motion parameter further includes acceleration, the brightness change parameter further includes brightness change acceleration, and determining at least one time value when the at least one moving object or brightness change object passes through a detection unit of an optical signal acquisition device or at least one time value when a brightness change point of the brightness change object passes through the detection unit according to the motion speed or the brightness change speed includes: respectively determining a plurality of motion speeds of the moving object according to the motion acceleration and the motion speed; respectively determining a plurality of time values of the moving object passing through a detection unit of the optical signal acquisition device according to the plurality of movement speeds; or respectively determining a plurality of brightness change speeds of the brightness change object according to the brightness change acceleration and the brightness change speed; and respectively determining a plurality of time values of the brightness change point of the brightness change object passing through one detection unit of the optical signal acquisition device according to the plurality of brightness change speeds.
According to the method of any of the preceding embodiments of the present disclosure, for example, the acquiring the light signal after the first processing includes: projecting the optical signal modulated by the spatial light modulator to an image detector; and the image detector collects the modulated optical signals.
The method according to any of the previous embodiments of the present disclosure, for example, the spatial light modulator includes a plurality of sub-modulation units, the image detector includes a plurality of sub-detection units, and one sub-modulation unit corresponds to one or more sub-detection units of the image detector; the collecting of the modulated optical signal by the image detector comprises: and the sub-detection units collect the optical signals modulated by the sub-modulation units corresponding to the sub-detection units.
According to the method of any of the preceding embodiments of the present disclosure, for example, the acquiring, by the image detector, the modulated light signal includes: and collecting the coded optical signal by the image detector within a first time, wherein the first time is related to a performance index of the image detector.
According to the method of any of the preceding embodiments of the present disclosure, for example, projecting the light signal modulated by the spatial light modulator to an image detector comprises: the optical signal modulated by the spatial light modulator passes through a second optical element; the light signal passing through the second optical element is projected to an image detector.
The method according to any of the preceding embodiments of the present disclosure, for example, the obtaining information in the target scene based on the acquired data includes: acquiring a coded signal corresponding to an optical signal from the target scene; and obtaining information in the target scene according to the coded signals and the acquired data.
The method according to any of the preceding embodiments of the present disclosure, for example, the acquiring an encoded signal corresponding to a light signal from the target scene comprises: acquiring a frequency parameter in the coded signal as a first parameter; the obtaining information in the target scene according to the encoded signal and the acquired data comprises: obtaining a second parameter based on the collected data; second processing the acquired data to obtain information in the target scene based on the first and second parameters.
According to the method of any one of the preceding embodiments of the present disclosure, for example, the acquired data includes a plurality of pixel points, each pixel point is associated with a sub-modulation unit of the spatial light modulator performing the first processing, the sub-modulation units are multiple and are divided into a plurality of groups, and the obtaining the second parameter based on the acquired data includes: and taking a plurality of pixel points corresponding to a group of sub-modulation units as a pixel point group, and determining the second parameter based on one or more pixel point groups.
According to another embodiment of the present disclosure, there is also provided a signal acquisition and processing apparatus, including: a signal processing device configured to perform a first processing on a light signal from a target scene; a signal acquisition device configured to acquire the first processed optical signal.
The apparatus according to any of the preceding embodiments of the present disclosure, for example, further comprises an information obtaining device configured to obtain information in the target scene based on the acquired data.
The apparatus according to any of the preceding embodiments of the present disclosure, for example, the information obtaining apparatus is further configured to perform a second processing on the collected data to obtain information in the target scene; and forming the obtained information into an image.
The apparatus according to any of the preceding embodiments of the present disclosure, for example, the signal processing apparatus is further configured to determine an encoded signal of the target scene; and carrying out coding processing on the optical signal from the target scene based on the coding signal, wherein the signal after the coding processing is the optical signal.
The apparatus according to any of the preceding embodiments of the present disclosure, for example, the signal acquisition apparatus is further configured to acquire the coded light signal, which includes information of the coded signal and a light signal from the target scene.
The apparatus according to any of the preceding embodiments of the present disclosure, for example, the coded processed light signal is an integration result of multiplication of the coded signal and a light signal from the target scene.
The apparatus according to any one of the preceding embodiments of the present disclosure, for example, the signal processing apparatus includes a modulation signal determination unit and a spatial light modulator; the modulation signal determination unit is configured to determine a modulation signal; the spatial light modulator modulates the light signal from the target scene based on a modulation signal.
The apparatus according to any of the preceding embodiments of the present disclosure, for example, the modulation signal comprises a frequency parameter and a phase parameter, and the modulation signal determination unit is further configured to determine the frequency parameter and the phase parameter and determine the modulation signal based on the frequency parameter and the phase parameter.
According to the apparatus of any one of the preceding embodiments of the present disclosure, for example, the modulation signal determination unit determines a property and/or a type of the pre-collected optical signal, and determines the frequency parameter of the modulation signal based on the property and/or the type of the pre-collected optical signal.
The apparatus according to any of the preceding embodiments of the present disclosure, for example, the modulation signal determination unit determines a complexity of the target scene, and determines the frequency parameter of the modulation signal based on the complexity.
According to the device of any one of the preceding embodiments of the present disclosure, for example, a moving object or a brightness variation object is included in the target scene, the modulation signal determination unit determines a motion parameter of the moving object or a brightness variation parameter of the brightness variation object, and determines the frequency parameter of the modulation signal based on the motion parameter or the brightness variation parameter.
The apparatus according to any of the preceding embodiments of the present disclosure, for example, further comprises a first optical element through which the light signal from the target scene passes; the light signal passing through the first optical element is projected onto the spatial light modulator; the spatial light modulator receive-modulates a light signal that is projected onto the optical element after passing through the optical element.
The apparatus according to any of the foregoing embodiments of the present disclosure, for example, the spatial light modulator includes a plurality of sub-modulation units, the light signal from the target scene is projected onto the sub-modulation units of the spatial light modulator, and the modulation signal determination unit determines the sub-modulation signal of the sub-modulation unit of each of the spatial light modulators; the spatial light modulator modulates the light signal from the target scene based on the determined sub-modulation signal of the sub-modulation unit.
The apparatus according to any of the preceding embodiments of the present disclosure, for example, the sub-modulation signal comprises a frequency parameter and a phase parameter, the modulation signal determination unit determines the frequency parameter and the phase parameter, and determines the sub-modulation signal based on the frequency parameter and the phase parameter of the sub-modulation signal.
According to the apparatus of any of the preceding embodiments of the present disclosure, for example, the modulation signal determination unit is further configured to determine information to be obtained from the target scene in advance, and determine the number of frequencies and/or the frequency value in the frequency parameter of the sub-modulation signals of the sub-modulation units based on the information to be obtained from the target scene.
The apparatus according to any of the preceding embodiments of the present disclosure, for example, the modulation signal determination unit is further configured to determine a position of each of the sub-modulation units in the spatial light modulator, and further determine a frequency parameter and/or a phase parameter of each of the sub-modulation units based on the position.
According to the apparatus of any of the preceding embodiments of the present disclosure, for example, the modulation signal of each of the plurality of sub-modulation units uses the same frequency.
According to the apparatus of any of the preceding embodiments of the present disclosure, for example, the modulation signals of at least two sub-modulation units in the plurality of sub-modulation units use different frequencies.
The apparatus according to any preceding embodiment of the present disclosure, for example, the frequency value is in a low frequency range.
The apparatus according to any of the preceding embodiments of the present disclosure, for example, comprises 0 in the plurality of frequency values.
The apparatus of any preceding embodiment of the present disclosure, for example, does not include 0 in the plurality of frequency values.
According to the device of any one of the preceding embodiments of the present disclosure, for example, the frequency parameter of the modulation signal of the sub-modulation unit is related to a first time, and the first time is related to a performance parameter of the signal acquisition device.
According to the apparatus of any one of the preceding embodiments of the present disclosure, for example, the frequency of the modulation signal of the sub-modulation unit is less than or equal to the inverse of the first time.
The apparatus according to any preceding embodiment of the present disclosure, the number of frequencies is determined, for example, based on a number of images formed in a first time.
According to the apparatus of any preceding embodiment of the present disclosure, for example, the number of frequencies is proportional to the number of images formed in the first time.
The apparatus according to any of the preceding embodiments of the present disclosure, for example, when the number of frequencies is plural, the numerical interval between the plural frequencies is determined based on the first time.
The apparatus of any preceding embodiment of the present disclosure, for example, wherein the numerical interval is less than or equal to a reciprocal of the first time.
According to the apparatus of any of the preceding embodiments of the present disclosure, for example, the phase parameters of at least three sub-modulation units of the plurality of sub-modulation units are different.
According to the device of any one of the preceding embodiments of the present disclosure, for example, the phase distribution of the phase parameter in the modulation signal of the sub-modulation unit is in the range of 0-2 pi.
According to the apparatus of any of the previous embodiments of the present disclosure, for example, the plurality of sub-modulation units are divided into a plurality of groups, each group includes at least two sub-modulation units, and all sub-modulation units in each group of sub-modulation units are adjacent in spatial position in the spatial light modulator; the modulation signal determination unit determines a sub-modulation signal of the sub-modulation unit based on the group.
According to the apparatus of any previous embodiment of the present disclosure, for example, the frequency values of each group of sub-modulation signals are the same, and the phase values of each group of sub-modulation signals are different.
According to the apparatus of any one of the preceding embodiments of the present disclosure, for example, the frequency value of the sub-modulation signal of each group of sub-modulation units is different from the frequency value of the modulation signal of the adjacent other groups of sub-modulation units.
According to the apparatus of any one of the preceding embodiments of the present disclosure, for example, the frequency value of the sub-modulation signal of each group of sub-modulation units is the same as the frequency value of the modulation signal of the adjacent other groups of sub-modulation units.
According to the device of any one of the preceding embodiments of the present disclosure, for example, the sub-modulation units comprise optical lenses which are pivoted about respective central axes at a frequency and a phase corresponding to the modulation signals of the sub-modulation units.
According to the device of any one of the preceding embodiments of the present disclosure, for example, the sub-modulation unit includes a liquid crystal cell, and the liquid crystal cell is turned on and off at a frequency and a phase corresponding to a modulation signal of the sub-modulation unit.
According to the device of any one of the preceding embodiments of the present disclosure, for example, at least one moving object or at least one brightness variation object is included in the target scene, and the modulation signal determination unit is further configured to determine at least one frequency parameter and/or at least one phase parameter in the modulation signal according to at least one motion parameter or brightness variation parameter of the moving object or brightness variation object; determining the modulation signal based on the at least one frequency parameter and/or at least one phase parameter.
The apparatus according to any of the preceding embodiments of the present disclosure, for example, a frequency value range of the frequency parameter of the modulation signal is greater than or equal to a frequency value range of the at least one frequency parameter.
According to the device of any one of the previous embodiments of the present disclosure, for example, the motion parameter includes a motion speed, the brightness change parameter includes a brightness change speed, and the modulation signal determination unit determines at least one time value when the at least one moving object or the brightness change object passes through a detection unit of the optical signal acquisition device or at least one time value when the brightness change point of the brightness change object passes through the detection unit according to the motion speed or the brightness change speed; determining the at least one frequency value from the at least one time value.
According to the device of any one of the foregoing embodiments of the present disclosure, for example, the motion parameters further include acceleration, the brightness change parameters further include brightness change acceleration, and the modulation signal determination unit determines a plurality of motion speeds of the moving object according to the motion acceleration and the motion speed, respectively; respectively determining a plurality of time values of the moving object passing through a detection unit of the optical signal acquisition device according to the plurality of movement speeds; or, the modulation signal determination unit determines a plurality of brightness change speeds of the brightness change object according to the brightness change acceleration and the brightness change speed, respectively; and respectively determining a plurality of time values of the brightness change point of the brightness change object passing through one detection unit of the optical signal acquisition device according to the plurality of brightness change speeds.
The apparatus according to any of the foregoing embodiments of the present disclosure, for example, the signal acquisition apparatus includes an image detector, to which the optical signal modulated by the spatial light modulator is projected; and the image detector collects the modulated optical signals.
The apparatus according to any of the foregoing embodiments of the present disclosure, for example, the spatial light modulator includes a plurality of sub-modulation units, the image detector includes a plurality of sub-detection units, and one sub-modulation unit corresponds to one or more sub-detection units of the image detector; and the sub-detection units collect the optical signals modulated by the sub-modulation units corresponding to the sub-detection units.
According to the apparatus of any of the previous embodiments of the present disclosure, for example, the image detector collects the encoded optical signal at a first time, where the first time is related to a performance index of the image detector.
The apparatus according to any of the foregoing embodiments of the present disclosure, for example, further includes a second optical element through which the optical signal modulated by the spatial light modulator passes; the light signal passing through the second optical element is projected to an image detector.
The apparatus according to any of the preceding embodiments of the present disclosure, for example, the information obtaining apparatus is further configured to obtain a coded signal corresponding to the light signal from the target scene, and obtain the information in the target scene according to the coded signal and the acquired data.
The apparatus according to any of the preceding embodiments of the present disclosure, for example, the acquiring an encoded signal corresponding to a light signal from the target scene includes: acquiring a frequency parameter in the coded signal as a first parameter; the obtaining information in the target scene according to the encoded signal and the acquired data comprises: obtaining a second parameter based on the collected data; second processing the acquired data to obtain information in the target scene based on the first and second parameters.
According to the apparatus of any of the foregoing embodiments of the present disclosure, for example, the acquired data includes a plurality of pixel points, each pixel point is associated with a sub-modulation unit of the spatial light modulator performing the first processing, the sub-modulation units are multiple and are divided into a plurality of groups, and the obtaining the second parameter based on the acquired data includes: and taking a plurality of pixel points corresponding to a group of sub-modulation units as a pixel point group, and determining the second parameter based on one or more pixel point groups.
In the signal acquisition processing method and the signal acquisition processing device of the embodiment of the disclosure, the acquired image is the processed information that the user wants to obtain, so the data volume is greatly reduced, the requirement on the performance of the signal acquisition device is reduced, and the signal acquisition processing speed is greatly improved.
Drawings
Fig. 1 shows a flow chart of a signal acquisition processing method according to an embodiment of the present disclosure.
Fig. 2 shows a schematic diagram of a signal acquisition processing apparatus according to an embodiment of the present disclosure.
FIG. 3 shows a schematic structural diagram of a spatial light modulator according to an embodiment of the present disclosure;
FIG. 4 shows a schematic diagram of another signal acquisition processing arrangement according to an embodiment of the present disclosure;
fig. 5 shows a schematic diagram of a target scene with acquired data.
Fig. 6 is a flow chart of another signal acquisition and processing method according to an embodiment of the disclosure.
Fig. 7 shows a schematic diagram of a third signal acquisition and processing device according to an embodiment of the disclosure.
Figure 8 shows image contrast maps obtained in different ways.
Fig. 9 shows a time domain waveform of a pixel during an exposure time.
Detailed Description
The target scene described in the embodiments of the present disclosure is, for example, a scene that a user wishes to perform signal processing and acquisition. The scene may be a static scene, a moving scene, or a scene including both moving objects and static objects. The object can be a static object or a moving object. In the case of moving objects, the moving objects include objects which can generate displacement within a period of time, and also include objects which do not generate displacement within the period of time but have brightness which changes, such as neon lights.
Fig. 1 illustrates a signal acquisition processing method provided according to an embodiment of the present disclosure, and the signal acquisition processing method according to the embodiment of the present disclosure will be described below with reference to fig. 1. Referring to fig. 1, the signal acquisition processing method 100 includes the following steps S101-S102.
In S101, a first process is performed on a light signal from a target scene.
In S102, the optical signal after the first processing is collected.
According to the embodiment of the disclosure, the optical signal from the target scene is processed first, and then the processed optical signal is collected. Compared with the prior art of processing after acquisition, the acquired signals are the information which is processed and is expected to be obtained by the user, which is the signals which are removed and are not expected to be obtained by the user, the data volume is greatly reduced, and the efficiency of signal processing and acquisition is greatly improved.
Fig. 2 shows a schematic diagram of a signal acquisition processing apparatus 200 according to an embodiment of the present disclosure. The signal acquisition processing method 100 corresponds to the signal acquisition processing apparatus 200. For the simplicity of the description, the method and the device are described at the same time, and all embodiments and examples related to the method correspond to and are the same as the signal acquisition and processing device in a one-to-one manner. The above method and apparatus are further described below with reference to fig. 1 and 2, respectively.
Referring to fig. 2, the signal acquisition processing device 200 includes a signal processing device 210 and a signal acquisition device 220. The signal processing means 210 performs a first processing on the light signal from the target scene S. The signal acquisition device 220 acquires the first processed optical signal. The signal processing device 210 is, for example, a device capable of processing signals, such as a filtering device, an amplifying device, an encoding device, and the like. The signal acquisition device may be a device that acquires an optical signal, such as an imaging apparatus. Such as cameras, charge coupled units, image detectors, photodetectors, and the like.
In S101, a first process is performed on a light signal from a target scene. In the signal acquisition device, the signal processing device 210 performs a first process on the optical signal from the target scene S.
In one example, the light signal from the target scene may be, for example, light projected from the target scene in various ways such as reflection, refraction, transmission, diffraction, or any combination thereof. The first processing includes various processing modes of the optical signal. For example, filtering, selecting, processing, and encoding the optical signal.
In one example, the first process may perform an encoding process on a light signal from the target scene. For example, the encoded signal of the target scene is first determined. Then, a first process is performed on the optical signal from the target scene based on the encoded signal of the target scene, for example, the optical signal from the target scene is encoded using the encoded signal. In one example, the encoded signal may also be an encoded signal that controls the light signal. For example, the optical path of the light signal from the target scene is controlled by an optical lens or a liquid crystal cell, and/or the parameters such as light intensity are controlled. And/or to mark, load information, etc. the light signals projected at different locations. That is, the first processing is processing of the optical signal of the target scene by the encoded signal, and the processed signal is still the optical signal. Since the first process is performed in the optical domain, the processing speed thereof may reach the light speed level. Compared with the prior art in which a computer is used for processing the acquired data, the processing speed is greatly improved.
After the first processing, the encoded light signal includes information of the encoded signal and the light signal from the target scene. For example, the encoded light signal may be the result of an integration of the encoded signal multiplied by the light signal from the target scene. If x (t) represents the light signal from the target scene and y (t) represents the encoded signal, then the encoded light signal s (t) may be represented by s (t) ═ x (t) y (t) dt, where t is a time parameter.
According to an example of the present disclosure, the signal processing device that performs the first processing on the light signal from the target scene may include a spatial light modulator. The spatial light modulator is capable of modulating a parameter of a light field through a unit device, including but not limited to a micro-mirror device, a liquid crystal unit, and the like, under active control, thereby loading information into a one-dimensional or two-dimensional light field to achieve the purpose of light wave modulation. The parameters include: amplitude, phase, polarization, and incoherent-coherent light conversion, etc. The embodiment of the disclosure uses the spatial light modulator to modulate the optical signal, and can encode the optical signal with different attributes, so that the optical signal can be selectively collected in the subsequent signal collection process.
Thus, the first processing may include projecting the light signal from the target scene onto a spatial light modulator, which then modulates the light signal from the target scene. Accordingly, the encoded signal may be a modulation signal of a spatial light modulator. Before modulation, the modulation signal of the spatial light modulator may first be determined, and then the optical signal from the target scene may be code modulated based on the modulation signal of the spatial light modulator. For example, the signal acquisition processing means may comprise a modulation signal determination unit for determining the modulation signal of the spatial light modulator. The modulation signal determination unit may be any one of a central processing unit, a microprocessor, a computer, a single chip, a chip or a chip set, and the like, and may be implemented by software, hardware or firmware.
In one example, the modulation signal may include one or more of a frequency parameter and a phase parameter, as well as other parameters such as amplitude. The present disclosure is described with a frequency parameter and a phase parameter as examples. In determining the modulation signal of the spatial light modulator, a frequency parameter and a phase parameter of the modulation signal may be determined, respectively, based on which the modulation signal is determined. The modulated signal may be represented by a function, e.g. a cosine functionWherein f represents a frequency parameter in the modulated signal,representing a phase parameter in the modulated signal. t denotes a time parameter, e.g. a modulation signal at different points in time.
In the embodiment of the present disclosure, the frequency parameter and the phase parameter may be determined in various ways to determine different modulation signals, so as to achieve different modulation effects, thereby facilitating the subsequent collection of the optical signal from the target scene.
In one example, in determining the frequency parameter, a complexity of the target scene may be first determined, and the frequency parameter of the modulated signal may be determined based on the complexity. The complexity of the target scene may be considered in several ways, such as the number of objects in the target scene, the number and/or motion parameters of moving objects with different motion patterns, and so on. And the motion parameters of the moving object, such as speed, acceleration, texture information of the object, size of the object and the like. The frequency parameter is determined according to the complexity of the target scene, the data volume can be properly controlled and adjusted, and unnecessary data volume is effectively reduced.
Furthermore, in another example, the property and/or type of the pre-collected optical signal may also be determined first, and the frequency parameter of the modulation signal may be determined based on the property and/or type of the pre-collected optical signal. Here, the pre-collected optical signal may be an optical signal that the user wishes to collect, with an optical signal that the user does not wish to collect removed. The pre-collected light signals are related to the information the user wishes to obtain from the target scene, and the user only wishes to collect the light signals corresponding to the information he wishes to obtain. For example, the pre-collected optical signal includes only foreground information of a specific frequency band in the target scene, but does not include background information, etc.
The properties and/or types of the pre-collected optical signals may include, for example, the brightness, frequency band, wavelength, phase, and amplitude of the optical signals. For example, the pre-collected optical signal is an optical signal with a large luminance value, and since the spectral components of the optical signal with the large luminance value are concentrated in the low frequency region in most real scenes, the frequency parameter of the modulation signal can be set in the low frequency region. For example, the pre-captured light signal is a background signal in the target scene, and since the frequency spectrum of the background information has no high frequency component and generally has a value only at a direct current component, the frequency parameter may be set to 0, and the captured data obtained in this way is a background image. In the embodiment of the present disclosure, the modulation signal may be determined based on the parameter of the pre-collected optical signal, so that the collected data is data that the user desires to obtain, and the user desires to obtain information from the target scene during the information acquisition, thereby effectively reducing the collection of unnecessary information, improving the processing and collection efficiency, and also reducing the data amount.
According to an embodiment of the present disclosure, a moving object may be included in the target scene. As described above, the moving object includes a moving object, and/or the object has a luminance change although it does not move, for example, the light of the lamp has a luminance change. In one example, when a moving object is included in the target scene, the frequency parameter of the modulation signal may also be determined according to a motion parameter of the moving object when determining the frequency parameter. For example, the spectral bandwidth of a high-speed moving object may be much larger than the bandwidth of a low-speed moving object. The frequency parameter can thus be determined from the speed of the moving object. Since the modulation signal can be determined based on the motion parameters of the moving object in the target scene in the embodiment of the disclosure, the moving object track can be acquired in the subsequent acquisition process, and the motion track can be obtained, without performing track detection again from the acquired image sequence frame as in the prior art, thereby effectively improving the processing speed.
Since the optical signals from the target scene include light from each direction of the scene, in order to modulate the optical signals from each direction of the target scene, in the embodiment of the present disclosure, modulation signals may be respectively set and modulated for the optical signals from each direction of the target scene, so that information of each direction in the target scene may be accurately acquired in a subsequent acquisition process. Fig. 3 shows a schematic structural diagram of a spatial light modulator according to an embodiment of the present disclosure. Referring to fig. 3, the spatial light modulator 210 includes a plurality of sub-modulation units 211. For example comprising tens, hundreds or thousands, or tens of thousands of sub-modulation units. Only 16 sub-modulation units 211 are drawn in fig. 3 as an example, and the other sub-modulation units are not drawn. The sub-modulation units are independent units, the sub-modulation units can be arranged in a one-dimensional or two-dimensional array in space, and each unit can independently receive control of optical or electric signals and the like and change the optical property of the unit according to the modulation signals. In the embodiment of the present disclosure, an optical signal from a target scene may be projected onto a plurality of sub-modulation units of a spatial light modulator, so as to implement modulation on optical waves of various directions irradiated on the spatial light modulator, and implement modulation on amplitude or intensity, phase, polarization state, wavelength, coherence state, and the like of the optical signal from the target scene.
According to one embodiment of the present disclosure, the sub-modulation unit may include various optical devices. For example, optical lenses such as mirrors are included, the mirrors being oscillated about respective central axes by parameters corresponding to modulation signals of the sub-modulation units. For example, parameters such as frequency and phase corresponding to the modulation signal may be preset, and the frequency and phase of the mirror may be controlled according to the set frequency and phase. According to another example, the sub-modulation unit may further include a liquid crystal unit, and the liquid crystal unit may be turned on and off at a frequency and a phase corresponding to a modulation signal of the sub-modulation unit, thereby modulating the optical signal.
In determining the modulation signals of the spatial light modulators, the sub-modulation signals of the sub-modulation units of each spatial light modulator may be determined separately. For example, the sub-modulation signals of the sub-modulation units of the spatial light modulator are determined by a modulation signal determination unit. In this way, when encoding the optical signal from the target scene, the optical signal from the target scene can be code-modulated based on the sub-modulation signal of the sub-modulation unit.
Similarly, the sub-modulation signal may include parameters such as a frequency parameter and a phase parameter. The frequency parameter and the phase parameter of the sub-modulated signal may be determined separately, the sub-modulated signal being determined based on the frequency parameter and the phase parameter of the sub-modulated signal.
Similar or identical to the method for determining the modulation signal of the spatial light modulator in the foregoing embodiments, in determining the sub-modulation signal, in addition to the factors mentioned in all the foregoing embodiments, the frequency parameter and the phase parameter of the sub-modulation signal may be determined separately from one or more of the following aspects.
In one example, a frequency parameter of the modulated signal may be determined from a performance parameter of the signal acquisition device. The signal acquisition device includes an imaging device such as a camera, a charge coupled unit, an image detector, and the like. The performance parameters of the signal acquisition device may include resolution, exposure time, and the like. For example, in order to use the principle of discrete fourier transformation for data acquisition, the frequency of the modulation signal of the sub-modulation units may be related to the first time. For example, to image a moving scene, the minimum time for the moving scene transition is generally less than the first time, and the modulation frequency value may be set to be the inverse of the first time or less than the inverse of the first time. Alternatively, the frequency of the modulation signal may be an integer multiple of the first time reciprocal or a non-integer multiple. The first time is related to a performance parameter of the signal acquisition device. For example, the first time may be an exposure time of the signal acquisition device.
In one example, in determining the number of different frequency values used by the sub-modulation signal, the number of different frequency values used by the sub-modulation signal may be determined according to the number of images imaged in the first time. That is, the number of different frequency values is proportional to the number of pre-imaging images. For example, in a compressive sampling application, when a user wishes to obtain multiple images within a first time, multiple different frequencies may be set, e.g., 4 or more than 4. When the user wishes to obtain fewer imaged images in the first time, fewer different frequencies may be set, such as 3 or less than 3.
Furthermore, in one example, since the positions of the sub-modulation units in the spatial light modulator are different, in one example, when determining the frequency parameter of the sub-modulation signal, the positions of the sub-modulation units in the spatial light modulator may be determined first, and the frequency and/or phase of the sub-modulation units may be determined based on the positions. Referring to fig. 3, a plurality of sub-modulation units 211 in the spatial light modulator 210 are distinguished by X1-X16, which are distributed at different positions of the spatial light modulator 210.
In one example, the modulation signals of each of all the sub-modulation units in the spatial light modulator use the same frequency, and the phases may be different. For example, the frequencies of all sub-modulation signals, including X1-X16, are the same. At least two of all the sub-modulation signals including X1-X16 are different in phase.
In one example, at least two of the plurality of sub-modulation units in the spatial light modulator use different frequencies.
For example, the plurality of sub-modulation units of the spatial light modulator may be divided into a plurality of groups, all sub-modulation units in each group of sub-modulation units being adjacent in spatial position in the spatial light modulator, each group including at least two sub-modulation units. For example, each group includes 2 sub-modulation units, 3 sub-modulation units, 4 sub-modulation units, 9 sub-modulation units, 16 sub-modulation units, and so on. The frequency and phase of the modulation signal of each sub-modulation unit in the group of sub-modulation units are then determined according to one or more of the information of each group category, the position of the group in the spatial light modulator, and the position of each sub-modulation unit in the group. For example, the frequency value of each group of sub-modulation signals is the same, and the frequency value of each group of sub-modulation signals is different from the frequency values of other adjacent groups of sub-modulation units. Furthermore, the phase value of each set of sub-modulation signals may be different. Taking 16 sub-modulation units X1-X16 in fig. 3 as an example, referring to fig. 3, 16 sub-modulation units X1-X16 can be divided into four groups, i.e., X1-X4, X5-X8, X9-X12, and X13-X16. X1-X4 can use the same frequency f1 and different phases 0, pi/2, 3 pi/2. X5-X8 can use the same frequency f2 and different phases 0, pi/2, 3 pi/2. X9-X12 can use the same frequency f3 and different phases 0, pi/2, 3 pi/2. X13-X16 use the same frequency f4 and different phases 0, pi/2, 3 pi/2. f1, f2, f3 and f4 are different from each other.
Of course, according to different information that the user wants to obtain from the target scene, the frequency value of each group of sub-modulation signals may also be set to be the same as the frequency values of sub-modulation signals of other adjacent groups of sub-modulation units, and the phase of each group of sub-modulation signals is different. Or the frequency value of each group of sub-modulated signals is different, etc.
In one example, the information to be obtained from the target scene by the user or the type of the pre-collected optical signal is predetermined, and the number of frequencies and/or the value of the frequencies in the frequency parameters of the sub-modulation signals of the sub-modulation units are determined based on the information desired to be obtained from the target scene by the user or the parameters of the pre-collected optical signal. Here, the information to be obtained by the user from the target scene and the light signal pre-collected by the user may be correlated.
For example, when a user desires to compressively sample a moving object scene by processing and capturing the scene, a sequence of continuously changing natural images is desired. That is, the collected intensity value of each pixel point related to the target scene is a segment of time domain waveform which changes continuously. The time domain waveform is sparse in the frequency domain, with only a few spectral components having large non-zero values, and most spectral components having small values close to zero. To compressively sample the target scene, these smaller values may be discarded in signal processing to achieve compressive sampling. In order to discard these spectral components with smaller brightness values, that is, in order to collect only the spectral components with larger brightness values in the subsequent collection process, since the spectral components corresponding to the points with larger brightness values are concentrated in the low frequency region in most real scenes, in one example, part of frequencies can be selected from the low frequency region as the frequency parameters of the modulation signal, and the optical signal from the target scene is modulated, so that only the spectrum in the low frequency region can be collected, and the compressed sampling can be realized.
In one example, the frequency value of the frequency used for the modulation signal for compressive sampling may gradually increase from 0 upwards. The frequency interval, i.e. the increment step, may be the inverse of the first time, or less than the inverse. In addition, the spacing between all adjacent frequencies may be the same. Wherein the first time may be, for example, an exposure time of a camera, an image detector, etc. as the signal acquisition device.
Further, in one example, the number of frequencies may be determined based on a number of images imaged within a first time, the first time being related to or the same as an exposure time of the signal acquisition device. For example, when the signal acquisition processing method of the present disclosure is used to perform near compression sampling on a target scene, the number of frequencies used for modulating signals may be set according to the compression ratio of signal compression sampling performed by a user. The number of frequencies used determines the compression ratio of the video signal. The smaller the number of frequencies used, the larger the signal compression ratio and, correspondingly, the fewer the imaged images. The larger the number of frequencies used, the smaller the video signal compression ratio, and the more images imaged. In addition, the frequency of different numbers can be selected according to the complexity of the target scene, the number of moving objects in the target scene, the motion mode, the motion parameters and the like, and the compression ratio balance of better compression sampling can be realized.
For example, the user may wish to obtain the motion trajectory of a moving object in the target scene by processing and capturing the target scene. The image detector as the optical signal acquisition device comprises a plurality of detection units, and each detection unit correspondingly acquires one pixel point of data. When the object moves in the space and the moving object appears, the brightness value of a pixel point of a certain detection unit of the image detector changes, and the change is represented as that a pulse signal is generated on a time domain waveform of which the horizontal axis corresponds to the pixel point as time and the vertical axis corresponds to the pixel brightness value. According to the different appearance moments of the moving object at different pixel points, the appearance moments of the pulse signals on the time domain waveforms of the pixel points are different, and the corresponding frequency domains are different in phase. Therefore, in this case, the phases of all the sub-modulation signals may be different. For example, in order to avoid phase wrapping, the phase variation range of the frequency of the modulation signal is in the range of 0-2 pi in one exposure time, and different times correspond to different phases one by one. Further, the frequencies of all the sub-modulation signals may be set to be the same, for example, the frequency may also be equal to or less than the inverse of the exposure time. It is also possible to set the frequency of each group of sub-modulation signals to be the same, with differences between groups. In this way, only the motion trajectory and the phase distribution of the motion trajectory at different time points may be acquired in step S102, and no other information may be acquired.
Because the frequency and the phase parameter of the modulation signal are set to take values in a certain value or a certain range, the motion track information in the target scene can be directly obtained from the collected data, the target scene does not need to be imaged to obtain a plurality of image sequences like the prior art, the motion track is extracted from the image sequences, and the processing speed and the processing efficiency of track detection are greatly improved.
Also for example, the value of the sub-modulation signal may be set to include 0 or not include 0 according to information that the user wants to obtain from the target scene. For example, when the light signal that the user wants to acquire is a background signal in the target scene, since the frequency spectrum of the background information has no high frequency component and has a value only at a direct current component, the frequency parameter may be set to 0, and the acquired data thus obtained is a background image. When the light signal that the user wants to collect is foreground information after removing the background signal. Also, it is also possible to set the frequency parameter to a frequency other than 0, so that the acquired data is information from which background information is removed. If the information that the user wishes to obtain from the target scene may include background information, the frequency parameter of the sub-modulation signal may include 0.
Also for example, when a moving object (including a brightness change object) is included in the target scene, the frequency parameter of the sub-modulation signals is determined, the motion parameter of the moving object in the target scene may be determined in addition to the factors mentioned in the example of determining the modulation signals in the foregoing embodiment, and the frequency parameters, such as the number of frequencies and/or the number of frequencies, of the sub-modulation signals of the sub-modulation units are respectively determined based on the motion parameter or the combination of the motion parameter and one or more of the factors mentioned in the foregoing embodiment. The frequency parameters of all the sub-modulation signals may be the same or different, or some frequency parameters may be the same and some frequency parameters may be different. In one example, a user desires target recognition of a particular target in a target scene. Due to different textures, the time domain waveforms of the objects with different moving speeds are often very different, and the frequency spectrum difference is also very large. The spectral distribution is a result of the co-action of various parameters of a plurality of moving objects. The varying physical quantities cause the spectral composition to vary. Such as a sharp increase in intensity at frequency components that are otherwise weaker in intensity. Therefore, the modulation signal can realize the specific moving object detection by adopting the frequency values as the frequency parameters. In order to avoid identification errors, the frequency value of the modulation signal may also be suitably adjusted, for example, may be a frequency range including the above frequency value.
In one example, the motion parameter of the moving object includes a motion speed. The brightness change parameter of the brightness change object includes a brightness change speed. Thus, when the frequency value is determined based on the motion parameter, it is possible to determine a time value of the moving object passing through one detection unit of the optical signal acquisition device, such as an image detector, based on the motion speed or the luminance change speed, and then determine the frequency value based on the time value. The time value may be one time value or a plurality of time values. Similarly, the frequency value may be a single frequency value or a plurality of frequency values corresponding to the time value, and form a frequency distribution. The frequency value is related to the time value, e.g. the frequency value is e.g. the inverse of the time value or less. If the motion parameters further comprise the motion acceleration or the brightness change parameters further comprise the change acceleration, different speeds of the moving object at different moments can be determined according to the acceleration and the initial motion speed.
And determining a plurality of time values of the moving object passing through a detection unit of the optical signal acquisition device according to the obtained plurality of speeds. Also, for the brightness change object, a plurality of brightness change speeds of the brightness change object may be determined based on the brightness change acceleration and the initial change speed, and a plurality of time values of the brightness change object passing through one detection unit of the optical signal collection device may be determined based on the plurality of brightness change speeds. Thereby determining a frequency value from the time value. For example, the frequency value is the inverse of the time value.
Furthermore, according to embodiments of the present disclosure, the determination of the phase parameter of the sub-modulation signal may be related to the frequency and/or the group of sub-modulation units. For example, according to the grouped sub-modulation units mentioned in the foregoing embodiment, it is possible to set the modulation signals of a group of sub-modulation units to have different phase values. In order to avoid phase wrapping problems, the phase distribution of the modulation signals of the sub-modulation units is a plurality of different phases, e.g. 3, 4 or more than 4, which may be in the range of 0-2 pi. In the one-time exposure time of the image detector, the phase distribution of one detection unit of the image detector is 0-2 pi, so that the problem of phase winding can be avoided, the time when a moving object appears corresponds to the phase of a frequency domain one by one, and an ambiguous value cannot appear.
Fig. 4 shows a schematic diagram of another signal acquisition processing device according to an embodiment of the present disclosure, and in order to project an optical signal from a target scene onto a spatial light modulator more accurately, referring to fig. 4, in an embodiment, the signal acquisition processing device 400 may further include a first optical element 430 in addition to the signal processing device, such as the spatial light modulator 410, and the signal acquisition device, such as the image detector 420, in all the embodiments described above. Then, the light signal from the target scene S may be projected onto the spatial light modulator 410 after passing through the first optical element 430, and the spatial light modulator 410 performs receiving modulation on the light signal projected thereon after passing through the first optical element 430. The first optical element may be one optical element or a group of optical elements. For example, may include one or more of lenses, lens groups, prisms, mirrors, optical fibers, etc. for transmitting the optical path or imaging. In one example, the first optical element 430 may project light signals from the target scene S onto the spatial light modulator 410 and may also image on the spatial light modulator 410. The first optical element may adjust the optical path such that the optical signal from the target scene is more accurately projected onto the spatial light modulator.
The method of performing the first processing on the optical signal from the target scene according to the present disclosure is described above, and the method of acquiring the first processed optical signal is further described below.
In S102, the optical signal after the first processing is collected. In the signal acquisition processing device, the signal acquisition device acquires the optical signal after the first processing.
See the signal acquisition and processing apparatus 400 of fig. 4. In the signal acquisition processing device 400, the signal acquisition device may be an image detector 420 or a Charge Coupled Device (CCD), a camera, or the like capable of forming a pixel array. When the signal processing device is the spatial light modulator 410, then, in step S102, the collecting the first processed optical signal includes: the optical signal modulated by the spatial light modulator 410 is projected to the image detector 420, and then the modulated optical signal is collected by the image detector 420.
The data collected by the signal collection device may be a one-dimensional or two-dimensional pixel lattice, for example, a pixel lattice formed by converting the received modulated optical signals into electrical signals by all sub-detection units on the image detector. Each pixel has its own pixel value, including luminance and chrominance values and other information. Each pixel point contains information of the modulated optical signal at a certain position, a certain frequency and a certain phase or combination information of a plurality of kinds of information. Fig. 5 shows a schematic diagram of a target scene with acquired data. Referring to fig. 5, the left diagram is the target scene, the middle diagram is the pixel dot array of the acquired data, and the right diagram is an enlarged view Y1-Y16 of a portion of the pixel dot matrix as an example. These collected pixel value points can be directly utilized as information from the target scene, and can also be further processed subsequently to obtain more information from the target scene.
The image detector is described as an example in this disclosure. The other signal acquisition devices operate in the same or similar manner.
The image detector may comprise a plurality of detection units, each detection unit in an embodiment of the present disclosure corresponding to one or more of the sub-modulation units of the spatial light modulator. That is, the modulated optical signal of the sub-modulation unit or units is detected by one sub-detector of the image detection unit. Of course, one or more sub-detection units may correspond to one sub-modulation unit of the spatial light modulator. I.e. both are in a many-to-one or one-to-many relationship. And the sub-detection units of the image detector collect the optical signals modulated by the corresponding sub-modulation units. The pixel point corresponding to each sub-detection unit in the image collected by the image detector contains information of the modulated optical signal at a certain position, a certain frequency and a certain phase or combination information of a plurality of kinds of information. Each sub-detection unit corresponds to one pixel point. Referring to the middle diagram of fig. 5, the collected data includes a plurality of pixel points, Y1-Y16 represents 16 pixel points of the plurality of pixel points, and each pixel point may correspond to a sub-modulation unit of the spatial light modulator, for example, Y1-Y16 in fig. 5 correspond to X1-X16 in fig. 3, respectively.
In one example, the acquisition process may be limited to a certain time, for example, the encoded light signal is acquired at a first time, which is related to the performance of the image detector. For example the exposure time of the image detector. I.e. the acquisition process is completed within the exposure time.
In order to project the modulated light signals onto the image detector in a fully accurate manner. Referring to fig. 4, in one example, a second optical element 440 may be disposed between spatial light modulator 410 and image detector 420. Likewise, the second optical element 440 may also be one element or a combination of elements. For example, one or more of the optical elements may include lenses, lens groups, mirrors, optical fibers, etc. Thus, the modulated optical signal of the spatial light modulator 410 may first pass through the second optical element 440 and be further projected onto the image detector 420 for collection. Therefore, the modulated optical signals can be more accurately focused on the image detector.
According to the embodiment of the disclosure, the target scene optical signal is processed and then collected, so that the collected image is information required to be obtained by the user, and the data volume is greatly reduced. Meanwhile, the acquired information contains more useful information than the information obtained by directly imaging the target scene, and the information of the target scene in a plurality of frequency domains is recorded, so that information loss caused by the defect of low performance of the acquisition device is avoided, and the requirements on the performance of the acquisition device, such as a camera frame rate, transmission link bandwidth, storage space and the like, are reduced. In addition, since the processing is performed in the optical domain, the speed of the algorithm can reach the speed level of light.
The signal acquisition processing method and apparatus according to the above embodiment of the present disclosure are introduced above, and the signal acquisition processing method and apparatus according to another embodiment of the present disclosure will be further described below.
Fig. 6 illustrates another signal acquisition processing method according to an embodiment of the disclosure. Referring to fig. 6, on the basis of all the foregoing embodiments and examples, the signal acquisition processing method further includes step S103, in S103, obtaining information in the target scene based on the acquired data. That is, in the signal acquisition processing method shown in fig. 6, in S101, the first processing is performed on the light signal from the target scene. In S102, the optical signal after the first processing is collected. In S103, information in the target scene is obtained based on the acquired data.
Fig. 7 shows a schematic diagram of a third signal acquisition and processing device according to an embodiment of the disclosure. The signal acquisition processing method 600 of fig. 6 corresponds to the signal acquisition processing apparatus 700 shown in fig. 7. Referring to fig. 7, the signal acquisition processing apparatus 700 includes a signal processing apparatus 710, such as a spatial light modulator, a signal acquisition apparatus 720, such as an image detector, and an information obtaining apparatus 730. The information obtaining device 730 can be, for example, a computer, a microprocessor, a central processing unit, etc., and can be implemented in any one of three implementations of software, hardware, and firmware.
Then, the signal acquisition processing method shown in fig. 6 may be, for example, that the signal processing device 710 performs first processing on the optical signal from the target scene S, the signal acquisition device 720 acquires the optical signal after the first processing, and the information obtaining device 730 obtains information in the target scene based on the acquired data.
That is, after the processed light signals are acquired, information in the target scene is also obtained based on the acquired data. The obtained information in the target scene may include, for example, an imaged image of the entire target scene, an image of a portion of the target scene, an image of one or more objects therein, an image of a point in time, an image of the target scene after other processing, a motion trajectory of an object, information of a point or points, background information, foreground information, and so on.
In one example, the second processing is performed on the acquired data to obtain information in the target scene, the obtained information being imaged. An image includes an image or images, such as pictures and video sequences. In addition, the information in the target scene may also be displayed in other manners. E.g., in the form of a waveform, a distribution of pixel values, etc.
In the foregoing embodiment, it can be known that the data collected in step S102 is pixel points formed by converting the received optical signals into electrical signals by all the sub-detection units on the image detector, and each pixel point has its own pixel value, i.e., a luminance value and/or a chrominance value, etc. Each pixel point contains information of the modulated optical signal at a certain position, a certain frequency and a certain phase or the combination information of the information.
Since the acquisition of the modulated light signal by the image detector is performed within a time period, for example within an exposure time of the image detector, different image detectors have different exposure times T, for example T ═ 0.01 seconds. And each pixel point comprises a combination of a plurality of information of a plurality of time points of the modulated light signal in the exposure time period. Since the optical signal from the target scene is modulated before, the embodiments of the present disclosure may recover the information of the target scene at a certain time point within the exposure time from the collected combined information.
Figure 8 shows image contrast maps obtained in different ways. I.e. a contrast map of the imaged image of the image detector over the exposure time and the imaged image at a certain point in time recovered by an embodiment of the present disclosure. Referring to fig. 8, the upper left image, i.e., fig. 8-1, is the target scene, the lower left image, i.e., fig. 8-2, is the imaged image of the image detector during the exposure time, and the right 3 images, i.e., fig. 8-3, fig. 8-4, and fig. 8-5 are the imaged images recovered by the embodiment of the present disclosure at three different time points t ═ 6ms, t ═ 60ms, and t ═ 100ms during the exposure time. It can be obviously seen that the image restored by the embodiment of the disclosure is clearer than the image obtained by normally photographing the image detector, and the detail information of the image can be obtained.
In one example, a time domain oscillogram of each pixel point in the collected data within the exposure time is respectively obtained, so as to obtain information in the target scene. The following description will take one pixel as an example. Fig. 9 shows a time domain waveform of a pixel during an exposure time. The time domain waveform of the pixel point is random, and may be, for example, a sine waveform, a cosine waveform, or other waveforms. The waveform diagram of fig. 9 is merely a schematic diagram, wherein the abscissa represents a time parameter T, which has a value in the range of 0-T, and T is the exposure time of the image detector; the ordinate represents the pixel values, e.g. luminance values, of the pixel point at different moments in time. Because the target scene may include moving objects or brightness-changing objects, for example, people blink eyes when taking a picture, or a car fast-driving scene, the brightness value of the same pixel point at each time point may be different, and the brightness value change condition of one pixel point within the exposure time can be obtained through the time-domain oscillogram.
And obtaining the time domain waveform corresponding to each other pixel point in the data acquired by the image detector in the same way. Similarly, the pixel values of all the pixel points at a certain time point can be obtained, so that the scene information of the time point can be recovered according to the pixel values of all the pixel points.
In order to obtain the time-domain waveform of a pixel point in the collected data, according to the embodiment of the present disclosure, scene information may be restored by using sub-detection units of the image detector corresponding to a plurality of sub-modulation units of the spatial light modulator as units. For example, the sub-modulation units are divided into a plurality of groups, and the scene information is restored by taking the pixel points of the sub-detection units corresponding to each group of sub-modulation units as units. For example, the recovery may also be performed by taking the pixel points of the sub-detection units corresponding to the multiple groups of sub-modulation units as units. The more groups in the unit of multiple groups of sub-modulation units, the smaller the resolution of the restored image. The fewer groups of sub-modulation units in a unit, the greater the resolution of the recovered image. The following description will be given taking the respective pixels Y1 to Y16 of the sub-detection units corresponding to the sub-modulation units X1 to X16 in fig. 3 as an example.
As shown in fig. 5, the Y1-Y16 pixel groups are close in spatial position and approximately correspond to the same position of the object, so that the information corresponding to the position in the target scene can be obtained in units of Y1-Y16, for example, the luminance value P1 corresponding to the position in the target scene is obtained. In addition, the chromatic value of the pixel point can be obtained by a color image sensor. In one example, since the target scene location corresponding to Y1-Y16 may include a plurality of pixel points, the luminance values of the plurality of pixel points at the location may all be set to the luminance value P1. In another example, the brightness values of a plurality of pixel points at the position of the target scene can be set to be related to P1 according to the a priori data. For example, P1 is a value obtained by performing various processing such as filtering and smoothing. Therefore, the pixel point groups of the sub detection units corresponding to each group of sub modulation units are subjected to the operation, so that the information of each position corresponding to the target scene can be obtained, and an image is formed.
The following description will be made by taking an example of obtaining the luminance value P1 from Y1-Y16.
In one example, to obtain information in the target scene, such as the luminance value P1, from the collected data Y1-Y16, a coded signal corresponding to the position of the target scene may be first obtained, and then the collected data may be parsed according to the coded signal. For example, the modulation signals of the sub-modulation units X1-X16 corresponding to Y1-Y16 are obtained. And acquiring a frequency parameter in each sub-modulation signal as a first parameter for data analysis. In addition, the second parameter can be obtained based on the acquired data Y1-Y16. Then, information in the target scene is obtained based on the first and second parameters.
In one example, the first parameter may be acquired in units of one group of the plurality of sub-modulation units of the spatial light modulator described above, for example. For example, as described in the previous embodiments, the sub-modulation units X1-X16 may also be divided into four groups, such as X1-X4, X5-X8, X9-X12, X13-X16. The frequency parameters of the modulation signals of the sub-modulation units in each group are the same, for example, X1-X4 may use the same frequency f 1. The same frequency f2 may be used for X5-X8. The same frequency f3 may be used for X9-X12. X13-X16 may use one or more of the frequency parameters fl, f2, f3 and f4 of each group as the first parameter, using the same frequency f 4.
In one example, the collected data includes a plurality of pixel points, e.g., Y1-Y16. And obtaining the brightness information of each pixel point. The plurality of pixel points collected by the detection unit corresponding to one group of sub-modulation units can be used as a pixel point group, and the second parameter is determined based on each group of pixel points. For example, the sub-modulation units X1-X16 are divided into four groups, X1-X4, X5-X8, X9-X12 and X13-X16, and Y1-Y16 can be similarly divided into four groups, Y1-Y4, Y5-Y8, Y9-Y12 and Y13-Y16. A second parameter is then determined based on the pixel values of each group of pixels.
In one example, the second parameter may be calculated according to the pixel values of all the pixels in each pixel group. For example, for the pixel values of all the pixels in each pixel group, a four-phase shift recovery algorithm is used to obtain the second parameter. The second parameter may be one or more.
In one example, a third parameter may also be obtained, which may be a phase parameter in each sub-modulated signal. And determining a second parameter based on the third parameter and a plurality of pixel points acquired by the detection units corresponding to the group of sub-modulation units. For example, a four-phase shift recovery algorithm is performed based on the third parameter to obtain the second parameter.
According to the second parameter obtained from each pixel point group and the first parameter obtained from the coding information, the frequency spectrum of the target scene at the position corresponding to the pixel point group can be obtained.
In one example, the first parameter and the second parameter may be used as parameters, and inverse fourier transform may be performed on a certain part of the acquired data to obtain a time-domain waveform of the target scene at a corresponding position. For example, as shown in fig. 9, a time domain waveform shows the luminance value L of a pixel point at a corresponding position in the target scene at different time t within the exposure time. In this way, the luminance values of the target scene at the respective positions at the respective points in time are obtained in the same manner, and an image can be formed.
In one example, where moving objects (including brightness changing objects) are included in the target scene, the changing images may be recovered from the acquired data. For example, when the user desires to acquire information about the object scene within a predetermined time, for example, within one exposure of the image sensor, that is, within one photographing time, a plurality of images in which a moving object in the object scene restored from a single image obtained by normal photographing varies in exposure time can be obtained.
Because the prior art is limited by performance indexes of a shooting camera, such as the exposure time of the camera, when a moving object is shot, a virtual image is often shot in the exposure time, and scene details are not clearly seen. According to the embodiments of the present disclosure, since the light signal from the target scene is effectively processed before being collected, a plurality of fine images within the exposure time of the camera can be obtained. As shown in fig. 8, the left side 8-1 is the target scene, 8-2 is a single image obtained by the image detector in normal photographing, that is, an image taken by the image detector without any processing on pre-acquired data, and the right side 8-3, 8-4, 8-5 is three pieces of image information of the target scene at different time points acquired by the signal acquisition device after the first processing is performed by the signal processing device. It can be seen from comparison that, with the method of the embodiment of the present disclosure, when an ordinary camera is used for imaging, only fuzzy ghost images are generally available for a fast moving object, and details of the fast moving object at different time points can be recovered from an imaging image recovered by the embodiment of the present disclosure, especially for imaging of the fast moving object. And because the processing process is optical signal processing, the processing speed is ultra-fast and can reach the standard of optical speed.
In the embodiment of the present disclosure, in step S101, the frequency parameter and the phase parameter of the modulation signal of each group of sub-modulation units may be set in advance according to the information that the user wants to obtain from the target scene. Thus, in step S103, the user can obtain different information in the target scene from the collected data. Or, in the signal acquisition and processing device, the signal processing device presets the frequency parameters and the phase parameters of the modulation signals of each group of sub-modulation units; then, the signal information obtaining means may obtain different information in the target scene from the collected data. That is to say, the embodiment of the present disclosure can be applied to different technical fields according to user requirements.
In one example, a user wishes to compressively sample a target scene. A moving object, such as a running car as shown in the upper left diagram of fig. 8, may be included in the target scene. After the target scene is compressed and sampled, in a plurality of sampled images obtained at different time points, the intensity values of the pixel points corresponding to the same position of the target scene are a section of time domain waveform which continuously changes within a certain time T, for example, the time domain waveform shown in fig. 9. In fig. 9, the abscissa represents time t, the ordinate represents intensity value L of a pixel, and the waveform represents a change in luminance value of a certain pixel at different times t. Thus, the image composed of all the pixel points is a continuous natural image sequence formed at different time points in the time T.
Since the time domain waveform is sparse in the frequency domain, that is, only a few spectral components have large nonzero values, and most spectral components have small values close to zero, in S101, when the signal processing apparatus performs the first processing on the optical signal from the target scene, the small values can be discarded to implement compressive sampling (video compression). For example, by setting the frequency parameter and the phase parameter of the encoded signal according to the above analysis at the time of the first processing in step S101, the signal acquisition means acquires only the spectral components whose luminance values are large values in S102. The information obtaining means may directly obtain the compressed data in S103. For example, in step S101, the signal processing device selects a partial frequency from the low frequency region as a frequency parameter of the modulation signal, and modulates the optical signal from the target scene. For example, when the signal acquisition processing method of the present disclosure is used to perform near compression sampling on a target scene, the number of frequencies used for modulating signals may also be set according to the compression ratio of signal compression sampling performed by a user. The number of frequencies used determines the compression ratio of the video signal. The smaller the number of frequencies used, the larger the signal compression ratio; the larger the number of frequencies used, the smaller the video signal compression ratio.
In one example, a user wishes to detect the trajectory of a moving object in a target scene. When the object moves in the space, the time when the object passes through different sub-detection units of the image detector is different, so that the different sub-detection units of the image detector can capture the object at different times. Each sub-detection unit has different brightness values corresponding to different time points of the pixel point, so that for each pixel point, a time domain waveform can be generated according to the time points and the brightness values of the pixel points. The time when the pulse signals appear on the time domain waveforms of the pixel points are different, and the phases are different corresponding to the frequency domain. In the present disclosure, in step S101, the signal processing apparatus performs a first process on the optical signal from the target scene through the spatial light modulator, for example, the encoded signal is a cosine signal with the same frequency corresponding to a pixel point of the sub-detection unit of the image detector, and the frequency is the reciprocal or greater than the reciprocal of the exposure time of the image detector. In addition, in one exposure time, the phase distribution of the same pixel point is 0-2 pi. Thus, in step S103, the information obtaining device can obtain the time when the moving object appears at different positions in the target scene according to the collected data. So that the motion trail of the moving object can be directly obtained. The track information can be directly obtained from the target scene without acquiring a plurality of image sequences of the target scene as in the prior art, and then the track detection is carried out from the plurality of image sequences. That is, according to the data collected by the signal collection device in step S102, in step S103, the information obtaining device can directly obtain the image information in the target scene only by performing inverse fourier transform once, so that the process of reconstructing a video image and then processing in the prior art is avoided, which is simpler and easier, and the image forming efficiency is significantly improved.
In one embodiment, the user wishes to remove the background in the target scene. In general, stationary objects, environments, etc. may be considered as background. The user is interested in the portrait of the front segment and objects, including moving objects. The background is often used as an interference term and the user wants to remove it by a background removal algorithm. In the embodiments of the present disclosure, the time domain waveform in consideration of the background information is stable and may be regarded as a constant that does not change with time. After the time domain waveform of the background is subjected to Fourier transform, no high-frequency component exists on the frequency spectrum, and only a direct-current component has a value. Therefore, in the disclosure, no direct current component is collected from the encoded signal by the signal processing apparatus in step S101. For example, in the coded signal, the frequency f ≠ 0 is not used as the modulation signal, and the modulation signals of all the sub-modulation units are cosine functions of f ≠ 0. Thus, in step S102, no dc component is collected. Since no dc component is acquired, background information is not captured by the image detector. In step S103, the information obtaining apparatus may directly obtain an image or a sequence of images from which the background is removed, based on the collected data.
In one example, a user desires target recognition of a target scene. The object may be a moving object or an object with a variable brightness, such as a neon light, although the object is not moving. The time domain waveforms of objects with different textures and moving speeds are often very different, so that the frequency spectrum of the objects is also very different. The spectral distribution is a result of the above-mentioned physical quantities acting in combination. The varying physical quantities cause the spectral composition to vary. Such as a sharp increase in intensity at frequency components that are otherwise weaker in intensity. Therefore, the specific moving object detection can be realized by adopting the frequency related to the moving objects or brightness change objects during the encoding. For example, the spectral bandwidth of a high-speed moving object may be much larger than the bandwidth of a low-speed moving object. The present disclosure may use a specific frequency parameter when setting a modulation signal in order to detect a specific target (e.g., a specific moving object) in a target scene, so that only a target image is obtained from the target scene, and a target recognition function is implemented.
For example, the motion law of the object of interest may be analyzed in advance to obtain the frequency spectrum of the object of interest. The law of motion may include, for example: texture information, velocity information, acceleration information, and the like of the object. The physical quantities can be reflected in the distribution condition of time domain waveforms of different pixel points corresponding to the image detector in the frequency spectrum.
In one example, a moving object or a brightness change object is included in the target scene, then in step S101, the signal processing device may determine at least one frequency parameter in the modulation signal according to at least one motion parameter of the moving object or according to a brightness change parameter of the brightness change object, and then determine the modulation signal based on the at least one frequency parameter.
For example, the frequency value of the frequency parameter of the modulation signal may be one or more frequency values determined according to the above-mentioned motion parameter or brightness variation parameter. In order to avoid recognition errors, the frequency value of the modulation signal may also be suitably adjusted, for example, may be a frequency range including the above-determined frequency value.
In one example, the motion parameter of the moving object includes a motion speed. The brightness change parameter of the brightness change object includes a brightness change speed. Thus, when the frequency value is determined according to the motion parameter, the time value of the moving object passing through one detection unit of the optical signal acquisition device can be determined according to the motion speed or the brightness change speed, and then the frequency can be determined according to the time value. If the motion parameters further comprise the motion acceleration or the brightness change parameters further comprise the change acceleration, different speeds of the moving object at different moments can be determined according to the acceleration and the initial motion speed. And determining a plurality of time values of the moving object passing through a detection unit of the optical signal acquisition device according to the obtained plurality of speeds. Also, for the brightness change object, a plurality of brightness change speeds of the brightness change object may be determined based on the brightness change acceleration and the initial change speed, and a plurality of time values of the brightness change object passing through one detection unit of the optical signal collection device may be determined based on the plurality of brightness change speeds. For example, the time value may be one or more non-zero time values, and the corresponding frequency value may have only 1 non-zero value or may have a plurality of non-zero values.
In this way, after determining the frequency parameters of the modulation signal according to the properties of the moving object, the signal processing device may set the encoding signal in step S101, and encode the light signal from the target scene according to the set encoding signal. Thus, in step S102, non-target data in the target scene is not captured into the image detector. In step S103, the information obtained by the information obtaining device according to the collected data only includes the image information of the specific target desired by the user, thereby implementing target identification.
Similarly, according to the embodiment of the disclosure, the specific target in the target scene can be acquired and detected by modulating the optical signal of the target scene, so that the processing speed is increased, and the calculation amount is effectively reduced.
The above-mentioned embodiments are intended to illustrate the objects, aspects and advantages of the present disclosure in further detail, and it should be understood that the above-mentioned embodiments are only illustrative of the present disclosure and are not intended to limit the present disclosure, and any modifications, equivalents, improvements and the like made within the spirit and principle of the present disclosure should be included in the scope of the present disclosure.
Claims (10)
1. A signal acquisition processing method, comprising:
performing a first processing on a light signal from a target scene;
and collecting the optical signal after the first processing.
2. The method of claim 1, further comprising: obtaining information in the target scene based on the collected data.
3. The method of claim 2, wherein the obtaining information in the target scene based on the acquired data comprises:
carrying out second processing on the acquired data to obtain information in the target scene;
the information obtained is formed into an image.
4. The method according to any one of claims 1 to 3,
the first processing of the light signal from the target scene comprises:
determining an encoding signal of the target scene;
and carrying out coding processing on the optical signal from the target scene based on the coding signal, wherein the signal after the coding processing is the optical signal.
5. The method of claim 4, wherein the acquiring the first processed optical signal comprises:
and acquiring the optical signal after the coding processing, wherein the optical signal after the coding processing comprises the information of the coded signal and the optical signal from the target scene.
6. The method of claim 5, wherein the encoded processed light signal is a result of a multiplication of the encoded signal and a light signal from the target scene.
7. The method of claim 4, wherein first processing the light signal from the target scene further comprises:
projecting the light signal from the target scene to a spatial light modulator;
the determining the coded signal of the target scene comprises:
determining a modulation signal of the spatial light modulator;
the encoding processing the light signal from the target scene based on the encoded signal comprises:
modulating the light signal from the target scene based on the modulation signal of the spatial light modulator.
8. The method of claim 7, wherein the modulation signal comprises a frequency parameter and a phase parameter, and wherein determining the modulation signal of the spatial light modulator comprises:
determining the frequency parameter of the modulated signal;
determining the phase parameter of the modulated signal;
determining the modulation signal based on the frequency parameter and the phase parameter.
9. The method of claim 8, wherein determining the frequency parameter of the modulated signal comprises:
determining properties and/or types of the pre-collected optical signals,
determining a frequency parameter of the modulation signal based on a property and/or type of the pre-collected optical signal.
10. The method of claim 8 or 9, wherein determining the frequency parameter of the modulated signal comprises:
determining the complexity of the target scene;
determining a frequency parameter of the modulated signal based on the complexity.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202010089554.9A CN111273050B (en) | 2020-02-12 | 2020-02-12 | Signal acquisition processing method and device |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202010089554.9A CN111273050B (en) | 2020-02-12 | 2020-02-12 | Signal acquisition processing method and device |
Publications (2)
Publication Number | Publication Date |
---|---|
CN111273050A true CN111273050A (en) | 2020-06-12 |
CN111273050B CN111273050B (en) | 2022-05-20 |
Family
ID=70999426
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202010089554.9A Active CN111273050B (en) | 2020-02-12 | 2020-02-12 | Signal acquisition processing method and device |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN111273050B (en) |
Citations (13)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPS63171364A (en) * | 1987-01-09 | 1988-07-15 | Hitachi Ltd | Apparatus for detecting rotational speed |
CN101706259A (en) * | 2009-11-25 | 2010-05-12 | 天津大学 | Concrete crack width test method based on wavefront coding technology and hand-held tester |
JPWO2009141973A1 (en) * | 2008-05-20 | 2011-09-29 | パナソニック株式会社 | Moving picture coding apparatus and moving picture coding method |
CN102509283A (en) * | 2011-09-30 | 2012-06-20 | 西安理工大学 | DSP (digital signal processor)-based target perceiving and encoding method facing optic nerve prosthesis |
CN104267407A (en) * | 2014-09-12 | 2015-01-07 | 清华大学 | Initiative imaging method and system based on compressed sampling |
CN104581166A (en) * | 2014-12-08 | 2015-04-29 | 天津大学 | Multichannel acquired image-based compressive imaging system and method |
CN105868700A (en) * | 2016-03-25 | 2016-08-17 | 哈尔滨工业大学深圳研究生院 | Vehicle type recognition and tracking method and system based on monitoring video |
CN106204586A (en) * | 2016-07-08 | 2016-12-07 | 华南农业大学 | A kind of based on the moving target detecting method under the complex scene followed the tracks of |
CN108093237A (en) * | 2017-12-05 | 2018-05-29 | 西北工业大学 | High spatial resolution optical field acquisition device and image generating method |
US20180367742A1 (en) * | 2016-03-31 | 2018-12-20 | Panasonic Intellectual Property Management Co., Ltd. | Imaging apparatus including light source, reflective encoding device, and image sensor |
CN109285132A (en) * | 2018-09-20 | 2019-01-29 | 南京大学 | A kind of spectrum reconstruction method based on Frequency Domain Coding |
CN110545379A (en) * | 2019-09-09 | 2019-12-06 | 北京理工大学 | Parallel time-space domain combined compression imaging method and device adopting DMD |
CN110650340A (en) * | 2019-04-25 | 2020-01-03 | 长沙理工大学 | Space-time multiplexing compressed video imaging method |
-
2020
- 2020-02-12 CN CN202010089554.9A patent/CN111273050B/en active Active
Patent Citations (13)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPS63171364A (en) * | 1987-01-09 | 1988-07-15 | Hitachi Ltd | Apparatus for detecting rotational speed |
JPWO2009141973A1 (en) * | 2008-05-20 | 2011-09-29 | パナソニック株式会社 | Moving picture coding apparatus and moving picture coding method |
CN101706259A (en) * | 2009-11-25 | 2010-05-12 | 天津大学 | Concrete crack width test method based on wavefront coding technology and hand-held tester |
CN102509283A (en) * | 2011-09-30 | 2012-06-20 | 西安理工大学 | DSP (digital signal processor)-based target perceiving and encoding method facing optic nerve prosthesis |
CN104267407A (en) * | 2014-09-12 | 2015-01-07 | 清华大学 | Initiative imaging method and system based on compressed sampling |
CN104581166A (en) * | 2014-12-08 | 2015-04-29 | 天津大学 | Multichannel acquired image-based compressive imaging system and method |
CN105868700A (en) * | 2016-03-25 | 2016-08-17 | 哈尔滨工业大学深圳研究生院 | Vehicle type recognition and tracking method and system based on monitoring video |
US20180367742A1 (en) * | 2016-03-31 | 2018-12-20 | Panasonic Intellectual Property Management Co., Ltd. | Imaging apparatus including light source, reflective encoding device, and image sensor |
CN106204586A (en) * | 2016-07-08 | 2016-12-07 | 华南农业大学 | A kind of based on the moving target detecting method under the complex scene followed the tracks of |
CN108093237A (en) * | 2017-12-05 | 2018-05-29 | 西北工业大学 | High spatial resolution optical field acquisition device and image generating method |
CN109285132A (en) * | 2018-09-20 | 2019-01-29 | 南京大学 | A kind of spectrum reconstruction method based on Frequency Domain Coding |
CN110650340A (en) * | 2019-04-25 | 2020-01-03 | 长沙理工大学 | Space-time multiplexing compressed video imaging method |
CN110545379A (en) * | 2019-09-09 | 2019-12-06 | 北京理工大学 | Parallel time-space domain combined compression imaging method and device adopting DMD |
Non-Patent Citations (3)
Title |
---|
JIANG TANG, ET AL: "Single-Shot Temporal Ghost Imaging Based on Orthogonal Frequency-Division Multiplexing", 《 IEEE PHOTONICS TECHNOLOGY LETTERS》 * |
TENG JJ, ET AL: "Time-encoded single-pixel 3D imaging", 《APL PHOTONICS》 * |
陈宏伟 等: "超快平面显微成像技术", 《数据采集与处理》 * |
Also Published As
Publication number | Publication date |
---|---|
CN111273050B (en) | 2022-05-20 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
Wang et al. | High-speed hyperspectral video acquisition with a dual-camera architecture | |
US7924315B2 (en) | Image processing method, image processing apparatus, image processing program, and image file format | |
Nagahara et al. | Programmable aperture camera using LCoS | |
KR101612165B1 (en) | Method for producing super-resolution images and nonlinear digital filter for implementing same | |
CN101305398B (en) | Method for forming synthesis image based on a plurality of image frames | |
TW512286B (en) | Real-time opto-electronic image processor | |
CN107306333B (en) | High-speed single-pixel imaging method | |
US20120330162A1 (en) | Modulated aperture imaging for automatic moving target detection | |
CN108184075B (en) | Method and apparatus for generating image | |
EP3204812B1 (en) | Microscope and method for obtaining a high dynamic range synthesized image of an object | |
US20170206633A1 (en) | Method and apparatus for up-scaling an image | |
CN106375675B (en) | A kind of more exposure image fusion methods of aerial camera | |
EP3143583B1 (en) | System and method for improved computational imaging | |
CN102564924B (en) | Automatic scanning method of single-frame image of blood cell | |
CN111273050B (en) | Signal acquisition processing method and device | |
US11663708B2 (en) | Image processing apparatus, imaging apparatus, and image processing method | |
CN114119428B (en) | Image deblurring method and device | |
KR20220040025A (en) | Thermal image reconstruction device and method based on deep learning | |
CN111915697A (en) | One-step harmonic single-pixel imaging method | |
Malviya et al. | Multi-focus image fusion of digital images | |
Malviya et al. | Wavelet based multi-focus image fusion | |
CN116156144B (en) | Integrated system and method for hyperspectral information acquisition and transmission | |
Gunturk et al. | Frequency division multiplexed imaging | |
Lagunas et al. | Human eye visual hyperacuity: Controlled diffraction for image resolution improvement | |
JP2005202907A (en) | Video forming device |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant | ||
TR01 | Transfer of patent right | ||
TR01 | Transfer of patent right |
Effective date of registration: 20230419 Address after: 101407 28-3, floor 1, building 28, yard 13, Paradise West Street, Huairou District, Beijing Patentee after: Beijing Qingzhi Yuanshi Technology Co.,Ltd. Address before: 100084 No. 1 Tsinghua Yuan, Beijing, Haidian District Patentee before: TSINGHUA University |