CN113924768A - Image processing method and device - Google Patents

Image processing method and device Download PDF

Info

Publication number
CN113924768A
CN113924768A CN202180003570.4A CN202180003570A CN113924768A CN 113924768 A CN113924768 A CN 113924768A CN 202180003570 A CN202180003570 A CN 202180003570A CN 113924768 A CN113924768 A CN 113924768A
Authority
CN
China
Prior art keywords
polarization
image
illumination
polarized
acquisition
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202180003570.4A
Other languages
Chinese (zh)
Other versions
CN113924768B (en
Inventor
吕笑宇
马莎
罗达新
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Huawei Technologies Co Ltd
Original Assignee
Huawei Technologies Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Huawei Technologies Co Ltd filed Critical Huawei Technologies Co Ltd
Publication of CN113924768A publication Critical patent/CN113924768A/en
Application granted granted Critical
Publication of CN113924768B publication Critical patent/CN113924768B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/80Camera processing pipelines; Components thereof
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/70Circuitry for compensating brightness variation in the scene
    • H04N23/71Circuitry for evaluating the brightness variation
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/70Circuitry for compensating brightness variation in the scene
    • H04N23/74Circuitry for compensating brightness variation in the scene by influencing the scene brightness using illuminating means

Abstract

The application provides an image processing method and device, relates to the field of image processing, and can be applied to automatic driving or intelligent driving scenes for reducing the calculated amount when polarization information is determined. In the method, an image processing apparatus acquires first feature information; the first characteristic information is used for representing scene characteristics and indicating the acquisition of polarization information of at least one target, and corresponds to a first illumination mode, a first acquisition mode and a first image processing mode; controlling polarized light illumination according to a first illumination mode, and controlling and acquiring a first polarized image of at least one target under the first illumination mode according to a first acquisition mode; and processing the first polarization image according to the first image processing mode to acquire the polarization information of at least one target. Thus, the image processing apparatus outputs the polarization information as required, and the amount of calculation by the image processing apparatus can be reduced.

Description

Image processing method and device
Technical Field
The present application relates to the field of image processing technologies, and in particular, to an image processing method and apparatus.
Background
Currently, a terminal device may detect surrounding environment information using a polarization camera, and determine information such as a shape, a color, and the like of an object (e.g., a vehicle, an obstacle, a road, and the like) existing around the terminal device.
However, the polarization image acquired by the terminal device has polarization information of multiple dimensions, for example, currently common polarization information at least includes: polarization image, stokes vector, polarization degree, polarization angle and mueller matrix after polarization modulation. When different polarization information of a target needs to be acquired, different processing modes need to be adopted by the terminal equipment. When the current terminal device acquires the polarization information, the terminal device may uniformly output all polarization information (the polarization information generally includes the polarization image, the stokes vector, the polarization degree, the polarization angle, and the mueller matrix after the polarization modulation) that can be acquired by the target, which may cause a large amount of calculation when the terminal device processes the polarization image, and affect the calculation performance of the terminal device.
Disclosure of Invention
The application provides an image processing method and device, and solves the problems that in the prior art, when a terminal device processes a polarization image, the calculated amount is large, and the calculation performance of the terminal device is affected.
In order to solve the technical problem, the following technical scheme is adopted in the application:
in a first aspect, an image processing method is provided, including: acquiring first characteristic information; the first characteristic information is used for representing scene characteristics and indicating acquisition of polarization information of at least one target, and the first characteristic information corresponds to a first illumination mode, a first acquisition mode and a first image processing mode. And controlling the polarized light illumination according to the first illumination mode, and controlling and acquiring a first polarized image of at least one target under the first illumination mode according to the first acquisition mode. And processing the first polarization image according to the first image processing mode to acquire the polarization information of at least one target.
Based on the above technical solution, in the image processing method provided in the embodiment of the present application, the image processing device may select a corresponding image processing manner according to the acquisition of the polarization information of the target, so as to acquire the polarization information of the target, thereby avoiding a problem of a large calculation amount of the image processing device caused by acquiring all possible polarization information of the target.
In addition, the image processing device can determine the Mueller matrix of the target according to the polarized light provided for the target and the collected polarized light reflected by the target, and further determine quantitative polarization information of the target. The image processing device can also provide illumination for the target, and improves the imaging definition of the target in a dim light scene.
With reference to the first aspect, in a possible implementation manner, the method further includes: and determining a first illumination mode, a first acquisition mode and a first image processing mode according to the first characteristic information.
Based on the method, the image processing device can determine a station name mode, an image acquisition mode and an image processing mode which are provided for the target when different polarization information is acquired in each scene. Therefore, the requirements of acquiring different polarization information under different scenes are better met.
With reference to the first aspect, in one possible implementation manner, the scene characteristic includes light intensity information of an environment.
Based on this, the image acquisition device can provide illumination for the scene under the condition that ambient light intensity is little to improve the ambient light intensity of scene.
With reference to the first aspect, in one possible implementation manner, the polarization information includes at least one of the following: a second polarization image, qualitative polarization information, or quantitative polarization information.
Based on this, the image acquisition device can acquire one or more polarization information in second polarization image, qualitative polarization information, the quantitative polarization information of target according to the demand.
With reference to the first aspect, in a possible implementation manner, the first illumination manner belongs to a polarized illumination manner set, and the polarized illumination manner set includes at least one of the following: multiple polarized light sources are illuminated synchronously, multiple polarized light sources are illuminated asynchronously, or no illumination is provided.
Wherein, the multi-polarization light source synchronous illumination is as follows: simultaneously providing polarized illumination by a plurality of polarized light sources of different polarization states; the non-synchronous illumination of the multi-polarization light source is as follows: providing polarized illumination by a plurality of polarized light sources of different polarization states in sequence; the non-illumination is: no illumination is provided.
Based on this, different illumination modes can be suitable for different scenes and polarization information acquisition requirements, and the image processing device determines to provide different illumination modes under different scenes and polarization information acquisition requirements, so that the applicability of the image processing method provided by the application can be improved.
With reference to the first aspect, in a possible implementation manner, the first acquisition mode belongs to a polarization acquisition mode set, and the polarization acquisition mode set includes: synchronous collection operation of single-frame polarized images and asynchronous collection operation of multi-frame polarized images.
The single-frame polarized image synchronous acquisition operation comprises the following steps: acquiring a plurality of polarization images with different polarization states from a single image by acquiring the single image of at least one target; the asynchronous acquisition operation of the multi-frame polarized image comprises the following steps: by acquiring a plurality of images of at least one target, each image of the plurality of images comprises a plurality of polarization images of different polarization states.
Based on this, different image acquisition modes can be suitable for different scenes and polarization information acquisition requirements, and the image processing device determines to provide different image modes under different scenes and polarization information acquisition requirements, so that the applicability of the image processing method provided by the application can be improved. In addition, the illumination mode and the image acquisition mode are matched with each other, the illumination mode provides polarized light illumination in different polarization states, the image acquisition device acquires polarized images under the polarized light illumination condition in different polarization states, the Mueller matrix of the target can be determined, and then quantitative polarization information of the target is determined.
With reference to the first aspect, in a possible implementation manner, the first image processing method belongs to a polarization image processing method set, and the polarization image processing method set includes: obtaining polarized images, obtaining qualitative polarized information and obtaining quantitative polarized information.
The polarization image obtaining operation is used for obtaining a second polarization image, and the second polarization image belongs to the polarization images of the first polarization image in different polarization states; the qualitative polarization information obtaining operation is for obtaining at least one of the following polarization information of at least one target: stokes vector, degree of polarization, or angle of polarization; the quantitative polarization information acquisition operation is used to acquire a mueller matrix of the at least one target.
Based on this, different image processing methods can be suitable for different polarization information acquisition requirements, and the image processing device determines that different image processing methods are adopted under different polarization information acquisition requirements, so that the applicability of the image processing method provided by the application can be improved.
With reference to the first aspect, in a possible implementation manner, the first feature information represents that the intensity of ambient light is less than a preset value and indicates to acquire the second polarization image, and the first illumination operation includes: the multi-polarization light source synchronous illumination, the polarization image acquisition mode comprises: single frame polarized image synchronous collection operation, the polarized image processing operation comprises: and (5) a polarized image acquisition operation.
Based on the above, in the scene where the image processing device is weak in ambient light and needs to acquire the polarized image, the illumination device provides the multi-polarized light source synchronous illumination for the scene, so that the ambient light intensity of the scene can be improved; the image acquisition device acquires a single image of the target, so that the number of processed images can be reduced; the image processing device processes the acquired second polarization image of the single image acquisition target, and the calculation amount of the image processing device can be reduced.
With reference to the first aspect, in a possible implementation manner, the first feature information represents that the ambient light intensity is smaller than a preset value and indicates to acquire qualitative polarization information, and the polarized light illumination operation includes: the multi-polarization light source synchronous illumination, the polarization image acquisition mode comprises: single frame polarized image synchronous collection operation, the polarized image processing operation comprises: and (5) obtaining qualitative polarization information.
Based on the above, in the scene where the image processing device is weak in scene environment light and needs to acquire qualitative polarization information, the illumination device provides multi-polarization light source synchronous illumination for the scene, so that the environment light intensity of the scene can be improved; the image acquisition device acquires a single image of the target, so that the number of processed images can be reduced; the image processing device processes the acquired qualitative polarization information of the single image acquisition target, so that the calculation amount of the image processing device can be reduced.
With reference to the first aspect, in a possible implementation manner, the first feature information represents that the ambient light intensity is smaller than a preset value and indicates to acquire qualitative polarization information, and the polarized light illumination operation includes: the non-synchronous illumination of many polarized light sources, polarization image acquisition mode includes: the asynchronous collection operation of multiframe polarization images, the polarization image processing operation includes: quantitative polarization information acquisition operation.
Based on the above, in the scene where the image processing device is weak in scene environment light and needs to acquire qualitative polarization information, the illumination device provides multi-polarization light source asynchronous illumination for the scene, so that the environment light intensity of the scene can be improved, and polarized light in different polarization states can be provided for the target; the image acquisition device acquires a plurality of images of the target under different polarized lights, and the image processing device can determine the Mueller matrix of the target by combining the polarized light provided by the polarized light source and the plurality of acquired images under the polarized lights, so that the quantitative polarization information of the target can be determined.
With reference to the foregoing first aspect, in a possible implementation manner, the first feature information represents that the ambient light intensity is greater than or equal to a preset value and indicates to acquire a second polarized image, and the polarized light illumination operation includes: the non-illumination and polarized image acquisition mode comprises the following steps: single frame polarized image synchronous collection operation, the polarized image processing operation comprises: and (5) a polarized image acquisition operation.
Based on the above, under the scene that the image processing device needs to acquire the polarized image due to the light intensity of the scene environment, the illumination device does not need to provide multi-polarized light source synchronous illumination for the scene, and the energy consumption of the illumination device can be reduced; the image acquisition device acquires a single image of the target, so that the number of processed images can be reduced; the image processing device processes the acquired second polarization image of the single image acquisition target, and the calculation amount of the image processing device can be reduced.
With reference to the foregoing first aspect, in a possible implementation manner, the first feature information represents that the ambient light intensity is greater than or equal to a preset value and indicates to acquire qualitative polarization information, and the polarized light illumination operation includes: the non-illumination and polarized image acquisition mode comprises the following steps: single frame polarized image synchronous collection operation, the polarized image processing operation comprises: and (5) obtaining qualitative polarization information.
Based on the method, under the scene that the image processing device needs to acquire qualitative polarization information due to the light intensity of the scene environment, the illumination device does not need to provide multi-polarization light source synchronous illumination for the scene, and the energy consumption of the illumination device can be reduced; the image acquisition device acquires a single image of the target, so that the number of processed images can be reduced; the image processing device processes the acquired qualitative polarization information of the single image acquisition target, so that the calculation amount of the image processing device can be reduced.
With reference to the foregoing first aspect, in a possible implementation manner, the first feature information represents that the ambient light intensity is greater than or equal to a preset value and indicates to acquire quantitative polarization information, and the polarized light illumination operation includes: the non-synchronous illumination of many polarized light sources, polarization image acquisition mode includes: the asynchronous collection operation of multiframe polarization images, the polarization image processing operation includes: quantitative polarization information acquisition operation.
Based on the situation, under the scene that the image processing device needs to acquire qualitative polarization information due to the light intensity of the scene environment, the illumination device provides multi-polarization light source asynchronous illumination for the scene and can provide polarized light in different polarization states for the target; the image acquisition device acquires a plurality of images of the target under different polarized lights, and the image processing device can determine the Mueller matrix of the target by combining the polarized light provided by the polarized light source and the plurality of acquired images under the polarized lights, so that the quantitative polarization information of the target can be determined.
In a second aspect, there is provided an image processing apparatus comprising: a processing unit and an acquisition unit. An acquisition unit configured to acquire first feature information; the first characteristic information is used for representing scene characteristics and indicating acquisition of polarization information of at least one target, and the first characteristic information corresponds to a first illumination mode, a first acquisition mode and a first image processing mode. And the processing unit is used for controlling the polarized light illumination according to the first illumination mode and controlling and acquiring a first polarized image of at least one target under the first illumination mode according to the first acquisition mode. And the processing unit is also used for processing the first polarization image according to the first image processing mode to acquire the polarization information of at least one target.
With reference to the second aspect, in a possible implementation manner, the processing unit is further configured to: and determining a first illumination mode, a first acquisition mode and a first image processing mode according to the first characteristic information.
With reference to the second aspect, in a possible implementation manner, the processing unit is specifically configured to: generating a first instruction and a second instruction; the first instructions are for indicating a first lighting mode; the second instruction is used for indicating the first acquisition mode; the instruction acquisition unit sends a first instruction to the lighting device and sends a second instruction to the image acquisition device.
With reference to the second aspect, in a possible implementation manner, the processing unit is specifically further configured to: the instruction acquisition unit receives a first polarization image from the image acquisition device.
In combination with the above second aspect, in one possible implementation manner, the scene characteristic includes light intensity information of an environment.
With reference to the second aspect, in one possible implementation manner, the polarization information includes at least one of the following: a second polarization image, qualitative polarization information, or quantitative polarization information.
With reference to the second aspect, in one possible implementation manner, the first illumination manner belongs to a polarized illumination manner set, and the polarized illumination manner set includes at least one of the following: multi-polarized light source synchronous illumination, multi-polarized light source asynchronous illumination, or no illumination; wherein, the multi-polarization light source synchronous illumination is as follows: simultaneously providing polarized illumination by a plurality of polarized light sources of different polarization states; the non-synchronous illumination of the multi-polarization light source is as follows: providing polarized illumination by a plurality of polarized light sources of different polarization states in sequence; the non-illumination is: no illumination is provided.
With reference to the second aspect, in a possible implementation manner, the first acquisition manner belongs to a polarization acquisition manner set, and the polarization acquisition manner set includes: synchronous collection operation of single-frame polarized images and asynchronous collection operation of multi-frame polarized images; the single-frame polarized image synchronous acquisition operation comprises the following steps: acquiring a plurality of polarization images with different polarization states from a single image by acquiring the single image of at least one target; the asynchronous acquisition operation of the multi-frame polarized image comprises the following steps: by acquiring a plurality of images of at least one target, each image of the plurality of images comprises a plurality of polarization images of different polarization states.
With reference to the second aspect, in a possible implementation manner, the first image processing manner belongs to a polarization image processing manner set, and the polarization image processing manner set includes: obtaining polarized images, qualitative polarization information and quantitative polarization information; the polarization image obtaining operation is used for obtaining a second polarization image, and the second polarization image belongs to the polarization images of the first polarization image in different polarization states; the qualitative polarization information obtaining operation is for obtaining at least one of the following polarization information of at least one target: stokes vector, degree of polarization, or angle of polarization; the quantitative polarization information acquisition operation is used to acquire a mueller matrix of the at least one target.
With reference to the second aspect, in a possible implementation manner, the first feature information represents that the intensity of ambient light is less than a preset value and indicates to acquire the second polarization image, and the first illumination operation includes: the multi-polarization light source synchronous illumination, the polarization image acquisition mode comprises: single frame polarized image synchronous collection operation, the polarized image processing operation comprises: and (5) a polarized image acquisition operation.
With reference to the second aspect, in a possible implementation manner, the first feature information represents that the ambient light intensity is smaller than a preset value and indicates to acquire qualitative polarization information, and the polarized light illumination operation includes: the multi-polarization light source synchronous illumination, the polarization image acquisition mode comprises: single frame polarized image synchronous collection operation, the polarized image processing operation comprises: and (5) obtaining qualitative polarization information.
With reference to the second aspect, in a possible implementation manner, the first feature information represents that the ambient light intensity is smaller than a preset value and indicates to acquire qualitative polarization information, and the polarized light illumination operation includes: the non-synchronous illumination of many polarized light sources, polarization image acquisition mode includes: the asynchronous collection operation of multiframe polarization images, the polarization image processing operation includes: quantitative polarization information acquisition operation.
With reference to the second aspect, in a possible implementation manner, the first feature information represents that the ambient light intensity is greater than or equal to a preset value and indicates to acquire the second polarized image, and the polarized light illumination operation includes: the non-illumination and polarized image acquisition mode comprises the following steps: single frame polarized image synchronous collection operation, the polarized image processing operation comprises: and (5) a polarized image acquisition operation.
With reference to the second aspect, in a possible implementation manner, the first feature information represents that the ambient light intensity is greater than or equal to a preset value and indicates to acquire qualitative polarization information, and the polarized light illumination operation includes: the non-illumination and polarized image acquisition mode comprises the following steps: single frame polarized image synchronous collection operation, the polarized image processing operation comprises: and (5) obtaining qualitative polarization information.
With reference to the second aspect, in a possible implementation manner, the first feature information represents that the ambient light intensity is greater than or equal to a preset value and indicates to acquire quantitative polarization information, and the polarized light illumination operation includes: the non-synchronous illumination of many polarized light sources, polarization image acquisition mode includes: the asynchronous collection operation of multiframe polarization images, the polarization image processing operation includes: quantitative polarization information acquisition operation.
In a third aspect, there is provided a lighting device comprising: the system includes a plurality of polarized light sources of different polarization states, at least one processor, and a communication interface. A communication interface for receiving a first instruction from the image processing apparatus, the first instruction being indicative of a first illumination mode; the first illumination mode is used to characterize the mode in which illumination is provided by a plurality of polarized light sources of different polarization states. And the processor is used for controlling the polarized light sources in the different polarization states to provide illumination according to the first instruction.
With reference to the third aspect, in a possible implementation manner, the first illumination manner belongs to a polarized illumination manner set, and the polarized illumination manner set includes at least one of the following: multi-polarized light source synchronous illumination, multi-polarized light source asynchronous illumination, or no illumination; wherein, the multi-polarization light source synchronous illumination is as follows: simultaneously providing polarized illumination by a plurality of polarized light sources of different polarization states; the non-synchronous illumination of the multi-polarization light source is as follows: providing polarized illumination by a plurality of polarized light sources of different polarization states in sequence; the non-illumination is: no illumination is provided.
With reference to the third aspect, in one possible implementation manner, the first illumination manner includes: multi-polarization light source synchronous illumination; the processor is specifically configured to: one or more of the plurality of polarized light sources of different polarization states are controlled to provide polarized illumination.
With reference to the third aspect, in one possible implementation manner, the first illumination manner includes: non-synchronous illumination by multiple polarized light sources; the processor is specifically configured to: and controlling each polarized light source in the plurality of polarized light sources with different polarization states to sequentially provide polarized illumination.
With reference to the third aspect, in one possible implementation manner, the first illumination manner includes: no illumination is carried out; the processor is specifically configured to: and controlling the plurality of polarized light sources with different polarization states to not provide illumination.
In the case where the illumination device according to the third aspect is integrated with the image capturing device according to the fourth aspect and the image processing device according to the fifth aspect, the illumination device and the image processing device may share at least one processor, and the at least one processor may execute processing operations in the illumination device, the image capturing device, and the image processing device.
In a fourth aspect, an image capturing apparatus is provided, including: the system comprises a polarization image collector, at least one processor and a communication interface; the communication interface is used for receiving a second instruction from the image processing device, and the second instruction is used for indicating the first acquisition mode; the first acquisition mode is used to characterize the mode in which the polarization image is acquired. And the processor is used for controlling the polarization image collector to collect the polarization image according to the second instruction.
With reference to the fourth aspect, in a possible implementation manner, the first acquisition mode belongs to a polarization acquisition mode set, and the polarization acquisition mode set includes: synchronous collection operation of single-frame polarized images and asynchronous collection operation of multi-frame polarized images; the single-frame polarized image synchronous acquisition operation comprises the following steps: acquiring a plurality of polarization images with different polarization states from a single image by acquiring the single image of at least one target; the asynchronous acquisition operation of the multi-frame polarized image comprises the following steps: by acquiring a plurality of images of at least one target, each image of the plurality of images comprises a plurality of polarization images of different polarization states.
With reference to the fourth aspect, in a possible implementation manner, the first acquisition manner includes: synchronously acquiring a single frame of polarized image; and the processor is specifically used for controlling the polarization image collector to collect a single image of at least one target according to the second instruction, and acquiring a plurality of polarization images in different polarization states from the single image.
With reference to the fourth aspect, in a possible implementation manner, the first acquisition manner includes: asynchronous collecting operation of multi-frame polarized images; and the processor is specifically configured to control the polarization image collector to collect images of at least one target for multiple times according to the second instruction, determine multiple images of the at least one target, and respectively obtain multiple polarization images in different polarization states from each of the multiple images.
In the case where the image pickup device according to the fourth aspect is integrated with the illumination device according to the third aspect or the image processing device according to the fifth aspect, the illumination device and the image processing device may share at least one processor, and the at least one processor may execute processing operations in the illumination device, the image pickup device, and the image processing device.
In a fifth aspect, there is provided an image processing apparatus comprising: at least one processor and a communication interface. The communication interface is used for acquiring first characteristic information; the first characteristic information is used for representing scene characteristics and indicating acquisition of polarization information of at least one target, and the first characteristic information corresponds to a first illumination mode, a first acquisition mode and a first image processing mode. And the processor is used for controlling the polarized light illumination according to the first illumination mode and controlling and acquiring a first polarized image of at least one target under the first illumination mode according to the first acquisition mode. And the processor is also used for processing the first polarization image according to the first image processing mode to acquire the polarization information of at least one target.
With reference to the fifth aspect, in a possible implementation manner, the processor is further configured to: and determining a first illumination mode, a first acquisition mode and a first image processing mode according to the first characteristic information.
With reference to the fifth aspect, in a possible implementation manner, the processor is specifically configured to: generating a first instruction and a second instruction; the first instructions are for indicating a first lighting mode; the second instruction is used for indicating the first acquisition mode; the communication interface is instructed to send a first instruction to the illumination device and a second instruction to the image acquisition device.
With reference to the fifth aspect, in a possible implementation manner, the processor is specifically further configured to: the communication interface is instructed to receive a first polarization image from the image capture device.
With reference to the fifth aspect, in one possible implementation manner, the scene characteristic includes light intensity information of an environment.
With reference to the fifth aspect, in one possible implementation manner, the polarization information includes at least one of the following: a second polarization image, qualitative polarization information, or quantitative polarization information.
With reference to the fifth aspect, in a possible implementation manner, the first illumination manner belongs to a polarized illumination manner set, and the polarized illumination manner set includes at least one of the following: multi-polarized light source synchronous illumination, multi-polarized light source asynchronous illumination, or no illumination; wherein, the multi-polarization light source synchronous illumination is as follows: simultaneously providing polarized illumination by a plurality of polarized light sources of different polarization states; the non-synchronous illumination of the multi-polarization light source is as follows: providing polarized illumination by a plurality of polarized light sources of different polarization states in sequence; the non-illumination is: no illumination is provided.
With reference to the fifth aspect, in a possible implementation manner, the first acquisition mode belongs to a polarization acquisition mode set, and the polarization acquisition mode set includes: synchronous collection operation of single-frame polarized images and asynchronous collection operation of multi-frame polarized images; the single-frame polarized image synchronous acquisition operation comprises the following steps: acquiring a plurality of polarization images with different polarization states from a single image by acquiring the single image of at least one target; the asynchronous acquisition operation of the multi-frame polarized image comprises the following steps: by acquiring a plurality of images of at least one target, each image of the plurality of images comprises a plurality of polarization images of different polarization states.
With reference to the fifth aspect, in a possible implementation manner, the first image processing manner belongs to a polarization image processing manner set, and the polarization image processing manner set includes: obtaining polarized images, qualitative polarization information and quantitative polarization information; the polarization image obtaining operation is used for obtaining a second polarization image, and the second polarization image belongs to the polarization images of the first polarization image in different polarization states; the qualitative polarization information obtaining operation is for obtaining at least one of the following polarization information of at least one target: stokes vector, degree of polarization, or angle of polarization; the quantitative polarization information acquisition operation is used to acquire a mueller matrix of the at least one target.
With reference to the fifth aspect, in a possible implementation manner, the first feature information represents that the intensity of ambient light is less than a preset value and indicates to acquire the second polarization image, and the first illumination operation includes: the multi-polarization light source synchronous illumination, the polarization image acquisition mode comprises: single frame polarized image synchronous collection operation, the polarized image processing operation comprises: and (5) a polarized image acquisition operation.
With reference to the fifth aspect, in a possible implementation manner, the first feature information represents that the ambient light intensity is smaller than a preset value and indicates to acquire qualitative polarization information, and the polarized light illumination operation includes: the multi-polarization light source synchronous illumination, the polarization image acquisition mode comprises: single frame polarized image synchronous collection operation, the polarized image processing operation comprises: and (5) obtaining qualitative polarization information.
With reference to the fifth aspect, in a possible implementation manner, the first feature information represents that the ambient light intensity is smaller than a preset value and indicates to acquire qualitative polarization information, and the polarized light illumination operation includes: the non-synchronous illumination of many polarized light sources, polarization image acquisition mode includes: the asynchronous collection operation of multiframe polarization images, the polarization image processing operation includes: quantitative polarization information acquisition operation.
With reference to the fifth aspect, in a possible implementation manner, the first feature information represents that the ambient light intensity is greater than or equal to a preset value and indicates to acquire the second polarized image, and the polarized light illumination operation includes: the non-illumination and polarized image acquisition mode comprises the following steps: single frame polarized image synchronous collection operation, the polarized image processing operation comprises: and (5) a polarized image acquisition operation.
With reference to the fifth aspect, in a possible implementation manner, the first feature information represents that the ambient light intensity is greater than or equal to a preset value and indicates to acquire qualitative polarization information, and the polarized light illumination operation includes: the non-illumination and polarized image acquisition mode comprises the following steps: single frame polarized image synchronous collection operation, the polarized image processing operation comprises: and (5) obtaining qualitative polarization information.
With reference to the fifth aspect, in a possible implementation manner, the first feature information represents that the ambient light intensity is greater than or equal to a preset value and indicates to acquire quantitative polarization information, and the polarized light illumination operation includes: the non-synchronous illumination of many polarized light sources, polarization image acquisition mode includes: the asynchronous collection operation of multiframe polarization images, the polarization image processing operation includes: quantitative polarization information acquisition operation.
In the case where the image processing apparatus according to the fifth aspect is integrated with the illumination apparatus according to the third aspect, the illumination apparatus and the image processing apparatus may share at least one processor, and the at least one processor may execute processing operations in the illumination apparatus, the image capturing apparatus, and the image processing apparatus.
In a sixth aspect, the present application provides a computer-readable storage medium comprising a computer program or instructions which, when run on a computer, cause the computer to perform the method as described in the first aspect and any one of the possible implementations of the first aspect.
In a seventh aspect, the present application provides a computer program product comprising instructions which, when run on a computer, cause the computer to perform the method as described in the first aspect and any one of the possible implementations of the first aspect.
It should be appreciated that the description of technical features, solutions, benefits, or similar language in this application does not imply that all of the features and advantages may be realized in any single embodiment. Rather, it is to be understood that the description of a feature or advantage is intended to include the specific features, aspects or advantages in at least one embodiment. Therefore, the descriptions of technical features, technical solutions or advantages in the present specification do not necessarily refer to the same embodiment. Furthermore, the technical features, technical solutions and advantages described in the present embodiments may also be combined in any suitable manner. One skilled in the relevant art will recognize that an embodiment may be practiced without one or more of the specific features, aspects, or advantages of a particular embodiment. In other instances, additional features and advantages may be recognized in certain embodiments that may not be present in all embodiments.
Drawings
FIG. 1 is a functional block diagram of a vehicle according to an embodiment of the present disclosure;
fig. 2 is a system architecture diagram of an image processing system according to an embodiment of the present application;
fig. 3 is a schematic view of a lighting device according to an embodiment of the present disclosure;
fig. 4 is a schematic flowchart of an image processing method according to an embodiment of the present application;
fig. 5 is a schematic flowchart of another image processing method provided in the embodiment of the present application;
FIGS. 6 a-6 d are schematic diagrams illustrating the comparison of polarization images with different polarization directions according to the embodiments of the present application;
FIGS. 7a and 7b are schematic diagrams illustrating a comparison between a visible light image and a polarization degree image provided in an embodiment of the present application;
FIGS. 8a and 8b are schematic diagrams illustrating a comparison between a visible light image and a polarization angle image according to an embodiment of the present disclosure;
fig. 9 is a schematic structural diagram of a processing apparatus according to an embodiment of the present disclosure;
fig. 10 is a schematic hardware structure diagram of a processing apparatus according to an embodiment of the present disclosure;
fig. 11 is a schematic hardware structure diagram of another processing device according to an embodiment of the present disclosure.
Detailed Description
In the description of this application, "/" means "or" unless otherwise stated, for example, A/B may mean A or B. "and/or" herein is merely an association describing an associated object, and means that there may be three relationships, e.g., a and/or B, which may mean: a exists alone, A and B exist simultaneously, and B exists alone. Further, "at least one" means one or more, "a plurality" means two or more. The terms "first", "second", and the like do not necessarily limit the number and execution order, and the terms "first", "second", and the like do not necessarily limit the difference.
It is noted that, in the present application, words such as "exemplary" or "for example" are used to mean exemplary, illustrative, or descriptive. Any embodiment or design described herein as "exemplary" or "e.g.," is not necessarily to be construed as preferred or advantageous over other embodiments or designs. Rather, use of the word "exemplary" or "such as" is intended to present concepts related in a concrete fashion.
In order to make the present application clearer, a brief description of some concepts related to the present application will be given below.
1. Polarization
Polarization refers to a vibration phenomenon in which a transverse wave vibrates during propagation and the vibration direction is perpendicular to the propagation direction of the wave. Polarization is a characteristic of transverse waves, and longitudinal waves are not polarized.
Since light is also a transverse wave, light is also polarized during propagation. Articles made of different materials can reflect light in different polarization states (polarization states can be polarization characteristics such as polarization degree and polarization angle, and the polarization state is taken as the polarization degree in the present application for explanation) after being irradiated by light. And the polarization state of the reflected light after the same material is irradiated by the light with different polarization states can be different.
Based on the characteristic that the polarization states of the reflected light of different objects are different, the objects made of different materials can be distinguished by detecting the polarization states of the reflected light of the objects. The problem that similar objects are difficult to identify due to the fact that foreign matters in the same spectrum (in a certain spectrum section, objects of different types present the same spectral characteristics) under visible light can be solved.
2. Quantifying polarization information
The quantitative polarization information is polarization information of the object itself, and is fixed and does not change with the change of the environment where the object is located. A parameter commonly used to characterize quantitative polarization information of an object is the mueller matrix. After determining the mueller matrix of an object, the material of the object can be determined by a table lookup.
The table lookup method is to pre-configure objects of different materials and corresponding mueller matrices. Therefore, after the Mueller matrix of the object is determined, the material of the object corresponding to the Mueller matrix can be inquired according to the Mueller matrix.
3. Qualitative polarization information
Qualitative polarization information refers to the polarization information that an object exhibits in its current environment. The qualitative polarization information may vary with the environment (light intensity, direction of illumination of light, polarization information of illumination light, etc.) in which the object is located.
The qualitative polarization information typically includes at least one of: stokes vector (stokes vector), degree of polarization, or angle of polarization.
4. Pixel-level coated polarization sensor
After the sensors of the image acquisition device are plated with the polarizing plates with different polarization directions, polarization images modulated by different polarization angles can be obtained, and after the RGB filter layers are plated, colorful polarization images can be obtained.
The working principle of the pixel-level coated polarization sensor is described as follows:
in an image collected by the pixel-level film-coated sensor, polarization states of every four (the number is not limited, and may be multiple) adjacent pixels are different, and in an example, polarization degrees of the four adjacent pixels are respectively: 0 °, 45 °, 90 °, 135 °. In this way, after an image is acquired by the image acquisition device, pixels with the same polarization state in the image are extracted, so that an image in one polarization state of the image can be obtained. In the same way, the image acquisition device can acquire images of 4 polarization states of the target.
The foregoing is a brief introduction to some of the matters and concepts related to this application.
Hereinafter, application scenarios of the embodiments of the present application will be briefly described.
The embodiment of the application provides an image processing method and device, which are applied to terminal equipment and used for determining polarization information of at least one target device.
A terminal device may also be referred to as a User Equipment (UE), a terminal, an access terminal, a subscriber unit, a subscriber station, a mobile station, a remote terminal, a mobile device, a user terminal, a wireless communication device, a user agent, or a user equipment. The terminal device may be a vehicle networking (V2X) device, such as a smart car (smart car or interactive car), a digital car (digital car), an unmanned car (unmanned car or drive car or pilot car or auto-mobile), an automatic car (self-driving car or auto-mobile car), a pure electric car (pure EV or Battery EV), a hybrid electric car (HEV), a Range Extended EV (REEV), a plug-in hybrid EV, a new energy vehicle (PHEV), and the like. The terminal device may also be a device-to-device (D2D) device. The terminal device may also be a Mobile Station (MS), a subscriber unit (subscriber unit), a drone, an internet of things (IoT) device, a station in a WLAN (station, ST), a cellular phone (cellular phone), a smart phone (smart phone), a cordless phone, a wireless data card, a tablet, a Session Initiation Protocol (SIP) phone, a Wireless Local Loop (WLL) station, a Personal Digital Assistant (PDA) device, a laptop computer (laptop computer), a Machine Type Communication (MTC) terminal, a handheld device with wireless communication capability, a computing device, or other processing device connected to a wireless modem, a vehicle mounted device, a wearable device (also referred to as a wearable smart device).
In the embodiment of the present application, a terminal device is taken as an example of a vehicle.
Fig. 1 is a functional block diagram of a vehicle 100 provided in an embodiment of the present application, where the vehicle 100 may be an intelligent vehicle. In one embodiment, the vehicle 100 determines a first illumination mode and a first acquisition mode according to the first characteristic information, provides illumination according to the first illumination mode, acquires a first polarization image of at least one object according to the first acquisition mode, and the vehicle 100 processes the first polarization image according to the first image processing mode to determine polarization information of the at least one object. The vehicle 100 determines the surroundings of the vehicle based on the polarization information of the at least one object, providing a basis for the automatic driving of the vehicle.
Vehicle 100 may include various subsystems such as a travel system 110, a sensor system 120, a control system 130, one or more peripherals 140, as well as a power supply 150, a computer system 160, and a user interface 170. Alternatively, vehicle 100 may include more or fewer subsystems, and each subsystem may include multiple elements. In addition, each of the sub-systems and elements of the vehicle 100 may be interconnected by wire or wirelessly.
The travel system 110 may include components that provide powered motion to the vehicle 100. In one embodiment, the travel system 110 may include an engine 111, a transmission 112, an energy source 113, and wheels 114. The engine 111 may be an internal combustion engine, an electric motor, an air compression engine, or other types of engine combinations, such as a hybrid engine consisting of a gasoline engine and an electric motor, a hybrid engine consisting of an internal combustion engine and an air compression engine. The engine 111 converts the energy source 113 into mechanical energy.
Examples of energy sources 113 include gasoline, diesel, other petroleum-based fuels, propane, other compressed gas-based fuels, ethanol, solar panels, batteries, and other sources of electrical power. The energy source 113 may also provide energy to other systems of the vehicle 100.
The transmission 112 may transmit mechanical power from the engine 111 to the wheels 114. The transmission 112 may include a gearbox, a differential, and a drive shaft. In one embodiment, the transmission 112 may also include other devices, such as a clutch. Wherein the drive shaft may comprise one or more shafts that may be coupled to one or more wheels 114.
The sensor system 120 may include several sensors that sense information about the environment surrounding the vehicle 100. For example, the sensor system 120 may include a positioning system 121 (the positioning system may be a Global Positioning System (GPS), a Beidou system, or other positioning system), an Inertial Measurement Unit (IMU) 122, a radar 123, a laser radar 124, and a camera 125. The sensor system 120 may also include sensors that monitor internal systems of the vehicle 100 (e.g., an in-vehicle air quality monitor, a fuel gauge, an oil temperature gauge, etc.). Sensor data from one or more of these sensors may be used to detect the object and its corresponding characteristics (position, shape, orientation, velocity, etc.). Such detection and identification is a key function of the safe operation of the automatic driving of the vehicle 100.
The positioning system 121 may be used to estimate the geographic location of the vehicle 100. The IMU 122 is used to sense position and orientation changes of the vehicle 100 based on inertial acceleration. In one embodiment, the IMU 122 may be a combination of an accelerometer and a gyroscope.
The radar 123 may utilize radio signals to sense objects within the surrounding environment of the vehicle 100. In some embodiments, in addition to sensing objects, radar 123 may also be used to sense the speed and/or heading of an object.
Lidar 124 may utilize a laser to sense objects in the environment in which vehicle 100 is located. In some embodiments, lidar 124 may include one or more laser sources, laser scanners, and one or more detectors, among other system components.
The camera 125 may be used to capture multiple images of the surroundings of the vehicle 100, as well as multiple images within the vehicle cabin. The camera 125 may be a still camera or a video camera. In the embodiment of the present application, the camera 123 may be a polarization camera capable of acquiring a polarization image of the target.
The control system 130 may control the operation of the vehicle 100 and its components. Control system 130 may include various elements including a steering system 131, a throttle 132, a braking unit 133, a computer vision system 134, a route control system 135, and an obstacle avoidance system 136.
The steering system 131 is operable to adjust the heading of the vehicle 100. For example, in one embodiment, a steering wheel system.
The throttle 132 is used to control the operating speed of the engine 111 and thus the speed of the vehicle 100.
The brake unit 133 is used to control the vehicle 100 to decelerate. The brake unit 133 may use friction to slow the wheel 114. In other embodiments, the brake unit 133 may convert the kinetic energy of the wheel 114 into an electrical current. The brake unit 133 may take other forms to slow the rotational speed of the wheels 114 to control the speed of the vehicle 100.
The computer vision system 134 may be operable to process and analyze images captured by the camera 125 to identify objects and/or features in the environment surrounding the vehicle 100 as well as limb and facial features of a driver within the vehicle cabin. The objects and/or features may include traffic signals, road conditions, and obstacles, and the limb and facial features of the driver include the driver's behavior, line of sight, expression, and the like. The computer vision system 134 may use object recognition algorithms, motion from motion (SFM) algorithms, video tracking, and other computer vision techniques. In some embodiments, the computer vision system 134 may be used to map an environment, track objects, estimate the speed of objects, determine driver behavior, face recognition, and so forth.
The route control system 135 is used to determine a travel route of the vehicle 100. In some embodiments, route control system 135 may combine data from sensors, positioning system 121, and one or more predetermined maps to determine a travel route for vehicle 100.
Obstacle avoidance system 136 is used to identify, assess, and avoid or otherwise negotiate potential obstacles in the environment of vehicle 100.
Of course, in one example, the control system 130 may add portions of components not shown above; or replacing some of the components shown above with other components; or alternatively some of the components shown above may be reduced.
Vehicle 100 interacts with external sensors, other vehicles, other computer systems, or users through peripherals 140. The peripheral devices 140 may include a wireless communication system 141, an in-vehicle computer 142, a microphone 143, and/or a speaker 144.
In some embodiments, the peripheral device 140 provides a means for a user of the vehicle 100 to interact with the user interface 170. For example, the in-vehicle computer 142 may provide information to a user of the vehicle 100. The user interface 170 may also operate the in-vehicle computer 142 to receive user input. The in-vehicle computer 142 may be operated through a touch screen. In other cases, the peripheral device 140 may provide a means for the vehicle 100 to communicate with other devices located within the vehicle. For example, the microphone 143 may receive audio (e.g., voice commands or other audio input) from a user of the vehicle 100. Similarly, the speaker 144 may output audio to a user of the vehicle 100.
Wireless communication system 141 may wirelessly communicate with one or more devices directly or via a communication network. For example, wireless communication system 141 may use 3G cellular communication, such as CDMA, EVD0, GSM/GPRS, or 4G cellular communication, such as LTE, or 5G cellular communication. The wireless communication system 141 may communicate with a Wireless Local Area Network (WLAN) using WiFi. In some embodiments, the wireless communication system 141 may utilize an infrared link, Bluetooth, or ZigBee to communicate directly with the devices. Wireless communication system 141 may also communicate with devices using other wireless protocols. Such as various vehicle communication systems. The wireless communication system 141 may include one or more Dedicated Short Range Communications (DSRC) devices.
The power supply 150 may provide power to various components of the vehicle 100. In one embodiment, power source 150 may be a rechargeable lithium ion or lead acid battery. One or more battery packs of such batteries may be configured as a power source to provide power to various components of the vehicle 100. In some embodiments, power source 150 and energy source 113 may be implemented together, such as a pure electric vehicle or a hybrid electric vehicle in a new energy vehicle, or the like.
Some or all of the functions of vehicle 100 are controlled by computer system 160. The computer system 160 may include at least one processor 161, the processor 161 executing instructions 1621 stored in a non-transitory computer readable medium, such as the data storage device 162. The computer system 160 may also be a plurality of computing devices that control individual components or subsystems of the vehicle 100 in a distributed manner.
Processor 161 may be any conventional processor, such as a commercially available Central Processing Unit (CPU). Alternatively, the processor may be a dedicated device such as an Application Specific Integrated Circuit (ASIC) or other hardware-based processor. Although fig. 1 functionally illustrates a processor, memory, and other elements within the same physical housing, those skilled in the art will appreciate that the processor, computer system, or memory may actually comprise multiple processors, computer systems, or memories that may or may not be stored within the same physical housing. For example, the memory may be a hard drive, or other storage medium located in a different physical enclosure. Thus, references to a processor or computer system are to be understood as including references to a collection of processors or computer systems or memories that may or may not operate in parallel. Rather than using a single processor to perform the steps described herein, some components, such as the steering component and the retarding component, may each have their own processor that performs only computations related to the component-specific functions.
In various aspects described herein, the processor may be located in a device remote from and in wireless communication with the vehicle. In other aspects, some of the processes described herein are executed on a processor disposed within the vehicle and others are executed by a remote processor, including taking the steps necessary to perform a single maneuver.
In some embodiments, the data storage device 162 may include instructions 1621 (e.g., program logic), which instructions 1621 may be executed by the processor 161 to perform various functions of the vehicle 100, including all or part of the functions described above. The data storage device 162 may also contain additional instructions, including instructions to send data to, receive data from, interact with, and/or control one or more of the travel system 110, the sensor system 120, the control system 130, and the peripheral devices 140.
In addition to instructions 1621, data storage device 162 may also store data such as road maps, route information, the location, direction, speed, and other such vehicle data of the vehicle, as well as other information. Such information may be used by vehicle 100 and computer system 160 during operation of vehicle 100 in autonomous, semi-autonomous, and/or manual modes.
For example, in one possible embodiment, the data storage device 162 may obtain information about obstacles in the surrounding environment, such as the positions of other vehicles, road edges, greenbelts, and the like, the distance between the obstacles and the vehicle, and the distance between the obstacles, which are obtained by the vehicle based on the sensors in the sensor system 120. The data storage device 162 may also obtain environmental information from the sensor system 120 or other components of the vehicle 100, such as whether a green belt, a lane, a pedestrian, etc. is present near the environment in which the vehicle is currently located, or whether a green belt, a pedestrian, etc. is present near the environment in which the vehicle is currently located as calculated by a machine learning algorithm. In addition to the above, the data storage device 162 may store the status information of the vehicle itself, including but not limited to the position, speed, acceleration, heading angle, etc. of the vehicle, as well as the status information of other vehicles with which the vehicle interacts. As such, the processor 161 may acquire such information from the data storage device 162, and determine a passable area of the vehicle based on environmental information of an environment in which the vehicle is located, state information of the vehicle itself, state information of other vehicles, and the like, and determine a final driving strategy based on the passable area to control the vehicle 100 to autonomously drive.
A user interface 170 for providing information to or receiving information from a user of the vehicle 100. Optionally, the user interface 170 may interact with one or more input/output devices within the set of peripheral devices 140, such as one or more of the wireless communication system 141, the in-vehicle computer 142, the microphone 143, and the speaker 144.
The computer system 160 may control the vehicle 100 based on information obtained from various subsystems (e.g., the travel system 110, the sensor system 120, and the control system 130) and information received from the user interface 170. For example, computer system 160 may control steering system 131 to alter the vehicle heading to avoid obstacles detected by sensor system 120 and obstacle avoidance system 136 based on information from control system 130. In some embodiments, the computer system 160 may control many aspects of the vehicle 100 and its subsystems.
Alternatively, one or more of these components described above may be mounted or associated separately from the vehicle 100. For example, the data storage device 162 may exist partially or completely separate from the vehicle 100. The above components may be coupled together for communication by wired and/or wireless means.
Optionally, the above components are only an example, in an actual application, components in the above modules may be added or deleted according to an actual need, and fig. 1 should not be construed as limiting the embodiment of the present application.
An autonomous automobile traveling on a roadway, such as vehicle 100 above, may determine an adjustment command for the current speed based on other vehicles within its surrounding environment. The objects in the environment around the vehicle 100 may be traffic control devices, or other types of objects such as green belts, among others. In some examples, each object within the surrounding environment may be considered independently, and the speed adjustment instructions for the vehicle 100 may be determined based on respective characteristics of the object, such as its current speed, acceleration, separation from the vehicle, and so forth.
Alternatively, vehicle 100, or a computer device associated therewith (e.g., computer system 160, computer vision system 134, data storage 162 of fig. 1), which is an autonomous automobile, may derive a state of the surrounding environment (e.g., traffic, rain, ice on the road, etc.) based on the identified measurement data and determine a relative position of an obstacle in the surrounding environment to the vehicle at the current time. Alternatively, the boundaries of the passable areas formed by each obstacle are dependent on each other, and therefore, it is also possible to determine the boundaries of the passable areas of the vehicle together with all the acquired measurement data, removing the areas of the passable areas that are not actually passable. The vehicle 100 is able to adjust its driving strategy based on the detected passable areas of the vehicle. In other words, the autonomous vehicle is able to determine what steady state the vehicle needs to adjust to (e.g., accelerate, decelerate, turn, or stop, etc.) based on the detected passable area of the vehicle. In this process, other factors may also be considered to determine the speed adjustment command for the vehicle 100, such as the lateral position of the vehicle 100 in the road being traveled, the curvature of the road, the proximity of static and dynamic objects, and so forth.
In addition to providing instructions to adjust the speed of the autonomous vehicle, the computer device may also provide instructions to modify the steering angle of the vehicle 100 to cause the autonomous vehicle to follow a given trajectory and/or to maintain a safe lateral and longitudinal distance of the autonomous vehicle from nearby objects (e.g., cars in adjacent lanes).
The vehicle 100 may be a car, a truck, a motorcycle, a bus, a boat, an airplane, a helicopter, a lawn mower, an amusement car, a playground vehicle, construction equipment, a trolley, a golf cart, a train, a trolley, etc., and the embodiment of the present invention is not particularly limited.
In other embodiments of the present application, the autonomous vehicle may further include a hardware structure and/or a software module, and the functions described above are implemented in the form of a hardware structure, a software module, or a hardware structure plus a software module. Whether any of the above-described functions is implemented as a hardware structure, a software module, or a hardware structure plus a software module depends upon the particular application and design constraints imposed on the technical solution.
In still another possible implementation manner, as shown in fig. 2, the image processing method provided in the embodiment of the present application may be applied to an image processing system 100 as shown in fig. 2, and as shown in fig. 2, the image processing system 100 includes: an image processing device 10, an illumination device 20, and an image acquisition device 30.
The image processing apparatus 10 is configured to control and manage the image processing method provided in the embodiment of the present application, and process the acquired polarization image to obtain polarization information of the polarization image.
An illumination device 20 for providing polarized illumination of the environment. The illumination device 20 includes one or more polarized light sources, with different polarized light sources providing polarized light of the same or different polarization states. In the embodiment of the present application, the illumination device 20 includes at least two polarized light sources, and the at least two polarized light sources can provide polarized light in more than two polarization states, so as to meet the requirement of the image processing device to obtain quantitative polarization information of the target.
In one example, as shown in FIG. 3, the illumination device 20 includes 4 polarized light sources, and the 4 polarized light sources respectively provide polarized light of 4 different polarization states.
The light source of the lighting device may be a Light Emitting Diode (LED), or a laser.
The method for providing polarized light by the illumination device comprises the following steps: the polarized light is directly emitted by using a polarized light source, the polarized light is obtained by filtering a non-polarized light source by using a polarizing plate, the polarized light is obtained by filtering the non-polarized light source by using a wave plate, and the polarized light is obtained by filtering the non-polarized light source by using the polarizing plate and the wave plate together.
Specifically, the 4 polarized light sources are respectively: a 0 polarized light source 201, a 45 polarized light source 202, a 90 polarized light source 203, a 135 polarized light source 204.
It should be noted that the polarized light of different polarization states provided by the polarized light sources are only exemplary, and in a specific implementation, the illumination device 20 may include any number of polarized light sources, and each polarized light source may provide polarized light of any polarization state, which is not limited in this application. For example, the 4 polarized light sources in the illumination device 20 may also be: 0 polarized light, 45 polarized light, 90 polarized light, left polarized light (or right polarized light).
And the image acquisition device 30 is used for acquiring the polarization image. The image capture device 30 may be an image capture device configured to include the pixel-level coated polarization sensor described above. In the embodiment of the present application, the image capturing device 30 may capture polarization images of more than 3 polarization states of the target. For example, the image capturing device 30 captures polarization images of the 4 polarization states of the target.
It should be noted that, in the embodiment of the present application, the image processing apparatus 10, the image capturing apparatus 20, and the lighting apparatus 30 may be integrated in a same device, for example, integrated in a mobile phone, a vehicle, or the like. If the three devices are integrated, at least one processor can be shared by the three devices to execute corresponding processing operations, or the respective processors can be configured to execute corresponding processing operations. As an example, in a smart driving scenario, the image capture device 30 may include the camera 125 described above in the vehicle 100. The lighting device 20 may include a lighting device newly added to the vehicle 100. The image processing apparatus 10 is a computer system 160 described in the vehicle 100; alternatively, more specifically, the image processing apparatus 10 may be the processor 161 described in the vehicle 100.
Further, the image processing device 10, the image capturing device 20, and the illumination device 30 may be separately provided in different apparatuses. For example, in the smart driving scenario, the image capture device 30 is the camera 125 described in the vehicle 100. The lighting device 20 is an additional lighting device in the vehicle 100. The image processing apparatus 10 is a processing apparatus provided at a remote end, for example, a Road Side Unit (RSU) provided at a roadside; also for example, a Mobile Edge Computing (MEC) device is disposed at the far end. In the case where the image processing device 10, the image capturing device 20, and the illuminating device 30 are separately provided, respective processors may be provided between the three devices to perform corresponding processing operations. At this time, the image capture device 20 may be a smart camera, or similar image capture device with processing capabilities. The lighting device 30 may be an intelligent lighting device, or similar lighting device with processing functionality.
In the embodiment of the present application, the image processing apparatus 10 is an MEC disposed at a remote end, and the illumination apparatus and the image capturing apparatus are disposed in a vehicle as an example.
In the current automatic driving scenario, a vehicle needs to detect information of light intensity, shape, color, and the like of each object existing in the vehicle surroundings using a visible light camera. In an autonomous driving scenario, targets refer to various objects present around the vehicle. For example, other vehicles, obstacles, lane markings, signposts, signal lights, and the like require vehicle identification to assist in autonomous driving.
However, there are many limitations to the application of visible light cameras in automatic driving scenes, for example, in scenes where the ambient light is too strong, the target is prone to glare, which makes it difficult to distinguish the target from other surrounding objects.
In a scene with too weak ambient light, the reflected light intensity of the target and its surrounding objects is weak (dark light), which may also make the target difficult to distinguish from other surrounding objects.
In addition, the target and its surrounding objects may have the problem of foreign matter co-spectrum, resulting in similar light reflection characteristics of the target and its surrounding objects even though the ambient light intensity is normal, resulting in difficulty in distinguishing the target from other surrounding objects.
The problem of glare, dim light, foreign matter co-spectrum and the like in a visible light camera is solved. A polarized light camera is currently proposed, which can acquire polarization information of a target in addition to gray scale information and color information of an image acquired by a visible light camera. Since different objects typically have different polarization information. Therefore, the polarized light machine can well solve the problem that targets are difficult to distinguish from other surrounding objects under the scene of glare, dim light and foreign matters in the same spectrum.
For example, by determining the polarization information of the target object when the visible light intensity is strong or weak, the target can be identified under strong and weak light conditions. When the target is similar to the colors of other objects, the polarized light representing the material of the object can be used for distinguishing the targets with similar colors and different materials.
Currently, in the field of automatic driving, the method of determining objects around a vehicle using a polarization camera is mainly a method of passive polarization imaging. That is, the polarization camera collects images of the surrounding environment under the current ambient light and performs polarization processing on the images to generate a polarization image, and the polarization camera transmits the polarization image to the image processing device. After receiving the image, the image processing device processes the polarization image and extracts polarization information of the polarization image. Generally, an image processing device can process a polarization image by using a trained neural network model to obtain polarization information of the polarization image; alternatively, the polarization information of the polarization image can be determined by adopting an image fusion method.
However, when the polarization camera is currently used to determine the polarization information of the target, the following problems still exist:
1. the polarization image has a variety of polarization information, such as polarization modulated polarization image, stokes vector, polarization angle, degree of polarization, mueller matrix. Different polarization information can represent different characteristics of the target, and different processing modes for acquiring different polarization information of the polarization image are different. When the polarization information of the polarization image is currently determined, the image processing apparatus directly acquires all the polarization information of the polarization image that can be acquired, which increases the amount of calculation of the image processing apparatus.
2. The polarization camera can only acquire the polarization image of the target under the ambient light, and based on the polarization image, the image processing device can only determine the qualitative polarization information of the target according to the polarization image. Qualitative polarization information can only be used to distinguish the target from other objects in the scene to which it belongs, and cannot be used to determine the material of the target.
3. The polarized image collected by the polarization camera has energy loss after being filtered by the polaroid, and the imaging effect is poor in a dark scene.
In order to solve the above technical problem, an embodiment of the present application provides an image processing method, in which an image processing apparatus determines first feature information capable of characterizing a scene feature and indicating acquisition of polarization information of an object, and the image processing apparatus provides polarized illumination for a scene in which the object is located according to the first feature information, acquires a polarized image of the object, and processes the polarized image of the object to determine the polarization information of the object.
Based on the method, the image processing device can select the corresponding image processing mode according to the acquisition of the polarization information of the target to acquire the polarization information of the target, and the problem of large calculation amount of the image processing device caused by the acquisition of the full amount of polarization information of the target is avoided.
In addition, the image processing device can determine the Mueller matrix of the target according to the polarized light provided for the target and the collected polarized light reflected by the target, and further determine quantitative polarization information of the target. In addition, the image processing device can also provide illumination for the target, and the imaging definition of the target in a dim light scene is improved.
It should be noted that the image processing method provided in the embodiment of the present application may be applied to other scenes that need to acquire a polarization image of a target, such as various scenes applied to image recognition (e.g., face recognition, telemedicine, etc.), besides an automatic driving scene, and the present application does not limit this.
The image processing method provided by the embodiment of the application can be applied to: in an automatic driving scenario, an intelligent driving scenario, or an Advanced Driving Assistance System (ADAS), the present application does not limit this scenario.
As shown in fig. 4, the image processing method provided by the embodiment of the present application includes the following steps S400 to S403. The following will specifically describe steps S400 to S403:
s400, the image processing device acquires first characteristic information.
The first feature information is used to characterize a scene feature and to indicate acquisition of polarization information of at least one target. The first characteristic information corresponds to a first illumination mode, a first acquisition mode and a first image processing mode.
In one possible implementation, the scene characteristic is a scene characteristic of an environment in which the at least one object is located. For example, the scene characteristic may be an ambient light intensity of a scene in which the at least one object is located.
The obtaining of the polarization information of the at least one target refers to which of the plurality of kinds of polarization information of the target the polarization information to be obtained of the at least one target is.
It is to be noted that the first feature information may be feature information generated after the image processing apparatus acquires the scene feature and the polarization information of the at least one target; the feature information may be feature information that is generated by another device and then transmitted to the image processing apparatus, which is not limited in the present application.
S401, the image processing device determines a first illumination mode, a first acquisition mode and a first image processing mode according to the first characteristic information.
Wherein the first illumination mode is used to characterize: an illumination mode that provides polarized light illumination for at least one target.
The first acquisition mode is used for characterizing: and acquiring a polarized image of the at least one target after being illuminated according to the first illumination mode.
The first image processing mode is used for representing: and an image processing means for processing the polarization image of the at least one object acquired in accordance with the first acquisition means.
In one possible implementation manner, the image processing apparatus is configured with a correspondence relationship between the first characteristic information and the first illumination manner, the first acquisition manner, and the first image processing manner in advance. After the image processing device acquires the first characteristic information, the image processing device determines a corresponding first illumination mode, a first acquisition mode and a first image processing mode according to the corresponding relation.
S402, controlling polarized illumination according to the first illumination mode by the image processing device, and controlling and acquiring a first polarized image of at least one target under the first illumination mode according to the first acquisition mode.
It should be noted that the image processing apparatus may control the polarized illumination by itself, or the image processing apparatus may control the illumination apparatus to illuminate, which is not limited in the present application.
Similarly, the controlling of the image processing device to acquire the first polarization image may be controlling of the image acquisition device to acquire the first polarization image of the at least one target in the first illumination mode, or controlling of the image processing device to acquire the first polarization image of the at least one target in the first illumination mode. This is not limited in this application.
And S403, the image processing device processes the first polarization image according to the first image processing mode to acquire the polarization information of at least one target.
Based on the above technical solution, in the image processing method provided in the embodiment of the present application, the image processing device may determine a corresponding image processing manner according to the acquisition of the polarization information of the target, and acquire the polarization information of the target, thereby avoiding a problem of a large calculation amount of the image processing device caused by acquiring the full amount of polarization information of the target.
In addition, the image processing device can determine the Mueller matrix of the target according to the polarized light provided for the target and the collected polarized light reflected by the target, and further determine quantitative polarization information of the target. The image processing device can also provide illumination for the target, and improves the imaging definition of the target in a dim light scene.
First characteristic information, a first illumination mode, a first acquisition mode, and a first image processing mode according to an embodiment of the present application are exemplarily described below:
a) and the scene characteristic in the first characteristic information comprises light intensity information of the environment. The polarization information in the first characteristic information includes at least one of: a second polarization image, qualitative polarization information, or quantitative polarization information.
b) The first illumination mode belongs to a polarized light illumination mode set, and the polarized light illumination mode set comprises at least one of the following items: multiple polarized light sources are illuminated synchronously, multiple polarized light sources are illuminated asynchronously, or no illumination is provided.
Wherein, the multi-polarization light source synchronous illumination is as follows: polarized illumination is provided simultaneously by a plurality of polarized light sources of different polarization states. The non-synchronous illumination of the multi-polarization light source is as follows: polarized illumination is provided sequentially by a plurality of polarized light sources of different polarization states. The non-illumination is: no illumination is provided.
It should be noted that, in the embodiment of the present application, the multi-polarization light source synchronous illumination may be specifically implemented as: all of the polarized light sources of the plurality of polarized light sources provide illumination simultaneously, or some of the polarized light sources of the plurality of polarized light sources provide illumination simultaneously. This is not limited in this application.
c) The first collection mode belongs to a polarization collection mode set, and the polarization collection mode set comprises: synchronous collection operation of single-frame polarized images and asynchronous collection operation of multi-frame polarized images.
The single-frame polarized image synchronous acquisition operation comprises the following steps: a plurality of polarization images in different polarization states are acquired from a single image by acquiring the single image of at least one target.
The asynchronous acquisition operation of the multi-frame polarized image comprises the following steps: by acquiring a plurality of images of at least one target, each image of the plurality of images comprises a plurality of polarization images of different polarization states.
d) The first image processing mode belongs to a polarization image processing mode set, and the polarization image processing mode set comprises: obtaining polarized images, obtaining qualitative polarized information and obtaining quantitative polarized information.
The polarization image obtaining operation is used for obtaining a second polarization image, and the second polarization image belongs to the polarization images of the first polarization image in different polarization states.
The qualitative polarization information obtaining operation is for obtaining at least one of the following polarization information of at least one target: stokes vector, degree of polarization, or angle of polarization.
The quantitative polarization information acquisition operation is used to acquire a mueller matrix of the at least one target.
Based on this, under the condition of different ambient light intensities and different polarization information acquisition, the image processing device can determine to provide corresponding illumination modes for the target and collect the polarization images of the target under the corresponding illumination modes. The image processing device can obtain the polarization information required by the target by processing the polarization images of the target acquired under different illumination modes.
In a possible implementation manner, referring to fig. 4, as shown in fig. 5, the above S400 may be specifically implemented by the following S400a to S400 c.
S400a, the image processing apparatus determines the ambient light intensity.
In one possible implementation, the ambient light intensity of the environment is detected by an ambient light detection module. When the image processing device needs to determine the first characteristic information, the image processing device sends an ambient light intensity detection instruction to the ambient light detection module. The ambient light detection module detects the intensity of ambient light in the environment where the at least one target is currently located according to the indication. After the ambient light detection module determines the ambient light intensity of the environment where the at least one target is currently located, the ambient light intensity is sent to the image processing device. Accordingly, the image processing apparatus receives the ambient light intensity from the ambient light detection module.
In yet another possible implementation, the image information of the environment where the at least one target is located is acquired by a visible light image acquisition device. When the image processing device needs to determine the first characteristic information, the image processing device sends a visible light image acquisition instruction to the visible light image acquisition device. And the visible light image acquisition device acquires a visible light image of the environment where the at least one current target is located according to the visible light image acquisition instruction. And the visible light image acquisition device sends a visible light image of the environment where at least one current target is located to the image processing device. Accordingly, the image processing device receives the visible light image from the visible light image acquisition device. The image acquisition unit processes the visible light image to determine the ambient light intensity.
It should be noted that, the above is merely an exemplary description of the method for determining the ambient light intensity by the image processing apparatus, and in a specific implementation, the image processing apparatus may also obtain the ambient light intensity by other methods, which is not limited in this application.
S400b, the image processing device determines polarization information to be acquired by the target.
In one possible implementation manner, the image processing apparatus is preset with polarization information to be acquired of a target corresponding to each scene. After the image processing apparatus determines a scene in which the polarization information is acquired, the image processing apparatus may determine the polarization information to be acquired according to the scene.
In the following, the example that the image processing device determines to acquire polarization information in an automatic driving scenario is taken as an example (the following scenario may also be an intelligent driving scenario, and the present application is not particularly limited, and is only described here by way of example):
when the scene is lane line detection or a driver detection task, the image processing device determines the polarization information to be acquired by the target as a second polarization image.
When the scene is to detect surrounding vehicles, the image processing device determines the polarization information to be acquired by the target as qualitative polarization information.
When the scene is metal well lid detection, the image processing device determines that the polarization information to be acquired by the target is quantitative polarization information.
It should be noted that the present application does not limit the sequence of the above-mentioned S400a and S400 b. For example, the image processing apparatus may first perform S400a and then perform S400 b; alternatively, the image processing apparatus may perform S400b first and then perform S400 a; still alternatively, the image processing apparatus may simultaneously perform S400a and S400 b. This is not limited in this application.
S400c, the image processing device generates first characteristic information according to the ambient light intensity and the polarization information to be acquired by the target.
Based on this, the image processing apparatus can generate corresponding first feature information according to the acquired ambient light intensity and the current scene information.
In a possible implementation manner, referring to fig. 4, as shown in fig. 5, the above S402 may be specifically implemented by the following S402a-S402 g. Next, S402a-S402g are explained in detail:
s402a, the image processing apparatus generates a first command according to the first illumination mode.
The first instruction is used for indicating a first illumination mode. In particular, the first instruction is for indicating which illumination mode of the set of polarized illumination modes the first illumination mode is in particular.
For example, the first instructions are for indicating that the first illumination mode is multi-polarized light source synchronous illumination; or the first instruction is used for indicating that the first illumination mode is the non-synchronous illumination of the multi-polarized light source; alternatively, the first instruction is for indicating that the first illumination mode is non-illumination.
S402b, the image processing apparatus transmits a first command to the lighting apparatus. Accordingly, the illumination device receives a first instruction from the image processing device.
S402c, the illumination device controls the polarized light sources with different polarization states to provide illumination according to the first instruction.
In one possible implementation, the lighting device has at least 3 lighting modes, respectively:
in the illumination mode 1, an illumination device controls all or part of a plurality of polarized light sources to be turned on simultaneously; the lighting mode 2 is that the lighting device controls each polarized light source in the plurality of polarized light sources to be turned on in sequence; and in the illumination mode 3, the illumination device controls all the polarized light sources in the plurality of polarized light sources to be turned off.
And under the condition that the first instruction indicates that the first illumination mode is multi-polarization light source synchronous illumination, the illumination device adopts an illumination mode 1 to provide illumination for the target.
In the case where the first instruction is for indicating that the first illumination mode is non-synchronized illumination with multiple polarized light sources, the illumination apparatus provides illumination to the target using illumination mode 2.
In the case where the first instruction is for indicating that the first illumination mode is no illumination, the illumination apparatus provides illumination to the target in illumination mode 3.
That is, in a case where the first instruction is used to instruct the first illumination mode to be multi-polarized light source synchronous illumination, the illumination device controls all or some of the plurality of polarized light sources to be simultaneously turned on, and all or some of the plurality of polarized light sources simultaneously provide illumination for the target.
When the first instruction is used for indicating that the first illumination mode is multi-polarized light source asynchronous illumination, the illumination device controls each polarized light source in the plurality of polarized light sources to be sequentially turned on, and the plurality of polarized light sources sequentially provide illumination for the target.
When the first instruction is used for indicating that the first illumination mode is no illumination, the illumination device controls each polarized light source in the plurality of polarized light sources to be turned off, and the plurality of polarized light sources do not provide illumination for the target.
In an example, referring to fig. 3, in illumination mode 1, the illumination apparatus may control 4 polarized light sources with different polarization states to illuminate simultaneously; or the lighting device can control two polarized light sources with the polarization degrees different by 90 degrees to simultaneously illuminate; for example, the 0 ° polarized light source and the 90 ° polarized light source are controlled to be illuminated simultaneously, or the 45 ° polarized light source and the 135 ° polarized light source are controlled to be illuminated simultaneously.
In illumination mode 2, the illumination apparatus controls the 0 ° polarized light source to illuminate for a first time period, controls the 45 ° polarized light source to illuminate for a second time period, controls the 90 ° polarized light source to illuminate for a third time period, and controls the 135 ° polarized light source to illuminate for a fourth time period. The second time period is a time period after the first time period, the third time period is a time period after the second time period, and the fourth time period is a time period after the third time period.
S402d, the image processing device generates a second instruction according to the first acquisition mode.
The second instruction is used for indicating the first acquisition mode. Specifically, the second instruction is used to indicate which image acquisition mode of the polarization acquisition mode set the first acquisition mode is.
For example, the second instruction is used for indicating that the first acquisition mode is a single-frame polarized image synchronous acquisition operation; or the second instruction is used for indicating that the first acquisition mode is multi-frame polarized image asynchronous acquisition operation.
It is to be noted that, in the embodiment of the present application, the image processing apparatus performs S402d while performing any of the steps S402a-S402 c. Alternatively, the image processing apparatus may perform S402d before or after performing any one of steps S402a-S402c, which is not limited in this application. For example, the image processing apparatus may first perform S400a and then perform S400 d; alternatively, the image processing apparatus may perform S400d first and then perform S400 a; still alternatively, the image processing apparatus may simultaneously perform S400a and S400 d. This is not limited in this application. The relationship between S402d and S402a and S402b may refer to the relationship between S400d and S400a, which is not described herein in detail.
S402e, the image processing device sends a second instruction to the image acquisition device. Correspondingly, the image acquisition device receives a second instruction from the image processing device.
It should be noted that S402e is a step performed after S402d, and the relationship between S402e and S402a-S402c may refer to the relationship between S402d and S402a-S402c, which is not described herein in detail.
S402f, the image acquisition device acquires a first polarization image according to the second instruction.
In one possible implementation, the image capturing device has at least two image capturing modes, which are respectively:
the image acquisition method 1 includes acquiring a single image of at least one target by an image acquisition device, and acquiring a plurality of polarization images in different polarization states from the single image. The image acquisition mode 2 is that the image acquisition device acquires a plurality of images of at least one target, and each image in the plurality of images comprises a plurality of polarization images in different polarization states.
And under the condition that the second instruction is used for indicating that the first acquisition mode is the single-frame polarized image synchronous acquisition operation, the image acquisition device acquires the polarized image of at least one target by adopting an image acquisition mode 1.
And under the condition that the second instruction is used for indicating that the first acquisition mode is multi-frame polarized image asynchronous acquisition operation, the image acquisition device acquires the polarized image of at least one target by adopting an image acquisition mode 2.
That is, under the condition that the second instruction is used for indicating that the first acquisition mode is the single-frame polarized image synchronous acquisition operation, the image acquisition device acquires a single image of at least one target; the image acquisition device acquires a plurality of polarization images with different polarization states from the single image.
And under the condition that the second instruction is used for indicating that the first acquisition mode is the multi-frame polarized image asynchronous acquisition operation, acquiring a plurality of images of at least one target, wherein each image in the plurality of images comprises a plurality of polarized images in different polarization states.
It is noted that in embodiments of the present application, the image capturing device captures polarized images of at least one target upon determining that the illumination device provides illumination.
For example, the illumination device provides illumination for a period of time, and the image capture device captures polarized images of the at least one target for the period of illumination.
In a possible implementation manner, when the lighting device adopts the lighting manner 1 to provide lighting for at least one target, the lighting device sends triggering information to the image acquisition device, and the triggering information is used for triggering the image acquisition device to acquire an image. In response to the trigger information, the image acquisition device acquires a polarized image of at least one target.
When the illumination device adopts the illumination mode 2 to provide illumination for at least one target, and the image acquisition device acquires a polarization image of the at least one target by using the image acquisition mode 2. The lighting device sends triggering information to the image acquisition device every time one polarized light source is adopted by the lighting device to provide lighting for at least one target, so that the image acquisition device is triggered to acquire the polarized image of the at least one target under the condition that the light source is used for lighting. Therefore, the polarization images of the target under the illumination of each light source can be acquired by the image acquisition device in sequence.
S402g, the image acquisition device sends the first polarization image to the image processing device. Accordingly, the image processing device receives the first polarization image from the image acquisition device.
Based on the technical scheme, the image processing device can acquire the first polarization image of at least one target according to the first illumination mode and the first acquisition mode.
In one possible implementation manner of S403, the first image processing method includes: in the case of the polarization image acquisition operation, S403 can be implemented in the manner described in the following manner 1.
The first image processing method includes: in the case of the qualitative polarization information acquisition operation, S403 can be realized by the manner described in the following manner 2.
The first image processing method includes: in the case of the quantitative polarization information acquisition operation, S403 can be realized by the manner described in the following manner 3.
Mode 1, mode 2, and mode 3 are described below in detail:
in the method 1, the image processing apparatus determines the second polarization image from the plurality of first polarization images of the target.
Specifically, the image processing apparatus determines a plurality of first polarization images of the target, the plurality of first polarization images having different polarization states.
The image processing device determines the polarization image of which the imaging effect meets the preset condition in the plurality of first polarization images as a second polarization image.
Wherein, the imaging effect satisfying the preset condition can be represented by at least one of: the definition of the target in the image meets a preset condition, and the contrast of the target in the image with other objects meets the preset condition. Alternatively, the imaging effect satisfying the preset condition may also be expressed in other manners, which is not limited in this application.
In one example, in a glare scenario, a vehicle needs to determine a traffic sign line on the road ahead.
As shown in fig. 6a, it is the first polarization image of the polarization direction of 0 ° acquired by the image acquisition device.
As shown in fig. 6b, the first polarization image with 45 ° polarization direction is acquired by the image acquisition device.
As shown in fig. 6c, the first polarization image is the 90 ° polarization direction image collected by the image collecting device.
As shown in fig. 6d, it is the first polarization image with 135 ° polarization direction collected by the image collecting device.
In fig. 6a to 6d, under the influence of glare, there may be a traffic sign line on the road that cannot be accurately displayed in the polarized image collected by the image collecting device.
For example, in fig. 6a to 6d, the vehicle needs to recognize the traffic sign line of the road ahead, but is affected by glare, and the left turn sign located in the frame line as shown in fig. 6a, 6b, and 6d may not be correctly recognized.
After receiving the first polarization images shown in fig. 6a to 6d, the image processing apparatus determines the first polarization image with the highest contrast among the 4 first polarization images as the first polarization image shown in fig. 6c after image processing. At this time, the image processing apparatus determines a second polarization image in which the first polarization image shown in fig. 6c is a target, and the image processing apparatus outputs the second polarization image.
Mode 2, the image processing apparatus determines qualitative polarization information of the target according to the plurality of first polarization images of the target.
Wherein the qualitative polarization information of the target at least comprises: the stokes vector of the target, the degree of polarization of the target, and the angle of polarization of the target.
Specifically, the image processing apparatus acquires a first polarization image of a 0 ° polarization direction, a first polarization image of a 45 ° polarization direction, a first polarization image of a 90 ° polarization direction, and a first polarization image of a 135 ° polarization direction of at least one target.
The image acquisition device determines that the stokes vector of the target satisfies the following formula 1:
Figure BDA0003375610110000201
wherein S is the target Stokes vector, S0,S1,S2,S3Are values in the stokes vector S. I is0Representing the light intensity of the polarized light reflected by the target under illumination by a 0 ° polarized light source, I45Representing the light intensity of polarized light reflected by the target under illumination by a 45 ° polarized light source, I90Representing the intensity of the polarized light reflected under illumination by a 90 ° polarized light source, I135Representing the light intensity, I, of polarized light reflected by a target under illumination by a 135 ° polarized light sourcerRepresenting the intensity of the polarized light reflected by the object under right-handed illumination, IlRepresenting the light intensity of polarized light reflected by the target under left-handed illumination.
When plane polarized light propagates along the optical axis in some anisotropic media, the vibration direction continuously rotates as the propagation distance increases. Dextrorotatory rotation means: when looking at the light transmission direction, the vibration surface rotates clockwise; left-handed refers to: the vibrating surface rotates counterclockwise when viewed from the light propagation direction.
After determining the stokes vector of the target, the image acquisition device determines that the degree of polarization of the target satisfies the following formula 2.1:
Figure BDA0003375610110000202
wherein DoLP is the degree of polarization, S, of the target0,S1,S2,S3Are the values in the stokes vector determined in equation 1 above.
It should be noted that in an autopilotIn the scene, the light intensity I of the polarized light reflected by the target under the condition of left-handed light illuminationlAnd the intensity I of the polarized light reflected by the target under right-handed illuminationrA value of 0, when S3The value of (d) is 0.
In this case, the image pickup apparatus determines that the degree of polarization of the target satisfies the following formula 2.2:
Figure BDA0003375610110000211
polarization degree images are commonly used to enhance the contrast of objects of different reflectivity to the background.
In one example, the target is a vehicle in the image. The visible light image of the object acquired by the image processing apparatus is shown in fig. 7a, and the polarization degree image of the object acquired by the image processing apparatus is shown in fig. 7 b.
As can be seen from the comparison between fig. 7a and fig. 7b, the polarization degree image of the target output by the image processing apparatus can greatly improve the contrast between the target and the background, which is more beneficial for the image processing apparatus to detect the target in the image.
The image acquisition device determines that the polarization angle of the target satisfies the following formula 3:
Figure BDA0003375610110000212
wherein AoLP is the polarization angle of the target, S1,S2Are the values in the stokes vector determined in equation 1 above.
Polarization angle images are typically used to enhance contrast between different surface orientations and different surface roughness targets.
One example, targets are asphalt road surfaces and dirt road surfaces in images. The visible light image of the target acquired by the image processing apparatus is shown in fig. 8a, and the polarization degree image of the target acquired by the image processing apparatus is shown in fig. 8 b.
According to the comparison between fig. 8a and fig. 8b, the contrast between the asphalt road surface and the dirt road surface in the polarization angle image of the target output by the image processing device is greatly improved, which is more beneficial to the material segmentation of the road by the vehicle.
Mode 3, the image processing apparatus determines quantitative polarization information of the target from the plurality of first polarization images of the target.
Wherein the quantitative polarization information of the target at least comprises a mueller matrix of the target.
The image processing apparatus determines that the mueller matrix of the target satisfies the following equation 4
S ═ mxs formula 4
Wherein S' is an input Stokes vector, S is an output Stokes vector, and M is a Mueller matrix.
In the embodiment of the present application, S is a stokes vector of a target determined by an image processing apparatus, that is, a stokes vector determined according to the above formula 1.
And S' is determined according to the polarization degree of the polarized light sequentially provided by the illumination device in the illumination mode 2.
Expressed in matrix form, equation 4 can be expressed by equation 5 below:
Figure BDA0003375610110000213
in the above formula 5, S in S0’,S1’,S2’,S3' satisfies the following equation 6:
Figure BDA0003375610110000214
wherein S is0’,S1’,S2’,S3', are all values in the stokes vector S'. I is0' denotes the light intensity of the polarized light provided by the 0 DEG polarized light source of the illumination device, I45' denotes the light intensity of the polarized light provided by a 45 DEG polarized light source of the illumination device, I90' denotes the light intensity of the polarized light provided by the 90 DEG polarized light source of the illumination device, I135' denotes the light intensity of polarized light provided by a 135 ° polarized light source of the illumination device.
In the above equation 5, the muller matrix M satisfies the following equation 7
Figure BDA0003375610110000221
Specifically, the image processing apparatus determines the procedure of the above equation 5 as:
based on equations 1 and 6, the image processing device determines S according to the relationship between the incident light of the object (i.e., the polarized light provided by the illumination device) and the reflected light (i.e., the reflected light of the object collected by the image collection device)0,S1,S2,S3And S0’,S1’,S2’,S3' satisfies the following equation 8:
Figure BDA0003375610110000222
based on equation 8, further transforming equation 8 can obtain equation 5
Figure BDA0003375610110000223
Based on the technical scheme, the image processing device can process the first polarization image of the target in different processing modes under different conditions, and determine the corresponding polarization parameter of the target. The method meets the requirement of the vehicle on the polarization parameters required to be acquired in different driving scenes.
In one possible implementation, the first feature information exists in the following scenarios 1-6. The following describes the above-described scenarios 1 to 6 in detail:
and 1, indicating that the intensity of ambient light represented by the first characteristic information is less than a preset value and indicating to acquire a second polarization image.
That is, in scene 1, the first feature information is used to indicate: and acquiring a second polarization image of at least one target under the scene that the ambient light of the target is weak.
The second polarization image refers to a polarization image in which an imaging effect (e.g., sharpness, contrast) satisfies a preset condition among a plurality of polarization images of the object.
It should be noted that a scene with weak ambient light may also be referred to as a dim light scene.
In one example, the second polarization image may be a highest contrast polarization image of the plurality of polarization images of the target.
Accordingly, in this scenario, the first lighting operation determined by the image processing apparatus in conjunction with S401 above includes: the multi-polarized light source synchronously illuminates. The first acquisition mode comprises: and synchronously acquiring a single-frame polarized image. The first image processing method comprises the following steps: a second polarized image acquisition operation.
That is to say, in a scene with weak ambient light, in order to acquire the second polarization image of at least one target, the image acquisition device determines to adopt a plurality of polarized light sources to provide illumination for the scene to which the target belongs at the same time, and after the illumination, the image acquisition device acquires a single image of the scene to which the target belongs and determines a plurality of first polarization images of different polarization states of the scene to which the target belongs from the single image. The image acquisition device determines a second polarization image from the plurality of first polarization images in different polarization states and outputs the second polarization image.
In one possible implementation manner, in combination with the above-mentioned S402a-S402f, in S402a, the first instruction generated by the image processing apparatus is used for instructing the first polarized illumination mode to be multi-polarized light source synchronous illumination.
In S402c, the lighting device provides lighting to the target in lighting pattern 1.
In S402d, the second instruction generated by the image processing apparatus is used to indicate that the first acquisition mode is a single-frame polarized image synchronous acquisition operation.
In S402f, the image capture device captures a polarized image of at least one target using image capture mode 1.
In conjunction with the above S403, in S403, the image processing apparatus processes the acquired first polarization image by using a polarization image acquisition operation to determine a second polarization image. The image processing device outputs a second polarization image.
Based on the technical scheme, in a scene with weak ambient light, when the second polarized image of at least one target needs to be acquired, the illumination device provides multi-polarized light source synchronous illumination for the at least one target, so that the light intensity of the scene to which the at least one target belongs can be improved, and the problem that the imaging effect of the polarized image acquired by the at least one target in a dark light scene is poor is solved.
The image acquisition device only acquires a single image of at least one target, so that the number of the images required to be acquired by the image acquisition device can be reduced, and further, the calculated amount of the image acquisition device is reduced.
The image processing device determines a second polarization image according to the plurality of first polarization images of the single image. The number of images that the image processing apparatus needs to process can also be reduced, thereby reducing the amount of computation of the image processing apparatus.
And 2, indicating that the intensity of the ambient light represented by the first characteristic information is less than a preset value and indicating to acquire qualitative polarization information.
That is, in scene 2, the first feature information is used to indicate: and acquiring qualitative polarization information of at least one target under the scene that the ambient light of the target is weak.
In this scenario, the first lighting operation determined by the image processing apparatus includes: the multi-polarization light source synchronous illumination device comprises a first acquisition mode and a second acquisition mode, wherein the first acquisition mode comprises the following steps: the single frame polarization image synchronous acquisition operation, the first image processing mode includes: and (5) obtaining qualitative polarization information.
That is to say, in a scene where the ambient light of the target is weak, in order to obtain the qualitative polarization information of at least one target, the image acquisition device determines to adopt a plurality of polarized light sources to provide illumination for the scene to which the target belongs at the same time, and the image acquisition device determines to acquire a single image of the scene to which the target belongs after illumination, and determines a plurality of first polarization images in different polarization states of the scene to which the target belongs from the single image. The image acquisition device performs qualitative polarization information acquisition operation according to the plurality of first polarization images in different polarization states to determine qualitative polarization information of at least one target.
In one possible implementation manner, in combination with the above-mentioned S402a-S402f, in S402a, the first instruction generated by the image processing apparatus is used for instructing the first polarized illumination mode to be multi-polarized light source synchronous illumination.
In S402c, the lighting device provides lighting to the target in lighting pattern 1.
In S402d, the second instruction generated by the image processing apparatus is used to indicate that the first acquisition mode is a single-frame polarized image synchronous acquisition operation.
In S402f, the image capture device captures a polarized image of at least one target using image capture mode 1.
In conjunction with the above S403, in S403, the image processing apparatus processes the acquired first polarization image by using a qualitative polarization information acquisition operation to determine qualitative polarization information of at least one target. The image processing device outputs qualitative polarization information of at least one target.
Based on the technical scheme, in a scene with weak ambient light, when the second polarized image of at least one target needs to be acquired, the illumination device provides multi-polarized light source synchronous illumination for the at least one target, so that the light intensity of the scene to which the at least one target belongs can be improved, and the problem that the imaging effect of the polarized image acquired by the at least one target in a dark light scene is poor is solved.
The image acquisition device only acquires a single image of at least one target, so that the number of the images required to be acquired by the image acquisition device can be reduced, and further, the calculated amount of the image acquisition device is reduced.
The image processing device determines qualitative polarization information of at least one target according to the plurality of first polarization images of the single image. The number of images that the image processing apparatus needs to process can also be reduced, thereby reducing the amount of computation of the image processing apparatus.
And 3, indicating that the intensity of the ambient light represented by the first characteristic information is less than a preset value and indicating to acquire quantitative polarization information.
That is, in scene 3, the first feature information is used to indicate: and acquiring quantitative polarization information of at least one target under the scene that the ambient light of the target is weak.
Accordingly, in this scenario, the first illumination operation determined by the image processing apparatus includes: the non-synchronous illumination of many polarized light sources, first collection mode includes: the asynchronous collection operation of multiframe polarization images, the first image processing mode comprises: quantitative polarization information acquisition operation.
That is to say, in a scene with weak ambient light, in order to acquire quantitative polarization information of at least one target, the image acquisition device determines to provide illumination for the scene to which the target belongs in sequence by using a plurality of polarized light sources, and the image acquisition device determines to acquire a plurality of images of the scene to which the target belongs after the illumination, and determines a plurality of first polarization images of different polarization states of the scene to which the target belongs from each image of the plurality of images. The image acquisition device determines a Mueller matrix of at least one target from the plurality of first polarization images in different polarization states, the image acquisition device determines quantitative polarization information of the at least one target according to the Mueller matrix of the at least one target, and the image processing device outputs the quantitative polarization information of the at least one target.
In one possible implementation manner, in combination with the foregoing S402a-S402f, in S402a, the first instruction generated by the image processing apparatus is used for indicating that the first polarized illumination mode is non-synchronized illumination of the multi-polarized light source.
In S402c, the lighting device provides lighting to the target in lighting pattern 2.
In S402d, the second instruction generated by the image processing apparatus is used to instruct the first acquisition mode to be a multi-frame polarized image non-synchronous acquisition operation.
In S402f, the image capture device captures a polarized image of at least one target using image capture mode 2.
In conjunction with the above S403, in S403, the image processing apparatus processes the acquired first polarization image by using a quantitative polarization information acquisition operation to determine quantitative polarization information of at least one target. The image processing device outputs quantitative polarization information of at least one target.
Based on the technical scheme, in a scene with weak ambient light, when quantitative polarization information of at least one target needs to be acquired, the illumination device provides asynchronous illumination of multiple polarized light sources for the at least one target, so that the image acquisition device can acquire polarized images formed by reflection of the at least one target under different polarized light sources, and based on the polarized images formed by reflection of the at least one target under different polarized light sources and the polarized light information provided by the different polarized light sources, the image processing device can determine the mueller matrix of the at least one target, and then the image processing device can determine the quantitative polarization information of the at least one target according to the mueller matrix of the at least one target.
And 4, indicating that the intensity of the ambient light represented by the scene 4 and the first characteristic information is greater than or equal to a preset value, and indicating to acquire a second polarization image.
That is, in scene 4, the first feature information is used to indicate: and acquiring a second polarization image of at least one target under the scene that the ambient light of the target is stronger.
Accordingly, in this scenario, the first illumination operation determined by the image processing apparatus includes: without illumination, the first acquisition mode comprises: the single frame polarization image synchronous acquisition operation, the first image processing mode includes: a second polarized image acquisition operation.
That is to say, when the second polarization image of at least one target is acquired in a scene with strong ambient light, the image acquisition device determines that the light source is not needed to provide illumination for the scene to which the target belongs, the image acquisition device directly acquires a single image of the scene to which the target belongs, and determines a plurality of first polarization images with different polarization states of the scene to which the target belongs from the single image. The image acquisition device determines a second polarization image from the plurality of first polarization images in different polarization states and outputs the second polarization image.
In one possible implementation manner, in combination with the above-mentioned S402a-S402f, in S402a, the first instruction generated by the image processing apparatus is used for indicating that the first polarized illumination mode is no illumination.
In S402c, the lighting device provides lighting to the target in lighting pattern 3.
In S402d, the second instruction generated by the image processing apparatus is used to indicate that the first acquisition mode is a single-frame polarized image synchronous acquisition operation.
In S402f, the image capture device captures a polarized image of at least one target using image capture mode 1.
In conjunction with the above S403, in S403, the image processing apparatus processes the acquired first polarization image by using a polarization image acquisition operation to determine a second polarization image. The image processing device outputs a second polarization image.
Based on the technical scheme, under the scene of the ambient light intensity, when the second polarization image of the at least one target needs to be acquired, the illumination device does not need to provide illumination for the at least one target. Thereby reducing the power consumption of the lighting device.
The image acquisition device only acquires a single image of at least one target, so that the number of the images required to be acquired by the image acquisition device can be reduced, and further, the calculated amount of the image acquisition device is reduced.
The image processing device determines a second polarization image according to the plurality of first polarization images of the single image. The number of images that the image processing apparatus needs to process can also be reduced, thereby reducing the amount of computation of the image processing apparatus.
And 5, indicating that the intensity of the ambient light represented by the scene 5 and the first characteristic information is greater than or equal to a preset value, and indicating to acquire qualitative polarization information.
That is, in scene 5, the first feature information is used to indicate: and acquiring qualitative polarization information of at least one target under the scene that the ambient light of the target is strong.
Accordingly, in this scenario, the first illumination operation determined by the image processing apparatus includes: without illumination, the first acquisition mode comprises: the single frame polarization image synchronous acquisition operation, the first image processing mode includes: and (5) obtaining qualitative polarization information.
That is to say, when the qualitative polarization information of at least one target is acquired in a scene with strong ambient light, the image acquisition device determines that illumination is not required to be provided for the scene to which the target belongs by adopting a light source, the image acquisition device directly acquires a single image of the scene to which the target belongs, and determines a plurality of first polarization images with different polarization states of the scene to which the target belongs from the single image. The image acquisition device determines a second polarization image from the plurality of first polarization images in different polarization states and outputs the second polarization image.
In one possible implementation manner, in combination with the above-mentioned S402a-S402f, in S402a, the first instruction generated by the image processing apparatus is used for indicating that the first polarized illumination mode is no illumination.
In S402c, the lighting device provides lighting to the target in lighting pattern 3.
In S402d, the second instruction generated by the image processing apparatus is used to indicate that the first acquisition mode is a single-frame polarized image synchronous acquisition operation.
In S402f, the image capture device captures a polarized image of at least one target using image capture mode 1.
In conjunction with the above S403, in S403, the image processing apparatus processes the acquired first polarization image by using a qualitative polarization information acquisition operation to determine qualitative polarization information of at least one target. The image processing device outputs qualitative polarization information of at least one target.
Based on the technical scheme, under the scene of the ambient light intensity, when the second polarization image of the at least one target needs to be acquired, the illumination device does not need to provide illumination for the at least one target. Thereby reducing the power consumption of the lighting device.
The image acquisition device only acquires a single image of at least one target, so that the number of the images required to be acquired by the image acquisition device can be reduced, and further, the calculated amount of the image acquisition device is reduced.
The image processing device determines qualitative polarization information of at least one target according to the plurality of first polarization images of the single image. The number of images that the image processing apparatus needs to process can also be reduced, thereby reducing the amount of computation of the image processing apparatus.
And the scene 6, the first characteristic information representation ambient light intensity is greater than or equal to a preset value, and quantitative polarization information is obtained through indication.
That is, in scene 6, the first feature information is used to indicate: and acquiring quantitative polarization information of at least one target under the scene that the ambient light of the target is strong.
Accordingly, in this scenario, the first illumination operation determined by the image processing apparatus includes: the non-synchronous illumination of many polarized light sources, first collection mode includes: the asynchronous collection operation of multiframe polarization images, the first image processing mode comprises: quantitative polarization information acquisition operation.
That is to say, in a scene with strong ambient light, in order to obtain quantitative polarization information of at least one target, the image acquisition device determines to provide illumination for the scene to which the target belongs in sequence by using a plurality of polarized light sources, and the image acquisition device determines to acquire a plurality of images of the scene to which the target belongs after the illumination, and determines a plurality of first polarization images of different polarization states of the scene to which the target belongs from each of the plurality of images. The image acquisition device determines a Mueller matrix of at least one target from the plurality of first polarization images in different polarization states, the image acquisition device determines quantitative polarization information of the at least one target according to the Mueller matrix of the at least one target, and the image processing device outputs the quantitative polarization information of the at least one target.
In one possible implementation manner, in combination with the foregoing S402a-S402f, in S402a, the first instruction generated by the image processing apparatus is used for indicating that the first polarized illumination mode is non-synchronized illumination of the multi-polarized light source.
In S402c, the lighting device provides lighting to the target in lighting pattern 2.
In S402d, the second instruction generated by the image processing apparatus is used to instruct the first acquisition mode to be a multi-frame polarized image non-synchronous acquisition operation.
In S402f, the image capture device captures a polarized image of at least one target using image capture mode 2.
In conjunction with the above S403, in S403, the image processing apparatus processes the acquired first polarization image by using a quantitative polarization information acquisition operation to determine quantitative polarization information of at least one target. The image processing device outputs quantitative polarization information of at least one target.
Based on the technical scheme, in a scene with strong ambient light, when the quantitative polarization information of at least one target needs to be acquired, the illumination device provides asynchronous illumination of multiple polarized light sources for the at least one target, so that the image acquisition device can acquire the polarized images formed by the reflection of the at least one target under different polarized light sources, and based on the polarized images formed by the reflection of the at least one target under different polarized light sources and the polarized light information provided by the different polarized light sources, the image processing device can determine the mueller matrix of the at least one target, and then the image processing device can determine the quantitative polarization information of the at least one target according to the mueller matrix of the at least one target.
It should be noted that, the wavelength band of the polarized light source provided by the illumination device and the wavelength band of the polarized image of the target collected by the image collection device are not limited in the embodiments of the present application.
For example, the illumination device may provide various polarized light sources under current visible light conditions; alternatively, the illumination device may provide a polarized light source in the Infrared (IR) band; alternatively, the illumination device may also provide polarized light sources in other wavelength bands. This is not limited in this application.
Similarly, the image acquisition device can acquire the visible light polarization image of the target when acquiring the polarization image of the target. Or the image acquisition device can also acquire a polarization image of the target in an IR wave band; or the image acquisition device can also acquire the polarization images of the target under other wave bands. This is not limited in this application.
It should be noted that, when the image acquisition device acquires polarization images of different wave bands, the image acquisition device can be implemented by adding channel coating films of corresponding wave bands.
For example, when the image acquisition device acquires the polarization image of the target in the IR band, the polarization image of the target in the IR band can be acquired by adding an IR channel coating film in the image acquisition device.
It should be noted that, in the above embodiments, mainly in order to apply the image processing method to the vehicle, in the implementation process, the image processing method may also be applied to other devices having lighting, image capturing, and image processing capabilities, such as a mobile phone, a tablet computer, a camera having an image processing capability, and the like. This is not limited in this application.
In addition, the present application mainly describes an example in which the image processing device, the illumination device, and the image capturing device are separately provided. In the specific implementation process, the image processing device, the lighting device and the image acquisition device can also be integrated in the same equipment. In this case, the implementation of the image processing method according to the embodiment of the present application is similar to that described in the above embodiment, and the description of the implementation is omitted here.
All the schemes in the above embodiments of the present application can be combined without contradiction.
The above description has introduced the solution of the embodiments of the present application mainly from the perspective of internal implementation of the device and interaction between the devices. It is to be understood that each of the devices, for example, the image processing device, the image acquisition device and the illumination device, includes at least one of a hardware structure and a software module corresponding to each function in order to realize the above-described functions. Those of skill in the art would readily appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as hardware or combinations of hardware and computer software. Whether a function is performed as hardware or computer software drives hardware depends upon the particular application and design constraints imposed on the solution. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
In the embodiment of the present application, the image processing device, the image capturing device, and the lighting device may be divided into the functional units according to the above method examples, for example, each functional unit may be divided corresponding to each function, or two or more functions may be integrated into one processing unit. The integrated unit can be realized in a form of hardware, and can also be realized in a form of a software functional unit. It should be noted that the division of the unit in the embodiment of the present application is schematic, and is only a logic function division, and there may be another division manner in actual implementation.
In the case of an integrated unit, fig. 9 shows a schematic diagram of a possible structure of the processing device (denoted as processing device 90) in the above embodiment, where the processing device 90 includes a processing unit 901, a communication unit 902, and a storage unit 903. The schematic structural diagram shown in fig. 9 can be used to illustrate the structures of the image processing device, the image capturing device, and the illumination device in the above embodiments.
When the schematic configuration diagram shown in fig. 9 is used to illustrate the configuration of the image processing apparatus according to the above-described embodiment, the processing unit 901 is configured to control and manage the operation of the image processing apparatus, for example, to control the image processing apparatus to perform the operations performed by the image processing apparatus in S400 to S403 in fig. 4, S400a to S400c, S401, S402a, S402b, S402d, S402e, S402g, and S403 in other processes described in the embodiment of the present application. The processing unit 901 may communicate with other device entities, e.g. with the image acquisition device and the lighting device shown in fig. 2, via the communication unit 902. The storage unit 903 is used to store program codes and data of the image processing apparatus.
When the schematic configuration diagram shown in fig. 9 is used to illustrate the configuration of the image processing apparatus according to the above embodiment, the processing apparatus 90 may be an image processing apparatus or a chip in the image processing apparatus.
When the schematic configuration shown in fig. 9 is used to illustrate the configuration of the image capturing apparatus in the above-described embodiment, the processing unit 901 is used to control and manage the actions of the image capturing apparatus, for example, control the image capturing apparatus to perform the actions performed by the image capturing apparatus in S402e to S402g in fig. 5, and/or other processes described in the embodiment of the present application. The processing unit 901 may physically communicate with other devices, for example, the image processing device and the lighting device shown in fig. 2, through the communication unit 902. The storage unit 903 is used to store program codes and data of the image acquisition apparatus.
When the schematic structural diagram shown in fig. 9 is used to illustrate the structure of the image capturing device in the above embodiment, the processing device 90 may be an image capturing device or a chip in the image capturing device.
When the schematic configuration diagram shown in fig. 9 is used to illustrate the configuration of the lighting device in the above embodiment, the processing unit 901 is used to perform control management on the actions of the lighting device, for example, to control the lighting device to perform the actions performed by the lighting device in S402b and S402c in fig. 5, and/or in other processes described in the embodiments of the present application. The processing unit 901 may physically communicate with other apparatuses, for example, the image processing apparatus and the image capturing apparatus shown in fig. 2, through the communication unit 902. The storage unit 903 is used to store program codes and data of the lighting device.
When the schematic configuration diagram shown in fig. 9 is used to illustrate the configuration of the lighting device in the above embodiment, the processing device 90 may be a lighting device or a chip in the lighting device.
When the processing device 90 is an image processing device, an image capturing device or an illumination device, the processing unit 901 may be a processor or a controller, and the communication unit 902 may be a communication interface, a transceiver circuit, a transceiver device, or the like. The communication interface is a generic term, and may include one or more interfaces. The storage unit 903 may be a memory. When the processing device 90 is an image processing device, an image acquisition device or a chip in a lighting device, the processing unit 901 may be a processor or a controller, and the communication unit 902 may be an input interface and/or an output interface, a pin or a circuit, etc. The storage unit 903 may be a storage unit (e.g., a register, a cache, etc.) in the chip, or may also be a storage unit (e.g., a read-only memory (ROM), a Random Access Memory (RAM), etc.) in the image processing apparatus, the image capturing apparatus, or the lighting apparatus, which is located outside the chip.
The communication unit may also be referred to as a transceiver unit. The antenna and the control circuit having the transmitting and receiving functions in the processing device 90 may be regarded as a communication unit 902 of the processing device 90, and the processor having the processing function may be regarded as a processing unit 901 of the processing device 90. Optionally, a device in the communication unit 902 for implementing a receiving function may be regarded as a receiving unit, where the receiving unit is configured to perform the receiving step in the embodiment of the present application, and the receiving unit may be a receiver, a receiving circuit, and the like.
The integrated unit in fig. 9, if implemented in the form of a software functional module and sold or used as a separate product, may be stored in a computer-readable storage medium. Based on such understanding, the technical solutions of the embodiments of the present application may be essentially implemented or make a contribution to the prior art, or all or part of the technical solutions may be implemented in the form of a software product stored in a storage medium, and including several instructions for causing a computer device (which may be a personal computer, a server, a network device, or the like) or a processor (processor) to execute all or part of the steps of the methods described in the embodiments of the present application. A storage medium storing a computer software product comprising: u disk, removable hard disk, read only memory, random access memory, magnetic or optical disk, etc. for storing program codes.
The elements in fig. 9 may also be referred to as modules, for example, the processing elements may be referred to as processing modules.
The embodiment of the present application further provides a schematic diagram of a hardware structure of a processing device (denoted as the processing device 100), and referring to fig. 10 or fig. 11, the processing device 100 includes a processor 1001, and optionally, further includes a memory 1002 connected to the processor 1001. At least one processor is included in the processor 1001.
In a first possible implementation, referring to fig. 10, the processing device 100 further comprises a communication interface 1003. The processor 1001, the memory 1002, and the communication interface 1003 are connected by a bus. The communication interface 1003 is used for communicating with other devices or communication networks. Optionally, communication interface 1003 may include a transmitter and a receiver. The device for implementing the receiving function in the communication interface 1003 may be regarded as a receiver for performing the receiving step in the embodiment of the present application. The means for implementing the transmission function in the communication interface 1003 may be regarded as a transmitter for performing the steps of transmission in the embodiment of the present application.
Based on a first possible implementation manner, the schematic structural diagram shown in fig. 10 may be used to illustrate the structure of the image processing apparatus, the image acquisition apparatus, or the lighting device in the above embodiments.
When the schematic configuration diagram shown in fig. 10 is used to illustrate the configuration of the image processing apparatus according to the above-described embodiment, the processor 1001 is configured to control and manage the operations of the image processing apparatus, and for example, the processor 1001 is configured to support the image processing apparatus to perform the operations performed by the image processing apparatus in S400 to S403 in fig. 4, S400a to S400c, S401, S402a, S402b, S402d, S402e, S402g, S403 in fig. 5, and/or other processes described in the embodiments of the present application. The processor 1001 may communicate with other device entities, such as the image capture device and the illumination device shown in fig. 2, through the communication interface 1003. The memory 1002 is used to store program codes and data of the image processing apparatus.
When the schematic configuration shown in fig. 10 is used to illustrate the configuration of the image capturing apparatus in the above embodiment, the processor 1001 is configured to control and manage the actions of the image capturing apparatus, for example, the processor 1001 is configured to support the image capturing apparatus to perform the actions performed by the image capturing apparatus in S402e to S402g in fig. 5 and/or other processes described in the embodiment of the present application. The processor 1001 may communicate with other device entities, for example, the image processing device and the lighting device shown in fig. 2, through the communication interface 1003. The memory 1002 is used to store program codes and data for the image acquisition apparatus. It should be noted that, when the processing apparatus 100 is used to illustrate the structure of the image capturing apparatus in the above embodiment, the processing apparatus 100 further includes a polarization image collector for collecting a polarization image of the target.
When the schematic configuration shown in fig. 10 is used to illustrate the configuration of the lighting device in the above embodiment, the processor 1001 is used to control and manage the actions of the lighting device, for example, the processor 1001 is used to support the lighting device to perform the actions performed by the lighting device in S402b and S402c in fig. 5 and/or other processes described in the embodiments of the present application. The processor 1001 may communicate with other device entities, for example, the image processing device and the image capturing device shown in fig. 2, through the communication interface 1003. The memory 1002 is used to store program codes and data for the lighting device. It should be noted that, when the processing device 100 is used to illustrate the structure of the illumination device in the above embodiments, the processing device 100 further includes a plurality of light sources with polarization states for providing polarized illumination for the target.
In a second possible implementation, the processor 1001 comprises logic circuits and at least one of an input interface and an output interface. Wherein the output interface is used for executing the sent action in the corresponding method, and the input interface is used for executing the received action in the corresponding method.
Based on the second possible implementation manner, referring to fig. 11, the schematic structural diagram shown in fig. 11 may be used to illustrate the structure of the image processing apparatus, the image capturing apparatus, or the lighting device in the above embodiments.
When the schematic configuration diagram shown in fig. 11 is used to illustrate the configuration of the image processing apparatus according to the above-described embodiment, the processor 1001 is configured to control and manage the operations of the image processing apparatus, and for example, the processor 1001 is configured to support the image processing apparatus to perform the operations performed by the image processing apparatus in S400 to S403 in fig. 4, S400a to S400c, S401, S402a, S402b, S402d, S402e, S402g, S403 in fig. 5, and/or other processes described in the embodiments of the present application. The processor 1001 may physically communicate with other devices, for example, the image capture device and the illumination device shown in fig. 2, through at least one of the input interface and the output interface. The memory 1002 is used to store program codes and data of the image processing apparatus.
When the schematic configuration shown in fig. 11 is used to illustrate the configuration of the image capturing apparatus in the above embodiment, the processor 1001 is configured to control and manage the actions of the image capturing apparatus, for example, the processor 1001 is configured to support the image capturing apparatus to perform the actions performed by the image capturing apparatus in S402e to S402g in fig. 5 and/or other processes described in the embodiment of the present application. The processor 1001 may physically communicate with other devices, for example, the image processing device and the lighting device shown in fig. 2, through at least one of the input interface and the output interface. The memory 1002 is used to store program codes and data for the image acquisition apparatus.
When the schematic configuration shown in fig. 11 is used to illustrate the configuration of the lighting device in the above embodiment, the processor 1001 is used to control and manage the actions of the lighting device, for example, the processor 1001 is used to support the lighting device to perform the actions performed by the lighting device in S402b and S402c in fig. 5 and/or other processes described in the embodiments of the present application. The processor 1001 may physically communicate with other apparatuses, for example, the image processing apparatus and the image capturing apparatus shown in fig. 2, through at least one of the input interface and the output interface. The memory 1002 is used to store program codes and data for the lighting device.
Fig. 10 and 11 may also illustrate a system chip in the image processing apparatus. In this case, the actions executed by the image processing apparatus may be implemented by the system chip, and the specific actions executed may be referred to above and are not described herein again. Fig. 10 and 11 may also illustrate a system chip in the image acquisition apparatus. In this case, the actions executed by the image acquisition device may be implemented by the system chip, and the specific actions executed may be referred to above and are not described herein again. Fig. 10 and 11 may also illustrate a system chip in the lighting device. In this case, the actions performed by the lighting device may be implemented by the system chip, and the specific actions performed may be referred to above and are not described herein again.
In implementation, the steps of the method provided by this embodiment may be implemented by hardware integrated logic circuits in a processor or instructions in the form of software. The steps of a method disclosed in connection with the embodiments of the present application may be directly implemented by a hardware processor, or may be implemented by a combination of hardware and software modules in a processor.
Processors in the present application may include, but are not limited to, at least one of: various computing devices that run software, such as a Central Processing Unit (CPU), a microprocessor, a Digital Signal Processor (DSP), a Microcontroller (MCU), or an artificial intelligence processor, may each include one or more cores for executing software instructions to perform operations or processing. The processor may be a single semiconductor chip or integrated with other circuits to form a semiconductor chip, for example, an SoC (system on chip) with other circuits (such as a codec circuit, a hardware acceleration circuit, or various buses and interface circuits), or may be integrated in the ASIC as a built-in processor of the ASIC, which may be packaged separately or together with other circuits. The processor may further include necessary hardware accelerators such as Field Programmable Gate Arrays (FPGAs), PLDs (programmable logic devices), or logic circuits implementing dedicated logic operations, in addition to cores for executing software instructions to perform operations or processes.
The memory in the embodiment of the present application may include at least one of the following types: read-only memory (ROM) or other types of static memory devices that may store static information and instructions, Random Access Memory (RAM) or other types of dynamic memory devices that may store information and instructions, and Electrically erasable programmable read-only memory (EEPROM). In some scenarios, the memory may also be, but is not limited to, a compact disk-read-only memory (CD-ROM) or other optical disk storage, optical disk storage (including compact disk, laser disk, optical disk, digital versatile disk, blu-ray disk, etc.), magnetic disk storage media or other magnetic storage devices, or any other medium that can be used to carry or store desired program code in the form of instructions or data structures and that can be accessed by a computer.
Embodiments of the present application also provide a computer-readable storage medium, which includes instructions that, when executed on a computer, cause the computer to perform any of the above methods.
Embodiments of the present application also provide a computer program product containing instructions which, when run on a computer, cause the computer to perform any of the methods described above.
An embodiment of the present application further provides an image processing system, including: the image processing device, the image acquisition device and the illumination device are provided.
Embodiments of the present application further provide a chip, where the chip includes a processor and an interface circuit, where the interface circuit is coupled to the processor, the processor is configured to execute a computer program or instructions to implement the method, and the interface circuit is configured to communicate with other modules outside the chip.
In the above embodiments, the implementation may be wholly or partially realized by software, hardware, firmware, or any combination thereof. When implemented using a software program, may be implemented in whole or in part in the form of a computer program product. The computer program product includes one or more computer instructions. The procedures or functions described in accordance with the embodiments of the present application are all or partially generated upon loading and execution of computer program instructions on a computer. The computer may be a general purpose computer, a special purpose computer, a network of computers, or other programmable device. The computer instructions may be stored on a computer readable storage medium or transmitted from one computer readable storage medium to another computer readable storage medium, for example, the computer instructions may be transmitted from one website, computer, server, or data center to another website, computer, server, or data center via wire (e.g., coaxial cable, fiber optic, Digital Subscriber Line (DSL)), or wireless (e.g., infrared, wireless, microwave, etc.). Computer-readable storage media can be any available media that can be accessed by a computer or can comprise one or more data storage devices, such as servers, data centers, and the like, that can be integrated with the media. The usable medium may be a magnetic medium (e.g., floppy disk, hard disk, magnetic tape), an optical medium (e.g., DVD), or a semiconductor medium (e.g., Solid State Disk (SSD)), among others.
Finally, it should be noted that: the above description is only an embodiment of the present application, but the scope of the present application is not limited thereto, and any changes or substitutions within the technical scope of the present disclosure should be covered by the scope of the present application. Therefore, the protection scope of the present application shall be subject to the protection scope of the claims.

Claims (39)

1. An image processing method, comprising:
acquiring first characteristic information; the first characteristic information is used for representing scene characteristics and indicating the acquisition of polarization information of at least one target, and corresponds to a first illumination mode, a first acquisition mode and a first image processing mode;
controlling polarized light illumination according to the first illumination mode, and controlling and acquiring a first polarized image of the at least one target under the first illumination mode according to the first acquisition mode;
and processing the first polarization image according to the first image processing mode to acquire the polarization information of the at least one target.
2. The method of claim 1, wherein after the obtaining the first feature information, the method further comprises:
and determining the first illumination mode, the first acquisition mode and the first image processing mode according to the first characteristic information.
3. The method according to claim 1 or 2, wherein the scene characteristics comprise light intensity information of the environment.
4. The method according to any of claims 1-3, wherein the polarization information comprises at least one of: a second polarization image, qualitative polarization information, or quantitative polarization information.
5. The method of any of claims 1-4, wherein the first illumination mode belongs to a set of polarized illumination modes, the set of polarized illumination modes comprising at least one of: multi-polarized light source synchronous illumination, multi-polarized light source asynchronous illumination, or no illumination;
wherein the multi-polarized light source synchronous illumination is as follows: simultaneously providing polarized illumination by a plurality of polarized light sources of different polarization states;
the non-synchronous illumination of the multi-polarization light source is as follows: providing polarized illumination by a plurality of polarized light sources of different polarization states in sequence;
the no-lighting is: no illumination is provided.
6. The method according to any one of claims 1 to 5, wherein the first acquisition mode belongs to a set of polarization acquisition modes, the set of polarization acquisition modes comprising: synchronous collection operation of single-frame polarized images and asynchronous collection operation of multi-frame polarized images;
the single-frame polarized image synchronous acquisition operation comprises the following steps: acquiring a plurality of polarization images with different polarization states from a single image by acquiring the single image of at least one target;
the asynchronous collecting operation of the multi-frame polarized image comprises the following steps: by acquiring a plurality of images of the at least one target, each image of the plurality of images comprises a plurality of polarization images of different polarization states.
7. The method according to any one of claims 1 to 6, wherein the first image processing mode belongs to a set of polarized image processing modes, the set of polarized image processing modes comprising: obtaining polarized images, qualitative polarization information and quantitative polarization information;
wherein the polarization image obtaining operation is configured to obtain the second polarization image, and the second polarization image belongs to polarization images of the first polarization image in different polarization states;
the qualitative polarization information obtaining operation is to obtain at least one of the following polarization information of the at least one target: stokes vector, degree of polarization, or angle of polarization;
the quantitative polarization information obtaining operation is used for obtaining a Mueller matrix of the at least one target.
8. The method according to any one of claims 1-7, wherein the first characteristic information characterizes that the ambient light intensity is less than a preset value and indicates to acquire the second polarization image, and the first illumination operation comprises: the multi-polarization light source synchronous illumination method comprises the following steps: a single frame polarized image synchronous acquisition operation, the polarized image processing operation comprising: and (5) a polarized image acquisition operation.
9. The method of any one of claims 1-7, wherein the first characterization information characterizes ambient light intensity less than a predetermined value and indicates acquisition of qualitative polarization information, and the polarized light illumination operation comprises: the multi-polarization light source synchronous illumination method comprises the following steps: a single frame polarized image synchronous acquisition operation, the polarized image processing operation comprising: and (5) obtaining qualitative polarization information.
10. The method of any one of claims 1-7, wherein the first characterization information characterizes ambient light intensity less than a predetermined value and indicates acquisition of qualitative polarization information, and the polarized light illumination operation comprises: the non-synchronous illumination of many polarized light sources, polarization image acquisition mode includes: a multi-frame polarized image non-synchronous acquisition operation, the polarized image processing operation comprising: quantitative polarization information acquisition operation.
11. The method of any one of claims 1-7, wherein the first characterization information characterizes an ambient light intensity greater than or equal to a preset value and indicates acquisition of the second polarized image, and the polarized light illumination operation comprises: no illumination, the polarized image acquisition mode comprises: a single frame polarized image synchronous acquisition operation, the polarized image processing operation comprising: and (5) a polarized image acquisition operation.
12. The method of any one of claims 1-7, wherein the first characterization information characterizes ambient light intensity greater than or equal to a preset value and indicates acquisition of qualitative polarization information, and the polarized light illumination operation comprises: no illumination, the polarized image acquisition mode comprises: a single frame polarized image synchronous acquisition operation, the polarized image processing operation comprising: and (5) obtaining qualitative polarization information.
13. The method of any one of claims 1-7, wherein the first characterization information characterizes ambient light intensity greater than or equal to a predetermined value and indicates acquisition of quantitative polarization information, and the polarized light illumination operation comprises: the non-synchronous illumination of many polarized light sources, polarization image acquisition mode includes: a multi-frame polarized image non-synchronous acquisition operation, the polarized image processing operation comprising: quantitative polarization information acquisition operation.
14. An image processing apparatus characterized by comprising: a processing unit and an acquisition unit;
the acquiring unit is used for acquiring first characteristic information; the first characteristic information is used for representing scene characteristics and indicating the acquisition of polarization information of at least one target, and corresponds to a first illumination mode, a first acquisition mode and a first image processing mode;
the processing unit is used for controlling polarized light illumination according to the first illumination mode and controlling and acquiring a first polarized image of the at least one target under the first illumination mode according to the first acquisition mode;
the processing unit is further configured to process the first polarization image according to the first image processing manner, and acquire polarization information of the at least one target.
15. The apparatus of claim 14, wherein the processing unit is further configured to:
and determining the first illumination mode, the first acquisition mode and the first image processing mode according to the first characteristic information.
16. The apparatus according to claim 15, wherein the processing unit is specifically configured to:
generating a first instruction and a second instruction; the first instruction is used for indicating the first illumination mode; the second instruction is used for indicating the first acquisition mode;
and instructing the acquisition unit to send the first instruction to a lighting device, and sending the second instruction to an image acquisition device.
17. The apparatus according to claim 16, wherein the processing unit is further configured to:
instructing the acquisition unit to receive the first polarization image from the image acquisition device.
18. The apparatus of any of claims 14-17, wherein the scene characteristics comprise light intensity information of the environment.
19. The apparatus according to any of claims 14-18, wherein the polarization information comprises at least one of: a second polarization image, qualitative polarization information, or quantitative polarization information.
20. The apparatus of any one of claims 14-19, wherein the first illumination mode belongs to a set of polarized illumination modes, the set of polarized illumination modes comprising at least one of: multi-polarized light source synchronous illumination, multi-polarized light source asynchronous illumination, or no illumination;
wherein the multi-polarized light source synchronous illumination is as follows: simultaneously providing polarized illumination by a plurality of polarized light sources of different polarization states;
the non-synchronous illumination of the multi-polarization light source is as follows: providing polarized illumination by a plurality of polarized light sources of different polarization states in sequence;
the no-lighting is: no illumination is provided.
21. The apparatus according to any one of claims 14-20, wherein the first acquisition mode belongs to a set of polarization acquisition modes, the set of polarization acquisition modes comprising: synchronous collection operation of single-frame polarized images and asynchronous collection operation of multi-frame polarized images;
the single-frame polarized image synchronous acquisition operation comprises the following steps: acquiring a plurality of polarization images with different polarization states from a single image by acquiring the single image of at least one target;
the asynchronous collecting operation of the multi-frame polarized image comprises the following steps: acquiring a plurality of images of the at least one target, wherein each image of the plurality of images comprises a plurality of polarization images with different polarization states.
22. The apparatus according to any one of claims 14-21, wherein the first image processing mode belongs to a set of polarized image processing modes, the set of polarized image processing modes comprising: obtaining polarized images, qualitative polarization information and quantitative polarization information;
wherein the polarization image obtaining operation is configured to obtain the second polarization image, and the second polarization image belongs to polarization images of the first polarization image in different polarization states;
the qualitative polarization information obtaining operation is to obtain at least one of the following polarization information of the at least one target: stokes vector, degree of polarization, or angle of polarization;
the quantitative polarization information obtaining operation is used for obtaining a Mueller matrix of the at least one target.
23. The apparatus according to any one of claims 14-22, wherein the first characteristic information characterizes an ambient light intensity less than a preset value and indicates acquisition of the second polarization image, and the first illumination operation comprises: the multi-polarization light source synchronous illumination method comprises the following steps: a single frame polarized image synchronous acquisition operation, the polarized image processing operation comprising: and (5) a polarized image acquisition operation.
24. The apparatus of any one of claims 14-22, wherein the first characterization information characterizes ambient light intensity less than a predetermined value and indicates acquisition of qualitative polarization information, and the polarized light illumination operation comprises: the multi-polarization light source synchronous illumination method comprises the following steps: a single frame polarized image synchronous acquisition operation, the polarized image processing operation comprising: and (5) obtaining qualitative polarization information.
25. The apparatus of any one of claims 14-22, wherein the first characterization information characterizes ambient light intensity less than a predetermined value and indicates acquisition of qualitative polarization information, and the polarized light illumination operation comprises: the non-synchronous illumination of many polarized light sources, polarization image acquisition mode includes: a multi-frame polarized image non-synchronous acquisition operation, the polarized image processing operation comprising: quantitative polarization information acquisition operation.
26. The apparatus of any one of claims 14-22, wherein the first characterization information characterizes an ambient light intensity greater than or equal to a preset value and indicates acquisition of the second polarized image, and wherein the polarized light illumination operation comprises: no illumination, the polarized image acquisition mode comprises: a single frame polarized image synchronous acquisition operation, the polarized image processing operation comprising: and (5) a polarized image acquisition operation.
27. The apparatus of any one of claims 14-22, wherein the first characterization information characterizes ambient light intensity greater than or equal to a predetermined value and indicates acquisition of qualitative polarization information, and wherein the polarized light illumination operation comprises: no illumination, the polarized image acquisition mode comprises: a single frame polarized image synchronous acquisition operation, the polarized image processing operation comprising: and (5) obtaining qualitative polarization information.
28. The apparatus of any one of claims 14-22, wherein the first characterization information is indicative of an ambient light intensity greater than or equal to a predetermined value and indicative of obtaining quantitative polarization information, and wherein the polarized light illumination operation comprises: the non-synchronous illumination of many polarized light sources, polarization image acquisition mode includes: a multi-frame polarized image non-synchronous acquisition operation, the polarized image processing operation comprising: quantitative polarization information acquisition operation.
29. An illumination device, comprising: a plurality of polarized light sources of different polarization states, at least one processor, and a communication interface;
the communication interface is used for receiving a first instruction from the image processing device, and the first instruction is used for indicating a first illumination mode; the first illumination mode is used for representing a mode of providing illumination by adopting the plurality of polarized light sources with different polarization states;
the at least one processor is configured to control the plurality of polarized light sources of different polarization states to provide illumination according to the first instruction.
30. The apparatus of claim 29, wherein the first illumination mode belongs to a set of polarized illumination modes, the set of polarized illumination modes comprising at least one of: multi-polarized light source synchronous illumination, multi-polarized light source asynchronous illumination, or no illumination;
wherein the multi-polarized light source synchronous illumination is as follows: simultaneously providing polarized illumination by a plurality of polarized light sources of different polarization states;
the non-synchronous illumination of the multi-polarization light source is as follows: providing polarized illumination by a plurality of polarized light sources of different polarization states in sequence;
the no-lighting is: no illumination is provided.
31. The apparatus of claim 30, wherein the first illumination mode comprises: the multi-polarization light source synchronously illuminates;
the processor is specifically configured to: controlling one or more of the plurality of differently polarized light sources to provide polarized illumination.
32. The apparatus of claim 30, wherein the first illumination mode comprises: non-synchronous illumination by multiple polarized light sources;
the processor is specifically configured to: and controlling each polarized light source in the plurality of polarized light sources with different polarization states to sequentially provide polarized illumination.
33. The apparatus of claim 30, wherein the first illumination mode comprises: no illumination is carried out;
the processor is specifically configured to: and controlling none of the polarized light sources in the plurality of different polarization states to provide illumination.
34. An image acquisition apparatus, comprising: the system comprises a polarization image collector, at least one processor and a communication interface;
the communication interface is used for receiving a second instruction from the image processing device, and the second instruction is used for indicating a first acquisition mode; the first acquisition mode is used for representing a mode of acquiring a polarization image;
and the at least one processor is used for controlling the polarization image collector to collect the polarization image according to the second instruction.
35. The apparatus of claim 34, wherein the first acquisition modality belongs to a set of polarization acquisition modalities, the set of polarization acquisition modalities comprising: synchronous collection operation of single-frame polarized images and asynchronous collection operation of multi-frame polarized images;
the single-frame polarized image synchronous acquisition operation comprises the following steps: acquiring a single image of at least one target, and acquiring a plurality of polarization images in different polarization states from the single image;
the asynchronous collecting operation of the multi-frame polarized image comprises the following steps: acquiring a plurality of images of the at least one target, wherein each image of the plurality of images comprises a plurality of polarization images with different polarization states.
36. The apparatus of claim 35, wherein the first acquisition mode comprises: synchronously acquiring a single frame of polarized image;
the processor is specifically configured to control the polarization image collector to collect a single image of the at least one target according to the second instruction, and obtain a plurality of polarization images in different polarization states from the single image.
37. The apparatus of claim 35, wherein the first acquisition mode comprises: asynchronous collecting operation of multi-frame polarized images;
the processor is specifically configured to control the polarization image collector to collect images of the at least one target for multiple times according to the second instruction, determine multiple images of the at least one target, and obtain multiple polarization images in different polarization states from each of the multiple images.
38. A computer-readable storage medium, comprising a computer program or instructions which, when run on a computer, cause the computer to perform the method of any one of claims 1-13.
39. A terminal device comprising one or more of: the image processing device of any one of claims 14 to 28, the illumination device of any one of claims 29 to 33, or the image capturing device of any one of claims 34 to 37.
CN202180003570.4A 2021-05-08 2021-05-08 Image processing method and device Active CN113924768B (en)

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2021/092444 WO2022236492A1 (en) 2021-05-08 2021-05-08 Image processing method and apparatus

Publications (2)

Publication Number Publication Date
CN113924768A true CN113924768A (en) 2022-01-11
CN113924768B CN113924768B (en) 2022-12-13

Family

ID=79248997

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202180003570.4A Active CN113924768B (en) 2021-05-08 2021-05-08 Image processing method and device

Country Status (2)

Country Link
CN (1) CN113924768B (en)
WO (1) WO2022236492A1 (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100283883A1 (en) * 2008-06-26 2010-11-11 Panasonic Corporation Image processing apparatus, image division program and image synthesising method
US20150219498A1 (en) * 2014-02-06 2015-08-06 The Boeing Company Systems and Methods for Measuring Polarization of Light in Images
US20180130444A1 (en) * 2016-11-07 2018-05-10 Electronics And Telecommunications Research Institute Information image display apparatus and method
WO2020160439A1 (en) * 2019-02-01 2020-08-06 Children's National Medical Center System and method for intraoperative, non-invasive nerve identification using snapshot polarimetry
CN111562223A (en) * 2019-03-25 2020-08-21 上海昊量光电设备有限公司 Polarizing imaging device and method based on micro-polarizer array

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20070146634A1 (en) * 2005-12-22 2007-06-28 Leblanc Richard A Illumination characteristic selection system for imaging during an ophthalmic laser procedure and associated methods
CN105946522B (en) * 2016-05-09 2018-01-09 京东方科技集团股份有限公司 One kind polarization light regulation method and device
CN108548603A (en) * 2018-04-12 2018-09-18 中国科学院光电技术研究所 A kind of non co axial four-way polarization imaging method and system
KR102008249B1 (en) * 2018-06-25 2019-09-04 사이정보통신(주) Apparatus and method for automatic polarization control
US20220146434A1 (en) * 2019-04-03 2022-05-12 Sony Group Corporation Image processing apparatus, information generation apparatus, and method thereof
CN112163586A (en) * 2020-09-30 2021-01-01 北京环境特性研究所 Feature extraction method and device of target object and computer readable medium

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100283883A1 (en) * 2008-06-26 2010-11-11 Panasonic Corporation Image processing apparatus, image division program and image synthesising method
US20150219498A1 (en) * 2014-02-06 2015-08-06 The Boeing Company Systems and Methods for Measuring Polarization of Light in Images
US20180130444A1 (en) * 2016-11-07 2018-05-10 Electronics And Telecommunications Research Institute Information image display apparatus and method
WO2020160439A1 (en) * 2019-02-01 2020-08-06 Children's National Medical Center System and method for intraoperative, non-invasive nerve identification using snapshot polarimetry
CN111562223A (en) * 2019-03-25 2020-08-21 上海昊量光电设备有限公司 Polarizing imaging device and method based on micro-polarizer array

Also Published As

Publication number Publication date
CN113924768B (en) 2022-12-13
WO2022236492A1 (en) 2022-11-17

Similar Documents

Publication Publication Date Title
KR102043060B1 (en) Autonomous drive apparatus and vehicle including the same
CN105270179B (en) Vehicle parking assistance device and vehicle
CA3086809C (en) High-speed image readout and processing
CN110543814A (en) Traffic light identification method and device
WO2021057344A1 (en) Data presentation method and terminal device
CN113228135B (en) Blind area image acquisition method and related terminal device
KR102077575B1 (en) Vehicle Driving Aids and Vehicles
CN112810603B (en) Positioning method and related product
CN115100377A (en) Map construction method and device, vehicle, readable storage medium and chip
CN112654546B (en) Method and device for identifying object of interest of user
CN113924768B (en) Image processing method and device
CN115205311A (en) Image processing method, image processing apparatus, vehicle, medium, and chip
CN115205848A (en) Target detection method, target detection device, vehicle, storage medium and chip
CN115170630A (en) Map generation method, map generation device, electronic device, vehicle, and storage medium
WO2021159397A1 (en) Vehicle travelable region detection method and detection device
CN115100630A (en) Obstacle detection method, obstacle detection device, vehicle, medium, and chip
CN113022573B (en) Road structure detection method and device
CN115164910B (en) Travel route generation method, travel route generation device, vehicle, storage medium, and chip
CN115082573B (en) Parameter calibration method and device, vehicle and storage medium
CN111845786B (en) Method and device for determining automatic driving strategy
CN114842454B (en) Obstacle detection method, device, equipment, storage medium, chip and vehicle
CN111775962B (en) Method and device for determining automatic driving strategy
WO2022041820A1 (en) Method and apparatus for planning lane-changing trajectory
CN115082886A (en) Target detection method and device, storage medium, chip and vehicle
CN114078246A (en) Method and device for determining three-dimensional information of detection object

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant