WO2022105381A1 - Procédé et appareil d'ajustement de paramètre d'exposition - Google Patents

Procédé et appareil d'ajustement de paramètre d'exposition Download PDF

Info

Publication number
WO2022105381A1
WO2022105381A1 PCT/CN2021/117598 CN2021117598W WO2022105381A1 WO 2022105381 A1 WO2022105381 A1 WO 2022105381A1 CN 2021117598 W CN2021117598 W CN 2021117598W WO 2022105381 A1 WO2022105381 A1 WO 2022105381A1
Authority
WO
WIPO (PCT)
Prior art keywords
light source
image
target light
moment
grayscale
Prior art date
Application number
PCT/CN2021/117598
Other languages
English (en)
Chinese (zh)
Inventor
林培埌
姜艺
余本德
Original Assignee
华为技术有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 华为技术有限公司 filed Critical 华为技术有限公司
Publication of WO2022105381A1 publication Critical patent/WO2022105381A1/fr

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/70Circuitry for compensating brightness variation in the scene
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/70Circuitry for compensating brightness variation in the scene
    • H04N23/73Circuitry for compensating brightness variation in the scene by influencing the exposure time

Definitions

  • the present application relates to the field of imaging technology, and in particular, to a method and device for adjusting exposure parameters.
  • the camera device provides input of road images for autopilot algorithms and plays an important role in autopilot.
  • ADAS advanced driving assistance system
  • ADAS advanced driving assistance system
  • functions such as: lane departure warning system (LDW), blind spot monitoring system (BSM), parking assist (parking assist) assist, PA), panoramic parking (surround view parking, SVP), traffic sign recognition (traffic sign recognition, TSR), lane keeping assist (lane keeping assist system, LKA) and other functions are very dependent on the camera device.
  • L3 ⁇ L4 ⁇ L5 autonomous driving the functions of obstacle recognition, traffic light recognition, lane line detection, etc. must also use camera devices.
  • the requirements for the camera device are relatively high.
  • the quality of the image captured by the camera device is required to be high, and the second is the need for a high capture frame rate of the camera device. If the quality of the image is poor, it will affect the neural network algorithm to recognize the image content, which will easily lead to accidents.
  • the car speed of 100km/h and the camera frame rate of 10Hz as an example, within the 100ms time interval of one frame of image, the car drives 2.78m, in this distance of 2.78m, the car does not take pictures, which is equivalent to no perception of the surrounding of the car. Such a driving scene is undoubtedly very dangerous.
  • the lighting conditions are complex, especially at night before and after the intersection with other cars with their lights turned on, or, when facing the flash of some road capture devices, the brightness of the light changes greatly, causing the camera to shoot. images are prone to overexposure or underexposure.
  • the camera device cannot adjust the exposure parameters in time in these scenarios, resulting in the inability to obtain clear images, thus causing various perception algorithms to fail to recognize the surrounding environment, which is prone to driving hazards.
  • the present application provides an exposure parameter adjustment method and device, which determines the influence of the position of the target light source on the imaging device by predicting the position of the target light source.
  • the exposure parameters can be adjusted in time to obtain clear images.
  • an embodiment of the present application provides an exposure parameter adjustment method, and the method can be executed by an electronic device that manages a camera device (it can also be the camera device itself).
  • the method may include the following steps:
  • the electronic device acquires the first image captured by the camera at the first moment; determines the target light source, and acquires first position information of the target light source, wherein the first position information is used to indicate that the target light source is in the first the position in the image, or indicate the physical position of the target light source at the first moment; according to the first position information, predict the position of the target light source in the second image to be captured by the camera at the second moment second position information; wherein the second time is located after the first time; generating a predicted image of the second image according to the first image and the second position information; calculating the first image
  • the grayscale value of the first image and the grayscale value of the predicted image are adjusted; the exposure parameters of the camera are adjusted according to the grayscale value of the first image and the grayscale value of the predicted image.
  • the electronic device acquires the first image captured by the camera at the first moment, and generates a predicted image of the second image to be captured at the second moment.
  • the second position information of the target light source at the second moment can be predicted, so that the predicted image can be generated according to the second position information, and the grayscale of the first image can be obtained. Value and the change of the gray value of the predicted image, adjust the exposure parameters of the camera.
  • this solution can determine the influence of the position of the target light source on the imaging device by predicting the position of the target light source. In the face of scenes with drastic changes in light, the exposure parameters can be adjusted in time to obtain clear images to ensure driving safety.
  • the electronic device before determining the target light source, the electronic device further performs the following steps:
  • Obtain brightness information of at least one light source wherein the brightness information of any light source at the first moment is used to represent the brightness of the light source at the first moment; according to the brightness information of the at least one light source, at the The target light source whose brightness exceeds the set brightness threshold is determined from at least one light source.
  • the target light source whose brightness exceeds the set brightness threshold is selected from the at least one light source, thereby avoiding the change of the position information of the light source that is not valid for calculation, reducing the amount of calculation, and enabling faster adjustment of exposure parameters.
  • the brightness information of at least one light source is obtained, including:
  • the brightness of at least one light source at the first moment can be determined according to the gray value of the area where the light source is located in the first image, and the brightness of at least one light source at the first moment sent by other devices can also be received.
  • the target light source to be predicted can be selected from a larger screening range, avoiding the problem of missing prediction of the position of some light sources at the second moment, thereby affecting the adjustment of exposure parameters.
  • the first position information is used to indicate the position of the target light source in the first image; and the camera is predicted at the second moment according to the first position information
  • the second position information of the target light source in the second image to be captured includes: acquiring the relative speed between the target light source and the camera device; according to the time difference between the first moment and the second moment, The first position information, and the relative velocity, determine the second position information.
  • the position of the target light source in the first image, and the relative speed of the target light source and the camera it is possible to predict the target light source.
  • the second position information of the target light source in the second image to be shot by the camera device at the second moment, and further, the second position information can be predicted only by the first image and the relative speed of the target light source and the camera device , without the need for electronic equipment to determine the specific physical position of the target light source, the requirements for electronic equipment are low, and the calculation is convenient.
  • the first position information is used to indicate the physical position of the target light source at the first moment; the camera is predicted to be about to shoot at the second moment according to the first position information
  • the second position information of the target light source in the second image of the The physical position at the moment; according to the physical position of the target light source at the second moment, the physical position of the camera device at the second moment, through various image prediction methods or various motion prediction methods, determine the second location information.
  • the electronic device can determine the moving speed of various target light sources. For example, when the target light source is a car light, the moving speed of the target light source can be sent by other vehicles through the V2X system, and the vehicle speed is the target light source. Therefore, the problem that the second position information of the target light source cannot be determined if the target light source does not appear in the first image can be avoided. With this design, even if the target light source does not appear in the first image, According to the moving speed and physical position of the target light source, the second position information of the target light source can also be determined through various motion prediction methods.
  • determining the second position information according to the physical position of the target light source at the second moment and the physical position of the camera device at the second moment includes: according to the The physical position of the target light source at the second moment, the physical position of the camera device at the second moment, determine the third position information in the camera coordinate system at the second moment; wherein, the camera coordinates is a coordinate system centered on the camera; converting the third position information into the second position information.
  • the electronic device can use the camera calibration method to convert the coordinate position of the target light source in the camera coordinate system into the second position information of the target light source in the second image to be captured at the second moment, so that the electronic device can
  • the second position information in the second image to be shot is used to generate a predicted image, and then the exposure parameters of the camera can be adjusted.
  • the acquiring the motion speed of the target light source includes: acquiring at least one third image before the first moment captured by the camera, according to the first image and the third image , determine the movement speed of the target light source; or receive the movement speed of the target light source sent by other devices; or receive sensor data sent by a sensor, and determine the movement speed of the target light source according to the sensor data.
  • the motion speed determined in different ways can be received, and the motion speed of the target light source can be predicted according to multiple images (the first image and the third image); the motion speed of the target light source sent by other devices can also be received.
  • the motion speed is the motion speed of the vehicle; the sensor on the electronic device can also detect the motion speed of the target light source, such as: using millimeter-wave radar or lidar to perceive the target Light source and predict movement speed.
  • the motion speed of the same target light source is determined in multiple ways
  • the motion speed of the target light source sent in multiple ways can also be fused and analyzed, and different determination methods are given according to the characteristics of different determination methods in different scenarios. Set different weight values to get a more accurate movement speed of the target light source.
  • generating the predicted image of the second image according to the first image and the second position information includes: determining an initialization image of the predicted image, wherein the initialization The image is the same as the first image; in the initialization image, the area indicated by the second position information is determined; the grayscale adjustment area is determined according to the area indicated by the second position information, wherein the The grayscale adjustment area includes the area indicated by the second position information; the grayscale value of the pixel included in the grayscale adjustment area in the initialization image is adjusted to a set grayscale value to obtain the predicted image .
  • the electronic device can determine the area indicated by the second position information in the initialization image according to the second position information, and because at the first moment and the second moment, the target light source is far from the camera device Therefore, in the predicted image, according to the area indicated by the second position information, the size of the gray-scale adjustment area is also different, wherein the area corresponding to the second position information may be the pixel coordinate
  • the point is an area that extends from the center to the surrounding area.
  • the shape of the expanded area can be: a circular area, a rectangular area, an irregular polygonal area, and so on.
  • the adjustment value of the grayscale value of the grayscale adjustment area is the set grayscale value, and the set grayscale value can be the maximum value (255) or the minimum value (0) among the grayscale values. In this way, When the exposure parameter is adjusted according to the gray value of the first image and the predicted image, the adjustment effect is more obvious.
  • the set gray value is the first position in the first image The gray value of the pixels contained in the area indicated by the information.
  • the electronic device when generating the predicted image, can determine the grayscale value of the pixel in the grayscale adjustment area in the predicted image according to the grayscale value of the target light source in the first image, so that when the predicted image is generated according to the first image and the predicted image The adjusted exposure parameters are more accurate when the exposure parameters are adjusted by the gray value of the gray value.
  • the grayscale value of the first image is the average value of the grayscale values of all pixels in the first image;
  • the grayscale value of the predicted image is all the grayscale values in the predicted image.
  • the electronic device can adjust the exposure parameter according to the change of the average gray level of the first image and the average gray level of the predicted image.
  • the gray value of the first image may also be the mode value of the gray value of each pixel point corresponding to the target light source in the first image.
  • the exposure parameter includes an exposure duration
  • adjusting the exposure parameter of the camera device according to the grayscale difference value includes: when the grayscale difference value is not less than the first difference value, Decrease the exposure time of the imaging device; or increase the exposure time of the imaging device when the grayscale difference is not greater than the second difference; or increase the exposure time of the imaging device when the grayscale difference is greater than the second difference and smaller than the second difference
  • the image signal processor of the camera device is adjusted to an automatic exposure time adjustment mode; wherein the first difference is a positive number, and the second difference is a negative number.
  • the electronic device can establish a corresponding exposure time adjustment table corresponding to the grayscale difference interval, and those skilled in the art can obtain exposure time curves at different vehicle speeds and different grayscale values through experiments.
  • the exposure time adjustment table is established, and the electronic device determines a corresponding exposure time adjustment value in the exposure time adjustment table according to the grayscale difference value interval in which the grayscale difference value is located. In this way, when the grayscale difference is not less than the first difference, it indicates that the target light source will be irradiated by the camera device at the second moment.
  • the exposure time of the camera should be reduced to prevent overexposure;
  • the grayscale difference value is greater than the second difference value, it indicates that at the second moment the imaging device is irradiated by the target light source into a dark environment, and the exposure time is increased at this time to prevent underexposure;
  • the grayscale difference value is greater than
  • the second difference is smaller than the first difference, it indicates that the ambient light does not change significantly, and the image signal processor of the camera device can be adjusted to an automatic exposure time adjustment mode to automatically adjust the exposure time.
  • the exposure parameter includes an exposure duration
  • adjusting the exposure parameter of the camera device according to the grayscale difference value includes: according to the grayscale difference value and the grayscale of the first image
  • the grayscale change rate is not less than the first threshold, reduce the exposure time of the camera device; or when the grayscale change rate is not greater than the second threshold, increase the exposure time of the camera; or when the grayscale difference is greater than the second threshold and less than the first threshold, the image signal processor of the camera is adjusted to an automatic exposure time adjustment mode; wherein the first A threshold is a positive number, and the second threshold is a negative number.
  • the electronic device can also establish a corresponding exposure time adjustment table corresponding to the grayscale change rate interval, and those skilled in the art can obtain exposure time curves at different vehicle speeds and different grayscale values through experiments.
  • the curve establishes the exposure time adjustment table.
  • the adjusted exposure value can be more suitable for the changing scene, and the image quality captured by the adjusted camera device is better.
  • adjusting the exposure parameters of the camera device is not limited to exposure time, but may also be exposure parameters such as aperture size.
  • the camera device can also be adjusted by expanding or narrowing the aperture Captured image quality. And adjust the exposure time, aperture size, and sensitivity in the exposure parameters.
  • the main purpose of adjusting the exposure parameters is to increase or decrease the amount of light entering the camera device, thereby preventing the problem of overexposure or underexposure of the image captured by the camera device.
  • an embodiment of the present application provides an exposure parameter adjustment apparatus, which includes a unit for performing each step in the above first aspect.
  • an embodiment of the present application provides an exposure parameter adjustment device, comprising at least one processing element and at least one storage element, wherein the at least one storage element is used to store programs and data, and the at least one processing element is used to execute this Apply the method provided in the first aspect.
  • an embodiment of the present application further provides a computer program, which, when the computer program runs on a computer, causes the computer to execute the method provided in the first aspect.
  • an embodiment of the present application further provides a computer storage medium, where a computer program is stored in the computer storage medium, and when the computer program is executed by a computer, the computer is caused to execute the method provided in the first aspect. .
  • an embodiment of the present application further provides a chip, where the chip is configured to read a computer program stored in a memory and execute the method provided in the first aspect.
  • an embodiment of the present application further provides a chip system, where the chip system includes a processor for supporting a computer device to implement the method provided in the first aspect.
  • the chip system further includes a memory for storing necessary programs and data of the computer device.
  • the chip system can be composed of chips, and can also include chips and other discrete devices.
  • FIG. 1 is a schematic diagram of the architecture of a vehicle-mounted system
  • FIG. 2 is a schematic flowchart of a method for adjusting exposure parameters
  • Fig. 3a is a spatial schematic diagram of the first vehicle at the first moment
  • 3b is a schematic diagram of a predicted intersection of the first vehicle at the second moment
  • FIG. 4 is a schematic flowchart of a first vehicle executing an exposure parameter adjustment method
  • Figure 5a is a first image captured by a camera at a first moment
  • Figure 5b is a second image to be captured by the camera at a second moment
  • Fig. 5c is the predicted image predicted and generated by the light source prediction module
  • 5d is a schematic diagram of the generation of a predicted image
  • FIG. 6 is a unit structure diagram of an exposure parameter adjustment device
  • FIG. 7 is a structural diagram of an exposure parameter adjustment device.
  • the present application provides an exposure parameter adjustment method and device, which are used to determine the influence of the position of the light source on the camera device by predicting the position of the light source, so as to adjust the exposure parameters of the camera device in advance.
  • the exposure parameters can be adjusted in time to obtain clear images to ensure driving safety.
  • the method and the device are based on the same technical concept. Since the principles of the method and the device to solve the problem are similar, the implementation of the device and the method can be referred to each other, and the repetition will not be repeated.
  • Kalman filtering is an algorithm that uses the linear system state equation to optimally estimate the state of the system through the system input and output observation data. Since the observation data includes the influence of noise and interference in the system, the optimal estimation can also be regarded as a filtering process.
  • camera calibration is to convert the physical coordinates of an object into the coordinates in the camera coordinate system established with the camera device as the center, and then convert the coordinates of the object in the camera coordinate system into the coordinates of the camera device.
  • the process of pixel coordinates of an image is to convert the physical coordinates of an object into the coordinates in the camera coordinate system established with the camera device as the center, and then convert the coordinates of the object in the camera coordinate system into the coordinates of the camera device. The process of pixel coordinates of an image.
  • Exposure parameters which are parameters set by the camera device when capturing images.
  • the exposure parameter can be used to instruct the camera device to receive the total amount of light emitted by the scene when photographing the scene.
  • Exposure parameters may include shutter time, sensitivity (international standardization organization, ISO), and aperture.
  • the shutter time is also called exposure time
  • the shutter is a threshold gate that controls the light entry time. For example, if the exposure time is longer and longer, the amount of light entering the imaging device when capturing an image is larger, so the captured image has higher brightness. If the exposure time is short, the amount of light entering the image pickup device when capturing an image is small, so the brightness of the captured image is low.
  • Sensitivity is the sensitivity of the photosensitive element of the imaging device to light, and the sensitivity is measured by the ISO value.
  • the aperture is used to control the amount of light that enters the camera body through the lens of the camera. If the aperture is increased, the amount of incoming light will increase, and the image captured by the imaging device will be brighter; if the aperture is decreased, the amount of incoming light will be reduced, and the image captured by the imaging device will be darker.
  • a millimeter wave radar is a radar working in the millimeter wave (millimeter wave) band for detection.
  • the working frequency band is generally 30GHz-300GHz, and the wavelength is 1-10mm, which is between microwave and centimeter wave.
  • the basic principle is to use high-frequency circuit to generate electromagnetic wave (cone) of specific modulation frequency, and send electromagnetic wave through antenna and receive reflection from target.
  • the returned electromagnetic wave calculates the various parameters of the target by sending and receiving the parameters of the electromagnetic wave.
  • Lidar is a radar system that emits laser beams to detect the position, speed, distance, size and other information of targets. Lidar detects targets by emitting laser beams, and forms point cloud data by collecting reflected beams. These data can be called accurate three-dimensional stereo images after photoelectric processing, which can accurately obtain high-precision physical space environment information, and the ranging accuracy can reach centimeter level.
  • the multi-sensor fusion algorithm is to make full use of multi-sensor data resources of different time and space, and use computer technology to analyze, synthesize, dominate and use the multi-sensor data obtained in time series under certain criteria, and obtain accurate information on the subject. Consistent interpretation and description of test objects.
  • the pixel involved in the embodiments of the present application may be the smallest imaging unit on an image.
  • a pixel can correspond to a coordinate point on the image.
  • a pixel can correspond to one parameter (such as grayscale), or it can be a collection of multiple parameters (such as grayscale, color, etc.). If a pixel corresponds to a parameter, the pixel value is the value of the parameter, and if the pixel is a set of multiple parameters, the pixel value includes the value of each parameter in the set.
  • the first image, the second image, and the third image in the embodiments of the present application are the output images of the camera device, that is, the original image data obtained by the camera device converting the light information reflected by the collected object into a digital image signal.
  • the raw data may be raw format data.
  • the raw format data may include information of the object and parameters of the camera device.
  • the predicted image in the embodiment of the present application is an image in which the first image is used as an initialization image for grayscale adjustment. If the grayscale value corresponding to a pixel point is 255, the pixel point appears white; when the grayscale value corresponding to a pixel point is 0, the pixel point appears black.
  • a pixel on a frame of image corresponds to a gray value
  • the gray value is the brightness level (such as the gray value of 0-255).
  • the exposure parameter adjustment method provided in the embodiments of the present application can be applied to various systems, such as a single-camera system, a single-camera optical image stabilizer (OIS) system, an optical image stabilizer system under multiple photography, and the like .
  • the camera device in the embodiment of the present application may be a device with a camera function, such as a mobile phone, a computer, a tablet computer, and the like with a camera function.
  • the embodiments of the present application may be introduced based on the camera coordinate system of the camera device, and the camera coordinate system is a coordinate system formulated with the camera device as the center.
  • the embodiments of the present application are also applicable to several other coordinate systems, such as a world coordinate system and the like.
  • the world coordinate system can also be called the real or real world coordinate system, which is the absolute coordinate of the objective world.
  • the methods provided in the embodiments of the present application may also be applied to the in-vehicle system of intelligent driving/automatic driving of the vehicle 100 .
  • the exposure parameter adjustment method can be carried in a separate on-board electronic device (also referred to as a vehicle control device), or coupled to an automatic driving assistance system such as an advanced driving assistance system (ADAS). , which is not limited in this application.
  • ADAS advanced driving assistance system
  • the in-vehicle system may include: N operating devices ( 1011 - 101N), an automatic driving device 102 , a vehicle control device 103 , a sensing device 104 , and a communication device 105 .
  • the operating devices (1011-101N) may specifically include a steering wheel, an accelerator pedal, a manual gear lever, etc.
  • the operating devices are used to receive the driver's driving intention and generate corresponding vehicle control commands.
  • the driver can control the steering of the vehicle and control the steering speed of the vehicle by controlling the rotation angle of the steering wheel, the rotational speed of the steering wheel, and the like.
  • the driver can also control the acceleration of the vehicle and the acceleration of the vehicle by controlling the opening degree, opening and closing speed, etc. of the accelerator pedal.
  • the driver can also control the direction of the vehicle by controlling the direction of the manual gear, such as forward gear and reverse gear.
  • the operating devices (1011-101N) in the embodiments of the present application may be divided into two parts: a mechanical operating part and a controller.
  • the controller may receive the automatic driving instruction sent by the automatic driving device 102, and control the mechanical operation part according to the automatic driving instruction.
  • the controller may control the opening and closing speed of the accelerator pedal after receiving the automatic driving command indicating the opening and closing speed of the accelerator pedal.
  • the automatic driving instruction also includes vehicle acceleration information, and the accelerator pedal controller can convert the acceleration information into the opening and closing speed of the accelerator pedal, and then control the mechanical operation part according to the obtained opening and closing speed of the accelerator pedal. Open and close.
  • the autopilot device 102 also known as an autopilot brain, may be an artificial intelligence (AI) chip, a graphics processing unit (GPU) chip, a central processing unit (central processing unit) capable of executing autopilot algorithms , CPU) and other chips, and may also be a system composed of multiple chips, which is not limited in the embodiments of the present application.
  • the automatic driving device 102 may receive sensing data provided by the sensing device 104, and generate automatic driving instructions according to the sensing data.
  • the vehicle control device 103 is configured to control the vehicle 100 by receiving the vehicle control commands sent by the operating devices ( 1011 to 101N ) and the control commands sent by the automatic driving device 102 .
  • the vehicle control device 103 may also perform fusion analysis on the sensor data sent by the multiple types of sensors in the sensing device 104 through a multi-sensor fusion algorithm, and assign different sensors to different sensors according to the characteristics of different sensors in different scenarios. Set different weight values. For example, if the camera device has a good effect on scene detection on low road edges, in the low road edge scene, increase the weight value of the camera device to improve the confidence of the camera device and so on.
  • the sensing device 104 may include a lidar, a millimeter-wave radar, a camera, a speed sensor, a GPS (global positioning system) sensor, and the like.
  • the speed sensor is used to collect the speed of the vehicle in real time
  • the GPS sensor is used to obtain the position information of the current vehicle
  • the camera device is used to collect images of the environment around the vehicle.
  • the communication device 105 is used to communicate with other vehicles, on board unit (OBU) and road side unit (RSU) in the Internet of Vehicles through the Internet of Vehicles or the vehicle to everything (V2X) system , roadside equipment (RSE), vehicle electronic control unit (electronic control unit, ECU) to communicate.
  • OBU on board unit
  • RSU road side unit
  • V2X vehicle to everything
  • RSE roadside equipment
  • ECU vehicle electronic control unit
  • the communication device 105 can establish a communication connection with the application server through the V2X communication network to perform communication interaction.
  • the communication device 105 may acquire vehicle information of any member of the Internet of Vehicles server from the Internet of Vehicles server.
  • the vehicle information includes information such as vehicle speed, vehicle position, and illuminance of vehicle headlights and taillights.
  • the communication device 105 of the vehicle is connected to an electronic control unit (electronic control unit, ECU) node connected on a controller area network (CAN) bus, and obtains various types of information of the vehicle through the ECU node.
  • ECU electronic control unit
  • CAN controller area network
  • the system architecture shown in FIG. 1 does not constitute a limitation on the in-vehicle system for realizing intelligent driving/automatic driving of a vehicle provided by the embodiment of the present application, and the vehicle in the above-mentioned in-vehicle system may also include more or fewer components. .
  • the automatic driving device 102 may not be included in the vehicle.
  • An embodiment of the present application provides a method for adjusting exposure parameters, and the method may be executed by an electronic device that manages the camera device (it may also be the camera device itself).
  • the device may be the vehicle control device 103 in the vehicle-mounted system as shown in FIG. 1 , or it may be an independent electronic device independent of the vehicle control device 103 , or it may be coupled to control the exposure parameters of the camera device.
  • the device that manages the camera device is simply referred to as an electronic device as an example for description.
  • the camera device is arranged on the intelligent driving/autonomous driving vehicle 100 to collect an image of the environment to perceive the surrounding environment.
  • the physical position is the coordinate position of the object in the world coordinate system.
  • the speed of the camera device and the speed of the vehicle can be regarded as the same value, and similarly, the acceleration of the camera device and the acceleration of the vehicle can also be regarded as the same value.
  • the method specifically includes the following steps:
  • S201 The electronic device acquires the first image captured by the camera at the first moment.
  • the camera is arranged on the vehicle, therefore, if the relative positional relationship between the camera and the vehicle is not considered, when the camera captures the first image at the first moment, the vehicle can be used as the center to formulate In the coordinate system, a first image is captured from a first perspective of the vehicle. If the relative positional relationship between the camera and the vehicle is considered, after the physical position of the vehicle is determined, the physical position of the camera is determined according to the relative position of the camera and the vehicle and the physical position of the vehicle, and then , when the camera device captures the first image at the first moment, a camera coordinate system may be formulated with the camera device as the center, and finally the first image is captured from the first angle of view of the camera device.
  • the first image captured by the camera device obtained by the electronic device is not limited to one, but may also be multiple images.
  • the camera device is a camera with a continuous shooting function, all images
  • the images acquired by the electronic device may be multiple images captured by the camera device at the first moment, and the electronic device may determine one or more of the images as the first image.
  • the electronic device determines the target light source, and obtains first position information of the target light source, where the first position information is used to indicate the position of the target light source in the first image, or to indicate the physical position of the target light source at the first moment;
  • the position information is to predict the second position information of the target light source in the second image to be captured by the camera device at the second moment. Wherein, the second moment is located after the first moment.
  • the second time is the time when the camera device next shoots
  • the first time may be the current time or any time before the current time. For example, if the second time is October 19, 2020 10:10:15, and the current time is October 19, 2020 10:10:14, then the first time can be October 19, 2020 Day 10:10:14, or any time before October 19, 2020 10:10:14 (such as October 19, 2020 10:10:13, etc.).
  • the target light source can be any object that can emit light or have light reflecting ability.
  • the target light source may be a natural light source (sun), a headlight of a vehicle on the opposite side, a taillight of a vehicle on the same side, a street light on the roadside, a searchlight in a tunnel, and the like.
  • the target light source may also be a light-emitting LED screen, a traffic sign made of reflective material, and the like.
  • the first position information can not only indicate the coordinate position of the target light source in the world coordinate system (that is, the physical position of the target light source), but also indicate the coordinate position of the target light source in the camera coordinate system, and may also indicate the coordinate position of the target light source in the camera coordinate system.
  • the coordinate position of the target light source in the coordinate system centered on any specified object may also indicate the pixel position coordinates of the target light source in the first image, which is not specifically limited here.
  • the second position information is used to indicate the pixel position coordinates of the target light source in the second image, but the pixel position coordinates are not limited to a single coordinate value, and can also be an area including a plurality of pixel position coordinates. (set of coordinates), which is not limited here. Since the second moment is the next moment to be photographed, the pixel position coordinates of the target light source in the second image need to be determined by means of prediction.
  • the electronic device may, but is not limited to, use the following method to predict the second position information of the target light source in the second image to be captured by the camera at the second moment:
  • Method 1 The electronic device can determine the physical position of the target light source at the second moment according to the relative speed between the target light source and the camera, the physical position of the target light source at the first moment, and the time difference between the first moment and the second moment. According to the physical position of the target light source at the second moment, the pixel position coordinates of the target light source in the second image to be captured by the camera device at the second moment are predicted.
  • the electronic device can predict the physical position of the target light source at the second moment by using the relative speed between the target light source and the camera, and the physical position of the target light source at the first moment. position, and determine the pixel position coordinates of the target light source in the second image according to the physical position of the target light source at the second moment.
  • the above determination method may specifically include the following steps:
  • the electronic device acquires the movement speed of the camera device and the movement speed of the target light source.
  • the electronic device may regard the speed of the vehicle at the current moment as the movement speed of the camera.
  • the speed of the vehicle may be obtained directly from the vehicle dashboard.
  • the electronic device receives sensor data sent by various types of sensors in the sensing device 104, and determines the speed of the vehicle according to the sensor data.
  • various sensors can collect sensor data such as longitudinal acceleration, lateral acceleration, and wheel speed of the vehicle at the current moment, and the electronic device can estimate the speed of the vehicle at the current moment according to the sensor data, and then determine the The moving speed of the camera.
  • the electronic device may further determine the movement speed and position of the target light source according to the sensor data.
  • various types of sensors are but not limited to: millimeter wave radar, lidar and so on.
  • the millimeter-wave radar determines the movement speed of the target light source by emitting electromagnetic waves with a specific modulation frequency.
  • the lidar transmits laser beams and collects the reflected beams to determine point cloud data, and according to the point cloud data forms a three-dimensional stereoscopic image containing the target light source to determine the movement speed of the target light source.
  • the electronic device may also receive the movement speed of the target light source sent by other devices through the communication device 105 .
  • the electronic device may acquire the moving speed of the target light source (other vehicle) from a V2X (vehicle to X, vehicle wireless communication technology) system of the other vehicle .
  • V2X vehicle to X, vehicle wireless communication technology
  • the electronic device may further acquire at least one third image captured by the camera before the first moment, according to the first
  • the pixel coordinate position of the target light source in the three images and the pixel coordinate position of the target light source in the first image are used to predict the movement speed of the target light source through a neural network.
  • the first image and the third image can be input into a first neural network model, and the movement speed of the target light source in the first image can be obtained from the output of the first neural network model, wherein the The first neural network model is a neural network model that predicts the movement speed of the object by outputting multiple images of the same object.
  • the electronic device determines the relative speed between the target light source and the camera device.
  • the electronic device determines the motion speed and direction of the camera device according to the motion speed and the motion direction of the target light source. direction to determine the relative speed between the target light source and the camera device.
  • the present application is not limited to the direction of motion speed in a two-dimensional coordinate system, but can also be a direction of motion speed in a three-dimensional coordinate system. The following uses the direction of motion speed in a two-dimensional coordinate system as an example for easy understanding.
  • the electronic device determines the physical position of the target light source at the second moment according to the relative speed between the target light source and the camera, and the physical position of the target light source at the first moment, according to the physical position of the target light source at the second moment. position, and predict the pixel position coordinates of the target light source in the second image to be captured by the camera device at the second moment.
  • the manner in which the electronic device determines the physical position of the target light source at the first moment may be, but not limited to, the following manners:
  • Mode (1) The electronic device receives the position of the target light source sent by other devices.
  • the electronic device may receive the physical location of the target light source at the first moment sent by other devices supporting the V2X system.
  • the electronic device can receive vehicle information of the vehicle broadcast by the V2X system of the other vehicle, and the vehicle information includes the physical location of the other vehicle, and other The position of the headlights and taillights on the vehicle relative to the vehicle.
  • the electronic device determines the physical positions of the headlights and taillights of other vehicles according to the positions of the headlights and taillights relative to the vehicle and the physical positions of other vehicles.
  • the electronic device may also use the position of the other vehicle as the position of the target light source.
  • the electronic device receives sensor data sent by a sensor, and determines the position of the target light source according to the sensor data. Specifically, the electronic device can receive the millimeter-wave radar by emitting electromagnetic waves with a specific modulation frequency to determine the physical position of the target light source, and the electronic device can use the lidar to emit laser beams and collect the reflected beams. The physical location of the target light source is determined.
  • Manner (3) The electronic device uses the pixel coordinate position of the target light source in the first image to determine the physical position of the target light source at the first moment.
  • the electronic device may, but is not limited to, determine the physical position of the target light source at the first moment by the following method: using a camera calibration method, The pixel coordinate position in is converted into the physical position of the target light source at the first moment. Specifically, the pixel coordinate position of the target light source in the first image is converted into the coordinate position of the target light source in the camera coordinate system established with the camera as the center.
  • the method for converting the pixel coordinate position of the target light source in the first image into the coordinate position in the camera coordinate system can be calculated by, but not limited to, the following formula:
  • u is the pixel position coordinate of the target light source in the vertical direction
  • a is the pixel resolution of the image captured by the camera in the vertical direction
  • a is the direction angle of the camera in the vertical direction
  • x m is the The x-axis coordinate value of the target light source in the camera coordinate system
  • z m is the z-axis coordinate of the target light source in the camera coordinate system.
  • the electronic device After determining the coordinate position of the target light source in the camera coordinate system, the electronic device converts the coordinate position into the physical position of the target light source in the world coordinate system. Specifically, the electronic device can determine the physical position of the camera and the coordinate position of the target light source in the camera coordinate system, and convert the coordinate position of the target light source in the camera coordinate system into the target light source in the camera coordinate system. Take the physical position in the world coordinate system.
  • the electronic device can predict the pixel position of the target light source in the second image coordinate.
  • the method for predicting the physical position of the target light source at the second moment may be, but not limited to, the following methods: Kalman filter method, multi-Bayesian estimation method, proportional, integral, differential (proportion integral differential, PID) algorithm, etc. Etc., the Kalman filter method is introduced here as an example of prediction.
  • the Kalman filter method mainly uses the Kalman filter to combine the predicted physical position of the target light source at the second moment with the physical position of the target light source detected by the sensor and sent by other devices at the second moment,
  • the obtained physical position estimation value of the target light source at the second moment, the physical position estimation value is more accurate than all physical positions detected only from sensors, sent by other devices, or predicted. For example, according to the physical position of the target light source at the first moment, the relative speed between the target light source and the camera, and the time difference between the first moment and the second moment, predict the target light source Physical location at the second moment.
  • the target After knowing the physical position P of the target light source at the first moment, the relative velocity V between the target light source and the camera, and the time difference ⁇ t between the first moment and the second moment, the target is established The first state vector X 1 (P X1 , P Y1 , V X , V Y ) of the light source at the first instant.
  • the P X1 is the abscissa of the physical position of the target light source at the first moment
  • the P Y1 is the ordinate of the physical position of the target light source at the first moment
  • the V X is the The lateral relative velocity between the target light source and the imaging device
  • the V Y is the longitudinal relative velocity between the target light source and the imaging device.
  • a motion model with process noise can be used to predict the physical position of the target light source at the second moment, and the second state can be predicted by, but not limited to, the following methods vector, the physical position of the target light source at the second moment is determined according to the second state vector:
  • X 2 is the second state vector of the target light source at the second moment
  • A is the process noise.
  • the process noise can be the acceleration and deceleration caused by the vehicle downwind and against the wind, the acceleration and deceleration caused by the road slope, artificial acceleration and deceleration, etc.
  • ⁇ t is the time difference between the first moment and the second moment.
  • the above-mentioned process noise may also be represented by a covariance matrix, which should be known to those skilled in the art, and details are not described here.
  • the electronic device can predict the physical position of the target light source at the second moment through the motion model; the electronic device can also determine the physical position of the target light source at the second moment through detection by a sensor or sent by other devices. , but the physical location of the target light source is not necessarily accurate due to sensor detection or transmission by other devices. Therefore, the physical position of the target light source at the second moment detected by the sensor or sent by other devices can be used to correct the physical position of the target light source at the second moment predicted this time, so as to form a new sports model. Therefore, each time the physical position is predicted, the motion model newly formed last time is used to predict the physical position of the target light source, so that the physical position of the target light source can be predicted more accurately.
  • the electronic device After predicting the physical position of the target light source at the second moment, the electronic device determines the coordinate position of the target light source in the camera coordinate system at the second moment according to the physical position of the target light source at the second moment . And use the camera calibration method again to convert the coordinate position of the target light source in the camera coordinate system at the second moment into the first position of the target light source in the second image captured by the camera device at the second moment. 2. Location information.
  • the coordinate transformation method in the embodiment of the present application and the coordinate transformation method provided in the above-mentioned embodiments are based on the same concept, which will not be repeated here.
  • Method 2 The electronic device determines the pixel position coordinates of the target light source in the second image according to the moving speed of the target light source.
  • the method for the electronic device to obtain the movement speed of the target light source is the same as that provided in the first method, and details are not described here.
  • the electronic device may predict the target light source according to the moving speed of the target light source and the physical position of the target light source at the first moment
  • the physical position of the target light source at the second moment, and the method for predicting the physical position of the target light source at the second moment may be, but not limited to, the Kalman filtering method provided in the first manner above, and the like.
  • the electronic device acquires the physical position of the camera at the second moment, and obtains the physical position of the camera at the second moment according to the physical position of the target light source at the second moment and the physical position of the camera at the second moment , and determine the second location information.
  • the electronic device determines, according to the physical position of the target light source at the second moment and the physical position of the camera device at the second moment, the position in the camera coordinate system at the second moment third position information; wherein, the camera coordinate system is a coordinate system centered on the camera device; and using the third position information, the second position information is determined according to a camera calibration method.
  • Method 3 The electronic device determines the pixel position coordinates of the target light source in the second image according to the relative speed between the target light source and the camera, and the pixel position coordinates of the target light source in the first image.
  • the method for the electronic device to obtain the relative speed between the target light source and the camera device is the same as that provided in the first mode, which will not be introduced here.
  • the Kalman filter method can also be used to determine the position of the target light source in the second image according to the pixel position coordinates of the target light source in the first image. Pixel position coordinates, or, the pixel position coordinates of the target light source in the first image, and the relative speed of the target light source and the camera device can be jointly input into the second neural network model, the second neural network model.
  • the network model takes the pixel position coordinates of the light source in the image at the first moment and the relative speed of the light source and the camera as input, and takes the pixel position coordinates of the light source in the image at the second moment as the output for training.
  • Neural network training Model takes the pixel position coordinates of the light source in the image at the first moment and the relative speed of the light source and the camera as input, and takes the pixel position coordinates of the light source in the image at the second moment as the output for training.
  • S203 The electronic device generates a predicted image of the second image according to the first image and the second position information.
  • the electronic device After acquiring the second position information of the target light source, the electronic device generates a predicted image of the second image.
  • the method for generating the predicted image of the second image may include, but is not limited to, the following methods:
  • Method 1 The electronic device determines the area indicated by the second position information in the initialization image, determines the grayscale adjustment area according to the area indicated by the second position information, and adjusts the grayscale in the initialization image.
  • the gray value of the pixel points included in the adjustment area is adjusted to the gray value of the target light source in the first image to obtain the second image.
  • the initialization image is the same as the first image
  • the predicted image is generated on the basis of the first image (initialization image)
  • the grayscale adjustment area includes the area indicated by the second position information ;
  • the method may specifically include the following steps:
  • the electronic device determines the area indicated by the second position information in the initialization image, according to the area indicated by the second position information.
  • the grayscale adjustment area may be an area formed by taking the pixel coordinate point as the center and extending to the surrounding area.
  • the shape can be: circular area, rectangular area, irregular polygonal area and so on.
  • the grayscale adjustment area may be determined according to the distance between the target light source and the camera device. Correspondingly, the farther the target light source is from the camera device, the grayscale adjustment The smaller the area, the closer the target light source is to the camera, and the larger the grayscale adjustment area.
  • the grayscale adjustment area may also be determined according to the illuminance of the target light source.
  • the illuminance of the target light source may be determined after being photographed by the camera, or may be broadcast and received from the other devices including the V2X system, or may be detected by a sensor.
  • the weaker the illuminance of the target light source, the smaller the grayscale adjustment area, and the stronger the illuminance of the target light source the larger the grayscale adjustment area.
  • the above method of determining the grayscale adjustment area can be implemented independently or in combination, that is, the closer the target light source is to the camera device, the stronger the illuminance and the larger the grayscale adjustment area. The farther the target light source is from the camera, the lower the illuminance and the smaller the grayscale adjustment area.
  • the electronic device determines the grayscale value of the target light source in the first image. Specifically, the electronic device may first determine the grayscale value of the target light source in the first image. Wherein, the grayscale value of the target light source in the first image may be the average value of the grayscale values of each pixel corresponding to the target light source in the first image, or may be the first The mode value of the gray value of each pixel corresponding to the target light source in the image.
  • the electronic device adjusts the grayscale adjustment area in the initialization image to the grayscale value of the target light source in the first image to generate a second image.
  • the electronic device can also directly adjust the grayscale value of each pixel in the grayscale adjustment area to the set grayscale value.
  • the electronic device can adjust the grayscale value of each pixel in the grayscale adjustment area.
  • the grayscale value of is adjusted to 255 to indicate that the grayscale adjustment area in the initialization image is completely illuminated by the target light source.
  • Method 2 The electronic device determines the area indicated by the second position information in the initialization image, determines the grayscale adjustment area according to the area indicated by the second position information, and determines the grayscale adjustment area according to the relationship between the target light source and the camera device. The distance between the two and the illuminance of the target light source is used to determine the predicted grayscale value, and the grayscale value of the pixel point included in the grayscale adjustment area is adjusted to the predicted grayscale value.
  • the target light source in the first image captured at the first moment, the target light source may not appear, that is, at the first moment, the target light source does not appear in the shooting range of the camera, but
  • the electronic device predicts that the camera will appear in the second image to be captured at the second moment. Therefore, the electronic device cannot determine the gray value of the target light source that is not present according to the first image, and generate a second image.
  • the electronic device can predict the predicted gray value when the target light source should appear in the second image according to the distance between the target light source and the camera device and the illuminance of the target light source at the second moment .
  • the distance between the target light source and the camera device and the correspondence between the illuminance and the predicted gray value may be established in advance.
  • the predicted gray value of the target light source is 240.
  • the grayscale values of the pixels included in the grayscale adjustment area are adjusted to the predicted grayscale values.
  • the generated predicted image can be saved in an image format (such as jpg, jpeg, etc.), or directly stored as an array with the combination of each pixel position coordinate + grayscale value, for example, using an array (1, 1, 255) to Indicates that the grayscale value of the pixel in the 1st row and 1st column is 255.
  • an image format such as jpg, jpeg, etc.
  • S204 The electronic device calculates the gray value of the first image and the gray value of the predicted image.
  • the gray value of the first image may be the average value of the gray values of all pixels in the first image
  • the gray value of the predicted image may be the gray value of all pixels in the predicted image average of.
  • the electronic device calculates the grayscale value of the first image by weighting according to the distribution of each grayscale in the grayscale histogram of the first image, which is not limited here.
  • S205 Adjust the exposure parameters of the imaging device according to the gray value of the first image and the gray value of the predicted image.
  • the electronic device may, according to the gray value of the first image and the gray value of the predicted image, before reaching the second moment, the method of adjusting the exposure parameters of the camera device may be, but not limited to, the following methods :
  • Mode 1 The electronic device calculates the gray value of the first image minus the gray value of the predicted image to obtain a gray value difference; for example, the gray value of the first image can be expressed as YA, the The gray value of the predicted image can be expressed as YB.
  • the exposure time of the imaging device is reduced.
  • the exposure duration of the imaging device is increased.
  • the electronic device adjusts the image signal processor of the camera to the mode of automatically adjusting the exposure time; wherein the first difference The value is a positive number and the second difference is a negative number.
  • a corresponding exposure time adjustment table is established corresponding to the grayscale difference interval, and the exposure time adjustment table refers to Table 1 below.
  • the exposure time adjustment table is established according to the exposure time curve, and the electronic device determines a corresponding exposure time adjustment value in the exposure time adjustment table according to the grayscale difference value interval in which the grayscale difference value is located.
  • the corresponding relationship between the exposure time adjustment value and the grayscale difference interval provided in the exposure time adjustment table is only an example, and does not limit the content provided in this application.
  • Mode 2 The electronic device calculates the gray value of the first image minus the gray value of the predicted image to obtain a gray difference value, and according to the ratio of the gray difference value to the gray value of the first image , to determine the grayscale change rate.
  • the grayscale value of the first image may be expressed as YA
  • the grayscale value of the predicted image may be expressed as YB.
  • the grayscale change rate can be expressed as (YA-YB)/YB.
  • the exposure time of the imaging device is reduced.
  • the exposure duration of the imaging device is increased.
  • the electronic device adjusts the image signal processor of the camera device to a mode of automatically adjusting exposure time
  • the first threshold is a positive number
  • the second threshold is a negative number
  • the exposure time adjustment table refers to Table 2 below.
  • Table 2 Those skilled in the art can obtain exposure times under different vehicle speeds and different grayscale value change rates through experiments.
  • the exposure time adjustment table is established according to the exposure time curve, and the corresponding exposure time adjustment value is found in the exposure time adjustment table according to the grayscale difference value interval where the grayscale change rate is located.
  • the exposure time adjustment value determined by the electronic device through the change rate of the gray value of the image can be better suited to the current scene, and the adjusted camera device shoots
  • the corresponding relationship between the exposure time adjustment value and the grayscale change rate interval provided in the exposure time adjustment table is only an example, and will not be used for the content provided in this application. limited.
  • the electronic device After obtaining the exposure time adjustment value, the electronic device adjusts the exposure time of the imaging device before reaching the second time.
  • adjusting the exposure parameters of the camera device is not limited to exposure time, but may also be parameters such as aperture size.
  • the aperture can also be enlarged or narrowed.
  • the above-mentioned adjustment of the exposure time, aperture size, and sensitivity in the exposure parameters is mainly for the purpose of In order to increase or decrease the light input amount of the camera device, the problem of overexposure or underexposure of the image captured by the camera device is prevented.
  • the electronic device before acquiring the first position information of the target light source at the first moment, the electronic device further needs to acquire brightness information of at least one light source.
  • the brightness information of the at least one light source at the first moment is used to indicate the brightness of any light source in the shooting range of the camera device at the first moment; the electronic device according to the at least one According to the brightness information of the light source at the first moment, among the at least one light source, a light source whose brightness exceeds a set brightness threshold is determined as the target light source.
  • the electronic device needs to first acquire the brightness information of at least one light source at the first moment.
  • the at least one light source is all the light sources in the shooting range of the camera, and the target light source whose brightness exceeds a set brightness threshold is selected from the at least one light source, or the target whose brightness is lower than the brightness threshold is filtered out
  • the light source avoids the change of the position information of the light source that is invalid for calculation, reduces the calculation amount, and can adjust the exposure parameters more quickly.
  • the method for the electronic device to obtain the brightness information of the at least one light source at the first moment may be, but not limited to, the following ways:
  • Manner 1 The electronic device determines the at least one light source in the first image, and determines that the at least one light source is in the first image according to the gray value of the area where the at least one light source is located in the first image. Brightness information at the moment. Specifically, the electronic device first determines in the first image at least one light source that exists in the image, and determines the grayscale value of the area where the at least one light source is located. The larger the grayscale value, the brighter the brightness of the light source. higher, the greater the brightness information at the first moment.
  • Manner 2 The electronic device receives the brightness information of the at least one light source at the first moment sent by other devices.
  • the electronic device may, but is not limited to, receive the brightness information of the at least one light source at the first moment sent by other devices in the following manner: receive various types of vehicle information broadcast by the V2X system of other vehicles, from the vehicle's Among the various types of information, the position information and illuminance of the at least one light source at the first moment are determined. First, according to the position information of the at least one light source at the first moment and the shooting range of the camera, it is determined whether the at least one light source is within the shooting range of the camera. If the light source is within the range, The brightness information of at least one light source at the first moment is determined according to the position information and the illuminance. The farther the light source is from the camera device, the lower the illuminance and the lower the brightness. Alternatively, the electronic device may also receive sensing data collected by a photosensitive element set in the camera device, and indicate according to the sensing data. brightness, and determine the brightness information of the at least one light source at the first moment.
  • the present application provides an example of an exposure parameter adjustment method.
  • the method can be applied to a first vehicle 301 equipped with a camera device.
  • Fig. 3a is a schematic diagram of the space of the first vehicle at the first moment; wherein, the first moment corresponds to the current moment.
  • the first vehicle 301 when the first vehicle 301 is driving on the road, the first vehicle 301 will turn on the front light at the second moment.
  • Fig. 3b is a schematic diagram of a predicted meeting of the first vehicle at a second time, where the second time is the time when the vehicle is about to meet.
  • the lights of the vehicle and the street lamp will affect the image capturing quality of the camera device of the first vehicle 301 .
  • the camera device on the first vehicle 301 cannot adjust the exposure parameters in time at the second moment, so that it is difficult to obtain a clear image.
  • the electronic device on the first vehicle 301 includes an information acquisition module, a light source prediction module, an image generation module and an exposure adjustment module.
  • the functions of each module in the first vehicle are described below:
  • the information acquisition module is configured to determine the position, speed and light information of at least one light source within the shooting range of the camera device on the first vehicle 301 .
  • the light source prediction module is configured to determine a target light source according to the position, speed and light information of at least one light source acquired by the information acquisition module, and predict the position of the target light source at the second moment.
  • the image generation module is configured to generate the position of the target light source at the second moment predicted by the light source prediction module and the initialization image of the first image captured by the camera at the first moment, to generate the image at the second moment.
  • the predicted image of the captured second image is configured to generate the position of the target light source at the second moment predicted by the light source prediction module and the initialization image of the first image captured by the camera at the first moment, to generate the image at the second moment. The predicted image of the captured second image.
  • the exposure adjustment module is configured to adjust the exposure parameter of the camera according to the grayscale value of the first image and the grayscale value of the predicted image.
  • the information acquisition module acquires a first image captured by a camera at a first moment.
  • the first moment may be the current moment, or may be any moment before the current moment.
  • FIG. 5a which is a schematic diagram of a first image
  • the content displayed in the first image includes: a second vehicle 302 with headlights turned on, a third vehicle 303 with taillights turned on, and street lights 304 .
  • the first image is an image captured by the camera device on the first vehicle 301 and captured at a first moment.
  • the information acquisition module acquires the brightness information of the at least one light source, and according to the brightness information of the at least one light source at the first moment, determines the light source whose brightness exceeds the set brightness threshold as the target light source in the at least one light source.
  • the information acquisition module can acquire multiple light sources: the headlights of the second vehicle 302, the taillights of the third vehicle 303, the street lights 304, the illuminated road signs, and the remote LED screens, etc. After selecting the light source greater than the brightness threshold from the light sources, the position of the light source at the second moment is predicted.
  • the light source greater than the brightness threshold is the headlight of the second vehicle 302 , the taillight of the third vehicle 303 , and the street light 304 .
  • the information acquisition module further includes: a V2X system, an image detection module and a perception module.
  • the V2X system is used to receive vehicle lamp information and position information sent by the second vehicle 302 and the third vehicle 303
  • the image detection module is used to detect the existing light source within the shooting range of the camera
  • the sensing module is used to further determine the position of the light source existing in the range from the camera device.
  • the information acquisition module can determine the target light source in the at least one light source according to the vehicle lamp information and the position information, or according to the position of the light source from the camera device.
  • Step S402 may be executed before step S401 or before step S403, and the execution order is not limited here.
  • the information acquisition module acquires the first position information of the target light source at the first moment, and sends the first position information to the light source prediction module, and the light source prediction module predicts that the camera will be about to be at the second moment according to the first position information Second position information of the target light source in the captured second image.
  • the V2X system in the information acquisition module is configured to receive the second vehicle 302 and the third vehicle 303, and the target light source sent by the target light source at the first moment The first position information, speed information and light information.
  • the first position information indicates the physical positions of the headlights of the second vehicle 302 and the taillights of the third vehicle 303 at the first moment;
  • the speed information indicates the second vehicle 302 and all the driving speed of the third vehicle 303;
  • the light information is used to indicate the illuminance of the headlights of the second vehicle 302 and the taillights of the third vehicle 303.
  • the image detection module is used for determining the illuminance of the street light 304, and determining the physical position and speed of the street light 304 at the first moment through the sensing module.
  • the light source prediction module determines, according to the physical positions of the headlights of the second vehicle 302 , the tail lights of the third vehicle 303 and the street lights 304 at the first moment, the target light source captured by the camera device at the second moment. second location information in the second image.
  • the light source prediction module can use the Kalman filter method to predict the target light source at the second moment according to the moving speed of the target light source and the physical position of the target light source at the first moment
  • the physical position of the target light source is determined according to the physical position of the camera at the second moment, and the position coordinates of the target light source in the coordinate system centered on the camera at the second moment are determined.
  • the light source prediction module finally uses the camera calibration method to determine that the target light source is photographed by the camera device at the second moment according to the position coordinates of the target light source in the coordinate system centered on the camera device. the second position information in the second image.
  • the image generation module generates a predicted image of the second image according to the first image and the second position information.
  • the image generation module determines an initialization image of the predicted image, wherein the initialization image is the same as the first image, determines the area indicated by the second position information, and determines the area indicated by the second position information according to the second position information.
  • the grayscale adjustment area is determined, and the grayscale values of the pixels included in the grayscale adjustment area are adjusted to the grayscale values of the target light source in the first image to generate the second image.
  • FIG. 5a is a first image captured by the camera at a first moment.
  • FIG. 5b is a schematic diagram of predicting a second image to be captured by the camera at a second moment. Using the predicted position of the target light source, the second position information of the target light source at the second moment is determined, and the grayscale adjustment area is enlarged or reduced. In some possible embodiments, the grayscale adjustment area may be determined according to the distance between the target light source and the camera device.
  • the image generation module After determining the grayscale value of the target light source in the first image, the image generation module adjusts the grayscale adjustment region to the grayscale value of the target light source in the first image. It can be seen from the changes between Fig. 5a and Fig. 5b that in this scenario, the first vehicle 301 and the third vehicle 303 are driving in the same direction and gradually approach, and the first vehicle 301 and the second vehicle 302 are driving in the opposite direction. The positions of the headlights of the second vehicle 302 , the tail lights of the third vehicle 303 , and the street lights 304 at the second moment are all close to the camera device at the positions at the first moment.
  • the grayscale value of the headlights on the second vehicle 302 is Y1
  • the grayscale value of the taillights on the third vehicle 303 is Y2
  • the grayscale value of the street lights 304 is Y3: Generate a predicted image of the second image according to the gray value of the target light source in the first image. Referring to FIG. 5c, it is the predicted image of the second image generated by the light source prediction module, wherein the target light source is close to the camera device during the driving process of the vehicle. Therefore, within the shooting range of the camera device, the target light source is close to the camera device. The illumination range of the light source will also increase, and accordingly, the grayscale adjustment area in the second image will also become larger.
  • the first position information in the initialization image may be The gray value of the pixel point in the non-overlapping area in the indicated area is adjusted to the gray average value of the initialization image, so as to reduce the error in calculating the gray value of the predicted image.
  • the area 501 indicated by the first position information in the initialization image and the grayscale adjustment area 502 in the predicted image are not completely coincident, then the area 501 indicated by the first position information in the initialization image and the predicted area 501 can be combined.
  • the grayscale values of the pixels in the non-overlapping parts of the grayscale adjustment area 502 in the image are adjusted to the grayscale average value of the initialization image, so as to reduce the error in calculating the grayscale value of the predicted image.
  • the image generation module calculates the gray value of the first image and the gray value of the predicted image.
  • the grayscale value of the first image is the average value YA of the grayscale values of all pixels in the first image
  • the grayscale value of the predicted image is the average value of the grayscale values of all pixels in the predicted image YB.
  • S406 Adjust the exposure parameter of the camera according to the grayscale value of the first image and the grayscale value of the predicted image.
  • the image generation module calculates the grayscale value YA of the first image minus the grayscale value YB of the predicted image, and obtains a grayscale difference value (YA-YB), according to the grayscale difference value
  • YA-YB grayscale difference value
  • the grayscale change rate is not less than the first threshold, it means that the camera device will be exposed to a large range of light illumination at the second moment and has a great influence, so the exposure time of the camera device is reduced.
  • the grayscale change rate is not greater than the second threshold, it means that the imaging device is illuminated by light at the second moment into a dark environment, and therefore, the exposure time of the imaging device is increased.
  • the first threshold is a positive number
  • the second threshold is a negative number
  • the increased or decreased exposure duration can be determined by the current vehicle speed and the gray value change rate.
  • the specific exposure adjustment time can be determined.
  • the embodiments of the present application further provide an exposure parameter adjustment device 600, which can be applied to the vehicle shown in FIG. 1 to implement the exposure parameter adjustment methods provided by the above embodiments.
  • the apparatus may include:
  • a first image acquisition unit 601 configured to acquire a first image captured by a camera at a first moment
  • a target light source position determination unit 602 configured to determine a target light source and obtain first position information of the target light source, wherein the first position information is used to indicate the position of the target light source in the first image, or indicating the physical location of the target light source at the first moment;
  • a position prediction unit 603, configured to predict, according to the first position information, the second position information of the target light source in the second image to be captured by the camera at the second moment; wherein the second moment is located at the after the first moment;
  • a predicted image generation unit 604 configured to generate a predicted image of the second image according to the first image and the second position information
  • the exposure parameter adjustment unit 605 is configured to calculate the gray value of the first image and the gray value of the predicted image; adjust the gray value of the first image and the gray value of the predicted image according to the gray value of the first image and the gray value of the predicted image.
  • the exposure parameters of the camera are described.
  • the exposure parameter adjustment device 600 further includes:
  • the target light source determination unit 606 is configured to obtain brightness information of at least one light source before determining the target light source, wherein the brightness information of any light source at the first moment is used to represent the brightness of the light source at the first moment ;
  • the target light source whose brightness exceeds a set brightness threshold is determined among the at least one light source.
  • the target light source determination unit 606 is further configured to: determine, according to the gray value of the region where the at least one light source is located in the first image, the location of the at least one light source in the first image Brightness information at the moment; or
  • the first position information is used to indicate the position of the target light source in the first image
  • the predicted position unit 603 is further configured to: acquire the relative speed between the target light source and the camera;
  • the second position information is determined according to the time difference between the first moment and the second moment, the first position information, and the relative speed.
  • the first position information is used to indicate the physical position of the target light source at the first moment
  • the predicted position unit 603 is further configured to: obtain the movement speed of the target light source;
  • the second position information is determined according to the physical position of the target light source at the second moment and the physical position of the imaging device at the second moment.
  • the predicting position unit 603 is further configured to: determine the location of the target light source according to the physical position of the target light source at the second moment and the physical position of the camera device at the second moment. third position information in the camera coordinate system at the second moment; wherein, the camera coordinate system is a coordinate system centered on the camera;
  • the third position information is converted into the second position information.
  • the predicting position unit 603 is further configured to: acquire at least one third image before the first moment captured by the camera, and determine the first image and the third image according to the first image and the third image. the speed of movement of the target light source; or
  • the sensor data sent by the sensor is received, and the movement speed of the target light source is determined according to the sensor data.
  • the predicted image generation unit 604 is further configured to: determine an initialization image of the predicted image, wherein the initialization image is the same as the first image;
  • the predicted image is obtained by adjusting the grayscale values of the pixels included in the grayscale adjustment region in the initialization image to a set grayscale value.
  • the set gray value is the first position information in the first image The gray value of the pixels contained in the indicated area.
  • the grayscale value of the first image is an average value of the grayscale values of all pixels in the first image; the grayscale value of the predicted image is all pixels in the predicted image The average value of the gray value of the point.
  • the exposure parameter adjustment unit 605 is further configured to: calculate the grayscale difference between the grayscale value of the first image and the grayscale value of the predicted image;
  • the exposure parameter of the camera device is adjusted.
  • the exposure parameters include exposure duration
  • the exposure parameter adjustment unit 605 is also used for:
  • the first difference is a positive number
  • the second difference is a negative number
  • the exposure parameters include exposure duration
  • the exposure parameter adjustment unit 605 is also used for:
  • the first threshold is a positive number
  • the second threshold is a negative number
  • each functional unit in each embodiment of the present application It can be integrated in one processing unit, or it can exist physically alone, or two or more units can be integrated in one unit.
  • the above-mentioned integrated units may be implemented in the form of hardware, or may be implemented in the form of software functional units.
  • the integrated unit if implemented in the form of a software functional unit and sold or used as an independent product, may be stored in a computer-readable storage medium.
  • the technical solutions of the present application can be embodied in the form of software products in essence, or the parts that contribute to the prior art, or all or part of the technical solutions, and the computer software products are stored in a storage medium , including several instructions for causing a computer device (which may be a personal computer, a server, or a network device, etc.) or a processor (processor) to execute all or part of the steps of the methods described in the various embodiments of the present application.
  • the aforementioned storage medium includes: U disk, mobile hard disk, read-only memory (Read-Only Memory, ROM), random access memory (Random Access Memory, RAM), magnetic disk or optical disk and other media that can store program codes .
  • the embodiments of the present application also provide an exposure parameter adjustment device, which can be applied to the vehicle shown in FIG. 1 to realize the above exposure parameter adjustment method, and has the device shown in FIG. Function.
  • the apparatus 700 includes: a communication module 701 , a processor 702 , and a memory 703 .
  • the communication module 701 , the memory 703 and the processor 702 are connected to each other.
  • the communication module 701, the memory 703 and the processor 702 can be connected to each other through a bus;
  • the bus can be a peripheral component interconnect standard (peripheral component interconnect, PCI) bus or an extended industrial Standard structure (extended industry standard architecture, EISA) bus, etc.
  • the bus can be divided into an address bus, a data bus, a control bus, and the like. For ease of presentation, only one thick line is used in FIG. 7, but it does not mean that there is only one bus or one type of bus.
  • the communication module 701 is used to communicate with other devices.
  • the communication module 701 may include a communication interface and a wireless communication module.
  • the communication interface is used to communicate with other components in the vehicle.
  • the in-vehicle device may acquire various data from components such as a sensing device, an operating device, and the like through the communication interface.
  • the wireless communication module may include: a Bluetooth module, a WiFi module, an RF circuit, and the like.
  • the processor 702 is configured to implement the exposure parameter adjustment method provided by the embodiment shown in FIG. 2 .
  • the processor 702 may be a central processing unit (central processing unit, CPU), or other hardware chips.
  • the above-mentioned hardware chip may be an application-specific integrated circuit (ASIC), a programmable logic device (PLD) or a combination thereof.
  • the above-mentioned PLD can be a complex programmable logic device (CPLD), a field-programmable gate array (FPGA), a general-purpose array logic (generic array logic, GAL) or any combination thereof.
  • CPLD complex programmable logic device
  • FPGA field-programmable gate array
  • GAL general-purpose array logic
  • the memory 703 is used to store program instructions, data, and the like.
  • the program instructions may include program code, which includes instructions for computer operation.
  • the memory 703 may include random access memory (RAM), and may also include non-volatile memory (non-volatile memory), such as at least one disk storage.
  • the processor 702 executes the program stored in the memory 703, and implements the above functions through the above components, thereby finally realizing the exposure parameter adjustment method provided by the above embodiments.
  • the memory 703 in FIG. 7 of the present application may be a volatile memory or a non-volatile memory, or may include both volatile and non-volatile memory.
  • the non-volatile memory may be a read-only memory (Read-Only Memory, ROM), a programmable read-only memory (Programmable ROM, PROM), an erasable programmable read-only memory (Erasable PROM, EPROM), an electrically programmable read-only memory (Erasable PROM, EPROM). Erase programmable read-only memory (Electrically EPROM, EEPROM) or flash memory.
  • Volatile memory may be Random Access Memory (RAM), which acts as an external cache.
  • RAM Static RAM
  • DRAM Dynamic RAM
  • SDRAM Synchronous DRAM
  • SDRAM double data rate synchronous dynamic random access memory
  • Double Data Rate SDRAM DDR SDRAM
  • enhanced SDRAM ESDRAM
  • synchronous link dynamic random access memory Synchlink DRAM, SLDRAM
  • Direct Rambus RAM Direct Rambus RAM
  • the embodiments of the present application further provide a computer program, which, when the computer program runs on a computer, causes the computer to execute the exposure parameter adjustment method provided by the above embodiments.
  • the embodiments of the present application further provide a computer-readable storage medium, where a computer program is stored in the computer-readable storage medium, and when the computer program is executed by a computer, the computer executes the exposure provided by the above embodiments. parameter adjustment method.
  • the storage medium may be any available medium that the computer can access.
  • computer readable media may include RAM, ROM, EEPROM, CD-ROM or other optical disk storage, magnetic disk storage media or other magnetic storage devices, or be capable of carrying or storing instructions or data structures in the form of desired program code and any other medium that can be accessed by a computer.
  • the embodiments of the present application further provide a chip, which is used for reading a computer program stored in a memory to implement the exposure parameter adjustment methods provided by the above embodiments.
  • the embodiments of the present application provide a chip system, where the chip system includes a processor for supporting a computer device to implement the functions involved in the service equipment, forwarding equipment, or site equipment in the above embodiments.
  • the chip system further includes a memory for storing necessary programs and data of the computer device.
  • the chip system may be composed of chips, or may include chips and other discrete devices.
  • the exposure parameter adjustment method and device provided by the present application solves the problem that in the prior art, the camera device cannot be adjusted to the appropriate exposure parameters in time under the scene of drastic changes in light, which makes the adjustment of the exposure parameters lag. Therefore, the quality of the images captured during the adjustment process is relatively poor.
  • the present application acquires the first image captured by the camera at the first moment, generates a predicted image of the second image to be captured at the second moment, and adjusts the The exposure parameters of the camera are described. Further, according to the first position information of the target light source at the first moment, the second position information of the target light source at the second moment can be predicted, so that the predicted image can be generated according to the second position information, and the grayscale of the first image can be obtained.
  • the present application can determine the influence of the position of the target light source on the imaging device by predicting the position of the target light source.
  • the problem of image overexposure or underexposure can be adjusted in time to obtain a clear image to ensure driving safety.
  • the embodiments of the present application may be provided as a method, a system, or a computer program product. Accordingly, the present application may take the form of an entirely hardware embodiment, an entirely software embodiment, or an embodiment combining software and hardware aspects. Furthermore, the present application may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, etc.) having computer-usable program code embodied therein.
  • computer-usable storage media including, but not limited to, disk storage, CD-ROM, optical storage, etc.
  • These computer program instructions may also be stored in a computer-readable memory capable of directing a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory result in an article of manufacture comprising instruction means, the instructions
  • the apparatus implements the functions specified in the flow or flow of the flowcharts and/or the block or blocks of the block diagrams.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Studio Devices (AREA)

Abstract

La présente invention concerne un procédé et un appareil d'ajustement de paramètre d'exposition. Le procédé comprend : l'acquisition d'une première image photographiée par un appareil photographique ; la détermination d'une source de lumière cible, l'acquisition d'une première information de position de la source de lumière cible, et la prédiction, selon la première information de position de la source de lumière cible, d'une seconde information de position de la source de lumière cible dans une seconde image à photographier par l'appareil photographique à un second moment ; et la génération d'une image prédite de la seconde image selon la première image et selon la seconde information de position, et l'ajustement d'un paramètre d'exposition de l'appareil photographique au moyen d'une valeur d'échelle de gris de la première image et d'une valeur d'échelle de gris de l'image prédite. Selon le présent procédé, l'impact de la position de la source de lumière cible sur l'appareil photographique est déterminé au moyen de la prédiction de la position de la source de lumière cible. Dans un scénario dans lequel des rayons lumineux changent radicalement, il est possible d'ajuster de manière opportune le paramètre d'exposition de façon à obtenir une image claire.
PCT/CN2021/117598 2020-11-18 2021-09-10 Procédé et appareil d'ajustement de paramètre d'exposition WO2022105381A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202011291411.2A CN114520880B (zh) 2020-11-18 2020-11-18 一种曝光参数调节方法及装置
CN202011291411.2 2020-11-18

Publications (1)

Publication Number Publication Date
WO2022105381A1 true WO2022105381A1 (fr) 2022-05-27

Family

ID=81595463

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2021/117598 WO2022105381A1 (fr) 2020-11-18 2021-09-10 Procédé et appareil d'ajustement de paramètre d'exposition

Country Status (2)

Country Link
CN (1) CN114520880B (fr)
WO (1) WO2022105381A1 (fr)

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115060723A (zh) * 2022-05-31 2022-09-16 浙江大学高端装备研究院 一种金属温度场测量装置和测量方法
CN115225820A (zh) * 2022-07-28 2022-10-21 东集技术股份有限公司 拍摄参数自动调整方法、装置、存储介质及工业相机
CN115633259A (zh) * 2022-11-15 2023-01-20 深圳市泰迅数码有限公司 基于人工智能的智能摄像头自动调控方法及系统
CN116503369A (zh) * 2023-05-06 2023-07-28 北京思莫特科技有限公司 结构体的变形监测方法、图像曝光参数调整方法
CN117333483A (zh) * 2023-11-30 2024-01-02 中科慧远视觉技术(洛阳)有限公司 一种用于金属凹陷结构底部的缺陷检测方法及装置
CN117761338A (zh) * 2023-12-26 2024-03-26 创材深造(苏州)科技有限公司 一种高通量力学测试系统、方法、存储介质及电子设备
CN117939751A (zh) * 2024-03-25 2024-04-26 济宁医学院附属医院 一种紫外线的灯光控制系统

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115657404B (zh) * 2022-12-12 2023-04-07 合肥安迅精密技术有限公司 改善相机光源线性度的调光方法及系统、存储介质

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1914115A2 (fr) * 2006-10-18 2008-04-23 Schefenacker Vision Systems Germany GmbH Système de phare pour véhicules, de préférence pour véhicules automobiles
CN103196550A (zh) * 2012-01-09 2013-07-10 西安智意能电子科技有限公司 一种对发射光源的成像信息进行筛选处理的方法与设备
CN109624666A (zh) * 2018-12-26 2019-04-16 侯力宇 一种汽车智能防炫目方法及系统
CN111246091A (zh) * 2020-01-16 2020-06-05 北京迈格威科技有限公司 一种动态自动曝光控制方法和装置及电子设备
CN111448529A (zh) * 2017-12-12 2020-07-24 索尼公司 信息处理装置、移动物体、控制系统、信息处理方法以及程序
CN111460865A (zh) * 2019-01-22 2020-07-28 阿里巴巴集团控股有限公司 辅助驾驶方法、辅助驾驶系统、计算设备及存储介质

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2005300855A (ja) * 2004-04-09 2005-10-27 Fuji Xerox Co Ltd 画像形成装置およびその階調処理方法
JP2008070611A (ja) * 2006-09-14 2008-03-27 Casio Comput Co Ltd 撮像装置、露出条件調整方法及びプログラム
JP4812846B2 (ja) * 2009-02-19 2011-11-09 アキュートロジック株式会社 撮像装置及び撮像方法
JP5337905B2 (ja) * 2011-12-26 2013-11-06 公益財団法人日本交通管理技術協会 速度計測システム、速度計測方法及びプログラム
CN108156369B (zh) * 2017-12-06 2020-03-13 Oppo广东移动通信有限公司 图像处理方法和装置
CN110753178B (zh) * 2018-07-24 2021-03-12 杭州海康威视数字技术股份有限公司 一种曝光时间调整方法及装置、摄像机

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1914115A2 (fr) * 2006-10-18 2008-04-23 Schefenacker Vision Systems Germany GmbH Système de phare pour véhicules, de préférence pour véhicules automobiles
CN103196550A (zh) * 2012-01-09 2013-07-10 西安智意能电子科技有限公司 一种对发射光源的成像信息进行筛选处理的方法与设备
CN111448529A (zh) * 2017-12-12 2020-07-24 索尼公司 信息处理装置、移动物体、控制系统、信息处理方法以及程序
CN109624666A (zh) * 2018-12-26 2019-04-16 侯力宇 一种汽车智能防炫目方法及系统
CN111460865A (zh) * 2019-01-22 2020-07-28 阿里巴巴集团控股有限公司 辅助驾驶方法、辅助驾驶系统、计算设备及存储介质
CN111246091A (zh) * 2020-01-16 2020-06-05 北京迈格威科技有限公司 一种动态自动曝光控制方法和装置及电子设备

Cited By (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115060723A (zh) * 2022-05-31 2022-09-16 浙江大学高端装备研究院 一种金属温度场测量装置和测量方法
CN115225820A (zh) * 2022-07-28 2022-10-21 东集技术股份有限公司 拍摄参数自动调整方法、装置、存储介质及工业相机
CN115225820B (zh) * 2022-07-28 2023-05-26 东集技术股份有限公司 拍摄参数自动调整方法、装置、存储介质及工业相机
CN115633259A (zh) * 2022-11-15 2023-01-20 深圳市泰迅数码有限公司 基于人工智能的智能摄像头自动调控方法及系统
CN116503369A (zh) * 2023-05-06 2023-07-28 北京思莫特科技有限公司 结构体的变形监测方法、图像曝光参数调整方法
CN116503369B (zh) * 2023-05-06 2024-01-26 北京思莫特科技有限公司 结构体的变形监测方法、图像曝光参数调整方法
CN117333483A (zh) * 2023-11-30 2024-01-02 中科慧远视觉技术(洛阳)有限公司 一种用于金属凹陷结构底部的缺陷检测方法及装置
CN117761338A (zh) * 2023-12-26 2024-03-26 创材深造(苏州)科技有限公司 一种高通量力学测试系统、方法、存储介质及电子设备
CN117939751A (zh) * 2024-03-25 2024-04-26 济宁医学院附属医院 一种紫外线的灯光控制系统
CN117939751B (zh) * 2024-03-25 2024-06-04 济宁医学院附属医院 一种紫外线的灯光控制系统

Also Published As

Publication number Publication date
CN114520880B (zh) 2023-04-18
CN114520880A (zh) 2022-05-20

Similar Documents

Publication Publication Date Title
WO2022105381A1 (fr) Procédé et appareil d'ajustement de paramètre d'exposition
JP6930613B2 (ja) 車載カメラ・システム、並びに画像処理装置及び画像処理方法
TWI703064B (zh) 用於在不良照明狀況下定位運輸工具的系統和方法
US11748620B2 (en) Generating ground truth for machine learning from time series elements
KR102554643B1 (ko) 동적 범위를 확장하기 위한 다수의 동작 모드들
US20220107651A1 (en) Predicting three-dimensional features for autonomous driving
CN113490863B (zh) 雷达辅助的单个图像三维深度重建
EP3367303A1 (fr) Procédé de traitement d'images de conduite autonome et appareil correspondant
US11970156B1 (en) Parking assistance using a stereo camera and an added light source
CN113212498B (zh) 车间距离测量方法、车间距离测量装置、电子设备、计算机程序以及计算机可读记录介质
US11715180B1 (en) Emirror adaptable stitching
US10872419B2 (en) Method and apparatus for evaluating a vehicle travel surface
US20190041038A1 (en) Dynamic control of vehicle lamps during maneuvers
US10699376B1 (en) eMirror with 3-in-1 stitching by non-rectilinear warping of camera views
US20210064913A1 (en) Driving assistant system, electronic device, and operation method thereof
JPWO2019194256A1 (ja) 演算処理装置、オブジェクト識別システム、学習方法、自動車、車両用灯具
WO2016194296A1 (fr) Système de caméra embarquée dans un véhicule, et appareil de traitement d'image
CN110293973B (zh) 驾驶支援系统
JP2018041209A (ja) 物体認識装置、モデル情報生成装置、物体認識方法、および物体認識プログラム
US11182623B2 (en) Flexible hardware design for camera calibration and image pre-procesing in autonomous driving vehicles
CN117082332A (zh) 使用雷达的相机设置的自动配置
US11435191B2 (en) Method and device for determining a highly precise position and for operating an automated vehicle
CN116419072A (zh) 车辆摄像头动态
KR20210094475A (ko) 차량 영상 기반의 차간 거리 측정 방법, 차간 거리 측정 장치, 전자 기기, 컴퓨터 프로그램 및 컴퓨터 판독 가능한 기록 매체
US20230394844A1 (en) System for avoiding accidents caused by wild animals crossing at dusk and at night

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21893532

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 21893532

Country of ref document: EP

Kind code of ref document: A1