WO2022099482A1 - Procédé et appareil de commande d'exposition, plate-forme mobile, et support de stockage lisible par ordinateur - Google Patents

Procédé et appareil de commande d'exposition, plate-forme mobile, et support de stockage lisible par ordinateur Download PDF

Info

Publication number
WO2022099482A1
WO2022099482A1 PCT/CN2020/127905 CN2020127905W WO2022099482A1 WO 2022099482 A1 WO2022099482 A1 WO 2022099482A1 CN 2020127905 W CN2020127905 W CN 2020127905W WO 2022099482 A1 WO2022099482 A1 WO 2022099482A1
Authority
WO
WIPO (PCT)
Prior art keywords
exposure
target
shutter sensor
rolling shutter
sensing unit
Prior art date
Application number
PCT/CN2020/127905
Other languages
English (en)
Chinese (zh)
Inventor
杜劼熹
周游
彭梦龙
Original Assignee
深圳市大疆创新科技有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 深圳市大疆创新科技有限公司 filed Critical 深圳市大疆创新科技有限公司
Priority to PCT/CN2020/127905 priority Critical patent/WO2022099482A1/fr
Publication of WO2022099482A1 publication Critical patent/WO2022099482A1/fr

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/70Circuitry for compensating brightness variation in the scene
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N25/00Circuitry of solid-state image sensors [SSIS]; Control thereof
    • H04N25/50Control of the SSIS exposure
    • H04N25/53Control of the integration time

Definitions

  • the present application relates to the technical field of exposure control, and in particular, to an exposure control method, a device, a movable platform, and a computer-readable storage medium.
  • the movable platform can perceive the surrounding environment through multiple vision sensors.
  • the vision sensor can use the global shutter (Global Shutter) or the rolling shutter (Rolling Shutter).
  • Global Shutter global shutter
  • Rolling Shutter rolling shutter
  • the vision sensor using the global shutter is used.
  • the entire image is exposed at the same time, and the visual sensor of the rolling shutter is used to expose the image line by line, and the exposure time of each line of the image is different. Due to the different exposure control methods of the global shutter and the rolling shutter, the image captured by the visual sensor using the global shutter and the image captured by the visual sensor using the rolling shutter will have the problem of asynchronous exposure time, which will affect the image quality and cause a poor user experience.
  • embodiments of the present application provide an exposure control method, device, movable platform, and computer-readable storage medium, which aim to ensure exposure synchronization between a rolling shutter sensor and a global shutter sensor.
  • an embodiment of the present application provides an exposure control method, which is applied to an electronic device, where the electronic device includes a rolling shutter sensor and a global shutter sensor, the rolling shutter sensor includes a plurality of sensing units, and the The multiple sensing units of the rolling shutter sensor are sequentially exposed in a preset order to collect image data; the global shutter sensor includes multiple sensing units, and the multiple sensing units of the global shutter sensor are exposed synchronously To collect image data, the method includes:
  • the global shutter sensor is triggered to perform exposure according to the exposure time information to obtain a target image.
  • an embodiment of the present application further provides an image acquisition device, the image acquisition device includes a rolling shutter sensor, a global shutter sensor, a processor and a memory, the rolling shutter sensor includes a plurality of sensing units, The multiple sensing units of the rolling shutter sensor are sequentially exposed in a preset order to collect image data; the global shutter sensor includes multiple sensing units, and the multiple sensing units of the global shutter sensor Synchronized exposure to capture image data;
  • the memory is used to store computer programs
  • the processor is configured to execute the computer program and implement the following steps when executing the computer program:
  • the global shutter sensor is triggered to perform exposure according to the exposure time information to obtain a target image.
  • an embodiment of the present application further provides a movable platform, wherein the movable platform includes:
  • a power system arranged on the platform body, for providing moving power for the movable platform
  • the above-mentioned image acquisition device is provided on the platform body, and is used for acquiring the target image and for controlling the movement of the movable platform.
  • an embodiment of the present application further provides a computer-readable storage medium, where the computer-readable storage medium stores a computer program, and when the computer program is executed by a processor, the processor implements the above-mentioned The steps of the exposure control method.
  • Embodiments of the present application provide an exposure control method, device, movable platform, and computer-readable storage medium.
  • determining a target sensing unit from a plurality of sensing units of a rolling shutter sensor, and Exposure time parameters of multiple sensing units determine the exposure time information of the target sensing unit during the exposure process of the rolling shutter sensor, and finally trigger the global shutter sensor to perform exposure according to the exposure time information of the target sensing unit to obtain the target
  • the image can ensure that the exposures of the rolling shutter sensor and the global shutter sensor are synchronized, and it is convenient for subsequent fusion of the image obtained by the exposure of the rolling shutter sensor and the image obtained by the exposure of the global shutter sensor.
  • FIG. 1 is a schematic structural diagram of an electronic device implementing the exposure control method provided by the embodiment of the present application.
  • FIG. 2 is a schematic flowchart of steps of an exposure control method provided by an embodiment of the present application.
  • FIG. 3 is a schematic flow chart of sub-steps of the exposure control method in FIG. 2;
  • FIG. 4 is a schematic diagram of the relationship between the exposure time of the rolling shutter sensor and the image line in the embodiment of the present application;
  • FIG. 5 is a schematic diagram of a scene in which a target sensing unit is determined based on a target exposure time of a candidate sensing unit and multiple work start times of an inertial measurement unit in an embodiment of the present application;
  • FIG. 6 is a schematic diagram of another scenario in which the target sensing unit is determined based on the target exposure time of the candidate sensing unit and multiple work start times of the inertial measurement unit in the embodiment of the present application;
  • FIG. 7 is a schematic diagram of another scenario in which the target sensing unit is determined based on the target exposure time of the candidate sensing unit and the multiple work start times of the inertial measurement unit in the embodiment of the present application;
  • FIG. 8 is a schematic diagram of a scene in which a target exposure time difference is determined based on an exposure start time in an embodiment of the present application
  • FIG. 9 is a schematic diagram of another scenario in which the target exposure time difference is determined based on the exposure start time in an embodiment of the present application.
  • FIG. 10 is a schematic diagram of another scene in which the target exposure time difference is determined based on the exposure start time in the embodiment of the present application;
  • FIG. 11 is a schematic block diagram of the structure of an image acquisition device provided by an embodiment of the present application.
  • FIG. 12 is a schematic structural block diagram of a movable platform provided by an embodiment of the present application.
  • the movable platform can perceive the surrounding environment through multiple vision sensors.
  • the vision sensor can use the global shutter (Global Shutter) or the rolling shutter (Rolling Shutter).
  • Global Shutter global shutter
  • Rolling Shutter rolling shutter
  • the vision sensor using the global shutter can be used.
  • the entire image is exposed at the same time, and the visual sensor of the rolling shutter is used to expose the image line by line, and the exposure time of each line of the image is different. Due to the different exposure control methods of the global shutter and the rolling shutter, the image captured by the visual sensor using the global shutter and the image captured by the visual sensor using the rolling shutter will have the problem of asynchronous exposure time, which will affect the image quality and cause a poor user experience.
  • the embodiments of the present application provide an exposure control method, device, movable platform and computer-readable storage medium, by determining a target sensing unit from a plurality of sensing units of a rolling shutter sensor, and according to Exposure time parameters of multiple sensing units of the rolling shutter sensor, determine the exposure time information of the target sensing unit during the exposure process of the rolling shutter sensor, and finally trigger the global shutter sensor according to the exposure time information of the target sensing unit Performing exposure to obtain the target image can ensure that the exposures of the rolling shutter sensor and the global shutter sensor are synchronized, which facilitates subsequent fusion of the image obtained by the exposure of the rolling shutter sensor and the image obtained by the exposure of the global shutter sensor.
  • FIG. 1 is a schematic structural diagram of an electronic device implementing the exposure control method provided by the embodiment of the present application.
  • the electronic device 100 includes a device body 110 , a rolling shutter sensor 120 provided on the device body 110 and a global shutter sensor 130 provided on the device body 110 .
  • the rolling shutter sensor 120 includes a plurality of sensors The multiple sensing units of the rolling shutter sensor 120 are sequentially exposed in a preset order to collect image data, the global shutter sensor 130 includes multiple sensing units, and the multiple sensing units of the global shutter sensor 130 are exposed synchronously to collect images. data.
  • the electronic device 100 further includes a power system 140 disposed on the device body 110 , the power system 140 is used to provide mobile power for the electronic device 100 , and the electronic device 100 includes a movable platform and a handheld device.
  • Equipment, mobile platforms include drones, unmanned vehicles, and unmanned ships
  • handheld devices include cameras, smartphones, and tablets.
  • one or more of the power systems 140 in the horizontal direction may rotate in a clockwise direction
  • one or more of the power systems 140 in the horizontal direction may rotate in a counterclockwise direction.
  • the rotational rate of the power systems 140 in each horizontal direction can be varied independently to achieve the lifting and/or pushing operations caused by each power system 140 to adjust the spatial orientation, speed and/or acceleration of the electronic device 100 (eg, relative to multiple rotation and translation up to three degrees of freedom).
  • the power system 140 enables the electronic device 100 (drones) to take off vertically from the ground, or to land vertically on the ground, without any horizontal movement of the electronic device 100 (drones) (eg, without taxiing on the runway is required).
  • the power system 140 may allow the electronic device 100 (drone) to preset positions and/or turn the steering wheel in the air.
  • One or more of the powertrains 140 may be controlled independently of the other powertrains 140 .
  • one or more power systems 140 may be controlled simultaneously.
  • the electronic device 100 (unmanned aerial vehicle) may have multiple horizontally oriented power systems 140 to track the lift and/or push of the target.
  • the horizontally oriented power system 140 can be actuated to provide the electronic device 100 (drone) with the ability to take off vertically, land vertically, and hover.
  • the electronic device 100 may also include a sensing system, which may include one or more sensors to sense the spatial orientation, velocity, and/or acceleration of the electronic device 100 (eg, relative to up to three Degree of freedom rotation and translation), angular acceleration, attitude, position (absolute position or relative position), etc.
  • the one or more sensors include GPS sensors, motion sensors, inertial sensors, proximity sensors, or image sensors.
  • the sensing system may also be used to collect data on the environment in which the electronic device 100 is located, such as climatic conditions, potential obstacles to be approached, locations of geographic features, locations of man-made structures, and the like.
  • the electronic device 100 may further include a processor (not shown in FIG. 1 ), and the processor is configured to process the input control instruction, or send and receive signals, and the like.
  • the processor may be provided inside the device body 110 .
  • the processor may be a central processing unit (Central Processing Unit, CPU), and the processor may also be other general-purpose processors, digital signal processors (Digital Signal Processors, DSP), application specific integrated circuits (application specific integrated circuits) circuit, ASIC), Field-Programmable Gate Array (FPGA) or other programmable logic devices, discrete gate or transistor logic devices, discrete hardware components, etc.
  • a general purpose processor may be a microprocessor or the processor may be any conventional processor or the like.
  • the processor can control the exposure of the rolling shutter sensor 120 and the global shutter sensor 130 to capture image data.
  • the multiple A target sensing unit is determined from each sensing unit, and according to the exposure time parameter of the rolling shutter sensor 120, the exposure time information of the target sensing unit is determined during the exposure process of the rolling shutter sensor 120, and finally the exposure time information is determined according to the exposure time information of the rolling shutter sensor 120.
  • the global shutter sensor 130 is triggered to perform exposure to obtain a target image, thereby ensuring that the exposures of the rolling shutter sensor 120 and the global shutter sensor 130 are synchronized.
  • the electronic device in FIG. 1 and the above naming of the components of the electronic device are only for the purpose of identification, and therefore do not limit the embodiments of the present application.
  • the exposure control method provided by the embodiments of the present application will be described in detail with reference to the electronic device in FIG. 1 .
  • the electronic device in FIG. 1 is only used to explain the exposure control method provided by the embodiment of the present application, but does not constitute a limitation on the application scenario of the exposure control method provided by the embodiment of the present application.
  • FIG. 2 is a schematic flowchart of steps of an exposure control method provided by an embodiment of the present application.
  • the exposure control method can be applied to electronic equipment for controlling the exposure of the rolling shutter sensor and the global shutter sensor to collect image data, so as to ensure the exposure synchronization of the rolling shutter sensor and the global shutter sensor.
  • the exposure control method includes steps S101 to S103.
  • Step S101 determining a target sensing unit from a plurality of sensing units of the rolling shutter sensor
  • Step S102 according to the exposure time parameter of the rolling shutter sensor, determine the exposure time information of the target sensing unit during the exposure process of the rolling shutter sensor;
  • Step S103 triggering the global shutter sensor to perform exposure according to the exposure time information to obtain a target image.
  • the rolling shutter sensor includes multiple sensing units, the multiple sensing units of the rolling shutter sensor are sequentially exposed in a preset order to collect image data, the global shutter sensor includes multiple sensing units, The multiple sensing units of the global shutter sensor are exposed simultaneously to collect image data.
  • the main control is to control the rolling shutter sensor and the global shutter sensor to start exposure at the same time to capture the image.
  • the multiple sensing units of the shutter sensor are sequentially exposed in a preset order to collect image data, and the multiple sensing units of the global shutter sensor are exposed synchronously to collect image data.
  • the exposure of local images is not synchronized, which leads to errors in the subsequent fusion of image data, and it is impossible to accurately locate and track the target.
  • the target sensing unit is determined from a plurality of sensing units of the rolling shutter sensor, and the target sensing unit is determined according to the exposure time parameter of the rolling shutter sensor during the exposure process of the rolling shutter sensor.
  • the global shutter sensor is triggered to perform exposure to obtain the target image, which ensures that the exposure synchronization of the local images in the image data collected by the rolling shutter sensor and the global shutter sensor exposure is convenient for subsequent accurate fusion. In order to improve the positioning effect and tracking effect of electronic equipment.
  • step S101 may include: sub-steps S1011 to S1012.
  • Sub-step S1011 Acquire the first pixel area of the target object in the image coordinate system where the rolling shutter sensor is located.
  • the target object may be determined in the image captured by the rolling shutter sensor, or may be determined in the image captured by the global shutter sensor. It can also be determined from the sampling data of other sensors. For example, one or more sampling points about a specific target are determined in the point cloud data obtained by the radar, and the spatial position of the target object is determined according to the position of the sampling point. The spatial position determines the first pixel area of the target object in the image coordinate system where the rolling shutter sensor is located.
  • the target object includes a salient target in the image collected by the rolling shutter sensor, a specific target, a tracking target currently being tracked, a tracking target re-selected by the user in the image collected by the rolling shutter sensor, etc.
  • the category of the target is located in the preset category library, and the category of the saliency target is different from the category of the specific target.
  • the categories in the preset category library include categories of objects that can be recognized by the object detection algorithm, such as pedestrians, vehicles, and ships.
  • the saliency target is determined according to the saliency of the object in the image captured by the rolling shutter sensor, and when the saliency of the object in the captured image is greater than or equal to a preset saliency
  • a preset saliency When the object is saliency target, it can be determined that the object is not a saliency target when the salience degree of the object in the collected image is less than the preset salience degree.
  • the preset significance level may be set based on the actual situation, which is not specifically limited in this embodiment of the present application.
  • the salience of the object in the image captured by the rolling shutter sensor may be determined according to the duration of the object staying at the preset position in the image. And/or can be determined according to the saliency value between the image area where the object is located in the image captured by the rolling shutter sensor and the adjacent image area. It can be understood that, the longer the staying time of the object at the preset position in the image, the higher the salience of the object in the image, and the shorter the staying time of the object at the preset position in the image, the higher the salience of the object in the image is. The less prominent in this image.
  • the movement speed of the target object is obtained, and the historical 3D position of the target object at the previous moment is obtained; according to the movement speed and the historical 3D position, the target 3D position of the target object at the current moment is predicted; according to the target 3D position, Determine the first pixel area of the target object in the image coordinate system where the rolling shutter sensor is located.
  • the first pixel area of the target object at the current moment in the image coordinate system where the rolling shutter sensor is located can be predicted, so that the subsequent prediction can be made according to the prediction.
  • the target sensing unit is determined in the first pixel area of The exposure synchronization of the local images is convenient for subsequent accurate fusion, so as to improve the positioning effect and tracking effect of electronic equipment.
  • the first pixel area of the target object at the next moment in the image coordinate system where the rolling shutter sensor is located specifically: obtaining the three-dimensional position of the target object at the current moment; The target three-dimensional position of the target object at the next moment is predicted; according to the target three-dimensional position, the first pixel area of the target object at the next moment in the image coordinate system where the rolling shutter sensor is located is determined.
  • the first pixel area of the target object in the image coordinate system where the rolling shutter sensor is located at the next moment can be predicted, so that the subsequent can be based on the predicted first pixel area.
  • a pixel area determines the target sensing unit, and then according to the exposure time information of the target sensing unit, the exposure of the global shutter sensor at the next moment is predicted to ensure that the rolling shutter sensor and the global shutter sensor expose the local part of the image data collected by the global shutter sensor.
  • the exposure of the images is synchronized to facilitate subsequent accurate fusion, so as to improve the positioning effect and tracking effect of electronic equipment.
  • the method of obtaining the motion speed of the target object may be: obtaining the three-dimensional position coordinates of the target object at different times, and determining the three-dimensional position coordinates of the target object at different times and the interval time between adjacent times.
  • the speed of movement of the target object after obtaining the three-dimensional position coordinates of the target object at different times, the Kalman filter algorithm can be used to determine the movement speed of the target object based on the three-dimensional position coordinates of the target object at different times and the interval between adjacent times. , thereby improving the accuracy of the movement speed of the target object, so as to facilitate the subsequent accurate prediction of the target three-dimensional position of the target object at the current moment.
  • the method of obtaining the three-dimensional position of the target object at the current moment may be: obtaining the first image and the second image collected by the global shutter sensor at the current moment; determining the target object from the first image and the second image; Matching pairs of feature points corresponding to multiple spatial points on the system; according to the matching pairs of multiple feature points, the depth information of the target object is determined, and the three-dimensional position of the target object at the current moment is determined according to the depth information of the target object.
  • the global shutter sensor includes a first global shutter sensor and a second global shutter sensor, the first image is an image collected by exposure of the first global shutter sensor, and the second image is an image collected by exposure of the second global shutter sensor.
  • Sub-step S1012 Determine a target sensing unit from a plurality of sensing units of the rolling shutter sensor according to the first pixel area.
  • each sensing unit of the rolling shutter sensor is sequentially exposed in a preset order to obtain a relationship diagram between multiple image lines and exposure times as shown in FIG.
  • One image line corresponds to one second pixel area
  • One second pixel area corresponds to one sensing unit of the rolling shutter sensor
  • the first pixel area 10 corresponding to the target object includes a second pixel area 11 , a second pixel area 12 and a second pixel area 13 .
  • the method of determining the target sensing unit from the multiple sensing units of the rolling shutter sensor may be: acquiring the first pixel area located in the first pixel area. There are two pixel areas, and the sensing unit corresponding to the second pixel area located in the first pixel area is determined as the target sensing unit.
  • the first pixel area 10 corresponding to the target object includes a second pixel area 11 , a second pixel area 12 and a second pixel area 13 . Therefore, the second pixel area 11 , the second pixel area 12 or the sensing unit corresponding to the second pixel area 13 is determined as the target sensing unit.
  • the method of determining the target sensing unit from the multiple sensing units of the rolling shutter sensor may be: obtaining the corresponding center position of the first pixel area.
  • the pixel sub-area of the pixel sub-area; the sensing unit corresponding to the second pixel area overlapping the pixel sub-area is determined as the target sensing unit. For example, as shown in FIG.
  • the first pixel area 10 corresponding to the target object includes a second pixel area 11 , a second pixel area 12 and a second pixel area 13 , and the pixel sub-area corresponding to the center position of the first pixel area is the same as the The second pixel area 12 overlaps, therefore, the sensing unit corresponding to the second pixel area 12 is determined as the target sensing unit.
  • the method of determining the target sensing unit from the multiple sensing units of the rolling shutter sensor may be: according to the first pixel area, from multiple sensing units of the rolling shutter sensor. A plurality of candidate sensing units are determined in the unit; the target exposure time of each candidate sensing unit is obtained, and the sampling time of the inertial measurement unit of the electronic device is obtained; according to the sampling time and the target exposure time of each candidate sensing unit, A target sensing unit is determined from a plurality of candidate sensing units.
  • the target sensing unit can be accurately determined, so that the exposure time of the selected target sensing unit can be synchronized with the sampling time of the inertial measurement unit, which is convenient for subsequent
  • the data collected by the inertial measurement unit, the image data collected by the exposure of the rolling shutter sensor and the image data collected by the exposure of the global shutter sensor are fused.
  • the manner of determining the multiple candidate sensing units from the multiple sensing units of the rolling shutter sensor may be: second pixel area; acquiring a second pixel area located in the first pixel area, and determining a sensing unit corresponding to the second pixel area located in the first pixel area as a candidate sensing unit.
  • the pixel area corresponding to each candidate sensing unit is located in the first pixel area.
  • the first pixel area 10 corresponding to the target object includes a second pixel area 11, a second pixel area 12 and a second pixel area 13. Therefore, the second pixel area 11, the second pixel area
  • the sensing units corresponding to 12 and the second pixel area 13 are determined as candidate sensing units, thereby obtaining three candidate sensing units.
  • the method of determining the target sensing unit from the multiple candidate sensing units may be: determining multiple inertial measurement units according to the sampling time. Work start time, wherein adjacent work start times are separated by the sampling time; according to the target exposure time of each candidate sensing unit and multiple work start times, the target sensor is determined from the multiple candidate sensing units. unit.
  • the target sensing unit can be accurately determined through the target exposure time of the candidate sensing unit and the multiple working start times of the inertial measurement unit, so that the exposure time of the selected target sensing unit can be synchronized with the sampling time of the inertial measurement unit, It is convenient to fuse the data collected by the inertial measurement unit, the image data collected by the exposure of the rolling shutter sensor, and the image data collected by the exposure of the global shutter sensor.
  • the target exposure time of the candidate sensing unit includes any one of the exposure start time, exposure end time and exposure center time of the candidate sensor unit, and the target exposure time of the target sensor unit and a work start time Is the time difference value smaller than the preset time difference value?
  • the preset time difference value can be set based on the actual situation, which is not specifically limited in this embodiment of the present application.
  • the target exposure time of the candidate sensing unit is the exposure start time of the candidate sensor unit.
  • the multiple work start times of the inertial measurement unit are respectively work start time 31 , work start time 31 , and work start time.
  • the exposure start time of the unit is the exposure start time 21, the exposure start time 22 and the exposure start time 23 respectively. Through comparison, it is found that the difference between the exposure start time 21 and the work start time 32 is zero.
  • the candidate sensing unit corresponding to the two-pixel area 11 is determined as the target sensing unit.
  • the target exposure time of the candidate sensing unit is the exposure center time of the candidate sensing unit.
  • the multiple work start times of the inertial measurement unit are work start time 41 and work start time 42 respectively.
  • the exposure start times of the candidate sensing units corresponding to the second pixel area 12 and the second pixel area 13 are the exposure center time 24 , the exposure center time 25 and the exposure center time 26 .
  • the difference value of 45 is zero, therefore, the candidate sensing unit corresponding to the second pixel area 12 is determined as the target sensing unit.
  • the target exposure time of the candidate sensing unit is the exposure end time of the candidate sensing unit.
  • the multiple work start times of the inertial measurement unit are work start time 51 and work start time 52 respectively.
  • work start time 53, work start time 54, work start time 55 and work start time 56, and the second pixel area 11, the second pixel area 12 and the second pixel area 13 correspond to the candidate sensing units of
  • the exposure end time is the exposure end time 27, the exposure end time 28 and the exposure end time 29 respectively.
  • the difference between the exposure end time 29 and the work start time 55 is zero. Therefore, the candidate corresponding to the third pixel area 13
  • the sensing unit is determined as the target sensing unit.
  • the exposure time parameters of the rolling shutter sensor include the exposure waiting time of the rolling shutter sensor, the exposure sequence of the target sensing unit, the exposure time difference between adjacent sensing units, and the exposure time of each sensing unit of the rolling shutter sensor.
  • Exposure duration, the exposure time information of the target sensing unit includes any one of exposure start time, exposure center time, and exposure end time.
  • the exposure start time of the target sensing unit is determined according to the exposure waiting time, the exposure order and the exposure time difference; or, according to the exposure waiting time, the exposure order, the exposure time difference and the exposure duration, Determine the exposure center time or exposure end time of the target sensing unit.
  • the exposure waiting time of the rolling shutter sensor be T blank
  • the exposure sequence of the target sensing unit is n
  • the exposure time difference between adjacent sensing units is ⁇ T
  • the exposure time of each sensing unit of the rolling shutter sensor is The duration is T lexp
  • triggering the global shutter sensor to perform exposure according to the exposure time information to obtain the target image may be: determining the target exposure time difference between the global shutter sensor and the rolling shutter sensor according to the exposure time information; , triggering the global shutter sensor to perform exposure to obtain the target image, that is, after controlling the rolling shutter sensor to start exposure, the target exposure time difference is controlled, and the global shutter sensor is controlled to start exposure to collect image data.
  • the target exposure time difference is spaced, and according to the exposure parameters of the rolling shutter sensor, the global shutter sensor is controlled to start exposure to collect image data, which can further ensure that the rolling shutter sensor and the The exposure synchronization of the local images in the image data collected by the exposure of the global shutter sensor is convenient for subsequent accurate fusion, so as to improve the positioning effect and tracking effect of the electronic device.
  • the method of determining the target exposure time difference between the global shutter sensor and the rolling shutter sensor may be: determining the global shutter sensor and the rolling shutter sensor according to the exposure start time of the target sensing unit. target exposure time difference.
  • the method of determining the target exposure time difference between the global shutter sensor and the rolling shutter sensor may be: according to the exposure center moment of the target sensing unit and the exposure duration of the global shutter sensor, determine the target exposure time difference.
  • the target exposure time difference between the global shutter sensor and the rolling shutter sensor Exemplarily, as shown in FIG. 9 , the exposure duration of the global shutter sensor is T vexp , the sensing unit corresponding to the second pixel area 12 is the target sensing unit, and the exposure center moment of the target sensing unit is the same as that of the rolling shutter.
  • the method of determining the target exposure time difference between the global shutter sensor and the rolling shutter sensor may be: according to the exposure end time of the target sensing unit and the exposure duration of the global shutter sensor, determine the target exposure time difference.
  • the target exposure time difference between the global shutter sensor and the rolling shutter sensor As shown in FIG.
  • the exposure duration of the global shutter sensor is T vexp
  • the sensing unit corresponding to the second pixel area 11 is the target sensing unit
  • the target sensing unit is determined from a plurality of sensing units of the rolling shutter sensor; according to the exposure time parameter of the rolling shutter sensor, during the exposure process of the rolling shutter sensor, the Exposure time information; according to the exposure time information, select the candidate image with the closest time from multiple candidate images as the target image.
  • the multiple candidate images are collected by the global shutter sensor, that is, the global shutter sensor is controlled to start exposure while the rolling shutter sensor is controlled to start exposure, so as to collect multiple candidate images.
  • the exposure time information of the target sensing unit to select the candidate image with the closest time from the multiple candidate images as the target image, the exposure synchronization of the image data collected by the rolling shutter sensor and the global shutter sensor can be ensured, which is convenient for subsequent accurate fusion. , in order to improve the positioning effect and tracking effect of electronic equipment.
  • the exposure time information includes any one of the exposure start time, the exposure center time, and the exposure end time of the target sensing unit, and the candidate image with the closest time is selected from the multiple candidate images according to the exposure time information.
  • the candidate image that is closest to the exposure start time, the exposure center time or the exposure end time can be selected from the plurality of candidate images as the target image.
  • the exposure start time, exposure center time and exposure end time of the target sensing unit are t 1 , t 2 and t 3 respectively, and the exposure start time, exposure center time and exposure end time of the candidate image 1 are respectively t 1 -1 , t 1-2 and t 1-3 , the exposure start time, exposure center time and exposure end time of candidate image 2 are respectively t 2-1 , t 2-2 and t 2-3 , and the exposure time of candidate image 3 is The exposure start time, exposure center time and exposure end time are respectively t 3-1 , t 3-2 and t 3-3 , if t 1-1 in t 1-1 , t 2-1 and t 3-1
  • the candidate image 1 is determined as the target image which is closest to the exposure start time t 1 of the target sensing unit, and if t 2-2 in t 1-2 , t 2-2 and t 3-2 is the same as the target image If the exposure center time t 2 of the sensing unit is the closest , the candidate image 2 is determined as the target
  • the target sensing unit is determined from the multiple sensing units of the rolling shutter sensor, and according to the exposure time parameter of the rolling shutter sensor, it is determined that during the exposure process of the rolling shutter sensor, Exposure time information of the target sensing unit, and finally trigger the global shutter sensor to perform exposure according to the exposure time information to obtain the target image, ensuring that the exposure synchronization of the partial images in the image data collected by the rolling shutter sensor and the global shutter sensor exposure is convenient for subsequent Accurate fusion to improve positioning and tracking of electronic devices.
  • FIG. 11 is a schematic structural block diagram of an image acquisition apparatus provided by an embodiment of the present application.
  • the image acquisition device 200 includes a rolling shutter sensor 201 , a global shutter sensor 202 , a processor 203 and a memory 204 , and the rolling shutter sensor 201 , the global shutter sensor 202 , the processor 203 and the memory 204 are connected through a bus 205 , the bus 205 is, for example, an I2C (Inter-integrated Circuit) bus.
  • the rolling shutter sensor 201 includes multiple sensing units, and the multiple sensing units of the rolling shutter sensor 201 are sequentially exposed in a preset sequence to collect image data; the global shutter sensor 202 includes multiple sensing units, and the global shutter sensor 202 includes multiple sensing units. The multiple sensing units of 202 are exposed simultaneously to acquire image data.
  • the processor 203 may be a micro-controller unit (Micro-controller Unit, MCU), a central processing unit (Central Processing Unit, CPU), or a digital signal processor (Digital Signal Processor, DSP) or the like.
  • MCU Micro-controller Unit
  • CPU Central Processing Unit
  • DSP Digital Signal Processor
  • the memory 204 may be a Flash chip, a read-only memory (ROM, Read-Only Memory) magnetic disk, an optical disk, a U disk, or a removable hard disk, or the like.
  • ROM Read-Only Memory
  • the memory 204 may be a Flash chip, a read-only memory (ROM, Read-Only Memory) magnetic disk, an optical disk, a U disk, or a removable hard disk, or the like.
  • the processor 203 is used for running the computer program stored in the memory 204, and implements the following steps when executing the computer program:
  • the global shutter sensor is triggered to perform exposure according to the exposure time information to obtain a target image.
  • the processor when the processor determines a target sensing unit from a plurality of sensing units of the rolling shutter sensor, the processor is configured to implement:
  • a target sensing unit is determined from a plurality of sensing units of the rolling shutter sensor according to the first pixel area.
  • the processor when the processor determines a target sensing unit from a plurality of sensing units of the rolling shutter sensor according to the first pixel area, the processor is configured to implement:
  • a target sensing unit is determined from a plurality of sensing units of the rolling shutter sensor according to the first pixel area and each of the second pixel areas.
  • the processor determines a target sensing unit from a plurality of sensing units of the rolling shutter sensor according to the first pixel area and each of the second pixel areas, use To achieve:
  • a sensing unit corresponding to the second pixel area overlapping with the pixel sub-area is determined as the target sensing unit.
  • the processor when the processor determines a target sensing unit from a plurality of sensing units of the rolling shutter sensor according to the first pixel area, the processor is configured to implement:
  • a target sensing unit is determined from the plurality of candidate sensing units according to the sampling time and the target exposure time of each of the candidate sensing units.
  • the target exposure time includes any one of an exposure start time, an exposure end time, and an exposure center time.
  • the processor is configured to determine a target sensing unit from the plurality of candidate sensing units according to the sampling time and the target exposure time of each candidate sensing unit, for implementing :
  • a target sensing unit is determined from the plurality of candidate sensing units according to the target exposure time and the plurality of work start times.
  • the time difference between the target exposure time of the target sensing unit and one of the working start times is less than a preset time difference.
  • the pixel area corresponding to each of the candidate sensing units is located in the first pixel area.
  • the processor when the processor acquires the first pixel area of the target object in the image coordinate system where the rolling shutter sensor is located, the processor is configured to:
  • a first pixel area of the target object in the image coordinate system where the rolling shutter sensor is located is determined.
  • the exposure time parameter includes the exposure waiting time of the rolling shutter sensor, the exposure sequence of the target sensing unit, the exposure time difference between adjacent sensing units, the exposure time of each sensing unit Exposure duration, when the processor determines the exposure time information of the target sensing unit during the exposure process of the rolling shutter sensor according to the exposure time parameter of the rolling shutter sensor, is used to achieve:
  • the exposure waiting time the exposure sequence, the exposure time difference and the exposure duration, the exposure center time or the exposure end time of the target sensing unit is determined.
  • the processor when the processor triggers the global shutter sensor to perform exposure according to the exposure time information to obtain a target image, the processor is configured to:
  • the global shutter sensor is triggered to perform exposure to obtain a target image.
  • the processor when the processor realizes that the target exposure time difference between the global shutter sensor and the rolling shutter sensor is determined according to the exposure time information, it is used to realize:
  • a target exposure time difference between the global shutter sensor and the rolling shutter sensor is determined according to the exposure end time of the target sensing unit and the exposure duration of the global shutter sensor.
  • FIG. 12 is a schematic structural block diagram of a movable platform provided by an embodiment of the present application.
  • the movable platform 300 includes a platform body 310 , a power system 320 provided on the platform body 310 and an image acquisition device 330 provided on the platform body 310 , and the power system 320 is used to provide the movable platform 300 with movement Power, the image acquisition device 330 is used for acquiring the target image and for controlling the movable platform 300 to move.
  • the movable platform 300 includes unmanned aerial vehicles, unmanned ships, and unmanned vehicles.
  • Embodiments of the present application further provide a computer-readable storage medium, where a computer program is stored in the computer-readable storage medium, and the computer program includes program instructions, and the processor executes the program instructions to realize the provision of the above embodiments.
  • the steps of the exposure control method are described in detail below.
  • the computer-readable storage medium may be an internal storage unit of the electronic device described in any of the foregoing embodiments, such as a hard disk or a memory of the electronic device.
  • the computer-readable storage medium may also be an external storage device of the electronic device, such as a plug-in hard disk, a smart memory card (Smart Media Card, SMC), a secure digital (Secure Digital, SD) equipped on the electronic device ) card, Flash Card, etc.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Studio Devices (AREA)

Abstract

Le procédé comporte les étapes consistant à: déterminer une unité de détection cible parmi une pluralité d'unités de détection d'un capteur d'obturateur roulant (S101); déterminer des informations de temps d'exposition de l'unité de détection cible au cours d'un processus d'exposition du capteur d'obturateur roulant (S102); et déclencher, selon les informations de temps d'exposition, un capteur d'obturateur global pour réaliser une exposition (S103). Au moyen de la présente invention, la synchronisation d'exposition d'un capteur d'obturateur roulant et d'un capteur d'obturateur global peut être garantie.
PCT/CN2020/127905 2020-11-10 2020-11-10 Procédé et appareil de commande d'exposition, plate-forme mobile, et support de stockage lisible par ordinateur WO2022099482A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/CN2020/127905 WO2022099482A1 (fr) 2020-11-10 2020-11-10 Procédé et appareil de commande d'exposition, plate-forme mobile, et support de stockage lisible par ordinateur

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2020/127905 WO2022099482A1 (fr) 2020-11-10 2020-11-10 Procédé et appareil de commande d'exposition, plate-forme mobile, et support de stockage lisible par ordinateur

Publications (1)

Publication Number Publication Date
WO2022099482A1 true WO2022099482A1 (fr) 2022-05-19

Family

ID=81601916

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2020/127905 WO2022099482A1 (fr) 2020-11-10 2020-11-10 Procédé et appareil de commande d'exposition, plate-forme mobile, et support de stockage lisible par ordinateur

Country Status (1)

Country Link
WO (1) WO2022099482A1 (fr)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115134492A (zh) * 2022-05-31 2022-09-30 北京极豪科技有限公司 图像采集方法、电子设备和计算机可读介质
CN115988331A (zh) * 2022-12-09 2023-04-18 浙江华锐捷技术有限公司 一种曝光控制方法、装置、设备及介质

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20080278610A1 (en) * 2007-05-11 2008-11-13 Micron Technology, Inc. Configurable pixel array system and method
CN105940390A (zh) * 2013-12-30 2016-09-14 谷歌技术控股有限责任公司 用于同步从设备的多个传感器接收的数据的方法和系统
CN106385538A (zh) * 2015-06-30 2017-02-08 鹦鹉无人机股份有限公司 用于无人机的具有校正抖动型失真的高分辨率相机单元
CN106454044A (zh) * 2016-10-25 2017-02-22 浙江宇视科技有限公司 一种爆闪补光装置及方法
CN109922260A (zh) * 2019-03-04 2019-06-21 中国科学院上海微系统与信息技术研究所 图像和惯性传感器的数据同步方法和同步装置
CN110198415A (zh) * 2019-05-26 2019-09-03 初速度(苏州)科技有限公司 一种图像时间戳的确定方法和装置
CN110771153A (zh) * 2017-06-21 2020-02-07 康蒂-特米克微电子有限公司 具有不同快门模式的摄像机系统

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20080278610A1 (en) * 2007-05-11 2008-11-13 Micron Technology, Inc. Configurable pixel array system and method
CN105940390A (zh) * 2013-12-30 2016-09-14 谷歌技术控股有限责任公司 用于同步从设备的多个传感器接收的数据的方法和系统
CN106385538A (zh) * 2015-06-30 2017-02-08 鹦鹉无人机股份有限公司 用于无人机的具有校正抖动型失真的高分辨率相机单元
CN106454044A (zh) * 2016-10-25 2017-02-22 浙江宇视科技有限公司 一种爆闪补光装置及方法
CN110771153A (zh) * 2017-06-21 2020-02-07 康蒂-特米克微电子有限公司 具有不同快门模式的摄像机系统
CN109922260A (zh) * 2019-03-04 2019-06-21 中国科学院上海微系统与信息技术研究所 图像和惯性传感器的数据同步方法和同步装置
CN110198415A (zh) * 2019-05-26 2019-09-03 初速度(苏州)科技有限公司 一种图像时间戳的确定方法和装置

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115134492A (zh) * 2022-05-31 2022-09-30 北京极豪科技有限公司 图像采集方法、电子设备和计算机可读介质
CN115134492B (zh) * 2022-05-31 2024-03-19 北京极光智芯科技有限公司 图像采集方法、电子设备和计算机可读介质
CN115988331A (zh) * 2022-12-09 2023-04-18 浙江华锐捷技术有限公司 一种曝光控制方法、装置、设备及介质

Similar Documents

Publication Publication Date Title
WO2020224375A1 (fr) Procédé, appareil et dispositif de positionnement et support d'informations lisible par ordinateur
US20210012520A1 (en) Distance measuring method and device
CN109215433B (zh) 用于自动驾驶仿真的基于视觉的驾驶场景生成器
US20210279444A1 (en) Systems and methods for depth map sampling
JP6754856B2 (ja) 自動運転車両のためのセンサー集約フレームワーク
US10339387B2 (en) Automated multiple target detection and tracking system
US9846043B2 (en) Map creation apparatus, map creation method, and computer-readable recording medium
US11057604B2 (en) Image processing method and device
US11537149B2 (en) Route generation device, moving body, and program
JP2018523865A (ja) 情報処理方法、デバイス、および端末
CN110096053A (zh) 用于自动驾驶车辆的驾驶轨迹生成方法、系统和机器可读介质
WO2018120350A1 (fr) Procédé et dispositif de positionnement de véhicule aérien sans pilote
US20210365038A1 (en) Local sensing based autonomous navigation, and associated systems and methods
WO2022099482A1 (fr) Procédé et appareil de commande d'exposition, plate-forme mobile, et support de stockage lisible par ordinateur
WO2020119567A1 (fr) Procédé de traitement de données, appareil et dispositif, et support lisible par machine
CN111812698A (zh) 一种定位方法、装置、介质和设备
JP2016157197A (ja) 自己位置推定装置、自己位置推定方法およびプログラム
WO2019161663A1 (fr) Procédé et système de surveillance de zone portuaire, et système de commande central
WO2020024134A1 (fr) Procédé et dispositif de commutation de piste
CN108416044B (zh) 场景缩略图的生成方法、装置、电子设备及存储介质
CN115556769A (zh) 障碍物状态量确定方法及装置、电子设备和介质
JP2020064029A (ja) 移動体制御装置
JP7179687B2 (ja) 障害物検知装置
WO2024036984A1 (fr) Procédé de localisation de cible et système associé, et support de stockage
WO2022133911A1 (fr) Procédé et appareil de détection de cible, plateforme mobile et support de stockage lisible par ordinateur

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20961041

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 20961041

Country of ref document: EP

Kind code of ref document: A1