WO2021212278A1 - 数据处理方法、装置、可移动平台及可穿戴式设备 - Google Patents

数据处理方法、装置、可移动平台及可穿戴式设备 Download PDF

Info

Publication number
WO2021212278A1
WO2021212278A1 PCT/CN2020/085640 CN2020085640W WO2021212278A1 WO 2021212278 A1 WO2021212278 A1 WO 2021212278A1 CN 2020085640 W CN2020085640 W CN 2020085640W WO 2021212278 A1 WO2021212278 A1 WO 2021212278A1
Authority
WO
WIPO (PCT)
Prior art keywords
external parameter
temperature
vision module
sensor
vision
Prior art date
Application number
PCT/CN2020/085640
Other languages
English (en)
French (fr)
Inventor
刘洁
周游
徐彬
Original Assignee
深圳市大疆创新科技有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 深圳市大疆创新科技有限公司 filed Critical 深圳市大疆创新科技有限公司
Priority to PCT/CN2020/085640 priority Critical patent/WO2021212278A1/zh
Publication of WO2021212278A1 publication Critical patent/WO2021212278A1/zh

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration

Definitions

  • the invention belongs to the technical field of visual inspection, and particularly relates to a data processing method, device, movable platform and wearable equipment.
  • the initial external parameters preset for the vision module are usually directly used for processing. However, affected by the temperature of the actual use environment, the shape and stress transmission of the connections between the various components in the vision module will change. This will cause the initial external parameters set before The processing accuracy of the external reference when processing information.
  • the present invention provides a data processing method, a device, a movable platform and a wearable device, so as to solve the problem that the initial external parameters of the vision module are not accurate enough due to the influence of temperature, which in turn leads to relatively low processing accuracy when performing information processing based on the initial external parameters. Low problem.
  • the present invention is implemented as follows:
  • an embodiment of the present invention provides a data processing method, the method including:
  • the initial external parameter is compensated according to the external parameter compensation amount to obtain the target external parameter of the vision module; wherein, the vision module includes two vision sensors, and the external parameter is used to characterize the The relative pose relationship between two vision sensors; or, the vision module includes a vision sensor and a pose sensor, and the external parameter is used to characterize the relative pose between the vision sensor and the pose sensor relation.
  • an embodiment of the present invention provides a data processing device, the data processing device includes a computer-readable storage medium and a processor; the processor is configured to perform the following operations:
  • the initial external parameter is compensated according to the external parameter compensation amount to obtain the target external parameter of the vision module; wherein, the vision module includes two vision sensors, and the external parameter is used to characterize the The relative pose relationship between two vision sensors; or, the vision module includes a vision sensor and a pose sensor, and the external parameter is used to characterize the relative pose between the vision sensor and the pose sensor relation.
  • an embodiment of the present invention provides a movable platform that includes a vision module and the above-mentioned data processing device; the data processing device is configured to perform the following operations:
  • the initial external parameter is compensated according to the external parameter compensation amount to obtain the target external parameter of the vision module; wherein, the vision module includes two vision sensors, and the external parameter is used to characterize the The relative pose relationship between two vision sensors; or, the vision module includes a vision sensor and a pose sensor, and the external parameter is used to characterize the relative pose between the vision sensor and the pose sensor relation.
  • an embodiment of the present invention provides a wearable device, wherein the wearable device includes a vision module and the aforementioned data processing device; the data processing device is configured to perform the following operations:
  • the initial external parameter is compensated according to the external parameter compensation amount to obtain the target external parameter of the vision module; wherein, the vision module includes two vision sensors, and the external parameter is used to characterize the The relative pose relationship between two vision sensors; or, the vision module includes a vision sensor and a pose sensor, and the external parameter is used to characterize the relative pose between the vision sensor and the pose sensor relation.
  • an embodiment of the present invention provides a computer-readable storage medium, the computer-readable storage medium stores a computer program, and when the computer program is executed by a processor, the following operations are implemented:
  • the initial external parameter is compensated according to the external parameter compensation amount to obtain the target external parameter of the vision module; wherein, the vision module includes two vision sensors, and the external parameter is used to characterize the The relative pose relationship between two vision sensors; or, the vision module includes a vision sensor and a pose sensor, and the external parameter is used to characterize the relative pose between the vision sensor and the pose sensor relation.
  • the current temperature of the vision module can be obtained first, and the corresponding external parameter compensation amount at the current temperature can be determined according to the current temperature and the corresponding relationship between different temperature values and different external parameter compensation amounts, and then obtain The initial external parameters of the vision module, and finally, the initial external parameters are compensated according to the external parameter compensation amount to obtain the target external parameters of the vision module.
  • the vision module includes two vision sensors, and the external parameter is used to characterize the relative pose relationship between the two vision sensors; or, the vision module includes a vision sensor and a pose sensor, and the external parameter is used to characterize the visual sensor and the position. The relative pose relationship between the pose sensors.
  • the initial external parameters are compensated according to the current temperature, and the problem of insufficient accuracy of the external parameters caused by temperature can be corrected to a certain extent, so that the external parameters can more accurately represent the relative pose relationship between the visual sensors or the visual sensor.
  • the relative pose relationship with the pose sensor can then ensure the processing accuracy when the external parameters are subsequently used for information processing.
  • Figure 1 is a flow chart of the steps of a data processing method provided by an embodiment of the present invention.
  • FIG. 2A is a flowchart of steps of another data processing method provided by an embodiment of the present invention.
  • Fig. 2B is a schematic diagram of a camera coordinate system of a dual vision sensor
  • FIG. 2C is a schematic diagram of an angle according to an embodiment of the present invention.
  • 2D is a schematic diagram of a corresponding curve provided by an embodiment of the present invention.
  • 2E is a schematic diagram of another corresponding curve provided by an embodiment of the present invention.
  • Figure 3 is a block diagram of a data processing device provided by an embodiment of the present invention.
  • FIG. 4 is a schematic diagram of the hardware structure of a device for implementing various embodiments of the present invention.
  • FIG. 5 is a block diagram of a computing processing device provided by an embodiment of the present invention.
  • Fig. 6 is a block diagram of a portable or fixed storage unit provided by an embodiment of the present invention.
  • Fig. 1 is a flow chart of the steps of a data processing method provided by an embodiment of the present invention. As shown in Fig. 1, the method may include:
  • Step 101 Obtain the current temperature of the vision module.
  • the data processing method provided by the embodiment of the present invention can be applied to a processor.
  • the processor can be a processor included in a vision module or a processor included in a device equipped with the vision module. This is not limited.
  • the current temperature may be the temperature of the vision module at the current moment. The temperature will cause changes in the shape and stress transmission of the connections between the various components in the vision module, which in turn causes the initial external parameters previously set to be inaccurate. Therefore, in this step, the current temperature of the vision module can be obtained first, so that the subsequent steps can compensate the initial external parameters based on the current temperature.
  • the vision module can be detected by a temperature sensor to obtain the current temperature.
  • Step 102 Determine the corresponding external parameter compensation amount at the current temperature according to the current temperature and the corresponding relationship between different temperature values and different external parameter compensation amounts.
  • the corresponding relationship may be pre-generated and stored in the processor, and the corresponding relationship may represent the external parameter compensation amount corresponding to different temperatures, and the external parameter compensation amount may be determined based on the actual external parameter at the temperature. of.
  • the external parameter compensation amount corresponding to the current temperature can be searched from the corresponding relationship to obtain the external parameter compensation amount corresponding to the current temperature.
  • Step 103 Obtain initial external parameters of the vision module.
  • the initial external parameters may be calibrated and stored parameters when the vision module leaves the factory, or may be manually calibrated and stored by the user during use. Specifically, when the initial external parameter is acquired, it can be directly read from the storage area where the initial external parameter is stored.
  • Step 104 Compensate the initial external parameter according to the external parameter compensation amount to obtain the target external parameter of the vision module.
  • the initial external parameters are not accurate enough for the vision module at the current temperature. Therefore, in this step, the initial external parameter can be compensated with reference to the external parameter compensation amount at the current temperature to obtain the target external parameter of the vision module.
  • the target external parameters obtained by using the external parameter compensation amount corresponding to the current temperature to compensate the initial external parameters can be more adapted to the vision module to a certain extent, and the external parameters can be corrected inaccurately, resulting in subsequent use of external parameters for information processing Deal with the problem of lower precision.
  • the vision module in the embodiment of the present invention may include two vision sensors. Accordingly, the external parameter in the embodiment of the present invention may be used to characterize the relative pose relationship between the two vision sensors. It should be noted that when the vision module includes two vision sensors, the vision module may also include a pose sensor.
  • the data processing method provided in the embodiment of the present invention may also be used for the vision module.
  • the external parameters between the vision sensor and the pose sensor in the group are compensated and calibrated, which is not limited in the embodiment of the present invention.
  • the vision module may include a vision sensor and a pose sensor.
  • the external parameters may be used to characterize the relative pose relationship between the vision sensor and the pose sensor.
  • the number of vision sensors in the vision module can be one or more.
  • the number of vision sensors is more than one, it is also The data processing method provided in the embodiment of the present invention can be used to compensate and calibrate the external parameters between the visual sensors in the multiple visual sensors, which is not limited in the embodiment of the present invention.
  • the data processing method provided by the embodiment of the present invention first obtains the current temperature of the vision module, and determines the corresponding temperature at the current temperature according to the current temperature and the corresponding relationship between different temperature values and different external parameter compensation amounts.
  • the external parameter compensation amount then, the initial external parameter of the vision module is obtained, and finally, the initial external parameter is compensated according to the external parameter compensation amount to obtain the target external parameter of the vision module.
  • the vision module includes two vision sensors, and the external parameter is used to characterize the relative pose relationship between the two vision sensors; or, the vision module includes a vision sensor and a pose sensor, and the external parameter is used to characterize the visual sensor and the position.
  • the relative pose relationship between the pose sensors is used to characterize the visual sensor and the position.
  • the initial external parameters are compensated according to the current temperature, and the problem of insufficient accuracy of the external parameters caused by temperature can be corrected to a certain extent, so that the external parameters can more accurately represent the relative pose relationship between the visual sensors or the visual sensor.
  • the relative pose relationship with the pose sensor can then ensure the processing accuracy when the external parameters are subsequently used for information processing.
  • Fig. 2A is a step flow chart of another data processing method provided by an embodiment of the present invention. As shown in Fig. 2A, the method may include:
  • Step 201 Obtain the current temperature of the vision module.
  • the vision module may include two vision sensors.
  • the external parameter in the embodiment of the present invention may be used to characterize the relative pose relationship between the two vision sensors.
  • the vision module may also include a vision sensor and a pose sensor.
  • the external parameter may be used to characterize the relative pose relationship between the vision sensor and the pose sensor.
  • the relative pose relationship between the two vision sensors may be used to represent the relative change relationship between the camera coordinate systems adopted by the two vision sensors.
  • the relative pose relationship between the vision sensor and the pose sensor may be used to indicate the relative change relationship between the camera coordinate system adopted by the vision sensor and the coordinate system adopted by the pose sensor.
  • the pose sensor may be an inertial measurement unit (IMU), and the relative pose relationship may include relative rotation and/or relative translation.
  • IMU inertial measurement unit
  • the external parameter by setting the external parameter as the relative pose relationship between the visual sensors or the relative pose relationship between the visual sensor and the pose sensor, in this way, after the subsequent compensation of the external parameters, the external parameters can be enabled It can more accurately represent the relative pose relationship between these components, and then improve the accuracy of the fusion processing of the information collected by these components based on external parameters.
  • the camera coordinate system also called the optical center coordinate system, is a coordinate system with the optical center as the coordinate origin, the horizontal and numerical directions of the imaging surface as the X axis and the Y axis, and the optical axis as the Z axis.
  • the vision sensor that is, the camera
  • the vision sensor can realize the conversion of 3D points in the world coordinate system in the real world into points in the camera coordinate system through external camera parameters.
  • the world coordinate system is the absolute coordinate system of the objective three-dimensional world, also known as the objective coordinate system.
  • a reference coordinate system to describe the position of the digital camera, and use it to describe the position of any other objects placed in this three-dimensional environment.
  • the conversion can be realized by the following formula:
  • [u, v, 1] T represents a 2D point in the image coordinate system
  • [x w , y w , z w ] T represents a point in the world coordinate system
  • [x, y, z] T represents a camera coordinate system
  • the matrix K is called the Camera calibration matrix, that is, the internal parameters of each camera (Intrinsic Parameters).
  • the internal parameters can be used to describe the corresponding parameters between the three-dimensional light and the two-dimensional pixel coordinates.
  • the accuracy of the internal parameters determines the accuracy of the two Accuracy of the transformation from the pixel coordinate information of one dimension to the light information of three dimensions.
  • [c x , c y ] T represents the optical center, usually near the center of the picture, f x and f y represent the focal length, the unit is pixel, k 1 , k 2 , k3, k 4 , k 5 and k 6 Indicates radial distortion, and p1 and p2 indicate radial distortion.
  • Matrix R is a rotation matrix (Rotation Matrix)
  • matrix T is a translation matrix (Translation Matrix)
  • R and T are the camera's external parameters (Extrinsic Matrix), used to express the rotation of the world coordinate system to the camera coordinate system in the three-dimensional space Transform with displacement. For example, FIG.
  • FIG. 2B is a schematic diagram of the camera coordinate system of a dual vision sensor.
  • the relative pose relationship between the two vision sensors can be represented by R and T.
  • R, T represents the rotation and displacement transformation relationship between the three-dimensional coordinate system with O I as the origin to the three-dimensional coordinate system with Or as the origin.
  • the coordinate information of the P point detected by the left vision sensor in the coordinate system formed by the X I axis, Y I axis, and Z I axis can be obtained, and the P point detected by the right vision sensor is in X r
  • the coordinate information in the coordinate system formed by the axis, Y r axis, and Z r axis is converted.
  • the coordinate information in the coordinate system formed by the X I axis, Y I axis, and Z I axis is mapped and transformed to obtain the two-dimensional image point P I of the P point in the left visual sensor
  • the coordinate information in the coordinate system formed by the X r axis, the Y r axis, and the Z r axis is mapped and transformed to obtain the two-dimensional image point P r of the P point in the right visual sensor.
  • the vision module can be mounted on a movable platform, and the current temperature is determined according to the ambient temperature of the environment where the movable platform is currently located and/or the body temperature of the movable platform.
  • the movable platform Since the vision module is placed on the movable platform, the movable platform is often related to the ambient temperature of the environment. In this way, the current environment temperature of the movable platform and the body temperature of the movable platform can represent the vision model to a certain extent.
  • the temperature of the group For example, the movable platform can be a drone, and the vision module can be embedded in the body of the drone. Then the temperature of the vision module is often the same as the ambient temperature of the drone’s current environment and the drone’s body. The temperature is close. Therefore, in the embodiment of the present invention, the current temperature may be determined based on the ambient temperature of the environment where the movable platform is currently located and/or the temperature of the body of the movable platform.
  • the ambient temperature of the environment where the movable platform is currently located or the body temperature of the movable platform can be detected, and the detected temperature can be determined as the current temperature, or the difference between the detected temperature and the preset temperature can be determined as Current Temperature.
  • the preset temperature may be predetermined according to the deviation between the ambient temperature of the environment where the vision module and the movable platform are currently located or the temperature of the fuselage.
  • the ambient temperature of the environment where the movable platform is currently located and the body temperature of the movable platform can be detected, and the current temperature can be calculated based on these two temperatures, for example, the average value of the two can be calculated as the current temperature.
  • the current temperature is determined according to the ambient temperature of the environment where the movable platform is currently located and/or the body temperature of the movable platform, and a more accurate current temperature can be flexibly obtained in a variety of ways. Further, in order to accurately obtain the current temperature, the current temperature in this step may also be collected based on a temperature sensor set on the vision module.
  • the temperature sensor may be newly added to the vision module specifically for temperature detection, or it may be an original temperature sensor in the vision module.
  • some vision modules are equipped with a Time Of Flight (TOF) sensor. Accordingly, in this case, the temperature sensor can be provided on the printed circuit board of the TOF sensor.
  • TOF Time Of Flight
  • the implementation cost can be saved.
  • the temperature sensor set on the printed circuit board of the TOF sensor is used to detect the current temperature, because the temperature sensor is set on the printed circuit board of the TOF sensor, and the TOF sensor is located in the vision module, it can be ensured to a certain extent.
  • the current temperature detected by the temperature sensor can accurately represent the current actual temperature of the vision module, thereby improving the accuracy of the subsequent external parameter compensation determined based on the current temperature.
  • Step 202 Determine the corresponding external parameter compensation amount at the current temperature according to the current temperature and the corresponding relationship between different temperature values and different external parameter compensation amounts.
  • the corresponding relationship in this step may be generated in advance before performing this step.
  • the corresponding relationship may be generated in advance before the vision module leaves the factory.
  • the corresponding relationship can be achieved through the following steps A to D: step A, controlling the equipment equipped with the test vision module to operate with different thermal powers in different ambient temperatures.
  • the test vision module may be the same as or different from the vision module involved in the data processing method provided by the embodiment of the present invention.
  • the test vision module may be a vision module specifically used to generate correspondences.
  • the equipment equipped with the test vision module can be the equipment that the vision module will be used in actual use, such as drones, unmanned vehicles, aerial photography aircraft, Virtual Reality (VR) glasses/enhanced display ( Augmented reality, AR) glasses and so on.
  • the ambient temperature may be the temperature of the environment where the device equipped with the test vision module is located. It should be noted that due to the structure and material of the vision module itself, the vision module often has an operating temperature range. In order to ensure that the operation can be carried out normally, when controlling the ambient temperature in this step, you can ensure that the ambient temperature is within the operating temperature. Within range. At the same time, multiple temperature values can be uniformly selected from the use temperature range, and correspondingly controlled, so that the ambient temperature reaches these temperature values respectively.
  • the uniformity of the sample temperature collected subsequently can be ensured, and the accuracy of the corresponding relationship generated based on the sample temperature can be improved to a certain extent.
  • the equipment equipped with the test vision module is controlled to operate at different thermal powers, the equipment can be controlled to turn on first, and then the number of components running in the equipment can be gradually controlled. Generally, the more components running, the thermal power of the equipment The larger will be, and then the control equipment can be operated with different thermal power.
  • Step B Obtain the temperature of the test vision module as the sample temperature.
  • the sample temperature can be collected by a temperature sensor set on the vision module.
  • the temperature sensor can be arranged on the printed circuit board of the TOF sensor.
  • the temperature value collected by the temperature sensor can be read to obtain the sample temperature.
  • the operation of obtaining the sample temperature may be performed when the thermal power of the equipment equipped with the test vision module reaches a balanced state. Among them, the thermal power of the device equipped with the test vision module reaches the equilibrium state, which means that there is no heat exchange between the device and the outside world. Correspondingly, the heat exchange value between the device and the outside world can be detected.
  • the exchange value is 0, then It can be determined that the thermal power of the device has reached an equilibrium state, and at this time, the operation of obtaining the sample temperature can be performed.
  • the thermal power of the device equipped with the test vision module reaches a balanced state, there is no heat exchange with the outside world, that is, the temperature of the test vision module is already stable at this time. Therefore, in the embodiment of the present invention, the test vision module is equipped with The method of obtaining the sample temperature only when the equipment in the group reaches an equilibrium state can ensure the accuracy of the obtained sample temperature.
  • the equilibrium state may include a minimum thermal equilibrium state and/or a maximum thermal equilibrium state, wherein the minimum thermal equilibrium state indicates that the test vision module reaches thermal equilibrium when working at the rated thermal power at the lowest working environment temperature, and the maximum thermal equilibrium state indicates the test vision module
  • the group reaches thermal equilibrium when working with maximum thermal power at the highest operating ambient temperature. Since the device generates heat when it is working, the temperature of the test vision module is often higher than the ambient temperature. The heat generated when the device works with different thermal powers under the same ambient temperature is different. In this way, the temperature of the test vision module under different thermal powers will be different.
  • the collected sample temperature includes the lowest temperature and/or the highest temperature of the test vision module, that is, it contains representative Higher end value temperature.
  • the accuracy of subsequent generation of the corresponding relationship based on the sample temperature can be improved to a certain extent. For example, suppose the lowest working environment temperature is -10°C and the highest working environment temperature is +50°C, then the lowest sample temperature Tmin for testing the vision module at -10°C and the highest testing vision module at +50°C can be collected Sample temperature Tmax.
  • Step C Obtain the external parameter compensation amount corresponding to the test vision module at the sample temperature, where the external parameter compensation amount is the actual external parameter corresponding to the test vision module at the sample temperature relative to the reference external parameter The difference of the parameters.
  • the reference external parameter may be an external parameter of the determined test vision module at a certain ambient temperature.
  • the external parameter is the external parameter between the vision sensor and the IMU in the vision module as an example. Therefore, when performing external parameter calibration, a known calibration target can be constructed first, and image sequences can be collected through a certain controllable motion excitation. Specifically, the visual sensor can be controlled to take photos of the calibration target in different relative postures.
  • the external parameters are the external parameters between the vision sensors in the vision module as an example.
  • some known calibration targets can be constructed offline, and the camera external parameters can be calculated through a certain number of controllable image sequences.
  • the calibration target can be a checkerboard, dots or some three-dimensional objects.
  • the two vision sensors can be used to take photos of the calibration target in different relative poses, and then the spatial information of the calibration target is extracted from the photos, and then the external parameters are solved based on the spatial information.
  • the relative change of the external parameter is determined as the corresponding external parameter compensation amount at the sample temperature, and then the sample temperature and the external parameter are subsequently used.
  • the construction of the corresponding relationship with the relative change of the parameters can reduce the impact of individual differences between the visual modules to a certain extent, thereby improving the versatility of the corresponding relationship.
  • the reference external parameter can be used to test the external parameter of the vision module when the ambient temperature is the lowest working environment temperature.
  • the compensation amount of the external parameter calculated based on the reference external parameter can be made to be a non-negative parameter, thereby facilitating the subsequent calculation based on the corresponding relationship
  • the external parameter compensation amount is calculated.
  • the absolute amount of the external parameter can also be used as the external parameter compensation amount. That is, the reference external parameter is set to 0. In this way, only the calibration of the test vision module is required, and no additional calculations are required to realize data collection, thereby saving the processing resources required to generate the corresponding relationship to a certain extent.
  • Step D Determine the corresponding relationship according to the sample temperature and the corresponding external parameter compensation amount.
  • the sample temperature and its corresponding external parameter compensation amount can be used as sample point pairs, and the corresponding curve of temperature and external parameter compensation amount can be fitted through these sample point pairs. Since the collected sample temperature and the number of corresponding external parameter compensations are often limited, the corresponding curve can be generated through the limited sample point pairs collected, and the temperature and external parameters represented by countless points on the corresponding curve can be obtained. The corresponding relationship of the compensation amount can further improve the coverage of the corresponding relationship. Wherein, when fitting the corresponding curve, a corresponding curve can be fitted for each type of parameter included in the external parameter compensation amount. For example, suppose that the reference external parameter is Tmin, and 8 sample temperatures and their corresponding external parameter compensation values are collected.
  • the external parameter is an angle matrix composed of yaw angle yaw, roll angle roll, and pitch angle pitch
  • a representation can be constructed
  • the functional relationship of is the corresponding curve of f(T).
  • f(T) [YAW(T), ROLL(T), PITCH(T)]
  • YAW(T) represents the relative change of the yaw angle corresponding to the sample temperature T
  • ROLL(T) represents the sample temperature T
  • PITCH(T) represents the relative change of the pitch angle corresponding to the sample temperature T.
  • FIG. 2C is a schematic diagram of an angle provided by an embodiment of the present invention. As shown in FIG.
  • FIG. 2D is a schematic diagram of a corresponding curve provided by an embodiment of the present invention. As shown in FIG. 2D, based on the yaw angle of the eight external parameter compensation amounts, the corresponding curve shown in FIG. 2D can be fitted. It should be noted that it is also possible to directly store the collected sample temperature and the corresponding external parameter compensation amount as a corresponding list, which is not limited in the embodiment of the present invention.
  • the processor that executes the step of generating the correspondence relationship may be the same as or different from the processor that executes the data processing method provided in the embodiment of the present invention. For example, each step of generating the corresponding relationship described above may be performed by a processor dedicated to generating the corresponding relationship.
  • Step 203 Obtain initial external parameters of the vision module.
  • Step 204 Compensate the initial external parameter according to the external parameter compensation amount to obtain the target external parameter of the vision module.
  • compensation can be achieved through the following operations: find the corresponding external parameter compensation amount at the calibration temperature of the initial external parameter in the corresponding relationship; calculate the corresponding external parameter compensation amount at the current temperature and The difference value of the corresponding external parameter compensation amount at the calibration temperature; the sum of the difference value and the initial external parameter is calculated to obtain the target external parameter.
  • the calibration temperature may be the temperature of the vision module when the initial external parameters are obtained, and the calibration temperature may be stored at the same time when the initial external parameters are calibrated.
  • the processor executes this step, it can directly read the pre-stored calibration temperature, and then find the corresponding external parameter compensation amount at the calibration temperature from the corresponding relationship based on the calibration temperature.
  • the initial external participation benchmark external parameters may be the same, and the actual external participation initial external parameters collected at the calibration temperature when the corresponding relationship is generated are the same.
  • the corresponding external parameter compensation amount at the calibrated temperature is 0. Therefore, in the embodiment of the present invention, before searching the corresponding external parameter compensation amount at the calibration temperature of the initial external parameter in the corresponding relationship, the first difference value of the initial external parameter relative to the reference external parameter can be determined first, and relative to the generated value.
  • the corresponding relationship is the second difference of the actual external parameters collected at the calibration temperature. If the first difference and the second difference are both 0, then the corresponding external parameter compensation and the initial external parameter at the current temperature can be directly calculated Sum, get the target external reference. If the first difference and/or the second difference are not 0, then perform the operation of searching the corresponding external parameter compensation amount at the calibration temperature of the initial external parameter in the corresponding relationship, so that unnecessary searching operations can be avoided.
  • the initial external parameters calibrated at the calibration temperature may be different from the actual external parameters collected at the calibration temperature when the corresponding relationship is generated.
  • the initial external parameters are manually calibrated at the calibration temperature by the user, and the actual external parameters collected at the calibration temperature when the corresponding relationship is generated are obtained through the calibration algorithm. Then, although the two are calibrated at the temperature, the values may be different. .
  • the operation of finding the corresponding external parameter compensation value at the calibration temperature is omitted, and the corresponding value at the current temperature is directly used. The external parameter compensation amount is compensated.
  • the calibration temperature is T0
  • the current temperature is T1
  • the initial external parameters are: [yaw0,roll0,pitch0]
  • f(T1) is the external parameter compensation amount at the current temperature
  • f(T0) is the corresponding external parameter compensation amount at the calibrated temperature.
  • Figure 2E is a schematic diagram of another corresponding curve provided by an embodiment of the present invention.
  • the change in yaw angle is YAW(T0), and the change in yaw angle in the external parameter at the current temperature T1 is YAW(T1).
  • the yaw angle in the initial external parameter calibrated at the calibrated temperature T0 is yaw0
  • Step 205 Perform subsequent processing based on the target external parameter.
  • the subsequent processing can be used to process the sensor data collected by the vision module according to the target external parameters to obtain the processing result, and determine the relationship between the object to be positioned and the device equipped with the vision module according to the processing result.
  • Relative positional relationship that is, positioning.
  • the posture of the object to be positioned is determined according to the processing result, that is, posture estimation is performed.
  • the subsequent processing may also be other operations, for example, it may be a map drawing operation, etc., which is not limited in the embodiment of the present invention.
  • the target external parameter may be verified before subsequent processing is performed based on the target external parameter, and then the follow-up can be performed when the target external parameter passes the verification. Processing to ensure the accuracy of subsequent processing. Specifically, it can be verified by the following operation: comparing the target external participation preset external parameter range; wherein, the preset external parameter range is used to characterize the value range of the external parameter corresponding to the current temperature; If the compensated current external parameter falls within the preset external parameter range, the subsequent processing operation based on the target external parameter is performed.
  • the preset external parameter range may be a numerical range that may fall into when the external parameter is normal at the current temperature.
  • the target external parameter does not fall within the preset external parameter range at the current temperature, it means that the target external parameter is abnormal, which is likely to be wrong. Accordingly, in this case, the target external parameter can be considered unreliable. If it falls into the preset external parameter range, it means the target external parameter is correct. Accordingly, in this case, the target external parameter can be considered to be more credible, and subsequent processing can be performed based on the target external parameter. .
  • verification may also be performed by the following operations: processing preset sensing data according to the target external parameter to obtain a preprocessing result; transmitting the preprocessing result to the preset
  • the standard processing results corresponding to the sensory data are compared; if the two are consistent, the subsequent processing operation based on the target external parameter is executed.
  • the standard processing result may be the result obtained when the preset sensor data is processed with correct external parameters.
  • the preprocessing result is consistent with the standard processing result, it means that the target external parameter is correct.
  • the target external parameter can be considered to be more reliable, and then it can be based on the target external parameter. , Proceed to follow-up processing.
  • the target external parameter it is possible to first determine whether to verify the target external parameter according to the estimation subject of the external parameter.
  • the target external parameters can be considered to be more reliable.
  • the target external parameter can be directly added as the initial value in the subsequent processing, for example, added to the state estimation step.
  • the target parameters are verified through the above verification method, and after the verification is passed, the target is added outside Into the subsequent processing. In this way, by selectively performing verification in some cases, the processing resources spent in verifying the target parameter can be saved to a certain extent.
  • the vision module as an example of a visual inertial odometer that includes a visual sensor and an IMU
  • the output of the visual sensor and the IMU inertial measurement unit need to be synchronized in time to ensure acceptable , Stable data delay.
  • the external parameters between the two sensor coordinate systems of the visual sensor and the IMU inertial measurement unit directly determine the accuracy of the visual inertial odometer.
  • various parameters such as the camera internal parameters, the external parameters between the vision sensor and the IMU, are determined at the design stage.
  • the data processing method provided in the embodiment of the present invention can compensate and correct the external parameters between the visual sensor and the IMU according to the current temperature during actual use, thereby improving the accuracy of the external parameters to a certain extent and improving the visual inertial odometer. Accuracy.
  • the data processing method provided by the embodiment of the present invention first obtains the current temperature of the vision module, and determines the corresponding temperature at the current temperature according to the current temperature and the corresponding relationship between different temperature values and different external parameter compensation amounts.
  • the external parameter compensation amount then, the initial external parameter of the vision module is obtained, and the initial external parameter is compensated according to the external parameter compensation amount to obtain the target external parameter of the vision module, and follow-up is performed according to the target parameter.
  • the initial external parameters are adaptively compensated according to the current temperature, to a certain extent, the problem of insufficient accuracy of the external parameters caused by the temperature can be corrected, and the processing accuracy when the external parameters are used for subsequent processing can be ensured.
  • the compensation calibration is automatically performed when the external parameters are inaccurate, so that the user does not need to manually perform the calibration, which can reduce the implementation of the calibration to a certain extent. cost.
  • FIG. 3 is a block diagram of a data processing device provided by an embodiment of the present invention.
  • the device 30 may include: a first obtaining module 301 configured to obtain the current temperature of the vision module.
  • the first determining module 302 is configured to determine the corresponding external parameter compensation amount at the current temperature according to the current temperature and the corresponding relationship between different temperature values and different external parameter compensation amounts.
  • the second acquiring module 303 is used to acquire the initial external parameters of the vision module.
  • the compensation module 304 is configured to compensate the initial external parameter according to the external parameter compensation amount to obtain the target external parameter of the vision module; wherein, the vision module includes two vision sensors, and the external parameter The parameter is used to characterize the relative pose relationship between the two vision sensors; or, the vision module includes a vision sensor and a pose sensor, and the external parameter is used to characterize the vision sensor and the pose sensor The relative pose relationship between.
  • the vision module is mounted on a movable platform, and the current temperature is determined according to the ambient temperature of the environment where the movable platform is currently located and/or the body temperature of the movable platform.
  • the movable platform is a drone.
  • the relative pose relationship includes a relative rotation amount and/or a relative translation amount.
  • the pose sensor is an inertial measurement unit IMU.
  • the corresponding relationship is set according to the difference between the actual external parameter of the vision module and the reference external parameter at different temperatures.
  • the corresponding relationship is generated by the following modules: a control module, which is used to control the equipment equipped with the test vision module to operate with different thermal powers in different ambient temperatures.
  • the third acquiring module is used to acquire the temperature of the test vision module as the sample temperature.
  • the fourth acquiring module is configured to acquire the external parameter compensation amount corresponding to the test vision module at the sample temperature, where the external parameter compensation amount is the actual external parameter corresponding to the test vision module at the sample temperature The difference relative to the benchmark external parameter.
  • the second determining module is configured to determine the corresponding relationship according to the sample temperature and the corresponding external parameter compensation amount.
  • the third obtaining module 303 is specifically configured to obtain the temperature of the test vision module as the sample temperature when the thermal power of the device equipped with the test vision module reaches a balanced state.
  • the equilibrium state includes a minimum thermal equilibrium state and/or a maximum thermal equilibrium state.
  • the minimum thermal equilibrium state indicates that the test vision module reaches thermal equilibrium when working at the rated thermal power at the lowest working environment temperature.
  • the maximum thermal equilibrium state indicates that the test vision module reaches thermal equilibrium when working with the maximum thermal power under the highest working environment temperature.
  • the reference external parameter is an external parameter of the test vision module when the ambient temperature is the lowest working ambient temperature.
  • the compensation module 304 is specifically configured to: find the corresponding external parameter compensation amount at the calibration temperature of the initial external parameter in the corresponding relationship. Calculate the difference between the external parameter compensation amount corresponding to the current temperature and the external parameter compensation amount corresponding to the calibration temperature. The sum of the difference and the initial external parameter is calculated to obtain the target external parameter.
  • the compensation module 304 is further specifically configured to: determine a first difference value of the initial external parameter relative to the reference external parameter, and the initial external parameter relative to the corresponding relationship when the corresponding relationship is generated.
  • the second difference value of the actual external parameter at the calibration temperature if the first difference value and/or the second difference value is not 0, then perform the search for the initial external parameter in the corresponding relationship Steps to calibrate the corresponding external parameter compensation amount at the temperature.
  • the current temperature is collected based on a temperature sensor provided on the vision module.
  • the vision module includes a time-of-flight TOF sensor, and the temperature sensor is arranged on a printed circuit board of the TOF sensor.
  • the device 30 further includes: a processing module, configured to perform subsequent processing based on the target external parameter.
  • the subsequent processing is used to process the sensor data collected by the vision module according to the target external parameter to obtain a processing result; according to the processing result, determine the object to be positioned and the device carrying the vision module The relative position relationship between the devices, or the posture of the object to be positioned is determined according to the processing result.
  • the device 30 further includes: a verification module, configured to compare the target external participation preset external reference range; wherein, the preset external reference range is used to characterize the external reference corresponding to the current temperature.
  • the data processing method first obtains the current temperature of the vision module, and determines the corresponding temperature at the current temperature according to the current temperature and the corresponding relationship between different temperature values and different external parameter compensation amounts.
  • the vision module includes two vision sensors, and the external parameter is used to characterize the relative pose relationship between the two vision sensors; or, the vision module includes a vision sensor and a pose sensor, and the external parameter is used to characterize the visual sensor and the position.
  • the relative pose relationship between the pose sensors In this way, the initial external parameters are compensated according to the current temperature, and the problem of insufficient accuracy of the external parameters caused by temperature can be corrected to a certain extent, so that the external parameters can more accurately represent the relative pose relationship between the visual sensors or the visual sensor.
  • the relative pose relationship with the pose sensor can then ensure the processing accuracy when the external parameters are subsequently used for information processing.
  • an embodiment of the present invention also provides a data processing device, which includes a computer-readable storage medium and a processor; the processor is configured to perform the following operations: obtain the current temperature of the vision module; Temperature, and the corresponding relationship between different temperature values and different external parameter compensation amounts, determine the corresponding external parameter compensation amount at the current temperature; obtain the initial external parameter of the vision module; The initial external parameter is compensated to obtain the target external parameter of the vision module; wherein, the vision module includes two vision sensors, and the external parameter is used to characterize the relative position between the two vision sensors. Attitude relationship; or, the vision module includes a vision sensor and a pose sensor, and the external parameter is used to characterize the relative pose relationship between the vision sensor and the pose sensor.
  • the vision module is mounted on a movable platform, and the current temperature is determined according to the ambient temperature of the environment where the movable platform is currently located and/or the body temperature of the movable platform.
  • the movable platform is a drone.
  • the relative pose relationship includes a relative rotation amount and/or a relative translation amount.
  • the pose sensor is an inertial measurement unit IMU.
  • the corresponding relationship is set according to the difference between the actual external parameter of the vision module and the reference external parameter at different temperatures.
  • the corresponding relationship is generated by the following operation: controlling the equipment equipped with the test vision module to operate with different thermal powers in different ambient temperatures. Obtain the temperature of the test vision module as the sample temperature.
  • the obtaining the temperature of the test vision module as a sample temperature includes: obtaining the temperature of the test vision module when the thermal power of the device equipped with the test vision module reaches an equilibrium state, As the sample temperature.
  • the equilibrium state includes a minimum thermal equilibrium state and/or a maximum thermal equilibrium state. The minimum thermal equilibrium state indicates that the test vision module reaches thermal equilibrium when operating at the rated thermal power at the lowest working environment temperature. The maximum thermal equilibrium state indicates that the test vision module reaches thermal equilibrium when working with the maximum thermal power under the highest working environment temperature.
  • the reference external parameter is an external parameter of the test vision module when the ambient temperature is the lowest working ambient temperature.
  • the processor determines the corresponding external parameter compensation amount at the current temperature according to the current temperature and the corresponding relationship between different temperature values and different external parameter compensation amounts by executing the following operations: Find the corresponding external parameter compensation amount at the calibration temperature of the initial external parameter in the corresponding relationship. Calculate the difference between the external parameter compensation amount corresponding to the current temperature and the external parameter compensation amount corresponding to the calibration temperature. The sum of the difference and the initial external parameter is calculated to obtain the target external parameter.
  • the processor is further configured to perform the following operations: determine a first difference value of the initial external parameter relative to the reference external parameter, and the initial external parameter relative to the corresponding relationship when the corresponding relationship is generated. The second difference of the actual external parameters at the calibration temperature. If the first difference value and/or the second difference value are not 0, the step of searching the corresponding external parameter compensation amount at the calibration temperature of the initial external parameter in the corresponding relationship is performed.
  • the current temperature is collected based on a temperature sensor provided on the vision module.
  • the vision module includes a time-of-flight TOF sensor, and the temperature sensor is arranged on a printed circuit board of the TOF sensor.
  • the processor is further configured to perform the following operations: perform subsequent processing based on the target external parameter.
  • the subsequent processing is used to process the sensor data collected by the vision module according to the target external parameter to obtain a processing result; according to the processing result, determine the object to be positioned and the device carrying the vision module The relative position relationship between the devices, or the posture of the object to be positioned is determined according to the processing result.
  • the processor is further configured to perform the following operations: compare the target external participation preset external parameter range; wherein, the preset external parameter range is used to characterize the external parameter corresponding to the current temperature Value range; if the compensated current external parameter falls within the preset external parameter range, then perform the subsequent processing operation based on the target external parameter.
  • the preset sensing data is processed according to the target external parameter to obtain a preprocessing result.
  • the preprocessing result is compared with the standard processing result corresponding to the preset sensing data; if the two are consistent, the subsequent processing operation based on the target external parameter is executed.
  • the operations performed by the above-mentioned processor are similar to the corresponding steps in the above-mentioned data processing method, and can achieve the same technical effect. In order to avoid repetition, details are not repeated here.
  • an embodiment of the present invention also provides a movable platform that includes a vision module and the above-mentioned data processing device; the data processing device is used to execute each step in the above-mentioned data processing method, and can achieve The same technical effect, in order to avoid repetition, will not be repeated here.
  • the movable platform includes a power propeller and a driving motor for driving the power propeller.
  • an embodiment of the present invention also provides a wearable device, the wearable device includes a vision module and the above-mentioned data processing device; the data processing device is used to execute each step in the above-mentioned data processing method, and To achieve the same technical effect, in order to avoid repetition, I will not repeat them here.
  • the embodiment of the present invention also provides a computer-readable storage medium on which a computer program is stored, and when the computer program is executed by a processor, each step in the above-mentioned data processing method is implemented and can be To achieve the same technical effect, in order to avoid repetition, I will not repeat them here.
  • the device 400 includes but is not limited to: a radio frequency unit 401, a network module 402, an audio output unit 403, an input unit 404, a sensor 405, a display unit 406, User input unit 407, interface unit 408, memory 409, processor 410, power supply 411 and other components.
  • a radio frequency unit 401 includes but is not limited to: a radio frequency unit 401, a network module 402, an audio output unit 403, an input unit 404, a sensor 405, a display unit 406, User input unit 407, interface unit 408, memory 409, processor 410, power supply 411 and other components.
  • devices include, but are not limited to, mobile phones, tablet computers, notebook computers, palmtop computers, vehicle-mounted devices, wearable devices, and pedometers.
  • the radio frequency unit 401 can be used for receiving and sending signals during the process of sending and receiving information or talking. After receiving the downlink data from the base station, it is processed by the processor 410; and the uplink data is sent to the base station.
  • the radio frequency unit 401 includes, but is not limited to, an antenna, at least one amplifier, a transceiver, a coupler, a low noise amplifier, a duplexer, and the like.
  • the radio frequency unit 401 can also communicate with the network and other devices through a wireless communication system.
  • the device provides users with wireless broadband Internet access through the network module 402, such as helping users to send and receive emails, browse web pages, and access streaming media.
  • the audio output unit 403 may convert the audio data received by the radio frequency unit 401 or the network module 402 or stored in the memory 409 into an audio signal and output it as sound. Moreover, the audio output unit 403 may also provide audio output related to a specific function performed by the device 400 (for example, call signal reception sound, message reception sound, etc.).
  • the audio output unit 403 includes a speaker, a buzzer, a receiver, and the like.
  • the input unit 404 is used to receive audio or video signals.
  • the input unit 404 may include a graphics processing unit (GPU) 4041 and a microphone 4042.
  • GPU graphics processing unit
  • the graphics processor 4041 is used to capture images of still pictures or videos obtained by an image capture device (such as a camera) in a video capture mode or an image capture mode. Data is processed.
  • the processed image frame can be displayed on the display unit 406.
  • the image frame processed by the graphics processor 4041 may be stored in the memory 409 (or other storage medium) or sent via the radio frequency unit 401 or the network module 402.
  • the microphone 4042 can receive sound, and can process such sound into audio data.
  • the processed audio data can be converted into a format that can be sent to a mobile communication base station via the radio frequency unit 401 in the case of a telephone call mode for output.
  • the device 400 also includes at least one sensor 405, such as a light sensor, a motion sensor, and other sensors.
  • the light sensor includes an ambient light sensor and a proximity sensor.
  • the ambient light sensor can adjust the brightness of the display panel 4061 according to the brightness of the ambient light.
  • the proximity sensor can turn off the display panel 4061 and/or the backlight when the device 400 is moved to the ear.
  • the accelerometer sensor can detect the magnitude of acceleration in various directions (usually three-axis), and can detect the magnitude and direction of gravity when stationary, and can be used to identify the device's posture (such as horizontal and vertical screen switching, related games, Magnetometer attitude calibration), vibration recognition related functions (such as pedometer, tap), etc.; sensor 405 can also include fingerprint sensors, pressure sensors, iris sensors, molecular sensors, gyroscopes, barometers, hygrometers, thermometers, infrared Sensors, etc., will not be repeated here.
  • the display unit 406 is used to display information input by the user or information provided to the user.
  • the display unit 406 may include a display panel 4061, and the display panel 4061 may be configured in the form of a liquid crystal display (LCD), an organic light-emitting diode (OLED), etc.
  • the user input unit 407 may be used to receive inputted numeric or character information, and generate key signal input related to user settings and function control of the device.
  • the user input unit 407 includes a touch panel 4041 and other input devices 4072.
  • the touch panel 4041 also called a touch screen, can collect user touch operations on or near it (for example, the user uses any suitable objects or accessories such as fingers, stylus, etc.) on the touch panel 4041 or near the touch panel 4041. operate).
  • the touch panel 4041 may include two parts: a touch detection device and a touch controller.
  • the touch detection device detects the user's touch position, detects the signal brought by the touch operation, and transmits the signal to the touch controller; the touch controller receives the touch information from the touch detection device, converts it into contact coordinates, and then sends it To the processor 410, the command sent by the processor 410 is received and executed.
  • the touch panel 4041 can be implemented in multiple types such as resistive, capacitive, infrared, and surface acoustic wave.
  • the user input unit 407 may also include other input devices 4072.
  • other input devices 4072 may include, but are not limited to, a physical keyboard, function keys (such as volume control buttons, switch buttons, etc.), trackball, mouse, and joystick, which will not be repeated here.
  • the touch panel 4041 can be overlaid on the display panel 4061. When the touch panel 4041 detects a touch operation on or near it, it is transmitted to the processor 410 to determine the type of the touch event, and then the processor 410 determines the type of touch event according to the touch. The type of event provides corresponding visual output on the display panel 4061.
  • the interface unit 408 is an interface for connecting an external device with the device 400.
  • the external device may include a wired or wireless headset port, an external power source (or battery charger) port, a wired or wireless data port, a memory card port, a port for connecting a device with an identification module, audio input/output (I/O) port, video I/O port, headphone port, etc.
  • the interface unit 408 can be used to receive input (for example, data information, power, etc.) from an external device and transmit the received input to one or more elements in the device 400 or can be used to connect the device 400 and the external device. Transfer data between.
  • the memory 409 can be used to store software programs and various data.
  • the memory 409 may mainly include a storage program area and a storage data area.
  • the storage program area may store an operating system, an application program required by at least one function (such as a sound playback function, an image playback function, etc.), etc.; Data created by the use of mobile phones (such as audio data, phone book, etc.), etc.
  • the memory 409 may include a high-speed random access memory, and may also include a non-volatile memory, such as at least one magnetic disk storage device, a flash memory device, or other volatile solid-state storage devices.
  • the processor 410 is the control center of the device. It uses various interfaces and lines to connect various parts of the entire device. Various functions and processing data of the equipment, so as to monitor the equipment as a whole.
  • the processor 410 may include one or more processing units; preferably, the processor 410 may integrate an application processor and a modem processor, where the application processor mainly processes the operating system, user interface, application programs, etc., and the modem The processor mainly deals with wireless communication. It can be understood that the foregoing modem processor may not be integrated into the processor 410.
  • the device 400 may also include a power source 411 (such as a battery) for supplying power to various components.
  • a power source 411 such as a battery
  • the power source 411 may be logically connected to the processor 410 through a power management system, so as to manage charging, discharging, and power consumption management through the power management system. Function.
  • the device 400 includes some functional modules not shown, which will not be repeated here.
  • the device embodiments described above are merely illustrative.
  • the units described as separate components may or may not be physically separated, and the components displayed as units may or may not be physical units, that is, they may be located in One place, or it can be distributed to multiple network units. Some or all of the modules can be selected according to actual needs to achieve the objectives of the solutions of the embodiments. Those of ordinary skill in the art can understand and implement it without creative work.
  • the various component embodiments of the present invention may be implemented by hardware, or by software modules running on one or more processors, or by a combination of them.
  • FIG. 5 is a block diagram of a computing processing device provided by an embodiment of the present invention. As shown in FIG. 5, FIG. 5, FIG.
  • the computing processing device traditionally includes a processor 510 and a computer program product in the form of a memory 520 or a computer readable medium.
  • the memory 520 may be an electronic memory such as flash memory, EEPROM (Electrically Erasable Programmable Read Only Memory), EPROM, hard disk, or ROM.
  • the memory 520 has a storage space 530 for executing program codes of any method steps in the above methods.
  • the storage space 530 for program codes may include various program codes respectively used to implement various steps in the above method. These program codes can be read from or written into one or more computer program products.
  • These computer program products include program code carriers such as hard disks, compact disks (CDs), memory cards, or floppy disks.
  • Such a computer program product is usually a portable or fixed storage unit as described with reference to FIG. 6.
  • the storage unit may have storage segments, storage spaces, etc. arranged similarly to the memory 520 in the computing processing device of FIG. 5.
  • the program code can be compressed in an appropriate form, for example.
  • the storage unit includes computer-readable codes, that is, codes that can be read by, for example, a processor such as 510. These codes, when run by a computing processing device, cause the computing processing device to perform each of the methods described above. step.
  • any reference signs placed between parentheses should not be constructed as a limitation to the claims.
  • the word “comprising” does not exclude the presence of elements or steps not listed in the claims.
  • the word “a” or “an” preceding an element does not exclude the presence of multiple such elements.
  • the invention can be implemented by means of hardware comprising several different elements and by means of a suitably programmed computer. In the unit claims listing several devices, several of these devices may be embodied in the same hardware item.
  • the use of the words first, second, and third, etc. do not indicate any order. These words can be interpreted as names.

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Studio Devices (AREA)

Abstract

一种数据处理方法、装置、可移动平台及可穿戴式设备,该方法可以先获取视觉模组的当前温度,根据当前温度,以及不同温度值与不同外参补偿量的对应关系,确定在当前温度下对应的外参补偿量,接着,获取视觉模组的初始外参,最后,根据外参补偿量对初始外参进行补偿,以得到视觉模组的目标外参。这样,根据当前温度对初始外参进行补偿,一定程度上可以修正由于温度带来的外参不够准确的问题,进而可以确保后续使用外参进行信息处理时的处理精度。

Description

数据处理方法、装置、可移动平台及可穿戴式设备 技术领域
本发明属于视觉检测技术领域,特别是涉及一种数据处理方法、装置、可移动平台及可穿戴式设备。
背景技术
目前,越来越多的设备会通过搭载的视觉模组进行视觉检测,以实现姿态估计、运动侦测、自动驾驶等操作。具体的,进行视觉检测时往往会通过视觉模组中的各个部件采集视觉信息,然后基于视觉模组中各个部件之间的外参,对各个部件采集到的视觉信息进行处理。
现有技术中,通常是直接使用预先为视觉模组设定的初始外参进行处理。但是,受到实际使用环境的温度影响,视觉模组中各个部件之间的连接件的形态、应力传递情况会发生变化,这样,就会导致之前设定的初始外参不够准确,进而降低基于该外参进行信息处理时的处理精度。
发明内容
本发明提供一种数据处理方法、装置、可移动平台及可穿戴式设备,以便解决受温度影响导致视觉模组的初始外参不够准确,进而导致基于该初始外参进行信息处理时处理精度较低的问题。
为了解决上述技术问题,本发明是这样实现的:
第一方面,本发明实施例提供了一种数据处理方法,该方法包括:
获取视觉模组的当前温度;
根据所述当前温度,以及不同温度值与不同外参补偿量的对应关系,确定在所述当前温度下对应的外参补偿量;
获取所述视觉模组的初始外参;
根据所述外参补偿量对所述初始外参进行补偿,以得到所述视觉模组的目标外参;其中,所述视觉模组包括两个视觉传感器,所述外参用于表征所述两个视觉传感器之间的相对位姿关系;或,所述视觉模组包括视觉传感器和位姿传感器,所述外参用于表征所述视觉传感器与所述位姿传感器之间的相对位姿关系。
第二方面,本发明实施例提供了一种数据处理装置,所述数据处理装置包括计算机可读存储介质及处理器;所述处理器用于执行以下操作:
获取视觉模组的当前温度;
根据所述当前温度,以及不同温度值与不同外参补偿量的对应关系,确定在所述当前温度下对应的外参补偿量;
获取所述视觉模组的初始外参;
根据所述外参补偿量对所述初始外参进行补偿,以得到所述视觉模组的目标外参;其中, 所述视觉模组包括两个视觉传感器,所述外参用于表征所述两个视觉传感器之间的相对位姿关系;或,所述视觉模组包括视觉传感器和位姿传感器,所述外参用于表征所述视觉传感器与所述位姿传感器之间的相对位姿关系。
第三方面,本发明实施例提供了一种可移动平台,所述可移动平台包含视觉模组和上述的数据处理装置;所述数据处理装置用于执行以下操作:
获取视觉模组的当前温度;
根据所述当前温度,以及不同温度值与不同外参补偿量的对应关系,确定在所述当前温度下对应的外参补偿量;
获取所述视觉模组的初始外参;
根据所述外参补偿量对所述初始外参进行补偿,以得到所述视觉模组的目标外参;其中,所述视觉模组包括两个视觉传感器,所述外参用于表征所述两个视觉传感器之间的相对位姿关系;或,所述视觉模组包括视觉传感器和位姿传感器,所述外参用于表征所述视觉传感器与所述位姿传感器之间的相对位姿关系。
第四方面,本发明实施例提供了一种可穿戴式设备,其特征在于,所述可穿戴式设备包含视觉模组和上述的数据处理装置;所述数据处理装置用于执行以下操作:
获取视觉模组的当前温度;
根据所述当前温度,以及不同温度值与不同外参补偿量的对应关系,确定在所述当前温度下对应的外参补偿量;
获取所述视觉模组的初始外参;
根据所述外参补偿量对所述初始外参进行补偿,以得到所述视觉模组的目标外参;其中,所述视觉模组包括两个视觉传感器,所述外参用于表征所述两个视觉传感器之间的相对位姿关系;或,所述视觉模组包括视觉传感器和位姿传感器,所述外参用于表征所述视觉传感器与所述位姿传感器之间的相对位姿关系。
第五方面,本发明实施例提供了一种计算机可读存储介质,所述计算机可读存储介质上存储计算机程序,所述计算机程序被处理器执行时实现以下操作:
获取视觉模组的当前温度;
根据所述当前温度,以及不同温度值与不同外参补偿量的对应关系,确定在所述当前温度下对应的外参补偿量;
获取所述视觉模组的初始外参;
根据所述外参补偿量对所述初始外参进行补偿,以得到所述视觉模组的目标外参;其中,所述视觉模组包括两个视觉传感器,所述外参用于表征所述两个视觉传感器之间的相对位姿关系;或,所述视觉模组包括视觉传感器和位姿传感器,所述外参用于表征所述视觉传感器与所述位姿传感器之间的相对位姿关系。
在本发明实施例中,可以先获取视觉模组的当前温度,根据当前温度,以及不同温度值与不同外参补偿量的对应关系,确定在当前温度下对应的外参补偿量,接着,获取视觉模组的初始外参,最后,根据外参补偿量对初始外参进行补偿,以得到视觉模组的目标外参。其中,视觉模组包括两个视觉传感器,外参用于表征两个视觉传感器之间的相对位姿关系;或,视觉模组包括视觉传感器和位姿传感器,外参用于表征视觉传感器与位姿传感器之间的相对位姿关系。这样,根据当前温度对初始外参进行补偿,一定程度上可以修正由于温度带来的外参不够准确的问题,使得外参能够更加精准的表征视觉传感器之间的相对位姿关系或者是视觉传感器与位姿传感器之间的相对位姿关系,进而可以确保后续使用外参进行信息处理时的处理精度。
附图说明
图1是本发明实施例提供的一种数据处理方法的步骤流程图;
图2A是本发明实施例提供的另一种数据处理方法的步骤流程图;
图2B是一种双视觉传感器的相机坐标系示意图;
图2C是本发明实施例提供一种角度示意图;
图2D是本发明实施例提供的一种对应曲线的示意图;
图2E是本发明实施例提供的另一种对应曲线的示意图
图3是本发明实施例提供的一种数据处理装置的框图;
图4为实现本发明各个实施例的一种设备的硬件结构示意图;
图5为本发明实施例提供的一种计算处理设备的框图;
图6为本发明实施例提供的一种便携式或者固定存储单元的框图。
具体实施方式
下面将结合本发明实施例中的附图,对本发明实施例中的技术方案进行清楚、完整地描述,显然,所描述的实施例是本发明一部分实施例,而不是全部的实施例。基于本发明中的实施例,本领域普通技术人员在没有作出创造性劳动前提下所获得的所有其他实施例,都属于本发明保护的范围。
图1是本发明实施例提供的一种数据处理方法的步骤流程图,如图1所示,该方法可以包括:
步骤101、获取视觉模组的当前温度。
本发明实施例提供的数据处理方法可以应用于处理器,该处理器可以是视觉模组中包含的处理器,也可以是搭载该视觉模组的设备中包含的处理器,本发明实施例对此不作限定。进一步地,当前温度可以是视觉模组在当前时刻下的温度。由于温度会引起视觉模组中各个部件之间的连接件的形态、应力传递情况发生变化,进而导致之前设定的初始外参不够准确。因此,本步骤中可以先获取视觉模组的当前温度,以便于后续步骤能够基于当前温度对初始外参进行补偿。具体的,可以通过温度传感器对视觉模组进行检测,以获取当前温度。
步骤102、根据所述当前温度,以及不同温度值与不同外参补偿量的对应关系,确定在所述当前温度下对应的外参补偿量。
本发明实施例中,该对应关系可以是预先生成并存储在处理器中,该对应关系可以表征不同温度对应的外参补偿量,该外参补偿量可以是基于该温度下实际的外参确定的。相应地,可以从该对应关系中查找当前温度对应的外参补偿量,得到当前温度下对应的外参补偿量。
步骤103、获取所述视觉模组的初始外参。
本发明实施例中,初始外参可以是在视觉模组出厂时标定并存储好的参数,也可以是用户在使用过程中人工标定并存储的。具体的,获取初始外参时,可以从存储该初始外参的存储区中直接读取。
步骤104、根据所述外参补偿量对所述初始外参进行补偿,以得到所述视觉模组的目标外参。
本发明实施例中,由于当前温度与标定初始外参时的标定温度可能不同,这样,当前温度下视觉模组中各个部件之间连接件的形态以及应力传递情况与标定温度下的连接件的形态以及应力传递情况就会不同。相应地,初始外参在对于当前温度下的视觉模组而言就不够准确。因此,本步骤中,可以参考当前温度下的外参补偿量,对初始外参进行补偿,以得到视觉模组的目标外参。通过使用当前温度对应的外参补偿量对初始外参进行补偿得到的目标外参,一定程度上可以更加适配视觉模组,进而可以修正外参不够准确,导致后续使用外参进行信息处理时处理精度较低的问题。其中,本发明实施例中的视觉模组可以包括两个视觉传感器,相应地,本发明实施例中的外参可以用于表征这两个视觉传感器之间的相对位姿关系。需要说明的是,在视觉模组包括两个视觉传感器的情况下,该视觉模组中还可以包括位姿传感器,相应地,也可以通过本发明实施例提供的数据处理方法,对该视觉模组中视觉传感器与位姿传感器之间的外参,进行补偿校准,本发明实施例对此不做限定。或者,该视觉模组可以包括视觉传感器和位姿传感器,相应地,外参可以用于表征视觉传感器与位姿传感器之间的相对位姿关系。进一步地,在视觉模组包括视觉传感器和位姿传感器的情况下,该视觉模组中视觉传感器的数量可以为一个或多个,相应地,在视觉传感器的数量为多个的情况下,也可以通过本发明实施例提供的数据处理方法,对这多个视觉传感器中视觉传感器之间的外参,进行补偿校准,本发明实施例对此不做限定。
综上所述,本发明实施例提供的数据处理方法,通过先获取视觉模组的当前温度,根据当前温度,以及不同温度值与不同外参补偿量的对应关系,确定在当前温度下对应的外参补偿量,接着,获取视觉模组的初始外参,最后,根据外参补偿量对初始外参进行补偿,以得到视觉模组的目标外参。其中,视觉模组包括两个视觉传感器,外参用于表征两个视觉传感器之间的相对位姿关系;或,视觉模组包括视觉传感器和位姿传感器,外参用于表征视觉传感器与位姿传感器之间的相对位姿关系。这样,根据当前温度对初始外参进行补偿,一定程度上可以修正由于温度带来的外参不够准确的问题,使得外参能够更加精准的表征视觉传感器之间的相对位姿关系或者是视觉 传感器与位姿传感器之间的相对位姿关系,进而可以确保后续使用外参进行信息处理时的处理精度。
图2A是本发明实施例提供的另一种数据处理方法的步骤流程图,如图2A所示,该方法可以包括:
步骤201、获取视觉模组的当前温度。
本发明实施例中,视觉模组可以包括两个视觉传感器,相应地,本发明实施例中的外参可以为用于表征两个视觉传感器之间的相对位姿关系。进一步地,该视觉模组也可以是包括视觉传感器和位姿传感器,相应地,外参可以为用于表征视觉传感器与位姿传感器之间的相对位姿关系。具体的,两个视觉传感器之间的相对位姿关系可以是用于表示这两个视觉传感器所采用的相机坐标系之间的相对变化关系。视觉传感器与位姿传感器之间的相对位姿关系可以是用于表示视觉传感器所采用的相机坐标系与位姿传感器所采用的坐标系之间的相对变化关系。其中,位姿传感器可以为惯性测量单元(Inertial measurement unit,IMU),相对位姿关系可以包括相对旋转量和/或相对平移量。本发明实施例中,通过设置外参为视觉传感器之间的相对位姿关系或者是视觉传感器与位姿传感器之间的相对位姿关系,这样,后续对外参进行补偿之后,可以使得外参能够更准确的表示这些部件之间的相对位姿关系,进而提高基于外参对这这些部件采集的信息进行融合处理时的精度。进一步地,相机坐标系,又称光心坐标系,它是以光心为坐标原点,成像面的水平方向和数值方向为X轴和Y轴,光轴为Z轴的坐标系。视觉传感器,即,相机,通过相机外参,可以实现将现实世界中世界坐标系中的3维点转换为相机坐标系中的点。其中,世界坐标系是客观三维世界的绝对坐标系,也称客观坐标系。因为设备往往安放在三维空间中,我们需要世界坐标系这个基准坐标系来描述数码拍摄设备的位置,并且用它来描述安放在此三维环境中的其它任何物体的位置。具体的,可以通过下述公式实现转换:
Figure PCTCN2020085640-appb-000001
进一步地,实际应用中还可以通过下述计算公式,实现将相机坐标系中的点转换到图像坐标系中:
Figure PCTCN2020085640-appb-000002
Figure PCTCN2020085640-appb-000003
r 2=x′ 2+y′ 2
Figure PCTCN2020085640-appb-000004
其中,[u,v,1] T表示图像坐标系中的2D点,[x w,y w,z w] T表示世界坐标系中的点,[x,y,z] T表示相机坐标系中的点,矩阵K称为Camera calibration matrix,即每个相机内参(Intrinsic Parameters),内参可以用于描述三维的光线与二维的像素坐标之间对应的参数,内参的是否准确决定了通过二维的像素坐标信息到三维的光线信息的变换精度。进一步地,[c x,c y] T表示光心,通常在图片的中心附近,f x与f y表示焦距,单位为像素,k 1,k 2,k3,k 4,k 5与k 6表示径向畸变,p1与p2表示径向畸变。矩阵R为旋转矩阵(Rotation Matrix),矩阵T为位移矩阵(Translation Matrix),R和T为相机的外参(Extrinsic Matrix),用于表达的是三维空间中世界坐标系到相机坐标系的旋转与位移变换。示例的,图2B是一种双视觉传感器的相机坐标系示意图,在图2B中,两个视觉传感器之间的相对位姿关系,可以通过R,T表示。具体的,R,T表示以O I为原点的三维坐标系到以O r为原点的三维坐标系之间的旋转与位移变换关系。通过R及T,可以对左侧视觉传感器检测到的P点在X I轴、Y I轴、Z I轴构成的坐标系中的坐标信息以及对右侧视觉传感器检测到的P点在X r轴、Y r轴、Z r轴构成的坐标系中的坐标信息进行转换。同时,基于各视觉传感器的内参,对X I轴、Y I轴、Z I轴构成的坐标系中的坐标信息进行映射转换,得到P点在左侧视觉传感器中的二维图像点P I,对X r轴、Y r轴、Z r轴构成的坐标系中的坐标信息进行映射转换,得到P点在右侧视觉传感器中的二维图像点P r。进一步地,视觉模组可以搭载于可移动平台,当前温度是根据可移动平台当前所处环境的环境温度和/或可移动平台的机身温度确定的。由于视觉模组搭在可移动平台,可移动平台往往与所处环境的环境温度相关,这样,可移动平台当前所处环境的环境温度以及可移动平台的机身温度一定程度上可以表示视觉模组的温度。例如,可移动平台可以为无人机,视觉模组可以嵌入在无人机的机身中,那么视觉模组的温度往往与无人机当前所处环境的环境温度以及无人机的机身温度接近。因此,本发明实施例中,可以基于根据可移动平台当前所处环境的环境温度和/或可移动平台的机身温度确定当前温度。
示例的,可以检测可移动平台当前所处环境的环境温度或可移动平台的机身温度,将检测到的温度确定为当前温度,或者是将检测到的温度与预设温度的差值确定为当前温度。其中,该预设温度可 以是根据视觉模组与可移动平台当前所处环境的环境温度或机身温度之间的偏差预先确定的。又或者是,可以检测可移动平台当前所处环境的环境温度以及可移动平台的机身温度,根据这两个温度计算当前温度,例如,计算两者的平均值,作为当前温度。本发明实施例中,根据可移动平台当前所处环境的环境温度和/或可移动平台的机身温度确定当前温度的方式,可以通过多种方式灵活的获取到较为准确的当前温度。进一步地,为了精确的获取到当前温度,本步骤中的当前温度也可以是基于设置在视觉模组上的温度传感器采集的。该温度传感器可以是专门为实现温度检测而在视觉模组中新添加,也可以是视觉模组中原有的温度传感器。例如,有些视觉模组会搭载飞行时间(Time Of Flight,TOF)传感器,相应地,在这种情况下,温度传感器可以设置在TOF传感器的印制电路板上。这样,无需额外添加传感器,通过直接使用视觉模组中已有的温度传感器检测当前温度,可以节省实现成本。同时,使用设置在TOF传感器的印制电路板上的温度传感器检测当前温度时,由于温度传感器设置在TOF传感器的印制电路板,而TOF传感器位于视觉模组中,这样,一定程度上可以确保使用该温度传感器检测到的当前温度能够精准的表示视觉模组当前的实际温度,进而可以提高后续基于当前温度确定的外参补偿量的准确性。
步骤202、根据所述当前温度,以及不同温度值与不同外参补偿量的对应关系,确定在所述当前温度下对应的外参补偿量。
可选的,本步骤中的对应关系可以是在执行本步骤之前预先生成的,例如,对应关系可以是在视觉模组出厂前预先生成的。示例的,该对应关系可以通过下述步骤A~步骤D实现:步骤A、控制搭载测试视觉模组的设备在不同环境温度中以不同的热功率运行。本步骤中,测试视觉模组可以与本发明实施例提供的数据处理方法中的涉及的视觉模组相同,也可以不同。例如,测试视觉模组可以为专门用于生成对应关系的视觉模组。进一步地,搭载测试视觉模组的设备可以为视觉模组在实际使用时会搭配的设备,例如,无人机、无人车、航拍飞行器、虚拟现实(Virtual Reality,VR)眼镜/增强显示(Augmented reality,AR)眼镜等等。环境温度可以为搭载测试视觉模组的设备所处环境的温度。需要说明的是,受到视觉模组本身结构以及材质的影响,视觉模组往往存在一个使用温度范围,为了确保操作能够正常进行,本步骤中在控制环境温度时,可以确保环境温度在该使用温度范围内。同时,可以从该使用温度范围内均匀选取多个温度值,并相应进行控制,以使得环境温度分别达到这些温度值。这样,可以确保后续采集到的样本温度的均匀性,进而一定程度上可以提高基于样本温度生成的对应关系的准确性。进一步地,控制控制搭载测试视觉模组的设备以不同的热功率运行时,可以先控制该设备打开,然后逐步控制该设备中运行的部件的数量,一般运行的部件越多,设备的热功率会越大,进而实现控制设备以不同的热功率运行。
步骤B、获取所述测试视觉模组的温度,作为样本温度。
本步骤中,该样本温度可以是通过设置在视觉模组上的温度传感器采集的。在测试视觉模组搭载TOF传感器的情况下,该温度传感器可以设置在TOF传感器的印制电路板上。相应地, 获取样本温度时,可以读取该温度传感器采集到的温度值,得到样本温度。可选的,获取样本温度的操作,可以是在搭载测试视觉模组的设备的热功率达到平衡状态时执行的。其中,搭载测试视觉模组的设备的热功率达到平衡状态指的是该设备与外界不存在热量交换,相应地,可以通过检测该设备与外界的热量交换值,如果该交换值为0,则可以确定设备的热功率达到平衡状态,此时,可以执行获取样本温度的操作。由于搭载测试视觉模组的设备的热功率达到平衡状态时,与外界不存在热量交换,即,此时测试视觉模组的温度已经稳定,因此,本发明实施例中,通过在搭载测试视觉模组的设备达到平衡状态时才获取样本温度的方式,可以确保获取到的样本温度的准确性。进一步地,该平衡状态可以包括最小热平衡状态和/或最大热平衡状态,其中,最小热平衡状态表示测试视觉模组在最低工作环境温度下以额定热功率工作时达到热平衡,最大热平衡状态表示测试视觉模组在最高工作环境温度下以最大热功率工作时达到热平衡。由于设备工作时会产生热量,测试视觉模组的温度往往会高于环境温度,同一环境温度下设备以不同热功率工作时产生的热量不同,这样,不同热功率下测试视觉模组的温度会不同。因此,本发明实施例中,通过在最小热平衡状态及最大热平衡状态下采集样本温度,可以确保采集到的样本温度中包含测试视觉模组的最低温度和/或最高温度,即,包含具有代表性较高的端值温度。这样,一定程度上可以提高后续基于样本温度生成对应关系的准确性。示例的,假设最低工作环境温度为-10℃,最高工作环境温度为+50℃,那么可以采集到-10℃下测试视觉模组的最低样本温度Tmin以及+50℃下测试视觉模组的最高样本温度Tmax。
步骤C、获取所述测试视觉模组在所述样本温度下对应的外参补偿量,所述外参补偿量为所述样本温度下所述测试视觉模组对应的实际外参相对于基准外参的差值。
本步骤中,可以先对测试视觉模组进行外参标定,得到该测试视觉模组在该样本温度下对应的实际外参,然后计算该实际外参与基准外参差值,得到该样本温度下对应的外参补偿量。该基准外参可以是在某个环境温度下,确定的测试视觉模组的外参。其中,以外参为视觉模组中视觉传感器与IMU之间的外参为例,因此,进行外参标定时,可以先构建已知的标定目标,通过一定可控的运动激励,收集图像序列。具体的,可以控制视觉传感器对标定目标在不同的相对姿态拍摄照片。同时在多个自由度上移动IMU惯性测量单元,产生IMU激励,收集IMU的输出。最后,解算视觉传感器与IMU之间的外参。以外参为视觉模组中视觉传感器之间的外参为例,进行外参标定时,可以先离线的构建某些已知的标定目标,通过一定数量的可控的图像序列解算相机外参。其中,标定目标可以是棋盘格、圆点或是一些三维物体。采集图像序列时,可以分别使用这两个视觉传感器对标定目标在不同的相对姿态下拍摄照片,然后从照片中提取标定目标的空间信息,接着基于空间信息,求解外参。受到制造工艺精度的影响,导致视觉模组之间的会存在一定的差异,本发明实施例中,确定外参相对变化量作为样本温度下对应的外参补偿量,后续通过使用样本温度以及外参相对变化量构建对应关系,一定程度上可以降低视觉模组之间的个体差异带来的影响,进而提高对应关系的通用性。进一步地,基准外参可 以为在环境温度为最低工作环境温度的情况下,测试视觉模组的外参。这样,通过选择在最低工作环境温度的情况下获取到的外参作为基准外参,可以使得基于该基准外参计算得到的外参补偿量均为非负参数,进而方便后续基于对应关系中的外参补偿量进行计算。当然,在本发明的另一可选实施例中,也可以使用外参的绝对量作为外参补偿量。即,将基准外参设置为0,这样,仅需进行对测试视觉模组进行标定,无需再进行额外计算,即可实现数据采集,进而一定程度上节省生成对应关系所需的处理资源。
步骤D、根据所述样本温度及其对应的外参补偿量,确定所述对应关系。
本步骤中,可以将样本温度及其对应的外参补偿量作为样本点对,通过这些样本点对拟合温度与外参补偿量对应曲线。由于采集到的样本温度及其对应的外参补偿量的数量往往是有限的,这样,通过采集到的有限样本点对,生成对应曲线,可以得到对应曲线上无数点所表示的温度与外参补偿量的对应关系,进而可以提高对应关系的覆盖范围。其中,在拟合对应曲线时,可以是针对外参补偿量中包含的每一类参数均为拟合一条对应曲线。示例的,假设基准外参为Tmin,采集了8个样本温度及其对应的外参补偿量,外参为偏航角yaw,横滚角roll,俯仰角pitch组成的角度矩阵,那么可以构建表示的函数关系为f(T)的对应曲线。其中,f(T)=[YAW(T),ROLL(T),PITCH(T)],YAW(T)表示样本温度T下对应的yaw角的相对变化量、ROLL(T)表示样本温度T下对应的roll角的相对变化量、PITCH(T)表示样本温度T下对应的pitch角的相对变化量。进一步地,图2C是本发明实施例提供一种角度示意图,如图2C所示,yaw、roll以及pitch可以表示绕不同轴旋转的三个角。图2D是本发明实施例提供的一种对应曲线的示意图,如图2D所示,基于这8个外参补偿量中的yaw角,可以拟合出图2D中所示的对应曲线。需要说明的是,也可以是直接将采集到的样本温度及其对应的外参补偿量,对应存储为一个对应列表,本发明实施例对此不作限定。进一步地,执行对应关系生成步骤的处理器可以与执行本发明实施例提供的数据处理方法的处理器相同,也可以不同。例如,可以通过专门用于生成对应关系的处理器来执行上述生成对应关系的各个步骤。
步骤203、获取所述视觉模组的初始外参。
具体的,本步骤可以参考前述步骤103,本发明实施例在此不做赘述。
步骤204、根据所述外参补偿量对所述初始外参进行补偿,以得到所述视觉模组的目标外参。
具体的,本步骤中可以通过下述操作实现补偿:在所述对应关系中查找所述初始外参的标定温度下对应的外参补偿量;计算所述当前温度下对应的外参补偿量与所述标定温度下对应的外参补偿量的差值;计算所述差值与所述初始外参之和,得到所述目标外参。其中,该标定温度可以是获取初始外参时视觉模组的温度,该标定温度可以在标定初始外参时将初始外参与标定温度同时进行存储。相应地,处理器在执行本步骤时,可以直接读取预先存储的标定温度,然后基于该标定温度从对应关系中查找标定温度下对应的外参补偿量。接着,计算当前温度下对应的外参补偿量与标定温度下对应的外参补偿量的差值,即,确定相对于初始外参的标定温 度,温度变换至当前的当前温度,外参应该变化多少,最后,可以将该差值加到初始外参上,得到目标外参。这样,通过将初始外参与该差值之和确定为目标外参,这样,弥补温度变化导致的外参偏差,进而可以提高目标外参的准确性。需要说明的是,实际应用场景中可能会出现初始外参与基准外参相同,生成对应关系时在标定温度下采集到的实际外参与初始外参相同。这种情况下,标定温度下对应的外参补偿量为0。因此,本发明实施例中,还可以在对应关系中查找初始外参的标定温度下对应的外参补偿量之前,先确定初始外参相对于基准外参的第一差值,以及相对于生成对应关系时在标定温度下采集到的实际外参的第二差值,若第一差值及第二差值均为0,则可以直接计算当前温度下对应的外参补偿量与初始外参之和,得到目标外参。若第一差值和/或第二差值不为0,再执行在对应关系中查找初始外参的标定温度下对应的外参补偿量的操作,这样可以避免执行不必须要的查找操作。进一步地,受到参数标定精度的影响,标定温度下标定的初始外参可能与生成对应关系时在标定温度下采集到的实际外参并不相同。例如,初始外参是用户在标定温度下手动标定的,而生成对应关系时在标定温度下采集到的实际外参,是通过标定算法得到,那么,两者虽然标定温度但是数值可能会存在差异。本发明实施例中,通过计算第一差值及第二差值,在两者均为0的情况下,才省略查找标定温度下对应的外参补偿量的操作,直接使用当前温度下对应的外参补偿量进行补偿。这样,在避免执行不必要的查找操作的同时,可以确保对初始外参的补偿精度。进一步地,假设标定温度为T0,当前温度为T1,初始外参为:[yaw0,roll0,pitch0],则校正后得到的目标外参可为:[yaw1,roll1,pitch1]=[yaw0,roll0,pitch0]+[f(T1)-f(T0)]。其中,f(T1)为当前温度下的外参补偿量,f(T0)为标定温度下对应的外参补偿量。示例的,以上述图2D所示的对应曲线为例,图2E是本发明实施例提供的另一种对应曲线的示意图,如图2E所示,通过对应曲线可以得到标定温度T0下外参中yaw角的变化量为YAW(T0),当前温度T1下外参中yaw角的变化量为YAW(T1),在温度从T0变换值T1的过程中,yaw角变化量为Δyaw=YAW(T1)-YAW(T0)。假设标定温度T0下标定的初始外参中的yaw角为yaw0,那么T1温度下外参中的实际yaw角,即目标外参中的yaw角可以为:yaw1=yaw0+Δyaw=yaw0+[YAW(T1)-YAW(T0)]
步骤205、基于所述目标外参,进行后续处理。
本步骤中,该后续处理可以用于根据目标外参对视觉模组采集到的传感数据进行处理,得到处理结果,根据处理结果确定待定位物体与搭载所述视觉模组的设备之间的相对位置关系,即,进行定位。或者是,根据处理结果确定待定位物体的姿态,即,进行姿态估计。当然,根据实际应用场景,该后续处理也可以为其他操作,例如,可以为地图绘制操作,等等,本发明实施例对此不作限定。进一步地,为了提高外参的准确性,本发明实施例中,还可以在基于目标外参,进行后续处理之前,先对目标外参进行验证,在目标外参通过验证的情况下再进行后续处理,以确保后续处理的准确性。具体的,可以通过下述操作进行验证:将所述目标外参与预设外参范围进行比对;其中,所述预设外参范围用于表征所述当前温度对应的外参取值范围; 若所述补偿后的当前外参落入所述预设外参范围,则执行所述基于所述目标外参,进行后续处理的操作。其中,该预设外参范围可以是根据当前温度下外参正常时可能落入的数值范围。如果在当前温度下,外参未落入该预设外参范围,则说明目标外参存在异常,很有可能是错误的,相应地,可以在这种情况下,认为目标外参不可信,反正,若落入该预设外参范围则说明目标外参是正确的,相应地,可以在这种情况下,认为目标外参可信度较高,进而可以基于目标外参,进行后续处理。可选的,本发明实施例中也可以通过下述操作进行验证:根据所述目标外参对预设传感数据进行处理,得到预处理结果;将所述预处理结果与所述预设传感数据对应的标准处理结果进行比对;若两者一致,则执行所述基于所述目标外参,进行后续处理的操作。其中,标准处理结果可以是在以正确的外参对预设传感数据进行处理时,会得到的结果。相应地,如果预处理结果与该标准处理结果相一致,则说明目标外参是正确的,相应地,可以在这种情况下,认为目标外参可信度较高,进而可以基于目标外参,进行后续处理。需要说明的是,本发明实施例中可以先根据外参的估计主体确定是否要对目标外参进行验证。具体的,可以在外参是根据视觉模组采用的视觉惯性里程计(Visual-Inertial Odometry,VIO)算法估计的情况下,认为补偿得到的目标外参可信度较高,相应地,可以在这种情况下,可以直接将目标外参作为初值加入后的后续的处理中,例如,加入到状态估计步骤。在外参不是根据视觉模组采用的VIO算法估计的情况下,即,定位系统不参与外参估计的情况下,通过上述验证方式,对目标参数进行验证,在通过验证之后,再将目标外参加入后续的处理过程中。这样,通过选择性的在部分情况下才进行验证,一定程度上可以节省对目标参数进行验证所耗费的处理资源。
进一步地,以视觉模组为包含视觉传感器及IMU的视觉惯性里程计为例,在视觉惯性里程计中,视觉传感器与IMU惯性测量单元的输出在时间上需要进行同步,以保证具有可接受的、稳定的数据延迟。而视觉传感器与IMU惯性测量单元两个传感器坐标系之间的外参,直接决定了视觉惯性里程计的准确度。对于一个视觉惯性定位系统而言,各种参数,例如,相机内参、视觉传感器与IMU之间的外参是在设计阶段就确定的。但由于各零件在生产制造中的精度误差,与部件装配时的装配误差,以及使用过程中的机械变化,温度变化,均会导致不同参数发生变化,进而就会导致视觉惯性里程计精度降低。本发明实施例中提供的数据处理方法,可以根据实际使用时的当前温度对视觉传感器及IMU之间的外参进行补偿校正,进而一定程度上可以提高外参的精度,提高觉视觉惯性里程计的精度。
综上所述,本发明实施例提供的数据处理方法,通过先获取视觉模组的当前温度,根据当前温度,以及不同温度值与不同外参补偿量的对应关系,确定在当前温度下对应的外参补偿量,接着,获取视觉模组的初始外参,根据外参补偿量对初始外参进行补偿,以得到视觉模组的目标外参,根据目标参数进行后续。这样,根据当前温度对初始外参进行适应性补偿,一定程度上可以修正由于温度带来的外参不够准确的问题,进而可以确保使用外参进行后续处理时的处理精度。同时,由于用户手工重新进行标定的实现难度较大,成本较高,本发明实施例中通过 在外参不准确时,自动进行补偿校准,使得用户无需手动进行校准,进而一定程度上可以降低实现校准成本。
图3是本发明实施例提供的一种数据处理装置的框图,该装置30可以包括:第一获取模块301,用于获取视觉模组的当前温度。第一确定模块302,用于根据所述当前温度,以及不同温度值与不同外参补偿量的对应关系,确定在所述当前温度下对应的外参补偿量。第二获取模块303,用于获取所述视觉模组的初始外参。补偿模块304,用于根据所述外参补偿量对所述初始外参进行补偿,以得到所述视觉模组的目标外参;其中,所述视觉模组包括两个视觉传感器,所述外参用于表征所述两个视觉传感器之间的相对位姿关系;或,所述视觉模组包括视觉传感器和位姿传感器,所述外参用于表征所述视觉传感器与所述位姿传感器之间的相对位姿关系。
可选的,所述视觉模组搭载于可移动平台,所述当前温度是根据所述可移动平台当前所处环境的环境温度和/或所述可移动平台的机身温度确定的。可选的,所述可移动平台为无人机。可选的,所述相对位姿关系包括相对旋转量和/或相对平移量。可选的,所述位姿传感器为惯性测量单元IMU。可选的,所述对应关系是根据在不同温度下所述视觉模组的实际外参相对于基准外参的差值设定的。可选的,所述对应关系通过以下模块生成:控制模块,用于控制搭载测试视觉模组的设备在不同环境温度中以不同的热功率运行。第三获取模块,用于获取所述测试视觉模组的温度,作为样本温度。第四获取模块,用于获取所述测试视觉模组在所述样本温度下对应的外参补偿量,所述外参补偿量为所述样本温度下所述测试视觉模组对应的实际外参相对于基准外参的差值。第二确定模块,用于根据所述样本温度及其对应的外参补偿量,确定所述对应关系。
可选的,所述第三获取模块303,具体用于:在所述搭载测试视觉模组的设备的热功率达到平衡状态时,获取所述测试视觉模组的温度,作为所述样本温度。可选的,所述平衡状态包括最小热平衡状态和/或最大热平衡状态。其中,所述最小热平衡状态表示所述测试视觉模组在最低工作环境温度下以额定热功率工作时达到热平衡。所述最大热平衡状态表示所述测试视觉模组在最高工作环境温度下以最大热功率工作时达到热平衡。可选的,所述基准外参为在所述环境温度为最低工作环境温度的情况下,所述测试视觉模组的外参。可选的,所述补偿模块304,具体用于:在所述对应关系中查找所述初始外参的标定温度下对应的外参补偿量。计算所述当前温度下对应的外参补偿量与所述标定温度下对应的外参补偿量的差值。计算所述差值与所述初始外参之和,得到所述目标外参。可选的,所述补偿模块304,还具体用于:确定所述初始外参相对于所述基准外参的第一差值,以及所述初始外参相对于生成所述对应关系时所述标定温度下的实际外参的第二差值;若所述第一差值和/或所述第二差值不为0,则执行所述在所述对应关系中查找所述初始外参的标定温度下对应的外参补偿量的步骤。可选的,所述当前温度是基于设置在所述视觉模组上的温度传感器采集的。可选的,在所述视觉模组中包含飞行时间TOF传感器,所述温度传感器设置在所述TOF传感器的印制电路板上。可选的,所述装置30还包括: 处理模块,用于基于所述目标外参,进行后续处理。其中,所述后续处理用于根据所述目标外参对所述视觉模组采集到的传感数据进行处理,得到处理结果;根据所述处理结果确定待定位物体与搭载所述视觉模组的设备之间的相对位置关系,或者是,根据所述处理结果确定所述待定位物体的姿态。可选的,所述装置30还包括:验证模块,用于将所述目标外参与预设外参范围进行比对;其中,所述预设外参范围用于表征所述当前温度对应的外参取值范围;若所述补偿后的当前外参落入所述预设外参范围,则执行所述基于所述目标外参,进行后续处理的操作;或者,根据所述目标外参对预设传感数据进行处理,得到预处理结果;将所述预处理结果与所述预设传感数据对应的标准处理结果进行比对;若两者一致,则执行所述基于所述目标外参,进行后续处理的操作。综上所述,本发明实施例提供的数据处理方法,通过先获取视觉模组的当前温度,根据当前温度,以及不同温度值与不同外参补偿量的对应关系,确定在当前温度下对应的外参补偿量,接着,获取视觉模组的初始外参,最后,根据外参补偿量对初始外参进行补偿,以得到视觉模组的目标外参。其中,视觉模组包括两个视觉传感器,外参用于表征两个视觉传感器之间的相对位姿关系;或,视觉模组包括视觉传感器和位姿传感器,外参用于表征视觉传感器与位姿传感器之间的相对位姿关系。这样,根据当前温度对初始外参进行补偿,一定程度上可以修正由于温度带来的外参不够准确的问题,使得外参能够更加精准的表征视觉传感器之间的相对位姿关系或者是视觉传感器与位姿传感器之间的相对位姿关系,进而可以确保后续使用外参进行信息处理时的处理精度。
进一步地,本发明实施例还提供一种数据处理装置,该数据处理装置包括计算机可读存储介质及处理器;所述处理器用于执行以下操作:获取视觉模组的当前温度;根据所述当前温度,以及不同温度值与不同外参补偿量的对应关系,确定在所述当前温度下对应的外参补偿量;获取所述视觉模组的初始外参;根据所述外参补偿量对所述初始外参进行补偿,以得到所述视觉模组的目标外参;其中,所述视觉模组包括两个视觉传感器,所述外参用于表征所述两个视觉传感器之间的相对位姿关系;或,所述视觉模组包括视觉传感器和位姿传感器,所述外参用于表征所述视觉传感器与所述位姿传感器之间的相对位姿关系。可选的,所述视觉模组搭载于可移动平台,所述当前温度是根据所述可移动平台当前所处环境的环境温度和/或所述可移动平台的机身温度确定的。可选的,所述可移动平台为无人机。可选的,所述相对位姿关系包括相对旋转量和/或相对平移量。可选的,所述位姿传感器为惯性测量单元IMU。可选的,所述对应关系是根据在不同温度下所述视觉模组的实际外参相对于基准外参的差值设定的。可选的,所述对应关系通过以下操作生成:控制搭载测试视觉模组的设备在不同环境温度中以不同的热功率运行。获取所述测试视觉模组的温度,作为样本温度。获取所述测试视觉模组在所述样本温度下对应的外参补偿量,所述外参补偿量为所述样本温度下所述测试视觉模组对应的实际外参相对于基准外参的差值。根据所述样本温度及其对应的外参补偿量,确定所述对应关系。可选的,所述获取所述测试视觉模组的温度,作为样本温度,包括:在所述搭载测试视觉模组的设备的热功率达到平衡状态时,获取所述测试视觉模组的温度,作为所述样本温度。可选的,所述平 衡状态包括最小热平衡状态和/或最大热平衡状态。所述最小热平衡状态表示所述测试视觉模组在最低工作环境温度下以额定热功率工作时达到热平衡。所述最大热平衡状态表示所述测试视觉模组在最高工作环境温度下以最大热功率工作时达到热平衡。
可选的,所述基准外参为在所述环境温度为最低工作环境温度的情况下,所述测试视觉模组的外参。可选的,所述处理器通过执行下述操作实现所述根据所述当前温度,以及不同温度值与不同外参补偿量的对应关系,确定在所述当前温度下对应的外参补偿量:在所述对应关系中查找所述初始外参的标定温度下对应的外参补偿量。计算所述当前温度下对应的外参补偿量与所述标定温度下对应的外参补偿量的差值。计算所述差值与所述初始外参之和,得到所述目标外参。可选的,所述处理器还用于执行以下操作:确定所述初始外参相对于所述基准外参的第一差值,以及所述初始外参相对于生成所述对应关系时所述标定温度下的实际外参的第二差值。若所述第一差值和/或所述第二差值不为0,则执行所述在所述对应关系中查找所述初始外参的标定温度下对应的外参补偿量的步骤。可选的,所述当前温度是基于设置在所述视觉模组上的温度传感器采集的。可选的,在所述视觉模组中包含飞行时间TOF传感器,所述温度传感器设置在所述TOF传感器的印制电路板上。可选的,所述处理器还用于执行以下操作:基于所述目标外参,进行后续处理。其中,所述后续处理用于根据所述目标外参对所述视觉模组采集到的传感数据进行处理,得到处理结果;根据所述处理结果确定待定位物体与搭载所述视觉模组的设备之间的相对位置关系,或者是,根据所述处理结果确定所述待定位物体的姿态。可选的,所述处理器还用于执行以下操作:将所述目标外参与预设外参范围进行比对;其中,所述预设外参范围用于表征所述当前温度对应的外参取值范围;若所述补偿后的当前外参落入所述预设外参范围,则执行所述基于所述目标外参,进行后续处理的操作。或者,根据所述目标外参对预设传感数据进行处理,得到预处理结果。将所述预处理结果与所述预设传感数据对应的标准处理结果进行比对;若两者一致,则执行所述基于所述目标外参,进行后续处理的操作。上述处理器执行操作与上述数据处理方法中的各个对应步骤类似,且能达到相同的技术效果,为避免重复,这里不再赘述。进一步地,本发明实施例还提供一种可移动平台,所述可移动平台包含视觉模组和上述数据处理装置;所述数据处理装置用于执行上述数据处理方法中的各个步骤,且能达到相同的技术效果,为避免重复,这里不再赘述。可选的,所述可移动平台包括动力旋桨以及用于驱动所述动力螺旋桨的驱动电机。进一步地,本发明实施例还提供一种可穿戴式设备,所述可穿戴式设备包含视觉模组和上述数据处理装置;所述数据处理装置用于执行上述数据处理方法中的各个步骤,且能达到相同的技术效果,为避免重复,这里不再赘述。进一步地,本发明实施例还提供一种计算机可读存储介质,所述计算机可读存储介质上存储计算机程序,所述计算机程序被处理器执行时实现上述数据处理方法中的各个步骤,且能达到相同的技术效果,为避免重复,这里不再赘述。
图4为实现本发明各个实施例的一种设备的硬件结构示意图,该设备400包括但不限于:射频单元401、网络模块402、音频输出单元403、输入单元404、传感器405、显示单元406、 用户输入单元407、接口单元408、存储器409、处理器410、以及电源411等部件。本领域技术人员可以理解,图4中示出的设备结构并不构成对设备的限定,设备可以包括比图示更多或更少的部件,或者组合某些部件,或者不同的部件布置。在本发明实施例中,设备包括但不限于手机、平板电脑、笔记本电脑、掌上电脑、车载设备、可穿戴设备、以及计步器等。应理解的是,射频单元401可用于收发信息或通话过程中,信号的接收和发送,将来自基站的下行数据接收后,给处理器410处理;将上行的数据发送给基站。通常,射频单元401包括但不限于天线、至少一个放大器、收发信机、耦合器、低噪声放大器、双工器等。此外,射频单元401还可以通过无线通信系统与网络和其他设备通信。设备通过网络模块402为用户提供了无线的宽带互联网访问,如帮助用户收发电子邮件、浏览网页和访问流式媒体等。音频输出单元403可以将射频单元401或网络模块402接收的或者在存储器409中存储的音频数据转换成音频信号并且输出为声音。而且,音频输出单元403还可以提供与设备400执行的特定功能相关的音频输出(例如,呼叫信号接收声音、消息接收声音等等)。音频输出单元403包括扬声器、蜂鸣器以及受话器等。输入单元404用于接收音频或视频信号。输入单元404可以包括图形处理器(Graphics Processing Unit,GPU)4041和麦克风4042,图形处理器4041对在视频捕获模式或图像捕获模式中由图像捕获设备(如摄像头)获得的静态图片或视频的图像数据进行处理。处理后的图像帧可以显示在显示单元406上。经图形处理器4041处理后的图像帧可以存储在存储器409(或其它存储介质)中或者经由射频单元401或网络模块402进行发送。麦克风4042可以接收声音,并且能够将这样的声音处理为音频数据。处理后的音频数据可以在电话通话模式的情况下转换为可经由射频单元401发送到移动通信基站的格式输出。设备400还包括至少一种传感器405,比如光传感器、运动传感器以及其他传感器。光传感器包括环境光传感器及接近传感器,其中,环境光传感器可根据环境光线的明暗来调节显示面板4061的亮度,接近传感器可在设备400移动到耳边时,关闭显示面板4061和/或背光。作为运动传感器的一种,加速计传感器可检测各个方向上(一般为三轴)加速度的大小,静止时可检测出重力的大小及方向,可用于识别设备姿态(比如横竖屏切换、相关游戏、磁力计姿态校准)、振动识别相关功能(比如计步器、敲击)等;传感器405还可以包括指纹传感器、压力传感器、虹膜传感器、分子传感器、陀螺仪、气压计、湿度计、温度计、红外线传感器等,在此不再赘述。显示单元406用于显示由用户输入的信息或提供给用户的信息。显示单元406可包括显示面板4061,可以采用液晶显示器(Liquid Crystal Display,LCD)、有机发光二极管(Organic Light-Emitting Diode,OLED)等形式来配置显示面板4061。用户输入单元407可用于接收输入的数字或字符信息,以及产生与设备的用户设置以及功能控制有关的键信号输入。具体地,用户输入单元407包括触控面板4041以及其他输入设备4072。触控面板4041,也称为触摸屏,可收集用户在其上或附近的触摸操作(比如用户使用手指、触笔等任何适合的物体或附件在触控面板4041上或在触控面板4041附近的操作)。触控面板4041可包括触摸检测设备和触摸控制器两个部分。其中, 触摸检测设备检测用户的触摸方位,并检测触摸操作带来的信号,将信号传送给触摸控制器;触摸控制器从触摸检测设备上接收触摸信息,并将它转换成触点坐标,再送给处理器410,接收处理器410发来的命令并加以执行。此外,可以采用电阻式、电容式、红外线以及表面声波等多种类型实现触控面板4041。除了触控面板4041,用户输入单元407还可以包括其他输入设备4072。具体地,其他输入设备4072可以包括但不限于物理键盘、功能键(比如音量控制按键、开关按键等)、轨迹球、鼠标、操作杆,在此不再赘述。进一步的,触控面板4041可覆盖在显示面板4061上,当触控面板4041检测到在其上或附近的触摸操作后,传送给处理器410以确定触摸事件的类型,随后处理器410根据触摸事件的类型在显示面板4061上提供相应的视觉输出。虽然触控面板4041与显示面板4061是作为两个独立的部件来实现设备的输入和输出功能,但是在某些实施例中,可以将触控面板4041与显示面板4061集成而实现设备的输入和输出功能,具体此处不做限定。接口单元408为外部设备与设备400连接的接口。例如,外部设备可以包括有线或无线头戴式耳机端口、外部电源(或电池充电器)端口、有线或无线数据端口、存储卡端口、用于连接具有识别模块的设备的端口、音频输入/输出(I/O)端口、视频I/O端口、耳机端口等等。接口单元408可以用于接收来自外部设备的输入(例如,数据信息、电力等等)并且将接收到的输入传输到设备400内的一个或多个元件或者可以用于在设备400和外部设备之间传输数据。存储器409可用于存储软件程序以及各种数据。存储器409可主要包括存储程序区和存储数据区,其中,存储程序区可存储操作系统、至少一个功能所需的应用程序(比如声音播放功能、图像播放功能等)等;存储数据区可存储根据手机的使用所创建的数据(比如音频数据、电话本等)等。此外,存储器409可以包括高速随机存取存储器,还可以包括非易失性存储器,例如至少一个磁盘存储器件、闪存器件、或其他易失性固态存储器件。处理器410是设备的控制中心,利用各种接口和线路连接整个设备的各个部分,通过运行或执行存储在存储器409内的软件程序和/或模块,以及调用存储在存储器409内的数据,执行设备的各种功能和处理数据,从而对设备进行整体监控。处理器410可包括一个或多个处理单元;优选的,处理器410可集成应用处理器和调制解调处理器,其中,应用处理器主要处理操作系统、用户界面和应用程序等,调制解调处理器主要处理无线通信。可以理解的是,上述调制解调处理器也可以不集成到处理器410中。设备400还可以包括给各个部件供电的电源411(比如电池),优选的,电源411可以通过电源管理系统与处理器410逻辑相连,从而通过电源管理系统实现管理充电、放电、以及功耗管理等功能。另外,设备400包括一些未示出的功能模块,在此不再赘述。
以上所描述的装置实施例仅仅是示意性的,其中所述作为分离部件说明的单元可以是或者也可以不是物理上分开的,作为单元显示的部件可以是或者也可以不是物理单元,即可以位于一个地方,或者也可以分布到多个网络单元上。可以根据实际的需要选择其中的部分或者全部模块来实现本实施例方案的目的。本领域普通技术人员在不付出创造性的劳动的情况下,即可以理解并实施。本发明的各个部件实施例可以以硬件实现,或者以在一个或者多个处理器上运 行的软件模块实现,或者以它们的组合实现。本领域的技术人员应当理解,可以在实践中使用微处理器或者数字信号处理器来实现根据本发明实施例的计算处理设备中的一些或者全部部件的一些或者全部功能。本发明还可以实现为用于执行这里所描述的方法的一部分或者全部的设备或者装置程序(例如,计算机程序和计算机程序产品)。这样的实现本发明的程序可以存储在计算机可读介质上,或者可以具有一个或者多个信号的形式。这样的信号可以从因特网网站上下载得到,或者在载体信号上提供,或者以任何其他形式提供。例如,图5为本发明实施例提供的一种计算处理设备的框图,如图5所示,图5示出了可以实现根据本发明的方法的计算处理设备。该计算处理设备传统上包括处理器510和以存储器520形式的计算机程序产品或者计算机可读介质。存储器520可以是诸如闪存、EEPROM(电可擦除可编程只读存储器)、EPROM、硬盘或者ROM之类的电子存储器。存储器520具有用于执行上述方法中的任何方法步骤的程序代码的存储空间530。例如,用于程序代码的存储空间530可以包括分别用于实现上面的方法中的各种步骤的各个程序代码。这些程序代码可以从一个或者多个计算机程序产品中读出或者写入到这一个或者多个计算机程序产品中。这些计算机程序产品包括诸如硬盘,紧致盘(CD)、存储卡或者软盘之类的程序代码载体。这样的计算机程序产品通常为如参考图6所述的便携式或者固定存储单元。该存储单元可以具有与图5的计算处理设备中的存储器520类似布置的存储段、存储空间等。程序代码可以例如以适当形式进行压缩。通常,存储单元包括计算机可读代码,即可以由例如诸如510之类的处理器读取的代码,这些代码当由计算处理设备运行时,导致该计算处理设备执行上面所描述的方法中的各个步骤。本说明书中的各个实施例均采用递进的方式描述,每个实施例重点说明的都是与其他实施例的不同之处,各个实施例之间相同相似的部分互相参见即可。本文中所称的“一个实施例”、“实施例”或者“一个或者多个实施例”意味着,结合实施例描述的特定特征、结构或者特性包括在本发明的至少一个实施例中。此外,请注意,这里“在一个实施例中”的词语例子不一定全指同一个实施例。在此处所提供的说明书中,说明了大量具体细节。然而,能够理解,本发明的实施例可以在没有这些具体细节的情况下被实践。在一些实例中,并未详细示出公知的方法、结构和技术,以便不模糊对本说明书的理解。在权利要求中,不应将位于括号之间的任何参考符号构造成对权利要求的限制。单词“包含”不排除存在未列在权利要求中的元件或步骤。位于元件之前的单词“一”或“一个”不排除存在多个这样的元件。本发明可以借助于包括有若干不同元件的硬件以及借助于适当编程的计算机来实现。在列举了若干装置的单元权利要求中,这些装置中的若干个可以是通过同一个硬件项来具体体现。单词第一、第二、以及第三等的使用不表示任何顺序。可将这些单词解释为名称。最后应说明的是:以上实施例仅用以说明本发明的技术方案,而非对其限制;尽管参照前述实施例对本发明进行了详细的说明,本领域的普通技术人员应当理解:其依然可以对前述各实施例所记载的技术方案进行修改,或者对其中部分技术特征进行等同替换;而这些修改或者替换,并不使相应技术方案的本质脱离本发明各实施例技术方案的精神和范围。

Claims (36)

  1. 一种数据处理方法,其特征在于,所述方法包括:
    获取视觉模组的当前温度;
    根据所述当前温度,以及不同温度值与不同外参补偿量的对应关系,确定在所述当前温度下对应的外参补偿量;
    获取所述视觉模组的初始外参;
    根据所述外参补偿量对所述初始外参进行补偿,以得到所述视觉模组的目标外参;
    其中,所述视觉模组包括两个视觉传感器,所述外参用于表征所述两个视觉传感器之间的相对位姿关系;或,所述视觉模组包括视觉传感器和位姿传感器,所述外参用于表征所述视觉传感器与所述位姿传感器之间的相对位姿关系。
  2. 根据权利要求1所述的方法,其特征在于,所述视觉模组搭载于可移动平台,所述当前温度是根据所述可移动平台当前所处环境的环境温度和/或所述可移动平台的机身温度确定的。
  3. 根据权利要求1所述的方法,其特征在于,所述可移动平台为无人机。
  4. 根据权利要求1所述方法,其特征在于,所述相对位姿关系包括相对旋转量和/或相对平移量。
  5. 根据权利要求1所述方法,其特征在于,所述位姿传感器为惯性测量单元IMU。
  6. 根据权利要求1所述方法,其特征在于,所述对应关系是根据在不同温度下所述视觉模组的实际外参相对于基准外参的差值设定的。
  7. 根据权利要求1所述方法,其特征在于,所述对应关系通过以下操作生成:
    控制搭载测试视觉模组的设备在不同环境温度中以不同的热功率运行;
    获取所述测试视觉模组的温度,作为样本温度;
    获取所述测试视觉模组在所述样本温度下对应的外参补偿量,所述外参补偿量为所述样本温度下所述测试视觉模组对应的实际外参相对于基准外参的差值;
    根据所述样本温度及其对应的外参补偿量,确定所述对应关系。
  8. 根据权利要求7所述方法,其特征在于,所述获取所述测试视觉模组的温度,作为样本温度,包括:
    在所述搭载测试视觉模组的设备的热功率达到平衡状态时,获取所述测试视觉模组的温度,作为所述样本温度。
  9. 根据权利要求8所述方法,其特征在于,所述平衡状态包括最小热平衡状态和/或最大热平衡状态;
    其中,所述最小热平衡状态表示所述测试视觉模组在最低工作环境温度下以额定热功率工作时达到热平衡;
    所述最大热平衡状态表示所述测试视觉模组在最高工作环境温度下以最大热功率工作时达到热平衡。
  10. 根据权利要求7所述方法,其特征在于,所述基准外参为在所述环境温度为最低工作环境温度的情况下,所述测试视觉模组的外参。
  11. 根据权利要求6所述方法,其特征在于,所述根据所述当前温度,以及不同温度值与不同外参补偿量的对应关系,确定在所述当前温度下对应的外参补偿量,包括:
    在所述对应关系中查找所述初始外参的标定温度下对应的外参补偿量;
    计算所述当前温度下对应的外参补偿量与所述标定温度下对应的外参补偿量的差值;
    计算所述差值与所述初始外参之和,得到所述目标外参。
  12. 根据权利要求11所述方法,其特征在于,所述在所述对应关系中查找所述初始外参的标定温度下对应的外参补偿量之前,所述方法还包括:
    确定所述初始外参相对于所述基准外参的第一差值,以及所述初始外参相对于生成所述对应关系时所述标定温度下的实际外参的第二差值;
    若所述第一差值和/或所述第二差值不为0,则执行所述在所述对应关系中查找所述初始外参的标定温度下对应的外参补偿量的步骤。
  13. 根据权利要求1所述方法,其特征在于,所述当前温度是基于设置在所述视觉模组上的温度传感器采集的。
  14. 根据权利要求13所述方法,其特征在于,其中,在所述视觉模组中包含飞行时间TOF传感器,所述温度传感器设置在所述TOF传感器的印制电路板上。
  15. 根据权利要求1所述方法,其特征在于,所述根据所述外参补偿量对所述初始外参进行补偿,以得到所述视觉模组的目标外参之后,所述方法还包括:
    基于所述目标外参,进行后续处理;
    其中,所述后续处理用于根据所述目标外参对所述视觉模组采集到的传感数据进行处理,得到处理结果;根据所述处理结果确定待定位物体与搭载所述视觉模组的设备之间的相对位置关系,或者是,根据所述处理结果确定所述待定位物体的姿态。
  16. 根据权利要求15所述方法,其特征在于,所述基于所述目标外参,进行后续处理之前,所述方法还包括:
    将所述目标外参与预设外参范围进行比对;其中,所述预设外参范围用于表征所述当前温度对应的外参取值范围;若所述补偿后的当前外参落入所述预设外参范围,则执行所述基于所述目标外参,进行后续处理的操作;
    或者,根据所述目标外参对预设传感数据进行处理,得到预处理结果;将所述预处理结果与所述预设传感数据对应的标准处理结果进行比对;若两者一致,则执行所述基于所述目标外参,进行后续处理的操作。
  17. 一种数据处理装置,其特征在于,所述数据处理装置包括计算机可读存储介质及处理器;所述处理器用于执行以下操作:
    获取视觉模组的当前温度;
    根据所述当前温度,以及不同温度值与不同外参补偿量的对应关系,确定在所述当前温度下对应的外参补偿量;
    获取所述视觉模组的初始外参;
    根据所述外参补偿量对所述初始外参进行补偿,以得到所述视觉模组的目标外参;
    其中,所述视觉模组包括两个视觉传感器,所述外参用于表征所述两个视觉传感器之间的相对位姿关系;或,所述视觉模组包括视觉传感器和位姿传感器,所述外参用于表征所述视觉传感器与所述位姿传感器之间的相对位姿关系。
  18. 根据权利要求17所述的装置,其特征在于,所述视觉模组搭载于可移动平台,所述当前温度是根据所述可移动平台当前所处环境的环境温度和/或所述可移动平台的机身温度确定的。
  19. 根据权利要求17所述的装置,其特征在于,所述可移动平台为无人机。
  20. 根据权利要求17所述装置,其特征在于,所述相对位姿关系包括相对旋转量和/或相对平移量。
  21. 根据权利要求17所述装置,其特征在于,所述位姿传感器为惯性测量单元IMU。
  22. 根据权利要求17所述装置,其特征在于,所述对应关系是根据在不同温度下所述视觉模组的实际外参相对于基准外参的差值设定的。
  23. 根据权利要求17所述装置,其特征在于,所述对应关系通过以下操作生成:
    控制搭载测试视觉模组的设备在不同环境温度中以不同的热功率运行;
    获取所述测试视觉模组的温度,作为样本温度;
    获取所述测试视觉模组在所述样本温度下对应的外参补偿量,所述外参补偿量为所述样本温度下所述测试视觉模组对应的实际外参相对于基准外参的差值;
    根据所述样本温度及其对应的外参补偿量,确定所述对应关系。
  24. 根据权利要求23所述装置,其特征在于,所述获取所述测试视觉模组的温度,作为样本温度,包括:
    在所述搭载测试视觉模组的设备的热功率达到平衡状态时,获取所述测试视觉模组的温度,作为所述样本温度。
  25. 根据权利要求24所述装置,其特征在于,所述平衡状态包括最小热平衡状态和/或最大热平衡状态;
    其中,所述最小热平衡状态表示所述测试视觉模组在最低工作环境温度下以额定热功率工作时达到热平衡;
    所述最大热平衡状态表示所述测试视觉模组在最高工作环境温度下以最大热功率工作时达到热平衡。
  26. 根据权利要求23所述装置,其特征在于,所述基准外参为在所述环境温度为最低工作环境 温度的情况下,所述测试视觉模组的外参。
  27. 根据权利要求22所述装置,其特征在于,所述处理器通过执行下述操作实现所述根据所述当前温度,以及不同温度值与不同外参补偿量的对应关系,确定在所述当前温度下对应的外参补偿量:
    在所述对应关系中查找所述初始外参的标定温度下对应的外参补偿量;
    计算所述当前温度下对应的外参补偿量与所述标定温度下对应的外参补偿量的差值;
    计算所述差值与所述初始外参之和,得到所述目标外参。
  28. 根据权利要求27所述装置,其特征在于,所述处理器还用于执行以下操作:
    确定所述初始外参相对于所述基准外参的第一差值,以及所述初始外参相对于生成所述对应关系时所述标定温度下的实际外参的第二差值;
    若所述第一差值和/或所述第二差值不为0,则执行所述在所述对应关系中查找所述初始外参的标定温度下对应的外参补偿量的步骤。
  29. 根据权利要求17所述装置,其特征在于,所述当前温度是基于设置在所述视觉模组上的温度传感器采集的。
  30. 根据权利要求13所述装置,其特征在于,其中,在所述视觉模组中包含飞行时间TOF传感器,所述温度传感器设置在所述TOF传感器的印制电路板上。
  31. 根据权利要求17所述装置,其特征在于,所述处理器还用于执行以下操作:
    基于所述目标外参,进行后续处理;
    其中,所述后续处理用于根据所述目标外参对所述视觉模组采集到的传感数据进行处理,得到处理结果;根据所述处理结果确定待定位物体与搭载所述视觉模组的设备之间的相对位置关系,或者是,根据所述处理结果确定所述待定位物体的姿态。
  32. 根据权利要求31所述装置,其特征在于,所述处理器还用于执行以下操作:
    将所述目标外参与预设外参范围进行比对;其中,所述预设外参范围用于表征所述当前温度对应的外参取值范围;若所述补偿后的当前外参落入所述预设外参范围,则执行所述基于所述目标外参,进行后续处理的操作;
    或者,根据所述目标外参对预设传感数据进行处理,得到预处理结果;将所述预处理结果与所述预设传感数据对应的标准处理结果进行比对;若两者一致,则执行所述基于所述目标外参,进行后续处理的操作。
  33. 一种可移动平台,其特征在于,所述可移动平台包含视觉模组和权利要求17至32中任一所述的数据处理装置;所述数据处理装置用于执行以下操作:
    获取视觉模组的当前温度;
    根据所述当前温度,以及不同温度值与不同外参补偿量的对应关系,确定在所述当前温度下对应的外参补偿量;
    获取所述视觉模组的初始外参;
    根据所述外参补偿量对所述初始外参进行补偿,以得到所述视觉模组的目标外参;
    其中,所述视觉模组包括两个视觉传感器,所述外参用于表征所述两个视觉传感器之间的相对位姿关系;或,所述视觉模组包括视觉传感器和位姿传感器,所述外参用于表征所述视觉传感器与所述位姿传感器之间的相对位姿关系。
  34. 根据权利要求33所述方法,其特征在于,所述可移动平台包括动力螺旋桨以及用于驱动所述动力螺旋桨的驱动电机。
  35. 一种可穿戴式设备,其特征在于,所述可穿戴式设备包含视觉模组和权利要求17至32中任一所述的数据处理装置;所述数据处理装置用于执行以下操作:
    获取视觉模组的当前温度;
    根据所述当前温度,以及不同温度值与不同外参补偿量的对应关系,确定在所述当前温度下对应的外参补偿量;
    获取所述视觉模组的初始外参;
    根据所述外参补偿量对所述初始外参进行补偿,以得到所述视觉模组的目标外参;
    其中,所述视觉模组包括两个视觉传感器,所述外参用于表征所述两个视觉传感器之间的相对位姿关系;或,所述视觉模组包括视觉传感器和位姿传感器,所述外参用于表征所述视觉传感器与所述位姿传感器之间的相对位姿关系。
  36. 一种计算机可读存储介质,其特征在于,所述计算机可读存储介质上存储计算机程序,所述计算机程序被处理器执行时实现以下操作:
    获取视觉模组的当前温度;
    根据所述当前温度,以及不同温度值与不同外参补偿量的对应关系,确定在所述当前温度下对应的外参补偿量;
    获取所述视觉模组的初始外参;
    根据所述外参补偿量对所述初始外参进行补偿,以得到所述视觉模组的目标外参;
    其中,所述视觉模组包括两个视觉传感器,所述外参用于表征所述两个视觉传感器之间的相对位姿关系;或,所述视觉模组包括视觉传感器和位姿传感器,所述外参用于表征所述视觉传感器与所述位姿传感器之间的相对位姿关系。
PCT/CN2020/085640 2020-04-20 2020-04-20 数据处理方法、装置、可移动平台及可穿戴式设备 WO2021212278A1 (zh)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/CN2020/085640 WO2021212278A1 (zh) 2020-04-20 2020-04-20 数据处理方法、装置、可移动平台及可穿戴式设备

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2020/085640 WO2021212278A1 (zh) 2020-04-20 2020-04-20 数据处理方法、装置、可移动平台及可穿戴式设备

Publications (1)

Publication Number Publication Date
WO2021212278A1 true WO2021212278A1 (zh) 2021-10-28

Family

ID=78271026

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2020/085640 WO2021212278A1 (zh) 2020-04-20 2020-04-20 数据处理方法、装置、可移动平台及可穿戴式设备

Country Status (1)

Country Link
WO (1) WO2021212278A1 (zh)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114216482A (zh) * 2021-12-14 2022-03-22 Oppo广东移动通信有限公司 轨迹外参真值确定方法及装置、存储介质、电子设备
CN114459615A (zh) * 2021-12-14 2022-05-10 浙江大华技术股份有限公司 一种应用于红外热成像测温设备的补偿的方法及装置
CN116124081A (zh) * 2023-04-18 2023-05-16 菲特(天津)检测技术有限公司 一种非接触式的工件检测方法、装置、电子设备及介质

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105354854A (zh) * 2015-12-01 2016-02-24 国家电网公司 基于三维数字模型的相机参数动态联合标定方法及系统
CN107016144A (zh) * 2015-10-12 2017-08-04 空中客车运营简化股份公司 用于预测飞机的部件、设备或结构容许的温度的方法
CN107401977A (zh) * 2017-08-15 2017-11-28 合肥工业大学 高温双目立体视觉测量中考虑折射偏差的成像补偿方法
US20180150976A1 (en) * 2016-11-25 2018-05-31 Continental Teves Ag & Co. Ohg Method for automatically establishing extrinsic parameters of a camera of a vehicle

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107016144A (zh) * 2015-10-12 2017-08-04 空中客车运营简化股份公司 用于预测飞机的部件、设备或结构容许的温度的方法
CN105354854A (zh) * 2015-12-01 2016-02-24 国家电网公司 基于三维数字模型的相机参数动态联合标定方法及系统
US20180150976A1 (en) * 2016-11-25 2018-05-31 Continental Teves Ag & Co. Ohg Method for automatically establishing extrinsic parameters of a camera of a vehicle
CN107401977A (zh) * 2017-08-15 2017-11-28 合肥工业大学 高温双目立体视觉测量中考虑折射偏差的成像补偿方法

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114216482A (zh) * 2021-12-14 2022-03-22 Oppo广东移动通信有限公司 轨迹外参真值确定方法及装置、存储介质、电子设备
CN114459615A (zh) * 2021-12-14 2022-05-10 浙江大华技术股份有限公司 一种应用于红外热成像测温设备的补偿的方法及装置
CN116124081A (zh) * 2023-04-18 2023-05-16 菲特(天津)检测技术有限公司 一种非接触式的工件检测方法、装置、电子设备及介质

Similar Documents

Publication Publication Date Title
WO2021212278A1 (zh) 数据处理方法、装置、可移动平台及可穿戴式设备
CN109947886B (zh) 图像处理方法、装置、电子设备及存储介质
CN108985220B (zh) 一种人脸图像处理方法、装置及存储介质
US11297239B2 (en) Lens control method and mobile terminal
US20210407211A1 (en) Electronic device and method for displaying sharing information on basis of augmented reality
AU2020263183B2 (en) Parameter Obtaining Method and Terminal Device
CN103105926A (zh) 多传感器姿势识别
CN111354434B (zh) 电子装置及其提供信息的方法
WO2020042968A1 (zh) 一种对象信息的获取方法、装置以及存储介质
WO2021190387A1 (zh) 检测结果输出的方法、电子设备及介质
CN111652942B (zh) 摄像模组的标定方法、第一电子设备和第二电子设备
CN110784575B (zh) 一种电子设备和拍摄方法
US20200084574A1 (en) Electronic device and method for identifying location by electronic device
US20220319118A1 (en) Electronic device for providing indoor positioning and method therefor
US11856294B2 (en) Electronic device and focusing method for electronic device
US20200117308A1 (en) Electronic device and method for determining touch input conditions based on type of touch input
CN114332423A (zh) 虚拟现实手柄追踪方法、终端及计算可读存储介质
CN109618055B (zh) 一种位置共享方法及移动终端
CN110312070B (zh) 一种图像处理方法及终端
CN111062261A (zh) 一种图像处理方法及装置
CN110148167B (zh) 一种距离测量方法及终端设备
US11877057B2 (en) Electronic device and focusing method
KR20220158628A (ko) 깊이 보조 시각적 관성 주행 거리 측정을 위한 방법 및 장치
CN111093031B (zh) 图像生成方法及电子设备
CN111179628B (zh) 自动驾驶车辆的定位方法、装置、电子设备及存储介质

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20932248

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 20932248

Country of ref document: EP

Kind code of ref document: A1