WO2019119282A1 - 图像和位置信息的关联方法、装置及可移动平台 - Google Patents

图像和位置信息的关联方法、装置及可移动平台 Download PDF

Info

Publication number
WO2019119282A1
WO2019119282A1 PCT/CN2017/117302 CN2017117302W WO2019119282A1 WO 2019119282 A1 WO2019119282 A1 WO 2019119282A1 CN 2017117302 W CN2017117302 W CN 2017117302W WO 2019119282 A1 WO2019119282 A1 WO 2019119282A1
Authority
WO
WIPO (PCT)
Prior art keywords
time
target
observation value
exposure time
exposure
Prior art date
Application number
PCT/CN2017/117302
Other languages
English (en)
French (fr)
Inventor
钟承群
崔留争
Original Assignee
深圳市大疆创新科技有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 深圳市大疆创新科技有限公司 filed Critical 深圳市大疆创新科技有限公司
Priority to CN201780004751.2A priority Critical patent/CN108513710A/zh
Priority to PCT/CN2017/117302 priority patent/WO2019119282A1/zh
Publication of WO2019119282A1 publication Critical patent/WO2019119282A1/zh

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/70Circuitry for compensating brightness variation in the scene
    • H04N23/73Circuitry for compensating brightness variation in the scene by influencing the exposure time

Definitions

  • the present invention relates to the field of mapping technology, and in particular, to a method and device for associating image and location information, and a movable platform.
  • the shooting mode adopted is generally hover shooting.
  • GPS Global Positioning System
  • the embodiment of the invention provides a method, a device and a movable platform for associating image and location information, which can improve the accuracy and efficiency of mapping.
  • an embodiment of the present invention provides a method for associating image and location information, including:
  • a shooting instruction is sent to the shooting device, the shooting instruction is used to instruct the shooting device to perform shooting to obtain a captured image;
  • the captured image is associated with the target location information.
  • an embodiment of the present invention provides an apparatus for associating image and location information, including:
  • a communication interface configured to send a shooting instruction to the shooting device when the processor detects a shooting triggering event, the shooting instruction is used to instruct the shooting device to perform shooting to obtain a captured image
  • the processor is further configured to calculate a target exposure time at which the photographing device exposes the captured image; determine target position information at the target exposure time; and the captured image and the target Location information is associated.
  • an embodiment of the present invention provides a mobile platform, including:
  • a power system disposed on the fuselage for providing flight power
  • a photographing device configured to receive a photographing instruction and perform photographing according to the photographing instruction to obtain a photographed image
  • an embodiment of the present invention provides a computer readable storage medium, where the computer storage medium stores a computer program, where the computer program includes program instructions, and the program instructions, when executed by a processor, cause the processing
  • the apparatus performs the association method of the image and the location information of the first aspect described above.
  • the captured image and the self can be accurately captured while flying while shooting.
  • the location information is correlated to improve the accuracy and efficiency of mapping.
  • FIG. 1 is a schematic flowchart of a method for associating image and location information according to an embodiment of the present invention
  • FIG. 2 is a schematic flowchart diagram of another method for associating image and location information according to an embodiment of the present invention
  • FIG. 3 is a schematic diagram of area division of a captured image according to an embodiment of the present invention.
  • FIG. 4 is a schematic structural diagram of an apparatus for associating image and location information according to an embodiment of the present invention
  • FIG. 5 is a schematic structural diagram of a mobile platform according to an embodiment of the present invention.
  • the movable platform such as the drone can obtain the accurate GPS position information of the scene, but the movable platform needs to hover every time an image is taken, so that the mapping efficiency is made. low.
  • an embodiment of the present invention provides a method for associating an image and location information.
  • the method for associating the image and the location information may be performed by a mobile platform.
  • the movable platform is associated with a photographing device, and the photographing device performs a photographing operation to obtain a photographed image after receiving the photographing instruction of the transmission of the movable platform.
  • the manner in which the photographing device performs photographing to obtain a photographed image may be fixed-point photographing, in which case the movable platform transmits a photographing instruction to the photographing device after flying to a specified GPS range.
  • the manner in which the photographing device performs photographing to obtain a photographed image may also be a fixed distance photographing.
  • the movable platform transmits a photographing instruction to the photographing device after flying a certain distance.
  • the manner in which the photographing device performs photographing to obtain a photographed image may also be timed photographing.
  • the movable platform sends a photographing instruction to the photographing device after a certain time.
  • the movable platform When the movable platform transmits a shooting instruction to the photographing device, the movable platform acquires an exposure time (hereinafter referred to as a center exposure time) of a central exposure position of the captured image.
  • a center exposure time an exposure time of a central exposure position of the captured image.
  • central exposure time the shooting The exposure time at which the image is initially exposed to the initial exposure position (hereinafter referred to as the initial exposure time) + the unit exposure time required for the exposure of the camera to one pixel* from the initial exposure position of the captured image to the center exposure The number of target pixels in the position.
  • the image sensor will issue a target interruption before starting the exposure, and the movable platform can record the moment when the target interruption occurs.
  • the target interruption can be VSYNC (field sync signal) of VIN (input signal) is interrupted.
  • the exposure preparation time is a time that the image sensor elapses after the target interruption occurs until before the exposure is started.
  • the number of target pixels from the initial exposure position of the captured image to the central exposure position may be determined according to the number of rows of pixels of the captured image and the number of column pixels included in a column of pixels in the captured image.
  • the method may specifically include: acquiring a starting exposure time of the captured image; calculating a unit exposure time required for the imaging device to expose a pixel; a number of target pixels from the initial exposure position of the captured image to the central exposure position; a sum of the product of the unit exposure time and the target pixel number and the start exposure time as a central exposure time of the captured image.
  • the method may specifically include: recording a target interruption occurrence time; acquiring an exposure preparation time; and using the sum of the interruption occurrence time and the exposure preparation time as a The starting exposure time of the captured image.
  • the movable platform may calculate the unit exposure time required for the exposure of the image capturing device to one pixel, and may include: acquiring a line exposure time required by the camera device to expose a row of pixels; acquiring the captured image The number of rows of pixels included in a row of pixels; the quotient of the row exposure time and the number of rows of pixels is used as the unit exposure time required for the camera to expose one pixel.
  • the method may further include: acquiring a number of line pixels included in a row of pixels in the captured image; acquiring the shooting a number of column pixels included in a column of pixels in the image; a sum of a product of a half of the number of columns of pixels and a number of pixels of the row and a half of the number of pixels of the row as a starting exposure position from the captured image to a center exposure The number of target pixels in the position.
  • the mobile platform includes a GPS-RTK (Real-Time Kinematic) module.
  • the movable platform After the movable platform acquires the central exposure time, the movable platform transmits the central exposure time to the GPS-RTK module, The GPS-RTK module calculates accurate GPS position information at the central exposure time by interpolation operation, and feeds the GPS position information to the movable platform, and the movable platform compares the GPS position information with The captured image is associated.
  • GPS-RTK Real-Time Kinematic
  • the mobile platform stores the central exposure time and combines PPK (Post Processed Kinematic) raw (RAW) data that is saved from the bootable platform.
  • PPK Post Processed Kinematic
  • the post-test joint processing is performed together to calculate GPS position information of the movable platform, and the GPS position information is associated with the captured image.
  • the central exposure time may be stored in an event log of a Secure Digital (SD) card of the camera.
  • SD Secure Digital
  • the manner in which the mobile platform associates the GPS location information with the captured image may specifically store the GPS location information into Exif data and XMP data of the captured image, so as to facilitate subsequent generation of a map. .
  • the captured image and the self can be accurately captured while flying while shooting.
  • the location information is correlated to improve the accuracy and efficiency of mapping.
  • FIG. 1 is a schematic flowchart diagram of a method for associating image and location information according to an embodiment of the present invention. As shown in FIG. 1, the method for associating the image and the location information may include:
  • the photographing device may be, for example, a camera, a camera, or the like.
  • the steps of the method for associating image and location information provided by the embodiments of the present invention may be performed by a mobile platform and applied to the field of mapping.
  • the movable platform may be an aircraft such as a drone, or may be a car, a ship or the like.
  • the shooting triggering event may include: the movable platform flies within a specified GPS range, the movable platform flies a specified distance, or the movable platform flies for a specified time, and the movable platform receives the specified time.
  • the remote control instruction or the voice instruction, the movable platform acquires the specified self posture data, the movable platform detects the specified posture motion, and the like, and the embodiment of the present invention There are no restrictions on this.
  • S102 Calculate a target exposure time at which the photographing device exposes the captured image.
  • the movable platform may perform the calculation of a target exposure time when the photographing device exposes the captured image, and may specifically include: acquiring an exposure preparation time of the photographing device; according to the exposure The preparation time calculates a target exposure time at which the photographing device exposes the photographed image.
  • the exposure preparation time refers to a time between a time when the photographing instruction is received by the photographing device and a time when the photographing device starts exposure.
  • the exposure preparation time of the photographing device may be pre-stored in the movable platform.
  • the movable platform may use a sending moment of the shooting instruction as a moment when the photographing apparatus receives the photographing instruction, that is, the exposure preparation time is from the photographing instruction.
  • the movable platform performing the calculating the target exposure time for exposing the captured image by the photographing device according to the exposure preparation time may specifically include: recording a sending moment of the shooting instruction, and The sum of the transmission timing of the shooting command and the exposure preparation time is determined as the target exposure timing at which the photographing device exposes the captured image.
  • the method may specifically include: acquiring, by the photographing device, the exposure of the photographed image Exposure time; calculating a target exposure time at which the photographing device exposes the photographed image according to the specified exposure time.
  • the movable platform performing the calculating the target exposure time for exposing the captured image by the photographing device according to the specified exposure time may specifically include: transmitting a sending moment of the shooting instruction and the specified exposure The sum of the times is determined as a target exposure time at which the photographing device exposes the photographed image.
  • the specified exposure time may be a time between a time when the photographing device starts to be exposed and a time when the photographing device is exposed to a specified exposure position, and the movable platform performs the acquiring the photographing device to the
  • the specified exposure time for taking an image exposure may specifically include: calculating a unit exposure time required for exposure of the photographing device to a single pixel; and calculating a target pixel number from a start exposure position in the captured image to a specified exposure position; And determining, according to the unit exposure time and the number of target pixels, a specified exposure time that the photographing device exposes the captured image.
  • the initial exposure position is a position where the first pixel of the captured image is exposed
  • the specified exposure position is a position where the designated pixel in the captured image is located.
  • the movable platform may perform the calculating a target exposure time when the photographing device is exposed to the photographed image, and may specifically include: acquiring an exposure preparation time of the photographing device; acquiring the a specified exposure time at which the photographing device exposes the photographed image; and based on the exposure preparation time and the designated exposure time, a target exposure timing at which the photographing device exposes the photographed image is calculated.
  • the movable platform performs the determining, according to the exposure preparation time and the specified exposure time, the target exposure time at which the photographing device exposes the captured image may specifically include: sending a sending moment of the shooting instruction The sum of the exposure preparation time and the specified exposure time is determined as a target exposure time at which the photographing device exposes the photographed image.
  • S103 Determine target position information at the target exposure time.
  • the determining, by the mobile platform, the target location information at the target exposure time may include: acquiring a first carrier phase observation value; determining, according to the first carrier phase observation value, Target position information at the target exposure time.
  • the first carrier phase observation value is satellite positioning data received by the movable platform.
  • the determining, by the mobile platform, the target location information at the target exposure time may include: determining, according to the first carrier phase observation value, location information at an observation value collection time; The position information at the time of the observation value acquisition is determined, and the target position information at the target exposure time is determined.
  • the observation value acquisition time refers to the acquisition time of the first carrier phase observation value.
  • the mobile platform performs the determining, according to the location information at the time of collecting the observation value, the target location information at the target exposure time may specifically include: determining the recorded observation value. Whether there is a target observation value acquisition time in the collection time, the target observation value acquisition time and the target exposure time are at the same time; if yes, the position information at the target observation value collection time is determined to be in the Target position information at the target exposure time.
  • the target exposure time is an observation acquisition time of the movable platform. Therefore, the movable platform can directly calculate the target in the mesh according to the first carrier phase observation value. Target position information at the time of exposure.
  • the movable platform may estimate the target position information at the target exposure time by an interpolation operation.
  • the movable platform periodically acquires the first carrier phase observation value.
  • the preset interval may be, for example, n (n is a positive integer) times period in which the movable platform acquires the first carrier phase observation value.
  • the movable platform may be in accordance with all observation time collection moments that are less than the target exposure time.
  • the maximum observation value acquisition time, and the position information at the minimum observation value acquisition time of all the observation value acquisition times greater than the target exposure time, and the target position information at the target exposure time is estimated.
  • the movable platform can estimate the target position information at the target exposure time based on the position information at the time of the two observations closest to the target exposure time.
  • the performing, by the mobile platform, performing the associating the captured image with the target location information may include: storing the target location information into target metadata, the target element The data is used to record attribute information of the captured image.
  • the target metadata is Exif data or XMP data of the captured image.
  • the movable platform stores the target location information at the target exposure time into the target metadata of the captured image, so as to facilitate subsequent generation of the map.
  • the captured image can be captured while flying while shooting. Correlation with its own accurate location information improves mapping accuracy and efficiency.
  • FIG. 2 is a schematic flowchart diagram of another method for associating image and location information according to an embodiment of the present invention.
  • the method for associating the image and the location information may include:
  • the photographing device may be, for example, a camera, a camera, or the like.
  • each step of the method for associating images and location information provided by the embodiments of the present invention may be performed by a mobile platform.
  • the movable platform may be an aircraft such as a drone, or may be a car, a ship or the like.
  • the shooting triggering event may include: the movable platform flies within a specified GPS range, the movable platform flies a specified distance, or the movable platform flies for a specified time, and the movable platform receives the specified time.
  • the remote control command or the voice command, the mobile platform acquires the specified self-attitude data, the movable platform detects the specified gesture action, and the like, and the embodiment of the present invention does not impose any limitation.
  • S202 Generate a captured image identifier corresponding to the captured image.
  • the captured image identifier (ie, the ID of the captured image) is used to distinguish the captured image captured by the camera at different target exposure times, that is, there is a between the captured image and the captured image identifier. A corresponding relationship.
  • S203 Calculate a target exposure time at which the photographing device exposes the photographed image.
  • the movable platform may perform the calculation of a target exposure time when the photographing device exposes the captured image, and may specifically include: acquiring an exposure preparation time of the photographing device; according to the exposure The preparation time calculates a target exposure time at which the photographing device exposes the photographed image.
  • the exposure preparation time refers to a time between a time when the photographing instruction is received by the photographing device and a time when the photographing device starts exposure. It should be noted that the exposure preparation time is determined by an image sensor in the imaging device. That is to say, photographing apparatuses having different image sensors have different exposure preparation times. As an optional implementation manner, the exposure preparation time of the photographing device may be pre-stored in the movable platform.
  • the movable platform may use a sending moment of the shooting instruction as a moment when the photographing apparatus receives the photographing instruction, that is, the exposure preparation time is a slave The time between the transmission timing of the shooting instruction and the time when the photographing device starts exposure.
  • the movable platform performing the calculating the target exposure time for exposing the captured image by the photographing device according to the exposure preparation time may specifically include: recording a sending moment of the shooting instruction, and The sum of the transmission timing of the shooting command and the exposure preparation time is determined as the target exposure timing at which the photographing device exposes the captured image.
  • the photographing device issues a target interrupt when the photographing device receives the photographing instruction.
  • the target interrupt can be, for example, a VSYNC interrupt of VIN.
  • the movable platform may use a moment when the target interruption occurs as a moment when the photographing apparatus receives the photographing instruction, that is, the exposure preparation time is interrupted from the target. The time between the moment when the shooting device begins to be exposed.
  • the photographing device may record the occurrence time of the target interruption and transmit the occurrence timing of the target interruption to the movable platform.
  • the movable platform performing the target exposure time for exposing the captured image by the photographing device according to the exposure preparation time may specifically include: receiving an occurrence time of the target interruption sent by the photographing device, The sum of the occurrence time of the target interruption and the exposure preparation time is determined as the target exposure time at which the photographing device exposes the captured image.
  • the method may specifically include: acquiring, by the photographing device, the exposure of the photographed image Exposure time; calculating a target exposure time at which the photographing device exposes the photographed image according to the specified exposure time.
  • the movable platform performing the calculating the target exposure time for exposing the captured image by the photographing device according to the specified exposure time may specifically include: transmitting a sending moment of the shooting instruction and the specified exposure The sum of the times is determined as a target exposure time at which the photographing device exposes the photographed image.
  • the mobile platform may perform, according to the specified exposure time, calculating a target exposure time that the photographing apparatus exposes the photographed image, and may further include: generating a moment of occurrence of the target interruption and the designating The sum of the exposure times is determined as the target exposure time at which the photographing device exposes the photographed image.
  • the specified exposure time is a time between a time when the photographing device starts exposure and a specified time during exposure.
  • the specified moment in the exposure process may be a moment when the photographing device is exposed to a specified exposure position.
  • the movable platform When the specified exposure time is a time between a time when the photographing device starts exposure and a time when the photographing device is exposed to a specified exposure position, the movable platform performs the acquiring the photographing device for the photographing
  • the specified exposure time of the image exposure may specifically include: calculating a unit exposure time required for exposure of the photographing device to a single pixel; calculating a target pixel number from a starting exposure position in the captured image to a specified exposure position; The unit exposure time and the number of target pixels are used to calculate a specified exposure time for the photographing device to expose the captured image.
  • the photographing device may expose the captured image by using a rolling shutter, that is, the photographing device is facing the shooting. The pixels in the image are progressively exposed.
  • the initial exposure position is a position where the first pixel of the captured image is exposed
  • the specified exposure position is a position where the designated pixel in the captured image is located.
  • the number of target pixels from the initial exposure position in the captured image to the specified exposure position may be the number of line pixels included in one line of pixels in the captured image and the position of the specified pixel in the captured image (ie, The specified pixel is determined by the pixels of the first few columns of the first row. Wherein, the number of rows of pixels is determined according to the resolution of the captured image.
  • the row pixel number and the column pixel number respectively refer to the row pixel number and column of the entire captured image.
  • the number of pixels not just the number of rows of pixels and the number of columns of pixels in the reserved area.
  • FIG. 3 is a schematic diagram of area division of a captured image according to an embodiment of the present invention.
  • the resolution of the captured image 30 shown in FIG. 3 is 4000*3000, and the number of rows of pixels included in one row of pixels is 4860, and the number of columns of pixels in one column of pixels is 3648.
  • the captured image 30 includes a reserved area 301 and a crop area 302 surrounding the reserved area 301, where the first pixel of the cropped area 302 (or the captured image 30) is located The position is the initial exposure position 303, and the position of the center pixel in the reserved area 303 (or the captured image 30) is center-exposed to the pixel 304.
  • the central pixel refers to the pixel of the 1824th row and the 24th column of the captured image 30. In this case, the number of target pixels is (4860*1824+2430).
  • the line exposure time required for the exposure of the photographing device to a row of pixels is determined by the image sensor in the photographing device.
  • the line exposure time required for the exposure of the photographing device to a row of pixels may be pre-stored in the movable platform.
  • the movable platform performing the calculation of the unit exposure time required for the exposure of the photographing device to a single pixel may specifically include: acquiring a line exposure time required for exposure of the photographing device to a row of pixels, and the captured image The number of rows of pixels included in a row of pixels; and the unit exposure time required for exposure of the camera to a single pixel is calculated according to the row exposure time and the number of rows of pixels.
  • the specified time during the exposure may be the time at which the exposure process proceeds to a specified extent (eg, halfway through the exposure process).
  • the specified exposure time may be a specified ratio of the total exposure time required for the photographing device to expose the captured image (eg, half of the total exposure time).
  • the movable platform performs the specified exposure of acquiring the exposure of the photographing image by the photographing device when the specified exposure time is a specified ratio of a total exposure time required for the photographing device to expose the photographed image
  • the time may further include: acquiring a total exposure time required for the photographing device to expose the photographed image; and calculating, according to the total exposure time, a specified exposure time of the photographing device to expose the photographed image.
  • the photographing device may expose the captured image by using a rolling shutter, or may use a global shutter (Global Shutter).
  • the method of exposing the captured image that is, the photographing device simultaneously exposes each pixel in the captured image.
  • the movable platform performing the calculating the specified exposure time of the photographing image by the photographing device according to the total exposure time may specifically include: using a specified ratio of the total exposure time as a specified exposure time .
  • the specified ratio may be pre-stored in the movable platform.
  • the movable platform performing the acquiring the total exposure time required for the photographing device to expose the photographed image may specifically include: calculating a unit exposure time required for the photographing device to expose a single pixel; calculating a total number of pixels included in the captured image; and calculating, by the photographing device, the photographed image according to the unit exposure time and the total number of pixels The total exposure time required.
  • the total number of pixels is determined according to the resolution of the captured image.
  • the specified ratio may be pre-stored in the movable platform when the photographing device exposes the photographed image using a global shutter.
  • the movable platform may perform the calculating a target exposure time when the photographing device is exposed to the photographed image, and may specifically include: acquiring an exposure preparation time of the photographing device; acquiring the a specified exposure time at which the photographing device exposes the photographed image; and based on the exposure preparation time and the designated exposure time, a target exposure timing at which the photographing device exposes the photographed image is calculated.
  • the movable platform performs the determining, according to the exposure preparation time and the specified exposure time, the target exposure time at which the photographing device exposes the captured image may specifically include: sending a sending moment of the shooting instruction The sum of the exposure preparation time and the specified exposure time is determined as a target exposure time at which the photographing device exposes the photographed image.
  • the movable platform performing the target exposure time according to the exposure preparation time and the specified exposure time and calculating the exposure of the photographing device to the captured image may further include: generating the target interruption The sum of the time and the exposure preparation time and the specified exposure time is determined as a target exposure time at which the photographing device exposes the captured image.
  • the manner in which the movable platform performs the association between the captured image identifier and the target exposure time may specifically include: associating the captured image identifier with the target exposure time.
  • the movable platform can query the captured image identifier of the captured image captured by the photographing device at the target exposure time according to the target exposure time, since the captured image and the captured image are identified There is a one-to-one correspondence between the two, so that the movable platform can further query the captured image taken at the target exposure time.
  • S205 Determine target position information at the target exposure time.
  • the determining, by the mobile platform, the target location information at the target exposure time may include: acquiring a first carrier phase observation value; and according to the first carrier The phase observation value determines the target position information at the target exposure time.
  • the first carrier phase observation value is satellite positioning data received by the movable platform.
  • the determining, by the mobile platform, the target location information at the target exposure time may include: determining, according to the first carrier phase observation value, location information at an observation value collection time; The position information at the time of the observation value acquisition is determined, and the target position information at the target exposure time is determined.
  • the observation value acquisition time refers to the acquisition time of the first carrier phase observation value.
  • the mobile platform performing the determining, according to the first carrier phase observation value, the location information at the time of the observation value collection may include: receiving the second carrier phase observation sent by the reference base station Value and reference position information; calculating the observed value using a Real-Time Kinematic (RTK) carrier phase difference technique according to the first carrier phase observation value, the second carrier phase observation value, and the reference position information Collect location information at the moment.
  • RTK Real-Time Kinematic
  • the mobile platform can acquire the second carrier phase observation value collected by the reference base station and the reference location information of the reference base station in real time.
  • the second carrier phase observation value is satellite positioning data received by the reference base station.
  • the reference location information refers to location information (such as coordinates) of the reference base station, and the observation value collection moment refers to collection of the first carrier phase observation value or the second carrier phase observation value. time.
  • the movable platform calculates the observation value collection time according to the first carrier phase observation value, the second carrier phase observation value, and the reference position information acquired at the same time. Location information below.
  • the mobile platform performing the determining, according to the first carrier phase observation value, the location information at the time of the observation value collection may specifically include: sending a location acquisition command to the reference base station, where The location acquisition command is used to instruct the reference base station to calculate the location information at the time of the observation value collection using the real-time dynamic carrier phase difference technology, where the location acquisition instruction carries the first carrier phase observation value; The location information fed back by the base station at the time of collecting the observation value is referred to.
  • the calculation of the position information at the observation acquisition time is performed by the reference base station, and the calculation of the position information at the observation acquisition time is transparent to the movable platform.
  • the mobile platform performs the according to the first carrier phase And determining, by the bit observation, the location information at the time of the acquisition of the observation value, the method further includes: storing the first carrier phase observation value; acquiring the second carrier phase observation value collected by the reference base station and the reference location information of the reference base station; The first carrier phase observation value, the second carrier phase observation value, and the reference position information are used to calculate position information at an observation value acquisition time using a Post Processed Kinematic (PPK) technique.
  • PPK Post Processed Kinematic
  • the mobile platform may obtain the second carrier phase observation value and the reference location information directly from the reference base station, and may also acquire the second carrier phase observation value from a third-party device.
  • the reference location information The first stage phase observation (also referred to as PPK RAW data) is periodically acquired once the mobile platform begins to operate.
  • the performing, by the movable platform, associating the captured image with the target location information may include: querying a captured image identifier corresponding to the target exposure time; storing the target location information In the target metadata, the target metadata is used to record attribute information of the captured image corresponding to the captured image identifier.
  • the target metadata is Exif data or XMP data of the captured image.
  • the movable platform stores the target location information at the target exposure time into the target metadata of the captured image, so as to facilitate subsequent generation of the map.
  • the captured image can be captured while flying while shooting. Correlation with its own accurate location information improves mapping accuracy and efficiency.
  • FIG. 4 is a schematic block diagram of an apparatus for associating image and location information according to an embodiment of the present invention.
  • the apparatus for associating image and location information may include one or more processors 401, one or more communication interfaces 402, and one or more memories 403.
  • the one or more processors 401 can operate individually or in concert, and the one or more memories 403 can operate individually or in concert.
  • the processor 401, the communication interface 402, and the memory 403 may be connected by, but not limited to, via a bus 404.
  • the processor 401 is configured to detect a shooting trigger event
  • the communication interface 402 is configured to send a shooting instruction to the shooting device when the processor 401 detects a shooting trigger event, where the shooting instruction is used to instruct the shooting device to perform shooting to obtain a captured image;
  • the processor 401 is further configured to calculate a target exposure time at which the photographing device exposes the captured image; determine target position information at the target exposure time; and perform the captured image and the target position information Association.
  • the processor 401 instructs to perform the calculation of a target exposure time when the photographing device exposes the captured image, specifically for acquiring an exposure preparation time of the photographing device; according to the exposure preparation time, A target exposure time at which the photographing device exposes the photographed image is calculated.
  • the processor 401 when the processor 401 performs the calculating a target exposure time at which the photographing device exposes the captured image, specifically for acquiring a specified exposure time for exposing the captured image by the photographing device; The specified exposure time is described, and a target exposure time at which the photographing device exposes the captured image is calculated.
  • the processor 401 is configured to acquire an exposure preparation time of the photographing device when the target exposure time for exposing the photographed image by the photographing device is performed, and acquiring the photographing device to the photographing device The specified exposure time of the image exposure is captured; and the target exposure time at which the photographing apparatus exposes the captured image is calculated according to the exposure preparation time and the specified exposure time.
  • the exposure preparation time is a time between a time when the photographing device receives the photographing instruction and a moment when the photographing device starts exposing.
  • the specified exposure time is a time between a time when the camera starts to expose and a specified time during exposure.
  • the processor 401 when the processor 401 performs the acquiring a specified exposure time for exposing the captured image by the photographing device, specifically for calculating a unit exposure time required for the photographing device to expose a single pixel; a number of target pixels between the initial exposure position in the captured image to the specified exposure position; and a specified exposure time at which the photographing device exposes the captured image is calculated according to the unit exposure time and the number of target pixels.
  • the processor 401 performs, when performing, calculating a unit exposure time required for the camera to expose a single pixel, specifically, acquiring a line exposure time required for the camera to expose a row of pixels, and the The number of pixels included in a row of pixels in the captured image is taken; and the unit exposure time required for the exposure of the photographing device to a single pixel is calculated according to the line exposure time and the number of pixels.
  • the processor 401 is configured to acquire a total exposure time required for exposing the captured image by the photographing device when the specified exposure time of the photographing image is taken by the photographing device; A specified exposure time at which the photographing device exposes the photographed image is calculated according to the total exposure time.
  • the communication interface 402 is further configured to collect a first carrier phase observation value
  • the processor 401 when the determining, by the processor 401, the target location information at the target exposure time, specifically determining, according to the first carrier phase observation value, determining a target location at the target exposure time information.
  • the processor 401 is configured to determine, according to the first carrier phase observation value, target position information at the target exposure time, specifically, according to the first carrier phase observation value, The position information at the time of the observation value acquisition; and the target position information at the target exposure time is determined according to the position information at the time of the observation value acquisition.
  • the communication interface 402 is further configured to receive a second carrier phase observation value and reference location information that are sent by the reference base station;
  • the processor 401 performs, according to the first carrier phase observation value, when determining location information at an observation value acquisition time, specifically, according to the first carrier phase observation value, the first The two carrier phase observations and the reference position information are calculated using the real-time dynamic carrier phase difference technique at the time of observation acquisition.
  • the communication interface 402 is further configured to send a location acquisition instruction to the reference base station, where the location acquisition instruction is used to instruct the reference base station to calculate location information at an observation time collection time by using a real-time dynamic carrier phase difference technology.
  • the location acquisition instruction carries the first carrier phase observation value; and receives the location information that is fed back by the reference base station at the observation value collection time;
  • the processor 401 performs, according to the first carrier phase observation value, when determining the location information at the time of the observation value acquisition, specifically for acquiring the observation value received by the communication interface. Location information at the moment.
  • the memory 403 is configured to store the first carrier phase observation value
  • the processor 401 is configured to: when determining, according to the first carrier phase observation value, location information at an acquisition time of an observation value, specifically acquiring a second carrier phase observation value and a location acquired by the reference base station Referring to the reference location information of the base station; calculating the observed value by using a dynamic post-processing technique according to the first carrier phase observation value, the second carrier phase observation value, and the reference position information Collect location information at the moment.
  • the processor 401 performs, according to the first carrier phase observation value, when determining location information at an observation value acquisition time, specifically determining whether the target observation value exists in the recorded observation value acquisition time.
  • the target observation value acquisition time is at the same time as the target exposure time; if present, the position information at the target observation value acquisition time is determined as the target position information at the target exposure time .
  • the processor 401 performs, according to the first carrier phase observation value, when determining the location information at the time of the observation value acquisition, specifically for determining the specified observation value collection from the recorded observation value acquisition time.
  • the time interval between the specified observation value acquisition time and the target exposure time is less than a preset interval; and the target position at the target exposure time is estimated according to the position information at the specified observation value acquisition time information.
  • the processor 401 is further configured to generate a captured image identifier corresponding to the captured image before performing the associating the captured image with the target location information; identifying the captured image and the The target exposure time is correlated.
  • the processor 401 when the processor 401 performs the associating the captured image with the target location information, specifically for querying a captured image identifier corresponding to the target exposure time; storing the target location information to In the target metadata, the target metadata is used to record attribute information of the captured image corresponding to the captured image identifier.
  • the memory 403 is further configured to store the target metadata.
  • the processor 401 may be a central processing unit (CPU), and the processor 401 may also be another general-purpose processor, a digital signal processor (Digital Signal Processor, DSP), Application Specific Integrated Circuit (ASIC), Field-Programmable Gate Array (FPGA) or other programmable logic device, discrete gate or transistor logic device, discrete hardware component, and the like.
  • the general purpose processor may be a microprocessor or the processor 401 or any conventional processor or the like.
  • the memory 403 may include a read-only memory (ROM) and a random access memory (RAM), and provides a computer program and data to the processor 401.
  • a portion of the memory 403 may also include a non-volatile random access memory.
  • processor 401 the communication interface 402, and the storage described in the embodiments of the present invention are required.
  • the implementation of the method for associating the image and the location information shown in FIG. 1 or FIG. 2 of the present application may be referred to the description of the relevant part of the method in the embodiment of the present invention, and details are not described herein again.
  • the image and location information associating device can calculate the target exposure time of the captured image by the photographing device, and associate the captured image with the target position information at the target exposure time, and can be on one side.
  • the shooting image is associated with its accurate position information in a flight mode, which improves the accuracy and efficiency of the surveying.
  • FIG. 5 is a schematic structural diagram of a mobile platform according to an embodiment of the present invention.
  • the mobile platform can include one or more processors 501, one or more communication interfaces 502, one or more camera devices 503, and one or more memories 504.
  • the processor 501, the communication interface 502, the camera 503, and the memory 504 may be connected by, but not limited to, via a bus 505.
  • the movable platform may further include a fuselage not shown in FIG. 5, a power system disposed on the fuselage, a power system, a flight control system, a navigation system, a positioning system, and the like.
  • the power system is used to provide flight power.
  • the movable platform may be an aircraft such as a drone, a car, a boat, or the like, and the photographing device 503 may be, for example, a camera, a camera, or the like.
  • the processor 501 is configured to detect a shooting trigger event
  • the communication interface 502 is configured to send a shooting instruction to the shooting device 503 when the processor 501 detects a shooting triggering event, where the shooting instruction is used to instruct the shooting device 503 to take a picture to obtain a captured image;
  • the photographing device 503 is configured to receive the photographing instruction, and perform photographing according to the photographing instruction to obtain a photographed image;
  • the processor 501 is further configured to calculate a target exposure time at which the photographing device exposes the captured image; determine target position information at the target exposure time; and perform the captured image and the target position information Association.
  • the processor 501 is configured to acquire an exposure preparation time of the photographing device when the target exposure time for exposing the captured image by the photographing device is performed, and calculate according to the exposure preparation time. a target exposure time at which the photographing device exposes the photographed image.
  • the processor 501 when the processor 501 performs the calculation of a target exposure time at which the photographing device exposes the captured image, specifically for acquiring a specified exposure of the photographing device to expose the captured image. Light time; calculating a target exposure time at which the photographing device exposes the photographed image according to the specified exposure time.
  • the processor 501 is configured to: when the target exposure time for exposing the captured image by the photographing device is performed, specifically for acquiring an exposure preparation time of the photographing device; and acquiring the photographing device The specified exposure time of the image exposure is captured; and the target exposure time at which the photographing apparatus exposes the captured image is calculated according to the exposure preparation time and the specified exposure time.
  • the exposure preparation time is a time between a time when the photographing device receives the photographing instruction and a moment when the photographing device starts exposing.
  • the specified exposure time is a time between a time when the camera starts to expose and a specified time during exposure.
  • the processor 501 when the processor 501 performs the acquiring a specified exposure time for exposing the captured image by the photographing device, specifically for calculating a unit exposure time required for exposure of the photographing device to a single pixel; a number of target pixels between the initial exposure position in the captured image to the specified exposure position; and a specified exposure time at which the photographing device exposes the captured image is calculated according to the unit exposure time and the number of target pixels.
  • the processor 501 performs, when performing, calculating a unit exposure time required for the camera to expose a single pixel, specifically for acquiring a line exposure time required for the camera to expose a row of pixels, and The number of pixels included in a row of pixels in the captured image is taken; and the unit exposure time required for the exposure of the photographing device to a single pixel is calculated according to the line exposure time and the number of pixels.
  • the processor 501 when the processor 501 performs the acquiring a specified exposure time for exposing the captured image by the photographing device, specifically for acquiring a total exposure time required for the photographing device to expose the captured image; A specified exposure time at which the photographing device exposes the photographed image is calculated according to the total exposure time.
  • the communication interface 502 is further configured to collect a first carrier phase observation value
  • the processor 501 when the determining, by the processor 501, the target location information at the target exposure time, specifically determining, according to the first carrier phase observation value, determining a target location at the target exposure time information.
  • the processor 501 is configured to determine, according to the first carrier phase observation value, target position information at the target exposure time, specifically, according to the first carrier phase observation value, Position information at the time of observation acquisition; according to the bit at the time of observation acquisition Information is set to determine target position information at the target exposure time.
  • the communication interface 502 is further configured to receive a second carrier phase observation value and reference location information sent by the reference base station;
  • the processor 501 performs, according to the first carrier phase observation value, when determining location information at an observation value acquisition time, specifically, according to the first carrier phase observation value, the first The two carrier phase observations and the reference position information are calculated using the real-time dynamic carrier phase difference technique at the time of observation acquisition.
  • the communication interface 502 is further configured to send a location acquisition command to the reference base station, where the location acquisition instruction is used to instruct the reference base station to calculate location information at an observation time collection time by using a real-time dynamic carrier phase difference technology.
  • the location acquisition instruction carries the first carrier phase observation value; and receives location information that is fed back by the reference base station at the observation value collection time.
  • the processor 501 performs, according to the first carrier phase observation value, when determining the location information at the time of the observation value collection, specifically, acquiring, by the communication interface, the collected at the observation value. Location information at the moment.
  • the memory 504 is configured to store the first carrier phase observation value; optionally, the processor 501 performs the determining, according to the first carrier phase observation value, at an observation value acquisition time
  • the location information is specifically used to obtain a second carrier phase observation value collected by the reference base station and reference location information of the reference base station; according to the first carrier phase observation value, the second carrier phase observation value, and the Referring to the position information, the dynamic post-processing technique is used to calculate the position information at the time of observation acquisition.
  • the processor 501 performs, according to the first carrier phase observation value, when determining the location information at the time of the observation value collection, specifically determining whether the target observation value exists in the recorded observation time collection time.
  • the target observation value acquisition time is at the same time as the target exposure time; if present, the position information at the target observation value acquisition time is determined as the target position information at the target exposure time .
  • the processor 501 performs, according to the first carrier phase observation value, when determining the location information at the time of the observation value collection, specifically, determining, according to the collected observation value collection time, the specified observation value collection.
  • the time interval between the specified observation value acquisition time and the target exposure time is less than a preset interval; and the target position at the target exposure time is estimated according to the position information at the specified observation value acquisition time information.
  • the processor 501 is further configured to generate a captured image identifier corresponding to the captured image before performing the associating the captured image with the target location information; identifying the captured image and the The target exposure time is correlated.
  • the processor 501 is configured to invoke the program instruction to perform the associating the captured image with the target location information, specifically for querying a captured image identifier corresponding to the target exposure time.
  • the target location information is stored in the target metadata, and the target metadata is used to record attribute information of the captured image corresponding to the captured image identifier.
  • the memory 504 is further configured to store the target metadata.
  • processor 501 in the embodiment of the present invention may be the processor described in the foregoing embodiment
  • memory 504 in the embodiment of the present invention may be the memory described in the foregoing embodiment.
  • the movable platform can record the target exposure time of the captured image by the photographing device, and associate the captured image with the target position information at the target exposure time, and can be photographed while flying.
  • the captured image is associated with its accurate position information, which improves the accuracy and efficiency of the surveying.
  • a computer readable storage medium is also provided in an embodiment of the present invention, the computer readable storage medium storing a computer program including program instructions, the program instructions being called by the processor 501
  • the processor 501 is caused to perform the association method of the image and the location information shown in FIG. 1 or FIG. 2 of the present application.
  • the computer readable storage medium may be an internal storage unit of the mobile platform described herein, such as a hard disk or memory of the removable platform.
  • the computer readable storage medium may also be an external storage device of the mobile platform, such as a plug-in hard disk equipped on the movable platform, a smart memory card (SMC), an SD card, a flash memory card. (Flash Card), etc.
  • the computer readable storage medium may also include both an internal storage unit of the removable platform and an external storage device.
  • the computer readable storage medium is for storing the computer program and other programs and data required by the mobile platform.
  • the computer readable storage medium can also be used to temporarily store data that has been output or is about to be output.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Studio Devices (AREA)

Abstract

本发明实施例提供了一种图像和位置信息的关联方法、装置及可移动平台,其中,所述方法包括:当检测到拍摄触发事件时,发送拍摄指令给拍摄装置,所述拍摄指令用于指示所述拍摄装置进行拍摄以得到拍摄图像;计算所述拍摄装置对所述拍摄图像曝光的目标曝光时刻;确定在所述目标曝光时刻下的目标位置信息;将所述拍摄图像和所述目标位置信息进行关联。本发明实施例可以提高测绘准确率和效率。

Description

图像和位置信息的关联方法、装置及可移动平台 技术领域
本发明涉及测绘技术领域,尤其涉及一种图像和位置信息的关联方法、装置及可移动平台。
背景技术
使用诸如无人机、汽车等可移动平台进行测绘具有方便快捷及造价低等优势,因此得到了广泛的应用。
在可移动平台高速飞行执行测绘任务的过程中,为了对应精准的位置信息,采用的拍摄方式一般为悬停拍摄。当采用悬停拍摄的拍摄方式时,虽然能够得到可移动平台拍摄的景物的准确的GPS(Global Positioning System,全球定位系统)位置信息,但是可移动平台在每拍摄一张图像时都需要悬停下来,使得测绘效率低下。
发明内容
本发明实施例提供了一种图像和位置信息的关联方法、装置及可移动平台,可以提高测绘准确率和效率。
第一方面,本发明实施例提供了一种图像和位置信息的关联方法,包括:
当检测到拍摄触发事件时,发送拍摄指令给拍摄装置,所述拍摄指令用于指示所述拍摄装置进行拍摄以得到拍摄图像;
计算所述拍摄装置对所述拍摄图像曝光的目标曝光时刻;
确定在所述目标曝光时刻下的目标位置信息;
将所述拍摄图像和所述目标位置信息进行关联。
第二方面,本发明实施例提供了一种图像和位置信息的关联装置,包括:
处理器,用于检测拍摄触发时间;
通讯接口,用于当所述处理器检测到拍摄触发事件时,发送拍摄指令给拍摄装置,所述拍摄指令用于指示所述拍摄装置进行拍摄以得到拍摄图像;
所述处理器,还用于计算所述拍摄装置对所述拍摄图像曝光的目标曝光时刻;确定在所述目标曝光时刻下的目标位置信息;将所述拍摄图像和所述目标 位置信息进行关联。
第三方面,本发明实施例提供了一种可移动平台,包括:
机身;
设置在所述机身上的动力系统,用于提供飞行动力;
拍摄装置,用于接收拍摄指令并根据所述拍摄指令进行拍摄以得到拍摄图像;以及
上述第二方面的图像和位置信息的关联装置。
第四方面,本发明实施例提供了一种计算机可读存储介质,所述计算机存储介质存储有计算机程序,所述计算机程序包括程序指令,所述程序指令当被处理器执行时使所述处理器执行上述第一方面的图像和位置信息的关联方法。
本发明实施例通过计算拍摄装置对拍摄图像曝光的目标曝光时刻,并将拍摄图像与在该目标曝光时刻下的目标位置信息进行关联,可以在一边飞行一边拍摄的方式下将拍摄图像与自身准确的位置信息进行关联,提高了测绘准确率和效率。
附图说明
为了更清楚地说明本发明实施例或现有技术中的技术方案,下面将对实施例中所需要使用的附图作简单地介绍,显而易见地,下面描述中的附图仅仅是本发明的一些实施例,对于本领域普通技术人员来讲,在不付出创造性劳动的前提下,还可以根据这些附图获得其他的附图。
图1是本发明实施例提供的一种图像和位置信息的关联方法的流程示意图;
图2是本发明实施例提供的另一种图像和位置信息的关联方法的流程示意图;
图3是本发明实施例提供的一种拍摄图像的区域划分示意图。
图4是本发明实施例提供的一种图像和位置信息的关联装置的结构示意图;
图5是本发明实施例提供的一种可移动平台的结构示意图。
具体实施方式
下面将结合本发明实施例中的附图,对本发明实施例中的技术方案进行清楚、完整地描述,显然,所描述的实施例仅仅是本发明一部分实施例,而不是全部的实施例。在不冲突的情况下,下述实施例或实施方法中的特征可以任意组合。基于本发明中的实施例,本领域普通技术人员在没有作出创造性劳动前提下所获得的所有其他实施例,都属于本发明保护的范围。
当采用悬停拍摄的拍摄方式时,诸如无人机等可移动平台虽然能够得到拍摄景物的准确的GPS位置信息,但是可移动平台在每拍摄一张图像时都需要悬停下来,使得测绘效率低下。
为了解决上述问题,本发明实施例提供了一种图像和位置信息的关联方法。其中,所述图像和位置信息的关联方法可以由可移动平台执行。
其中,所述可移动平台关联有拍摄装置,所述拍摄装置在接收到所述可移动平台的发送的拍摄指令后执行拍摄操作得到拍摄图像。
所述拍摄装置进行拍摄以得到拍摄图像的方式可以为定点拍摄,在这种情形下,所述可移动平台在飞到指定的GPS范围内后发送拍摄指令给所述拍摄装置。
所述拍摄装置进行拍摄以得到拍摄图像的方式还可以为定距离拍摄,在这种情形下,所述可移动平台每飞过一定的距离后发送拍摄指令给所述拍摄装置。
所述拍摄装置进行拍摄以得到拍摄图像的方式还可以为定时拍摄,在这种情形下,所述可移动平台每隔一定的时间后发送拍照指令给所述拍摄装置。
当所述可移动平台发送拍摄指令给所述拍摄装置时,所述可移动平台获取所述拍摄图像的中心曝光位置的曝光时刻(以下称为中心曝光时刻)。
其中,当所述拍摄装置中的图像传感器采用卷帘快门(Rolling Shutter)的方式进行曝光(即逐行曝光)时,所述中心曝光时刻的计算公式可以具体为:中心曝光时刻=所述拍摄图像起始曝光位置起始曝光位置的曝光时刻(以下称为起始曝光时刻)+所述拍摄装置对一个像素曝光所需的单位曝光时间*从所述拍摄图像的起始曝光位置到中心曝光位置的目标像素数目。
其中,所述起始曝光时刻的计算公式可以具体为:起始曝光时刻=目标中断发生时刻+曝光准备时间。
需要说明的是,所述图像传感器在开始曝光前会发出一个目标中断,所述可移动平台可以记录所述目标中断发生的时刻。其中,所述目标中断可以为 VIN(输入信号)的VSYNC(场同步信号)中断。所述曝光准备时间为所述图像传感器在所述目标中断发生后到开始曝光前所经历的时间。
还需要说明的是,不同的图像传感器其曝光准备时间不一样。
其中,所述单位曝光时间的计算公式可以具体为:单位曝光时间=所述拍摄装置对一行像素曝光所需的行曝光时间/所述拍摄图像中一行像素包括的行像素数目。
还需要说明的是,从所述拍摄图像的起始曝光位置到中心曝光位置的目标像素数目可以根据所述拍摄图像的行像素数目和所述拍摄图像中一列像素包括的列像素数目确定。具体地,所述目标像素数目的计算公式可以具体为:目标像素数目=行像素数目*列像素数目/2+行像素数目/2。
从而,所述可移动平台获取所述拍摄图像的中心曝光时刻时可以具体包括:获取所述拍摄图像的起始曝光时刻;计算所述拍摄装置对一个像素曝光所需的单位曝光时间;计算从所述拍摄图像的起始曝光位置到中心曝光位置的目标像素数目;将所述单位曝光时间和所述目标像素数目的乘积与所述开始曝光时刻之和作为所述拍摄图像的中心曝光时刻。
具体地,所述可移动平台获取所述拍摄图像的起始曝光时刻时可以具体包括:记录目标中断发生时刻;获取曝光准备时间;将所述中断发生时刻与所述曝光准备时间之和作为所述拍摄图像的起始曝光时刻。
具体地,所述可移动平台计算所述拍摄装置对一个像素曝光所需的单位曝光时间时可以具体包括:获取所述拍摄装置对一行像素曝光所需的行曝光时间;获取所述拍摄图像中一行像素包括的行像素数目;将所述行曝光时间与所述行像素数目的商作为所述拍摄装置对一个像素曝光所需的单位曝光时间。
具体地,所述可移动平台计算从所述拍摄图像的起始曝光位置到中心曝光位置的目标像素数目时可以具体包括:获取所述拍摄图像中一行像素包括的行像素数目;获取所述拍摄图像中一列像素包括的列像素数目;将所述列像素数目的一半和所述行像素数目的乘积与所述行像素数目的一半之和作为从所述拍摄图像的起始曝光位置到中心曝光位置的目标像素数目。
作为一种可选的实施方式,所述可移动平台中包括GPS-RTK(Real-Time Kinematic,实时动态)模块。在所述可移动平台获取所述中心曝光时刻之后,所述可移动平台将所述中心曝光时刻发送给所述GPS-RTK模块,所述 GPS-RTK模块通过插值运算,计算出在所述中心曝光时刻下的精确GPS位置信息,并将所述GPS位置信息反馈给所述可移动平台,所述可移动平台将所述GPS位置信息与所述拍摄图像关联。
作为另一种可选的实施方式,所述可移动平台存储所述中心曝光时刻,并结合所述可移动平台从开机就一直保存的PPK(Post Processed Kinematic,动态后处理)原始(RAW)数据一起进行测后联合处理,从而计算出所述可移动平台的GPS位置信息,并将所述GPS位置信息与所述拍摄图像关联。其中,所述中心曝光时刻可以存储在所述拍摄装置的安全数字(Secure Digital,SD)卡的事件日志记录(Event log)中。
具体地,所述可移动平台将所述GPS位置信息与所述拍摄图像关联的方式可以具体为将所述GPS位置信息存储到所述拍摄图像的Exif数据和XMP数据中,便于后续生成地图使用。
本发明实施例通过计算拍摄装置对拍摄图像曝光的目标曝光时刻,并将拍摄图像与在该目标曝光时刻下的目标位置信息进行关联,可以在一边飞行一边拍摄的方式下将拍摄图像与自身准确的位置信息进行关联,提高了测绘准确率和效率。下面结合图1至图5,对本发明实施例的图像和位置信息的关联方法、装置及可移动平台进行详细的描述。
请参见图1,是本发明实施例提供的一种图像和位置信息的关联方法的流程示意图。如图1所示,所述图像和位置信息的关联方法可以包括:
S101:当检测到拍摄触发事件时,发送拍摄指令给拍摄装置,所述拍摄指令用于指示所述拍摄装置进行拍摄以得到拍摄图像。
其中,所述拍摄装置例如可以是相机、摄像头等。
需要说明的是,本发明实施例提供的图像和位置信息的关联方法的各个步骤可以由可移动平台执行并且应用于测绘领域中。其中,所述可移动平台可以是诸如无人机等飞行器,也可以是汽车、船等等。
在本发明实施中,所述拍摄触发事件可以包括:可移动平台飞行到指定的GPS范围内、可移动平台飞行了指定的距离或可移动平台飞行了指定的时间、可移动平台接收到了指定的遥控器指令或者语音指令、可移动平台获取到了指定的自身姿态数据、可移动平台检测到了指定的姿态动作等等,本发明实施例 对此不作任何限制。
S102:计算所述拍摄装置对所述拍摄图像曝光的目标曝光时刻。
在一个具体的实施例中,所述可移动平台执行所述计算所述拍摄装置对所述拍摄图像曝光的目标曝光时刻时可以具体包括:获取所述拍摄装置的曝光准备时间;根据所述曝光准备时间,计算所述拍摄装置对所述拍摄图像曝光的目标曝光时刻。
其中,所述曝光准备时间指的是从所述拍摄装置接收到所述拍摄指令的时刻到所述拍摄装置开始曝光的时刻之间的时间。作为一种可选的实施方式,所述拍摄装置的曝光准备时间可以预先存储在所述可移动平台中。
作为一种可选地实施方式,所述可移动平台可以将所述拍摄指令的发送时刻作为所述拍摄装置接收到所述拍摄指令的时刻,即所述曝光准备时间为从所述拍摄指令的发送时刻到所述拍摄装置开始曝光的时刻之间的时间。
具体地,所述可移动平台执行所述根据所述曝光准备时间,计算所述拍摄装置对所述拍摄图像曝光的目标曝光时刻可以具体包括:记录所述拍摄指令的发送时刻,并将所述拍摄指令的发送时刻与所述曝光准备时间之和确定为所述拍摄装置对所述拍摄图像曝光的目标曝光时刻。
在另一个具体的实施例中,所述可移动平台执行所述计算所述拍摄装置对所述拍摄图像曝光的目标曝光时刻时可以具体包括:获取所述拍摄装置对所述拍摄图像曝光的指定曝光时间;根据所述指定曝光时间,计算所述拍摄装置对所述拍摄图像曝光的目标曝光时刻。
具体地,所述可移动平台执行所述根据所述指定曝光时间,计算所述拍摄装置对所述拍摄图像曝光的目标曝光时刻可以具体包括:将所述拍摄指令的发送时刻与所述指定曝光时间之和确定为所述拍摄装置对所述拍摄图像曝光的目标曝光时刻。
其中,所述指定曝光时间可以是所述拍摄装置开始曝光的时刻到所述拍摄装置曝光到指定曝光位置的时刻之间的时间,所述可移动平台执行所述获取所述拍摄装置对所述拍摄图像曝光的指定曝光时间可以具体包括:计算所述拍摄装置对单个像素曝光所需的单位曝光时间;计算从所述拍摄图像中的起始曝光位置到指定曝光位置之间的目标像素数目;根据所述单位曝光时间和所述目标像素数目,计算所述拍摄装置对所述拍摄图像曝光的指定曝光时间。
其中,所述起始曝光位置为所述拍摄图像曝光的第一个像素所在的位置,所述指定曝光位置为所述拍摄图像中指定像素所在的位置。
在另一个具体的实施例中,所述可移动平台执行所述计算所述拍摄装置对所述拍摄图像曝光的目标曝光时刻时可以具体包括:获取所述拍摄装置的曝光准备时间;获取所述拍摄装置对所述拍摄图像曝光的指定曝光时间;根据所述曝光准备时间和所述指定曝光时间,计算所述拍摄装置曝光所述拍摄图像的目标曝光时刻。
具体地,所述可移动平台执行所述根据所述曝光准备时间和所述指定曝光时间,计算所述拍摄装置曝光所述拍摄图像的目标曝光时刻可以具体包括:将所述拍摄指令的发送时刻与所述曝光准备时间以及所述指定曝光时间之和确定为所述拍摄装置对所述拍摄图像曝光的目标曝光时刻。
需要说明的是,所述可移动平台执行获取所述曝光准备时间和所述指定曝光时间的具体技术细节可以参考前述相关部分的描述,在此不再赘述。
S103:确定在所述目标曝光时刻下的目标位置信息。
在本发明实施例中,所述可移动平台执行所述确定在所述目标曝光时刻下的目标位置信息可以具体包括:采集第一载波相位观测值;根据所述第一载波相位观测值,确定在所述目标曝光时刻下的目标位置信息。
其中,所述第一载波相位观测值为所述可移动平台接收到的卫星定位数据。
具体地,所述可移动平台执行所述确定在所述目标曝光时刻下的目标位置信息可以具体包括:根据所述第一载波相位观测值,确定在观测值采集时刻下的位置信息;根据所述在观测值采集时刻下的位置信息,确定在所述目标曝光时刻下的目标位置信息。
其中,所述观测值采集时刻指的是所述第一载波相位观测值的采集时刻。
在一个具体的实施例中,所述可移动平台执行所述根据所述在观测值采集时刻下的位置信息,确定在所述目标曝光时刻下的目标位置信息可以具体包括:判断记录的观测值采集时刻中是否存在目标观测值采集时刻,所述目标观测值采集时刻与所述目标曝光时刻处于同一时刻;若存在,则将在所述目标观测值采集时刻下的位置信息确定为在所述目标曝光时刻下的目标位置信息。
在这种情形下,所述目标曝光时刻为所述可移动平台的观测值采集时刻。因此,所述可移动平台可以根据所述第一载波相位观测值直接计算出在所述目 标曝光时刻下的目标位置信息。
在另一个具体的实施例中,所述可移动平台执行所述根据所述在观测值采集时刻下的位置信息,确定在所述目标曝光时刻下的目标位置信息可以具体包括:从记录的观测值采集时刻中确定指定观测值采集时刻,所述指定观测值采集时刻与所述目标曝光时刻之间的时间间隔小于预设间隔;根据在所述指定观测值采集时刻下的位置信息,估算在所述目标曝光时刻下的目标位置信息。
具体地,所述可移动平台可以通过插值运算来估算在在所述目标曝光时刻下的目标位置信息。
在本发明实施例中,所述可移动平台周期性地采集所述第一载波相位观测值。在这种情形下,所述预设间隔例如可以是所述可移动平台采集所述第一载波相位观测值的n(n为正整数)倍周期。
举例来说,当所述预设间隔为所述可移动平台采集所述第一载波相位观测值的周期时,所述可移动平台可以根据在小于所述目标曝光时刻的所有观测值采集时刻中最大的观测值采集时刻,以及大于所述目标曝光时刻的所有观测值采集时刻中最小的观测值采集时刻下的位置信息,估算在所述目标曝光时刻下的目标位置信息。换句话说,所述可移动平台可以根据在与所述目标曝光时刻最接近的两个观测值采集时刻下的位置信息,估算在所述目标曝光时刻下的目标位置信息。
S104:将所述拍摄图像和所述目标位置信息进行关联。
在一个具体的实施例中,所述可移动平台执行所述将所述拍摄图像和所述目标位置信息进行关联可以具体包括:将所述目标位置信息存储到目标元数据中,所述目标元数据用于记录所述拍摄图像的属性信息。
其中,所述目标元数据为所述拍摄图像的Exif数据或XMP数据。所述可移动平台将在所述目标曝光时刻下的目标位置信息存储到所述拍摄图像的目标元数据中,便于后续生成地图使用。
在本发明实施例中,通过计算拍摄装置对拍摄图像曝光的目标曝光时刻,并将拍摄图像与在该目标曝光时刻下的目标位置信息进行关联,可以在一边飞行一边拍摄的方式下将拍摄图像与自身准确的位置信息进行关联,提高了测绘准确率和效率。
进一步地,请参见图2,是本发明实施例提供的另一种图像和位置信息的关联方法的流程示意图。在图1所示的实施例的基础上,如图2所示,所述图像和位置信息的关联方法可以包括:
S201:当检测到拍摄触发事件时,发送拍摄指令给拍摄装置,所述拍摄指令用于指示所述拍摄装置进行拍摄以得到拍摄图像。
其中,所述拍摄装置例如可以是相机、摄像头等。
需要说明的是,本发明实施例提供的图像和位置信息的关联方法的各个步骤可以由可移动平台执行。其中,所述可移动平台可以是诸如无人机等飞行器,也可以是汽车、船等等。
在本发明实施中,所述拍摄触发事件可以包括:可移动平台飞行到指定的GPS范围内、可移动平台飞行了指定的距离或可移动平台飞行了指定的时间、可移动平台接收到了指定的遥控器指令或者语音指令、可移动平台获取到了指定的自身姿态数据、可移动平台检测到了指定的姿态动作等等,本发明实施例对此不作任何限制。
S202:生成所述拍摄图像对应的拍摄图像标识。
其中,所述拍摄图像标识(即所述拍摄图像的ID)用于区分所述拍摄装置在不同的目标曝光时刻拍摄得到的拍摄图像,即所述拍摄图像与所述拍摄图像标识之间具有一一对应的关系。
S203:计算所述拍摄装置对所述拍摄图像曝光的目标曝光时刻。
在一个具体的实施例中,所述可移动平台执行所述计算所述拍摄装置对所述拍摄图像曝光的目标曝光时刻时可以具体包括:获取所述拍摄装置的曝光准备时间;根据所述曝光准备时间,计算所述拍摄装置对所述拍摄图像曝光的目标曝光时刻。
其中,所述曝光准备时间指的是从所述拍摄装置接收到所述拍摄指令的时刻到所述拍摄装置开始曝光的时刻之间的时间。需要说明的是,所述曝光准备时间是由所述拍摄装置中的图像传感器决定的。也就是说,具有不同的图像传感器的拍摄装置其曝光准备时间不一样。作为一种可选的实施方式,所述拍摄装置的曝光准备时间可以预先存储在所述可移动平台中。
作为一种可选地实施方式,所述可移动平台可以将所述拍摄指令的发送时刻作为所述拍摄装置接收到所述拍摄指令的时刻,即所述曝光准备时间为从所 述拍摄指令的发送时刻到所述拍摄装置开始曝光的时刻之间的时间。
具体地,所述可移动平台执行所述根据所述曝光准备时间,计算所述拍摄装置对所述拍摄图像曝光的目标曝光时刻可以具体包括:记录所述拍摄指令的发送时刻,并将所述拍摄指令的发送时刻与所述曝光准备时间之和确定为所述拍摄装置对所述拍摄图像曝光的目标曝光时刻。
在一些实施例中,当所述拍摄装置接收到所述拍摄指令时,所述拍摄装置会发出一个目标中断。其中,所述目标中断例如可以是VIN的VSYNC中断。作为一种可选地实施方式,所述可移动平台可以将所述目标中断的发生时刻作为所述拍摄装置接收到所述拍摄指令的时刻,即所述曝光准备时间为从所述目标中断的发生时刻到所述拍摄装置开始曝光的时刻之间的时间。在这种情形下,所述拍摄装置可以记录所述目标中断的发生时刻,并将所述目标中断的发生时刻发送给所述可移动平台。所述可移动平台执行所述根据所述曝光准备时间,计算所述拍摄装置对所述拍摄图像曝光的目标曝光时刻可以具体包括:接收所述拍摄装置发送的所述目标中断的发生时刻,将所述目标中断的发生时刻与所述曝光准备时间之和确定为所述拍摄装置对所述拍摄图像曝光的目标曝光时刻。
在另一个具体的实施例中,所述可移动平台执行所述计算所述拍摄装置对所述拍摄图像曝光的目标曝光时刻时可以具体包括:获取所述拍摄装置对所述拍摄图像曝光的指定曝光时间;根据所述指定曝光时间,计算所述拍摄装置对所述拍摄图像曝光的目标曝光时刻。
具体地,所述可移动平台执行所述根据所述指定曝光时间,计算所述拍摄装置对所述拍摄图像曝光的目标曝光时刻可以具体包括:将所述拍摄指令的发送时刻与所述指定曝光时间之和确定为所述拍摄装置对所述拍摄图像曝光的目标曝光时刻。
具体地,所述可移动平台执行所述根据所述指定曝光时间,计算所述拍摄装置对所述拍摄图像曝光的目标曝光时刻还可以具体包括:将所述目标中断的发生时刻与所述指定曝光时间之和确定为所述拍摄装置对所述拍摄图像曝光的目标曝光时刻。
其中,所述指定曝光时间为所述拍摄装置开始曝光的时刻到曝光过程中的指定时刻之间的时间。
作为一种可选的实施方式,所述曝光过程中的指定时刻可以是所述拍摄装置曝光到指定曝光位置的时刻。
当所述指定曝光时间为所述拍摄装置开始曝光的时刻到所述拍摄装置曝光到指定曝光位置的时刻之间的时间时,所述可移动平台执行所述获取所述拍摄装置对所述拍摄图像曝光的指定曝光时间可以具体包括:计算所述拍摄装置对单个像素曝光所需的单位曝光时间;计算从所述拍摄图像中的起始曝光位置到指定曝光位置之间的目标像素数目;根据所述单位曝光时间和所述目标像素数目,计算所述拍摄装置对所述拍摄图像曝光的指定曝光时间。需要说明的是,当所述可移动平台采用该方式计算所述指定曝光时间时,所述拍摄装置可以使用卷帘快门的方式对所述拍摄图像进行曝光,即所述拍摄装置对所述拍摄图像中的像素进行逐行曝光。
其中,所述起始曝光位置为所述拍摄图像曝光的第一个像素所在的位置,所述指定曝光位置为所述拍摄图像中指定像素所在的位置。从所述拍摄图像中的起始曝光位置到指定曝光位置之间的目标像素数目可以由所述拍摄图像中一行像素包括的行像素数目以及所述指定像素在所述拍摄图像中的位置(即所述指定像素为第几行第几列的像素)确定。其中,所述行像素数目是根据所述拍摄图像的分辨率确定的。
需要说明的是,当所述拍摄图像包括保留区域和围绕其保留区域的裁剪区域时,所述行像素数目和所述列像素数目分别指的是整张所述拍摄图像的行像素数目和列像素数目,而不仅仅是保留区域的行像素数目和列像素数目。
请参见图3,图3为本发明实施例提供一种拍摄图像的区域划分示意图。其中,图3所示的拍摄图像30的分辨率为4000*3000,其一行像素包括的行像素数目为4860,其一列像素包括的列像素数目为3648。如图3所示,所述拍摄图像30包括保留区域301和围绕所述保留区域301的裁剪(crop)区域302,所述裁剪区域302中(或所述拍摄图像30)的第一个像素所在的位置为起始曝光位置303,所述保留区域303(或所述拍摄图像30)中的中心像素所在的位置中心曝光像素304。需要说明的是,所述中心像素指的是所述拍摄图像30中第1824行第2430列的像素。在这种情形下,所述目标像素数目为(4860*1824+2430)个。
需要说明的是,当所述拍摄装置使用卷帘快门的方式对所述拍摄图像进行 曝光时,所述拍摄装置对一行像素曝光所需的行曝光时间是由所述拍摄装置中的图像传感器决定的。作为一种可选的实施方式,所述拍摄装置对一行像素曝光所需的行曝光时间可以预先存储在所述可移动平台中。
具体地,所述可移动平台执行所述计算所述拍摄装置对单个像素曝光所需的单位曝光时间可以具体包括:获取所述拍摄装置对一行像素曝光所需的行曝光时间以及所述拍摄图像中一行像素包括的行像素数目;根据所述行曝光时间以及所述行像素数目,计算所述拍摄装置对单个像素曝光所需的单位曝光时间。
作为另一种可选的实施方式,所述曝光过程中的指定时刻可以是曝光过程进行到指定程度(如曝光过程进行到一半)的时刻。在这种情形下,所述指定曝光时间可以是所述拍摄装置对所述拍摄图像曝光所需的总曝光时间的指定比例(如所述总曝光时间的一半)。
当所述指定曝光时间为所述拍摄装置对所述拍摄图像曝光所需的总曝光时间的指定比例时,所述可移动平台执行所述获取所述拍摄装置对所述拍摄图像曝光的指定曝光时间还可以具体包括:获取所述拍摄装置对所述拍摄图像曝光所需的总曝光时间;根据所述总曝光时间,计算所述拍摄装置对所述拍摄图像曝光的指定曝光时间。需要说明的是,当所述可移动平台采用该方式计算所述指定曝光时间时,所述拍摄装置可以使用卷帘快门的方式对所述拍摄图像进行曝光,也可以使用全局快门(Global Shutter)的方式对所述拍摄图像进行曝光,即所述拍摄装置对所述拍摄图像中的各个像素同时进行曝光。
具体地,所述可移动平台执行所述根据所述总曝光时间,计算所述拍摄装置对所述拍摄图像曝光的指定曝光时间可以具体包括:将所述总曝光时间的指定比例作为指定曝光时间。其中,所述指定比例可以预先存储在所述可移动平台中。
当所述拍摄装置使用卷帘快门的方式对所述拍摄图像进行曝光时,所述可移动平台执行所述获取所述拍摄装置对所述拍摄图像曝光所需的总曝光时间可以具体包括:计算所述拍摄装置对单个像素曝光所需的单位曝光时间;计算所述拍摄图像包括的总像素数目;根据所述单位曝光时间和所述总像素数目,计算所述拍摄装置对所述拍摄图像曝光所需的总曝光时间。
其中,所述单位曝光时间的计算方式可以参考前述相关部分的描述,在此不再赘述。所述总像素数目根据所述拍摄图像的分辨率确定。
当所述拍摄装置使用全局快门的方式对所述拍摄图像进行曝光时,所述指定比例可以预先存储在所述可移动平台中。
在另一个具体的实施例中,所述可移动平台执行所述计算所述拍摄装置对所述拍摄图像曝光的目标曝光时刻时可以具体包括:获取所述拍摄装置的曝光准备时间;获取所述拍摄装置对所述拍摄图像曝光的指定曝光时间;根据所述曝光准备时间和所述指定曝光时间,计算所述拍摄装置曝光所述拍摄图像的目标曝光时刻。
具体地,所述可移动平台执行所述根据所述曝光准备时间和所述指定曝光时间,计算所述拍摄装置曝光所述拍摄图像的目标曝光时刻可以具体包括:将所述拍摄指令的发送时刻与所述曝光准备时间以及所述指定曝光时间之和确定为所述拍摄装置对所述拍摄图像曝光的目标曝光时刻。
具体地,所述可移动平台执行所述根据所述曝光准备时间和所述指定曝光时间,计算所述拍摄装置曝光所述拍摄图像的目标曝光时刻还可以具体包括:将所述目标中断的发生时刻与所述曝光准备时间以及所述指定曝光时间之和确定为所述拍摄装置对所述拍摄图像曝光的目标曝光时刻。
需要说明的是,所述可移动平台执行获取所述曝光准备时间和所述指定曝光时间的具体技术细节可以参考前述相关部分的描述,在此不再赘述。
S204:将所述拍摄图像标识和所述目标曝光时刻进行关联。
在一个具体的实施例中,所述可移动平台执行将所述拍摄图像标识和所述目标曝光时刻进行关联的方式可以具体包括:将所述拍摄图像标识和所述目标曝光时刻进行关联存储。
需要说明的是,在所述可移动平台将所述拍摄图像标识和所述目标曝光时刻进行关联之后,所述拍摄图像标识与所述目标曝光时刻之间具有一一对应的关系。从而,所述可移动平台可以根据所述目标曝光时刻查询到在所述拍摄装置在所述目标曝光时刻下拍摄得到的拍摄图像的拍摄图像标识,由于所述拍摄图像与所述拍摄图像标识之间具有一一对应的关系,因此所述可移动平台还可以进一步查询到在所述目标曝光时刻下拍摄得到的拍摄图像。
S205:确定在所述目标曝光时刻下的目标位置信息。
在本发明实施例中,所述可移动平台执行所述确定在所述目标曝光时刻下的目标位置信息可以具体包括:采集第一载波相位观测值;根据所述第一载波 相位观测值,确定在所述目标曝光时刻下的目标位置信息。
其中,所述第一载波相位观测值为所述可移动平台接收到的卫星定位数据。
具体地,所述可移动平台执行所述确定在所述目标曝光时刻下的目标位置信息可以具体包括:根据所述第一载波相位观测值,确定在观测值采集时刻下的位置信息;根据所述在观测值采集时刻下的位置信息,确定在所述目标曝光时刻下的目标位置信息。
其中,所述观测值采集时刻指的是所述第一载波相位观测值的采集时刻。
在一个具体的实施例中,所述可移动平台执行所述根据所述第一载波相位观测值,确定在观测值采集时刻下的位置信息可以具体包括:接收参考基站发送的第二载波相位观测值和参考位置信息;根据所述第一载波相位观测值、所述第二载波相位观测值和所述参考位置信息,使用实时动态(Real-Time Kinematic,RTK)载波相位差分技术计算在观测值采集时刻下的位置信息。
在这种情形下,所述可移动平台可以实时获取所述参考基站采集的第二载波相位观测值和所述参考基站的参考位置信息。其中,所述第二载波相位观测值为所述参考基站接收到的卫星定位数据。
其中,所述参考位置信息指的是所述参考基站的位置信息(如坐标),所述观测值采集时刻指的是所述第一载波相位观测值或所述第二载波相位观测值的采集时刻。
作为一种可选的实施方式,所述可移动平台是根据在同一时刻采集的所述第一载波相位观测值、所述第二载波相位观测值以及所述参考位置信息计算在观测值采集时刻下的位置信息。
在另一个具体的实施例中,所述可移动平台执行所述根据所述第一载波相位观测值,确定在观测值采集时刻下的位置信息可以具体包括:发送位置获取指令给参考基站,所述位置获取指令用于指示所述参考基站使用实时动态载波相位差分技术计算在观测值采集时刻下的位置信息,其中,所述位置获取指令中携带有所述第一载波相位观测值;接收所述参考基站反馈的在所述观测值采集时刻下的位置信息。
在这种情形下,计算在观测值采集时刻下的位置信息由所述参考基站执行,所述在观测值采集时刻下的位置信息的计算对于所述可移动平台是透明的。
在另一个具体的实施例中,所述可移动平台执行所述根据所述第一载波相 位观测值,确定在观测值采集时刻下的位置信息可以具体包括:存储所述第一载波相位观测值;获取参考基站采集的第二载波相位观测值和所述参考基站的参考位置信息;根据所述第一载波相位观测值、所述第二载波相位观测值和所述参考位置信息,使用动态后处理(Post Processed Kinematic,PPK)技术计算在观测值采集时刻下的位置信息。
在这种情形下,所述可移动平台可以直接从所述参考基站获取所述第二载波相位观测值和所述参考位置信息,也可以从第三方设备获取所述第二载波相位观测值和所述参考位置信息。所述可移动平台一旦开始运行就会周期性地获取所述第一载波相位观测值(也称为PPK RAW数据)。
S206:将所述拍摄图像和所述目标位置信息进行关联。
在一个具体的实施例中,所述可移动平台执行将所述拍摄图像和所述目标位置信息进行关联可以具体包括:查询所述目标曝光时刻对应的拍摄图像标识;将所述目标位置信息存储到目标元数据中,所述目标元数据用于记录所述拍摄图像标识对应的拍摄图像的属性信息。
其中,所述目标元数据为所述拍摄图像的Exif数据或XMP数据。所述可移动平台将在所述目标曝光时刻下的目标位置信息存储到所述拍摄图像的目标元数据中,便于后续生成地图使用。
在本发明实施例中,通过计算拍摄装置对拍摄图像曝光的目标曝光时刻,并将拍摄图像与在该目标曝光时刻下的目标位置信息进行关联,可以在一边飞行一边拍摄的方式下将拍摄图像与自身准确的位置信息进行关联,提高了测绘准确率和效率。
请参见图4,是本发明实施例提供的一种图像和位置信息的关联装置的示意性框图。如图4所示,所述图像和位置信息的关联装置可以包括:一个或多个处理器401、一个或多个通讯接口402以及一个或多个存储器403。所述一个或多个处理器401可以单独地或协同的地工作,所述一个或多个存储器403可以单独地或协同的地工作。所述处理器401、所述通讯接口402和所述存储器403可以通过但不限于通过总线404连接。
所述处理器401,用于检测拍摄触发事件;
所述通讯接口402,用于当所述处理器401检测到拍摄触发事件时,发送拍摄指令给拍摄装置,所述拍摄指令用于指示所述拍摄装置进行拍摄以得到拍摄图像;
所述处理器401,还用于计算所述拍摄装置对所述拍摄图像曝光的目标曝光时刻;确定在所述目标曝光时刻下的目标位置信息;将所述拍摄图像和所述目标位置信息进行关联。
可选地,所述处理器401指令执行所述计算所述拍摄装置对所述拍摄图像曝光的目标曝光时刻时,具体用于获取所述拍摄装置的曝光准备时间;根据所述曝光准备时间,计算所述拍摄装置对所述拍摄图像曝光的目标曝光时刻。
可选地,所述处理器401执行所述计算所述拍摄装置对所述拍摄图像曝光的目标曝光时刻时,具体用于获取所述拍摄装置对所述拍摄图像曝光的指定曝光时间;根据所述指定曝光时间,计算所述拍摄装置对所述拍摄图像曝光的目标曝光时刻。
可选地,所述处理器401执行所述计算所述拍摄装置对所述拍摄图像曝光的目标曝光时刻时,具体用于获取所述拍摄装置的曝光准备时间;获取所述拍摄装置对所述拍摄图像曝光的指定曝光时间;根据所述曝光准备时间和所述指定曝光时间,计算所述拍摄装置曝光所述拍摄图像的目标曝光时刻。
可选地,所述曝光准备时间为所述拍摄装置接收到所述拍摄指令的时刻到所述拍摄装置开始曝光的时刻之间的时间。
可选地,所述指定曝光时间为所述拍摄装置开始曝光的时刻到曝光过程中的指定时刻之间的时间。
可选地,所述处理器401执行所述获取所述拍摄装置对所述拍摄图像曝光的指定曝光时间时,具体用于计算所述拍摄装置对单个像素曝光所需的单位曝光时间;计算从所述拍摄图像中的起始曝光位置到指定曝光位置之间的目标像素数目;根据所述单位曝光时间和所述目标像素数目,计算所述拍摄装置对所述拍摄图像曝光的指定曝光时间。
可选地,所述处理器401执行所述计算所述拍摄装置对单个像素曝光所需的单位曝光时间时,具体用于获取所述拍摄装置对一行像素曝光所需的行曝光时间以及所述拍摄图像中一行像素包括的像素数目;根据所述行曝光时间以及所述像素数目,计算所述拍摄装置对单个像素曝光所需的单位曝光时间。
可选地,所述处理器401执行所述获取所述拍摄装置对所述拍摄图像曝光的指定曝光时间时,具体用于获取所述拍摄装置对所述拍摄图像曝光所需的总曝光时间;根据所述总曝光时间,计算所述拍摄装置对所述拍摄图像曝光的指定曝光时间。
可选地,所述通讯接口402,还用于采集第一载波相位观测值;
可选地,所述处理器401执行所述确定在所述目标曝光时刻下的目标位置信息时,具体用于根据所述第一载波相位观测值,确定在所述目标曝光时刻下的目标位置信息。
可选地,所述处理器401执行所述根据所述第一载波相位观测值,确定在所述目标曝光时刻下的目标位置信息时,具体用于根据所述第一载波相位观测值,确定在观测值采集时刻下的位置信息;根据所述在观测值采集时刻下的位置信息,确定在所述目标曝光时刻下的目标位置信息。
可选地,所述通讯接口402,还用于接收参考基站发送的第二载波相位观测值和参考位置信息;
可选地,所述处理器401执行所述根据所述第一载波相位观测值,确定在观测值采集时刻下的位置信息时,具体用于根据所述第一载波相位观测值、所述第二载波相位观测值和所述参考位置信息,使用实时动态载波相位差分技术计算在观测值采集时刻下的位置信息。
可选地,所述通讯接口402,还用于发送位置获取指令给参考基站,所述位置获取指令用于指示所述参考基站使用实时动态载波相位差分技术计算在观测值采集时刻下的位置信息,其中,所述位置获取指令中携带有所述第一载波相位观测值;接收所述参考基站反馈的在所述观测值采集时刻下的位置信息;
可选地,所述处理器401执行所述根据所述第一载波相位观测值,确定在观测值采集时刻下的位置信息时,具体用于获取所述通讯接口接收的在所述观测值采集时刻下的位置信息。
可选地,所述存储器403,用于存储所述第一载波相位观测值;
可选地,所述处理器401执行所述根据所述第一载波相位观测值,确定在观测值采集时刻下的位置信息时,具体用于获取参考基站采集的第二载波相位观测值和所述参考基站的参考位置信息;根据所述第一载波相位观测值、所述第二载波相位观测值和所述参考位置信息,使用动态后处理技术计算在观测值 采集时刻下的位置信息。
可选地,所述处理器401执行所述根据所述第一载波相位观测值,确定在观测值采集时刻下的位置信息时,具体用于判断记录的观测值采集时刻中是否存在目标观测值采集时刻,所述目标观测值采集时刻与所述目标曝光时刻处于同一时刻;若存在,则将在所述目标观测值采集时刻下的位置信息确定为在所述目标曝光时刻下的目标位置信息。
可选地,所述处理器401执行所述根据所述第一载波相位观测值,确定在观测值采集时刻下的位置信息时,具体用于从记录的观测值采集时刻中确定指定观测值采集时刻,所述指定观测值采集时刻与所述目标曝光时刻之间的时间间隔小于预设间隔;根据在所述指定观测值采集时刻下的位置信息,估算在所述目标曝光时刻下的目标位置信息。
可选地,所述处理器401执行所述将所述拍摄图像和所述目标位置信息进行关联之前,还用于生成所述拍摄图像对应的拍摄图像标识;将所述拍摄图像标识和所述目标曝光时刻进行关联。
可选地,所述处理器401执行所述将所述拍摄图像和所述目标位置信息进行关联时,具体用于查询所述目标曝光时刻对应的拍摄图像标识;将所述目标位置信息存储到目标元数据中,所述目标元数据用于记录所述拍摄图像标识对应的拍摄图像的属性信息。
可选地,所述存储器403,还用于存储所述目标元数据。
应当理解,在本发明实施例中,所述处理器401可以是中央处理单元(Central Processing Unit,CPU),所述处理器401还可以是其他通用处理器、数字信号处理器(Digital Signal Processor,DSP)、专用集成电路(Application Specific Integrated Circuit,ASIC)、现场可编程门阵列(Field-Programmable Gate Array,FPGA)或者其他可编程逻辑器件、分立门或者晶体管逻辑器件、分立硬件组件等。所述通用处理器可以是微处理器或者所述处理器401也可以是任何常规的处理器等。
所述存储器403可以包括只读存储器(Read-Only Memory,ROM)和随机存取存储器(Random Access Memory,RAM),并向处理器401提供计算机程序和数据。所述存储器403的一部分还可以包括非易失性随机存取存储器。
需要说明的是,本发明实施例中描述的处理器401、通讯接口402和存储 器403可执行本申请图1或图2所示的图像和位置信息的关联方法的实现方式,具体技术细节可以参考本发明实施例方法的相关部分的描述,在此不再赘述。
在本发明实施例中,所述图像和位置信息的关联装置通过计算拍摄装置对拍摄图像曝光的目标曝光时刻,并将拍摄图像与在该目标曝光时刻下的目标位置信息进行关联,可以在一边飞行一边拍摄的方式下将拍摄图像与自身准确的位置信息进行关联,提高了测绘准确率和效率。
请参见图5,图5是本发明实施例提供的一种可移动平台的结构示意图。如5所示,所述可移动平台可以包括:一个或多个处理器501、一个或多个通讯接口502、一个或多个拍摄装置503以及一个或多个存储器504。所述处理器501、通讯接口502、拍摄装置503、和存储器504可以通过但不限于通过总线505连接。
需要说明的是,所述可移动平台还可以包括未在图5中示出的机身、设置在所述机身上的动力系统、电源系统、飞控系统、导航系统、定位系统等等,所述动力系统用于提供飞行动力。其中,所述可移动平台可以是诸如无人机等飞行器、汽车、船等等,所述拍摄装置503例如可以是相机、摄像头等等。
所述处理器501,用于检测拍摄触发事件;
所述通讯接口502,用于当所述处理器501检测到拍摄触发事件时,发送拍摄指令给所述拍摄装置503,所述拍摄指令用于指示所述拍摄装置503进行拍摄以得到拍摄图像;
所述拍摄装置503,用于接收所述拍摄指令,并根据所述拍摄指令进行拍摄以得到拍摄图像;
所述处理器501,还用于计算所述拍摄装置对所述拍摄图像曝光的目标曝光时刻;确定在所述目标曝光时刻下的目标位置信息;将所述拍摄图像和所述目标位置信息进行关联。
可选地,所述处理器501执行所述计算所述拍摄装置对所述拍摄图像曝光的目标曝光时刻时,具体用于获取所述拍摄装置的曝光准备时间;根据所述曝光准备时间,计算所述拍摄装置对所述拍摄图像曝光的目标曝光时刻。
可选地,所述处理器501执行所述计算所述拍摄装置对所述拍摄图像曝光的目标曝光时刻时,具体用于获取所述拍摄装置对所述拍摄图像曝光的指定曝 光时间;根据所述指定曝光时间,计算所述拍摄装置对所述拍摄图像曝光的目标曝光时刻。
可选地,所述处理器501执行所述计算所述拍摄装置对所述拍摄图像曝光的目标曝光时刻时,具体用于获取所述拍摄装置的曝光准备时间;获取所述拍摄装置对所述拍摄图像曝光的指定曝光时间;根据所述曝光准备时间和所述指定曝光时间,计算所述拍摄装置曝光所述拍摄图像的目标曝光时刻。
可选地,所述曝光准备时间为所述拍摄装置接收到所述拍摄指令的时刻到所述拍摄装置开始曝光的时刻之间的时间。
可选地,所述指定曝光时间为所述拍摄装置开始曝光的时刻到曝光过程中的指定时刻之间的时间。
可选地,所述处理器501执行所述获取所述拍摄装置对所述拍摄图像曝光的指定曝光时间时,具体用于计算所述拍摄装置对单个像素曝光所需的单位曝光时间;计算从所述拍摄图像中的起始曝光位置到指定曝光位置之间的目标像素数目;根据所述单位曝光时间和所述目标像素数目,计算所述拍摄装置对所述拍摄图像曝光的指定曝光时间。
可选地,所述处理器501执行所述计算所述拍摄装置对单个像素曝光所需的单位曝光时间时,具体用于获取所述拍摄装置对一行像素曝光所需的行曝光时间以及所述拍摄图像中一行像素包括的像素数目;根据所述行曝光时间以及所述像素数目,计算所述拍摄装置对单个像素曝光所需的单位曝光时间。
可选地,所述处理器501执行所述获取所述拍摄装置对所述拍摄图像曝光的指定曝光时间时,具体用于获取所述拍摄装置对所述拍摄图像曝光所需的总曝光时间;根据所述总曝光时间,计算所述拍摄装置对所述拍摄图像曝光的指定曝光时间。
可选地,所述通讯接口502,还用于采集第一载波相位观测值;
可选地,所述处理器501执行所述确定在所述目标曝光时刻下的目标位置信息时,具体用于根据所述第一载波相位观测值,确定在所述目标曝光时刻下的目标位置信息。
可选地,所述处理器501执行所述根据所述第一载波相位观测值,确定在所述目标曝光时刻下的目标位置信息时,具体用于根据所述第一载波相位观测值,确定在观测值采集时刻下的位置信息;根据所述在观测值采集时刻下的位 置信息,确定在所述目标曝光时刻下的目标位置信息。
可选地,所述通讯接口502,还用于接收参考基站发送的第二载波相位观测值和参考位置信息;
可选地,所述处理器501执行所述根据所述第一载波相位观测值,确定在观测值采集时刻下的位置信息时,具体用于根据所述第一载波相位观测值、所述第二载波相位观测值和所述参考位置信息,使用实时动态载波相位差分技术计算在观测值采集时刻下的位置信息。
可选地,所述通讯接口502,还用于发送位置获取指令给参考基站,所述位置获取指令用于指示所述参考基站使用实时动态载波相位差分技术计算在观测值采集时刻下的位置信息,其中,所述位置获取指令中携带有所述第一载波相位观测值;接收所述参考基站反馈的在所述观测值采集时刻下的位置信息。
可选地,所述处理器501执行所述根据所述第一载波相位观测值,确定在观测值采集时刻下的位置信息时,具体用于获取所述通讯接口接收的在所述观测值采集时刻下的位置信息。
可选地,所述存储器504,用于存储所述第一载波相位观测值;可选地,所述处理器501执行所述根据所述第一载波相位观测值,确定在观测值采集时刻下的位置信息时,具体用于获取参考基站采集的第二载波相位观测值和所述参考基站的参考位置信息;根据所述第一载波相位观测值、所述第二载波相位观测值和所述参考位置信息,使用动态后处理技术计算在观测值采集时刻下的位置信息。
可选地,所述处理器501执行所述根据所述第一载波相位观测值,确定在观测值采集时刻下的位置信息时,具体用于判断记录的观测值采集时刻中是否存在目标观测值采集时刻,所述目标观测值采集时刻与所述目标曝光时刻处于同一时刻;若存在,则将在所述目标观测值采集时刻下的位置信息确定为在所述目标曝光时刻下的目标位置信息。
可选地,所述处理器501执行所述根据所述第一载波相位观测值,确定在观测值采集时刻下的位置信息时,具体用于从记录的观测值采集时刻中确定指定观测值采集时刻,所述指定观测值采集时刻与所述目标曝光时刻之间的时间间隔小于预设间隔;根据在所述指定观测值采集时刻下的位置信息,估算在所述目标曝光时刻下的目标位置信息。
可选地,所述处理器501执行所述将所述拍摄图像和所述目标位置信息进行关联之前,还用于生成所述拍摄图像对应的拍摄图像标识;将所述拍摄图像标识和所述目标曝光时刻进行关联。
可选地,所述处理器501被配置用于调用所述程序指令执行所述将所述拍摄图像和所述目标位置信息进行关联时,具体用于查询所述目标曝光时刻对应的拍摄图像标识;将所述目标位置信息存储到目标元数据中,所述目标元数据用于记录所述拍摄图像标识对应的拍摄图像的属性信息。
可选地,所述存储器504,还用于存储所述目标元数据。
需要说明的是,本发明实施例中的处理器501可以是前述实施例所描述的处理器,本发明实施例中的存储器504可以是前述实施例所描述的存储器。
在本发明实施例中,所述可移动平台通过计算拍摄装置对拍摄图像曝光的目标曝光时刻,并将拍摄图像与在该目标曝光时刻下的目标位置信息进行关联,可以在一边飞行一边拍摄的方式下将拍摄图像与自身准确的位置信息进行关联,提高了测绘准确率和效率。
在本发明的实施例中还提供一种计算机可读存储介质,所述计算机可读存储介质存储有计算机程序,所述计算机程序中包括程序指令,所述程序指令当被所述处理器501调用时使所述处理器501执行本申请图1或图2所示的图像和位置信息的关联方法。
所述计算机可读存储介质可以是本申请所述的可移动平台的内部存储单元,例如所述可移动平台的硬盘或内存。所述计算机可读存储介质也可以是所述可移动平台的外部存储设备,例如所述可移动平台上配备的插接式硬盘,智能存储卡(Smart Media Card,SMC),SD卡,闪存卡(Flash Card)等。进一步地,所述计算机可读存储介质还可以既包括所述可移动平台的内部存储单元也包括外部存储设备。所述计算机可读存储介质用于存储所述计算机程序以及所述可移动平台所需的其他程序和数据。所述计算机可读存储介质还可以用于暂时地存储已经输出或者将要输出的数据。
以上所述,仅为本发明的部分实施方式,但本发明的保护范围并不局限于此,任何熟悉本技术领域的技术人员在本发明揭露的技术范围内,可轻易想到各种等效的修改或替换,这些修改或替换都应涵盖在本发明的保护范围之内。因此,本发明的保护范围应以权利要求的保护范围为准。

Claims (38)

  1. 一种图像和位置信息的关联方法,其特征在于,所述方法包括:
    当检测到拍摄触发事件时,发送拍摄指令给拍摄装置,所述拍摄指令用于指示所述拍摄装置进行拍摄以得到拍摄图像;
    计算所述拍摄装置对所述拍摄图像曝光的目标曝光时刻;
    确定在所述目标曝光时刻下的目标位置信息;
    将所述拍摄图像和所述目标位置信息进行关联。
  2. 根据权利要求1所述的方法,其特征在于,所述计算所述拍摄装置对所述拍摄图像曝光的目标曝光时刻,包括:
    获取所述拍摄装置的曝光准备时间;
    根据所述曝光准备时间,计算所述拍摄装置对所述拍摄图像曝光的目标曝光时刻。
  3. 根据权利要求1所述的方法,其特征在于,所述计算所述拍摄装置对所述拍摄图像曝光的目标曝光时刻,包括:
    获取所述拍摄装置对所述拍摄图像曝光的指定曝光时间;
    根据所述指定曝光时间,计算所述拍摄装置对所述拍摄图像曝光的目标曝光时刻。
  4. 根据权利要求1所述的方法,其特征在于,所述计算所述拍摄装置对所述拍摄图像曝光的目标曝光时刻,包括:
    获取所述拍摄装置的曝光准备时间;
    获取所述拍摄装置对所述拍摄图像曝光的指定曝光时间;
    根据所述曝光准备时间和所述指定曝光时间,计算所述拍摄装置曝光所述拍摄图像的目标曝光时刻。
  5. 根据权利要求2或4所述的方法,其特征在于,所述曝光准备时间为所述拍摄装置接收到所述拍摄指令的时刻到所述拍摄装置开始曝光的时刻之 间的时间。
  6. 根据权利要求3或4所述的方法,其特征在于,所述指定曝光时间为所述拍摄装置开始曝光的时刻到曝光过程中的指定时刻之间的时间。
  7. 根据权利要求6所述的方法,其特征在于,所述获取所述拍摄装置对所述拍摄图像曝光的指定曝光时间,包括:
    计算所述拍摄装置对单个像素曝光所需的单位曝光时间;
    计算从所述拍摄图像中的起始曝光位置到指定曝光位置之间的目标像素数目;
    根据所述单位曝光时间和所述目标像素数目,计算所述拍摄装置对所述拍摄图像曝光的指定曝光时间。
  8. 根据权利要求7所述的方法,其特征在于,所述计算所述拍摄装置对单个像素曝光所需的单位曝光时间,包括:
    获取所述拍摄装置对一行像素曝光所需的行曝光时间以及所述拍摄图像中一行像素包括的行像素数目;
    根据所述行曝光时间以及所述行像素数目,计算所述拍摄装置对单个像素曝光所需的单位曝光时间。
  9. 根据权利要求6所述的方法,其特征在于,所述获取所述拍摄装置对所述拍摄图像曝光的指定曝光时间,包括:
    获取所述拍摄装置对所述拍摄图像曝光所需的总曝光时间;
    根据所述总曝光时间,计算所述拍摄装置对所述拍摄图像曝光的指定曝光时间。
  10. 根据权利要求1所述的方法,其特征在于,所述确定在所述目标曝光时刻下的目标位置信息,包括:
    采集第一载波相位观测值;
    根据所述第一载波相位观测值,确定在所述目标曝光时刻下的目标位置信 息。
  11. 根据权利要求10所述的方法,其特征在于,所述根据所述第一载波相位观测值,确定在所述目标曝光时刻下的目标位置信息,包括:
    根据所述第一载波相位观测值,确定在观测值采集时刻下的位置信息;
    根据所述在观测值采集时刻下的位置信息,确定在所述目标曝光时刻下的目标位置信息。
  12. 根据权利要求11所述的方法,其特征在于,所述根据所述第一载波相位观测值,确定在观测值采集时刻下的位置信息,包括:
    接收参考基站发送的第二载波相位观测值和参考位置信息;
    根据所述第一载波相位观测值、所述第二载波相位观测值和所述参考位置信息,使用实时动态载波相位差分技术计算在观测值采集时刻下的位置信息。
  13. 根据权利要求11所述的方法,其特征在于,所述根据所述第一载波相位观测值,确定在观测值采集时刻下的位置信息,包括:
    发送位置获取指令给参考基站,所述位置获取指令用于指示所述参考基站使用实时动态载波相位差分技术计算在观测值采集时刻下的位置信息,其中,所述位置获取指令中携带有所述第一载波相位观测值;
    接收所述参考基站反馈的在所述观测值采集时刻下的位置信息。
  14. 根据权利要求11所述的方法,其特征在于,所述根据所述第一载波相位观测值,确定在观测值采集时刻下的位置信息,包括:
    存储所述第一载波相位观测值;
    获取参考基站采集的第二载波相位观测值和所述参考基站的参考位置信息;
    根据所述第一载波相位观测值、所述第二载波相位观测值和所述参考位置信息,使用动态后处理技术计算在观测值采集时刻下的位置信息。
  15. 根据权利要求11所述的方法,其特征在于,所述根据所述在观测值 采集时刻下的位置信息,确定在所述目标曝光时刻下的目标位置信息,包括:
    判断记录的观测值采集时刻中是否存在目标观测值采集时刻,所述目标观测值采集时刻与所述目标曝光时刻处于同一时刻;
    若存在,则将在所述目标观测值采集时刻下的位置信息确定为在所述目标曝光时刻下的目标位置信息。
  16. 根据权利要求11所述的方法,其特征在于,所述根据所述在观测值采集时刻下的位置信息,确定在所述目标曝光时刻下的目标位置信息,包括:
    从记录的观测值采集时刻中确定指定观测值采集时刻,所述指定观测值采集时刻与所述目标曝光时刻之间的时间间隔小于预设间隔;
    根据在所述指定观测值采集时刻下的位置信息,估算在所述目标曝光时刻下的目标位置信息。
  17. 根据权利要求1所述的方法,其特征在于,在所述将所述拍摄图像和所述目标位置信息进行关联之前,所述方法还包括:
    生成所述拍摄图像对应的拍摄图像标识;
    将所述拍摄图像标识和所述目标曝光时刻进行关联。
  18. 根据权利要求17所述的方法,其特征在于,所述将所述拍摄图像和所述目标位置信息进行关联,包括:
    查询所述目标曝光时刻对应的拍摄图像标识;
    将所述目标位置信息存储到目标元数据中,所述目标元数据用于记录所述拍摄图像标识对应的拍摄图像的属性信息。
  19. 一种图像和位置信息的关联装置,其特征在于,所述装置包括:
    处理器,用于检测拍摄触发时间;
    通讯接口,用于当所述处理器检测到拍摄触发事件时,发送拍摄指令给拍摄装置,所述拍摄指令用于指示所述拍摄装置进行拍摄以得到拍摄图像;
    所述处理器,还用于计算所述拍摄装置对所述拍摄图像曝光的目标曝光时刻;确定在所述目标曝光时刻下的目标位置信息;将所述拍摄图像和所述目标 位置信息进行关联。
  20. 根据权利要求19所述的装置,其特征在于,
    所述处理器执行所述计算所述拍摄装置对所述拍摄图像曝光的目标曝光时刻时,具体用于获取所述拍摄装置的曝光准备时间;根据所述曝光准备时间,计算所述拍摄装置对所述拍摄图像曝光的目标曝光时刻。
  21. 根据权利要求19所述的装置,其特征在于,
    所述处理器执行所述计算所述拍摄装置对所述拍摄图像曝光的目标曝光时刻时,具体用于获取所述拍摄装置对所述拍摄图像曝光的指定曝光时间;根据所述指定曝光时间,计算所述拍摄装置对所述拍摄图像曝光的目标曝光时刻。
  22. 根据权利要求19所述的装置,其特征在于,
    所述处理器执行所述计算所述拍摄装置对所述拍摄图像曝光的目标曝光时刻时,具体用于获取所述拍摄装置的曝光准备时间;获取所述拍摄装置对所述拍摄图像曝光的指定曝光时间;根据所述曝光准备时间和所述指定曝光时间,计算所述拍摄装置曝光所述拍摄图像的目标曝光时刻。
  23. 根据权利要求20或22所述的装置,其特征在于,所述曝光准备时间为所述拍摄装置接收到所述拍摄指令的时刻到所述拍摄装置开始曝光的时刻之间的时间。
  24. 根据权利要求21或22所述的装置,其特征在于,所述指定曝光时间为所述拍摄装置开始曝光的时刻到曝光过程中的指定时刻之间的时间。
  25. 根据权利要求24所述的装置,其特征在于,
    所述处理器执行所述获取所述拍摄装置对所述拍摄图像曝光的指定曝光时间时,具体用于计算所述拍摄装置对单个像素曝光所需的单位曝光时间;计算从所述拍摄图像中的起始曝光位置到指定曝光位置之间的目标像素数目;根据所述单位曝光时间和所述目标像素数目,计算所述拍摄装置对所述拍摄图像 曝光的指定曝光时间。
  26. 根据权利要求25所述的装置,其特征在于,
    所述处理器执行所述计算所述拍摄装置对单个像素曝光所需的单位曝光时间时,具体用于获取所述拍摄装置对一行像素曝光所需的行曝光时间以及所述拍摄图像中一行像素包括的行像素数目;根据所述行曝光时间以及所述行像素数目,计算所述拍摄装置对单个像素曝光所需的单位曝光时间。
  27. 根据权利要求24所述的装置,其特征在于,
    所述处理器执行所述获取所述拍摄装置对所述拍摄图像曝光的指定曝光时间时,具体用于获取所述拍摄装置对所述拍摄图像曝光所需的总曝光时间;根据所述总曝光时间,计算所述拍摄装置对所述拍摄图像曝光的指定曝光时间。
  28. 根据权利要求19所述的装置,其特征在于,
    所述通讯接口,还用于采集第一载波相位观测值;
    所述处理器执行所述确定在所述目标曝光时刻下的目标位置信息时,具体用于根据所述第一载波相位观测值,确定在所述目标曝光时刻下的目标位置信息。
  29. 根据权利要求28所述的装置,其特征在于,
    所述处理器执行所述根据所述第一载波相位观测值,确定在所述目标曝光时刻下的目标位置信息时,具体用于根据所述第一载波相位观测值,确定在观测值采集时刻下的位置信息;根据所述在观测值采集时刻下的位置信息,确定在所述目标曝光时刻下的目标位置信息。
  30. 根据权利要求29所述的装置,其特征在于,
    所述通讯接口,还用于接收参考基站发送的第二载波相位观测值和参考位置信息;
    所述处理器执行所述根据所述第一载波相位观测值,确定在观测值采集时刻下的位置信息时,具体用于根据所述第一载波相位观测值、所述第二载波相 位观测值和所述参考位置信息,使用实时动态载波相位差分技术计算在观测值采集时刻下的位置信息。
  31. 根据权利要求29所述的装置,其特征在于,
    所述通讯接口,还用于发送位置获取指令给参考基站,所述位置获取指令用于指示所述参考基站使用实时动态载波相位差分技术计算在观测值采集时刻下的位置信息,其中,所述位置获取指令中携带有所述第一载波相位观测值;接收所述参考基站反馈的在所述观测值采集时刻下的位置信息;
    所述处理器执行所述根据所述第一载波相位观测值,确定在观测值采集时刻下的位置信息时,具体用于获取所述通讯接口接收的在所述观测值采集时刻下的位置信息。
  32. 根据权利要求29所述的装置,其特征在于,所述装置还包括:
    存储器,用于存储所述第一载波相位观测值;
    所述处理器执行所述根据所述第一载波相位观测值,确定在观测值采集时刻下的位置信息时,具体用于获取参考基站采集的第二载波相位观测值和所述参考基站的参考位置信息;根据所述第一载波相位观测值、所述第二载波相位观测值和所述参考位置信息,使用动态后处理技术计算在观测值采集时刻下的位置信息。
  33. 根据权利要求29所述的装置,其特征在于,
    所述处理器执行所述根据所述第一载波相位观测值,确定在观测值采集时刻下的位置信息时,具体用于判断记录的观测值采集时刻中是否存在目标观测值采集时刻,所述目标观测值采集时刻与所述目标曝光时刻处于同一时刻;若存在,则将在所述目标观测值采集时刻下的位置信息确定为在所述目标曝光时刻下的目标位置信息。
  34. 根据权利要求29所述的装置,其特征在于,
    所述处理器执行所述根据所述第一载波相位观测值,确定在观测值采集时刻下的位置信息时,具体用于从记录的观测值采集时刻中确定指定观测值采集 时刻,所述指定观测值采集时刻与所述目标曝光时刻之间的时间间隔小于预设间隔;根据在所述指定观测值采集时刻下的位置信息,估算在所述目标曝光时刻下的目标位置信息。
  35. 根据权利要求19所述的装置,其特征在于,
    所述处理器执行所述将所述拍摄图像和所述目标位置信息进行关联之前,还用于生成所述拍摄图像对应的拍摄图像标识;将所述拍摄图像标识和所述目标曝光时刻进行关联。
  36. 根据权利要求35所述的装置,其特征在于,
    所述处理器执行所述将所述拍摄图像和所述目标位置信息进行关联时,具体用于查询所述目标曝光时刻对应的拍摄图像标识;将所述目标位置信息存储到目标元数据中,所述目标元数据用于记录所述拍摄图像标识对应的拍摄图像的属性信息;
    所述存储器,还用于存储所述目标元数据。
  37. 一种可移动平台,其特征在于,所述平台包括:
    机身;
    设置在所述机身上的动力系统,用于提供飞行动力;
    拍摄装置,用于接收拍摄指令并根据所述拍摄指令进行拍摄以得到拍摄图像;以及
    如权利要求19至36所述的图像和位置信息的关联装置。
  38. 一种计算机可读存储介质,其特征在于,所述计算机存储介质存储有计算机程序,所述计算机程序包括程序指令,所述程序指令当被处理器调用时使所述处理器执行如权利要求1至18任一项所述的方法。
PCT/CN2017/117302 2017-12-19 2017-12-19 图像和位置信息的关联方法、装置及可移动平台 WO2019119282A1 (zh)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN201780004751.2A CN108513710A (zh) 2017-12-19 2017-12-19 图像和位置信息的关联方法、装置及可移动平台
PCT/CN2017/117302 WO2019119282A1 (zh) 2017-12-19 2017-12-19 图像和位置信息的关联方法、装置及可移动平台

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2017/117302 WO2019119282A1 (zh) 2017-12-19 2017-12-19 图像和位置信息的关联方法、装置及可移动平台

Publications (1)

Publication Number Publication Date
WO2019119282A1 true WO2019119282A1 (zh) 2019-06-27

Family

ID=63375239

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2017/117302 WO2019119282A1 (zh) 2017-12-19 2017-12-19 图像和位置信息的关联方法、装置及可移动平台

Country Status (2)

Country Link
CN (1) CN108513710A (zh)
WO (1) WO2019119282A1 (zh)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113686240A (zh) * 2021-07-13 2021-11-23 广州粤能电力科技开发有限公司 基于电力杆塔的定位方法、装置、计算机设备和存储介质

Families Citing this family (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2020062255A1 (zh) * 2018-09-30 2020-04-02 深圳市大疆创新科技有限公司 拍摄控制方法和无人机
CN111372111A (zh) * 2018-12-25 2020-07-03 北京初速度科技有限公司 一种数据确定方法、装置及车载终端
CN113973176A (zh) * 2019-01-30 2022-01-25 深圳市大疆创新科技有限公司 一种负载的控制方法、装置及控制设备
CN110533766B (zh) * 2019-08-06 2023-04-11 土豆数据科技集团有限公司 一种基于免像控ppk数据的倾斜摄影影像智能写入方法
WO2021035675A1 (zh) * 2019-08-30 2021-03-04 深圳市大疆创新科技有限公司 一种拍摄方法、装置及拍摄设备
CN112911134B (zh) * 2019-12-03 2023-04-25 杭州海康威视数字技术股份有限公司 图像抓拍方法和设备
CN110941001B (zh) * 2019-12-23 2021-02-26 北京讯腾智慧科技股份有限公司 复杂环境下的定位数据采集方法、系统、终端及存储介质
CN111103612A (zh) * 2019-12-27 2020-05-05 北京车和家信息技术有限公司 视觉自定位方法、装置及车辆
CN114125177A (zh) * 2020-08-28 2022-03-01 魔门塔(苏州)科技有限公司 一种基于图像帧号的图像时间戳的标定系统及方法
CN113141466A (zh) * 2021-04-21 2021-07-20 广州极飞科技股份有限公司 图像处理系统、图像处理方法及电子设备
CN113141470A (zh) * 2021-04-21 2021-07-20 广州极飞科技股份有限公司 图像采集系统、图像采集方法及电子设备
CN116363110B (zh) * 2023-04-06 2024-01-30 北京四维远见信息技术有限公司 基于pos数据的航飞质量检查方法、装置及存储介质

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101300524A (zh) * 2005-11-11 2008-11-05 松下电器产业株式会社 可互换镜头、照相机系统及其控制方法
US20110149133A1 (en) * 2009-12-22 2011-06-23 Canon Kabushiki Kaisha Image pickup apparatus for finding in-focus direction based on focus signal and control method for the image pickup apparatus
CN104765224A (zh) * 2015-04-23 2015-07-08 中国科学院光电技术研究所 一种航测相机定点拍摄预测控制方法
CN105956081A (zh) * 2016-04-29 2016-09-21 深圳电航空技术有限公司 地面站地图更新方法及装置
CN106657806A (zh) * 2017-01-24 2017-05-10 维沃移动通信有限公司 一种曝光方法及移动终端
CN106713773A (zh) * 2017-03-31 2017-05-24 联想(北京)有限公司 一种拍摄控制方法及电子设备

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102607527B (zh) * 2012-02-17 2014-01-08 中测新图(北京)遥感技术有限责任公司 无人机航摄测量方法和无人机航摄测量系统
CN106708070B (zh) * 2015-08-17 2021-05-11 深圳市道通智能航空技术股份有限公司 一种航拍控制方法和装置
JP6984997B2 (ja) * 2016-03-31 2021-12-22 倉敷紡績株式会社 画像配置方法及び画像配置用コンピュータプログラム
CN106772493B (zh) * 2017-01-03 2019-07-16 昆明理工大学 基于北斗差分定位的无人机航向测算系统及其测算方法
CN106680854A (zh) * 2017-01-17 2017-05-17 桂林电子科技大学 一种低成本高精度定位系统及方法

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101300524A (zh) * 2005-11-11 2008-11-05 松下电器产业株式会社 可互换镜头、照相机系统及其控制方法
US20110149133A1 (en) * 2009-12-22 2011-06-23 Canon Kabushiki Kaisha Image pickup apparatus for finding in-focus direction based on focus signal and control method for the image pickup apparatus
CN104765224A (zh) * 2015-04-23 2015-07-08 中国科学院光电技术研究所 一种航测相机定点拍摄预测控制方法
CN105956081A (zh) * 2016-04-29 2016-09-21 深圳电航空技术有限公司 地面站地图更新方法及装置
CN106657806A (zh) * 2017-01-24 2017-05-10 维沃移动通信有限公司 一种曝光方法及移动终端
CN106713773A (zh) * 2017-03-31 2017-05-24 联想(北京)有限公司 一种拍摄控制方法及电子设备

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113686240A (zh) * 2021-07-13 2021-11-23 广州粤能电力科技开发有限公司 基于电力杆塔的定位方法、装置、计算机设备和存储介质
CN113686240B (zh) * 2021-07-13 2024-05-03 广州粤能电力科技开发有限公司 基于电力杆塔的定位方法、装置、计算机设备和存储介质

Also Published As

Publication number Publication date
CN108513710A (zh) 2018-09-07

Similar Documents

Publication Publication Date Title
WO2019119282A1 (zh) 图像和位置信息的关联方法、装置及可移动平台
CN108900272B (zh) 传感器数据采集方法、系统和丢包判断方法
EP3469306B1 (en) Geometric matching in visual navigation systems
CN102263899B (zh) 摄影装置及其控制方法
WO2020258901A1 (zh) 传感器数据处理方法、装置、电子设备及系统
WO2019080052A1 (zh) 姿态标定方法、设备及无人飞行器
US9667826B2 (en) Image pickup apparatus, method for controlling the same, and program for obtaining position information and direction information
CN112383675B (zh) 一种时间同步方法、装置及终端设备
US20180218534A1 (en) Drawing creation apparatus and drawing creation method
JP2008118643A (ja) イメージファイル管理装置および方法
CN110716586A (zh) 无人机的拍照控制方法、装置、无人机和存储介质
JP2020533883A5 (zh)
CN112601928A (zh) 位置坐标推定装置、位置坐标推定方法以及程序
JP2024061721A (ja) 撮影指示方法及び撮影指示装置
Zingoni et al. Real-time 3D reconstruction from images taken from an UAV
KR20210133583A (ko) Gps정보 및 라이다 신호를 기초로 형성되는 컬러 맵 생성 장치 및 그 제어방법
JP2010258897A (ja) 判定プログラムおよびキャリブレーション装置
WO2020107195A1 (zh) 一种信息同步方法、无人机、负载设备、系统及存储介质
WO2020019111A1 (zh) 一种目标对象的深度信息获取方法及可移动平台
JP2017058831A (ja) 情報特定システム、情報特定方法及び、そのプログラム
US11157750B2 (en) Captured image check system and captured image check method
JP7183058B2 (ja) 三次元計測装置および三次元計測プログラム
US20120249840A1 (en) Electronic camera
WO2019225249A1 (ja) 情報処理装置、サーバ、移動体デバイス、情報処理方法、およびプログラム
JP2009239391A (ja) 複眼撮影装置およびその制御方法並びにプログラム

Legal Events

Date Code Title Description
NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 17935131

Country of ref document: EP

Kind code of ref document: A1