WO2021035756A1 - 基于飞行器的巡检方法、设备及存储介质 - Google Patents

基于飞行器的巡检方法、设备及存储介质 Download PDF

Info

Publication number
WO2021035756A1
WO2021035756A1 PCT/CN2019/103893 CN2019103893W WO2021035756A1 WO 2021035756 A1 WO2021035756 A1 WO 2021035756A1 CN 2019103893 W CN2019103893 W CN 2019103893W WO 2021035756 A1 WO2021035756 A1 WO 2021035756A1
Authority
WO
WIPO (PCT)
Prior art keywords
information
aircraft
flight
video recording
video frame
Prior art date
Application number
PCT/CN2019/103893
Other languages
English (en)
French (fr)
Inventor
杨超锋
何纲
Original Assignee
深圳市大疆创新科技有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 深圳市大疆创新科技有限公司 filed Critical 深圳市大疆创新科技有限公司
Priority to PCT/CN2019/103893 priority Critical patent/WO2021035756A1/zh
Priority to CN201980040208.7A priority patent/CN112313596A/zh
Publication of WO2021035756A1 publication Critical patent/WO2021035756A1/zh

Links

Images

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/10Simultaneous control of position or course in three dimensions
    • G05D1/101Simultaneous control of position or course in three dimensions specially adapted for aircraft
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/10Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration
    • G01C21/12Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning
    • G01C21/16Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/10Simultaneous control of position or course in three dimensions
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/10Image acquisition
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features

Definitions

  • This application relates to the technical field of aircraft inspection, and in particular to an aircraft-based inspection method, equipment and storage medium.
  • the present application provides an aircraft-based inspection method, equipment, and storage medium, which can quickly locate the location of the target area.
  • this application provides an aircraft-based inspection method, and the inspection method includes:
  • the flight information is bound to the corresponding video frame of the video recording, so as to obtain the position of the target area in the video frame according to the position information in the flight information and the PTZ attitude information.
  • the present application also provides an aircraft, the aircraft including a body, a photographing device, a memory and a processor;
  • the photographing device is connected to the body to photograph images, and the photographing device includes a pan/tilt and a camera installed on the pan/tilt, and the photographing angle of the camera can be adjusted by adjusting the pan/tilt;
  • the memory is used to store a computer program
  • the processor is configured to execute the computer program and, when executing the computer program, implement the following steps:
  • the flight information is bound to the corresponding video frame of the video recording, so as to obtain the position of the target area in the video frame according to the position information in the flight information and the PTZ attitude information.
  • the present application also provides a receiving end, the receiving end includes a display device;
  • the receiving end displays the video recording and the flight information bound to the video recording through the display device.
  • the flight information includes position information and pan/tilt attitude information; according to the position information and cloud information in the flight information
  • the station posture information acquires the position of the target area in the video frame.
  • the present application also provides a flight system, the flight system includes an aircraft and a receiving end, and the receiving end includes a display device;
  • the aircraft is used to collect video recordings of the inspection area during inspection
  • the aircraft is used to obtain corresponding flight information when the video recording is collected, and the flight information includes position information and gimbal attitude information;
  • the aircraft is used to bind the flight information to the corresponding video frame of the video recording, and send the video recording with the flight information added to the receiving end;
  • the receiving end is configured to display the video recording and the flight information in the video frame of the video recording through the display device, and obtain the video according to the position information and the pan/tilt attitude information in the flight information The position of the target area in the frame.
  • the present application also provides a computer-readable storage medium, the computer-readable storage medium stores a computer program, and when the computer program is executed by a processor, the processor implements the above-mentioned inspection method.
  • the embodiments of the application provide an aircraft-based inspection method, equipment, and storage medium, which collect video recordings of inspection areas during aircraft inspections; and obtain corresponding flight information when the aircraft collects the video recordings.
  • the flight information includes position information and pan/tilt attitude information; the flight information is bound to the corresponding video frame of the video recording, so as to obtain the video according to the position information and pan/tilt attitude information in the flight information.
  • the position of the target area in the frame In this way, rapid and accurate positioning of the target area is realized, and the efficiency of confirming the target area is improved.
  • Fig. 1 is a schematic block diagram of a flight system provided by an embodiment of the present application
  • FIG. 2 is a schematic flowchart of steps of an aircraft-based inspection method provided by an embodiment of the present application
  • FIG. 3 is a schematic diagram of an application scenario of an aircraft-based inspection method provided by an embodiment of the present application
  • FIG. 4 is a schematic diagram of the effect of determining the position of the target area provided by the embodiment of the present application.
  • FIG. 5 is a schematic flowchart of steps for adding flight information provided by an embodiment of the present application.
  • FIG. 6 is a schematic diagram of the effect of determining the subtitle display area provided by an embodiment of the present application.
  • FIG. 7 is a schematic block diagram of an aircraft provided by an embodiment of the present application.
  • FIG. 8 is a schematic block diagram of the receiving end of the flight system provided by an embodiment of the present application.
  • FIG. 1 is a schematic block diagram of a flight system according to an embodiment of the present application.
  • the flight system 100 may include an aircraft 110 and a receiving terminal 120, and the receiving terminal 120 is in communication connection with the aircraft 110.
  • the aircraft 110 is used for cruise detection; the receiving end 120 can be used as a control device to control the flying and shooting of the aircraft 110, and to receive the video taken during the inspection of the aircraft 110.
  • the aircraft 110 may be a rotary-wing aircraft, such as a single-rotor aircraft, a double-rotor aircraft, a tri-rotor aircraft, a quad-rotor aircraft, a hexa-rotor aircraft, an eight-rotor aircraft, a ten-rotor aircraft, a twelve-rotor aircraft, and the like.
  • the aircraft may also be other types of unmanned aerial vehicles or movable devices, and the embodiments of the present application are not limited thereto.
  • the receiving end 120 includes a display device for displaying the video collected by the aircraft 110 for the user to watch.
  • the receiving end 120 may be located on the ground end of the flight system 100, and may communicate with the aircraft 110 in a wireless manner, for receiving the video collected by the aircraft 110, and displaying the received video in real time.
  • the receiving terminal 120 is also used for remote control of the aircraft 110.
  • the receiving terminal 120 may be a remote controller or a terminal installed with an application program for controlling the aircraft 110.
  • the terminal can be a mobile phone, a tablet computer, a notebook computer, a desktop computer, a personal digital assistant, a wearable device, etc.
  • the receiving terminal 120 receives the flight control instruction input by the user, which may be, for example, controlling the aircraft 110 through an input device such as a joystick, a dial, a key, a button, or the like on a remote control, or a user interface (UI) on a terminal.
  • an input device such as a joystick, a dial, a key, a button, or the like on a remote control, or a user interface (UI) on a terminal.
  • UI user interface
  • the receiving end 120 may be a display device that only includes a display device, such as a display or a projector, for reading and displaying the video collected by the aircraft 110.
  • a display device such as a display or a projector
  • the aircraft 110 includes a flight controller 111, a camera 112 and a positioning device 113.
  • the flight controller 111 is used to control the flight of the aircraft 110, that is, the control module of the aircraft 110. It is understandable that the flight controller 111 can control the aircraft 110 according to pre-programmed program instructions, or can control the aircraft 110 by responding to one or more control instructions from the receiving end.
  • the camera 112 is mounted under the aircraft 110 and includes a camera 1121 and a pan/tilt 1122. Of course, it can also be an integrated pan/tilt camera.
  • the photographing device 112 is communicatively connected with the flight controller 111, and takes an image under the control of the flight controller 111.
  • the positioning device 113 is mounted on the aircraft 110 and is used to measure the position information of the aircraft in real time.
  • the positioning device 113 may include, but is not limited to, a GPS positioning device, a Beidou positioning device, or a real-time dynamic (RTK) carrier phase differential positioning device (RTK positioning device for short).
  • a GPS positioning device a Beidou positioning device
  • RTK real-time dynamic carrier phase differential positioning device
  • the RTK carrier phase differential technology is a differential method that processes the carrier phase observations of two measuring stations in real time.
  • the carrier phase collected by the reference station is sent to the user receiver to calculate the difference and solve the coordinates.
  • the RTK carrier phase difference technology uses the carrier phase dynamic real-time difference method.
  • the RTK carrier phase difference technology can obtain centimeter-level positioning accuracy in the field in real time, without the need for post-calculation to obtain centimeter-level accuracy.
  • the RTK positioning device can be used to accurately detect Positioning information of the aircraft.
  • the positioning device 113 is under the control of the flight controller 111 to collect the current position information of the aircraft in real time.
  • the location information may include longitude information and latitude information.
  • the aircraft 110 also includes an inertial measurement device, which is used to measure the flight speed and flight attitude information of the aircraft.
  • the pan/tilt 1122 includes an electronic speed governor (referred to as an ESC) and a motor.
  • the flight controller 111 can control the movement of the gimbal through an ESC and a motor.
  • the pan/tilt may also include a controller, through which the ESC and the motor are controlled to control the movement of the pan/tilt.
  • the motors are three-axis motors, which are pitch (pitch) axis motors, roll (roll) axis motors and translation (yaw) axis motors, which are used to change the attitude of the pan/tilt during shooting.
  • pan/tilt 1122 may be independent of the aircraft or part of the aircraft. It is understandable that the motor can be a DC motor or an AC motor. In addition, the motor may be a brushless motor or a brushed motor, and the embodiment of the present application is not limited thereto.
  • the camera 112 may also be provided at other suitable positions of the aircraft, such as the nose of the aircraft, and the embodiment of the present application is not limited to this.
  • the aircraft 110 further includes a distance measuring device 114.
  • the distance measuring device 114 is mounted on an aircraft and is used to measure the distance between the aircraft and a key target, and measure the flying distance or flying height of the aircraft.
  • the distance measuring device 114 includes at least one of the following: Time of Flight (TOF) ranging detection equipment, radar, ultrasonic detection equipment, laser detection equipment, and the like.
  • TOF Time of Flight
  • the aircraft 110 sends the captured video to the receiving terminal 120, and specifically transmits the captured video to the receiving terminal 120 using wireless image transmission technology.
  • the aircraft 110 is used to collect video, compress and encode the collected video, and send the compressed and encoded video to the receiving terminal 120, and the receiving terminal 120 decodes and displays the encoded video.
  • the smoothness and transmission speed of video transmission are improved.
  • the way of encoding the video can be inter-frame coding or intra-frame coding, of course, other coding methods can also be used, such as a mixed coding method of inter-frame coding and intra-frame coding.
  • a decoding method corresponding to the encoding method is adopted to decode the encoded video.
  • the aircraft 110 may also store the captured video in a memory or a memory card of the aircraft, so as to send or copy it to the receiving terminal 120 for display.
  • the inspection method provided by the embodiment of the present application will be introduced in detail based on the flight system, the aircraft in the flight system, and the receiving end in the flight system.
  • FIG. 2 is a schematic flowchart of steps of an aircraft-based inspection method according to an embodiment of the present application.
  • the aircraft-based inspection method is specifically applied to the flight controller of the aircraft, that is, it is executed by the flight controller 111 in FIG. 1; of course, it can also be implemented by other control devices carried on the aircraft, and the embodiment of the present application is not limited to this.
  • control device as a flight controller as an example.
  • the aircraft-based inspection method includes steps S201 to S203.
  • the video recording includes multiple video frames.
  • the collected video recordings are used to determine whether an abnormal accident occurs in the target area.
  • the abnormal accident is, for example, a disaster such as a fire or an earthquake, or a base station is destroyed, a bridge is broken, or a building collapses.
  • the flight information includes at least position information and gimbal attitude information.
  • the location information includes at least the location information of the aircraft, that is, the real-time location information corresponding to the time the aircraft shoots the video, including longitude information, latitude information, and altitude information.
  • the gimbal attitude information is the gimbal attitude parameter information when the aircraft is shooting video, including pitch angle, roll angle, and heading angle.
  • the position information further includes: take-off point position information and/or relative position information of the aircraft and the take-off point.
  • take-off point position information and/or relative position information of the aircraft and the take-off point When the aircraft position information is inaccurate, the take-off point position information and the position information The relative position information between the aircraft and the take-off point determines the position of the target area.
  • the flight information further includes flight parameters, where the flight parameters include at least one of flight speed and aircraft attitude parameters. So that the user can understand the detailed parameters of the target area according to the flight parameters, and do accurate analysis, positioning and recording.
  • the flight information further includes camera parameters, and the camera parameters include at least one of camera aperture size, shutter time, exposure intensity, and exposure value.
  • image processing may be performed on the video frame according to the shooting parameters, so as to more accurately identify the details of the target area.
  • the target area is the area where the bridge is located. If the bridge is broken, the details of the target area can be used to determine the degree of the bridge's fracture.
  • the flight information is bound to the corresponding video frame of the video recording, so as to obtain the position of the target area in the video frame according to the position information in the flight information and the PTZ attitude information.
  • the aircraft 110 collects the video recording of the inspection area through the camera 112; and obtains the corresponding flight information when the video recording is collected.
  • the flight information includes position information and cloud Station attitude information; and bind the flight information to the corresponding video frame of the video recording, and send the video recording and flight information to the receiving end 120.
  • the sending can be real-time transmission or uploading to the video recording after saving
  • the receiving end, the receiving end 120 may be a mobile phone, a tablet computer, another display device or a controller with a display device, etc.; the receiving end 120 displays video recordings and displays flight information in the video frames of the video recordings, so as to be based on the flight information
  • the position information and the pan/tilt posture information in the video frame are used to obtain the position of the target area in the video frame.
  • the current frame 130 displayed by the receiving end 120 includes flight information, and the flight information is specifically the information in the dashed box in the current frame 130.
  • F represents the aperture size of the camera
  • SS represents the shutter time
  • ISO represents the exposure intensity
  • EV represents the exposure value
  • GPS/RTK represents the aircraft position information (longitude, latitude, altitude)
  • HOME represents the aircraft take-off point location Information (longitude, latitude, altitude)
  • D represents the horizontal distance of the aircraft from the take-off point
  • H represents the relative altitude of the aircraft relative to the take-off point
  • HS represents the horizontal forward flight speed of the aircraft
  • VS represents the vertical flight speed of the aircraft.
  • F.PRY represents the aircraft's gimbal attitude parameters
  • G.PRY represents the aircraft's flight attitude parameters.
  • the corresponding flight information can be observed in the played video recording, and the flight information includes position information and pan/tilt attitude information. If you find that there is a target area while watching the video, you can determine the location of the target area based on the flight information and the PTZ attitude information.
  • the position of the target area may be determined according to the position information of the aircraft.
  • the position of the aircraft may be used as the position of the target area.
  • the location of the target area can be determined based on the location information and the PTZ attitude information.
  • the location information includes the position information of the aircraft, and the location of the target area and the position of the aircraft are determined according to the PTZ attitude information. Relationship, and then determine the location of the target area based on the location information of the aircraft and the azimuth relationship between the target area and the location of the aircraft.
  • the embodiment of the present invention can obtain more accurate target area position information, which is of great help to inspection operations, and is simple to implement without complicated calculations. The result is also more intuitive. For the aircraft, there is no need to add other detection devices, which saves costs and improves work efficiency.
  • the position information further includes take-off point position information and/or relative position information of the aircraft and the take-off point.
  • the position information of the aircraft is inaccurate (for example, the positioning accuracy of the positioning device is relatively low). Large deviation), the position information of the aircraft can be determined by the position information of the take-off point and the relative position information of the aircraft and the take-off point, and then combined with the PTZ attitude information to determine the position of the target area.
  • a more accurate position of the target area can be calculated according to the position information and the PTZ posture information.
  • the specific process is as follows: obtain the position point information corresponding to the target area determined by the user in the displayed video frame; The location information, the PTZ posture information, and the location point information determine the location of the target area.
  • the position includes: the position information of the aircraft, the distance between the aircraft and the target area; it may also include the position information of the take-off point and/or the aircraft and the take-off point Relative position information.
  • the PTZ attitude information includes the pitch angle, roll angle, and heading angle of the PTZ. Calculate the position of the target area according to the position information, the PTZ posture information and the position point information.
  • the gimbal attitude angle can be used as the angle information between the aircraft and the target area point, or the gimbal attitude angle can be used to estimate the angle information between the aircraft and the target area point as required , The specific needs to be determined according to the actual situation. Using the position of the aircraft, the angle information between the aircraft and the target area point, and the distance between the aircraft and the target area, the position of the target area can be calculated through the trigonometric function formula.
  • the calculated position information of the target area may also be displayed. So that users can more accurately determine the location information of the target area. It can be seen that when the user determines the target area while watching the video, the position of the target area can be accurately determined according to the position information and the PTZ posture information in the flight information added in the video recording. In order to deal with the target area in time, improve the inspection efficiency.
  • the corresponding flight information when the aircraft is collecting the video recording may be acquired at a preset frequency.
  • the preset frequency is related to the capture frequency of the video frame.
  • binding the flight information to the corresponding video frame of the video recording is specifically: determining the video frame corresponding to the flight information, and binding the flight information to the corresponding video frame.
  • determining the video frame corresponding to the flight information is specifically determining the number of video frames corresponding to each group of flight information according to the frame rate of the video recording and the preset frequency.
  • the flight information does not change much between adjacent video frames, so there is no need to add different flight information to each frame of video, which can also facilitate the user to determine the location of the target area.
  • hardware computing resources can be saved, the efficiency of adding flight information can be improved, and the power of the aircraft can be saved at the same time.
  • the collection time point may be determined, and the corresponding flight information when the aircraft is collecting the video recording can be obtained at the collection time point.
  • determining the collection time point is specifically acquiring the positioning accuracy level, and determining the collection time point according to the positioning accuracy level, where different positioning accuracy levels correspond to different positioning distances, and corresponding relationships can be preset and stored.
  • the flight information corresponding to the collection time point is bound to each video frame in the video recording corresponding to the collection time point to the previous collection time point.
  • 20 frames of video are taken from the collection time point T5 to the collection time point T4, and the flight information acquired at the collection time point T5 is bound to the 20 frames of video.
  • the positioning accuracy level selected by the user may be obtained; or, the default positioning accuracy level in the aircraft may be obtained.
  • the hardware computing resources of the aircraft be saved and the efficiency of adding flight information can be improved, but also the user experience can be improved.
  • the flight speed of the aircraft may also be acquired, and the acquisition time point is determined according to the positioning distance corresponding to the positioning accuracy level and the flight speed.
  • the acquisition time point can be obtained by dividing the positioning distance by the flight speed.
  • the flight information can be correlated with the flight speed, and more accurate flight information can be added to the video recording so that the location of the target area can be accurately located.
  • subtitle information is used to bind the flight information to the corresponding video frame of the video recording.
  • steps to add flight information include the following:
  • S203a Convert the flight information into subtitle information
  • S203b Bind the subtitle information with the corresponding video frame of the video recording.
  • the specific position in the video frame is a position set in advance for the convenience of the user to watch, such as a specific position, which can be set below, above, left or right of the video screen, for example, in Figure 3, the video is set at a specific position The bottom of the screen.
  • the flight information is displayed in a specific position of the video screen to facilitate the user to view, thereby improving the accuracy and efficiency of determining the location of the target area.
  • the subtitle information is displayed in the display area in a manner of determining the display area.
  • the key target is the inspection object, such as cables, dams, buildings, or iron towers.
  • Image recognition technology can be used to first determine the target area corresponding to the key target in the video image, and then select an area for displaying subtitle information as the display area in an area far from the target area.
  • determining the display area of the subtitle information in the corresponding video frame of the video recording is specifically: determining a plurality of candidates for adding the subtitle information in the corresponding video frame of the video recording Area; and calculating the sum of the pixel energy of each candidate area, and determining the display area of the subtitle information from the plurality of candidate areas according to the pixel energy.
  • the corresponding video frame of the video recording can be divided into several areas. As shown in Figure 6, the video frame is divided into 9 areas by the way of nine square grids, namely area 1, area 2, area 3, area 4, area 5. Area 6, area 7, area 8, and area 9, these nine areas are regarded as candidate areas. Of course, other division methods can also be used, which are not limited here.
  • the display area of the subtitle information is determined according to the pixel energy and from a plurality of the candidate areas. For example, selecting the smallest pixel energy and the corresponding candidate area as the display area of the subtitle information.
  • the video picture may also include other objects.
  • the absolute value of the pixel difference between the pixel in the candidate area and the key target object can be used to calculate the pixel energy of each candidate area. And, selecting the maximum pixel energy and the corresponding candidate area as the display area of the caption information.
  • the area unrelated to the inspection object can be quickly determined as the display area, thereby avoiding the subtitle information to block the inspection object, so that the user can clearly observe the inspection object and find the target area.
  • the subtitle information is specifically displayed in the display area in white.
  • the caption information in the dashed box in FIG. 3 is white.
  • the subtitle information may be displayed in the display area in a way of inverse color difference.
  • the color value of the display area is calculated according to the pixel value of the display area
  • the subtitle color value of the subtitle information is determined according to the color value of the display area
  • the subtitle information is corresponding to the subtitle color value.
  • the color of is displayed in the display area.
  • the color value of the subtitle and the color corresponding to the color value of the display area are mutually contrasting colors.
  • the contrasting colors can be, for example, colors that are symmetrical to each other on the color circle chart, so that the user can clearly see the subtitle information.
  • the video recording needs to be sent to the receiving end, so that the receiving end can display the video recording.
  • wireless image transmission technology can be used to encode and compress video recordings bound with flight information, and the encoded and compressed video recordings are sent to the receiving end, and the receiving end receives the encoded and compressed video recordings.
  • the video recording is decoded and the video recording is displayed.
  • wireless image transmission technology can prevent stuttering during video transmission and playback.
  • the encoding method may adopt inter-frame encoding or intra-frame encoding, and of course, other encoding methods may also be adopted, for example, a mixed encoding method of inter-frame encoding and intra-frame encoding.
  • a decoding method corresponding to the encoding method is adopted to decode the encoded video.
  • each macroblock includes multiple pixels, such as 16 ⁇ 16 pixels as a macroblock.
  • one frame of video image can be encoded into one or more slices, and each slice contains an integer number of macroblocks.
  • each slice has at least one macroblock, and at most it contains the macroblock of the entire image.
  • a slice group is a subset of several macroblocks in a coded image, including one or several slices.
  • the aircraft and the receiving end encode and decode the video image, respectively through the encoder and the decoder for encoding and decoding.
  • each frame of video image and the corresponding flight information may be unbound and packaged, and the corresponding relationship may be established.
  • the corresponding relationship is established by establishing the same label, that is, the data packet of the flight information is identified by the identification number of the video frame. Send the video recording and flight information separately.
  • the decoder determines whether the partial data output by the current frame in the video recording decoded by the decoder is greater than the preset macroblock data; if the partial output data is greater than the preset macroblock data, a data packet of flight information is loaded and generated The timing signal is sent to the display device, so that the display device displays the current frame, and the current frame includes flight information.
  • the decoding display method does not adopt a fixed frequency refresh display method, it will not cause the display to freeze or be unsmooth, thereby improving the smoothness of video playback, so as to quickly determine the location of the target area.
  • the above embodiment collects video recordings of the inspection area during the aircraft inspection; and obtains the flight information corresponding to the aircraft during the video recording.
  • the flight information includes position information and gimbal attitude information;
  • the flight information is bound to the corresponding video frame of the video recording, so that the user can obtain the position of the target area in the video frame according to the position information in the flight information and the PTZ attitude information when watching the video. So as to realize the fast and accurate positioning of the target area.
  • FIG. 7 is a schematic block diagram of an aircraft provided by an embodiment of the present application.
  • the aircraft 400 includes a camera 410, a processor 411, and a memory 412.
  • the processor 411, the memory 412, and the camera 410 are connected by a bus, such as an I2C (Inter-integrated Circuit) bus.
  • I2C Inter-integrated Circuit
  • the aircraft 400 also includes a body, and a photographing device 410 is connected to the body to capture images.
  • the photographing device includes a pan/tilt and a camera mounted on the pan/tilt.
  • the photographing angle of the camera can be adjusted by adjusting the pan/tilt. , That is, the PTZ attitude information.
  • the processor 411 may be a micro-controller unit (MCU), a central processing unit (Central Processing Unit, CPU), a digital signal processor (Digital Signal Processor, DSP), or the like.
  • MCU micro-controller unit
  • CPU Central Processing Unit
  • DSP Digital Signal Processor
  • the memory 412 may be a Flash chip, a read-only memory (ROM, Read-Only Memory) disk, an optical disk, a U disk, or a mobile hard disk.
  • the processor is used to run a computer program stored in a memory, and implement the following steps when executing the computer program:
  • the flight information is bound to the corresponding video frame of the video recording, so as to obtain the position of the target area in the video frame according to the position information in the flight information and the PTZ attitude information.
  • the processor implements the following steps when implementing the acquisition of the flight information corresponding to the aircraft during the acquisition of the video recording:
  • the processor when the processor implements the binding of the flight information with the corresponding video frame of the video recording, it specifically implements:
  • the flight information corresponding to the collection time point is bound to each video frame in the video recording corresponding to the collection time point to the previous collection time point.
  • the processor when the processor implements the binding of the flight information with the corresponding video frame of the video recording, it specifically implements:
  • the processor when the processor implements the binding of the subtitle information with the corresponding video frame of the video recording, it specifically implements:
  • the subtitle information is displayed in the display area.
  • the processor before implementing the displaying of the subtitle information in the display area, the processor further implements:
  • the displaying the subtitle information in the display area includes: displaying the subtitle information in the display area in a color corresponding to the subtitle color value.
  • the processor after the processor implements the binding of the flight information with the corresponding video frame of the video recording, it further implements:
  • the position information includes position information of the aircraft; or, the position information includes position information of the aircraft, position information of a take-off point, and relative position information of the aircraft and the take-off point.
  • the processor when the processor implements the acquisition of the position of the target area in the video frame according to the position information in the flight information and the pan/tilt attitude information, it specifically implements:
  • the position of the target area is determined according to the position information of the aircraft, the attitude information of the pan-tilt and the position point information.
  • the flight information further includes flight parameters and/or camera parameters; the flight parameters include: at least one of flight speed and aircraft attitude parameters; the camera parameters include: camera aperture size, shutter time At least one of, exposure intensity and exposure value.
  • FIG. 8 is a schematic block diagram of a receiving end according to an embodiment of the present application.
  • the receiving end 500 includes a display device 510, a processor 511, and a memory 512.
  • the processor 511 and the memory 512 are connected by a bus, such as an I2C (Inter-integrated Circuit) bus.
  • I2C Inter-integrated Circuit
  • the processor 511 may be a micro-controller unit (MCU), a central processing unit (CPU), a digital signal processor (Digital Signal Processor, DSP), or the like.
  • MCU micro-controller unit
  • CPU central processing unit
  • DSP Digital Signal Processor
  • the memory 512 may be a Flash chip, a read-only memory (ROM, Read-Only Memory) disk, an optical disk, a U disk, or a mobile hard disk.
  • the receiving end 500 may be a display, and the display includes an LED display, an LCD display, or an OLED display, etc.
  • the processor is used to run a computer program stored in a memory, and implement the following steps when executing the computer program:
  • the video frame of the video recording includes flight information, the flight information includes position information and pan/tilt attitude information; obtain the video frame according to the position information and pan/tilt attitude information in the flight information The location of the target area.
  • the location information includes location information of the aircraft; or, the location information includes location information of the aircraft, position information of a take-off point, and relative position information of the aircraft and the take-off point.
  • the processor when the processor implements the acquisition of the position of the target area in the video frame according to the position information in the flight information and the pan/tilt attitude information, it specifically implements:
  • the position of the target area is determined according to the position information of the aircraft, the attitude information of the pan-tilt and the position point information.
  • the flight information further includes flight parameters and/or camera parameters; the flight parameters include: at least one of flight speed and aircraft attitude parameters; the camera parameters include: camera aperture size, shutter time At least one of, exposure intensity and exposure value.
  • the video frame of the video recording includes flight information, including: the flight information is displayed in the video frame as subtitle information; the subtitle information is displayed in a specific position of the video frame, and the The specific position includes the lower, upper, left, or right side of the video screen of the video frame.
  • An embodiment of the present application also provides a flight system.
  • the flight system includes an aircraft and a receiving end, and the receiving end includes a display device.
  • the aircraft may be the aircraft illustrated in FIG. 7; the receiving end may be the receiving end illustrated in FIG. 8.
  • the aircraft is used to collect video recordings of the inspection area during inspection
  • the aircraft is used to obtain corresponding flight information when the video recording is collected, and the flight information includes position information and gimbal attitude information;
  • the aircraft is used to bind the flight information to the corresponding video frame of the video recording, and send the video recording with the flight information added to the receiving end;
  • the receiving end is configured to display the video recording and the flight information in the video frame of the video recording through the display device, and obtain the video according to the position information and the pan/tilt attitude information in the flight information The position of the target area in the frame.
  • the aircraft is used to obtain corresponding flight information when the video recording is collected, including:
  • the use of the aircraft to bind the flight information to the corresponding video frame of the video recording includes:
  • the flight information corresponding to the collection time point is bound to each video frame in the video recording corresponding to the collection time point to the previous collection time point.
  • the use of the aircraft to bind the flight information to the corresponding video frame of the video recording includes:
  • the aircraft being used to bind the subtitle information to the corresponding video frame of the video recording includes:
  • the subtitle information is displayed in the display area.
  • the method before displaying the subtitle information in the display area, the method further includes:
  • the displaying the subtitle information in the display area includes: displaying the subtitle information in the display area in a color corresponding to the subtitle color value.
  • the position information includes position information of the aircraft; or, the position information includes position information of the aircraft, position information of a take-off point, and relative position information of the aircraft and the take-off point.
  • the acquiring the position of the target area in the video frame according to the position information in the flight information and the PTZ attitude information includes:
  • the position of the target area is determined according to the position information of the aircraft, the attitude information of the pan-tilt and the position point information.
  • the flight information further includes flight parameters and/or camera parameters; the flight parameters include: at least one of flight speed and aircraft attitude parameters; the camera parameters include: camera aperture size, shutter time At least one of, exposure intensity and exposure value.
  • the embodiments of the present application also provide a computer-readable storage medium, the computer-readable storage medium stores a computer program, the computer program includes program instructions, and the processor executes the program instructions to implement the foregoing implementation The steps of the inspection method provided in the example.
  • the computer-readable storage medium may be the internal storage unit of the aircraft described in any of the foregoing embodiments, such as the hard disk or memory of the aircraft.
  • the computer-readable storage medium may also be an external storage device of the aircraft, such as a plug-in hard disk equipped on the aircraft, a Smart Media Card (SMC), or a Secure Digital (SD) card. , Flash Card, etc.
  • SMC Smart Media Card
  • SD Secure Digital

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Multimedia (AREA)
  • Theoretical Computer Science (AREA)
  • Automation & Control Theory (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Studio Devices (AREA)
  • Television Signal Processing For Recording (AREA)

Abstract

一种基于飞行器的巡检方法、设备和存储介质,该方法包括:在飞行器巡检时,采集巡检区的视频录像;获取飞行器在采集所述视频录像时对应的飞行信息,飞行信息包括位置信息和云台姿态信息;将飞行信息和视频录像的相应视频帧绑定,以便根据飞行信息中的位置信息和云台姿态信息获取所述视频帧中的目标区域的位置。

Description

基于飞行器的巡检方法、设备及存储介质
版权申明
本专利文件披露的内容包含受版权保护的材料。该版权为版权所有人所有。版权所有人不反对任何人复制专利与商标局的官方记录和档案中所存在的该专利文件或者该专利披露。
技术领域
本申请涉及飞行器巡检技术领域,尤其涉及一种基于飞行器的巡检方法、设备及存储介质。
背景技术
目前,市面上存在很多巡检业务的应用,比如电塔、基站、桥梁和建筑等均需定期巡检,来确保这些设施的安全及正常运行,又如火灾及地震等灾害发生后也需要飞行器进行巡检救援,利用飞行器技术结合摄影测量技术可以提供一种高效的巡检作业模式,但是飞行器在巡检过程中,只是对巡检目标进行录像,在回放事故发生点视频时,只能人工去判断事故发生位置和相关信息,因此造成效率较低,同时耗费大量的人力。
发明内容
基于此,本申请提供了一种基于飞行器的巡检方法、设备及存储介质,可快速定位到目标区域的位置。
第一方面,本申请提供了一种基于飞行器的巡检方法,所述巡检方法包括:
在飞行器巡检时,采集巡检区的视频录像;
获取所述飞行器在采集所述视频录像时对应的飞行信息,所述飞行信息包括位置信息和云台姿态信息;
将所述飞行信息与所述视频录像的相应视频帧绑定,以便根据所述飞行信息中的位置信息和云台姿态信息获取所述视频帧中的目标区域的位置。
第二方面,本申请还提供了一种飞行器,所述飞行器包括机体、拍摄装置以及存储器和处理器;
所述拍摄装置连接于所述机体以拍摄图像,所述拍摄装置包括云台和安装在所述云台上的相机,可通过调整所述云台调整所述相机的拍摄角度;
所述存储器用于存储计算机程序;
所述处理器,用于执行所述计算机程序并在执行所述计算机程序时,实现如下步骤:
在飞行器巡检时,采集巡检区的视频录像;
获取所述飞行器在采集所述视频录像时对应的飞行信息,所述飞行信息包括位置信息和云台姿态信息;
将所述飞行信息与所述视频录像的相应视频帧绑定,以便根据所述飞行信息中的位置信息和云台姿态信息获取所述视频帧中的目标区域的位置。
第三方面,本申请还提供了一种接收端,所述接收端包括显示装置;
所述接收端通过所述显示装置显示所述视频录像和与所述视频录像绑定的飞行信息,所述飞行信息包括位置信息和云台姿态信息;根据所述飞行信息中的位置信息和云台姿态信息获取所述视频帧中的目标区域的位置。
第四方面,本申请还提供了一种飞行系统,所述飞行系统包括飞行器和接收端,所述接收端包括显示装置;
所述飞行器用于在巡检时,采集巡检区的视频录像;
所述飞行器用于获取在采集所述视频录像时对应的飞行信息,所述飞行信息包括位置信息和云台姿态信息;
所述飞行器用于将所述飞行信息与所述视频录像的相应视频帧绑定,并将添加有所述飞行信息的所述视频录像发送至接收端;
所述接收端用于通过所述显示装置显示所述视频录像以及在所述视频录像的视频帧中显示所述飞行信息,根据所述飞行信息中的位置信息和云台姿态信息获取所述视频帧中的目标区域的位置。
第五方面,本申请还提供了一种计算机可读存储介质,所述计算机可读存储介质存储有计算机程序,所述计算机程序被处理器执行时使所述处理器实现 上述的巡检方法。
本申请实施例提供了一种基于飞行器的巡检方法、设备及存储介质,通过在飞行器巡检时采集巡检区的视频录像;并获取所述飞行器在采集所述视频录像时对应的飞行信息,所述飞行信息包括位置信息和云台姿态信息;将所述飞行信息与所述视频录像的相应视频帧绑定,以便根据所述飞行信息中的位置信息和云台姿态信息获取所述视频帧中的目标区域的位置。从而实现了对目标区域的快速精准定位,提高了目标区域确认效率。
应当理解的是,以上的一般描述和后文的细节描述仅是示例性和解释性的,并不能限制本申请。
附图说明
为了更清楚地说明本申请实施例技术方案,下面将对实施例描述中所需要使用的附图作简单地介绍,显而易见地,下面描述中的附图是本申请的一些实施例,对于本领域普通技术人员来讲,在不付出创造性劳动的前提下,还可以根据这些附图获得其他的附图。
图1是本申请一实施例提供的一种飞行系统的示意性框图;
图2是本申请一实施例提供的一种基于飞行器的巡检方法的步骤示意流程图;
图3是本申请实施例提供的基于飞行器的巡检方法的应用场景示意图;
图4是本申请实施例提供的确定目标区域的位置的效果示意图;
图5是本申请一实施例提供的添加飞行信息的步骤示意流程图;
图6是本申请实施例提供的确定字幕显示区域的效果示意图;
图7是本申请一实施例提供的飞行器的示意性框图;
图8是本申请一实施例提供的飞行系统的接收端的示意性框图。
具体实施方式
下面将结合本申请实施例中的附图,对本申请实施例中的技术方案进行清楚、完整地描述,显然,所描述的实施例是本申请一部分实施例,而不是全部的实施例。基于本申请中的实施例,本领域普通技术人员在没有做出创造性劳 动前提下所获得的所有其他实施例,都属于本申请保护的范围。
附图中所示的流程图仅是示例说明,不是必须包括所有的内容和操作/步骤,也不是必须按所描述的顺序执行。例如,有的操作/步骤还可以分解、组合或部分合并,因此实际执行的顺序有可能根据实际情况改变。
下面结合附图,对本申请的一些实施方式作详细说明。在不冲突的情况下,下述的实施例及实施例中的特征可以相互组合。
请参阅图1,图1是本申请一实施例提供的一种飞行系统的示意性框图。如图1所示,飞行系统100可以包括飞行器110和接收端120,接收端120与飞行器110通信连接。飞行器110用于巡航检测;接收端120可作为操控装置控制飞行器110飞行以及拍摄,以及接收飞行器110巡检时拍摄的视频。
飞行器110可以为旋翼型飞行器,例如单旋翼飞行器、双旋翼飞行器、三旋翼飞行器、四旋翼飞行器、六旋翼飞行器、八旋翼飞行器、十旋翼飞行器、十二旋翼飞行器等。当然,飞行器也可以是其他类型的无人机或可移动装置,本申请实施例不限于此。
接收端120包括显示装置,用于显示飞行器110采集的视频,以便用户观看。
示例性的,接收端120可以是位于飞行系统100的地面端,可以通过无线方式与飞行器110进行通信,用于接收飞行器110采集的视频,并实时显示接收到的视频。当然,接收端120还用于对飞行器110进行远程操纵。
接收端120可以是遥控器或者安装有控制飞行器110的应用程序的终端。
其中,终端可以是手机、平板电脑、笔记本电脑、台式电脑、个人数字助理、穿戴式设备等。
接收端120接收用户输入的飞行控制指令,可例如为通过遥控器上的摇杆、拨轮、按键、按钮等输入装置或者终端上的用户界面(UI)对飞行器110进行操控。
示例性的,接收端120可以是仅包括显示装置的显示设备,比如显示器或投影仪等,用于读取并显示飞行器110采集的视频。
飞行器110包括飞行控制器111、拍摄装置112和定位装置113。
飞行控制器111用于控制飞行器110的飞行,即飞行器110的控制模块。可以理解的,飞行控制器111可以按照预先编好的程序指令对飞行器110进行 控制,也可以通过响应接收端的一个或多个控制指令对飞行器110进行控制。
拍摄装置112搭载在飞行器110下方,包括相机1121和云台1122。当然也可以是一体化云台相机。拍摄装置112与飞行控制器111通信连接,并在飞行控制器111的控制下进行图像的拍摄。
定位装置113搭载于飞行器110上,用于实时测量所述飞行器的位置信息。
定位装置113可以包括但不限于GPS定位装置、北斗定位装置或实时动态(Real-time kinematic,RTK)载波相位差分定位装置(简称RTK定位装置)。
RTK载波相位差分技术是实时处理两个测量站载波相位观测量的差分方法,将基准站采集的载波相位发给用户接收机,进行求差解算坐标。RTK载波相位差分技术采用了载波相位动态实时差分方法,RTK载波相位差分技术能够在野外实时得到厘米级定位精度,而不需要事后进行解算才能获得厘米级的精度,采用RTK定位装置可精确检测飞行器的定位信息。
具体地,定位装置113在飞行控制器111的控制下以便实时采集飞行器当前的位置信息。该位置信息可包括经度信息和纬度信息。
飞行器110还包括惯性测量装置,所述惯性测量装置用于测量飞行器的飞行速度和飞行姿态信息等。
云台1122包括电子调速器(简称电调)和电机。飞行控制器111可以通过电调和电机控制云台的运动。当然,云台还可以包括控制器,通过该控制器控制电调和电机实现控制云台的运动。
电机为三轴电机,分别为俯仰(pitch)轴电机、横滚(roll)轴电机和平移(yaw)轴电机,用于改变拍摄时的云台姿态。
可以理解的,云台1122可以独立于飞行器,也可以为飞行器的一部分。可以理解的,电机可以是直流电机,也可以是交流电机。此外,电机可以是无刷电机,也可以是有刷电机,本申请实施例不限于此。
可以理解的,拍摄装置112也可以设于飞行器的其他适宜位置,例如飞行器的机头,本申请实施例不限于此。
在一个实施例中,飞行器110还包括距离测量装置114。距离测量装置114搭载于飞行器上,用于测量所述飞行器与关键目标物的距离、测量所述飞行器的飞行距离或者飞行高度。
该距离测量装置114包括如下至少一种:飞行时间(Time of Flight,TOF) 测距探测设备、雷达、超声波探测设备和激光探测设备等。
飞行器110将拍摄的视频发送给接收端120,具体是采用无线图传技术将拍摄的视频发送给接收端120。
飞行器110用于采集视频,对采集的视频进行压缩编码,并将压缩编码后的视频发送至接收端120,接收端120对编码后的视频解码显示。通过对视频进行编码由此提高视频传输的流畅度以及传输速度。
其中,对视频的编码的方式可以采用帧间编码或帧内编码的方式,当然也可以采用其他编码方式,比如采用帧间编码和帧内编码混合编码方式。相应地,采用与编码方式对应的解码方式对编码后的视频进行解码。
当然,飞行器110也可以将拍摄的视频存储在飞行器的存储器中或存储卡内,以便将其发送至或拷贝至接收端120进行显示。
可以理解的,上述对于飞行系统各部件的命名仅仅出于标识的目的,并不因此对本申请实施例进行限制。
以下将基于飞行系统、所述飞行系统中的飞行器和所述飞行系统中的接收端对本申请的实施例提供的巡检方法进行详细介绍。
请参阅图2,图2是本申请一实施例提供的一种基于飞行器的巡检方法的步骤示意流程图。该基于飞行器的巡检方法具体应用于飞行器的飞行控制器,即通过图1的飞行控制器111来执行;当然也可以由飞行器上携带的其它控制装置来实现,本申请实施例不限于此。
为了方便对本申请的实施例作详细阐述,以下以控制装置为飞行控制器为例进行说明。
具体地,如图2所示,该基于飞行器的巡检方法包括步骤S201至步骤S203。
S201、在飞行器巡检时,采集巡检区的视频录像;
S202、获取所述飞行器在采集所述视频录像时对应的飞行信息,所述飞行信息包括位置信息和云台姿态信息;
S203、将所述飞行信息与所述视频录像的相应视频帧绑定,以便根据所述飞行信息中的位置信息和云台姿态信息获取所述视频帧中的目标区域的位置。
在使用飞行器进行飞行巡检时,比如对电塔、基站、桥梁和建筑等进行定期巡检,需要通过拍摄装置采集巡检路线上个巡检去的视频录像。该视频录像包括多帧视频帧。
采集的视频录像用于确定目标区域是否发生异常事故。该异常事故比如为:发生火灾或地震等灾害,或者基站被毁、桥梁断裂或建筑倒塌等等。
在采集视频录像的同时,还需获取飞行器在采集视频录像时对应的飞行信息。其中,飞行信息至少包括位置信息和云台姿态信息。
位置信息至少包括飞行器的位置信息,即飞行器拍摄视频时对应的实时位置信息,包括经度信息、维度信息和高度信息等。
云台姿态信息为飞行器在拍摄视频时云台姿态参数信息,包括俯仰角、横滚角和航向角等。
在一些实施例中,所述位置信息还包括:起飞点位置信息和/或所述飞行器与所述起飞点的相对位置信息,当飞行器的位置信息不准确时,可以通过起飞点位置信息以及所述飞行器与所述起飞点的相对位置信息确定目标区域的位置。
在一些实施例中,飞行信息还包括飞行参数,其中,飞行参数包括:飞行速度和飞行器姿态参数中的至少一项。以便用户根据飞行参数了解目标区域的详细参数,做精准分析定位以及记录。
在一些实施例中,飞行信息还包括摄像参数,所述摄像参数包括:相机光圈大小、快门时间、曝光强度和曝光值中的至少一项。
示例性的,在确定存在目标区域后,可根据拍摄参数对视频帧进行图像处理,以更为准确地识别目标区域的细节。比如目标区域为桥梁所在区域,若桥梁发生断裂,则可以目标区域的细节确定桥梁的断裂程度等。
将所述飞行信息与所述视频录像的相应视频帧绑定,以便根据所述飞行信息中的位置信息和云台姿态信息获取所述视频帧中的目标区域的位置。
示例性的,如图3所示,飞行器110在巡检时,通过拍摄装置112采集巡检区的视频录像;并获取在采集视频录像时对应的飞行信息,其中,飞行信息包括位置信息和云台姿态信息;以及将飞行信息与视频录像的相应视频帧绑定,并将视频录像和飞行信息发送至接收端120,所述发送,可以是实时发送,也可以是将视频录像保存后上传至接收端,该接收端120可以为手机、平板电脑、其他显示设备或具备显示设备的控制器等;接收端120显示视频录像以及在视频录像的视频帧中显示飞行信息,以便根据所述飞行信息中的位置信息和云台姿态信息获取所述视频帧中的目标区域的位置。
比如,接收端120显示的当前帧130中包括飞行信息,该飞行信息具体如当前帧130中的虚线框内的信息。
其中,在图3中,F表示相机光圈大小,SS表示快门时间,ISO表示曝光强度,EV表示曝光值,GPS/RTK表示飞行器位置信息(经度、纬度、海拔高度),HOME表示飞行器起飞点位置信息(经度、纬度、海拔高度),D:表示飞行器距离起飞点水平距离,H表示飞行器相对于起飞点的相对高度,H.S表示飞行器水平向前的飞行速度,V.S表示飞行器垂直上升的飞行速度,F.PRY表示飞行器云台姿态参数,G.PRY表示飞行器飞行姿态参数。
由此,用户在通过接收端观看视频录像时,在播放的视频录像中均可观察到相应的飞行信息,该飞行信息包括位置信息和云台姿态信息。若在观看视频时,发现存在目标区域,则可以根据飞行信息和云台姿态信息确定目标区域的位置。
在一种实施例中,目标区域的位置可以根据飞行器的位置信息确定,例如,可以将飞行器的位置作为目标区域的位置。在另一种实施例中,目标区域的位置可以根据位置信息和云台姿态信息确定,例如,所述位置信息包括飞行器的位置信息,根据云台姿态信息确定目标区域的位置和飞行器位置的方位关系,进而根据飞行器的位置信息以及目标区域和飞行器位置的方位关系判断目标区域的位置。本发明实施例相对于直接将飞行器的位置作为目标区域的位置,能够获得更为准确的目标区域位置信息,这对巡检作业具有较大帮助,并且实现起来简便,无需进行复杂的计算,其结果也较为直观,对于飞行器来说,无需增加其他的检测装置,节省了成本的同时提高了工作效率。
在另一种实施例中,所述位置信息还包括起飞点位置信息和/或所述飞行器与所述起飞点的相对位置信息,当飞行器的位置信息不准确时(例如定位装置定位精度出现较大偏差),可以通过起飞点位置信息以及所述飞行器与所述起飞点的相对位置信息确定飞行器的位置信息,再结合云台姿态信息进而确定目标区域的位置。
在另一种实施例中,可以根据位置信息和云台姿态信息计算得到更加精确的目标区域位置,具体过程如下:获取用户在显示的视频帧中确定目标区域对应的位置点信息;根据所述位置信息、所述云台姿态信息和所述位置点信息确定所述目标区域的位置。
例如,如图3所示,在用户观看包括飞行信息的视频录像时,若用户确定可能存在目标区域,会在当前帧点击视频暂停播放,并选择目标区域点,则可以根据用户的选择操作获取目标区域在当前帧对应的位置点信息。
从当前帧中提取位置信息和云台姿态信息,所述位置包括:飞行器的位置信息,飞行器与目标区域之间的距离;还可以包括起飞点位置信息和/或所述飞行器与所述起飞点的相对位置信息。所述云台姿态信息包括云台的俯仰角、横滚角和航向角等。根据位置信息、云台姿态信息和位置点信息计算所述目标区域的位置。
具体如图4所示,若用户选择当前帧130的图像中的铁塔131,则确定该铁塔131所在位置为目标区域发生的位置,所述选择可以是点击按钮选项,或者在图像中进行框选,本发明实施例对此不进行限制。假设飞行器110的航向沿着箭头方向,可以将云台姿态角作为飞行器与目标区域点之间的角度信息,也可以根据需要,用云台姿态角估算出飞行器与目标区域点之间的角度信息,具体需根据实际情况确定。利用飞行器的位置,飞行器与目标区域点之间的角度信息,以及飞行器与目标区域之间的距离,通过三角函数公式可以计算出所述目标区域的位置。
在一些实施例中,在根据位置信息和云台姿态信息计算目标区域的位置之后,还可以显示所计算出的目标区域的位置信息。以便用户更为准确地确定目标区域的位置信息。由此可见,在用户观看视频确定目标区域时,可以根据添加在视频录像中的飞行信息中的位置信息和云台姿态信息准确地确定目标区域的位置。以便及时对目标区域进行处理,提高巡检效率。
在一些实施例中,为了节省硬件计算资源以及提高将飞行信息显示在视频录像的效率。可以以预设频率获取所述飞行器在采集所述视频录像时对应的飞行信息。该预设频率与视频帧的采集频率相关,该预设频率小于视频帧的采集频率,且成整数倍关系,比如f=10F,F为预设频率,f为视频帧的采集频率,即表示采集10帧视频帧时才采集一次飞行信息。
相应地,将所述飞行信息与所述视频录像的相应视频帧绑定,具体为:确定所述飞行信息所对应的视频帧,将所述飞行信息与其对应的视频帧绑定。
其中,确定所述飞行信息所对应的视频帧,具体为根据所述视频录像的帧率和所述预设频率确定每组飞行信息所对应视频帧的数目。
比如,确定10帧视频帧为一组,对应的相同的飞行信息,并将该飞行信息与该组中的10帧视频帧中。
因为视频的采集较快,在相邻的视频帧之间飞行信息变化不大,因此不需要每一帧视频均添加不同的飞行信息,也可以方便用户确定目标区域的位置。由此可以节省硬件计算资源,并且提高飞行信息添加的效率,同时可以节省飞行器的电量。
在另一些实施例中,为了节省硬件计算资源以及提高将飞行信息显示在视频录像的效率。可以确定采集时间点,在所述采集时间点获取所述飞行器在采集所述视频录像时对应的飞行信息。
其中,确定采集时间点,具体为获取定位精度等级,根据定位精度等级确定采集时间点,其中不同的定位精度等级对应不同的定位距离,可以预先设置对应关系并存储。
相应地,将所述采集时间点对应的飞行信息,与所述采集时间点至上一个采集时间点对应的视频录像中的每一帧视频帧相绑定。
比如,采集时间点T5至采集时间点T4拍摄了20帧视频,将在采集时间点T5时获取的飞行信息与该20帧视频绑定。
其中,获取定位精度等级,可以获取用户选择的定位精度等级;或者,获取所述飞行器内默认的定位精度等级。由此,不仅可以节省飞行器的硬件计算资源以及提高飞行信息的添加效率,还可以提高用户的体验度。
在一些实施例中,确定采集时间点,还可以获取所述飞行器的飞行速度,根据所述定位精度等级对应的定位距离和所述飞行速度确定采集时间点。即用定位距离除以飞行速度即可以得到采集时间点。由此可以将飞行信息和飞行速度相关联,实现了在视频录像中添加更为准确的飞行信息,以便能准确地定位目标区域的位置。
在一些实施例中,采用字幕信息的方式将所述飞行信息与所述视频录像的相应视频帧绑定。如图5所示,添加飞行信息的步骤,具体包括以下内容:
S203a、将所述飞行信息转换成字幕信息;S203b、将所述字幕信息与所述视频录像的相应视频帧相绑定。
其中,视频帧中的特定位置是为了方便用户观看时而预先设置的位置,比如特定位置,可以设置在视频画面的下方、上方、左侧或右侧等,比如在图3 中,特定位置设置视频画面的下方。
通过字幕信息的方式,将飞行信息显示在视频画面的特定位置,以方便用户查看,由此提高了确定目标区域的位置的准确率和效率。
在一些实施例中,为了方便用户可以清晰地查看巡检对象,采用确定显示区域的方式,将所述字幕信息显示在所述显示区域中。
具体地,在所述视频录像的相应视频帧中确定所述字幕信息的显示区域,所述显示区域远离所述视频帧中的关键目标物;以及将所述字幕信息显示在所述显示区域中。
其中,该关键目标物即为巡检对象,比如为线缆、水坝、楼宇或铁塔等。可以使用图像识别技术先在视频图像中确定关键目标物对应的目标区域,再在远离目标区域的区域选择一块用于显示字幕信息的区域作为显示区域。
在一些实施例中,在所述视频录像的相应视频帧中确定所述字幕信息的显示区域,具体为:在所述视频录像的相应视频帧中确定多个用于添加所述字幕信息的候选区域;以及计算每个所述候选区域的像素能量和,并根据所述像素能量和从多个所述候选区域中确定所述字幕信息的显示区域。
可以将视频录像的相应视频帧划分为几个区,如图6所示,采用九宫格的方式将视频画面划分为9个区域,分别为区域1、区域2、区域3、区域4、区域5、区域6、区域7、区域8和区域9,将这九个区域作为候选区域。当然也可以采用其他划分方式,在此不做限定。
分别计算9个区域中每个候选区域的像素能量和,像素能量和根据区域内的像素值进行计算。根据所述像素能量和从多个所述候选区域中确定所述字幕信息的显示区域。比如,选择最小的所述像素能量和对应的候选区域作为所述字幕信息的显示区域。
由于,9个候选区域中区域1和区域9没有铁塔覆盖,因此可以确定区域1和区域9像素能量和最小,可以从区域1和区域9其中一个区域作为显示区域。当然在实际应用中,视频画面还可能包括其他物体,在计算每个候选区域的像素能量和,可以用候选区域中像素与关键目标物对应的像素差的绝对值计算每个候选区域的像素能量和,再选择最大像素能量和对应的候选区域作为所述字幕信息的显示区域。
通过区域划分和像素能量和计算的方式,可以快速地确定与巡检对象无关 的区域作为显示区域,由此避免字幕信息遮挡巡检对象,以便用户可以清晰地观察巡检对象,找到目标区域。
在一些实施例中,为了方便用户清晰地看到视频画面中的飞行信息,具体采用将所述字幕信息采用白色显示在所述显示区域中。比如,如图3中的虚线框内字幕信息采用白色。
在一些实施例中,为了方便用户更清晰地看到视频画面中的飞行信息,可以采用反色差的方式,将字幕信息显示在显示区域。
具体地,根据所述显示区域的像素值计算所述显示区域的颜色值,根据所述显示区域的颜色值确定所述字幕信息的字幕颜色值,将所述字幕信息采用所述字幕颜色值对应的颜色显示至所述显示区域中。
其中所述字幕颜色值和所述显示区域的颜色值对应的颜色互为反差色。
互为反差色,可例如为在颜色环形图上选择成相互对称的颜色,以便用户可以清晰看到字幕信息。
在一些实施例中,在将所述飞行信息与所述视频录像的相应视频帧绑定之后,还需发送所述视频录像至接收端,以使所述接收端显示所述视频录像。
具体地,可以采用无线图传技术对绑定有飞行信息的视频录像进行编码压缩,在将编码压缩后的视频录像发送至接收端,接收端在接收到编码压缩后的视频录像后,对该视频录像进行解码,并显示所述视频录像。使用无线图传技术可以防止视频传输和播放时出现卡顿。
其中,编码方式可以采用帧间编码或帧内编码的方式,当然也可以采用其他编码方式,比如采用帧间编码和帧内编码混合编码方式。相应地,采用与编码方式对应的解码方式对编码后的视频进行解码。
在对视频数据进行编码时,需要对视频数据中的视频图像进行宏块划分,即每个宏块包括多个像素,比如16×16个像素为一个宏块。
当然,一帧视频图像可编码成一个或者多个片,每片包含整数个宏块。其中,每片至少一个宏块,最多时包含整个图像的宏块。
其中,片组是一个编码图像中若干宏块的一个子集,包含一个或若干个片。
飞行器和接收端对视频图像进行编码和解码,分别是通过编码器、解码器进行编码解码的。
在一些实施例中,为了防止视频出现卡顿,提高视频播放的流畅性,可以 将每帧视频图像和对应的飞行信息解除绑定后打包并建立对应关系。比如,通过建立相同的标号以建立对应关系,即飞行信息的数据包用视频帧的标识号进行标识。将视频录像和飞行信息分开发送。
具体地,确定所述解码器解码的所述视频录像中的当前帧输出的部分数据是否大于预设宏块数据;若输出的部分数据大于预设宏块数据,加载飞行信息的数据包并生成定时信号并将所述定时信号发送至显示装置,以使显示装置显示当前帧,当前帧中包括飞行信息。
由于该解码显示的方式,不是采用定频刷新显示方式,由此不会造成显示卡顿或不流畅等情况,进而提高视频播放的流畅性,以便快速确定目标区域的位置。
上述实施例通过在飞行器巡检时采集巡检区的视频录像;并获取所述飞行器在采集所述视频录像时对应的飞行信息,所述飞行信息包括位置信息和云台姿态信息;将所述飞行信息与所述视频录像的相应视频帧绑定,以便用户在观看视频时根据所述飞行信息中的位置信息和云台姿态信息获取所述视频帧中的目标区域的位置。从而实现了对目标区域的快速精准定位。
请参阅图7,图7是本申请一实施例提供的飞行器的示意性框图。该飞行器400包括拍摄装置410、处理器411和存储器412,处理器411、存储器412和拍摄装置410通过总线连接,该总线比如为I2C(Inter-integrated Circuit)总线。
飞行器400还包括机体,拍摄装置410连接于所述机体以拍摄图像,所述拍摄装置包括云台和安装在所述云台上的相机,可通过调整所述云台调整所述相机的拍摄角度,即云台姿态信息。
具体地,处理器411可以是微控制单元(Micro-controller Unit,MCU)、中央处理单元(Central Processing Unit,CPU)或数字信号处理器(Digital Signal Processor,DSP)等。
具体地,存储器412可以是Flash芯片、只读存储器(ROM,Read-Only Memory)磁盘、光盘、U盘或移动硬盘等。
其中,所述处理器用于运行存储在存储器中的计算机程序,并在执行所述计算机程序时实现如下步骤:
在飞行器巡检时,采集巡检区的视频录像;
获取所述飞行器在采集所述视频录像时对应的飞行信息,所述飞行信息包括位置信息和云台姿态信息;
将所述飞行信息与所述视频录像的相应视频帧绑定,以便根据所述飞行信息中的位置信息和云台姿态信息获取所述视频帧中的目标区域的位置。
在一些实施例中,所述处理器在实现所述获取所述飞行器在采集所述视频录像时对应的飞行信息时,实现如下步骤:
获取定位精度等级,根据定位精度等级确定采集时间点;
在所述采集时间点获取所述飞行器在采集所述视频录像时对应的飞行信息。
在一些实施例中,所述处理器在实现所述将所述飞行信息与所述视频录像的相应视频帧绑定时,具体实现:
将所述采集时间点对应的飞行信息,与所述采集时间点至上一个采集时间点对应的视频录像中的每一帧视频帧相绑定。
在一些实施例中,所述处理器在实现所述将所述飞行信息与所述视频录像的相应视频帧绑定时,具体实现:
将所述飞行信息转换成字幕信息;以及
将所述字幕信息与所述视频录像的相应视频帧相绑定。
在一些实施例中,所述处理器在实现所述将所述字幕信息与所述视频录像的相应视频帧相绑定时,具体实现:
在所述视频录像的相应视频帧中确定所述字幕信息的显示区域,所述显示区域远离所述视频帧中的关键目标物;以及
将所述字幕信息显示在所述显示区域中。
在一些实施例中,所述处理器在实现所述将所述字幕信息显示在所述显示区域中之前,还实现:
根据所述显示区域的像素值计算所述显示区域的颜色值,根据所述显示区域的颜色值确定所述字幕信息的字幕颜色值,其中所述字幕颜色值和所述显示区域的颜色值对应的颜色互为反差色;
所述将所述字幕信息显示在所述显示区域中,包括:将所述字幕信息采用所述字幕颜色值对应的颜色显示至所述显示区域中。
在一些实施例中,所述处理器在实现所述将所述飞行信息与所述视频录像 的相应视频帧绑定之后,还实现:
发送所述视频录像至接收端,以使所述接收端显示所述视频录像。
在一些实施例中,所述位置信息包括所述飞行器的位置信息;或者,所述位置信息包括所述飞行器的位置信息、起飞点位置信息以及所述飞行器与所述起飞点的相对位置信息。
在一些实施例中,所述处理器在实现所述根据所述飞行信息中的位置信息和云台姿态信息获取所述视频帧中的目标区域的位置时,具体实现:
获取用户在显示的视频帧中确定的目标区域对应的位置点信息;
根据所述飞行器的位置信息、所述云台姿态信息和所述位置点信息确定所述目标区域的位置。
在一些实施例中,所述飞行信息还包括飞行参数和/或摄像参数;所述飞行参数包括:飞行速度和飞行器姿态参数中的至少一项;所述摄像参数包括:相机光圈大小、快门时间、曝光强度和曝光值中的至少一项。
请参阅图8,图8是本申请一实施例提供的接收端的示意性框图。该接收端500包括显示装置510、处理器511和存储器512,处理器511和存储器512通过总线连接,该总线比如为I2C(Inter-integrated Circuit)总线。
具体地,处理器511可以是微控制单元(Micro-controller Unit,MCU)、中央处理单元(Central Processing Unit,CPU)或数字信号处理器(Digital Signal Processor,DSP)等。
具体地,存储器512可以是Flash芯片、只读存储器(ROM,Read-Only Memory)磁盘、光盘、U盘或移动硬盘等。
可以理解的是,接收端500可以是显示器,该显示器包括LED显示器、LCD显示器或OLED显示器等。
其中,所述处理器用于运行存储在存储器中的计算机程序,并在执行所述计算机程序时实现如下步骤:
显示所述视频录像,所述视频录像的视频帧包括飞行信息,所述飞行信息包括位置信息和云台姿态信息;根据所述飞行信息中的位置信息和云台姿态信息获取所述视频帧中的目标区域的位置。
在一些实施例中,所述位置信息包括所述飞行器的位置信息;或者,所述位置信息包括所述飞行器的位置信息、起飞点位置信息以及所述飞行器与所述 起飞点的相对位置信息。
在一些实施例中,所述处理器在实现所述根据所述飞行信息中的位置信息和云台姿态信息获取所述视频帧中的目标区域的位置时,具体实现:
获取用户在显示的视频帧中确定目标区域对应的位置点信息;
根据所述飞行器的位置信息、所述云台姿态信息和所述位置点信息确定所述目标区域的位置。
在一些实施例中,所述飞行信息还包括飞行参数和/或摄像参数;所述飞行参数包括:飞行速度和飞行器姿态参数中的至少一项;所述摄像参数包括:相机光圈大小、快门时间、曝光强度和曝光值中的至少一项。
在一些实施例中,所述视频录像的视频帧包括飞行信息,包括:所述飞行信息以字幕信息显示在所述视频帧中;所述字幕信息显示在所述视频帧的特定位置,所述特定位置包括在视频帧的视频画面的下方、上方、左侧或右侧。
本申请的实施例还提供了一种飞行系统,所述飞行系统包括飞行器和接收端,所述接收端包括显示装置。
需要说明的是,飞行器可以为图7中示例的飞行器;接收端可以为图8中示例的接收端。
所述飞行器用于在巡检时,采集巡检区的视频录像;
所述飞行器用于获取在采集所述视频录像时对应的飞行信息,所述飞行信息包括位置信息和云台姿态信息;
所述飞行器用于将所述飞行信息与所述视频录像的相应视频帧绑定,并将添加有所述飞行信息的所述视频录像发送至接收端;
所述接收端用于通过所述显示装置显示所述视频录像以及在所述视频录像的视频帧中显示所述飞行信息,根据所述飞行信息中的位置信息和云台姿态信息获取所述视频帧中的目标区域的位置。
在一些实施例中,所述飞行器用于获取在采集所述视频录像时对应的飞行信息,包括:
获取定位精度等级,根据定位精度等级确定采集时间点;
在所述采集时间点获取所述飞行器在采集所述视频录像时对应的飞行信息。
在一些实施例中,所述飞行器用于将所述飞行信息与所述视频录像的相应 视频帧绑定,包括:
将所述采集时间点对应的飞行信息,与所述采集时间点至上一个采集时间点对应的视频录像中的每一帧视频帧相绑定。
在一些实施例中,所述飞行器用于将所述飞行信息与所述视频录像的相应视频帧绑定,包括:
将所述飞行信息转换成字幕信息;以及
将所述字幕信息与所述视频录像的相应视频帧相绑定。
在一些实施例中,所述飞行器用于将所述字幕信息与所述视频录像的相应视频帧相绑定,包括:
在所述视频录像的相应视频帧中确定所述字幕信息的显示区域,所述显示区域远离所述视频帧中的关键目标物;以及
将所述字幕信息显示在所述显示区域中。
在一些实施例中,所述将所述字幕信息显示在所述显示区域中之前,还包括:
根据所述显示区域的像素值计算所述显示区域的颜色值,根据所述显示区域的颜色值确定所述字幕信息的字幕颜色值,其中所述字幕颜色值和所述显示区域的颜色值对应的颜色互为反差色;
所述将所述字幕信息显示在所述显示区域中,包括:将所述字幕信息采用所述字幕颜色值对应的颜色显示至所述显示区域中。
在一些实施例中,所述位置信息包括所述飞行器的位置信息;或者,所述位置信息包括所述飞行器的位置信息、起飞点位置信息以及所述飞行器与所述起飞点的相对位置信息。
在一些实施例中,所述根据所述飞行信息中的位置信息和云台姿态信息获取所述视频帧中的目标区域的位置,包括:
获取用户在显示的视频帧中确定的目标区域对应的位置点信息;
根据所述飞行器的位置信息、所述云台姿态信息和所述位置点信息确定所述目标区域的位置。
在一些实施例中,所述飞行信息还包括飞行参数和/或摄像参数;所述飞行参数包括:飞行速度和飞行器姿态参数中的至少一项;所述摄像参数包括:相机光圈大小、快门时间、曝光强度和曝光值中的至少一项。
本申请的实施例中还提供一种计算机可读存储介质,所述计算机可读存储介质存储有计算机程序,所述计算机程序中包括程序指令,所述处理器执行所述程序指令,实现上述实施例提供的巡检方法的步骤。
其中,所述计算机可读存储介质可以是前述任一实施例所述的飞行器的内部存储单元,例如所述飞行器的硬盘或内存。所述计算机可读存储介质也可以是所述飞行器的外部存储设备,例如所述飞行器上配备的插接式硬盘,智能存储卡(Smart Media Card,SMC),安全数字(Secure Digital,SD)卡,闪存卡(Flash Card)等。
以上所述,仅为本申请的具体实施方式,但本申请的保护范围并不局限于此,任何熟悉本技术领域的技术人员在本申请揭露的技术范围内,可轻易想到各种等效的修改或替换,这些修改或替换都应涵盖在本申请的保护范围之内。因此,本申请的保护范围应以权利要求的保护范围为准。

Claims (35)

  1. 一种基于飞行器的巡检方法,其特征在于,包括:
    在飞行器巡检时,采集巡检区的视频录像;
    获取所述飞行器在采集所述视频录像时对应的飞行信息,所述飞行信息包括位置信息和云台姿态信息;
    将所述飞行信息与所述视频录像的相应视频帧绑定,以便根据所述飞行信息中的位置信息和云台姿态信息获取所述视频帧中的目标区域的位置。
  2. 根据权利要求1所述的巡检方法,其特征在于,所述获取所述飞行器在采集所述视频录像时对应的飞行信息,包括:
    获取定位精度等级,根据定位精度等级确定采集时间点;
    在所述采集时间点获取所述飞行器在采集所述视频录像时对应的飞行信息。
  3. 根据权利要求2所述的巡检方法,其特征在于,所述将所述飞行信息与所述视频录像的相应视频帧绑定,包括:
    将所述采集时间点对应的飞行信息,与所述采集时间点至上一个采集时间点对应的视频录像中的每一帧视频帧相绑定。
  4. 根据权利要求1至3任一项所述的巡检方法,其特征在于,所述将所述飞行信息与所述视频录像的相应视频帧绑定,包括:
    将所述飞行信息转换成字幕信息;以及
    将所述字幕信息与所述视频录像的相应视频帧相绑定。
  5. 根据权利要求4所述的巡检方法,其特征在于,所述将所述字幕信息与所述视频录像的相应视频帧相绑定,包括:
    在所述视频录像的相应视频帧中确定所述字幕信息的显示区域,所述显示区域远离所述视频帧中的关键目标物;以及
    将所述字幕信息显示在所述显示区域中。
  6. 根据权利要求5所述的巡检方法,其特征在于,所述将所述字幕信息显示在所述显示区域中之前,还包括:
    根据所述显示区域的像素值计算所述显示区域的颜色值,根据所述显示区 域的颜色值确定所述字幕信息的字幕颜色值,其中所述字幕颜色值和所述显示区域的颜色值对应的颜色互为反差色;
    所述将所述字幕信息显示在所述显示区域中,包括:将所述字幕信息采用所述字幕颜色值对应的颜色显示至所述显示区域中。
  7. 根据权利要求1至5任一项所述的巡检方法,其特征在于,所述将所述飞行信息与所述视频录像的相应视频帧绑定之后,还包括:
    发送所述视频录像至接收端,以使所述接收端显示所述视频录像。
  8. 根据权利要求1所述的巡检方法,其特征在于,所述位置信息包括所述飞行器的位置信息;或者,所述位置信息包括所述飞行器的位置信息、起飞点位置信息以及所述飞行器与所述起飞点的相对位置信息。
  9. 根据权利要求8所述的巡检方法,其特征在于,所述根据所述飞行信息中的位置信息和云台姿态信息获取所述视频帧中的目标区域的位置,包括:
    获取用户在显示的视频帧中确定的目标区域对应的位置点信息;
    根据所述飞行器的位置信息、所述云台姿态信息和所述位置点信息确定所述目标区域的位置。
  10. 根据权利要求1所述的巡检方法,其特征在于,所述飞行信息还包括飞行参数和/或摄像参数;所述飞行参数包括:飞行速度和飞行器姿态参数中的至少一项;所述摄像参数包括:相机光圈大小、快门时间、曝光强度和曝光值中的至少一项。
  11. 一种飞行器,其特征在于,包括机体、拍摄装置以及存储器和处理器;
    所述拍摄装置连接于所述机体以拍摄图像,所述拍摄装置包括云台和安装在所述云台上的相机,可通过调整所述云台调整所述相机的拍摄角度;
    所述存储器用于存储计算机程序;
    所述处理器,用于执行所述计算机程序并在执行所述计算机程序时,实现如下步骤:
    在飞行器巡检时,采集巡检区的视频录像;
    获取所述飞行器在采集所述视频录像时对应的飞行信息,所述飞行信息包括位置信息和云台姿态信息;
    将所述飞行信息与所述视频录像的相应视频帧绑定,以便根据所述飞行信息中的位置信息和云台姿态信息获取所述视频帧中的目标区域的位置。
  12. 根据权利要求11所述的飞行器,其特征在于,所述处理器在实现所述获取所述飞行器在采集所述视频录像时对应的飞行信息时,具体实现:
    获取定位精度等级,根据定位精度等级确定采集时间点;
    在所述采集时间点获取所述飞行器在采集所述视频录像时对应的飞行信息。
  13. 根据权利要求12所述的飞行器,其特征在于,所述处理器在实现所述将所述飞行信息与所述视频录像的相应视频帧绑定时,具体实现:
    将所述采集时间点对应的飞行信息,与所述采集时间点至上一个采集时间点对应的视频录像中的每一帧视频帧相绑定。
  14. 根据权利要求11至13任一项所述的飞行器,其特征在于,所述处理器在实现所述将所述飞行信息与所述视频录像的相应视频帧绑定时,具体实现:
    将所述飞行信息转换成字幕信息;以及
    将所述字幕信息与所述视频录像的相应视频帧相绑定。
  15. 根据权利要求14所述的飞行器,其特征在于,所述处理器在实现所述将所述字幕信息与所述视频录像的相应视频帧相绑定时,具体实现:
    在所述视频录像的相应视频帧中确定所述字幕信息的显示区域,所述显示区域远离所述视频帧中的关键目标物;以及
    将所述字幕信息显示在所述显示区域中。
  16. 根据权利要求15所述的飞行器,其特征在于,所述处理器在实现所述将所述字幕信息显示在所述显示区域中之前,还实现:
    根据所述显示区域的像素值计算所述显示区域的颜色值,根据所述显示区域的颜色值确定所述字幕信息的字幕颜色值,其中所述字幕颜色值和所述显示区域的颜色值对应的颜色互为反差色;
    所述将所述字幕信息显示在所述显示区域中,包括:将所述字幕信息采用所述字幕颜色值对应的颜色显示至所述显示区域中。
  17. 根据权利要求11至15任一项所述的飞行器,其特征在于,所述处理器在实现所述将所述飞行信息与所述视频录像的相应视频帧绑定之后,还实现:
    发送所述视频录像至接收端,以使所述接收端显示所述视频录像。
  18. 根据权利要求11所述的飞行器,其特征在于,所述位置信息包括所述飞行器的位置信息;或者,所述位置信息包括所述飞行器的位置信息、起飞点 位置信息以及所述飞行器与所述起飞点的相对位置信息。
  19. 根据权利要求18所述的飞行器,其特征在于,所述处理器在实现所述根据所述飞行信息中的位置信息和云台姿态信息获取所述视频帧中的目标区域的位置时,实现:
    获取用户在显示的视频帧中确定的目标区域对应的位置点信息;
    根据所述飞行器的位置信息、所述云台姿态信息和所述位置点信息确定所述目标区域的位置。
  20. 根据权利要求11所述的飞行器,其特征在于,所述飞行信息还包括飞行参数和/或摄像参数;所述飞行参数包括:飞行速度和飞行器姿态参数中的至少一项;所述摄像参数包括:相机光圈大小、快门时间、曝光强度和曝光值中的至少一项。
  21. 一种接收端,其特征在于,所述接收端包括显示装置;所述接收端通过所述显示装置显示所述视频录像和与所述视频录像绑定的飞行信息,所述飞行信息包括位置信息和云台姿态信息;根据所述飞行信息中的位置信息和云台姿态信息获取所述视频帧中的目标区域的位置。
  22. 根据权利要求21所述的接收端,其特征在于,所述位置信息包括所述飞行器的位置信息;或者,所述位置信息包括所述飞行器的位置信息、起飞点位置信息以及所述飞行器与所述起飞点的相对位置信息。
  23. 根据权利要求22所述的接收端,其特征在于,所述根据所述飞行信息中的位置信息和云台姿态信息获取所述视频帧中的目标区域的位置,包括:
    获取用户在显示的视频帧中确定目标区域对应的位置点信息;
    根据所述飞行器的位置信息、所述云台姿态信息和所述位置点信息确定所述目标区域的位置。
  24. 根据权利要求21所述的接收端,其特征在于,所述飞行信息还包括飞行参数和/或摄像参数;所述飞行参数包括:飞行速度和飞行器姿态参数中的至少一项;所述摄像参数包括:相机光圈大小、快门时间、曝光强度和曝光值中的至少一项。
  25. 根据权利要求21至24任一项所述的接收端,其特征在于,所述视频录像的视频帧包括飞行信息,包括:所述飞行信息以字幕信息显示在所述视频帧中;所述字幕信息显示在所述视频帧的特定位置,所述特定位置包括在视频 帧的视频画面的下方、上方、左侧或右侧。
  26. 一种飞行系统,其特征在于,所述飞行系统包括飞行器和接收端,所述接收端包括显示装置;
    所述飞行器用于在巡检时,采集巡检区的视频录像;
    所述飞行器用于获取在采集所述视频录像时对应的飞行信息,所述飞行信息包括位置信息和云台姿态信息;
    所述飞行器用于将所述飞行信息与所述视频录像的相应视频帧绑定,并将添加有所述飞行信息的所述视频录像发送至接收端;
    所述接收端用于通过所述显示装置显示所述视频录像以及在所述视频录像的视频帧中显示所述飞行信息,根据所述飞行信息中的位置信息和云台姿态信息获取所述视频帧中的目标区域的位置。
  27. 根据权利要求26所述的飞行系统,其特征在于,所述飞行器用于获取在采集所述视频录像时对应的飞行信息,包括:
    获取定位精度等级,根据定位精度等级确定采集时间点;
    在所述采集时间点获取所述飞行器在采集所述视频录像时对应的飞行信息。
  28. 根据权利要求27所述的飞行系统,其特征在于,所述飞行器用于将所述飞行信息与所述视频录像的相应视频帧绑定,包括:
    将所述采集时间点对应的飞行信息,与所述采集时间点至上一个采集时间点对应的视频录像中的每一帧视频帧相绑定。
  29. 根据权利要求26至28任一项所述的飞行系统,其特征在于,所述飞行器用于将所述飞行信息与所述视频录像的相应视频帧绑定,包括:
    将所述飞行信息转换成字幕信息;以及
    将所述字幕信息与所述视频录像的相应视频帧相绑定。
  30. 根据权利要求29所述的飞行系统,其特征在于,所述飞行器用于将所述字幕信息与所述视频录像的相应视频帧相绑定,包括:
    在所述视频录像的相应视频帧中确定所述字幕信息的显示区域,所述显示区域远离所述视频帧中的关键目标物;以及
    将所述字幕信息显示在所述显示区域中。
  31. 根据权利要求30所述的飞行系统,其特征在于,所述将所述字幕信息 显示在所述显示区域中之前,还包括:
    根据所述显示区域的像素值计算所述显示区域的颜色值,根据所述显示区域的颜色值确定所述字幕信息的字幕颜色值,其中所述字幕颜色值和所述显示区域的颜色值对应的颜色互为反差色;
    所述将所述字幕信息显示在所述显示区域中,包括:将所述字幕信息采用所述字幕颜色值对应的颜色显示至所述显示区域中。
  32. 根据权利要求26所述的飞行系统,其特征在于,所述位置信息包括所述飞行器的位置信息;或者,所述位置信息包括所述飞行器的位置信息、起飞点位置信息以及所述飞行器与所述起飞点的相对位置信息。
  33. 根据权利要求32所述的飞行系统,其特征在于,所述根据所述飞行信息中的位置信息和云台姿态信息获取所述视频帧中的目标区域的位置,包括:
    获取用户在显示的视频帧中确定的目标区域对应的位置点信息;
    根据所述飞行器的位置信息、所述云台姿态信息和所述位置点信息确定所述目标区域的位置。
  34. 根据权利要求26所述的飞行系统,其特征在于,所述飞行信息还包括飞行参数和/或摄像参数;所述飞行参数包括:飞行速度和飞行器姿态参数中的至少一项;所述摄像参数包括:相机光圈大小、快门时间、曝光强度和曝光值中的至少一项。
  35. 一种计算机可读存储介质,其特征在于,所述计算机可读存储介质存储有计算机程序,所述计算机程序被处理器执行时使所述处理器实现如权利要求1至10中任一项所述的巡检方法。
PCT/CN2019/103893 2019-08-31 2019-08-31 基于飞行器的巡检方法、设备及存储介质 WO2021035756A1 (zh)

Priority Applications (2)

Application Number Priority Date Filing Date Title
PCT/CN2019/103893 WO2021035756A1 (zh) 2019-08-31 2019-08-31 基于飞行器的巡检方法、设备及存储介质
CN201980040208.7A CN112313596A (zh) 2019-08-31 2019-08-31 基于飞行器的巡检方法、设备及存储介质

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2019/103893 WO2021035756A1 (zh) 2019-08-31 2019-08-31 基于飞行器的巡检方法、设备及存储介质

Publications (1)

Publication Number Publication Date
WO2021035756A1 true WO2021035756A1 (zh) 2021-03-04

Family

ID=74336507

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2019/103893 WO2021035756A1 (zh) 2019-08-31 2019-08-31 基于飞行器的巡检方法、设备及存储介质

Country Status (2)

Country Link
CN (1) CN112313596A (zh)
WO (1) WO2021035756A1 (zh)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113720721A (zh) * 2021-08-16 2021-11-30 中国飞机强度研究所 一种飞机疲劳试验中内舱结构巡检的标定融合方法
CN114785961A (zh) * 2022-06-21 2022-07-22 山东信通电子股份有限公司 一种基于云台相机的巡视路线生成方法、设备及介质
CN115455275A (zh) * 2022-11-08 2022-12-09 广东卓维网络有限公司 融合巡检设备的视频处理系统
CN116597327A (zh) * 2023-05-15 2023-08-15 岳阳市水利水电规划勘测设计院有限公司 基于无人机的水利设施隐患排查系统

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113379908A (zh) * 2021-04-08 2021-09-10 贵州电网有限责任公司 一种电力设备自动巡检用的三维gisvr线路实景平台搭建系统

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101650866A (zh) * 2009-09-22 2010-02-17 华南理工大学 一种应用于无人机的火灾检测系统及其火灾检测方法
CN105790155A (zh) * 2016-04-08 2016-07-20 四川桑莱特智能电气设备股份有限公司 一种基于差分gps的输电线路无人机自主巡检系统及方法
CN109239725A (zh) * 2018-08-20 2019-01-18 广州极飞科技有限公司 基于激光测距装置的地图测绘方法及终端
CN110033103A (zh) * 2019-04-12 2019-07-19 合肥佳讯科技有限公司 一种光伏板巡检系统及巡检方法

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102255259A (zh) * 2011-03-29 2011-11-23 山东鲁能智能技术有限公司 适合于无人飞行器的输电线路巡检装置
CN103942273B (zh) * 2014-03-27 2017-03-15 北京空间机电研究所 一种空中快速响应动态监测系统及其动态监测方法
CN105100665B (zh) * 2015-08-21 2019-01-18 广州飞米电子科技有限公司 存储飞行器采集的多媒体信息的方法和装置
CN105698762B (zh) * 2016-01-15 2018-02-23 中国人民解放军国防科学技术大学 一种单机航迹上基于不同时刻观测点的目标快速定位方法
CN108680143A (zh) * 2018-04-27 2018-10-19 南京拓威航空科技有限公司 基于远程测距的目标定位方法、装置及无人机

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101650866A (zh) * 2009-09-22 2010-02-17 华南理工大学 一种应用于无人机的火灾检测系统及其火灾检测方法
CN105790155A (zh) * 2016-04-08 2016-07-20 四川桑莱特智能电气设备股份有限公司 一种基于差分gps的输电线路无人机自主巡检系统及方法
CN109239725A (zh) * 2018-08-20 2019-01-18 广州极飞科技有限公司 基于激光测距装置的地图测绘方法及终端
CN110033103A (zh) * 2019-04-12 2019-07-19 合肥佳讯科技有限公司 一种光伏板巡检系统及巡检方法

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113720721A (zh) * 2021-08-16 2021-11-30 中国飞机强度研究所 一种飞机疲劳试验中内舱结构巡检的标定融合方法
CN113720721B (zh) * 2021-08-16 2024-05-03 中国飞机强度研究所 一种飞机疲劳试验中内舱结构巡检的标定融合方法
CN114785961A (zh) * 2022-06-21 2022-07-22 山东信通电子股份有限公司 一种基于云台相机的巡视路线生成方法、设备及介质
CN114785961B (zh) * 2022-06-21 2022-09-20 山东信通电子股份有限公司 一种基于云台相机的巡视路线生成方法、设备及介质
CN115455275A (zh) * 2022-11-08 2022-12-09 广东卓维网络有限公司 融合巡检设备的视频处理系统
CN116597327A (zh) * 2023-05-15 2023-08-15 岳阳市水利水电规划勘测设计院有限公司 基于无人机的水利设施隐患排查系统
CN116597327B (zh) * 2023-05-15 2024-04-12 岳阳市水利水电规划勘测设计院有限公司 基于无人机的水利设施隐患排查系统

Also Published As

Publication number Publication date
CN112313596A (zh) 2021-02-02

Similar Documents

Publication Publication Date Title
WO2021035756A1 (zh) 基于飞行器的巡检方法、设备及存储介质
US11483518B2 (en) Real-time moving platform management system
CN108234927B (zh) 视频追踪方法和系统
US11024052B2 (en) Stereo camera and height acquisition method thereof and height acquisition system
CN111436208B (zh) 一种测绘采样点的规划方法、装置、控制终端及存储介质
CN113345028B (zh) 一种确定目标坐标变换信息的方法与设备
KR101984778B1 (ko) 시설물 외벽 진단을 위한 다중 협업 방법 및 이를 수행하기 위한 장치
CN105516604A (zh) 一种航拍视频分享方法和系统
US20190199992A1 (en) Information processing apparatus, method for controlling the same, and recording medium
CN113905211B (zh) 一种视频巡逻方法、装置、电子设备及存储介质
CN111527375B (zh) 一种测绘采样点的规划方法、装置、控制终端及存储介质
WO2019085945A1 (zh) 探测装置、探测系统和探测方法
CN113378605B (zh) 多源信息融合方法及装置、电子设备和存储介质
CN110208742B (zh) 一种基于bls可用于室内的定位系统及定位方法
US20230071355A1 (en) Image processing apparatus, image processing method, and program
CN114567742A (zh) 全景视频的传输方法、装置及存储介质
KR101674033B1 (ko) 삼차원 지도 기반 폐쇄회로 텔레비전 영상 매핑 시스템
CN111581322B (zh) 视频中兴趣区域在地图窗口内显示的方法和装置及设备
CN111950420A (zh) 一种避障方法、装置、设备和存储介质
US20230237796A1 (en) Geo-spatial context for full-motion video
CN113572946B (zh) 图像显示方法、装置、系统及存储介质
CN116758157B (zh) 一种无人机室内三维空间测绘方法、系统及存储介质
US20240087157A1 (en) Image processing method, recording medium, image processing apparatus, and image processing system
CN115439635B (zh) 一种呈现目标对象的标记信息的方法与设备
CN113919737A (zh) 多功能应急消防救援前突综合保障数据处理方法及系统

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19942775

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 19942775

Country of ref document: EP

Kind code of ref document: A1