WO2021212499A1 - Procédé, appareil et système d'étalonnage de cible, et terminal de commande à distance d'une plateforme mobile - Google Patents

Procédé, appareil et système d'étalonnage de cible, et terminal de commande à distance d'une plateforme mobile Download PDF

Info

Publication number
WO2021212499A1
WO2021212499A1 PCT/CN2020/086793 CN2020086793W WO2021212499A1 WO 2021212499 A1 WO2021212499 A1 WO 2021212499A1 CN 2020086793 W CN2020086793 W CN 2020086793W WO 2021212499 A1 WO2021212499 A1 WO 2021212499A1
Authority
WO
WIPO (PCT)
Prior art keywords
target object
image transmission
transmission screen
lens
movable platform
Prior art date
Application number
PCT/CN2020/086793
Other languages
English (en)
Chinese (zh)
Inventor
温亚停
方馨月
陈晨
Original Assignee
深圳市大疆创新科技有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 深圳市大疆创新科技有限公司 filed Critical 深圳市大疆创新科技有限公司
Priority to PCT/CN2020/086793 priority Critical patent/WO2021212499A1/fr
Priority to CN202080021661.6A priority patent/CN113597596A/zh
Publication of WO2021212499A1 publication Critical patent/WO2021212499A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range

Definitions

  • This application relates to the field of interaction, and in particular to a target calibration method, device and system, and a remote control terminal of a movable platform.
  • This application provides a target calibration method, device and system, and a remote control terminal of a movable platform.
  • an embodiment of the present application provides a target calibration method, which is applicable to a remote control terminal, the remote control terminal communicates with a movable platform, the movable platform includes a first lens and a second lens, and the method includes :
  • the target object is The second position information in the second video transmission picture acquired by the second lens.
  • an embodiment of the present application provides a target calibration device, the target calibration device is provided in a remote control terminal, the remote control terminal communicates with a movable platform, and the movable platform includes a first lens and a second lens,
  • the device includes:
  • Storage device for storing program instructions
  • One or more processors call program instructions stored in the storage device, and when the program instructions are executed, the one or more processors are individually or collectively configured to implement the following operations:
  • the target object is The second position information in the second video transmission picture acquired by the second lens.
  • an embodiment of the present application provides a remote control terminal, the remote control terminal communicates with a movable platform, the movable platform includes a first lens and a second lens, and the remote control terminal includes:
  • the target calibration device is supported by the main body;
  • the target calibration device includes:
  • Storage device for storing program instructions
  • One or more processors call program instructions stored in the storage device, and when the program instructions are executed, the one or more processors are individually or collectively configured to implement the following operations:
  • the target object is The second position information in the second video transmission picture acquired by the second lens.
  • the embodiments of the present application provide a target calibration system, including:
  • a remote control terminal including a main body and a target calibration device, the target calibration device being supported by the main body;
  • a movable platform communicating with the remote control terminal, the movable platform including a first lens and a second lens;
  • the target calibration device includes:
  • Storage device for storing program instructions
  • One or more processors call program instructions stored in the storage device, and when the program instructions are executed, the one or more processors are individually or collectively configured to implement the following operations:
  • the target object is The second position information in the second video transmission picture acquired by the second lens.
  • an embodiment of the present application provides a computer-readable storage medium on which a computer program is stored, and when the program is executed by a processor, the following steps are implemented:
  • the target object is The second position information in the second video transmission picture acquired by the second lens.
  • an embodiment of the present application provides a target calibration method, which is applicable to a remote control terminal that communicates with a movable platform on which a camera is mounted, and the method includes:
  • the target object is marked on the image transmission screen.
  • an embodiment of the present application provides a target calibration device, the target calibration device is provided in a remote control terminal, the remote control terminal communicates with a movable platform, the movable platform is equipped with a camera, the device include:
  • Storage device for storing program instructions
  • One or more processors call program instructions stored in the storage device, and when the program instructions are executed, the one or more processors are individually or collectively configured to implement the following operations:
  • the target object is marked on the image transmission screen.
  • an embodiment of the present application provides a remote control terminal, the remote control terminal communicates with a movable platform, the movable platform is equipped with a photographing device, and the remote control terminal includes:
  • the target calibration device is supported by the main body;
  • the target calibration device includes:
  • Storage device for storing program instructions
  • One or more processors call program instructions stored in the storage device, and when the program instructions are executed, the one or more processors are individually or collectively configured to implement the following operations:
  • the target object is marked on the image transmission screen.
  • an embodiment of the present application provides a target calibration system, including:
  • a remote control terminal comprising a main body and a target calibration device, the target calibration device being supported by the main body;
  • a movable platform that communicates with the remote control terminal, and a camera is mounted on the movable platform;
  • the target calibration device includes:
  • Storage device for storing program instructions
  • One or more processors call program instructions stored in the storage device, and when the program instructions are executed, the one or more processors are individually or collectively configured to implement the following operations:
  • the target object is marked on the image transmission screen.
  • an embodiment of the present application provides a computer-readable storage medium on which a computer program is stored, and when the program is executed by a processor, the following steps are implemented:
  • the target object is marked on the image transmission screen.
  • the technical solution provided by the embodiments of the present application according to the first position information of the target object in the first image transmission screen, the relative positional relationship between the first lens and the second lens, the shooting parameters of the first lens, and the second lens Shooting parameters, you can get the second location information of the target object in the second image transmission screen, so that the target object marked on the first image transmission screen can be shared with the second image transmission screen based on the second location information , It realizes the sharing of target calibration results between the image transmission images acquired by multiple lenses, which is beneficial to reduce communication costs when multi-person collaborative control of the movable platform works, and improves the efficiency of coordinated operations.
  • Fig. 1A is an application scenario diagram of a target calibration method in an embodiment of the present application
  • FIG. 1B is a structural block diagram of a movable platform in an embodiment of the present application.
  • FIG. 2 is a schematic diagram of the method flow of the target calibration method in an embodiment of the present application.
  • 3A is a schematic diagram of an interactive interface displaying a picture transmission screen in an embodiment of the present application.
  • 3B is a schematic diagram of an interactive interface in an embodiment of the present application displaying an image transmission screen under an intelligent tracking strategy
  • 4A is a schematic diagram of an interactive interface in an embodiment of the present application displaying an image transmission screen under a dot positioning strategy
  • 4B is a schematic diagram of an interactive interface in another embodiment of the present application displaying a map corresponding to the first image transmission screen;
  • FIG. 5 is a schematic diagram of an interactive interface in another embodiment of the present application displaying an image transmission screen under the laser ranging strategy
  • Fig. 6 is a schematic diagram of a flight guidance compass in an embodiment of the present application.
  • Fig. 7 is a structural block diagram of a target calibration device in an embodiment of the present application.
  • FIG. 8 is a schematic diagram of the method flow of the target calibration method in another embodiment of the present application.
  • pilot 1 performs operations by viewing the image transmission screen obtained by lens 1
  • pilot 2 performs operations by viewing the image transmission screen obtained by lens 2.
  • Pilots usually calibrate the targets that need to be focused on in the operation area. For example, if pilot 1 finds a large obstacle in the operation area through the image transmission screen obtained by lens 1, he can calibrate on the image transmission screen obtained by lens 1. The large obstacle avoids collision during operation.
  • the present application can be based on the first position information of the target object in the first image transmission screen, the relative position relationship between the first lens and the second lens, the shooting parameters of the first lens, and the shooting parameters of the second lens.
  • the sharing of the target calibration results between the acquired image transmission screens is beneficial to reduce communication costs when multiple people collaborate to control the operation of the movable platform, and improve the efficiency of coordination operations.
  • the target object is a suspect to be arrested, and if the location of the suspect cannot be demarcated accurately, the arrest process will be greatly troubled.
  • the position information of the target object obtained from the movable platform includes the distance of the target object relative to the movable platform. Based on the distance of the target object relative to the movable platform, the target object can be quickly calibrated. Positioning, when multiple people collaborate to control the work of the movable platform, it can quickly notify the partner of the coordinated work to arrive at the destination.
  • the target calibration method of the embodiment of the present application is applicable to a remote control terminal that communicates with a movable platform.
  • the remote control terminal of the embodiment of the present application may be a remote control of a movable platform, a smart terminal (such as a mobile phone, a tablet computer, etc.), or other terminal equipment capable of remotely controlling the movable platform.
  • the movable platform in the embodiment of the present application may be a drone, an unmanned vehicle or a ground mobile robot; of course, the movable platform in the embodiment of the present application may also be other types of movable platforms, such as a manned aircraft.
  • the movable platform includes a first lens and a second lens.
  • the movable platform includes a photographing device including multiple lenses, the first lens is one of the multiple lenses included in the photographing device, and the second lens includes multiple lenses included in the photographing device.
  • the movable platform includes a plurality of photographing devices, the first lens is a lens of one of the photographing devices, and the second lens includes a lens of other photographing devices.
  • the multiple shooting devices include a FPV (First Person View) camera.
  • the camera is mounted on the body of the movable platform via a pan-tilt; in some embodiments, the camera is directly fixed on the body of the movable platform.
  • the photographing device includes a dual-lens lens, such as a wide-angle lens and a zoom lens.
  • the first lens is one of a wide-angle lens and a zoom lens
  • the second lens is the other of a wide-angle lens and a zoom lens.
  • the photographing device includes a three-lens lens, such as a wide-angle lens, a zoom lens, and an infrared lens.
  • the first lens is one of a zoom lens, a wide-angle lens, and an infrared lens
  • the second lens includes a zoom lens, a wide-angle lens, and an infrared lens. The other shots in the shot.
  • the movable platform includes an FPV camera and a first imaging device
  • the first imaging device includes the above-mentioned dual-lens and/or triple-lens imaging device.
  • the first lens is one of the lens of the FPV camera and the lens of the first photographing device
  • the second lens includes the lens of the FPV camera and the other lens of the lens of the first photographing device.
  • the target calibration method in the embodiment of the present application may include S201 to S205.
  • the image transmission images of the first lens and the second lens are acquired through the remote control terminal, and the first image transmission image acquired by the first lens is displayed on the interactive interface of the remote control terminal.
  • the image transmission screen obtained by the first lens and the second lens will be transmitted back to the remote control terminal.
  • the logo corresponding to the first lens and the logo corresponding to the second lens are displayed on the interactive interface.
  • the user can select the first lens and the second lens by operating the logo corresponding to each lens (such as clicking, double-clicking, or long-pressing)
  • One of the image transmission screens is displayed on the interactive interface.
  • the first lens is a zoom lens
  • the second lens includes an infrared lens, a wide-angle lens, and an FPV lens.
  • the zoom lens corresponds to the first logo 11
  • the infrared lens corresponds to the second logo 12
  • the wide-angle lens corresponds to the third logo 13.
  • the lens of the FPV camera corresponds to the fourth mark 14.
  • the user can click on one of the first logo 11, the second logo 12, the third logo 13, and the fourth logo 14, so that the image transmission screen of the lens corresponding to the currently clicked logo is displayed on the interactive interface.
  • a first user instruction is acquired, where the first user instruction is used to instruct the target object to be calibrated in the first image transmission screen.
  • the first user instruction is also used to instruct a positioning strategy for positioning the target object to be calibrated (hereinafter referred to as the target object), and the remote control terminal obtains the position information of the target object from the movable platform according to the positioning strategy (ie, the following Said third position information) to calibrate the target object.
  • the remote control terminal obtains the third position information obtained by positioning the target object by the movable platform according to the positioning strategy in the first user instruction from the movable platform.
  • the third position information can be used to indicate the position of the target object in the world coordinate system, or it can be used to indicate the position of the target object in the image coordinate system, or to indicate the target object. Position in other coordinate systems.
  • the third position information includes at least one of the latitude, longitude and height of the target object; of course, the third position information may also include others, such as the distance of the target object relative to the movable platform.
  • the third position information is determined according to the position information of the movable platform and the distance of the target object relative to the movable platform.
  • the third position information includes relative position information and/or absolute position information of the target object. Both the position information and the absolute position information are determined according to the position information of the movable platform and the distance of the target object relative to the movable platform.
  • the positioning strategy may include one or more.
  • the positioning strategy includes smart tracking strategy (also called Smarttrack), dot positioning strategy (also called PIN Point) and laser ranging strategy (also called Range). );
  • the positioning strategy can also include others. It should be noted that when there are multiple positioning strategies, each positioning strategy is independent of each other.
  • the positioning strategy includes an intelligent tracking strategy, a dot positioning strategy, and a laser ranging strategy.
  • the movable platform can locate the target object through a navigation and/or laser ranging device.
  • Navigation can be GPS or other.
  • the third location information obtained from the mobile platform may be different for different positioning strategies.
  • the third location information may include the latitude, longitude and/or height of the target object, or the position of the target object in the image coordinate system.
  • the third position information may include the latitude, longitude and/or height of the target object and/or the distance of the target object relative to the movable platform; for the laser ranging strategy, the third position information may include the latitude, longitude and/or distance of the target object / Or the height and / or the distance of the target object relative to the movable platform.
  • the interactive interface displays the identification corresponding to each positioning strategy, and the user can operate (such as clicking, double-clicking, or long-pressing, etc.) the identification corresponding to each positioning strategy to select the corresponding positioning strategy.
  • the interactive interface displays a fifth indicator 21 corresponding to the smart tracking strategy, a sixth indicator 22 corresponding to the dot positioning strategy, and a seventh indicator 23 corresponding to the laser ranging strategy.
  • the user can operate one of the fifth identifier 21, the sixth identifier 22, and the seventh identifier 23 at the same time, so that the positioning strategy corresponding to the currently operated identifier is in the trigger state, and the remote control terminal will be based on the current status
  • the positioning identifier in the triggered state obtains the third position information from the movable platform.
  • the positioning strategy can default to one of an intelligent tracking strategy, a dot positioning strategy, and a laser ranging strategy.
  • the positioning strategy needs to be switched, the corresponding identification can be operated.
  • the first user instruction may be generated when the user clicks or box-selects an object in the first image transmission screen, or the first user instruction may also be generated when the user clicks the logo corresponding to the positioning strategy.
  • the first user instruction is generated when the user clicks or box selects an object in the first image transmission screen, which is suitable for use scenarios where the positioning strategy is the default positioning strategy.
  • the positioning strategy is the default intelligent tracking strategy.
  • the objects in the first image transmission screen (such as people, cars, boats, etc.) can be identified through the control device of the movable platform, and the control will be marked on the interactive interface. For the recognition results of the device (the first object 31, the second object 32, and the third object 33 in FIG.
  • the first user instruction is generated when the user clicks the object corresponding to the recognition result.
  • Click includes one of single click and double click, and can also include the other. Exemplarily, please refer to FIG. 3A again.
  • the target object is the second object 32.
  • the first user instruction is generated when the user clicks the logo corresponding to the positioning strategy displayed on the first image transmission screen.
  • the first user instruction is generated when the user clicks the mark corresponding to the dot positioning strategy displayed on the first image transmission screen, or the first user instruction is generated when the user clicks the laser ranging strategy displayed on the first image transmission screen Generated when marking.
  • the first location information is the pixel coordinates of the target object on the first image transmission screen.
  • the first position information of the target object on the first image transmission screen can be obtained by using an existing image recognition algorithm, which is not described in the embodiment of the present application.
  • the positioning strategy is an intelligent tracking strategy.
  • the first instruction is sent Information to a mobile platform.
  • the first indication information is used to instruct the movable platform to adjust the posture of the first lens, so that the target object in the first image transmission screen moves to the first specific position of the first image transmission screen.
  • the first location information is location information of the first specific location.
  • the first specific position includes the center position of the first video transmission screen; of course, the first specific position may also be other positions of the first video transmission screen.
  • the movable platform controls the shooting device through the control device to adjust the posture of the shooting device, thereby adjusting the posture of the first lens.
  • the control device may adjust at least one of the yaw attitude, pitch attitude, and roll attitude of the camera to achieve the purpose of adjusting the attitude of the camera.
  • the camera is mounted on a movable platform through a pan-tilt, and the attitude of the camera can be adjusted by adjusting the attitude of the pan-tilt.
  • the first indication information is also used to instruct the movable platform to adjust the shooting parameters of the first lens, so that the size of the target object in the first image transmission screen in the first image transmission screen is a preset size, which is convenient for the user Observe the target object.
  • the size of the preset size can be set as required.
  • the first indication information is also used to instruct the movable platform to adjust the zoom factor of the first lens, so that the size of the target object in the first image transmission screen is a preset size.
  • FIGS. 3A and 3B Exemplarily, please combine FIGS. 3A and 3B.
  • the user selects (by clicking or box selection or other operation methods) the second object 32 in the first image transmission screen shown in FIG. 3A as the target object, and the remote control terminal will Send the first instruction information to the movable platform.
  • the movable platform adjusts the posture of the first lens through the control device so that the second object 32 moves to the center of the first image transmission screen and controls
  • the device adjusts the zoom factor of the first lens so that the second object 32 is enlarged by a preset factor so that the size of the target object in the first image transmission screen is the preset size, as shown in FIG. 3B.
  • the user only needs to perform the target selection operation, and the user can observe the target object clearly, intuitively, and in real time without unnecessary operations.
  • the intelligent tracking strategy can obtain the real-time location information of the target object. Therefore, the intelligent tracking strategy is more suitable for calibrating and tracking the moving target object. If the target object is a suspect, the location of the suspect can be obtained in real time through the intelligent tracking strategy The arrest operation.
  • the third location information corresponding to the smart tracking strategy may include the latitude, longitude and height of the target object, where the latitude and longitude of the target object is determined according to the latitude and longitude of the movable platform and the height of the movable platform, and the height of the target object is based on The height of the movable platform is determined. That is, the third position information corresponding to the smart tracking strategy can obtain the relative position of the target object.
  • the positioning strategy is a dot positioning strategy
  • the target object is an object at a second specific position on the first video transmission screen.
  • the second specific position includes the center position of the first image transmission screen; of course, the second specific position may also be other positions of the first image transmission screen.
  • the user Before locating the target object through the dot positioning strategy, if the object as the target object in the first image transmission screen is not currently in the second specific position, you need to move the object as the target object in the first image transmission image to the second specific position Position; if the object as the target object in the first image transmission screen is currently in the second specific position, the user can directly operate the mark corresponding to the dot positioning strategy to select the dot positioning strategy as the positioning strategy used for the calibration target.
  • the strategy for moving the object that is the target object in the first image transmission screen to the second specific position may include, but is not limited to, the following two strategies:
  • the second indication information is used to instruct the user to adjust the pose of the first lens through the control device of the movable platform, so that the target object in the first image transmission screen is at the second specific position.
  • control device and the remote control terminal may be the same device or different devices; for example, the control device is a remote controller of a movable platform, and the remote control terminal is a mobile phone.
  • the user can adjust at least one of the poses of the movable platform and the shooting device by operating the control device to adjust the pose of the first lens.
  • the purpose of moving the target object in the first image transmission screen to the second specific position of the first image transmission screen can be achieved by manually controlling the joystick of the remote control to adjust the pose of the aircraft.
  • the second user instruction Before the first user instruction is acquired, the second user instruction is acquired, and the third instruction information is sent to the movable platform according to the second user instruction.
  • the second user instruction is used to instruct the object in the first image transmission screen to be moved to the second specific position.
  • the second user instruction is for the user to double-click the object to be moved to the second specific position in the first image transmission screen.
  • the third indication information is used to instruct the movable platform to automatically adjust the pose of the first lens, so that the object to be moved to the second specific position moves to the second specific position. It should be noted that, in this application, various instruction information can be output in the form of a dialog box, or various instruction information can be output in other ways.
  • the second object 32 needs to be calibrated, that is, the second object 32 is the target object, and the second object 32 is not currently in the center position of the first image transmission screen.
  • click the mark corresponding to the dot positioning strategy dot the center position of the first image transmission screen, and record the third position information of the second object 32.
  • augmented reality Augmented Reality
  • dynamic or static dot method may be used to dot the ZTE position of the first image transmission screen.
  • the third position information is determined according to the position information of the movable platform and the distance between the target object and the movable platform.
  • the relative position and absolute position of the target object can be determined, that is, the third position information may include At least one of relative position information and absolute position information of the target object. It should be noted that the distance between the target object and the movable platform can be detected by the laser ranging device of the movable platform.
  • the positioning strategy is a laser ranging strategy
  • the target object is an object at a fourth specific position on the first image transmission frame.
  • the fourth specific position includes the center position of the first image transmission screen; of course, the fourth specific position may also be other positions of the first image transmission screen.
  • the “calibrating” mark is displayed in the center of the first image transmission screen, indicating that it is currently aligned with the first image transmission screen. To obtain the distance of the target object relative to the movable platform and the latitude, longitude and height of the target object.
  • the target object is marked on the first image transmission screen according to the first location information.
  • the annotation may include the icon and the third position information of the target object, but it is not limited to this.
  • the icon when the target object is marked, the icon is displayed on the target object, that is, the pixels of the icon cover at least part of the pixels of the target object.
  • the icon may include at least one of graphics, numbers, and symbols, and the icon may also include others.
  • the target calibration method may further include: when the icon on the target object is in a triggered state, displaying third position information on the icon. Further, in some embodiments, the target calibration method may further include: hiding the third position information when the icon on the target object is in a non-triggered state. It should be noted that when the icon is operated, the icon is in a triggered state; when the icon is not operated, the icon is in a non-triggered state. Among them, the icon being operated may include the icon being clicked, such as clicking, double-clicking, or long-pressing.
  • the labeled icons corresponding to different target objects of the same positioning strategy are different to distinguish the target objects; in some embodiments, the labeled icons corresponding to different target objects of the same positioning strategy are the same.
  • the labeled icons corresponding to different positioning strategies are different to distinguish the positioning strategies of the target object; in some embodiments, the labeled icons corresponding to different positioning strategies are the same.
  • the different icons may include at least one of different icon size, icon shape, and icon color.
  • the same icon means that the icon size, icon shape, and icon color are all the same.
  • the smart tracking strategy is used to mark the target object 1 and the target object 2 in the first image transmission screen, the marked icon corresponding to the target object 1 is a green circle, and the marked icon corresponding to the target object 2 is a green square. frame.
  • the marked icons corresponding to different target objects of the same positioning strategy are the same, and the marked icons corresponding to different positioning strategies are different.
  • the marked icon corresponding to the smart tracking strategy is a green "+”
  • the marked icon corresponding to the dot positioning strategy is a green " ⁇ ”
  • the marked icon corresponding to the laser ranging strategy is a red "+”.
  • the target object is at the center of the first image transmission screen
  • the marked icon of the target object is the intelligent tracking strategy and Corresponding marked icon.
  • that the target object is at the center position of the first video transmission screen means that the center of the target object coincides with the center position of the first video transmission screen. It should be understood that when the deviation between the center of the target object and the center position of the first video transmission screen is less than the preset deviation threshold, it can also be considered that the center of the target object coincides with the center position of the first video transmission screen.
  • the intelligent tracking strategy and the laser ranging strategy are used to locate the third position information of the target object at the same time, since the laser ranging strategy can obtain the distance of the target object relative to the movable platform, it is combined with the intelligent tracking strategy
  • the obtained real-time position information of the target object and the distance of the target object relative to the movable platform obtained by the laser ranging strategy can determine the real-time absolute position of the target object, so that accurate tracking of the target object can be achieved.
  • the third location information contained in the label corresponding to the smart tracking strategy will change as the position of the target object changes in the real world, and the third location information contained in the label corresponding to the dot positioning strategy will not change.
  • the third position information includes absolute position information of the target object.
  • the implementation process of marking the target object on the first image transmission screen can be Including: according to the absolute position information contained in the first position information and the third position information, an augmented reality AR projection label is performed on the target object on the first image transmission screen, and the AR projection label is more eye-catching.
  • the dot calibration operation can also be performed on the map corresponding to the first image transmission screen.
  • the interactive interface also displays the eighth mark 41 corresponding to the map corresponding to the first image transmission screen.
  • the mark 41 makes the eighth mark in the triggered state
  • the map corresponding to the first image transmission screen will be displayed on the interactive interface, as shown in FIG. 4B.
  • the map corresponding to the first image transmission screen is a map around the current location of the movable platform, and the user can operate the interactive interface to zoom in or zoom out the map.
  • a third user instruction is obtained, where the third user instruction is generated when the user operates the mark corresponding to the dot positioning strategy displayed on the map;
  • the sixth indicator 22 corresponding to the dot positioning strategy will be displayed on the map.
  • the first image transmission screen 1 and the image transmission screen 2 corresponding to the FPV camera are also displayed on the map.
  • the user can watch the image transmission screen while watching the map, which is conducive to the intuitive presentation of information.
  • the image transmission images of different lenses may be obtained by different lenses of the same shooting device, or may be obtained by different shooting devices.
  • the photographing device includes a zoom lens and a wide-angle lens.
  • the first image transmission picture may be a picture transmission picture obtained by a zoom lens or a picture transmission picture obtained by a wide-angle lens;
  • the photographing device includes a zoom lens, a wide-angle lens and Infrared lens, the first image transmission image can be the image transmission image obtained by the zoom lens, the image transmission image obtained by the wide-angle lens, or the image transmission image obtained by the infrared lens;
  • the movable platform includes the first image
  • the first image transmission picture may be a picture transmission picture obtained by the first camera, or a picture transmission picture obtained by the FPV camera.
  • the icon corresponding to the annotation on the map when the icon corresponding to the annotation on the map is triggered, if the fourth user instruction is obtained , According to the fourth user instruction, modify and/or delete the height in the third position information contained in the annotation.
  • the icon corresponding to the label on the map can be triggered by a user operation (such as clicking or box selection) of the icon corresponding to the label, and the fourth user instruction can be generated when the user operates the virtual keyboard and/or virtual mouse on the interactive interface .
  • the target object at the third specific location on the map is marked according to the third user instruction
  • the icon corresponding to the mark on the map is triggered
  • the fifth user instruction instructs the user to perform the first operation on the icon first, and then perform the second operation on the icon;
  • the first operation includes continuously clicking the icon for longer than the preset period of time, and the second operation includes dragging. icon.
  • the size of the preset duration can be set as required, for example, the preset duration can be 5 seconds or other sizes. It should be understood that the fifth user instruction may also include other operations performed by the user on the icon, and the first operation and the second operation may also be of other types.
  • the target calibration method may further include: obtaining the distance of the target object relative to the movable platform obtained by positioning the target object.
  • the implementation process of labeling the target object on the first image transmission screen may include: according to the first location information and the distance of the target object relative to the movable platform, matching on the first image transmission screen The target object is labeled.
  • the third position information corresponding to the annotation includes the distance of the target object relative to the movable platform, and the user can determine the distance of the target object relative to the movable platform through the annotation, so that the absolute position of the target object can be determined.
  • the postures of the first lens and the second lens are approximately the same and the distance is less than a preset threshold, it can be considered that the positions of the two are coincident.
  • the shooting parameters include the shooting angle of view FOV; of course, the shooting parameters may also include others.
  • the target object on the second image transmission screen After determining the second position information of the target object in the second image transmission screen obtained by the second lens, mark the target object on the second image transmission screen according to the second position information.
  • the purpose of simultaneously calibrating the target object on the first image transmission screen and the second image transmission screen is realized, thereby realizing the sharing of the calibration results when the target object is calibrated on different image transmission screens.
  • the pilot 1 views the first image transmission screen through the interactive interface of the remote control terminal 1 to perform operations
  • the pilot 2 views the second image transmission screen through the interactive interface of the remote control terminal 2 to perform operations.
  • the pilot 1 calibrates a target object 1 on the first image transmission screen
  • the pilot 2 may also see the information that the target object 1 is calibrated on the second image transmission screen.
  • the label of the target object on the first image transmission screen is the same as the label of the target object on the second image transmission screen; of course, in other embodiments, the label of the target object on the first image transmission screen
  • the label on the second image transmission screen may be different from that of the target object.
  • the positioning strategy is a laser ranging strategy
  • the second position information of the target object in the second image transmission image obtained by the second lens if the second position information is not in the second On the image transmission screen, the label of the target object is displayed on the edge of the second image transmission screen.
  • the position of the edge relative to the center position of the second video transmission screen is used to indicate the position of the target object relative to the center position of the second video transmission screen. In this way, the user can be informed of the location of the target object.
  • the target calibration method may further include: marking the target object on the map corresponding to the first image transmission screen according to the third location information.
  • the target calibration method may further include: marking the target object on the map corresponding to the first image transmission screen according to the third location information.
  • the target calibration method may further include: synchronizing the annotations to the map corresponding to the external device and/or the image transmission screen and/or the operation guide page of the movable platform.
  • the external device is a background monitoring device, or may be HSI, other web platforms, etc.
  • the external device is a background monitoring device (usually used by the commander), and the calibration result of the pilot calibration target through the remote control terminal can be shared with the background monitoring device, which realizes the synchronization of information between the pilot and the commander, and the command The officer can schedule multiple pilots based on these synchronized information to improve operational efficiency.
  • the operation guide page is used to indicate the operation status of the movable platform.
  • the movable platform is a drone
  • the operation guide page is the flight guide page of the drone
  • the flight guide page is described.
  • the flight guide page is displayed on the display device of the remote control terminal or the display device connected to the remote control terminal, so that the user can know the flight status of the drone.
  • the flight guide page includes a flight guide compass, which is used to simultaneously identify the position of the drone that is in communication with the remote control terminal and the position of the drone's gimbal.
  • the flight guide page also displays the drone's position.
  • the flight guidance compass includes any one of the attitude ball and the horizontal status indicator (Horizontal Situation Indicator, HSI).
  • the display position of the flight guidance compass can be set according to the actual situation, which is not specifically limited in this application. For example, The lower middle area of the flight guide page displays the flight guide compass.
  • the flight guidance compass is an azimuth indication mark indicating the azimuth on the flight guidance page, and it can be a square, a circle, an ellipse, or other shapes that can indicate the azimuth.
  • the flight guidance compass is a square, the four corners of the direction are set to four directions: east, west, south, north, and the current position of the drone and the pan/tilt can be indicated in the square compass.
  • the flight guidance compass is circular as an example for description.
  • the flight guidance compass rotates with the rotation of the drone, and the center area of the flight guidance compass displays a drone icon.
  • the drone icon is used to indicate the drone, and the drone icon does not follow.
  • the edge area of the flight guidance compass shows the corresponding indicator characters for the true east, west, south and true north directions.
  • the edge area of the flight guidance compass also shows clouds.
  • PTZ icon the PTZ icon is used to represent the PTZ of the drone.
  • the position of the PTZ icon in the edge area is determined according to the direction of the Yaw axis of the PTZ.
  • the position of the PTZ icon in the edge area follows The direction of the pan-tilt changes.
  • the user can know the position and position of the gimbal based on the displayed flight guidance compass.
  • the position of the drone is convenient for the user to control the attitude of the drone's gimbal and the attitude of the drone.
  • the drone icon, pan/tilt icon, and indicator characters can be set according to actual conditions. This application does not specifically limit this.
  • the indicator characters corresponding to the direction of due east, west, south and north are respectively E, W, S, and N.
  • the drone icon is an arrow, circle, triangle, or other shapes
  • the gimbal icon is a triangle, quadrilateral, pentagon, or other shapes.
  • the angle value corresponding to the nose of the drone is displayed near the flight guidance compass, and the displayed angle value is used to indicate the nose of the drone, where the angle value is that of the drone.
  • the angle of the nose with respect to true north, true south, true west, or true east.
  • the drone icon is an arrow icon
  • the arrow icon is fixed to the top of the flight guide page, and the direction of the arrow icon is consistent with the nose of the drone; when the orientation of the drone changes,
  • the edge area of the flight guidance compass displays the corresponding gimbal icon for each gimbal, and the color of each gimbal icon is different, the flight guidance compass
  • the edge area of also displays multiple angle scale lines and the corresponding angle value of each angle scale line. Among them, the angle value is the angle deviating from the true north direction.
  • the user can be accurate and clear based on the displayed flight guidance compass Knowing the position of the gimbal and the position of the drone is convenient for the user to control the attitude of the gimbal of the drone and the attitude of the drone.
  • the flight guidance compass also includes a follow icon, which is used to indicate the object that the drone follows.
  • the position of the follow icon on the flight guidance compass is based on the direction and distance of the object followed by the drone relative to the drone. Confirmed; when the distance of the object followed by the drone relative to the drone is less than the preset distance, the follow icon is located inside the flight guidance compass, and the distance of the object followed by the drone relative to the drone is greater than or equal to the preset distance.
  • the follow icon is located inside the edge area of the flight guidance compass; the mark point icon and the distance of the marked space point relative to the drone are also displayed near the flight guidance compass.
  • the follow icon can be set based on actual conditions, and this application does not specifically limit this.
  • the follow icon is
  • the flight guide page also displays an object follow button.
  • the drone automatically recognizes people, vehicles, and ships and other objects, and adjusts the camera's focus so that objects such as people, vehicles, and ships are in the center of the screen.
  • a follow icon is displayed on the flight guidance compass.
  • the home point icon of the drone is displayed inside the flight guidance compass
  • the home point icon is displayed near the lower right side of the flight guidance compass
  • the distance from the home point of the drone to the drone is 10m
  • the inner side of the edge area of the flight guidance compass is displayed with a marker icon
  • a marker icon is displayed near the lower left side of the flight guidance compass
  • the distance from the marked space point to the drone is 45m
  • the follow icon is displayed inside the flight guidance compass
  • a follow icon is displayed near the upper left side of the flight guidance compass
  • the distance from the following object to the drone is 5m.
  • the target calibration method of the embodiment of the present application can track and synchronize information for "dynamic targets” and “static targets”, realize multi-platform closed-loop operation, and be suitable for industries such as security and emergency.
  • an embodiment of the present application also provides a target calibration device.
  • the target calibration device is provided in a remote control terminal.
  • the target calibration device may include a storage device and one or more Processors.
  • the storage device is used to store program instructions.
  • One or more processors call the program instructions stored in the storage device.
  • the one or more processors are individually or collectively configured to perform the following operations: Obtain the first shot through the remote control terminal And the image transmission screen of the second lens, the first image transmission screen acquired by the first lens is displayed on the interactive interface of the remote control terminal; the first user instruction is obtained, wherein the first user instruction is used to instruct the first image transmission screen
  • the target object to be calibrated obtain the first position information of the target object on the first image transmission screen; mark the target object on the first image transmission screen according to the first position information; according to the first position information, the first lens and The relative positional relationship between the second lenses, the shooting parameters of the first lens, and the shooting parameters of the second lens determine the second position information of the target object in the second video image obtained by the second lens.
  • the processor of this embodiment can implement the target calibration method of the embodiment shown in FIG. 2 of the present application, and reference may be made to the description of the corresponding part in the foregoing embodiment.
  • an embodiment of the present application also provides a remote control terminal, the remote control terminal communicates with a movable platform, the movable platform includes a first lens and a second lens, the remote control terminal includes a main body and The target calibration device is supported by the main body.
  • an embodiment of the present application also provides a target calibration system, which includes a remote control terminal and a movable platform.
  • the remote control terminal includes a main body and the target calibration device of the foregoing embodiment, and the target calibration device is supported by the main body.
  • the movable platform communicates with the remote control terminal, and the movable platform includes a first lens and a second lens.
  • an embodiment of the present application also provides a computer-readable storage medium on which a computer program is stored, and when the program is executed by a processor, the steps of the target calibration method in the second embodiment are realized.
  • FIG. 8 is a method flow diagram of a target calibration method in another embodiment of the present application; the execution subject of the target calibration method in the embodiment of the present application is a remote control terminal.
  • the target calibration method in the embodiment of the present application may include S801 to S804.
  • the image transmission screen acquired by the camera is displayed on the interactive interface of the remote control terminal.
  • the camera is mounted on a movable platform, and the camera can include one or more lenses.
  • the image transmission image may be an image transmission image obtained by any one of the multiple lenses.
  • the image transmission picture is a picture transmission picture obtained by the first lens of the movable platform.
  • the image transmission screen will be sent back to the remote control terminal.
  • a first user instruction is obtained, where the first user instruction is used to instruct the target object to be calibrated in the image transmission screen and a positioning strategy for positioning the target object.
  • the location information in the embodiment of the present application is equivalent to the third location information in the first embodiment.
  • the positioning strategy in the embodiment of the present application includes at least one of a dot positioning strategy and a laser ranging strategy.
  • the position information of the target object is obtained from the movable platform, where the position information includes the distance of the target object relative to the movable platform.
  • the position information can be used to indicate the position of the target object in the world coordinate system, and can also be used to indicate the position of the target object in the image coordinate system, or used to indicate the position of the target object in other coordinate systems.
  • the target object is marked on the image transmission screen according to the location information.
  • the label includes an icon and location information.
  • the location information further includes at least one of the latitude, longitude and height of the target object.
  • the target calibration method further includes: when the icon is in a triggered state, displaying position information on the icon.
  • the target calibration method further includes: hiding the position information when the icon is in a non-triggered state.
  • the positioning strategy includes a dot positioning strategy
  • the first user instruction is generated when the user clicks the mark corresponding to the dot positioning strategy displayed on the image transmission screen.
  • the target object is an object at a first specific position on the image transmission screen.
  • the first specific position includes the center position of the image transmission screen.
  • the method further includes: before acquiring the location information, outputting first indication information; the first indication information is used to instruct the user to adjust the posture of the shooting device through the control device of the movable platform, so that the image transmission screen The target object is in the first specific position.
  • the method further includes: obtaining a second user instruction, where the second user instruction is used to instruct the object to be moved to the first specific position in the image transmission screen; and according to the second user instruction, sending The second instruction information is to the movable platform; wherein, the second instruction information is used to instruct the movable platform to adjust the posture of the photographing device so that the object to be moved to the first specific position moves to the first specific position.
  • the position information is determined according to the position information of the movable platform and the distance of the target object relative to the movable platform.
  • the method further includes: displaying a map corresponding to the image transmission screen on the interactive interface; acquiring a third user instruction, where the third user instruction is generated when the user operates the mark corresponding to the dot positioning strategy displayed on the map ;According to the dot positioning strategy, obtain the location information of the target object from the movable platform; according to the location information, mark the target object at the second specific location on the map.
  • the target object at the second specific location on the map after labeling the target object at the second specific location on the map according to the location information, it further includes: when the icon corresponding to the label on the map is triggered, if a fourth user instruction is obtained, then according to the fourth user instruction.
  • User instructions to modify and/or delete the height in the position information contained in the label After labeling the target object at the second specific location on the map according to the location information, it further includes: when the icon corresponding to the label on the map is triggered, if a fourth user instruction is obtained, then according to the fourth user instruction.
  • the target object at the second specific location on the map after labeling the target object at the second specific location on the map according to the location information, it further includes: when the icon corresponding to the label on the map is triggered, if the fifth user instruction is obtained, then according to the fifth user instruction.
  • the user instructs to move the position of the icon on the map, and replace the longitude and latitude in the position information contained in the label with the longitude and latitude corresponding to the position of the moved icon.
  • the fifth user instruction instructs the user to perform the first operation on the icon first, and then perform the second operation on the icon.
  • the first operation includes: the duration of continuously clicking the icon is greater than a preset duration
  • the second operation includes: dragging the icon.
  • the positioning strategy includes a laser ranging strategy
  • the first user instruction is generated when the user clicks the mark corresponding to the laser ranging strategy displayed on the image transmission screen.
  • the target object is an object at a third specific position on the image transmission screen.
  • the third specific position includes the center position of the image transmission screen.
  • the target calibration method further includes: synchronizing the annotations to the map corresponding to the external device and/or the image transmission screen and/or the operation guide page of the movable platform;
  • the operation guide page is used to indicate the operation status of the movable platform.
  • an embodiment of the present application also provides a target calibration device, the target calibration device is set in the remote control terminal, please refer to FIG. 7, the target calibration device may include a storage device and one or more Processors.
  • the storage device is used to store program instructions.
  • One or more processors call program instructions stored in the storage device. When the program instructions are executed, one or more processors are individually or collectively configured to implement the following operations: On the interactive interface of the remote control terminal Display the image transmission screen obtained by the camera; obtain the first user instruction, where the first user instruction is used to instruct the target object to be calibrated in the image transmission screen and the positioning strategy of the target object; The mobile platform obtains the location information of the target object, and the location information includes the distance of the target object relative to the movable platform; according to the location information, the target object is marked on the image transmission screen.
  • the processor of this embodiment can implement the target calibration method of the embodiment shown in FIG. 8 of the present application, and reference may be made to the description of the corresponding part in the foregoing embodiment.
  • the storage device of the foregoing embodiment stores the executable instruction computer program of the target calibration method.
  • the storage device may include at least one type of storage medium.
  • the storage medium includes flash memory, hard disk, multimedia card, card-type memory (for example, SD Or DX memory, etc.), random access memory (RAM), static random access memory (SRAM), read only memory (ROM), electrically erasable programmable read only memory (EEPROM), programmable read only memory (PROM) , Magnetic storage, magnetic disks, optical disks, etc.
  • the target calibration device may cooperate with a network storage device that performs the storage function of the memory through a network connection.
  • the memory may be an internal storage unit of the target calibration device, such as the hard disk or memory of the target calibration device.
  • the memory can also be an external storage device of the target calibration device, such as plug-in hard disks equipped on the target calibration device, Smart Media Card (SMC), Secure Digital (SD) card, Flash Card )Wait. Further, the memory may also include both an internal storage unit of the target calibration device and an external storage device. The memory is used to store computer programs and other programs and data required by the device. The memory can also be used to temporarily store data that has been output or will be output.
  • SMC Smart Media Card
  • SD Secure Digital
  • Flash Card Flash Card
  • the processor of the foregoing embodiment may be a central processing unit (Central Processing Unit, CPU), or other general-purpose processors, digital signal processors (Digital Signal Processor, DSP), and Application Specific Integrated Circuit (ASIC) , Field-Programmable Gate Array (FPGA) or other programmable logic devices, discrete gates or transistor logic devices, discrete hardware components, etc.
  • the general-purpose processor may be a microprocessor or the processor may also be any conventional processor or the like.
  • an embodiment of the present application also provides a remote control terminal, the remote control terminal communicates with a movable platform, the movable platform is equipped with a photographing device, the remote control terminal includes a main body and the target calibration of the second embodiment above Device, the target calibration device is supported by the main body.
  • an embodiment of the present application also provides a target calibration system, including a remote control terminal and a movable platform.
  • the remote control terminal includes a main body and the target calibration device of the second embodiment above, and the target calibration device is supported by the main body.
  • the movable platform communicates with the remote control terminal, and a camera is mounted on the movable platform.
  • an embodiment of the present application also provides a computer-readable storage medium on which a computer program is stored, and when the program is executed by a processor, the steps of the target calibration method in the second embodiment are realized.
  • the computer-readable storage medium may be an internal storage unit of the remote control terminal described in any of the foregoing embodiments, such as a hard disk or a memory.
  • the computer-readable storage medium may also be an external storage device of the remote control terminal, such as a plug-in hard disk, a smart media card (SMC), an SD card, a flash memory card (Flash Card), etc. equipped on the device .
  • the computer-readable storage medium may also include both an internal storage unit of the remote control terminal and an external storage device.
  • the computer-readable storage medium is used to store the computer program and other programs and data required by the remote control terminal, and can also be used to temporarily store data that has been output or will be output.
  • the program can be stored in a computer readable storage medium. During execution, it may include the procedures of the above-mentioned method embodiments.
  • the storage medium may be a magnetic disk, an optical disc, a read-only memory (Read-Only Memory, ROM), or a random access memory (Random Access Memory, RAM), etc.

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Human Computer Interaction (AREA)
  • Automation & Control Theory (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

L'invention concerne un procédé, un appareil et un système d'étalonnage de cible, ainsi qu'un terminal de commande à distance d'une plateforme mobile, ledit procédé consistant à : au moyen d'un terminal de commande à distance, acquérir une image de transmission d'image d'une première lentille et d'une seconde lentille, puis afficher une première image de transmission d'image acquise par la première lentille sur une interface interactive du terminal de commande à distance (S201) ; acquérir une première instruction d'utilisateur, la première instruction d'utilisateur permettant d'indiquer un objet cible à étalonner dans la première image de transmission d'image (S202) ; acquérir les premières informations de position de l'objet cible dans la première image de transmission d'image (S203) ; d'après les premières informations de position, marquer l'objet cible sur la première image de transmission d'image (S204) ; et d'après les premières informations de position, les informations de position relative de la première lentille et de la seconde lentille, les paramètres de photographie de la première lentille et les paramètres de photographie de la seconde lentille, déterminer les secondes informations de position de l'objet cible dans une seconde image de transmission d'image acquise par la seconde lentille (S205). Le procédé de l'invention effectue un partage des résultats d'étalonnage cible entre les images de transmission d'image acquises par de multiples lentilles, ce qui réduit les coûts de communication lorsque de multiples personnes collaborent pour commander les opérations de la plateforme mobile, et augmente l'efficacité des opérations collaboratives.
PCT/CN2020/086793 2020-04-24 2020-04-24 Procédé, appareil et système d'étalonnage de cible, et terminal de commande à distance d'une plateforme mobile WO2021212499A1 (fr)

Priority Applications (2)

Application Number Priority Date Filing Date Title
PCT/CN2020/086793 WO2021212499A1 (fr) 2020-04-24 2020-04-24 Procédé, appareil et système d'étalonnage de cible, et terminal de commande à distance d'une plateforme mobile
CN202080021661.6A CN113597596A (zh) 2020-04-24 2020-04-24 目标标定方法、装置和系统及可移动平台的遥控终端

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2020/086793 WO2021212499A1 (fr) 2020-04-24 2020-04-24 Procédé, appareil et système d'étalonnage de cible, et terminal de commande à distance d'une plateforme mobile

Publications (1)

Publication Number Publication Date
WO2021212499A1 true WO2021212499A1 (fr) 2021-10-28

Family

ID=78237890

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2020/086793 WO2021212499A1 (fr) 2020-04-24 2020-04-24 Procédé, appareil et système d'étalonnage de cible, et terminal de commande à distance d'une plateforme mobile

Country Status (2)

Country Link
CN (1) CN113597596A (fr)
WO (1) WO2021212499A1 (fr)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114301571A (zh) * 2022-02-14 2022-04-08 中国人民解放军陆军工程大学 一种多旋翼无人机反制方法及系统

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20080099678A1 (en) * 2004-12-03 2008-05-01 Johnson Kirk R Camera with visible light and infrared image blending
CN102148965A (zh) * 2011-05-09 2011-08-10 上海芯启电子科技有限公司 多目标跟踪特写拍摄视频监控系统
CN104038737A (zh) * 2014-05-30 2014-09-10 西安交通大学 主动获取感兴趣目标高分辨率图像的双摄像机系统及方法
CN109154874A (zh) * 2017-10-31 2019-01-04 深圳市大疆创新科技有限公司 图像显示方法、控制方法及相关设备
CN109618131A (zh) * 2018-11-22 2019-04-12 亮风台(上海)信息科技有限公司 一种用于呈现决策辅助信息的方法与设备

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20080099678A1 (en) * 2004-12-03 2008-05-01 Johnson Kirk R Camera with visible light and infrared image blending
CN102148965A (zh) * 2011-05-09 2011-08-10 上海芯启电子科技有限公司 多目标跟踪特写拍摄视频监控系统
CN104038737A (zh) * 2014-05-30 2014-09-10 西安交通大学 主动获取感兴趣目标高分辨率图像的双摄像机系统及方法
CN109154874A (zh) * 2017-10-31 2019-01-04 深圳市大疆创新科技有限公司 图像显示方法、控制方法及相关设备
CN109618131A (zh) * 2018-11-22 2019-04-12 亮风台(上海)信息科技有限公司 一种用于呈现决策辅助信息的方法与设备

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114301571A (zh) * 2022-02-14 2022-04-08 中国人民解放军陆军工程大学 一种多旋翼无人机反制方法及系统
CN114301571B (zh) * 2022-02-14 2024-03-12 中国人民解放军陆军工程大学 一种多旋翼无人机反制方法及系统

Also Published As

Publication number Publication date
CN113597596A (zh) 2021-11-02

Similar Documents

Publication Publication Date Title
US11165959B2 (en) Connecting and using building data acquired from mobile devices
US10354407B2 (en) Camera for locating hidden objects
JP5740884B2 (ja) 繰り返し撮影用arナビゲーション及び差異抽出のシステム、方法及びプログラム
CN108702444B (zh) 一种图像处理方法、无人机及系统
US9560274B2 (en) Image generation apparatus and image generation method
US9729788B2 (en) Image generation apparatus and image generation method
WO2018214078A1 (fr) Procédé et dispositif de commande de photographie
US10284776B2 (en) Image generation apparatus and image generation method
US9894272B2 (en) Image generation apparatus and image generation method
US9736368B2 (en) Camera in a headframe for object tracking
KR20210104684A (ko) 측량 및 매핑 시스템, 측량 및 매핑 방법, 장치 및 기기
US20180262789A1 (en) System for georeferenced, geo-oriented realtime video streams
WO2020103020A1 (fr) Procédé et appareil de planification de point d'échantillon de surveillance, terminal de commande et support d'informations
US11668577B1 (en) Methods and systems for response vehicle deployment
KR101959366B1 (ko) 무인기와 무선단말기 간의 상호 인식 방법
JP2004056664A (ja) 共同撮影システム
WO2020103023A1 (fr) Système, procédé, appareil, dispositif et support d'arpentage et de cartographie
WO2021212499A1 (fr) Procédé, appareil et système d'étalonnage de cible, et terminal de commande à distance d'une plateforme mobile
WO2020103021A1 (fr) Procédé et appareil de planification pour examiner et cartographier des points d'échantillonnage, terminal de commande et support de stockage
CA3069813A1 (fr) Capture, connexion et utilisation de donnees d'interieur de batiment a partir de dispositifs mobiles
CN113906481A (zh) 成像显示方法、遥控终端、装置、系统及存储介质
KR20210106422A (ko) 작업 제어 시스템, 작업 제어 방법, 장치 및 기기
WO2022188151A1 (fr) Procédé de photographie d'image, appareil de commande, plateforme mobile et support de stockage informatique
WO2021212501A1 (fr) Procédé de génération de trajectoire, terminal de commande à distance, plateforme mobile, système et support de stockage lisible par ordinateur
JP7081198B2 (ja) 撮影システム及び撮影制御装置

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20932746

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 20932746

Country of ref document: EP

Kind code of ref document: A1