WO2023070667A1 - Plateforme mobile, procédé et appareil de traitement de données de plateforme mobile, et équipement terminal - Google Patents
Plateforme mobile, procédé et appareil de traitement de données de plateforme mobile, et équipement terminal Download PDFInfo
- Publication number
- WO2023070667A1 WO2023070667A1 PCT/CN2021/127965 CN2021127965W WO2023070667A1 WO 2023070667 A1 WO2023070667 A1 WO 2023070667A1 CN 2021127965 W CN2021127965 W CN 2021127965W WO 2023070667 A1 WO2023070667 A1 WO 2023070667A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- sensor
- movable platform
- area
- sensing
- attitude
- Prior art date
Links
- 238000000034 method Methods 0.000 title claims abstract description 65
- 238000012545 processing Methods 0.000 title claims abstract description 20
- 230000000007 visual effect Effects 0.000 claims description 75
- 230000004438 eyesight Effects 0.000 claims description 20
- 238000004891 communication Methods 0.000 claims description 18
- 230000007613 environmental effect Effects 0.000 claims description 17
- 238000006243 chemical reaction Methods 0.000 claims description 8
- 238000001514 detection method Methods 0.000 claims description 3
- 238000005259 measurement Methods 0.000 claims description 3
- 239000003550 marker Substances 0.000 claims 4
- 230000000875 corresponding effect Effects 0.000 description 16
- 238000010586 diagram Methods 0.000 description 15
- 230000008569 process Effects 0.000 description 6
- 230000008859 change Effects 0.000 description 4
- 230000001276 controlling effect Effects 0.000 description 4
- 238000005516 engineering process Methods 0.000 description 4
- 230000004048 modification Effects 0.000 description 4
- 238000012986 modification Methods 0.000 description 4
- 230000003287 optical effect Effects 0.000 description 4
- 238000004590 computer program Methods 0.000 description 3
- 230000006870 function Effects 0.000 description 3
- 230000006978 adaptation Effects 0.000 description 2
- 230000002829 reductive effect Effects 0.000 description 2
- 230000004044 response Effects 0.000 description 2
- 230000003068 static effect Effects 0.000 description 2
- 230000001360 synchronised effect Effects 0.000 description 2
- 230000009466 transformation Effects 0.000 description 2
- 230000005540 biological transmission Effects 0.000 description 1
- 230000004397 blinking Effects 0.000 description 1
- 239000006227 byproduct Substances 0.000 description 1
- 238000004364 calculation method Methods 0.000 description 1
- 230000001413 cellular effect Effects 0.000 description 1
- 238000010276 construction Methods 0.000 description 1
- 230000002596 correlated effect Effects 0.000 description 1
- 238000001914 filtration Methods 0.000 description 1
- 230000004927 fusion Effects 0.000 description 1
- 238000003384 imaging method Methods 0.000 description 1
- 238000007689 inspection Methods 0.000 description 1
- 230000003993 interaction Effects 0.000 description 1
- 230000002452 interceptive effect Effects 0.000 description 1
- 238000010801 machine learning Methods 0.000 description 1
- 230000036961 partial effect Effects 0.000 description 1
- 230000005855 radiation Effects 0.000 description 1
- 230000000717 retained effect Effects 0.000 description 1
- 230000035945 sensitivity Effects 0.000 description 1
- 238000001931 thermography Methods 0.000 description 1
- 238000012546 transfer Methods 0.000 description 1
- 230000001960 triggered effect Effects 0.000 description 1
- 230000004304 visual acuity Effects 0.000 description 1
- 230000016776 visual perception Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01D—MEASURING NOT SPECIALLY ADAPTED FOR A SPECIFIC VARIABLE; ARRANGEMENTS FOR MEASURING TWO OR MORE VARIABLES NOT COVERED IN A SINGLE OTHER SUBCLASS; TARIFF METERING APPARATUS; MEASURING OR TESTING NOT OTHERWISE PROVIDED FOR
- G01D21/00—Measuring or testing not otherwise provided for
- G01D21/02—Measuring two or more variables by means not covered by a single other subclass
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S19/00—Satellite radio beacon positioning systems; Determining position, velocity or attitude using signals transmitted by such systems
- G01S19/38—Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system
- G01S19/39—Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system the satellite radio beacon positioning system transmitting time-stamped messages, e.g. GPS [Global Positioning System], GLONASS [Global Orbiting Navigation Satellite System] or GALILEO
- G01S19/40—Correcting position, velocity or attitude
- G01S19/41—Differential correction, e.g. DGPS [differential GPS]
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S19/00—Satellite radio beacon positioning systems; Determining position, velocity or attitude using signals transmitted by such systems
- G01S19/38—Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system
- G01S19/39—Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system the satellite radio beacon positioning system transmitting time-stamped messages, e.g. GPS [Global Positioning System], GLONASS [Global Orbiting Navigation Satellite System] or GALILEO
- G01S19/42—Determining position
- G01S19/43—Determining position using carrier phase measurements, e.g. kinematic positioning; using long or short baseline interferometry
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/10—Simultaneous control of position or course in three dimensions
Definitions
- the present disclosure relates to the technical field of data processing, and in particular to a mobile platform, a method and device for processing data thereon, and a terminal device.
- an embodiment of the present disclosure provides a method for processing data of a movable platform, the movable platform is equipped with a first sensor for sensing first environmental information of the surrounding environment of the movable platform, so The method includes: obtaining the passing position of the movable platform in the work area; and determining the sensing range of the first sensor when the movable platform is at the passing position according to the passing position; according to and The sensing range of the first sensor determines the first sensed area in the work area; the display device is controlled to display first identification information, and the first identification information is used to identify the The first sensed area.
- an embodiment of the present disclosure provides a device for processing data of a movable platform, the movable platform is equipped with a first sensor for sensing first environmental information of the surrounding environment of the movable platform, so
- the device includes a processor, and the processor is configured to perform the following steps: acquire the passing position of the movable platform in the work area; and determine the position when the movable platform is in the passing position according to the passed position
- the sensing range of the first sensor according to the sensing range of the first sensor, determine the first sensed area in the work area; control the display device to display the first identification information, and the first identification information is used for The first sensed area in the work area is identified.
- an embodiment of the present disclosure provides a mobile platform, including a first sensor for sensing first environmental information of the surrounding environment of the mobile platform; a positioning device for positioning the mobile platform ; one or more processors, configured to acquire the passing position of the movable platform in the work area, and determine the position of the first sensor when the movable platform is in the passing position according to the passed position sensing range; determining a first sensed area in the work area according to the sensing range of the first sensor; and sending the signal sent by the processor to a display device to control the display device to display First identification information, where the first identification information is used to identify the first sensed area in the working area.
- an embodiment of the present disclosure provides a terminal device that communicates with a movable platform, and the movable platform is equipped with a first sensor for sensing first environmental information of the surrounding environment of the movable platform.
- the terminal includes: a communication unit; one or more processors for
- the display is configured to receive a control instruction from the processor, and display first identification information according to the control instruction, where the first identification information is used to identify the first sensed area in the working area.
- an embodiment of the present disclosure provides a computer-readable storage medium, on which a computer program is stored, and when the computer program is executed by a processor, the method described in the first aspect is implemented.
- the first sensed area is determined according to the passing position of the movable platform and the sensing range of the first sensor carried on the movable platform at the passing position, and marked on the interface of the display device.
- the first sensed area enables the user to determine the area that has been sensed by the first sensor according to the marked information on the display interface of the display device, thereby reducing repeated sensing in the same area or occurrence of unidentified The case of the missed area of sensing.
- Figure 1A is a schematic diagram of some embodiments in the presence of duplicate work areas.
- FIG. 1B is a schematic diagram of some embodiments in the presence of missing working areas.
- Fig. 2 is a flowchart of a method for processing data of a mobile platform according to an embodiment of the present disclosure.
- FIG. 3 is a schematic diagram of a sensing range of a first sensor according to an embodiment of the present disclosure.
- FIG. 4 is a schematic diagram of a marking method of a first sensed region implemented in the present disclosure.
- FIG. 5 is a schematic illustration of image distortion implemented by the present disclosure.
- FIG. 6 is a schematic diagram of the core sensing region of an implementation of the present disclosure.
- FIG. 7 is a schematic diagram of a method of determining a core sensing area based on a point cloud implemented in the present disclosure.
- FIG. 8 is a schematic diagram of a zooming method implemented in the present disclosure.
- FIG. 9 is a schematic diagram of a comparison between a first sensed area and a second sensed area implemented in the present disclosure.
- FIG. 10 is a schematic diagram of the second sensed area before and after changing the zoom ratio implemented in the present disclosure.
- FIG. 11A and FIG. 11B are respectively schematic diagrams of the relative positional relationship between the first sensed area and the second sensed area implemented in the present disclosure.
- Fig. 12 is a flowchart of a method for processing data of a mobile platform according to another embodiment of the present disclosure.
- FIG. 13 is a schematic diagram of an apparatus for processing mobile platform data implemented in the present disclosure.
- FIG. 14 is a schematic diagram of a movable platform according to an embodiment of the present disclosure.
- FIG. 15 is a schematic diagram of a terminal device according to an embodiment of the present disclosure.
- first, second, third, etc. may be used in the present disclosure to describe various information, the information should not be limited to these terms. These terms are only used to distinguish information of the same type from one another. For example, without departing from the scope of the present disclosure, first information may also be called second information, and similarly, second information may also be called first information. Depending on the context, the word “if” as used herein may be interpreted as “at” or “when” or “in response to a determination.”
- the movable platform moves from the position P1 to the position P2 along the moving track 102 , and each time the movable platform passes a position, the area near the position can be searched. Assuming that the search areas when the movable platform is at position P1 and position P2 are respectively shown in area 101 and area 103 in the figure, it can be seen that there is an overlap between area 101 and area 103, and the mobile platform has carried out the overlapping area S1 Search at least twice.
- each gray rectangular area represents the search area of the movable platform at different positions, and the dotted line represents the moving track of the movable platform. It can be seen that there is an area 105 between each search area. , when the mobile platform searches at any position, the area 105 is not found, that is, the area 105 is a missed search area.
- an embodiment of the present disclosure provides a method for processing data of a movable platform, the movable platform is equipped with a first sensor for sensing first environmental information of the surrounding environment of the movable platform, see FIG. 2.
- the method includes:
- Step 201 Obtain the passing position of the movable platform in the working area; and determine the sensing range of the first sensor when the movable platform is at the passed position according to the passed position;
- Step 202 Determine a first sensed area in the working area according to the sensing range of the first sensor
- Step 203 Control the display device to display first identification information, where the first identification information is used to identify the first sensed area in the working area.
- the method in the embodiments of the present disclosure may be executed by a mobile platform, or may be executed by a terminal device communicating with the mobile platform, or a part of the steps in the method may be executed by the mobile platform, and the other part of the steps may be executed by the terminal device.
- the mobile platform may include, but is not limited to, drones, unmanned vehicles, unmanned ships, mobile robots, etc.
- the terminal device can be a remote control matched with the mobile platform or an intelligent terminal such as a mobile phone or a computer.
- the first sensor can be mounted on the movable platform, and the first sensor can include but not limited to various types of sensors such as visual sensors, infrared sensors, radar sensors, and sound sensors, or a combination of at least two sensors.
- the first sensor has a certain sensing range.
- the first sensor mounted on the movable platform can continuously sense the environment within a certain range around the movable platform, thereby performing various operations.
- Different types of sensors have different sensing ranges and thus perform different jobs. For example, in an application scenario where a drone is equipped with a visual sensor, the sensing range of the first sensor is the image collection range of the visual sensor.
- Images of the surrounding environment of the drone can be collected through the visual sensor to determine whether there is a target object that needs to be searched in the environment, such as a trapped person or animal.
- the sensing range of the first sensor is the range in which the infrared sensor can sense temperature.
- Infrared sensors can be used to collect thermal imaging images of the surrounding environment of the drone in order to determine whether there is a heat source in the environment, thereby realizing fire detection and early warning.
- the application scenarios of the embodiments of the present disclosure are not limited to the above scenarios. In different scenarios, different types and numbers of first sensors may be carried on different mobile platforms to perform different operations.
- the sensing range of the first sensor may be jointly determined based on the position of the movable platform and the sensing range of the first sensor when the movable platform is at the position.
- FIG. 3 shows a schematic diagram of the sensing range of a visual sensor (Sensor) mounted on a drone.
- the visual sensor is facing the shooting area (for example, the ground)
- the geometric relationship between the side length d of the visual sensor, the focal length f, the field of view FOV and the flying height H of the drone can be used to determine the Ground coverage D (ie the sensing range).
- the geometric relationship between the side length d of the visual sensor, the focal length f, the field of view FOV and the distance L from the center of the visual sensor to the center of the shooting area can be used, Determine the ground coverage D' of the vision sensor (ie the sensing range).
- the position of the mobile platform can be obtained through the positioning equipment carried on the mobile platform.
- the centimeter-level high-precision position information of the UAV can be obtained through real-time differential (Real-time kinematic, RTK) positioning technology.
- RTK Real-time kinematic positioning technology
- H_UAV real-time differential positioning technology
- LonUAV, LatUAV, and H_UAV represent the longitude, latitude, and altitude of the drone's location, respectively.
- each position obtained by the positioning device may also be associated with the time information when the position was obtained, so as to determine when the position was obtained.
- the attitude of the first sensor can also be acquired by an attitude sensor.
- the attitude of the first sensor when the movable platform is at the passing position may be collected by an attitude sensor installed on the first sensor.
- the attitude of the movable platform at the passing position is collected by an attitude sensor installed on the body of the movable platform, and determined based on the attitude conversion relationship between the body of the movable platform and the first sensor
- the attitude of the first sensor when the movable platform is in the passing position may also be determined jointly by combining the information collected by the above two attitude sensors, for example, the average value of the information collected by the above two attitude sensors is determined as the attitude of the first sensor.
- the IMU attitude data of the drone can be obtained, and after fusion with the above position data (LonUAV, LatUAV, H_UAV), combined with the RTK antenna phase center to the visual
- the compensation value of the sensor which determines the position of the origin of the vision sensor data (LonPL, LatPL, H_PL).
- LonPL, LatPL and H_PL are the longitude, latitude and height of the origin of the visual sensor data respectively. Due to the large scale of the background map, relative to the scale of the background map, the position difference between the UAV’s RTK antenna phase center and the origin of the visual sensor can be ignored.
- the past location of the movable platform may include the current location of the movable platform.
- the current location refers to the real-time location of the movable platform, that is, the location of the movable platform at the current moment.
- the sensing range of the first sensor when the movable platform is at the passed position includes the first sensing range of the first sensor when the movable platform is at the current position.
- the first field of view angle of the first sensor when the movable platform is at the current position may be acquired, and the first attitude of the first sensor when the movable platform is at the current position may be acquired, and then based on the The first viewing angle and the first attitude determine the first sensing range. If the first sensor is installed with a first attitude sensor, the attitude collected by the first attitude sensor when the movable platform is at the current position can be used as the first attitude of the first sensor.
- the first attitude may also be collected based on the second attitude sensor when the movable platform is at the current position
- the pose information of the main body of the movable platform and the pose conversion relationship between the main body of the movable platform and the visual sensor are jointly determined.
- the locations traveled by the movable platform may include historical locations of the movable platform.
- the historical location refers to the location of the movable platform at a certain historical moment before the current moment, that is, the location that the movable platform has reached.
- the sensing range of the first sensor when the movable platform is in the passing position includes the second sensing range of the first sensor when the movable platform is in a historical position.
- a second field of view angle of the first sensor when the movable platform is at a historical position may be acquired; and a second attitude of the visual sensor when the movable platform is at the historical position may be acquired; based on the second The field of view and the second attitude determine the second sensing range.
- the second attitude sensor is installed on the first sensor, the attitude collected by the first attitude sensor when the movable platform is in the historical position can be used as the second attitude of the first sensor.
- the second posture may be collected by the second posture sensor when the movable platform is at a historical position
- the attitude of the main body and the attitude transformation relationship between the movable platform and the first sensor are jointly determined.
- the first posture and the second posture may be respectively associated with the positions of the movable platform. Further, the position collected by the positioning device on the movable platform, the time when the positioning device collects the position, and the first posture collected by the first posture sensor may all be correlated.
- a first sensed area may be determined according to the passed position and the sensing range of the first sensor.
- the visual sensor's sense of sight can be calculated according to the FOV of the visual sensor (including the heading and side FOV) and the flying height of the UAV. Measure the position of the edge.
- the position of the ground point corresponding to any field of view can be calculated, and the heading projection distance L between the ground point under the FOV and the projection point of the drone on the ground can be calculated
- the course and side projection distance L side determines the first sensed area when the drone is at the position P.
- the area extending left and right outwards to the L lateral direction is the first sensed area when the drone is at the position P.
- the first sensed area may include the first sensed area when the movable platform is at the current position (referred to as the current sensed area) and/or the first sensed area when the movable platform is at the historical position.
- the detected area (called the historically sensed area).
- the UAV track is the center
- the left and right outward extension range is the L lateral area, which is the historical sensed area.
- the UAV is at the current position, with the current position as the center, the area obtained by extending the L course forward and backward along the course projection direction, and extending the L side direction left and right along the side projection direction is the currently sensed area area.
- both the movable platform and the first sensor can maintain a fixed posture for a period of time.
- the sensing range of the first sensor acquired for the ith time can be determined as the sensing range of the first sensor acquired for the i+1 time, and the first sensed range of the first sensor acquired for the i time
- An offset is superimposed on the area to obtain the first sensed area acquired for the i+1th time.
- i is a positive integer.
- the offset can be determined based on the speed of the movable platform and the time interval between two adjacent updates of the first sensed area.
- the current sensed area and the historical sensed area can be marked on the interface of the display device simultaneously.
- one of the currently sensed area and the historically sensed area may also be marked on the interface of the display device based on the user's instruction. For example, when the user inputs an instruction to mark the currently sensed area, the currently sensed area may be marked on the interface of the display device. When the user inputs an instruction to switch to the historically sensed region, the historically sensed region may be marked on the interface of the display device.
- This step can be performed continuously during the moving process of the movable platform, can also be performed when triggered by a specified trigger operation, or can be performed periodically at intervals.
- the trigger operation may be that the user inputs a trigger instruction through an interactive component on the display device, or it may be that there is a repeated sensing area or a missing sensing area is detected.
- the display device may display a map on the display interface, and may mark the first sensed area on the map.
- the boundary of the first sensed area may be marked on the map, or the entire first sensed area may be marked on the map, or the first sensed area may be marked on the map. corners of the measurement area.
- the marked content in response to receiving a switching instruction from the user, the marked content may be switched to the boundary of the first sensed area or the entire first sensed area.
- the marked boundary of the first sensed area and the entire marked first sensed area may be continuously displayed on the display interface, or may also be displayed on the display interface in a blinking manner.
- the information to be marked may include the moving track and moving direction of the movable platform.
- the position information of the movable platform is refreshed at a certain frequency (such as 5Hz)
- the position information POSi (Loni, Lati, Hi) refreshed each time is synchronized to the background, or synchronized after downsampling to a certain frequency (such as 1Hz) Give the background, and then the moving track of the movable platform can be displayed in real time on the background/ground station/display screen.
- the moving direction of the movable platform may also be indicated by an arrow.
- the arrow may indicate the orientation of the nose of the unmanned aerial vehicle.
- the detectable area of the movable platform after a period of time can be predicted according to the moving direction of the movable platform, and it is judged whether the senseable area and the historically sensed area can be seamlessly connected.
- the solid line with the arrow represents the movement trajectory of the movable platform
- the dotted line in (4-a) in Fig. 4 represents the boundary of the first sensed area
- the gray in (4-b) in Fig. 4 The area represents the entire first sensed area
- the rectangular frame represents the current sensed area. It can be seen that there is a missing area in (4-a) in Figure 4, and there is no missing area in (4-b) in Figure 4.
- the first sensed area may be marked on the map by using a preset visual feature.
- the visual features include but are not limited to at least any of the following: color, brightness, transparency, and fill pattern.
- the current and historical visual features are different.
- the boundary of the first sensed area and the area within the boundary can be marked with different visual features
- the historical sensed area and the current sensed area can be marked with different visual features
- the moving track of the movable platform It can also be marked with a different visual feature than the first sensed area.
- the above partial information may also be marked with the same visual features.
- the boundary of the first sensed area and the areas within the boundary can be marked with the same visual feature.
- one or more of the information to be marked can be highlighted based on the user's choice, for example, using a color or a thicker line that has a larger contrast with other information to be marked Information is marked.
- a core sensing area may also be determined within the range of the first sensed area, and the core sensing area may be marked on an interface of a display device.
- the core sensing area may be an area that is more likely to be noticed by the user, an area at a specified location, an area including a specified object, or an area with specified features.
- the core sensing area may be an area where the temperature in the thermal radiation map collected by the infrared sensor is greater than the preset temperature threshold.
- the core sensing area may be an area including buildings in an image collected by the visual sensor.
- Image distortion can be divided into radial distortion and tangential distortion, see Figure 5. At the center of the image, it can be considered that the radial distortion is approximately 0, which can more realistically restore the geometric characteristics of the photographed object. At the edge of the image, the distortion reaches its maximum, and the geometric properties of the object itself will change. Operators may ignore ground features due to changes in the properties of the subject collection on the screen;
- the resolving power of the lens itself will also decrease as the distance from the center gets farther and farther away.
- the direct performance is that the contrast at the edge of the image is reduced and the clarity is reduced.
- the central area in the image collected by the visual sensor can be determined as the core sensing area, and the areas in the image other than the core sensing area are called edge areas. Based on the above characteristics of the core sensing area, any of the following methods can be used to determine the core sensing area:
- Way 1 Obtain the core sensing range of the first sensor when the movable platform is at the passing position; the core sensing range is within the sensing range of the first sensor; according to the passing position and the core sensing area, and determine the core sensing area.
- a preset second viewing angle may be acquired, and the second viewing angle is within the range of the first viewing angle of the first visual sensor; The attitude of the vision sensor at the position; the core sensing range is determined based on the second field of view and the attitude of the vision sensor when the movable platform is at the passing position.
- the second viewing angle may be a default viewing angle, for example, assuming that the first viewing angle is A1 degree*A2 degree, then the second viewing angle is A1/k1 degree*A2/k2 degree by default, Wherein, both k1 and k2 are constants greater than 1.
- the second viewing angle may also be manually specified by the user. Users can directly set the angle between the heading and lateral core sensing areas.
- the FOV of 60 degrees*45 degrees can be manually set as the second field of view.
- the manner of determining the core sensing range is similar to the manner of determining the sensing range of the first sensor, which will not be repeated here.
- Way 2 acquiring an image collected when the first sensor is at the passed position; determining the image quality of the image; determining the core sensing area according to the image quality.
- the image quality includes resolution, and the resolution in the image corresponding to the core sensing region is higher than the resolution in the image corresponding to other regions except the core sensing region. Rate.
- a resolution threshold can be set, and the area in the image whose resolution is greater than or equal to the resolution threshold is determined as the core sensing area, and the area in the image whose resolution is smaller than the resolution threshold is determined as the edge area.
- the image quality includes a degree of distortion
- the degree of distortion in the image corresponding to the core sensing region is lower than the distortion corresponding to regions other than the core sensing region in the image degree.
- a threshold of distortion degree (mainly radial distortion) can be set, and the radial distortion/overall distortion value of the device (vision sensor + lens) from the imaging center to the imaging edge can be calculated.
- this point is set as the cut-off point. All the obtained boundary points are connected with a line, and the internal area enclosed by the lines is determined as the core sensing area, and the area with the distortion greater than the threshold is used as the edge area.
- the image quality includes contrast, and the contrast in the image corresponding to the core sensing region is smaller than the contrast in the image corresponding to other regions other than the core sensing region.
- Filtering can be done based on the relationship between the Modulation Transfer Function (MTF) curve of the lens and the pixel size of the vision sensor. Assuming that the pixel size of the sensor is A microns, according to Nyquist's law and actual operation experience, taking 1/2 Nyquist as the benchmark, it corresponds to 4 pixel sizes, that is, 4A microns, and the corresponding line pair is B.
- MTF Modulation Transfer Function
- the resolution of the lens Since the resolution of the lens must match the pixel size of the sensor, and the decrease of the resolution of the lens at the edge leads to a decrease in contrast, it is set under the B line pair, and the contrast C is the threshold. According to the MTF curve corresponding to the equipment lens, find out the area where the contrast is beyond the threshold, and define it as the edge area, and the area where the contrast is less than the threshold, define it as the core sensing area.
- the core sensing area is determined by point cloud density.
- the first sensor is a radar sensor, especially when the lidar simulates human vision by rotating and scanning, it can be determined by setting the point cloud density. Assuming that the point cloud density P is used as the threshold limit, the part where the point cloud density is higher than P is set as the core sensing area, and the outer part where the point cloud density is lower than P is set as the edge area.
- the display device is controlled to display second identification information, the second identification information is used to indicate the core sensing area in the first sensing area; wherein, the first identification The information is different from the second identification information.
- the core sensing area and other areas (ie edge areas) other than the core sensing area can be respectively marked on the display interface of the display device with different visual features.
- the core sensing area may be marked with a darker color and the edge areas may be marked with a lighter color.
- the core sensing area and the edge area can be marked on the display interface at the same time.
- only the core sensing area may be marked by default, and the edge area may be further marked upon receiving a user's display instruction.
- machine learning can be used to determine which areas are not detected and belong to missed areas, which areas are only covered by edge areas, and the details are likely to be ignored, and a prompt message is output to the user.
- the movable platform further includes a second sensor for collecting second environmental information, and the sensing range of the second sensor is smaller than the sensing range of the first sensor.
- the sensing range of the second sensor when the movable platform is at the passing position can be acquired; according to the passing position and the sensing range of the second sensor, determine the second sensed area; control the display device to display third identification information, the third identification information is used to represent the The second sensed area.
- both the first sensor and the second sensor are vision sensors, and the field angle of the first sensor is larger than the field angle of the second sensor.
- the first sensor may be used to locate the approximate area where the target object is located, and then the second sensor may be used to perform a fine search for the target object in the approximate area.
- optical zoom The implementation logic of optical zoom is shown in (8-a) in Figure 8.
- the logic of digital zoom is shown in (8-b) in Figure 8, which is to "cut" a certain part of the sensor image and enlarge it, but the details of the image do not change. But as far as the field of view of the zoom is concerned, whether it is optical zoom or digital zoom, it is equivalent to reducing the FOV of the observed object.
- FIG. 9 shows a case where the first sensed area and the second sensed area are simultaneously displayed. That is, after switching to zooming, the coverage area of the wide-angle lens before switching is still retained. Then you can calculate the four corner coordinates of the second sensed area at this height/distance after zooming according to the camera center, the zoomed field of view FOV tele_heading 1 *FOV tele_side 1 and the focal length f tele1 , and displayed within the first sensed area. If the focal length magnification is adjusted, for example, when the focal length is adjusted from f tele1 to f tele2 , the field of view and the marked range of the zoom will change accordingly, and at this time, the marked result will be refreshed.
- the zoom sensing range will be recalculated according to the angle and the new ranging value obtained by laser ranging.
- the range sensed by the second sensor will also be recalculated.
- the second sensed area before the zoom ratio is changed is shown in the gray rectangle box in the figure, and the second sensed area after the zoom ratio is changed is shown by the dotted line in the figure The area enclosed by the box is shown.
- the first sensed area and the second sensed area are marked only for a period of time after switching, and only the second sensed area is marked after said period of time.
- the first sensed area and the second sensed area may be marked with different visual characteristics.
- the sensed area of the second sensor may also include a historical sensed area and a current sensed area.
- the currently sensed area and the historically sensed area of the second sensor may be marked with different visual features. Referring to FIG. 11A , the historical sensed area of the second sensor may be marked with a lighter color, and the current sensed area of the second sensor may be marked with a darker color.
- the sensed area of the first sensor can also be marked. It can thus be determined which parts have been carefully searched by the second sensor in the sensed area of the first sensor (that is, the historical sensed area of the second sensor), and the current position, angle and adjustment of the second sensor can be determined.
- an error prompt message may also be output.
- the boundary of the above-mentioned first sensing area within the boundary of the first sensing area, the core sensing area, the edge area, the history sensed area of the first sensor, and the current sensed area of the first sensor , the historical sensed area of the second sensor and the current sensed area of the second sensor and other boundaries or areas, some or all of the information in it can be marked with different visual features, and the above-mentioned various information can be marked
- the visual features used may be set according to actual needs, and the present disclosure does not limit this.
- the above various information can also be marked with the same visual features. In order to avoid confusion of various types of information, different tag information may be displayed at different times.
- the current sensed area and historical sensed area of the sensor with a smaller sensing range can also be displayed in real time in the sensing area of the sensor with a larger sensing range, Allow staff to have traces to follow when conducting detailed searches.
- an embodiment of the present disclosure also provides another method for processing data of a movable platform, the movable platform is equipped with a first sensor and a second sensor, which are respectively used for first and second detection of the surrounding environment of the movable platform. Sensing with a second environment, the sensing range of the second sensor is smaller than the sensing range of the first sensor, the method includes:
- Step 1201 When the currently working sensor is switched from the first sensor to the second sensor, obtain the sensed area of the first sensor and the sensed area of the second sensor at the switching time;
- Step 1202 mark the sensed area of the first sensor and the sensed area of the second sensor on the display interface of the display device to control the display device to display the first identification information and the third identification information, the The first identification information is used to mark the sensed area of the first sensor, and the third identification information is used for the sensed area of the second sensor.
- the first sensor and the second sensor may both be visual sensors.
- the solutions of the embodiments of the present disclosure may be used to search for a target area. Since the sensing range of the first sensor is relatively large, the first sensor may be used to roughly search the target area, so as to locate the approximate area where the target object is located. Then, the second sensor performs a fine search on the general area, so as to precisely locate the target object.
- the first sensor may be used to roughly search the target area, so as to locate the approximate area where the target object is located.
- the second sensor performs a fine search on the general area, so as to precisely locate the target object.
- FOV of the two due to the large difference in the FOV of the two, it often happens that it cannot be determined which areas have been sensed and which areas have not been sensed after the switch, and the first sensor cannot be determined.
- the problem of the relative positional relationship between the sensed area of the second sensor and the sensed area of the first sensor needs to be positioned by the experience of the staff after each switch, which is cumbersome to operate and poor in practicability.
- this embodiment by respectively marking the sensed areas of the two sensors on the display interface, it is convenient for the user to observe the sensed area of the first sensor and the sensed area of the second sensor through the display interface after switching sensors, and It can be determined which part of the sensed area of the second sensor is in the sensed area of the first sensor without manual positioning, which reduces operational complexity and improves practicability.
- An embodiment of the present disclosure also provides a device for processing data of a movable platform, the movable platform is equipped with a first sensor for sensing first environmental information of the surrounding environment of the movable platform, the device includes one or more processors for performing the steps of:
- the display device is controlled to display first identification information, where the first identification information is used to identify the first sensed area in the working area.
- the passed position includes the current position of the movable platform; when the movable platform is at the passed position, the sensing range of the first sensor includes when the movable platform is at the current position The first sensing range of the first sensor.
- the first sensor is a visual sensor; when determining the sensing range of the first sensor when the movable platform is at the passed position according to the passed position, the processor specifically uses In: acquiring the first field of view angle of the first sensor when the movable platform is at the current position; and acquiring the first attitude of the first sensor when the movable platform is at the current position; based on the The current position, the first viewing angle and the first posture determine the first sensing range.
- the movable platform includes a first attitude sensor installed on the first sensor for collecting the attitude of the first sensor; the first attitude is determined by the first attitude sensor in the Collected when the movable platform is at its current position.
- the movable platform includes a second attitude sensor installed on the body of the movable platform for collecting the attitude of the movable platform; the first attitude is based on the second attitude sensor at the The posture information of the body of the movable platform collected when the movable platform is at the current position, and the posture conversion relationship between the body of the movable platform and the visual sensor are jointly determined.
- the passed position includes the historical position of the movable platform; when the movable platform is in the passed position, the sensing range of the first sensor includes when the movable platform is in the historical position The second sensing range of the first sensor.
- the first sensor is a visual sensor; when determining the sensing range of the first sensor when the movable platform is at the passed position according to the passed position, the processor specifically uses In: obtaining the second field of view angle of the first sensor when the movable platform is in the historical position; and obtaining the second posture of the vision sensor when the movable platform is in the historical position; based on the history The position, the second viewing angle and the second posture determine the second sensing range.
- the movable platform includes a first attitude sensor installed on the first sensor for collecting the attitude of the first sensor; the second attitude is determined by the first attitude sensor in the Collected when the movable platform is in the historical position.
- the movable platform includes a second attitude sensor installed on the body of the movable platform for collecting the attitude of the body of the movable platform; the second attitude is determined by the second attitude sensor
- the posture of the main body collected when the movable platform is in a historical position and the posture conversion relationship between the movable platform and the first sensor are jointly determined.
- the display device is configured to display a map on a display interface, and when the display device displays the first identification information, the processor is specifically configured to:
- the first identification information includes visual features.
- the visual features include at least any one of the following: color, brightness, transparency, and fill pattern.
- the processor is also used for:
- the core sensing area is within the range of the first sensed area
- the display device is controlled to display second identification information, where the second identification information is used to identify the core sensing region.
- the processor when determining the core sensing area, is specifically configured to:
- the core sensing range is within the sensing range of the first sensor
- the first sensor is a visual sensor; when acquiring the core sensing range of the first sensor when the movable platform is at the passing position, the processor is specifically configured to: acquire a preset the second field of view angle, the second field of view is within the range of the first field of view angle of the first visual sensor; Attitude: determining the core sensing range based on the second field of view and the attitude of the visual sensor when the movable platform is at the passing position.
- the processor when determining the core sensing area, is specifically configured to: acquire an image captured when the first sensor is at the passing position; determine the image quality of the image; The core sensing area is determined.
- the image quality includes resolution, and the resolution corresponding to the core sensing area in the image is higher than the resolution corresponding to other areas in the image except the core sensing area.
- the image quality includes a degree of distortion, and the degree of distortion in the image corresponding to the core sensing area is lower than the degree of distortion in the image corresponding to other areas except the core sensing area.
- the image quality includes contrast, and the contrast in the image corresponding to the core sensing area is smaller than the contrast in the image corresponding to other areas except the core sensing area.
- the processor is further configured to: control the display device to display second identification information, where the second identification information is used to indicate the core sensing area in the first sensing area; wherein , the first identification information is different from the second identification information.
- the movable platform further includes a second sensor for collecting second environmental information, the sensing range of the second sensor is smaller than the sensing range of the first sensor; when the movable platform When in the passed position and the second sensor is working, the processor is further configured to: determine the sensitivity of the second sensor when the movable platform is in the passed position according to the passed position. measuring range; according to the passing position and the sensing range of the second sensor, determine the second sensed area in the work area; control the display of third identification information, and use the third identification information to represent a second sensed area in the working area.
- the second sensed area is within the range of the first sensed area.
- the visual characteristics of the third mark are different from those of the first mark.
- the second sensor is a visual sensor; when the second sensed area is marked on the display interface of the display device, the processor is specifically configured to: acquire the third detected area of the second sensor. Angle of field of view; and acquiring the posture of the second sensor when the movable platform is at the passing position; based on the passing position, the third viewing angle and the passing position of the movable platform The posture of the second sensor at the position determines the sensing range of the second sensor.
- An embodiment of the present disclosure also provides an apparatus for processing mobile platform data
- the movable platform is equipped with a first sensor for sensing first environmental information of the surrounding environment of the movable platform, and the device includes one or more processors, and the processor is used to perform the following steps:
- the display device is controlled to display first identification information, where the first identification information is used to identify the first sensed area in the working area.
- Fig. 13 shows a schematic diagram of the hardware structure of a more specific device for processing mobile platform data provided by an embodiment of the present disclosure.
- the device may include: a processor 1301, a memory 1302, an input/output interface 1303, Communication interface 1304 and bus 1305 .
- the processor 1301 , the memory 1302 , the input/output interface 1303 and the communication interface 1304 are connected to each other within the device through the bus 1305 .
- the processor 1301 may be implemented by a general-purpose CPU (Central Processing Unit, central processing unit), a microprocessor, an application-specific integrated circuit (Application Specific Integrated Circuit, ASIC), or one or more integrated circuits, and is used to execute related programs to realize the technical solutions provided by the embodiments of this specification.
- a general-purpose CPU Central Processing Unit, central processing unit
- a microprocessor an application-specific integrated circuit (Application Specific Integrated Circuit, ASIC), or one or more integrated circuits, and is used to execute related programs to realize the technical solutions provided by the embodiments of this specification.
- ASIC Application Specific Integrated Circuit
- the memory 1302 can be implemented in the form of ROM (Read Only Memory, read-only memory), RAM (Random Access Memory, random access memory), static storage device, dynamic storage device, etc.
- the memory 1302 can store operating systems and other application programs. When implementing the technical solutions provided by the embodiments of this specification through software or firmware, the relevant program codes are stored in the memory 1302 and invoked by the processor 1301 for execution.
- the input/output interface 1303 is used to connect the input/output module to realize information input and output.
- the input/output/module can be configured in the device as a component (not shown in the figure), or can be externally connected to the device to provide corresponding functions.
- the input device may include a keyboard, mouse, touch screen, microphone, various sensors, etc.
- the output device may include a display, a speaker, a vibrator, an indicator light, and the like.
- the communication interface 1304 is used to connect a communication module (not shown in the figure), so as to realize the communication interaction between the device and other devices.
- the communication module can realize communication through wired means (such as USB, network cable, etc.), and can also realize communication through wireless means (such as mobile network, WIFI, Bluetooth, etc.).
- Bus 1305 includes a path for transferring information between the various components of the device (eg, processor 1301, memory 1302, input/output interface 1303, and communication interface 1304).
- the above device only shows the processor 1301, the memory 1302, the input/output interface 1303, the communication interface 1304, and the bus 1305, in the specific implementation process, the device may also include other components.
- the above-mentioned device may only include components necessary to implement the solutions of the embodiments of this specification, and does not necessarily include all the components shown in the figure.
- an embodiment of the present disclosure also provides a mobile platform, which includes:
- the first sensor 1402 is configured to sense first environmental information of the surrounding environment of the movable platform
- processors 1404 configured to obtain the position of the movable platform passing in the work area
- Antenna 1405, configured to send the control instruction received from the processor to the display device, so as to control the display device to display first identification information, where the first identification information is used to identify the The first sensed area.
- the mobile platform in the embodiment of the present disclosure may be a handheld device such as a mobile phone, a PDA, a camera, and a cloud platform.
- the movable platform can also include a power system 1401, which is used to provide power for the movable platform for the movement of the movable platform; the movable platform can be a drone, an unmanned vehicle, an unmanned ship, a movable equipment such as robots.
- the first sensor 1402 may include but not limited to various types of sensors such as visual sensors, infrared sensors, radar sensors, and sound sensors.
- the positioning device 1403 can use Global Positioning System (Global Positioning System, GPS), Beidou Satellite Positioning System (BeiDou Navigation Satellite System, BDS), RTK positioning and other methods to position the movable platform.
- GPS Global Positioning System
- BDS Beidou Satellite Positioning System
- RTK positioning RTK positioning and other methods to position the movable platform.
- the antenna 1405 can be used to realize the communication between the UAV and the ground station/terminal equipment.
- the display device can be set on the ground station or terminal equipment, and the terminal equipment can be a remote controller matched with the movable platform, or can be an intelligent terminal such as a mobile phone or a computer.
- the movable platform is equipped with a first sensor and a second sensor
- the sensing range of the second sensor is smaller than the sensing range of the first sensor
- the processor 1404 can operate on the currently working sensor When switching from the first sensor to the second sensor, obtain the sensed area of the first sensor and the sensed area of the second sensor at the switching time, and store the sensed area of the first sensor
- Both the detected area and the sensed area of the second sensor are sent to the display device through the antenna 1405, so that the display device marks the sensed area of the first sensor and the sensed area of the second sensor on the display interface. sensing area.
- an embodiment of the present disclosure further provides a terminal device, the terminal device is communicatively connected to a movable platform, and the movable platform is equipped with a first sensor, which is used to obtain the first environmental information of the surrounding environment of the movable platform.
- the terminal includes:
- the communication unit 1501 is configured to acquire the passing position of the movable platform in the working area, and acquire the sensing range of the first sensor when the movable platform is in the passing position;
- processors 1502 configured to control the communication unit to acquire the position of the movable platform passing through the work area from the movable platform;
- the display 1503 is configured to receive a control instruction from the processor, and display first identification information according to the control instruction, where the first identification information is used to identify the first sensed area in the working area.
- the movable platform is equipped with a first sensor and a second sensor
- the sensing range of the second sensor is smaller than the sensing range of the first sensor
- the communication unit 1501 is used to acquire the When the currently working sensor is switched from the first sensor to the second sensor, the sensing range of the first sensor, the sensing range of the second sensor and the position passed by the movable platform at the time of switching .
- the processor 1502 is configured to determine the sensed area of the first sensor according to the sensing range of the first sensor at the passing position of the movable platform at the switching moment, and determine the sensed area of the first sensor according to the passing position of the movable platform at the switching moment.
- the sensing range of the second sensor determines the sensed area of the second sensor.
- the display 1503 is used to respectively mark the sensed area of the first sensor and the sensed area of the second sensor on the interface.
- An embodiment of the present disclosure further provides a computer-readable storage medium, on which a computer program is stored, and when the program is executed by a processor, the steps performed by the second processing unit in the method described in any of the preceding embodiments are implemented.
- Computer-readable media including both permanent and non-permanent, removable and non-removable media, can be implemented by any method or technology for storage of information.
- Information may be computer readable instructions, data structures, modules of a program, or other data.
- Examples of computer storage media include, but are not limited to, phase change memory (PRAM), static random access memory (SRAM), dynamic random access memory (DRAM), other types of random access memory (RAM), read only memory (ROM), Electrically Erasable Programmable Read-Only Memory (EEPROM), Flash memory or other memory technology, Compact Disc Read-Only Memory (CD-ROM), Digital Versatile Disc (DVD) or other optical storage, Magnetic tape cartridge, tape magnetic disk storage or other magnetic storage device or any other non-transmission medium that can be used to store information that can be accessed by a computing device.
- computer-readable media excludes transitory computer-readable media, such as modulated data signals and carrier waves.
- a typical implementing device is a computer, which may take the form of a personal computer, laptop computer, cellular phone, camera phone, smart phone, personal digital assistant, media player, navigation device, e-mail device, game control device, etc. desktops, tablets, wearables, or any combination of these.
Landscapes
- Engineering & Computer Science (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Computer Networks & Wireless Communication (AREA)
- Aviation & Aerospace Engineering (AREA)
- Automation & Control Theory (AREA)
- User Interface Of Digital Computer (AREA)
- Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)
Abstract
Plateforme mobile, procédé et appareil de traitement de données d'une plateforme mobile, et équipement terminal. Un premier capteur est installé sur la plateforme mobile et est utilisé pour détecter des premières informations d'environnement d'un environnement autour de la plateforme mobile. Le procédé consiste à : acquérir les positions d'une plateforme mobile passant à travers une zone d'opérations, et déterminer, en fonction des positions à travers lesquelles passe la plateforme mobile, une plage de détection du premier capteur lorsque la plateforme mobile se trouve au niveau des positions à travers lesquelles passe la plateforme mobile (201) ; déterminer une première zone détectée dans la zone d'opérations en fonction de la plage de détection du premier capteur (202) ; et commander un appareil d'affichage pour afficher des premières informations d'identification, les premières informations d'identification étant utilisées pour identifier la première zone détectée dans la zone d'opérations (203).
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/CN2021/127965 WO2023070667A1 (fr) | 2021-11-01 | 2021-11-01 | Plateforme mobile, procédé et appareil de traitement de données de plateforme mobile, et équipement terminal |
CN202180013840.XA CN115066663A (zh) | 2021-11-01 | 2021-11-01 | 可移动平台及用于处理其数据的方法和装置、终端设备 |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/CN2021/127965 WO2023070667A1 (fr) | 2021-11-01 | 2021-11-01 | Plateforme mobile, procédé et appareil de traitement de données de plateforme mobile, et équipement terminal |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2023070667A1 true WO2023070667A1 (fr) | 2023-05-04 |
Family
ID=83196394
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/CN2021/127965 WO2023070667A1 (fr) | 2021-11-01 | 2021-11-01 | Plateforme mobile, procédé et appareil de traitement de données de plateforme mobile, et équipement terminal |
Country Status (2)
Country | Link |
---|---|
CN (1) | CN115066663A (fr) |
WO (1) | WO2023070667A1 (fr) |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20080063299A1 (en) * | 2006-08-08 | 2008-03-13 | Kokusai Kogyo Co., Ltd. | Method of producing and displaying an aerial photograph data set |
CN106197377A (zh) * | 2016-06-30 | 2016-12-07 | 西安电子科技大学 | 一种无人机对地目标监视及二维三维联动的显示系统 |
CN106887028A (zh) * | 2017-01-19 | 2017-06-23 | 西安忠林世纪电子科技有限公司 | 实时显示航拍照片覆盖区域的方法及系统 |
CN109708636A (zh) * | 2017-10-26 | 2019-05-03 | 广州极飞科技有限公司 | 导航图配置方法、避障方法以及装置、终端、无人飞行器 |
CN110069073A (zh) * | 2018-11-30 | 2019-07-30 | 广州极飞科技有限公司 | 作业控制方法、装置及植保系统 |
Family Cites Families (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN107662707A (zh) * | 2016-07-28 | 2018-02-06 | 深圳航天旭飞科技有限公司 | 节药无人机 |
CN108717301B (zh) * | 2018-06-13 | 2022-02-15 | 仲恺农业工程学院 | 一种基于gis的无人机植保系统及方法 |
CN110186433B (zh) * | 2019-03-27 | 2019-11-22 | 成都睿铂科技有限责任公司 | 一种可剔除多余航片的航测方法和装置 |
KR102252060B1 (ko) * | 2019-12-24 | 2021-05-14 | 한국항공우주연구원 | Gps 메타데이터에 기반하여 드론 영상의 시공간 정보를 표시하기 위한 방법 및 그 장치 |
-
2021
- 2021-11-01 WO PCT/CN2021/127965 patent/WO2023070667A1/fr unknown
- 2021-11-01 CN CN202180013840.XA patent/CN115066663A/zh active Pending
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20080063299A1 (en) * | 2006-08-08 | 2008-03-13 | Kokusai Kogyo Co., Ltd. | Method of producing and displaying an aerial photograph data set |
CN106197377A (zh) * | 2016-06-30 | 2016-12-07 | 西安电子科技大学 | 一种无人机对地目标监视及二维三维联动的显示系统 |
CN106887028A (zh) * | 2017-01-19 | 2017-06-23 | 西安忠林世纪电子科技有限公司 | 实时显示航拍照片覆盖区域的方法及系统 |
CN109708636A (zh) * | 2017-10-26 | 2019-05-03 | 广州极飞科技有限公司 | 导航图配置方法、避障方法以及装置、终端、无人飞行器 |
CN110069073A (zh) * | 2018-11-30 | 2019-07-30 | 广州极飞科技有限公司 | 作业控制方法、装置及植保系统 |
Also Published As
Publication number | Publication date |
---|---|
CN115066663A (zh) | 2022-09-16 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US12033388B2 (en) | Positioning method, apparatus, device, and computer-readable storage medium | |
CN112567201B (zh) | 距离测量方法以及设备 | |
EP3591490B1 (fr) | Procédé et dispositif d'évitement d'obstacle, et véhicule aérien autonome | |
EP2879371B1 (fr) | Système pour suivre un objet marqué par un dispositif d'étiquette avec une caméra | |
US20170200273A1 (en) | System and Method for Fusing Outputs of Sensors Having Different Resolutions | |
CN105606077A (zh) | 大地测量系统 | |
JP2018160228A (ja) | 経路生成装置、経路制御システム、及び経路生成方法 | |
JP4969053B2 (ja) | 携帯端末装置及び表示方法 | |
KR20180131033A (ko) | 카메라와 레이더의 캘리브레이션 장치 및 방법 | |
CN109974713B (zh) | 一种基于地表特征群的导航方法及系统 | |
CN107466384A (zh) | 一种追踪目标的方法及装置 | |
US20200217665A1 (en) | Mobile platform, image capture path generation method, program, and recording medium | |
CN116817929B (zh) | 一种无人机对地平面多目标同时定位方法及系统 | |
WO2023070667A1 (fr) | Plateforme mobile, procédé et appareil de traitement de données de plateforme mobile, et équipement terminal | |
JP5514062B2 (ja) | 電子機器、情報付き撮像画面表示方法及びプログラム | |
Hao et al. | Assessment of an effective range of detecting intruder aerial drone using onboard EO-sensor | |
CN111491154A (zh) | 基于一个或多个单视场帧的检测和测距 | |
US20230177781A1 (en) | Information processing apparatus, information processing method, and information processing program | |
US11415990B2 (en) | Optical object tracking on focal plane with dynamic focal length | |
WO2021212499A1 (fr) | Procédé, appareil et système d'étalonnage de cible, et terminal de commande à distance d'une plateforme mobile | |
CN109309709B (zh) | 一种可远端控制无人装置的控制方法 | |
CN113646606A (zh) | 一种控制方法、设备、无人机及存储介质 | |
JP2021103410A (ja) | 移動体及び撮像システム | |
CN109343561B (zh) | 用以在电子装置显示与操作无人装置的操作方法 | |
TWI849522B (zh) | 船舶航行顯示系統及船舶航行顯示方法 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 21962017 Country of ref document: EP Kind code of ref document: A1 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |