WO2024027082A1 - Procédé et appareil d'avertissement précoce de collision d'entrée et de sortie de hangar d'aéronef, et dispositif ainsi que support - Google Patents

Procédé et appareil d'avertissement précoce de collision d'entrée et de sortie de hangar d'aéronef, et dispositif ainsi que support Download PDF

Info

Publication number
WO2024027082A1
WO2024027082A1 PCT/CN2022/141372 CN2022141372W WO2024027082A1 WO 2024027082 A1 WO2024027082 A1 WO 2024027082A1 CN 2022141372 W CN2022141372 W CN 2022141372W WO 2024027082 A1 WO2024027082 A1 WO 2024027082A1
Authority
WO
WIPO (PCT)
Prior art keywords
target
aircraft
physical
target aircraft
model
Prior art date
Application number
PCT/CN2022/141372
Other languages
English (en)
Chinese (zh)
Inventor
陈金
马博闻
张渊佳
王天歌
李响
那瀚文
徐洋
Original Assignee
天翼云科技有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 天翼云科技有限公司 filed Critical 天翼云科技有限公司
Publication of WO2024027082A1 publication Critical patent/WO2024027082A1/fr

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/30Determination of transform parameters for the alignment of images, i.e. image registration
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • G06T7/246Analysis of motion using feature-based methods, e.g. the tracking of corners or segments
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • G06T7/73Determining position or orientation of objects or cameras using feature-based methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/764Arrangements for image or video recognition or understanding using pattern recognition or machine learning using classification, e.g. of video objects
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10028Range image; Depth image; 3D point clouds
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30241Trajectory
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30244Camera pose

Definitions

  • This application relates to the field of smart civil aviation technology, and in particular to a collision warning method, device, equipment and medium for aircraft entry and exit.
  • the safety status of the aircraft needs to be inspected every time it lands.
  • the aircraft needs to be pushed into the hangar.
  • the aircraft enters and exits the hangar, it is likely to encounter the surrounding environment, which can easily cause damage to the aircraft.
  • the distances between aircraft in the hangar, between the aircraft and the maintenance workbench, and between the aircraft and the hangar crane are small.
  • the tractor driver has a large blind spot.
  • multiple ground staff are used to coordinate manual observation and intercom communication to achieve anti-collision and berth guidance for aircraft.
  • multiple instructors guide the aircraft, observe the surrounding environment of the aircraft, and notify the pilot through the radio intercom.
  • manual observation on the ground has parallax and is prone to misjudgment, resulting in aircraft scratch accidents causing economic losses, requiring a lot of manpower and affecting efficiency.
  • Embodiments of the present application provide a collision warning method, device, equipment and medium for aircraft entry and exit to solve the problem of related technologies.
  • multiple ground staff coordinate and communicate and guide, which consumes a lot of manpower and material resources and is less efficient. Low, and because of the existence of blind spots, collisions due to guidance errors are prone to occur.
  • This application provides a collision warning method for aircraft entry and exit.
  • the method includes:
  • the target aircraft entering and exiting the warehouse obtain the first image of the target hangar in the physical scene in real time, and determine the physical location information of the target aircraft and each target object based on the first image;
  • the simulated location information of the model corresponding to the target aircraft and the model corresponding to each target object in the digital twin scene model According to the physical location information of the target aircraft and each target object, determine the simulated location information of the model corresponding to the target aircraft and the model corresponding to each target object in the digital twin scene model;
  • the corresponding physical distances between the models corresponding to the target aircraft and the models corresponding to each target object are obtained in real time, and when the minimum physical distance is less than the preset distance threshold, the first warning prompt information is output.
  • determining the physical location information of the target aircraft and each target object based on the first image includes:
  • the physical location information includes the physical two-dimensional coordinate information and the physical depth coordinate information.
  • the real-time acquisition of the corresponding physical distances between the models corresponding to the target aircraft and the models corresponding to each target object includes:
  • the real-time acquisition of the corresponding physical distances between the models corresponding to the target aircraft and the models corresponding to each target object includes:
  • the simulation distance between the model corresponding to the target aircraft and the model corresponding to each target object is determined in real time. According to the simulation distance, and the relationship between the digital twin scene model and the physical The mapping relationship of the scene determines the physical distance corresponding to the simulated distance; or
  • determining the simulated position information of the model corresponding to the target aircraft in the digital twin scene model includes:
  • the simulated position information and simulated angle attitude information of the model corresponding to the target aircraft in the digital twin scene model are determined.
  • the method also includes:
  • the travel of the target aircraft in and out of the warehouse is determined. trajectory information.
  • the method also includes:
  • this application provides an aircraft entry and exit collision warning device, which device includes:
  • the first determination module is used to obtain the first image of the target hangar in the physical scene in real time during the process of the target aircraft entering and exiting the warehouse, and determine the physical location information of the target aircraft and each target object based on the first image;
  • the second determination module is used to determine the simulated position information of the model corresponding to the target aircraft and the model corresponding to each target object in the digital twin scene model based on the physical position information of the target aircraft and each target object;
  • An early warning module is used to obtain the physical distance between the model corresponding to the target aircraft and the model corresponding to each target object in real time, and output the first early warning prompt information when the minimum physical distance is less than the preset distance threshold.
  • the first determination module is specifically configured to identify the target aircraft and each target object based on the first image through a target recognition algorithm, and determine the physical two-dimensional coordinates of the target aircraft and each target object.
  • Information obtain the three-dimensional point cloud data collected by the three-dimensional laser equipment arranged in the physical scene, and determine the physical depth coordinate information of the target aircraft and each target object according to the three-dimensional point cloud data; wherein, the physical The location information includes the physical two-dimensional coordinate information and the physical depth coordinate information.
  • the early warning module is specifically used to determine in real time the first minimum circumscribed bounding box of the model corresponding to the target aircraft and the second minimum circumscribed bounding box of the model corresponding to each target object; obtain the first minimum circumscribed bounding box The corresponding physical distance between the box and each second smallest bounding box.
  • the early warning module is specifically configured to determine the simulation distance between the model corresponding to the target aircraft and the model corresponding to each target object in real time in the digital twin scene model, according to the simulation distance, and
  • the mapping relationship between the digital twin scene model and the physical scene determines the physical distance corresponding to the simulated distance; or obtains in real time the target aircraft collected by the one-dimensional ranging equipment arranged in the physical scene and the respective targets.
  • the distance between objects is regarded as the corresponding physical distance between the model corresponding to the target aircraft and the model corresponding to each target object.
  • the second determination module is specifically used to determine the physical angle and attitude information of the target aircraft according to the three-dimensional point cloud data; and determine the physical angle and attitude information of the target aircraft according to the physical position information and physical angle and attitude information of the digital twin scene model.
  • the device also includes:
  • the third determination module is configured to determine the model based on the simulated position information of the model corresponding to the target aircraft and the model corresponding to each target object in the digital twin scene model, and the simulated angle and attitude information of the model corresponding to the target aircraft. Describes the trajectory information of the target aircraft entering and exiting the warehouse.
  • the early warning module is also used to determine whether the model corresponding to the target aircraft is centered and enters the model corresponding to the maintenance stand based on the digital twin scene model when the target aircraft enters the maintenance stand; if not , determine the deviation direction based on the simulated position information of the model corresponding to the target aircraft and the simulated position information of the model corresponding to the maintenance stand, and output the second warning prompt information carrying the deviation direction; if so, obtain the Based on the second image of the maintenance machine position and the pre-trained collision recognition model, a second judgment is made as to whether there is a collision risk. If there is a collision risk, a third warning message is output. If there is no collision risk, a third warning message is output. , instructing the target aircraft to enter the maintenance bay.
  • this application provides an electronic device, including a processor, a communication interface, a memory, and a communication bus, wherein the processor, the communication interface, and the memory complete communication with each other through the communication bus;
  • Memory used to store computer programs
  • the processor is used to implement any of the above method steps when executing the program stored in the memory.
  • the present application provides a computer-readable storage medium in which a computer program is stored, and when the computer program is executed by a processor, any one of the above method steps is implemented.
  • This application provides a collision warning method, device, equipment and medium for aircraft entering and exiting the warehouse.
  • the method includes: during the process of the target aircraft entering and exiting the warehouse, acquiring the first image of the target hangar in the physical scene in real time, based on the first An image determines the physical location information of the target aircraft and each target object; and determines the model corresponding to the target aircraft and each target object in the digital twin scene model based on the physical location information of the target aircraft and each target object.
  • simulated position information of the corresponding model obtain the physical distance between the model corresponding to the target aircraft and the model corresponding to each target object in real time, and when the minimum physical distance is less than the preset distance threshold, output the first Early warning information.
  • the solution to achieve collision avoidance of target aircraft entering and exiting the warehouse compared with the solution of collaborative communication and guidance by ground staff, avoids the problem of collision due to guidance errors due to the existence of blind spots, and improves the safety of collision avoidance. It is reliable and effective, and does not require a lot of ground staff to work together, which reduces the consumption of manpower and material resources, and at the same time improves the efficiency of the target aircraft entering and exiting the warehouse.
  • Figure 1 is a schematic diagram of the aircraft entry and exit collision warning process provided by this application.
  • Figure 2 is the architecture diagram of aircraft entry and exit collision warning provided by this application.
  • Figure 3 is a visual schematic diagram of the sensor arrangement provided by this application.
  • Figure 4 is a flow chart of aircraft entry and exit collision warning provided by this application.
  • FIG. 5 is a schematic diagram of sensor deployment at the maintenance rack provided in this application.
  • Figure 6 is a schematic structural diagram of the aircraft entry and exit collision warning device provided by this application.
  • Figure 7 is a schematic structural diagram of an electronic device provided by this application.
  • FIG. 1 is a schematic diagram of the aircraft entry and exit collision warning process provided by this application. The process includes the following steps:
  • S101 During the process of the target aircraft entering and exiting the warehouse, obtain the first image of the target hangar in the physical scene in real time, and determine the physical location information of the target aircraft and each target object based on the first image.
  • S102 Based on the physical location information of the target aircraft and each target object, determine the simulated location information of the model corresponding to the target aircraft and the model corresponding to each target object in the digital twin scene model.
  • S103 Obtain the corresponding physical distance between the model corresponding to the target aircraft and the model corresponding to each target object in real time, and when the minimum physical distance is less than the preset distance threshold, output the first warning prompt information.
  • the aircraft entry and exit collision warning method provided by this application is applied to electronic devices.
  • the electronic devices can be PCs, tablets and other devices, or they can also be servers.
  • At least one image acquisition device is arranged, and the combined field of view of all images collected by the at least one image acquisition device can cover the entire physical scene of aircraft entry and exit.
  • At least one image captures the image of the entire physical scene of the aircraft entering and exiting the warehouse in real time.
  • the image of the entire physical scene of the aircraft entering and exiting the warehouse in real time is called the first image.
  • the aircraft that is entering or leaving the warehouse is called the target aircraft.
  • the electronic device is respectively connected to at least one image acquisition device, and the electronic device acquires at least one first image to be acquired by the at least one image acquisition device in real time. Through intelligent analysis of the first image, the physical location information of the target aircraft and each target object is determined.
  • Physical location information refers to the location information in the physical scene of aircraft entering and exiting the warehouse.
  • Each target object is, for example, static objects inside the hangar, various types of aircraft parked in the hangar, movable obstacles, personnel, sensors, aircraft maintenance racks, etc.
  • Intelligent analysis of the first image involves, for example, using a target recognition algorithm to identify the target aircraft and each target object in the first image.
  • the target recognition model is trained in advance, the first image is input into the target recognition model, and the target aircraft and each target object in the first image are obtained.
  • the target recognition model is trained based on the sample images collected by the image acquisition equipment in the target hangar physical scene and the position information of each sample object and sample aircraft in the annotated sample images.
  • the position information of the target aircraft and each target object in the image can be determined.
  • the target recognition model can identify the location information of each target object on the one hand, and the category information of the target object on the other hand, such as identifying whether the target object is an aircraft, sensor, static object, etc.
  • digital twin technology can be used to establish a digital twin scene model corresponding to the target hangar.
  • the simulated location information of the model corresponding to the target aircraft and the model corresponding to each target object in the digital twin scene model is determined.
  • the physical location information of the target aircraft and each target object is related to the model corresponding to the target aircraft and the corresponding model of each target object in the digital twin scene model.
  • the simulated location information of has the same mapping relationship as above.
  • the model corresponding to the target aircraft and the models corresponding to each target object are established in the digital twin scene model based on the simulated position information. .
  • the corresponding physical distances between the models corresponding to the target aircraft and the models corresponding to each target object are obtained in real time.
  • the first warning prompt information is output.
  • the preset distance threshold can be set according to needs, such as 10 cm, 20 cm, etc.
  • the digital twin scene model and the corresponding physical distance between the model corresponding to the target aircraft and the corresponding model of each target object can be displayed on the display screen of the electronic device in real time.
  • the first early warning information may be sound information, text information, or warning picture information, or may be a combination of sound information, text information, and warning picture information.
  • the sound information can be emitted by the electronic device, or an alarm can be arranged in the scene, and the electronic device controls the alarm to emit the sound information as the first warning prompt information.
  • Text information or warning picture information can be displayed on the display of the electronic device.
  • the physical location information of the target aircraft and each target object is determined, and a digital twin scene model is established. Determine the simulated position information of the model corresponding to the target aircraft and the model corresponding to each target object in the digital twin scene model. Carry out collision detection based on the model corresponding to the target aircraft and the corresponding model of each target object, and output the first warning message when a collision is detected.
  • the solution to achieve collision avoidance of target aircraft entering and exiting the warehouse compared with the solution of collaborative communication and guidance by ground staff, avoids the problem of collision due to guidance errors due to the existence of blind spots, and improves the safety of collision avoidance. It is reliable and effective, and does not require a lot of ground staff to work together, which reduces the consumption of manpower and material resources, and at the same time improves the efficiency of the target aircraft entering and exiting the warehouse.
  • determining the physical location information of the target aircraft and each target object based on the first image includes:
  • the physical location information includes the physical two-dimensional coordinate information and the physical depth coordinate information.
  • the target aircraft and each target object are identified through a target recognition algorithm, and the coordinate information of the target aircraft and each target object in the image is determined, and the coordinate information is two-dimensional coordinate information.
  • the physical coordinate information obtained based on the coordinate information of the target aircraft and each target object in the image and the calibration information of the image acquisition device is also physical two-dimensional coordinate information.
  • At least one three-dimensional laser device is arranged in the physical scene of aircraft entering and exiting the warehouse, and the three-dimensional point cloud data collected by the at least one three-dimensional laser device can cover the entire physical scene of aircraft entering and exiting the warehouse.
  • the physical two-dimensional coordinate information and physical depth coordinate information of the target aircraft and each target object constitute the physical location information. This can make the determined physical location information more accurate, thereby making the collision warning for aircraft entering and leaving the warehouse more accurate.
  • the real-time acquisition of the corresponding physical distances between the models corresponding to the target aircraft and the models corresponding to each target object includes:
  • the minimum circumscribed cube of the model corresponding to the target aircraft is determined in real time, and the minimum circumscribed cube of the model corresponding to the target aircraft is used as the first minimum circumscribed bounding box.
  • the minimum circumscribed cube of the model corresponding to each target object is determined in real time, and the minimum circumscribed cube of the model corresponding to each target object is used as each second minimum circumscribed bounding box.
  • the first minimum external bounding box and each second minimum external bounding box are used to calculate the corresponding physical distance between the model corresponding to the target aircraft and the corresponding model of each target object.
  • the third minimum external bounding box can be displayed.
  • One minimum external bounding box and each second minimum external bounding box can also display only the model corresponding to the target aircraft and the models corresponding to each target object.
  • Electronic devices can be configured with different display functions, and corresponding content can be displayed according to the display functions to meet the different needs of users.
  • the corresponding distances between the first minimum circumscribed bounding box and each second minimum circumscribed bounding box are used as the corresponding physical distances between the corresponding models of the target aircraft and the respective models of each target object.
  • the aircraft entry and exit collision warning is based on the physical distance between the first minimum external bounding box and each second minimum external bounding box, which can further ensure the accuracy of the aircraft entry and exit collision warning.
  • the real-time acquisition of the corresponding physical distances between the models corresponding to the target aircraft and the models corresponding to each target object includes:
  • the simulation distance between the model corresponding to the target aircraft and the model corresponding to each target object is determined in real time. According to the simulation distance, and the relationship between the digital twin scene model and the physical The mapping relationship of the scene determines the physical distance corresponding to the simulated distance; or
  • This application provides the following two methods for obtaining the corresponding physical distance between the model corresponding to the target aircraft and the corresponding model of each target object:
  • Method 1 Based on the digital twin scene model, determine the simulation distance between the model corresponding to the target aircraft and the model corresponding to each target object in real time, and then determine the simulation distance between the digital twin scene model and the physical scene of the target hangar based on the mapping relationship between the digital twin scene model and the physical scene of the target hangar. Output the physical distance corresponding to the simulated distance.
  • Method 2 Multiple one-dimensional ranging devices are arranged in the physical scene. The distances between the target aircraft and each target object collected by the one-dimensional ranging devices are used as the physical distances. , and displayed in the digital twin scene model.
  • determining the simulated position information of the model corresponding to the target aircraft in the digital twin scene model according to the physical position information of the target aircraft includes:
  • the simulated position information and simulated angle attitude information of the model corresponding to the target aircraft in the digital twin scene model are determined.
  • the physical angle and attitude information of the target aircraft is determined.
  • the physical angle and attitude information includes horizontal angle information, pitch angle information and roll angle information.
  • the simulated position information and simulated angle attitude information of the model corresponding to the target aircraft in the digital twin scene model are determined. Combining simulated position information and simulated angle and attitude information can make the simulated position information of the model corresponding to the target aircraft more accurate, which in turn can make the collision warning of aircraft entering and leaving the warehouse more accurate.
  • the method also includes:
  • the travel of the target aircraft in and out of the warehouse is determined. trajectory information.
  • the digital twin scene model can fully accommodate
  • the area of the model corresponding to the target aircraft, and the trajectory corresponding to this area is the trajectory that can ensure the safe movement of the target aircraft. It should be noted that this application can determine multiple trajectories that can ensure the safe travel of the target aircraft, and use these multiple trajectories as candidate trajectories. Then the target trajectory is selected from each candidate trajectory according to the set filtering rules as the trajectory information of the target aircraft entering and exiting the warehouse.
  • the set filtering rules are, for example, the shortest distance, the largest area space, which means the highest security, and the most convenient entrance and exit to the target.
  • the warehousing position is used as the end point of the trajectory.
  • Each candidate trajectory is determined based on the warehousing position.
  • It includes the entry and exit positions, and then selects the target trajectory from each candidate trajectory according to the set filtering rules as the trajectory information of the target aircraft entering and exiting the warehouse.
  • the trajectory information of the target aircraft entering and leaving the warehouse can be displayed on the display screen of the electronic device to guide the staff to control the target aircraft entering and exiting the warehouse.
  • the method also includes:
  • the risk of aircraft collision is greatest at the maintenance bay.
  • the central axis of the model corresponding to the target aircraft and the central axis of the maintenance stand can be determined based on the digital twin scene model, and it can be determined whether the central axis of the model corresponding to the target aircraft and the central axis of the maintenance stand are consistent.
  • Matching refers to The central axes completely coincide or the deviation of the central axes is within the allowable range.
  • the allowed range in the digital twin scene model is, for example, 1 mm, 2 mm, etc.
  • the model corresponding to the target aircraft coincides with the central axis of the maintenance stand, it is determined that the model corresponding to the target aircraft is centered and enters the model corresponding to the maintenance stand; otherwise, it is determined that the model corresponding to the target aircraft is not centered and enters the maintenance stand. corresponding model.
  • the deviation direction is determined based on the simulated position information of the model corresponding to the target aircraft and the simulated position information of the model corresponding to the maintenance stand.
  • the deviation direction can be determined based on the deviation position of the central axis of the model corresponding to the target aircraft and the central axis of the maintenance stand.
  • a second warning message carrying the deviation direction is output. The second warning message not only reminds the staff that the aircraft has a risk of collision, but also prompts the staff to adjust the storage position of the target aircraft according to the deviation direction.
  • the maintenance stand is Multiple image acquisition devices are installed to acquire images as second images.
  • the electronic device saves a pre-trained collision recognition model.
  • the collision recognition model is trained based on the sample images collected by multiple image acquisition devices installed at the maintenance station and the corresponding annotation information of whether there is a collision. Based on the second image and the pre-trained collision recognition model, a second judgment is made as to whether there is a risk of collision. If there is a risk of collision, a third warning message is output. If there is no risk of collision, the target aircraft is instructed to enter the maintenance bay.
  • the method proposed in this application is to help aircraft monitor in real time whether there will be a collision when entering or exiting the hangar, and provide timely warnings when there is a risk of collision, so as to reduce accidents and improve efficiency. Further improve service levels and operating experience, and enhance the overall operating efficiency and intelligence level of the maintenance company.
  • This application uses digital twin technology to achieve real-time aircraft anti-collision warning during the process of aircraft entering and exiting the warehouse. Compared with the traditional multi-person collaborative guidance method, it saves a lot of manpower and improves safety. At the same time, due to the adoption of pre-digital modeling and the dynamic generation of twin models based on intelligent analysis results of visual algorithms, the computational cost of technical routes based on three-dimensional visual algorithms is greatly reduced.
  • This application can simulate and deduce the process of aircraft entering and exiting the warehouse in advance, recommend feasible entry and exit paths, and display them through twin animations. The actual route of the aircraft in and out of the warehouse can also be recorded in the twin system to facilitate traceability of safety responsibilities afterwards.
  • FIG. 1 is an architecture diagram of aircraft entry and exit collision warning provided by this application. Specific steps include:
  • Models, personnel models, sensor models, etc. are used to achieve a one-to-one restoration of the aircraft warehousing process, and real-time rendering technologies such as ray tracing are used to achieve realistic display effects.
  • Multi-dimensional perception is the data basis for the operation of this application.
  • the sensor data includes but is not limited to 3D laser scanning and 2D high-definition images of the hangar interior before warehousing, 3D laser scanning of the local scene of the gate and the machine stand iron frame during warehousing, and one-dimensional Ultrasonic distance detection, close-range three-dimensional scanning and one-dimensional distance detection of key protection areas of the wing and tail. These data can be used for subsequent collision avoidance identification and decision-making.
  • the data obtained through various sensors need to be intelligently analyzed through algorithms, such as analyzing three-dimensional point cloud data of lidar and two-dimensional image information of high-definition cameras to achieve scene analysis and analyze various types of targets in the scene, such as: aircraft, staff, Obstacles etc. Perform more fine-grained target recognition for potential obstacles, and classify obstacles by constructing global features of each target.
  • algorithms such as analyzing three-dimensional point cloud data of lidar and two-dimensional image information of high-definition cameras to achieve scene analysis and analyze various types of targets in the scene, such as: aircraft, staff, Obstacles etc. Perform more fine-grained target recognition for potential obstacles, and classify obstacles by constructing global features of each target.
  • Detecting the position and attitude of the aircraft can help with subsequent collision analysis.
  • a minimum outer bounding box (such as an AABB bounding box) is generated.
  • the corresponding model is selected from the pre-digital model library, the corresponding object model is generated in real time in the digital twin scene, and its status is updated in real time based on its dynamic pose information.
  • pre-collision detection analysis is performed in the digital twin space to automatically calculate the best route for the aircraft to enter the warehouse, and display it in the form of route prediction.
  • real-time collision detection is performed, the risk situation on the actual path is analyzed, collision warning is performed, and corresponding adjustment strategies are given.
  • the full-duplex communication protocol web socket based on TCP updates the aircraft status information in real time, and combines the real-time audio and video communication technology to achieve a 1:1 restoration of the actual warehousing process with low-latency animation and return display of the surveillance video. It can perform digital twin reproduction of multi-dimensional sensor devices in the hangar (including but not limited to three-dimensional lidar, two-dimensional high-definition camera, one-dimensional ultrasonic rangefinder), and visually display information such as its working range, working performance, model data, etc. , to help relevant staff understand the working content of various sensors in actual working scenarios.
  • Figure 3 is a visual schematic diagram of sensor arrangement provided by this application. It should be noted that Figure 3 is only an example, and this application does not limit the actual arrangement position and number of sensors.
  • the collision warning uses the interface to display a global view of the aircraft, the hangar area where it is located, and the surrounding object models in the 3D scene. It will mark the distance between all potential collision objects within X meters (X can be set, such as 0.5) around the aircraft. If If the distance is less than the safe distance, the risk warning information and the real-time minimum distance between the fuselage and the collision object will be immediately displayed on the screen to facilitate the operator to make timely adjustments.
  • FIG. 4 is a flow chart of aircraft entry and exit collision warning provided by this application. As shown in Figure 4, aircraft entry and exit collision prevention based on digital twins includes the following steps:
  • Start the vision system start the high-definition camera and three-dimensional lidar.
  • System data initialization Load initialization data such as internal and external parameters of the camera.
  • Foreign object detection Use video surveillance to detect whether there are obstacles on the warehousing path in front of the aircraft, and identify the type of obstacles. If there are no obstacles, it will prompt that the aircraft can be warehoused. Otherwise, an early warning will be issued and it is recommended to stop warehousing and carry out manual obstacle clearance.
  • FIG. 5 is a schematic diagram of sensor deployment at the maintenance rack provided by this application. It should be noted that Figure 5 is only an example, and this application does not limit the actual arrangement position and number of sensors.
  • the radar in Figure 5 includes a three-dimensional laser device.
  • Figure 6 is a schematic structural diagram of the aircraft entry and exit collision warning device provided by this application.
  • the device includes:
  • the first determination module 61 is used to obtain the first image of the target hangar in the physical scene in real time during the process of the target aircraft entering and exiting the warehouse, and determine the physical location information of the target aircraft and each target object based on the first image. ;
  • the second determination module 62 is used to determine the simulated position information of the model corresponding to the target aircraft and the model corresponding to each target object in the digital twin scene model based on the physical position information of the target aircraft and each target object;
  • the early warning module 63 is used to obtain the physical distance between the model corresponding to the target aircraft and the model corresponding to each target object in real time. When the minimum physical distance is less than the preset distance threshold, output a first early warning prompt. information.
  • the first determination module 61 is specifically configured to identify the target aircraft and each target object based on the first image through a target recognition algorithm, and determine the physical two-dimensional dimensions of the target aircraft and each target object. Coordinate information; obtain three-dimensional point cloud data collected by three-dimensional laser equipment arranged in the physical scene, and determine respective physical depth coordinate information of the target aircraft and each target object according to the three-dimensional point cloud data; wherein, Physical location information includes the physical two-dimensional coordinate information and the physical depth coordinate information.
  • the early warning module 63 is specifically used to determine in real time the first minimum external bounding box of the model corresponding to the target aircraft and the second minimum external bounding box of the model corresponding to each target object; obtain the first minimum external bounding box The corresponding physical distance between the bounding box and each second smallest bounding box.
  • the early warning module 63 is specifically used to determine the simulation distance between the model corresponding to the target aircraft and the model corresponding to each target object in real time in the digital twin scene model. According to the simulation distance, and the mapping relationship between the digital twin scene model and the physical scene to determine the physical distance corresponding to the simulated distance; or obtain in real time the target aircraft collected by the one-dimensional ranging equipment arranged in the physical scene and each of the The distance between target objects is regarded as the corresponding physical distance between the model corresponding to the target aircraft and the model corresponding to each target object.
  • the second determination module 62 is specifically used to determine the physical angle and attitude information of the target aircraft according to the three-dimensional point cloud data; and determine the digital twin scene model according to the physical position information and physical angle and attitude information of the target aircraft.
  • the device also includes:
  • the third determination module 64 is configured to determine based on the simulated position information of the model corresponding to the target aircraft and the model corresponding to each target object in the digital twin scene model, and the simulated angle and attitude information of the model corresponding to the target aircraft. The trajectory information of the target aircraft entering and exiting the warehouse.
  • the early warning module 63 is also used to determine whether the model corresponding to the target aircraft is centered and enters the model corresponding to the maintenance stand based on the digital twin scene model when the target aircraft enters the maintenance stand; if No, determine the deviation direction based on the simulated position information of the model corresponding to the target aircraft and the simulated position information of the model corresponding to the maintenance stand, and output the second warning information carrying the deviation direction; if yes, obtain Based on the second image at the maintenance station and the pre-trained collision recognition model, a second judgment is made as to whether there is a risk of collision. If there is a risk of collision, a third warning message is output. If there is no collision, risk, instructing the target aircraft to enter the maintenance bay.
  • This application also provides an electronic device, as shown in Figure 7, including: a processor 301, a communication interface 302, a memory 303, and a communication bus 304.
  • the processor 301, the communication interface 302, and the memory 303 are completed through the communication bus 304. communication between each other;
  • the memory 303 stores a computer program.
  • the processor 301 executes the program to perform any of the above method steps.
  • the communication bus mentioned in the above-mentioned electronic equipment can be a peripheral component interconnect standard (Peripheral Component Interconnect, PCI) bus or an extended industry standard structure (Extended Industry Standard Architecture, EISA) bus, etc.
  • PCI peripheral component interconnect standard
  • EISA Extended Industry Standard Architecture
  • the communication bus can be divided into address bus, data bus, control bus, etc. For ease of presentation, only one thick line is used in the figure, but it does not mean that there is only one bus or one type of bus.
  • the communication interface 302 is used for communication between the above-mentioned electronic device and other devices.
  • the memory may include random access memory (Random Access Memory, RAM) or non-volatile memory (Non-Volatile Memory, NVM), such as at least one disk memory.
  • RAM Random Access Memory
  • NVM Non-Volatile Memory
  • the memory may also be at least one storage device located remotely from the aforementioned processor.
  • the above-mentioned processor can be a general-purpose processor, including a central processing unit, a network processor (Network Processor, NP), etc.; it can also be a digital signal processor (Digital Signal Processor). Processing, DSP), application-specific integrated circuits, field programmable gate arrays or other programmable logic devices, discrete gate or transistor logic devices, discrete hardware components, etc.
  • NP Network Processor
  • DSP digital signal processor
  • This application also provides a computer-readable storage medium.
  • the computer-readable storage medium stores a computer program that can be executed by an electronic device. When the program is run on the electronic device, the electronic device causes the electronic device to Implement any of the above method steps during execution.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Databases & Information Systems (AREA)
  • Computing Systems (AREA)
  • Artificial Intelligence (AREA)
  • Evolutionary Computation (AREA)
  • General Health & Medical Sciences (AREA)
  • Medical Informatics (AREA)
  • Software Systems (AREA)
  • Health & Medical Sciences (AREA)
  • Traffic Control Systems (AREA)
  • Radar Systems Or Details Thereof (AREA)

Abstract

La présente demande divulgue un procédé et un appareil d'avertissement précoce de collision d'entrée et de sortie de hangar d'aéronef, et un dispositif ainsi qu'un support. Le procédé consiste à : sur la base d'une première image d'un hangar cible dans une scène physique, déterminer des informations de position physique respectives d'un aéronef cible et d'objets cibles, et établir un modèle de scène jumelle numérique ; déterminer, à partir du modèle de scène jumelle numérique, des informations de position stimulée d'un modèle correspondant à l'aéronef cible et des informations de position simulée de modèles correspondant respectivement aux objets cibles ; et effectuer une détection de collision selon le modèle correspondant à l'aéronef cible et les modèles correspondant respectivement aux objets cibles, et lorsqu'il est détecté qu'il y aura une collision, délivrer en sortie des premières informations d'invite d'avertissement précoce. Au moyen de la solution de réalisation de prévention de collision d'entrée et de sortie de hangar d'aéronef sur la base d'un modèle de scène jumelle numérique, le problème d'une collision provoquée par une erreur de guidage se produisant facilement en raison de l'existence d'une zone aveugle est empêché, ce qui permet d'améliorer la sécurité et l'efficacité de prévention de collision ; et de plus, peu de personnel de service terrestre doit fonctionner de manière coopérative, ce qui permet de réduire la consommation de main-d'œuvre et de ressources matérielles, et d'améliorer également l'efficacité d'entrée et de sortie de hangar d'un aéronef.
PCT/CN2022/141372 2022-08-01 2022-12-23 Procédé et appareil d'avertissement précoce de collision d'entrée et de sortie de hangar d'aéronef, et dispositif ainsi que support WO2024027082A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202210916835.6 2022-08-01
CN202210916835.6A CN115345911A (zh) 2022-08-01 2022-08-01 一种飞机出入库碰撞预警方法、装置、设备及介质

Publications (1)

Publication Number Publication Date
WO2024027082A1 true WO2024027082A1 (fr) 2024-02-08

Family

ID=83949106

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2022/141372 WO2024027082A1 (fr) 2022-08-01 2022-12-23 Procédé et appareil d'avertissement précoce de collision d'entrée et de sortie de hangar d'aéronef, et dispositif ainsi que support

Country Status (2)

Country Link
CN (1) CN115345911A (fr)
WO (1) WO2024027082A1 (fr)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115345911A (zh) * 2022-08-01 2022-11-15 天翼云科技有限公司 一种飞机出入库碰撞预警方法、装置、设备及介质

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20080021680A1 (en) * 2005-10-04 2008-01-24 Rdv Systems, Ltd. Method and apparatus for evaluating sight distance
CN104715479A (zh) * 2015-03-06 2015-06-17 上海交通大学 基于增强虚拟的场景复现检测方法
CN108959727A (zh) * 2018-06-12 2018-12-07 上海天华建筑设计有限公司 建筑材料的比选方法
CN113954066A (zh) * 2021-10-14 2022-01-21 国电南瑞科技股份有限公司 一种基于数字孪生系统的配网作业机器人控制方法及装置
WO2022040920A1 (fr) * 2020-08-25 2022-03-03 南京翱翔智能制造科技有限公司 Système et procédé interactifs ar basés sur un jumeau numérique
CN114545877A (zh) * 2022-02-08 2022-05-27 燕山大学 一种面向散货的多工程机械数字孪生在线监控系统及方法
CN115345911A (zh) * 2022-08-01 2022-11-15 天翼云科技有限公司 一种飞机出入库碰撞预警方法、装置、设备及介质

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20080021680A1 (en) * 2005-10-04 2008-01-24 Rdv Systems, Ltd. Method and apparatus for evaluating sight distance
CN104715479A (zh) * 2015-03-06 2015-06-17 上海交通大学 基于增强虚拟的场景复现检测方法
CN108959727A (zh) * 2018-06-12 2018-12-07 上海天华建筑设计有限公司 建筑材料的比选方法
WO2022040920A1 (fr) * 2020-08-25 2022-03-03 南京翱翔智能制造科技有限公司 Système et procédé interactifs ar basés sur un jumeau numérique
CN113954066A (zh) * 2021-10-14 2022-01-21 国电南瑞科技股份有限公司 一种基于数字孪生系统的配网作业机器人控制方法及装置
CN114545877A (zh) * 2022-02-08 2022-05-27 燕山大学 一种面向散货的多工程机械数字孪生在线监控系统及方法
CN115345911A (zh) * 2022-08-01 2022-11-15 天翼云科技有限公司 一种飞机出入库碰撞预警方法、装置、设备及介质

Also Published As

Publication number Publication date
CN115345911A (zh) 2022-11-15

Similar Documents

Publication Publication Date Title
US11885910B2 (en) Hybrid-view LIDAR-based object detection
US10377485B2 (en) System and method for automatically inspecting surfaces
CN109116867B (zh) 一种无人机飞行避障方法、装置、电子设备及存储介质
US20210263528A1 (en) Transferring synthetic lidar system data to real world domain for autonomous vehicle training applications
CN110253570A (zh) 基于视觉的工业机械臂人机安全系统
CN111291697B (zh) 用于识别障碍物的方法和装置
CN111427374B (zh) 飞机泊位引导方法、装置及设备
CN114782626B (zh) 基于激光与视觉融合的变电站场景建图及定位优化方法
DK201300589A1 (en) Dynamic alarm zones for bird detection systems
CN114089330B (zh) 一种基于深度图像修复的室内移动机器人玻璃检测与地图更新方法
AU2020270461B2 (en) Situational Awareness Monitoring
WO2024027082A1 (fr) Procédé et appareil d'avertissement précoce de collision d'entrée et de sortie de hangar d'aéronef, et dispositif ainsi que support
CN112330915A (zh) 无人机森林防火预警方法、系统、电子设备和存储介质
JP2022548009A (ja) 物体移動システム
CN113255444A (zh) 图像识别模型的训练方法、图像识别方法和装置
CN110136186B (zh) 一种用于移动机器人目标测距的检测目标匹配方法
US20230401748A1 (en) Apparatus and methods to calibrate a stereo camera pair
EP3885972A1 (fr) Procédé de perception basé sur le contexte et système de gestion de la sécurité de l'environnement dans un environnement informatique
Surgailis et al. Avoiding forward car collision using stereo vision system
CN116630931A (zh) 障碍物检测方法、系统、农业机械、电子设备和存储介质
Dubey et al. Droan-disparity-space representation for obstacle avoidance: Enabling wire mapping & avoidance
CN115236693A (zh) 轨道侵限检测方法、装置、电子设备和存储介质
Mund et al. Introducing lidar point cloud-based object classification for safer apron operations
CN113709672A (zh) 基于ar的巡检系统及其巡检方法
Gruyer et al. Target-to-track collaborative association combining a laser scanner and a camera

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22953870

Country of ref document: EP

Kind code of ref document: A1