WO2023007770A1 - 航空機の乗降部を検出するための検出システム - Google Patents
航空機の乗降部を検出するための検出システム Download PDFInfo
- Publication number
- WO2023007770A1 WO2023007770A1 PCT/JP2022/002158 JP2022002158W WO2023007770A1 WO 2023007770 A1 WO2023007770 A1 WO 2023007770A1 JP 2022002158 W JP2022002158 W JP 2022002158W WO 2023007770 A1 WO2023007770 A1 WO 2023007770A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- search
- boarding
- image
- camera
- captured
- Prior art date
Links
- 238000001514 detection method Methods 0.000 title claims abstract description 48
- 238000012545 processing Methods 0.000 claims abstract description 52
- 230000007246 mechanism Effects 0.000 claims description 18
- 230000009466 transformation Effects 0.000 claims description 18
- 230000001131 transforming effect Effects 0.000 claims description 3
- 238000003384 imaging method Methods 0.000 claims description 2
- 238000000034 method Methods 0.000 description 37
- 230000008569 process Effects 0.000 description 35
- 230000008859 change Effects 0.000 description 10
- 238000004891 communication Methods 0.000 description 10
- 238000010586 diagram Methods 0.000 description 10
- 238000012937 correction Methods 0.000 description 4
- 230000003028 elevating effect Effects 0.000 description 4
- 230000003014 reinforcing effect Effects 0.000 description 4
- 238000004364 calculation method Methods 0.000 description 3
- JLQUFIHWVLZVTJ-UHFFFAOYSA-N carbosulfan Chemical compound CCCCN(CCCC)SN(C)C(=O)OC1=CC=CC2=C1OC(C)(C)C2 JLQUFIHWVLZVTJ-UHFFFAOYSA-N 0.000 description 3
- 230000006870 function Effects 0.000 description 3
- 230000006872 improvement Effects 0.000 description 2
- 230000004048 modification Effects 0.000 description 2
- 238000012986 modification Methods 0.000 description 2
- 238000013459 approach Methods 0.000 description 1
- 239000000470 constituent Substances 0.000 description 1
- 238000013135 deep learning Methods 0.000 description 1
- 230000006866 deterioration Effects 0.000 description 1
- 238000013213 extrapolation Methods 0.000 description 1
- 230000004044 response Effects 0.000 description 1
Images
Classifications
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B64—AIRCRAFT; AVIATION; COSMONAUTICS
- B64F—GROUND OR AIRCRAFT-CARRIER-DECK INSTALLATIONS SPECIALLY ADAPTED FOR USE IN CONNECTION WITH AIRCRAFT; DESIGNING, MANUFACTURING, ASSEMBLING, CLEANING, MAINTAINING OR REPAIRING AIRCRAFT, NOT OTHERWISE PROVIDED FOR; HANDLING, TRANSPORTING, TESTING OR INSPECTING AIRCRAFT COMPONENTS, NOT OTHERWISE PROVIDED FOR
- B64F1/00—Ground or aircraft-carrier-deck installations
- B64F1/30—Ground or aircraft-carrier-deck installations for embarking or disembarking passengers
- B64F1/305—Bridges extending between terminal building and aircraft, e.g. telescopic, vertically adjustable
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
Definitions
- the present disclosure relates to a detection system for detecting an aircraft landing section.
- a passenger boarding bridge is known as a facility that serves as a pedestrian passageway for passengers between the airport terminal building and the aircraft.
- the passenger boarding bridge consists of a rotunda connected to the terminal building and supported horizontally rotatably, a tunnel section connected to the rotunda at its base end and configured to extend and retract, and a rotatable section provided at the tip of the tunnel section to hold the aircraft. and a drive column provided as a support leg near the tip of the tunnel.
- the drive column includes an elevating device that supports and vertically moves the tunnel section, and a traveling device that is provided below the elevating device and has a pair of traveling wheels. It has been proposed to automate the movement of such passenger boarding bridges (see Patent Documents 1 and 2 below, for example).
- a camera is attached to the cab to photograph the boarding and alighting section of an aircraft, and when the cab is in a predetermined standby position, the horizontal position of the boarding and alighting section is based on the image of the boarding and alighting section captured by the camera. Information is calculated, based on this horizontal position information, a target position to which the cab is to be moved for mounting it on the boarding/alighting section is calculated, and the cab at the standby position is moved toward the target position. ing.
- Patent Document 2 describes a configuration in which two cameras, first and second, are provided in a head portion (cab) that can be connected to the boarding gate of an aircraft. Then, when an input to start driving is made on the operation panel, the traveling drive section starts wheel traveling, and when the head section reaches several meters in front of the aircraft, the first feature of the aircraft is captured by the first and second cameras. Imaging of the part and the second feature is started. Then, using the images captured by the first and second cameras, the position of the target point of the aircraft entrance/exit is calculated, the relative position and relative angle of the head portion with respect to the aircraft entrance/exit are calculated, and control correction is performed based on these. It describes calculating a quantity on the basis of which various drives are driven to move the head towards a target point on the aircraft.
- the present disclosure has been made in order to solve the above-described problems. It is an object of the present invention to provide a detection system capable of accurately detecting corresponding to
- a detection system for detecting a boarding/alighting section of an aircraft, comprising a passenger boarding bridge connected to a terminal building, a camera provided on the passenger boarding bridge, and the camera and an image processing device for detecting the boarding/alighting section of the aircraft from the photographed image, wherein the image processing device divides a predetermined region including part of the photographed image to generate a search region image.
- a search area image generation unit ; and a search execution unit that searches whether or not the boarding/alighting unit exists in the search area image. from one side to the other side in the axial direction of the aircraft, the search area image is sequentially generated, and the search execution unit uses the plurality of search area images that are sequentially generated to generate the search area image The search is repeated while changing .
- a plurality of search area images divided into predetermined areas including part of the captured image are generated from the captured image captured by the camera provided on the passenger boarding bridge.
- a plurality of search area images are generated by moving a predetermined area from one side to the other side in the aircraft axis direction. Therefore, by repeatedly searching for the boarding/alighting section while changing the search area image using a plurality of sequentially generated search area images, the boarding/alighting section of the entire photographed image can be obtained while moving the search area in the direction along the axis of the aircraft. can be searched.
- This makes it possible to uniformly search for boarding/alighting sections regardless of the type of aircraft or the arrangement of gates. Therefore, the position of the boarding/alighting section of the aircraft can be accurately detected corresponding to various gate arrangements.
- the search area image generation unit includes projectively transforming a virtual plane perpendicular to the horizontal plane and parallel to the axis in the captured image so as to be an image when viewed from a direction perpendicular to the virtual plane. It's okay.
- the search area image is generated as an image parallel to the axis. Therefore, even if the captured image is captured from an oblique direction of the aircraft, distortion of the shape of the boarding/alighting section to be searched in the search area image can be suppressed. As a result, it is possible to improve the search accuracy of the boarding/alighting section in the search area image.
- the search area image generation unit may projectively transform the captured image in the predetermined area each time the predetermined area including a portion of the captured image is delimited.
- the photographed image is projectively transformed for each predetermined area, it is possible to suppress the deterioration of the resolution of the image rather than projectively transforming the entire photographed image and then dividing the predetermined area from the image after the projective transformation. can be done.
- the detection system includes a photographing direction changing mechanism for changing the photographing direction of the camera by rotating around a predetermined rotation axis extending in a direction intersecting a horizontal plane, and a control for controlling the operation of the photographing direction changing mechanism.
- the image processing device includes a search range determination unit that determines whether or not the predetermined area after movement is within a predetermined search range and within the captured image, and the controller comprises: is within the search range but not within the captured image, the capturing direction of the camera is changed by the capturing direction changing mechanism, and capturing by the camera is performed again. good.
- the photographing range per photographed image can be made relatively narrow, the resolution of the search area image obtained from the photographed image can be increased.
- the passenger boarding bridge includes a first boarding bridge and a second boarding bridge, and the cameras are a first camera provided on the first boarding bridge and a second camera provided on the second boarding bridge.
- the search area image generation unit sequentially generates the search area image by moving the predetermined area on the captured image captured by the first camera from the front side to the rear side in the aircraft axis direction.
- the search area image may be generated sequentially by moving the predetermined area from the rear side to the front side in the aircraft axis direction on the captured image captured by the second camera.
- the aircraft boarding/alighting section to be connected to the first boarding bridge is searched based on the captured image captured by the first camera provided on the first boarding bridge, and the Based on the photographed image photographed by the second camera, a search is made for the boarding/alighting section of the aircraft to which the second boarding bridge should be connected.
- the search is performed from the front side to the rearward side in the aircraft axis direction, while the boarding/alighting section is searched using the captured image captured by the second camera.
- the search is performed from the rear side in the axial direction of the aircraft to the front side.
- the boarding/alighting section in front of the aircraft can be preferentially detected based on the image captured by the first camera, and the boarding/alighting section behind the aircraft can be detected based on the captured image captured by the second camera. can be preferentially detected. Therefore, it is possible to accurately search for the boarding/alighting section to which the first boarding bridge should be connected and the boarding/alighting section to which the second boarding bridge should be connected in a short time.
- the search execution unit performs a first search using the search area image generated from the captured image captured by the first camera, and a second search using the search area image generated from the captured image captured by the second camera. 2 searches may be performed in parallel. According to the above configuration, since the search for the platform to which the first boarding bridge should connect and the search for the platform to which the second boarding bridge should connect are performed in parallel, the search period can be shortened. can.
- the search execution unit detects a first position where the boarding/alighting unit is detected in a first search using the search area image generated from the captured image captured by the first camera, and the captured image captured by the second camera.
- the first position and the second position are included in at least one of the first search range and the second search range, it is determined that one boarding/alighting section is detected by the two searches.
- the first position and the second position are close to each other, only when it is assumed that the detection of one boarding/alighting section by the two searches is highly reliable, only one boarding/alighting section is detected.
- FIG. 1 is a schematic plan view showing an example of a passenger boarding bridge to which a detection system according to this embodiment is applied.
- 2 is a block diagram showing a schematic configuration of a detection system applied to the passenger boarding bridge shown in FIG. 1.
- FIG. 3 is a flow chart showing the flow of search processing in this embodiment.
- FIG. 4 is a diagram showing an example of a photographed image according to this embodiment.
- FIG. 5 is a conceptual diagram for setting a predetermined area in the captured image shown in FIG.
- FIG. 6 is a diagram showing a comparison between an image after projective transformation of a predetermined region and an image before projective transformation in the present embodiment.
- FIG. 7 is a flow chart showing an example flow of automatic mounting control processing according to the present embodiment.
- FIG. 8 is a flow chart showing the same door detection determination process shown in FIG.
- FIG. 9 is a conceptual diagram of different cases in the same door detection determination process.
- FIG. 1 is a schematic plan view showing an example of a passenger boarding bridge to which the detection system according to this embodiment is applied.
- the passenger boarding bridge 1 includes a first boarding bridge 11 and a second boarding bridge 12 .
- Each boarding bridge 11, 12 has a similar configuration.
- Each of the boarding bridges 11 and 12 has a rotunda (base circular chamber) 4 connected to the entrance/exit of the terminal building 2 of the airport, and a base end connected to the rotunda 4 so that it can be raised and lowered, so that the boarding bridges 11 and 12 can extend and contract in the longitudinal direction. and a cab (end circular chamber) 6 connected to the end of the tunnel 5 .
- the rotunda 4 is configured to be horizontally rotatable around a vertically extending first rotation axis R4.
- the cab 6 is horizontally rotatable about a second rotation axis R6 extending in a direction orthogonal to the bottom surface of the cab 6. As shown in FIG.
- the tunnel section 5 forms a walking passageway for passengers, and is constructed so that a plurality of tubular tunnels 5a and 5b are telescopically fitted to each other so as to extend and contract in the longitudinal direction.
- the tunnel section 5 configured by two tunnels 5a and 5b is illustrated here, the tunnel section 5 may be configured by two or more tunnels.
- the base end of the tunnel portion 5 is connected to the rotunda 4 so as to be swingable about the horizontal axis of rotation (swingable up and down), so that it can be raised and lowered.
- a drive column 7 is provided as a support leg at the tip of the tunnel portion 5 (tunnel 5a on the tip side). Note that the drive column 7 may be attached to the cab 6 .
- the drive column 7 is provided with an elevating device 8 for elevating the cab 6 and the tunnel section 5 .
- the cab 6 and the tunnel portion 5 can swing vertically with the rotunda 4 as a base point.
- the orientation of the second rotation axis R6, which is the rotation axis of the cab 6 changes according to the inclination of the bottom surface of the cab 6 with respect to the horizontal plane.
- the second rotation axis R6 extends in a direction that intersects the horizontal plane, and its direction changes according to the vertical swing of the cab 6 and the tunnel portion 5. As shown in FIG.
- the drive column 7 also includes a travel device 9 for rotating the tunnel portion 5 around the rotunda 4 (around the first rotation axis R4) and for expanding and contracting the tunnel portion 5.
- the traveling device 9 is provided below the lifting device 8 .
- the traveling device 9 has, for example, two traveling wheels that can be independently driven forward and backward. By rotating the two running wheels in the same direction (forward rotation or reverse rotation), it is possible to move forward or backward. You can also change the direction of the wheels.
- the rotunda 4 itself rotates around the first rotation axis R4 to rotate the tunnel portion 5 around the first rotation axis R4. 4 may be fixed to the terminal building 2, and the tunnel part 5 may be rotatably connected to the rotunda 4 around the first rotation axis R4.
- the cab 6 has a connecting portion 6a at its tip that is connected to the boarding/alighting portions D1 and D2 of the aircraft 3.
- a closure, a bumper, a distance sensor, and the like are provided on the connecting portion 6a.
- cameras 21 and 22 for photographing the sides of the aircraft 3 are installed in the cab 6 .
- Cameras 21 and 22 are provided at the tip of cab 6 .
- the first camera 21 provided on the first boarding bridge 11 is installed on the inner upper portion of the connecting portion 6 a of the cab 6 .
- a second camera 22 provided on the second boarding bridge 12 is also installed in the same manner.
- the cameras 21 and 22 may be provided anywhere at the tip of the cab 6 as long as the side of the aircraft 3 can be photographed. For example, it may be installed in the inner lower part of the connection part 6 a in the cab 6 , or may be installed in the outer upper part or the outer lower part of the cab 6 .
- a plurality of cameras may be installed for one boarding bridge 11 and 12 .
- cameras may be installed above and below the cab 6, respectively.
- the boarding/alighting section D1 of the aircraft 3 to which the first boarding bridge 11 should be connected is searched based on the photographed image G1 photographed by the first camera 21 provided at the tip of the first boarding bridge 11.
- the boarding/alighting section D2 of the aircraft 3 to which the second boarding bridge 12 should connect is searched.
- FIG. 2 is a block diagram showing the schematic configuration of the detection system applied to the passenger boarding bridge shown in FIG.
- the detection system 20 in the present embodiment detects the boarding/alighting section D1, and an image processing device 23 for detecting D2.
- the detection system 20 includes a storage device 24 for storing data such as photographed images, an image processing program, and the like, and an output device 25 for outputting detection results and the like.
- These structures 21 , 22 , 23 , 24 , 25 can pass data to each other via communication bus 26 .
- a controller 30 that controls the boarding bridges 11 and 12 is also connected to the communication bus 26 .
- the controller 30 is provided, for example, in the cab 6 or the tunnel 5a on the extreme tip side.
- the controller 30 controls the rotation of the cabs 6 of the boarding bridges 11 and 12 and the elevation and travel of the drive columns 7 .
- the controller 30 controls the rotation of the shooting direction changing mechanisms 34 and 35 of the cameras 21 and 22 .
- the cameras 21 and 22 are capable of changing the shooting direction (orientation of the shooting central axes L1 and L2) within the horizontal plane.
- the detection system 20 includes photographing direction changing mechanisms 34 and 35 that change the photographing directions of the cameras 21 and 22 by rotating around a predetermined rotation axis extending in a direction intersecting the horizontal plane.
- the shooting direction changing mechanisms 34 and 35 can change the shooting directions of the first camera 21 and the second camera 22 independently of each other.
- the shooting direction changing mechanisms 34 and 35 may be configured as camera rotation mechanisms that rotate the cameras 21 and 22 relative to the cab 6 .
- the shooting direction changing mechanisms 34 and 35 may be the cab 6 .
- the cameras 21 and 22 are fixed to the cab 6, and the cab 6 rotates around a second rotation axis R6 extending in a direction that intersects the horizontal plane. orientation) may be changed.
- the controller 30 and the image processing device 23 are configured by a computer that performs various calculations and processes based on the data stored in the storage device 24.
- the controller 30 and the image processing device 23 have a CPU, a main memory (RAM), a communication interface, and the like.
- the controller 30 and the image processing device 23 may be configured by the same computer, or may be configured by different computers.
- the output device 25 outputs the calculation or processing results of the controller 30 and the image processing device 23 .
- the output device 25 is composed of, for example, a monitor that displays calculation results or the like, or a communication device that transmits data to a server or a communication terminal via a communication network.
- the image processing device 23 and the storage device 24 of the detection system 20 may be configured as a server connected to the cameras 21 and 22 via a communication network. That is, the server acquires the captured images captured by the cameras 21 and 22, performs the search processing described later on the server, and displays the result on the monitor or the display unit of the communication terminal provided in the operation room of the boarding bridges 11 and 12. may be displayed.
- the controller 30 and the image processing device 23 grasp the position coordinates of each part of the passenger boarding bridges 11 and 12 in real time using a predetermined three-dimensional coordinate system such as an XYZ orthogonal coordinate system.
- a predetermined three-dimensional coordinate system such as an XYZ orthogonal coordinate system.
- the positional coordinates of each part of the passenger boarding bridge 1 are expressed as absolute coordinates, with the intersection of the first rotation axis R4 of the rotunda 4 and the plane of the apron EP as the origin (0, 0, 0), and the X axis perpendicular to each other, It is expressed as position coordinates in a three-dimensional coordinate system with Y and Z axes.
- the X-coordinate value, Y-coordinate value, and Z-coordinate value of the position coordinates are respectively the distance (for example, unit [mm]) from the origin (0, 0, 0), which is the position of the first rotation axis R4 of the rotunda 4. show.
- Controller 30 and image processing device 23 express the position of each part of aircraft 3 and boarding bridges 11 and 12 as position coordinates using such a three-dimensional orthogonal coordinate system.
- the image processing device 23 has a search area image generation unit 31, a search execution unit 32, and a search range determination unit 33 as functional blocks in order to perform search processing, which will be described later. These functional blocks partially or wholly include circuits including integrated circuits. Therefore, these configurations 31, 32, 33 can be regarded as circuits. These functional blocks are hardware that performs the enumerated functions, or hardware that is programmed to perform the enumerated functions. The hardware may be the hardware disclosed herein, or other known hardware programmed or configured to perform the recited functions. These functional blocks are a combination of hardware and software, where the hardware is a processor, which is considered a type of circuit, and the software is used to configure the hardware or the processor. The image processing device 23 reads a program for performing search processing stored in the storage device 24 and executes search processing, which will be described later.
- FIG. 3 is a flow chart showing the flow of search processing in this embodiment. Search processing is performed when the aircraft 3 arrives at a predetermined arrival position. Therefore, each of the boarding bridges 11 and 12 is positioned at an initial position (standby position) avoiding the aircraft 3 as shown in FIG.
- the storage device 24 stores search range data preset for each of the cameras 21 and 22 .
- the search range is set as an area along the axis AL direction.
- the storage device 24 stores the initial angles (data of shooting directions) of the cameras 21 and 22 .
- the initial angles of the cameras 21 and 22 are such that, at the initial positions of the boarding bridges 11 and 12, at least one end of the search range predetermined for each of the cameras 21 and 22 in the direction of the axis AL is within the angle of view of the cameras 21 and 22. is set to the angle contained in .
- the first search using the captured image captured by the first camera 21 is performed from the front side of the aircraft 3 in the direction of the axis AL to the rear side. Therefore, the initial angle of the first camera 21 is such that the front end of the search range set for the first camera 21 in the direction of the axis AL is included in the angle of view of the first camera 21 at the initial position of the first boarding bridge 11 . is set to an angle that
- the second search using the captured image captured by the second camera 22 can be performed from the rear side of the aircraft 3 in the axis AL direction to the front side.
- the initial angle of the second camera 22 is such that the rear end of the search range set for the second camera 22 in the direction of the axis AL is within the angle of view of the second camera 22 at the initial position of the second boarding bridge 12. Set to the contained angle.
- the controller 30 controls the shooting direction changing mechanism 34 so that the angle of the first camera 21 becomes the initial angle (step S1).
- the first camera 21 takes an image at the initial angle (step S2).
- the image processing device 23 acquires the captured image G1 at the initial angle of the first camera 21 .
- the acquired photographed image G1 is stored in the storage device 24 .
- the image processing device 23 may perform predetermined image processing such as distortion correction or brightness correction due to the camera lens on the captured image G1 in advance.
- a position in the photographed image G1 is expressed as a position coordinate of a two-dimensional coordinate system in which one vertex (for example, the upper left vertex) of the photographed image G1 is the origin, and U-axis and V-axis are orthogonal to each other.
- FIG. 4 is a diagram showing an example of a photographed image in this embodiment.
- FIG. 5 is a conceptual diagram for setting a predetermined area in the photographed image shown in FIG.
- the search area image generator 31 sets a virtual plane VP perpendicular to the horizontal plane (apron EL) and parallel to the axis AL in the photographed image G1.
- the axis AL is set in advance at a position a predetermined distance above the marshal line 13 on the assumption that the aircraft 3 will park on the marshal line 13 shown on the apron EL.
- the front side and rear side of the axis AL mean the front side and rear side of the aircraft 3 parked on the marshal line 13 .
- the virtual plane VP is set at a position offset by an offset amount W from the axis AL to one side in the machine width direction (the side closer to the boarding bridges 11 and 12). That is, the virtual plane VP is defined as a plane perpendicular to the horizontal plane including the reference axis VL that is offset by the offset amount W from the axis AL to one side in the machine width direction.
- the offset amount W is set in advance in consideration of the body width of the aircraft 3 so that the distance between the virtual plane VP and the outer surfaces of the boarding/alighting sections D1 and D2 is close (less than a predetermined distance).
- the search area image generator 31 sets a virtual rectangular frame Bi on the three-dimensional space on the virtual plane VP.
- the virtual rectangular frame Bi can be configured as a square frame with a side of 4 m on the virtual plane VP centered at the reference point Pi on the reference axis VL. Since the shooting direction (shooting center axis L1) of the first camera 21 is not perpendicular to the virtual plane VP in many cases, the virtual rectangular frame Bi is, in many cases, the predetermined area shown in FIG. 4 on the shot image G1. It has a distorted rectangular shape as shown by Ei.
- Three-dimensional coordinates Bi (xi, yi, zi) in the real space and two-dimensional coordinates Ei (ui, vi) on the captured image G1 captured by the first camera 21 correspond one-to-one. Transformation between the three-dimensional coordinate system of the real space and the two-dimensional coordinate system on the captured image G1 is performed by perspective projection transformation.
- the entire quadrangular shape Ei on the captured image G1 corresponding to the virtual rectangular frame Bi fits within the captured image G1, but at least a part of the quadrangular shape may be captured in the captured image G1.
- the search area image generation unit 31 generates an image of the virtual plane VP when viewed from a direction perpendicular to the virtual plane VP in the predetermined area Ei (area including part of the captured image G1) set in this manner. (step S4).
- pixels outside the range of the photographed image G1 are subjected to extrapolation processing such as filling with black or copying pixels in the edge portion. Projective transformation will be described later.
- the search area image generation unit 31 generates an image after projective transformation as a search area image Ci.
- the generated search area image Ci is enlarged so that the entire virtual rectangular frame Bi appears as large as possible in the search area image Ci.
- the search execution unit 32 searches for a door that is a candidate for the boarding/alighting unit D1 in the search area image Ci (step S5).
- a door search method is not particularly limited as long as it is an image recognition process that can detect a door. For example, AI image recognition using a trained model generated by deep learning can be used.
- the door and its reference point are detected based on the painted part of the contour of the door and the shape of the reinforcing plate provided on the door sill.
- the reference point of the door is set, for example, at the center of the door sill or the center of the reinforcing plate.
- the search execution unit 32 detects a door (Yes in step S6), it calculates the three-dimensional coordinates of the reference point of the door (step S7).
- the three-dimensional coordinates of the door are obtained, for example, by converting the two-dimensional coordinates on the search area image C1 into two-dimensional coordinates on the original photographed image G1, and by converting the two-dimensional coordinates on the photographed image G1 into three-dimensional coordinates in the real space. obtained by doing To convert the two-dimensional coordinates into three-dimensional coordinates in the real space, for example, two-dimensional coordinates of a plurality of cameras are used, or the reference point of the door is approximated to be on the virtual plane VP.
- the search execution unit 32 determines whether the detected door is the boarding/alighting unit D1 to be mounted on the first boarding bridge 11 (step S8).
- a search range is stored in advance so as to include all doors of various aircraft to which the first boarding bridge 11 is to be attached.
- the search executing section 32 determines that the boarding/alighting section D1 has been detected.
- the search execution section 32 outputs the search success result (step S9).
- the search area image generation unit 31 When no door is detected in the search area image Ci (No in step S6) and when it is determined that the detected door is not the boarding/alighting section D1 (No in step S8), the search area image generation unit 31 generates a predetermined Search area images Ci are sequentially generated by moving the area Ei from one side in the direction of the axis AL of the aircraft 3 to the other side. In the first search for the first boarding bridge 11, the search area image generator 31 moves the predetermined area Ei from the front side in the direction of the axis AL to the rear side.
- the search area image generation unit 31 shifts (moves) the virtual rectangular frame Bi from one side in the direction of the axis AL of the aircraft 3 to the other side in the three-dimensional space by a predetermined distance.
- the search area image generator 31 calculates a predetermined area Ei on the captured image G1 corresponding to the shifted virtual rectangular frame Bi. In this way, the search area image generator 31 shifts (moves) the predetermined area Ei from one side in the AL direction of the aircraft 3 to the other side in the captured image G1 (step S10).
- a reference point Pi+1 is set on the reference axis VL at a predetermined distance rearward in the axis AL direction from the position of the reference point Pi, and a virtual rectangular frame Bi+1 is set with the reference point Pi+1 as a reference.
- the distance between the reference points Pi and Pi+1 is preferably equal to or less than the length of one side of the virtual rectangular frame Bi extending in the horizontal direction, and more preferably 1/4 or more and 1/2 or less of the length of the side. This can create opportunities to detect the same door multiple times (on the order of two or three times) during the door search.
- the search range determination unit 33 determines whether or not the predetermined area Ei after the shift is within a predetermined search range, that is, whether or not the entire search range has been searched (step S11). When determining whether the shifted predetermined area Ei is within the search range, the search range determination unit 33 determines whether the corresponding virtual rectangular frame Bi in the three-dimensional space is within the search range. good too. If the predetermined area Ei after the shift is within the search range (Yes in step S11), the search range determination unit 33 further determines that the predetermined area Ei after the shift is within the range of the photographed image G1 used in the previous search. It is determined whether or not (step S12).
- the search area image generation unit 31 sets the predetermined area Ei after the shift as a predetermined area for generating the search area image Ci. Ei is set (updated) (step S3).
- the search area image generation unit 31 generates (updates) the search area image Ci in the same manner as described above based on the updated predetermined area Ei.
- the search executing section 32 searches for the boarding/alighting section D1 in the updated search area image Ci. In this way, the search execution unit 32 changes the search area image Ci by shifting the predetermined area Ei from the front side to the rear side along the direction of the axis AL, and sequentially generates a plurality of search area images Ci. are used to repeatedly search for the boarding/alighting section D1.
- the search range determination unit 33 instructs the controller 30 to change the shooting direction of the first camera 21. Send an instruction signal to change.
- the controller 30 controls the corresponding shooting direction changing mechanism 34 to change the shooting direction of the first camera 21 based on the instruction signal (step S13).
- the amount of change in the shooting direction is set so that the shot images before and after the shooting direction change partially overlap each other.
- the photographing central axis L1 of the first camera 21 is positioned on the rear side of the photographing central axis L1 in the direction of the axis AL.
- the imaging center axis L1a intersects the reference axis VL of the VP.
- the method of changing the photographing direction of the cameras 21 and 22 is not limited to this. good too.
- the first camera 21 shoots again after changing the shooting direction (step S2). Thereafter, similarly, search processing is performed on the updated photographed image (steps S3 to S8).
- the search range determination unit 33 determines that the boarding/alighting section D1 could not be detected, and outputs a search failure result (step S15).
- FIG. 6 is a diagram showing a comparison between an image after projective transformation of a predetermined region and an image before projective transformation in the present embodiment.
- FIG. 6 illustrates a case where a virtual rectangular frame Bi is set near the boarding/alighting section D1. Moreover, in FIG. 6, a part of the outer surface of the aircraft 3 is indicated by a two-dot chain line.
- the predetermined area Ei including part of the captured image G1 often has a distorted rectangular shape.
- the virtual rectangular frame Bi has the same rectangular shape as the shape set in the three-dimensional space.
- the predetermined area Epi corresponding to the virtual rectangular frame Bi after projective transformation is also square.
- a part of the aircraft 3 near the virtual plane VP is corrected so as to approach the image seen from the direction perpendicular to the axis AL of the aircraft 3 . Since only the object on the virtual plane VP is strictly corrected, correction errors remain as the distance from the virtual plane VP increases.
- the shooting direction (shooting center axis L1) of the first camera 21 is inclined with respect to the axis AL. Therefore, the door, which is the boarding/alighting section D1 included in the predetermined area Ei, appears as if the lower end and the upper end are inclined with respect to the horizontal line of the photographed image G1.
- the reference point of the door is set, for example, on the door sill or a reinforcing plate provided on the door sill. In this case, in the image used for searching the boarding/alighting section D1, it is preferable that the lower end portion (door sill or reinforcing plate) of the boarding/alighting section D1 (door) extends horizontally.
- a virtual rectangular frame Bi set on a virtual plane VP parallel to the axis AL and perpendicular to the horizontal plane (apron EP) is displayed in a rectangular shape.
- a predetermined area Ei including part of the captured image G1 is projectively transformed.
- the projective transformation suppresses a change in the size of the boarding/alighting part D1 due to the distance between the first camera 21 and the boarding/alighting part D1. That is, regardless of the distance between the first camera 21 and the boarding/alighting part D1, the size of the boarding/alighting part D1 after the projective transformation can be made approximately the same. Therefore, it is possible to accurately detect the boarding/alighting section D1 in the image after projective transformation.
- a plurality of search area images Ci are generated by dividing each predetermined area Ei including part of the photographed image Gj.
- a plurality of search area images Ci are generated by moving the predetermined area Ei from one side of the aircraft 3 in the direction of the axis AL to the other side. Therefore, by repeatedly searching the boarding/alighting sections D1 and D2 while changing the search area image Ci using a plurality of sequentially generated search area images Ci, the search area is moved in the direction along the axis AL of the aircraft 3. However, it is possible to search for the boarding/alighting sections D1 and D2 in the entire photographed image Gj.
- the boarding/alighting sections D1 and D2 can be uniformly searched. Therefore, the positions of the boarding/alighting sections D1 and D2 of the aircraft 3 can be accurately detected corresponding to various gate arrangements. As a result, it is possible to correctly detect the boarding/alighting parts D1 and D2 to be mounted even in the gate arrangement, which could not be handled by conventional position detection of the boarding/alighting parts D1 and D2 relying only on the three-dimensional positions. Moreover, when searching for the boarding/alighting sections D1 and D2, troublesome operations such as inputting information corresponding to the parked aircraft 3 can be eliminated. That is, the search process and the automatic attachment control process, which will be described later, can be executed for various aircraft 3 simply by the operator operating a button for starting the search process.
- the search area image Ci is generated as an image parallel to the axis AL by projective transformation. Therefore, even if the photographed image Gj is photographed from an oblique direction of the aircraft 3, it is possible to suppress the shape distortion of the boarding/alighting sections D1 and D2 to be searched in the search area image Ci. As a result, it is possible to improve the search accuracy of the boarding/alighting sections D1 and D2 in the search area image Ci.
- the search area image generation unit 31 projectively transforms the captured image Gj in the predetermined area Ei each time the predetermined area Ei including part of the captured image Gj is divided. In this way, the photographed image Gj is projectively transformed for each predetermined area Ei (enlargement processing is also performed at the same time). Lowering of image resolution can be suppressed more than enlarging.
- the photographing direction changing mechanisms 34 and 35 cause the cameras 21 and 22 to photograph.
- the direction is changed, and the cameras 21 and 22 take pictures again.
- the photographing range per photographed image can be made relatively narrow, the resolution of the search area image Ci obtained from the photographed image Gj can be increased.
- the search process described above is also used to search for the boarding/alighting section D2 for the second boarding bridge 12 using the second camera 22.
- the connecting portion 6a of the cab 6 on the first boarding bridge 11 is connected to the boarding/alighting portion D1 on the front side of the aircraft 3, and the connecting portion 6a of the cab 6 on the second boarding bridge 12 is connected to the boarding/alighting portion D2 on the rear side of the aircraft 3. connect to.
- a predetermined search range for the first boarding bridge 11 is searched from the front side in the direction of the axis AL to the rear side.
- the predetermined area Ei is moved from the front side to the rear side in the axis AL direction on the photographed image G1. Further, if necessary, the photographing direction of the first camera 21 is changed from the front side of the aircraft 3 to the rear side, and re-photographing is performed.
- a search is performed from the rear side in the direction of the axis AL to the front side within a predetermined search range for the second boarding bridge 12. That is, in the process of searching for the boarding/alighting section D2 connected to the second boarding bridge 12, the predetermined area Ei is moved from the rear side in the axis AL direction to the front side on the photographed image G2. Further, if necessary, the photographing direction of the second camera 22 is changed from the rear side of the aircraft 3 to the front side, and re-photographing is performed.
- the boarding/alighting section D1 in front of the aircraft 3 can be preferentially detected based on the captured image G1 captured by the first camera 21, and based on the captured image G2 captured by the second camera 22.
- the boarding/alighting section D2 behind the aircraft 3 can be preferentially detected. Therefore, it is possible to reduce the possibility of detecting the same boarding/alighting section for a plurality of boarding bridges 11 and 12 . Therefore, the boarding/alighting section D1 to which the first boarding bridge 11 should connect and the boarding/alighting section D2 to which the second boarding bridge 12 should connect can be searched accurately and in a short time.
- FIG. 7 is a flow chart showing an example flow of automatic mounting control processing according to the present embodiment.
- the first search for the first boarding bridge 11 using the first camera 21 (step SA1) and the second search for the second boarding bridge 12 using the second camera 22 (step SA2) are performed in parallel.
- the first search and the second search are started simultaneously.
- the second search may be started after a predetermined time has passed since the first search was started (before the end of the first search).
- Predetermined operating devices are provided in the first boarding bridge 11, the second boarding bridge 12, or a separate control room provided in the terminal building 2 or the like. Further, the operation device may be realized by displaying virtual operation buttons on a touch panel of a communication terminal connected for communication with the controller 30 via a server.
- the search execution unit 32 Upon completion of the first search, the search execution unit 32 stores the search result (search success in step S9 of FIG. 3 or search failure in step S15) in the storage device 24 (step SA3). Similarly, when the second search is completed, the search execution unit 32 stores the search result in the storage device 24 (step SA4). The search execution unit 32 determines whether or not to automatically attach the boarding bridges 11 and 12 to the boarding/alighting units D1 and D2 according to the combination of the search results of the first search and the second search (from step SA5 to step SA7).
- the search execution unit 32 transmits the three-dimensional position data of the boarding/alighting unit D1 detected in the first search to the controller 30. Execution of automatic attachment to the first boarding bridge 11 is permitted.
- the controller 30 performs mounting processing for mounting the first boarding bridge 11 on the boarding/alighting section D1 detected in the first search (step SA8). Specifically, the controller 30 controls the position of the connecting portion 6a of the cab 6 of the first boarding bridge 11 so that it reaches a preposition (pause position) calculated from the detected three-dimensional position of the boarding/alighting portion D1. It controls the drive column 7 and cab 6 of the first boarding bridge 11 .
- the controller 30 After the connecting portion 6a reaches the pre-position, the controller 30 detects the boarding/alighting portion D1 again using the first camera 21, a sensor or the like provided in the connecting portion 6a or the like, and the connecting portion 6a detects the boarding/alighting portion. Control to connect to D1.
- step SA6 the search execution unit 32 transmits the three-dimensional position data of the boarding/alighting unit D2 detected in the second search to the controller 30. Execution of automatic attachment to the second boarding bridge 12 is permitted.
- the controller 30 performs mounting processing for mounting the second boarding bridge 12 on the boarding/alighting section D2 detected in the second search (step SA9).
- the specific mounting process is the same as the mounting process in the first boarding bridge 11 .
- the search execution unit 32 determines the three-dimensional position (first position) of the boarding/alighting unit D1 detected in the first search and It is determined whether or not the detected distance (distance L between detections) between the detected three-dimensional position (second position) of the boarding/alighting section D2 is equal to or greater than a predetermined reference value Lo (step SA10).
- the search execution unit 32 determines that the boarding/alighting part D1 detected in the first search and the boarding/alighting part D2 detected in the second search are different from each other.
- the controller 30 performs mounting processing for mounting the first boarding bridge 11 on the boarding/alighting section D1 detected in the first search, and mounts the second boarding bridge 12 on the boarding/alighting section D2 detected in the second search. Mounting processing for the purpose is performed (step SA11).
- step SA7 the search execution unit 32 outputs an error to the display unit of the operation device, etc., and stops the automatic mounting control process (step SA12). ).
- the search execution unit 32 executes same door detection determination processing.
- the same door detection determination process is a process for determining whether or not the boarding/alighting part D1 detected by the first search and the boarding/alighting part D2 detected by the second search are the same boarding/alighting part.
- the reference value Lo is set based on the distance between two doors provided on the aircraft 3 . For example, the reference value Lo is set to 1/2 of the minimum door-to-door distance for all aircraft. Therefore, when the inter-detection distance L is less than the reference value Lo, it is highly likely that the entrance/exit part D1 detected by the first search and the entrance/exit part D2 detected by the second search are the same door.
- FIG. 8 is a flow chart showing the flow of the same door detection determination process shown in FIG.
- FIG. 9 is a conceptual diagram of different cases in the same door detection determination process.
- the search execution unit 32 adds a search range to at least one of the first search range F1 set for the first camera 21 and the second search range F2 set for the second camera 22, If both the first position H1, which is the three-dimensional position of the boarding/alighting section D1 detected in the first search, and the second position H2, which is the three-dimensional position of the boarding/alighting section D2 detected in the second search, are included, It is determined that one boarding/alighting section D1 or D2 is detected at the first position H1 or the second position H2.
- the first search range F1 and the second search range F2 are displayed with their vertical positions shifted from each other in order to facilitate discrimination between the first search range F1 and the second search range F2. may be the same or different.
- the search execution unit 32 determines whether the first search range F1 includes both the first position H1 and the second position H2 (step SB1). When both the first position H1 and the second position H2 are included in the first search range F1 (Yes in step SB1), the search execution unit 32 considers that the detection of the boarding/alighting part D1 in the first search is highly credible. , the first position H1, which is the three-dimensional position of the boarding/alighting section D1, is determined to be valid. In this case, the search executing unit 32 further determines whether or not both the first position H1 and the second position H2 are included in the second search range F2 (step SB2).
- the search execution unit 32 regards the detection of the boarding/alighting part D2 in the second search as having high credibility. Then, the second position H2, which is the three-dimensional position of the boarding/alighting section D2, is also determined to be valid. That is, the search execution unit 32 determines that both the first position H1 and the second position H2 are valid. In this case, state Q1 and state Q2 in FIG. 9 correspond.
- the state Q1 is a case where the first position H1 is located on the forward side in the axis AL direction and the second position H2 is located on the rear side in the axis AL direction.
- State Q2 is a case where the first position H1 is located on the rear side in the axis AL direction and the second position H2 is located on the front side in the axis AL direction.
- one search range F1 and F2 includes both the first position H1 and the second position H2.
- the controller 30 moves a predetermined boarding bridge of the first boarding bridge 11 and the second boarding bridge 12 to predetermined position coordinates of the first position H1 and the second position H2. Then, the mounting process is performed (step SB3).
- the controller 30 may move the first boarding bridge 11 to the first position H1 and perform the mounting process.
- the controller 30 moves the first boarding bridge 11 to the forward side in the axis AL direction between the first position H1 and the second position H2 and mounts it. processing may be performed. Further, for example, the controller 30 may move the first boarding bridge 11 to the first position H1 in the state Q1, and may move the first boarding bridge 11 to the second position H2 in the state Q2. .
- step No at SB2 when the search execution unit 32 determines that both the first position H1 and the second position H2 are included in the first search range F1 but the first position H1 is not included in the second search range F2 (step No at SB2), only the detection of the boarding/alighting part D1 in the first search of the boarding/alighting part D1 and the boarding/alighting part D2 is considered to be highly credible, and only the first position H1 is determined to be valid. In this case, state Q3 in FIG. 9 corresponds. In this case, the controller 30 moves the first boarding bridge 11 to the first position H1 and performs the mounting process (step SB4).
- the second search range F2 includes both the first position H1 and the second position H2. is included (step SB5).
- the search execution unit 32 determines that the second position H2 is not included in the first search range F1 but both the first position H1 and the second position H2 are included in the second search range F2 (at step SB5 Yes)
- only the detection of the boarding/alighting part D2 in the second search of the boarding/alighting part D1 and the boarding/alighting part D2 is considered to be highly credible, and only the second position H2 is determined to be valid.
- state Q4 in FIG. 9 applies.
- the controller 30 moves the second boarding bridge 12 to the second position H2 and performs the mounting process (step SB6).
- the search execution unit 32 determines that the second position H2 is not included in the first search range F1 and that the first position H1 is not included in the second search range F2 (No in step SB5), the boarding/alighting unit D1 and the detection of the boarding/alighting section D2 are regarded as having low credibility, and the first position H1 and the second position H2 are determined to be invalid.
- state Q5 and state Q6 in FIG. 9 correspond.
- State Q5 is a case where there is overlap between the first search range F1 and the second search range F2, but there is no search range that includes both the first position H1 and the second position H2.
- State Q6 is a case where the first search range F1 and the second search range F2 do not overlap in the first place.
- the search executing section 32 outputs an error to the display section of the operating device, etc., and stops the automatic mounting control process (step SB7).
- the search for the boarding/alighting section D1 connected to the first boarding bridge 11 and the search for the boarding/alighting section D2 connected to the second boarding bridge 12 are performed in parallel.
- the period can be shortened.
- it is included in at least one of the second search ranges F2 it is determined that one boarding/alighting section is detected by two searches.
- the cameras 21 and 22 for capturing the captured image Gj for generating the search area image Ci are cameras provided above the cab 6 .
- the search area image Ci may be generated from the photographed image Gj photographed by the camera. Further, the search process may be performed using each of a plurality of photographed images Gj photographed by a plurality of cameras provided above and below one cab 6 .
- the detection system 20 of the present disclosure can applicable to
- the first search is performed from the front side in the axis AL direction to the rear side within the first search range.
- a second search is performed from the rear side in the axis AL direction to the front side in the second search range.
- the second boarding bridge 12 is searched from the rear side in the direction of the axis AL to the front side in the second search range F2. As with the bridge 11, it may be possible to search from the front side in the direction of the axis AL to the rear side.
- an aircraft 3 with one boarding/alighting section can be parked even in a parking area where one boarding gate is provided with two boarding bridges 11 and 12 as shown in FIG.
- the search process for the boarding bridges 11 and 12 is performed, and only one of the boarding bridges is subjected to the attachment process (steps SA8 and SA9). , SB3, SB4, or SB6), the operator may select the number of boarding bridges to be attached to the aircraft 3 according to the type of the parked aircraft 3.
- the image processing device 23 attaches the first boarding bridge 11 to the boarding/alighting section D1 in the manner illustrated in the above embodiment, and attaches the second boarding bridge 12.
- a search process for mounting on the boarding/alighting section D2 is performed.
- the number of boarding bridges to be attached to the aircraft 3 is one, it is possible to select which one of the first boarding bridge 11 and the second boarding bridge 12 to be attached according to the parking position of the aircraft. good.
- the image processing device 23 scans the first search range F1 from the front side in the direction of the axis AL to the rear side. Execute search processing for searching to the side.
- the image processing device 23 searches from the front side in the direction of the axis AL to the rear side in the second search range F2. processing may be performed.
- both the first search for the first boarding bridge 11 and the second search for the second boarding bridge 12 are searched in the same direction (from one side to the other side in the direction of the axis AL). You may do so.
- the selection of the search direction in the second search is based on the connection mode (connection mode in which the second boarding bridge 12 is connected to the boarding section D2 on the rear side of the aircraft 3 having two boarding sections D1 and D2, or one boarding section D1). It may be automatically selected according to the switching of the connection mode (connecting the second boarding bridge 12 to the boarding and alighting section D1 of the aircraft 3), or the search direction may be manually selected by operating the operation device.
- the search direction in the first search may be switched not only from the front side to the rear side in the axis AL direction but also from the rear side to the front side in the axis AL direction.
- the first search for the first boarding bridge 11 and the second search may be performed in order.
- the second search may be performed after the first search is performed, or the first search may be performed after the second search is performed.
- one image processing device 23 performs both the first search and the second search.
- the device 23 may be configured by another processing device (computer, etc.).
- the image processing device that performs the first search may be installed on the first boarding bridge 11 and the image processing device that performs the second search may be installed on the second boarding bridge 12 .
- one controller 30 controls the first boarding bridge 11, the second boarding bridge 12, the photographing direction changing mechanism 34 for the first camera 21, and the photographing direction changing mechanism for the second camera 22. Although illustrated as controlling the operation of any of 35, different controllers may be provided for some or all of these arrangements 11, 12, 34, 35.
Abstract
Description
以上、本開示の実施の形態について説明したが、本開示は上記実施の形態に限定されるものではなく、その趣旨を逸脱しない範囲内で種々の改良、変更、修正が可能である。
2 ターミナルビル
3 航空機
11 第1搭乗橋
12 第2搭乗橋
20 検出システム
21 第1カメラ
22 第2カメラ
23 画像処理装置
30 制御器
31 探索領域画像生成部
32 探索実行部
33 探索範囲判定部
34,35 撮影方向変更機構
D1,D2 乗降部
Claims (7)
- 航空機の乗降部を検出するための検出システムであって、
ターミナルビルに接続される旅客搭乗橋と、
前記旅客搭乗橋に設けられたカメラと、
前記カメラで撮影された撮影画像から前記航空機の乗降部を検出する画像処理装置と、を備え、
前記画像処理装置は、
前記撮影画像のうちの一部を含む所定領域を区切って探索領域画像を生成する探索領域画像生成部と、
前記探索領域画像内に前記乗降部が存在するか否かの探索を行う探索実行部と、を含み、
前記探索領域画像生成部は、前記撮影画像上において前記所定領域を前記航空機の機軸方向一方側から他方側に移動させることにより、前記探索領域画像を順次生成し、
前記探索実行部は、順次生成された複数の前記探索領域画像を用いて、前記探索領域画像を変えながら前記探索を繰り返し行う、検出システム。 - 前記探索領域画像生成部は、前記撮影画像において水平面に対して垂直かつ前記機軸に平行な仮想平面を、前記仮想平面に垂直な方向から見たときの画像となるように射影変換することを含む、請求項1に記載の検出システム。
- 前記探索領域画像生成部は、前記撮影画像のうちの一部を含む前記所定領域を区切るたびに当該所定領域における前記撮影画像を前記射影変換する、請求項2に記載の検出システム。
- 水平面に交差する方向に延びる所定の回転軸回りに回動することにより前記カメラの撮影方向を変更する撮影方向変更機構と、
前記撮影方向変更機構の動作を制御する制御器を備え、
前記画像処理装置は、移動後の前記所定領域が予め定められる探索範囲内かつ前記撮影画像内であるか否かを判定する探索範囲判定部を含み、
前記制御器は、移動後の前記所定領域が前記探索範囲内であるが、前記撮影画像内ではないと判定された場合、前記撮影方向変更機構により前記カメラの撮影方向を変更して、前記カメラによる撮影を再度行う、請求項1から3の何れかに記載の検出システム。 - 前記旅客搭乗橋は、第1搭乗橋と、第2搭乗橋とを含み、
前記カメラは、前記第1搭乗橋に設けられた第1カメラと、前記第2搭乗橋に設けられた第2カメラとを含み、
前記探索領域画像生成部は、前記第1カメラが撮影した前記撮影画像上において前記所定領域を前記航空機の機軸方向前方側から後方側に移動させることにより、前記探索領域画像を順次生成し、前記第2カメラが撮影した前記撮影画像上において前記所定領域を前記航空機の機軸方向後方側から前方側に移動させることにより、前記探索領域画像を順次生成する、請求項1から4の何れかに記載の検出システム。 - 前記探索実行部は、前記第1カメラが撮影した前記撮影画像から生成された前記探索領域画像による第1探索と、前記第2カメラが撮影した前記撮影画像から生成された前記探索領域画像による第2探索とを、並行して行う、請求項5に記載の検出システム。
- 前記探索実行部は、前記第1カメラが撮影した前記撮影画像から生成された前記探索領域画像による第1探索において前記乗降部を検出した第1位置と、前記第2カメラが撮影した前記撮影画像から生成された前記探索領域画像による第2探索において前記乗降部を検出した第2位置との距離が所定の基準値未満の場合に、前記第1カメラについて設定される第1探索範囲または前記第2カメラについて設定される第2探索範囲の少なくとも何れか一方の探索範囲に前記第1位置および前記第2位置の双方が含まれていれば、前記第1位置または前記第2位置で、一の前記乗降部を検出したと判定する、請求項5または6に記載の検出システム。
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2023538216A JP7449454B2 (ja) | 2021-07-29 | 2022-01-21 | 航空機の乗降部を検出するための検出システム |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2021124555 | 2021-07-29 | ||
JP2021-124555 | 2021-07-29 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2023007770A1 true WO2023007770A1 (ja) | 2023-02-02 |
Family
ID=85086622
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2022/002158 WO2023007770A1 (ja) | 2021-07-29 | 2022-01-21 | 航空機の乗降部を検出するための検出システム |
Country Status (2)
Country | Link |
---|---|
JP (1) | JP7449454B2 (ja) |
WO (1) | WO2023007770A1 (ja) |
Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2005518308A (ja) * | 2002-02-27 | 2005-06-23 | インダール テクノロジーズ インコーポレイテッド | 航空機と自動的にドッキングするための乗客用ブリッジなどのための画像化システム |
US20080098538A1 (en) * | 2006-10-31 | 2008-05-01 | Dew Engineering And Development Limited | Vision system for automatically aligning a passenger boarding bridge with a doorway of an aircraft and method therefor |
JP2013050912A (ja) * | 2011-08-31 | 2013-03-14 | Toshiba Corp | オブジェクト探索装置、映像表示装置およびオブジェクト探索方法 |
JP2017085439A (ja) * | 2015-10-30 | 2017-05-18 | キヤノン株式会社 | 追尾装置 |
US20190225351A1 (en) * | 2016-05-17 | 2019-07-25 | Thyssenkrupp Elevator Innovation Center S.A. | Method for positioning a passenger boarding bridge at an airplane |
JP6720414B2 (ja) | 2017-07-13 | 2020-07-08 | 新明和工業株式会社 | 旅客搭乗橋 |
JP2020175727A (ja) | 2019-04-16 | 2020-10-29 | 三菱重工交通機器エンジニアリング株式会社 | ボーディングブリッジ及びその制御装置 |
-
2022
- 2022-01-21 JP JP2023538216A patent/JP7449454B2/ja active Active
- 2022-01-21 WO PCT/JP2022/002158 patent/WO2023007770A1/ja active Application Filing
Patent Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2005518308A (ja) * | 2002-02-27 | 2005-06-23 | インダール テクノロジーズ インコーポレイテッド | 航空機と自動的にドッキングするための乗客用ブリッジなどのための画像化システム |
US20080098538A1 (en) * | 2006-10-31 | 2008-05-01 | Dew Engineering And Development Limited | Vision system for automatically aligning a passenger boarding bridge with a doorway of an aircraft and method therefor |
JP2013050912A (ja) * | 2011-08-31 | 2013-03-14 | Toshiba Corp | オブジェクト探索装置、映像表示装置およびオブジェクト探索方法 |
JP2017085439A (ja) * | 2015-10-30 | 2017-05-18 | キヤノン株式会社 | 追尾装置 |
US20190225351A1 (en) * | 2016-05-17 | 2019-07-25 | Thyssenkrupp Elevator Innovation Center S.A. | Method for positioning a passenger boarding bridge at an airplane |
JP6720414B2 (ja) | 2017-07-13 | 2020-07-08 | 新明和工業株式会社 | 旅客搭乗橋 |
JP2020175727A (ja) | 2019-04-16 | 2020-10-29 | 三菱重工交通機器エンジニアリング株式会社 | ボーディングブリッジ及びその制御装置 |
Also Published As
Publication number | Publication date |
---|---|
JP7449454B2 (ja) | 2024-03-13 |
JPWO2023007770A1 (ja) | 2023-02-02 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US10875666B2 (en) | Passenger boarding bridge | |
US7990415B2 (en) | Image input device and calibration method | |
US10365090B2 (en) | Imaging device and imaging method | |
WO2023007770A1 (ja) | 航空機の乗降部を検出するための検出システム | |
JP6960022B2 (ja) | 旅客搭乗橋 | |
JP5763986B2 (ja) | 移動体および移動体の制御方法 | |
JP7222113B2 (ja) | 撮像支援装置、撮像装置、撮像システム、撮像支援システム、撮像支援方法、及びプログラム | |
JP7321749B2 (ja) | ボーディングブリッジ及びその制御装置 | |
WO2023188547A1 (ja) | 航空機の乗降部を検出するための検出システム | |
JPH06214639A (ja) | 移動体の走行制御装置 | |
CN111942391A (zh) | 铰接式工程机械、全景环视系统及其标定方法 | |
JP7324800B2 (ja) | 旅客搭乗橋の走行制御方法 | |
JP5040938B2 (ja) | 駐車支援装置及び駐車支援映像の表示方法 | |
JP2020066418A (ja) | ボーディングブリッジ及びボーディングブリッジ制御装置 | |
US10742874B2 (en) | Imaging plan generation device, imaging plan generation method, and program | |
CN212289808U (zh) | 铰接式工程机械及其全景环视系统 | |
JP6502570B1 (ja) | ボーディングブリッジ及びボーディングブリッジ制御装置 | |
JP7457873B2 (ja) | 旅客搭乗橋 | |
WO2022208601A1 (ja) | 旅客搭乗橋 | |
JP2001186513A (ja) | 撮像装置 | |
JP7224905B2 (ja) | ボーディングブリッジ | |
JP2989617B2 (ja) | 移動車の環境認識装置 | |
JP6845976B1 (ja) | 旅客搭乗橋 | |
JPH06214640A (ja) | 移動体の走行制御装置 | |
CN117956670A (zh) | 成像控制方法、装置、成像系统及可读存储介质 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 22848854 Country of ref document: EP Kind code of ref document: A1 |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2023538216 Country of ref document: JP |
|
WWE | Wipo information: entry into national phase |
Ref document number: 18562487 Country of ref document: US |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2022848854 Country of ref document: EP |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
ENP | Entry into the national phase |
Ref document number: 2022848854 Country of ref document: EP Effective date: 20240229 |