GB2532080A - Security inspection device image processing - Google Patents

Security inspection device image processing Download PDF

Info

Publication number
GB2532080A
GB2532080A GB1419954.1A GB201419954A GB2532080A GB 2532080 A GB2532080 A GB 2532080A GB 201419954 A GB201419954 A GB 201419954A GB 2532080 A GB2532080 A GB 2532080A
Authority
GB
United Kingdom
Prior art keywords
data
data stream
image
operable
ray
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
GB1419954.1A
Other versions
GB201419954D0 (en
GB2532080B (en
Inventor
Hlebarov Vejen
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Epicuro Ltd
Original Assignee
Epicuro Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Epicuro Ltd filed Critical Epicuro Ltd
Priority to GB1419954.1A priority Critical patent/GB2532080B/en
Publication of GB201419954D0 publication Critical patent/GB201419954D0/en
Publication of GB2532080A publication Critical patent/GB2532080A/en
Application granted granted Critical
Publication of GB2532080B publication Critical patent/GB2532080B/en
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • G01V5/22
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N23/00Investigating or analysing materials by the use of wave or particle radiation, e.g. X-rays or neutrons, not covered by groups G01N3/00 – G01N17/00, G01N21/00 or G01N22/00
    • G01N23/20Investigating or analysing materials by the use of wave or particle radiation, e.g. X-rays or neutrons, not covered by groups G01N3/00 – G01N17/00, G01N21/00 or G01N22/00 by using diffraction of the radiation by the materials, e.g. for investigating crystal structure; by using scattering of the radiation by the materials, e.g. for investigating non-crystalline materials; by using reflection of the radiation by the materials
    • G01N23/203Measuring back scattering
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01VGEOPHYSICS; GRAVITATIONAL MEASUREMENTS; DETECTING MASSES OR OBJECTS; TAGS
    • G01V11/00Prospecting or detecting by methods combining techniques covered by two or more of main groups G01V1/00 - G01V9/00
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0231Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0231Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
    • G05D1/0242Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using non-visible light signals, e.g. IR or UV signals
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0255Control of position or course in two dimensions specially adapted to land vehicles using acoustic signals, e.g. ultra-sonic singals
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0268Control of position or course in two dimensions specially adapted to land vehicles using internal positioning means
    • G05D1/0272Control of position or course in two dimensions specially adapted to land vehicles using internal positioning means comprising means for registering the travel distance, e.g. revolutions of wheels
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S15/00Systems using the reflection or reradiation of acoustic waves, e.g. sonar systems
    • G01S15/88Sonar systems specially adapted for specific applications
    • G01S15/93Sonar systems specially adapted for specific applications for anti-collision purposes
    • G01S15/931Sonar systems specially adapted for specific applications for anti-collision purposes of land vehicles
    • G01S2015/937Sonar systems specially adapted for specific applications for anti-collision purposes of land vehicles sensor installation details
    • G01S2015/939Sonar systems specially adapted for specific applications for anti-collision purposes of land vehicles sensor installation details vertical stacking of sensors, e.g. to enable obstacle height determination

Abstract

A method of processing image data for security inspection for example of vehicles, and the corresponding device 1, comprises a continuous X-ray image generator, a light image generator, and a position sensor. The method comprises receiving a data stream from an X-ray image generator relating to an X-ray image of an object 6, receiving a data stream from a light image generator relating to a light (e.g. optical/visible or infrared) image of the object, receiving a data stream from a position sensor, or a plurality thereof, relating to the position of the security inspection device, and producing an output image data stream by combining the X-ray and light data streams in dependence upon the position data stream. The device 1 comprises a central processing unit (CPU) configured to perform the steps of this method, and may comprise a wheeled mobile robot system 4. The combination of the three data streams may depend on a confidence level of the absolute position of the security device and confidence levels derived from characteristics of the X-ray and optical or IR images.

Description

SECURITY INSPECTION DEVICE IMAGE PROCESSING
The present invention relates to image processing for an inspection device comprising an X-ray generator for backscatter inspection of motorised vehicles, in particular to an inspection device capable of inspecting a geometrically restricted opening provided beneath a target vehicle or object and the adjacent road surface.
Background to the invention
X-ray is typically used for security inspection of motorised vehicles. There are two types of X-ray systems according to how they collect the data: a pulse X-ray generator and a continuous X-ray generator. Continuous X-ray generators are preferred as they can be used to provide a single, continuous image of the detected object. It has, however, been found that conventional continuous X-ray generators emit a substantially greater heat output than pulse X-ray generators. The conventional continuous X-ray generators therefore require an additional fluid filled cooling jacket to surround the generator in order to satisfy the additional cooling requirement. As a result, the dimensions of the continuous X-ray generator together with cooling jacket are much larger than pulse X-ray generators. The opening provided between the underside of a motorised vehicle and an adjacent road surface may be extremely small, for example in the region of 130 mm. Conventional continuous X-ray generators can therefore not be used to inspect the geometrically restricted opening beneath motorised vehicles without using an additional elevator to raise the inspected vehicle.
Pulse X-ray generators may provide a number of individual scans. The distance between each individual scan may however be unknown or uncertain. As a result gaps may occur between sequential scans when a final image is constructed. Sequential images may also be unrelated and difficult, if not impossible, to join together with any degree of accuracy to form a single X-ray image.
There is therefore a need for an image processing technique for an inspection device comprising an X-ray generator for inspection of motorised vehicles, in particular for inspecting a geometrically restricted opening provided beneath motorised vehicles and the adjacent road surface without the disadvantages of the known X-ray generators.
Summary of the Invention
In accordance with a first aspect, the present invention provides a method of processing image data for a security inspection device comprising a continuous X-ray image generator, a light image generator, and a position sensor, the method comprising receiving a first data stream from an X-ray image generator, the first data stream relating to an X-ray image of an object, receiving a second data stream from a light image generator, the second data stream relating to a light image of the object, receiving a third data stream from a position sensor, the third data stream relating to the positon of the security inspection device, and producing an output image data stream by combining the first and second data streams in dependence upon the third data stream.
Preferably, the security inspection device comprises a continuous back scatter X-ray image generator.
The security device light image generator may comprise one or more visual and/or infra red (IR) image generators.
Further, the security inspection device may comprise a plurality of positional sensors.
In one example, the step of producing an output image data stream comprises the steps of determining at least one image data characteristic from each of the first and second data streams, determining a confidence level for each such image data characteristic, determining an absolute position from the third data stream;, determining a confidence level for the absolute position, combining the first and second data streams in dependence upon the absolute position and the said confidence levels.
In another example, the step of producing an output image data stream comprises the steps of determining a first series of discrete images from the first data stream, determining respective position data relating to the first series of discrete images from the third data stream, determining a second series of discrete images from the second data stream, determining respective position data relating to the second series of discrete images from the third data stream, determining respective confidence levels for the positon data, combining the first and second series of discrete images in dependence upon the respective confidence levels.
In such an example, the position sensor may comprise a plurality of position sensors, and wherein the third data stream includes data items from each of the position sensors, and confidence levels are determined for each of the position sensors.
The light image generator may include a visible light sensor operable to produce data relating to a visible light image of the object.
The light image generator may include an infra-red light sensor operable to produce data relating to an infra-red light image of the object.
The positon sensors may include at least one wheel sensor, and an infrared-based positioning system.
The infra-red light sensor may be operable to produce an infra-red light image and produce position data as part of such an infrared-based positioning system.
According to a second aspect of the present invention, there is provided a security inspection device comprising a continuous X-ray image generator, a light image generator, a position sensor, and an image processing system comprising a first interface operable to receive a first data stream from the X-ray image generator, the first data stream relating to an X-ray image of an object, a second interface operable to receive a second data stream from the light image generator, the second data stream relating to a light image of the object, a third interface operable to receive a third data stream from the position sensor, the third data stream relating to the positon of such security inspection device, and a central processing unit operable to produce output image data by combining the first and second data streams in dependence upon the third data stream.
In one example, the central processing unit comprises an image processing element operable to determine at least one image data characteristic from each of the first and second data streams, and to determine a confidence level for each such image data characteristic, a position processing element operable to determine an absolute position from the third data stream, and to determine a confidence level for the absolute position, and a decision processing element operable to combine the first and second data streams in dependence upon the absolute position and the said confidence levels.
In another example, the central processing unit is operable to determine a first series of discrete images from the first data stream, and to determine a second series of discrete images from the second data stream, and wherein the central processing unit comprises a position processing element operable to determine respective position data relating to the first series of discrete images from the third data stream, and to determine respective position data relating to the second series of discrete images from the third data stream and to determine respective confidence levels for the positon data, the central processing unit being operable to combine the first and second series of discrete images in dependence upon the respective confidence levels.
The position sensor may comprise a plurality of position sensors, in which case, the third data stream includes data items from each of the position sensors, and wherein the central processing unit is operable to determine a confidence level for each of the position sensors.
In one example, the light image generator includes a visible light sensor operable to produce data relating to a visible light image of the object.
In one example, the light image generator includes an infra-red light sensor operable to produce data relating to an infra-red light image of the object.
In one example, the positon sensors include at least one wheel sensor, and an infrared-based positioning system. In such an example, the infra-red light sensor is operable to produce infra-red light image data, and to produce position data as part of the infrared-based positioning system.
Brief Description of the Drawinps
An embodiment of the invention will now be described, by way of example only, and with reference to the accompanying drawings, in which: Figure 1 illustrates a side view of a security inspection; Figure 2 illustrates a front view of the device of Figure 1; Figure 3 illustrates a view from above of the device of Figure 1; Figure 4 illustrates a cross-sectional view of the device of Figure 1; Figure 5 illustrates views of the wheels of the device of Figure 1; Figure 6 illustrates a side view of apparatus comprising the device of Figure 1 and an elevator; Figure 7 illustrates a view from above of the apparatus of Figure 6; Figure 8 illustrates a front view of the apparatus of Figure 6; Figure 9 illustrates a scanning path of the device of Figure 1; Figure 10 illustrates an image processing unit for the device of Figure 1; and Figure 11 illustrates an image processing method for the unit of Figure 10.
Detailed Description of Embodiments of the Invention As described above, it is desirable to provide an improved image processing technique for security devices that use a number of imaging and positioning sources to produce a final image. Embodiments of the present invention aim to provide such an improved technique, and will be described with reference to a particular example of security imaging device. It will be readily appreciated that the techniques and concepts embodying the present invention are applicable to any suitable security device, whether similar or not to that described below.
With reference to Figures 1 to 9 of the attached drawings, the vehicle security inspection device 1 comprises an X-ray generator 13 in a continuous scanning mode. The X-ray generator 13 has a substantially flat transverse profile.
The X-ray generator 13 may have any suitable transverse shape depending on the requirements for the inspection device. In the illustrated embodiment, the X-ray generator 13 has a substantially rectangular transverse shape.
The X-ray generator 13 illustrated in the embodiment is rated at 40-80kV/2-10 mA. It is however to be understood that any X-ray generator 13 with suitable ratings may be used within the range. For example, the X-ray generator is preferably rated between about 20 to100 kV and with between 2 and 100 mA with a power of less than about 1 kW.
The device 1 further comprises a cooling fluid mechanism 60 as shown in Figure 10. The cooling mechanism 60 comprises oil which is substantially free of water. It is however to be understood that the cooling mechanism may comprise any suitable cooling fluid. The cooling mechanism comprises a cooling fluid pathway 62 and a pump 63 arranged in use to pump the oil around the cooling fluid pathway 62. The cooling fluid pathway 62 extends around the periphery of the continuous X-ray generator 13. The X-ray generator 13 has a pair of opposed side portions 64, 64' and a pair of opposed end portions 65, 65'. The cooling fluid pathway 62 is arranged to extend adjacent to and along at least a portion of each of the side portions 64, 64' and end portions 65, 65' as shown in the Figures. It is however to be understood that the cooling fluid pathway 62 may extend across a single side or end portion, or any combination thereof, depending on the requirements for the device 1.
It can be seen from the figures that the cooling fluid pathway 62 does not completely surround the X-ray generator 13. In particular, the cooling fluid pathway 62 does not extend across the upper surface 66 of the X-ray generator 13, for example the surface of the X-ray generator 13 arranged in use to be positioned adjacent the underside of the target vehicle. The cooling fluid pathway 62 extends from the pump 63 to a radiator 67 (composed of copper and/or aluminium) which provides a cavity for best heat exchange. The radiator 67 is exposed to the surface of the device 1. Two fans 68 provide additional cooling of the surface of the radiator 67. It is to be understood that the device 1 may include any suitable number of fans and/or radiators and/or cooling fluid pathways depending on the requirements of the device 1. The fans 68 are operable only when the temperature of the radiator 67 reaches a predetermined critical value. It is important that the X-ray generator 13 neither overheats nor is excessively cooled by the fans 68 as the X-ray generation depends on the temperature of the target (anode) and the cathode in the X-ray tube.
The continuous X-ray generator 13 comprises an amorphous insulator material of ceramic 69. It is however to be understood that the amorphous insulator material may be selected from any suitable plastic and/or ceramic, such as for example quartz; 'TORELINA' PPS polyphenylene sulphide; TFE tetrafluoroethylene; 'TEFLON' PTFE polytetrafluoroethylene; PFA perfluoroalkoxy; 'TECAPEEK'; or any other suitable material.
The device 1 further comprises two spaced apart X-ray detectors 3a, 3b. The X-ray detector(s) 3a, 3b are elongate in shape and extend spaced apart from and substantially parallel to each other to detect the X-ray images. The detectors 3a, 3b may however have any suitable shape and be arranged in any suitable configuration for detecting X-ray images from an adjacent surface 6 of the target vehicle or object which is to be inspected. The detectors 3a, 3b are photomultipliers cells with concentrators having a reflective coat with phosphor to produce light when hit by a photon. The captured X-Ray image is made up from a series of line scans made by the synchronous work of the X-ray generator and the detectors 3a,3b. Backscatter in the X-Ray inspection allows the mobile robotic system 4 to capture data for inaccessible areas that otherwise limit or obstruct its movement and positioning particularly to the perimeter of the under vehicle workspace. These areas correspond to such items as the elevators for the inspected vehicle, wheels, tires, wheel arches, brake drums, linkages and any other motor vehicle element.
The device 1 comprises a mobile robot system 4 arranged in use to move and/or position and/or orientate the X-ray generator 13 within a cavity 5 provided between for example the underside of a target vehicle or object 6 and a road surface 7. The mobile robot system 4 comprising a motorised platform. The mobile robot system 4 shown in the figures is automated. It is however to be understood that the X-ray generator 13 may be in communication with a manipulator which is manually controlled.
The mobile robot system 4 is arranged to carry and locate the continuous X-ray generator and/or to move the X-ray generator 13 into the desired location and/or orientation within the cavity 5 relative to the adjacent surface 6 of the target vehicle or object being inspected with improved accuracy such that the device 1 may be used to inspect parts of trains, planes, missiles, boats, plants, machinery and other suitable structures requiring inspection. The continuous X-ray generator 13 may be operable, with the mobile robot system 4, to traverse contoured surfaces and/or surfaces which extend at an angle to the adjacent road surface, such as for example the front, back, sides, roof or interior of a target vehicle or object.
The X-ray generator 13 is arranged to scan only when the motorised platform of the mobile robot system 4 is moving in a single direction. The X-ray generator 13 is therefore arranged on the motorised platform such that the generator does not scan when the platform is turning. It is however to be understood that the generator may be arranged such that the generator can scan while turning. It is also to be understood that the X-ray generator 13 may be operable independent of the motion of the motorised platform of the mobile robot system 4. The motion of the motorised platform is preferably synchronised with the detection rate of the X-ray images by the X-ray detector(s). The detection rate may for example be in the range of from 15 to 100 lines per second.
Due to the unpredictability of the cavity and ground topography, the mobile robot system 4 comprising a motorised platform cannot be driven or commanded to move along an exact path and/or at a constant speed within the cavity 5. Wheel slippage or partial loss of ground contact may occur during movement of the motorised platform of the mobile robot system 4. In order to provide a single continuous X-ray image, there needs to be accurate positioning and/or orientation of the X-ray generator 13 relative to the adjacent surface 6 of the target vehicle or object.
As shown in the Figures, the motorised platform of the mobile robot system 4 comprises four wheels 8. It is however to be understood that the motorised platform of the mobile robot system 4 may comprise any suitable number of wheels. Preferably, the motorised platform comprises at least three wheels.
The motorised platform has independent drives for each wheel 8, each capable of being orientated about a vertical axis for steering purposes. It is however to be understood that the motorised platform may have no steering or be arranged such that the steering is associated with any number of wheels 8. Wheels with no steering preferably have a horizontal axis with a motion drive motor. In the illustrated embodiment each wheel 8 is associated with two motors. A first motor 9 is set on a horizontal axis and is arranged to provide wheel rotation. The second motor 10 is set on a vertical axis and is arranged to provide directional steering of the wheel. The first 9 and second 10 motors have continuous synchronisation with each other to avoid conflicts such as for example two wheels 8 driving in opposite directions.
The drive motors 9, 10 may be stepping motors or any type of motor capable of delivering a rotating action. Power and commanding of the mobile robot system 4 in respect of all its functions including mobility is delivered to the motorised platform by means of a flexible umbilical 11. It is however to be understood that the power and commands may be transmitted to the motorised platform by any suitable means. The umbilical 11 comprises an insulated outer tube. Power and/or signal communications are located within a hollow bore extending within the or each umbilical 11. The umbilical 11 extends between the mobile robot system 4 and the operator command module (not shown). It is to be understood that the device 1 may comprise any suitable number of umbilicals 11 depending on the particular requirements for the device 1.
The device 1 has improved tracking capability compared to conventional inspection devices. The device 1 may for example be arranged so as to track the locality of the X-ray generator 13 to within less than 1 mm of its actual location. It is however to be understood that the accuracy of the tracking capability of the device 1 may vary. Preferably, the device 1 enables the position of the X-ray generator 13 to be tracked to within less than 10 mm, more preferably less than 5 mm, for example less than 2 mm of its actual position.
In order to achieve accurate positioning of the device 1, and in particular the X-ray generator 13, relative to an adjacent surface 6 of the target object or vehicle, the device, for example the mobile robot system 4, comprises a visual image processing system responsive to three independent input data streams to measure location and/or orientation of the X-ray generator in order to minimise location error. The three input data streams are: wheel encoder 12, 14; NorthStarTM system 15; and computer vision. It is however to be understand that the visual image processing system may be comprise any desired number of data streams, such as the ones mentioned above, in any suitable combination, depending on the particular requirements for the device 1. The NorthStarTM positioning system is a visible light/IR light based position detection system for robots and movable devices, which is contained on the device. The system uses projected light to produce a series of reference points which are detected. The location of the device can then be determined. Alternative position determining systems may be used with embodiments of the present invention.
As shown in the Figures, the motorised platform comprises four ultrasonic sensors 16. The four sensors 16 are arranged within two pairs. Each pair is located towards the front 18 and adjacent opposing sides 20, 22 of the mobile robot system 4. The ultrasonic sensors 16 are located ahead of the X-ray generator 13 with respect to the direction of travel of the mobile robot system 4. It is however to be understood that the number and/or location of the sensors 16 may vary depending on the requirements of the inspection device 1. The ultrasonic sensors 16 may detect signals within any suitable range, preferably in the range of from 50 mm to 100 mm.
The mobile robot system 4, preferably the motorised platform, may include any number of tactile sensing and/or force sensing sensors in any suitable location. Slippage, loss of traction or resistance to motion of one or more wheels 8 may be detected.
In order to further improve the accuracy of the path travelled by the motorised platform, the mobile robot system 4 further includes a webcam 24. It is to be understood that the device 1 may include any suitable number of webcams 24 in any suitable location. In the illustrated embodiment, the webcam 24 is located on the upper surface 26 of the mobile robot system 4 and is located towards the rear side 30 of the device 1, i.e. behind the X-ray generator 13 with respect to the direction of travel of the device 1. The webcam includes a fixed focus wide angle lens with automatic electronic distortion correction by execution of image processing logic. It is to be understood that any suitable webcam may be located on the device 1.
The webcam(s) 24 may be mounted in any suitable way with respect to the device 1. For example, the or each webcam(s) 24 may be mounted on two motor driven axes. A first motor driven axis may be arranged to adjust the pitch of the webcam 24. The second motor driven axis may be arranged to give yawing of the webcam 24 to within, for example +/-180 degrees.
The mobile robot system 4, for example the motorized platform, may further include one or more light sources, such as for example low wattage LED light sources, located in any suitable location for the purposes of illumination digital image capture. In the illustrated embodiment, the device 1 comprises nine LED light sources 28. Three LED light sources 28 are spaced apart from each other and located adjacent the first side 20 of the mobile robot system 4. A further three LED light sources 28 are spaced apart from each other and located adjacent the second side 22 of the mobile robot system 4. A further three LED light sources 28 are spaced apart from each other and located adjacent the rear side 30 of the mobile robot system 4. It is to be understood that the device 1 may include any suitable number of light sources 28 in any suitable arrangement.
The mobile robot system 4 further includes two infrared detection cameras 32, 33. It is however to be understood that the device 1 may comprise any suitable number of infrared detection cameras in any suitable location and/or arrangement.
A first infrared detection camera 32 is located on the upper surface 26 of the mobile robot system 4 and is located towards the rear side 30 of the device 1, i.e. behind the X-ray generator 13 with respect to the direction of travel of the device 1. The first infrared detection camera 32 is located adjacent the webcam 24. The second infrared detection camera 33 is located on the front panel 18 of the mobile robot system 4. The infrared detection camera(s) 32, 33 may be mounted in any suitable way with respect to the mobile robot system 4. For example, one or more of the infrared detection cameras 32, 33 may be mounted on two motor driven axes. A first motor driven axis is arranged to adjust the pitch of the infrared detection camera 32, 33. The second motor driven axis is arranged to give yawing to the infrared detection camera 32, 33, in the range of for example between +/-180 degrees.
The ultrasound sensors 16, Webcam 24, infrared detection camera 32, 33 and the NorthStarTM can be operated and used collectively, individually, or in any combination to enable the required motion paths and positioning of the mobile robot system 4 to be achieved. These sensors operate automatically, with optional operator intervention, supported by computer encoded logic that may be artificial neural networking, Baysian certainty, fuzzy logic, or any other logic or combination of such. The automatic control may perform self-configuring optimisation.
The mobile robot system 4 further comprises an X-ray collimator 34. The X-ray collimator 34 may be located on the mobile robot system 4 at any suitable position. As shown in the Figures, the X-ray collimator 34 is shown as protruding from the upper surface 26 of the mobile robot system 4. The X-ray collimator 34 is substantially centrally located with respect to the mobile robot system 4. As shown in the Figures, the X-ray collimator 34 is located approximately equidistant from each of the X-ray detectors 3a, 3b. The X-ray collimator 34 is covered with a protective plastic strip (not shown). It is to be understood that the X-ray collimator 34 may be covered with any other suitable protective material which does not affect the X-ray function.
The X-ray collimator 34 produces a beam having an intended angle of incident as determined by the particular requirements for inspection of an adjacent surface 5 of a target vehicle or object. In the illustrated embodiment, the X-ray collimator 34 has an included angle of 120 degrees for a radius of 150mm. The X-ray source is located at the centre of this, which has protective covers to both sides of its circular form. The collimator 34, which rotates at high speed, has at least three radial holes for emitting the X-rays. An accelerator tube is included.
In use one or more of the following operations, in any suitable combination, may be carried out in order to inspect a target vehicle or object with the security device: (i) One or more of the dimensions of the target vehicle or object, such as for example the wheel base and/or track width, together with the minimum ground clearance dimensions between the underside of the target vehicle/object and the adjacent road surface are determined using available data or one of several measurement methods; (h) a vehicle elevator, if required, is provided depending on the outcome of (i); (iii) the vehicle to be inspected is placed in the inspection area or located on the vehicle elevator within the inspection area; (iv) the inspection device is located adjacent to the vehicle to be inspected; (v) motion path(s) of the motorized platform within the cavity are planned; (vi) the device is operated to enter the cavity provided between the underside of the vehicle/object and the adjacent road surface. The device is operable to be maneuvered along one or more motion path(s) within the cavity to carry out an X-ray survey to all or parts of the vehicle. Electrical power and/or signal communications are transferred between the device 1 and a command centre by means of the one or more umbilicals 11. Using any data output means such as a visual display or alarm the attending operator is informed of anomalies in the X-ray based imagery. Some causes of such anomalies can be chemicals, materials or any item incorporated in any way, located on the surface or buried within the parts of the inspected vehicle.
In use, the X-ray generator 13 is positioned as required on the mobile robot system 4, in this case on the motorised platform of the mobile robot system 4. The motorised platform, together with the X-ray generator 13 in a continuous scanning mode, is directed into the cavity 5 provided between the underside 6 of the vehicle and the adjacent road surface 7. The X-ray generator 13 only operates when the motorised platform is moving in a single direction. Cooling fluid is pumped along the cooling pathways to maintain the X-ray generator 13 at a workable operating temperature. The upper surface 26 of the mobile robot system 4 is preferably positioned at least 100 mm from the adjacent surface 6 of the target object or vehicle to be inspected. If the clearance is less than 100 mm, the target vehicle or object is placed on an elevator and raised to a sufficient height in order to provide at least the minimum clearance for the mobile robot system 4 prior to inspection.
As shown in the Figures, in order to provide a sufficient clearance, such as for example at least 350 mm, within the cavity 5 for the device 1 of the present invention an elevator 40 may be used. In the illustrated embodiment, the elevator 40 has two elevating portions 42, 42'. The first elevating portion 42 is arranged in use to provide an elevation path for the left hand side of the vehicle 100 to be inspected. The second elevating portion 42' is arranged in use to provide an elevation path for the right hand side of the vehicle 100 to be inspected. The first and second elevating portions 42, 42' are adjustable such that in use the vehicle or object to be inspected is raised to an approximately level standing position.
Many designs for the elevator are possible. The preferred design of the elevator 40 is a lightweight, modular constructions. The elevator is preferably composed of one or more of: aluminium, aluminium alloy, mild steel, metal, carbon fibre, reinforced composite, fibre glass composite, laminated timber, or other high strength to weight ratio components, or any combination thereof.
The elevator 40 comprises a number of interlocking elements to stabilise the elevator 40 and to provide a structural tie along the length of each elevator portion 42, 42'. The elevator 40 further comprises ramped elements located at opposing ends 43a, 43b, 44a, 44b of each elevating portion 42, 42'arranged to provide a ramp for receiving the vehicle 100. Each elevator element has a weight which does not exceed 20 kg. The maximum vertical height of the elevator 40 is typically in the region of between 220 mm and 250 mm. It is however to be understood that the maximum vertical height of the elevator 40 can vary or be adjusted depending on the particular requirements for the elevator 40.
It is to be understood that the number of elements within each elevator portion 42, 42' may vary depending on the particular requirements for inspecting a particular vehicle. For example, the number of elements within each elevator portion 42, 42' between each ramped portion 43, 44 may vary depending on the particular requirements for inspecting a particular vehicle. . The wheel base, track width and weight statistics of a candidate vehicle for inspection by the invention may be gathered from a data base such as occurring on the internet. Alternatively, the vehicle can be measured in respect to its wheel track width and wheel base dimensions. This can be done using various devices including calibrated image processing, survey wheel, tape measurement or an encoded pull wire, for example. This data is then used to correctly interlock and position the elevator portions 42,42' to provide an elevator 40 having the correct dimensions and spacing for receiving the vehicle. The two elevator portions 42, 42' can be aligned substantially parallel and spaced apart from each other by a predetermined distance for a particular vehicle by means of distance measuring lasers and targets incorporated into opposite pairs of ramp elements at both ends of each elevator portion 42, 42'.
The elevator 40 may be a permanent or temporary construction. Forced stop, chocks and/or stop warning devices can also be incorporated into the elevator 40 to ensure the candidate vehicle is correctly located and restrained in the standing zone 44of the elevator.
The topography of the ground over which the device travels is not required to be in a single plane and/or horizontal. The manipulator can adjust the position of the X-ray generator relative to the ground topography in order to provide the relevant scanned image.
The motorised platform, on the mobile robot system 4, is moved into the cavity 5. The motorised platform follows a planned path looping backwards and forwards relative to the target vehicle or object. The number and/or spacing and/or length and/or orientation of the or each path depends on the particular requirements for each individual vehicle or object, such as for example the wheel base and track width of the vehicle.
As shown in Figure 9, in order to scan a vehicle with a wheel base of 4 m and a track width of 2 m the motorised platform follows the illustrated paths as indicated by arrows 50 illustrating the direction of travel of the device 1. The planned path of the motorised platform may however be modified automatically on a real time basis using microprocessor based logic that addresses collision avoidance, detection and collision recovery of the mobile robot system. The planned path may take any form, such as for example linear, curved or turn in any manner e.g. right angle. As shown in Figure 9, the device 1 passes across the underside of the vehicle three times. It is however to be understood that the device may pass across the underside of the vehicle any suitable number of times, for example 4, 5 or more passes, depending on the requirements for inspection of the vehicle, such as for example depending on the distance of the inspected underside of the vehicle from the device 1. The greater the distance the fewer passes are required as the X-ray frustum is increased.
The device 1, for example the X-ray generator 13 on the motorised platform, is freely moveable and/or orientatable within the cavity 5. Movement of the device 1 in relation to the target vehicle or object is not restricted to a fixed path and/or speed, and/or orientation, and/or transit within an unstructured, undefined, unpredictable or dimensionally unknown cavity. The device 1 does not require guide members, such as for example rails, to define the pattern and/or range of movement of the device relative to the vehicle. In some embodiments, the device 1 may however be moveable along guide members relative to the vehicle.
The X-ray detectors 3a, 3b detect the X-rays received from the underside 6 of the target object or vehicle. The speed of the motorised platform is adjusted according to the detection rate by the X-ray detectors 3a, 3b. The device 1 generates accurately joined up X-ray images due to improved accuracy of locating the X-ray generator 13 within the cavity 5 to provide a single, continuous image of the underside 6 of a target vehicle or object of any dimensions, shape, axial and wheel configuration located on a surface of any topography.
The mobile robot system 4 may be driven in any suitable direction, such as for example along optimal straight lines to minimise image processing, keeping its course and by the fusion of the three data inputs. In order to assess the location of the X-ray generator 13 three systems are used, absolute NorthStarTM and/or relative encoders and/or image processing. The invention means that it is not necessary to position the target vehicle or object to be inspected in any special way. Unlike previous systems that use two beacons, the device of the present invention may be responsive to four TrueTrackTm infrared beacon sources for NorthStarTM irradiating on different frequencies. The electronic circuit uses computer implemented logic to locate at least two of the beacon markers in order to satisfy the necessary calculation for absolute position of the continuous series of X-Ray images. The relative position system relies on continuous feedback produced by the wheel encoders (multi-directional rotation and forward/backwards encoders). Slippage of encoder is compensated by application of the two other independent data sets. In this case the mobile robot system 4 will use both systems for calculation for its location. The visual image processing system (which is always on) gives indication if the mobile robot system 4 sharply moves in one or another direction. This may happen for example if one or more of the wheels is stepping on a stone, or if the robot 4 has touched another vehicle wheel. Based on this information, the mobile robot system 4 will precisely reduce the positional error to within a range of between 1mm to 5 mm in the scanning plane, which is the important for continuous X-Ray scanning.
Figure 10 illustrates an image processing unit embodying one aspect of the present invention, and suitable for use in the security device described with reference to Figures 1 to 9.
The image processing unit 100 includes a central processing unit 101, which receives inputs from various parts of the device described above. A webcam interface 103 receives Webcam data 102 from the or each webcam 24, an IR camera interface 103 receives IR camera data from the infra-red camera 32, an X-ray interface 104 receives X-ray data from the X-ray receivers 3a and 3b, an encoder interface 105 receives encoder data from the encoders 13 and 14, and a positon interface 105 receives position data from the NorthStarTM positioning system. The central processing unit 101 operates, as will be described in more detail below, to produce combined image data 108 which is output from an output device 107. The output device also operates to supply the combined image data to the central processing unit 101, via a feedback loop 109, as prior image data 110. This prior image data 110 is combined with the other input data by the central processing unit 101 to produce new combined image data.
The central processing unit 101 preferably is provided by a neural network processor, for example including fuzzy logic processing. The central processing unit 101 uses received input data to determine positional and orientation data relating the X-ray unit. This makes it possible to construct a single image from a series of continuous scans. This may be a full scan of the underside of a randomly selected road vehicle, for example. At the same time, the central processing unit 101 calculates errors and produces feedback for the learning part of the neural network, leading to improved performance. In the present example, the fuzzy logic processing of the central processing unit 101 uses 'probability of trust' parameters for each of the input data sources in order to 'offset' data from one or more of the other input data sources, if these sources reveal conflicts in understanding the position of the robot platform that moves the continuous X-ray scanner within the working envelope.
A schematic diagram of the fuzzy logic processing of the central processing unit 101 is presented in Figure 11. The fuzzy logic processing comprises video data processing 120, position data processing 130, a confidence database 140, and a decision processor 150.
The video data processing 120 includes contrast analysis 121 and overlapping image analysis 122. In addition, the video data processing includes confidence processing 123 which assigns a confidence level to each of the video data sources.
The position data processing 130 includes position analysis 131, and a positon confidence processing 131 which assigns a confidence level to each position data source.
The confidence database 140 stored previously determined confidence levels.
The decision processor 150 uses the outputs of the video processing, position processing and confidence database in order to update the images and re-assign confidence for the new images.
The video input data for the image processing comes from three sources at the same time: the X-ray receivers, the Webcam (for strange shapes, plate, welded as previously stated) and the infrared (IR) camera to assist detection of foreign objects within the working envelope. In an embodiment of the present invention, the infrared camera is not only used to locate the position of the robot but provide data on thermal anomalies in the target being viewed and assessed. For example, a normally hot pipe may appear to be partially insulated, due to foreign chemical matter detected and classified by the continuous X-ray processing unit. This IR image is recorded and is superimposed on the visual information.
The visual information comes from normal visual light cameras (two cameras).
The following processes are typically implemented in the image processing embodying an aspect of the present invention.
Data validation (each pixel is evaluated separately) Data normalization (feedback to the generator and the detectors).
- Data stitching based on image edges match Data colour separation (stitching for each colour channel) Analysis of the data in comparison to the other visual data streams Data Location Synchronisation (with the visual data) Data plotting (presenting on the screen) The IR cameras are used for both determining positon (as part of the NorthStarTM system), and for providing infrared data for combination with the other image data provided by the webcam and X-ray systems. Making use of the infrared camera(s) for multiple purposes presents visible improvement of the images by adding one more wideband (IR) into the data presented for security inspection.
At an initial stage, the IR cameras are positioned to provide the IR sensors of the NorthStarTM system. This system enables the exact absolute positon of the security robot to be determined and stored. Following this position determining stage, the IR cameras are rotated to a position in which they are able to inspect the underside of the target vehicle. At this moment the image processing system of the IR cameras locks on to the two brightest spots of the image and, knowing its absolute position, assigns absolute coordinates to them. The target vehicle is static during inspection, thus absolute coordinates for these points are determined and used at all times as a reference point, without further use of the NorthStarTM system. When one of the spots becomes out of the field of view of the IR camera, the process is repeated but this time the position of the new spots is calculated based on the old one (that is, recalculating the absolute position), again without the need to use the NorthStar TM system.
However, when the confidence parameters shows that this positional data is not reliable anymore, the IR cameras are used again as part of the NorthStarTM system for recalculation of the position of the security robot. During this time the robot can move, and timing and encoders' relative position is also used when the position is calculated. If static re-calibration is to be used, the robot should be static when this process is done. Static re-calibration is the most accurate absolute positioning technique, but often not very practical during target vehicle inspection.
Each of the four wheel motors of the inspection robot has its own encoder. Encoder data is collected in a logical block which evaluates each data set for wheel slippage. Only the three closest data sources are used for further calculation. However, if the standard deviation of the data from the three encoders is high, the probability factor (the weight of this input) is reduced during the positional calculation. This is performed in the positon processing 130 of the fuzzy logic.
The image processing system uses positional image morphing in order to overlay completely the visual input. This is necessary when switching from one image source to another on a single screen when inspection of a specific target vehicle area is necessary. When the cameras have big distances between their viewpoints (relative to the inspection surface) this produces images from different angles which do not match completely. In order to produce full overlay of images, morphing of the IR and X-ray data is performed. This process uses the visual data as a base onto which the two other images are morphed. This image morphing is carried out by the central processing unit 101 in an additional image processing step, and can be switched on and off.
As a result, the security device can be used to produce reliable, accurate images of the vehicle/object whilst avoiding loss of important fine detail between otherwise separate images which may be provided by pulse X-ray generators. This fine detail may for example include wire(s), hole(s), screw(s), bolt(s), rivet(s), pin(s), fixing(s), catch(s), slot(s), weld(s), weld splatter(s), cut(s), crack(s), scratch(s), chip(s), dent(s) and/or chemical(s) or any combination thereof.
Although aspects of the invention have been described with reference to the embodiment shown in the accompanying drawings, it is to be understood that the invention is not limited to the precise embodiment shown and that various changes and modifications may be effected without further inventive skill and effort, for example, the device may be used to inspect other suitable objects and is not limited for use within a cavity provided between a road surface and the underneath of a target vehicle. Furthermore, the device may be used to inspect other surfaces of a target vehicle.

Claims (16)

  1. CLAIMS1. A method of processing image data for a security inspection device comprising a continuous X-ray image generator, a light image generator, and a position sensor, the method comprising: receiving a first data stream from an X-ray image generator, the first data stream relating to an X-ray image of an object; receiving a second data stream from a light image generator, the second data stream relating to a light image of the object; receiving a third data stream from a position sensor, the third data stream relating to the positon of the security inspection device; and producing an output image data stream by combining the first and second data streams in dependence upon the third data stream.
  2. 2. A method as claimed in claim 1, wherein the step of producing an output image data stream comprises the steps of: determining at least one image data characteristic from each of the first and second data streams; determining a confidence level for each such image data characteristic; determining an absolute position from the third data stream; determining a confidence level for the absolute position; combining the first and second data streams in dependence upon the absolute position and the said confidence levels.
  3. 3. A method as claimed in claim 1, wherein the step of producing an output image data stream comprises the steps of: determining a first series of discrete images from the first data stream; determining respective position data relating to the first series of discrete images from the third data stream; determining a second series of discrete images from the second data stream; determining respective position data relating to the second series of discrete images from the third data stream; determining respective confidence levels for the positon data; combining the first and second series of discrete images in dependence upon the respective confidence levels.
  4. 4. A method as claimed in claim 3, wherein the position sensor comprises a plurality of position sensors, and wherein the third data stream includes data items from each of the position sensors, and confidence levels are determined for each of the position sensors.
  5. 5. A method as claimed in any one of the preceding claims, wherein the light image generator includes a visible light sensor operable to produce data relating to a visible light image of the object.
  6. 6. A method as claimed in any one of the preceding claims, wherein the light image generator includes an infra-red light sensor operable to produce data relating to an infra-red light image of the object.
  7. 7. A method as claimed in any one of the preceding claims, wherein the positon sensors include at least one wheel sensor, and an infrared-based positioning system.
  8. 8. A method as claimed in claim 7, when dependent upon claim 6, wherein the infra-red light sensor is operable to produce an infra-red light image and as part of the infrared-based positioning system.
  9. 9. A security inspection device comprising: a continuous X-ray image generator; a light image generator; a position sensor; and image processing system comprising: a first interface operable to receive a first data stream from the X-ray image generator, the first data stream relating to an X-ray image of an object; a second interface operable to receive a second data stream from the light image generator, the second data stream relating to a light image of the object; a third interface operable to receive a third data stream from the position sensor, the third data stream relating to the positon of such security inspection device; and a central processing unit operable to produce output image data by combining the first and second data streams in dependence upon the third data stream.
  10. 10. A device as claimed in claim 9, wherein the central processing unit comprises: an image processing element operable to determine at least one image data characteristic from each of the first and second data streams, and to determine a confidence level for each such image data characteristic; a position processing element operable to determine an absolute position from the third data stream, and to determine a confidence level for the absolute position; a decision processing element operable to combine the first and second data streams in dependence upon the absolute position and the said confidence levels.
  11. 11. A device as claimed in claim 9, wherein the central processing unit is operable to determine a first series of discrete images from the first data stream, and to determine a second series of discrete images from the second data stream, and wherein the central processing unit comprises: a position processing element operable to determine respective position data relating to the first series of discrete images from the third data stream, and to determine respective position data relating to the second series of discrete images from the third data stream and to determine respective confidence levels for the positon data; the central processing unit being operable to combine the first and second series of discrete images in dependence upon the respective confidence levels.
  12. 12. A device as claimed in claim 11, wherein the position sensor comprises a plurality of position sensors, and wherein the third data stream includes data items from each of the position sensors, and wherein the central processing unit is operable to determine a confidence level for each of the position sensors.
  13. 13. A device as claimed in any one of claims 9 to 12, wherein the light image generator includes a visible light sensor operable to produce data relating to a visible light image of the object.
  14. 14. A device as claimed in any one of claims 9 to 13, wherein the light image generator includes an infra-red light sensor operable to produce data relating to an infra-red light image of the object.
  15. 15. A device as claimed in any one of claims 9 to 14, wherein the positon sensors include at least one wheel sensor, and an infrared-based positioning system.
  16. 16. A device as claimed in claim 15, when dependent upon claim 14, wherein the infrared light sensor is operable to produce infra-red light image data, and to produce position data as part of the infrared-based positioning system.
GB1419954.1A 2014-11-10 2014-11-10 Security inspection device image processing Expired - Fee Related GB2532080B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
GB1419954.1A GB2532080B (en) 2014-11-10 2014-11-10 Security inspection device image processing

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
GB1419954.1A GB2532080B (en) 2014-11-10 2014-11-10 Security inspection device image processing

Publications (3)

Publication Number Publication Date
GB201419954D0 GB201419954D0 (en) 2014-12-24
GB2532080A true GB2532080A (en) 2016-05-11
GB2532080B GB2532080B (en) 2017-04-19

Family

ID=52118241

Family Applications (1)

Application Number Title Priority Date Filing Date
GB1419954.1A Expired - Fee Related GB2532080B (en) 2014-11-10 2014-11-10 Security inspection device image processing

Country Status (1)

Country Link
GB (1) GB2532080B (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112307952A (en) * 2020-10-29 2021-02-02 苏州博众机器人有限公司 Robot security check method, device, equipment and storage medium

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109490345A (en) * 2018-12-28 2019-03-19 中国原子能科学研究院 A kind of X-ray transmission and back scattering detection device and method
CN112434546A (en) * 2019-08-26 2021-03-02 杭州魔点科技有限公司 Face living body detection method and device, equipment and storage medium

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2006041416A1 (en) * 2004-10-11 2006-04-20 Stratech Systems Limited A system and method for automatic exterior and interior inspection of vehicles
US20070046237A1 (en) * 2005-04-25 2007-03-01 Sridhar Lakshmanan Miniature surveillance robot
US20070140427A1 (en) * 2005-12-20 2007-06-21 General Electric Company System and method for image composition using position sensors
WO2007113556A1 (en) * 2006-04-06 2007-10-11 Smartdrive Technology Limited Imaging apparatus
US20130195248A1 (en) * 2012-01-27 2013-08-01 American Science And Engineering, Inc. Hand-Held X-Ray Backscatter Imaging Device

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2006041416A1 (en) * 2004-10-11 2006-04-20 Stratech Systems Limited A system and method for automatic exterior and interior inspection of vehicles
US20070046237A1 (en) * 2005-04-25 2007-03-01 Sridhar Lakshmanan Miniature surveillance robot
US20070140427A1 (en) * 2005-12-20 2007-06-21 General Electric Company System and method for image composition using position sensors
WO2007113556A1 (en) * 2006-04-06 2007-10-11 Smartdrive Technology Limited Imaging apparatus
US20130195248A1 (en) * 2012-01-27 2013-08-01 American Science And Engineering, Inc. Hand-Held X-Ray Backscatter Imaging Device

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112307952A (en) * 2020-10-29 2021-02-02 苏州博众机器人有限公司 Robot security check method, device, equipment and storage medium
WO2022088389A1 (en) * 2020-10-29 2022-05-05 博众精工科技股份有限公司 Robot security check method and apparatus, and device and storage medium

Also Published As

Publication number Publication date
GB201419954D0 (en) 2014-12-24
GB2532080B (en) 2017-04-19

Similar Documents

Publication Publication Date Title
US10625427B2 (en) Method for controlling location of end effector of robot using location alignment feedback
US11188688B2 (en) Advanced automated process for the wing-to-body join of an aircraft with predictive surface scanning
US10814480B2 (en) Stabilization of tool-carrying end of extended-reach arm of automated apparatus
US10634632B2 (en) Methods for inspecting structures having non-planar surfaces using location alignment feedback
CN106249239B (en) Object detection method and device
Loupos et al. Autonomous robotic system for tunnel structural inspection and assessment
JP3600230B2 (en) Architectural and civil engineering structure measurement and analysis system
JP4889913B2 (en) Infrared camera sensitive to infrared rays
KR102277633B1 (en) Automatic robot measuring system for transit tunnel safety inspection
EP3818337B1 (en) Defect detection system using a camera equipped uav for building facades on complex asset geometry with optimal automatic obstacle deconflicted flightpath
US20070177011A1 (en) Movement control system
EP2587215B1 (en) Method for image measuring system
JP5650942B2 (en) Inspection system and inspection method
JP6581839B2 (en) Structure inspection method
GB2532080A (en) Security inspection device image processing
EP3567340A1 (en) Visual inspection arrangement
Kim et al. Autonomous mobile robot localization and mapping for unknown construction environments
US20220307834A1 (en) Surveying System
JP2001328600A (en) Landing point searching device, flying object using therewith and landing point evaluating device
GB2519241B (en) Security inspection robot
JP3223416B2 (en) Processing position detection device and automatic processing system
CN108088374B (en) Light beam switching method and device
TWI630372B (en) Laser detection device
McArthur et al. Pose-estimate-based target tracking for human-guided remote sensor mounting with a UAV
Nygards et al. Docking to pallets with feedback from a sheet-of-light range camera

Legal Events

Date Code Title Description
PCNP Patent ceased through non-payment of renewal fee

Effective date: 20201110