METHOD AND DEVICE FOR DETERMINING A POSITION OF AN UNDERWATER VEHICLE RELATIVE TO AN UNDERWATER OBJECT.
Field of the Invention
The present invention relates to a positioning system 5 for enabling docking of an underwater Vehicle (UV) to e.g. a submarine.
Prior Art
Positioning systems are used in a variety of
10 applications, e.g. on ships (GPS, RADAR, DECCA...) , air crafts (GPS, inertial systems, ILS:s...). GPS, Global Positioning System, is a highly versatile, precise positioning system, but it has one important disadvantage for underwater applications, namely that the radio signals
15 from the satellites are severely attenuated by water. This signal attenuation makes it impossible to use GPS for underwater applications, at least without a surface breaking antenna.
Hydroacoustic transponder systems are suitable for 0 underwater applications. There is, however, a risk of being discovered under use of such systems since such systems are active, i.e. they send out signals that can be detected by hostile warfare forces, and are thereby of limited use. There are systems comprising GPS-provided buoys, 5 which transform the GPS information to hydroacoustic information that a submarine can use as information regarding its position.
As implied above, one major problem for submarines is the risk of being discovered by other units, such as
30 submarines, surface vessels, helicopters, static underwater surveillance systems, and sonar buoys. The best way to remain undiscovered is to minimise the signature, and have full control over the emitted power, regarding direction and range, during e.g. communication. Some examples of
35 signals that should be avoided, if a location of a
submarine is to be kept secret, are radio signals, long wave sonar, and magnetic fields in general. One signal that could be sent out substantially without risking discovery is light, since light is efficiently absorbed and scattered by water.
(Unmanned) Underwater Vehicles ((U)UV:s) are gaining world-wide acceptance as a way of reducing the risk for personnel losses during war actions. One major problem is however to be able to recover, or retain, the UV after its mission. Especially for submarines, this constitutes a severe problem, if the submarine is required to maintain submerged during recovery of the UV. Recovering a UV to a "mothership" , in this case a submarine, is usually referred to as "docking" One possible way to recover a UV is to let it swim into an opening, e.g. an open torpedo tube of the submarine. Obviously, this requires a guiding system that is very accurate, has a fast position update rate, and gives information on several degrees of freedom, at least at short range.
US-B-6 362 875 discloses a positioning system for relative positioning between two objects, the system including 2 light emitting devices, 2 cameras, a communication link and computing means. The communication link transmits data between the two objects. One severe drawback with the system disclosed in US-B-β 362 875 is that the system relies on the communication link between the two objects. The transmitting of data in the communication link will send out signals that can be detected by other units.
Summary of the Invention
The above and other problems and demands are solved by means of the method as claimed in claim 1, and/or the device according to claim 7.
Brief Description of the Drawings
Fig. 1 is a perspective view showing the principles of a positioning system according to the present invention. Fig. 2 is a scheme showing the general steps in the data processing algorithm.
Figs. 3a and 3b are perspective view and a graph, respectively, showing results on positioning accuracy for a test set-up for the system. Figs. 4a and 4b are photos of a test set-up.
Fig. 5 is a drawing of a submarine provided with a light source pattern for positioning according to the present invention.
Fig. 6 is a scheme showing the principal steps of the processing of image data according to the present invention.
Detailed Description of an Embodiment
Fig. 1 shows a positioning system 100 according to the present invention. The system comprises a camera C having a lens L. The camera C is connected to a computing means 110, that in turn is connected to a display means 115. In front of the camera C, a 3-D light pattern 120 is shown, comprising ten light sources LS1-LS10. Two of the light sources (LS5 and LS6) are placed at the bottom of a pipe 122. The light sources are connected to a driver 125, which is controlled by a controller 130. Power is supplied to the controller 130 and the driver 125 by means of a power box 135. The controller 130 is arranged to control a lightning sequence of the light sources LSI to LS10. Two small figures 140 and 145 are shown below the perspective view of the light pattern 120 and show exemplary lightning patterns of the light sources LS1-LS 10. In Fig. 140, the light sources LS1-LS6 are lit, and in figure 145, the light sources LS7-LS10 are lit. The lightning of the various
light source groups LS1-LS10 can be performed sequentially, meaning that the camera C is able to differ between various light sources, depending on which light source group is lit. Fig. 2 is a simplified scheme showing the function of a data algorithm for the positioning system 100 of Fig 1. In one embodiment, at least two images 205' are captured by the camera by means of an image grabber 205, one first image where some of the light sources are lit, or activated, and one where the other light sources are activated. The images 205' are transmitted to an image preprocessor 210, in which an image subtraction, which will be explained below, is performed. If there is no relative movement between the camera C and the light sources LS1- LS10, the resulting image will only contain information regarding the location (and strength) of the light sources, since that is the only difference between the images. The resulting image is "digitised", which means that the image is divided into zones "with light signal" and "without light signal", the resulting image being shown at 210'. The image 210' is transmitted to computing means 215, that by means of the image 210' computes the position of the camera C relative to the light pattern 120. The data processing will be more thoroughly described below, with reference to Fig. 6.
Figs. 3a and 3b show test results for an object provided with a camera and a processing system according to the above. The object is moved relative to the light source pattern 120, along a dotted pattern 300. The result is shown in Fig. 3 by means of the graph.
Figs. 4a and 4b are photographs of a test setup for the light source pattern 120, captured slightly from the side (4a) and face on (4b), respectively. As can be noted, the light sources LS5 and LS6 are not visible from the side
position, since they are placed on the bottom portion of the pipe 122.
Fig. 5 shows the light source pattern 120 of the positioning system according to the present invention arranged on a stem of a submarine 500. Also shown is the pipe 122, that in the case of a submarine could be a torpedo tube.
In the following, the two objects, whose relative position is to be determined, will be referenced to as "UV" and "submarine", the UV being the object provided with the camera and the data processing system, the submarine being provided with the light source pattern LS1-LS10. The intention of this referencing is not to be limiting, but such referencing is made for simplicity. As appears from the above description of the details of Fig. 1, the camera C captures an image of the light source pattern 120. Depending on the position of the camera C relative to the light source pattern LS1-LS10, the image the camera C captures will change. The longer distance between the camera C and the light source pattern LS1-LS10, the shorter the distances between the light sources LS7-LS10 will appear compared to the distances between the light sources LS1-LS4.
If the camera C captures the image from an oblique angle, may it be vertically or horizontally, the centre point of the light sources LS7-LS10 will be offset relative to the centre point of the light sources LSI and LS2 and the light sources LS3 and LS4, respectively.
The two light sources LS5 and LS6 will not be visible until the position of the camera C is approximately in line with the longitudinal axis of the pipe 122. Thus, it is possible to know that the position of the UV is almost correct relative to the pipe 122 when the two light sources LS5 and LS6 are visible, if the purpose of the positioning is to dock the UV into the pipe 122.
With the above, it is possible to extract information on the UV position relative to the position of the submarine sidewise, heightwise, and lengthwise.
It is noteworthy that the positioning according to the present invention is independent of the zoom ratio of the lens L on the camera C. That makes it possible to use a zoom ratio that gives an optimum utilization of the light sensitive chip in the camera C. Hence, the precision increases . It is also possible to determine the angular position of the UV relative to the submarine. This is done by comparing the centre of the camera chip with the centre of the light source pattern of the submarine, and to compare the vertical position of any light source pair (e.g. LSI and LS2) with a horizontal direction on the camera chip.
The above is an explanation on how the positioning algorithm works. In reality, a 6-point algorithm is used for the calculation of the position of the UV relative to the submarine. As is well known by people skilled in the art of computervision, such an algorithm outputs the position of the camera C relative to a (light source) pattern LS1-LS10 of known (3D) geometry. Except from the position lengthwise, heightwise, and sidewise, the algorithm outputs information on the attitude angles, i.e. the angular position of the UV provided with the camera C relative to the light pattern on the submarine.
Now, the attention is directed towards Figs. 2 and 6. As mentioned, the light source pattern LS1-LS10 can be pulsed. One preferred way of pulsing the light source pattern is to operate different groups of light sources
(e.g. LS1-LS6) a certain time, followed by operation of the light sources LS7-LS10. By doing so, it is possible for the data processing system to distinguish between the light sources by grabbing images at certain timings, in case not all light sources should be visible. Also, it is possible
to perform a background reduction by means of capturing two images with a (small) time separation. The background reduction will be described in the following.
The frame grabber 205 grabs two images, with a small time separation. The time separation is such, that the first image captures light from the light sources LS 1-LS6, and the second image from the light sources LS7-LS10, since they, as mentioned in the previous paragraph, are operated at various timings . It is of outmost importance that the phasing of the camera C and the light source LS1-LS10 timings are in phase. If not, the camera C might capture images between the operation timings of the various light source LS1-LS10 groups. This could be avoided in many ways, e.g. by having an additional light source and an additional, continuously operated light sensing means, the additional light source giving information on the operation timing of the light sources LS1-LS10. Another option is to synchronise the camera C and the light source groups LS1-LS10 prior to the release of the UV from the submarine 500. As time measuring devices are very precise, the synchronisation will last for several weeks.
Both images are transferred to the image preprocessor 210, in which the first image is subtracted from the second image (or vice versa) . This gives a resulting image, ideally containing only the light sources LS1-LS10 (see 210'). The resulting image is transferred to the computing means 215, in which a tresholding process and the position calculation for the camera on the UV relative to the light source pattern on the submarine is performed.
The processing system is described in Fig. 6. As can be seen in the scheme of Fig. 6, an image is grabbed (by the frame grabber 205) , upon which the image is transferred to a memory of the pre-processor 210. If the pre-processor does not have a previous image stored in the memory, the
image is stored, and the frame grabber is ordered to grab one more image. After the pre-processor has got two successive images in its memory, an image processing commences, which starts with subtracting the one image from the other. After the subtraction, the oldest image is deleted from the memory, to free space for a successive image. The resulting image will contain positive and negative peaks, corresponding to light sources LS1-LS10 captured on the first and second images, respectively. The subtraction of a succeeding image from the preceding one helps in reducing the background, and results in an image ideally only containing information on the light source positions. The resulting image is threshold processed in order to further distinguish the light sources from noise. The thresholding is performed in two steps, namely on the "positive side" and on the "negative side". All values above a certain level are substituted with the number 2, all values below a certain level are substituted with the number 1, and all values in between are substituted with 0.
After the thresholding, the image has been transferred into a matrix, containing only values of 0, 1, and 2. In the next step, pixel group to light source correspondence, i.e. the pixel groups representative of each light source are identified. Thereafter, a centre point for each pixel group is determined.
After the pixel group correspondence and centre point determination, the number of centre points, hence the number of pixel groups, and hence the number of visible light sources, is determined. If the number of centre points is 6 or more, a standard DLT (Direct Linear Transform) algorithm with SVD (Singular Value Decomposition) is used to calculate the position and attitude angles of the camera on the UV relative to the
light source pattern on the submarine 500. This procedure is well understood by persons skilled in the art.
If 5 centre positions are found, a less precise 5- point algorithm is used for the same purpose, and if 4 points are visible, an even less precise 4-point algorithm is used. The fewer points that are visible, the less precise results, and less information can be obtained from the algorithm.
After the position has been calculated, the process starts all over again, with the grabbing of another image. It may be worth noticing that, after the process has been run for the first time, there will always be one image stored in the memory, which means that it is possible to perform one position calculation per grabbed image. With standard video cameras, this gives a position update frequency of 25 Hz.
As is well understood by persons skilled in the art, many measures can be taken to increase the accuracy of the determination of the exact location of the light source centre in the image resulting from the pre-processor. One such method is Gauss-adaptation of a Gaussian curve to the light source image on the camera chip. Another method is determining the mass centre of each light source image.
As mentioned above, it is of outmost importance that a submarine, or any member of a naval warfare force, does not send out signals that can be detected from long range by other, hostile, warfare forces. For underwater forces, that generally are very weak in terms of weaponry and protection, and whose largest benefit is surprise, this is even more important than for surface ships. Light is a signal type that can be sent out from a submarine without being discovered at long range. The wavelength of the light is not crucial for the functionality of the positioning system, but for optimum performance, some properties of the
light and the transmission medium may be considered as the wavelength is chosen:
1. Water (the transmission medium) absorbs light with a wavelength longer than about 700 nm (near infrared) .
2. Plankton absorbs light at virtually all wavelengths, but most severely at around 400-450 nm (blue) .
3. Organic compunds in the water absorb more light at shorter wavelengths.
4. The scattering of light in salt water increases as the wavelength decreases.
In view of the above, it turns out that a wavelength of about 500-550 nm fulfils the demands for a positioning system according to the present invention, since both such wavelengths light has reasonably low absorption by water and by plankton's, as well as by organic compounds. In some cases, it may be beneficial to use a shorter wavelength, namely when the surface water is filled with plankton, and the positioning system is supposed to be operated under the plankton layer; then, scattered light from the positioning system will be absorbed by the plankton present in the surface water, and make it more difficult to discover the light from above the surface, e.g. by means of ships, aircraft's or satellites.
One option that might be considered is to have light sources with different wavelengths on different positions of the submarine. For example, it may be beneficial to use shorter wavelengths on the topside of the submarine. This might lead to the submarine being harder to detect from the surface, than would be the case if light with optimum transmission characteristics was chosen.
The described system can be used in reasonably clear water up to a range of about 50 metres. This was field tested in the water of Malmo harbour.
In very clear water, the system could probably be operated up to several hundreds of meters.
Various devices can be used as light sources LS1- LS10. One choice is however LED:s, Light Emitting Diodes. LED:s have a relatively narrow wavelength span, high efficiency, and can be turned on and off within very short time periods. In other embodiments, it could however be advantageous to use other light sources, e.g. lasers, light bulbs, quantum wells, etc. As can be seen in Fig. 3, one advantage with the system according to the present invention is the increasing positioning accuracy at short range. At long range, it is not necessary to know the exact relative positioning for the UV and the submarine, respectively, but at short range it certainly is, since the UV should be able to "swim" into a hatch or a torpedo tube of the submarine.
In the above description, the terms Underwater Vehicle (UV) and "submarine" have been used in order to make the description clear. The intention of this terminology is not to limit the scope of the invention. For example, the invention could be used to guide a Submarine Rescue Vehicle (SRV) to a sinken submarine, a submarine to an underwater station, a surveillance submarine to underwater parts of an oil rig, etc. The scope of the invention is only limited by the appended claims.