GB2608614A - System for assisting with water rescue - Google Patents

System for assisting with water rescue Download PDF

Info

Publication number
GB2608614A
GB2608614A GB2109711.8A GB202109711A GB2608614A GB 2608614 A GB2608614 A GB 2608614A GB 202109711 A GB202109711 A GB 202109711A GB 2608614 A GB2608614 A GB 2608614A
Authority
GB
United Kingdom
Prior art keywords
location
portable device
autonomous vehicle
person
self
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
GB2109711.8A
Other versions
GB202109711D0 (en
Inventor
Geake Vincent
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Individual
Original Assignee
Individual
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Individual filed Critical Individual
Priority to GB2109711.8A priority Critical patent/GB2608614A/en
Publication of GB202109711D0 publication Critical patent/GB202109711D0/en
Publication of GB2608614A publication Critical patent/GB2608614A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B63SHIPS OR OTHER WATERBORNE VESSELS; RELATED EQUIPMENT
    • B63CLAUNCHING, HAULING-OUT, OR DRY-DOCKING OF VESSELS; LIFE-SAVING IN WATER; EQUIPMENT FOR DWELLING OR WORKING UNDER WATER; MEANS FOR SALVAGING OR SEARCHING FOR UNDERWATER OBJECTS
    • B63C9/00Life-saving in water
    • B63C9/22Devices for holding or launching life-buoys, inflatable life-rafts, or other floatable life-saving equipment
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B63SHIPS OR OTHER WATERBORNE VESSELS; RELATED EQUIPMENT
    • B63CLAUNCHING, HAULING-OUT, OR DRY-DOCKING OF VESSELS; LIFE-SAVING IN WATER; EQUIPMENT FOR DWELLING OR WORKING UNDER WATER; MEANS FOR SALVAGING OR SEARCHING FOR UNDERWATER OBJECTS
    • B63C9/00Life-saving in water
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B63SHIPS OR OTHER WATERBORNE VESSELS; RELATED EQUIPMENT
    • B63CLAUNCHING, HAULING-OUT, OR DRY-DOCKING OF VESSELS; LIFE-SAVING IN WATER; EQUIPMENT FOR DWELLING OR WORKING UNDER WATER; MEANS FOR SALVAGING OR SEARCHING FOR UNDERWATER OBJECTS
    • B63C9/00Life-saving in water
    • B63C9/0005Life-saving in water by means of alarm devices for persons falling into the water, e.g. by signalling, by controlling the propulsion or manoeuvring means of the boat
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B63SHIPS OR OTHER WATERBORNE VESSELS; RELATED EQUIPMENT
    • B63CLAUNCHING, HAULING-OUT, OR DRY-DOCKING OF VESSELS; LIFE-SAVING IN WATER; EQUIPMENT FOR DWELLING OR WORKING UNDER WATER; MEANS FOR SALVAGING OR SEARCHING FOR UNDERWATER OBJECTS
    • B63C9/00Life-saving in water
    • B63C9/01Air-sea rescue devices, i.e. equipment carried by, and capable of being dropped from, an aircraft
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S15/00Systems using the reflection or reradiation of acoustic waves, e.g. sonar systems
    • G01S15/02Systems using the reflection or reradiation of acoustic waves, e.g. sonar systems using reflection of acoustic waves
    • G01S15/06Systems determining the position data of a target
    • G01S15/08Systems for measuring distance only
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S15/00Systems using the reflection or reradiation of acoustic waves, e.g. sonar systems
    • G01S15/88Sonar systems specially adapted for specific applications
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/86Combinations of lidar systems with systems other than lidar, radar or sonar, e.g. with direction finders
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/003Transmission of data between radar, sonar or lidar systems and remote stations
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B63SHIPS OR OTHER WATERBORNE VESSELS; RELATED EQUIPMENT
    • B63CLAUNCHING, HAULING-OUT, OR DRY-DOCKING OF VESSELS; LIFE-SAVING IN WATER; EQUIPMENT FOR DWELLING OR WORKING UNDER WATER; MEANS FOR SALVAGING OR SEARCHING FOR UNDERWATER OBJECTS
    • B63C9/00Life-saving in water
    • B63C2009/0017Life-saving in water characterised by making use of satellite radio beacon positioning systems, e.g. the Global Positioning System [GPS]
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64UUNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
    • B64U2101/00UAVs specially adapted for particular uses or applications
    • B64U2101/30UAVs specially adapted for particular uses or applications for imaging, photography or videography
    • B64U2101/31UAVs specially adapted for particular uses or applications for imaging, photography or videography for surveillance
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64UUNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
    • B64U2101/00UAVs specially adapted for particular uses or applications
    • B64U2101/55UAVs specially adapted for particular uses or applications for life-saving or rescue operations; for medical use
    • B64U2101/57UAVs specially adapted for particular uses or applications for life-saving or rescue operations; for medical use for bringing emergency supplies to persons or animals in danger, e.g. ropes or life vests
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64UUNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
    • B64U2101/00UAVs specially adapted for particular uses or applications
    • B64U2101/60UAVs specially adapted for particular uses or applications for transporting passengers; for transporting goods other than weapons
    • B64U2101/67UAVs specially adapted for particular uses or applications for transporting passengers; for transporting goods other than weapons the UAVs comprising tethers for lowering the goods
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64UUNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
    • B64U2201/00UAVs characterised by their flight controls
    • B64U2201/10UAVs characterised by their flight controls autonomous, i.e. by navigating independently from ground or air stations, e.g. by using inertial navigation systems [INS]
    • B64U2201/104UAVs characterised by their flight controls autonomous, i.e. by navigating independently from ground or air stations, e.g. by using inertial navigation systems [INS] using satellite radio beacon positioning systems, e.g. GPS
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64UUNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
    • B64U2201/00UAVs characterised by their flight controls
    • B64U2201/20Remote controls
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/02Systems using the reflection of electromagnetic waves other than radio waves
    • G01S17/06Systems determining position data of a target
    • G01S17/42Simultaneous measurement of distance and other co-ordinates

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • General Physics & Mathematics (AREA)
  • Mechanical Engineering (AREA)
  • Ocean & Marine Engineering (AREA)
  • Electromagnetism (AREA)
  • Acoustics & Sound (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Position Fixing By Use Of Radio Waves (AREA)

Abstract

A system for assisting with the rescue of a person 204 in a body of water, the system comprising: a portable device 210 for determining a first location of the person in the body of water; and a processing means configured to: determine the first location using the portable device; and based on said determined first location, cause a self-propelled autonomous vehicle 230 to navigate to a second location proximate to the first location. The portable device is preferably a handheld rangefinder comprising a LIDAR system. The autonomous vehicle is preferably a multi-rotor aerial drone or an unmanned water vehicle. The UAV preferably arranged to deliver a rescue aid 234 to the person in the water such as a flotation device, location aid, retrieval aid or predator repellent.

Description

System for Assisting With Water Rescue The present invention relates to a system and a method for assisting with water rescue, for example a system and method for assisting with the rescue of a person in a body of 5 water.
Background
When a person is in distress in the water, it is important to help them stay afloat until appropriate help can be provided. The person may be in distress for a number of reasons, for example falling overboard from a yacht or ship, getting into difficulties when swimming in surf off a beach, or getting swept away by flood water, for example from a monsoon or a tsunami.
In many of these cases, the person will not have any flotation aid with them, for example a life ring or buoyancy aid, and so will have to attempt to support themselves, provided they are capable of doing so (e.g. they know how to swim, are not incapacitated by injury or alcohol, and have the reserves of energy).
In some cases, for example when a person has fallen overboard from a vessel, the crew on the vessel will endeavour to manually release flotation equipment into the water to aid in rescue of the person. However, due to time delay and the motion of the vessel and/or person in need of rescue, the flotation equipment may fall some distance from the person. Furthermore, the person may not be capable of swimming to the flotation equipment or may not be able to see it, for example due to the height of waves.
Summary
According to a first aspect of the invention, there is provided a system for assisting with the rescue of a person in a body of water, the system comprising: a portable device for determining a first location of the person in the body of water; and a processing means configured to: determine the first location using the portable device; and based on said determined first location, cause a self-propelled autonomous vehicle to navigate to a second location proximate to the first location.
The portable device may be a handheld device.
The processing means may be configured to determine the first location using the portable device at least in part by: receiving data indicating the first location from the portable device; and determining the first location based on the received data.
The data may indicates at least one of a range, an azimuth or a declination of the person in the body of water relative to the portable device.
The portable device may be configured to generate the data and transmit the data to the processing means in response to receipt of a user input at the portable device.
The portable device may have a viewfinder, and the portable rangefinder may be configured such that: the user may aim the portable device at the person in the body of water by viewing the person in the body of water using the viewfinder; and in response to receipt of the user input while the portable device is aimed at the person in the body of water, the portable device generates the data indicating the first location of the person in the body of water.
The portable device may be configured to generate the data using a laser. 20 The portable device may be configured to generate the data using LIDAR. The portable device may be configured to generate the data using infrared. The autonomous vehicle may be an unmanned aerial vehicle. -0or
The unmanned aerial vehicle may be a fixed-wing drone or a multirotor drone. The autonomous vehicle may be an unmanned watercraft.
3o The system may further comprise the self-propelled autonomous vehicle.
The autonomous vehicle may comprise a camera for capturing a video of the person in the body of water.
The autonomous vehicle may further comprise a wireless communication module for transmitting video data representing the video to an external device. -3 -
The autonomous vehicle may be configured to navigate to a third location proximate the second location based at least in part on video captured by the camera.
The autonomous vehicle may be configured to identify the person using the video captured by the camera and at least one of machine learning or artificial intelligence, and to navigate to third location based at least in part on the identification of the person.
The autonomous vehicle may comprise a wireless communication module for receiving control signals from a remote control system, and wherein the autonomous vehicle may be configured to be remotely navigated based on the received control signals.
The autonomous vehicle may comprise a rescue aid for delivery to the person in the 15 body of water.
The autonomous vehicle may be configured to automatically deploy the rescue aid upon arrival at the second location.
The rescue aid may comprise at least one of a flotation aid, a locator aid, a retrieval aid or a predator repellent.
The autonomous vehicle may comprise at least one line coupled to the rescue aid; and upon reaching the second location, the autonomous vehicle is configured to traverse a -0or predetermined path proximate the first location.
The path may substantially encircle the first location.
The autonomous vehicle, as it traverses the path, may be configured either to extend the line from the autonomous vehicle or descend in altitude.
The autonomous vehicle may comprise a plurality of lines coupled to the rescue aid; and upon reaching the second location, the autonomous vehicle may be configured to radially eject the plurality of lines from the autonomous vehicle. -4 -
According to a second aspect of the present invention, there is disclosed a method for assisting the rescue of a person in a body of water using any of the systems disclosed herein, the method comprising: determining, by the processing means, the first location of the person in the body of water using the portable device; and causing, by the processing means, and based on said determined first location, said self-propelled autonomous vehicle to navigate to a second location proximate to the first location.
Brief Description of the Drawings
Exemplary embodiments of the present invention are described with reference to the /o accompanying drawings, in which: FIG. r is a schematic illustration of a system according to embodiments of the present invention; FIG. 2 is a schematic side-view illustration of a system according to embodiments of the present invention in use; FIG. 3 is a schematic side-view illustration of the system of FIG. 2, after the self-propelled autonomous vehicle has moved to a location; FIG. 4 is a schematic side-view illustration of the system of FIG. 3, after the self-propelled autonomous vehicle has deployed a rescue aid; FIG. 5 is a schematic side-view illustration of a self-propelled autonomous vehicle of a system according to embodiments of the present invention; FIG. 6 is a schematic side-view illustration of the self-propelled autonomous vehicle of FIG. 5, after the self-propelled autonomous vehicle has moved along a path T; FIG. 7 is a schematic top-view illustration of the self-propelled autonomous vehicle of FIG. 6; FIG. 8 is a schematic side-view illustration of a self-propelled autonomous vehicle of a system according to embodiments of the present invention; FIG. 9 is a schematic illustration of a processing means suitable for use in embodiments of the present invention; FIG. 10 is a flow diagram of a method according to embodiments of the present invention; FIG. n is a schematic side-view illustration of a system according to embodiments of the present invention in use; FIG. 12 is a schematic top-view illustration of a system according to embodiments of 35 the present invention; -5 -FIG. 13 is a schematic top-view illustration of a system according to embodiments of the present invention; FIG. 14A is a flow diagram of a method according to embodiments of the present invention; FIG. 14B is a continuation of the flow diagram of FIG. 14A; and FIG. 14A is a continuation of the flow diagram of FIG. 14A.
Reference will now be made in detail to the embodiments of the present invention, examples of which are illustrated in the accompanying drawings, wherein like reference /o numerals refer to like elements throughout.
Detailed Description
The present invention relates to a system for assisting with the rescue of a person in a body of water, for example, but not limited to, a sea, lake, reservoir or river.
The person may have entered the body of water involuntarily, for example after falling overboard from a ship or bridge or being swept out to sea by a wave, or they may have entered the body of water voluntarily, for example to go swimming, or by intentionally jumping from a ship or bridge or the like. The person may need to be rescued from the body of water for a number of different reasons. For example, the person may not have sufficient swimming ability to stay afloat or tread water, they may be at risk of being swept away by strong currents, they maybe injured or otherwise impaired, they may be at risk of hypothermia, or they may be at risk of attack by a water-based predator such as a shark. It is often imperative to assist the person in staying afloat and to rescue the or person as soon as possible in order to minimise the chance of the person coming to harm.
According to present disclosure, there is provided a system and a method for assisting with the rescue of such a person in a body of water. One or more of the systems disclosed herein may be systems for a marine vessel such as ship (for example a cruise liner) in that one or more components of the system (for example all components of the system) may be located said a marine vessel. Similarly, one or more of the methods disclosed herein may be methods for use on a marine vessel. The marine vessel may be in the same body of water as the person in need of rescue. In such a scenario, the systems and methods may enable a swift rescue of a person who has fallen overboard from the marine vessel, for example. -6 -
Many of the embodiments described herein will be discussed in relation to marine vessels. However, the systems and methods disclosed herein are not limited to being systems and methods for use on a marine vessel and may be suitable for being located or used elsewhere, for example on a coastline, lakeshore, beside a swimming pool or reservoir, on a bridge, or at any other location adjacent a body of water where a person may need rescuing.
By assisting with the rescue of the person in a body of water, the systems and methods described herein may reduce the likelihood of the person coming to harm, for example through drowning, hypothermia, exertion, waterborne illness, predatory attack by an animal, or physical harm from another water vessel or the like. The system may not necessarily bring the person to complete safety, for example by carrying them to land, but it may assist in improving the chances of a successful rescue of the person by a third party or the like, for example by providing to the person a flotation aid, food, drink, medicine, animal repellent, flares, a location beacon or the like.
The systems and methods described herein may reduce the traditional time delay between a person being seen to be in need of rescue in a body of water and said person being rescued, for example by allowing for swift deployment and delivery of a rescue aid to the person in need of rescue. The systems and methods described herein may reduce the complexity of assisting a person in need of rescue, making it easier for a rescue party to locate the person and deploy assistance to the person. For example, the systems and methods described herein may allow for a substantial portion of the rescue or assistance to be automated, with little or no user input beyond the user performing an initial step to locate the user. One or more of the systems and methods described herein may allow the user of the system to locate and provide assistance to the person in need of rescue in situations where the visibility of the person in need of rescue is sub-optimal to the user, for example in poor visibility, or where relative motion between the user and the person in need of rescue due to waves or the like makes it otherwise difficult to locate the person. One or more of the systems and methods described herein may allow the user to locate the person using a single input, without continually having to track the location of the person after providing the input. The possibility of human error may be reduced. Rescue assistance may be provided to the person more quickly and more accurately. -7 -
FIG.1 shows a system roo according to embodiments of the present disclosure. The system loo comprises a portable device no, a processing means 120, and a self-propelled autonomous vehicle 130. The portable device no and the self-propelled autonomous vehicle 130 are separate devices in that they are physically separated or distinct from each other. In other words, the portable device no is not physically connected to the self-propelled autonomous vehicle 130. The portable device no is an electronic device for making one or more measurements described herein such as range, declination, azimuth, altitude, geospatial location as described throughout this application. The portable device no may be a portable a portable rangefinder device no, or another type of portable location measuring device, for measuring one or more angles or distances to a target. The portable device no may comprise suitable electronics for performing functions disclosed herein, for example a processing means 90o as discussed in relation to FIG. 9. The processing means 900 of the portable device no may be configured to perform any of the measurements disclosed herein and communicate with the processing means 12o. The processing means 900 of the portable device no may be configured to control one or more sensors or other components of the portable device no described herein to perform the measurements.
FIG. 1 shows the processing means 120 physically distinct from the portable device no and the self-propelled autonomous vehicle 130 in that it does not form part of either the portable device no or the self-propelled autonomous vehicle 13o. As an example, the processing means 120 may instead be comprised in an external computing device, for example located in the vicinity of the portable device no and the self-propelled autonomous vehicle 130. However, in other embodiments the processing means 120 may be comprised in at least one of the portable device no or the self-propelled autonomous vehicle 130, or may be split among multiple devices, for example between at least two of the portable device no, the self-propelled autonomous vehicle 13o, or an external computing device.
At least part of the system loo may be located on a marine vessel such as a ship. In such an example, at least one of the portable device no, the processing means 120 and the self-propelled autonomous vehicle 130 are located on the marine vessel. For example, the portable device no, the processing means 120 and the self-propelled autonomous vehicle 13o may all be located on the marine vessel. In other examples the portable device no may be located on the marine vessel, but the self-propelled autonomous vehicle 130 may not be located on the marine vessel, but rather may be proximate the -8 -marine vessel. For example, it may be located flying near the marine vessel, or floating near the marine vessel in the body of water.
The portable device no may be a handheld device to be held in the hand of a user, for example having the portable device no having the form of a gun with a handgrip 111 as shown in FIG. 1. The portable device 110 may be located such that it is easily accessible to a user, for example in a prominent position aboard a marine vessel. The portable device no may be located to be easily accessible to a user so that the user may rapidly obtain and use the portable device no when as soon as they are aware of a person in need of water rescue. As an example, the portable device no may comprise a modified version of the ATN BINOX 4K 4-16X Smart Ultra HD Day/Night Vision Binoculars with Laser Rangefinder produced by ATN Corporation of San Francisco, USA.
The portable device no is configured to measure the range of a target object in an environment (i.e. a distance between the portable device loo and the target object). The target object may be the person in a body of water in need of rescue. The portable device no may be configured to measure the range using any suitable range finding means or method, for example using a laser, LIDAR, sonar or infrared. For example, the portable device no may be configured to emit at least one of a laser, sound or infrared light towards the target object and detect the laser, sound, or infrared light reflected from the target object. The portable device 110 may use the reflected laser, sound, or infrared light to determine the range of the target object. The portable device 110 may generate data indicating the measured range, and this data may be transmitted to another device in the system 100 such as the processing means 120, as described elsewhere, for further processing.
The portable device no may have a viewfinder 112. A user may be able to use the viewfinder 112 to assist with aiming the portable device no at the target object. The viewfinder 112 may use one or more lenses to magnify an image of the target object. The viewfinder 112 may have a visible marking for indicating the target object in the environment that the portable device no is currently aimed at. When the portable device no is activated to take a range measurement, it will take a measurement of the target object as indicated by the marking in the viewfinder 112.
The portable device 110 comprises a means for receiving a user input. The portable device 110 is configured to take one or more measurements such as a range -9 -measurement as described previously in response to receiving the user input. The means may comprise a switch, for example a trigger switch 114 as shown in FIG. 1. The switch may be located on the handgrip nr. Other types of switch or other means for receiving a user input may be used. For example, the means may comprise a resistive sensor, a capacitive sensor, an audio sensor, a magnetic sensor, or the like. A user may aim the portable device 110 at a person in need of rescue then provide a user input via the means, for example by pressing the switch. In response, to receiving the user input, the portable device no will measure the range of the person in need of rescue.
io In some scenarios, a user of the portable device no may find it difficult to take a range measurement of the person in need of rescue using the portable device no. As an example, there may be a certain amount of relative movement between the user of the portable device no and the person in need of rescue due to rough water, making it difficult to aim the portable device 110 long enough at the target to take the measurement. In other examples, there may be poor visibility between the user and the target. in other examples, the user may lack the dexterity to take an accurate measurement, or may not be able to provide the user input via the trigger switch 114 or the like quickly enough. To assist with such scenarios, the portable device no may be configured to automatically generate a range measurement in response to the portable device no identifying a suitable target. For example, the portable device no may be configured to receive a user input via a switch or other user input device of the portable device 110. In response to receiving the user input, the portable device no may take a range measurement in response to the portable device no identifying a suitable target.
In an example where the portable device no is configured to measure a range using a laser, identifying a suitable target may comprise the portable device no receiving reflected laser light from a target. In response to receiving the reflected laser light, the portable device no may make the range measurement to the target, in some examples in addition to other measurements such as the azimuth and/or declination of the target relative to the portable device 110. in this example, the portable device no therefore makes a range measurement in response to the first time it receives reflected laser light subsequent to receiving the user input. Other measurements such as azimuth, declination or height above the ground or body of water may also be made.
If the portable device no makes an automatic range measurement, it may provide the user with an opportunity to accept the range measurement before the range -10 -measurement is used. As an example, subsequent to making the automatic range measurement, the portable device no may output an indication of the measured range to the user, or other data indicative of the target used for the range measurement, such as a location of the target or an image of the target captured using a camera of the portable device no. The data indicative of the target may be output via a display of the portable electronic device no or a display connected to the portable electronic device. For example, the portable electronic device no may output to the display an image of the target that the portable device no has used to acquire the range. The user may review the data, such as the image, and may provide a user input to the portable device o no user any suitable user input device to indicate acceptance of the acquired range value. In response, the acquired range value may be further processed as described elsewhere.
The portable device 110 may be configured to make one or more additional measurements in relation to the target object, such as the person in need of rescue, wherein the one or more additional measurements can be used together with the measured range to determine a location of the person. For example, the portable device no may be configured to determine one or more of an azimuth or a declination of the person relative to the portable device no. This may be achieved through any suitable azimuth sensor 117 and/or declination/angle sensor 113 comprised in the portable device no or arranged elsewhere in the system 100, or other means for determining the azimuth and/or declination of the person. The portable device 110 may generate data indicating the azimuth and/or declination of the person in need of rescue, based on the measurement of azimuth and/or declination, and this data may be transmitted to another device in the system loo such as the processing means 12o, as described elsewhere, for further processing. The one or more additional measurements such as the azimuth and/or declination measurements may be performed in response to the same user input as described previously. For example, in response to the portable device no receiving the user input, the portable device no may measure one or more of a range, an azimuth or a declination of the target object (i.e. person) relative to the portable device no, generate data indicating the measured one or more of the range, azimuth or declination, and transmit the generated data to the other device in the system loo such as the processing means 120, as described elsewhere. In some or all embodiments, in response to the portable device no receiving the user input, the portable device no measure the range, the azimuth and the declination of the target object (i.e. person) relative to the portable device no, generates data indicating the range, azimuth and declination, and transmits the generated data to the other device in the system 100 such as the processing means 120, as described elsewhere.
The data generated by the portable device no, such as the range data and/or azimuth data and/or declination data may be considered to be data indicating the location of the target object (i.e. person in need of rescue), in that it can be used to determine the location. The location may be a relative location (i.e. a location relative to a body such as the portable device no), or an absolute location (i.e. a location based on a fixed point on earth). In some examples, the data generated by the portable device no is data indicating a relative location and this data is used to determine an absolute location using geolocation, for example geolocation of the portable device 110 or another part of the system 100. For example, the system 100 may comprise a Global Positioning System (GPS) receiver for geolocating the system wo. The GPS receiver may be comprised in any part of the system loo such as the portable device 110, processing means 12o, or self-propelled autonomous vehicle 13o, or a different part of the system 100. The GPS receiver may be used to determine a geospatial location of the system 100, and this geospatial location maybe processed with the data indicating the relative location of the target object to determine an absolute location of the target object. While a GPS receiver is used as an example, a different type of satellite-based navigation system may be used instead, such as GLONASS or GALILEO Any known suitable method for determining an absolute location based on data indicating a relative location may be used.
The portable device no comprises a communication module 115 for transmitting data (such as the data indicating the location of the target object) to another device in the system, such as the processing means 120. The communication module 115 may comprise at least one of a wired communication module for sending data over a wired communication link or a wireless communication module for sending data over a wireless communication link. -Where the processing means 120 is comprised in the portable device no, the communication module n5 maybe a wired communication module. Where the processing means 120 is not comprised in the portable device no, the communication module 115 maybe a wireless communication module. The wireless communication link may use Bluetooth, WiFi, VHF, infrared, or any other suitable radio link or other wireless communication link.
-12 -The processing means 120 may be any suitable processing means for carrying out various method steps disclosed herein, and an example of the processing means 120 is discussed later in relation to FIG. 9. The processing means 120 has a communication module 125 for receiving data from the portable device no and sending data/instructions to the self-propelled autonomous vehicle 130. The communication module 125 may comprise at least one of a wired communication module for sending data over a wired communication link or a wireless communication module for sending data over a wireless communication link. Where the processing means 120 is comprised in the portable device no, the communication module 125 may be a wired /0 communication module. Where the processing means 120 is not comprised in the portable device no, the communication module 125 may be a wireless communication module. The wireless communication link may use Bluetooth, WiFi, VHF, infrared, or any other suitable radio link or other wireless communication link.
The self-propelled autonomous vehicle 130 is used to assist in the rescue of the person in the body of water. In some examples, the self-propelled autonomous vehicle 130 may be an unmanned aerial vehicle such as a fixed-wing drone, a multirotor drone, or another type of drone or unmanned aerial vehicle. In other examples, the self-propelled autonomous vehicle 130 may be an unmanned watercraft, such as a small boat, life raft, hovercraft, or a different type of unmanned watercraft. The use of a self-propelled autonomous vehicle 130 can allow for the deployment of rapid assistance to the person in need of rescue. The self-propelled autonomous vehicle 130 may be able to reach the person faster than another marine vessel could, such as the marine vessel the system 100 may be located on. -0or
The self-propelled autonomous vehicle 130 has a main body 132 and a propulsion system 133 coupled to the main body 132, the propulsion system 133 for causing the autonomous vehicle 130 to move. In the example of FIG. 1, in which the autonomous vehicle 130 is an unmanned aerial vehicle, the propulsion system 133 may comprise a first rotor 138a and second rotor 138b, each with blades, however another number of rotors may be used for the propulsion system 133 such as four rotors, or a different type of propulsion system may be used, bladed or non-bladed. In examples where the autonomous vehicle 130 is an unmanned watercraft, the propulsion system 133 may comprise a inboard or outboard motor, a paddle wheel, water jet, fan, or another means of propulsion. The propulsion system 133 may be controlled to alter the speed and direction in which the autonomous vehicle 130 moves. The propulsion system 133 may -13 -comprise any suitable means for controlling the direction of motion of the autonomous vehicle 130, for example a rudder or flaps.
The self-propelled autonomous vehicle 130 may comprise a rescue aid 134, otherwise known as a rescue payload, coupled to the main body 132 of the self-propelled autonomous vehicle 130, for example via a coupling unit 136. The coupling unit 136 may be configured such that the rescue aid 134 is detachable from the main body 132 of the self-propelled autonomous vehicle 13o. The coupling unit 136 may be configured to detach the rescue aid 134 from the main body 132 in order to deploy the rescue aid to a to person in a body of water in need of rescue. The coupling unit 136 may be configured to detach the rescue aid 134 from the main body 133 under the control of the self-propelled autonomous vehicle 13o, for example in response to receipt of a signal from the self-propelled autonomous vehicle 13o. The coupling unit 136 may comprise at least one of an electromechanical coupling mechanism or a magnetic coupling mechanism, although the coupling unit 136 may comprise one or more different coupling mechanisms known in the art.
The rescue aid 134 is a payload to assist in the rescue of the person in the body of water, for example by assisting the person to float, providing means for locating the person, providing equipment to help the person be moved from their location in the body of water, or deterring predators. As examples, the rescue aid 134 may comprise at least one of a flotation aid, a locator aid, a retrieval aid or a predator repellent, or other types of rescue aid 134. A flotation aid is for assisting the person in staying afloat and may comprise a life jacket, life belt, flotation suit, buoyancy aid, life raft, or the like. A locator aid is for assisting in the location of the person by a third party and may comprise a light, a whistle, a siren, a flare, an emergency radio beacon or another type of beacon, or another type of locator aid. A retrieval aid is for assisting in the movement of the person from their location in the body of water by a third party and may comprise a harness or the like. A predator repellent is for repelling or incapacitating predators in the vicinity of the person, for example a shark repellent. The predator repellent may repel or incapacitate the predators by chemical, electrical, optical or audio means, or by any other suitable means.
The self-propelled autonomous vehicle 130 has a communication module 135 for 35 receiving data/instructions from the processing means 120 and optionally sending data/instructions to the processing means 120 or a different external device. The communication module 135 may comprise at least one of a wired communication module for sending data over a wired communication link or a wireless communication module for sending data over a wireless communication link. Where the processing means 120 is comprised in the self-propelled autonomous vehicle 13o, the communication module 135 may be a wired communication module. Where the processing means 120 is not comprised in the self-propelled autonomous vehicle 130, the communication module 135 may be a wireless communication module. The wireless communication link may use Bluetooth, WiFi, VHF, infrared, or any other suitable radio link or other wireless communication link. The wireless communication link may comprise a marine VHF communications link such as ASM or ARDM.
The self-propelled autonomous vehicle 130 is configured to receive one or more signals from the processing arrangement 12o via the communication module 135, which one or more signals cause the self-propelled autonomous vehicle i3o to autonomously navigate to a location. By autonomously navigate, it is meant that the self-propelled autonomous vehicle 130 moves to the location in response to the one or more signals from the processing arrangement 120, without further input from a user. For example, the self-propelled autonomous vehicle 130 moves to the location without any further guidance or remote steering by a user. The one or more signals may indicate the location of the person in need of rescue, as determined by the processing means 12o.
For example, the one or more signals may indicate a geospatial coordinate of the location as determined by the processing means 120. The self-propelled autonomous vehicle 130 may comprise any suitable processing and/or navigation means for autonomously navigating to the location based on the one or more signals. As examples, the self-propelled autonomous vehicle 130 may navigate to the location using geolocation and/or dead-reckoning. The self-propelled autonomous vehicle 130 may navigate to the location based on one or more geospatial coordinates provided to the self-propelled autonomous vehicle 130 by the processing means 12o.
The self-propelled autonomous vehicle 130 may begin autonomously navigating to the location in response to receiving the one or more signals from the processing means 120, that is without further user input.
In some examples the processing means 120 or portable device no sends a first signal 35 to the self-propelled autonomous vehicle 130 in response to the portable device 110 receiving the user input. This first signal may cause the self-propelled autonomous -15 -vehicle 130 to be activated, such that it is in a prepared state ready for navigating to the location. For example, the first signal may cause the self-propelled autonomous vehicle 130 to be released from a docking station, or may cause the propulsion system 133 of the self-propelled autonomous vehicle 130 to be activated. Where the self-propelled autonomous vehicle 13() is an unmanned aerial vehicle, receipt of the first signal may cause the unmanned aerial vehicle to begin flying. The self-propelled autonomous vehicle 130 may then receive a second one or more signals from the processing means 12o, the second one or more signals causing the self-propelled autonomous vehicle 130 to autonomously navigate to the location as discussed previously.
The self-propelled autonomous vehicle 130 may comprise a camera 139. The camera 139 may be for capturing a video of the person in the body of water. Where the self-propelled autonomous vehicle 130 is an unmanned aerial vehicle, the camera 139 may be configured to point substantially downwards, so as to capture video from a downwards perspective above a person in the body of water when the unmanned aerial vehicle is in flight. The self-propelled autonomous vehicle 130 may be configured to transmit video data representing the video to an external device, for example using the communication module 135 such as a wireless communication module. The self-propelled autonomous vehicle 130 may therefore be able to stream live video from the camera 139 to the external device. As an example, the external device may be a device having a screen that is able to display the received video stream. The external device may be in the possession of a person assisting in the rescue of the person in the body of water.
The self-propelled autonomous vehicle 130 may use the camera 139 to navigate to the person in need of rescue. For example, the self-propelled autonomous vehicle 130 may navigate to the location determined by the processing means 120 as discussed previously (or near to the location), before using the camera 139 to identify the person in need of rescue. The self-propelled autonomous vehicle 130 may comprise suitable processing means for processing video data from the camera 139 to identify the person in need of rescue, for example using machine learning and/or artificial intelligence. After identifying the person, the self-propelled autonomous vehicle 130 may navigate to a location based on the identification of the person in the video. For example, the self-propelled autonomous vehicle 130 may navigate to a location nearer the person, or may navigate to a location that is a safe or otherwise suitable distance from the person.
Where the communication module 135 comprises a wireless communication module, the wireless communication module may be configured to receive wireless control signals from a remote control system. The self-propelled autonomous vehicle 130 may be configured to be remotely navigated based on the received control signals. in other words, a user may be able to remotely control the self-propelled autonomous vehicle 130 in order to direct the self-propelled autonomous vehicle 130 towards the person in need of rescue. The user may be viewing the viewing the live video send from the self-propelled autonomous vehicle 130 to the external device and may be remotely controlling the self-propelled autonomous vehicle 130 based on the video. This may be o useful where the self-propelled autonomous vehicle 130 has autonomously navigated to (or near to) the second location relative to the person in need of rescue, but where further adjustment of the location of the self-propelled autonomous vehicle 130 is needed.
FTG. 2 shows an example of how a system zoo according to one or more embodiments of the present disclosure may be used. The system 200 may be similar or identical to the system described previously in relation to FIG. 1.
The system 200 is located on a marine vessel 202, in this example a boat, however the system zoo may alternatively be located elsewhere, as discussed previously. A person 204 is in a body of water, in this example the sea, and is in need of rescue. The person 204 may have fallen from the boat 202 into the body of water, however in other examples they may be in the body of water for a different reason, for example falling from a different marine vessel, or swimming out to sea. -0or
A user 206 of the system zoo is located on the marine vessel zoz, along with the portable device 210, processing means zzo and self-propelled autonomous vehicle 230 of the system zoo, which may the same as the portable device no, processing means 120 and self-propelled autonomous vehicle 130 discussed previously.
FTG. 2 shows the self-propelled autonomous vehicle 230 as an unmanned aerial vehicle such as a drone, however another self-propelled autonomous vehicle 230 may be used instead, such as an unmanned watercraft. In some examples, the self-propelled autonomous vehicle 230 may not be located on the marine vessel 202 but may instead be in flight proximate to the marine vessel 202 (for example where the self-propelled autonomous vehicle 23o is an unmanned aerial vehicle) or floating in the body of water -17 -proximate to the marine vessel 202 (for example where the self-propelled autonomous vehicle 23o is an unmanned watercraft). The self-propelled autonomous vehicle 23o may have a rescue aid 234 similar to the rescue aid 134 discussed previously, which may be coupled to a main body of the self-propelled autonomous vehicle 230 as described previously.
The user 206 may become aware of the person 204 in the body of water. This may be because they observe the person 204 in the body of water, or they are otherwise made aware that the person 204 is, or may be, in the body of water. The user 206 picks up the /0 portable device 210 and uses it to determine a first location L, of the person in the body of water. If the portable device 210 is a handheld device then the user 206 may hold the portable device 210 by hand. The user 206 aims the portable device 210 at the person 204 in the body of water, for example using the viewfinder as discussed in relation to FIG. 1. Once the user 206 has correctly aimed the portable device no at the person 204, they provide a user input to the portable device 210, for example by pressing the trigger switch 114 of the portable device 210, or actuating a different type of switch or user input means as discussed previously.
In response to receiving a signal indicating the user input, the portable device 210 generates data indicating the location of the target at which the portable device 210 is aimed, as discussed previously. In this case, the target is the person 204 and so the location is the first location L, of the person 204 in the body of water.
The data indicating the location may comprise one or more of data indicating a range R -0or (i.e. distance), azimuth or declination of the person 204 in the body of water relative to the portable device 210. As discussed previously, the location L, may be a relative location or an absolute location (for example an relative location is converted to an absolute location based on additional geolocation data).
The portable device 210 transmits the data indicating the first location I, to the processing means 220 either via a wired communication link or a wireless communication link. For example, where the processing means 220 is comprised in the portable device 210, the link may be a wired communication link. Where the processing means 220 is not comprised in the portable device 210, the link may be a wired or wireless communication link. The wired or wireless communication links may use any suitable communication protocols.
-18 -The portable device 210 transmits the data indicating the first location L, to the processing means 220 automatically. That is, the portable device 210 transmits the data indicating the first location L, to the processing means 220 in response to the user 5 input, but without further input from the user.
In response to receiving the data indicating the first location L, from the portable device 21o, the processing means zzo determines the first location L, based at least on the data indicating the first location L,.
The processing means 220, based on the determined first location L,, causes the self-propelled autonomous vehicle 23o to navigate to a second location L2 proximate to the first location L1, as illustrated in FIG. 3. Causing the self-propelled autonomous vehicle 23o to navigate to a second location L2 may comprise the processing means 220 sending one or more signals to the self-propelled autonomous vehicle 230 as discussed previously in relation to FIG. 1. The second location Lo may have been determined by the processing means 120 based on the first location L1.
In some examples the second location L2 is identical to the first location L, in that it has the same geospatial coordinates. However, in other examples the second location L, is proximate, but not identical, to the first location L,, in that it has geospatial coordinates that differ from the first location L,, for example in one or more of x-, y-, or z-axis Cartesian coordinates, or in latitude, longitude or elevation. By having a second location L, that is proximate to, but differs from, the first location L" the self-propelled autonomous vehicle 23o may avoid a collision with the person 204 in the body of water. The second location L, may differ from the first location L, by a predetermined distance.
FIG. 3 shows the scenario of FIG. 2, after the processing means 220 has caused the self-propelled autonomous vehicle 230 to navigate towards the second location Lo. The dotted line labelled 'P' shows an example path P that the self-propelled autonomous vehicle 230 has taken to the second location L2, proximate the first location L, of the person 204 in the body of water. In FIG. 3, the self-propelled autonomous vehicle 230 is an unmanned aerial vehicle and so the path P is a path the self-propelled autonomous vehicle 23o has moved through the air. However, in examples where the self-propelled autonomous vehicle 23o is an unmanned watercraft, the path P may be across the surface of the body of water.
FIG. 3 shows the user 206 still using the portable device 210, however this is not necessary. The self-propelled autonomous vehicle 230 has already been provided with the one or more signals from the processing means 220 to enable it to navigate to the second location L2. No further input may be required from the user 206, the portable device zro, or the processing means zzo for the self-propelled autonomous vehicle 230 to reach the second location L2.
While it is described that the self-propelled autonomous vehicle 230 navigates to the second location L2, it is envisaged that in some examples the self-propelled autonomous vehicle 230 may not actually reach the second location L2 due to factors such as adverse winds or water conditions, in which case the self-propelled autonomous vehicle 230 may simply navigate to a third position proximate the second position L2, for example a predetermined distance from the second location In.
As described previously in relation to FIG. 1, in some examples the self-propelled autonomous vehicle 23o may navigate to the second location L,, but may then navigate to a further location using a camera 139 of the self-propelled autonomous vehicle 230 or the like. In some examples as discussed previously, the self-propelled autonomous vehicle 230 may be navigated by remote control to a new location. For example, the self-propelled autonomous vehicle 23o may autonomously navigate to the second location L, (or proximate the second location), after which the self-propelled autonomous vehicle 230 may be remotely controlled by a user to navigate to another location (or to the second location L2) Once the self-propelled autonomous vehicle 230 has reached the second location L2 (or a third location proximate the second location L2) it may deploy the rescue aid 234 for the person 204. Deploying the rescue aid 234 may comprise releasing the rescue aid 234 from the coupling unit 136. The self-propelled autonomous vehicle 230 may automatically (i.e. without further human intervention) deploy the rescue aid 234 in response to determining it has reached the second location L2 (or third location). The self-propelled autonomous vehicle 230 may determine it has reached the second location L2 based on performing a comparison between the second location L2 and a current location of the self-propelled autonomous vehicle 23o as determined using a GPS device of the self-propelled autonomous vehicle 23o, or the like.
FIG. 4 shows the scenario of FIG. 3, after the self-propelled autonomous vehicle 230 has deployed the rescue aid 234. In this example the self-propelled autonomous vehicle 23o is an unmanned aerial vehicle and so the rescue aid 234 has fallen from the self-propelled autonomous vehicle 23o after being released, and landed on the body of water. The person 204 may then access the rescue aid 234. If the rescue aid 234 is not within reaching distance of the person 204, the person 204 may move themselves nearer the rescue aid 234 by swimming or otherwise.
The person 204 may then use the rescue aid 234. For example if the rescue aid 234 comprises a flotation aid, the person 204 may use the flotation aid to keep afloat. If the rescue aid 234 comprises a locator aid, then the person 204 may activate the locator aid. For example, if the locator aid comprises a light or emergency radio beacon then the person 204 may activate the light or emergency radio beacon. If the locator aid comprises a flare then the person 204 may activate the flare. In some examples, the locator aid is already activated, without the person having to activate the locator aid. For example, the locator aid may be activated prior to being deployed by the self- propelled autonomous vehicle 23o. The locator aid may be activated by the self-propelled autonomous vehicle 23o in response to the self-propelled autonomous vehicle 23o determining that it has arrived at the second location L., at a predetermined distance from the second location L, at a different location, or on some other basis. In some examples the locator aid may be activated by another means, for example in response to contact with the body of water, or in response to descending from the self-propelled autonomous vehicle 23o where the self-propelled autonomous vehicle 23o is an unmanned aerial vehicle.
Where the rescue aid 234 comprises a retrieval aid such as a recovery harness, the person 204 may attach the retrieval aid to their body. Where the rescue aid 234 comprises a predator repellent such as a shark repellent, the person 204 may release the repellent from the rescue aid. In some examples the predator repellent may be released from the rescue aid 234 without intervention from the person 204. For example, the predator repellent may be released during deployment of the rescue aid 234 by the self-propelled autonomous vehicle 23o. The predator repellent may be released by the self-propelled autonomous vehicle 23o in response to the self-propelled -21 -autonomous vehicle 230 determining that it has arrived at the second location L2, at a predetermined distance from the second location L2, or at a different location. In some examples the predator repellent may be released by another means, for example in response to contact of the rescue aid 234 with the body of water, or in response to the rescue aid 234 descending from the self-propelled autonomous vehicle 230, where the self-propelled autonomous vehicle 230 is an unmanned aerial vehicle.
The rescue aid 234 may assist the person 204 until further help arrives, for example in the form of the user 206 and the marine vessel 202 arriving at the location of the person 204, or a different vessel, vehicle or persons arriving at the location of the person 204.
FIG. 5 shows a similar scenario to that of FIG. 3, with the self-propelled autonomous vehicle 230 having arrived at the second location L2 proximate the first location L, of the person 204. However in this embodiment, the self-propelled autonomous vehicle 230 further comprises at least one line 240 coupled to the rescue aid 234. The line 240 may comprise any suitable type of line 240 such as a rope, cable, ribbon or the like. The line 240 is configured to assist the person 204 in the body of water in obtaining the rescue aid 234. The line 240 may be fluorescent to assist in the user finding the line 240, for example in the dark. FIG. 5 shows the line 240 extending from below the self-propelled autonomous vehicle 230 under the force of gravity. However in examples where the self-propelled autonomous vehicle 230 is an unmanned watercraft, the line 240 may extend substantially horizontally from the unmanned watercraft, across the surface of the body of water. While FIG. 5 shows one line 240, there may be more than one line 240 coupled to the rescue aid 234, as illustrated in FIG. 8.
In the embodiment of FIG. 5, the self-propelled autonomous vehicle 230 may be able to deploy the rescue aid 234 in a similar manner as previously described in relation to FIG. 4, however in this example the line 240 remains coupled to the rescue aid 234. The line 240 is preferably comprised of a buoyant material, or has one or more buoyant dements attached along its length, so as to cause the line 240 to float in the body of water once the rescue aid 234 and attached line 240 have been deployed into the body of water. By providing a line 240 coupled to the rescue aid 234, the line 240 effectively provides a greater surface area for the person 204 to hold onto, compared to the rescue aid 234 alone, thereby increasing the likelihood and/or ease of the person 204 finding the rescue aid 234. The person 204 may grab a portion of the line 240 and use it to pull the attached rescue aid 234 towards them. The provision of the line 240 also means that less accuracy is needed when positioning the self-propelled autonomous vehicle 230 relative to the person 204 prior to deployment of the rescue aid 234, since the length of the line 240 increases the likelihood that a portion of the line 240 will land in the vicinity of the person after deployment. The use of a line 240 can also address the problem of the person 204 drifting or otherwise moving from their initial first location L1, for example due to the effect of currents in the body of water.
In some examples, as illustrated in FIG. 5, the self-propelled autonomous vehicle 23o is configured to traverse a predetermined path T proximate the first location L1, prior to deploying the rescue aid 234. FIG. 5 shows the self-propelled autonomous vehicle 230 traversing a substantially circular path T around the first location L,, although a different shape of path T may be traversed. The self-propelled autonomous vehicle 23o may begin traversing the predetermined path T in response to determining it has reached the second location L2, or a different location.
As the self-propelled autonomous vehicle 230 moves along the path T, the line 240 also moves in a path substantially corresponding to the path T of the self-propelled autonomous vehicle 230 (ignoring the effects of strong wind and/or water currents for simplicity). By causing the line 240 to move with the self-propelled autonomous vehicle 230 as the self-propelled autonomous vehicle 230 moves along the path T, the person 204 is provided with greater opportunity to grab the line 240 and therefore hold onto the rescue aid 234. This may be useful where the self-propelled autonomous vehicle 230 does not initially arrive at a location near enough to the person 204.
If the self-propelled autonomous vehicle 23o is an unmanned aerial vehicle then the unmanned aerial vehicle may move itself to an altitude such that the line 240 is within sufficient reaching distance above the surface of the body of water for the person 204, or it may extend the line 240 towards the surface of the body of water, for example using a winch mechanism.
The path T may substantially encircle the first location Li of the person 204, to increase the likelihood of the person 204 reaching the line 240. That is, the path T may be predetermined such that it allows for the line 240 to surround the first location L, of the 35 person 204.
The self-propelled autonomous vehicle 230 may deploy the rescue aid 232 in response to a sensor in the self-propelled autonomous vehicle 230 or the line 240 detecting that the person 240 is holding the line 240, or after a predetermined period of time, such as a predetermined period of time of the self-propelled autonomous vehicle 230 following the path T. In some examples, the path T is not predetermined but may instead be random.
In some examples, as illustrated in FIG. 6, the self-propelled autonomous vehicle 23o is configured to either extend the line 240 from the self-propelled autonomous vehicle 230 (for example using a winch mechanism) or descend in altitude (if the self-propelled autonomous vehicle 230 is an unmanned aerial vehicle) as the self-propelled autonomous vehicle 23o traverses the predetermined path T. Performing either of these actions allows for the line 240 to effectively move towards the body of water. Once one end of the line 240 makes contact with the body of water, an increasing length of the line 240 will make contact with the body of water as the self-propelled autonomous vehicle 230 continues to extend the line and/or descend, as illustrated in FIG. 6. The line 240 will increasingly lie along the surface of the body of water, forming its own path on the surface of the water as shown, and making it easier for the person 204 to grab the line 240.
FIG. 7 shows the embodiment of FIG. 6, but from a view above the surface of the water, looking down towards person 204 and the self-propelled autonomous vehicle 23o. The self-propelled autonomous vehicle 230 has followed the path T from the second location L, to the location currently shown, the path T encircling the first location L, of the person 204. At least part of the line 240 is lying along the surface of the body of water along a path corresponding to the path T taken by the self-propelled autonomous vehicle 23o (assuming negligible surface water currents for simplicity). The line 240 is also encircling the person 204 on the surface of the body of water. This improves the likelihood of the person 204 being able to hold onto a portion of the line 240 and therefore obtain the rescue aid 234 once deployed. The self-propelled autonomous vehicle 230 may deploy the rescue aid 234 in a similar manner as described previously in relation FIG. 5.
In some embodiments, the self-propelled autonomous vehicle 230 may be configured 35 to deploy the rescue aid 234 as it travels along a radial path between the portable device 210 (or the original location of the self-propelled autonomous vehicle 230 prior to deployment of the self-propelled autonomous vehicle 23o). This may be advantageous when the determined range R between the person 204 and the portable device 210, or the horizontal distance between the person 204 and the portable device 210, is not particularly accurate, and so the determined location of the person 204 may not be accurate. It may be possible to more accurately determine a bearing or azimuth of the person 204 relative to the portable device 210 than the range R or horizontal distance of the person 204 relative to the portable device 210. Therefore when the self-propelled autonomous vehicle 23o is instructed to navigate towards a determined location of the person 204 based on a determined azimuth/bearing of the person 204 relative to the tt) portable device 210 and a determined range/horizontal distance between the person 204 and the portable device 210, the self-propelled autonomous vehicle 23o may be configured to navigate along a radial path from its initial position towards the determined location at the determined azimuth/bearing, and deploy the rescue aid 234 as it travels along a radial path, at a location proximate to the determined location. For example, the rescue aid 234 may comprise a line 240 and the self-propelled autonomous vehicle 23o may deploy the line 240 along the radial path, starting at a location proximate to, but before, the determined location, and continuing to deploy the line along the radial path and towards the determined location. As such, the likelihood of the line 24o reaching the person 204 is increased. In a similar manner, the rescue aid 234 may comprise a plurality of payloads and the self-propelled autonomous vehicle 23o may be configured to sequentially drop each of the plurality of payloads along the radial path.
FIG. 8 shows an embodiment wherein the self-propelled autonomous vehicle 23o comprises a plurality of lines 240a-f coupled to the rescue aid 234. In this example there are six lines 240a-f shown coupled to the rescue aid 234, however in other examples there may be more or fewer than this. The self-propelled autonomous vehicle 23o may act in a similar manner as described previously in relation to any of the previous Figures. By providing more lines 240a-f coupled to the rescue aid 234, the likelihood of the person 204 being able to grab one of the lines 240a-f and therefore obtain the rescue aid 234 is increased. The self-propelled autonomous vehicle 23o may be configured to radially eject the plurality of lines from the self-propelled autonomous vehicle 23o, as illustrated in FIG. 8. That is, the self-propelled autonomous vehicle 23o may comprise a mechanism to propel each of the plurality of lines 240a-f in a radial direction away from the self-propelled autonomous vehicle 23o such each of the plurality of lines 240a-f travels in a different radial direction from the self-propelled autonomous vehicle 23o. As a result, the plurality of lines 240a-f may effectively cover a greater surface area of the body of water, increasing the likelihood of the person being able to grab one of the lines 24oa-f. The plurality of lines 240a-f may be ejected by any suitable means of propulsion, for example using an elastic propulsion means or gas propulsion means.
FIG. 9 shows an example processing means 900. The processing means 90o may be the processing means 12o described elsewhere in this application, and may be used to perform various method steps disclosed throughout this application. In some examples, a similar or identical processing means 90o may be comprised in at least one of the portable device no or the self-propelled autonomous vehicle 130 and may be used to perform various operations such as measurements associated with the portable device no or the self-propelled autonomous vehicle 13o. For example, a processing means 90o comprised in the portable device no may be configured to receive the user input, generate the data indicating the first location, and transmit said data to the processing means 120. The a processing means 90o comprised in the portable device no may be configured to cause the portable device no to make one or more of the measurements disclosed herein, such as measure range, declination, azimuth, or the height of the portable device no above the ground/body of water. As another example, a processing means 90o comprised in the self-propelled autonomous vehicle 130 may be configured to receive and process instructions from the processing means 120, and in response send signals to the propulsion system 133 of the self-propelled autonomous vehicle 130 to move the self-propelled autonomous vehicle 13o.
The processing means 90o may form part of a computer.
The processing means 90o comprises one or more processors 910 coupled to a memory 92o. The memory 92o may comprise instructions which, when executed by the one or more processors 910, may cause the one or more processors 910 to perform any of the steps described herein. The memory may comprise at least one of volatile memory 922 and non-volatile memory 924 The non-volatile memory 924 may comprise the instructions, for example. In some examples, the non-volatile memory 924 may comprise an operating system or the like. The volatile memory 922 may be used by the one or more processors 910 to temporarily store data to allow for operation of the one or more processors 91o. The one or more processors 910 may comprise any suitable processors known in the art. The memory 92o, such as the volatile memory 922 and non-volatile memory 924, may comprise any suitable memory known in the art.
The processing means 90o may be coupled to a power supply, not shown for simplicity. 5 The processing means 900 may be coupled to one or more sensors for receiving inputs, and/or peripheral devices for providing outputs. The processing means 900 may be coupled to a communication module (not shown) for providing a wired and/or wireless communication link with a corresponding communication module of another device. Data, such as any of the data described in this application, may be transmitted using /o the communication module over the communication link. The communication link may use any suitable wired or wireless communication protocol. Where the communication module is a wireless communication module and the communication link is a wireless communication link, the wireless communication link may use Bluetooth, WiFi, VHF, infrared, or any other suitable radio link or other wireless communication link.
Aspects of the present disclosure also relate to various methods including a method for assisting the rescue of a person in a body of water using any of the previously described systems.
FIG. io is a flow diagram illustrating a method according to embodiments of the present disclosure. The method is for assisting the rescue of a person in a body of water and may use any of the previously described systems such as system 100 or system 200.
As illustrated by step 1010, the method comprises determining, by the processing -0or means, the first location of the person in the body of water using the portable device.
Determining the first location using the portable device may comprise at least receiving data indicating the first location from the portable device and determining the first location based on the received data. The portable device may generate the data and transmit the data to the processing means in response to receipt of a user input at the
portable device.
As illustrated by step 1020, the method further comprises causing, by the processing means, and based on the determined first location, the self-propelled autonomous 35 vehicle to navigate to a second location proximate to the first location.
-27 -One or more of the steps of the methods disclosed herein may be performed by processing means 120, processing means 220, processing means 900, or another processing means. The portable device may be any of the portable devices described herein. The self-propelled autonomous vehicle may be any self-propelled autonomous vehicle described herein.
Aspects of the present disclosure have described the use of the portable device no to take a range measurement between the portable device no and a target such as a person in need of rescue, for example using a laser to make the range measurement. The range may then be used with an azimuth measurement of the target to determine a location of the target. However, in some scenarios it may be difficult or impossible to make a range measurement using a laser or the like. For example, the target may be too small and/or lack sufficient reflectivity to allow a measurement of range to be made, or else the target may be intermittently visible but not for long enough to aim the portable device no and take a measurement. In this case, a range may be acquired in a different manner, as illustrated in FIG. a The system 1100 of FIG. 11 may be similar to the system described in relation to FIG. 2. However in this scenario, the user 206 may have difficulty in making a direct measurement of the range R between the portable device 210 and the person 204, for example due to excessive movement of the person 204 in rough water. While the range R may be unknown, the radial distance d (also referred to as horizontal distance) between the person 204 in the water and the horizontal position of the portable device 210 in the water may be calculated based on the height h of the portable device 210 above the water and the declination A of the person 204 relative to the portable device 210. The radial distance d may be calculated as follows: h d = tan A The declination A may be determined using any suitable declination sensor 113 in the portable device 210 such as a tilt sensor or accelerometer. The user 206 may aim the portable device 210 at the person 204 as described previously, and the portable device 210 may then make a measurement of the declination of the person 204 using a declination sensor 113 or the like. This may be in response to a user input at the portable device 210 via any suitable user input device.
The value of the height h, also known as altitude, may already be known. For example, the value may be predetermined and stored in the portable device 210 or processing means 220. In other examples, a value for the height h may be determined, for example in response to a user input at the portable device 210 via any suitable user input device. The height h may be determined using any suitable sensor, for example an altimeter, comprised in the portable device 210 or another part of the system 200. The height h of the portable device 210 above the body of water or ground level may be approximated as equal to the height of the portable device zro above the target, such as the person 204.
The processing means zzo may perform any of the aforementioned calculations. For example, the processing means zzo may determine the distance d based on a measurement of the declination A received from the portable device 210 and the height 11 received from the portable device 210 or elsewhere. The processing means 220 may send instructions to the self-propelled autonomous vehicle 230 to cause it to navigate to a location In based on the determined distance d and an azimuth/bearing of the person 204 relative to the portable device 210 as measured by an azimuth sensor 117 of the portable device 210 or the like.
In some examples, a value for the height h may be determined using an auxiliary target 1250, as illustrated in FIG. 12. In this example, the height h may be determined based on a range S between the portable device 210 and the auxiliary target 1250, and a declination B of the auxiliary target 1250 relative to the portable device 210. The range S may be determined using any manner discussed previously, for example by aiming the portable device 210 at the auxiliary target 1250 and making a range S measurement. The declination B may be determined using any manner discussed previously, for example in a similar manner to the measurement of the declination A. The auxiliary target 1250 may be any other target in the water that is not the person 204, but for which a measurement of the range S can be made. in some examples, the auxiliary target 1250 may be thrown into the water by the user 206 or a third party.
Once the range S and declination B are known, the height h may be calculated using the following equation: h = S sin B A value for the radial distance d shown in FIG. 12 may therefore be calculated as follows: S sin B d= tan A The radial distance d may be used to determine the location of the person 204.
As discussed previously, a location of the person 204 may be determined based on a ro determined range R or radial distance d, and an azimuth of the person 204. As also discussed previously, this location may be a relative location. That is, the determined location may be a location relative to the portable device 210. The processing means zzo may send instructions to the self-propelled autonomous vehicle 230 to cause it to navigate to a location based on the relative location. However in some example, it may be preferable to convert the relative location to an absolute position, and send instructions to the self-propelled autonomous vehicle 230 to cause it to navigate to a location based on the absolute location.
FIG. 13 schematically illustrates how a relative location may be converted to an absolute location. FIG. 13 shows the portable device 210 on a marine vessel 202, however as for any of the embodiments described herein, the portable device 210 may not be located on a marine vessel 202 and may instead be located elsewhere, for example on land, on a bridge, or at another location described previously, among other locations. FIG. 13 also shows the target as a person 204., however in other examples the -0 target may be any other target such as an animal, or another object, living or non-living.
A person 204 in need of rescue may be determined to have a relative location LA, this location relative to the portable device 210, which has a relative location LB. in some examples, the system roo comprises a GPS receiver as discussed previously and this can be used to determine an absolute location of the portable device 210, for example by approximating an absolute location of the GPS receiver as equal to an absolute location of the portable device 210, or using the absolute location of the GPS receiver and a known distance between the GPS receiver and the portable device 210 to determine the absolute location of the portable device 210. Once the absolute location of the portable device 210 is determined, this may be used to determine an absolute location of the person 204, based on the absolute location of the portable device 210 and the relative location of the person 204.
In some examples, an object 1350 of known absolute location Lc may be in the vicinity of the portable device 210. The object 1350 could be an object in a fixed location on land or in the body of water, or could have a moving location on land or in the body of water but its location at a given point in time is known. If the portable device 210 is near the object 1350 then the absolute location of the portable device 210 may be determined to be the absolute location Lc of the object 1350, which may in turn be used to determine an absolute location of the person 204, based on the absolute location of the portable device 210 and the relative location of the person 204.
In other examples, the portable device 210 may be used to measure a distance/range v to the object 1350 using a laser or another suitable method, and an azimuth P of the object 1350 (or another suitable angle) using an azimuth sensor 117 or the like. The known absolute location Lc of the object 135o, the distance v and the azimuth P may be used to determine an absolute location of the portable device 210. The absolute location of the portable device 210 may be used with a distance w and azimuth Q of the person 204 to determine an absolute location of the person 204.
FIG. 14A-C together form a flow diagram of a method according to embodiments of the present invention.
In step 1401, a user aims the portable device 210 at a person 204 in need of rescue, 25 aligning the portable device 210 with the person 204 such that it can be used to make one or more measurements for determining a location of the person 204. The user may align the portable device 210 using a viewfinder 112, for example.
In optional step 1402, the portable device 210 may enhance the image of the aligned target provided at the viewfinder.
In optional step 1403, it is determined whether a range measurement of the person 204 should be taken at that moment in time. For example, where the range measurement is made using a laser of the portable device 210, it is determined whether a range measurement should be made using the laser. This determination may be made based on whether a user input is received at the portable device 210, for example via trigger -31 - 114. In response to a positive determination, that the range measurement should be made, the method continues to step 1404 where a measurement of the range of the person 204 is made.
In step 1405, an azimuth of the person 204 is measured, for example using an azimuth sensor 117 of the portable device 210.
In optional step 1406, a declination of the person 204 is measured, for example using an declination sensor 113 of the portable device 210. The declination measurement may o be used to improve the accuracy of determining the radial distance to the person 204.
In optional step 1407, information indicating the target measured by the portable device 210 is output by the portable device 210, for example at a display of the portable device 210. The information may comprise an image of the target as captured by portable device 210, or a visual representation of the measured range, azimuth and/or declination. in some examples, the portable device 210 may be configured to receive an input from the user confirming whether the correct target has been measured. If the user input confirms the correct target has not been measured, the method may return to step 1401. If the user input confirms the correct target has been measured, the method may continue to step 1408.
In step 1408, the measured range and azimuth data are used to determine a relative location of the target.
o In optional step 1409, the relative location of the target may also be determined based on the measured declination.
Returning to step 1403, in response to a negative determination, that the range measurement should be not be made, the method continues to step 1410 where it is determined whether a range measurement of the person 204 should be taken at a future moment in time, when possible (for example because of rough water making a range measurement of the person 204 difficult at that given moment in time). In response to a user input indicating that a range measurement of the person 204 should be taken at a future moment in time, the method proceeds to step 1411, where the portable device 210 makes a range measurement of the person 204 when possible, before continuing to step 1405.
Returning to step 1410, in response to a negative determination, that the range measurement should be not be made, the method continues to step 1412. In step 1412, In step 1412, an azimuth of the person 204 is measured, for example using an azimuth sensor 117 of the portable device 210.
In step 1413, a declination of the person 204 is measured, for example using an declination sensor 113 of the portable device 210.
In step 1414, it is determined whether a height h of the portable device 210 above the body of water is known. If it is, for example because a value for the height is already stored in the system or a value is input by a user, the method proceeds to step 1415, where the determined declination, azimuth and height are used to determine a relative location of the person 204. If in step 1414 the height is not known, the method proceeds to step 1416 shown in FIG. 14B.
In step 1416, the user aims the portable device 210 at an auxiliary target 1250 in the body of water. In step 1417, it is determined whether a range measurement of the auxiliary target 125o should be made, in a similar manner to step 1403. If yes, the method proceeds to step 1418 and a range measurement of the auxiliary target 1250 is made using the portable device 210, in a similar manner to step 1404, and a declination of the auxiliary target 1250 is measured in step 1419, in a similar manner to step 1406. In optional step 1420, the portable device 210 may allow the user to review and confirm the measurements, in a similar manner as described for step 1407. If they do not confirm the measurements, the method may return to step 1416. If they do confirm the measurements, in step 1421, the range and declination measurements for the auxiliary target 125o are used to determine the height h of the portable device 210 and the method proceeds to step 1415 as discussed previously.
Returning to step 1417, if it is determined that a range measurement of the auxiliary target 1250 should not be made at that point in time, the method proceeds to step 1422 where it is determined whether a range measurement of the auxiliary target 1250 should be taken at a future moment in time, when possible (for example because of rough water making a range measurement of the auxiliary target 1250 difficult at that given moment in time). In response to a user input indicating that a range measurement of the auxiliary target 1250 should be taken at a future moment in time, the method proceeds to step 1423, where the portable device 210 makes a range measurement of the auxiliary target 1250 when possible, before continuing to step 1419.
Returning to step 1422, in response to a negative determination, that the range measurement should be not be made, the method continues to step 1424, where a new auxiliary target may be selected by the user, and the method may continue to step 1416 using the new auxiliary target as the auxiliary target 125o.
Steps 1409 and 1415 continue to step 1425 in FIG. 14C, where it is determined whether an absolute location of the portable device 210 is known. This may be because the portable device 210 or another part of the system 100 comprises a GPS receiver, or else because the portable device 210 is sufficiently proximate an object of known absolute location that the absolute location of the portable device 210 can be considered equal to the absolute location of the object. If, in step 1425, it is determined that an absolute location of the portable device 210 is known, the method proceeds to optional step 1426, where the absolute location of the portable device 210 is set as the absolute location of the object having the known absolute location (if appropriate), then step 1427, where an absolute location of the person 204 is determined based on the absolute location of the portable device 210 and the relative location of the person 204. In step 1428, the absolute location is used to instruct the self-propelled autonomous vehicle 23o to navigate to the person 204, for example by the processing means 220 sending instructions comprising the absolute location to the self-propelled autonomous vehicle 23o, or sending instructions indicative of the absolute location to the self-propelled autonomous vehicle 23o.
Returning to step 1425, in response to a negative determination, the method proceeds to step 1429, where the user of the portable device 210 is in sight of an object 1350 having a known absolute location. In step 1430, the user aims the portable device 210 at the object 1350 in order to take a range measurement of the object 1350. In step 1431, it is determined whether a range measurement of the object 1350 should be made, in a similar manner to step 1403. If yes, the method proceeds to step 1432 and a range measurement of the object 1350 is made using the portable device 210, in a similar manner to step 1404, and an azimuth of the object 1350 is measured in step 1433, in a similar manner to step 1412. In optional step 1434, a declination of the object 1350 is measured using a declination sensor 117 or the like. In optional step 1435, the portable device 210 may allow the user to review and confirm the measurements, in a similar manner as described for step 1407. If they do not confirm the measurements, the method may return to step 1430. If they do confirm the measurements, in step 1436, the range and azimuth measurements for the object 1350 are used to determine the relative location of the object 1350, relative to the portable device 210. In optional step 1437, the measured declination of the object 1350 may be used to provide a more accurate determination of the relative location of the object 1350. In step 1438, an absolute location of the portable device 210 is determined based on the absolute location of the object 1350 and the relative location of the object 1350, and the method /o then proceeds to step 1427.
Returning to step 1431, if it is determined that a range measurement of the object 135o should not be made at that point in time, the method proceeds to step 1439 where it is determined whether a range measurement of the object 1350 should be taken at a future moment in time, when possible (for example because of rough water making a range measurement of the object 1350 difficult at that given moment in time). In response to a user input indicating that a range measurement of the object 1350 should be taken at a future moment in time, the method proceeds to step 1340, where the portable device 210 makes a range measurement of the object 1350 when possible, before continuing to step 1433.
Returning to step 1439, in response to a negative determination, that the range measurement should be not be made, the method may continue to step 1426, where the absolute location of the portable device 210 may be set as a known absolute location, -0or for example a last known absolute location.
In the embodiments discussed previously, where a particular measurement of range, distance, azimuth/bearing, declination, height or the like is made, in some examples more than one of that measurement may be made, and an average of the more than one measurements may be calculated and used for further processes.
Where reference is made to an azimuth (also referred to as a bearing), the azimuth (or bearing) may be considered an angle in the horizontal plane (parallel to the ground/surface of the body of water) between a predefined direction (e.g. North) from the sensor making the azimuth measurement (i.e. the azimuth sensor of the portable device no) and a direction between the sensor making the azimuth measurement (i.e. the azimuth sensor of the portable device no) and the target (such as the person 204). Where reference is made to azimuth or bearing, this may not necessarily be an angle measured from North as the predefined direction, but may be an angle of a target relative to the portable device no as measured from a different starting direction, predetermined or otherwise.
Where reference is made to declination, this refers to the vertical angle formed between a target (such as the person 204) and the horizontal (with respect to the ground/body of water), as viewed from the sensor making the declination measurement (i.e. the io declination sensor of the portable device no). Tn other words, the declination is the vertical angle between the horizontal and the line of sight between the sensor making the declination measurement (i.e. the declination sensor of the portable device no) and the target. In some examples, other suitable angles may be measured in place of the declination.
Where reference is made to determining a location of the target, this may be an absolute location, for example the coordinates of the target in a geospatial coordinate system may be determined, or it may be a location represented by a determined azimuth and radial distance of the target.
Although several embodiments have been shown and described, it would be appreciated by those skilled in the art that changes may be made in these embodiments without departing from the principles of the invention, the scope of which is defined in the claims. -0or
Those of skill in the art will understand that modifications (additions and/or removals) of various components of the apparatuses, methods, systems and embodiments described herein may be made without departing from the full scope and spirit of the present invention, which encompass such modifications and any and all equivalents thereof.

Claims (25)

  1. Claims 1. A system for assisting with the rescue of a person in a body of water, the system comprising: a portable device for determining a first location of the person in the body of water; and a processing means configured to: determine the first location using the portable device; and based on said determined first location, cause a self-propelled autonomous vehicle to navigate to a second location proximate to the first location.
  2. 2. The system of claim 1, wherein the portable device is a handheld device, optionally wherein the portable device is a portable rangefinder device.
  3. 3. The system of claim 1 or 2, wherein the processing means is configured to determine the first location using the portable device at least in part by: receiving data indicating the first location from the portable device; and determining the first location based on the received data.
  4. 4. The system of claim 3, wherein the data indicates at least one of a range, an azimuth or a declination of the person in the body of water relative to the portable device.
  5. -0or 5. The system of claim 4, wherein the portable device comprises a declination sensor, wherein the data indicates a declination of the person as determined using the declination sensor, and wherein the processing means in configured to determine the first location based on the declination and a height of the portable device above the body of water.
  6. 6. The system of claim 5, wherein the portable device is configured to determine a range and a declination of an auxiliary target, wherein the processing means is configured to receive data indicating the range and the declination of the auxiliary target from the portable device, and wherein the processing means is configured to determine the height based on the received data indicating the range and the declination of the auxiliary target.
  7. 7. The system of any of claims 3 to 6, wherein the portable device is configured to generate the data and transmit the data to the processing means in response to receipt of a user input at the portable device.
  8. 8. The system of claim 7, wherein the portable device has a viewfinder, and wherein the portable rangefinder is configured such that: the user may aim the portable device at the person in the body of water by viewing the person in the body of water using the viewfinder; and in response to receipt of the user input while the portable device is aimed at the person in the body of water, the portable device generates the data indicating the first location of the person in the body of water.
  9. 9. The system of any preceding claim, where the first location is a relative first location, and wherein the processing means is configured to convert the relative first location to an absolute first location based on: receiving data from the portable device indicating a relative location of a target having a known absolute location, wherein the relative location of the target is relative to the portable device; and converting the relative first location to the absolute first location based on the absolute location of the target and the data indicating the relative location of the target.
  10. 10. The system of claim 9 wherein the data indicating the relative location of the target comprises data indicating a range of the target and an azimuth of the target -0or relative to the portable device.
  11. 11. The system of any preceding claim, wherein the portable device is configured to generate the data using a laser.
  12. 12. The system of claim 11, wherein the portable device is configured to generate the data using LTDAR.
  13. 13. The system of any preceding claim, wherein the autonomous vehicle is an unmanned aerial vehicle, optionally wherein the unmanned aerial vehicle is a fixed-35 wing drone or a multirotor drone.
  14. 14. The system of any of claims 1 to 12, wherein the autonomous vehicle is an unmanned watercraft.
  15. 15. The system of any preceding claim, further comprising the self-propelled 5 autonomous vehicle.
  16. 16. The system of any preceding claim, wherein the autonomous vehicle comprises a camera for capturing a video of the person in the body of water, optionally wherein the autonomous vehicle further comprises a wireless communication module for transmitting video data representing the video to an external device.
  17. 17. The system of claim 16, wherein the autonomous vehicle is configured to navigate to a third location proximate the second location based at least in part on video captured by the camera, optionally wherein the autonomous vehicle is configured to identify the person using the video captured by the camera and at least one of machine learning or artificial intelligence, and to navigate to third location based at least in part on the identification of the person.
  18. 18. The system of any preceding claim, wherein the autonomous vehicle comprises a wireless communication module for receiving control signals from a remote control system, and wherein the autonomous vehicle is configured to be remotely navigated based on the received control signals.
  19. 19. The system of any preceding claim, wherein the autonomous vehicle comprises a rescue aid for delivery to the person in the body of water.
  20. 20. The system of claim 19, wherein the autonomous vehicle is configured to automatically deploy the rescue aid upon arrival at the second location.
  21. 21. The system of claim 19 or 20 wherein the rescue aid comprises at least one of a flotation aid, a locator aid, a retrieval aid or a predator repellent.
  22. 22. The system of claim 19, 20 or 21, wherein: the autonomous vehicle comprises at least one line coupled to the rescue aid; and upon reaching the second location, the autonomous vehicle is configured to traverse a predetermined path proximate the first location, optionally wherein the path substantially encircles the first location.
  23. 23. The system of claim 22 wherein the autonomous vehicle, as it traverses the path, is configured either to extend the line from the autonomous vehicle or descend in altitude.
  24. 24. The system of any of claims i9to 23, wherein: the autonomous vehicle comprises a plurality of lines coupled to the rescue aid; and upon reaching the second location, the autonomous vehicle is configured to radially eject the plurality of lines from the autonomous vehicle.
  25. 25. A method for assisting the rescue of a person in a body of water using the system of any of claims i to 24, the method comprising: determining, by the processing means, the first location of the person in the body of water using the portable device; and causing, by the processing means, and based on said determined first location, said self-propelled autonomous vehicle to navigate to a second location proximate to the first location.
GB2109711.8A 2021-07-05 2021-07-05 System for assisting with water rescue Pending GB2608614A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
GB2109711.8A GB2608614A (en) 2021-07-05 2021-07-05 System for assisting with water rescue

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
GB2109711.8A GB2608614A (en) 2021-07-05 2021-07-05 System for assisting with water rescue

Publications (2)

Publication Number Publication Date
GB202109711D0 GB202109711D0 (en) 2021-08-18
GB2608614A true GB2608614A (en) 2023-01-11

Family

ID=77274570

Family Applications (1)

Application Number Title Priority Date Filing Date
GB2109711.8A Pending GB2608614A (en) 2021-07-05 2021-07-05 System for assisting with water rescue

Country Status (1)

Country Link
GB (1) GB2608614A (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115188155B (en) * 2022-02-28 2024-03-08 华北水利水电大学 Automatic rescue remote control system and method for monitoring accidental falling water

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20140031498A (en) * 2012-09-03 2014-03-13 주식회사 에프나인 Life rescue system using overwater robot and life rescue method using the same
CN107351991A (en) * 2016-05-10 2017-11-17 三亚深海鲸电子科技有限公司 Quick unmanned plane rescue system for outdoor fixed swimming area
US20180194445A1 (en) * 2015-05-19 2018-07-12 Rujing Tang Unmanned aerial vehicle system and methods for use
US20200031437A1 (en) * 2018-07-25 2020-01-30 Thomas Lawrence Moses Unmanned Aerial Vehicle Search and Rescue System
US10745132B1 (en) * 2016-06-28 2020-08-18 Amazon Technologies, Inc. Tow control handle for unmanned aerial vehicle

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20140031498A (en) * 2012-09-03 2014-03-13 주식회사 에프나인 Life rescue system using overwater robot and life rescue method using the same
US20180194445A1 (en) * 2015-05-19 2018-07-12 Rujing Tang Unmanned aerial vehicle system and methods for use
CN107351991A (en) * 2016-05-10 2017-11-17 三亚深海鲸电子科技有限公司 Quick unmanned plane rescue system for outdoor fixed swimming area
US10745132B1 (en) * 2016-06-28 2020-08-18 Amazon Technologies, Inc. Tow control handle for unmanned aerial vehicle
US20200031437A1 (en) * 2018-07-25 2020-01-30 Thomas Lawrence Moses Unmanned Aerial Vehicle Search and Rescue System

Also Published As

Publication number Publication date
GB202109711D0 (en) 2021-08-18

Similar Documents

Publication Publication Date Title
US11891158B2 (en) Unmanned aerial vehicle search and rescue system
US10668997B2 (en) Unmanned aerial vehicle search and rescue system
US11275371B2 (en) Unmanned vehicle control and operation in a marine environment
US9223027B1 (en) Rescue method and system for an overboard passenger
KR101812487B1 (en) Offshore lifesaving system using drone
US11430332B2 (en) Unmanned aerial system assisted navigational systems and methods
KR101653125B1 (en) Drone system for rescue and method of rescue
JP6337253B2 (en) Underwater exploration system
US20190127034A1 (en) Autonomous underwater survey apparatus and system
US10059448B1 (en) Rescue device for distressed swimmer
KR102050519B1 (en) Marine rescue boat drone
US7230881B2 (en) Submarine remote surface platform
KR101710613B1 (en) Real-time wave and current measurement using Waterproof Drone equipped with hydrofoil
CN109690250B (en) Unmanned aerial vehicle system assisted navigation system and method
US20230150625A1 (en) Unmanned Aerial Vehicle Search and Rescue System
KR20170040446A (en) Method of real-time operational ocean monitoring system using Drone equipped with an automatic winch and thereof device
JP7181723B2 (en) Maritime search system, unmanned air vehicle, and unmanned flight method
CN111275924B (en) Unmanned aerial vehicle-based child drowning prevention monitoring method and system and unmanned aerial vehicle
GB2608614A (en) System for assisting with water rescue
US20220317233A1 (en) System and Method of Detecting and Notifying of an Occurrence of an Overboard Passenger on a Vessel
GB2607103A (en) System for assisting with water rescue
JP6393157B2 (en) Spacecraft search and recovery system
GB2572842A (en) Unmanned aerial system assisted navigational systems and methods
US20240103154A1 (en) Location Apparatus
JP2022144860A (en) Search system and search method for unmanned underwater vehicle