WO2020210918A1 - Underwater vehicle with an omnidirectional camera, and method of controlling movement of the same - Google Patents

Underwater vehicle with an omnidirectional camera, and method of controlling movement of the same Download PDF

Info

Publication number
WO2020210918A1
WO2020210918A1 PCT/CA2020/050520 CA2020050520W WO2020210918A1 WO 2020210918 A1 WO2020210918 A1 WO 2020210918A1 CA 2020050520 W CA2020050520 W CA 2020050520W WO 2020210918 A1 WO2020210918 A1 WO 2020210918A1
Authority
WO
WIPO (PCT)
Prior art keywords
underwater vehicle
processor
view
navigation
omnidirectional
Prior art date
Application number
PCT/CA2020/050520
Other languages
French (fr)
Inventor
Michael WESTBY
Mathew CLARKE
Michael Hansen
Original Assignee
Poseidon Ocean Systems Ltd.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Poseidon Ocean Systems Ltd. filed Critical Poseidon Ocean Systems Ltd.
Publication of WO2020210918A1 publication Critical patent/WO2020210918A1/en

Links

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B63SHIPS OR OTHER WATERBORNE VESSELS; RELATED EQUIPMENT
    • B63GOFFENSIVE OR DEFENSIVE ARRANGEMENTS ON VESSELS; MINE-LAYING; MINE-SWEEPING; SUBMARINES; AIRCRAFT CARRIERS
    • B63G8/00Underwater vessels, e.g. submarines; Equipment specially adapted therefor
    • B63G8/14Control of attitude or depth
    • B63G8/16Control of attitude or depth by direct use of propellers or jets
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/0011Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots associated with a remote control arrangement
    • G05D1/0033Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots associated with a remote control arrangement by having the operator tracking the vehicle either by direct line of sight or via one or more cameras located remotely from the vehicle
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B63SHIPS OR OTHER WATERBORNE VESSELS; RELATED EQUIPMENT
    • B63GOFFENSIVE OR DEFENSIVE ARRANGEMENTS ON VESSELS; MINE-LAYING; MINE-SWEEPING; SUBMARINES; AIRCRAFT CARRIERS
    • B63G8/00Underwater vessels, e.g. submarines; Equipment specially adapted therefor
    • B63G8/001Underwater vessels adapted for special purposes, e.g. unmanned underwater vessels; Equipment specially adapted therefor, e.g. docking stations
    • B63G2008/002Underwater vessels adapted for special purposes, e.g. unmanned underwater vessels; Equipment specially adapted therefor, e.g. docking stations unmanned
    • B63G2008/005Underwater vessels adapted for special purposes, e.g. unmanned underwater vessels; Equipment specially adapted therefor, e.g. docking stations unmanned remotely controlled
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B63SHIPS OR OTHER WATERBORNE VESSELS; RELATED EQUIPMENT
    • B63GOFFENSIVE OR DEFENSIVE ARRANGEMENTS ON VESSELS; MINE-LAYING; MINE-SWEEPING; SUBMARINES; AIRCRAFT CARRIERS
    • B63G8/00Underwater vessels, e.g. submarines; Equipment specially adapted therefor
    • B63G8/38Arrangement of visual or electronic watch equipment, e.g. of periscopes, of radar

Definitions

  • the present invention relates to an underwater vehicle and, in particular, to an underwater vehicle provided with an omnidirectional camera, as well as a method of operating and controlling movement of the same.
  • United States Patent Application Publication No. 2016/0301866 to Samsung Electronics Co., Ltd. which was published on October 13, 2016, discloses a method of setting an omnidirectional camera based on a viewpoint of a user and transmitting images in real time to a head mounted display worn by the user.
  • United States Patent Application Publication No. 2017/0278262 to Sony Corporation, which was published on September 28, 2017, discloses an omnidirectional camera mounted on a vehicle.
  • the image generated by the camera is based on the head posture of a user and is transmitted to a display device fixed to a head or a face of the user.
  • a system comprising an underwater vehicle and a control station to allow an operator to control movement of the underwater vehicle.
  • the underwater vehicle is provided with an omnidirectional field of view, a camera, a plurality of thrusters, and a microcontroller that controls vector thrusting of the thrusters to move the underwater vehicle.
  • the control station includes a processor, a head mounted display device in communication with the processor, and an input device in communication with the processor.
  • the processor signals the microcontroller to roll, pitch, or yaw the underwater vehicle to align the navigated direction of the underwater vehicle based on an operator observed direction which is derived from the omnidirectional field of view of the underwater vehicle, and the orientation of the head mounted display device worn by the operator.
  • the processor also synchronously signals the microcontroller to move the underwater vehicle in a direction of travel, based on a navigation command, which has been aligned to the operators observed direction and inputted by the operator using the input device or based on pre-inputted waypoints.
  • a system comprising an underwater vehicle and a control station to allow an operator to control movement of the underwater vehicle.
  • the underwater vehicle is provided with an omnidirectional field of view, a camera, a plurality of thrusters, and a microcontroller that controls vector thrusting of the thrusters to move the underwater vehicle.
  • the control station includes a processor, a screen display in communication with the processor, and an input device in communication with the processor.
  • the processor signals the microcontroller to roll, pitch, or yaw the underwater vehicle to align the navigated direction of the underwater vehicle based on an operator observed direction which is derived from the omnidirectional field of view of the underwater camera, and further based on a navigation command inputted by the operator using the input device.
  • the processor also synchronously signals the microcontroller to move the underwater vehicle in a direction of travel, based on a navigation command inputted by the operator using the input device.
  • a method for controlling the movement of an underwater vehicle having an omnidirectional field of view The underwater vehicle is controlled to roll, pitch, or yaw to change the navigated direction of the underwater vehicle based on an operator observed direction which is derived from the omnidirectional field of view of the omnidirectional as selected bythe orientation of the head mounted display device worn by an operator.
  • the underwater vehicle is also synchronously controlled to move in a direction of travel based on a navigation command inputted by the operator.
  • the method includes providing camera controls which associate a front of the underwater vehicle with a center of a field of view of a head-mounted display device.
  • the camera controls cause the underwater vehicle to roll, pitch or yaw in response to movement of and to maintain orientation with the head-mounted display device.
  • the method includes determining via a processor of a control station whether both the camera controls are active and navigation commands have been inputted and if so, enabling navigation controls to selectively navigate the underwater vehicle in forward, reverse, lateral directions.
  • the method includes determining via the processor of the control station whether the camera controls are overridden and navigation commands have been inputted and if so, enabling the navigation controls to selectively navigate the underwater vehicle in forward, reverse, lateral, upward and downward direction.
  • a method for autonomously controlling the movement and navigation of an underwater vehicle having an omnidirectional field of view and an on-board control processes which is used to determine objects within the navigational view The underwater vehicle is controlled to roll, pitch, or yaw in a direction of travel based on a plurality of navigation commands (waypoints) previously input by the operator.
  • the vehicle is subsequently guided by visual and other technological means of localization of objects and waypoints within the omnidirectional navigational field of view. Localization is not limited to strictly visual means and is indeed to include location awareness from any suitable sensory technology. Such technology can include but is not limited to sensors based on sonar, lidar, radar, stereo vision, structured light, etc.
  • the vehicle navigation control will consume the plurality of localization sensory inputs to calculate a best estimate on localization of the vehicle in reference to the desired navigational path and observed or sensed obstructions in the surroundings.
  • the operator may change the observed field of view derived from the omnidirectional camera, independent of the navigational field of view, through the orientation of the head mounted display device worn by said operator. Autonomous navigation proceeds until such time as the control system requires further waypoint input or obstruction guidance by the operator.
  • the underwater vehicle has one or more cameras providing an omnidirectional field of view and includes an on-board control system.
  • the method includes controlling roll, pitch or yaw in a direction of travel based on a plurality of previously-inputted navigation commands.
  • the navigation commands may comprise one or more waypoints.
  • the method includes subsequently guiding movement of the vehicle by object localization.
  • the object localization comprises one or more visual or other technological means for location awareness.
  • the method includes adjusting an observed said field of view, independent of a navigational said field of view, by adjusting orientation of a head-mounted display device worn by an operator.
  • the method includes autonomous navigation proceeding until such time as the control system requires further navigation command input or obstruction guidance by the operator.
  • Figure l is a rear, top, right side perspective view of an underwater vehicle provided with an omnidirectional camera
  • Figure 2 is a front, top, left side perspective view thereof
  • FIG 3 is a schematic showing the underwater vehicle of Figure 1 in communication with a control station therefor, the control station including a processor, a head mounted display device, an input device, and a display screen;
  • FIG 4 is a block diagram showing the underwater vehicle of Figure 1 in communication with the control station of Figure 3;
  • Figure 5 is a flowchart showing the control logic of the underwater vehicle of Figure 1 .
  • an underwater vehicle 10 The underwater vehicle has a top 1 1a, bottom 1 lb, front 11c and rear l id.
  • the front and rear of the underwater vehicle 10 extend from the top to the bottom of the underwater vehicle.
  • the underwater vehicle has a left side l ie and a right side I lf each of which extend between the top 11a and bottom 1 lb of the underwater vehicle and between the front 11c and rear 1 Id of the underwater vehicle.
  • the underwater vehicle 10 includes a chassis 12.
  • the chassis in this example comprises of a pair of spaced-apart side members, in this example side planar members 13a and 13b which extend along sides l ie and 1 If of the underwater vehicle.
  • the chassis 12 includes a body 13c positioned between the front 11c and rear l id of the underwater vehicle and which extends between the side planar members.
  • the chassis includes one or more front members, in this example a pair of spaced-apart forward rods 13d and 13e which align with the front 1 lc of the underwater vehicle 10 and extend between and couple to the side planar members 13a and 13b.
  • the chassis 12 includes one or more rear members, in this example a pair of spaced-apart rearward rods 13f and 13g which align with the rear l id of the underwater vehicle 10 and extend between and couple to the side planar members.
  • the underwater vehicle includes an elongate housing 14 mounted on the chassis 12, in this example coupling to body 13c.
  • the housing is generally cylindrical in shape in this example and has a longitudinal or x-axis 15.
  • Axis 15 is positioned between the sides l ie and 1 If of the underwater vehicle, is positioned between the front 11c and rear 1 Id of the underwater vehicle, extends between the front 11c and rear l id of the underwater vehicle, and is adjacent to the top 1 la of the underwater vehicle in this example.
  • the underwater vehicle 10 includes a plurality of thrusters, in this example thrusters 24, 26, 28, 30 and 32 mounted on the chassis 12 thereof.
  • Forward thruster 24 is positioned near the front 11c of the underwater vehicle and generates thrust vectors 25 aligned parallel to a z-axis 27 of the underwater vehicle 10, which is a vertical axis in Figure 1.
  • Axis 27 is positioned between the front 11c and rear 1 Id of the underwater vehicle and is positioned between the sides l ie and 1 If of the underwater vehicle.
  • Axis 27 extends between the top 11a and bottom l ib of the underwater vehicle, and intersects with and extends perpendicular to axis 15 in this example.
  • Spaced-apart central thrusters 24 and 26 are positioned between the front 11 c and rear 1 Id of the underwater vehicle 10 and between side planar members 13a and 13b.
  • the central thrusters generate thrust vectors 29 and 31 aligned parallel to x-axis 15.
  • Spaced- apart rearward thrusters 30 and 32 are positioned near the rear l id of the underwater vehicle adjacent to side planar members 13a and 13b, respectively.
  • the rearward thrusters generate thrust vectors 37 and 39 aligned parallel to axis 27 of the underwater vehicle.
  • the omnidirectional cameras 16 and 18 are axially spaced outwards from the thrusters 24, 26, 28, 30 and 32.
  • the underwater vehicle 10 also has a lateral or y axis 35.
  • Axis 35 is positioned between the front 11c and rear 1 Id of the underwater vehicle and is positioned between the top 1 1a and bottom 1 lb of the underwater vehicle.
  • Axis 35 extends between the sides l ie and 1 If of the underwater vehicle, and intersects with and extends perpendicular to axis 27 in this example.
  • Axis 35 also extends perpendicular to axis 15 in this example.
  • the thrusters 24, 26, 28, 30, and 32 are employed to provide the underwater vehicle 10 with up to six degrees of freedom of motion.
  • the thrusters so configured and positioned enable the underwater vehicle to travel along the x-axis 15, y-axis 35, and z- axis 27, i.e. move up, down, left, right, forward, and backward.
  • the thrusters 24, 26, 28, 30 and 32 so configured and positioned also enable the underwater vehicle 10 to rotate about the x-axis, y-axis, and z-axis, i.e. roll 41, pitch 43, and yaw 45.
  • the underwater vehicle 10 includes a microcontroller 34 which controls the movement thereof.
  • the microcontroller disposed within the housing 14 of the underwater vehicle.
  • the microcontroller 34 controls thrust vectoring of the thrusters 24, 26, 28, 30, and 32 to result in desired travel along the x-axis 15, y-axis 35, and z-axis 27 seen in Figure 1 and/or rotation about the x-axis, y-axis, and z-axis, i.e. roll 41, pitch 43, and yaw 45.
  • the microcontroller 34 is part of an onboard processor 33.
  • the underwater vehicle 10 includes a first omnidirectional camera 16 and a second omnidirectional camera 18.
  • the first and second omnidirectional cameras provide the underwater vehicle with omnidirectional vision capability.
  • the first omnidirectional camera 16 and the second omnidirectional camera 18 are each mounted on the housing 14 of the underwater vehicle 10.
  • the first omnidirectional camera is mounted on a first end 20 of the housing 14 of the underwater vehicle and the second omnidirectional camera is mounted on a second end 22 of the housing 14 of the underwater vehicle 10.
  • an omnidirectional camera may be otherwise mounted on the housing.
  • the first end 20 of the housing 14 of the underwater vehicle 10 and the second end 22 of the housing 14 of the underwater vehicle 10 are opposite.
  • the omnidirectional cameras 16 and 18 are accordingly capable of generating a 360 degree image about the x-axis 15, y-axis 35, and z-axis 27.
  • the omnidirectional cameras accordingly provide an omnidirectional field of view.
  • a plurality of cameras may be employed to provide an omnidirectional field of view.
  • the underwater vehicle 10 is remotely controlled by an operator (not shown) from a control station 50.
  • the control station generally includes a processor 52, which in this example is part of a computer terminal 53 of the control station.
  • the processor is in communication with the microcontroller 34 of the underwater vehicle 10 as best shown in Figure 4.
  • the control station 50 further includes a video display, in this example a head-mounted display device 54, which may be a virtual reality headset.
  • the control station also includes an input device 56, which may be a joystick or gamepad controller for example.
  • An image generated by the first omnidirectional camera 16 and/or the second omnidirectional camera 18 is transmitted to the processor 52 of the control station 50.
  • the image is displayed in real-time on the head-mounted display device 54 which is worn by the operator and/or the display screen 58 depending on a mode of operation.
  • the field of view of the first omnidirectional camera 16 and the second omnidirectional camera 18 may be coupled to the orientation of the head-mounted display device 54 by camera controls and, in this example, a front of the underwater vehicle 10 is oriented with a center of the field of view.
  • the field of view of the first omnidirectional camera 16 and the second omnidirectional camera 18 may be alternatively be coupled to the orientation of the underwater vehicle with a front of the underwater vehicle 10 being oriented with a center of the field of view.
  • the operator is able to pan and tilt by changing their physical orientation, for example, by turning their head“side to side” or looking“up” and“down”.
  • the camera controls cause the underwater vehicle 10 to roll, pitch, and yaw in response to changes in the physical orientation of the operator.
  • the operator is also able to navigate the underwater vehicle 10 using the input device 56 to input navigation commands to navigation controls which cause the underwater vehicle 10 to travel in response to the navigation commands.
  • navigation commands may be determined by one or more previously inputted waypoints.
  • the camera controls and navigation controls are synchronized. Accordingly, in this first mode of operation, the camera controls control the orientation of the underwater vehicle 10 and the navigation controls control the direction of travel of the underwater vehicle 10. It is optionally possible to override the camera controls, and control both orientation and direction of travel of the underwater vehicle 10 using the navigation controls, in a second mode of operation.
  • omnidirectional cameras 16 and 18 thus send image data or video signals to a video processing module 34a of the on-board processor 33 which processes the same.
  • the processor includes a video signal sending transceiver 34b which receives video signals so processed by module 34a and sends this to a video signal receiving module or transceiver 53 a of computer terminal 53.
  • a video processing module 53b of the computer terminal receives and processes the image data so received by transceiver 53a and sends this data via a video signal sending module or transceiver 53 c to an image signal receiving module or transceiver 54a of head mounted display device 54.
  • the display device includes an image display 54c which receives the image data so transmitted after being processed by signal processing module 54b.
  • on-board processor 33 has both a video and control communications link to computer terminal 53.
  • Internals of the computer terminal are comprised of video receiving and processing blocks that allows the omnidirectional view of the underwater vehicle to be viewed by the operator on either a virtual reality headset or video display depicted in the computer terminal.
  • the operator orients the head-mounted display device 54 to provide yaw, pitch roll commands which are interpreted by the orientation sensing block 54d.
  • the received orientation commands are forwarded to the control logic processor 53d via signal processing module 54e and signal transmission module or transceiver 54f of the head-mounted display device 54, and orientation receiving module or transceiver 53 e of the computer terminal 53.
  • the autonomous object localization block 53f of the computer terminal 53 determines vehicle commands based on the processing of the video signals from video signal transceiver 53a to determine the relative location of the vehicle to desired waypoints 53g and possible path obstructions. Given the preprogramed navigation path has a solution, the autonomous control will provide commands to the control logic processor 53d. In the autonomous mode of operation, the operator is strictly an observer and is able orient the head-mounted display device 54 in any desirable orientation in order to see surrounding events or objects within the omnidirectional camera 16, 18 field of view.
  • the vehicle 10 will alert the operator of any impediments to the solution of the guided path. The operator is thus able either via visual or manual control to resolve the navigational impediment.
  • Navigation Commands originating either from the input device 56, the head- mounted display device 54 or the localization block 53f, are mixed and selected for forwarding to the vehicle 10 via the control logic processor 52d.
  • the processed commands are immediately communicated by the control logic processor to the thruster control 34d via control signal input module or transceiver transfer 34c, for selectively actuating and controlling thrusters 24, 26, 28, 30, and 32.
  • This link may be referred to as the command path.
  • the features shown in box 10 of Figure 4 may be referred to as an on-board control system.
  • a control logic 100 of the underwater vehicle 10 is shown in Figure 5.
  • the processor 52 determines at decision block 1 10 if the camera controls have been overridden, as shown by camera override block 56c in Figure 4.
  • the camera controls will orient the front of the underwater vehicle 10 with a center of the field of view at block 120 if the camera controls have not been overridden.
  • a method of controlling movement of the underwater vehicle including providing camera controls, generally shown within boxes 10 and 54 in Figure 4, which associate the front 1 lc of the underwater vehicle seen in Figure 1, with a center of a field of view of the head- mounted display device 54 seen in Figure 4.
  • the camera controls cause the underwater vehicle to roll, pitch or yaw in response to movement of and to maintain orientation with the head-mounted display device. This is the first mode of operation.
  • the processor 52 determines at decision block 130 if any navigation commands have been inputted.
  • the control logic 100 will continuously loop to orient the front of the underwater vehicle with a center of the field of view if navigation commands have not been inputted. However, if navigation commands have been inputted, the navigation controls will navigate the underwater vehicle at block 140.
  • the method thus includes determining via the processor whether both the camera controls are active and navigation commands have been inputted. If so, navigation controls, as generally shown by the components within boxes 53 and 10 in Figure 4, are enabled to selectively navigate the underwater vehicle in forward, reverse and lateral directions.
  • the method includes either using input device 56 seen in Figure 3 to input the navigation commands or using one or more previously-inputted waypoints for determining the navigation commands.
  • the processor 52 determines at decision block 150 if there has been a change in the field of view, i.e. if the physical orientation of the operator has changed in the first mode of operation.
  • the control logic 100 will continuously loop to navigate the underwater vehicle 10 if there has been no change in the field of view. The method thus includes monitoring the navigation controls if no movement of the head- mounted display device is detected. [0037] However, if there has been a change in the field of view, the control logic 100 will loop to determine if the camera controls have been overridden at decision block 110. The method includes determining via the processor 52 seen in Figure 4 whether the camera controls, generally seen within boxes 10 and 53, have been overridden upon detecting movement of the head-mounted display device 54.
  • the control logic 100 will continue to operate in the first mode of operation until it is determined at decision block 110 that the camera controls have been overridden.
  • the control logic 100 will then operate in the second mode of operation in which only the navigation controls control both orientation and direction of travel of the underwater vehicle 10.
  • the method thus includes determining via the processor of the control station whether the camera controls are overridden and navigation commands have been inputted and if so, enabling the navigation controls to selectively navigate the underwater vehicle in forward, reverse, lateral, upward and downward direction.
  • the method therefore includes operating the underwater vehicle solely via said navigation controls by overriding the camera controls.
  • a method of autonomously controlling movement and navigation of an underwater vehicle including controlling roll, pitch or yaw in a direction of travel based on a plurality of previously-inputted navigation commands.
  • the navigation commands may comprise one or more waypoints.
  • the method includes subsequently guiding movement of the vehicle by object localization.
  • the object localization may comprise one or more visual or other technological means for location awareness.
  • the method includes adjusting an observed field of view, independent of a navigational field of view, by adjusting orientation of the head-mounted display device 54 seen in Figure 4.
  • the method includes autonomous navigation proceeding until such time as the control system, as generally seen by the features within box 10 of Figure 4, require further navigation command input or obstruction guidance by the operator.
  • a system comprising: an underwater vehicle provided with an omnidirectional field of view, the underwater vehicle having a camera, a plurality of thrusters, and a microcontroller that controls vector thrusting of the thrusters to move the underwater vehicle; and a control station to allow an operator to control movement of the underwater vehicle, the control station including a processor, a head mounted display device in communication with the processor, and an input device in communication with a processor, wherein the processor signals the microcontroller to roll, pitch, or yaw the underwater vehicle to change the omnidirectional field of view of the underwater vehicle based on an orientation of the head mounted display device worn by the operator, and the processor synchronously signals the microcontroller to move the underwater vehicle in a direction of travel based on a navigation command inputted by the operator using the input device.
  • a method of controlling movement of an underwater vehicle provided with an omnidirectional field of view comprising: controlling the underwater vehicle to roll, pitch, or yaw to change the omnidirectional field of view of the underwater vehicle based on an orientation of a head mounted display device worn by an operator; and controlling the underwater vehicle to move in a direction of travel based on a navigation command inputted by the operator, wherein the underwater vehicle is synchronously controlled to move in the direction of travel and roll, pitch, or yaw.
  • a system comprising: an underwater vehicle provided with an omnidirectional having a field of view, the underwater vehicle having a camera, a plurality of thrusters, and a microcontroller that controls vector thrusting of the thrusters to move the underwater vehicle; and a control station to allow an operator to control movement of the underwater vehicle, the control station including a processor, a display screen in communication with the processor, and an input device in communication with a processor, wherein the processor signals the microcontroller to roll, pitch, or yaw the underwater vehicle to change the omnidirectional field of view of the underwater vehicle based on a navigation command inputted by the operator using the input device, and the processor synchronously signals the microcontroller to move the underwater vehicle in a direction of travel based on a navigation command inputted by the operator using the input device.

Landscapes

  • Engineering & Computer Science (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Mechanical Engineering (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)

Abstract

An underwater vehicle is provided with an omnidirectional field of view, a camera, a plurality of thrusters, and a microcontroller that controls vector thrusting of the thrusters to move the underwater vehicle. There is a control station to allow an operator to control movement of the underwater vehicle. The control station includes a processor, a head mounted display device in communication with the processor, and an input device in communication with processor. The processor signals the microcontroller to roll, pitch, or yaw the underwater vehicle to change the omnidirectional field of view of the underwater vehicle, based on an orientation of the head mounted display device worn by the operator. The processor also synchronously signals the microcontroller to move the underwater vehicle in a direction of travel, based on a navigation command inputted by the operator using the input device.

Description

UNDERWATER VEHICLE WITH AN OMNIDIRECTIONAL CAMERA, AND METHOD OF CONTROLLING MOVEMENT OF THE SAME
BACKGROUND OF THE INVENTION
Field of the Invention
[0001] The present invention relates to an underwater vehicle and, in particular, to an underwater vehicle provided with an omnidirectional camera, as well as a method of operating and controlling movement of the same.
Description of the Related Art
[0002] United States Patent Application Publication No. 2016/0301866 to Samsung Electronics Co., Ltd., which was published on October 13, 2016, discloses a method of setting an omnidirectional camera based on a viewpoint of a user and transmitting images in real time to a head mounted display worn by the user.
[0003] United States Patent Application Publication No. 2017/0278262 to Sony Corporation, which was published on September 28, 2017, discloses an omnidirectional camera mounted on a vehicle. The image generated by the camera is based on the head posture of a user and is transmitted to a display device fixed to a head or a face of the user.
SUMMARY OF THE INVENTION
[0004] It is an object of the present invention to provide an underwater vehicle having an omnidirectional field of view and to provide a method of controlling the navigated movement of the underwater vehicle based on operator observed orientations within the omnidirectional field of view.
[0005] There is accordingly provided a system comprising an underwater vehicle and a control station to allow an operator to control movement of the underwater vehicle. The underwater vehicle is provided with an omnidirectional field of view, a camera, a plurality of thrusters, and a microcontroller that controls vector thrusting of the thrusters to move the underwater vehicle. The control station includes a processor, a head mounted display device in communication with the processor, and an input device in communication with the processor. The processor signals the microcontroller to roll, pitch, or yaw the underwater vehicle to align the navigated direction of the underwater vehicle based on an operator observed direction which is derived from the omnidirectional field of view of the underwater vehicle, and the orientation of the head mounted display device worn by the operator. The processor also synchronously signals the microcontroller to move the underwater vehicle in a direction of travel, based on a navigation command, which has been aligned to the operators observed direction and inputted by the operator using the input device or based on pre-inputted waypoints.
[0006] There is also provided a system comprising an underwater vehicle and a control station to allow an operator to control movement of the underwater vehicle. The underwater vehicle is provided with an omnidirectional field of view, a camera, a plurality of thrusters, and a microcontroller that controls vector thrusting of the thrusters to move the underwater vehicle. The control station includes a processor, a screen display in communication with the processor, and an input device in communication with the processor. The processor signals the microcontroller to roll, pitch, or yaw the underwater vehicle to align the navigated direction of the underwater vehicle based on an operator observed direction which is derived from the omnidirectional field of view of the underwater camera, and further based on a navigation command inputted by the operator using the input device. The processor also synchronously signals the microcontroller to move the underwater vehicle in a direction of travel, based on a navigation command inputted by the operator using the input device. [0007] There is further provided a method for controlling the movement of an underwater vehicle having an omnidirectional field of view. The underwater vehicle is controlled to roll, pitch, or yaw to change the navigated direction of the underwater vehicle based on an operator observed direction which is derived from the omnidirectional field of view of the omnidirectional as selected bythe orientation of the head mounted display device worn by an operator. The underwater vehicle is also synchronously controlled to move in a direction of travel based on a navigation command inputted by the operator.
[0008] There is additionally provided a method of controlling movement of an underwater vehicle provided with an omnidirectional field of view. The method includes providing camera controls which associate a front of the underwater vehicle with a center of a field of view of a head-mounted display device. The camera controls cause the underwater vehicle to roll, pitch or yaw in response to movement of and to maintain orientation with the head-mounted display device. The method includes determining via a processor of a control station whether both the camera controls are active and navigation commands have been inputted and if so, enabling navigation controls to selectively navigate the underwater vehicle in forward, reverse, lateral directions. The method includes determining via the processor of the control station whether the camera controls are overridden and navigation commands have been inputted and if so, enabling the navigation controls to selectively navigate the underwater vehicle in forward, reverse, lateral, upward and downward direction.
[0009] There is yet further provided a method for autonomously controlling the movement and navigation of an underwater vehicle having an omnidirectional field of view and an on-board control processes which is used to determine objects within the navigational view. The underwater vehicle is controlled to roll, pitch, or yaw in a direction of travel based on a plurality of navigation commands (waypoints) previously input by the operator. The vehicle is subsequently guided by visual and other technological means of localization of objects and waypoints within the omnidirectional navigational field of view. Localization is not limited to strictly visual means and is indeed to include location awareness from any suitable sensory technology. Such technology can include but is not limited to sensors based on sonar, lidar, radar, stereo vision, structured light, etc. The vehicle navigation control will consume the plurality of localization sensory inputs to calculate a best estimate on localization of the vehicle in reference to the desired navigational path and observed or sensed obstructions in the surroundings.
[0010] The operator may change the observed field of view derived from the omnidirectional camera, independent of the navigational field of view, through the orientation of the head mounted display device worn by said operator. Autonomous navigation proceeds until such time as the control system requires further waypoint input or obstruction guidance by the operator.
[0011] There is yet additionally provided a method of autonomously controlling movement and navigation of an underwater vehicle. The underwater vehicle has one or more cameras providing an omnidirectional field of view and includes an on-board control system. The method includes controlling roll, pitch or yaw in a direction of travel based on a plurality of previously-inputted navigation commands. The navigation commands may comprise one or more waypoints. The method includes subsequently guiding movement of the vehicle by object localization. The object localization comprises one or more visual or other technological means for location awareness. The method includes adjusting an observed said field of view, independent of a navigational said field of view, by adjusting orientation of a head-mounted display device worn by an operator. The method includes autonomous navigation proceeding until such time as the control system requires further navigation command input or obstruction guidance by the operator.
BRIEF DESCRIPTIONS OF DRAWINGS
[0012] The invention will be more readily understood from the following description of the embodiments thereof given, by way of example only, with reference to the accompanying drawings, in which: Figure l is a rear, top, right side perspective view of an underwater vehicle provided with an omnidirectional camera;
Figure 2 is a front, top, left side perspective view thereof;
Figure 3 is a schematic showing the underwater vehicle of Figure 1 in communication with a control station therefor, the control station including a processor, a head mounted display device, an input device, and a display screen;
Figure 4 is a block diagram showing the underwater vehicle of Figure 1 in communication with the control station of Figure 3;
Figure 5 is a flowchart showing the control logic of the underwater vehicle of Figure 1 .
DESCRIPTIONS OF THE PREFERRED EMBODIMENTS
[0013] Referring to the drawings and first to Figure 1, there is shown an underwater vehicle 10. The underwater vehicle has a top 1 1a, bottom 1 lb, front 11c and rear l id. The front and rear of the underwater vehicle 10 extend from the top to the bottom of the underwater vehicle. The underwater vehicle has a left side l ie and a right side I lf each of which extend between the top 11a and bottom 1 lb of the underwater vehicle and between the front 11c and rear 1 Id of the underwater vehicle.
[0014] The underwater vehicle 10 includes a chassis 12. The chassis in this example comprises of a pair of spaced-apart side members, in this example side planar members 13a and 13b which extend along sides l ie and 1 If of the underwater vehicle. The chassis 12 includes a body 13c positioned between the front 11c and rear l id of the underwater vehicle and which extends between the side planar members. The chassis includes one or more front members, in this example a pair of spaced-apart forward rods 13d and 13e which align with the front 1 lc of the underwater vehicle 10 and extend between and couple to the side planar members 13a and 13b. The chassis 12 includes one or more rear members, in this example a pair of spaced-apart rearward rods 13f and 13g which align with the rear l id of the underwater vehicle 10 and extend between and couple to the side planar members.
[0015] The underwater vehicle includes an elongate housing 14 mounted on the chassis 12, in this example coupling to body 13c. The housing is generally cylindrical in shape in this example and has a longitudinal or x-axis 15. Axis 15 is positioned between the sides l ie and 1 If of the underwater vehicle, is positioned between the front 11c and rear 1 Id of the underwater vehicle, extends between the front 11c and rear l id of the underwater vehicle, and is adjacent to the top 1 la of the underwater vehicle in this example.
[0016] The underwater vehicle 10 includes a plurality of thrusters, in this example thrusters 24, 26, 28, 30 and 32 mounted on the chassis 12 thereof. Forward thruster 24 is positioned near the front 11c of the underwater vehicle and generates thrust vectors 25 aligned parallel to a z-axis 27 of the underwater vehicle 10, which is a vertical axis in Figure 1. Axis 27 is positioned between the front 11c and rear 1 Id of the underwater vehicle and is positioned between the sides l ie and 1 If of the underwater vehicle. Axis 27 extends between the top 11a and bottom l ib of the underwater vehicle, and intersects with and extends perpendicular to axis 15 in this example.
[0017] Spaced-apart central thrusters 24 and 26 are positioned between the front 11 c and rear 1 Id of the underwater vehicle 10 and between side planar members 13a and 13b. The central thrusters generate thrust vectors 29 and 31 aligned parallel to x-axis 15. Spaced- apart rearward thrusters 30 and 32 are positioned near the rear l id of the underwater vehicle adjacent to side planar members 13a and 13b, respectively. The rearward thrusters generate thrust vectors 37 and 39 aligned parallel to axis 27 of the underwater vehicle. As seen in Figure 3, the omnidirectional cameras 16 and 18 are axially spaced outwards from the thrusters 24, 26, 28, 30 and 32.
[0018] The underwater vehicle 10 also has a lateral or y axis 35. Axis 35 is positioned between the front 11c and rear 1 Id of the underwater vehicle and is positioned between the top 1 1a and bottom 1 lb of the underwater vehicle. Axis 35 extends between the sides l ie and 1 If of the underwater vehicle, and intersects with and extends perpendicular to axis 27 in this example. Axis 35 also extends perpendicular to axis 15 in this example.
[0019] The thrusters 24, 26, 28, 30, and 32 are employed to provide the underwater vehicle 10 with up to six degrees of freedom of motion. The thrusters so configured and positioned enable the underwater vehicle to travel along the x-axis 15, y-axis 35, and z- axis 27, i.e. move up, down, left, right, forward, and backward. The thrusters 24, 26, 28, 30 and 32 so configured and positioned also enable the underwater vehicle 10 to rotate about the x-axis, y-axis, and z-axis, i.e. roll 41, pitch 43, and yaw 45.
[0020] Referring to Figure 3, the underwater vehicle 10 includes a microcontroller 34 which controls the movement thereof. The microcontroller disposed within the housing 14 of the underwater vehicle. The microcontroller 34 controls thrust vectoring of the thrusters 24, 26, 28, 30, and 32 to result in desired travel along the x-axis 15, y-axis 35, and z-axis 27 seen in Figure 1 and/or rotation about the x-axis, y-axis, and z-axis, i.e. roll 41, pitch 43, and yaw 45. As seen in Figure 4, the microcontroller 34 is part of an onboard processor 33.
[0021] Still referring to Figure 1, the underwater vehicle 10 includes a first omnidirectional camera 16 and a second omnidirectional camera 18. The first and second omnidirectional cameras provide the underwater vehicle with omnidirectional vision capability. The first omnidirectional camera 16 and the second omnidirectional camera 18 are each mounted on the housing 14 of the underwater vehicle 10. In this example, the first omnidirectional camera is mounted on a first end 20 of the housing 14 of the underwater vehicle and the second omnidirectional camera is mounted on a second end 22 of the housing 14 of the underwater vehicle 10. However, in other examples, an omnidirectional camera may be otherwise mounted on the housing. The first end 20 of the housing 14 of the underwater vehicle 10 and the second end 22 of the housing 14 of the underwater vehicle 10 are opposite. The omnidirectional cameras 16 and 18 are accordingly capable of generating a 360 degree image about the x-axis 15, y-axis 35, and z-axis 27. The omnidirectional cameras accordingly provide an omnidirectional field of view. In other examples, instead of an omnidirectional camera, a plurality of cameras may be employed to provide an omnidirectional field of view.
[0022] Referring now to Figure 4, the underwater vehicle 10 is remotely controlled by an operator (not shown) from a control station 50. The control station generally includes a processor 52, which in this example is part of a computer terminal 53 of the control station. The processor is in communication with the microcontroller 34 of the underwater vehicle 10 as best shown in Figure 4. The control station 50 further includes a video display, in this example a head-mounted display device 54, which may be a virtual reality headset. The control station also includes an input device 56, which may be a joystick or gamepad controller for example.
[0023] An image generated by the first omnidirectional camera 16 and/or the second omnidirectional camera 18 is transmitted to the processor 52 of the control station 50. The image is displayed in real-time on the head-mounted display device 54 which is worn by the operator and/or the display screen 58 depending on a mode of operation. The field of view of the first omnidirectional camera 16 and the second omnidirectional camera 18 may be coupled to the orientation of the head-mounted display device 54 by camera controls and, in this example, a front of the underwater vehicle 10 is oriented with a center of the field of view. The field of view of the first omnidirectional camera 16 and the second omnidirectional camera 18 may be alternatively be coupled to the orientation of the underwater vehicle with a front of the underwater vehicle 10 being oriented with a center of the field of view.
[0024] The operator is able to pan and tilt by changing their physical orientation, for example, by turning their head“side to side” or looking“up” and“down”. The camera controls cause the underwater vehicle 10 to roll, pitch, and yaw in response to changes in the physical orientation of the operator. The operator is also able to navigate the underwater vehicle 10 using the input device 56 to input navigation commands to navigation controls which cause the underwater vehicle 10 to travel in response to the navigation commands. Alternatively, navigation commands may be determined by one or more previously inputted waypoints. The camera controls and navigation controls are synchronized. Accordingly, in this first mode of operation, the camera controls control the orientation of the underwater vehicle 10 and the navigation controls control the direction of travel of the underwater vehicle 10. It is optionally possible to override the camera controls, and control both orientation and direction of travel of the underwater vehicle 10 using the navigation controls, in a second mode of operation.
[0025] As seen in Figure 4, omnidirectional cameras 16 and 18 thus send image data or video signals to a video processing module 34a of the on-board processor 33 which processes the same. The processor includes a video signal sending transceiver 34b which receives video signals so processed by module 34a and sends this to a video signal receiving module or transceiver 53 a of computer terminal 53.
[0026] A video processing module 53b of the computer terminal receives and processes the image data so received by transceiver 53a and sends this data via a video signal sending module or transceiver 53 c to an image signal receiving module or transceiver 54a of head mounted display device 54. The display device includes an image display 54c which receives the image data so transmitted after being processed by signal processing module 54b.
[0027] Thus, on-board processor 33 has both a video and control communications link to computer terminal 53. Internals of the computer terminal are comprised of video receiving and processing blocks that allows the omnidirectional view of the underwater vehicle to be viewed by the operator on either a virtual reality headset or video display depicted in the computer terminal.
[0028] Manual control of the vehicle is enabled by the tactile input control block 56a. Operator initiated commands via these controls are forwarded via signal transmission module or transceiver 56b to the control logic processor 53d within the computer terminal 53. During manual control of the vehicle the operator can input navigational commands via the input devices within block 56.
[0029] During visual control of the vehicle the operator orients the head-mounted display device 54 to provide yaw, pitch roll commands which are interpreted by the orientation sensing block 54d. The received orientation commands are forwarded to the control logic processor 53d via signal processing module 54e and signal transmission module or transceiver 54f of the head-mounted display device 54, and orientation receiving module or transceiver 53 e of the computer terminal 53.
[0030] During autonomous control of the vehicle 10, the autonomous object localization block 53f of the computer terminal 53 determines vehicle commands based on the processing of the video signals from video signal transceiver 53a to determine the relative location of the vehicle to desired waypoints 53g and possible path obstructions. Given the preprogramed navigation path has a solution, the autonomous control will provide commands to the control logic processor 53d. In the autonomous mode of operation, the operator is strictly an observer and is able orient the head-mounted display device 54 in any desirable orientation in order to see surrounding events or objects within the omnidirectional camera 16, 18 field of view.
[0031] During autonomous control the vehicle 10 will alert the operator of any impediments to the solution of the guided path. The operator is thus able either via visual or manual control to resolve the navigational impediment.
[0032] Navigation Commands, originating either from the input device 56, the head- mounted display device 54 or the localization block 53f, are mixed and selected for forwarding to the vehicle 10 via the control logic processor 52d. Depending on mode of navigation, the processed commands are immediately communicated by the control logic processor to the thruster control 34d via control signal input module or transceiver transfer 34c, for selectively actuating and controlling thrusters 24, 26, 28, 30, and 32. This link may be referred to as the command path. The features shown in box 10 of Figure 4 may be referred to as an on-board control system.
[0033] A control logic 100 of the underwater vehicle 10 is shown in Figure 5. The processor 52 determines at decision block 1 10 if the camera controls have been overridden, as shown by camera override block 56c in Figure 4. Referring to Figure 5, the camera controls will orient the front of the underwater vehicle 10 with a center of the field of view at block 120 if the camera controls have not been overridden. There is thus provided a method of controlling movement of the underwater vehicle including providing camera controls, generally shown within boxes 10 and 54 in Figure 4, which associate the front 1 lc of the underwater vehicle seen in Figure 1, with a center of a field of view of the head- mounted display device 54 seen in Figure 4. The camera controls cause the underwater vehicle to roll, pitch or yaw in response to movement of and to maintain orientation with the head-mounted display device. This is the first mode of operation.
[0034] Referring back to Figure 5, the processor 52 then determines at decision block 130 if any navigation commands have been inputted. The control logic 100 will continuously loop to orient the front of the underwater vehicle with a center of the field of view if navigation commands have not been inputted. However, if navigation commands have been inputted, the navigation controls will navigate the underwater vehicle at block 140. The method thus includes determining via the processor whether both the camera controls are active and navigation commands have been inputted. If so, navigation controls, as generally shown by the components within boxes 53 and 10 in Figure 4, are enabled to selectively navigate the underwater vehicle in forward, reverse and lateral directions.
[0035] The method includes either using input device 56 seen in Figure 3 to input the navigation commands or using one or more previously-inputted waypoints for determining the navigation commands.
[0036] Referring back to Figure 5, the processor 52 then determines at decision block 150 if there has been a change in the field of view, i.e. if the physical orientation of the operator has changed in the first mode of operation. The control logic 100 will continuously loop to navigate the underwater vehicle 10 if there has been no change in the field of view. The method thus includes monitoring the navigation controls if no movement of the head- mounted display device is detected. [0037] However, if there has been a change in the field of view, the control logic 100 will loop to determine if the camera controls have been overridden at decision block 110. The method includes determining via the processor 52 seen in Figure 4 whether the camera controls, generally seen within boxes 10 and 53, have been overridden upon detecting movement of the head-mounted display device 54. [0038] Referring back to Figure 5, the control logic 100 will continue to operate in the first mode of operation until it is determined at decision block 110 that the camera controls have been overridden. The control logic 100 will then operate in the second mode of operation in which only the navigation controls control both orientation and direction of travel of the underwater vehicle 10. [0039] The method thus includes determining via the processor of the control station whether the camera controls are overridden and navigation commands have been inputted and if so, enabling the navigation controls to selectively navigate the underwater vehicle in forward, reverse, lateral, upward and downward direction. The method therefore includes operating the underwater vehicle solely via said navigation controls by overriding the camera controls.
[0040] In this case there is provided a method of autonomously controlling movement and navigation of an underwater vehicle, including controlling roll, pitch or yaw in a direction of travel based on a plurality of previously-inputted navigation commands. The navigation commands may comprise one or more waypoints. The method includes subsequently guiding movement of the vehicle by object localization. The object localization may comprise one or more visual or other technological means for location awareness. [0041] The method includes adjusting an observed field of view, independent of a navigational field of view, by adjusting orientation of the head-mounted display device 54 seen in Figure 4. The method includes autonomous navigation proceeding until such time as the control system, as generally seen by the features within box 10 of Figure 4, require further navigation command input or obstruction guidance by the operator.
[0042] Three modes of vehicle guidance have thus herein been described: 1) manual input device (e.g. joystick) - whereby‘hand’ controls guide the craft; 2) head-mounted display orientation (e.g. virtual reality (VR) headset) - whereby visual fields of view guide the craft; and 3) autonomous localization - whereby autonomous systems guide the craft on a pre-set path, where the pre-set path can be entered via highlighting visual targets in the VR/Display. In autonomous mode the head-mounted display view and the vehicle navigation path are decoupled and the craft proceeds on a path while the operator is able to observe the surroundings in any orientation of view possible with the cameras.
ADDITIONAL DESCRIPTION
[0043] Examples of systems comprising an underwater vehicle and a control station therefor have been described. The following clauses are offered as further description.
(1) A system comprising: an underwater vehicle provided with an omnidirectional field of view, the underwater vehicle having a camera, a plurality of thrusters, and a microcontroller that controls vector thrusting of the thrusters to move the underwater vehicle; and a control station to allow an operator to control movement of the underwater vehicle, the control station including a processor, a head mounted display device in communication with the processor, and an input device in communication with a processor, wherein the processor signals the microcontroller to roll, pitch, or yaw the underwater vehicle to change the omnidirectional field of view of the underwater vehicle based on an orientation of the head mounted display device worn by the operator, and the processor synchronously signals the microcontroller to move the underwater vehicle in a direction of travel based on a navigation command inputted by the operator using the input device.
(2) A method of controlling movement of an underwater vehicle provided with an omnidirectional field of view, the method comprising: controlling the underwater vehicle to roll, pitch, or yaw to change the omnidirectional field of view of the underwater vehicle based on an orientation of a head mounted display device worn by an operator; and controlling the underwater vehicle to move in a direction of travel based on a navigation command inputted by the operator, wherein the underwater vehicle is synchronously controlled to move in the direction of travel and roll, pitch, or yaw.
(3) A system comprising: an underwater vehicle provided with an omnidirectional having a field of view, the underwater vehicle having a camera, a plurality of thrusters, and a microcontroller that controls vector thrusting of the thrusters to move the underwater vehicle; and a control station to allow an operator to control movement of the underwater vehicle, the control station including a processor, a display screen in communication with the processor, and an input device in communication with a processor, wherein the processor signals the microcontroller to roll, pitch, or yaw the underwater vehicle to change the omnidirectional field of view of the underwater vehicle based on a navigation command inputted by the operator using the input device, and the processor synchronously signals the microcontroller to move the underwater vehicle in a direction of travel based on a navigation command inputted by the operator using the input device.
[0044] It will also be understood by a person skilled in the art that many of the details provided above are by way of example only, and are not intended to limit the scope of the invention which is to be determined with reference to the following claims.

Claims

WHAT IS CLAIMED IS:
1. A system comprising an underwater vehicle and a control station configured to control movement of the underwater vehicle, the underwater vehicle including one or more thrusters, and a microcontroller that controls vector thrusting of the one or more thrusters to move the underwater vehicle, and the underwater vehicle including one or more cameras configured to provide an omnidirectional field of view of the environment surrounding the vehicle and, the control station including a processor and a head-mounted display device in communication with the processor, the processor signaling the microcontroller to roll, pitch, or yaw the underwater vehicle to change a navigated orientation thereof to align with an observed orientation of the head-mounted display device as derived from the omnidirectional field of view of the underwater vehicle.
2. The system as claimed in claim 1 wherein the control station allows an operator to control navigated said movement of the underwater vehicle and wherein the processor signals the microcontroller to the roll, pitch, or yaw or otherwise adjust a navigated direction of the underwater vehicle to align with the observed orientation within the omnidirectional field of view of the underwater vehicle based on an observed direction and the observed orientation of the head mounted display device as worn by the operator.
3. The system as claimed in claim 1 wherein the underwater vehicle includes at least one central said thrusters positioned between a front thereof and a rear thereof, at least one forward said thruster positioned adjacent said front of the underwater vehicle, and at least one rear said thruster positioned adjacent said rear of the underwater vehicle.
4. The system as claimed in claim 1 wherein the underwater vehicle includes a pair of spaced-apart central said thrusters positioned between a front thereof and a rear thereof, one said thruster positioned adjacent a first of said front and said rear of the underwater vehicle, and a pair of said thruster positioned adjacent a second of said front and said rear of the underwater vehicle.
5. The system as claimed in claim 1 wherein the underwater vehicle has an elongate housing, includes a first omnidirectional said camera mounted on a first end of the housing, and a second omnidirectional said camera mounted on a second end of the housing.
6. The system as claimed in claim 5 wherein the housing has a longitudinal axis and wherein the omnidirectional said cameras are axially spaced outwards from the one or more thrusters.
7. The system as claimed in claim 1 wherein the system further includes an input device in communication with the processor, and wherein the processor synchronously signals the microcontroller to move the underwater vehicle in a direction of travel based on a navigation command inputted by an operator using the input device.
8. The system as claimed in claim 1, wherein the underwater vehicle has a first axis which extends between a front thereof and a rear thereof, a second axis which extends between a top thereof and a bottom thereof, and a third axis which extends between spaced-apart sides thereof, wherein the system further includes an input device in communication with the processor, and wherein the microcontroller controls thrust vectoring of the one or more thrusters to result in desired travel at least along one or more of the first axis, the second axis, and the third axis based on a navigation command inputted using from the input device.
9. A system comprising an underwater vehicle and a control station configured to control movement of the underwater vehicle, the underwater vehicle including one or more thrusters, one or more cameras configured to provide an omnidirectional field of view, and a microcontroller that controls vector thrusting of the one or more thrusters to move the underwater vehicle, and the control station including a processor and a head-mounted display device in communication with the processor, the processor signaling the microcontroller to roll, pitch, or yaw the underwater vehicle to change the omnidirectional field of view of the underwater vehicle based on an orientation of the head-mounted display device.
10. A system comprising an underwater vehicle and a control station configured to control movement of the underwater vehicle, the underwater vehicle including one or more thrusters, one or more cameras configured to provide an omnidirectional field of view, and a microcontroller that controls vector thrusting of the one or more thrusters to move the underwater vehicle, and the control station including a processor, a display screen in communication with the processor, and an input device in communication with the processor, wherein the processor signals the microcontroller to roll, pitch, or yaw the underwater vehicle to align a navigated direction with an operator observed direction derived from the omnidirectional field of view of the underwater vehicle based on a navigation command inputted using the input device, and the processor synchronously signals the microcontroller to move the underwater vehicle in a direction of travel based on a navigation command inputted by an operator using the input device.
11. A system comprising an underwater vehicle and a control station configured to control movement of the underwater vehicle, the underwater vehicle including one or more thrusters, one or more cameras configured to provide an omnidirectional field of view, and a microcontroller that controls vector thrusting of the one or more thrusters to move the underwater vehicle, and the control station including a processor, a display screen in communication with the processor, and an input device in communication with the processor, wherein the processor signals the microcontroller to roll, pitch, or yaw the underwater vehicle to change the omnidirectional field of view of the underwater vehicle based on a navigation command inputted using the input device, and the processor synchronously signals the microcontroller to move the underwater vehicle in a direction of travel based on a navigation command inputted by an operator using the input device.
12. A method of controlling movement of an underwater vehicle provided with an omnidirectional field of view, the method comprising: a) providing camera controls which associate a front of the underwater vehicle with a center of a field of view of a head-mounted display device, whereby said camera controls cause the underwater vehicle to roll, pitch or yaw in response to movement of and to maintain orientation with the head-mounted display device; b) determining via a processor of a control station whether both the camera controls are active and navigation commands have been inputted and if so, enabling navigation controls to selectively navigate the underwater vehicle in forward, reverse, lateral directions; and c) determining via the processor of the control station whether the camera controls are overridden and navigation commands have been inputted and if so, enabling the navigation controls to selectively navigate the underwater vehicle in said forward direction, said reverse direction, said lateral directions, an upward direction and a downward direction.
13. The method as claimed in claim 12 further including within step c: determining via the processor whether the camera controls have been overridden upon detecting movement of the head-mounted display device.
14. The method as claimed in claim 12 further including within step b: monitoring the navigation controls if no movement of the head-mounted display device is detected.
15. The method as claimed in claim 12 further including operating the underwater vehicle solely via said navigation controls by overriding the camera controls.
16. The method as claimed in claim 12 further including: using an input device to input said navigation commands.
17. The method as claimed in claim 12 further including: using one or more previously-inputted waypoints for determining said navigation commands.
18. A method of controlling movement of an underwater vehicle provided with an omnidirectional field of view, the method comprising: controlling the underwater vehicle to roll, pitch, or yaw to align direction of travel with the omnidirectional field of view of the underwater vehicle based on an orientation of a head-mounted display device; and controlling the underwater vehicle to move in a direction of travel based on a navigation command, wherein the underwater vehicle is synchronously controlled to move in the direction of travel and roll, pitch, or yaw.
19. The method as claimed in claim 18 wherein the head -mounted display device is worn by an operator and wherein the navigation command is inputted by the operator via an input device.
20. The method as claimed in claim 18 wherein the navigation command is determined by one or more previously-inputted waypoints.
21. A method of controlling movement of an underwater vehicle provided with an omnidirectional field of view, the method comprising: controlling the underwater vehicle to roll, pitch, or yaw to change the omnidirectional field of view of the underwater vehicle based on an orientation of a head-mounted display device; and controlling the underwater vehicle to move in a direction of travel based on a navigation command, wherein the underwater vehicle is synchronously controlled to move in the direction of travel and roll, pitch, or yaw.
22. A method of autonomously controlling movement and navigation of an underwater vehicle having one or more cameras providing an omnidirectional field of view and including an on-board control system, the method comprising: controlling roll, pitch or yaw in a direction of travel based on a plurality of previously-inputted navigation commands; subsequently guiding movement of the vehicle by object localization; adjusting an observed said field of view, independent of a navigational said field of view, by adjusting orientation of a head-mounted display device worn by an operator; and autonomous navigation proceeding until such time as the control system requires further navigation command input or obstruction guidance by the operator.
23. The method as claimed in claim 22 wherein the navigation commands comprise one or more waypoints, and wherein the object localization comprises one or more visual or other technological means for location awareness.
PCT/CA2020/050520 2019-04-18 2020-04-17 Underwater vehicle with an omnidirectional camera, and method of controlling movement of the same WO2020210918A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US201962836058P 2019-04-18 2019-04-18
US62/836,058 2019-04-18

Publications (1)

Publication Number Publication Date
WO2020210918A1 true WO2020210918A1 (en) 2020-10-22

Family

ID=72836878

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CA2020/050520 WO2020210918A1 (en) 2019-04-18 2020-04-17 Underwater vehicle with an omnidirectional camera, and method of controlling movement of the same

Country Status (1)

Country Link
WO (1) WO2020210918A1 (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
RU2760985C1 (en) * 2021-02-04 2021-12-02 Федеральное государственное бюджетное образовательное учреждение высшего образования Иркутский государственный университет путей сообщения (ФГБОУ ВО ИрГУПС) Multifunctional device for deep-sea monitoring of the underwater environment and underwater technical works
CN114180015A (en) * 2022-01-02 2022-03-15 天津瀚海蓝帆海洋科技有限公司 Middle-sized deep sea open-frame type ARV

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2010123380A2 (en) * 2009-04-24 2010-10-28 Sperre As Underwater vessel with improved propulsion and handling
US20160301866A1 (en) * 2015-04-10 2016-10-13 Samsung Electronics Co., Ltd. Apparatus and method for setting camera
US20170278262A1 (en) * 2014-07-31 2017-09-28 Sony Corporation Information processing device, method of information processing, and image display system

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2010123380A2 (en) * 2009-04-24 2010-10-28 Sperre As Underwater vessel with improved propulsion and handling
US20170278262A1 (en) * 2014-07-31 2017-09-28 Sony Corporation Information processing device, method of information processing, and image display system
US20160301866A1 (en) * 2015-04-10 2016-10-13 Samsung Electronics Co., Ltd. Apparatus and method for setting camera

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
RU2760985C1 (en) * 2021-02-04 2021-12-02 Федеральное государственное бюджетное образовательное учреждение высшего образования Иркутский государственный университет путей сообщения (ФГБОУ ВО ИрГУПС) Multifunctional device for deep-sea monitoring of the underwater environment and underwater technical works
CN114180015A (en) * 2022-01-02 2022-03-15 天津瀚海蓝帆海洋科技有限公司 Middle-sized deep sea open-frame type ARV

Similar Documents

Publication Publication Date Title
EP3272586B1 (en) Work vehicle
KR102548491B1 (en) An automatic location placement system
US8447440B2 (en) Autonomous behaviors for a remote vehicle
US20200097021A1 (en) Autonomous Farm Equipment Hitching To A Tractor
US20100292868A1 (en) System and method for navigating a remote control vehicle past obstacles
JP4012749B2 (en) Remote control system
WO2020210918A1 (en) Underwater vehicle with an omnidirectional camera, and method of controlling movement of the same
JP4108629B2 (en) Underwater robot operation method and underwater robot operation system
JP5366711B2 (en) Semi-autonomous vehicle remote control system
US20230097676A1 (en) Tactical advanced robotic engagement system
KR101151787B1 (en) Remotely controlled robot system with enhanced heading determination performance and remote controller in the system
US8466959B2 (en) Vehicle video control system
JP7024997B2 (en) Aircraft Maneuvering System and How to Maneuver an Aircraft Using an Aircraft Maneuvering System
EP3288828B1 (en) Unmanned aerial vehicle system and method for controlling an unmanned aerial vehicle
Jarvis A tele-autonomous heavy duty robotic lawn mower
JP2018164223A (en) Display system
JPH08202445A (en) Controller of autonomous underwater robot
Ross et al. High performance teleoperation for industrial work robots
WO2024024535A1 (en) Information processing method, information processing device, and movable body control system
EP2147386A1 (en) Autonomous behaviors for a remote vehicle
Jarvis Tele-autonomous watercraft navigation
JP2023177286A (en) Information processing device, information processing system, information processing method, and program
JP2023113717A (en) flight control system
Högström et al. A semi autonomous robot with rate gyro supported control and a video camera
Mononen et al. Teleoperation and autonomous guidance systems for off-road vehicles

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20791252

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 20791252

Country of ref document: EP

Kind code of ref document: A1