US20210141071A1 - Video imaging using multi-ping sonar - Google Patents

Video imaging using multi-ping sonar Download PDF

Info

Publication number
US20210141071A1
US20210141071A1 US16/727,198 US201916727198A US2021141071A1 US 20210141071 A1 US20210141071 A1 US 20210141071A1 US 201916727198 A US201916727198 A US 201916727198A US 2021141071 A1 US2021141071 A1 US 2021141071A1
Authority
US
United States
Prior art keywords
sonar
series
pings
ping
images
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US16/727,198
Inventor
Blair Cunningham
Charlie Pearson
Martyn Sloss
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Coda Octopus Group Inc
Original Assignee
Coda Octopus Group Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Coda Octopus Group Inc filed Critical Coda Octopus Group Inc
Priority to US16/727,198 priority Critical patent/US20210141071A1/en
Assigned to CODA OCTOPUS GROUP INC. reassignment CODA OCTOPUS GROUP INC. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: CUNNINGHAM, BLAIR GRAEME, PEARSON, CHARLIE, SLOSS, MARTYN
Publication of US20210141071A1 publication Critical patent/US20210141071A1/en
Assigned to CODA OCTOPUS GROUP, INC. reassignment CODA OCTOPUS GROUP, INC. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: McFadzean, Angus
Priority to US17/493,638 priority patent/US20220026570A1/en
Priority to US17/674,514 priority patent/US20220171056A1/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G10MUSICAL INSTRUMENTS; ACOUSTICS
    • G10KSOUND-PRODUCING DEVICES; METHODS OR DEVICES FOR PROTECTING AGAINST, OR FOR DAMPING, NOISE OR OTHER ACOUSTIC WAVES IN GENERAL; ACOUSTICS NOT OTHERWISE PROVIDED FOR
    • G10K11/00Methods or devices for transmitting, conducting or directing sound in general; Methods or devices for protecting against, or for damping, noise or other acoustic waves in general
    • G10K11/18Methods or devices for transmitting, conducting or directing sound
    • G10K11/26Sound-focusing or directing, e.g. scanning
    • G10K11/34Sound-focusing or directing, e.g. scanning using electrical steering of transducer arrays, e.g. beam steering
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/86Combinations of radar systems with non-radar systems, e.g. sonar, direction finder
    • G01S13/862Combination of radar systems with sonar systems
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S15/00Systems using the reflection or reradiation of acoustic waves, e.g. sonar systems
    • G01S15/02Systems using the reflection or reradiation of acoustic waves, e.g. sonar systems using reflection of acoustic waves
    • G01S15/06Systems determining the position data of a target
    • G01S15/08Systems for measuring distance only
    • G01S15/10Systems for measuring distance only using transmission of interrupted, pulse-modulated waves
    • G01S15/102Systems for measuring distance only using transmission of interrupted, pulse-modulated waves using transmission of pulses having some particular characteristics
    • G01S15/107Systems for measuring distance only using transmission of interrupted, pulse-modulated waves using transmission of pulses having some particular characteristics using frequency agility of carrier wave
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S15/00Systems using the reflection or reradiation of acoustic waves, e.g. sonar systems
    • G01S15/02Systems using the reflection or reradiation of acoustic waves, e.g. sonar systems using reflection of acoustic waves
    • G01S15/06Systems determining the position data of a target
    • G01S15/42Simultaneous measurement of distance and other co-ordinates
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S15/00Systems using the reflection or reradiation of acoustic waves, e.g. sonar systems
    • G01S15/88Sonar systems specially adapted for specific applications
    • G01S15/89Sonar systems specially adapted for specific applications for mapping or imaging
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S15/00Systems using the reflection or reradiation of acoustic waves, e.g. sonar systems
    • G01S15/88Sonar systems specially adapted for specific applications
    • G01S15/89Sonar systems specially adapted for specific applications for mapping or imaging
    • G01S15/8902Side-looking sonar
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S15/00Systems using the reflection or reradiation of acoustic waves, e.g. sonar systems
    • G01S15/88Sonar systems specially adapted for specific applications
    • G01S15/89Sonar systems specially adapted for specific applications for mapping or imaging
    • G01S15/8906Short-range imaging systems; Acoustic microscope systems using pulse-echo techniques
    • G01S15/8909Short-range imaging systems; Acoustic microscope systems using pulse-echo techniques using a static transducer configuration
    • G01S15/8915Short-range imaging systems; Acoustic microscope systems using pulse-echo techniques using a static transducer configuration using a transducer array
    • G01S15/8925Short-range imaging systems; Acoustic microscope systems using pulse-echo techniques using a static transducer configuration using a transducer array the array being a two-dimensional transducer configuration, i.e. matrix or orthogonal linear arrays
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S15/00Systems using the reflection or reradiation of acoustic waves, e.g. sonar systems
    • G01S15/88Sonar systems specially adapted for specific applications
    • G01S15/89Sonar systems specially adapted for specific applications for mapping or imaging
    • G01S15/8906Short-range imaging systems; Acoustic microscope systems using pulse-echo techniques
    • G01S15/8977Short-range imaging systems; Acoustic microscope systems using pulse-echo techniques using special techniques for image reconstruction, e.g. FFT, geometrical transformations, spatial deconvolution, time deconvolution
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/86Combinations of lidar systems with systems other than lidar, radar or sonar, e.g. with direction finders
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/52Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S15/00
    • G01S7/52003Techniques for enhancing spatial resolution of targets
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/52Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S15/00
    • G01S7/52017Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S15/00 particularly adapted to short-range imaging
    • G01S7/52046Techniques for image enhancement involving transmitter or receiver
    • G01S7/52047Techniques for image enhancement involving transmitter or receiver for elimination of side lobes or of grating lobes; for increasing resolving power
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/52Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S15/00
    • G01S7/523Details of pulse systems
    • G01S7/524Transmitters
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/52Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S15/00
    • G01S7/523Details of pulse systems
    • G01S7/526Receivers
    • G01S7/53Means for transforming coordinates or for evaluating data, e.g. using computers
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/52Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S15/00
    • G01S7/56Display arrangements
    • G01S7/62Cathode-ray tube displays
    • G01S7/6245Stereoscopic displays; Three-dimensional displays; Pseudo-three dimensional displays

Definitions

  • FIG. 1 shows a sketch of the layout where the method of the invention may be used.
  • FIG. 1 shows the sonar transmitter sending out pulses of sound waves 12 which propagate into the water in an approximately cone shaped beam.
  • the pulses 12 strike objects in the water such as stones 15 on the seabed 13 , an underwater vessel 17 , a swimming diver 18 , and a sea wall 16 .
  • the vessel 17 may either be manned or be a remotely operated vessel (ROV).
  • ROV remotely operated vessel
  • the other end of the piezoelectric material elements 21 is energized by applying an ultrasonic frequency voltage to electrical elements 24 which are separated electrically from each other and which energize groups of piezo electric elements 21 to vibrate with the same phase and frequency.
  • Wires 25 are sketched to show the electrical connections to the different segments 24 .
  • the plan view of the transmitter shows the elements 24 in FIG. 2B shown segmented into 9 segments.
  • FIG. 3 shows other preferred segmentation schemes useful in the method of the invention.
  • FIG. 4A shows the beam pattern of the outgoing sonar waves if all the elements 21 are energized with the same phase and frequency electrical signal.
  • FIG. 4B shows the beam pattern of the outgoing sonar waves if only the elements 21 in the center section of FIG.
  • the beam 2B are energized with the same phase and frequency electrical signal.
  • the full beam has a divergence of 50 degrees and the restricted beam shown in FIG. 4B has a divergence of 25 degrees.
  • the beam may be sent out up, down, left, or right.
  • the detector elements are generally constructed by sandwiching a piezo electric material between two electrically conducting materials as shown for the sonar transmitter, but with an electrical connection to each element in the array.
  • a reflected sonar ping reaches the sonar detecting element, the element is compressed and decompressed at the sonar ping frequency, and produces a nanovolt analog signal between the electrically conducting materials.
  • the nanovolt signals are amplified and digitally sampled at a sonar receiver sampling rate controlled by the sonar receiver controller, and the resulting digital signal is compared to a signal related to sent out ping signals to measure the phase and amplitude of the incoming sonar signals for each receiver element.
  • the amplification or gain for the incoming sonar signals is controlled by the sonar receiver controller.
  • a large amount of data generated per second by prior art sonar systems has traditionally been discarded because of data transmission and/or storage limits.
  • the present invention allows a higher percentage of the original data generated to be stored for later analysis.
  • FIG. 6 shows a flowchart of the method of the invention.
  • the start 60 of the process of sending out a ping is to set all system parameters for all system controllers. Either all parameters are the same as the last ping, or they have been changed automatically by signals from stages of the previous ping.
  • Step 60 sends signal to step 61 to send commands to transmitter 62 .
  • Transmitter sends data to receiver controller 63 to set parameters for receiver 64 and start receiver 64 .
  • Receiver receives analogue signals, samples the voltages from each element, and transmits data to the beamformer controller which sends data and instructions to the Beamformer section.

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • General Physics & Mathematics (AREA)
  • Acoustics & Sound (AREA)
  • Multimedia (AREA)
  • Electromagnetism (AREA)
  • Measurement Of Velocity Or Position Using Acoustic Or Ultrasonic Waves (AREA)

Abstract

A sonar system comprising a sonar transmitter, a very large array two dimensional sonar receiver, and a beamformer section transmits a series of sonar pings into an insonified volume of fluid at a rate greater than 5 pings per second, receives sonar signals reflected and scattered from objects in the insonified volume, and beamforms the reflected signals to provide a video presentation and/or to store the beamformed data for later use. The parameters controlling the sonar system are changed between pings to provide enhanced video imaging.

Description

    RELATED PATENTS AND APPLICATIONS
  • The following US patents and US patent applications are related to the present application: U.S. Pat. No. 6,438,071 issued to Hansen, et al. on Aug. 20, 2002; U.S. Pat. No. 7,466,628 issued to Hansen on Dec. 16, 2008; U.S. Pat. No. 7,489,592 issued Feb. 10, 2009 to Hansen; U.S. Pat. No. 8,059,486 issued to Sloss on Nov. 15, 2011; U.S. Pat. No. 7,898,902 issued to Sloss on Mar. 1, 2011; U.S. Pat. No. 8,854,920 issued to Sloss on Oct. 7, 2014; and U.S. Pat. No. 9,019,795 issued to Sloss on Apr. 28, 2015; U.S. patent application Ser. Nos. 14/927,748 and 14/927,730 filed on Oct. 30, 2015, Ser. No. 15/978,386 filed on May 14, 2018, Ser. No. 15/908,395 filed on Feb. 28, 2018, Ser. No. 15/953,423 filed on Apr. 14, 2018, Ser. No. 16/693,684 filed Nov. 11, 2019, and 62/931,956 and 62/932,734 filed Nov. 7, 2019, Ser. No. 16/362,255 filed on Mar. 22, 2019, and 62/818,682 filed Mar. 14, 2019 and are also related to the present application. The above identified patents and patent applications are assigned to the assignee of the present invention and are incorporated herein by reference in their entirety including incorporated material.
  • FIELD OF THE INVENTION
  • The field of the invention is the field generating and receiving of sonar pulses and of visualization and/or use of data from sonar signals scattered from objects immersed in a fluid.
  • OBJECTS OF THE INVENTION
  • It is an object of the invention to improve visualization using sonar imaging. It is an object of the invention to measure and record the positions and orientations, and images of submerged objects. It is an object of the invention to improve resolution of sonar images. It is an object of the invention to present sonar video images at increased video rates. It is an object of the invention to rapidly change the sonar image resolution between at least 2 pings of a series of pings. It is the object of the invention to change rapidly change the direction of the field of view on sonar images between at least 2 pings of a series of pings.
  • SUMMARY OF THE INVENTION
  • A series of sonar pings are sent into an insonified volume of water and reflected or scattered from submerged object(s) in the insonified volume of water. One or more large sonar receiver arrays of sonar detectors are used to produce and analyze sonar data to produce 3 dimensional images of the submerged object(s) for each ping. One or more parameters controlling the sonar imaging system are changed between pings to change the series of images. The resulting changed images are combined together to produce an enhanced video presentation of the submerged objects at an enhanced video frame rate of at least 5 frames per second. More than one of the parameters used to control the sonar imaging system are used to produce different 3D images from the same ping in a time less than the time between two pings.
  • BRIEF DESCRIPTION OF THE DRAWING
  • FIG. 1 shows a sketch of the layout where the method of the invention may be used.
  • FIGS. 2A, 2B and 2C show side elevation, plan view and end elevation views of the sonar transmitter of the invention.
  • FIG. 3 shows possible configurations of the sonar transmitter of the invention.
  • FIGS. 4A and 4B show the sonar transmitter of the invention sending out pings in a 50 degree included angle and a 25 degree included angle.
  • FIGS. 5A, 5B and 5C show plan view, side elevation, and end elevation views of the sonar receiver of the invention.
  • FIG. 6 shows a flow chart of the method of the invention.
  • DETAILED DESCRIPTION OF THE INVENTION
  • It has long been known that data presented in visual form is much better understood by humans than data presented in the form of tables, charts, text, etc. However, even data presented visually as bar graphs, line graphs, maps, or topographic maps requires experience and training to interpret them. Humans can, however, immediately recognize and understand patterns in visual images which would be difficult for even the best and fastest computers to pick out. Much effort has thus been spent in turning data into images.
  • In particular, images which are generated from data which are not related to light are often difficult to produce and often require skill to interpret. One such type of data is sonar data, wherein a sonar signal pulse is sent out from a sonar generator into a volume of sea water or fresh water of a lake or river, and reflected sound energy from objects in the insonified volume is measured by a sonar receiver.
  • The field of underwater sonar imaging is different from the fields of medical ultrasonic imaging and imaging of underground rock formations because there are far fewer sonar reflecting surfaces in the underwater insonified volume. Persons skilled in the medical and geological arts would not normally follow the art of sonar imaging of such sparse targets. FIG. 1 shows a sketch of the system of the invention. A vessel 10 carrying the apparatus 11 of the invention is on the surface 14 of a body of water which we will call a part of a sea. The water rests on a seabed 13. It is understood that any fluid that supports sound waves may be investigated by the methods of the present invention. The apparatus 11 generally comprises a sonar ping transmitter (or generator) and a sonar receiver, but the sonar transmitter and receiver may be separated for special operations. Various sections of the apparatus are each controlled by controllers which determine parameters required for optimum operation of the entire system. In the present specification, a parameter is a specific value to be used which can be changed rapidly between pings. The parameters may be grouped in sets and the set can be switched, either by hand or automatically according to a criterion. The decision to switch parameters may be made by an operator or made automatically based on information gained from prior pings sent out by sonar transmitter or by information gained from the current ping. FIG. 1 shows the sonar transmitter sending out pulses of sound waves 12 which propagate into the water in an approximately cone shaped beam. The pulses 12 strike objects in the water such as stones 15 on the seabed 13, an underwater vessel 17, a swimming diver 18, and a sea wall 16. The vessel 17 may either be manned or be a remotely operated vessel (ROV). The objects underwater that have a different density than the sea water reflect pulses 19 as a generally expanding waves back toward the apparatus 11.
  • The term “insonified volume” is known to one of skill in the art and is defined herein as being a volume of fluid through which sound waves are directed. In the present invention, the sonar signal pulse of sound waves is called and defined herein as a ping, which is sent out from one or more sonar ping generators or transmitters, each of which insonifies a roughly conical volume of fluid. A sonar ping generator is controlled by a ping generator controller according to set of ping generator parameters. Ping generator parameters comprise ping sonar frequency, ping sonar frequency variation during the ping pulse, ping rate, ping pulse length, ping power, ping energy, ping direction with respect a ping generator axis, and 2 ping angles which determine a field of view of the objects. A ping generator preferably has a fixed surface of material 22 which is part of a sphere, but may shaped differently. Preferred ping generators of the invention are sketched in FIGS. 2 through 4. FIG. 2A shows a ping generator cross section 20 with piezo electric elements 21 sandwiched between electrically conducting materials 22 and 23. Material 25 between the piezo electric elements is electrically insulating. The electrically conducting material 22 is preferably a solid sheet of material which is grounded and is in contact with the seawater. Material 22 is thin enough that ultrasonic pressure waves can easily pass through it, but thick enough that water does not leak through it and get into the interior of the ping generator. The other end of the piezoelectric material elements 21 is energized by applying an ultrasonic frequency voltage to electrical elements 24 which are separated electrically from each other and which energize groups of piezo electric elements 21 to vibrate with the same phase and frequency. Wires 25 are sketched to show the electrical connections to the different segments 24. The plan view of the transmitter shows the elements 24 in FIG. 2B shown segmented into 9 segments. FIG. 3 shows other preferred segmentation schemes useful in the method of the invention. FIG. 4A shows the beam pattern of the outgoing sonar waves if all the elements 21 are energized with the same phase and frequency electrical signal. FIG. 4B shows the beam pattern of the outgoing sonar waves if only the elements 21 in the center section of FIG. 2B are energized with the same phase and frequency electrical signal. For the relative size and curvature of the surfaces 25 of FIG. 4A, the full beam has a divergence of 50 degrees and the restricted beam shown in FIG. 4B has a divergence of 25 degrees. By energizing appropriate combinations of electrodes, the beam may be sent out up, down, left, or right.
  • Ping generators of the prior art could send out a series of pings with a constant ping frequency during the ping. Ping frequencies varying in time during the sent out ping are known in the prior art. Changing the ping frequency pattern, duration, power, directions, and other ping parameters rapidly and/or automatically between pings in a series has not heretofore been proposed. One method of the invention anticipates that the system itself uses the results from a prior ping can be analyzed automatically to determine the system parameters needed for the next ping, and can send the commands to the various system controllers in time to change the parameters for the next ping. When operating in a wide angle mode at a particular angle and range, for example, a new object anywhere in the field of view can signal the system controllers to send the next outgoing ping the direction of the object, decrease the field of view around the new object, increase the number of pings per second according to a criterion based on the distance to the object, set the ping power to optimize conditions for the range of the object, etc. Most preferably, the system can be set to automatically change any or all system parameters to optimize the system for either anticipated or in reaction to unanticipated changes in the environment.
  • In a particularly preferred embodiment, the controller system may be set to change the sent out frequency alternately between a higher and a lower frequency. The resulting images alternate between a higher resolution and smaller field of view for the higher frequency, and a lower resolution and a larger field of view for the lower frequency. The alternate images may then be stitched after the receiver stage to provide a video stream at half the frame rate of the system available with unchanged parameters, but with higher central resolution and wider field of view, or at the same frame rate by stitching neighboring images.
  • Intelligent steering of the high-resolution, focused field of view on to a specific target of interest would mean that this technology would not necessarily be limited only to short range applications. If only one of the four steered pings, for example, needs to be continuously updated to generate real-time images, then the range limit could be significantly extended. The intelligent focusing may be implemented in a mode whereby a low-frequency, low-resolution ping with a large field of view is used to locate the target of interest. The subsequent high-frequency, high-resolution ping may then be directed to look specifically at the region of interest without having to physically steer the sonar head.
  • In this particularly preferred embodiment, additional intelligent and predictive processing and inter-frame alignment may be used to account for and track motion and moving objects. The priority of frame processing may be adapted to allow focus and higher refresh rate of images including the primary target, for example with the field of view centered on a primary target, or moving objects requiring the images that represent a portion of the field of view containing moving object to be updated more frequently.
  • The sonar receiver of the invention is a large array of pressure measuring elements. The sonar receiver is controlled by a sonar receiver controller according to set of sonar receiver parameters. The array is preferably arranged as a planar array shown in FIG. 5 because it is simpler to construct, but may be shaped in any convenient form such as a concave or convex spherical form for different applications. The array has preferably 24 times 24 sonar detecting elements, or more preferably 48 times 48 elements, or even more preferably 64 time 64 detectors, or most preferably 128 times 128 elements. A square array of elements is preferred, but the array may be a rectangular array or a hexagonal array or any other convenient shape. The detector elements are generally constructed by sandwiching a piezo electric material between two electrically conducting materials as shown for the sonar transmitter, but with an electrical connection to each element in the array. When a reflected sonar ping reaches the sonar detecting element, the element is compressed and decompressed at the sonar ping frequency, and produces a nanovolt analog signal between the electrically conducting materials. The nanovolt signals are amplified and digitally sampled at a sonar receiver sampling rate controlled by the sonar receiver controller, and the resulting digital signal is compared to a signal related to sent out ping signals to measure the phase and amplitude of the incoming sonar signals for each receiver element. The amplification or gain for the incoming sonar signals is controlled by the sonar receiver controller. If the sonar ping frequency is changed rapidly between pings, the sampling rate may also be changed to reflect the changed ping frequency. The incoming sonar ping is divided into consecutive slices of time, where the slice time is related to the slice length by the speed of sound in the water. A slice time parameter is set by the sonar receiver controller. For example, pings arriving from more distant objects can have wider slices than pings reflections from closer objects. Each slice contains a number of sonar wavelengths as the pulse travels through the water. The sonar receiver preferably has sonar receiver parameters controlled by the sonar receiver controller to have, for example, programmable phase delays between the detector elements digital sampling times may be varied to achieve the same result. The sonar receiver may have parameters controlled by the sonar receiver controller which can be set to change the amplification or gain of the nanovolt electrical signals during the incoming sonar ping reflected signals. Prior art time varying gain (TVG) systems have used preplanned amplification ramps to correct for attenuation in the water column. This gain is applied based on range (distance from transmitter), but the gain profile does not change from ping to ping. Generally, the attenuation of the ultrasonic waves is higher for higher ping frequencies. Prior art changed the amplification factor by a preplanned schedule to even out the signals between the received first slice and the last slice of a ping. Prior TVG did not allow for the increased absorption by soft mud on the seafloor, for example. Since mud absorbs sound waves, the reflected sound waves are less intense as soon as the reflected slice reaches the mud. The TVG is changed on the next ping to boost the signals that reflect or are scattered by the mud. In the same way, the TVG is changed to boost or reduce the gain for slices that more strongly reflect or are scattered by a hard, highly reflecting object like the sea wall shown in FIG. 1.
  • A phase and amplitude of the pressure wave coming into the sonar receiver is preferably assigned to each detector element for each incoming slice, and a phase map may be generated for each incoming slice. A phase map is like a topographical map showing lines of equal phase on the surface of the detector array.
  • FIG. 5C sketches a reflected ping 54 reflected by first object at a range of 20 detector widths from the detector. The first object is on a line starting from the center of the detector and perpendicular to the detector surface. The scattered ping is shown having a spherical surface to reflect a wave with origin at the surface of the first object. The phase map for this ping will be a series of circular regions centered in the center of the detector, all having the same phase, and moving outward from the center of the detector as the various slices of the ping are analyzed. Reflected ping 55 indicates a second object located further away from the detector than the first object, and at an angle of 5 degrees to the right of the center line. Reflected ping 56 shows a third object located yet further away from the detector, and at an angle of 10 degrees to the left of the center line. Pings 55 and 56 produce similar rings originating to the left and right of the detector, and expanding as slightly elliptical rings outwardly from their centers (which are not located on the detector for the angles shown).
  • Applying additional gain control can be incorporated with Phase Filtering.
  • Phase map and data cleanup and noise reduction may be done optionally in the sonar receiver or in a beamformer section. The phase map and/or the digital stream of data from the detector are passed to the beamformer section, where the data are analyzed to determine the ranges and characteristics of the objects in the insonified volume.
  • The range of the object is determined by the speed of sound in the water and the time between the outgoing ping and the reflected ping received at the receiver. The data are most preferably investigated by using a spherical coordinate system with origin in the center of the detector array, a range variable, and two angle variables defined with respect to the normal to the detector array surface. The beamformer section is controlled by a beamformer controller using a set of beamformer parameters. The space that the receiver considers is divided into a series of volume elements radiating from the detector array and called beams. The center of each volume element of a beam has the same two angular coordinate and each volume element may have the same thickness as a slice. The beam volume elements may also preferably have thickness proportional to their range from the detector, or any other characteristic parameters as chosen by a beamformer controller. The range resolution is given by the slice thickness.
  • The beamformer controller controls the volume of space “seen” by the detector array and used to collect data. For example, if the sonar transmitter sends out a narrow or a broad beam, or changes the direction of the sent out beam, the beamformer may also change the system to only look at the insonified volume. Thus, the system of the invention preferably changes two or more of the system parameters between the same pings to improve the results. Some of the parameters controlled by the beamformer controller are:
      • Field-of-view
      • Minimum and maximum beamformed ranges
      • Beam detection mode such as (First Above Threshold FAT or maximum amplitude (MAX) or many other modes as known in the art)
      • Range resolution
      • Minimum signal level included in image
      • Image dynamic range
      • Array weighting function (used to modify the beamforming profile)
      • Applying additional gain post beamforming (this can be incorporated with Thresholding).
  • The incoming digital data stream from each sonar detector of the receiver array has typically been multiplied by a TVG function. A triangular data function ensures that the edges of the slices have little intensity to reduce digital noise in the signal. The TVG signal is set to zero to remove data that is collected from too near too and to far away from the detector, and to increase or decrease the signal depending on the situation.
  • In the prior art, the data have been filtered according to a criterion, and just one volume element for each beam was selected to have a value. For example, if the data was treated to accept the first signal in a beam arriving at the detector having an amplitude above a defined threshold (FAT), the three dimension point cloud used to generate an image for the ping would be much different from a point cloud generated by picking a value generated by using the maximum signal (MAX). In the FAT case, the image would be, for example, of fish swimming through the insonified volume and the image in the MAX case would be the image of the sea bottom. In the prior art, only one range in each beam would show at most one value or point and all the other ranges of a single beam would be assigned a zero.
  • In the present invention, the data stream is analyzed by completing two or more beamformer processing procedures in the time between two pings, either in parallel or in series. In a video presentation, the prior art showed a time series of 3D images to introduce another, fourth dimension time into the presentation of data. By introducing values into more than one volume element per ping, we introduce a 5th dimension to the presentation. We can “see” behind objects, for example and “through” objects and “around” objects to get much more information. We can use various data treatments to improve the video image stream. In the same way, other ways of analyzing the data stream can be used to accomplish provide cleaner images, higher resolution images, expanded range images, etc. These different images imaging tasks to can be used on only one ping. The different images may be combined into a single image in a video presentation, or in more than one video at the frame rate the same as the ping rate.
  • If we are surveying a seawall, we Beamform the data before the wall (sea bottom—oblique to beams (low backscatter) soft (low intensity signals returned)) differently from the harbour wall (orthogonal to beams (high back scatter) hard, high intensity. If we know where a seawall is from a chart, the beamformer can use GPS or camera data to work out what ranges are before the wall and what are after and change TVG in the middle of the returned ping.
  • If we know the sea depth we can specify two planes, SeaSurfacePlane and SeatBottomPlane only data between the planes will be processed and sent from the head to the top end.
  • A large amount of data generated per second by prior art sonar systems has traditionally been discarded because of data transmission and/or storage limits. The present invention allows a higher percentage of the original data generated to be stored for later analysis.
  • FIG. 6 shows a flowchart of the method of the invention. The start 60 of the process of sending out a ping is to set all system parameters for all system controllers. Either all parameters are the same as the last ping, or they have been changed automatically by signals from stages of the previous ping. Step 60 sends signal to step 61 to send commands to transmitter 62. Transmitter sends data to receiver controller 63 to set parameters for receiver 64 and start receiver 64. Receiver receives analogue signals, samples the voltages from each element, and transmits data to the beamformer controller which sends data and instructions to the Beamformer section.
  • The beamformer analyses data and decides whether the next ping should change settings, and if so sends signals to the appropriate controller to change the settings for the next ping. The beamformer analyses the data in step 67 and decides either on the basis of incoming ping data or on previous instructions whether to perform single or multiple types of analysis of the incoming ping data. For example, the beamformer could analyze the data using both the FAT and MAX analysis, and present both images either separately or combined, so that there will be some beams having more than one value per beam. The reduced data is sent from step 67 to step 68 which stores or sends raw data or image data for further processing into a video presentation at a rate greater than 5 frames per second.
  • Obviously, many modifications and variations of the present invention are possible in light of the above teachings. It is therefore to be understood that, within the scope of the appended claims, the invention may be practiced otherwise than as specifically described.

Claims (19)

1. A method of recording a 3D sonar image, comprising;
a) transmitting a series of sonar pings into a first volume of water, the series of sonar pings transmitted from a sonar ping transmitting device at a rate at least 5 pings per second, wherein the sonar ping transmitting device is controlled by sonar ping transmitting parameters, and wherein each sonar ping transmitting parameter is chosen from predetermined list of sonar transmitting parameter settings;
b) receiving sonar signals reflected or scattered from objects in the first volume of water from each of the series of sonar pings, the received sonar signals received by a large two dimensional array sonar receiving device;
c) wherein the sonar receiving device is controlled by sonar receiving parameters, and;
d) beamforming the received sonar signals from each of the series of sonar pings with a sonar beamforming device to form a three dimensional (3D) sonar image of the objects reflecting or scattering the received sonar signals, wherein the sonar beamforming device is controlled by a set of sonar beamforming parameters, and wherein each sonar beamforming device parameter is chosen from predetermined list of sonar beamforming device parameter settings;
e) wherein at least one of the sonar transmitting parameters; sonar receiving parameters, or sonar beamforming parameters is changed in the time between any two sonar pings of the series of sonar pings.
2. The method of claim 1, wherein the transmitter frequency is changed in the time between any two sonar pings of the series of sonar pings.
3. The method of claim 2, wherein the transmitter frequency is changed from a first frequency to a second frequency and back between each ping of the series of sonar pings.
4. The method of claim 3, wherein a sonar beamforming device parameter wherein each is changed between each ping of the series of sonar pings.
5. The method of claim 2, wherein the transmitter frequency and the insonified volume are changed in the time between any two sonar pings of the series of sonar pings.
6. The method of claim 1, wherein at least one sonar beamforming parameters is changed in the time between any two sonar pings of the series of sonar pings.
7. The method of claim 6, wherein at least two different fields of view are imaged in the series of sonar pings.
8. The method of claim 7, wherein at least four different fields of view are imaged in sequence in the series of sonar pings, and wherein images of the four different fields of view are stitched to make one composite image, and wherein the a series of composite images is presented as a video presentation with a frame rate of at least 5 frames per second.
9. A method of real time three dimensional (3D) sonar imaging, comprising:
a) insonifying a first volume of fluid with a first series of at least one sonar ping transmitted by a first sonar transmitter; wherein the first sonar transmitter frequency has a first sonar transmitter frequency parameter set to a first sonar frequency, then
b) changing the first sonar transmitter frequency parameter to a second sonar frequency, then
c) transmitting a second series of at least one sonar ping, wherein the time between the last sonar ping of the first series of sonar pings and the first sonar ping of the second series is less than 0.2 seconds, and wherein the first series of sonar pings and the second series of sonar pings are transmitted at a rate greater than 5 pings a second;
d) receiving for each of the series of sonar pings sonar signals reflected from one or more objects in the volume of fluid, wherein the sonar signals are received with a large 2D array of sonar signal detectors;
e) beamforming the reflected sonar signals for each of the series of sonar pings to provide a series of three dimensional (3D) sonar images of the one or more objects.
10. The method of claim 9, further comprising;
f) recording a video presentation of the series of 3D sonar images shown sequentially at a rate greater than 5 images per second.
11. The method of claim 9, further comprising;
f) producing a series of sonar images by alternating the first sonar transmitter frequency parameter between the first and the second sonar frequencies ping to ping; then
g) stitching neighboring pairs sonar images of the series of 3D images provided in step e) to make a third series of composite sonar images having a wider field of view than images in the first series and a resolution over portions of the field of view higher than images in the second series; then
h) recording a video presentation of the composite 3D sonar images shown sequentially at a rate greater than 5 images per second.
12: The method of claim 5, wherein resulting images alternate between a higher resolution and smaller field of view for the higher frequency and a lower resolution and larger field of view for the lower frequency, and the alternating images are stitched after the receiver stage to provide video stream with a higher central resolution and wider field of view at half the frame rate of the system available with unchanged parameters.
13. The method of claim 5, where resulting images alternate between a higher resolution and smaller field of view for the higher frequency and a lower resolution and larger field of view for the lower frequency, and the alternating images are stitched as neighboring images to provide a video stream at the same frame rate but a larger field of view.
14. The method of claim 8, wherein only a subset of the at least four images is updated continuously to generate real time images.
15. The method in claim 8, wherein at least one low frequency, low resolution ping with a large field of view is used to locate a target of interest, and subsequent high frequency high resolution pings are directed at a region of interest without steering the sonar head.
16. The method of claim 8, wherein intelligent processing is used to account for and/or track motion of moving objects.
17. The method of claim 8, wherein predictive processing is used to account for and/or track motion of moving objects.
18. The method of claim 8, wherein interframe alignment is used to account for and/or track motion of moving objects.
19. The method of claim 8, wherein the portion of the field of view containing the motion of moving objects is updated more frequently than the remaining portions of the field of view.
US16/727,198 2019-11-07 2019-12-26 Video imaging using multi-ping sonar Abandoned US20210141071A1 (en)

Priority Applications (3)

Application Number Priority Date Filing Date Title
US16/727,198 US20210141071A1 (en) 2019-11-07 2019-12-26 Video imaging using multi-ping sonar
US17/493,638 US20220026570A1 (en) 2019-11-07 2021-10-04 Techniques for sonar data processing
US17/674,514 US20220171056A1 (en) 2019-11-07 2022-02-17 Techniques for sonar data processing

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
US201962931956P 2019-11-07 2019-11-07
US201962932734P 2019-11-08 2019-11-08
US16/727,198 US20210141071A1 (en) 2019-11-07 2019-12-26 Video imaging using multi-ping sonar

Related Child Applications (2)

Application Number Title Priority Date Filing Date
US16/729,404 Continuation-In-Part US20210141087A1 (en) 2019-11-07 2019-12-29 Video imaging using multi-ping sonar
US17/493,638 Continuation-In-Part US20220026570A1 (en) 2019-11-07 2021-10-04 Techniques for sonar data processing

Publications (1)

Publication Number Publication Date
US20210141071A1 true US20210141071A1 (en) 2021-05-13

Family

ID=75845442

Family Applications (4)

Application Number Title Priority Date Filing Date
US16/693,684 Active 2042-02-13 US11789146B2 (en) 2019-11-07 2019-11-25 Combined method of location of sonar detection device
US16/727,198 Abandoned US20210141071A1 (en) 2019-11-07 2019-12-26 Video imaging using multi-ping sonar
US16/729,404 Abandoned US20210141087A1 (en) 2019-11-07 2019-12-29 Video imaging using multi-ping sonar
US16/775,060 Abandoned US20210141072A1 (en) 2019-11-07 2020-01-28 Method of recording sonar data

Family Applications Before (1)

Application Number Title Priority Date Filing Date
US16/693,684 Active 2042-02-13 US11789146B2 (en) 2019-11-07 2019-11-25 Combined method of location of sonar detection device

Family Applications After (2)

Application Number Title Priority Date Filing Date
US16/729,404 Abandoned US20210141087A1 (en) 2019-11-07 2019-12-29 Video imaging using multi-ping sonar
US16/775,060 Abandoned US20210141072A1 (en) 2019-11-07 2020-01-28 Method of recording sonar data

Country Status (1)

Country Link
US (4) US11789146B2 (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20210141086A1 (en) * 2019-11-07 2021-05-13 Coda Octopus Group Inc. Combined method of location of sonar detection device
CN116858098A (en) * 2023-08-15 2023-10-10 中国铁路经济规划研究院有限公司 Automatic acquisition method and system for multi-element information of tunnel in construction period

Families Citing this family (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11448755B2 (en) * 2020-09-18 2022-09-20 Coda Octopus Group, Inc. System and techniques for split-aperture beamforming
US20220252707A1 (en) * 2021-02-09 2022-08-11 Coda Octopus Group, Inc. System and techniques for configuring a two-dimensional semiregular sparse array of sonar detector elements
US20220373678A1 (en) 2021-05-21 2022-11-24 Navico Holding As Steering assemblies and associated methods
US11796661B2 (en) 2021-05-21 2023-10-24 Navico, Inc. Orientation device for marine sonar systems
US11921200B1 (en) * 2022-08-19 2024-03-05 Navico, Inc. Live down sonar view

Family Cites Families (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
NO307014B1 (en) * 1998-06-19 2000-01-24 Omnitech As Procedure for generating a 3D image
US20070159922A1 (en) * 2001-06-21 2007-07-12 Zimmerman Matthew J 3-D sonar system
US6868041B2 (en) * 2002-05-01 2005-03-15 Quester Tangent Corporation Compensation of sonar image data primarily for seabed classification
US20050093859A1 (en) * 2003-11-04 2005-05-05 Siemens Medical Solutions Usa, Inc. Viewing direction dependent acquisition or processing for 3D ultrasound imaging
US8902099B2 (en) * 2010-08-16 2014-12-02 Groundprobe Pty Ltd Work area monitor
US8854920B2 (en) * 2012-09-05 2014-10-07 Codaoctopus Group Volume rendering of 3D sonar data
US10024957B2 (en) * 2015-09-17 2018-07-17 Navico Holding As Adaptive beamformer for sonar imaging
US9711851B1 (en) * 2016-02-04 2017-07-18 Proxy Technologies, Inc. Unmanned vehicle, system and method for transmitting signals
WO2017189449A2 (en) * 2016-04-29 2017-11-02 R2Sonic, Llc Multifan survey system & method
US20180011190A1 (en) * 2016-07-05 2018-01-11 Navico Holding As High Ping Rate Sonar
US10984543B1 (en) * 2019-05-09 2021-04-20 Zoox, Inc. Image-based depth data and relative depth data
US20220026570A1 (en) * 2019-11-07 2022-01-27 Coda Octopus Group Inc. Techniques for sonar data processing
US11789146B2 (en) * 2019-11-07 2023-10-17 Coda Octopus Group Inc. Combined method of location of sonar detection device

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20210141086A1 (en) * 2019-11-07 2021-05-13 Coda Octopus Group Inc. Combined method of location of sonar detection device
US11789146B2 (en) * 2019-11-07 2023-10-17 Coda Octopus Group Inc. Combined method of location of sonar detection device
CN116858098A (en) * 2023-08-15 2023-10-10 中国铁路经济规划研究院有限公司 Automatic acquisition method and system for multi-element information of tunnel in construction period

Also Published As

Publication number Publication date
US20210141087A1 (en) 2021-05-13
US11789146B2 (en) 2023-10-17
US20210141072A1 (en) 2021-05-13
US20210141086A1 (en) 2021-05-13

Similar Documents

Publication Publication Date Title
US20210141071A1 (en) Video imaging using multi-ping sonar
US20220026570A1 (en) Techniques for sonar data processing
US11668820B2 (en) Sonar data compression
US7369461B2 (en) Acoustic transducer and underwater sounding apparatus
US20150276930A1 (en) Sonar transducer assembly
KR20090084877A (en) Ship mounted underwater sonar system
US20130083628A1 (en) Imaging system and method
US6829197B2 (en) Acoustical imaging interferometer for detection of buried underwater objects
JP2016090452A (en) Detection device and underwater detection device
US10718865B2 (en) Method of compressing beamformed sonar data
US20200333787A1 (en) Marine surface drone and method for characterising an underwater environment implemented by such a drone
WO2017219470A1 (en) Apparatus for forming three-dimensional ultrasonic wave image of unmanned vessel monitoring region by using orthogonal array
JP7224959B2 (en) How to compress sonar data
JP5593204B2 (en) Underwater acoustic imaging device
Pinto et al. Real-and synthetic-array signal processing of buried targets
US20200292699A1 (en) Sonar tracking of unknown possible objects
JP2010175429A (en) Synthetic aperture sonar
US20230036543A1 (en) Depointable parametric echosounder, and method for characterizing a portion of the sub-bottom of a subaquatic environment
Ehrhardt et al. Comparison of different short-range sonar systems on real structures and objects
Nolan et al. A low directivity ultrasonic sensor for collision avoidance and station keeping on inspection-class AUVs
KR20190048608A (en) Apparatus and method for signal processing in sonar system, recording medium for performing the method
Dutkiewicz et al. Synthetic aperture sonar for sub-bottom imaging
Suzuki et al. Verification of resolution on self-focusing effect of polarization-inverted transmitter by up-chirp signal for subaperture array
Guyonic et al. Buried mines detection and classification: 2D-SAS processing definition and experimental results
CN117111140A (en) Three-dimensional detection device and method for underwater stratum structure by phased ultrasonic area array three-dimensional focusing

Legal Events

Date Code Title Description
AS Assignment

Owner name: CODA OCTOPUS GROUP INC., FLORIDA

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:CUNNINGHAM, BLAIR GRAEME;PEARSON, CHARLIE;SLOSS, MARTYN;SIGNING DATES FROM 20191223 TO 20201201;REEL/FRAME:055345/0993

STPP Information on status: patent application and granting procedure in general

Free format text: NON FINAL ACTION MAILED

AS Assignment

Owner name: CODA OCTOPUS GROUP, INC., FLORIDA

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:MCFADZEAN, ANGUS;REEL/FRAME:057410/0533

Effective date: 20210819

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION