CN107544071B - Water detection system - Google Patents

Water detection system Download PDF

Info

Publication number
CN107544071B
CN107544071B CN201710473384.2A CN201710473384A CN107544071B CN 107544071 B CN107544071 B CN 107544071B CN 201710473384 A CN201710473384 A CN 201710473384A CN 107544071 B CN107544071 B CN 107544071B
Authority
CN
China
Prior art keywords
image
transmission wave
detection system
transmission
water detection
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201710473384.2A
Other languages
Chinese (zh)
Other versions
CN107544071A (en
Inventor
奥西哲
永井早苗
上山佳宏
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Furuno Electric Co Ltd
Original Assignee
Furuno Electric Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Furuno Electric Co Ltd filed Critical Furuno Electric Co Ltd
Publication of CN107544071A publication Critical patent/CN107544071A/en
Application granted granted Critical
Publication of CN107544071B publication Critical patent/CN107544071B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S15/00Systems using the reflection or reradiation of acoustic waves, e.g. sonar systems
    • G01S15/88Sonar systems specially adapted for specific applications
    • G01S15/96Sonar systems specially adapted for specific applications for locating fish
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S15/00Systems using the reflection or reradiation of acoustic waves, e.g. sonar systems
    • G01S15/88Sonar systems specially adapted for specific applications
    • G01S15/89Sonar systems specially adapted for specific applications for mapping or imaging
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/52Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S15/00
    • G01S7/56Display arrangements
    • G01S7/62Cathode-ray tube displays
    • G01S7/6281Composite displays, e.g. split-screen, multiple images

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • Acoustics & Sound (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • General Physics & Mathematics (AREA)
  • Measurement Of Velocity Or Position Using Acoustic Or Ultrasonic Waves (AREA)
  • Transducers For Ultrasonic Waves (AREA)

Abstract

The invention provides a water detection system which suppresses the cost of the whole system and has high convenience for users. The water detection system (1) is composed of the following components: a transmission circuit unit (7 a) for driving a transceiver (2) that transmits a first transmission wave and a second transmission wave having a wider beam width in the vertical direction than the first transmission wave; a reception circuit unit (11, 12) that generates a first reception signal based on a reflected wave of the first transmission wave and generates a second reception signal based on a reflected wave of the second transmission wave; a first control unit (7 b) that causes the transmission circuit unit (7 a) to generate a first drive signal that is the basis of the first transmission wave; a first image generation unit (16) that generates a first image on the basis of the first reception signal; a second control unit (20) that causes the transmission circuit unit (7 a) to generate a second drive signal that is the basis of a second transmission wave; and a second image generation unit (23) that generates a second image on the basis of the second reception signal.

Description

Water detection system
Technical Field
The present invention relates to a water detection system for detecting objects.
Background
For example, patent document 1 discloses a water detection device (sonar) that transmits an ultrasonic beam into water and scans a three-dimensional region, and displays water information (for example, a fish shoal) in the scanned region as a three-dimensional image based on the received echo. In the sonar disclosed in patent document 1, a transmission beam having no directivity is formed for a predetermined three-dimensional region. On the other hand, upon reception, a sharp directional beam is formed as a reception beam having a narrow beam width, and the reception beam is scanned in a three-dimensional area. The sonar similar to this is also disclosed in patent document 2, patent document 3, and the like. In these sonars, a transmission beam having a wide beam width is formed, and a reception beam having a narrow beam width is formed.
In general scanning sonar, after a transmission wave having a narrow beam width in the vertical direction is once transmitted in all directions around the ship, a reception beam having the same narrow beam width in the vertical direction rotates around the ship. This makes it possible to detect a fish school existing in the vicinity of a desired inclination angle direction with respect to the ship on which the sonar is mounted.
In addition, in the known scanning sonar, a horizontal plane image in which an image signal generated based on a reception signal obtained from a detection area is viewed from above is displayed.
Prior art literature
Patent literature
Patent document 1: japanese patent No. 5089319
Patent document 2: japanese patent application laid-open No. 60-78372
Patent document 3: japanese patent No. 2885989
Disclosure of Invention
However, in the case where it is intended to detect an area having a three-dimensional expansion using the above-described general scanning sonar, it is considered to scan a three-dimensional area by gradually changing the tilt angle of a beam having a narrow beam width in the vertical direction. However, the computation load becomes extremely large compared with a normal scan sonar. As a result, the device becomes larger, and the manufacturing cost increases. On the other hand, it is considered that the three-dimensional region is scanned as described above without increasing the size of the apparatus, but there is a problem that it is difficult to obtain real-time information because the update period of the image is long.
In addition, in the case where the fish school is present in a horizontal plane image obtained by viewing only the image signal from above as in the above-described known scanning sonar, there is a possibility that the fish school on the lower side is leaked.
The present invention has been made to solve the above problems, and an object of the present invention is to provide a water detection system which suppresses the cost of the entire system and is highly convenient for users.
The present invention has been made to solve the above problems, and an object thereof is to provide an underwater detection system capable of preventing leakage detection of a fish group.
In order to solve the above problems, an underwater detection system according to an aspect of the present invention includes: a transceiver having a plurality of transceiver elements; a transmission circuit unit that drives the plurality of transmission/reception wave elements that transmit a first transmission wave and a second transmission wave having a wider beam width in a vertical direction than the first transmission wave; a reception circuit unit that generates a first reception signal based on a reflected wave of the first transmission wave and generates a second reception signal based on a reflected wave of the second transmission wave; a first control unit that causes the transmission circuit unit to generate a first drive signal that is a basis of the first transmission wave; a first image generating unit that generates a first image based on the first reception signal output from the reception circuit unit; a second control unit that causes the transmission circuit unit to generate a second drive signal that is a basis of the second transmission wave; and a second image generation unit that generates a second image based on the second reception signal output from the reception circuit unit.
In order to solve the above problems, an underwater detection system according to an aspect of the present invention includes: a transceiver having a plurality of transceiver elements; a transmission circuit unit that drives the plurality of transmission/reception elements that transmit three-dimensional transmission waves propagating to a three-dimensional area; a reception circuit unit that generates a reception signal based on a reflected wave of the three-dimensional transmission wave; a three-dimensional region image data generation unit configured to generate three-dimensional region image data that is generated based on the received signal and that has three-dimensional position information and echo intensity information obtained in correspondence with each position included in the three-dimensional region; and an image generating unit that generates an upper horizontal plane image and a lower horizontal plane image, respectively, the upper horizontal plane image being an image in which the three-dimensional region image data is projected onto an upper horizontal plane located above the three-dimensional position of the three-dimensional region image data, and the lower horizontal plane image being an image in which the three-dimensional region image data is projected onto a lower horizontal plane located below the three-dimensional position of the three-dimensional region image data.
According to the present invention, it is possible to provide a water detection system which suppresses the cost of the entire system and has high convenience for the user. Further, according to the present invention, it is possible to provide a water detection system capable of preventing leakage detection of a fish school.
Drawings
Fig. 1 is a block diagram showing a configuration of an underwater detection system according to an embodiment of the present invention.
Fig. 2 is a diagram schematically showing a transmission range of a first transmission wave transmitted from a transceiver.
Fig. 3 is a diagram schematically showing a transmission range of the second transmission wave transmitted from the transceiver.
Fig. 4 is a block diagram showing the structure of a receiver.
Fig. 5 is a diagram schematically showing an example of a display screen displayed on the first display device.
Fig. 6 is a block diagram showing the configuration of the processing apparatus.
Fig. 7 is a diagram schematically showing an example of an upper horizontal plane image.
Fig. 8 is a diagram schematically showing a bottom level image generated in the process of generating a lower level image.
Fig. 9 is a diagram schematically showing an example of the lower horizontal plane image.
Fig. 10 is a diagram schematically showing a process of generating the upper side horizontal plane image and the lower side horizontal plane image generated by the second image generating unit.
Fig. 11 is a flowchart for explaining the operation of the water detection system.
Fig. 12 is a view showing an example of a three-dimensional area image displayed on the second display device of the water detection system according to the modification.
Fig. 13 is a block diagram showing a configuration of the water detection system according to the modification example.
Fig. 14 is a diagram showing an example of two three-dimensional area images displayed on the second display device of the water detection system according to the modification, wherein (a) is a diagram showing an upper horizontal plane image and (B) is a diagram showing a vertical plane image.
Fig. 15 is a view showing an example of an upper horizontal plane image displayed on the second display device of the water detection system according to the modification.
Fig. 16 is a vertical plane image displayed corresponding to the upper horizontal plane image shown in fig. 15, in which (a) is a rear view vertical plane image and (B) is a left view vertical plane image.
Fig. 17 is a view showing an example of a rear vertical plane image displayed on the second display device of the water detection system according to the modification.
Fig. 18 is an upper horizontal plane image displayed corresponding to the rear view vertical plane image shown in fig. 17.
Fig. 19 is a lower horizontal plane image displayed corresponding to the rear view vertical plane image shown in fig. 17.
Fig. 20 is a view showing an example of an upper horizontal plane image displayed on the second display device of the water detection system according to the modification.
Fig. 21 is a vertical plane image displayed corresponding to the upper horizontal plane image shown in fig. 20, in which (a) is a rear view vertical plane image and (B) is a left view vertical plane image.
Fig. 22 is a view showing an example of an upper horizontal plane image displayed on the second display device of the water detection system according to the modification.
Fig. 23 is a vertical plane image displayed corresponding to the upper horizontal plane image shown in fig. 22, where (a) is a rear view vertical plane image and (B) is a left view vertical plane image.
Fig. 24 is a view showing an example of a lower horizontal plane image displayed on the second display device of the water detection system according to the modification.
Fig. 25 is a vertical plane image displayed corresponding to the lower horizontal plane image shown in fig. 24, where (a) is a rear view vertical plane image and (B) is a left view vertical plane image.
Fig. 26 is a flowchart showing another example of operation of the water detection system according to the embodiment.
Description of the reference numerals:
1. 1a Water detection System
2. Wave transceiver
7a transmitting circuit section
7b first control part
8. 8a receiver (wherein a receiving circuit part is provided)
16. First image generating section
20. A second control part
23. Second image generating section
Detailed Description
Hereinafter, the water detection system 1 according to the embodiment of the present invention will be described with reference to the drawings.
[ Structure of Water detection System ]
Fig. 1 is a block diagram showing a configuration of a water detection system 1 according to an embodiment of the present invention. The water detection system 1 of the present embodiment is installed in a ship such as a fishing boat, for example. Hereinafter, the ship provided with the underwater detection system 1 will be referred to as a "host ship". Fig. 1 illustrates only some of the components of the receiver 8 and the processing device 5.
As shown in fig. 1, the water detection system 1 according to the present embodiment includes a scanning sonar 10, a processing device 5, and a second display device 6. The water detection system 1 is configured by externally arranging the processing device 5 and the second display device 6 on the generally known scanning sonar 10.
The scan sonar 10 includes a transceiver 2, a transmitter-receiver 3, and a first display 4.
The transceiver 2 has a function of transmitting and receiving ultrasonic waves, and is mounted on the bottom of the ship. For example, the transceiver 2 is substantially cylindrical, and is disposed such that its axial direction is along the vertical direction and its radial direction is along the horizontal direction.
Specifically, the transceiver 2 includes a substantially cylindrical case and a plurality of ultrasonic transducers (not shown) as transceiver elements mounted on the outer peripheral surface of the case. The ultrasonic transducer transmits ultrasonic waves into water, receives echoes, converts the echoes into electrical signals, and outputs the electrical signals to the receiver 8. In the present embodiment, the transceiver 2 is shown with a cylindrical case, but the shape is not particularly limited, and may be another shape such as a sphere, for example.
Fig. 2 is a diagram schematically showing a transmission range of the first transmission wave transmitted from the transceiver 2. Fig. 3 is a diagram schematically showing a transmission range of the second transmission wave transmitted from the transceiver 2. In fig. 2 and 3, the transmission range of the transmission wave transmitted from the transceiver 2 mounted on the host ship S is schematically indicated by the spot where dot hatching is applied.
In the present embodiment, the slave transceiver 2 transmits two types of transmission waves, specifically, a first transmission wave as shown in fig. 2 and a second transmission wave as shown in fig. 3. The transceiver 2 transmits the transmission wave in all directions in the horizontal direction around the ship.
The first transmission wave is the beam width theta in the vertical direction 1 A relatively narrow transmit wave. Beam width θ of first transmission wave 1 For example, the angle is about 8 degrees, but the angle is not limited to this, and may be smaller than 20 degrees. The region in which the first transmission wave is transmitted is hereinafter referred to as a two-dimensional region Z1. The reason why the region in which the first transmission wave is transmitted is referred to as a two-dimensional region Z1 is as follows. That is, the beam width θ in the vertical direction of the first transmission wave 1 Is the smallest beam width in the vertical direction that can be achieved in the transceiver 2, or a value close thereto. Therefore, the region in which the first transmission wave is transmitted is a region in which the spatial expansion is relatively narrow, and therefore, in this specification, this region is referred to as a two-dimensional region Z1.
The second transmission wave is the beam width theta in the vertical direction 2 A transmission wave that is wider than the first transmission wave. Beam width θ of second transmission wave 2 For example, the angle is about 30 degrees, but the angle is not limited to this, and may be 20 degrees or more. The region in which the second transmission wave is transmitted is hereinafter referred to as a three-dimensional region Z2. The reason why the region in which the second transmission wave is transmitted is referred to as the three-dimensional region Z2 is as follows. That is, it can be considered that: beam width θ in the vertical direction of the first transmission wave 1 In contrast to the relatively narrow beam width described above, the beam width θ is 20 degrees or more 2 The second transmission wave of (2) has a sufficiently wider beam width than the first transmission wave. Therefore, in the present specification, a region in which the three-dimensional expansion of the second transmission wave transmitted with a sufficiently wide beam width is relatively large is referred to as a three-dimensional region Z2.
In the transceiver 2, for example, after transmission of the first transmission wave and reception of the reflected wave of the first transmission wave are performed a plurality of times, transmission of the second transmission wave and reception of the reflected wave of the second transmission wave are performed 1 time. That is, in the present embodiment, the frequency of the second transmission wave being transmitted is smaller than the frequency of the first transmission wave being transmitted.
The transmitting/receiving apparatus 3 includes a transmission/reception switching unit 3a, a transmitter 7, and a receiver 8. The transmitting/receiving apparatus 3 is configured by a hardware processor 9 (e.g., CPU, FPGA, etc.), an analog circuit, a nonvolatile memory, and other devices. The hardware processor 9 functions as a first control unit 7b, a quadrature detector unit 13, a first beam forming unit 14, a filter unit 15, and a first image generating unit 16, which will be described in detail below. For example, the hardware processor 9 functions as the first control unit 7b, the quadrature detector unit 13, the first beam forming unit 14, the filter unit 15, and the first image generating unit 16 by reading out a program from a nonvolatile memory and executing the program.
The transmission/reception switching section 3a is for switching between transmitting and receiving signals to/from the transceiver 2. Specifically, when transmitting a drive signal for driving the transceiver 2 to the transceiver 2, the transmission/reception switching unit 3a outputs the drive signal output from the transmitter 7 to the transceiver 2. On the other hand, when receiving the reception signal from the transceiver 2, the transmission/reception switching unit 3a outputs the reception signal received by the transceiver 2 to the receiver 8.
The transmitter 7 is configured to generate a drive signal that is a basis of the transmission wave transmitted from the transceiver 2. The transmitter 7 includes a transmission circuit section 7a and a first control section 7b.
The transmission circuit unit 7a is controlled by the first control unit 7b and a second control unit 20 of the processing device 5 described in detail below, and generates a drive signal. Specifically, the transmission circuit unit 7a includes transmission circuits (not shown) provided corresponding to the respective ultrasonic transducers, and each transmission circuit is appropriately controlled by the first control unit 7b to generate the first drive signal. The first drive signal is a signal which is a basis of the first transmission wave (transmission wave having a beam width of about 8 degrees in the vertical direction) transmitted from the transceiver 2. In the transmission circuit 7a, each transmission circuit is controlled by the second control unit 20, and generates a second drive signal. The second drive signal is a signal based on a second transmission wave (transmission wave having a beam width of about 30 degrees in the vertical direction) transmitted from the transceiver 2.
The first control unit 7b appropriately controls each of the plurality of transmission circuits included in the transmission circuit unit 7a, and causes the transmission circuit unit 7a to generate the first drive signal.
Fig. 4 is a block diagram showing the structure of the receiver 8. The receiver 8 has an analog section 11, an analog-to-digital conversion section 12, a quadrature detection section 13, a first beam forming section 14, a filter section 15, and a first image generating section 16. The analog unit 11 and the analog-to-digital conversion unit 12 are provided as a reception circuit unit that generates a reception signal based on a reflected wave of the transmission wave.
The analog unit 11 amplifies the electric signal from the transceiver 2 and removes unnecessary frequency components by limiting the frequency band. The simulation unit 11 processes both an electric signal obtained from the reflected wave of the first transmission wave and an electric signal obtained from the reflected wave of the second transmission wave.
The analog-to-digital conversion section 12 converts the electric signal generated by the analog section 11 into a reception signal that is a digital signal. The analog-to-digital conversion unit 12 generates a first reception signal by processing an electric signal obtained from a reflected wave of a first transmission wave, and generates a second reception signal by processing an electric signal obtained from a reflected wave of a second reception wave.
The quadrature detector 13 applies quadrature detection processing to the first and second received signals obtained from the respective ultrasonic transducers, and generates an I signal and a Q signal. These signals are processed as complex signals with the I signal as the real part and the Q signal as the imaginary part. When the received signal outputted from the analog-to-digital converter 12 is the first received signal, the quadrature detector 13 outputs the generated complex signal as the first complex signal to the first beam forming unit 14. On the other hand, when the received signal outputted from the analog-to-digital converter 12 is the second received signal, the quadrature detector 13 outputs the generated complex signal as the second complex signal to the processing device 5. The second complex signals are output from the quadrature detector 13 to the processing device 5, and these second complex signals may be temporarily stored in a storage unit (not shown) provided in the transmitting/receiving device 3.
Here, the second complex signal is generated in the quadrature detector 13 and then output to the processing device 5, but the present invention is not limited thereto. Specifically, the second received signal generated in the analog-to-digital converter 12 may be output to the processing device 5, and then quadrature detection processing may be performed in the processing device 5.
The first beam forming unit 14 performs beam forming processing (specifically, phase adjustment addition) on the first complex signals obtained from the specific plurality of ultrasonic transducers, and generates a first beam signal which is a signal equivalent to a signal obtained from a single ultrasonic transducer having sharp directivity in a specific direction. The first beam forming unit 14 repeatedly performs beam forming processing while changing the combination of ultrasonic transducers to be subjected to the processing, thereby generating a plurality of first beam signals having directivity in each azimuth. Referring to fig. 2, the first beam forming section 14 generates a beam having a relatively narrow beam width θ in the vertical direction 1 A first beam signal (e.g., about 8 degrees, as an example).
The filter unit 15 applies a processing of a band-limiting filter or a pulse compression filter to the first beam signal formed by the first beam forming unit 14, thereby generating a two-dimensional image signal for generating a first image (two-dimensional area image) to be described later.
The first image generating unit 16 generates a two-dimensional area image representing the distribution of the objects around the ship based on the amplitude of the two-dimensional image signal (specifically, the absolute value of the complex signal) generated by the filter unit 15. Specifically, the first image generating unit 16 generates an image (hereinafter, also referred to as a horizontal mode image H1) of a distribution on a conical surface having the position of the transceiver 2 of the ship as a vertex, or an image (hereinafter, also referred to as a vertical mode image V1) of a distribution on a vertical surface including the transceiver 2, when viewed from above. The image generated by the first image generating unit 16 is an image generated based on a signal obtained from a first transmission wave having a relatively narrow beam width, and is an image obtained from a two-dimensional planar region having no expansion of space. In addition, the area where the horizontal mode image H1 is obtained is the area where dot hatching is applied in fig. 2.
Fig. 5 is a diagram schematically showing an example of the display screen 4a displayed on the first display device 4. On the display screen 4a of the first display device 4, the horizontal mode image H1 and the vertical mode image V1 generated by the first image generating unit 16 are displayed. For example, by appropriately operating an operating device (not shown) such as a keyboard provided in the water detection system 1, the user can switch between the horizontal mode image H1 and the vertical mode image V1 on the first display device 4 or can simultaneously display them. In fig. 5, an example is shown in which a fish farm exists in the 2 o' clock azimuth with respect to the ship S.
In the first display device 4, the region where the signal having a high echo intensity is obtained is represented by a region to which a high density of dot hatching is added, the region where the signal having a medium echo intensity is obtained is represented by a region to which a medium density of dot hatching is added, and the region where the signal having a low echo intensity is represented by a region to which a low density of dot hatching is added. Hereinafter, the region to which the high-density spot hatching is added is referred to as a high-echo intensity region, the region to which the medium-density spot hatching is added is referred to as a medium-echo intensity region, and the region to which the low-density spot hatching is added is referred to as a low-echo intensity region. In the actual first display device 4, the high echo intensity region is represented by red, the medium echo intensity region is represented by green, and the low echo intensity region is represented by blue.
The processing device 5 is a device connected to the transmitting/receiving device 3 of the scanning sonar 10 via a cable or the like, and is constituted by a PC (personal computer), for example. As will be described in detail later, the processing device 5 is configured to process a part of the received signals processed by the transmitting and receiving device 3.
However, the water detection system 1 according to the present embodiment can generate, not only an image (specifically, the horizontal mode image H1 and the vertical mode image V1) in which the object included in the two-dimensional region Z1 near the ship is projected by the scanning sonar 10, but also an image in which the object included in the three-dimensional region Z2 (see fig. 3) which is a three-dimensional region near the ship is projected by the processing device 5 described in detail below. In the present water detection system 1, as long as a predetermined instruction is not received from a user via an operation device (not shown), an image on which an object included in the two-dimensional region Z1 is projected is generated by scanning the sonar 10. On the other hand, in the water detection system 1 according to the present embodiment, when a predetermined instruction is received from the user, the processing device 5, the scanning sonar 10, and the like perform operations described below, and an image on which the object included in the three-dimensional region Z2 in the vicinity of the ship is projected is generated. The user specification instruction described here is the following instruction: an image is generated by projecting the object contained in the three-dimensional region Z2 shown by dot hatching in fig. 3. Hereinafter, this predetermined instruction will be referred to as a three-dimensional region image generation instruction.
Fig. 6 is a block diagram showing the configuration of the processing device 5. The processing device 5 has a second control section 20, a second beam forming section 21, a filter section 22, and a second image generating section 23.
The second control unit 20 receives the user three-dimensional area image generation instruction, and appropriately controls each of the plurality of transmission circuits included in the transmission circuit unit 7a, thereby causing the transmission circuit unit 7a to generate the second drive signal. For example, when the transceiver 2 is cylindrical in shape, the second control unit 20 controls the amplitude and phase of the drive signal so that a function of a shading (shading) coefficient in the vertical direction becomes a sine function.
The second complex signal from the quadrature detector 13 is input to the second beam forming unit 21. The second beam forming unit 21 performs beam forming processing (specifically, phase adjustment addition) on the second complex signals obtained from the specific plurality of ultrasonic transducers, and generates a second beam signal which is a signal equivalent to a signal obtained from a single ultrasonic transducer having sharp directivity in the specific direction. The second beam forming unit 21 repeatedly changes the combination of the ultrasonic transducers to be subjected to the beam forming processThis process is performed to generate a plurality of second beam signals having directivity in each azimuth. The second beam forming unit 21 generates a beam having a beam width θ with the second transmission wave 2 The second beam signal of a narrower beam width is compared with the first beam signal, and the tilt angle thereof is gradually changed, thereby scanning the range in which the second transmission wave is transmitted. The position information of each three-dimensional area image data (described in detail below) generated based on the second transmission wave is calculated based on the distance from the transceiver 2 to the reflection target obtained from the time until the second transmission wave is transmitted and the azimuth of the second beam signal.
The filter unit 22 performs a process of a band-limiting filter or a pulse compression filter on the second beam signal formed by the second beam forming unit, thereby generating three-dimensional region image data for generating a second image (three-dimensional region image) to be described later. The three-dimensional region image data is a signal obtained from each position included in the three-dimensional region Z2, and has a three-dimensional position and echo intensity at which each signal is obtained as information. The filter unit 22 functions as, for example, a three-dimensional region image data generating unit.
The second image generating unit 23 generates an image representing the distribution of the objects around the ship based on the amplitude (specifically, the absolute value of the complex signal) of the three-dimensional region image data generated by the filter unit 22. Specifically, the second image generating unit 23 generates a three-dimensional region image as the second image based on the signal obtained from the three-dimensional region Z2 (see fig. 3).
FIG. 7 schematically shows an upper horizontal plane image H2 U Is shown in the figure of one example. Further, fig. 8 schematically shows the generation of the lower horizontal plane image H2 L Bottom horizontal plane image H2 generated during the process of (2) L ' drawing. Further, fig. 9 schematically shows a lower horizontal plane image H2 L Is shown in the figure of one example. In the present embodiment, the second image generating unit 23 generates an upper horizontal plane image H2 shown as an example in fig. 7 U And a lower horizontal plane image H2 shown as an example in fig. 9 L As a three-dimensional area image.
Fig. 10 schematically shows the upper horizontal plane image H2 generated by the second image generating unit 23 U Lower side horizontal plane image H2 L Is a graph of the generation process of (a). Fig. 10 illustrates a cross section in which three-dimensional region image data Sg drawn in three-dimensional orthogonal coordinates is cut by a predetermined plane (specifically, a plane extending in the up-down front-rear direction). The three-dimensional region image data Sg shown in fig. 10 is represented by a high echo intensity region S represented by red (dot hatching of high density is added to fig. 10) as in the case described above H A mid-echo intensity region S represented in green (a medium density dot hatching is added in fig. 10) M And a low echo intensity region S represented in blue (dotted shading of low density is added in fig. 10) L The composition is formed.
Upper side horizontal plane image H2 U The three-dimensional region image data Sg is projected to an upper horizontal plane P located above the three-dimensional region image data Sg HU Thereby generating. On the other hand, the lower side horizontal plane image H2 L The three-dimensional region image data Sg is projected to a lower horizontal plane P lower than the three-dimensional region image data Sg HL Thereby generating.
More specifically, the second image generating unit 23 generates the upper horizontal plane image H2 as follows U . Specifically, referring to fig. 10, the second image generation unit 23 adds a color visible on the nearest front side (upper side) to the upper side horizontal plane P when viewing the three-dimensional region image data Sg from above HU Thereby generating an upper side horizontal plane image H2 U . In the present embodiment, the region S shown in green and blue is M 、S L A proper perspective is set so that the near front side is covered by a blue green region S M Projected to the upper horizontal plane P HU A red region S having a front side covered with at least one of blue and green H Projected to the upper horizontal plane P HU . In addition, the red region S whose near front side is covered with at least one of blue and green H In a different colour from red (e.g. asLoess color, for example). Thereby, an upper horizontal plane image H2 shown in fig. 7 is generated U
The second image generating unit 23 generates the lower horizontal plane image H2 as follows L . First, the second image generating unit 23 generates a lower horizontal plane image H2 L Is based on a bottom horizontal plane image H2 L ’。
Specifically, referring to fig. 10, the second image generation unit 23 adds a color visible on the nearest front side (lower side) to the lower side horizontal plane P when viewing the three-dimensional region image data Sg from below HL Thereby generating a bottom-view horizontal plane image H2 L '. In the present embodiment, the region S indicated by green and blue is the same as the above case M 、S L A proper perspective is set so that the near front side is covered by a blue green region S M Projected to the lower horizontal plane P HL A red region S having a front side covered with at least one of blue and green H Projected to the lower horizontal plane P HL . In addition, the red region S whose near front side is covered with at least one of blue and green H Is displayed in a color different from red (for example, loess color as an example). Thereby, a bottom horizontal plane image H2 shown in fig. 8 is generated L ’。
The second image generating unit 23 is based on the bottom horizontal plane image H2 L ' generating a lower horizontal plane image H2 L . Specifically, referring to fig. 8, the second image generating unit 23 generates a bottom horizontal plane image H2 L ' relative to the bottom horizontal plane image H2 L The fore-and-aft axis L in' passing through a position corresponding to the host ship S and extending in the fore-and-aft direction is mirrored. Thereby, the lower horizontal plane image H2 is generated L
In the second display device 6, the three-dimensional area image generated by the second image generating section 23 is displayed. In the case of the present embodiment, the second display device 6 displays the upper horizontal plane image H2 U Lower side horizontal plane image H2 L
[ action of the Water detection System ]
Fig. 11 is a flowchart for explaining the operation of the water detection system 1. Next, with reference to fig. 11, the operation of the scanning sonar 10 and the processing device 5 included in the water detection system 1 will be described.
First, when the water detection system 1 is started, the scanning sonar 10 starts a normal operation (step S1). The normal operation of the scanning sonar 10 is a series of operations described below. In the normal operation, first, a signal having a narrow beam width θ is transmitted from the transceiver 2 1 The reflected wave of which is received by the transceiver 2. The reflected wave received by the transceiver 2 is processed by each component of the receiver 8, and a horizontal mode image H1 and a vertical mode image V1 are generated. As shown in fig. 5, the first display device 4 displays these horizontal mode images H1 and vertical mode images V1. The transmission wave transmitted to generate the vertical mode image V1 is a transmission wave having a relatively wide beam width in the vertical direction.
In the normal operation described above, when the instruction for generating the three-dimensional area image by the user is not received (no in step S3), the scanning sonar 10 continues the normal operation. On the other hand, when receiving the instruction for generating the three-dimensional area image from the user (yes in step S3), the water detection system 1 operates as shown in steps S4 to S6 below. Specifically, when the three-dimensional area image generation instruction is given in step S2, the second control unit 20 causes the transmission circuit unit 7a to generate the second drive signal. Then, in step S4, the transceiver 2 transmits the second transmission wave based on the second driving signal. After the reflected wave of the second transmission wave is received by the transceiver 2, the reflected wave is processed by the analog unit 11, the analog-to-digital conversion unit 12, and the quadrature detection unit 13, and a second complex signal is generated (step S5). These second complex signals are collected each time they are generated or after they are temporarily stored in a storage unit (not shown), and transferred to the second beam forming unit 21 of the processing device 5 (step S6).
In step S6, at least a part of the time when the second complex signal is transferred to the processing device 5 and at least a part of the time when the first image is generated in the normal operation overlap each other. In the present embodiment, the transfer processing of the second complex signal and the normal operation are performed in parallel. By performing the normal operation of the scanning sonar 10 while performing the transfer processing of the second complex signal, the transfer processing of data for generating the three-dimensional area image can be performed without substantially reducing the update rate of the images (specifically, the horizontal mode image H1 and the vertical mode image V1) generated during the normal operation.
In step S7, the processing device 5 that received the second complex signal performs processing of the second complex signal. Specifically, in step S7, the second beam forming unit 21 generates a second beam signal from the second complex signal, and the filter unit 22 generates three-dimensional region image data Sg obtained by subjecting the second beam signal to the filter processing. Then, based on the three-dimensional region image data Sg, an upper horizontal plane image H2 is generated U Lower side horizontal plane image H2 L . The upper horizontal plane image H2 thus generated U Lower side horizontal plane image H2 L Is displayed in the second display device 6 (step S8). Thereafter, when the instruction to stop the generation of the three-dimensional area image by the user is not received (no in step S9), the generation of the three-dimensional area image is continued. On the other hand, when the instruction to stop the generation of the three-dimensional area image by the user is received (yes in step S9), the generation of the three-dimensional area image is stopped. In addition, a second image (upper horizontal plane image H2 is generated U Lower side horizontal plane image H2 L ) And at least a part of the time of generating the first image in the normal operation. This allows the images to be generated in parallel, and thus prevents the update rate of each image from decreasing.
In addition, conventionally, a water detection device (for example, sonar) has been disclosed in which an ultrasonic beam is transmitted into water to scan a three-dimensional region, and water information (for example, a fish shoal) in the scanned region is displayed as a three-dimensional image based on the received echo. In the sonar disclosed in patent document 1, a nondirectional transmission beam is formed for a predetermined three-dimensional region. On the other hand, upon reception, a sharp directional beam is formed as a reception beam having a narrow beam width, and the reception beam is scanned in a three-dimensional area.
In general scanning sonar, after a transmission wave having a narrow beam width in the vertical direction is once transmitted in all directions around the ship, a reception beam having the same narrow beam width in the vertical direction rotates around the ship. This makes it possible to detect a fish school existing in the vicinity of a desired inclination angle direction with respect to the ship on which the sonar is mounted.
In the case where the above-described general scanning sonar is used to detect an area having a three-dimensional expansion, it is considered to scan the three-dimensional area by gradually changing the tilt angle of a beam having a narrow beam width in the vertical direction. However, in this case, the computation load becomes extremely large compared with the normal scan sonar. As a result, the device becomes larger, and the manufacturing cost increases. On the other hand, it is considered that the apparatus scans the three-dimensional region without increasing the size of the apparatus as described above, but if the image update period becomes long, there is a problem that it is difficult to obtain real-time information.
[ Effect ]
In this regard, in the water detection system 1 according to the present embodiment, the beam width θ is relatively narrow in the vertical direction 1 The first reception signal obtained by the reflected wave of the first transmission wave of (a) can generate the horizontal mode image H1 and the vertical mode image V1 at the same update rate as the conventional scanning sonar. Further, in the water detection system 1, the beam width θ based on the secondary and primary transmission waves 1 Compared to beam width theta 2 The second reception signal obtained by the reflected wave of the wider second transmission wave can also obtain the image H2 of the object contained in the three-dimensional region Z2 U 、H2 L . In this way, the horizontal mode image H1 and the vertical mode image V1 can be obtained at the same update rate as in the conventional case, and the distribution of the object included in the three-dimensional region Z2 can be grasped.
Thus, according to the water detection system 1, it is possible to provide a water detection system which suppresses the cost of the entire system and is highly convenient for the user.
Further, according to the water detection system 1, a two-dimensional area image (in the case of the present embodiment, the horizontal mode image H1 and the vertical mode image V1) is generated based on the first reception signal obtained from the reflected wave of the first transmission wave, and on the other hand, a three-dimensional area image (in the case of the present embodiment, the upper horizontal plane image H2) is generated based on the second reception signal obtained from the reflected wave of the second transmission wave having a beam width different from the first transmission wave U Lower side horizontal plane image H2 L )。
In the underwater detection system, since the area in which the object can be detected depends on the beam width of the transmission wave, if the object detection is performed using the transmission wave having a wide beam width as in the underwater detection system 1, the object contained in the space having the three-dimensional expansion can be detected. The object detected in the three-dimensional space can be projected onto the three-dimensional region image. Therefore, according to the water detection system 1, echo images having different characteristics can be generated based on the received signals corresponding to the transmission waves having different beam widths, respectively. Specifically, according to the pass having a narrow beam width θ 1 The two-dimensional area image obtained by the first transmission wave of (a) can detect the target object from the two-dimensional area (in the case of the present embodiment, the two-dimensional area Z1) having no expansion of the space. Thus, the position of the target can be accurately grasped. On the other hand, according to the beam having wide beam width theta 2 The three-dimensional region image obtained by the second transmission wave of (a) can detect the target from the three-dimensional region (in the case of the present embodiment, the three-dimensional region Z2) having the expansion of the space. Thus, the object can be detected in a wide space. That is, according to the water detection system 1, for example, the user can grasp the position of a desired fish school roughly by confirming the three-dimensional area image and grasp the more accurate position of the fish school by confirming the two-dimensional area image, so that a water detection system with high convenience can be provided.
Further, according to the water detection system 1, the beam width of the first transmission wave is set to be less than 20 degrees (specifically, about 8 degrees), and on the other hand, the beam width of the second transmission wave is set to be 20 degrees or more (specifically, about 30 degrees). Thus, a two-dimensional region detected by the first transmission wave and a three-dimensional region detected by the second transmission wave can be set.
Further, according to the water detection system 1, the processing device 5 constituted by a PC or the like is connected to the conventionally known scanning sonar 10, so that an image (three-dimensional area image) can be easily generated based on a signal obtained from a three-dimensional area. That is, according to the water detection system 1, it is possible to provide a water detection system capable of generating a three-dimensional area image at low cost without involving a large change in equipment.
In the water detection system 1, the scanning sonar 10 performs a normal operation while the scanning sonar 10 transfers the received signal to the processing device 5. This makes it possible to perform a transfer process of data for generating a three-dimensional area image without substantially reducing the update rate of the two-dimensional area image generated during normal operation.
In the water detection system 1, a two-dimensional area image based on the echo signal obtained from the two-dimensional area Z1 is displayed on the first display device 4, and a three-dimensional area image is displayed on the second display device 6. This allows the user to visually recognize the images obtained from the respective regions Z1 and Z2.
In the water detection system 1, the processing device 5 is constituted by a PC (personal computer). This allows the processing device 5 to be relatively inexpensive to construct.
In the water detection system 1, the frequency of the second transmission wave being transmitted is smaller than the frequency of the first transmission wave being transmitted. Thus, the three-dimensional region image can be generated without greatly reducing the update rate of the scanning sonar 10.
In the water detection system 1, the first image is an image obtained based on the echo of the object included in the two-dimensional region Z1 in which the first transmission wave is transmitted, and the second image is an image obtained based on the echo of the object included in the three-dimensional region Z2 in which the second transmission wave is transmitted. Thus, in the water detection system 1, the first image based on the echo obtained from the three-dimensional region Z1 having a small volume and the second image based on the echo obtained from the three-dimensional region Z2 can be obtained, and therefore, the water detection system with good convenience can be provided.
In addition, in the display device of the conventional water detection system, a horizontal plane image in which an image signal generated based on a reception signal obtained from a detection area is viewed from above is displayed. However, in this case, if the fish school is present while overlapping in the vertical direction, there is a possibility that the fish school on the lower side may be seen.
In this regard, in the water detection system 1, even when the fish school positions overlap in the vertical direction, the upper horizontal plane image H2 can be obtained with respect to the positions of the fish school on the upper side U To grasp and the position of the fish shoal on the lower side can be obtained by the lower side horizontal plane image H2 L To grasp. Specifically, referring to fig. 7 and 9, the echo image a in fig. 7 and the echo image C in fig. 9 are both echoes from a fish school. As in the present underwater detection system 1, the upper horizontal plane image H2 U Lower side horizontal plane image H2 L The fish shoals are displayed on the display device, so that both fish shoals overlapped in the vertical direction can be grasped. That is, according to the water detection system 1, it is possible to prevent the leakage detection of the fish shoal. The echo image B in fig. 7 is an echo image showing the track of the ship.
In addition, the bottom-view horizontal plane image H2 of the three-dimensional region image data Sg is simply viewed from below L In the' image, the fish school on the starboard side of the ship is displayed on the left side of the screen, and the fish school on the starboard side of the ship is displayed on the right side of the screen. Thus, it is difficult to intuitively grasp the position of the fish school when the user views the image. Specifically, the bottom horizontal plane image H2 is shown L The echo image C in' fig. 8 is actually located on the port side of the ship, but is looking up at the horizontal plane image H2 L ' the middle is located on the right side.
In this regard, in the water detection system 1, the above-described bottom horizontal plane image H2 is displayed L 'mirror image' to generate an underside horizontal plane image H2 L . If this is done, the fish school on the starboard side of the ship is displayed on the right side of the screen, the shipThe fish school on the port side of the boat is shown on the left side of the screen. Thus, the user can easily and intuitively grasp the position of the fish school.
Modification example
The embodiments of the present invention have been described above, but the present invention is not limited thereto, and various modifications may be made without departing from the gist of the present invention.
(1) Fig. 12 is a view showing an example of a three-dimensional area image displayed on the second display device of the water detection system according to the modification. In the above embodiment, the upper horizontal plane image H2 U Lower side horizontal plane image H2 L Both of these are displayed as three-dimensional area images on the second display device 6, but the present invention is not limited to this, and may be only the upper horizontal plane image H2 U Lower side horizontal plane image H2 L Either of (3) is used. Or can also display the oblique view image H2 shown in FIG. 12 T As a three-dimensional area image. Strabismus image H2 T The three-dimensional region image data Sg is projected onto an inclined plane intersecting both the vertical plane and the horizontal plane. Namely, the strabismus image H2 T The three-dimensional region image data Sg is viewed from an oblique direction. Thus, the strabismus image H2 T Is displayed on the second display device 6, so that the position of the fish school can be easily and intuitively grasped. In addition, a squint image H2 T The three-dimensional region image data Sg may be viewed from obliquely above or from obliquely below.
(2) Fig. 13 is a block diagram showing the structure of the water detection system 1a according to the modification. In the above-described embodiment, the description has been given of an example in which the processing device 5, which is constituted by a PC or the like separate from the conventionally known scanning sonar 10, is connected to the scanning sonar 10 to constitute the water detection system 1, but the present invention is not limited thereto. Specifically, as shown in fig. 13, the processing device 5 described in the above-described embodiment of the water detection system 1 may have a configuration in which each component included in the scanning sonar 10a is incorporated. The transmitting/receiving device 3b shown in fig. 13 has a structure in which the second control unit 20 is embedded in the transmitter 7c and the second image generation unit 23 is embedded in the receiver 8a. Although illustration in fig. 13 is omitted, the second beam forming unit 21 and the filter unit 22 in the above embodiment are embedded in the receiver 8a. Further, in the water detection system 1a, a two-dimensional area image and a three-dimensional area image are displayed in the first display device 4.
As described above, according to the water detection system 1a according to the present modification, the processing device 5 composed of a PC or the like, which is required in the water detection system 1 according to the above embodiment, is not required, and each component constituting the processing device 5 can be incorporated in the scanning sonar 10 a. Thus, the water detection system can be miniaturized.
(3) Fig. 14 is a view showing an example of two three-dimensional area images displayed on the second display device of the water detection system according to the modification, and fig. 14 (a) shows an upper horizontal plane image H2 U Fig. 14 (B) shows a vertical plane image V2.
In the second display device of the present modification, the upper horizontal plane image H2 is displayed U Rear view vertical plane image V2 B . Rear view vertical plane image V2 B The three-dimensional region image data Sg is projected onto a vertical plane that is located rearward and extends in the vertical-horizontal direction compared to the three-dimensional region image data Sg.
In the water detection system according to the present modification, referring to fig. 14 (a), when the user selects the upper horizontal plane image H2 displayed on the second display device using an operation device such as a mouse U Any 1 point (for example, point P1 in fig. 14 (a)) in the image H2 of both sides U 、V2 B And displaying a cursor thereon. Specifically, the upper horizontal plane image H2 U A cross cursor CS1 (first mark) on the upper horizontal plane passing through the point P1 selected by the user is displayed and the vertical plane image V2 is viewed from behind B On the screen, a vertical cursor CS extending in the up-down direction on the screen through the position corresponding to the point P1 is displayed BH (second mark). The upper horizontal cross cursor CS1 is composed of a vertical bar Bv extending in the up-down direction on the screen and a horizontal bar extending in the left-right directionThe extended horizontal bar Bh.
As described above, according to the present modification, the upper horizontal plane image H2 is generated based on the three-dimensional region image data Sg U And a vertical plane image V2, so that the position of a desired fish school in water can be grasped by visually recognizing the vertical plane image V2 in association with the fish school.
Further, according to the present modification, the cross cursor CS1 passing through the point P1 selected by the user is displayed in the upper horizontal plane image H2 U A vertical cursor CS passing through a position corresponding to the point P1 BH Is displayed in a rear view vertical plane image V2 B Is a kind of medium. Thus, the position of the desired fish school can be accurately grasped based on the cursors.
(4) Fig. 15 is a view showing an upper side horizontal plane image H2 displayed on the second display device of the water detection system according to the modification example U Is shown in the figure of one example. Fig. 16 is an upper horizontal plane image H2 shown in fig. 15 U The vertical plane image V2 displayed correspondingly is a rear-view vertical plane image V2 in fig. 16 (a) B Fig. 16 (B) shows a left-view vertical plane image V2 L
In the second display device of the present modification, the upper horizontal plane image H2 is displayed U Rear view vertical plane image V2 B Left-view vertical plane image V2 L . Left-view vertical plane image V2 L The three-dimensional region image data Sg is projected onto a vertical plane that is located on the left (port side) and extends in the up-down front-rear direction compared to the three-dimensional region image data Sg.
In the water detection system according to the present modification, referring to fig. 15, when the user selects the upper horizontal plane image H2 displayed on the second display device using an operation device such as a mouse U Any 1 point (e.g., point P2 in fig. 15), all three images H2 described above U 、V2 B 、V2 L A cursor is displayed. Specifically, the upper horizontal plane image H2 U In (a) a cross cursor CS1 (first mark) on the upper horizontal plane passing through the point P2 selected by the user is displayed. In addition, the vertical plane image V2 is viewed from behind B Left-view vertical plan viewImage V2 L In each of the above, a vertical cursor CS extending in the vertical direction on the screen through a position corresponding to the point P2 is displayed BH 、CS LH (second mark).
As described above, according to the present modification, the upper horizontal plane image H2 is generated based on the three-dimensional region image data Sg U Rear view vertical plane image V2 B Left-view vertical plane image V2 L Therefore, by associating them with each other and visually recognizing them, the position of a desired fish in water can be accurately grasped.
Further, according to the present modification, the cross cursor CS1 passing through the point P2 selected by the user is displayed in the upper horizontal plane image H2 U A vertical cursor CS passing through a position corresponding to the point P2 BH 、CS LH Is displayed in a rear view vertical plane image V2 B Left-view vertical plane image V2 L Is a kind of medium. Thus, the position of the desired fish school can be grasped more accurately based on these cursors.
(5) Fig. 17 is a rear view vertical plane image V2 displayed on the second display device of the water detection system according to the modification example B Is shown in the figure of one example. Fig. 18 is a rear view vertical plane image V2 similar to that shown in fig. 17 B Corresponding and displayed upper horizontal plane image H2 U . Fig. 19 is a rear view vertical plane image V2 similar to that shown in fig. 17 B Correspondingly displayed lower horizontal plane image H2 L
In the present modification, when the user inputs a desired depth range in which to detect the fish school, a depth range scale R indicating the depth range is displayed on the rear-view vertical plane image V2 B And about the upper horizontal plane image H2 U Lower side horizontal plane image H2 L Only echo images included in the depth range are displayed. In this way, since the fish shoal existing beyond the depth range that the user wants to detect or the unnecessary echo image (for example, the echo image B due to the track in fig. 7) can be removed from the display screen, the echo image of the desired fish shoal can be reliably displayed and the echo image that does not need to be displayed can be removed from the screen.
(6) Fig. 20 is a view showing an upper horizontal plane image H2 displayed on the second display device of the water detection system according to the modification example U Is shown in the figure of one example. Fig. 21 is an upper horizontal plane image H2 shown in fig. 20 U The vertical plane image V2 displayed correspondingly is a rear-view vertical plane image V2 in fig. 21 (a) B Fig. 21 (B) shows a left-view vertical plane image V2 L
In the present modification, when the user inputs a desired azimuth range in which the user wants to detect the fish school, a straight line L1 and a straight line L2 indicating the azimuth range are displayed in the upper horizontal plane image H2 U And with respect to the rear view vertical plane image V2 B Left-view vertical plane image V2 L Only echo images included in the azimuth range are displayed. In this way, since the fish school existing outside the azimuth range that the user wants to detect can be removed from the display screen, the echo image of the desired fish school can be reliably displayed, and the echo image that does not need to be displayed can be removed from the screen.
(7) In the above-described embodiment, the echo image included in the predetermined distance range with the ship as a reference is displayed by way of example, but the present invention is not limited thereto. For example, the following water detection system may be constructed: when the user inputs a desired distance range in which to detect the fish school, only echo images included in the distance range are displayed in each three-dimensional area image.
(8) In the above-described embodiments and modifications, coloring in each image is displayed in association with the echo level, but the present invention is not limited thereto. Specifically, although not shown, coloring in each image may be associated with depth. In this way, by displaying, for example, a horizontal plane image and a squint image, the depth of each fish school can be easily grasped.
(9) In the above-described embodiment, the example in which the signal processing is performed on all the received signals included up to the predetermined distance range with the ship as a reference has been described, but the present invention is not limited thereto. Specifically, for example, a distance range, an azimuth range, a depth range, or the like in which the user wants to detect the fish school may be input in advance, and only echo signals included in these ranges may be the object of signal processing. This can omit signal processing of echo signals obtained from a range that is not necessary for the user, and therefore can reduce the computational load that is required for the transmitting/receiving apparatus.
(10) Fig. 22 is a view showing an upper side horizontal plane image H2 displayed on the second display device of the water detection system according to the modification example U Is shown in the figure of one example. Fig. 23 is an upper horizontal plane image H2 shown in fig. 22 U The vertical plane image V2 displayed correspondingly is a rear-view vertical plane image V2 in fig. 23 (a) B Fig. 23 (B) shows a left-view vertical plane image V2 L . Fig. 24 is a view showing a lower horizontal plane image H2 displayed on the second display device of the water detection system according to the present modification example L Is shown in the figure of one example. Fig. 25 is a lower horizontal plane image H2 shown in fig. 24 L A vertical plane image V2 displayed correspondingly, and (a) in fig. 25 is a rear view vertical plane image V2 B Fig. 25 (B) shows a left-view vertical plane image V2 L
In this modification, when the user clicks the echo image of the fish school whose position is to be recognized with a mouse pointer or the like in any one of the three-dimensional area images, the position is displayed in the other three-dimensional area image. For example, referring to fig. 22 and 23, when the user clicks on the upper horizontal plane image H2 shown in fig. 22 U At the point P3 of (2), the upper horizontal plane image H2 is displayed U An upper horizontal cross cursor CS1 (first mark) centered on point P3. The upper horizontal cross cursor CS1 is composed of a vertical bar Bv extending in the vertical direction on the screen, and a horizontal bar Bh extending in the horizontal direction. At this time, a rear view vertical plane image V2 shown in fig. 23 (a) B In the coordinate position corresponding to the point P3, a rear-view vertical-plane cross cursor CS2 (second mark) centered around the coordinate position is displayed, and a left-view vertical-plane image V2 shown in fig. 23B is displayed L In the above, a left-view vertical-plane cross cursor CS3 (second mark) centered on the coordinate position corresponding to the point P3 is displayed. This makes it possible to grasp the correspondence relationship between fish shoals displayed in each three-dimensional area image more easily. In addition, as point P3The depth position is selected to be the uppermost depth position in the echo intensity range (in the case of the example shown in fig. 22, the echo intensity range shown by cross hatching) including the selected point P3.
Similarly, referring to fig. 24 and 25, when the user clicks on the lower horizontal plane image H2 shown in fig. 24 L At the point P4 of (2), the lower horizontal plane image H2 is displayed L A lower horizontal cross cursor CS4 (first mark) centered at point P4. At this time, a rear view vertical plane image V2 shown in fig. 25 (a) B In the coordinate position corresponding to the point P4, a rear-view vertical-plane cross cursor CS2 (second mark) centered around the coordinate position is displayed, and a left-view vertical-plane image V2 shown in fig. 25B is displayed L In the above, a left-view vertical-plane cross cursor CS3 (second mark) centered on the coordinate position corresponding to the point P4 is displayed. Further, as the depth position of the point P4, the depth position on the lowest side of the echo intensity range (in the case of the example shown in fig. 24, the echo intensity range shown by cross hatching) including the selected point P4 is selected.
Here, an upper horizontal plane image H2 is listed U Or underside horizontal plane image H2 L In the case where a certain position is selected, the vertical plane image V2 is viewed from behind B Left-view vertical plane image V2 L The example in which the cross cursors CS2 and CS3 are displayed at the positions corresponding to the selected positions has been described, but the present invention is not limited thereto. Specifically, when a certain position is selected in any one of the plurality of three-dimensional area images, a second scroll cursor may be displayed at a position corresponding to the selected position in the other three-dimensional area image.
Here, the first mark and the second mark are described as examples of cross cursors, but the present invention is not limited thereto. For example, the first mark and the second mark may be an o mark, an x mark, or the like.
(11) Fig. 26 is a flowchart showing another example of the operation of the water detection system 1 according to the above embodiment. In the above-described embodiment, the description has been given of the example in which the generation of the three-dimensional area image is continued until the stop instruction of the generation of the three-dimensional area image by the user is received, with reference to fig. 11, but the present invention is not limited thereto. Specifically, as shown in the flowchart of fig. 26, after the three-dimensional region image is once generated, the three-dimensional region image may not be updated until the user newly receives a three-dimensional region image generation instruction next time. Or may be an update period in which the user can specify the three-dimensional area image.
(12) In the water detection system 1 according to the above embodiment, a sensor capable of detecting the roll angle and pitch angle of the ship may be provided, and the processing device 5 may display a three-dimensional area image on coordinates obtained by performing coordinate conversion corresponding to the roll angle and pitch angle detected by the sensor. Thus, the spatial distribution of the fish school can be accurately grasped without being affected by the shaking of the ship body.
(13) In the underwater detection system 1 according to the above-described embodiment, a transmission wave having a relatively wide beam width in the vertical direction is used as the transmission wave for generating the vertical mode image V1. The transmission wave is transmitted to generate the vertical mode image V1. However, even if the transmission wave is not transmitted, the second transmission wave transmitted to generate the three-dimensional area image may be used as the transmission wave for generating the vertical mode image V1. The second transmission wave is a transmission wave having a relatively wide beam width in the vertical direction, and thus can be converted into a transmission wave for generating the vertical mode image V1. In this way, by converting the second transmission wave for generating the three-dimensional area image into the transmission wave for generating the vertical mode image, it is not necessary to transmit the transmission wave transmitted only for generating the vertical mode image V1. This suppresses a decrease in the update rate of the image during normal operation of the scan sonar, and enables a three-dimensional area image to be displayed.
Industrial applicability
The invention can be widely applied to the underwater detection system for detecting the object.

Claims (18)

1. An underwater detection system, comprising:
a transceiver having a plurality of transceiver elements;
a transmission circuit unit that drives the plurality of transmission/reception elements so that the transmission/reception device transmits a transmission wave having a beam width in a vertical direction in one go, wherein the plurality of transmission/reception elements that transmit a first transmission wave and a second transmission wave are driven, and the beam width of the second transmission wave in the vertical direction is wider than that of the first transmission wave;
a reception circuit unit that generates a first reception signal based on a reflected wave of the first transmission wave and generates a second reception signal based on a reflected wave of the second transmission wave;
a first control unit that causes the transmission circuit unit to generate a first drive signal that is a basis of the first transmission wave;
a first image generating unit that generates a first image based on the first reception signal output from the reception circuit unit;
a second control unit that causes the transmission circuit unit to generate a second drive signal that is a basis of the second transmission wave; and
a second image generating unit configured to generate a second image based on the second reception signal output from the reception circuit unit,
The first image is a two-dimensional area image generated based on the first reception signal obtained from an area where the first transmission wave is transmitted,
the second image is a three-dimensional area image generated based on the second reception signal obtained from a three-dimensional area that is an area where the second transmission wave is transmitted,
the three-dimensional region image is at least one of a vertical plane image in which three-dimensional region image data having three-dimensional position information and echo intensity information each generated based on the second reception signal and obtained from each position included in the three-dimensional region is projected onto a vertical plane, and a horizontal plane image in which the three-dimensional region image data is projected onto a horizontal plane,
the three-dimensional region image includes an upper level image and a lower level image,
the upper level image is the level image obtained by projecting the three-dimensional region image data onto an upper level above the three-dimensional position of the three-dimensional region image data,
the lower level image is the level image obtained by projecting the three-dimensional region image data onto a lower level below the three-dimensional position of the three-dimensional region image data.
2. The water detection system of claim 1, wherein,
the beam width in the vertical direction of the first transmission wave is less than 20 degrees,
the beam width of the second transmission wave in the vertical direction is 20 degrees or more.
3. The water detection system of claim 1 or claim 2, further comprising:
a scan sonar including the transmission circuit section, the reception circuit section, the first control section, and the first image generation section; and
a processing device having the second control unit and the second image generation unit,
the processing device and the scanning sonar are arranged in a split mode.
4. A water detection system as claimed in claim 3, wherein,
at least a portion of the time at which the second received signal is forwarded to the processing device overlaps with at least a portion of the time at which the first image is generated.
5. A water detection system as claimed in claim 3, wherein,
the scanning sonar also has a first display device that displays the first image,
the water detection system further comprises a second display device for displaying the second image.
6. A water detection system as claimed in claim 3, wherein,
The processing device is a personal computer.
7. An in-water detection system as claimed in claim 1 or claim 2, wherein,
at least a portion of the time at which the first image is generated overlaps with at least a portion of the time at which the second image is generated.
8. An in-water detection system as claimed in claim 1 or claim 2, wherein,
the second transmission wave is transmitted less frequently than the first transmission wave.
9. The water detection system of claim 1, wherein,
the lower horizontal plane image is an image in which the horizontal plane image obtained by projecting the three-dimensional area image data onto the lower horizontal plane is mirrored about a predetermined axis extending in the in-plane direction of the lower horizontal plane.
10. An underwater detection system, comprising:
a transceiver having a plurality of transceiver elements;
a transmission circuit unit that drives the plurality of transmission/reception elements so that the transmission/reception device transmits a transmission wave having a beam width in a vertical direction in one go, wherein the plurality of transmission/reception elements that transmit a first transmission wave and a second transmission wave are driven, and the beam width of the second transmission wave in the vertical direction is wider than that of the first transmission wave;
A reception circuit unit that generates a first reception signal based on a reflected wave of the first transmission wave and generates a second reception signal based on a reflected wave of the second transmission wave;
a first control unit that causes the transmission circuit unit to generate a first drive signal that is a basis of the first transmission wave;
a first image generating unit that generates a first image based on the first reception signal output from the reception circuit unit;
a second control unit that causes the transmission circuit unit to generate a second drive signal that is a basis of the second transmission wave; and
a second image generating unit configured to generate a second image based on the second reception signal output from the reception circuit unit,
the first image is a two-dimensional area image generated based on the first reception signal obtained from an area where the first transmission wave is transmitted,
the second image is a three-dimensional area image generated based on the second reception signal obtained from a three-dimensional area that is an area where the second transmission wave is transmitted,
the three-dimensional region image is at least one of a vertical plane image in which three-dimensional region image data having three-dimensional position information and echo intensity information obtained from each position included in the three-dimensional region and generated based on the second reception signal, and a horizontal plane image in which the three-dimensional region image data is projected onto a horizontal plane, and the three-dimensional region image includes:
The data for the vertical plane image, the horizontal plane image, and the three-dimensional region image are projected onto at least two of oblique images formed by oblique planes intersecting both the vertical plane and the horizontal plane.
11. The water detection system of claim 10, wherein,
the beam width in the vertical direction of the first transmission wave is less than 20 degrees,
the beam width of the second transmission wave in the vertical direction is 20 degrees or more.
12. A water detection system as claimed in claim 10 or claim 11, further comprising:
a scan sonar including the transmission circuit section, the reception circuit section, the first control section, and the first image generation section; and
a processing device having the second control unit and the second image generation unit,
the processing device and the scanning sonar are arranged in a split mode.
13. The water detection system of claim 12, wherein,
at least a portion of the time at which the second received signal is forwarded to the processing device overlaps with at least a portion of the time at which the first image is generated.
14. The water detection system of claim 12, wherein,
The scanning sonar also has a first display device that displays the first image,
the water detection system further comprises a second display device for displaying the second image.
15. The water detection system of claim 12, wherein,
the processing device is a personal computer.
16. An in-water detection system as claimed in claim 10 or claim 11, wherein,
at least a portion of the time at which the first image is generated overlaps with at least a portion of the time at which the second image is generated.
17. An in-water detection system as claimed in claim 10 or claim 11, wherein,
the second transmission wave is transmitted less frequently than the first transmission wave.
18. The water detection system of claim 10, wherein,
a predetermined position in any one of the vertical plane image, the horizontal plane image, and the oblique-view image is selected by a user, whereby a first mark is displayed at the predetermined position in the selected image, and a second mark is displayed at a position corresponding to the predetermined position in at least any one of the vertical plane image, the horizontal plane image, and the oblique-view image other than the image in which the first mark is displayed.
CN201710473384.2A 2016-06-23 2017-06-21 Water detection system Active CN107544071B (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2016-124722 2016-06-23
JP2016124722A JP6722521B2 (en) 2016-06-23 2016-06-23 Underwater detection system

Publications (2)

Publication Number Publication Date
CN107544071A CN107544071A (en) 2018-01-05
CN107544071B true CN107544071B (en) 2023-04-28

Family

ID=59358285

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201710473384.2A Active CN107544071B (en) 2016-06-23 2017-06-21 Water detection system

Country Status (3)

Country Link
JP (1) JP6722521B2 (en)
CN (1) CN107544071B (en)
GB (1) GB2553404B (en)

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2019167563A1 (en) * 2018-03-02 2019-09-06 古野電気株式会社 Underwater detection device and underwater detection method
JP7051625B2 (en) * 2018-07-12 2022-04-11 古野電気株式会社 Underwater detector and underwater detection method
JP7021025B2 (en) * 2018-07-31 2022-02-16 古野電気株式会社 Echo signal processing device, echo signal processing system, and echo signal processing method
EP3761057A1 (en) * 2019-07-03 2021-01-06 Furuno Electric Co., Ltd. Systems and methods for generating training dataset for discrimination of fish species

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1441386A (en) * 2002-01-29 2003-09-10 夏普公司 Omni-bearing visual system, image processing method, control program and readable recording medium
CN102736081A (en) * 2011-04-14 2012-10-17 古野电气株式会社 A fish finder, an underwater detection device, an underwater detecting method, and a program
CN104181536A (en) * 2013-05-27 2014-12-03 本多电子株式会社 Searchlight type sonar
CN104797196A (en) * 2012-09-26 2015-07-22 日立阿洛卡医疗株式会社 Ultrasound diagnostic device and ultrasound two-dimensional tomographic image generation method
CN105451660A (en) * 2013-07-19 2016-03-30 Axion日本株式会社 Panoramic image capturing device, and image-based diagnostic method used therein
WO2016068715A1 (en) * 2014-10-31 2016-05-06 Fugro N.V. Underwater positioning system

Family Cites Families (27)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5267329A (en) * 1990-08-10 1993-11-30 Kaman Aerospace Corporation Process for automatically detecting and locating a target from a plurality of two dimensional images
US5184330A (en) * 1991-06-25 1993-02-02 Techsonic Industries, Inc. Multi-beam sonar fish detection apparatus providing real-time three-dimensional wire-frame display representation
JPH05288855A (en) * 1992-04-10 1993-11-05 Kaijo Corp Scanning sonar and its detection display method
JP2885989B2 (en) * 1992-04-10 1999-04-26 株式会社カイジョー Scanning sonar and its detection and display method
WO1997039367A1 (en) * 1996-04-12 1997-10-23 Amoco Corporation Method and apparatus for seismic signal processing and exploration
JPH11344566A (en) * 1998-03-31 1999-12-14 Japan Radio Co Ltd Fish finder
JP4091454B2 (en) * 2003-02-25 2008-05-28 株式会社カイジョーソニック Scanning sonar
JP4354736B2 (en) * 2003-05-09 2009-10-28 古野電気株式会社 Ultrasonic transceiver
JP4564286B2 (en) * 2004-06-14 2010-10-20 株式会社東芝 3D ultrasonic imaging device
JP2006322773A (en) * 2005-05-18 2006-11-30 Furuno Electric Co Ltd Scanning sonar
CN101493524A (en) * 2009-02-19 2009-07-29 浙江海洋学院 Sonar detecting method for large yellow croaker escaping from deep water net cage and special equipment thereof
US8300499B2 (en) * 2009-07-14 2012-10-30 Navico, Inc. Linear and circular downscan imaging sonar
JP5375489B2 (en) * 2009-09-29 2013-12-25 ソニー株式会社 Image signal processing apparatus, image signal processing method, program, and image signal processing system
US8929176B2 (en) * 2010-10-25 2015-01-06 Lockheed Martin Corporation Building a three-dimensional model of an underwater structure
MY164768A (en) * 2010-10-25 2018-01-30 Lockheed Corp Estimating position and orientation of an underwater vehicle based on correlated sensor data
US8942062B2 (en) * 2010-10-25 2015-01-27 Lockheed Martin Corporation Detecting structural changes to underwater structures
CN102183767A (en) * 2011-03-11 2011-09-14 上海海洋大学 Three-dimensional digital fish finder system based on USB (universal serial bus)
US10502820B2 (en) * 2013-02-13 2019-12-10 Farsounder, Inc. Integrated sonar devices and methods
US9335412B2 (en) * 2013-03-14 2016-05-10 Navico Holding As Sonar transducer assembly
US20160116572A1 (en) * 2013-06-05 2016-04-28 Furuno Electric Co., Ltd. Sensor information output apparatus, sensor image display apparatus, detection apparatus, and sensor information output method
GB2542523B (en) * 2014-05-30 2020-12-30 Flir Systems Transmission signal shaping systems and methods
JP2016090452A (en) * 2014-11-07 2016-05-23 古野電気株式会社 Detection device and underwater detection device
JP2016090453A (en) * 2014-11-07 2016-05-23 古野電気株式会社 Detection device and underwater detection device
CN104361623A (en) * 2014-11-25 2015-02-18 中国电子科技集团公司第三研究所 Portable three-dimensional imaging sonar and imaging method and system thereof
CN104765066B (en) * 2015-04-22 2017-07-21 郑州大学 The scanning focused imaging method of earthquake three-dimensional velocity of wave
CN105259557A (en) * 2015-09-25 2016-01-20 浙江大学 Multi-frequency emission beam formation method and application
CN105572676A (en) * 2015-12-16 2016-05-11 浙江大学 Seine object fish shoal tracking method based on horizontal fishgraph images

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1441386A (en) * 2002-01-29 2003-09-10 夏普公司 Omni-bearing visual system, image processing method, control program and readable recording medium
CN102736081A (en) * 2011-04-14 2012-10-17 古野电气株式会社 A fish finder, an underwater detection device, an underwater detecting method, and a program
CN104797196A (en) * 2012-09-26 2015-07-22 日立阿洛卡医疗株式会社 Ultrasound diagnostic device and ultrasound two-dimensional tomographic image generation method
CN104181536A (en) * 2013-05-27 2014-12-03 本多电子株式会社 Searchlight type sonar
CN105451660A (en) * 2013-07-19 2016-03-30 Axion日本株式会社 Panoramic image capturing device, and image-based diagnostic method used therein
WO2016068715A1 (en) * 2014-10-31 2016-05-06 Fugro N.V. Underwater positioning system

Also Published As

Publication number Publication date
GB201709428D0 (en) 2017-07-26
JP6722521B2 (en) 2020-07-15
GB2553404A (en) 2018-03-07
CN107544071A (en) 2018-01-05
JP2017227564A (en) 2017-12-28
GB2553404B (en) 2022-02-23

Similar Documents

Publication Publication Date Title
CN107544071B (en) Water detection system
EP2454606B1 (en) Linear and circular downscan imaging sonar
US8187192B2 (en) Method and apparatus for scan conversion and interpolation of ultrasonic linear array steering imaging
EP2325672B1 (en) Spatial compound imaging in an ultrasound system
US11467280B2 (en) Underwater detection apparatus and underwater detection method
EP0947854B1 (en) Apparatus suitable for searching objects in water
US6709394B2 (en) Biplane ultrasonic imaging
US6755788B2 (en) Image orientation display for a three dimensional ultrasonic imaging system
JP7262298B2 (en) UNDERWATER DETECTION DEVICE AND UNDERWATER DETECTION METHOD
EP3170019B1 (en) Depth display using sonar data
JP2006162480A (en) Underwater detection system
US11047964B2 (en) Sonar transducer having geometric elements
US10151829B2 (en) Systems and associated methods for producing sonar image overlay
CN102798841A (en) Image display device and radar device
US20170139044A1 (en) Transducer Elements at Different Tilt Angles
JP7021025B2 (en) Echo signal processing device, echo signal processing system, and echo signal processing method
US8002702B2 (en) Ultrasonograph and method for controlling movement of display body of ultrasonograph
JP6982138B2 (en) Underwater detector
US20100056920A1 (en) Ultrasound system and method of providing orientation help view
JP7219640B2 (en) Underwater detection device and underwater detection method
US11137495B2 (en) Angular offset transducers in multiple transducer sonar system
US11262440B2 (en) Forming a composite down sonar image
GB2580726A (en) Underwater detection apparatus and underwater detection method
JP2012137446A (en) Radar signal processor and radar image processor
JPH03251780A (en) Front image display method of ultrasonic underwater imaging apparatus

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant