GB2609727A - Target detection device and target detection method - Google Patents
Target detection device and target detection method Download PDFInfo
- Publication number
- GB2609727A GB2609727A GB2208300.0A GB202208300A GB2609727A GB 2609727 A GB2609727 A GB 2609727A GB 202208300 A GB202208300 A GB 202208300A GB 2609727 A GB2609727 A GB 2609727A
- Authority
- GB
- United Kingdom
- Prior art keywords
- region
- image
- display
- intensity
- latent
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S15/00—Systems using the reflection or reradiation of acoustic waves, e.g. sonar systems
- G01S15/88—Sonar systems specially adapted for specific applications
- G01S15/96—Sonar systems specially adapted for specific applications for locating fish
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S15/00—Systems using the reflection or reradiation of acoustic waves, e.g. sonar systems
- G01S15/88—Sonar systems specially adapted for specific applications
- G01S15/89—Sonar systems specially adapted for specific applications for mapping or imaging
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S7/00—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
- G01S7/52—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S15/00
- G01S7/523—Details of pulse systems
- G01S7/526—Receivers
- G01S7/527—Extracting wanted echo signals
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S7/00—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
- G01S7/52—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S15/00
- G01S7/56—Display arrangements
- G01S7/58—Display arrangements for providing variable ranges
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S7/00—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
- G01S7/52—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S15/00
- G01S7/56—Display arrangements
- G01S7/62—Cathode-ray tube displays
- G01S7/6218—Cathode-ray tube displays providing two-dimensional coordinated display of distance and direction
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S7/00—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
- G01S7/52—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S15/00
- G01S7/56—Display arrangements
- G01S7/62—Cathode-ray tube displays
- G01S7/6272—Cathode-ray tube displays producing cursor lines and indicia by electronic means
Landscapes
- Engineering & Computer Science (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Physics & Mathematics (AREA)
- Computer Networks & Wireless Communication (AREA)
- General Physics & Mathematics (AREA)
- Acoustics & Sound (AREA)
- Measurement Of Velocity Or Position Using Acoustic Or Ultrasonic Waves (AREA)
Abstract
A target detection device 10 comprises a transducer 2 configured to transmit ultrasonic waves into water and to receive the reflected waves, and a reception processing module 14 which generates echo data according to the intensity of the reflected wave by a signal output from the transducer. The device further includes an image-generating module 11a which generates a first image to display indicating the intensity distribution of the reflected wave from the echo data, and a region extraction module 11b that extracts, from the echo data, a latent region of a target not displayed on the first image. A display processing module 11c superimposes the latent region on the first image. A region of a pixel group may be extracted from a generated second image as the latent region, if the intensity of the reflection is greater than or equal to a threshold, the intensity is continuous between adjacent pixels, and if there is no lump of the intensity of the reflection corresponding to a target in the region on the first image corresponding to the region.
Description
TARGET DETECTION DEVICE AND TARGET DETECTION METHOD
[0001] The present invention relates to a target detection device and a target detection method for detecting a target in water.
[0002] A target detection device for detecting a target in water is known. In the target detection device, ultrasonic waves are transmitted into the water, and reflected waves are received. Echo data which corresponds to the intensity of the received reflected wave, is generated. Based on the generated echo data, the intensity distribution of the reflected wave of each water depth is displayed by a color of a corresponding gradation. A fish finder having such configuration is described in the following Japanese patent documentl -JP2010281736A.
[0003] As described above, in the target detection device, the intensity ranges for setting the gradation to display is often set relatively high. Thus, the intensity distribution of the reflected wave may be easily seen, and the target in the water may be easily grasped. However, when the intensity of the reflected wave from the target is lower than the intensity range, then the intensity distribution based on the target is not displayed on the image. On the other hand, if the intensity range is lowered, then noise other than the target is displayed, and the distribution of the target in the water becomes difficult to understand.
[0004] An object of the present invention is to provide a target detection device and a target detection method capable of more accurately grasping a target existing in the water.
[0005] In accordance with an embodiment, a first aspect of the present invention relates to a target detection device. The target detection device, according to the present embodiment, may be provided with a transducer configured to transmit ultrasonic waves into water and to receive reflected waves, a reception processing module may be configured to generate echo data according to the intensity of the reflected waves by a signal output from the transducer, an image generating module may be configured to generate a first image to display, indicating the intensity distribution of the reflected wave from the echo data, a region extraction module may be configured to extract, from the echo data, a latent region of a target not displayed on the first image, and a display processing module may be configured to superimpose the latent region on the first image and to display the latent region on the display.
[0006] In the target detection method, according to the present embodiment, the latent region of the target which is not displayed in the first image may be superimposed and displayed on the first image. Thus, a user may grasp the target from the intensity distribution of the reflected wave displayed in the first image, and may grasp the target having a low intensity of the reflected wave from the latent region superimposed on the first image. Thus, the user may be more accurately grasp the target existing in the water from the displayed image.
[0007] In accordance with a second aspect of the embodiment, a target detection method is disclosed. The target detection method, according to the present embodiment, comprises transmitting ultrasonic waves into water and receiving reflected waves thereof, generating an image to display indicating the intensity distribution of the reflected wave from the echo data, extracting from the echo data a latent region of a target not displayed on the image, and superimposing the latent region on the image.
[0008] In the target detection method, according to this embodiment, the same effect as that of the first aspect may be obtained.
[0009] As described above, according to the present invention, it is possible to provide a target detection device and a target detection method capable of more accurately grasping a target existing in the water.
[0010] The effect or significance of the present invention will become more apparent from the description of the following embodiments However, the following embodiments are only examples of the present invention, and the present invention is not limited in any way to those described in the following embodiments described below.
[0011] FIG. 1 is a diagram showing a usage of a target detection device, according to an embodiment, FIG. 2 is a block diagram showing a configuration of a target detection device, according to an embodiment; FIG. 3(a) and FIG. 3(b) schematically show the setting states of the gradation setting ranges, according to the embodiment; FIG. 4(a) is a diagram showing a specific example of an echo image when the gradation setting range is set relatively high, according to the embodiment, FIG. 4(b) is a diagram showing a specific example of an echo image when the gradation setting range is set relatively low, according to the embodiment; FIG. 5 is a flowchart showing the processing of the target detection device, according to the embodiment, FIG. 6(a) is a diagram schematically showing an image for extraction, according to an embodiment; FIG. 6(b) is a diagram showing an echo image and a latent region superimposed on the echo image, according to an embodiment; FIG. 7 is a flowchart showing a latent region extraction process, according to an embodiment; FIG. 8(a), FIG. 8(b), and FIG. 8(c) are schematic views for explaining the extraction processing of regions forming a lump, according to the respective embodiments; FIG. 9(a), FIG. 9(b), and FIG. 9(c) are diagrams schematically showing a region forming a lump in an extraction image, a region forming a lump superimposed on an echo image, and an extracted latent region, respectively, according to an embodiment; FIG. 10(a) is a flowchart showing the processing for accepting the designation of the latent region and for displaying the intensity distribution, according to a modification 1; FIG. 10(b) is a diagram showing an echo image in a state in which a designated latent region is displayed in a predetermined gradation, according to the modification 1; FIG. 11 is a flowchart showing the processing for accepting the latent region and the scale of the gradational display and for displaying the intensity distribution, according to a modification 2; FIG. 12(a) is a diagram showing an echo image in a state in which a slider is displayed, according to modification 2; FIG. 12(b) is a diagram showing an echo image in a state in which the designated latent region is displayed on the scale of the designated gradation, according to the modification 2; FIG. 13(a) is a diagram showing an echo image in which all latent regions are displayed in a predetermined gradation, according to another modification; and FIG. 13(b) is a diagram showing an echo image switched to a state in which only contours are displayed in all latent regions, according to another modification example.
[0012] Embodiments of the present invention will now be described with reference to the drawings. The following embodiment shows an example in which the present invention is applied to a target detection device installed on the hull of a fishing boat or the like. However, the following embodiment is one embodiment of the present invention, and the present invention is not limited in any way to the following embodiments.
[0013] FIG. 1 illustrates a view showing a mode of use of a target detection device. In the present embodiment, a transducer (2) may be installed at a bottom of a vessel (1), and a transmission beam (3) of ultrasonic waves is transmitted from the transducer (2) into water. The transmission beam (3) may have a conical shape with a small vertical angle and may be transmitted in a pulse shape in a direction directly below a lead. The transmission beam (3) may be reflected by a seabed (water bottom) (4) and a fish school (5), and a reflected wave (echo) may be received by the transducer (2). Echo data in which signal intensity of a received signal is distributed in a detection range in a depth direction, may be generated by the received signal of the reflected wave based on the transmission of one transmission beam (3).
[0014] An echo image showing the distribution of signal intensity in the depth direction may be generated by accumulating the history of the echo data The echo image includes the intensity distribution of the target. The generated echo image in the water may be displayed on a display installed in a steering room or the like of the vessel (1). Thus, the user may confirm the position of a target (seabed (4), fish school (5), etc.) existing in the water.
[0015] FIG. 2 illustrates a block diagram showing the configuration of a target detection device (10).
[0016] The target detection device (10) of this embodiment is a fish finder or an echo sounder. The target detection device (10) may include a processing module (11), a storage (12), a transmission processing module (13), a reception processing module (14), an input interface module (15), a display (16), and the transducer (2) shown in FIG. 1 [0017] The processing module (11), the storage (12), the transmission processing module (13), the reception processing module (14), the input interface module (15), and the display (16), may be installed in the steering room or the like of the vessel (1). The configuration excluding the transducer (2) may be unitized in one housing, or some components such as the display (16) may be separated. The transmission processing module (13) and the reception processing module (14) may be communicably connected to the transducer (2) by signal cables.
[0018] The transducer (2) may be provided with a transmitter for transmitting ultrasonic waves and a receiver for receiving ultrasonic waves. For example, the transducer (2) are ultrasonic vibrators. The transmission processing module (13) outputs a transmission signal to the transmitter of the transducer (2) under the control of the processing module (11). The transmitter of the transducer (2) transmits ultrasonic waves in water based on the transmission signal. The receiver of the transducer (2) receives the transmitted ultrasonic wave reflected therefrom and outputs a reception signal having a magnitude corresponding to the intensity of the reflected wave to the reception processing module (14).
[0019] Based on the received signal from the receiver of the transducer (2), the reception processing module (14) generates data (Hereinafter referred to as "echo data") in which an elapsed time from the timing of transmitting the transmission beam (3) is correlated with the intensity of the reflected wave, and outputs the generated echo data to the processing module (11). The elapsed time from the timing of transmitting the transmission beam (3) may correspond to the distance to a target, and the intensity of the reflected wave decreases as the distance to the target increases. Therefore, the reception processing module (14) corrects the intensity of the reflected wave that attenuates according to the elapsed time, and outputs the echo data with the corrected intensity to the processing module (11).
[0020] The processing module (11) comprises an arithmetic processing circuit such as a CPU or an integrated circuit such as an FPGA. The storage (12) includes a Read-only memory (ROM), a Random Access Memory (RAM), a hard disk, and the like. Various programs may be stored in the storage (12). The processing module (11) may control each section by a program stored in the storage (12). The processing module (11) executes the functions of an image generating module (11a), a region extraction module (1 1b), and a display processing module (11c) by the programs stored in the storage (12). The processing module (11) of FIG. 2 shows a functional block executed by the processing module (11) based on a program. The processing contents of each functional block will be described with reference to FIG. 5 to FIG. 9(c).
[0021] The input interface module (15) may be constituted by the input means such as a mouse and a keyboard and accepts input from a user. The input interface module (15) may be a touch panel integrated with the display (16). The display (16) may comprise a display such as a CRT monitor or a liquid crystal panel, and displays an image generated by the processing module (11). As described later, an echo image generated based on the echo data is displayed on the display (16).
[0022] The processing module (11) acquires the echo data in which the elapsed time (distance) may be correlated with the intensity of the received signal for each transmission timing of the transmission beam (3). The processing module (11) generates an echo image on the basis of echo data for 1 frame continuously acquired and displays it on the display (16). Echo images are sometimes referred to as echo diagrams. The echo image may be generated with the water depth and time axis as two axes. In the echo image, coloring or shading is applied in gradation corresponding to the signal intensity of the reflected wave for each pixel.
[0023] In the target detection device (10) having the above stnicture, an intensity range (Hereinafter referred to as "Gradation Setting Range") for setting the gradation to display is often set relatively high. This makes the intensity distribution of the reflected wave easy to see, and makes it easy to grasp the target in water. However, in this method, when the intensity of the reflected wave from the target is lower than the gray scal e setting range, then the intensity distribution based on the target is not displayed in the echo image. On the other hand, when the predetermined level is lowered, then noise or the like other than the target is displayed on the image, and thus making it difficult to understand the distribution of the target in the water.
[0024] Referring to FIG. 3(a) to FIG. 4(b), an echo image when the grayscale setting range is set high and an echo image when the grayscale setting range set low are described.
[0025] FIG. 3(a) and FIG. 3(b) schematically show the setting states of the gradation setting ranges.
[0026] FIG. 3(a) and FIG. 3(b) illustrate the echo data acquired at a predetermined transmission timing of the transmission beam (3). The vertical axis indicates the depth direction, and the horizontal axis indicates the intensity of the echo data. As shown in FIG. 3(a) and FIG. 3(b), when targets TI and T2 exist in the depth direction, then the intensity of the echo data changes according to the sizes of the targets TI and T2. In this case, since the target T1 is larger than the target T2, the intensity of the echo data based on the target T1 is larger than the intensity of the echo data based on the target T2.
[0027] As shown in FIG. 3(a), when the gradation setting range is set high (when the sensitivity may be set low), then the lower limit intensity SV11 is higher than the highest intensity of the target T2, and the upper limit intensity SVI2 is lower than the highest intensity of the target T1. In this case, the echo data whose intensity is SV11 or more and SV12 or less is displayed in gradation. That is, the intensity SV11 is set to the lowest gradation, and the intensity SV12 is set to the highest gradation. Echo data whose intensity is higher than the gradation setting range, are uniformly set to the highest gradation and echo data whose intensity is lower than the gradation setting range are uniformly set to the lowest gradation.
[0028] Therefore, in this case, when an echo image is generated based on the echo data acquired at the transmission timing of one frame continuously in time, the region of the target T1 is displayed in a predetermined gradation in the generated echo image, but all the regions of the target T2 are displayed in the lowest gradation, and the region of the target T2 cannot be confirmed on the echo image.
[0029] On the other hand, as shown in FIG 3(b), when the gradation setting range is set low (when the sensitivity is set high), then the lower limit intensity SV21 is lower than the highest intensity of the target T2, and the upper limit intensity SV22 is lower than the highest intensity of the target T1 [0030] Therefore, in this case, when an echo image is generated based on the echo data acquired at the transmission timing of one frame continuously in time, both regions of the target T1 and T2 are displayed in a predetermined gradation in the generated echo image. However, in this case, noise or the like other than the target is displayed on the echo image, making it difficult to understand the distribution of the target in the water.
[0031] FIG. 4(a) is a diagram showing a specific example of an echo image (110) when the gradation setting range is set relatively high as shown in FIG. 3(a), and FIG. 4(b) is a diagram showing a specific example of the echo image (111) when the gradation setting range is set relatively low as shown in FIG. 3(b).
[0032] In FIG. 4(a) and FIG. 4(b), the vertical axis indicates depth, and the horizontal axis indicates time. In the longitudinal direction, the depth increases as it moves downward. In the horizontal axis direction, the rightmost side corresponds to the most recent echo data, and the leftmost side corresponds to the most past echo data. The echo image (110) of FIG. 4(a) and the echo image (111) of FIG. 4(b) are generated based on echo data for the same one frame.
[0033] As shown in FIG. 4(a), the echo image (110) is generated in such a manner that the gradation setting range is set high, so that a target existing in the vicinity of the region R1 may be confirmed in a state where noise is small. However, in the echo image (110), the target which originally existed in the vicinity of the region R2 may not be confirmed. On the other hand, as shown in FIG. 4(b), since the echo image (111) is generated with the gray-scale setting range set low, not only the target existing near the region R1 but also the target existing near the region R2 may be confirmed. However, in the case of the echo image (111), the range of the high gradation is greatly expanded and the distribution of noise other than the target is remarkably increased so that the original distribution of the target in the water is difficult to be understood.
[0034] In order to solve such a problem, in the present embodiment, an echo image (110) having a high gradation setting range, as shown in FIG. 4(a), is generated based on the echo data so that a target existing in water may be grasped more accurately. In relation to the gradation setting range, the echo image (110) is displayed on the display (16) in a state where the distribution region of the potential targets being not displayed on the echo image (110) is superimposed on the echo image (110). This process will be described below.
[0035] FIG. 5 illustrates a flowchart showing the processing of the target detection device (10) [0036] When the processing module (11) accepts the echo data at a predetermined timing from the reception processing module (14) at step (S11: YES), the image generating module (I la) generates an echo image (110) to display indicating the intensity distribution of the reflected wave from the echo data by the function of the image generating module (11a), at step 512. The echo data used in this case is echo data for 1 (one) frame including the echo data received in step SI I and the echo data received within a predetermined period backward from the present. As described with reference to FIG. 3(a), the echo image (110) is generated by displaying echo data in gradation within a highly set gradation setting range. Thus, the echo image (110) as shown in FIG. 4(a) is generated.
[0037] Subsequently, the processing module (11) performs extraction processing of the latent region by the function of the region extraction module (1 lb), at step S13. The latent region is the distribution region of the target being not displayed on the echo image (110), and is extracted based on an extraction image (120) as shown in FIG. 6(a).
[0038] FIG. 6(a) schematically shows the extraction image (120) for extraction. As shown in FIG. 6(a), the extraction image (120) includes a plurality of echo data El arranged in a lateral direction (time direction). The extraction image (120) is not an image based on a gradation setting range as in the echo image (110), but an image based on the echo data when the gradation setting range is not set. The pixel values of the pixels arranged vertically and horizontally in the extraction image (120), are values in which the intensity of the reflected wave is expressed in gradation. In step S13, as shown in FIG. 6(a), for example, latent regions R11 to R15 are extracted in the extraction image (120). The latent region extraction process will be described with reference to FIG. 7 to FIG. 9(c).
[0039] Subsequently, the processing module (11) superimposes the contour of the latent region on the echo image (110) and displays it on the display (16) by the function of the display processing module (11c), at step S14. As a result, for example, as shown in FIG. 6(b), the echo image (110) in which the contours of the latent regions RI I to RIS are superimposed and are displayed on the display (16).
[0040] FIG. 7 illustrates a flowchart showing a latent region extraction process.
[0041] The processing module (11) generates the extraction image (120) for extracting the latent region from the echo data, at step S21. The echo data used in this case is echo data for I (one) frame including the echo data received in step S11 of FIG. 5 and the echo data received within a predetermined period retroactively from the present as in the case of generating the echo image (110) In step S21, if a high intensity region is spread over a wide range on the extraction image (120), then the processing module (11) determines that this region corresponds to the seabed [0042] In the above description, "generate(s) the extraction image (120)" does not mean that the extraction image (120) is generated for display purposes, but means that the extraction image (120) is generated for data processing purposes. That is, in the present embodiment, the extraction image (120) is generated for the extraction processing of the latent region, and it is not necessary to generate it as an image file, and it is not necessary to display it on the display (16).
[0043] Subsequently, the processing module (11) performs labeling processing for extracting a region in the extraction image (120) in which the intensity is equal to or greater than a predetermined level and a lump is formed, at step S22. In other words, in the extraction image (120), the processing module (11) extracts a region of the pixel group in which the intensity of the reflected wave is equal to or greater than the threshold value and the intensity of the reflected wave is continuous between adjacent pixels, at step S22.
[0044] FIG. 8(a), FIG. 8(b) and FIG. 8(c) illustrate schematic views for explaining the extraction processing of the region forming the lump in step S22. FIG. 8(a), FIG. 8(b) and FIG. 8(c) show a part of the extraction image (120) for extraction.
[0045] The extraction image (120) is generated by assigning the intensity of each pixel position of the echo data to the pixel In step S22, the processing module (11) determines whether or not the intensity (pixel value) of the pixel to be determined is equal to or greater than a threshold value for each pixel to be determined (first determination), while shifting the pixel to be determined in the extraction image (120), and determines whether or not there is intensity continuity between the pixel to be determined and the surrounding four pixels adjacent to the pixel to be determined (second determination). When both the first determination and the second determination are satisfied, the processing module (11) includes the pixel to be determined and the pixel determined to have continuity in the pixel group having continuity.
[0046] In the example shown in FIG. 8(a), the second pixel P1 from the left in the middle row is a determination target. As an example, in the first determination, it is determined whether the intensity of the pixel to be determined is equal to or greater than a threshold value V, (= 10), and in the second determination, it is determined whether the intensity of the adjacent pixel is equal to or greater than a ratio V, (= 50%) of the intensity of the pixel to be determined, and whether the intensity of the adjacent pixel is equal to or greater than the threshold value V, (= 10).
[0047] The threshold value Vs used in the first determination is set to be smaller than the maximum intensity of a pixel corresponding to a target which may not be confirmed in the echo image (110). Specifically, the threshold value V, used in the first determination is set to a value larger than the maximum intensity level of the noise of the echo data by a predetermined value Va. If noise has been removed from the echo data in advance, then the threshold value Vs used in the first determination is set to the predetermined value Va.
[0048] As shown in FIG. 8(a), the intensity of a pixel P1 is not less than the threshold value V, (= 10). Among the pixels above, below, to the left and to the right of the pixel Pl, only a pixel P2 to the right of the pixel P1 has an intensity ratio V, (= 50%) or higher, and the intensity of the pixel P2 is the threshold value V, (= 10) or higher. Therefore, as shown in FIG. 8(b), the processing module (11) includes the pixels P1 and P2 in a pixel group G. [0049] Subsequently, the processing module (11) moves the pixel to be determined from the pixel P1 to the pixel P2, and performs the same determination. As shown in FIG. 8(b), the intensity of the pixel P2 is not less than the threshold value V, (= 10). Among the pixels above, below, on the right and left sides of the pixel 1)2, the pixels having the ratio V, (= 50%) or more of the intensity of the pixel P2 are the pixel P1 to the left of the pixel P2, a pixel P3 below the pixel P2, and a pixel P4 to the right of the pixel P2, and the intensities of the pixels Pl, P3, and P4 are all equal to or more than the threshold value V, (= 10). Therefore, as shown in FIG. 8(c), the processing module (11) includes the pixels Pl, P2, P3, and P4 in the pixel group G. [0050] The processing module (11) performs the above-described determination for all pixels while shifting the pixels to be determined in the extraction image (120). If the pixel to be determined is a pixel corresponding to the seafloor region, the processing module (11) does not make the above determination for the pixel to be determined, but shifts the pixel to be determined. Thus, the processing module (11) may quickly perform the region extraction processing. Thus, as illustrated in FIG. 9(a), in the extraction image (120) for extraction, regions R11 to R20 of the pixel group in which the intensity of the reflected wave is equal to or greater than the threshold value and the intensity of the reflected wave is continuous between adjacent pixels are extracted.
[0051] The method of setting the region of the pixel group is not limited to the above-described method. For example, in the second determination, the processing module (11) may determine that continuity exists when the value obtained by subtracting the intensity of the pixel to be determined from the intensity of adjacent pixels is equal to or greater than a predetermined value. The threshold value Vs of the first determination and the ratio V, of the second determination are not limited to the values described above.
[0052] Returning to FIG. 7, the processing module (11) sequentially performs the processes of steps 523 to 526 for the region (For example, regions R11 to R20 in Fig 9(a)) extracted in step 522.
[0053] The processing module (11) determines the size of one of the regions extracted in step 523, and determines whether or not the size of one of the regions is larger than a predetermined size, at step 524. Specifically, the processing module (11) determines whether a region in the depth direction (vertical axis direction in FIG. 9(a)) is continuous by a predetermined value or more, whether a region in the time direction (horizontal axis direction in FIG. 9(a)) is continuous by a predetermined value or more, and whether a region of a region is equal to or more than a predetermined value. When these three conditions are satisfied, the processing module (11) determines that the width of one region is equal to or larger than a predetermined width. In the example shown in FIG. 9(a), it is determined that the regions R11 to R15 and R17 to R20 are larger than a predetermined region, and it is determined that the region R16 is smaller than the predetermined region.
[0054] The determination of the width in step S22 is not limited to the determination described above. For example, the determination may be based solely on the depth width and the time width, or may be based solely on the region.
[0055] When the width of one region is equal to or larger than a predetermined width at step (S24: YES), then the processing module (11) determines whether or not a target corresponding to one region exists in the echo image (110), at step S25. Specifically, the processing module (11) determines whether or not a lump of the intensity of the reflected wave corresponding to the target exists in a region on the echo image (110) corresponding to one region. For example, when the ratio of pixels having a predetermined gradation or less in a region on the echo image (110) corresponding to one region is less than a predetermined value, it is determined that a target mass exists in the region on the echo image (110), and when the ratio of pixels having a predetermined gradation or less is equal to or greater than the predetermined value, it is determined that a target mass does not exist in the region on the echo image (110).
[0056] In the case of the example shown in FIG. 9(b), it is determined that a lump (the hatched portion in FIG. 9(b)) of the intensity of the reflected wave corresponding to the target exists in the region on the echo image (110) corresponding to the regions R17 to R20. On the other hand, it is determined that no lump of intensity of the reflected wave corresponding to the target exists in the region on the echo image (110) corresponding to the regions R11 to R16.
[0057] When there is no target corresponding to one region in the echo image (110) at step (S25: NO), the processing module (11) extracts one region as a latent region, at step S26. If the width of one region is less than the predetermined width at step (S24: NO), and if the target corresponding to one region is present in the echo image (110) at step (S25: YES), the process at step S26 is skipped [0058] The processing module (I_ I) determines whether the processing of steps S23 to S26 has been performed for all the regions extracted in step S22 to step S27. If the processing has not been completed for all the regions at step (S27: NO), then the processing module (11) changes the region to be processed, returns the processing to step S23, and performs the processing of steps S23 to S26 for the regions that have not been processed yet. On the other hand, when the processing is completed for all the regions at step (527: YES), then the latent region extraction processing is completed. As a result, regions Rt I_ to R15 are extracted as latent regions, for example, as shown in FIG. 9(c).
[0059] According to the embodiment, the following effects are achieved.
[0060] The latent region (For example, latent regions R11 to R15 in FIG. 6(b)) of the target which is not displayed in the first image (echo image (110)) is superimposed on the first image. Thus, the user may grasp the target from the intensity distribution of the reflected wave displayed in the first image, and may grasp the target having a low intensity of the reflected wave from the latent region superimposed on the first image. Thus, the user may more accurately grasp the target existing in the water from the displayed image.
[0061] The echo image (110) is a low noise image because it is generated from echo data in a high intensity range (For example, the gradation setting range shown in FIG. 3(a)). On the other hand, the latent region may include noise because it is extracted from the extraction image (120) generated from the echo data without setting an intensity range. However, as described above, the echo image (110) based on the low noise image only partially overlaps the latent region of the target not displayed in the echo image (110). Thus, the user may simultaneously confirm a target of low strength while confirming the target in the water in a state where noise is suppressed low.
[0062] The processing module (11) extracts, by the function of the region extraction module (1 lb), a region of a pixel group in which the intensity of the reflected wave is equal to or greater than a threshold value and the intensity of the reflected wave is continuous between adjacent pixels in the second image (extraction image (120)), at step 522 in FIG. 7. The processing module (11) extracts the extracted region as a latent region (For example, regions R11 to R15 in FIG. 9(c)) based on the fact that no lump of intensity of the reflected wave corresponding to the target exists in the region on the first image (echo image (110)) corresponding to the extracted region (For example, regions R11 to R20 in FIG. 9(a)) at step (525: NO in FIG. 7) (S26 in FIG. 7). According to this configuration, the latent region may be set smoothly and properly.
[0063] Based on the fact that the region (For example, regions RI Ito R20 in FIG. 9(a)) extracted from the second image (extraction image (120)) is larger than a predetermined threshold value at step (524: YES in FIG. 7), the processing module (11) extracts the extracted region as a latent region (For example, regions R11 to R15 in FIG. 9(c)) by the function of the region extraction module (11b) at step 526 in FIG. 7. According to this configuration, when the region extracted from the second image is small in size, then the region is excluded from the latent region to be displayed on the assumption that it is caused by factors other than the target to be detected, such as noise and minute floating objects This allows regions of potential targets that are not displayed in the image to be displayed more accurately.
[0064] The processing module (11) causes the display (16) to display the outline of the latent region (For example, region R11 to R15 in FIG. 6(b)) overlaid on the first image (echo image (110)) by the function of the display processing module (11 c) at step S14 in FIG. 5). According to this configuration, the user may grasp the region of the target not displayed in the display form of the first image by the contour superimposed on the first image.
[0065] Modification 1 In the above embodiment, only the contour of the latent region is superimposed on the echo image (110), but in addition to the contour of the latent region, the intensity distribution of the reflected wave in the designated latent region may be superimposed.
[0066] FIG. 10(a) is a flowchart showing the processing for accepting the designation of the latent region and for displaying the intensity distribution, according to the modification 1 [0067] The processing module (11) starts accepting the designation of the latent region by the function of the display processing module (11 c), and determines whether or not the designation of the latent region has been accepted, at step S101. The user selects a latent region by performing an operation such as clicking or tapping on any latent region on the echo image (110) displayed on the display (16) through the input interface module (15). When the designation of the latent region is accepted at step (S101: YES), then the processing module (11) uses the function of the display processing module (11c) to display the latent region accepted in step S101 in a predetermined gradation based on the echo data, at step S102. The gradational display in this case is, for example, a gradational display based on an intensity range (for example, the gradation setting range shown in FIG. 3(b)) lower than the intensity range used for generating the echo image (110).
[0068] FIG. 10(b) shows an echo image (110) in which a designated latent region is displayed in a predetermined gradation.
[0069] FIG. 10(b) shows a state in which the latent region R1 1 is designated among the latent regions R11 to R15. In this case, the intensity distribution in the latent region R11 is displayed in a predetermined gradation.
[0070] According to the modification 1, the processing module (11) accepts the designation of the latent region by the function of the display processing module (11c) and causes the display (16) to display the intensity distribution of the reflected wave in the accepted latent region in a predetermined gradation. According to this configuration, since the intensity distribution of the reflected wave in an optional latent region is displayed in a predetermined gradation, the user may grasp the kind of target or the like in the optional latent region.
[0071] Modification 2: In the modification 1, the designated latent region is displayed in a predetermined gradation, but the scale of the gradation may be designated by the user.
[0072] FIG. 11 is a flowchart showing processing for accepting the latent region and the scale of the gradational display and for displaying the intensity distribution, according to the modification 2. In FIG. 11, steps S111 and 5112 are added in place of step S102 as compared with FIG. 10(a).
[0073] When the designation of the latent region is accepted at step (S101: YES), the processing module (11) starts to accept the scale of the gradational display by the function of the display processing module (11c) and determines whether or not the scale of the gradational display has been accepted at step S111. The user inputs the scale of the gradational display by moving the slider (130) (see FIG. 12(a)) displayed on the display (16) via the input interface module (15). When the scale of the gradational display is accepted at step (S111: YES), the processing module (11) displays the latent region accepted in step S101 with the scale of the gradational display accepted in step S111 based on the echo data by the function of the display processing module (11 c), at step S112.
[0074] FIG. 12(a) shows an echo image (110) with the slider (130) displayed. FIG. 12(b) is a diagram showing an echo image (110) in a state in which a designated latent region is displayed on a scale of a designated gradation display.
[0075] FIG. 12(a) shows a state in which the latent region R11 is designated among the latent regions R11 to R15. In this case, the slider (130) is displayed for the designated latent region R11. The user sets the scale of the gradational display in the designated latent region R11 by moving the slider (130). For example, when the slider (130) is moved to the left, the grayscale setting range is lowered to a predetermined width, and when the [0081] Although the latent region is always superimposed on the echo image (110) in the above embodiments and the modified examples 1 and 2, the latent region may be superimposed in response to a predetermined operation from the user. In the above modification 1, the latent region designated by the user, is displayed in a predetermined gradation, but as shown in FIG. 13(a), all the latent regions may be displayed in a predetermined gradation without designation of the latent region by the user.
[0082] In this case, when an operation such as clicking is performed on any one of the latent regions R11 to R15, as shown in FIG. 13(b), only contours are displayed in all the latent regions RI 1 to R15 as in the above embodiment. Further, in the state of FIG. 13(b), when an operation such as clicking is performed on any one of the latent regions RI 1 to R15, as shown in FIG. 13(a), all the latent regions R11 to R15 are switched to a state in which are displayed in a predetermined gradation.
[0083] Even in the modification example shown in FIG. 13(a) and FIG. 13(b), the processing module (11) causes the display (16) to display the intensity distribution of the reflected wave in the latent region in a predetermined gradation by the function of the display processing module (11c). According to this configuration, since the intensity distribution of the reflected wave in the latent region is displayed in gradation as in the modification 1 and 2, the user can grasp the kind of target in the latent region.
[0084] In the above modification 1 and the modification examples shown in FIG 13(a) and FIG. 13(b), the intensity distribution in the latent region is displayed in a predetermined gradation, but this is not limited to this, and the gradational display in the latent region may be automatically and appropriately set.
[0085] In this case, for example, the processing module (11) may automatically set the scale of the gradational display in the latent region so that the average value of the gradation in the region corresponding to the target in the echo image (110) is equal to the average value of the gradation in the latent region superimposed on the echo image (110). Further, the processing module (11) may set a gradation setting range having a predetermined width for the echo data in the latent region, adjust the lower limit of the gradation setting range to the lowest intensity in the latent region, and di splay the intensity distribution in the gradation within the latent region Further, the processing module (11) may set a gradation setting range for the echo data in the latent region, adjust the lower limit of the gradation setting range to the lowest intensity in the latent region, and adjust the upper limit of the gradation setting range to the highest intensity in the latent region to display the intensity distribution in the latent region in gray scale.
[0086] In the above modification 2, as shown in FIG. 12(a) and FIG. 12(b), the slider (130) is used to shift the gradation setting range of a predetermined width in the high direction and the low direction. However, the present invention is not limited to this, and a slider for shifting the lower limit of the gradation setting range in a high direction and a slider for shifting the upper limit of the gradation setting range in a high direction and a low direction may be provided.
[0087] In the above embodiment, as shown in FIG 1, the transmission beam (3) is transmitted in a direction directly under the lead, but may be transmitted in a direction having an angle with respect to the direction directly under the lead. In this case, in the echo image (110) and the extraction image (120), the vertical direction corresponds not to the depth direction but to the transmission direction of the transmission beam (3).
[0088] In the above embodiment, the target detection device (10) is a fish finder for detecting directly under the vessel (1), but may be a sonar for detecting the circumference of the vessel (1). In this case, the reception processing module (14) performs beamforming processing on the reception signal outputted from the receiver of the transducer (2), generates the reception signal of each direction, and generates echo data of each direction. Then, the echo image and the extraction image are generated based on the echo data of each direction, and a latent region extracted from the corresponding extraction image is superimposed on the echo image of each direction.
[0089] Embodiments of the present invention may be modified in various ways within the scope of the claims.
[List of Reference Numerals] 2 Transducer Target Detection Device lla Image Generating Module 1lb Region Extraction Module 11c Display Processing Module 14 Reception Processing Module 16 Display Echo Image (First Image) Extraction Image (Second Image) RI 1 to R20 Region RU to R15 Latent Region
Claims (8)
- CLAIMS1 A target detection device (10) comprising: a transducer (2) configured to transmit ultrasonic waves into water and to receive reflected waves; a reception processing module (14) configured to generate echo data according to the intensity of the reflected wave by a signal output from the transducer (2); an image generating module (1 I a) configured to generate a first image to display indicating the intensity distribution of the reflected wave from the echo data, a region extraction module (1 lb) configured to extract, from the echo data, a latent region of a target not displayed on the first image; and a display processing module (11c) configured to superimpose the latent region on the first image and to display the latent region on the display.
- 2 The target detection device (10) according to claim 1, wherein the region extraction module (11b) is further configured: to generate a second image for extracting the latent region from the echo data; to extract, in the second image, a region of a pixel group in which the intensity of the reflected wave is equal to or greater than a threshold value and the intensity of the reflected wave is continuous between adjacent pixels; and to extract the extracted region as a latent region based on the fact that there is no lump of the intensity of the reflected wave corresponding to the target in the region on the first image corresponding to the extracted region.
- 3 The target detection device (10) according to claim 2, wherein the region extraction module (11b) is further configured to extract the extracted region as the latent region based on the fact that the region extracted from the second image has a width equal to or larger than a predetermined threshold value.
- 4 The target detection device (10) according to any one of claims 1 to 3, wherein the display processing module (11c) is further configured to superimpose the outline of the latent region on the first image and displays it on the display.
- The target detection device (10) according to any one of claims 1 to 4, wherein the display processing module (11c) is further configured to cause the display to display the intensity distribution of the reflected wave in the latent region in a predetermined gradation display.
- 6 The target detection device (10) according to claim 5, wherein the display processing module (11c) is further configured: to accept the designation of the latent region, and to cause the display to display the intensity distribution of the reflected wave in the accepted latent region in a predetermined gradation display.
- 7 The target detection device (10) according to claim 5 or 6, wherein the di splay processing module (11c) is further configured: to accept setting of the scale of the grayscale display with respect to the intensity of the reflected wave in the latent region, and to display, in a grayscale mode, the intensity distribution of the reflected wave in the latent region by the accepted scale.
- 8 A target detection method comprising: transmitting ultrasonic waves into water and receiving reflected waves thereof; generating an image to display indicating the intensity distribution of the reflected wave from the echo data; extracting from the echo data a latent region of a target not displayed on the image; and superimposing the latent region on the image.
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2021116691A JP2023012925A (en) | 2021-07-14 | 2021-07-14 | Target object detection device and target object detection method |
Publications (2)
Publication Number | Publication Date |
---|---|
GB202208300D0 GB202208300D0 (en) | 2022-07-20 |
GB2609727A true GB2609727A (en) | 2023-02-15 |
Family
ID=82404672
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
GB2208300.0A Pending GB2609727A (en) | 2021-07-14 | 2022-06-07 | Target detection device and target detection method |
Country Status (2)
Country | Link |
---|---|
JP (1) | JP2023012925A (en) |
GB (1) | GB2609727A (en) |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2009069164A (en) * | 2002-03-29 | 2009-04-02 | Koden Electronics Co Ltd | Fish finder |
GB2573759A (en) * | 2018-05-14 | 2019-11-20 | Furuno Electric Co | Underwater detection apparatus and underwater detection method |
US20200116858A1 (en) * | 2018-10-11 | 2020-04-16 | Furuno Electric Co., Ltd. | Underwater detection apparatus and underwater detection method |
JP2021117204A (en) * | 2020-01-29 | 2021-08-10 | 古野電気株式会社 | Underwater detector, and display method of underwater detection image |
-
2021
- 2021-07-14 JP JP2021116691A patent/JP2023012925A/en active Pending
-
2022
- 2022-06-07 GB GB2208300.0A patent/GB2609727A/en active Pending
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2009069164A (en) * | 2002-03-29 | 2009-04-02 | Koden Electronics Co Ltd | Fish finder |
GB2573759A (en) * | 2018-05-14 | 2019-11-20 | Furuno Electric Co | Underwater detection apparatus and underwater detection method |
US20200116858A1 (en) * | 2018-10-11 | 2020-04-16 | Furuno Electric Co., Ltd. | Underwater detection apparatus and underwater detection method |
JP2021117204A (en) * | 2020-01-29 | 2021-08-10 | 古野電気株式会社 | Underwater detector, and display method of underwater detection image |
Also Published As
Publication number | Publication date |
---|---|
GB202208300D0 (en) | 2022-07-20 |
JP2023012925A (en) | 2023-01-26 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US9366758B2 (en) | Detection device | |
US8593335B2 (en) | Method and device for processing echo signal, radar device and echo signal processing program | |
US11844656B2 (en) | Ultrasound diagnostic apparatus, method of controlling ultrasound diagnostic apparatus, and non-transitory computer-readable recording medium storing therein computer-readable program for controlling ultrasound diagnostic apparatus | |
US20150359507A1 (en) | Ultrasound diagnosis apparatus and ultrasound image processing method | |
GB2503351A (en) | Underwater detection device | |
US9268021B2 (en) | Detection device and computer readable media storing detection program | |
US7388809B2 (en) | Fish finder that transmits ultrasonic waves, receives echo signals, and performs interference removal | |
EP2790032B1 (en) | Image processing apparatus, radar apparatus, image processing method, and image processing program | |
EP2527864A1 (en) | Sensor image display device and method | |
CN107544071B (en) | Water detection system | |
CN112882037B (en) | Side-scan sonar sea bottom line detection method and device | |
GB2506995A (en) | Fish type distinguishing device, signal processing apparatus, and underwater detector | |
GB2609727A (en) | Target detection device and target detection method | |
JP7475227B2 (en) | Underwater detection device and method for displaying underwater detection images | |
US10365360B2 (en) | Radar apparatus | |
GB2529063A (en) | Detecting device, detecting method and program | |
US11221409B2 (en) | Underwater detection apparatus and underwater detection method | |
JPWO2017163716A1 (en) | Radar device and wake display method | |
JP2023166224A (en) | Fish detection device, image creation method, and program | |
CN109982646B (en) | Ultrasonic diagnostic apparatus and image processing method | |
US10436892B2 (en) | Radar apparatus | |
JP6811069B2 (en) | Underwater detection signal processing device, underwater detection device, and underwater detection signal processing method | |
WO2024053713A1 (en) | Dual-frequency fish finder, and dual-frequency trnsmission method | |
US20220413136A1 (en) | Ultrasound imaging device and method of generating color doppler image | |
WO2024038653A1 (en) | Device, method, and program for learning fish species |