GB2261510A - Ultrasonic reflection device - Google Patents
Ultrasonic reflection device Download PDFInfo
- Publication number
- GB2261510A GB2261510A GB9124436A GB9124436A GB2261510A GB 2261510 A GB2261510 A GB 2261510A GB 9124436 A GB9124436 A GB 9124436A GB 9124436 A GB9124436 A GB 9124436A GB 2261510 A GB2261510 A GB 2261510A
- Authority
- GB
- United Kingdom
- Prior art keywords
- signal
- reference signals
- interest
- detector
- comparison means
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
- 238000002592 echocardiography Methods 0.000 claims abstract description 8
- 239000000463 material Substances 0.000 claims description 12
- 238000013528 artificial neural network Methods 0.000 claims description 10
- 238000012549 training Methods 0.000 description 13
- 238000000034 method Methods 0.000 description 5
- 230000001944 accentuation Effects 0.000 description 4
- 238000006073 displacement reaction Methods 0.000 description 4
- 238000001228 spectrum Methods 0.000 description 4
- 239000004793 Polystyrene Substances 0.000 description 3
- 230000001537 neural effect Effects 0.000 description 3
- 229920002223 polystyrene Polymers 0.000 description 3
- 238000013459 approach Methods 0.000 description 2
- 238000007781 pre-processing Methods 0.000 description 2
- 238000012545 processing Methods 0.000 description 2
- 238000002604 ultrasonography Methods 0.000 description 2
- 239000004698 Polyethylene Substances 0.000 description 1
- 238000012512 characterization method Methods 0.000 description 1
- 238000010586 diagram Methods 0.000 description 1
- 230000002708 enhancing effect Effects 0.000 description 1
- 238000000605 extraction Methods 0.000 description 1
- 238000011835 investigation Methods 0.000 description 1
- 238000005259 measurement Methods 0.000 description 1
- 210000002569 neuron Anatomy 0.000 description 1
- 239000004033 plastic Substances 0.000 description 1
- 229920003023 plastic Polymers 0.000 description 1
- -1 polyethylene Polymers 0.000 description 1
- 229920000573 polyethylene Polymers 0.000 description 1
- 238000013138 pruning Methods 0.000 description 1
- 239000013077 target material Substances 0.000 description 1
- 238000012876 topography Methods 0.000 description 1
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S15/00—Systems using the reflection or reradiation of acoustic waves, e.g. sonar systems
- G01S15/88—Sonar systems specially adapted for specific applications
- G01S15/93—Sonar systems specially adapted for specific applications for anti-collision purposes
- G01S15/931—Sonar systems specially adapted for specific applications for anti-collision purposes of land vehicles
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01B—MEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
- G01B17/00—Measuring arrangements characterised by the use of infrasonic, sonic or ultrasonic vibrations
- G01B17/08—Measuring arrangements characterised by the use of infrasonic, sonic or ultrasonic vibrations for measuring roughness or irregularity of surfaces
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S7/00—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
- G01S7/52—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S15/00
- G01S7/539—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S15/00 using analysis of echo signal for target characterisation; Target signature; Target cross-section
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- General Physics & Mathematics (AREA)
- Computer Networks & Wireless Communication (AREA)
- Acoustics & Sound (AREA)
- Measurement Of Velocity Or Position Using Acoustic Or Ultrasonic Waves (AREA)
- Image Processing (AREA)
Abstract
The device comprises a transmitter 9 for transmitting in a scanned fashion a beam of ultrasonic energy over a target region of interest 11, a detector 13 for detecting return ultrasonic energy comprising echoes or reflections of the transmitted beam and comparison means 19 for comparing the signal detected by the detector means with a plurality of reference signals respectively representative of different known target surface textures of interest and for generating an output indicating when the detected signal is found to correspond to one of the reference signals and which one of the reference signals it corresponds to. The device is suitable for use as a sensor on a mobile robot. <IMAGE>
Description
ULTRASONIC REFLECTION DEVICES
The present invention relates to ultrasonic reflection devices especially for mobile robot applications.
According to the present invention there is provided an ultrasonic reflection device suitable for use as a sensor on a mobile robot, the device comprising a transmitter for transmitting in a scanned fashion a beam of ultrasonic energy over a target region of interest, a detector for detecting return ultrasonic energy comprising echoes or reflections of the transmitted beam and comparison means for comparing the signal detected by the detector means with a plurality of reference signals respectively representative of different known target surface textures of interest and for generating an output indicating when the detected signal is found to correspond to one of the said reference signals and which one of the reference signals it corresponds to.
The said comparison means may comprise a neural network which has been trained to recognise each one of the said reference signals and to classify inspected material textures as giving echoes similar to the reference signals representing those textures.
In order to simplify the recognition procedure by the said comparison means, the raw signal obtained as an output from the detector may be pre-processed before application to the comparison means so that a signature of a.given target surface texture of interest is more easily recognised. For example, the detected signal and the reference signal may be processed so that only the envelope of the signals is considered by the comparison means. Desirably, the pre-processor also includes means for extracting the absolute value of a signal varying positively and negatively about a mean or zero level in order to assist the envelope comparison procedure.
We have found that the information concerning the characterisation of a target surface of interest is contained largely in a tail portion of a detected signal representing echoes from the target surface.
The comparator means therefore desirably is engaged to compare only the tail portion with reference signals representing tail portions of signals of echoes from reference targets. Extraction of the tail portions may be carried out in the said pre-processor, eg by withdrawing a fixed length of signal immediately after the leading edge so that only the relevant tail portion is investigated.
In order to assist the procedure of recognising a reference signal within a given detected signal the frequency of the transmitted ultrasonic beam may be varied or swept in a known way. Alternatively, or in addition, the transmitted energy may be emitted in a series of pulses or packets in which the pulse lengths have a known variation sequence, eg a pseudo-random code. Varying the frequency for instance can enable accentuation of certain parts of the reference signal profile at certain frequency components to be identified. Such accentuation may be caused for example by resonances at certain frequencies characteristic of the texture of the material of the surface being investigated.
The power spectrum of a detected signal obtained by echo of a frequency swept transmitted signal from known target surfaces at an angle of incidence may be employed in the pre-processor to extract the frequency accentuations.
A known algorithm, eg a Fourier transform, may for example be employed in the pre-processor to obtain a set of accentuation frequencies which set is used as the training signal for the neural network.
Desirably, a neural network is trained to recognise each one of a set of given target surface textures by applying to it the detected and pre-processed signals corresponding to a plurality of echoes from each of a plurality of transmitter positions adjacent to the reflecting surface.
The use of neural networks (which have other names such as multi-layer perceptrons) to recognise electronic signals is well documented in the prior art.
For example, USP 4,979,124 and USP 5,003,490 and the prior art cited on the front page of each of the published specifications of the said patents give a variety of published references to such use. The application of a neural network to recognise ultrasonic reflection signals from different surface textures of interest in the manner of the present invention is not, of course, envisaged in any of such references. USP 4,979,124 provides the description which is the most relevant to the present invention but in the said patent the system description is not specific to any particular recognition application and the training procedure employs a single reference signal; training of a neural network in the manner and for the specific purpose in relation to the present invention as described herein is not suggested by the said patent.
Use of the present invention beneficially and unexpectedly enables the specific materials present in a given environment by reflection of an ultrasonic beam, eg mounted on a mobile robot, to be recognised.
Such recognition may conveniently be carried out after producing a map of the environment of the robot to detect objects in the front of the robot.
The transmitted beam in the device according to the present invention may be scanned over a given arc in a series of discrete steps, eg by controlling the direction of the transmitted beam by the action of a stepper motor to which the transmitter is fixed.
The outputs from the said means for processing, obtained for each step in a series of steps when the transmitter is scanned may be mapped to form a two-dimensional image being a radial plot of reflected signal intensity as measured in a series of grey scale levels as a function of distance from transmitter. This image may be enhanced by application of an image enhancement algorithm.
In the device according to the present invention the ultrasonic beam transmitted by the transmitter will in general be a signal alternating at a fundamental oscillator frequency. The output may be emitted as a series of pulses. For the purpose of further enhancing signal to noise quality the envelope of each output pulse may itself comprise a pseudo-random code. Such a code is generated by a control device, the code also being applied to the detector. Alternatively the frequency of the pulses in the output may be frequency swept.
Embodiments of the present invention will now be described by way of example with reference to the accompanying drawings in which:
Figure 1 is a block schematic circuit diagram showing the various functional units included in an ultrasonic ranging device embodying the present invention.
In the circuit of Figure 1 the output of an alternating electrical source 1 typically of frequency 49.41 Hz (although in principle any ultrasonic frequency could be used) is modulated by the output of a ims pulse generator 3 (which could alternatively be a generator providing a series of pseudo-random sequences of different pulse widths) controlled by a control unit 4 in a multiplier 5 whose output is fed into a high gain output amplifier 7 capable of delivering 300V signal bursts. The output of the amplifier 7 is employed to excite the transducer of an ultrasonic transmitter 9 which emits ultrasound signals through air toward a target surface 11 of interest.
The ultrasound signal reflected from the surface 11 is reflected and picked up by the transducer of a receiver 13. In practice the transducer of the transmitter 9 and receiver 13 may be a common transducer. The detected signal is amplified by an automatic gain controlled receiving amplifier 14. The items 1, 7, 9, 13 and 14 form part of a commercially available device known as "Polaroid Instrument Grade" transducer. The dashed line X indicates commercially available circuits.
The receiver 13 and the amplifier 14 referenced to the output of the control unit 4 form the raw signal detector.
The output of the amplifier 14 is processed by a pre-processing unit 17, eg to act as a band pass filter and, if required, to extract only the tail portion of the signal which tail portion will be used to identify the texture under investigation. The output of the unit 17 is then stored in a computer store 21. The stored signal is applied to a trained neural network 19 and the output of the neural network 19 may optionally be applied to an image processor 23. The neural network 19 is trained to recognise a series of "ideal" echo signals from different material textures of interest and to produce different output signals each indicative of the type of material recognised.
Each output from the neural matched filter 19 is an A-scan result, that is a one dimensional measurement of amplitude versus time of echo signals received from a given target direction. If a series of such results is obtained by scanning the transmitter 9 and receiver 13 around an arc, eg 45" in finite steps, eg at intervals of one degree, and recording a result at each step, a two-dimensional map or B-scan may be built up of the recorded echo intensity for each direction angle of transmitted beam. The image processor 23 initially collects outputs from the neural matched filter 19 to build up such a map. Each A-scan echo amplitude is recorded onto one of 64 grey scale levels to build up the B-scan map.
The raw B-scan map built up by the image processor 21 may then be enhanced in a known way to show the topography of a multi-textured surface.
A specific example of the use of a circuit as shown in Figure 1 is as follows. The following three distinct target materials were investigated plasterboard (smooth texture) polystyrene (intermediate roughness texture) and "bubble wrap" plastics eg polyethylene films enclosing air bubbles typically 0.5cm2 in surface area and 0.5 cm deep (highly undulating texture).
A multi-layered perceptron (MLP) comprising a first matched filter (Filter 1) employing back propagation learning or training was used as the neural matched filter 19 shown in Figure 1 the MLP was obtained by using on an IBM compatible 386 Personal
Computer housing a Texas Instruments TMS320C30 digital signal processor card an existing MLP software package.
The package is a windowed, mouse driven package, capable of implementing a fully configurable MLP.
During training, RMS error is displayed, and network weights are saved at used defined intervals. Once training is complete, the weights file is transferred to the TMS320C30 daughter board for high speed recall.
During recall, the software package provides a graphical grey scale representation of all neurons within a given network. This has proven useful for network "pruning" in order to increase speed and efficiency.
In order to overcome problems associated with positional variation, a 512 point power spectrum of "ideal" echo samples for each of the three materials was used for training of the MLP. Fifty points around the fundamental and 10 points at the low frequency end of these spectra were scaled and used to train an MLP with 60 inputs, two hidden layers of 30 units each, and 3 output units (one for each material). Table 1 shows the training data format used for this example.
TABLE 1 - Traininq data for power spectrum based
learninq
MATERIAL NUMBER OF SAMPLES PER OUTPUT
EXAMPLE EXAMPLE TARGET
PLASTERBOARD 60 60 1,0,0
POLYSTYRENE 60 60 0,1,0
BUBBLE WRAP 60 60 0,0,1
For comparison, a time domain based training approach was also employed. This technique involved acquiring the trailing samples (ie reverberation) of a given echo according to a constant offset from its detected leading edge. Absolute values of reverberation samples for all three materials were subjected to a low pass filter (for envelope acquisition) corresponding to the pre-processing unit 17 and then used as training data for the MLP. It was observed that the nature of a given reverberation changed according to lateral target displacement - this was most predominant with the bubble wrap target. In order to account for this anomaly, training data was acquired over 5 discrete lateral target displacements.
This approach is summarised in Table 5. Network layer architectures (from input to output layers) involved included - 150:30:30:3, 150:50:12:3, and 150:40:3. Of these, the 150:40:3 network offered the best performance.
TABLE 5 - Training data for time domain material
identification
MATERIAL DISPLACEMENT EXAMPLES PER TOTAL SAMPLES/ OUTPUT
NUMBER DISPLACEMENT EXAMPLE EXAMPLE TARGET
PLASTERBOARD 5 60 300 150 1,0,0
POLYSTYRENE 5 60 300 150 0,1,0
BUBBLE WRAP 5 60 300 150 0,0,1
Time domain processing proved the more successful.
After 1500 training iterations, the recognition rate showed an average performance of 79% success.
Claims (6)
1. An ultrasonic reflection device suitable for use as a sensor on a mobile robot, the device comprising a transmitter for transmitting in a scanned fashion a beam of ultrasonic energy over a target region of interest, a detector for detecting return ultrasonic energy comprising echoes or reflections of the transmitted beam and comparison means for comparing the signal detected by the detector means with a plurality of reference signals respectively representative of different known target surface textures of interest and for generating an output indicating when the detected signal is found to correspond to one of the said reference signals and which one of the reference signals it corresponds to.
2. A device as in Claim 1 and wherein the said comparison means comprises a neural network which has been trained to recognise each one of the said reference signals and to classify inspected material textures as giving echoes similar to the reference signals representing those textures.
3. A device as in Claim 1 or Claim 2 and wherein the device includes a pre-processor whereby in operation the raw signal obtained as an output from the detector may be pre-processed before application to the comparison means so that a signature of a given target surface texture of interest is more easily recognised.
4. A device as in Claim 3 and wherein the pre-processor is such that in operation the detected signal and the reference signal may be processed so that only the envelope of the signals is considered by the comparison means.
5. A device as in Claim 4 and wherein the pre-processor also includes means for extracting the absolute value of a signal varying positively and negatively about a mean.
6. A device as in Claim 1 and substantially the same as in any of the examples described herein.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
GB9124436A GB2261510B (en) | 1991-11-16 | 1991-11-16 | Ultrasonic reflection devices |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
GB9124436A GB2261510B (en) | 1991-11-16 | 1991-11-16 | Ultrasonic reflection devices |
Publications (3)
Publication Number | Publication Date |
---|---|
GB9124436D0 GB9124436D0 (en) | 1992-01-08 |
GB2261510A true GB2261510A (en) | 1993-05-19 |
GB2261510B GB2261510B (en) | 1996-05-22 |
Family
ID=10704798
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
GB9124436A Expired - Fee Related GB2261510B (en) | 1991-11-16 | 1991-11-16 | Ultrasonic reflection devices |
Country Status (1)
Country | Link |
---|---|
GB (1) | GB2261510B (en) |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
SG111994A1 (en) * | 2003-07-25 | 2005-06-29 | Sony Corp | Wireless positioning system which operates without using an explicit reference |
US7400291B2 (en) | 2003-12-04 | 2008-07-15 | Sony Corporation | Local positioning system which operates based on reflected wireless signals |
WO2017012982A1 (en) * | 2015-07-17 | 2017-01-26 | Jaguar Land Rover Limited | An ultrasonic sensor system in a vehicle for terrain identification |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US4817015A (en) * | 1985-11-18 | 1989-03-28 | The United States Government As Represented By The Secretary Of The Health And Human Services | High speed texture discriminator for ultrasonic imaging |
EP0396171A1 (en) * | 1989-04-26 | 1990-11-07 | Laboratoires D'electronique Philips | Texture analysis process and analyser |
US5003490A (en) * | 1988-10-07 | 1991-03-26 | Hughes Aircraft Company | Neural network signal processor |
-
1991
- 1991-11-16 GB GB9124436A patent/GB2261510B/en not_active Expired - Fee Related
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US4817015A (en) * | 1985-11-18 | 1989-03-28 | The United States Government As Represented By The Secretary Of The Health And Human Services | High speed texture discriminator for ultrasonic imaging |
US5003490A (en) * | 1988-10-07 | 1991-03-26 | Hughes Aircraft Company | Neural network signal processor |
EP0396171A1 (en) * | 1989-04-26 | 1990-11-07 | Laboratoires D'electronique Philips | Texture analysis process and analyser |
Cited By (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
SG111994A1 (en) * | 2003-07-25 | 2005-06-29 | Sony Corp | Wireless positioning system which operates without using an explicit reference |
US7400291B2 (en) | 2003-12-04 | 2008-07-15 | Sony Corporation | Local positioning system which operates based on reflected wireless signals |
WO2017012982A1 (en) * | 2015-07-17 | 2017-01-26 | Jaguar Land Rover Limited | An ultrasonic sensor system in a vehicle for terrain identification |
US10852405B2 (en) | 2015-07-17 | 2020-12-01 | Jaguar Land Rover Limited | Ultrasonic sensor system in a vehicle for terrain identification |
CN108027427B (en) * | 2015-07-17 | 2022-08-26 | 捷豹路虎有限公司 | Ultrasonic sensor system for terrain recognition in a vehicle |
Also Published As
Publication number | Publication date |
---|---|
GB9124436D0 (en) | 1992-01-08 |
GB2261510B (en) | 1996-05-22 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JPH09509742A (en) | Method for detecting the relative position of an object with respect to the background using ultrasound | |
CN101595396A (en) | The apparatus and method of high-resolution ultrasound displacement measurement | |
NL9201060A (en) | VOLUMETRIC AND TERRAIN IMAGE SONAR. | |
US20190346565A1 (en) | Method of compressing beamformed sonar data | |
CN104897779B (en) | Utilize the method for linear FM signal Measuring Propagation Time of Ultrasonic Wave | |
CN87102652A (en) | The ultrasonic listening Apparatus and method for | |
AU646976B2 (en) | Ultrasonic ranging devices | |
JPS6238360A (en) | Ultrasonic phase reflectoscope | |
JP2003066015A (en) | Signal processing method in acoustic emission method | |
WO2001075474A3 (en) | Method and apparatus for discriminating ultrasonic echoes using wavelet function processing | |
GB2261510A (en) | Ultrasonic reflection device | |
JPH02156391A (en) | Method and apparatus for counting flat product within flake | |
GB2261511A (en) | Ultrasonic ranging device | |
Murino et al. | A confidence-based approach to enhancing underwater acoustic image formation | |
AU2005255537A1 (en) | Method for detecting targets | |
WO2001013129A1 (en) | Correlation speed sensor | |
Leetang et al. | Evaluation of ultrasonic target detection by alternate transmission of different codes in M-sequence pulse compression | |
CN107144637B (en) | A method of identification direction of check | |
Thomas et al. | Neural processing of airborne sonar for mobile robot applications | |
US20030218468A1 (en) | Apparatus and method for measuring characteristics of anisotropic materials | |
FR2492112A1 (en) | METHOD FOR IDENTIFYING OBJECTS IMMERSED IN WATER | |
JPH0781993B2 (en) | Ultrasonic measuring device | |
JPH08503070A (en) | Ultrasonic inspection method for identifying a three-dimensional pattern | |
US11960036B2 (en) | Device and method for processing signals from a set of ultrasonic transducers | |
CN111024818B (en) | Submerged plant identification method of ultrasonic sensor based on longitudinal wave sound velocity |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PCNP | Patent ceased through non-payment of renewal fee |
Effective date: 19960822 |