WO2022231567A1 - Systèmes et procédés de détection d'emplacement d'objet, tels que la détection de l'emplacement d'un accident d'avion - Google Patents

Systèmes et procédés de détection d'emplacement d'objet, tels que la détection de l'emplacement d'un accident d'avion Download PDF

Info

Publication number
WO2022231567A1
WO2022231567A1 PCT/US2021/029134 US2021029134W WO2022231567A1 WO 2022231567 A1 WO2022231567 A1 WO 2022231567A1 US 2021029134 W US2021029134 W US 2021029134W WO 2022231567 A1 WO2022231567 A1 WO 2022231567A1
Authority
WO
WIPO (PCT)
Prior art keywords
data
processor
location
seismic
survey
Prior art date
Application number
PCT/US2021/029134
Other languages
English (en)
Inventor
Maurice Nessim
Nicolae Moldoveanu
Original Assignee
Maurice Nessim
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Maurice Nessim filed Critical Maurice Nessim
Priority to PCT/US2021/029134 priority Critical patent/WO2022231567A1/fr
Priority to US17/737,684 priority patent/US11562655B2/en
Publication of WO2022231567A1 publication Critical patent/WO2022231567A1/fr
Priority to PCT/US2023/021113 priority patent/WO2023215537A1/fr

Links

Classifications

    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G5/00Traffic control systems for aircraft, e.g. air-traffic control [ATC]
    • G08G5/0047Navigation or guidance aids for a single aircraft
    • G08G5/0056Navigation or guidance aids for a single aircraft in an emergency situation, e.g. hijacking
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64CAEROPLANES; HELICOPTERS
    • B64C39/00Aircraft not otherwise provided for
    • B64C39/02Aircraft not otherwise provided for characterised by special use
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G3/00Traffic control systems for marine craft
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G5/00Traffic control systems for aircraft, e.g. air-traffic control [ATC]
    • G08G5/0004Transmission of traffic-related information to or from an aircraft
    • G08G5/0013Transmission of traffic-related information to or from an aircraft with a ground station
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64UUNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
    • B64U10/00Type of UAV
    • B64U10/25Fixed-wing aircraft
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64UUNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
    • B64U2101/00UAVs specially adapted for particular uses or applications
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N20/00Machine learning

Definitions

  • the present disclosure relates to systems and methods for object location detection, and more particularly, for improving the prediction of object location.
  • Embodiments of the present disclosure provide a location system.
  • the location system may include a memory and a processor.
  • the processor may be configured to collect seismic data and geophysical data to determine object location.
  • the processor may be configured to determine one or more seismic attributes associated with a plurality types of noises based on the seismic data and the geophysical data using one or more machine learning algorithms.
  • the processor may be configured to eliminate unwanted noises from noise classifications based on the one or more seismic attributes.
  • the processor may be configured to predict the object location by comparing time and velocity data of the object with recorded timing and velocity data.
  • the processor may be configured to validate the object location by comparing the determined noise with image data.
  • Embodiments of the present disclosure provide a method of determining object location.
  • the method may include collecting, by a processor, seismic data and geophysical data to determine object location.
  • the method may include determining, by the processor, one or more seismic attributes associated with a plurality types of noises based on the seismic data and the geophysical data using one or more machine learning algorithms.
  • the method may include eliminating, by the processor, unwanted noises from noise classifications based on the one or more seismic attributes.
  • the method may include predicting, by the processor, the object location by comparing time and velocity data of the object with recorded timing and velocity data.
  • the method may include validating, by the processor, the object location by comparing the determined noise with image data.
  • Embodiments of the present disclosure provide a computer readable non- transitory medium comprising computer executable instructions that, when executed on a processor, perform procedures comprising the steps of: collecting seismic data and geophysical data to determine object location; determining one or more seismic attributes associated with a plurality types of noises based on the seismic data and the geophysical data using one or more machine learning algorithms; eliminating unwanted noises from noise classifications based on the one or more seismic attributes; predicting the object location by comparing time and velocity data of the object with recorded timing and velocity data; and validating the object location by comparing the determined noise with image data.
  • Figure 1 depicts a location system according to an exemplary embodiment.
  • Figure 2 depicts a location system according to an exemplary embodiment.
  • Figure 3 depicts a location system according to an exemplary embodiment.
  • Figure 4 depicts s a location system according to an exemplary embodiment.
  • Figure 5 depicts a method of determining object location according to an exemplary embodiment.
  • Figure 6 depicts a node device according to an exemplary embodiment.
  • Figure 7 depicts an illustration of data recording according to an exemplary embodiment.
  • Figure 8 depicts an illustration of data recording according to an exemplary embodiment.
  • Figure 9 depicts a graph according to an exemplary embodiment.
  • Figure 10 depicts a method of object location according to an exemplary embodiment.
  • the systems and methods are applicable to identifying the location of a plane that has crashed into the sea or ocean using seismic data. Examples of other applications that are within the scope of the invention include identifying the location of any object that has fallen from the sky into the water, exploration, environmental analysis, and the like.
  • the invention is not limited to objects approaching or falling into the water.
  • the systems and methods disclosed herein may be applied to objects approaching or falling into the earth, and not just water.
  • Seismic data may be collected from vessels that operate in the assumed body of water, such as the sea or ocean. This collected data is recorded prior to the assumed time of the crash, during the crash, and after the crash up to a predetermined number associated with time.
  • the predetermined number may include any number of seconds, minutes, hours, days, weeks, months, etc.
  • the collected data may be recorded prior to and after the crash up to a minimum of fifty hours.
  • a direct wave and a refracted wave may be recorded by the receivers in the streamer, and with reference to FIG. 8, a direct wave and a refracted wave may be recorded by one or more ocean bottom node (OBN) devices.
  • OBN ocean bottom node
  • different formulas may be used to estimate these distances based on direct waves or refracted waves, and this is part of the specific processing for the streamer and OBN; to obtain accurate estimation of SRi, accurate water velocity variation with depth is required. In the second situation, consideration must also be given to the location of the crash at a large distance from the survey area.
  • a deep sound channel such as a sound fixing and ranging (SOFAR) channel
  • SOFAR sound fixing and ranging
  • Water velocity analysis is required to be performed on the collected seismic data to determine water velocity variation with depth, and to determine if there is a layer of low velocity in the water (SOFAR channel). If the presence of this layer is detected, it must be evaluated how this is coupled to the streamer spread or OBN grid of receivers.
  • An example of water velocity analysis is illustrated in FIG. 9. This is based on measuring the semblance on common midpoint (CMP) gathers that normal moveout (NMO) corrected with different velocities.
  • CMP common midpoint
  • NMO normal moveout
  • Other methods may be used to estimate velocity in water or sediments, including but not limited to full wavefield inversion (FWI) and tomography.
  • any reflection coming from it will be showing a rather large anomaly. Moreover, this reflected signal will likely not indicate sand or mud and will rather exhibit certain characteristics. In some examples, if a seismic survey was recorded over an area where a plane crashed, and the plane did not disintegrate in very small fragments, the recorded signal may appear as seismic diffractions, because the acoustic impedance contrast between plane (metal) and earth sediments is very high.
  • Object location may be accurately approximated and thereby improved by transmitting the signals received from different surveys from a vessels and hydrophones and based on the difference in acoustic impedance including the velocity and density of the water.
  • the source of energy is the object, or plane, itself when it makes contact with the surface of the water.
  • a corresponding wave plane-let in the seismic data will be picked up through any number of surveys by one or more nodes or streamers, which may then be transmitted for analysis, as further described below.
  • the vessels are located in vicinity of the new sound, this data is likely to be picked up by them.
  • one or more hydrophones that are located offshore or onshore may be configured to record this data.
  • standard hydrophones in the streamers may be used.
  • OBN ocean bottom node
  • any and all four component (4C) receivers may be used.
  • the 4C receivers may comprise a hydrophone, a vertical geophone, a horizontal geophone oriented in X direction, and a horizontal geophone oriented in Y-direction, as further depicted in FIG. 6, and the node device may be deployed on, without limitation, the ocean or sea floor.
  • a spike in the data measurement obtained by the vessels may represent a type of noise, and instruments or vessels that capture this information may be configured to eliminate the spike, for example, by identifying it as an electronic glitch during recording.
  • a processor of a first device may be configured to take the recorded data and remove this type of information as part of improving the signal to noise ratio.
  • these vessels may each be equipped with global positioning systems. These vessels may be periodically or continuously recording seismic data, and may even be for a different purpose, such as for exploration. Thus, if these vessels are recording and retain the original data, they necessarily include the noise spike captured as well.
  • the signal may be associated with the falling object if it has a unique shape or wave plane-let that may be identified on several recording channels for a short period of time, and if a similar type of wave plane-let is identified on the seismic records of the other surveys from one or more recording vessels.
  • the wave plane-let may be a spike if the location of the crash was very close to the seismic recording system.
  • the shape of the wave plane-let may be different due to attenuation of the high frequencies during propagation.
  • the invention may be trained by machine learning algorithms to automate the process described herein to accurately identify the location of the object.
  • the machine learning algorithms employed can include at least one selected from the group of gradient boosting machine, logistic regression, neural networks, and a combination thereof, however, it is understood that other machine learning algorithms can be utilized.
  • the calibration through the case of the shallow water example may be used to demonstrate the validity of the invention.
  • that seismic data may also be used by a processor as training data for the machine learning algorithms for future applications.
  • devices associated with agencies such as NASA® may be periodically or monitoring objects or masses falling from the sky that, for example, make contact with the surface of earth or water.
  • the invention may work in tandem with such agencies.
  • the systems and methods disclosed herein may complement this type of work, in which the processor is configured to compare the seismic data with one or more images that are captured and received in order to improve the accuracy of the location of the object.
  • the captured image may comprise a single image or a plurality of images of the falling object approaching the earth or water.
  • data validation may be performed by the processor to improve the detection of the location of the object.
  • Fiber optic cables that are deployed on the ocean floor for internet and telephone communications. Over 1,200,000 km of fiber optics cable are already deployed across the oceans.
  • the transoceanic fiber optic cables may be further configured to detect changes in polarization of the light traveling through the cables that are caused by perturbations in the water produced by large waves related to large storms or earthquakes generated under water.
  • the fiber optic cables such as distributive acoustic sensors (DAS cable), may be further configured to acquire borehole and surface seismic data for seismic applications.
  • DAS cable distributive acoustic sensors
  • Figure 1 illustrates a location system 100 according to an exemplary embodiment.
  • the system 100 may comprise a first device 105, a second device 110, a network
  • system 100 may include any number of components.
  • System 100 may include a first device 105.
  • the first device 105 may include one or more processors 102, and memory 104.
  • the first device 105 may comprise a network- enabled computer, or other device described herein.
  • a network-enabled computer may include, but is not limited to a computer device, or communications device including, e.g., a server, a network appliance, a personal computer, a workstation, a phone, a handheld PC, a personal digital assistant, a thin client, a fat client, an Internet browser, a kiosk, a tablet, a terminal, or other device.
  • First device 105 also may be a mobile device; for example, a mobile device may include an iPhone, iPod, iPad from Apple® or any other mobile device running Apple’s iOS® operating system, any device running Microsoft’s Windows® Mobile operating system, any device running Google’s Android® operating system, and/or any other smartphone, tablet, or like wearable mobile device.
  • the first device 105 may comprise an acquisition device configured to perform object location detection using the systems and methods disclosed herein.
  • the first device 105 may include processing circuitry and may contain additional components, including processors, memories, error and parity/CRC checkers, data encoders, anticollision algorithms, controllers, command decoders, security primitives and tamperproofing hardware, as necessary to perform the functions described herein.
  • the first device 105 may further include a display and input devices.
  • the display may be any type of device for presenting visual information such as a computer monitor, a flat panel display, and a mobile device screen, including liquid crystal displays, light-emitting diode displays, plasma panels, and cathode ray tube displays.
  • the input devices may include any device for entering information into the user’s device that is available and supported by the user’s device, such as a touch-screen, keyboard, mouse, cursor-control device, touch-screen, microphone, digital camera, video recorder or camcorder. These devices may be used to enter information and interact with the software and other devices described herein.
  • First device 105 may include a communication interface 106.
  • the communication interface 106 may comprise communication capabilities with physical interfaces and contactless interfaces.
  • the communication interface 106 may be configured to establish contactless communication via a short-range wireless communication method, such as NFC, Bluetooth, Wi-Fi, RFID, and other forms of contactless communication.
  • the communication interface 106 may be configured to communicate directly with the second device 110, server 120, and/or database 125 via network 115.
  • First device 105 may be in data communication with any number of components of system 100. For example, first device 105 may transmit and/or receive data via network 115 to and/or from second device 110, and/or server 120. First device 105 may transmit data via network 115 to database 125.
  • First device 105 may be include any number of transmitters and/or receivers 108 that are configured to communicate with any component of system 100.
  • the transmitter and/or receiver 108 may be configured to transmit and receive any type of data.
  • the transmitter and/or receiver 108 may be configured to transmit and/or receive seismic and geological data from second device 110.
  • System 100 may include a second device 110.
  • the second device 110 may be associated with any type and number of hydrophones or vessels that are configured to record and capture data, including but not limited to seismic data.
  • the second device 110 may include one or more processors 112, and memory 114.
  • Memory 114 may include one or more applications, including but not limited to application 116.
  • Second device 110 may be in data communication with any number of components of system 100.
  • second device 110 may be in data communication with any number of components of system 100.
  • second device 110 may transmit and/or receive data via network 115 to and/or from first device 105 and/or server 120. Second device 110 may transmit data via network 115 to database 125.
  • second device 110 may be a network-enabled computer.
  • a network-enabled computer may include, but is not limited to a computer device, or communications device including, e.g., a server, a network appliance, a personal computer, a workstation, a phone, a handheld PC, a personal digital assistant, a thin client, a fat client, an
  • Second device 110 also may be a mobile device; for example, a mobile device may include an iPhone, iPod, iPad from Apple® or any other mobile device running Apple’s iOS® operating system, any device running Microsoft’s Windows® Mobile operating system, any device running Google’s Android® operating system, and/or any other smartphone, tablet, or like wearable mobile device.
  • a mobile device may include an iPhone, iPod, iPad from Apple® or any other mobile device running Apple’s iOS® operating system, any device running Microsoft’s Windows® Mobile operating system, any device running Google’s Android® operating system, and/or any other smartphone, tablet, or like wearable mobile device.
  • the second device 110 may include processing circuitry and may contain additional components, including processors, memories, error and parity/CRC checkers, data encoders, anticolbsion algorithms, controllers, command decoders, security primitives and tamperproofmg hardware, as necessary to perform the functions described herein.
  • the second device 110 may further include a display and input devices.
  • the display may be any type of device for presenting visual information such as a computer monitor, a flat panel display, and a mobile device screen, including liquid crystal displays, light-emitting diode displays, plasma panels, and cathode ray tube displays.
  • the input devices may include any device for entering information into the user’s device that is available and supported by the user’s device, such as a touch-screen, keyboard, mouse, cursor-control device, touch-screen, microphone, digital camera, video recorder or camcorder. These devices may be used to enter information and interact with the software and other devices described herein.
  • Second device 110 may be include any number of transmitters and/or receivers 118 that are configured to communicate with any component of system 100.
  • the transmitter and/or receiver 118 may be configured to transmit and receive any type of data.
  • the transmitter and/or receiver 118 may be configured to transmit and/or receive seismic and geological data to and/or from first device 105.
  • System 100 may include a network 115.
  • network 115 may be one or more of a wireless network, a wired network or any combination of wireless network and wired network, and may be configured to connect to any one of components of system 100.
  • first device 105 may be configured to connect to server 120 via network 115.
  • network 115 may include one or more of a fiber optics network, a passive optical network, a cable network, an Internet network, a satellite network, a wireless local area network (LAN), a Global System for Mobile Communication, a Personal Communication Service, a Personal Area Network, Wireless Application Protocol, Multimedia Messaging Service, Enhanced Messaging Service, Short Message Service, Time Division Multiplexing based systems, Code Division Multiple Access based systems, D-AMPS, Wi-Fi, Fixed Wireless Data, IEEE 802.11b, 802.15.1, 802.11h and 802. llg, Bluetooth, NFC, Radio Frequency Identification (RFID), Wi-Fi, and/or the like.
  • RFID Radio Frequency Identification
  • network 115 may include, without limitation, telephone lines, fiber optics, IEEE Ethernet 902.3, a wide area network, a wireless personal area network, a LAN, or a global network such as the Internet.
  • network 115 may support an Internet network, a wireless communication network, a cellular network, or the like, or any combination thereof.
  • Network 115 may further include one network, or any number of the exemplary types of networks mentioned above, operating as a stand-alone network or in cooperation with each other.
  • Network 115 may utilize one or more protocols of one or more network elements to which they are communicatively coupled.
  • Network 115 may translate to or from other protocols to one or more protocols of network devices.
  • network 115 may comprise a plurality of interconnected networks, such as, for example, the Internet, a service provider's network, a cable television network, corporate networks, such as credit card association networks, and home networks.
  • networks such as, for example, the Internet, a service provider's network, a cable television network, corporate networks, such as credit card association networks, and home networks.
  • System 100 may include one or more servers 120.
  • server 120 may include one or more processors 122 coupled to memory 124.
  • Server 120 may be configured as a central system, server or platform to control and call various data at different times to execute a plurality of workflow actions.
  • Server 120 may be configured to connect to any component of first device 105 and/or second device 110.
  • Server 120 may be in data communication with the processor 102 and/or application 116.
  • a server 120 may be in data communication with any of transmitter and/or receiver 108, 118 via one or more networks 115.
  • First device 105 may be in communication with one or more servers 120 via one or more networks 115 and may operate as a respective front-end to back-end pair with server 120.
  • First device 105 may transmit, for example from processor 102, one or more requests to server 120.
  • the one or more requests may be associated with retrieving data from second device 110 and/or server 120.
  • Server 120 may receive the one or more requests from first device 105. Based on the one or more requests from processor 102, server 120 may be configured to retrieve the requested data.
  • Server 120 may be configured to transmit the received data to processor 106, the received data being responsive to one or more requests.
  • server 120 can be a dedicated server computer, such as bladed servers, or can be personal computers, laptop computers, notebook computers, palm top computers, network computers, mobile devices, wearable devices, or any processor-controlled device capable of supporting the system 100. While FIG. 1 illustrates a single server 120, it is understood that other embodiments can use multiple servers or multiple computer systems as necessary or desired to support the users and can also use back-up or redundant servers to prevent network downtime in the event of a failure of a particular server.
  • Server 120 may include an application comprising instructions for execution thereon.
  • the application may comprise instructions for execution on the server
  • server 120 may be in communication with any components of system 100.
  • server 120 may execute one or more applications that enable, for example, network and/or data communications with one or more components of system 100 and transmit and/or receive data.
  • server 120 may be a network-enabled computer.
  • a network-enabled computer may include, but is not limited to a computer device, or communications device including, e.g., a server, a network appliance, a personal computer, a workstation, a phone, a handheld PC, a personal digital assistant, a thin client, a fat client, an Internet browser, or other device.
  • Server 120 also may be a mobile device; for example, a mobile device may include an iPhone, iPod, iPad from Apple® or any other mobile device running Apple’s iOS® operating system, any device running Microsoft’s Windows® Mobile operating system, any device running Google’s Android® operating system, and/or any other smartphone, tablet, or like wearable mobile device.
  • a mobile device may include an iPhone, iPod, iPad from Apple® or any other mobile device running Apple’s iOS® operating system, any device running Microsoft’s Windows® Mobile operating system, any device running Google’s Android® operating system, and/or any other smartphone, tablet, or like wearable mobile device.
  • the server 120 may include processing circuitry and may contain additional components, including processors, memories, error and parity/CRC checkers, data encoders, anticollision algorithms, controllers, command decoders, security primitives and tamperproofmg hardware, as necessary to perform the functions described herein.
  • the server 120 may further include a display and input devices.
  • the display may be any type of device for presenting visual information such as a computer monitor, a flat panel display, and a mobile device screen, including liquid crystal displays, light-emitting diode displays, plasma panels, and cathode ray tube displays.
  • the input devices may include any device for entering information into the user’s device that is available and supported by the user’s device, such as a touch-screen, keyboard, mouse, cursor-control device, touch-screen, microphone, digital camera, video recorder or camcorder. These devices may be used to enter information and interact with the software and other devices described herein.
  • System 100 may include one or more databases 125.
  • the database 125 may comprise a relational database, a non-relational database, or other database implementations, and any combination thereof, including a plurality of relational databases and non-relational databases.
  • the database 125 may comprise a desktop database, a mobile database, or an in-memory database.
  • the database 125 may be hosted internally by any component of system 100, such as the first device 105 or server 120, or the database 125 may be hosted externally to any component of the system 100, such as the first device 105 or server 120, by a cloud-based platform, or in any storage device that is in data communication with the first device 105 and server 120.
  • database 125 may be in data communication with any number of components of system 100.
  • server 120 may be configured to retrieve the requested data from the database 125 that is transmitted by processor 102.
  • Server 120 may be configured to transmit the received data from database 125 to processor 102 via network 115, the received data being responsive to the transmitted one or more requests.
  • processor 102 may be configured to transmit one or more requests for the requested data from database 125 via network 115.
  • exemplary procedures in accordance with the present disclosure described herein can be performed by a processing arrangement and/or a computing arrangement (e.g., computer hardware arrangement).
  • a processing/computing arrangement can be, for example entirely or a part of, or include, but not limited to, a computer/processor that can include, for example one or more microprocessors, and use instructions stored on a computer-accessible medium (e.g., RAM, ROM, hard drive, or other storage device).
  • a computer-accessible medium can be part of the memory of the first device 105, second device 110, server 120, and/or database 125, or other computer hardware arrangement.
  • a computer-accessible medium e.g., as described herein above, a storage device such as a hard disk, floppy disk, memory stick, CD-ROM, RAM, ROM, etc., or a collection thereol
  • the computer-accessible medium can contain executable instructions thereon.
  • a storage arrangement can be provided separately from the computer-accessible medium, which can provide the instructions to the processing arrangement so as to configure the processing arrangement to execute certain exemplary procedures, processes, and methods, as described herein above, for example.
  • the processor 102 may be configured to perform analysis of the seismic data to identify and classify the type of noise affecting the seismic data. The deeper the crash into the water, the greater the likelihood of salinity. For example, if the salinity of the water increases, the absorption is stronger and higher frequencies will be more attenuated than lower frequencies. Another critical factor related to sound attenuation is the propagation distance of the sound in water.
  • the processor 102 may be configured to perform the analysis on raw seismic data and after attenuation of marine noise.
  • the processor 102 may be configured to determine anomalous noise.
  • the processor 102 may be configured to determine one or more non-geological signals.
  • the noise may be coherent, such as that exhibiting a constant characteristic, or random, such as that not exhibiting a specific characteristic.
  • Other noises may include electrical noises, surface noises, and/or environmental noises.
  • any data related to the surface or earth are eliminated and noises are kept, including specific types of noises.
  • the amplitude over a small time interval may be measured, thereby corresponding to the spike, as previously mentioned, which may be recognized as a training period for the machine learning algorithm which may further be configured to identify anomalous signals or noise, and upon such identification, determine if the recorded signal corresponds to a type of wave, such as a direct wave or a refracted wave. This technique may be reiterated for large volumes of data using the machine learning algorithm.
  • the processor 102 may be configured to identify and classify the type of noise using one or more machine learning algorithms.
  • the machine learning algorithm may be trained to improve the accuracy and efficiency of the object location determination.
  • the machine learning algorithm may be trained to recognize that when seismic data is being captured, a predetermined number of streamers are needed to capture this data, such as a vessel pulling a spread of 10 streamers, each with a given length of about 8-14 km long and a width of the spread of 800 m to 1400m.
  • the machine learning algorithm may be further trained to recognize that any streamer may be moving and pulling any number and type of hydrophones behind them that are suspended above and below the surface of the power.
  • the machine learning algorithm may be further trained to recognize that each vessel has a given number of sensors. In a single experiment or shot, for example, millions of data recordings may be captured, and the machine learning algorithm may be trained to look at only a portion, such as a five, of these data recordings in order to identify the characteristics resembling that of the spike based on the high amplitude over the short period of time interval, including the identification of the anomalous signals or noise, and also the type of wave. In addition, the machine learning algorithm may be trained to further recognize that any signal arriving from a greater depth than desired is to be eliminated from consideration.
  • the processor may be further trained to recognize that any signal arriving from a greater depth than desired is to be eliminated from consideration.
  • the interpretation engine may be configured to load the seismic data volume into and thereby communicate with an interpretation engine.
  • the interpretation engine may include a three-dimensional visualization and interpretation algorithm, such as Petrel®.
  • the interpretation engine may be internal to the first device.
  • the interpretation may be external to the first device.
  • the interpretation engine may be configured to examine the seismic data in common shot gathers, common trace gathers, and three-dimensional stacked volumes to generate a plurality of types of seismic attributes.
  • the plurality of types of seismic attributes may be used to identify the anomalous noise or non-geological signals, such as the spike as previously mentioned above.
  • the processor 102 instead of the interpretation engine, may be configured to perform the examination of the seismic data.
  • the processor 102 may be configured to determine if the anomalous noise or non-geological signal is consistent on different measurements from one or more surveys. If the processor 102 determines that the anomalous noise or non-geological signal is consistent with different measurements retrieved from the surveys, then a disposition of “YES” is made with respect to whether the signal is in fact coming from the plane crash. If the processor 102 determines that the anomalous noise or non-geological signal is inconsistent with different measurements retrieved from the surveys, then a disposition of “NO” is made with respect to whether the signal is in fact coming from the plane crash. In this manner, the processor 102 may be configured to perform data validation on the seismic data.
  • the entire seismic data may be examined by a processor 102 or an interpretation system that may be configured to perform data validation based on 2D and 3D seismic displays including vertical and horizontal sections (time slices) to confirm the findings obtained from the machine learning algorithm examination, and in particular, the similarities in the shape of the signal of wave plane-let on different streamers, the accuracy of the timing as a function of distance from the recording vessel (the streamer and channel location) to the location of the crash, and the change in the shape of the wave plane-let as a function of distance from the recording vessel to the found location of the crash.
  • the processor 102 may be configured to determine a location of the plane crash using a method based on triangulation.
  • the processor 102 may be configured to take into account a plurality of receivers for streamer acquisition and a plurality of sensors for surveys, such as ocean bottom system surveys.
  • the triangulation method utilized herein may be used to identify the epicenters of the earthquakes using measurements from a plurality of stations, such as three seismic stations as illustrated in
  • FIG. 10 Based on the estimated source-receiver distances, SRi, circles with radii SRi corresponding to a minimum of three receivers may be drawn and their intersection may provide the potential location of the plane crash.
  • an analysis such as an uncertainty analysis, may be performed by the processor 102 for the estimated location of the crash.
  • a plurality of factors that may be considered for performing the uncertainty analysis include: water velocity and sea floor velocities; and different receiver location along the streamers or different receiver location from the OBN survey. For each of these factors, different plausible values may be tested to determine the effect on the intersection point (potential location of the crash). This uncertainty analysis may be implemented with a machine learning algorithm.
  • the processor 102 may be further configured to perform analysis on different possible velocities of the water, as well as identify timing.
  • FIG. 2 a plurality of vessels of system 200 are depicted according to an exemplary embodiment.
  • Figure 2 may reference the same or similar components of system 100.
  • Figure 2 illustrates single instances of components of system 200, system 200 may include any number of components.
  • a minimum of two towed streamer seismic surveys may be conducted by the vessels 210, 220 during the plane crash.
  • the first vessel 210 may comprise a first dimension
  • the first vessel 205 which may include a width approximately ranging from 800m to 1400m.
  • the 210 may comprise a second dimension 207, which may include a length approximately ranging from 8000m to 14000m.
  • the second dimension 207 may exceed the first dimension 205 by, for example a factor of 10, however it is not limited to such configurations or dimensions.
  • the second vessel 220 may comprise a first dimension 215, which may include a width approximately ranging from 800m to 1400m.
  • the second vessel 220 may comprise a second dimension 217, which may include a length approximately ranging from 8000m to 14000m.
  • the second dimension 217 may exceed the first dimension 215 by, for example a factor of 10, however it is not limited to such configurations or dimensions.
  • vessel 210 may be in transit in a southeast direction during the time of the plane crash at point 230, whereas vessel
  • a processor such as processor 102 of the first device 105, may be configured to receive the surveys from each of the vessels 210, 220 via second device 110.
  • the data received from each survey should include data with navigation information in the headers.
  • the data with navigation information in the headers may be provided with no additional processing.
  • the required processing may be the same for each data set on the designated device. Specific formulas for streamers and OBN may be used in processing to calculate the distances from the assumed source location to that of the specific receivers.
  • a survey may be conducted during the plane crash by system 300 according to an exemplary embodiment.
  • Figure 3 may reference the same or similar components of system 100, and system 200 of FIG. 2.
  • Figure 3 illustrates single instances of components of system 300
  • system 300 may include any number of components.
  • an ocean bottom node 305 survey may be conducted during the plane crash at point 325.
  • the distance between the plurality of sensors 310 disposed on the seafloor may range from 200m to 1000m.
  • the ocean bottom node 305 may include any number of receivers for capturing data and transmitters for transmitting this data to the processor, such as processor 102 of first device 105.
  • the survey may take into account a region including a first dimension 315 and a second dimension 320.
  • the first dimension 315 may comprise a width of 20-40km.
  • the second dimension 320 may comprise a length of 40-60km.
  • the second dimension 320 may exceed the first dimension 315 by, for example a factor of 2, however it is not limited to such configurations or dimensions.
  • additional surveys from the ocean bottom node 305 may be conducted.
  • a processor such as processor 102 of the first device 105, may be configured to receive this survey from the ocean bottom node 305 via second device 110.
  • OBN devices may be deployed to a maximum depth, such as 3000m.
  • FIG. 4 may reference the same or similar components of system 100, system 200 of FIG. 2, and system 300 of FIG. 3.
  • system 400 may include any number of components.
  • an ocean bohom node 405 survey and a towed streamer survey from a first vessel 425 may be conducted during the plane crash at point
  • the survey from the ocean bohom node 405 may take into account a region including a first dimension 415 and a second dimension 420.
  • the first dimension 415 may comprise a width of 20-40km.
  • the second dimension 420 may comprise a length of 40-60km.
  • the second dimension 420 may exceed the first dimension 415 by, for example a factor of 2, however it is not limited to such configurations or dimensions. It is understood that additional surveys from the ocean bohom node 405 may be conducted.
  • the distance between the plurality of sensors may be conducted.
  • the first vessel 425 may comprise a first dimension 427, which may include a width approximately ranging from 800m to 1400m.
  • the first vessel 425 may comprise a second dimension 429, which may include a length approximately ranging from 8000m to 14000m.
  • the second dimension 429 may exceed the first dimension 427, for example by a factor of 10, however it is not limited to such configurations or dimensions. It is understood that additional surveys from the first vessel 425 may be conducted. Without limitation, vessel 425 may be in transit in an east direction during the time of the plane crash at point 430.
  • a processor such as processor 102 of the first device 105, may be configured to receive each of these surveys from one or more second devices 110. From each survey, the critical results from each of these surveys may include: the receiver locations Ri where the source arrival was detected; calculated distances SRi based on formula for direct waves and refracted waves; and water velocities profiles that describe variation of water velocity with depth.
  • Figure 5 depicts a method of 500 of determining object location according to an exemplary embodiment.
  • Figure 5 may reference the same or similar components of system 100, system 200 of FIG. 2, system 300 of FIG. 3, and system 400 of FIG. 4.
  • the method 500 may include collecting, by a processor, seismic data and geophysical data to determine object location.
  • the processor may belong to a device, such as first device 105.
  • the method 500 may include determining, by the processor, one or more seismic attributes associated with a plurality types of noises based on the seismic data and the geophysical data using one or more machine learning algorithms.
  • the method 500 may include eliminating, by the processor, unwanted noises from noise classifications based on the one or more seismic attributes.
  • the method 500 may include predicting, by the processor, the object location by comparing time and velocity data of the object with recorded timing and velocity data.
  • the method 500 may include validating, by the processor, the object location by comparing the determined noise with image data.
  • Figure 6 depicts a node device 600 according to an exemplary embodiment.
  • Figure 6 may reference the same or similar components of system 100, system 200 of FIG. 2, system 300 of FIG. 3, system 400 of FIG. 4, and method of FIG. 5.
  • the node device 600 may comprise an ocean bottom node device.
  • OBN surveys are 4C (4 component): a hydrophone, a vertical geophone, a horizontal geophone oriented in X direction, and a horizontal geophone oriented in Y -direction, and node device
  • the deployment of the receivers may be on a grid, with a grid interval that may vary from 400 m x 400 m to 1100 m x 1100 m.
  • the number of nodes 600 that may be deployed with the systems and methods of the present disclosure may range from 2000 to 4000 nodes. For example, if the seismic survey is an ocean bottom node (OBN) survey, any and all four component (4C) receivers may be used.
  • OBN ocean bottom node
  • 4C four component
  • Figure 7 depicts an illustration of data recording 700 according to an exemplary embodiment.
  • Figure 7 may reference the same or similar components of system 100, system 200 of FIG. 2, system 300 of FIG. 3, system 400 of FIG. 4, method of FIG. 5, and device 600 of FIG. 6.
  • waves may be recorded by receivers in the streamers.
  • direct waves and refracted waves may be recorded by the streamer receivers if the location of the object crash is within the seismic survey area.
  • S represents the assumed location of the source.
  • the distances SRi i representing the receiver index
  • FIG. 8 depicts an illustration of data recording 800 according to an exemplary embodiment.
  • Figure 8 may reference the same or similar components of system 100, system 200 of FIG. 2, system 300 of FIG. 3, system 400 of FIG. 4, method of FIG. 5, device 600 of FIG. 6, and data recording 700 of FIG. 7.
  • waves may be recorded by receivers in the OBN devices.
  • direct waves and refracted waves may be recorded by the OBN receivers if the location of the object crash is within the seismic survey area.
  • S represents the assumed location of the source.
  • FIG. 9 depicts a graph 900 according to an exemplary embodiment.
  • Figure 9 may reference the same or similar components of system 100, system 200 of FIG. 2, system 300 of FIG. 3, system 400 of FIG. 4, method of FIG. 5, device 600 of FIG. 6, data recording 700 of FIG. 7, and data recording 800 of FIG. 8.
  • FIG. 9 An example of velocity analysis is illustrated in FIG. 9.
  • water velocity analysis may based on measuring the semblance on common midpoint (CMP) gathers that normal moveout (NMO) corrected with different velocities.
  • CMP common midpoint
  • NMO normal moveout
  • Other methods may be used to estimate velocity in water or sediments, including but not limited to full wavefield inversion (FWI) and tomography.
  • Figure 10 depicts a method 1000 of object location according to an exemplary embodiment.
  • Figure 10 may reference the same or similar components of system 100, system 200 of FIG. 2, system 300 of FIG. 3, system 400 of FIG. 4, method of FIG. 5, device 600 of FIG. 6, data recording 700 of FIG. 7, data recording 800 of FIG. 8, and graph 900 of FIG. 9.
  • an illustration of a triangulation method 1000 may be used to estimate the location of the crash.
  • the triangulation method 1000 utilized herein may be used to identify the epicenters of the earthquakes using measurements from a plurality of stations, such as three seismic stations as illustrated in FIG. 10.
  • SRi with i as index circles with radii SRi corresponding to a minimum of three receivers may be drawn and their intersection may provide the potential location of the plane crash.
  • PL refers to the potential location of the crash determined from surface seismic data.
  • VI is the location of the receiver for a first vessel with radius SRI
  • V2 is the location of the receiver for a second vessel with radius SR2
  • V3 is the location of the receiver for a third vessel with radius SR3.
  • Each of the receivers may be of the same type or a different type as the other receivers.
  • systems and methods described herein may be tangibly embodied in one of more physical media, such as, but not limited to, a compact disc
  • CD compact disc
  • DVD digital versatile disc
  • ROM read only memory
  • RAM random access memory
  • data storage may include random access memory (RAM) and read only memory
  • ROM read-only memory
  • Data storage may also include storage media or other suitable type of memory (e.g., such as, for example, RAM, ROM, programmable read-only memory (PROM), erasable programmable read-only memory (EPROM), electrically erasable programmable read-only memory (EEPROM), magnetic disks, optical disks, floppy disks, hard disks, removable cartridges, flash drives, any type of tangible and non-transitory storage medium), where the files that comprise an operating system, application programs including, for example, web browser application, email application and/or other applications, and data files may be stored.
  • PROM programmable read-only memory
  • EPROM erasable programmable read-only memory
  • EEPROM electrically erasable programmable read-only memory
  • the data storage of the network-enabled computer systems may include electronic information, files, and documents stored in various ways, including, for example, a flat file, indexed file, hierarchical database, relational database, such as a database created and maintained with software from, for example, Oracle® Corporation, Microsoft® Excel file,
  • Microsoft® Access file a solid state storage device, which may include a flash array, a hybrid array, or a server-side product, enterprise storage, which may include online or cloud storage, or any other storage mechanism.
  • the figures illustrate various components (e.g., servers, computers, processors, etc.) separately. The functions described as being performed at various components may be performed at other components, and the various components may be combined or separated. Other modifications also may be made.

Landscapes

  • Engineering & Computer Science (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Business, Economics & Management (AREA)
  • Emergency Management (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Ocean & Marine Engineering (AREA)
  • Geophysics And Detection Of Objects (AREA)

Abstract

L'invention concerne des systèmes et des procédés de détermination d'emplacement d'objet, lesquels peuvent comprendre une mémoire et un processeur. Le processeur peut être configuré pour collecter des données sismiques et des données géophysiques en vue de déterminer l'emplacement d'un objet. Le processeur peut être configuré pour déterminer un ou plusieurs attributs sismiques associés à une pluralité de types de bruits sur la base des données sismiques et des données géophysiques en utilisant un ou plusieurs algorithmes d'apprentissage automatique. Le processeur peut être configuré pour éliminer les bruits indésirables des classifications de bruit sur la base desdits attributs sismiques. Le processeur peut être configuré pour prédire l'emplacement de l'objet en comparant des données de temps et de vitesse de l'objet avec des données de cadencement et de vitesse enregistrées. Le processeur peut être configuré pour valider l'emplacement de l'objet en comparant le bruit déterminé avec des données d'image. Les systèmes et les procédés peuvent être utilisés, par exemple, pour détecter des avions manquants tels que le vol 370 de la Malaysian Airlines.
PCT/US2021/029134 2021-04-26 2021-04-26 Systèmes et procédés de détection d'emplacement d'objet, tels que la détection de l'emplacement d'un accident d'avion WO2022231567A1 (fr)

Priority Applications (3)

Application Number Priority Date Filing Date Title
PCT/US2021/029134 WO2022231567A1 (fr) 2021-04-26 2021-04-26 Systèmes et procédés de détection d'emplacement d'objet, tels que la détection de l'emplacement d'un accident d'avion
US17/737,684 US11562655B2 (en) 2021-04-26 2022-05-05 Aircraft rescue systems and methods using predictive models
PCT/US2023/021113 WO2023215537A1 (fr) 2021-04-26 2023-05-05 Systèmes et procédés de sauvetage d'aéronef utilisant des modèles prédictifs

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/US2021/029134 WO2022231567A1 (fr) 2021-04-26 2021-04-26 Systèmes et procédés de détection d'emplacement d'objet, tels que la détection de l'emplacement d'un accident d'avion

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US17/737,684 Continuation-In-Part US11562655B2 (en) 2021-04-26 2022-05-05 Aircraft rescue systems and methods using predictive models

Publications (1)

Publication Number Publication Date
WO2022231567A1 true WO2022231567A1 (fr) 2022-11-03

Family

ID=83693420

Family Applications (2)

Application Number Title Priority Date Filing Date
PCT/US2021/029134 WO2022231567A1 (fr) 2021-04-26 2021-04-26 Systèmes et procédés de détection d'emplacement d'objet, tels que la détection de l'emplacement d'un accident d'avion
PCT/US2023/021113 WO2023215537A1 (fr) 2021-04-26 2023-05-05 Systèmes et procédés de sauvetage d'aéronef utilisant des modèles prédictifs

Family Applications After (1)

Application Number Title Priority Date Filing Date
PCT/US2023/021113 WO2023215537A1 (fr) 2021-04-26 2023-05-05 Systèmes et procédés de sauvetage d'aéronef utilisant des modèles prédictifs

Country Status (2)

Country Link
US (1) US11562655B2 (fr)
WO (2) WO2022231567A1 (fr)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20170227638A1 (en) * 2016-01-04 2017-08-10 Raytheon Bbn Technologies Corp. Bobber Field Acoustic Detection System
US20200049848A1 (en) * 2003-05-30 2020-02-13 Magseis Ff Llc Ocean bottom seismometer package
US20200110185A1 (en) * 2018-10-05 2020-04-09 Es Xplore, L.L.C. Passive electroseismic surveying

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20200049848A1 (en) * 2003-05-30 2020-02-13 Magseis Ff Llc Ocean bottom seismometer package
US20170227638A1 (en) * 2016-01-04 2017-08-10 Raytheon Bbn Technologies Corp. Bobber Field Acoustic Detection System
US20200110185A1 (en) * 2018-10-05 2020-04-09 Es Xplore, L.L.C. Passive electroseismic surveying

Also Published As

Publication number Publication date
US20220343774A1 (en) 2022-10-27
WO2023215537A1 (fr) 2023-11-09
US11562655B2 (en) 2023-01-24

Similar Documents

Publication Publication Date Title
Stork et al. Application of machine learning to microseismic event detection in distributed acoustic sensing data
Ross et al. Automatic picking of direct P, S seismic phases and fault zone head waves
Verdon et al. Microseismic monitoring using a fiber-optic distributed acoustic sensor array
Festa et al. Earthquake magnitude estimation from early radiated energy
US20210063570A1 (en) Diffraction Imaging using Pseudo Dip-Angle Gather
US20220091289A1 (en) Networked System and Method for Passive Monitoring, Locating or Characterizing Activities
US10802169B2 (en) Determining node depth and water column transit velocity
Tan et al. Seismicity‐scanning based on navigated automatic phase‐picking
US10948617B2 (en) Generating a velocity model for a subsurface structure using refraction travel time tomography
WO2021126814A1 (fr) Cartographie d'hétérogénéités de surface proche dans une formation souterraine
US10921472B2 (en) Determing first-break points in seismic data
Xuan et al. Probabilistic microearthquake location for reservoir monitoring
US11194068B1 (en) Systems and methods for object location detection such as detecting airplane crash location
Garza‐Girón et al. A specific earthquake processing workflow for studying long‐lived, explosive volcanic eruptions with application to the 2008 Okmok Volcano, Alaska, eruption
US11562655B2 (en) Aircraft rescue systems and methods using predictive models
US11740374B2 (en) System and method for randomness measurement in sesimic image data using vectorized disorder algorithm
US11460595B2 (en) Unified continuous seismic reservoir monitoring
Rentsch et al. Migration-based location of seismicity recorded with an array installed in the main hole of the San Andreas Fault Observatory at Depth (SAFOD)
Wu et al. Magnitude determination using cumulative absolute absement for earthquake early warning
US11768303B2 (en) Automatic data enhancement for full waveform inversion in the midpoint-offset domain
US11898901B2 (en) Method and system for mapping fiber optic distributed acoustic sensing measurements to particle motion
EP3959545B1 (fr) Surveillance de réservoir sismique au moyen de points de transmission communs
Xie et al. Integrating distributed acoustic sensing and computer vision for real-time seismic location of landslides and rockfalls along linear infrastructure
Dittmann Improving Earthquake Monitoring and Early Warning Using GNSS Velocities and Machine Learning
US20180364382A1 (en) Similarity Determination based on a Coherence Function

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21939518

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 21939518

Country of ref document: EP

Kind code of ref document: A1