CN110545380B - Video system and method for data communication - Google Patents

Video system and method for data communication Download PDF

Info

Publication number
CN110545380B
CN110545380B CN201910851198.7A CN201910851198A CN110545380B CN 110545380 B CN110545380 B CN 110545380B CN 201910851198 A CN201910851198 A CN 201910851198A CN 110545380 B CN110545380 B CN 110545380B
Authority
CN
China
Prior art keywords
vehicle
camera
image data
data
communicate
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201910851198.7A
Other languages
Chinese (zh)
Other versions
CN110545380A (en
Inventor
M·B·克雷林
M·S·米纳
S·J·克劳斯
A·阿扎姆
M·L·布莱尔
N·奈萨尼
D·J·劳
A·宾德
S·D·查基
S·D·纳尔逊
N·U·纳法德
钟永扬
D·M·巴勒斯蒂
G·R·沙菲尔
J·J·基萨克
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Transportation IP Holdings LLC
Original Assignee
GE Global Sourcing LLC
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from US14/217,672 external-priority patent/US11124207B2/en
Priority claimed from US14/253,294 external-priority patent/US9875414B2/en
Priority claimed from US14/457,353 external-priority patent/US20150235094A1/en
Priority claimed from US14/479,847 external-priority patent/US20150269722A1/en
Priority claimed from US14/541,370 external-priority patent/US10110795B2/en
Application filed by GE Global Sourcing LLC filed Critical GE Global Sourcing LLC
Publication of CN110545380A publication Critical patent/CN110545380A/en
Application granted granted Critical
Publication of CN110545380B publication Critical patent/CN110545380B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B61RAILWAYS
    • B61LGUIDING RAILWAY TRAFFIC; ENSURING THE SAFETY OF RAILWAY TRAFFIC
    • B61L23/00Control, warning, or like safety means along the route or between vehicles or vehicle trains
    • B61L23/04Control, warning, or like safety means along the route or between vehicles or vehicle trains for monitoring the mechanical state of the route
    • B61L23/041Obstacle detection
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B61RAILWAYS
    • B61LGUIDING RAILWAY TRAFFIC; ENSURING THE SAFETY OF RAILWAY TRAFFIC
    • B61L23/00Control, warning, or like safety means along the route or between vehicles or vehicle trains
    • B61L23/04Control, warning, or like safety means along the route or between vehicles or vehicle trains for monitoring the mechanical state of the route
    • B61L23/042Track changes detection
    • B61L23/045Rail wear
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B61RAILWAYS
    • B61LGUIDING RAILWAY TRAFFIC; ENSURING THE SAFETY OF RAILWAY TRAFFIC
    • B61L23/00Control, warning, or like safety means along the route or between vehicles or vehicle trains
    • B61L23/04Control, warning, or like safety means along the route or between vehicles or vehicle trains for monitoring the mechanical state of the route
    • B61L23/042Track changes detection
    • B61L23/047Track or rail movements
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B61RAILWAYS
    • B61LGUIDING RAILWAY TRAFFIC; ENSURING THE SAFETY OF RAILWAY TRAFFIC
    • B61L23/00Control, warning, or like safety means along the route or between vehicles or vehicle trains
    • B61L23/04Control, warning, or like safety means along the route or between vehicles or vehicle trains for monitoring the mechanical state of the route
    • B61L23/042Track changes detection
    • B61L23/048Road bed changes, e.g. road bed erosion
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/76Television signal recording
    • H04N5/765Interface circuits between an apparatus for recording and another apparatus
    • H04N5/77Interface circuits between an apparatus for recording and another apparatus between a recording apparatus and a television camera
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/76Television signal recording
    • H04N5/765Interface circuits between an apparatus for recording and another apparatus
    • H04N5/77Interface circuits between an apparatus for recording and another apparatus between a recording apparatus and a television camera
    • H04N5/772Interface circuits between an apparatus for recording and another apparatus between a recording apparatus and a television camera the recording apparatus and the television camera being placed in the same enclosure
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
    • H04N7/183Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast for receiving images from a single remote source
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N9/00Details of colour television systems
    • H04N9/79Processing of colour television signals in connection with recording
    • H04N9/80Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback
    • H04N9/804Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback involving pulse code modulation of the colour picture signal components
    • H04N9/8042Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback involving pulse code modulation of the colour picture signal components involving data reduction

Abstract

A camera system and method of capturing image data having a camera, a data storage device electrically connected to the camera and configured to store video data, and/or a communication device electrically connected to the camera and configured to communicate image data to a system receiver located remotely from the camera. The system receiver may be located on the vehicle so that the operator can carry the camera off board the vehicle and transmit image data back to the vehicle when, for example, performing work on the vehicle or inspecting the vehicle or the vehicle's surroundings.

Description

Video system and method for data communication
Description of the patent
The present application is a divisional application of the chinese patent application having an application date of 2015, 30/1, application number of 2015800201304 and entitled "video system and method for data communication".
Technical Field
Embodiments of the subject matter disclosed herein relate to obtaining and communicating video data, such data may be associated with a device or a transportation network.
Background
The device may sometimes be equipped with a camera unit for capturing and storing video data of the environment surrounding the vehicle. For example, a law enforcement vehicle may be provided with a "dashboard cam" to record the field of view of the front windshield of the vehicle, capturing video data of the interaction between law enforcement personnel and, for example, the occupants of another vehicle. As another example, a passenger vehicle may be provided with a positioning rearview camera for capturing a video stream of an area directly behind the vehicle, which is displayed on a console display screen to assist the driver in safely backing the vehicle.
In addition to on-board cameras, transportation networks (referring to infrastructure for vehicle movement, such as railroad tracks for rail vehicles, or highway and other road networks for cars, semi-trailers, or other highway vehicles) are sometimes equipped with roadside cameras for capturing video data of the transportation network. For example, cameras may be attached to posts on the side of a highway to capture video data of the highway for traffic tracking and reporting purposes.
This is typically the case for both on-board and roadside camera systems: the camera system is fixed in position to capture video data only for a specified field of view, such as a front or rear of a vehicle or a specified road segment. For vehicles, this is because the camera system is designed to capture video data (law enforcement dashboard cams) that may be safety critical (rear view) or important from a public policy perspective. For roadside camera systems, this is because the specified field of view must be constantly monitored (e.g., toll gate views) or the data kept consistent (e.g., road monitoring over time).
Disclosure of Invention
In one embodiment, a system (e.g., a camera or video system) includes: a photographing device; at least one of a data storage device and/or a communication device; a photographing device supporting an object; a positioning device and a control unit. The camera may be configured to capture at least image data. The data storage device may be electrically coupled to the camera and configured to store image data. The communication device may be electrically coupled to the camera and configured to communicate the image data to the system receiver. The camera supporting object may be coupled to the camera. The positioning apparatus may be configured to detect a position of the camera supporting object. The control unit may be configured to communicate with the system receiver and the positioning apparatus and control the camera based at least in part on a position of the camera-supporting object.
In another embodiment, a method (e.g., for obtaining and/or communicating image data) includes obtaining image data from a camera configured to capture image data (where the camera may be supported by a camera supporting object), determining a position of the camera supporting object, and controlling the camera based at least in part on the position of the camera supporting object detected by the positioning apparatus.
Drawings
The subject matter described herein will be better understood from reading the following description of non-limiting embodiments, with reference to the accompanying drawings, in which:
FIG. 1 illustrates a camera system for capturing and communicating transportation data related to a vehicle or otherwise related to a transportation system, according to one embodiment;
FIG. 2 illustrates a camera system according to another embodiment;
FIG. 3 illustrates another embodiment of a camera system;
FIG. 4 illustrates another embodiment of a camera system having a garment and a portable camera unit attached and/or attachable to the garment;
FIG. 5 illustrates another embodiment of a camera system;
FIG. 6 illustrates one embodiment of a vehicle;
FIG. 7 illustrates a control system according to one embodiment;
FIG. 8 illustrates a transport system receiver located on a vehicle according to one embodiment;
FIG. 9 illustrates another embodiment of a camera system;
FIG. 10 illustrates another embodiment of a camera system;
FIG. 11 illustrates a perspective view of a camera system;
fig. 12 illustrates a side view of the camera system shown in fig. 11;
FIG. 13 illustrates a top view of the camera system shown in FIG. 11;
FIG. 14 illustrates a schematic representation of an image analysis system according to an embodiment;
FIG. 15 illustrates a flow diagram of one embodiment of a method for obtaining and/or analyzing image data for transport data communication.
Detailed Description
Embodiments described herein relate to a video unit for capturing and communicating video data in a transportation system or network. For example, a camera may be deployed on a rail vehicle or other vehicle and then carried by an operator of the vehicle (e.g., when performing work on the vehicle, inspecting the vehicle or vehicle surroundings, or the like) to capture video data of the vehicle and its surroundings for storage for later use or for display or other use on the vehicle. Alternatively, the camera may be coupleable to a powered camera support object such that the camera may be mobile. That is, the camera and its supporting object may be able to move independently or separately from the movement of the operator or his reference vehicle. For example, the camera may be connected or otherwise disposed on an aerial device (e.g., a drone, helicopter, or airplane) to allow the camera unit to fly, the camera unit may be connected to or otherwise disposed on another ground or water moving system (e.g., a robot or remote controlled vehicle) to allow the robot and camera to move relative to the vehicle, or the like. In one embodiment, the camera supporting object is a first ground vehicle having at least one of remote control or autonomous movement capability relative to a second ground vehicle along a route of the second vehicle. The first ground vehicle is intended to travel ahead of the second ground vehicle along the route and to transmit image data back to the second ground vehicle. This may provide a view of the route well to the operator of the second vehicle before the second vehicle arrives. For very high speed second vehicles, the stopping distance may exceed visibility provided from the vantage of the non-aerial vehicle. The view from the first vehicle may then extend this visibility range or supplement it. In addition, the camera itself may be repositionable and may have the ability to translate left, right, up and down, as well as the ability to zoom in and out.
As used herein, a camera is a device for capturing and/or recording visual images. These images may take the form of fixed lenses, analog video signals, or digital video signals. The signal, in particular the digital video signal, may for example be subjected to a compression/decompression algorithm, such as MPEG or HEVC. Suitable cameras may capture and record in a defined band of light or energy. For example, in one embodiment, the camera may sense wavelengths in the visible spectrum and in another embodiment the camera may sense wavelengths in the infrared spectrum. Multiple sensors may be combined in a single camera and may be selectively used based on the application. Further, stereoscopic and 3D capture devices are contemplated for at least some embodiments described herein. These cameras can help determine distances, velocities, and vectors to predict (and thereby avoid) collisions and damage. The term consist, vehicle consist, refers to two or more vehicles or items of mobile equipment mechanically or logically coupled to each other. Through the logical coupling, the plurality of items of the mobile device are controlled such that control of one of the items to move causes a corresponding movement in the other items in the grouping, for example through wireless commands. A multi-cell ethernet (eMU) system may include, for example, a communication system for use in communicating data from one vehicle to another in a consist (e.g., an ethernet network through which data is communicated between two or more vehicles).
FIG. 1 illustrates a camera system 100 for capturing and communicating transportation data related to a vehicle or otherwise related to a transportation system, according to one embodiment. The system includes a portable camera unit 102 having a camera 104, a data storage device 106 and/or a communication device 108, and a battery or other energy storage device 110. The camera unit may be portable in that it is small and/or light enough to be carried by a single adult. The camera unit is configured to capture and/or generate image data 112 of the camera unit's field of view 101. For example, the field of view may represent the solid angle through which the camera unit is sensitive to light, electromagnetic radiation, or other energy used to form an image, video, or the like. The image data may include still images of one or more objects within the field of view of the camera unit, video (e.g., moving images or a series of images representing moving objects), or the like. In any of the embodiments of any of the camera systems described herein, data other than image data may be captured and communicated, e.g., a portable camera unit may have a microphone for capturing audio data, a vibration sensor for capturing vibration data, and so forth.
A suitable portable camera unit may be an internet protocol camera unit, such as a camera that can transmit video data via the internet or another network. In one aspect, the camera may be a digital camera capable of obtaining relatively high quality image data (e.g., still or still images and/or video). For example, the camera may be an Internet Protocol (IP) camera that generates packet image data. The photographing device may be a High Definition (HD) photographing device capable of obtaining image data at a relatively high resolution. For example, the camera may obtain image data having a resolution of at least 480 horizontal scan lines, at least 576 horizontal scan lines, at least 720 horizontal scan lines, at least 1080 horizontal scan lines, or even greater. Alternatively, the camera may be another type of camera.
The data storage device may be electrically connected to the camera unit and configured to store image data. The data storage device may include one or more computer hard drives, removable drives, magnetic drives, read-only memory, random access memory, flash drives or other solid state storage devices, or the like. Alternatively, the data storage device may be located remotely from the camera unit, for example by being separated from the camera unit by at least a few centimeters, meters, kilometers, as determined at least in part by the application at hand.
The communication device may be electrically connected to the camera unit and configured to wirelessly communicate (e.g., transmit, propagate, or the like) the image data to a transport system receiver 114 located outside of the camera unit. Alternatively, the image data may be communicated to the receiver via one or more wired connections, over power lines, through other data storage devices, or the like. The communication device and/or receiver may represent hardware circuitry or circuitry, such as transceiver circuitry and associated hardware (e.g., antenna) 103, including and/or connected to one or more processors (e.g., microprocessors, controllers, or the like).
The energy storage device may be electrically connected to the camera unit, the data storage device and/or the communication device. The energy storage device may represent one or more devices that store and/or generate electrical current to power the camera unit, the data storage device, and/or the communication device. For example, the energy storage device may include one or more batteries, a pantograph (e.g., that receives current from an off-board source via a catenary or overhead line), a conductive shoe (e.g., that contacts a conductor such as an electrified rail to receive current from an off-board source), a generator, an alternator, or the like.
In one embodiment, the camera unit includes a camera, a data storage device, and an energy storage device, but does not include a communication device. In such embodiments, the camera unit may be used to store the captured image data for later retrieval and use. In another embodiment, the camera unit includes a camera, a communication device, and an energy storage device, but does not include a data storage device. In such embodiments, the portable camera unit may be used to communicate image data to the vehicle or other location for immediate use (e.g., display on a display screen) and/or for storage remote from the portable camera unit (i.e., for storage not within the portable camera unit). In another embodiment, the camera unit includes a camera, a communication device, a data storage device, and an energy storage device. In such embodiments, the portable camera unit may have multiple modes of operation, such as a first mode of operation in which image data is stored within the portable camera unit on the data storage device 106, and a second mode of operation in which image data is transferred away from the portable camera unit for remote storage elsewhere and/or immediate use.
The camera may be a digital video camera, such as a camera having a lens, an electronic sensor for converting light passing through the lens into an electronic signal, and a controller for converting the electronic signal output by the electronic sensor into image data, which may be formatted according to a standard such as MP 4. The data storage device (if present) may be a hard disk drive, flash memory (electronic non-volatile non-transitory computer storage medium), or the like. The communication device, if present, may be a wireless Local Area Network (LAN) transmitter (e.g., a Wi-Fi transmitter), a Radio Frequency (RF) transmitter in and transmitting according to one or more commercial cell frequencies/protocols (e.g., 3G or 4G), and/or an RF transmitter configured to wirelessly communicate at a frequency for vehicle communication (e.g., at a frequency compatible with a wireless receiver of a distributed power system of a rail vehicle; distributed power refers to coordinated traction control of a train or other rail vehicle consist having a plurality of locomotives or other powered rail vehicle units, such as throttle and brake). Suitable energy storage devices may be rechargeable lithium ion batteries, rechargeable Ni-Mh batteries, alkaline batteries, or other devices configured for portable energy storage for use in electronic devices. While more energy is provided than stored, another suitable energy storage device includes piezoelectric vibrating harvesters and solar panels, where energy is generated and then provided to the camera system.
The camera unit may comprise a positioning device 105 which generates data for determining the position of the camera unit. Positioning device 105 may represent one or more hardware circuits or circuits that include and/or are coupled to one or more processors (e.g., controllers, microprocessors, or other electronic logic-based devices). In one example, the positioning device 105 represents: a Global Positioning System (GPS) receiver that determines a location of the camera unit; a beacon or other communication device that propagates or transmits a signal received by another component (e.g., a transportation system receiver) to determine how far the camera unit is from the component (e.g., receiver) receiving the signal; a Radio Frequency Identification (RFID) tag or reader that transmits and/or receives electromagnetic radiation to determine how far a camera unit is from another RFID reader or tag (e.g., a receiver); or the like. The receiver may receive a signal from the locating device 105 to determine the location of the locating device 105 relative to the receiver and/or another location (e.g., relative to a vehicle or vehicle system). Additionally or alternatively, the locating device 105 may receive a signal from a receiver (e.g., which may include a transceiver capable of transmitting and/or propagating a signal) to determine a location of the locating device 105 relative to the receiver and/or another location (e.g., relative to a vehicle or vehicle system).
Fig. 2 illustrates a camera system 200 according to another embodiment. The system 200 includes a garment 116 configured to be worn or carried by an operator 118 (e.g., a vehicle operator, a transportation worker, or others). The portable camera unit may be attached to clothing. For example, the garment may be a hat 120 (which includes a head-worn garment), an eye device 122 (e.g., Google Glass)TMA device or other eyepiece), a band or watch 124, a jacket 126 or other outerwear portion, a writing tablet, or the like. The camera unit being detachably attachable to the garment, or in other embodimentsThe portable camera unit may be integrated into the garment or otherwise permanently attached to the garment. Attaching the portable camera unit to the garment may allow the portable camera unit to be worn by a human operator of the vehicle (or the human operator may be otherwise associated with the transport system) for capturing image data associated with the human operator performing one or more functions with respect to the vehicle or, more generally, the transport system.
For example, in one embodiment, the portable camera unit includes a communication device that may be configured to wirelessly communicate image data to the transport system receiver. The transport system receiver may be located on the vehicle 128 (shown in fig. 3), at a wayside location 130 of the route of the vehicle 128 (shown in fig. 4), or otherwise remote from the vehicle 128 (as shown in fig. 5). Remote generally refers to not being on the vehicle, and in embodiments, more specifically, not being within the immediate vicinity of the vehicle, such as not being within WiFi and/or cellular range of the vehicle. In one aspect, the camera unit may be secured to clothing worn by an operator of the vehicle 128 and provide image data representative of an area surrounding the operator. For example, the image data may represent an area viewed by an operator. The image data may no longer be generated by the camera unit during periods when the operator is within the vehicle 128 or within a specified distance from the vehicle 128. The camera unit may automatically generate and/or store image data when away from the vehicle 128 or moving further than a specified distance (e.g., five meters) from the vehicle 128. As described herein, the image data may be communicated to a display on the vehicle 128 or in another location so that another person on the vehicle 128 may determine the operator's location with the camera unit based on the image data. With respect to rail vehicles, one such example may be an operator leaving the cab of a locomotive. If the operator were to disconnect the car from the rail vehicle (which includes the locomotive), image data obtained by the camera unit regarding clothing worn by the operator may be recorded and displayed to an engineer on the locomotive. The engineer may review the image data as a review to ensure that the locomotive is not moving if the conductor is between cars of the rail vehicle. Once it is clear from the image data that the conductor is not in the way, the engineer may control the locomotive to move the rail vehicle.
Alternatively, the image data may be examined autonomously by one or more of the image data analysis systems or image analysis systems described herein. For example, one or more of the transport receiver system 114, the vehicle, and/or the camera unit may include an image data analysis system (also referred to as an image analysis system) that examines the image data for one or more purposes described herein.
Fig. 3 illustrates another embodiment of a camera system 300. The system may include a display screen system 132 that is located remotely from the portable camera unit and the vehicle. The display screen system receives the image data from the transport system receiver as live and displays the image data (e.g., converts back to moving images) on the display screen 134 of the display screen system. The live may include image data representing the object at the same time as the video data was captured (if there is no communication lag associated with communicating the image data from the portable camera unit to the display screen system). Such embodiments may be used, for example, to communicate image data captured by a human operator wearing or otherwise using the portable camera unit and associated with the human operator performing one or more tasks associated with the vehicle (e.g., vehicle inspection) or otherwise associated with the transportation network (e.g., rail track inspection) to a remote human operator viewing the display screen. The remote human operator may be, for example, an expert in a particular task or tasks, and may provide suggestions or instructions to the live human operator based on the image data.
Fig. 4 illustrates another embodiment of a camera system 400, the camera system 400 having a garment and a portable camera unit attached and/or attachable to the garment. The system may be similar to other camera systems described herein, wherein the system further includes a position detection unit 136 and a control unit 138. The position detection unit detects a position of a transportation worker wearing the garment. The position detection unit may be connected to and part of a garment, to and part of a portable camera unit, or to and part of a vehicle or roadside device. The location detection unit may be, for example, a Global Positioning System (GPS) unit or a switch or other sensor that detects when a human operator (wearing a garment) is at a particular location in the vehicle or otherwise. In one embodiment, the location detection unit may detect the presence of a wireless signal when the camera is within a specified range of the vehicle or vehicle cab. The position detection unit may determine that the camera unit is no longer in the vehicle or the vehicle cab in response to no longer detecting a wireless signal or the signal strength falling below a specified threshold.
A control unit (which may be part of the portable camera unit) controls the portable camera unit based at least in part on the position of the transportation worker detected by the position detection unit. The control unit may represent a hardware circuit or circuits that includes and/or is connected to one or more processors (e.g., microprocessors, controllers, or the like).
In one embodiment, the control unit controls the portable photographing device unit to a first operation mode when the position of the transportation worker detected by the position detection unit indicates that the transportation worker is at the operator terminal 140 of the vehicle (e.g., in the cab 142 of the vehicle), and controls the portable photographing device unit to a different second operation mode when the position of the transportation worker detected by the position detection unit indicates that the transportation worker is not at the operator terminal of the vehicle. For example, in a first mode of operation, the portable camera unit is disabled from at least one of capturing, storing and/or communicating image data, and in a second mode of operation, the portable camera unit is enabled to capture, store and/or communicate image data. Thus, in such embodiments this may be the case: the portable unit is disabled from capturing image data when the operator is located at the operator terminal and is enabled when the operator is away from the operator terminal. The control unit may cause the camera to record image data so that the operator's motion can be tracked when the operator leaves the operator cab or the operator terminal. For example, in the context of a rail vehicle, the movement of the operator may be examined using image data to determine whether the operator is working on the correct portion of the vehicle, at the time the operator is to work, or the like. As another example, in the context of a police or other law enforcement officer, the control unit may cause the camera to record images and/or video to generate additional evidence of an event (e.g., a corner of mouth with a suspect, asking the driver after a car accident, performing a live wake test, or the like) in response to the operator leaving the operator's cab of the vehicle (e.g., an officer leaving a police car).
In another embodiment, the control unit is configured to control the portable camera unit to a first operating mode when the position of the transportation worker detected by the position detection unit 136 indicates that the transportation worker is in the operator's cab 142 of the vehicle and to control the portable camera unit to a different second operating mode when the position of the transportation worker detected by the position detection unit indicates that the transportation worker is not in the operator's cab of the vehicle. For example, a portable camera unit may be enabled for capturing image data while the operator is in the operator's cab and disabled from capturing image data while the operator is outside the operator's cab. As should be appreciated, enabling may include powering up, and disabling may include powering down.
In another embodiment, the system has a display screen 144 in the operator cab of the rail vehicle. The communication device of the portable camera unit may wirelessly communicate the image data to a transport system receiver, which may be located on the vehicle and operatively connected to the display screen for display of the image data on the display screen. Such embodiments may be useful for one operator of a vehicle to view image data captured by another operator of the vehicle using a portable camera unit. For example, if the portable camera system is attached to a garment worn by one operator while performing a task external to the vehicle, video data associated with the task may be transmitted back to the rest of the operator in the operator's cab for regulatory or safety purposes.
Fig. 5 illustrates another embodiment of a camera system 500. An on-vehicle control system 146 may be provided for controlling movement of the vehicle. The control system may include or represent a control unit and may include hardware circuitry or circuitry including and/or connected with one or more processors (e.g., microprocessors, controllers, or the like). The control system may control operation of the vehicle, such as by communicating command signals to a propulsion system (e.g., motor, engine, brakes, or the like) of the vehicle for controlling an output of the propulsion system.
The control system may prevent movement of the vehicle in response to a first data content of the image data and allow movement of the vehicle in response to a second, different data content of the image data. For example, a control system on the vehicle may engage a brake and/or prevent the motor from moving the vehicle to prevent the vehicle from moving in response to a first data content of the image data indicating that the portable camera unit (e.g., worn by the operator, or otherwise carried by the operator) is located outside of an operator's cab of the vehicle and to allow the vehicle to move in response to a second data content of the image data indicating that the portable camera unit is located inside of the operator's cab.
The data content of the image data may indicate that the camera unit is outside the operator's cab based on a change in one or more parameters of the image data. One of these parameters may include brightness or light intensity in the image data. For example, during daytime, an increase in brightness or light intensity in the image data may indicate that the operator and the camera unit move from inside the cab to outside the cab. A decrease in brightness or light intensity in the image data may indicate that the operator and the camera unit are moving from outside the cab to inside the cab. Another parameter of the image data may include the presence or absence of one or more objects in the image data. For example, the control system may use one or more image and/or video processing algorithms, such as edge detection, pixel metrics, comparison to a reference image, object detection, gradient determination, or the like, to identify the presence or absence of one or more objects in the image data. If the object is inside the cab or vehicle, the inability of the control system to detect the object in the image data may indicate that the operator is no longer in the cab or vehicle. However, if an object is detected in the image data, the control system may determine that the operator is in the cab or the vehicle.
FIG. 6 illustrates one embodiment of a vehicle. The vehicle may include one or more vehicle consists 148 having a plurality of interconnected vehicle units 150, wherein at least one of the plurality of vehicle units is a propulsion-generated non-aircraft vehicle unit 152. The vehicle may represent a rail vehicle system, such as a train, where the vehicle units 150, 152 represent locomotives, rail cars, or other types of rail vehicles. Alternatively, the vehicle may represent another type of vehicle, such as an automobile, a boat, a mining vehicle, other off-highway vehicles (e.g., vehicles that are not designed and/or legally not allowed to travel on a public road), or the like. A consist may represent a plurality of vehicle units mechanically connected to travel together along a land or water route 602 (e.g., rail, road, water route, or the like). Alternatively, the consist and/or vehicle may include a plurality of vehicle units that communicate with each other to travel together along the route 602 but are not connected to each other. For example, the vehicle unit may send command signals to the plurality of vehicle units to instruct the plurality of vehicle units how to move along the route 602 to maintain separation distances between the plurality of vehicle units.
The control system on the vehicle may be configured to prevent movement of the vehicle consist in response to the first data content of the image data indicating that the portable camera unit is positioned between adjacent vehicle units of the vehicle consist and to allow movement of the vehicle consist in response to the second data content of the image data indicating that the portable camera unit is not positioned between adjacent vehicle units of the vehicle consist. Adjacent may refer to connected vehicle units next to each other or to vehicle units that are not connected to each other but next to each other. Such embodiments may be used, for example, to prevent consist movement for safety purposes when an operator (wearing or otherwise carrying the portable camera unit) is positioned between adjacent vehicle units (e.g., for detaching or attaching the units to one another).
As described above, the control system may examine parameters of the image data to determine the position of the operator. For example, a decrease in brightness may indicate that the operator and the camera unit are between the vehicles, while a relatively small increase or decrease in brightness, e.g., no greater than a specified non-zero threshold), may indicate that the operator and the camera unit are not between the vehicles.
FIG. 7 illustrates a control system according to one embodiment. The control system may be located on the vehicle and may also include an image data analysis system 154. The analysis system may automatically process the image data for identifying the first data content and the second data content in the image data. The control system may be configured to automatically prevent and allow movement of the vehicle in response to the first data and the second data (which are recognized by the image data analysis system), respectively. The image data analysis system may include one or more image analysis processors that autonomously examine image data obtained by the camera unit for one or more purposes, as described herein.
FIG. 8 illustrates a transport system receiver located on a vehicle according to one embodiment. The transport system receiver may be configured to wirelessly communicate network data onto and/or off of the vehicle, and/or automatically switch to a mode for receiving image data from the portable camera unit in response to the portable camera unit being active to communicate image data. For example, in response to the portable camera unit being active to transmit image data, the transport system receiver may be configured to automatically switch from the network wireless client mode of operation 156 (transmitting data originating from equipment on the vehicle, such as a control unit) to a mode for receiving image data from the portable camera unit. The mode for receiving image data from the portable camera unit may include a wireless access point mode of operation 158 (receiving data from the portable camera unit).
In another embodiment, the camera system further comprises a transport system receiver located on the vehicle. The transport system receiver may be configured to wirelessly communicate network data onto and/or off the vehicle and/or automatically switch from a network wireless client mode of operation to a wireless access point mode of operation for receiving image data from the portable camera unit. The network data may include data other than image data. For example, the network data may include information about upcoming trips of the vehicle (e.g., schedules, route grades, curvature of the route, speed limits, areas in maintenance or repair, etc.), cargo carried by the vehicle, or other information. Alternatively, the network data may include image data. Alternatively, the network data may include graphical data. The receiver may switch the mode of operation and receive image data in response to at least one specified state of the portable unit. For example, the specified state may be that the portable camera unit is operating to transfer image data, or that the portable camera unit is in a specified location. As another example, the specified state may be that the camera unit moves or that the camera unit moves short. In response to the receiver and/or the camera unit determining that the camera unit is not moving and/or is not moving into or out of the vehicle, the camera unit may stop generating image data, the camera unit may stop communicating image data to the receiver, and/or the receiver may stop receiving image data from the camera unit. In response to the receiver and/or the camera unit determining that the camera unit is moving and/or moving into or out of the vehicle, the camera unit may begin generating image data, the camera unit may begin communicating image data to the receiver, and/or the receiver may begin receiving image data from the camera unit.
In another embodiment of one or more of the camera systems described herein, the system is configured for image data to be stored and/or used locally (e.g., in a vehicle) or transmitted to a remote location (e.g., an off-board location) based on where the vehicle is located. For example, if the vehicle is in a venue (e.g., a yard, maintenance facility, or the like), the image data may be transmitted to a location in the venue. The image data may be stored on the vehicle and not communicated to any location outside the vehicle before the vehicle enters the venue or a designated location in the venue.
Thus, in an embodiment, the system further comprises a control unit responsive to at least one of a position of the portable camera unit or a control input to control at least one of the portable camera unit or the transport system receiver to a first mode of operation for at least one of video data stored or displayed on the rail vehicle and to a second mode of operation for communicating the video data out of the rail vehicle for at least one of video data stored or displayed outside the rail vehicle. For example, the control unit may be configured to automatically control the at least one of the portable camera unit or the transport system receiver from the first mode of operation to the second mode of operation in response to the position of the portable camera unit indicating that the rail vehicle is in the yard.
During operation of the vehicle and/or camera unit outside of a designated area (e.g., a fence or other location extending around the vehicle lot), image data generated by the camera may be stored locally in a data storage device of the camera unit, shown on a display of the vehicle, or the like. In response to the vehicle and/or the camera unit entering a designated area, the camera unit may switch modes to begin wirelessly communicating image data to a receiver, which may be located in the designated area. Changing the place where the image data is communicated based on the location of the vehicle and/or camera unit may allow the image data to be accessible to those viewing the image data for security, analysis, or the like. For example, the image data may be presented to an on-board operator during movement of the vehicle outside of the vehicle yard, and/or the image data may be analyzed by an on-board analysis system of the vehicle to ensure safe operation of the vehicle. In response to the vehicle and/or camera unit entering the vehicle yard, the image data may be communicated to a central office or management facility for remote monitoring of the vehicle and/or performing operations in the vicinity of the vehicle.
As one example, event data transmission (e.g., transmission, propagation, or other communication of image data) may be configured to occur based on various vehicle states, geographic locations, and/or conditions. The image data may be pulled (e.g., requested) or pushed (e.g., transmitted and/or propagated) from the vehicle. For example, the image data may be transmitted from the vehicle to an off-board location based on a selected operating state (e.g., emergency brake application), a geographic location (e.g., near an intersection between two or more routes), a selected and/or inferred operational area of interest (e.g., high wheel slip or vehicle speed beyond an area limit), and/or a time-driven message (e.g., transmitted once a day). The off-board location may also request and retrieve image data from a particular vehicle as needed.
Fig. 9 illustrates another embodiment of a camera system 900. The system includes a portable support 159 having at least one post 160 and a head 162 attached to the at least one post. The head is detachably coupled to the portable camera unit, and at least one strut autonomously (e.g., without human intervention) supports the portable camera unit at a roadside location outside the vehicle. The support may be used to place the camera unit in a position to view at least one of the vehicle and/or roadside locations. The communication device may wirelessly communicate the image data to a transport system receiver located on the vehicle. The image data may be transmitted from outside the vehicle to on-board the vehicle for at least one of storing and/or displaying the image data on-board the vehicle. In one example, the portable support may be a camera tripod. The portable support may be used by an operator to provide a portable camera unit outside the vehicle for communicating image data back to the vehicle for viewing in the operator's cab of the vehicle or in another location. The image data may be communicated to the vehicle to allow an operator and/or another passenger of the vehicle to inspect the exterior of the vehicle, inspect wayside equipment and/or location, inspect a route on which the vehicle is traveling, or the like. In one example, image data may be communicated from an off-board location onto the vehicle to allow an operator and/or passenger to view the image data for entertainment purposes, such as watching a movie, video, or the like.
Fig. 10 illustrates another embodiment of a camera system 1000. The system includes a telescoping mast 164 configured for attachment to the vehicle. The telescoping mast has one or more segments 166 deployable from a first position 168 relative to the vehicle to a second position 170 relative to the vehicle that is higher than the first position relative to the ground. The mast includes a coupler 172 attached to at least one of the mast sections. The coupler allows detachable coupling of the portable camera unit with at least one of the mast sections. When the portable camera unit is coupled to the retractable mast by the coupler and the retractable mast is deployed to the second position, the portable camera unit is positioned above the vehicle for inspection of the roof of the vehicle, other vehicle units in the consist, the surroundings of the vehicle, or the like.
Fig. 11, 12, and 13 illustrate another embodiment of a camera system 1100. Fig. 11 illustrates a perspective view of the camera system, fig. 12 illustrates a side view of the camera system, and fig. 13 illustrates a top view of the camera system 1100. The system includes an aerial device 174 configured for at least one of remote control or autonomous flight over a ground route of the non-aerial vehicle. The aerial device may have one or more camera mounts 176 for housing one or more portable camera units, and may also have a vehicle mount for coupling the aerial device to a vehicle. In the illustrated example, the aerial device includes three cameras, with one camera unit facing in a forward travel direction 1200 of the aerial device, another camera unit facing in a downward direction 1202 towards the ground or route (on which the aerial device is flying), and another camera unit facing in a rearward direction 1204 of the aerial device. Alternatively, a different number of camera units may be used and/or the camera units may be oriented in other directions.
When the aerial device is airborne, the portable camera unit may position the camera to view a route, vehicle, or other area near the vehicle. The aerial device may be, for example, a scale of spacecraft, a scale of helicopter, or the like (e.g., the aerial device may be smaller than needed by a transportation person, such as 1/10 scales or smaller). Helicopters of suitable dimensions may include quadcopters and the like.
The system may also include an aerial device vehicle mount 178 to attach an aerial device to the vehicle. The aerial device vehicle dock can receive the aerial device due to at least one of a detachable coupling of the aerial device with the vehicle, a charging of a battery of the aerial device by a power source of the vehicle, or the like. For example, the base may include one or more connectors 180 that mechanically or magnetically couple with the aerial device to prevent movement of the aerial device relative to the base, conductively couple an onboard power source (e.g., a battery) of the aerial device with a power source (e.g., a generator, an alternator, a battery, a pantograph, or the like) of the vehicle such that the power source of the aerial device may be charged by the power source of the vehicle during movement of the vehicle.
The aerial device may fly off of the vehicle to obtain image data that is communicated from one or more of the cameras on the aerial device to one or more receivers 114 on the vehicle. The aerial device may fly relative to the vehicle while the vehicle is stationary and/or while the vehicle is moving along the route. The image data may be displayed to the operator on a display device on the vehicle and/or may be examined autonomously as described herein. The image data may be reviewed by an operator and/or an image analysis system of the vehicle, such as to review the vehicle, review other vehicles traveling relative to the vehicle (e.g., to avoid a collision between the vehicles), review a route traveled (e.g., to perform a route review), alert of upcoming obstacles or other problems in front of the vehicle along the route, and the like. When the aerial device is coupled within the vehicle dock, one or more cameras may be positioned to view the route during movement of the vehicle.
FIG. 14 is a schematic illustration of an image analysis system 154 according to an embodiment. As described herein, an image analysis system may be used to examine the data content of image data to automatically identify objects, damage in routes, or the like in the image data. The controller 1400 of the system includes or represents hardware circuitry or circuitry that includes or is coupled to one or more computer processors (e.g., one or more computer microprocessors). The controller may save image data obtained by the camera unit to one or more memory devices 1402 of the imaging system, generate an alarm signal or the like in response to identifying one or more problems with the route and/or roadside equipment based on the obtained image data. Memory device 1402 includes one or more computer-readable media for at least temporarily storing image data. Suitable memory devices may include computer hard drives, flash or solid state drives, optical disks, or the like.
During travel of the vehicle along the route, the camera unit may generate image data representing images and/or video of the camera's field of view. The image data may represent actions occurring within the vehicle (e.g., an operator changing operational settings of the vehicle). For example, one use for image data may be for accident investigation, where the onboard operator's actions are examined to determine whether the operator is controlling the vehicle at the time of the accident, whether the operator is awake and aware of the causing of the accident, whether appropriate action is taken in the event of the causing of the accident (e.g., activating a horn or other alarm, engaging brakes, etc.), or the like.
Additionally or alternatively, the image data may be used to check the health of the route, the status of roadside equipment along the route traveled by the vehicle, or the like. The field of view of the camera may include at least some of the route and/or wayside equipment disposed ahead of the vehicle in the direction of travel of the vehicle. During movement of the vehicle along the route, the camera unit may obtain data representative of the route and/or wayside equipment for inspection to determine whether the route and/or wayside equipment is functioning properly or is damaged and/or requires further inspection.
Because the image data represents what the system sees in the field of view of the camera unit, the image data created by the camera unit may be referred to as machine vision. One or more analysis processors 1404 of the system can examine the image data to identify the status of the vehicle, route, and/or wayside equipment. Optionally, the analysis processor may examine the terrain at, near or around the route and/or wayside equipment to determine whether the terrain has changed such that maintenance of the route, wayside equipment and/or terrain is required. For example, the analysis processor may examine the image data to determine whether vegetation (e.g., trees, shrubs, and the like) is growing (e.g., signals) on the route or roadside equipment such that travel on the route may be impeded and/or the field of view of the roadside equipment may be obscured from the operator of the vehicle. As another example, the analysis processor may examine the image data to determine whether the terrain is far from, on, or towards the route and/or wayside equipment such that eroding the terrain interferes with travel on the route, interferes with operation of the wayside equipment, or poses a risk of interfering with operation of the route and/or wayside equipment. Thus, the terrain "near" the route and/or wayside equipment may include terrain within the field of view of the camera unit when the route and/or wayside equipment is within the field of view of the camera unit, terrain that encroaches onto or is disposed below the route and/or wayside equipment, and/or terrain that is within a specified distance (e.g., two meters, five meters, ten meters, or another distance) from the route and/or wayside equipment. The analysis processor may represent hardware circuitry and/or circuitry that includes and/or is coupled to one or more processors (e.g., one or more computer microprocessors, controllers, or the like).
Acquiring image data from the camera units may allow the analysis processor 1404 to access sufficient information to examine individual video frames, individual still images, several video frames, or the like and determine a route, a status of the wayside equipment, and/or terrain at or near the wayside equipment. The image data may optionally allow the analysis processor to access sufficient information to examine individual video frames, individual still images, several video frames, or the like, and determine the status of the route. The state of the route may represent a health of the route, such as a state of damage to one or more rails of the track, the presence of foreign objects on the route, overgrowth of vegetation on the route, and the like. As used herein, the term "damage" may include physical damage to the route (e.g., a break in the route, a depression in the route, or the like), movement of the route from a previous or designated location, growth of vegetation toward and/or over the route, degradation of support material (e.g., ballast material) under the route, or the like. For example, the analysis processor may examine the image data to determine whether one or more rails are bent, twisted, broken, or otherwise damaged. Alternatively, the analysis processor may measure the distance between the rails to determine if the spacing between the rails is different than a specified distance (e.g., the gauge or other measure of the route). The analysis of the image data by the analysis processor may be performed using one or more image and/or video processing algorithms, such as edge detection, pixel metrics, comparison to a reference image, object detection, gradient determination, or the like.
The communication system 1406 of the system represents hardware circuitry or circuitry that includes and/or is connected to one or more processors (e.g., microprocessors, controllers, or the like) and communication devices (e.g., wireless antennas 1408 and/or wired connections 1410) that operate as transmitters and/or transceivers for communicating signals with one or more locations. For example, the communication system may communicate signals wirelessly via an antenna and/or to a facility and/or another vehicle system via a wired connection (e.g., a cable, bus, or wire, such as a composite cable, train pipe), or the like.
The image analysis system may optionally examine the image data obtained by the camera unit to identify features of interest and/or designated objects in the image data. By way of example, the feature of interest may include a gauge distance between two or more portions of the route. With respect to rail vehicles, the features of interest identified from the image data may include gauge distances between rails of the route. The designated object may include roadside assets such as security devices, signs, signals, switches, inspection devices, or the like. The image data may be automatically examined by the route inspection system to determine changes in features of interest, missed designated objects, damaged or failed designated objects, and/or to determine the location of designated objects. This automatic check may be performed without operator intervention. Alternatively, the automatic check may be performed by means of the operator and/or upon request by the operator.
The image analysis system may use analysis of the image data to detect route damage. For example, a rail misalignment of a rail vehicle traveling may be identified. Based on the detected misalignment, an operator of the vehicle may be alerted so that the operator may implement one or more responsive actions, such as by slowing and/or stopping the vehicle. Upon identifying a damaged section of the route, one or more other responsive actions may be initiated. For example, alert information may be communicated (e.g., transmitted or propagated) to one or more other vehicles to alert the other vehicles of the damage, alert signals may be communicated to one or more wayside devices disposed at or near the route such that the wayside devices may communicate the alert signals to the one or more other vehicles, alert signals may be communicated to off-board facilities that may be disposed for repair and/or further inspection of the damaged section of the route, or the like.
In another embodiment, the image analysis system may examine the image data to identify text, signs, or the like along the route. For example, information printed or displayed on signs, display devices, vehicles, or the like indicating speed limits, locations, warnings, upcoming obstacles, vehicle identities, or the like may be autonomously read by the image analysis system. The image analysis system may identify the information by detecting and reading information about the mark. In one aspect, the image analysis processor may detect information (e.g., text, images, or the like) based on intensities of pixels in the image data, based on wireframe model data (which is generated based on the image data), or the like. The image analysis processor may identify the information and store the information in the memory device. The image analysis processor may examine the information, for example, by using optical character recognition to identify letters, numbers, symbols, or the like included in the image data. This information may be used to autonomously and/or remotely control the vehicle, for example by communicating an alert signal to a control unit of the vehicle, which may slow the vehicle in response to reading a flag indicating a slower speed limit than the current actual speed of the vehicle. As another example, the information may be used to identify the vehicle and/or the cargo carried by the vehicle by reading information printed or displayed on the vehicle.
In another example, the image analysis system may examine the image data to ensure that the safety devices on the route are functioning as intended or designed. For example, the image analysis processor may analyze image data showing an intersection device. The image analysis processor may examine this data to determine whether the intersection device is operating to notify other vehicles at the intersection (e.g., an intersection between a route and another route (e.g., a motorway)) of the vehicle passing through the intersection.
In another example, the image analysis system may examine the image data to predict when repair or maintenance of one or more objects shown in the image data is needed. For example, a history of image data may be examined to determine whether an object exhibits a pattern of degradation over time. Based on this pattern, a service team (e.g., one or more personnel and/or equipment groups) can identify which portions of the object are prone to or have been in an undesirable condition, and can then proactively perform repair and/or maintenance on those portions of the object. Image data from multiple different camera units acquired at different times of the same object may be examined to determine changes in object state. Image data obtained at different times of the same object may be examined to filter out external factors or conditions, such as the effect of precipitation (e.g., rain, snow, ice, or the like) on the appearance of the object, from the object examination. This may be performed, for example, by converting the image data into wireframe model data.
FIG. 15 illustrates a flow diagram of one embodiment of a method 1500 for obtaining and/or analyzing image data for transport data communication. The method may be practiced by one or more embodiments of the system described herein. At 1502, image data is obtained using one or more portable camera units. As described above, the portable camera unit may be coupled to clothing worn by an operator on and/or off the vehicle, may be coupled to roadside equipment separately and disposed off the vehicle but that may obtain image data of the vehicle and/or an area surrounding the vehicle, may be coupled to the vehicle, may be coupled with aerial equipment for flying around and/or in front of the vehicle, or the like. In one aspect, the camera unit may be in an operational state or mode in which image data is not generated by the camera unit during periods when the camera unit is inside (or outside) a specified area (e.g., a vehicle). In response to the camera unit moving outside (or inside) the designated area, the camera unit may change to another operational state or mode to begin generating image data.
At 1504, the image data is communicated to a transport system receiver. For example, image data may be wirelessly communicated from the portable camera unit to the transport system receiver. Alternatively, the image data may be communicated using one or more wired connections. The image data may be communicated upon obtaining the image data, or may be communicated in response to the vehicle and/or camera unit entering or exiting a designated area (e.g., a fence).
At 1506, the data is checked for one or more purposes, such as to control or limit control of the vehicle, control operation of the camera unit, identify vehicle damage, damage to a route ahead of the vehicle, or the like, and/or identify obstacles in the vehicle's way. For example, if the camera unit is worn on the operator's clothing outside the vehicle, the image data may be analyzed to determine if the wipe configuration values are between two or more vehicle units of the vehicle and/or otherwise in a location that would be unsafe for vehicle movement (e.g., the operator is behind and/or in front of the vehicle). With respect to vehicle consists, the image data may be examined to determine whether an operator is between two or more vehicle consists or otherwise in an inaccessible location (and risking injury or life risk if the vehicle consists is moved). Alternatively, the image data may be examined to determine whether the off-board operator is in the blind spot of the vehicle driver, e.g., behind the vehicle. The image analysis system described above may examine the image data and if it is determined that an off-board operator is between vehicle units, behind the vehicle, and/or otherwise in a location that is unsafe if the vehicle is moving, the image analysis system may generate an alert signal that is communicated to the control unit of the vehicle. The alert signal may be received by the control unit, and in response to receiving the control signal, the control unit may prevent the vehicle from moving. For example, the control unit may override the in-vehicle operator-controlled movement to move the vehicle, and the control unit may engage the brake and/or disengage the propulsion system of the vehicle (e.g., turn off or otherwise deactivate an engine, motor, or other propulsion-generating component of the vehicle). In one aspect, the image analysis system may examine the image data to determine whether the route is damaged (e.g., the rails on which the vehicle is traveling are broken, bent, or otherwise damaged), whether an obstacle is on the route in front of the vehicle (e.g., another vehicle or object on the route), or the like.
In one embodiment, a system (e.g., a camera system) includes: a photographing device; at least one of a data storage and/or communication device; a photographing device supporting an object; a positioning device and a control unit. The camera may be configured to capture at least image data. The data storage device may be electrically coupled to the camera and configured to store image data. The communication device may be electrically coupled to the camera and configured to communicate the image data to the system receiver. The camera supporting object may be coupled to the camera. The positioning apparatus may be configured to detect a position of the camera supporting object. The control unit may be configured to communicate with the system receiver and the positioning apparatus and control the camera based at least in part on a position of the camera-supporting object.
In one aspect, the camera support object may be coupled to a garment configured to be worn by a worker, and the control unit may be configured to control the camera to a first mode of operation in response to the position of the worker indicating that the worker is at the operator terminal and to control the camera to a different second mode of operation in response to the position of the worker indicating that the worker is not at the operator terminal.
In one aspect, in a first mode of operation, the camera may be disabled from performing at least one of capturing, storing, or communicating image data, and in a second mode of operation, the camera may be enabled to perform at least one of capturing, storing, or communicating image data.
In one aspect, the operator terminal may be located in an operator cab of the vehicle.
In one aspect, the system further includes a vehicle control unit configured to control the vehicle based at least in part on the image data and prevent the vehicle from moving in response to a first data content of the image data indicating that a clothed worker is located outside an operator cab of the vehicle and allow the vehicle to move in response to a second data content of the image data indicating that the clothed worker is located inside the operator cab.
In one aspect, the vehicle may be one of a plurality of vehicles logically or mechanically coupled to form a consist having a plurality of interconnected vehicle units, wherein at least one of the plurality of vehicle units is a powered vehicle unit. The vehicle control unit may be configured to prevent movement of the vehicle consist in response to the first data content of the image data indicating that the camera is positioned between adjacent vehicle units of the vehicle consist and to allow movement of the vehicle consist in response to the second data content of the image data indicating that the camera is not positioned between adjacent vehicle units of the vehicle consist.
In one aspect, the vehicle control unit may include an image data analysis system configured to process the image data and thereby identify the first data content and the second data content. The vehicle control unit may be configured to prevent and allow movement of the vehicle in response to the first data and the second data (which are identified by the image data analysis system), respectively.
In one aspect, the system may further include a transport system receiver disposed on the vehicle, wherein the transport system receiver is configured to communicate network data other than the image data to at least one of on-board or off-board the vehicle and switch to a mode for receiving the image data from the camera in response to the camera being active to communicate the image data.
In one aspect, the transport system receiver may be configured to wirelessly communicate the network data outside of the vehicle.
In one aspect, the transport system receiver may be configured to communicate one or both of the network data and the image data onto the vehicle over an ethernet network configured for data communication between the vehicle and one or more other vehicles.
In one aspect, the camera supporting object may include a retractable post.
In one aspect, the camera supporting object may include aerial equipment configured for at least one of remote control or autonomous flight with respect to a ground vehicle route of the vehicle.
In one aspect, an aerial device can include a vehicle mount for coupling the aerial device to a vehicle. When the aerial device is in the vehicle dock, the camera may be positioned to view the vehicle route.
In one aspect, an aerial device can include a vehicle mount for coupling the aerial device to a vehicle. The vehicle dock may be configured to charge a battery of the aerial device by a power source of the vehicle when the aerial device is docked in the vehicle dock.
In one aspect, the camera supporting object may include a first ground vehicle configured for at least one of remote control or autonomous movement relative to a second ground vehicle along a route of the second vehicle. The first ground vehicle may be intended to travel ahead of the second ground vehicle along the route and transmit the image data back to the second ground vehicle.
In another embodiment, a method (e.g., for obtaining and/or communicating image data) includes obtaining image data from a camera configured to capture image data (where the camera may be supported by a camera support object), determining a position of the camera support object, and controlling the camera based at least in part on the position of the camera support object detected by the positioning apparatus.
In one aspect, the camera supporting object may include a garment configured to be worn by a worker. The method may further include switching the camera to a first operating mode in response to the location of the worker indicating that the worker is at the operator terminal of the vehicle and switching the camera to a second, different operating mode in response to the location of the worker indicating that the worker is not at the operator terminal of the vehicle.
In one aspect, the method may further include disabling the camera from at least one of capturing, storing, or communicating image data in response to determining that the camera is in the first mode of operation, and enabling the camera for at least one of capturing, storing, or communicating image data in response to determining that the camera is in the second mode of operation.
In one aspect, the camera supporting object may include a garment configured to be worn by a worker. The method may further include preventing movement of the vehicle in response to the first data content of the image data indicating that the worker is located outside of an operator cab of the vehicle and allowing movement of the vehicle in response to the second data content of the image data indicating that the worker is located inside of the operator cab.
In one aspect, the method may further comprise controlling the camera support apparatus to travel relative to a ground vehicle and observe one or more of the vehicle, a roadside asset, or a route traveled by the ground vehicle via the camera.
In one aspect, the method may further comprise examining the image data to identify damage or status of one or more of the vehicle, the wayside asset or the route traveled by the ground vehicle, and/or to predict an imminent effect or damage on one or more of the vehicle, the wayside asset or the route traveled by the ground vehicle.
In one embodiment, a system (e.g., a camera system) includes a portable camera unit and a garment. The portable photographing device unit includes: a camera configured to capture at least image data; at least one of a data storage device electrically connected to the camera and configured to store the image data or a communication device electrically connected to the camera and configured to wirelessly communicate the image data to a transport system receiver located outside of the portable camera unit. The garment is configured to be worn by a transport worker. The portable camera unit is attached to the garment.
In one aspect, the garment includes one or more of a hat or an eye device. In one aspect, the system may further comprise: a positioning device configured to detect a position of a transportation worker wearing the garment; and a control unit configured to control the portable camera unit based at least in part on the position of the transportation worker detected by the positioning apparatus. In one aspect, the control unit is configured to control the portable camera unit to a first mode of operation in response to the location of the transportation worker detected by the positioning apparatus indicating that the transportation worker is at the operator terminal of the vehicle and to control the portable camera unit to a different second mode of operation in response to the location of the transportation worker detected by the positioning apparatus indicating that the transportation worker is not at the operator terminal of the vehicle.
In one aspect, in a first mode of operation, the portable camera unit may be disabled from at least one of capturing, storing, or communicating communication data, and in a second mode of operation, the portable camera unit may be enabled for one or more of capturing, storing, or communicating image data. In one aspect, the control unit may be configured to control the portable camera unit to a first mode of operation in response to the location of the transportation worker detected by the positioning device indicating that the transportation worker is in the operator cab of the vehicle and to control the portable camera unit to a different second mode of operation in response to the location of the transportation worker detected by the positioning device indicating that the transportation worker is not in the operator cab of the vehicle.
In one aspect, the system may further include a vehicle control unit configured to control the vehicle based at least in part on the image data. The vehicle control unit may be configured to prevent movement of the vehicle in response to the first data content of the image data indicating that the portable camera unit is located in an operator's cab of the vehicle and to allow movement of the vehicle in response to the second data content of the image data indicating that the portable camera unit is located inside the operator's cab.
In one aspect, a vehicle may include a vehicle consist having a plurality of interconnected vehicle units, wherein at least one of the plurality of vehicle units is a powered vehicle unit. The vehicle control unit may be configured to prevent movement of the vehicle consist in response to the first data content of the image data indicating that the portable camera unit is positioned between adjacent vehicle units of the vehicle consist and to allow movement of the vehicle consist in response to the second data content of the image data indicating that the portable camera unit is not positioned between adjacent vehicle units of the vehicle consist.
In one aspect, the vehicle control unit may include an image data analysis system configured to automatically process the image data for identifying the first data content and the second data content. The vehicle control unit may be configured to automatically prevent and allow movement of the vehicle in response to the first data and the second data (which are recognized by the image data analysis system), respectively. In one aspect, the system further includes a transport system receiver configured to be located on the vehicle, wherein the transport system receiver is configured to wirelessly communicate network data other than the image data to at least one of on or off the vehicle and automatically switch to a mode for receiving the image data from the portable camera unit in response to the portable camera unit being active to communicate the image data. In one aspect, the system further comprises a retractable post configured for attachment to a vehicle. The retractable mast may include one or more mast sections deployable from a first position relative to the vehicle to a second position relative to the vehicle. The second position is higher than the first position. The mast may further comprise a coupler attached to one of the one or more mast sections and configured for detachable coupling of the portable camera unit to said one of the one or more mast sections. The portable camera unit is coupled to the retractable mast by the coupler and the retractable mast is deployed to a second position, wherein the portable camera unit is positioned above the vehicle.
In another embodiment, another camera system is provided. The system may include a portable camera unit and an aerial device. The portable photographing device unit may include: a camera configured to capture at least image data; and at least one of a data storage device electrically connected to the camera and configured to store the image data or a communication device electrically connected to the camera and configured to wirelessly communicate the image data to a transport system receiver located outside of the portable camera unit. The aerial device is configured for at least one of remote control or autonomous flight over a ground vehicle route. The aerial device includes a camera mount for receiving a portable camera unit. When in the camera mount, the portable camera unit positions the camera to view the vehicle route.
In one aspect, a system includes an aerial device mount attached to a ground vehicle. The aerial device mount may be configured to receive an aerial device for at least one of detachable coupling of the aerial device to a ground vehicle or charging an aerial device battery by a power source of the vehicle. In one aspect, the aerial device is a scale airship or a scale helicopter. In another embodiment, a method (e.g., for obtaining and/or analyzing image data for transport data communication) is provided. The method comprises the following steps: obtaining image data from a portable camera unit having a camera configured to capture image data and attach to a garment worn by a transportation worker; communicating the image data to a transport system receiver located outside the portable camera unit; determining a location of a transport worker wearing the garment with a positioning device; and autonomously controlling the portable camera unit based at least in part on the location of the transportation worker detected by the positioning apparatus.
In one aspect, the method further includes switching the portable camera unit to a first mode of operation in response to the location of the transportation worker detected by the positioning device indicating that the transportation worker is at the operator terminal of the vehicle and switching the portable camera unit to a second, different mode of operation in response to the location of the transportation worker detected by the positioning device indicating that the transportation worker is not at the operator terminal of the vehicle. In one aspect, the method further includes disabling the portable camera unit from at least one of capturing, storing, or communicating communication data in response to determining that the portable camera unit is in the first mode of operation, and enabling the portable camera unit for the at least one of capturing, storing, or communicating image data in response to determining that the portable camera unit is in the second mode of operation.
In one aspect, the method further includes switching the portable camera unit to a first mode of operation in response to the location of the transportation worker detected by the positioning device indicating that the transportation worker is in the operator cab of the vehicle and switching the portable camera unit to a second mode of operation in response to the location of the transportation worker detected by the positioning device indicating that the transportation worker is not in the operator cab of the vehicle. In one aspect, the method further includes preventing movement of the vehicle in response to the first data content of the image data indicating that the portable camera unit is located outside of an operator's cab of the vehicle and allowing movement of the vehicle in response to the second data content of the image data indicating that the portable camera unit is located inside of the operator's cab.
In one aspect, the method further includes preventing movement of a vehicle consist having a plurality of interconnected vehicle units, wherein at least one of the vehicle units includes a powered vehicle unit. The vehicle consist may be prevented from moving in response to the first data content of the image data indicating that the portable camera unit is positioned between adjacent vehicle units of the vehicle consist. The method may also allow the vehicle consist to move in response to the second data content of the image data indicating that the portable camera unit is not positioned between adjacent vehicle units of the vehicle consist. In one aspect, the method may further comprise autonomously inspecting the image data to identify or predict one or more of damage to one or more of the vehicle, the roadside asset, or the route traveled by the vehicle.
The foregoing description of certain embodiments of the inventive subject matter will be better understood when read in conjunction with the appended drawings. To the extent that the figures illustrate diagrams of the functional blocks of various embodiments, the functional blocks are not necessarily indicative of the division between hardware circuitry. Thus, for example, one or more of the functional blocks (e.g., processors or memories) may be implemented in a single piece of hardware (e.g., a general purpose signal processor, microcontroller, random access memory, hard disk, and the like). Similarly, the programs may be stand alone programs, may be incorporated as subroutines in an operating system, may be functions in an installed software package, and the like. The various embodiments are not limited to the arrangements and instrumentality shown in the drawings.
The above description is illustrative and not restrictive. For example, the above-described embodiments (and/or aspects thereof) may be used in combination with each other. In addition, many modifications may be made to adapt a particular situation or material to the teachings of the inventive subject matter without departing from its scope. While the dimensions and types of materials described herein are intended to define the parameters of the inventive subject matter, they are by no means limiting and are exemplary embodiments. Many other embodiments will be apparent to those of skill in the art upon reviewing the above description. The scope of the inventive subject matter should, therefore, be determined with reference to the appended claims, along with the full scope of equivalents to which such claims are entitled.
In the appended claims, the terms "including" and "in … are used as equivalents of the respective terms" comprising "and" wherein ". Furthermore, in the following claims, terms such as "first," "second," "third," and the like are used merely as labels, and are not intended to impose numerical or positional requirements on their objects. Furthermore, the limitations of the following claims are not written in component-plus-function format and are not intended to be based on the 35 u.s.c. § 112, a sixth paragraph interpretation of claims unless and until such claims define the phrase "component for …" explicitly using the following functional description without other structure. Also, as used herein, an element or step recited in the singular and proceeded with the word "a" or "an" should be understood as not excluding plural said elements or steps, unless such exclusion is explicitly stated. Furthermore, references to "one embodiment" of the inventive subject matter are not intended to be interpreted as excluding the existence of additional embodiments that also incorporate the recited features. Moreover, unless explicitly stated to the contrary, embodiments "comprising" or "having" an element or a plurality of elements having a particular property may include additional such elements not having that property.
This written description uses examples to disclose several embodiments of the inventive subject matter, and also to enable any person skilled in the art to practice embodiments of the inventive subject matter, including making and using any devices or systems and performing any incorporated methods. The patentable scope of the inventive subject matter may include other examples that occur to those skilled in the art. Such other examples are intended to be within the scope of the claims if they have structural elements that do not differ from the literal language of the claims, or if they include equivalent structural elements with insubstantial differences from the literal languages of the claims.

Claims (21)

1. A video system for data communication, comprising:
a camera configured to capture at least image data; and
a data storage device electrically coupled to the camera and configured to store the image data;
a communication device electrically coupled to the camera and configured to communicate the image data to a system receiver;
a camera support object coupled to the camera;
a positioning apparatus configured to detect a position at which the photographing device supports an object; and
a control unit configured to communicate with the system receiver and the positioning apparatus and to control an operating mode within the camera based at least in part on the detected position of the camera support object;
wherein the camera supporting object is a telescopic mast.
2. The system of claim 1, wherein,
the telescopic mast is attached to the vehicle, an
The retractable mast comprises a plurality of mast sections deployable from a first position relative to the vehicle to a second position relative to the vehicle, the second position being higher than the first position relative to a ground surface on which the vehicle is located.
3. The system of claim 2, wherein,
the retractable mast further comprises a coupler attached to at least one of the mast sections, the coupler configured for detachably coupling the camera with at least one of the mast sections, and
when the retractable post is deployed to the second position, the camera is positioned above the vehicle for inspection of the vehicle and/or surroundings of the vehicle.
4. The system of claim 1, wherein,
the system includes the communication device electrically coupled to the camera and configured to communicate the image data to a system receiver, the system receiver including a transport system receiver disposed on a vehicle,
the transit system receiver is configured to communicate network data other than the image data to at least one of on-board the vehicle or off-board the vehicle, and to switch to a mode for receiving the image data from the camera to communicate the image data in response to the camera being active.
5. The system of claim 4, wherein the transportation system receiver is configured to wirelessly communicate the network data outside of the vehicle.
6. The system of claim 4, wherein the transport system receiver is configured to communicate one or both of the network data and the image data onto the vehicle over an Ethernet network configured for data communication between the vehicle and one or more other vehicles.
7. The system of claim 1, wherein the operational modes within the camera include:
a first mode of operation in which the image data is stored in the camera on the data storage device, an
A second mode of operation in which the image data is communicated from the camera to a remote storage device.
8. A video system for data communication, comprising:
a camera configured to capture at least image data; and
a data storage device electrically coupled to the camera and configured to store the image data;
a communication device electrically coupled to the camera and configured to communicate the image data to a system receiver;
a camera support object coupled to the camera;
a positioning apparatus configured to detect a position at which the photographing device supports an object; and
a control unit configured to communicate with the system receiver and the positioning apparatus and to control an operating mode within the camera based at least in part on the detected position of the camera support object;
wherein the camera supporting object is an aerial device configured for at least one of remote control or autonomous flight with respect to a ground vehicle route of the vehicle.
9. The system of claim 8, wherein,
the vehicle includes a vehicle chassis for coupling the aerial device to the vehicle, an
The camera is positioned to view the ground vehicle route when the aerial device is in the vehicle dock.
10. The system of claim 8, wherein,
the vehicle includes a vehicle chassis for coupling the aerial device to the vehicle, an
The vehicle dock is configured to charge a battery of the aerial device from a power source of the vehicle when the aerial device is docked in the vehicle dock.
11. The system of claim 8, wherein,
the vehicle comprising a vehicle chassis for coupling the aerial device to the vehicle,
the camera is positioned to view the ground vehicle route when the aerial device is in the vehicle dock, and
the vehicle dock is configured to charge a battery of the aerial device from a power source of the vehicle when the aerial device is docked in the vehicle dock.
12. The system of claim 11, wherein,
the system includes the communication device electrically coupled to the camera and configured to communicate the image data to a system receiver, the system receiver including a transport system receiver disposed on the vehicle,
the transit system receiver is configured to communicate network data other than the image data to at least one of on-board the vehicle or off-board the vehicle, and to switch to a mode for receiving the image data from the camera to communicate the image data in response to the camera being active.
13. The system of claim 12, wherein the transport system receiver is configured to wirelessly communicate the network data outside of the vehicle.
14. The system of claim 12, wherein the transport system receiver is configured to communicate one or both of the network data and the image data onto the vehicle over an ethernet network configured for data communication between the vehicle and one or more other vehicles.
15. The system of claim 8, wherein,
the system includes the communication device electrically coupled to the camera and configured to communicate the image data to a system receiver, the system receiver including a transport system receiver disposed on the vehicle,
the transit system receiver is configured to communicate network data other than the image data to at least one of on-board the vehicle or off-board the vehicle, and to switch to a mode for receiving the image data from the camera to communicate the image data in response to the camera being active.
16. The system of claim 15, wherein the transportation system receiver is configured to wirelessly communicate the network data outside of the vehicle.
17. The system of claim 15, wherein the transport system receiver is configured to communicate one or both of the network data and the image data onto the vehicle over an ethernet network configured for data communication between the vehicle and one or more other vehicles.
18. A video system for data communication, comprising:
a camera configured to capture at least image data; and
a data storage device electrically coupled to the camera and configured to store the image data;
a communication device electrically coupled to the camera and configured to communicate the image data to a system receiver;
a camera support object coupled to the camera;
a positioning apparatus configured to detect a position at which the photographing device supports an object; and
a control unit configured to communicate with the system receiver and the positioning apparatus and to control an operating mode within the camera based at least in part on the detected position of the camera support object;
wherein the camera supporting object is a first ground vehicle configured for at least one of remote control or autonomous movement relative to a second ground vehicle along a route of the second vehicle,
the first ground vehicle is intended to travel ahead of the second ground vehicle along the route and to transmit the image data back to the second ground vehicle.
19. The system of claim 18, wherein,
the system includes the communication device electrically coupled to the camera and configured to communicate the image data to the system receiver, the system receiver including a transport system receiver disposed on the first ground vehicle,
the transit system receiver is configured to communicate network data other than the image data to at least one of on the first land vehicle or outside of the first land vehicle, and to switch to a mode for receiving the image data from the camera to communicate the image data in response to the camera being active.
20. The system of claim 19, wherein the transport system receiver is configured to wirelessly communicate the network data outside of the first ground vehicle.
21. The system of claim 19, wherein the transport system receiver is configured to communicate one or both of the network data and the image data onto the first ground vehicle over an ethernet network or a wireless network configured for data communication between the first ground vehicle and the second ground vehicle.
CN201910851198.7A 2014-02-17 2015-01-30 Video system and method for data communication Active CN110545380B (en)

Applications Claiming Priority (21)

Application Number Priority Date Filing Date Title
US201461940696P 2014-02-17 2014-02-17
US201461940660P 2014-02-17 2014-02-17
US201461940610P 2014-02-17 2014-02-17
US201461940813P 2014-02-17 2014-02-17
US61/940813 2014-02-17
US61/940660 2014-02-17
US61/940610 2014-02-17
US61/940696 2014-02-17
US14/217,672 US11124207B2 (en) 2014-03-18 2014-03-18 Optical route examination system and method
US14/217672 2014-03-18
US14/253294 2014-04-15
US14/253,294 US9875414B2 (en) 2014-04-15 2014-04-15 Route damage prediction system and method
US14/457,353 US20150235094A1 (en) 2014-02-17 2014-08-12 Vehicle imaging system and method
US14/457353 2014-08-12
US14/479847 2014-09-08
US14/479,847 US20150269722A1 (en) 2014-03-18 2014-09-08 Optical route examination system and method
US14/485,398 US10049298B2 (en) 2014-02-17 2014-09-12 Vehicle image data management system and method
US14/485398 2014-09-12
US14/541370 2014-11-14
US14/541,370 US10110795B2 (en) 2002-06-04 2014-11-14 Video system and method for data communication
CN201580020130.4A CN106537900B (en) 2014-02-17 2015-01-30 Video system and method for data communication

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
CN201580020130.4A Division CN106537900B (en) 2014-02-17 2015-01-30 Video system and method for data communication

Publications (2)

Publication Number Publication Date
CN110545380A CN110545380A (en) 2019-12-06
CN110545380B true CN110545380B (en) 2021-08-06

Family

ID=53800530

Family Applications (3)

Application Number Title Priority Date Filing Date
CN201580020130.4A Active CN106537900B (en) 2014-02-17 2015-01-30 Video system and method for data communication
CN201910851198.7A Active CN110545380B (en) 2014-02-17 2015-01-30 Video system and method for data communication
CN201580020285.8A Active CN106458238B (en) 2014-02-17 2015-02-17 The method of Aerial photography apparatus system harm related to route for identification

Family Applications Before (1)

Application Number Title Priority Date Filing Date
CN201580020130.4A Active CN106537900B (en) 2014-02-17 2015-01-30 Video system and method for data communication

Family Applications After (1)

Application Number Title Priority Date Filing Date
CN201580020285.8A Active CN106458238B (en) 2014-02-17 2015-02-17 The method of Aerial photography apparatus system harm related to route for identification

Country Status (3)

Country Link
CN (3) CN106537900B (en)
AU (4) AU2015217536B2 (en)
WO (2) WO2015123035A1 (en)

Families Citing this family (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105578057A (en) * 2016-01-29 2016-05-11 深圳市高巨创新科技开发有限公司 Aerial video processing method and aerial video processing system
JP6633458B2 (en) * 2016-06-02 2020-01-22 株式会社日立製作所 Vehicle control system
CN107089248B (en) * 2017-04-30 2018-10-12 中南大学 UAV Intelligent anti-collision early warning control method and system when a kind of train lost contact
US11014667B2 (en) * 2017-10-18 2021-05-25 Bombardier Transportation Gmbh Rail vehicle and on-board safety drone
JP7052652B2 (en) * 2018-09-06 2022-04-12 トヨタ自動車株式会社 Mobile robots, remote terminals, mobile robot control programs, and remote terminal control programs
JP6627117B1 (en) * 2018-10-29 2020-01-08 エスゼット ディージェイアイ テクノロジー カンパニー リミテッドSz Dji Technology Co.,Ltd Image processing device, imaging device, moving object, image processing method, and program
WO2020149275A1 (en) * 2019-01-16 2020-07-23 株式会社ナイルワークス Drone system, drone, moving body, demarcation member, drone system control method, and drone system control program
CN110027593A (en) * 2019-04-12 2019-07-19 成都宇俊盛科技有限公司 A kind of full-automatic unmanned driving's rail vehicle safe-guard system
CN110084171B (en) * 2019-04-23 2023-03-31 东北电力大学 Detection device and detection method for foreign matters on top of subway train
CN112061179A (en) * 2020-09-17 2020-12-11 中车株洲电力机车有限公司 Active anti-collision device and method for rail vehicle based on unmanned aerial vehicle
DE102020215245A1 (en) * 2020-12-02 2022-06-02 Bombardier Transportation Gmbh Method for operating a rail vehicle and arrangement with a rail vehicle

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102654940A (en) * 2012-05-23 2012-09-05 上海交通大学 Traffic information acquisition system based on unmanned aerial vehicle and processing method of traffic information acquisition system
KR20130119633A (en) * 2012-04-24 2013-11-01 유콘시스템 주식회사 Unmanned aerial vehicle system with cable connection equipment
KR20140017735A (en) * 2012-07-31 2014-02-12 인텔렉추얼디스커버리 주식회사 Wearable electronic device and method for controlling the same

Family Cites Families (19)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20030222981A1 (en) * 2002-06-04 2003-12-04 Kisak Jeffrey James Locomotive wireless video recorder and recording system
JP4235051B2 (en) * 2003-08-29 2009-03-04 トヨタ自動車株式会社 Parking assistance device
DE602004004246T2 (en) * 2004-04-01 2007-11-15 Heuristics Gmbh Method and system for detecting defects and dangerous properties of passing railway vehicles
US7493202B2 (en) * 2004-11-12 2009-02-17 Takata Corporation Vehicle safety control system by image processing
WO2006074298A2 (en) * 2005-01-06 2006-07-13 Alan Shulman Navigation and inspection system
US7483485B2 (en) * 2005-01-24 2009-01-27 Moderator Systems, Inc. Wireless event authentication system
US7510142B2 (en) * 2006-02-24 2009-03-31 Stealth Robotics Aerial robot
US7581702B2 (en) * 2006-06-09 2009-09-01 Insitu, Inc. Wirelessly controlling unmanned aircraft and accessing associated surveillance data
EP2238758A4 (en) * 2008-01-24 2013-12-18 Micropower Technologies Inc Video delivery systems using wireless cameras
CN101439727A (en) * 2008-10-17 2009-05-27 陈立 Train operation ahead dynamic monitoring method and monitoring system
CN201499256U (en) * 2009-09-09 2010-06-02 广东泽安电子有限公司 Ground air moving audio and video frequency real time monitoring device and ground air moving body
CN101898567B (en) * 2010-04-07 2012-03-14 西南交通大学 Railway foreign body limit-intruding monitoring system based on intelligent video
CN102092408A (en) * 2010-12-15 2011-06-15 河北汉光重工有限责任公司 Railway locomotive auxiliary driving device
KR20130005107A (en) * 2011-07-05 2013-01-15 현대자동차주식회사 System for controlling vehicle interval automatically and method thereof
CN102436738B (en) * 2011-09-26 2014-03-05 同济大学 Traffic monitoring device based on unmanned aerial vehicle (UAV)
US20150009331A1 (en) * 2012-02-17 2015-01-08 Balaji Venkatraman Real time railway disaster vulnerability assessment and rescue guidance system using multi-layered video computational analytics
JPWO2013136399A1 (en) * 2012-03-12 2015-07-30 パナソニックIpマネジメント株式会社 Information providing system, information providing apparatus, photographing apparatus, and computer program
US8649917B1 (en) * 2012-09-26 2014-02-11 Michael Franklin Abernathy Apparatus for measurement of vertical obstructions
CN103500322B (en) * 2013-09-10 2016-08-17 北京航空航天大学 Automatic lane line identification method based on low latitude Aerial Images

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20130119633A (en) * 2012-04-24 2013-11-01 유콘시스템 주식회사 Unmanned aerial vehicle system with cable connection equipment
CN102654940A (en) * 2012-05-23 2012-09-05 上海交通大学 Traffic information acquisition system based on unmanned aerial vehicle and processing method of traffic information acquisition system
KR20140017735A (en) * 2012-07-31 2014-02-12 인텔렉추얼디스커버리 주식회사 Wearable electronic device and method for controlling the same

Also Published As

Publication number Publication date
WO2015123035A1 (en) 2015-08-20
AU2021203703A1 (en) 2021-07-01
AU2015217536B2 (en) 2019-05-30
CN106537900A (en) 2017-03-22
CN106458238A (en) 2017-02-22
WO2015123669A1 (en) 2015-08-20
CN110545380A (en) 2019-12-06
CN106458238B (en) 2019-01-22
AU2021203703B2 (en) 2023-03-30
CN106537900A8 (en) 2017-06-30
AU2019205977A1 (en) 2019-08-01
AU2015218266A1 (en) 2016-10-06
CN106458238A8 (en) 2017-06-30
AU2015218266B2 (en) 2019-10-10
AU2015217536A1 (en) 2016-09-22
CN106537900B (en) 2019-10-01
AU2019205977B2 (en) 2021-04-01

Similar Documents

Publication Publication Date Title
US11039055B2 (en) Video system and method for data communication
AU2021203703B2 (en) Video system and method for data communication
US9919723B2 (en) Aerial camera system and method for determining size parameters of vehicle systems
US9873442B2 (en) Aerial camera system and method for identifying route-related hazards
US20170255824A1 (en) Aerial camera system and method for identifying route-related hazards
US10381731B2 (en) Aerial camera system, method for identifying route-related hazards, and microstrip antenna
US20170313332A1 (en) Autonomous vehicle system and method
JP5819555B1 (en) Vehicle driving support system
CN205601869U (en) On -vehicle operating environment safety monitoring system
JP6441902B2 (en) Taxiable aircraft neighborhood visualization system and method
KR101810964B1 (en) Drone for safe and secure law enforcement on the street and Method for controlling the same
CN104332053A (en) Road traffic inspection system and method based on small unmanned aerial vehicle
US20180339719A1 (en) Locomotive decision support architecture and control system interface aggregating multiple disparate datasets
CN205601867U (en) Train contact net detection device
CN112319552A (en) Rail car operation detection early warning system
US11767016B2 (en) Optical route examination system and method
KR101658752B1 (en) Flight safety apparatus for guiding car and docking system thereof
CN107399341A (en) On-board running Environmental safety supervision system and method
CN106101148A (en) A kind of bus safety monitoring system based on video acquisition
CA3126118A1 (en) Vehicle monitoring system
CN111564062A (en) Driving guide system and method
TWI804113B (en) Intelligent railway monitoring system and method thereof
US20230415912A1 (en) Aerial-based event notification
CN116279673A (en) Train running line and driving state on-line monitoring system and device
CN114187761A (en) Road maintenance operation prompting device, system and method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
CP01 Change in the name or title of a patent holder

Address after: Connecticut, USA

Patentee after: IP transmission holding Co.

Address before: Connecticut, USA

Patentee before: General Electric Global Procurement Co.,Ltd.

CP01 Change in the name or title of a patent holder