US20040143374A1 - Remote control system for locomotive - Google Patents
Remote control system for locomotive Download PDFInfo
- Publication number
- US20040143374A1 US20040143374A1 US10/754,525 US75452504A US2004143374A1 US 20040143374 A1 US20040143374 A1 US 20040143374A1 US 75452504 A US75452504 A US 75452504A US 2004143374 A1 US2004143374 A1 US 2004143374A1
- Authority
- US
- United States
- Prior art keywords
- locomotive
- control system
- remote control
- controller
- command
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
- 230000003137 locomotive effect Effects 0.000 title claims abstract description 147
- 238000004891 communication Methods 0.000 claims abstract description 58
- 238000012545 processing Methods 0.000 claims description 63
- 238000012795 verification Methods 0.000 claims description 22
- 238000012790 confirmation Methods 0.000 claims description 20
- 230000009471 action Effects 0.000 claims description 7
- 230000001143 conditioned effect Effects 0.000 claims description 5
- 230000001419 dependent effect Effects 0.000 claims description 2
- 230000005540 biological transmission Effects 0.000 claims 1
- 230000006870 function Effects 0.000 description 7
- 238000000034 method Methods 0.000 description 7
- 238000010586 diagram Methods 0.000 description 6
- 230000008569 process Effects 0.000 description 4
- 230000005236 sound signal Effects 0.000 description 4
- 238000012546 transfer Methods 0.000 description 4
- 238000013461 design Methods 0.000 description 2
- 230000009467 reduction Effects 0.000 description 2
- 238000010200 validation analysis Methods 0.000 description 2
- 230000001755 vocal effect Effects 0.000 description 2
- 230000001133 acceleration Effects 0.000 description 1
- 238000013459 approach Methods 0.000 description 1
- 230000008859 change Effects 0.000 description 1
- 238000010276 construction Methods 0.000 description 1
- 230000000994 depressogenic effect Effects 0.000 description 1
- 238000001514 detection method Methods 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 230000005484 gravity Effects 0.000 description 1
- 238000005259 measurement Methods 0.000 description 1
- 230000007246 mechanism Effects 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 238000012544 monitoring process Methods 0.000 description 1
- 230000003595 spectral effect Effects 0.000 description 1
- 230000026676 system process Effects 0.000 description 1
- 239000013598 vector Substances 0.000 description 1
Images
Classifications
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B61—RAILWAYS
- B61L—GUIDING RAILWAY TRAFFIC; ENSURING THE SAFETY OF RAILWAY TRAFFIC
- B61L3/00—Devices along the route for controlling devices on the vehicle or train, e.g. to release brake or to operate a warning signal
- B61L3/02—Devices along the route for controlling devices on the vehicle or train, e.g. to release brake or to operate a warning signal at selected places along the route, e.g. intermittent control simultaneous mechanical and electrical control
- B61L3/08—Devices along the route for controlling devices on the vehicle or train, e.g. to release brake or to operate a warning signal at selected places along the route, e.g. intermittent control simultaneous mechanical and electrical control controlling electrically
- B61L3/12—Devices along the route for controlling devices on the vehicle or train, e.g. to release brake or to operate a warning signal at selected places along the route, e.g. intermittent control simultaneous mechanical and electrical control controlling electrically using magnetic or electrostatic induction; using radio waves
- B61L3/127—Devices along the route for controlling devices on the vehicle or train, e.g. to release brake or to operate a warning signal at selected places along the route, e.g. intermittent control simultaneous mechanical and electrical control controlling electrically using magnetic or electrostatic induction; using radio waves for remote control of locomotives
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B61—RAILWAYS
- B61C—LOCOMOTIVES; MOTOR RAILCARS
- B61C17/00—Arrangement or disposition of parts; Details or accessories not otherwise provided for; Use of control gear and control systems
- B61C17/12—Control gear; Arrangements for controlling locomotives from remote points in the train or when operating in multiple units
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B61—RAILWAYS
- B61L—GUIDING RAILWAY TRAFFIC; ENSURING THE SAFETY OF RAILWAY TRAFFIC
- B61L17/00—Switching systems for classification yards
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B61—RAILWAYS
- B61L—GUIDING RAILWAY TRAFFIC; ENSURING THE SAFETY OF RAILWAY TRAFFIC
- B61L2205/00—Communication or navigation systems for railway traffic
- B61L2205/04—Satellite based navigation systems, e.g. global positioning system [GPS]
-
- G—PHYSICS
- G10—MUSICAL INSTRUMENTS; ACOUSTICS
- G10L—SPEECH ANALYSIS TECHNIQUES OR SPEECH SYNTHESIS; SPEECH RECOGNITION; SPEECH OR VOICE PROCESSING TECHNIQUES; SPEECH OR AUDIO CODING OR DECODING
- G10L15/00—Speech recognition
- G10L15/26—Speech to text systems
Definitions
- the invention relates to locomotive remote control technology.
- the invention provides a lead controller for use with a remote control system for a locomotive.
- the lead controller comprises an input for receiving a signal containing speech information.
- a processing unit receives the speech information and performs speech recognition to generate speech recognition results.
- the processing unit uses the speech recognition results to produce a command for execution by the locomotive.
- the lead controller also has a communication link interface and is operative for transmitting the command to be executed by the locomotive over a wireless communication link.
- the invention provides a remote control system for a locomotive that has a lead controller remote from the locomotive and a follower controller designed to be mounted in the locomotive.
- the lead controller can wirelessly transmit information to the follower controller.
- the lead controller includes an input for receiving a signal derived from a spoken utterance and containing speech information.
- the remote control system has a processing unit for performing speech recognition on the speech information contained in the signal and uses the speech recognition results to produce a command to be executed by the locomotive.
- the invention provides a lead controller for use with a remote control system for a locomotive.
- the lead controller comprises an input interface for receiving a distance command.
- the lead controller also has a communication link interface for transmitting the distance command to be executed by the locomotive over a wireless communication link.
- the invention provides a remote control system for a locomotive that has a lead controller remote from the locomotive and a follower controller to be mounted in the locomotive.
- the lead controller has an input interface for receiving from a human operator a distance command.
- the lead controller also has a communication link interface for transmitting the distance command over a wireless communication link.
- the follower controller is responsive to the distance command sent over the wireless communication link to cause the locomotive to execute the distance command.
- the invention provides a lead controller for use with a remote control system for a locomotive.
- the lead controller comprises an input interface for receiving a target location command.
- the lead controller also has a communication link interface for transmitting the target location command to be executed by the locomotive over a wireless communication link.
- the invention provides a remote control system for a locomotive that has a lead controller remote from the locomotive and a follower controller to be mounted in the locomotive.
- the lead controller has an input interface for receiving from a, human operator a target location command.
- the lead controller also has a communication link interface for transmitting the target location command over a wireless communication link.
- the follower controller is responsive to the target location command sent over the wireless communication link to cause the locomotive to execute the target location command.
- the invention provides a lead controller for use with a remote control system for a locomotive pulling a train.
- the lead controller comprises an input interface for:
- the lead controller has a communication link interface for transmitting the parameter of the train and the command to be executed by the train over a wireless communication link.
- the invention provides a remote control system for a locomotive that has a lead controller remote from the locomotive and a follower controller to be mounted in the locomotive.
- the lead controller includes an input interface for receiving:
- the remote control system has a communication interface for transmitting the command and the parameter of the train over a wireless communication link.
- the follower controller is responsive to the command and to the parameter of the train sent over the wireless communication link to cause the train to execute the command by implementing actions conditioned at least in part on the parameter of the train.
- the invention provides a lead controller for use with a remote control system for a locomotive riding on a track.
- the lead controller comprises an input interface for:
- the lead controller has a communication link interface for transmitting the parameter of the track and the command for execution by the locomotive over a wireless communication link.
- the invention provides a remote control system for a locomotive traveling on a track, that has a lead controller remote from the locomotive and a follower controller to be mounted in the locomotive.
- the lead controller includes an input interface for receiving:
- the follower controller is responsive to the command and to the parameter of the track transmitted over the wireless communication link to cause the locomotive to execute the command by implementing actions conditioned at least in part on the basis of the parameter of the track.
- FIG. 1 is a block diagram of a remote control system for a locomotive
- FIG. 2 is a block diagram of the lead controller of the remote control system for a locomotive depicted in FIG. 1;
- FIG. 3 is a block diagram of a communication link interface of the lead controller shown in FIG. 2;
- FIG. 4 is a flow-chart illustrating the operation of the lead controller shown in FIG. 2;
- FIG. 5 is a block diagram of a computing platform that can be used to implement some of the components of the remote control system for a locomotive shown in FIG. 1;
- FIG. 6 is block diagram of a remote control system for a locomotive according to a variant.
- FIG. 1 illustrates a remote control system for a locomotive.
- the remote control system includes two main components, namely a lead controller 12 and a follower controller 14 that are linked to one another by a wireless communication link 16 .
- an operator dials in commands at the lead controller 12 and those commands are relayed to the follower controller mounted on-board the locomotive.
- the follower controller will process the commands and issue local signals that-are applied to the locomotive such as to implement the commands specified by the operator.
- the lead controller 12 includes an input interface (not shown) for receiving commands from the operator as to desired actions to be executed by the locomotive or certain parameters about the train pulled by the locomotive or about the track on which the locomotive or train is moving.
- the input interface refers broadly to the agency on the lead controller 12 on which commands and parameters can be input without any limitation to the specific input devices used.
- the input devices may comprise keys, switches, knobs or levers that must be displaced, depressed, or otherwise mechanically operated by the operator to dial in commands or parameters.
- the input interface may include a pointing device, a touch sensitive screen or a voice input.
- the input interface has a voice input allowing the operator to enter commands or parameters via voice.
- the voice input includes a microphone.
- lead controller 12 includes a processing unit 15 and a communication link interface 32 .
- the processing unit 15 has an input that receives a signal conveying speech information.
- the voice input device on the input interface is a microphone
- the input 17 could be the output from the microphone.
- the signal at the input 17 is of analog nature. That signal is applied to an analog-to-digital converter 18 that digitizes the signal according to a Pulse Code Modulation (PCM) technique or according to any other suitable method.
- PCM Pulse Code Modulation
- the stream of PCM samples released from the analog-to-digital converter 18 are applied to a parameterization unit 20 whose task is to extract from the audio signal containing the speech information significant features on which further speech processing can be performed.
- PCM Pulse Code Modulation
- Examples of speech feature elements include feature vectors, spectral parameters, audio signal segments, band energies and cepstral parameters, among others.
- the feature elements extracted by the parameterization unit 20 are passed to a speech recognition engine 22 .
- Any suitable commercially available speech recognition engine can be used without departing from the spirit of the invention.
- the speech recognition engine 22 works in conjunction with a speech recognition dictionary 24 that contains a list of vocabulary items that the speech recognition engine 22 can recognize.
- the speech recognition engine 22 receives the feature elements generated by the parameterization unit 20 , it generates at output 23 data that represents the vocabulary item best matching the spoken utterance characterized by the feature elements.
- the vocabulary items held in the speech recognition dictionary 24 reflect the commands that the lead controller 12 should be able to recognize.
- the speech recognition engine 22 is speaker dependent. In other words, the speech recognition engine 22 should be trained from speech tokens from a specific speaker such that the speech recognition engine better adapts to the characteristics of the speaker. Alternatively, at speaker independent speech recognition engine can be used without departing from the spirit of the invention.
- the recognition results are released by the speech recognition engine 22 on the output 23 .
- the recognition results are the vocabulary item found as being the best match to the spoken utterance, expressed in orthographic form. Other types of representations of the recognition results can be used without departing from the spirit of the invention.
- the recognition results are input in a text to speech converter 25 that synthesizes an audio signal released on the output 19 to audibly play to the user the recognition result.
- This mechanism is provided as a safety feature to allow the operator to abort a command in cases when the recognition results are incorrect.
- the audio signal released from the text-to-speech converter is in analog form.
- the analog signal is then passed to a suitable amplifier (not shown in the drawings) and a suitable speaker (not shown in the drawings) such as to audibly play the synthetic speech to the operator.
- Any suitable text-to-speech converter could be used without departing from the spirit of the invention.
- text-to-speech converters are generally known in the art it is not deemed necessary to describe them here in detail.
- the lead controller 12 can visually communicate those results to the operator.
- the lead controller 12 can be provided with a display on which the recognition results are shown.
- the output 23 is also applied to a command verification unit 28 .
- the command verification unit gates the recognition results. If a confirmation has been received from the operator within a specified time period that the recognition result is accurate, the command verification unit 28 will release the recognition result for further processing. If no positive input has been received from the operator within the specified time period or a negative input has been received from the operator, the command verification unit 28 deletes or otherwise negates the recognition results applied at its input.
- the command verification unit 28 will release the recognition results only if the operator has uttered the word “yes” within a certain time frame after reception of the recognition results, say 5 seconds.
- a timer starts.
- the operator receives from the text-t-speech converter 26 synthesized speech conveying what are the recognition results. If the operator accepts the results, he or she utters “yes”.
- the new spoken utterance is processed as described previously, and assuming a correct recognition the orthographic representation of the word “yes” appears at the output 23 and it is supplied to the command verification unit 28 .
- the prior recognition results (conveying the original command) are released by the command verification unit 28 . If nothing is received by the command verification unit 28 before the timer stops, then the prior recognition results buffered in the command verification unit 28 are deleted. The same operation is performed if any other word than “yes” is received by the command verification unit 28 .
- the architecture of the system is such that the operator will also hear the recognition results from the confirmation utterance, namely the word “yes” (assuming correct recognition). In some applications, this might be desirable. If this feature is not desirable, the system shown in FIG. 2 can be modified such that a control signal is issued from the command verification unit 28 while the timer is counting. The control signal is applied to the text-to-speech converter 26 such as to prevent the converter from operating. After the timer stops, the control signal is no longer generated and the text-to-speech converter 26 is re-activated.
- a confirmation other than a vocal confirmation can be used.
- the lead controller 12 can be provided with a button that that operator needs to depress in order to confirm the recognition results.
- This possibility can be implemented by modifying the command verification unit 28 to release the recognition results when a logic signal derived from the manual actuation of the button is received before the timer stops.
- This variant is particularly well suited to applications in which the recognition results are communicated to the operator visually instead of audibly.
- the command verification unit 28 will include a speaker verification module allowing to verify that the operator entering the voice command is an authorized user. Prior to using the system, each authorized user will be asked to provide a respective access voiceprint associated to a user identification number.
- a voiceprint is a mathematical representation the acoustic properties of a spoken utterance.
- the access voiceprint will be used to grant access to the control system by performing a similarity measurement between the access voiceprint and an input utterance provided by the operator.
- a speaker verification operation will be performed for each command received from an operator. In this case, command voiceprints for each allowable command will have to be provided by each authorized user prior to using the control system.
- command voiceprints are stored in records in a computer readable medium and are associated to respective authorized user via their identification number. Once an operator has been granted access to the control system by his access voiceprint, the corresponding record containing the command voiceprints is extracted and used for subsequent speaker verification operations. Consequently, each spoken utterance indicative of a command received by the control system is verified against the corresponding command voiceprint in the record associated to the given user. Speaker verification units are well-known and will not be further described here. If the operator cannot be verified as an authorized user, the system will issue a message indicating that control access is being denied.
- the recognition results are released from the command verification unit 28 , they are passed to a command translator.
- the purpose of the command translator is to encode the command in a format suitable for processing by the control unit 30 to be described later.
- the command released by the command validation unit is in orthographic form which is not the best form suited for analysis such as basic sanity checks and other similar operations to be performed by the control unit 30 .
- the command translator will convert the command that is orthographic representation to a format normally obtained from typical manually operated controls. This feature allows using a control unit 30 of known design since the control unit 30 will receive commands in a format that it can already interpret.
- the command translator 29 can be designed around a database that maps the orthographic representation of a command to its encoded form.
- the size of the database will depend upon the number of possible commands the lead controller 12 is designed to vocally accept.
- the control unit 30 receives the encoded command and processes it.
- One type of processing is to perform a high-level validation or sanity check. For example, when the locomotive is travelling forward and a command is received that specifies a reverse movement, that command is rejected.
- the control unit 30 is of known construction and it does not need to be described in detail here. For more information, the reader is directed to the U.S. Pat. Nos. 5,511,749 and 5,685,507 that provide more information on this particular point. The contents of these patents in incorporated herein by reference.
- FIG. 4 is a flowchart that summarizes the operation of the lead controller 12 .
- the process starts.
- the vocal command uttered by the operator is received at the input 17 .
- the spoken utterance is recognized.
- synthetic speech is created and played to the operator to communicate to him the recognition results.
- the recognition results are validated.
- the validated recognition results are passed to the command translator 29 for encoding and then to the control unit 30 .
- the command is sent to the follower controller 14 over the wireless communication link 16 .
- the processing unit 15 can in large part be implemented in software executed on a suitable computing platform of the type illustrated in FIG. 5.
- Such computing platform includes a Central Processing Unit (CPU) 60 connected to a memory 62 over a data bus 64 .
- An Input/Output interface 66 is connected to the data bus 64 and allows the computing platform to exchange signals/data with the external world.
- the memory 62 is used for holding program instructions of program elements that implement the functionality of components of the processing unit 15 . Also, the memory 62 is used to hold data on which the program elements operate.
- FIG. 6 illustrates a variant of the remote control system for locomotive 100 .
- the remote control system 100 has a lead controller 102 that is generally similar to the lead controller 12 described earlier and a follower controller 104 that communicates with the lead controller 102 via a communication link 106 .
- the follower controller 104 causes the locomotive to execute a number of functions, including traveling on the track a predetermined distance.
- the operator can input on the input interface of the lead controller 102 a distance command specifying the distance to travel.
- Such distance command can be input as a distance value, such as 100 meters for example, or as a parameter that can be resolved by the remote control system 10 to a distance value.
- the expression “distance command” or command directing the locomotive to travel a predetermined distance” should be interpreted to cover commands that specify explicitly a distance to be travelled or a command that can be resolved into a distance to be travelled.
- An example of such a command is a command directing the locomotive to move a predetermined number of car lengths. Since the length of a car is known, the remote control system 100 can compute the total distance the locomotive is to travel by multiplying the average length of a car by the number of cars specified by the operator.
- the follower controller 104 uses a distance input in order to determine the actual distance travelled by the locomotive such that when the distance to travel has been reached, the movement can be stopped.
- a distance input There are a wide variety of ways to obtain the distance input, without departing from the spirit of the invention. For example:
- the distance input can be obtained internally by processing the output of the velocity sensor mounted on the locomotive that is used to measure the speed of the locomotive.
- the distance travelled can be determined by integrating the velocity over the time the locomotive is moving.
- methods (a) and (b) may introduce errors when the velocity sensor is mounted on a traction wheel of the locomotive and that wheel is subjected to slip.
- the distance input can be obtained externally, as shown by the arrow 108 , via transponder detection.
- the locomotive is provided with an antenna that senses transponders placed at predetermined locations along the track. Based on the identity of a given transponder and knowing the location of each transponder, the remote control system can derive the distance traveled by the locomotive between two transponders.
- GPS Global Position System
- the locomotive In use, when the remote control system receives a distance command, the locomotive either alone or when pulling a train will start moving while monitoring the distance travelled. When the locomotive approaches the end of the track span that corresponds to the distance to be travelled, the locomotive slows down by reduction of power and/or application of brakes such as to stop without exceeding the distance specified by the operator.
- the remote control system 100 is designed to receive a target location command, which is a command identifying a location on the track where the locomotive is to stop.
- the remote control system 100 tracks the current location of the locomotive and compares it to the target location.
- the remote control system 100 causes the locomotive to stop by power reduction and/or application of brakes.
- the command supplied by the operator provides the information about the target location.
- a target location command can specify the identity of a transponder and the locomotive will stop when this transponder is reached.
- the target location command can specify a location by expressing coordinates explicitly or implicitly.
- the operator may specify the name of a location that can be resolved to a set of coordinates by the remote control system.
- the operator may be provided on the input interface with a touch sensitive screen showing a map of the yard where the locomotive rides. The operator may touch or otherwise specify a location on the displayed map. Underlying logic in the lead controller 102 can issue based on this input the coordinates, expressed in any suitable format, of the location specified by the operator.
- the remote control system can receive, in addition to commands directing the locomotive to perform a certain function, parameters that can change the way the command is being implemented.
- the operator may specify parameters about the train being pulled by the locomotive or parameters about the track on which the locomotive is riding.
- An example of a parameter of the train is the approximate weight of the train or the number of cars in the train.
- This information can be used by the follower controller to adapt the control logic that is used to regulate the train movement. For instance, in the case of a heavier train, more power can be applied when the operator desired to bring the train to certain speed, than in the case of a lighter train.
- the control logic will apply the brakes earlier than in the case of a lighter train, to avoid the train overshooting the target location. Accordingly, the locomotive is caused to execute commands by implementing actions that are conditioned on the parameter specified by the operator.
- control logic in the follower controller can usually be expressed as a set of transfer functions, each function associated to a certain operation, such as acceleration, coasting, braking, etc.
- Each transfer function has a number of control parameters.
- the parameters of the train input by the operator used to modify the control parameters of the relevant transfer function. Which control parameters of the transfer function, and the extent to which they can be modified, is a matter of design choice that can vary significantly from one application to the other.
- Another parameter that can be specified by the operator and that can be used to tailor the control logic of the follower controller generally as described earlier, relates to track conditions.
- the operator can specify the grade (incline) of the track which information can be used modify the control logic to apply more power when the train is going up the track and less power when the train is moving down, due to the gravity effect. Braking can be tailored in a similar manner.
- Another possible track parameter, among others, is how slippery the track is.
- Parameters about the train and track conditions, distance and target location commands can be entered via voice.
- the speech recognition functionality of the remote control system processes the speech input as required previously to output the relevant command or parameter, which is then processed for implementation.
- the speech recognition dictionary of the system contains vocabulary items that correspond to the distance and target location commands, train, track or other parameters that the system should be able to recognize, such that when the operator utters one or more of those commands or parameters the speech recognition system will be able find proper matches.
Landscapes
- Engineering & Computer Science (AREA)
- Mechanical Engineering (AREA)
- Automation & Control Theory (AREA)
- Transportation (AREA)
- Electric Propulsion And Braking For Vehicles (AREA)
- Selective Calling Equipment (AREA)
Abstract
Description
- This application is a continuation-in-part of pending U.S. patent application Ser. No. 10/328,517 filed Dec. 23, 2002 which is a continuation-in-part of pending U.S. patent application Ser. No. 10/222,560 filed Aug. 16, 2002 which is a continuation of U.S. patent application Ser. No. 09/653,651 filed Sep. 1, 2000 and issued Oct. 15, 2000 as patent No. 6,466,847. The contents of the above documents are incorporated herein by reference.
- The invention relates to locomotive remote control technology.
- Under a first broad aspect, the invention provides a lead controller for use with a remote control system for a locomotive. The lead controller comprises an input for receiving a signal containing speech information. A processing unit receives the speech information and performs speech recognition to generate speech recognition results. The processing unit uses the speech recognition results to produce a command for execution by the locomotive. The lead controller also has a communication link interface and is operative for transmitting the command to be executed by the locomotive over a wireless communication link.
- In a second broad aspect, the invention provides a remote control system for a locomotive that has a lead controller remote from the locomotive and a follower controller designed to be mounted in the locomotive. The lead controller can wirelessly transmit information to the follower controller. The lead controller includes an input for receiving a signal derived from a spoken utterance and containing speech information. The remote control system has a processing unit for performing speech recognition on the speech information contained in the signal and uses the speech recognition results to produce a command to be executed by the locomotive.
- In a third broad aspect, the invention provides a lead controller for use with a remote control system for a locomotive. The lead controller comprises an input interface for receiving a distance command. The lead controller also has a communication link interface for transmitting the distance command to be executed by the locomotive over a wireless communication link.
- In a fourth broad aspect, the invention provides a remote control system for a locomotive that has a lead controller remote from the locomotive and a follower controller to be mounted in the locomotive. The lead controller has an input interface for receiving from a human operator a distance command. The lead controller also has a communication link interface for transmitting the distance command over a wireless communication link. The follower controller is responsive to the distance command sent over the wireless communication link to cause the locomotive to execute the distance command.
- In a fifth broad aspect, the invention provides a lead controller for use with a remote control system for a locomotive. The lead controller comprises an input interface for receiving a target location command. The lead controller also has a communication link interface for transmitting the target location command to be executed by the locomotive over a wireless communication link.
- In a sixth broad aspect, the invention provides a remote control system for a locomotive that has a lead controller remote from the locomotive and a follower controller to be mounted in the locomotive. The lead controller has an input interface for receiving from a, human operator a target location command. The lead controller also has a communication link interface for transmitting the target location command over a wireless communication link. The follower controller is responsive to the target location command sent over the wireless communication link to cause the locomotive to execute the target location command.
- In a seventh broad aspect, the invention provides a lead controller for use with a remote control system for a locomotive pulling a train. The lead controller comprises an input interface for:
- (a) receiving a command for execution by the train;
- (b) a parameter of the train.
- The lead controller has a communication link interface for transmitting the parameter of the train and the command to be executed by the train over a wireless communication link.
- In an eight broad aspect, the invention provides a remote control system for a locomotive that has a lead controller remote from the locomotive and a follower controller to be mounted in the locomotive. The lead controller includes an input interface for receiving:
- (a) a command for execution by the train;
- (b) a parameter of the train.
- The remote control system has a communication interface for transmitting the command and the parameter of the train over a wireless communication link. The follower controller is responsive to the command and to the parameter of the train sent over the wireless communication link to cause the train to execute the command by implementing actions conditioned at least in part on the parameter of the train.
- In a ninth broad aspect, the invention provides a lead controller for use with a remote control system for a locomotive riding on a track. The lead controller comprises an input interface for:
- (c) receiving a command directing the locomotive to perform a certain action;
- (d) a parameter of the track.
- The lead controller has a communication link interface for transmitting the parameter of the track and the command for execution by the locomotive over a wireless communication link.
- In an tenth broad aspect, the invention provides a remote control system for a locomotive traveling on a track, that has a lead controller remote from the locomotive and a follower controller to be mounted in the locomotive. The lead controller includes an input interface for receiving:
- (c) a command for execution by the locomotive;
- (d) a parameter of the track.
- The follower controller is responsive to the command and to the parameter of the track transmitted over the wireless communication link to cause the locomotive to execute the command by implementing actions conditioned at least in part on the basis of the parameter of the track.
- A detailed description of examples of implementation of the present invention is provided hereinbelow with reference to the following drawings., in which:
- FIG. 1 is a block diagram of a remote control system for a locomotive;
- FIG. 2 is a block diagram of the lead controller of the remote control system for a locomotive depicted in FIG. 1;
- FIG. 3 is a block diagram of a communication link interface of the lead controller shown in FIG. 2;
- FIG. 4 is a flow-chart illustrating the operation of the lead controller shown in FIG. 2; and
- FIG. 5 is a block diagram of a computing platform that can be used to implement some of the components of the remote control system for a locomotive shown in FIG. 1; and
- FIG. 6 is block diagram of a remote control system for a locomotive according to a variant.
- In the drawings, embodiments of the invention are illustrated by way of example. It is to be expressly understood that the description and drawings are only for purposes of illustration and as an aid to understanding, and are not intended to be a definition of the limits of the invention.
- A non-limiting example of implementation of the present invention is illustrated in FIG. 1 of the drawings. In particular, FIG. 1 illustrates a remote control system for a locomotive. The remote control system includes two main components, namely a
lead controller 12 and afollower controller 14 that are linked to one another by awireless communication link 16. - In use, an operator dials in commands at the
lead controller 12 and those commands are relayed to the follower controller mounted on-board the locomotive. The follower controller will process the commands and issue local signals that-are applied to the locomotive such as to implement the commands specified by the operator. - A detailed block diagram of the
lead controller 12 is shown in FIG. 2. The lead controller includes an input interface (not shown) for receiving commands from the operator as to desired actions to be executed by the locomotive or certain parameters about the train pulled by the locomotive or about the track on which the locomotive or train is moving. The input interface refers broadly to the agency on thelead controller 12 on which commands and parameters can be input without any limitation to the specific input devices used. The input devices may comprise keys, switches, knobs or levers that must be displaced, depressed, or otherwise mechanically operated by the operator to dial in commands or parameters. Alternatively, the input interface may include a pointing device, a touch sensitive screen or a voice input. In the specific example described in this application, the input interface has a voice input allowing the operator to enter commands or parameters via voice. In a non-limiting example of implementation, the voice input includes a microphone. - In addition to the input interface,
lead controller 12 includes aprocessing unit 15 and acommunication link interface 32. Theprocessing unit 15 has an input that receives a signal conveying speech information. In practice, if the voice input device on the input interface is a microphone, theinput 17 could be the output from the microphone. The signal at theinput 17 is of analog nature. That signal is applied to an analog-to-digital converter 18 that digitizes the signal according to a Pulse Code Modulation (PCM) technique or according to any other suitable method. The stream of PCM samples released from the analog-to-digital converter 18 are applied to aparameterization unit 20 whose task is to extract from the audio signal containing the speech information significant features on which further speech processing can be performed. - Examples of speech feature elements include feature vectors, spectral parameters, audio signal segments, band energies and cepstral parameters, among others.
- It is not deemed necessary to describe in detail the structure and operation of the
parameterization unit 20 since such component is well known to those skilled in the art. - The feature elements extracted by the
parameterization unit 20 are passed to aspeech recognition engine 22. Any suitable commercially available speech recognition engine can be used without departing from the spirit of the invention. Thespeech recognition engine 22 works in conjunction with aspeech recognition dictionary 24 that contains a list of vocabulary items that thespeech recognition engine 22 can recognize. In use, when thespeech recognition engine 22 receives the feature elements generated by theparameterization unit 20, it generates atoutput 23 data that represents the vocabulary item best matching the spoken utterance characterized by the feature elements. - The vocabulary items held in the
speech recognition dictionary 24 reflect the commands that thelead controller 12 should be able to recognize. - For better recognition performance, the
speech recognition engine 22 is speaker dependent. In other words, thespeech recognition engine 22 should be trained from speech tokens from a specific speaker such that the speech recognition engine better adapts to the characteristics of the speaker. Alternatively, at speaker independent speech recognition engine can be used without departing from the spirit of the invention. - The recognition results are released by the
speech recognition engine 22 on theoutput 23. In one specific example, the recognition results are the vocabulary item found as being the best match to the spoken utterance, expressed in orthographic form. Other types of representations of the recognition results can be used without departing from the spirit of the invention. - The recognition results are input in a text to speech converter25 that synthesizes an audio signal released on the
output 19 to audibly play to the user the recognition result. This mechanism is provided as a safety feature to allow the operator to abort a command in cases when the recognition results are incorrect. In a specific non-limiting example of implementation, the audio signal released from the text-to-speech converter is in analog form. The analog signal is then passed to a suitable amplifier (not shown in the drawings) and a suitable speaker (not shown in the drawings) such as to audibly play the synthetic speech to the operator. - Any suitable text-to-speech converter could be used without departing from the spirit of the invention. In light of the fact that text-to-speech converters are generally known in the art it is not deemed necessary to describe them here in detail.
- Instead of communicating audibly to the operator the recognition results for verification purposes, the
lead controller 12 can visually communicate those results to the operator. For example, thelead controller 12 can be provided with a display on which the recognition results are shown. - The
output 23 is also applied to acommand verification unit 28. The command verification unit gates the recognition results. If a confirmation has been received from the operator within a specified time period that the recognition result is accurate, thecommand verification unit 28 will release the recognition result for further processing. If no positive input has been received from the operator within the specified time period or a negative input has been received from the operator, thecommand verification unit 28 deletes or otherwise negates the recognition results applied at its input. - In one specific example, the
command verification unit 28 will release the recognition results only if the operator has uttered the word “yes” within a certain time frame after reception of the recognition results, say 5 seconds. After the recognition results are input in thecommand verification unit 28, a timer starts. At the same time, the operator receives from the text-t-speech converter 26 synthesized speech conveying what are the recognition results. If the operator accepts the results, he or she utters “yes”. The new spoken utterance is processed as described previously, and assuming a correct recognition the orthographic representation of the word “yes” appears at theoutput 23 and it is supplied to thecommand verification unit 28. If the word “yes” is received before the expiration of the 5 seconds interval, the prior recognition results (conveying the original command) are released by thecommand verification unit 28. If nothing is received by thecommand verification unit 28 before the timer stops, then the prior recognition results buffered in thecommand verification unit 28 are deleted. The same operation is performed if any other word than “yes” is received by thecommand verification unit 28. - In the example of implementation shown in FIG. 2, the architecture of the system is such that the operator will also hear the recognition results from the confirmation utterance, namely the word “yes” (assuming correct recognition). In some applications, this might be desirable. If this feature is not desirable, the system shown in FIG. 2 can be modified such that a control signal is issued from the
command verification unit 28 while the timer is counting. The control signal is applied to the text-to-speech converter 26 such as to prevent the converter from operating. After the timer stops, the control signal is no longer generated and the text-to-speech converter 26 is re-activated. - In a possible variant, a confirmation other than a vocal confirmation can be used. For instance, the
lead controller 12 can be provided with a button that that operator needs to depress in order to confirm the recognition results. This possibility can be implemented by modifying thecommand verification unit 28 to release the recognition results when a logic signal derived from the manual actuation of the button is received before the timer stops. This variant is particularly well suited to applications in which the recognition results are communicated to the operator visually instead of audibly. - In another possible variant, the
command verification unit 28 will include a speaker verification module allowing to verify that the operator entering the voice command is an authorized user. Prior to using the system, each authorized user will be asked to provide a respective access voiceprint associated to a user identification number. A voiceprint is a mathematical representation the acoustic properties of a spoken utterance. The access voiceprint will be used to grant access to the control system by performing a similarity measurement between the access voiceprint and an input utterance provided by the operator. For increased security, in addition to the identification number and access voiceprint, a speaker verification operation will be performed for each command received from an operator. In this case, command voiceprints for each allowable command will have to be provided by each authorized user prior to using the control system. These command voiceprints are stored in records in a computer readable medium and are associated to respective authorized user via their identification number. Once an operator has been granted access to the control system by his access voiceprint, the corresponding record containing the command voiceprints is extracted and used for subsequent speaker verification operations. Consequently, each spoken utterance indicative of a command received by the control system is verified against the corresponding command voiceprint in the record associated to the given user. Speaker verification units are well-known and will not be further described here. If the operator cannot be verified as an authorized user, the system will issue a message indicating that control access is being denied. - When the recognition results are released from the
command verification unit 28, they are passed to a command translator. The purpose of the command translator is to encode the command in a format suitable for processing by thecontrol unit 30 to be described later. Generally stated, the command released by the command validation unit is in orthographic form which is not the best form suited for analysis such as basic sanity checks and other similar operations to be performed by thecontrol unit 30. - In a prior art lead controller, when the operator manually acts on the controls, the commands are encoded and supplied to a control unit. In the present example of implementation, the command translator will convert the command that is orthographic representation to a format normally obtained from typical manually operated controls. This feature allows using a
control unit 30 of known design since thecontrol unit 30 will receive commands in a format that it can already interpret. - The
command translator 29 can be designed around a database that maps the orthographic representation of a command to its encoded form. The size of the database will depend upon the number of possible commands thelead controller 12 is designed to vocally accept. - The
control unit 30 receives the encoded command and processes it. One type of processing is to perform a high-level validation or sanity check. For example, when the locomotive is travelling forward and a command is received that specifies a reverse movement, that command is rejected. In general, thecontrol unit 30 is of known construction and it does not need to be described in detail here. For more information, the reader is directed to the U.S. Pat. Nos. 5,511,749 and 5,685,507 that provide more information on this particular point. The contents of these patents in incorporated herein by reference. - The output generated by the
control unit 30 is passed to thecommunication link interface 32 such that it can be transmitted to thefollower controller 14 over thewireless communication link 16. An example of implementation of the communication link interface is shown in FIG. 3. The communication link interface includes areceiver unit 34 and a transmitter unit 36. Signals issued from thecontrol unit 30 are passed to the transmitter unit 36 for modulation and any other suitable processing such as they can be transported over thewireless communication link 16. Similarly, signals in thewireless communication link 16 directed at thelead controller 12 are passed to thereceiver unit 34 for demodulation and they are then passed to the component of thelead control 12 designed to process them. - FIG. 4 is a flowchart that summarizes the operation of the
lead controller 12. At theblock 38, the process starts. Atstep 40, the vocal command uttered by the operator is received at theinput 17. Atstep 42, the spoken utterance is recognized. Atstep 44, synthetic speech is created and played to the operator to communicate to him the recognition results. Atstep 46, the recognition results are validated. Atstep 48, the validated recognition results are passed to thecommand translator 29 for encoding and then to thecontrol unit 30. Atstep 50; the command is sent to thefollower controller 14 over thewireless communication link 16. - The
processing unit 15 can in large part be implemented in software executed on a suitable computing platform of the type illustrated in FIG. 5. Such computing platform includes a Central Processing Unit (CPU) 60 connected to amemory 62 over adata bus 64. An Input/Output interface 66 is connected to thedata bus 64 and allows the computing platform to exchange signals/data with the external world. Thememory 62 is used for holding program instructions of program elements that implement the functionality of components of theprocessing unit 15. Also, thememory 62 is used to hold data on which the program elements operate. - The structure and operation of the
follower controller 14 is not described in detail in this specification. For more information the reader is directed to the U.S. Pat. Nos. 5,511,749 and 5,685,507. - FIG. 6 illustrates a variant of the remote control system for
locomotive 100. Theremote control system 100 has alead controller 102 that is generally similar to thelead controller 12 described earlier and afollower controller 104 that communicates with thelead controller 102 via acommunication link 106. Thefollower controller 104 causes the locomotive to execute a number of functions, including traveling on the track a predetermined distance. The operator can input on the input interface of the lead controller 102 a distance command specifying the distance to travel. Such distance command can be input as a distance value, such as 100 meters for example, or as a parameter that can be resolved by the remote control system 10 to a distance value. Accordingly, for the purpose of this specification, the expression “distance command” or command directing the locomotive to travel a predetermined distance” should be interpreted to cover commands that specify explicitly a distance to be travelled or a command that can be resolved into a distance to be travelled. An example of such a command is a command directing the locomotive to move a predetermined number of car lengths. Since the length of a car is known, theremote control system 100 can compute the total distance the locomotive is to travel by multiplying the average length of a car by the number of cars specified by the operator. - The
follower controller 104 uses a distance input in order to determine the actual distance travelled by the locomotive such that when the distance to travel has been reached, the movement can be stopped. There are a wide variety of ways to obtain the distance input, without departing from the spirit of the invention. For example: - (a) The distance input can be obtained internally by processing the output of the velocity sensor mounted on the locomotive that is used to measure the speed of the locomotive. The distance travelled can be determined by integrating the velocity over the time the locomotive is moving.
- (b) If the velocity sensor uses a pulse generator, where each pulse corresponds to a certain distance travelled, counting the pulses represents a way to measure distance.
- Objectively, methods (a) and (b) may introduce errors when the velocity sensor is mounted on a traction wheel of the locomotive and that wheel is subjected to slip.
- (c) The distance input can be obtained externally, as shown by the
arrow 108, via transponder detection. The locomotive is provided with an antenna that senses transponders placed at predetermined locations along the track. Based on the identity of a given transponder and knowing the location of each transponder, the remote control system can derive the distance traveled by the locomotive between two transponders. - (d) Another way to obtain the distance input externally is via a Global Position System (GPS) that can provide travelled distance information. For higher accuracy a differential GPS system can be used.
- In use, when the remote control system receives a distance command, the locomotive either alone or when pulling a train will start moving while monitoring the distance travelled. When the locomotive approaches the end of the track span that corresponds to the distance to be travelled, the locomotive slows down by reduction of power and/or application of brakes such as to stop without exceeding the distance specified by the operator.
- In another possible variant, that is also described in conjunction with FIG. 6, the
remote control system 100 is designed to receive a target location command, which is a command identifying a location on the track where the locomotive is to stop. In this embodiment, theremote control system 100 tracks the current location of the locomotive and compares it to the target location. When the locomotive is at or near the target location theremote control system 100 causes the locomotive to stop by power reduction and/or application of brakes. The command supplied by the operator provides the information about the target location. For example, a target location command can specify the identity of a transponder and the locomotive will stop when this transponder is reached. Alternatively, the target location command can specify a location by expressing coordinates explicitly or implicitly. For instance, the operator may specify the name of a location that can be resolved to a set of coordinates by the remote control system. Alternatively, the operator may be provided on the input interface with a touch sensitive screen showing a map of the yard where the locomotive rides. The operator may touch or otherwise specify a location on the displayed map. Underlying logic in thelead controller 102 can issue based on this input the coordinates, expressed in any suitable format, of the location specified by the operator. - Once the target location is known, the
remote control system 100 can determine when to stop the locomotive by tracking the current position of the locomotive. The current position can be generated from a GPS unit, as depicted by thearrow 108, or by reading a transponder along the track. - In yet another possible variant the remote control system can receive, in addition to commands directing the locomotive to perform a certain function, parameters that can change the way the command is being implemented. For example, the operator may specify parameters about the train being pulled by the locomotive or parameters about the track on which the locomotive is riding. An example of a parameter of the train is the approximate weight of the train or the number of cars in the train. This information can be used by the follower controller to adapt the control logic that is used to regulate the train movement. For instance, in the case of a heavier train, more power can be applied when the operator desired to bring the train to certain speed, than in the case of a lighter train. In a braking scenario, when the operator desires to bring the train to a stop at a certain location, the control logic will apply the brakes earlier than in the case of a lighter train, to avoid the train overshooting the target location. Accordingly, the locomotive is caused to execute commands by implementing actions that are conditioned on the parameter specified by the operator.
- The control logic in the follower controller can usually be expressed as a set of transfer functions, each function associated to a certain operation, such as acceleration, coasting, braking, etc. Each transfer function has a number of control parameters. The parameters of the train input by the operator used to modify the control parameters of the relevant transfer function. Which control parameters of the transfer function, and the extent to which they can be modified, is a matter of design choice that can vary significantly from one application to the other.
- Another parameter that can be specified by the operator and that can be used to tailor the control logic of the follower controller generally as described earlier, relates to track conditions. For example, the operator can specify the grade (incline) of the track which information can be used modify the control logic to apply more power when the train is going up the track and less power when the train is moving down, due to the gravity effect. Braking can be tailored in a similar manner. Another possible track parameter, among others, is how slippery the track is.
- Parameters about the train and track conditions, distance and target location commands can be entered via voice. The speech recognition functionality of the remote control system processes the speech input as required previously to output the relevant command or parameter, which is then processed for implementation. The speech recognition dictionary of the system contains vocabulary items that correspond to the distance and target location commands, train, track or other parameters that the system should be able to recognize, such that when the operator utters one or more of those commands or parameters the speech recognition system will be able find proper matches.
- It should be expressly noted that parameters about the train and track conditions, and distance and target location commands could also be specified by the operator by using input devices other than a speech sensing input.
- Although various embodiments have been illustrated, this was for the purpose of describing, but not limiting, the invention. Various modifications will become apparent to those skilled in the art and are within the scope of this invention, which is defined more particularly by the attached claims.
Claims (112)
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US10/754,525 US7236859B2 (en) | 2000-09-01 | 2004-01-12 | Remote control system for a locomotive |
Applications Claiming Priority (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US09/653,651 US6466847B1 (en) | 2000-09-01 | 2000-09-01 | Remote control system for a locomotive using voice commands |
US10/222,560 US6697716B2 (en) | 2000-09-01 | 2002-08-16 | Remote control system for a locomotive using voice commands |
US10/328,517 US6799098B2 (en) | 2000-09-01 | 2002-12-23 | Remote control system for a locomotive using voice commands |
US10/754,525 US7236859B2 (en) | 2000-09-01 | 2004-01-12 | Remote control system for a locomotive |
Related Parent Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US10/328,517 Continuation-In-Part US6799098B2 (en) | 2000-09-01 | 2002-12-23 | Remote control system for a locomotive using voice commands |
Publications (2)
Publication Number | Publication Date |
---|---|
US20040143374A1 true US20040143374A1 (en) | 2004-07-22 |
US7236859B2 US7236859B2 (en) | 2007-06-26 |
Family
ID=32718978
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US10/754,525 Expired - Lifetime US7236859B2 (en) | 2000-09-01 | 2004-01-12 | Remote control system for a locomotive |
Country Status (1)
Country | Link |
---|---|
US (1) | US7236859B2 (en) |
Cited By (21)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20080195265A1 (en) * | 2004-05-03 | 2008-08-14 | Sti Rail Pty Ltd | Train Integrity Network System |
US20130006452A1 (en) * | 2010-04-28 | 2013-01-03 | Mitsubishi Electronic Corporation | Train speed control apparatus and train speed control method |
US20130018531A1 (en) * | 2006-03-20 | 2013-01-17 | Ajith Kuttannair Kumar | System, method, and computer software code for controlling speed regulation of a remotely controlled powered system |
CN102887151A (en) * | 2011-07-01 | 2013-01-23 | 通用电气公司 | Control system |
US20140180445A1 (en) * | 2005-05-09 | 2014-06-26 | Michael Gardiner | Use of natural language in controlling devices |
US20140321759A1 (en) * | 2013-04-26 | 2014-10-30 | Denso Corporation | Object detection apparatus |
US8983759B2 (en) | 2012-06-29 | 2015-03-17 | General Electric Company | System and method for communicating in a vehicle consist |
US9669851B2 (en) | 2012-11-21 | 2017-06-06 | General Electric Company | Route examination system and method |
US9682716B2 (en) | 2012-11-21 | 2017-06-20 | General Electric Company | Route examining system and method |
US9689681B2 (en) | 2014-08-12 | 2017-06-27 | General Electric Company | System and method for vehicle operation |
US9733625B2 (en) | 2006-03-20 | 2017-08-15 | General Electric Company | Trip optimization system and method for a train |
US9828010B2 (en) | 2006-03-20 | 2017-11-28 | General Electric Company | System, method and computer software code for determining a mission plan for a powered system using signal aspect information |
US9834237B2 (en) | 2012-11-21 | 2017-12-05 | General Electric Company | Route examining system and method |
US9950722B2 (en) | 2003-01-06 | 2018-04-24 | General Electric Company | System and method for vehicle control |
US10304475B1 (en) * | 2017-08-14 | 2019-05-28 | Amazon Technologies, Inc. | Trigger word based beam selection |
US10308265B2 (en) | 2006-03-20 | 2019-06-04 | Ge Global Sourcing Llc | Vehicle control system and method |
US20190189113A1 (en) * | 2017-12-14 | 2019-06-20 | GM Global Technology Operations LLC | System and method for understanding standard language and dialects |
US10395657B2 (en) * | 2013-10-14 | 2019-08-27 | Samsung Electronics Co., Ltd. | Display apparatus capable of releasing a voice input mode by sensing a speech finish and voice control method thereof |
US10569792B2 (en) | 2006-03-20 | 2020-02-25 | General Electric Company | Vehicle control system and method |
CN112292303A (en) * | 2018-06-21 | 2021-01-29 | 西门子交通有限公司 | Method and device for controlling a rail vehicle by means of voice messages |
US11208125B2 (en) * | 2016-08-08 | 2021-12-28 | Transportation Ip Holdings, Llc | Vehicle control system |
Families Citing this family (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US7484169B2 (en) * | 2006-02-15 | 2009-01-27 | General Electric Company | Implicit message sequence numbering for locomotive remote control system wireless communications |
US9026284B2 (en) * | 2006-09-21 | 2015-05-05 | General Electric Company | Methods and systems for throttle control and coupling control for vehicles |
US10597055B2 (en) | 2015-11-02 | 2020-03-24 | Methode Electronics, Inc. | Locomotive control networks |
US11854309B2 (en) | 2021-10-30 | 2023-12-26 | Cattron North America, Inc. | Systems and methods for remotely controlling locomotives with gestures |
US12109993B2 (en) | 2022-04-19 | 2024-10-08 | Transportation Ip Holdings, Llc | Vehicle control system and method |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US4462080A (en) * | 1981-11-27 | 1984-07-24 | Kearney & Trecker Corporation | Voice actuated machine control |
US5039038A (en) * | 1983-09-14 | 1991-08-13 | Harris Corporation | Railroad communication system |
US5774841A (en) * | 1995-09-20 | 1998-06-30 | The United States Of America As Represented By The Adminstrator Of The National Aeronautics And Space Administration | Real-time reconfigurable adaptive speech recognition command and control apparatus and method |
US6012029A (en) * | 1995-09-29 | 2000-01-04 | Cirino; Sepideh S. | Voice activated system for locating misplaced items |
US6128594A (en) * | 1996-01-26 | 2000-10-03 | Sextant Avionique | Process of voice recognition in a harsh environment, and device for implementation |
US6587824B1 (en) * | 2000-05-04 | 2003-07-01 | Visteon Global Technologies, Inc. | Selective speaker adaptation for an in-vehicle speech recognition system |
Family Cites Families (14)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US4641292A (en) | 1983-06-20 | 1987-02-03 | George Tunnell | Voice controlled welding system |
JPS6059901A (en) | 1983-09-07 | 1985-04-06 | Mitsubishi Heavy Ind Ltd | Voice control system of electric railcar |
US4725956A (en) | 1985-10-15 | 1988-02-16 | Lockheed Corporation | Voice command air vehicle control system |
US4872195A (en) | 1986-11-13 | 1989-10-03 | Gentner Electronics Corporation | Remote control unit for radio/television transmitter station |
US4893240A (en) | 1987-01-29 | 1990-01-09 | Imad Karkouti | Remote control system for operating selected functions of a vehicle |
US5511749A (en) | 1994-04-01 | 1996-04-30 | Canac International, Inc. | Remote control system for a locomotive |
US5832440A (en) | 1996-06-10 | 1998-11-03 | Dace Technology | Trolling motor with remote-control system having both voice--command and manual modes |
EP0998405A4 (en) | 1997-07-22 | 2002-10-23 | Tranz Rail Ltd | Locomotive remote control system |
DE19743306A1 (en) | 1997-09-30 | 1999-04-08 | Siemens Ag | Mobile operation apparatus especially for train |
EP0971330A1 (en) | 1998-07-07 | 2000-01-12 | Otis Elevator Company | Verbal remote control device |
CA2248526A1 (en) | 1998-09-25 | 2000-03-25 | Canac Inc. | Method and apparatus for automatic repetition rate assignment in a remote control system |
US6449536B1 (en) | 2000-07-14 | 2002-09-10 | Canac, Inc. | Remote control system for locomotives |
US6466847B1 (en) | 2000-09-01 | 2002-10-15 | Canac Inc | Remote control system for a locomotive using voice commands |
US6470245B1 (en) | 2002-01-31 | 2002-10-22 | Canac Inc. | Remote control system for a locomotive with solid state tilt sensor |
-
2004
- 2004-01-12 US US10/754,525 patent/US7236859B2/en not_active Expired - Lifetime
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US4462080A (en) * | 1981-11-27 | 1984-07-24 | Kearney & Trecker Corporation | Voice actuated machine control |
US5039038A (en) * | 1983-09-14 | 1991-08-13 | Harris Corporation | Railroad communication system |
US5774841A (en) * | 1995-09-20 | 1998-06-30 | The United States Of America As Represented By The Adminstrator Of The National Aeronautics And Space Administration | Real-time reconfigurable adaptive speech recognition command and control apparatus and method |
US6012029A (en) * | 1995-09-29 | 2000-01-04 | Cirino; Sepideh S. | Voice activated system for locating misplaced items |
US6128594A (en) * | 1996-01-26 | 2000-10-03 | Sextant Avionique | Process of voice recognition in a harsh environment, and device for implementation |
US6587824B1 (en) * | 2000-05-04 | 2003-07-01 | Visteon Global Technologies, Inc. | Selective speaker adaptation for an in-vehicle speech recognition system |
Cited By (29)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US9950722B2 (en) | 2003-01-06 | 2018-04-24 | General Electric Company | System and method for vehicle control |
US20080195265A1 (en) * | 2004-05-03 | 2008-08-14 | Sti Rail Pty Ltd | Train Integrity Network System |
US20140180445A1 (en) * | 2005-05-09 | 2014-06-26 | Michael Gardiner | Use of natural language in controlling devices |
US9733625B2 (en) | 2006-03-20 | 2017-08-15 | General Electric Company | Trip optimization system and method for a train |
US20130018531A1 (en) * | 2006-03-20 | 2013-01-17 | Ajith Kuttannair Kumar | System, method, and computer software code for controlling speed regulation of a remotely controlled powered system |
US10569792B2 (en) | 2006-03-20 | 2020-02-25 | General Electric Company | Vehicle control system and method |
US8989917B2 (en) * | 2006-03-20 | 2015-03-24 | General Electric Company | System, method, and computer software code for controlling speed regulation of a remotely controlled powered system |
US10308265B2 (en) | 2006-03-20 | 2019-06-04 | Ge Global Sourcing Llc | Vehicle control system and method |
US9828010B2 (en) | 2006-03-20 | 2017-11-28 | General Electric Company | System, method and computer software code for determining a mission plan for a powered system using signal aspect information |
US20130006452A1 (en) * | 2010-04-28 | 2013-01-03 | Mitsubishi Electronic Corporation | Train speed control apparatus and train speed control method |
US8670883B2 (en) * | 2010-04-28 | 2014-03-11 | Mitsubishi Electric Corporation | Train speed control apparatus and train speed control method |
CN102887151A (en) * | 2011-07-01 | 2013-01-23 | 通用电气公司 | Control system |
US8983759B2 (en) | 2012-06-29 | 2015-03-17 | General Electric Company | System and method for communicating in a vehicle consist |
US9682716B2 (en) | 2012-11-21 | 2017-06-20 | General Electric Company | Route examining system and method |
US9834237B2 (en) | 2012-11-21 | 2017-12-05 | General Electric Company | Route examining system and method |
US9669851B2 (en) | 2012-11-21 | 2017-06-06 | General Electric Company | Route examination system and method |
US9262693B2 (en) * | 2013-04-26 | 2016-02-16 | Denso Corporation | Object detection apparatus |
US20140321759A1 (en) * | 2013-04-26 | 2014-10-30 | Denso Corporation | Object detection apparatus |
US20200302935A1 (en) * | 2013-10-14 | 2020-09-24 | Samsung Electronics Co., Ltd. | Display apparatus capable of releasing a voice input mode by sensing a speech finish and voice control method thereof |
US11823682B2 (en) * | 2013-10-14 | 2023-11-21 | Samsung Electronics Co., Ltd. | Display apparatus capable of releasing a voice input mode by sensing a speech finish and voice control method thereof |
US10395657B2 (en) * | 2013-10-14 | 2019-08-27 | Samsung Electronics Co., Ltd. | Display apparatus capable of releasing a voice input mode by sensing a speech finish and voice control method thereof |
US20190341051A1 (en) * | 2013-10-14 | 2019-11-07 | Samsung Electronics Co., Ltd. | Display apparatus capable of releasing a voice input mode by sensing a speech finish and voice control method thereof |
US10720162B2 (en) * | 2013-10-14 | 2020-07-21 | Samsung Electronics Co., Ltd. | Display apparatus capable of releasing a voice input mode by sensing a speech finish and voice control method thereof |
US9689681B2 (en) | 2014-08-12 | 2017-06-27 | General Electric Company | System and method for vehicle operation |
US11208125B2 (en) * | 2016-08-08 | 2021-12-28 | Transportation Ip Holdings, Llc | Vehicle control system |
US10304475B1 (en) * | 2017-08-14 | 2019-05-28 | Amazon Technologies, Inc. | Trigger word based beam selection |
US10468017B2 (en) * | 2017-12-14 | 2019-11-05 | GM Global Technology Operations LLC | System and method for understanding standard language and dialects |
US20190189113A1 (en) * | 2017-12-14 | 2019-06-20 | GM Global Technology Operations LLC | System and method for understanding standard language and dialects |
CN112292303A (en) * | 2018-06-21 | 2021-01-29 | 西门子交通有限公司 | Method and device for controlling a rail vehicle by means of voice messages |
Also Published As
Publication number | Publication date |
---|---|
US7236859B2 (en) | 2007-06-26 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US7236859B2 (en) | Remote control system for a locomotive | |
US6697716B2 (en) | Remote control system for a locomotive using voice commands | |
AU2001275616A1 (en) | Remote control system for a locomotive using voice commands | |
US6799098B2 (en) | Remote control system for a locomotive using voice commands | |
EP0560786B1 (en) | Audio navigation for vehicles | |
US7805240B2 (en) | Driving behavior prediction method and apparatus | |
KR102414456B1 (en) | Dialogue processing apparatus, vehicle having the same and accident information processing method | |
US9644985B2 (en) | Navigation device that evaluates points of interest based on user utterance | |
CN103810995B (en) | Adjusting method and system for voice system | |
CN105637323B (en) | Navigation server, navigation system and air navigation aid | |
EP0813718A1 (en) | Navigation system utilizing audio cd player for data storage | |
US8374868B2 (en) | Method of recognizing speech | |
CN202743179U (en) | Voice control device for automobile | |
KR20200016055A (en) | Intelligent drowsiness driving prevention device | |
CN109102801A (en) | Audio recognition method and speech recognition equipment | |
KR20190011458A (en) | Vehicle, mobile for communicate with the vehicle and method for controlling the vehicle | |
US9476728B2 (en) | Navigation apparatus, method and program | |
KR20200006738A (en) | Dialogue system, and dialogue processing method | |
CN107426143A (en) | The quick accessing method of user vehicle and device based on Application on Voiceprint Recognition | |
KR20200000621A (en) | Dialogue processing apparatus, vehicle having the same and dialogue processing method | |
JPH1063288A (en) | Voice recognition device | |
JP2021024415A (en) | Automobile acceleration control system, method, program | |
KR20190135676A (en) | Dialogue system, vehicle having the same and dialogue processing method | |
JP7178147B1 (en) | Information processing device, information processing method, program | |
CN107264445A (en) | A kind of harmful influence vehicle deceleration Activity recognition method |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: BELTPACK CORPORATION, CANADA Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:CANAC INC.;REEL/FRAME:015642/0377 Effective date: 20040430 |
|
AS | Assignment |
Owner name: BELTPACK CORPORATION, CANADA Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:HORST, FOLKERT;GEORGIEV, STEPHAN P.;MATTAR, BRIGIDE;REEL/FRAME:015807/0516;SIGNING DATES FROM 20040902 TO 20040920 |
|
AS | Assignment |
Owner name: ARGOSY INVESTMENT PARTNERS II, L.P., PENNSYLVANIA Free format text: SECURITY INTEREST;ASSIGNOR:CATTRON INTELLECTUAL PROPERTY CORPORATION;REEL/FRAME:016116/0653 Effective date: 20041015 |
|
AS | Assignment |
Owner name: CATTRON INTELLECTUAL PROPERTY CORPORATION, PENNSYL Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:BELTPACK CORPORATION;REEL/FRAME:015587/0725 Effective date: 20041015 |
|
STCF | Information on status: patent grant |
Free format text: PATENTED CASE |
|
FEPP | Fee payment procedure |
Free format text: PAT HOLDER CLAIMS SMALL ENTITY STATUS, ENTITY STATUS SET TO SMALL (ORIGINAL EVENT CODE: LTOS); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY |
|
FPAY | Fee payment |
Year of fee payment: 4 |
|
FEPP | Fee payment procedure |
Free format text: PAT HOLDER NO LONGER CLAIMS SMALL ENTITY STATUS, ENTITY STATUS SET TO UNDISCOUNTED (ORIGINAL EVENT CODE: STOL); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY |
|
FPAY | Fee payment |
Year of fee payment: 8 |
|
MAFP | Maintenance fee payment |
Free format text: PAYMENT OF MAINTENANCE FEE, 12TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1553); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY Year of fee payment: 12 |
|
AS | Assignment |
Owner name: CATTRON-THEIMEG, INC., PENNSYLVANIA Free format text: MERGER AND CHANGE OF NAME;ASSIGNORS:CATTRON INTELLECTUAL PROPERTY CORPORATION;CATTRON INTELLECTUAL PROPERTY CORPORATION;REEL/FRAME:047704/0955 Effective date: 20131231 |
|
AS | Assignment |
Owner name: LAIRD CONTROLS NORTH AMERICA INC., PENNSYLVANIA Free format text: MERGER AND CHANGE OF NAME;ASSIGNORS:CATTRON-THEIMEG, INC.;CATTRON-THEIMEG, INC.;REEL/FRAME:048407/0964 Effective date: 20140825 |
|
AS | Assignment |
Owner name: CATTRON INTELLECTUAL PROPERTY CORPORATION, PENNSYL Free format text: RELEASE BY SECURED PARTY;ASSIGNOR:ARGOSY INVESTMENT PARTNERS II, L.P.;REEL/FRAME:048029/0474 Effective date: 20190103 |
|
AS | Assignment |
Owner name: CATTRON NORTH AMERICA, INC., OHIO Free format text: CHANGE OF NAME;ASSIGNOR:LAIRD CONTROLS NORTH AMERICA INC.;REEL/FRAME:049677/0840 Effective date: 20190220 |