MX2015001237A - Vehicle with traffic flow reminder. - Google Patents

Vehicle with traffic flow reminder.

Info

Publication number
MX2015001237A
MX2015001237A MX2015001237A MX2015001237A MX2015001237A MX 2015001237 A MX2015001237 A MX 2015001237A MX 2015001237 A MX2015001237 A MX 2015001237A MX 2015001237 A MX2015001237 A MX 2015001237A MX 2015001237 A MX2015001237 A MX 2015001237A
Authority
MX
Mexico
Prior art keywords
vehicle
signal
front vehicle
range
main vehicle
Prior art date
Application number
MX2015001237A
Other languages
Spanish (es)
Other versions
MX350151B (en
Inventor
Hsin-Hsiang Yang
Kwaku O Prakah-Asante
Original Assignee
Ford Global Tech Llc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from US14/165,655 external-priority patent/US9349292B2/en
Application filed by Ford Global Tech Llc filed Critical Ford Global Tech Llc
Publication of MX2015001237A publication Critical patent/MX2015001237A/en
Publication of MX350151B publication Critical patent/MX350151B/en

Links

Classifications

    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/16Anti-collision systems
    • G08G1/166Anti-collision systems for active traffic, e.g. moving vehicles, pedestrians, bikes
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W30/00Purposes of road vehicle drive control systems not related to the control of a particular sub-unit, e.g. of systems using conjoint control of vehicle sub-units

Landscapes

  • Engineering & Computer Science (AREA)
  • Automation & Control Theory (AREA)
  • Transportation (AREA)
  • Mechanical Engineering (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Traffic Control Systems (AREA)
  • Control Of Driving Devices And Active Controlling Of Vehicle (AREA)
  • Combined Controls Of Internal Combustion Engines (AREA)

Abstract

A vehicle system having at least one sensor configured to output a range signal and a range-rate signal. The range signal represents a distance from a host vehicle to the front vehicle and the range-rate signal represents range-rate information of the front vehicle relative to the host vehicle. A processing device is configured to output an alarm signal based on the range signal, the range-rate signal, and whether a driver of the host vehicle is determined to be distracted.

Description

VEHICLE WITH TRAFFIC CIRCULATION REMINDER Cross reference with related request This application is a continuation in part of United States patent 14 / 158,769, filed on January 17, 2014, entitled "AUTOMATIC CONTROL FOR STARTING AND STOPPING AN ENGINE", the content of which is incorporated herein by reference In its whole.
BACKGROUND OF THE INVENTION Turning off the engine of my vehicle when the vehicle stops can improve fuel economy and reduce emissions. Ideally, the engine would start up again before a power demand made by the driver. Some ways to anticipate driver power demand include when the driver presses the accelerator pedal (for vehicles with automatic transmissions) or when the driver operates the clutch and the gearbox (for vehicles with manual transmissions). Turning the engine back on before the power demand is realized balances the rapid response to the driver's power demand with the provision of better fuel economy and lower emissions relative to vehicles where the engine is running continuously during the operation of the vehicle.
BRIEF DESCRIPTION OF THE DRAWINGS FIG. 1 illustrates a main vehicle having a system for predicting the power demands of a driver based on the movement of a front vehicle.
FIG. 2 illustrates an exemplary system for starting a motor of a main vehicle to meet the power demands of a driver.
FIG. 3 is a flowchart of an exemplary process that can be used by one or more components of the system of FIG. 2.
FIG. 4A is a graph showing range and speed data by way of example when the front vehicle is moving away from the main vehicle.
FIG. 4B is a graph showing exemplary range and speed data when the front vehicle moved and then stopped so that it should not cause the main vehicle engine to start.
DETAILED DESCRIPTION OF THE INVENTION An exemplary vehicle system includes at least one sensor configured to emit a range signal and a range speed signal. The range signal represents a distance from a main vehicle to the front vehicle and the range speed signal represents information of the speed of reach of the front vehicle relative to the main vehicle. A processing device is configured to output a control signal on the basis of the range signal and the range speed signal. The control signal commands that a motor of the main vehicle starts up. With this system, the main vehicle engine can be turned off when the main vehicle it stops. The engine can remain off until the front vehicle starts to move away from the main vehicle. As described below, the system does not interpret all the movements of the front vehicle as the front vehicle moves away from the main vehicle. Therefore, the system is able to account for "false positives" that can be activated by, for example, the forward vehicle advancing and then stopping again a short distance later, which can occur in a traffic signal or during a traffic of stop and continuation alternatives. In addition, the system can be configured to detect if the driver is distracted and, if so, alert the driver that the front vehicle has begun to move.
The system illustrated in the figures can take many different forms and include multiple and / or alternative components and resources. Although an exemplary system is illustrated, the exemplary components illustrated are not intended to be exclusive. In fact, additional or alternative components and / or implementations can be used.
In FIG. 1, a main vehicle 100 and a front vehicle 105 are illustrated. The main vehicle 100 and the front vehicle 105 can be any type of passenger or commercial vehicle such as a car, truck, sport utility vehicle, crossover vehicle, trailer. , etc. While both vehicles are stopped with the main vehicle 100 directly behind the front vehicle 105, the engine 110 of the main vehicle 100 can be turned off. While being stopped, the main vehicle 100 can monitor the movement of the front vehicle 105. This may include the use of one or more on-board sensors 120 to determine a distance to the front vehicle 105. This distance may be referred to as the "reach" of the front vehicle. 105 relative to the main vehicle 100. The same sensor or a different sensor 120 may further determine the information of the front vehicle reach speed 105. The speed information of the reach can include the speed with which the front vehicle 105 is moving away from the main vehicle 100. As described in more detail below, the motor 110 of the main vehicle 100 can turn on once the reach value and range speed exceed a predetermined envelope, which may define a relationship between the range value and range speed at which it is determined that the front vehicle 105 is moving away from the main vehicle 100. The The default envelope can eliminate "false starts" that might otherwise occur when the front vehicle 105 begins to move but then stops again (eg, the driver of the front vehicle 105 briefly lifts his foot off the brake and then returns to apply the brake to completely stop the front vehicle 105 again). In this way, the engine 110 of the main vehicle 100 can remain off during false starts, while still providing fuel economy and emissions benefits despite relatively minor movement of the front vehicle 105.
In addition, as described in more detail below with respect to FIG. 2, the front vehicle 105 can further alert the driver that the front vehicle 105 is moving away from the main vehicle 100. In some cases, the main vehicle 100 can only alert the driver if, p. eg, the main vehicle 100 determines that the driver is distracted, or that he may not realize that the front vehicle 105 has begun to move.
With reference now to FIG. 2, a system 115 incorporated in the main vehicle 100 to determine whether the engine 110 is enabled includes at least one sensor 120, a motor controller 125 and a processing device 130. The system 115 can determine in addition if the driver is distracted and, if so, alert the driver that the front vehicle 105 has begun to move. Therefore, the system 115 may further include a user interface device 135, a driver monitoring system 140 and a communication interface device 145.
The sensor 120 may include any number of devices configured to generate signals that allow the main vehicle 100 to "see" one or more other vehicles, such as the front vehicle 105. Examples of sensors 120 may include a radar sensor, a lidar sensor, a camera or similar. The sensor 120 may be configured to emit a range signal based on a distance between the main vehicle 100 and the front vehicle 105. The same sensor or a different sensor 120 may be configured to emit a range speed signal over the information base of the speed of reach of the front vehicle 105.
The motor controller 125 may be configured to control the operation of the motor 110 and possibly other components of the power train, including transmission. The engine 110 may include an internal combustion engine 110 configured to convert a fuel, such as gasoline, in mechanical movement. The engine 110 may include one or more combustion chambers to oxidize the fuel. The oxidized fuel can be compressed and ignited in the combustion chamber. The combustion in each chamber can generate a force that activates a piston to rotate an axis. The engine 110 can include any number of combustion chambers. A cylinder block can define the combustion chambers, as well as accommodate the pistons and the shaft that make up the engine 110. The cylinder block can be molded from, for example. eg, iron, an aluminum alloy or any other material that can transfer heat to the engine coolant that circulates through the cylinder block. The motor controller 125 can control the combustion timing described above. In addition, the motor controller 125 may be configured to receive inputs of various components and / or subsystems of the main vehicle 100.
The processing device 130 may include any number of devices configured to provide for a power demand of the main vehicle 100. One way to forecast the power demand is to evaluate the movement of the front vehicle 105 relative to the main vehicle 100 and determine if it is set running the engine 110 on the basis of the range represented by the range signal and the range speed information represented by the range speed signal. If the processing device 130 determines that the motor 110 should be started, the processing device 130 can be configured to output a control signal which commands the motor 110 to start up. The control signal can be provided to, p. eg, the motor controller 125. In an alternative implementation, the processing device 130 may be incorporated in the motor controller 125 which means that the motor controller 125 may receive and process the reach and speed signals of range, as well as determining whether to start the motor 110 on the basis of those signals. In a possible approach, the processing device 130 can be configured to compare the movement of the front vehicle 105, represented by the range and speed information, with a predetermined envelope and emit the control signal that commands the engine 110 to be placed running if the movement of the front vehicle 105 exceeds the predetermined envelope.
The processing device 130 can be configured to remove a distance of initial stop (reach) before evaluating the movement of the front vehicle 105. Equation 1, below, describes a way in which the initial stopping distance can be removed. reach step_low (k) = a · reach step_low (k - 1) + (1 - a) · reach (k) Scope (k) = reach. { k) - reach step_low (k) (1) The envelope can be compared with the range speed and the modified range from Equation (1). Next, an example envelope is established as a circle of quadrants (Equation (2)). reach ^ + reach speed 2 > radio, if the reach, and range speed > 0 (2) In addition to foreseeing the power demand of the main vehicle 100 and evaluating the movement of the front vehicle 105, the processing device 130 may be further configured to determine whether the main vehicle 100, the front vehicle 105, or both, are stopped. The processing device 130 can determine if the main vehicle 100 is stopped using a sensor 120, such as a Global Positioning System (GPS) sensor 120 or a speedometer. Alternatively, the processing device 130 may receive a measurement of the speed of the wheels or another value representative of the speed of the main vehicle 100. In some cases, the processing device 130 may obtain the speed of the main vehicle 100 from one or more sensor values 120. The processing device 130 can be configured to determine if the front vehicle 105 is stopped on the base of the range and / or range speed signals emitted by the sensors 120. For example, the processing device 130 can determine that the front vehicle 105 is stopped if the speed of reach of the front vehicle 105 is zero or if the range between the main vehicle 100 and the front vehicle 105 is not changing. In some cases, the processing device 130 can determine whether both the front vehicle 105 and the main vehicle 100 are stopped before deciding whether to turn the engine 110 of the front vehicle 100 off or on., the processing device 130 may be configured to determine if the motor 110 of the main vehicle 100 is turned off before attempting to anticipate any power demand. In other words, the processing device 130 can only evaluate the movement of the front vehicle 105 to determine, e.g. eg, if the engine 110 of the main vehicle 100 is to be started, if the engine 110 of the main vehicle 100 is turned off.
The user interface device 135 may be configured to present information to a user, such as a driver, during operation of the main vehicle 100. In addition, the user interface device 135 may be configured to receive inputs from the user. Therefore, the user interface device 135 may be located in the passenger compartment of the main vehicle 100. In some possible approaches, the user interface device 135 may include a touch-sensitive display screen and / or an interface verbal. The user interface device 135 may be configured to output signals indicating that the user is providing inputs to, or interacting with, the user interface device 135. Examples of user interface devices 135 may include a user interface system. entertainment, a climate control system, a navigation system or similar.
The driver monitoring system 140 may be configured to monitor a driver's status. For example, the driver monitoring system 140 may include a camera located in the passenger compartment. The driver monitoring system 140 can use image processing to determine where the driver is looking. If the driver is looking through the front window of the main vehicle 100, the driver monitoring system 140 can determine that the driver is not distracted. If the driver is watching, p. For example, the user interface device 135, a mobile device, through the side windows or to any place other than through the front window, the driver monitoring system 140 may be configured to determine that the driver is distracted. The driver monitoring system 140 may be configured to emit one or more signals that represent if the driver is distracted.
The communication interface device 145 may be configured to interface with any number of mobile devices introduced in the passenger compartment. The communication interface device 145 may use, in a possible approach, a communication protocol such as Bluetooth® to communicate with the mobile device. The communication interface device 145 may be configured to communicate with the mobile device to determine, e.g. eg, if the mobile device is currently in use and if the use requires the interaction of, eg. eg, the driver. For example, if the mobile device is playing music, no driver interaction is required. However, if the mobile device is displaying a web page or running an application of a game or social network site, the communication interface device 145 may determine that the driver is interacting with the mobile device. The communication interface device 145 may be configured to emit a signal indicating whether it is It is likely that the driver is interacting with the mobile device.
The processing device 130 can be configured to determine if the driver is distracted on the basis of the signals received from, p. eg, the user interface device 135, the driver monitoring system 140 and the communication interface device 145. For example, the processing device 130 may receive a signal indicating that the driver is interacting with the device. user interface 135, without looking through the windshield or connected to a mobile device. In any of these cases and possibly in other circumstances, the processing device 130 may emit a signal that causes a sound and / or visual alarm to be presented to the driver. The audible and / or visual alarm can be presented by, p. eg, the user interface device 135, such as through speakers of the entertainment system. The audible and / or visual alarm can alert the driver that the front vehicle 105 has started to move away from the main vehicle 100.
In general, computer systems and / or devices, such as processing device 130, may employ any of a number of computer operating systems including, but in no way limited to, versions and / or varieties of the Ford Sync® operating system. , the Microsoft Windows® operating system, the Unix operating system (eg, the Solaris® operating system distributed by Oracle Corporation of Redwood Shores, California), the AIX UNIX operating system distributed by the International Business Machines of Armonk, New York, the Linux operating system, the Mac OS X and iOS operating systems distributed by Apple Inc., of Cupertino, California, the BlackBerry OS distributed by Research In Motion of Waterloo, Canada and the Android operating system developed by the Open Handset Alliance. The examples of devices computer include, without limitation, a computer on board the vehicle, a cubicle to work with a computer, a server, a desktop computer, notebook computer, laptop or handheld computer or some other system and / or computer device.
Computer devices generally include computer executable instructions, in which the instructions may be executable by one or more computing devices such as those listed above. Computer executable instructions can be compiled or interpreted from computer programs created using a variety of programming languages and / or technologies including, without limitation and either alone or in combination, Java ™, C, C ++, Visual Basic, Java Script, Perl, etc. In general, a processor (eg, a microprocessor) receives instructions, e.g. eg, of a memory, a computer-readable medium, etc., and executes these instructions, thereby performing one or more processes, including one or more of the processes described in this document. Such instructions and other data may be stored and transmitted using a variety of computer-readable media.
A computer-readable medium (also referred to as a processor-readable medium) includes any non-transient (eg, tangible) medium that participates in the provision of data (eg, instructions) that can be read by a computer ( eg, by a computer processor). Said medium can take many forms including, but not limited to, non-volatile media and volatile media. The non-volatile media may include, for example, optical or magnetic disks and other continuous memory. The volatile means may include, for example, Dynamic Random Access Memory (DRAM), which usually constitutes a main memory. Said instructions may be transmitted by one or more transmission means, including coaxial cables, copper wires and optical fibers, including cables comprising a system bus coupled to a computer processor. Common forms of computer-readable media include, for example, a floppy disk, a floppy disk, hard drive, magnetic tape, any other magnetic media, a compact read-only memory disk (CD-ROM). , digital versatile disk (DVD), any other optical medium, perforated cards, paper tape, any other physical medium with hole patterns, a random access memory (RAM), a programmable read-only memory (PROM), a programmable read-only erasable memory (EPROM), a programmable read-only electrically erasable flash memory (FLASH-EEPROM), any other chip or memory cartridge or any other means from which a computer can read.
In some examples, elements of the system can be implemented as computer-readable instructions (eg, software) on one or more computing devices (eg, servers, personal computers, etc.), stored on readable media by computer associated with them (eg, disks, memories, etc.). A computer program product may comprise said instructions stored on computer readable media to carry out the functions described herein.
FIG. 3 is a flow chart of an exemplary process 300 that can be implemented by one or more components of the system 115 of FIG. 2. For example, the process 300 can be executed by the processing device 130.
In block 305, processing device 130 can receive the signal from scope. As described above, the range signal can be generated by one or more sensors 120 and can represent a distance between the main vehicle 100 and the front vehicle 105. In addition, in some cases, the initial range can be set to zero so that which is considered a modified scope.
In block 310, processing device 130 can receive the range speed signal. The range speed signal may be generated by one or more sensors 120 and may represent the range speed information of the front vehicle 105. An example of range speed information may include how quickly the front vehicle 105 is moved. is moving away from the main vehicle 100.
In decision block 315, processing device 130 can determine if front vehicle 105 is stopped. The processing device 130 can determine if the front vehicle 105 is stopped from the range and / or range speed signals emitted by the sensors 120. For example, the processing device 130 can determine that the front vehicle 105 is stopped if the speed of reach of the front vehicle 105 is zero or if the range between the main vehicle 100 and the front vehicle 105 is not changing. If the front vehicle 105 is stopped, the process 300 may continue in the block 320. Otherwise, the process 300 may return to block 305.
In decision block 320, processing device 130 can determine if main vehicle 100 is stopped. For example, the processing device 130 may be based on a sensor 120 such as a Global Positioning System (GPS) 120 sensor or a speedometer to determine if the main vehicle 100 is stopped. Alternatively, the processing device 130 may receive a measure of the speed of the wheels or another value representative of the speed of the main vehicle 100.
In some cases, the processing device 130 can obtain the speed of the main vehicle 100 from one or more values of the sensor 120. If the processing device 130 determines that the main vehicle 100 is stopped, the process 300 can continue in the block 325. If the processing device 130 determines that the main vehicle 100 is still moving, the process 300 may return to block 305.
In block 325, the processing device 130 can evaluate the movement of the front vehicle 105 relative to the main vehicle 100. The evaluation of the movement can be based, at least in part, on additional range and speed signals, received from determining that the front vehicle 105 is stopped in the block 320. As described above, an increase in range (including an increase in the modified range) may suggest that the front vehicle 105 is moving away from the main vehicle 100. The information of the speed of reach of the front vehicle 105 suggests the speed with which the front vehicle 105 is moving away from the main vehicle 100. As such, the range information (including the modified range) and range speed can be used to evaluate the movement of the front vehicle 105.
In the decision block 330, the processing device 130 can compare the movement of the front vehicle 105 with a predetermined threshold. For example, the processing device 130 may apply, e.g. e.g., Equation (2) to determine if the range and range speed suggest that the front vehicle 105 is moving away from the main vehicle 100 in a manner that suggests a high probability that the driver of the main vehicle 100 will soon make a power demand. Therefore, if the movement of the front vehicle 105 exceeds the predetermined threshold, the process 300 may continue in block 335. If the movement of the front vehicle 105 does not exceed the predetermined threshold, the process 300 may return to block 325 so that additional movements of the front vehicle 105 may be evaluated.
In decision block 335, the processing device 130 can determine if the engine 110 of the main vehicle 100 is turned off. The engine 110 may not turn off each time the main vehicle 100 is stopped. Before ordering the motor 110 to turn on in the block 340, the processing device 130 can first confirm that the motor 110 is actually turned off. If it is off, process 300 may continue in block 340. If engine 110 is still on, process 300 may return to block 325.
In block 340, the processing device 130 can output the control signal which commands the motor 110 of the main vehicle 100 to start up. The control signal can be provided to the motor controller 125 so that the motor controller 125 can start the motor 110 according to the control signal.
In the decision block 345, the processing device 130 can receive signals from the user interface device 135, the driver monitoring system 140 and / or the communication interface device 145 and determine if the driver of the main vehicle 100 is distracted. The processing device 130 can determine if the driver is distracted on the basis of, e.g. eg, if the driver is interacting with the user interface device 135, if the driver is looking anywhere other than through the windshield or if the driver is connected to a mobile device. The processing device 130 can make said determinations on the basis of signals emitted by the user interface device 135, the driver monitoring system 140 and / or the communication interface device 145. If the processing device 130 determines that the driver is distracted, the process 300 may continue in block 350. If not, process 300 may terminate or return to block 305.
In block 350, the processing device 130 can emit a signal which causes, for example, eg, the user interface device 135 emits a sound and / or visual alarm to the driver. The alarm can indicate to the driver that the front vehicle 105 has started to move away from the main vehicle 100.
Process 300 may end after block 350. Alternatively, process 300 may return to block 305 or possibly another block after block 340.
FIG. 4A is a graph 400A showing exemplary range and speed data when the front vehicle 105 is moving away from the main vehicle 100. FIG. 4B is a graph 400B showing exemplary range and speed data when the front vehicle 105 moved and then stopped so that it should not cause the engine 110 of the main vehicle 100 to start. In both graphs 400A and 400B, the range is represented by the Y axis and the range speed is represented by the X axis. An envelope 405 is shown as a dashed line. The movement of the front vehicle 105 can be represented in the graphics 400A and 400B depending on both the range and range speed data. Movement exceeding envelope 405, as shown in FIG. 4A, it can cause the motor 110 of the main vehicle 100 to start up. The movement that is less than the envelope 405, such as the movement illustrated in FIG. 4B, it can cause the motor 110 to remain off as it is unlikely that the driver will make an engine power demand 110 based on the limited movement of the front vehicle 105.
With respect to the processes, systems, methods, heuristics, etc., described in this document, it should be understood that, although the stages of said processes, etc., have been described as occurring in accordance with a certain orderly sequence, such processes could be implemented with the described steps carried out in a different order of order. described in this document. In addition, it should be understood that certain steps could be carried out simultaneously, that other steps could be added or that certain steps described in this document could be omitted. In other words, the descriptions of the processes of the present document are provided for the purpose of illustrating certain embodiments and should in no way be construed as limiting the claims.
In this way, it should be understood that the foregoing description is meant to be illustrative and not restrictive. Many embodiments and applications other than the examples provided would be apparent upon reading the above description. The scope should not be determined by reference to the foregoing description but, instead, should be determined with reference to the appended claims, together with the full scope of equivalents to which those claims are entitled. It is contemplated and anticipated that future developments will occur in the technologies described in this document and that the disclosed systems and methods will be incorporated in said future embodiments. In summary, it should be understood that the request can be modified and varied.
It is anticipated that all terms used in the claims will receive their broadest reasonable interpretations and their usual meaning understood by those having knowledge in the technologies described in this document, unless explicitly stated otherwise in this document. In particular, the use of singular items such as "a", "he / she", "said", etc., should be interpreted as listing one or more of the indicated elements unless a claim lists an explicit limitation to the contrary.
The Summary of Disclosure is provided to allow the reader to quickly determine the nature of the technical disclosure. It is presented with the understanding that it will not be used to interpret or limit the scope or meaning of the claims. Furthermore, in the above Detailed Description it can be seen that various features are grouped together in various embodiments for the purpose of making disclosure more efficient. This disclosure method should not be construed as reflecting an intention that the claimed embodiments require more features than those expressly mentioned in each claim. Rather, as the following claims reflect, the object of the invention lies in less of all the features of a single disclosed embodiment. Therefore, the following claims are incorporated herein in the Detailed Description, wherein each claim is valid per se as a separately claimed object.

Claims (20)

1. A vehicle system characterized in that it comprises: at least one sensor configured to emit a range signal representing a distance from a main vehicle to a front vehicle and a range speed signal representing information of the speed of reach of the front vehicle relative to the main vehicle; Y a processing device configured to emit an alarm signal based on the range signal, the range speed signal and if it is determined that a driver of the main vehicle is distracted.
2. The vehicle of claim 1, characterized in that the processing device is configured to evaluate the movement of the front vehicle relative to the main vehicle on the basis of the range signal and the range speed signal.
3. The vehicle of claim 2, characterized in that the processing device is configured to emit the control signal if the movement of the front vehicle exceeds a predetermined envelope.
4. The vehicle of claim 2, characterized in that the processing device is configured to provide for the power demand of the main vehicle based on the movement of the front vehicle.
5. The vehicle of claim 2, characterized in that the processing device is configured to determine if the main vehicle is stopped and if the front vehicle is stopped before evaluating the movement of the front vehicle.
6. The vehicle of claim 1, characterized in that the processing device is configured to determine if the driver is distracted on the basis of a signal received from at least one of a user interface device, a communication interface device and a system of driver monitoring.
7. The vehicle of claim 6, characterized in that the processing device is configured to emit the alarm signal to the user interface device and where the user interface device is configured to emit at least one of a sound alarm and a visual alarm .
8. The vehicle of claim 1, characterized in that the processing device is configured to determine whether the front vehicle and the main vehicle are stopped.
9. The vehicle of claim 1, characterized in that the processing device is configured to determine if the engine of the main vehicle is turned off.
10. A method characterized in that it comprises: receiving a range signal representing a distance from a main vehicle to a front vehicle; receiving a range speed signal representing information of the speed of reach of the front vehicle relative to the main vehicle; evaluating the movement of the front vehicle relative to the main vehicle on the basis of the range signal and the range speed signal; determine if a driver is distracted; Y issue an alarm signal if the front vehicle is moving away from the main vehicle and the driver is determined to be distracted.
11. The method of claim 10, characterized in that the control signal is emitted if the movement of the front vehicle exceeds a predetermined envelope.
12. The method of claim 9, characterized in that the determination of whether the driver is distracted is based at least in part on a signal received from at least one of a user interface device, a driver monitoring system and an interface device. Communication.
13. The method of claim 9, characterized in that the emission of the alarm signal includes emitting the alarm signal to the user interface device to generate at least one of an audible alarm and a visual alarm.
14. The method of claim 10, further comprising determining whether the front vehicle and the main vehicle are stopped.
15. The method of claim 14, characterized in that the determination of whether the main vehicle is stopped and if the front vehicle is stopped occurs before evaluating the movement of the front vehicle.
16. The method of claim 10, characterized in that it further comprises the determination of whether the engine of the main vehicle is turned off.
17. The method of claim 16, characterized in that the control signal is emitted if the main vehicle engine is turned off.
18. The method of claim 10, characterized in that the range signal and the range speed signal are received from at least one sensor located in the main vehicle.
19. A main vehicle characterized because it comprises: a user interface device configured to generate at least one of a sound alarm and a visual alarm; a motor; at least one sensor configured to detect a front vehicle and emit a range signal representing a distance to the front vehicle and a speed signal to reach which represents information of the speed of reach of the front vehicle in relation to the main vehicle; a processing device configured to evaluate the movement of the front vehicle relative to the main vehicle on the basis of the range signal and the range speed signal and to emit a control signal if the movement of the front vehicle exceeds a predetermined envelope; Y an engine controller configured to start the engine in response to receiving the control signal, wherein the processing device is further configured to determine if a driver of the main vehicle is distracted and, if so, to issue an alarm signal to the user interface device, where the user interface device is configured to generate at least one of the audible alarm and the visual alarm in response to the reception of the alarm signal.
20. The main vehicle of claim 19, characterized in that the processing device is configured to provide for the power demand of the main vehicle based on the movement of the front vehicle. \
MX2015001237A 2014-01-28 2015-01-27 Vehicle with traffic flow reminder. MX350151B (en)

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
US14/165,655 US9349292B2 (en) 2014-01-17 2014-01-28 Vehicle with traffic flow reminder

Publications (2)

Publication Number Publication Date
MX2015001237A true MX2015001237A (en) 2015-07-27
MX350151B MX350151B (en) 2017-08-29

Family

ID=53688128

Family Applications (1)

Application Number Title Priority Date Filing Date
MX2015001237A MX350151B (en) 2014-01-28 2015-01-27 Vehicle with traffic flow reminder.

Country Status (3)

Country Link
CN (1) CN104802805B (en)
MX (1) MX350151B (en)
RU (1) RU2658618C2 (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102017218222A1 (en) * 2017-10-12 2019-04-18 Continental Automotive Gmbh Determining the position of a later breakpoint of a vehicle
JP6801627B2 (en) * 2017-10-25 2020-12-16 トヨタ自動車株式会社 vehicle

Family Cites Families (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH08293100A (en) * 1995-04-21 1996-11-05 Nissan Motor Co Ltd Preceding vehicle start alarm device
US7831369B2 (en) * 2005-11-16 2010-11-09 Gm Global Technology Operations, Inc. Method and apparatus for vehicle and engine operation
US7603228B2 (en) * 2006-05-25 2009-10-13 Ford Global Technologies, Llc Haptic apparatus and coaching method for improving vehicle fuel economy
EP2133851B1 (en) * 2007-04-02 2012-11-28 Panasonic Corporation Safe driving assisting device
EP2302412B1 (en) * 2009-09-29 2012-08-15 Volvo Car Corporation System and method for evaluation of an automotive vehicle forward collision threat
RU2402445C1 (en) * 2009-10-16 2010-10-27 Осман Мирзаевич Мирза Method of preventing tf collision with object moving in tf front lateral zone in direction crossing that of tf
US8509982B2 (en) * 2010-10-05 2013-08-13 Google Inc. Zone driving
US9037341B2 (en) * 2012-06-03 2015-05-19 Jerry Alex James Traction control enable/disable feature

Also Published As

Publication number Publication date
CN104802805A (en) 2015-07-29
RU2015102508A (en) 2016-08-20
CN104802805B (en) 2019-06-11
RU2658618C2 (en) 2018-06-21
MX350151B (en) 2017-08-29

Similar Documents

Publication Publication Date Title
US9349292B2 (en) Vehicle with traffic flow reminder
US9447741B2 (en) Automatic engine start-stop control
US9849865B2 (en) Emergency braking system and method of controlling the same
US20170106862A1 (en) Apparatus and method for controlling speed of cacc system
GB2552404A (en) Extended lane blind spot detection
CN109720348B (en) In-vehicle device, information processing system, and information processing method
US11458979B2 (en) Information processing system, information processing device, information processing method, and non-transitory computer readable storage medium storing program
US10421399B2 (en) Driver alert systems and methods based on the presence of cyclists
JP2017091168A (en) Drive support device
CN107696861B (en) Method, device and system for controlling vehicle speed of reversing and reversing safety controller
MX2015001237A (en) Vehicle with traffic flow reminder.
CN111683850B (en) Parking support device and parking support method
JP2012051498A (en) Vehicle control system
CN111516701A (en) Vehicle and speed limiting method and device thereof
CN116101274A (en) Vehicle control method, device and equipment for rear collision and vehicle
US10326878B2 (en) Anti-distracted driving systems and methods
JP2020041532A (en) Vehicle control device and vehicle control method
KR20200067702A (en) Engine start control for idle stop-and-go vehicle
CN113942504B (en) Self-adaptive cruise control method and device
CN115440086A (en) Vehicle jam early warning method, device, equipment, medium and vehicle
WO2016042706A1 (en) Driving burden estimation device and driving burden estimation method
US11904879B2 (en) Information processing apparatus, recording medium, and information processing method
US11261960B2 (en) Apparatus and method for controlling transmission of vehicle
US11618378B2 (en) Methods and systems for guiding an operator of a vehicle during a vehicle-sharing session
CN113688277B (en) Voice content display method, device and system and vehicle

Legal Events

Date Code Title Description
FG Grant or registration