KR20160136916A - Apparatus for controlling of vehicle and method thereof - Google Patents
Apparatus for controlling of vehicle and method thereof Download PDFInfo
- Publication number
- KR20160136916A KR20160136916A KR1020150071149A KR20150071149A KR20160136916A KR 20160136916 A KR20160136916 A KR 20160136916A KR 1020150071149 A KR1020150071149 A KR 1020150071149A KR 20150071149 A KR20150071149 A KR 20150071149A KR 20160136916 A KR20160136916 A KR 20160136916A
- Authority
- KR
- South Korea
- Prior art keywords
- vehicle
- vehicle driver
- state information
- driver
- predicted
- Prior art date
Links
Images
Classifications
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W50/00—Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
- B60W50/08—Interaction between the driver and the control system
- B60W50/14—Means for informing the driver, warning the driver or prompting a driver intervention
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60K—ARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
- B60K28/00—Safety devices for propulsion-unit control, specially adapted for, or arranged in, vehicles, e.g. preventing fuel supply or ignition in the event of potentially dangerous conditions
- B60K28/02—Safety devices for propulsion-unit control, specially adapted for, or arranged in, vehicles, e.g. preventing fuel supply or ignition in the event of potentially dangerous conditions responsive to conditions relating to the driver
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60K—ARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
- B60K28/00—Safety devices for propulsion-unit control, specially adapted for, or arranged in, vehicles, e.g. preventing fuel supply or ignition in the event of potentially dangerous conditions
- B60K28/10—Safety devices for propulsion-unit control, specially adapted for, or arranged in, vehicles, e.g. preventing fuel supply or ignition in the event of potentially dangerous conditions responsive to conditions relating to the vehicle
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60Q—ARRANGEMENT OF SIGNALLING OR LIGHTING DEVICES, THE MOUNTING OR SUPPORTING THEREOF OR CIRCUITS THEREFOR, FOR VEHICLES IN GENERAL
- B60Q5/00—Arrangement or adaptation of acoustic signal devices
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W40/00—Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models
- B60W40/08—Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models related to drivers or passengers
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W40/00—Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models
- B60W40/10—Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models related to vehicle motion
-
- B60W2040/08—
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W50/00—Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
- B60W50/08—Interaction between the driver and the control system
- B60W50/14—Means for informing the driver, warning the driver or prompting a driver intervention
- B60W2050/143—Alarm means
Landscapes
- Engineering & Computer Science (AREA)
- Mechanical Engineering (AREA)
- Transportation (AREA)
- Automation & Control Theory (AREA)
- Physics & Mathematics (AREA)
- Mathematical Physics (AREA)
- Chemical & Material Sciences (AREA)
- Combustion & Propulsion (AREA)
- Human Computer Interaction (AREA)
- Acoustics & Sound (AREA)
- Traffic Control Systems (AREA)
- Navigation (AREA)
Abstract
The present invention relates to a vehicle control apparatus and a method thereof. The vehicle control apparatus according to the present invention is characterized in that it comprises a sensing unit for sensing at least one of state information of a vehicle driver and state information related to the vehicle, and a calculation unit for calculating, based on the sensed at least one state information, Based on the prediction information.
Description
The present invention relates to a vehicle control apparatus and a method thereof.
Conventional driver state monitoring (DSM) systems, advanced driving assist systems (ADAS), and the like sense vehicle-based information. Accordingly, information such as an external traffic situation or a driver's condition viewed by the vehicle, which is information acquired from the position of the vehicle, is utilized. Since this information is indirect information from the driver's point of view, there is a limit in judging the actual driver's condition.
In other words, in order to monitor the precise driver condition, it is required to measure the external stimulus of the first person, which is seen and heard by the driver in addition to the observation outside the driver.
The present invention is directed to solving the above-mentioned problems and other problems. Another object of the present invention is to provide a vehicle control apparatus and method which provide driver-centered information.
A vehicle control apparatus according to an embodiment disclosed herein includes a sensing unit that senses at least one of state information of a vehicle driver and state information related to the vehicle and a control unit that controls the vehicle driver based on the sensed at least one state information And a control unit for calculating prediction information calculated as being provided.
In an exemplary embodiment, the sensing unit may include at least one of a driver state monitoring (DSM) system that senses state information of the vehicle driver and an Advanced Driving Assist System (ADAS) that senses state information associated with the vehicle .
In another embodiment, the control unit may calculate an external image predicted to be provided to the vehicle driver based on the input image input to the image input device and the state information of the vehicle driver.
At this time, the state information of the vehicle driver may include at least one of the face position, the eye position, the line of sight, the face position, and the eye position of the vehicle driver.
In another embodiment, the control unit may calculate an external sound that is predicted to be provided to the vehicle driver, based on the input sound input to the sound input device and the state information of the vehicle driver.
At this time, the state information of the vehicle driver may include at least one of the face position, the ear position, the face position change, and the ear position change of the vehicle driver.
Accordingly, the control unit can calculate an external sound that is predicted to be provided to the vehicle driver, based on at least one of the position of the sound source estimated from the input sound and the state information of the vehicle driver.
In another embodiment, the control unit may calculate an external impact that is predicted to be provided to the vehicle driver, based on state information related to the driving of the vehicle and state information of the vehicle driver.
At this time, the state information related to the driving of the vehicle includes at least one of the speed of the vehicle, the GPS information, and the acceleration, and the state information of the vehicle driver may include a positional change of the vehicle driver's head.
Thus, the control unit can calculate an external impact that is predicted to be provided to the vehicle driver, based on the positional change of the head of the vehicle driver corresponding to the state information related to the running of the vehicle.
A vehicle control method according to an embodiment disclosed herein includes the steps of: (a) sensing at least one of state information of a vehicle driver and state information related to the vehicle by a sensing unit; and (b) And calculating prediction information calculated to be provided to the vehicle driver based on the information.
In an exemplary embodiment, the sensing unit may include at least one of a driver state monitoring (DSM) system that senses state information of the vehicle driver and an Advanced Driving Assist System (ADAS) that senses state information associated with the vehicle .
In another embodiment, the step (b) includes a step of calculating an external image predicted to be provided to the vehicle driver based on the input image input to the image input device and the state information of the vehicle driver can do.
At this time, the state information of the vehicle driver may include at least one of the face position, the eye position, the line of sight, the face position, and the eye position of the vehicle driver.
In another embodiment, the step (b) includes the step of calculating an external sound that is predicted to be provided to the vehicle driver, based on the input sound input to the sound input device and the state information of the vehicle driver can do.
At this time, the state information of the vehicle driver may include at least one of the face position, the ear position, the face position change, and the ear position change of the vehicle driver.
Accordingly, the step (b) includes calculating an external sound that is predicted to be provided to the vehicle driver, based on at least one of the position of the sound source estimated from the input sound and the state information of the vehicle driver can do.
In another embodiment, the step (b) includes a step of calculating an external impact predicted to be provided to the vehicle driver, based on state information related to the driving of the vehicle and state information of the vehicle driver can do.
At this time, the state information related to the driving of the vehicle includes at least one of the speed of the vehicle, the GPS information, and the acceleration, and the state information of the vehicle driver may include a positional change of the vehicle driver's head.
Accordingly, the step (b) includes calculating an external impact that is predicted to be provided to the vehicle driver based on a change in the position of the vehicle driver's head corresponding to the state information related to the running of the vehicle .
Effects of the vehicle control device and the method according to the present invention will be described as follows.
According to at least one of the embodiments of the present invention, driver-centered information that the driver directly sees, hears and feels can be calculated.
As a result, the existing information calculated based on the center of the vehicle is interpreted by the driver, and the state information of the actual driver can be clearly grasped.
Further scope of applicability of the present invention will become apparent from the following detailed description. It should be understood, however, that the detailed description and specific examples, such as the preferred embodiments of the invention, are given by way of illustration only, since various changes and modifications within the spirit and scope of the invention will become apparent to those skilled in the art.
1 is a schematic view showing a vehicle for explaining an embodiment of the present invention.
2 is a diagram showing a configuration of a vehicle for explaining an embodiment of the present invention.
3 is a block diagram showing a configuration of a telematics terminal for explaining an embodiment of the present invention.
4 is a block diagram showing the configuration of a vehicle control device related to the present invention.
5 is a conceptual diagram showing an embodiment of data flow by the vehicle control apparatus according to the present invention.
6 is a flowchart showing a vehicle control method related to the present invention.
7 is a conceptual diagram showing an embodiment in which an external image predicted to be provided to a driver is calculated.
8 is a conceptual diagram showing an embodiment in which an external sound that is predicted to be provided to a driver is calculated.
9 is a conceptual diagram showing an embodiment in which an external impact that is predicted to be provided to a driver is calculated.
10 is a conceptual diagram showing an embodiment in which prediction information is applied to a virtual reality to improve the driver's driving ability.
It is noted that the technical terms used herein are used only to describe specific embodiments and are not intended to limit the invention. It is also to be understood that the technical terms used herein are to be interpreted in a sense generally understood by a person skilled in the art to which the present invention belongs, Should not be construed to mean, or be interpreted in an excessively reduced sense. Further, when a technical term used herein is an erroneous technical term that does not accurately express the spirit of the present invention, it should be understood that technical terms that can be understood by a person skilled in the art are replaced. In addition, the general terms used in the present invention should be interpreted according to a predefined or prior context, and should not be construed as being excessively reduced.
Also, the singular forms "as used herein include plural referents unless the context clearly dictates otherwise. In the present application, the term "comprising" or "comprising" or the like should not be construed as necessarily including the various elements or steps described in the specification, Or may be further comprised of additional components or steps.
Also, terms including ordinals such as first, second, etc. used in the present specification can be used to describe a plurality of constituent elements, but the constituent elements should not be limited by the terms. The terms are used only for the purpose of distinguishing one component from another. For example, without departing from the scope of the present invention, the first component may be referred to as a second component, and similarly, the second component may also be referred to as a first component.
Hereinafter, exemplary embodiments of the present invention will be described in detail with reference to the accompanying drawings, wherein like reference numerals refer to like or similar elements throughout the several views, and redundant description thereof will be omitted.
In the following description, well-known functions or constructions are not described in detail since they would obscure the invention in unnecessary detail. It is to be noted that the accompanying drawings are only for the purpose of facilitating understanding of the present invention, and should not be construed as limiting the scope of the present invention with reference to the accompanying drawings.
1 is a schematic view showing a vehicle (for example, an electric vehicle) for explaining an embodiment of the present invention.
Embodiments of the present invention can be applied not only to general automobiles (for example, gasoline cars, automobiles, gas automobiles, etc.) but also pure electric vehicles and hybrid electric vehicles. The Hybrid Electric Vehicles (HEV) mounts a battery pack composed of a plurality of battery cells in order to receive necessary power. A plurality of battery cells included in the battery pack need to uniformize the voltages of the battery cells in order to improve safety and life, and to obtain high output.
2 is a diagram showing the configuration of a vehicle (for example, a hybrid electric vehicle) for explaining an embodiment of the present invention.
2, a
The M /
The M /
The
The power of the
The
The
The electric vehicle is a hybrid electric vehicle electronic control unit (HEV-ECU) that implements an electric vehicle that communicates with and controls the
The regenerative braking force calculated by the
The electric vehicle further comprises a
The HEV-
In addition, the electric vehicle includes a
Hereinafter, the configuration of a
3 is a block diagram showing a configuration of a
3, the
The
The main board is provided with a code division multiple access (CDMA)
Also, the
The
The
The
The audio output unit (amplifier) 226 is connected to the
The
The
A speech recognition device (or speech recognition module) 298 recognizes the speech uttered by the user and performs the corresponding function according to the recognized speech signal.
A
The vehicle control apparatus and method according to the embodiment of the present invention can be applied to the telematics terminal 200 (or HUD (Head Up Display)) and the vehicle instrument cluster (Cluster). For example, it may be implemented as a video display device capable of outputting time information, or may be present in a vehicle as a vehicle control device.
4 is a block diagram showing the configuration of a vehicle control device related to the present invention.
Referring to FIG. 4, the
The
Specifically, the state information of the vehicle driver means information related to the physical condition or psychological state of the driver. For example, at least one of the position of the driver's face, the position of the eyes, the position of the ears, the position of the head, the position of the eyes, the position of the face, the position of the eyes, the position of the ears, , Pose, heart rate, body temperature, degree of blinking of the eyes, degree of occlusion of the eyelids, and the like.
The status information related to the vehicle means status information related to the vehicle operation or the internal environment of the vehicle. For example, vehicle speed, GPS information, acceleration, temperature, pressure, and the like.
In order to detect such information, the
As an example, the
The
Specifically, the prediction information means the driver's state information interpreted from the driver's standpoint. For example, it means information such as time, hearing, tactile sensation, pressure, acceleration, etc., which are presumed to be actually seen or felt by the driver.
In other words, according to the
The
The
In an exemplary embodiment, the
In another embodiment, the
At this time, the state information of the vehicle driver may include at least one of the face position, the eye position, the line of sight, the face position, and the eye position of the vehicle driver.
In another embodiment, the
At this time, the state information of the vehicle driver may include at least one of the face position, the ear position, the face position change, and the ear position change of the vehicle driver.
Accordingly, the
In another embodiment, the
At this time, the state information related to the driving of the vehicle includes at least one of the speed of the vehicle, the GPS information, and the acceleration, and the state information of the vehicle driver may include a positional change of the vehicle driver's head.
Accordingly, the
5 is a conceptual diagram showing an embodiment of data flow by the vehicle control apparatus according to the present invention.
Referring to FIG. 5, status information of a driver of a vehicle sensed by a driver state monitoring (DSM) sensor and an ADAS (Advanced Driving Assist System) sensor and status information related to the vehicle may be transmitted to the
In addition, such state information can be transmitted to the
Specifically, a CAN (Controller Area Network) -data bus is mainly used for vehicle safety systems, data transmission between ECUs of a convenience system, information / communication system, and control of an entertainment system. An ECU (Electronic Control Unit) is an electronic control unit that controls the state of an automobile engine, an automatic transmission, ABS, etc. by a computer. Here, the ABS (Anti-lock Brake System) is a special brake developed to prevent the wheel from being locked when the vehicle suddenly brakes.
In addition, the status information of the vehicle driver and the status information related to the vehicle sensed by the camera, the array microphone, the acceleration sensor, the gyro sensor, the pressure sensor, the IR temperature sensor, the infrared sensor and the like may be transmitted to the
Accordingly, the
Conventional driver state monitoring (DSM) systems, advanced driving assist systems (ADAS), and the like sense vehicle-based information. Accordingly, information such as an external traffic situation or a driver's condition viewed by the vehicle, which is information acquired from the position of the vehicle, is utilized. Since this information is indirect information from the driver's point of view, there is a limit in judging the actual driver's condition.
In other words, in order to monitor the precise driver condition, it is required to measure the external stimulus of the first person, which is seen and heard by the driver in addition to the observation outside the driver.
The
In addition, the prediction information thus calculated can be utilized in various fields such as a driver's behavior analysis, physical condition analysis, and psychological state analysis. As a result, accurate driver monitoring and driving habits analysis become possible.
As an embodiment, based on a viewpoint of a driver, which is sensed by a DSM (driver state monitoring) system, a video image inside a vehicle photographed by a camera, an image outside the vehicle sensed by ADAS (Advanced Driving Assist System) , A view of the driver first person view can be calculated.
As another embodiment, the sound source may be separated from the sound sensed by the array microphone, and the sound of the driver first person based on the driver's attitude sensed by the DSM (driver state monitoring) system may be calculated.
In still another embodiment, the
6 is a flowchart showing a vehicle control method related to the present invention.
Referring to FIG. 6, at step S610, the
Then, in operation S620, the prediction information calculated to be provided to the vehicle driver is calculated based on the detected at least one state information.
In an exemplary embodiment, the
In another embodiment, the step S620 may include calculating an external image predicted to be provided to the vehicle driver based on the input image input to the image input device and the state information of the vehicle driver have.
At this time, the state information of the vehicle driver may include at least one of the face position, the eye position, the line of sight, the face position, and the eye position of the vehicle driver.
In another embodiment, the step S620 may include calculating an external sound that is predicted to be provided to the vehicle driver, based on the input sound input to the sound input device and the state information of the vehicle driver have.
At this time, the state information of the vehicle driver may include at least one of the face position, the ear position, the face position change, and the ear position change of the vehicle driver.
Accordingly, the step S620 may include calculating an external sound that is predicted to be provided to the vehicle driver, based on at least one of the position of the sound source estimated from the input sound and the state information of the vehicle driver have.
In yet another embodiment, the step S620 may include calculating an external impact that is predicted to be provided to the vehicle driver, based on the state information related to the driving of the vehicle and the state information of the vehicle driver have.
At this time, the state information related to the driving of the vehicle includes at least one of the speed of the vehicle, the GPS information, and the acceleration, and the state information of the vehicle driver may include a positional change of the vehicle driver's head.
Accordingly, the step S620 may include calculating an external impact that is predicted to be provided to the vehicle driver, based on a change in the position of the vehicle driver's head corresponding to the state information related to the running of the vehicle .
On the other hand, the
At this time, the state information of the vehicle driver may include at least one of the face position, the eye position, the line of sight, the face position, and the eye position of the vehicle driver.
7 is a conceptual diagram showing an embodiment in which an external image predicted to be provided to a driver is calculated.
7, a captured image of the vehicle exterior foreground can be acquired by a stereoscopic camera, a depth sensor, and an RGB camera. (710)
Then, monitoring information of the driver of the vehicle can be obtained by a driver state monitoring (DSM) system. (720)
Subsequently, based on the thus obtained information, the vehicle exterior foreground image (driver-based Ego-centric view), which is predicted to be viewed by the driver (730), can be generated (740)
Specifically, an operator-based Ego-centric view can be generated based on the 3D reconstruction image of the vehicle exterior foreground (3D reconstruction), the three-dimensional image of the driver's eye position, and the driver's gaze.
Meanwhile, the
At this time, the state information of the vehicle driver may include at least one of the face position, the ear position, the face position change, and the ear position change of the vehicle driver.
Accordingly, the
8 is a conceptual diagram showing an embodiment in which an external sound that is predicted to be provided to a driver is calculated.
8, monitoring information of a vehicle driver may be obtained by a driver state monitoring (DSM)
Subsequently, based on the information thus obtained, (840) an acoustic (driver-based Ego-centric Hearing) predicted to be heard by the driver can be generated. (850)
Specifically, the Ego-centric Hearing based on the
On the other hand, the
At this time, the state information related to the driving of the vehicle includes at least one of the speed of the vehicle, the GPS information, and the acceleration, and the state information of the vehicle driver may include a positional change of the vehicle driver's head.
Accordingly, the
9 is a conceptual diagram showing an embodiment in which an external impact that is predicted to be provided to a driver is calculated.
9, the monitoring information of the driver of the vehicle can be acquired by a driver state monitoring (DSM) system. (910) Also, the vehicle speed and GPS information are acquired by a CAN (Controller Area Network) 920) Additional information may be obtained by additional sensors, such as an acceleration sensor or
Subsequently, based on the thus obtained information, an external shock (driver-based Ego-centric Acceleration) predicted to be felt by the
Specifically, based on the three-dimensional position tracking information of the driver's head, the acceleration information of the longitudinal / lateral / vertical / rotational direction of the vehicle, and the acceleration information of the longitudinal / lateral / vertical / 940) to generate an operator-based Ego-centric Acceleration (950)
On the other hand, as described above, the
10 is a conceptual diagram showing an embodiment in which prediction information is applied to a virtual reality to improve the driver's driving ability.
Referring to FIG. 10, a 'driver-centered driving record' by a skilled driver can be calculated by the
Next, the calculated driving record may be applied to the virtual reality system so that other drivers may experience the virtual experience. (1020) In an embodiment, the
As a result, the driver is able to learn the skillful driving habits and the manner of responding to the external environment, etc. (1030)
Effects of the vehicle control device and the method according to the present invention will be described as follows.
According to at least one of the embodiments of the present invention, driver-centered information that the driver directly sees, hears and feels can be calculated.
As a result, the existing information calculated based on the center of the vehicle is interpreted by the driver, and the state information of the actual driver can be clearly grasped.
The present invention described above can be embodied as computer-readable codes on a medium on which a program is recorded. The computer readable medium includes all kinds of recording devices in which data that can be read by a computer system is stored. Examples of the computer readable medium include a hard disk drive (HDD), a solid state disk (SSD), a silicon disk drive (SDD), a ROM, a RAM, a CD-ROM, a magnetic tape, a floppy disk, , And may also be implemented in the form of a carrier wave (e.g., transmission over the Internet). Also, the computer may include a control unit 180 of the terminal. Accordingly, the above description should not be construed in a limiting sense in all respects and should be considered illustrative. The scope of the present invention should be determined by rational interpretation of the appended claims, and all changes within the scope of equivalents of the present invention are included in the scope of the present invention.
Claims (20)
And a control unit for calculating prediction information calculated to be provided to the vehicle driver based on the detected at least one state information.
The sensing unit includes:
A driver state monitoring (DSM) system for sensing state information of the vehicle driver, and an Advanced Driving Assist System (ADAS) for sensing state information associated with the vehicle.
Wherein,
And calculates an external image predicted to be provided to the vehicle driver based on the input image input to the image input device and the state information of the vehicle driver.
The state information of the vehicle driver may include:
A face position of the vehicle driver, a position of the eyes, a line of sight, a change of the face position, and a change in the position of the eyes.
Wherein,
And calculates an external sound that is predicted to be provided to the vehicle driver based on the input sound input to the sound input device and the state information of the vehicle driver.
The state information of the vehicle driver may include:
A position of the ear, a change of the face position, and a change of the ear position of the vehicle driver.
Wherein,
And calculates an external sound that is predicted to be provided to the vehicle driver based on at least one of the position of the sound source estimated from the input sound and the state information of the vehicle driver.
Wherein,
And calculates an external impact that is predicted to be provided to the vehicle driver based on the state information related to the driving of the vehicle and the state information of the vehicle driver.
Wherein the status information related to the driving of the vehicle includes at least one of the speed of the vehicle, the GPS information, and the acceleration,
And the state information of the vehicle driver includes a change in the position of the head of the vehicle driver.
Wherein,
And calculates an external impact predicted to be provided to the vehicle driver based on a change in position of the vehicle driver's head corresponding to state information related to the running of the vehicle.
(b) calculating prediction information calculated to be provided to the vehicle driver based on the detected at least one state information.
The sensing unit includes:
A driver state monitoring (DSM) system for sensing state information of the vehicle driver, and an Advanced Driving Assist System (ADAS) for sensing state information associated with the vehicle.
The step (b)
And calculating an external image predicted to be provided to the vehicle driver based on the input image input to the image input device and the state information of the vehicle driver.
The state information of the vehicle driver may include:
A face position, a snow position, a gaze, a face position change, and a snow position change of the vehicle driver.
The step (b)
And calculating an external sound that is predicted to be provided to the vehicle driver based on the input sound input to the sound input device and the state information of the vehicle driver.
The state information of the vehicle driver may include:
A position of the ear, a change of the face position, and a change of the ear position of the vehicle driver.
The step (b)
And calculating an external sound that is predicted to be provided to the vehicle driver based on at least one of the position of the sound source estimated from the input sound and the state information of the vehicle driver.
The step (b)
And calculating an external impact predicted to be provided to the vehicle driver based on state information related to the driving of the vehicle and state information of the vehicle driver.
Wherein the status information related to the driving of the vehicle includes at least one of the speed of the vehicle, the GPS information, and the acceleration,
Wherein the state information of the vehicle driver includes a change in position of the head of the vehicle driver.
The step (b)
And calculating an external impact predicted to be provided to the vehicle driver based on a change in position of the head of the vehicle driver corresponding to state information related to the driving of the vehicle.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
KR1020150071149A KR20160136916A (en) | 2015-05-21 | 2015-05-21 | Apparatus for controlling of vehicle and method thereof |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
KR1020150071149A KR20160136916A (en) | 2015-05-21 | 2015-05-21 | Apparatus for controlling of vehicle and method thereof |
Publications (1)
Publication Number | Publication Date |
---|---|
KR20160136916A true KR20160136916A (en) | 2016-11-30 |
Family
ID=57707234
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
KR1020150071149A KR20160136916A (en) | 2015-05-21 | 2015-05-21 | Apparatus for controlling of vehicle and method thereof |
Country Status (1)
Country | Link |
---|---|
KR (1) | KR20160136916A (en) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10529865B2 (en) | 2017-07-31 | 2020-01-07 | Samsung Electronics Co., Ltd. | Vertical semiconductor devices |
KR20200036167A (en) * | 2018-09-28 | 2020-04-07 | 현대자동차주식회사 | Vehicle and controlling method of vehicle |
-
2015
- 2015-05-21 KR KR1020150071149A patent/KR20160136916A/en not_active Application Discontinuation
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10529865B2 (en) | 2017-07-31 | 2020-01-07 | Samsung Electronics Co., Ltd. | Vertical semiconductor devices |
KR20200036167A (en) * | 2018-09-28 | 2020-04-07 | 현대자동차주식회사 | Vehicle and controlling method of vehicle |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
KR101659034B1 (en) | Apparatus for switching driving mode of vehicle and method thereof | |
KR20160076262A (en) | Apparatus for switching driving mode of vehicle and method thereof | |
US10449970B2 (en) | Vehicle control system | |
JP6237685B2 (en) | Vehicle control device | |
JP6409699B2 (en) | Automated driving system | |
CN110214107B (en) | Autonomous vehicle providing driver education | |
JP6613623B2 (en) | On-vehicle device, operation mode control system, and operation mode control method | |
CN110473310A (en) | Running car data record method, system, equipment and storage medium | |
US11008012B2 (en) | Driving consciousness estimation device | |
JP2012113609A (en) | Data recording device and data recording method | |
JP2018073374A (en) | Vehicle control method and system capable of recognizing driver | |
KR102035135B1 (en) | vehicle accident information providing system | |
KR101765229B1 (en) | Apparatus for switching driving mode of vehicle and method thereof | |
JP4421668B2 (en) | Imaging control apparatus, imaging control method, imaging control program, and recording medium | |
US9789815B2 (en) | Navigation device, navigation method, and computer program product | |
KR20180062672A (en) | Car cluster for automatically controlling volume of output sound | |
KR20160136916A (en) | Apparatus for controlling of vehicle and method thereof | |
JP2012018527A (en) | Vehicle state recording device | |
KR20160097661A (en) | Apparatus for sensing occupants in a vehicle and method thereof | |
KR101763389B1 (en) | Driver distraction warning system and method | |
KR101982534B1 (en) | Vehicle control device mounted on vehicle | |
US20230316826A1 (en) | Information processing apparatus, computer-readable storage medium, and information processing method | |
WO2024043053A1 (en) | Information processing device, information processing method, and program | |
US11661070B2 (en) | Driving consciousness estimation device | |
JP2024090768A (en) | Information processor, information processing method and program |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
A201 | Request for examination | ||
E902 | Notification of reason for refusal | ||
E902 | Notification of reason for refusal | ||
E90F | Notification of reason for final refusal | ||
E601 | Decision to refuse application |