KR20160136916A - Apparatus for controlling of vehicle and method thereof - Google Patents

Apparatus for controlling of vehicle and method thereof Download PDF

Info

Publication number
KR20160136916A
KR20160136916A KR1020150071149A KR20150071149A KR20160136916A KR 20160136916 A KR20160136916 A KR 20160136916A KR 1020150071149 A KR1020150071149 A KR 1020150071149A KR 20150071149 A KR20150071149 A KR 20150071149A KR 20160136916 A KR20160136916 A KR 20160136916A
Authority
KR
South Korea
Prior art keywords
vehicle
vehicle driver
state information
driver
predicted
Prior art date
Application number
KR1020150071149A
Other languages
Korean (ko)
Inventor
김상기
윤희주
Original Assignee
엘지전자 주식회사
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 엘지전자 주식회사 filed Critical 엘지전자 주식회사
Priority to KR1020150071149A priority Critical patent/KR20160136916A/en
Publication of KR20160136916A publication Critical patent/KR20160136916A/en

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W50/00Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
    • B60W50/08Interaction between the driver and the control system
    • B60W50/14Means for informing the driver, warning the driver or prompting a driver intervention
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K28/00Safety devices for propulsion-unit control, specially adapted for, or arranged in, vehicles, e.g. preventing fuel supply or ignition in the event of potentially dangerous conditions
    • B60K28/02Safety devices for propulsion-unit control, specially adapted for, or arranged in, vehicles, e.g. preventing fuel supply or ignition in the event of potentially dangerous conditions responsive to conditions relating to the driver
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K28/00Safety devices for propulsion-unit control, specially adapted for, or arranged in, vehicles, e.g. preventing fuel supply or ignition in the event of potentially dangerous conditions
    • B60K28/10Safety devices for propulsion-unit control, specially adapted for, or arranged in, vehicles, e.g. preventing fuel supply or ignition in the event of potentially dangerous conditions responsive to conditions relating to the vehicle 
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60QARRANGEMENT OF SIGNALLING OR LIGHTING DEVICES, THE MOUNTING OR SUPPORTING THEREOF OR CIRCUITS THEREFOR, FOR VEHICLES IN GENERAL
    • B60Q5/00Arrangement or adaptation of acoustic signal devices
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W40/00Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models
    • B60W40/08Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models related to drivers or passengers
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W40/00Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models
    • B60W40/10Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models related to vehicle motion
    • B60W2040/08
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W50/00Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
    • B60W50/08Interaction between the driver and the control system
    • B60W50/14Means for informing the driver, warning the driver or prompting a driver intervention
    • B60W2050/143Alarm means

Abstract

The present invention relates to a vehicle control apparatus and a method thereof. The vehicle control apparatus according to the present invention is characterized in that it comprises a sensing unit for sensing at least one of state information of a vehicle driver and state information related to the vehicle, and a calculation unit for calculating, based on the sensed at least one state information, Based on the prediction information.

Description

[0001] APPARATUS FOR CONTROLLING OF VEHICLE AND METHOD THEREOF [0002]

The present invention relates to a vehicle control apparatus and a method thereof.

Conventional driver state monitoring (DSM) systems, advanced driving assist systems (ADAS), and the like sense vehicle-based information. Accordingly, information such as an external traffic situation or a driver's condition viewed by the vehicle, which is information acquired from the position of the vehicle, is utilized. Since this information is indirect information from the driver's point of view, there is a limit in judging the actual driver's condition.

In other words, in order to monitor the precise driver condition, it is required to measure the external stimulus of the first person, which is seen and heard by the driver in addition to the observation outside the driver.

The present invention is directed to solving the above-mentioned problems and other problems. Another object of the present invention is to provide a vehicle control apparatus and method which provide driver-centered information.

A vehicle control apparatus according to an embodiment disclosed herein includes a sensing unit that senses at least one of state information of a vehicle driver and state information related to the vehicle and a control unit that controls the vehicle driver based on the sensed at least one state information And a control unit for calculating prediction information calculated as being provided.

In an exemplary embodiment, the sensing unit may include at least one of a driver state monitoring (DSM) system that senses state information of the vehicle driver and an Advanced Driving Assist System (ADAS) that senses state information associated with the vehicle .

In another embodiment, the control unit may calculate an external image predicted to be provided to the vehicle driver based on the input image input to the image input device and the state information of the vehicle driver.

At this time, the state information of the vehicle driver may include at least one of the face position, the eye position, the line of sight, the face position, and the eye position of the vehicle driver.

In another embodiment, the control unit may calculate an external sound that is predicted to be provided to the vehicle driver, based on the input sound input to the sound input device and the state information of the vehicle driver.

At this time, the state information of the vehicle driver may include at least one of the face position, the ear position, the face position change, and the ear position change of the vehicle driver.

Accordingly, the control unit can calculate an external sound that is predicted to be provided to the vehicle driver, based on at least one of the position of the sound source estimated from the input sound and the state information of the vehicle driver.

In another embodiment, the control unit may calculate an external impact that is predicted to be provided to the vehicle driver, based on state information related to the driving of the vehicle and state information of the vehicle driver.

At this time, the state information related to the driving of the vehicle includes at least one of the speed of the vehicle, the GPS information, and the acceleration, and the state information of the vehicle driver may include a positional change of the vehicle driver's head.

Thus, the control unit can calculate an external impact that is predicted to be provided to the vehicle driver, based on the positional change of the head of the vehicle driver corresponding to the state information related to the running of the vehicle.

A vehicle control method according to an embodiment disclosed herein includes the steps of: (a) sensing at least one of state information of a vehicle driver and state information related to the vehicle by a sensing unit; and (b) And calculating prediction information calculated to be provided to the vehicle driver based on the information.

In an exemplary embodiment, the sensing unit may include at least one of a driver state monitoring (DSM) system that senses state information of the vehicle driver and an Advanced Driving Assist System (ADAS) that senses state information associated with the vehicle .

In another embodiment, the step (b) includes a step of calculating an external image predicted to be provided to the vehicle driver based on the input image input to the image input device and the state information of the vehicle driver can do.

At this time, the state information of the vehicle driver may include at least one of the face position, the eye position, the line of sight, the face position, and the eye position of the vehicle driver.

In another embodiment, the step (b) includes the step of calculating an external sound that is predicted to be provided to the vehicle driver, based on the input sound input to the sound input device and the state information of the vehicle driver can do.

At this time, the state information of the vehicle driver may include at least one of the face position, the ear position, the face position change, and the ear position change of the vehicle driver.

Accordingly, the step (b) includes calculating an external sound that is predicted to be provided to the vehicle driver, based on at least one of the position of the sound source estimated from the input sound and the state information of the vehicle driver can do.

In another embodiment, the step (b) includes a step of calculating an external impact predicted to be provided to the vehicle driver, based on state information related to the driving of the vehicle and state information of the vehicle driver can do.

At this time, the state information related to the driving of the vehicle includes at least one of the speed of the vehicle, the GPS information, and the acceleration, and the state information of the vehicle driver may include a positional change of the vehicle driver's head.

Accordingly, the step (b) includes calculating an external impact that is predicted to be provided to the vehicle driver based on a change in the position of the vehicle driver's head corresponding to the state information related to the running of the vehicle .

Effects of the vehicle control device and the method according to the present invention will be described as follows.

According to at least one of the embodiments of the present invention, driver-centered information that the driver directly sees, hears and feels can be calculated.

As a result, the existing information calculated based on the center of the vehicle is interpreted by the driver, and the state information of the actual driver can be clearly grasped.

Further scope of applicability of the present invention will become apparent from the following detailed description. It should be understood, however, that the detailed description and specific examples, such as the preferred embodiments of the invention, are given by way of illustration only, since various changes and modifications within the spirit and scope of the invention will become apparent to those skilled in the art.

1 is a schematic view showing a vehicle for explaining an embodiment of the present invention.
2 is a diagram showing a configuration of a vehicle for explaining an embodiment of the present invention.
3 is a block diagram showing a configuration of a telematics terminal for explaining an embodiment of the present invention.
4 is a block diagram showing the configuration of a vehicle control device related to the present invention.
5 is a conceptual diagram showing an embodiment of data flow by the vehicle control apparatus according to the present invention.
6 is a flowchart showing a vehicle control method related to the present invention.
7 is a conceptual diagram showing an embodiment in which an external image predicted to be provided to a driver is calculated.
8 is a conceptual diagram showing an embodiment in which an external sound that is predicted to be provided to a driver is calculated.
9 is a conceptual diagram showing an embodiment in which an external impact that is predicted to be provided to a driver is calculated.
10 is a conceptual diagram showing an embodiment in which prediction information is applied to a virtual reality to improve the driver's driving ability.

It is noted that the technical terms used herein are used only to describe specific embodiments and are not intended to limit the invention. It is also to be understood that the technical terms used herein are to be interpreted in a sense generally understood by a person skilled in the art to which the present invention belongs, Should not be construed to mean, or be interpreted in an excessively reduced sense. Further, when a technical term used herein is an erroneous technical term that does not accurately express the spirit of the present invention, it should be understood that technical terms that can be understood by a person skilled in the art are replaced. In addition, the general terms used in the present invention should be interpreted according to a predefined or prior context, and should not be construed as being excessively reduced.

Also, the singular forms "as used herein include plural referents unless the context clearly dictates otherwise. In the present application, the term "comprising" or "comprising" or the like should not be construed as necessarily including the various elements or steps described in the specification, Or may be further comprised of additional components or steps.

Also, terms including ordinals such as first, second, etc. used in the present specification can be used to describe a plurality of constituent elements, but the constituent elements should not be limited by the terms. The terms are used only for the purpose of distinguishing one component from another. For example, without departing from the scope of the present invention, the first component may be referred to as a second component, and similarly, the second component may also be referred to as a first component.

Hereinafter, exemplary embodiments of the present invention will be described in detail with reference to the accompanying drawings, wherein like reference numerals refer to like or similar elements throughout the several views, and redundant description thereof will be omitted.

In the following description, well-known functions or constructions are not described in detail since they would obscure the invention in unnecessary detail. It is to be noted that the accompanying drawings are only for the purpose of facilitating understanding of the present invention, and should not be construed as limiting the scope of the present invention with reference to the accompanying drawings.

1 is a schematic view showing a vehicle (for example, an electric vehicle) for explaining an embodiment of the present invention.

Embodiments of the present invention can be applied not only to general automobiles (for example, gasoline cars, automobiles, gas automobiles, etc.) but also pure electric vehicles and hybrid electric vehicles. The Hybrid Electric Vehicles (HEV) mounts a battery pack composed of a plurality of battery cells in order to receive necessary power. A plurality of battery cells included in the battery pack need to uniformize the voltages of the battery cells in order to improve safety and life, and to obtain high output.

2 is a diagram showing the configuration of a vehicle (for example, a hybrid electric vehicle) for explaining an embodiment of the present invention.

2, a vehicle 100 for explaining an embodiment of the present invention includes an engine 101 as a power source and a motor / generator unit (hereinafter, abbreviated as "M / G unit" 102). The driven wheel driven by the power source is a front-wheel in a front-wheel drive vehicle and a rear-wheel in a rear-wheel drive vehicle. In the following, the all-wheel-drive vehicle will be described. The embodiment of the rear-wheel drive vehicle is apparent from the following description of the all-wheel-drive vehicle.

The M / B unit 102 is a device that selectively functions as a motor or a generator depending on the driving state, and is apparent to those skilled in the art. Therefore, in the following description, the M / G unit 102 can be used as a motor or a generator for the sake of understanding, but all refer to the same components. The engine 101 and the motor 102 of the electric vehicle are connected in series to a transmission.

The M / G unit 102 is driven by a signal of an inverter 104 under the control of a motor control unit (MCU 103).

The inverter 104 drives the M / G unit 102 as a power source by using the electric energy stored in the battery 105 under the control of the MCU 103, The battery 105 is charged with the electric energy developed in the M / G unit 102. In this case,

The power of the engine 101 and the M / G unit 102 is transmitted to the transmission (T / M) 107 via the clutch 106 and the final drive gear (F / R) To the front wheel 109 via the front wheels. The rear wheel 110 is a non-driving wheel that is not driven by the engine 101 and the M / G unit 102.

The front wheel 109 and the rear wheel 110 are each provided with a wheel brake apparatus 111 for reducing the rotational speed of each wheel. A hydraulic pressure control system (not shown) for braking each wheel brake unit 111 based on the hydraulic pressure generated by the operation of the brake pedal 112 and the brake pedal 112 so as to drive each wheel brake unit 111 and a hydraulic control system 113. The electric vehicle includes a brake control unit (BCU) 114 that controls the hydraulic control system 113 and receives a brake control state from the hydraulic control system 113. [

The BCU 114 detects the hydraulic pressure generated in the hydraulic control system 113 when the brake pedal 112 is operated by the driver. Based on this, the BCU 114 calculates a braking force to be applied to a drive wheel (for example, the front wheel 109), a hydraulic braking force to be braked by the hydraulic pressure therein, and a regenerative braking force to be braked by regenerative braking. Accordingly, the BCU 114 supplies the calculated hydraulic braking force to the wheel brake unit 111 of the front wheel 109 through the control of the hydraulic control system 113.

The electric vehicle is a hybrid electric vehicle electronic control unit (HEV-ECU) that implements an electric vehicle that communicates with and controls the BCU 114 and the MCU 103 to perform a maximum speed limiting method. ) ≪ / RTI >

The regenerative braking force calculated by the BCU 114 is transmitted to the HEV-ECU 115 so that the HEV-ECU 115 controls the MCU 103 based on the received regenerative braking force. Accordingly, the MCU 103 drives the M / G unit 102 as a generator so that the regenerative braking force designated by the HEV-ECU 115 is realized. At this time, the electric energy generated by the M / G unit 102 is stored in the battery 105.

The electric vehicle further comprises a vehicle speed detector 116 for detecting the vehicle speed.

The HEV-ECU 115 utilizes the vehicle speed detected by the vehicle speed detector 116 as data for controlling the BCU 114 and the MCU 103.

In addition, the electric vehicle includes a battery voltage detector 117 for detecting the voltage of the battery 105. The battery voltage detector 117 detects the current voltage of the battery 105 and controls the HEV-ECU 115 to limit the maximum speed of the electric vehicle according to a deviation between the detected current voltage and a preset reference voltage. Provide the result data.

Hereinafter, the configuration of a telematics terminal 200 for explaining an embodiment of the present invention will be described with reference to FIG.

3 is a block diagram showing a configuration of a telematics terminal 200 for explaining an embodiment of the present invention.

3, the telematics terminal 200 includes a control unit (e.g., a central processing unit, CPU) 212 for controlling the telematics terminal 200 as a whole, a memory 213 for storing various types of information, A key controller 211 for controlling various key signals, and an LCD controller 214 for controlling a liquid crystal display (LCD).

The memory 213 stores map information (map data) for displaying the route guidance information on the digital map. In addition, the memory 213 stores traffic information collection control algorithms for allowing the vehicle to input traffic information according to the current road conditions, and information for controlling the algorithms.

The main board is provided with a code division multiple access (CDMA) module 206, which is a mobile communication terminal built in a vehicle to which a unique device number is assigned, a GPS (Global) module for locating the vehicle, A GPS module 207 for receiving traffic information collected by a user or a GPS signal, a CD deck 208 for reproducing signals recorded on a CD (compact disc), a gyro A gyro sensor 209, and the like. The CDMA module 206 and the GPS module 207 transmit / receive signals through the antennas 204 and 205, respectively.

Also, the broadcast receiving module 222 is connected to the main board and receives a broadcast signal through the antenna 223. The main board includes a display unit (LCD) 201 under the control of the LCD control unit 214 through an interface board 203, a front board 202 under the control of the key control unit 211, Or a camera 227 for photographing the outside. The display unit 201 displays various video signals and character signals, and the front board 202 includes buttons for inputting various key signals, and provides a key signal corresponding to a user-selected button to the main board do. In addition, the display unit 201 includes the proximity sensor and the touch sensor (touch screen) of FIG.

The front board 202 may have a menu key for directly inputting traffic information, and the menu key may be controlled by the key controller 211.

The audio board 217 is connected to the main board and processes various audio signals. The audio board 217 includes a microcomputer 219 for controlling the audio board 217, a tuner 218 for receiving a radio signal, a power source 216 for supplying power to the microcomputer 219, And a signal processing unit 215 for processing a signal.

The audio board 217 includes a radio antenna 220 for receiving a radio signal and a CD deck 221 for reproducing an audio signal of a compact disc (CD). The audio board 217 may further comprise a voice output unit (for example, an amplifier) 226 for outputting a voice signal processed by the audio board 217.

The audio output unit (amplifier) 226 is connected to the vehicle interface 224. That is, the audio board 217 and the main board are connected to the vehicle interface 224. The vehicle interface 224 may be connected to a handsfree 225a for inputting a voice signal, an air bag 225b for occupant safety, a speed sensor 225c for detecting the speed of the vehicle, and the like. The speed sensor 225c calculates the vehicle speed and provides the calculated vehicle speed information to the central processing unit 212. [

The navigation session 299 applied to the telematics terminal 200 generates route guidance information based on the map data and the current vehicle location information, and notifies the user of the generated route guidance information.

The display unit 201 senses the proximity touch in the display window through the proximity sensor. For example, the display unit 201 may detect the position of the proximity touch when a pointer (e.g., a finger or a stylus pen) is touched in proximity, and transmit position information corresponding to the detected position And outputs it to the control unit 212.

A speech recognition device (or speech recognition module) 298 recognizes the speech uttered by the user and performs the corresponding function according to the recognized speech signal.

A navigation session 299 applied to the telematics terminal 200 displays a traveling route on the map data and displays the position of the mobile communication terminal 100 at a predetermined distance from a blind spot included in the traveling route. (For example, a car navigation device) mounted on a nearby vehicle and / or a mobile communication terminal carried by a nearby pedestrian through a wireless communication (for example, a short-range wireless communication network) Thereby receiving the location information of the neighboring vehicle from the terminal mounted on the neighboring vehicle and receiving the location information of the neighboring walker from the mobile communication terminal carried by the neighboring walker.

The vehicle control apparatus and method according to the embodiment of the present invention can be applied to the telematics terminal 200 (or HUD (Head Up Display)) and the vehicle instrument cluster (Cluster). For example, it may be implemented as a video display device capable of outputting time information, or may be present in a vehicle as a vehicle control device.

4 is a block diagram showing the configuration of a vehicle control device related to the present invention.

Referring to FIG. 4, the vehicle control apparatus 400 according to the present invention may include a sensing unit 410, a control unit 420, and an output unit 430.

The sensing unit 410 senses at least one of state information of the vehicle driver and state information related to the vehicle.

Specifically, the state information of the vehicle driver means information related to the physical condition or psychological state of the driver. For example, at least one of the position of the driver's face, the position of the eyes, the position of the ears, the position of the head, the position of the eyes, the position of the face, the position of the eyes, the position of the ears, , Pose, heart rate, body temperature, degree of blinking of the eyes, degree of occlusion of the eyelids, and the like.

The status information related to the vehicle means status information related to the vehicle operation or the internal environment of the vehicle. For example, vehicle speed, GPS information, acceleration, temperature, pressure, and the like.

In order to detect such information, the sensing unit 410 may include a driver state monitoring (DSM) system, an ADAS (Advanced Driving Assist System), a camera, an array microphone, an acceleration sensor, a gyro sensor, a pressure sensor, an IR temperature sensor, . ≪ / RTI >

As an example, the sensing unit 410 may include some of these sensors and may receive information sensed by an external sensor.

The control unit 420 calculates prediction information calculated to be provided to the vehicle driver based on the detected at least one state information.

Specifically, the prediction information means the driver's state information interpreted from the driver's standpoint. For example, it means information such as time, hearing, tactile sensation, pressure, acceleration, etc., which are presumed to be actually seen or felt by the driver.

In other words, according to the vehicle control apparatus 400 of the present invention, based on the driver's state information sensed by the sensor and the state information associated with the vehicle, State information (prediction information) can be calculated.

The output unit 430 can visually output the prediction information to the screen. For example, the video display device 430 can output an image predicted to be viewed by the driver. In addition, the prediction information can be output in the form of vibration or sound output.

The vehicle control apparatus 400 according to the present invention may include the following embodiments.

In an exemplary embodiment, the sensing unit 410 may include at least one of a driver state monitoring (DSM) system that senses state information of the vehicle driver and an Advanced Driving Assist System (ADAS) that senses state information associated with the vehicle .

In another embodiment, the controller 420 may calculate an external image predicted to be provided to the vehicle driver based on the input image input to the image input device and the state information of the vehicle driver.

At this time, the state information of the vehicle driver may include at least one of the face position, the eye position, the line of sight, the face position, and the eye position of the vehicle driver.

In another embodiment, the control unit 420 may calculate an external sound that is predicted to be provided to the vehicle driver, based on the input sound input to the sound input device and the state information of the vehicle driver.

At this time, the state information of the vehicle driver may include at least one of the face position, the ear position, the face position change, and the ear position change of the vehicle driver.

Accordingly, the control unit 420 may calculate an external sound that is predicted to be provided to the vehicle driver, based on at least one of the position of the sound source estimated from the input sound and the state information of the vehicle driver.

In another embodiment, the control unit 420 may calculate an external impact that is predicted to be provided to the vehicle driver, based on state information related to the driving of the vehicle and state information of the vehicle driver.

At this time, the state information related to the driving of the vehicle includes at least one of the speed of the vehicle, the GPS information, and the acceleration, and the state information of the vehicle driver may include a positional change of the vehicle driver's head.

Accordingly, the control unit 420 can calculate an external impact that is predicted to be provided to the vehicle driver, based on the positional change of the head of the vehicle driver corresponding to the state information related to the running of the vehicle.

5 is a conceptual diagram showing an embodiment of data flow by the vehicle control apparatus according to the present invention.

Referring to FIG. 5, status information of a driver of a vehicle sensed by a driver state monitoring (DSM) sensor and an ADAS (Advanced Driving Assist System) sensor and status information related to the vehicle may be transmitted to the controller 420.

In addition, such state information can be transmitted to the controller 420 as CAN (Controller Area Network) data.

Specifically, a CAN (Controller Area Network) -data bus is mainly used for vehicle safety systems, data transmission between ECUs of a convenience system, information / communication system, and control of an entertainment system. An ECU (Electronic Control Unit) is an electronic control unit that controls the state of an automobile engine, an automatic transmission, ABS, etc. by a computer. Here, the ABS (Anti-lock Brake System) is a special brake developed to prevent the wheel from being locked when the vehicle suddenly brakes.

In addition, the status information of the vehicle driver and the status information related to the vehicle sensed by the camera, the array microphone, the acceleration sensor, the gyro sensor, the pressure sensor, the IR temperature sensor, the infrared sensor and the like may be transmitted to the control unit 420.

Accordingly, the control unit 420 calculates state information (prediction information) interpreted in the driver's position (presumed to be seen or felt by the driver) based on the transmitted state information of the vehicle driver and the vehicle-related state information .

Conventional driver state monitoring (DSM) systems, advanced driving assist systems (ADAS), and the like sense vehicle-based information. Accordingly, information such as an external traffic situation or a driver's condition viewed by the vehicle, which is information acquired from the position of the vehicle, is utilized. Since this information is indirect information from the driver's point of view, there is a limit in judging the actual driver's condition.

In other words, in order to monitor the precise driver condition, it is required to measure the external stimulus of the first person, which is seen and heard by the driver in addition to the observation outside the driver.

The vehicle control device 400 according to the present invention is capable of detecting information such as time, hearing, tactile sensation, pressure, acceleration, etc. at the time of the first person of the driver based on the sight line, position, movement, pose, heart rate, body temperature, (Prediction information) can be calculated.

In addition, the prediction information thus calculated can be utilized in various fields such as a driver's behavior analysis, physical condition analysis, and psychological state analysis. As a result, accurate driver monitoring and driving habits analysis become possible.

As an embodiment, based on a viewpoint of a driver, which is sensed by a DSM (driver state monitoring) system, a video image inside a vehicle photographed by a camera, an image outside the vehicle sensed by ADAS (Advanced Driving Assist System) , A view of the driver first person view can be calculated.

As another embodiment, the sound source may be separated from the sound sensed by the array microphone, and the sound of the driver first person based on the driver's attitude sensed by the DSM (driver state monitoring) system may be calculated.

In still another embodiment, the sensing unit 410 of the vehicle control device 400 according to the present invention may optionally include various types of sensing sensors. In addition, the control unit 420 can execute a program that can utilize the calculated prediction information.

6 is a flowchart showing a vehicle control method related to the present invention.

Referring to FIG. 6, at step S610, the sensing unit 410 senses at least one of state information of the vehicle driver and state information related to the vehicle.

Then, in operation S620, the prediction information calculated to be provided to the vehicle driver is calculated based on the detected at least one state information.

In an exemplary embodiment, the sensing unit 410 may include at least one of a driver state monitoring (DSM) system that senses state information of the vehicle driver and an Advanced Driving Assist System (ADAS) that senses state information associated with the vehicle .

In another embodiment, the step S620 may include calculating an external image predicted to be provided to the vehicle driver based on the input image input to the image input device and the state information of the vehicle driver have.

At this time, the state information of the vehicle driver may include at least one of the face position, the eye position, the line of sight, the face position, and the eye position of the vehicle driver.

In another embodiment, the step S620 may include calculating an external sound that is predicted to be provided to the vehicle driver, based on the input sound input to the sound input device and the state information of the vehicle driver have.

At this time, the state information of the vehicle driver may include at least one of the face position, the ear position, the face position change, and the ear position change of the vehicle driver.

Accordingly, the step S620 may include calculating an external sound that is predicted to be provided to the vehicle driver, based on at least one of the position of the sound source estimated from the input sound and the state information of the vehicle driver have.

In yet another embodiment, the step S620 may include calculating an external impact that is predicted to be provided to the vehicle driver, based on the state information related to the driving of the vehicle and the state information of the vehicle driver have.

At this time, the state information related to the driving of the vehicle includes at least one of the speed of the vehicle, the GPS information, and the acceleration, and the state information of the vehicle driver may include a positional change of the vehicle driver's head.

Accordingly, the step S620 may include calculating an external impact that is predicted to be provided to the vehicle driver, based on a change in the position of the vehicle driver's head corresponding to the state information related to the running of the vehicle .

On the other hand, the controller 420 may calculate an external image predicted to be provided to the vehicle driver based on the input image input to the image input device and the state information of the vehicle driver.

At this time, the state information of the vehicle driver may include at least one of the face position, the eye position, the line of sight, the face position, and the eye position of the vehicle driver.

7 is a conceptual diagram showing an embodiment in which an external image predicted to be provided to a driver is calculated.

7, a captured image of the vehicle exterior foreground can be acquired by a stereoscopic camera, a depth sensor, and an RGB camera. (710)

Then, monitoring information of the driver of the vehicle can be obtained by a driver state monitoring (DSM) system. (720)

Subsequently, based on the thus obtained information, the vehicle exterior foreground image (driver-based Ego-centric view), which is predicted to be viewed by the driver (730), can be generated (740)

Specifically, an operator-based Ego-centric view can be generated based on the 3D reconstruction image of the vehicle exterior foreground (3D reconstruction), the three-dimensional image of the driver's eye position, and the driver's gaze.

Meanwhile, the control unit 420 may calculate an external sound that is predicted to be provided to the vehicle driver, based on the input sound input to the sound input device and the state information of the vehicle driver.

At this time, the state information of the vehicle driver may include at least one of the face position, the ear position, the face position change, and the ear position change of the vehicle driver.

Accordingly, the control unit 420 may calculate an external sound that is predicted to be provided to the vehicle driver, based on at least one of the position of the sound source estimated from the input sound and the state information of the vehicle driver.

8 is a conceptual diagram showing an embodiment in which an external sound that is predicted to be provided to a driver is calculated.

8, monitoring information of a vehicle driver may be obtained by a driver state monitoring (DSM) system 810. Further, acoustic information may be acquired by additional sensors, for example, an array microphone (820)

Subsequently, based on the information thus obtained, (840) an acoustic (driver-based Ego-centric Hearing) predicted to be heard by the driver can be generated. (850)

Specifically, the Ego-centric Hearing based on the driver 840 can be generated based on the information that tracks the position of the driver's ear, the sound source separation, the position information of the 3D sound source, and the predefined car sound model (850)

On the other hand, the control unit 420 may calculate an external impact that is predicted to be provided to the vehicle driver, based on the state information related to the driving of the vehicle and the state information of the vehicle driver.

At this time, the state information related to the driving of the vehicle includes at least one of the speed of the vehicle, the GPS information, and the acceleration, and the state information of the vehicle driver may include a positional change of the vehicle driver's head.

Accordingly, the control unit 420 can calculate an external impact that is predicted to be provided to the vehicle driver, based on the positional change of the head of the vehicle driver corresponding to the state information related to the running of the vehicle.

9 is a conceptual diagram showing an embodiment in which an external impact that is predicted to be provided to a driver is calculated.

9, the monitoring information of the driver of the vehicle can be acquired by a driver state monitoring (DSM) system. (910) Also, the vehicle speed and GPS information are acquired by a CAN (Controller Area Network) 920) Additional information may be obtained by additional sensors, such as an acceleration sensor or gyro sensor 930,

Subsequently, based on the thus obtained information, an external shock (driver-based Ego-centric Acceleration) predicted to be felt by the driver 940 can be generated. (950)

Specifically, based on the three-dimensional position tracking information of the driver's head, the acceleration information of the longitudinal / lateral / vertical / rotational direction of the vehicle, and the acceleration information of the longitudinal / lateral / vertical / 940) to generate an operator-based Ego-centric Acceleration (950)

On the other hand, as described above, the control unit 420 can execute a program that can utilize the calculated prediction information.

10 is a conceptual diagram showing an embodiment in which prediction information is applied to a virtual reality to improve the driver's driving ability.

Referring to FIG. 10, a 'driver-centered driving record' by a skilled driver can be calculated by the vehicle control apparatus 400 according to the present invention. (1010) For example, when an experienced driver drives , Records to be seen, heard, and felt can be included in the 'driver-oriented driving record'.

Next, the calculated driving record may be applied to the virtual reality system so that other drivers may experience the virtual experience. (1020) In an embodiment, the vehicle control apparatus 400 according to the present invention may include a virtual reality system .

As a result, the driver is able to learn the skillful driving habits and the manner of responding to the external environment, etc. (1030)

Effects of the vehicle control device and the method according to the present invention will be described as follows.

According to at least one of the embodiments of the present invention, driver-centered information that the driver directly sees, hears and feels can be calculated.

As a result, the existing information calculated based on the center of the vehicle is interpreted by the driver, and the state information of the actual driver can be clearly grasped.

The present invention described above can be embodied as computer-readable codes on a medium on which a program is recorded. The computer readable medium includes all kinds of recording devices in which data that can be read by a computer system is stored. Examples of the computer readable medium include a hard disk drive (HDD), a solid state disk (SSD), a silicon disk drive (SDD), a ROM, a RAM, a CD-ROM, a magnetic tape, a floppy disk, , And may also be implemented in the form of a carrier wave (e.g., transmission over the Internet). Also, the computer may include a control unit 180 of the terminal. Accordingly, the above description should not be construed in a limiting sense in all respects and should be considered illustrative. The scope of the present invention should be determined by rational interpretation of the appended claims, and all changes within the scope of equivalents of the present invention are included in the scope of the present invention.

Claims (20)

A sensing unit for sensing at least one of state information of the vehicle driver and state information related to the vehicle; And
And a control unit for calculating prediction information calculated to be provided to the vehicle driver based on the detected at least one state information.
The method according to claim 1,
The sensing unit includes:
A driver state monitoring (DSM) system for sensing state information of the vehicle driver, and an Advanced Driving Assist System (ADAS) for sensing state information associated with the vehicle.
The method according to claim 1,
Wherein,
And calculates an external image predicted to be provided to the vehicle driver based on the input image input to the image input device and the state information of the vehicle driver.
The method of claim 3,
The state information of the vehicle driver may include:
A face position of the vehicle driver, a position of the eyes, a line of sight, a change of the face position, and a change in the position of the eyes.
The method according to claim 1,
Wherein,
And calculates an external sound that is predicted to be provided to the vehicle driver based on the input sound input to the sound input device and the state information of the vehicle driver.
6. The method of claim 5,
The state information of the vehicle driver may include:
A position of the ear, a change of the face position, and a change of the ear position of the vehicle driver.
The method according to claim 6,
Wherein,
And calculates an external sound that is predicted to be provided to the vehicle driver based on at least one of the position of the sound source estimated from the input sound and the state information of the vehicle driver.
The method according to claim 1,
Wherein,
And calculates an external impact that is predicted to be provided to the vehicle driver based on the state information related to the driving of the vehicle and the state information of the vehicle driver.
9. The method of claim 8,
Wherein the status information related to the driving of the vehicle includes at least one of the speed of the vehicle, the GPS information, and the acceleration,
And the state information of the vehicle driver includes a change in the position of the head of the vehicle driver.
10. The method of claim 9,
Wherein,
And calculates an external impact predicted to be provided to the vehicle driver based on a change in position of the vehicle driver's head corresponding to state information related to the running of the vehicle.
(a) detecting at least one of status information of a vehicle driver and status information related to the vehicle by a sensing unit; And
(b) calculating prediction information calculated to be provided to the vehicle driver based on the detected at least one state information.
12. The method of claim 11,
The sensing unit includes:
A driver state monitoring (DSM) system for sensing state information of the vehicle driver, and an Advanced Driving Assist System (ADAS) for sensing state information associated with the vehicle.
12. The method of claim 11,
The step (b)
And calculating an external image predicted to be provided to the vehicle driver based on the input image input to the image input device and the state information of the vehicle driver.
14. The method of claim 13,
The state information of the vehicle driver may include:
A face position, a snow position, a gaze, a face position change, and a snow position change of the vehicle driver.
12. The method of claim 11,
The step (b)
And calculating an external sound that is predicted to be provided to the vehicle driver based on the input sound input to the sound input device and the state information of the vehicle driver.
16. The method of claim 15,
The state information of the vehicle driver may include:
A position of the ear, a change of the face position, and a change of the ear position of the vehicle driver.
17. The method of claim 16,
The step (b)
And calculating an external sound that is predicted to be provided to the vehicle driver based on at least one of the position of the sound source estimated from the input sound and the state information of the vehicle driver.
12. The method of claim 11,
The step (b)
And calculating an external impact predicted to be provided to the vehicle driver based on state information related to the driving of the vehicle and state information of the vehicle driver.
19. The method of claim 18,
Wherein the status information related to the driving of the vehicle includes at least one of the speed of the vehicle, the GPS information, and the acceleration,
Wherein the state information of the vehicle driver includes a change in position of the head of the vehicle driver.
20. The method of claim 19,
The step (b)
And calculating an external impact predicted to be provided to the vehicle driver based on a change in position of the head of the vehicle driver corresponding to state information related to the driving of the vehicle.











KR1020150071149A 2015-05-21 2015-05-21 Apparatus for controlling of vehicle and method thereof KR20160136916A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
KR1020150071149A KR20160136916A (en) 2015-05-21 2015-05-21 Apparatus for controlling of vehicle and method thereof

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
KR1020150071149A KR20160136916A (en) 2015-05-21 2015-05-21 Apparatus for controlling of vehicle and method thereof

Publications (1)

Publication Number Publication Date
KR20160136916A true KR20160136916A (en) 2016-11-30

Family

ID=57707234

Family Applications (1)

Application Number Title Priority Date Filing Date
KR1020150071149A KR20160136916A (en) 2015-05-21 2015-05-21 Apparatus for controlling of vehicle and method thereof

Country Status (1)

Country Link
KR (1) KR20160136916A (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10529865B2 (en) 2017-07-31 2020-01-07 Samsung Electronics Co., Ltd. Vertical semiconductor devices
KR20200036167A (en) * 2018-09-28 2020-04-07 현대자동차주식회사 Vehicle and controlling method of vehicle

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10529865B2 (en) 2017-07-31 2020-01-07 Samsung Electronics Co., Ltd. Vertical semiconductor devices
KR20200036167A (en) * 2018-09-28 2020-04-07 현대자동차주식회사 Vehicle and controlling method of vehicle

Similar Documents

Publication Publication Date Title
KR101659034B1 (en) Apparatus for switching driving mode of vehicle and method thereof
KR20160076262A (en) Apparatus for switching driving mode of vehicle and method thereof
US10449970B2 (en) Vehicle control system
US11155267B2 (en) Mobile sensor platform
JP6237685B2 (en) Vehicle control device
JP6409699B2 (en) Automated driving system
CN110214107B (en) Autonomous vehicle providing driver education
JP6613623B2 (en) On-vehicle device, operation mode control system, and operation mode control method
CN110473310A (en) Running car data record method, system, equipment and storage medium
JP2012113609A (en) Data recording device and data recording method
KR102035135B1 (en) vehicle accident information providing system
KR101765229B1 (en) Apparatus for switching driving mode of vehicle and method thereof
US20200047765A1 (en) Driving consciousness estimation device
JP2018073374A (en) Vehicle control method and system capable of recognizing driver
JP4421668B2 (en) Imaging control apparatus, imaging control method, imaging control program, and recording medium
US9789815B2 (en) Navigation device, navigation method, and computer program product
KR20160136916A (en) Apparatus for controlling of vehicle and method thereof
KR20180062672A (en) Car cluster for automatically controlling volume of output sound
CN112654546B (en) Method and device for identifying object of interest of user
KR20160097661A (en) Apparatus for sensing occupants in a vehicle and method thereof
JP2012018527A (en) Vehicle state recording device
KR101982534B1 (en) Vehicle control device mounted on vehicle
US20230316826A1 (en) Information processing apparatus, computer-readable storage medium, and information processing method
WO2024043053A1 (en) Information processing device, information processing method, and program
EP3674978A1 (en) System and method for detecting reaction time of a driver

Legal Events

Date Code Title Description
A201 Request for examination
E902 Notification of reason for refusal
E902 Notification of reason for refusal
E90F Notification of reason for final refusal
E601 Decision to refuse application