KR20160120104A - Display device for vehicles and method for controlling the same - Google Patents
Display device for vehicles and method for controlling the same Download PDFInfo
- Publication number
- KR20160120104A KR20160120104A KR1020150049197A KR20150049197A KR20160120104A KR 20160120104 A KR20160120104 A KR 20160120104A KR 1020150049197 A KR1020150049197 A KR 1020150049197A KR 20150049197 A KR20150049197 A KR 20150049197A KR 20160120104 A KR20160120104 A KR 20160120104A
- Authority
- KR
- South Korea
- Prior art keywords
- driver
- information
- vehicle
- information related
- point
- Prior art date
Links
- 238000000034 method Methods 0.000 title claims abstract description 40
- 238000004891 communication Methods 0.000 claims abstract description 51
- 230000004397 blinking Effects 0.000 claims description 7
- 230000000007 visual effect Effects 0.000 claims description 6
- 238000004458 analytical method Methods 0.000 abstract description 6
- 230000006870 function Effects 0.000 description 35
- 210000001508 eye Anatomy 0.000 description 24
- 230000033001 locomotion Effects 0.000 description 17
- 238000010295 mobile communication Methods 0.000 description 15
- 230000008859 change Effects 0.000 description 13
- 238000010586 diagram Methods 0.000 description 13
- 230000008569 process Effects 0.000 description 8
- 238000001514 detection method Methods 0.000 description 7
- 230000000694 effects Effects 0.000 description 7
- 238000005516 engineering process Methods 0.000 description 6
- 238000012545 processing Methods 0.000 description 6
- 230000007774 longterm Effects 0.000 description 5
- 230000005540 biological transmission Effects 0.000 description 4
- 230000003287 optical effect Effects 0.000 description 4
- 230000005236 sound signal Effects 0.000 description 4
- 239000004984 smart glass Substances 0.000 description 3
- 125000002066 L-histidyl group Chemical group [H]N1C([H])=NC(C([H])([H])[C@](C(=O)[*])([H])N([H])[H])=C1[H] 0.000 description 2
- XUIMIQQOPSSXEZ-UHFFFAOYSA-N Silicon Chemical compound [Si] XUIMIQQOPSSXEZ-UHFFFAOYSA-N 0.000 description 2
- 210000005252 bulbus oculi Anatomy 0.000 description 2
- 238000004364 calculation method Methods 0.000 description 2
- 239000003086 colorant Substances 0.000 description 2
- 238000010168 coupling process Methods 0.000 description 2
- 239000011521 glass Substances 0.000 description 2
- 239000011159 matrix material Substances 0.000 description 2
- 229910052710 silicon Inorganic materials 0.000 description 2
- 239000010703 silicon Substances 0.000 description 2
- 239000007787 solid Substances 0.000 description 2
- 230000001133 acceleration Effects 0.000 description 1
- 238000013459 approach Methods 0.000 description 1
- 238000012790 confirmation Methods 0.000 description 1
- 230000005684 electric field Effects 0.000 description 1
- 230000007613 environmental effect Effects 0.000 description 1
- 230000000193 eyeblink Effects 0.000 description 1
- 210000000887 face Anatomy 0.000 description 1
- 210000003128 head Anatomy 0.000 description 1
- 238000005286 illumination Methods 0.000 description 1
- 230000006698 induction Effects 0.000 description 1
- 230000001939 inductive effect Effects 0.000 description 1
- 238000002347 injection Methods 0.000 description 1
- 239000007924 injection Substances 0.000 description 1
- 239000004973 liquid crystal related substance Substances 0.000 description 1
- 210000003205 muscle Anatomy 0.000 description 1
- 230000010355 oscillation Effects 0.000 description 1
- 238000003909 pattern recognition Methods 0.000 description 1
- 230000005855 radiation Effects 0.000 description 1
- 230000009467 reduction Effects 0.000 description 1
- 238000011160 research Methods 0.000 description 1
- 230000035807 sensation Effects 0.000 description 1
- 239000010454 slate Substances 0.000 description 1
- 238000005507 spraying Methods 0.000 description 1
- 230000003068 static effect Effects 0.000 description 1
- 239000000126 substance Substances 0.000 description 1
Images
Classifications
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60K—ARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
- B60K28/00—Safety devices for propulsion-unit control, specially adapted for, or arranged in, vehicles, e.g. preventing fuel supply or ignition in the event of potentially dangerous conditions
- B60K28/02—Safety devices for propulsion-unit control, specially adapted for, or arranged in, vehicles, e.g. preventing fuel supply or ignition in the event of potentially dangerous conditions responsive to conditions relating to the driver
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60K—ARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
- B60K35/00—Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W40/00—Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models
- B60W40/08—Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models related to drivers or passengers
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W50/00—Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
- B60W50/08—Interaction between the driver and the control system
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/017—Gesture based interaction, e.g. based on a set of recognized hand gestures
-
- B60K2350/1052—
-
- B60W2050/08—
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W50/00—Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
- B60W50/08—Interaction between the driver and the control system
- B60W50/14—Means for informing the driver, warning the driver or prompting a driver intervention
- B60W2050/146—Display means
Landscapes
- Engineering & Computer Science (AREA)
- Mechanical Engineering (AREA)
- Transportation (AREA)
- Automation & Control Theory (AREA)
- Human Computer Interaction (AREA)
- Combustion & Propulsion (AREA)
- Chemical & Material Sciences (AREA)
- Physics & Mathematics (AREA)
- General Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Mathematical Physics (AREA)
- General Physics & Mathematics (AREA)
- Controls And Circuits For Display Device (AREA)
Abstract
The vehicle display apparatus and the control method thereof according to the present invention can output information related to the point of interest of the driver through line of sight analysis of the driver. More specifically, a vehicle display device according to an embodiment of the present invention includes a display unit, a camera that senses a driver's gaze, a wireless communication unit that receives position information of the vehicle on which the driver boarded the driver's vehicle, And a control unit for controlling the display unit to determine a point at which the driver's line of sight is pointed and to output information related to the determined point using the sight line arrival distance obtained from the sight line and the position information of the vehicle .
Description
The present invention relates to a vehicle display device capable of displaying information related to a specific point and a control method thereof.
A terminal can be divided into a mobile terminal (mobile / portable terminal) and a stationary terminal according to whether the terminal can be moved. The mobile terminal can be divided into a handheld terminal and a vehicle mounted terminal according to whether the user can directly carry the mobile terminal.
Such a terminal has various functions, for example, in the form of a multimedia device having multiple functions such as photographing and photographing of a moving picture, reproduction of a music or video file, reception of a game and broadcasting, etc. .
Also, the functions of the terminal are applied to various devices (e.g., refrigerators, vehicles, etc.). In particular, a terminal applied to a vehicle can provide various functions in consideration of the state of the vehicle, the driver of the vehicle, and the occupant. For example, it may provide route guidance information using the location of the vehicle, or inform the external device of the status of the driver and the occupant.
In this manner, a function performed in consideration of the state of a vehicle or a driver often requires a specific input for executing the function. In order to compensate for this, research has been conducted on a terminal that detects the state of the vehicle or the state of the driver and executes the corresponding function, but it is difficult to sufficiently reflect the intention and interest of the driver.
An object of the present invention is to provide a vehicle display device and a control method thereof that can display information on a point of interest of a driver through a line of sight analysis of a driver.
A vehicle display apparatus according to an embodiment of the present invention includes a display unit, a camera for sensing a driver's gaze, a wireless communication unit for receiving position information of the driver's vehicle, And a control unit for controlling the display unit to determine a point at which the driver's line of sight is pointed by using the line-of-sight distance obtained from the positional information of the vehicle, and to output information related to the determined point.
In one embodiment, the line-of-sight reaching distance may be calculated by an angle formed by a line of sight corresponding to the left eye of the driver and a line of sight corresponding to the right eye.
In one embodiment, the control unit may determine, in the predetermined map data, a direction that the driver's line of sight and a point corresponding to the line-of-sight distance meet as the point at which the line of sight is directed from a point corresponding to the position information of the vehicle .
In one embodiment, the controller may determine a direction in which the driver's line of sight is maintained for a predetermined period of time or more in a direction in which the driver's line of sight is directed.
In one embodiment, the information associated with the determined point includes at least one of advertising information about the determined point, location information of the determined point, information about a location associated with the determined point, and event information associated with the determined point .
In one embodiment, the control unit may output information related to at least one of a date and time when the driver's gaze is detected among the plurality of information when the information related to the determined point is plural .
In one embodiment, the controller may control the wireless communication unit so that a signal requesting reception of information related to the determined point is transmitted to an external server when a point of sight of the driver is determined.
In one embodiment, the control unit may control the display unit such that information related to the determined point is projected onto the windshield area of the vehicle.
In one embodiment, the control unit may control the display unit so that information related to the determined point is projected to a position corresponding to the driver's gaze of the windshield area of the vehicle.
In one embodiment, the apparatus further includes a sensing unit for sensing a gesture of the driver. When the gesture of the driver is sensed in a state in which information related to the determined point is output, The state can be controlled.
In one embodiment, the information related to the determined point is event information on the determined point, and when the gesture of the driver is sensed, the controller may further output detailed information including a specific content of the event information The display unit can be controlled.
In one embodiment, the information related to the determined point is position information of the determined point, and when the determined point is a preset point, if the gesture of the driver is sensed, And the display unit may be controlled so that the information is changed and output to information related to a place associated with the display unit.
In one embodiment, the gesture of the driver may include a gesture of the driver's hand and a blinking gesture of the driver's eye.
In one embodiment, the control unit determines that the vehicle is at least one of a case where a position of the vehicle is out of a region corresponding to the determined point, and a case where a predetermined time has elapsed after information related to the determined point is output , It can be determined whether to terminate the output of the information related to the determined point.
In one embodiment, the control unit may be configured to determine whether the position of the vehicle is out of the area corresponding to the determined point, or when a predetermined time has not elapsed after information related to the determined point is output, When the predetermined time elapses after the information related to the determined point is output but the position of the vehicle does not deviate from the area corresponding to the determined point, The display unit can be controlled so that the output of the information is maintained.
A method of controlling a vehicle display device according to an exemplary embodiment of the present invention includes sensing a driver's gaze, detecting a position of the driver's gaze using a sight line of the driver sensed by the driver, Determining a direction of the driver's gaze at a point corresponding to the position of the vehicle and a point corresponding to the gaze reaching distance in the pre-stored map data; And outputting the information.
In one embodiment, the line-of-sight reaching distance may be calculated by an angle formed by a line of sight corresponding to the left eye of the driver and a line of sight corresponding to the right eye.
In one embodiment, the direction of the driver's gaze may be a direction in which the driver's gaze is maintained for a predetermined period of time.
In one embodiment, the step of outputting information related to the determined point may include outputting information related to the determined point to a display unit, and displaying the information related to the determined point on the windshield area of the vehicle, And controlling the operation of the display device.
In one embodiment, the information related to the determined point may be projected to a position corresponding to the driver's gaze of the windshield area of the vehicle.
The vehicle display apparatus and the control method thereof according to an embodiment of the present invention may determine a point at which the driver's line of sight is directed through the sight line analysis of the driver and then output information related to the determined point. Accordingly, the driver can easily receive desired information without any additional input such as directly searching for information related to his / her point of interest.
Further, since the information related to the determined point is projected at a position facing the driver's line of sight in the windshield area of the vehicle, the driver does not have to move his or her eyes to confirm the information related to the determined point during driving, To provide comfort to the driver.
1 is a block diagram illustrating a mobile terminal according to the present invention.
2 is a block diagram showing a configuration of a vehicle display device according to an embodiment of the present invention.
FIG. 3A is a flowchart of a method of controlling a display device for a vehicle according to an embodiment of the present invention, and FIGS. 3B and 3C are views for explaining a method of controlling a display device for a vehicle according to an embodiment of the present invention.
FIGS. 4A and 4B are views showing an embodiment in which information output to the display unit is changed according to predetermined conditions.
Figures 5A and 5B are diagrams illustrating embodiments related to various output schemes of information associated with determined points.
6A, 6B, and 6C are views showing an embodiment in which the output state of information related to a point determined based on an additional input of a driver is changed.
Figures 7A and 7B show an embodiment relating to the output termination of information associated with a determined point.
8A and 8B are views showing an embodiment of performing wireless communication with an external device using information related to a determined point.
Hereinafter, embodiments of the present invention will be described in detail with reference to the accompanying drawings, wherein like or similar elements are denoted by the same or similar reference numerals, and a duplicate description thereof will be omitted. The suffix "module" and " part "for the components used in the following description are given or mixed in consideration of ease of specification, and do not have their own meaning or role. In the following description of the embodiments of the present invention, a detailed description of related arts will be omitted when it is determined that the gist of the embodiments disclosed herein may be blurred. It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory and are intended to provide further explanation of the invention as claimed. , ≪ / RTI > equivalents, and alternatives.
The mobile terminal described in this specification includes a mobile phone, a smart phone, a laptop computer, a digital broadcasting terminal, a personal digital assistant (PDA), a portable multimedia player (PMP), a navigation device, a slate PC A tablet PC, an ultrabook, a wearable device such as a smartwatch, a smart glass, and a head mounted display (HMD). have.
However, it will be appreciated by those skilled in the art that the configuration according to the embodiments described herein may be applied to fixed terminals such as a digital TV, a desktop computer, a digital signage, and the like, will be.
Referring to FIG. 1, FIG. 1 is a block diagram illustrating a mobile terminal according to the present invention.
The
The
The
The
The
The
The interface unit 160 serves as a path to various types of external devices connected to the
In addition, the
In addition to the operations related to the application program, the
In addition, the
The
At least some of the components may operate in cooperation with one another to implement a method of operation, control, or control of a mobile terminal according to various embodiments described below. In addition, the operation, control, or control method of the mobile terminal may be implemented on the mobile terminal by driving at least one application program stored in the
Hereinafter, the components listed above will be described in more detail with reference to FIG. 1 before explaining various embodiments implemented through the
First, referring to the
The broadcast management server may refer to a server for generating and transmitting broadcast signals and / or broadcast related information, or a server for receiving broadcast signals and / or broadcast related information generated by the broadcast management server and transmitting the generated broadcast signals and / or broadcast related information. The broadcast signal may include a TV broadcast signal, a radio broadcast signal, a data broadcast signal, and a broadcast signal in which a data broadcast signal is combined with a TV broadcast signal or a radio broadcast signal.
The broadcasting signal may be encoded according to at least one of technical standards for transmitting and receiving a digital broadcasting signal (or a broadcasting system, for example, ISO, IEC, DVB, ATSC, etc.) It is possible to receive the digital broadcasting signal using a method conforming to the technical standard defined by the technical standards.
The broadcast-related information may be information related to a broadcast channel, a broadcast program, or a broadcast service provider. The broadcast-related information may also be provided through a mobile communication network. In this case, it may be received by the
The broadcast-related information may exist in various forms, for example, an Electronic Program Guide (EPG) of Digital Multimedia Broadcasting (DMB) or an Electronic Service Guide (ESG) of Digital Video Broadcast-Handheld (DVB-H). The broadcast signal and / or broadcast related information received through the
The
The wireless signal may include various types of data depending on a voice call signal, a video call signal or a text / multimedia message transmission / reception.
The
Wireless Internet technologies include, for example, wireless LAN (WLAN), wireless fidelity (Wi-Fi), wireless fidelity (Wi-Fi) Direct, DLNA (Digital Living Network Alliance), WiBro Interoperability for Microwave Access, High Speed Downlink Packet Access (HSDPA), High Speed Uplink Packet Access (HSUPA), Long Term Evolution (LTE) and Long Term Evolution-Advanced (LTE-A) 113 transmit and receive data according to at least one wireless Internet technology, including Internet technologies not listed above.
The
The short-
Here, the other
The position information module 115 is a module for obtaining the position (or current position) of the mobile terminal, and a representative example thereof is a Global Positioning System (GPS) module or a Wireless Fidelity (WiFi) module. For example, when the mobile terminal utilizes the GPS module, it can acquire the position of the mobile terminal by using a signal transmitted from the GPS satellite. As another example, when the mobile terminal utilizes the Wi-Fi module, it can acquire the position of the mobile terminal based on information of a wireless access point (AP) that transmits or receives the wireless signal with the Wi-Fi module. Optionally, the location information module 115 may perform any of the other functions of the
Next, the
The
The user input unit 123 is for receiving information from a user and when the information is inputted through the user input unit 123, the
Meanwhile, the
First, the
Examples of the
On the other hand, for convenience of explanation, the act of recognizing that the object is located on the touch screen in proximity with no object touching the touch screen is referred to as "proximity touch & The act of actually touching an object on the screen is called a "contact touch. &Quot; The position at which the object is closely touched on the touch screen means a position where the object corresponds to the touch screen vertically when the object is touched. The
The touch sensor senses a touch (or touch input) applied to the touch screen (or the display unit 151) by using at least one of various touch methods such as a resistance film type, a capacitive type, an infrared type, an ultrasonic type, do.
For example, the touch sensor may be configured to convert a change in a pressure applied to a specific portion of the touch screen or a capacitance generated in a specific portion to an electrical input signal. The touch sensor may be configured to detect a position, an area, a pressure at the time of touch, a capacitance at the time of touch, and the like where a touch object touching the touch screen is touched on the touch sensor. Here, the touch object may be a finger, a touch pen, a stylus pen, a pointer, or the like as an object to which a touch is applied to the touch sensor.
Thus, when there is a touch input to the touch sensor, the corresponding signal (s) is sent to the touch controller. The touch controller processes the signal (s) and transmits the corresponding data to the
On the other hand, the
On the other hand, the touch sensors and the proximity sensors discussed above can be used independently or in combination to provide a short touch (touch), a long touch, a multi touch, a drag touch ), Flick touch, pinch-in touch, pinch-out touch, swipe touch, hovering touch, and the like. Touch can be sensed.
The ultrasonic sensor can recognize the position information of the object to be sensed by using ultrasonic waves. Meanwhile, the
The
The
The
In addition, the
In the stereoscopic display unit, a three-dimensional display system such as a stereoscopic system (glasses system), an autostereoscopic system (no-glasses system), and a projection system (holographic system) can be applied.
The
The
In addition to vibration, the
The
The
The signal output from the
The interface unit 160 serves as a path for communication with all external devices connected to the
The identification module is a chip for storing various information for authenticating the use right of the
The interface unit 160 may be a path through which power from the cradle is supplied to the
The
The
Meanwhile, as described above, the
In addition, the
The
In addition, the
As another example, the
In the following, various embodiments may be embodied in a recording medium readable by a computer or similar device using, for example, software, hardware, or a combination thereof.
2 is a block diagram showing a configuration of a
2, the
The
The
The
The
The
The
The
The
The audio output unit (amplifier) 226 is connected to the
The
For example, the sensing unit 230 may include a
As another example, the sensing unit 230 may include a motion sensor 230b capable of sensing a motion gesture by the occupant on the vehicle, and may be based on the type of the motion gesture sensed by the motion sensor 230b Outputting predetermined information, searching, and performing wireless communication with an external electronic device.
Although not shown in the drawings, the sensing unit 230 may sense the state of the vehicle (for example, whether the vehicle is running or the like) in addition to the sensors (speed sensor, motion sensor) It is possible to further include various sensors.
The
The
The speech recognition device (or speech recognition module) 301 recognizes the speech uttered by the user and performs the corresponding function according to the recognized speech signal.
A navigation session (300) applied to the vehicle display device (200) displays a traveling route on the map data.
Meanwhile, the
The vehicle display apparatus according to the present invention can output information on a specific point that the driver watches through a line of sight of the driver to the display unit without any additional input. Therefore, the driver can conveniently receive the information on the specific point without directly searching for the specific point during operation.
Further, the vehicle display apparatus can control so that information on the specific point output to the display unit is projected on a windshield (windshield) of the vehicle at a position corresponding to the line of sight of the driver. Therefore, the driver can easily grasp the information that he / she is interested in, without moving his or her eyes to check the information about the specific point during driving.
Hereinafter, embodiments related to a control method that can be implemented in a vehicle display apparatus configured as above will be described in more detail with reference to the accompanying drawings. The drawings attached herewith are diagrams showing the state in which the screen information outputted to the display section of the vehicle display device is projected onto the windshield area of the vehicle, but the present invention is not limited thereto. That is, whether or not to be projected on the windshield area of the vehicle can be determined by the user's selection, and includes a control unit inside the vehicle, and can output information related to a specific point on a separate display unit mounted inside the vehicle.
The vehicle display device may be a device that performs various functions in addition to outputting information related to the specific point. For example, the vehicle display device may be a navigation device capable of outputting route guidance information in addition to information related to a specific point. As another example, the vehicle display device may be a device corresponding to a mobile terminal capable of performing a call origination function, an Internet search function, and the like. As another example, the vehicle display device may correspond to a projector terminal (a terminal capable of projecting screen information output to the display unit to the windshield area) located inside the vehicle. In other words, it is apparent to those skilled in the art that the present invention can be embodied in other specific forms without departing from the spirit and essential characteristics of the present invention.
FIG. 3A is a flowchart of a method of controlling a display device for a vehicle according to an embodiment of the present invention, and FIGS. 3B and 3C are views for explaining a method of controlling a display device for a vehicle according to an embodiment of the present invention.
3A, a vehicle display apparatus according to an exemplary embodiment of the present invention can detect a driver's gaze (S301).
The driver's gaze can be sensed through the camera included in the
Further, the vehicle display apparatus may receive position information of the vehicle on which the driver is boarding (S302).
The
The location information of the vehicle may be continuously received through the external server regardless of whether the driver's gaze is sensed, whether the driver's gaze is sensed, or whether the driver's gaze is sensed.
When the driver's sight line information and the vehicle position information are obtained, the
The
Also, the
For example, when the angle formed by the line of sight corresponding to each of the left and right eyes of the driver is measured by the first angle, a first distance that is a distance corresponding to the first angle may be calculated as the line-of-sight reaching distance. Alternatively, when the angle formed by the line of sight corresponding to each of the left and right eyes of the driver is measured as a second angle having a value smaller than the first angle, a second distance corresponding to the second angle is measured as the line- / RTI > In this case, the second distance may have a value larger than the first distance.
3B, the driver can look at a
3C, when the driver looks at another
At this time, when looking at a
Accordingly, the
The data may be data relating to an angle formed by a line of sight corresponding to each of the left and right eyes of the driver and a line-of-sight reaching distance corresponding to the angle. In addition, the data may be stored in the
When the line-of-sight reaching distance is calculated as described above, the
The
The
For example, as shown in the second diagram of FIG. 3B, the
Similarly, as shown in the second diagram of FIG. 3C, the
Although not shown in the drawing, when the driver's line of sight is maintained in the first direction for a predetermined time or longer and then maintained in the second direction for a predetermined time or longer, the
When the driver's line of sight is determined as described above, the
The information related to the determined point may be at least one of advertisement information about the determined point, position information of the determined point, information about a place associated with the determined point, and event information associated with the determined point.
For example, as shown in FIG. 3B, when the determined location is a shopping mall, information related to the determined location may include event information (discount information or lottery information, etc.) associated with the shopping mall, advertisement information of the shopping mall, and the like. As another example, if the determined point is a restaurant, the information related to the determined point may be event information (recommendation menu information) associated with the restaurant, information about a place (parking lot, convenience facility) associated with the restaurant, And detailed location information of the restaurant (building name or specific address, etc.).
The information related to the determined point may be information stored in the
In addition, the
In addition, the
When the information related to the determined point is output to the
3B and 3C, the
Therefore, the driver can easily receive various information related to the specific point by a simple operation such as keeping a gaze at a specific point of interest. In addition, since the information related to the specific point is projected at a position facing the driver's line of sight of the windshield area, the driver can clearly and easily confirm the desired information while performing the operation, There is an effect of reducing the accident risk of the vehicle.
On the other hand, the information related to the determined point may include various kinds of information as described above. In this case, the
The predetermined condition may be a condition set by the driver (for example, a condition for limiting the output of the advertisement information or a condition for outputting only the detailed position information). In addition, the preset condition may be status information such as the time, the date, and the location of the vehicle, where the driver's gaze was sensed. In this case, the
Hereinafter, this will be described with reference to Figs. 4A and 4B.
FIGS. 4A and 4B are views showing an embodiment in which information output to the display unit is changed according to predetermined conditions.
Referring to FIG. 4A, when the
If the type of information related to the B restaurant is determined by the driver under the predetermined condition, the
Accordingly, as shown in FIG. 4A, information (parking lot, convenience information) related to a place associated with the B restaurant selected by the above conditions may be output to the
4A, the
Alternatively, the information output to the
For example, referring to FIG. 4B, when the
For example, when the driver's line of sight is sensed at a specific date or at a specific time of a discount event occurring in the shopping mall, the
The information output to the
Accordingly, the driver can distinguish the desired information or the unwanted information through the setting. In addition, the information related to the determined point is selectively output according to the current state information, so that it is possible to provide the driver with the most appropriate information in the current state.
The information output to the
Also, the information output to the
Hereinafter, an embodiment related to this will be described with reference to Figs. 5A and 5B.
Figures 5A and 5B are diagrams illustrating embodiments related to various output schemes of information associated with determined points.
Referring to FIG. 5A, the
For example, the
In this case, the
Accordingly, even when a plurality of pieces of information related to the determined point are output at once, each of the plurality of pieces of information is output in a different manner, so that the driver can quickly and easily confirm desired information among other information items.
On the other hand, the
5B, information (today's recommendation menu, price information and location information) related to the determined point (B restaurant 20) is output to the
In addition, the
For example, the
When the state information of the
Further, the magnified and outputted information can be projected to the area of the
Although not shown in the drawing, the
In addition, although not shown in the drawing, if there are a plurality of points determined based on the sight line of the driver, the
Accordingly, as the driver approaches the point of interest, the driver can reliably grasp information related to the point of interest.
In addition, the
The
In addition, the
6A, 6B, and 6C are views showing an embodiment in which the output state of information related to a point determined based on an additional input of a driver is changed.
6A, today's discount information is output in relation to the determined point
The
6A, when the driver's preset blinking gesture of the eyes is sensed, the
For example, if the information related to the determined point is event information related to the determined point, based on the detection of the flicker of the predetermined eye, the event information may include detailed information including the specific content of the event information Can be changed and output.
The
On the other hand, in the first drawing of FIG. 6B, the position information related to the determined point is outputted and other information may be added to the position information based on the detection of the hand movement gesture of the driver.
That is, when the movement gesture of the hand of the predetermined driver is sensed from the sensing unit 230, the
The
In addition, the driver can input a control command related to the output of desired information by voice so that the driver can output desired information. That is, referring to FIG. 6C, in a state in which information related to a determined point is output, the driver can designate desired information (route guidance to the determined point) by voice and input the information.
In this case, the
In addition, the
Accordingly, the driver is provided with the convenience of additionally confirming other information related to the determined point or more specific information through a simple gesture even after receiving the information related to the point of interest of the driver.
In this manner, after the information related to the determined point is output, the driver may not want to receive further information related to the determined point. In this regard, the
For example, the
Figures 7A and 7B show an embodiment relating to the output termination of information associated with a determined point.
Referring to FIG. 7A, the
If it is detected that the predetermined time has elapsed since the information related to the determined point is output, the
Alternatively, when the
If the condition of the predetermined time and the condition of the determined point and the relative position of the vehicle are satisfied, whether or not the output of the information is completed can be changed. That is, when the
However, even if the preset time has elapsed after the information related to the determined point is output as shown in FIG. 7B, the vehicle (FIG. 7B) is displayed in the area 31a corresponding to the
Therefore, the
After the information related to the determined point is output, control is performed so as to match the intention of the user by determining whether or not the output is ended based on a specific condition (degree of change in time or vehicle position) .
Meanwhile, the
In this regard, description will be made with reference to Figs. 8A and 8B.
8A and 8B are views showing an embodiment of performing wireless communication with an external device using information related to a determined point.
Referring to FIG. 8A, when event information is projected in one
As shown in the second diagram of FIG. 8A, when the operation gesture of the driver is sensed, the
Here, the
In this case, the
Alternatively, the driver can directly input a control command to control the
That is, as shown in FIG. 8B, usage information (standby personnel) and positional information related to the determined point may be projected onto one
In this case, the
Alternatively, the
In this way, the
The vehicle display apparatus and the control method thereof according to an embodiment of the present invention may determine a point at which the driver's line of sight is directed through the sight line analysis of the driver and then output information related to the determined point. Accordingly, the driver can easily receive desired information without any additional input such as directly searching for information related to his / her point of interest.
Further, since the information related to the determined point is projected at a position facing the driver's line of sight in the windshield area of the vehicle, the driver does not have to move his or her eyes to confirm the information related to the determined point during driving, To provide comfort to the driver.
The present invention described above can be embodied as computer-readable codes on a medium on which a program is recorded. The computer readable medium includes all kinds of recording devices in which data that can be read by a computer system is stored. Examples of the computer readable medium include a hard disk drive (HDD), a solid state disk (SSD), a silicon disk drive (SDD), a ROM, a RAM, a CD-ROM, a magnetic tape, a floppy disk, , And may also be implemented in the form of a carrier wave (e.g., transmission over the Internet). Also, the computer may include a
Claims (20)
A camera for detecting the driver's gaze;
A wireless communication unit for receiving location information of the vehicle on which the driver is boarding; And
Determining a point at which the driver's line of sight is pointed by using the line-of-sight reaching distance obtained from the driver's line of sight detected through the camera and the position information of the vehicle,
And controlling the display unit so that information related to the determined point is output.
Wherein the angle is calculated by an angle formed by a line of sight corresponding to the left eye of the driver and a line of sight corresponding to the right eye of the driver.
Wherein the map determining unit determines the direction of the driver's line of sight from the point corresponding to the position information of the vehicle and the point corresponding to the line-of-sight reaching distance to the line of sight in the predetermined map data.
And determines a direction in which the line of sight of the driver is maintained for a predetermined time or longer.
The advertisement information related to the determined spot, the position information of the determined spot, the information related to the spot associated with the determined spot, and the event information associated with the determined spot.
And outputs information related to at least one of a date and time when the driver's line of sight is detected among the plurality of information when the information related to the determined point is a plurality of information.
Wherein the controller controls the wireless communication unit to transmit a signal requesting receipt of information related to the determined point to an external server when a point of sight of the driver is determined.
And controls the display unit such that information related to the determined point is projected onto the windshield area of the vehicle.
And controls the display unit so that information related to the determined point is projected to a position corresponding to the sight line of the driver out of the windshield area of the vehicle.
Further comprising a sensing unit sensing a gesture of the driver,
Wherein,
And controls the output state of information related to the determined point when the gesture of the driver is detected in a state in which information related to the determined point is outputted.
Wherein the information related to the determined point is event information for the determined point,
Wherein,
Wherein the control unit controls the display unit so that detailed information including a specific content of the event information is additionally output when the gesture of the driver is detected.
Wherein the information related to the determined point is position information of the determined point,
Wherein,
Wherein the display control unit controls the display unit such that, when the determined point is a predetermined point, the position information is changed to information related to a place associated with the determined point and output when the gesture of the driver is detected.
A gesture of the driver's hand and a blinking gesture of the driver's eye.
Based on whether or not the position of the vehicle falls outside the area corresponding to the determined point and a case where a predetermined time elapses after the information related to the determined point is outputted, And determines whether or not to terminate the output.
The control unit controls the display unit so that the output of the information related to the determined point is terminated when the position of the vehicle is out of the area corresponding to the determined point but the predetermined time has not elapsed since the information related to the determined point was outputted and,
And controlling the display unit to maintain the output of information related to the determined point when a predetermined time elapses after the information related to the determined point is output but the position of the vehicle does not deviate from the area corresponding to the determined point And a display device for a vehicle.
Calculating a line-of-sight reaching distance to a point where the driver's line of sight is pointed by using the sight line of the detected driver and the position information of the vehicle received from the outside;
Determining a direction of the driver's gaze at a point corresponding to the position of the vehicle and a point corresponding to the gaze reaching distance in the pre-stored map data; And
And outputting information related to the determined point. ≪ RTI ID = 0.0 > 11. < / RTI >
Wherein the visual line reaching distance is calculated by an angle formed by a line of sight corresponding to the left eye of the driver and a line of sight corresponding to the right eye of the driver.
Wherein the direction of the driver's gaze is a direction in which the driver's gaze is maintained for a predetermined time or more.
Wherein the step of outputting information related to the determined point comprises:
Outputting information related to the determined point to a display unit; And
And controlling the display unit so that information related to the determined point is projected onto a windshield area of the vehicle.
Wherein the information related to the determined point is projected to a position corresponding to the driver's line of the windshield area of the vehicle.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
KR1020150049197A KR20160120104A (en) | 2015-04-07 | 2015-04-07 | Display device for vehicles and method for controlling the same |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
KR1020150049197A KR20160120104A (en) | 2015-04-07 | 2015-04-07 | Display device for vehicles and method for controlling the same |
Publications (1)
Publication Number | Publication Date |
---|---|
KR20160120104A true KR20160120104A (en) | 2016-10-17 |
Family
ID=57250298
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
KR1020150049197A KR20160120104A (en) | 2015-04-07 | 2015-04-07 | Display device for vehicles and method for controlling the same |
Country Status (1)
Country | Link |
---|---|
KR (1) | KR20160120104A (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2019107730A1 (en) * | 2017-11-29 | 2019-06-06 | 삼성전자주식회사 | Electronic device and text providing method therefor |
-
2015
- 2015-04-07 KR KR1020150049197A patent/KR20160120104A/en not_active Application Discontinuation
Cited By (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2019107730A1 (en) * | 2017-11-29 | 2019-06-06 | 삼성전자주식회사 | Electronic device and text providing method therefor |
KR20190069633A (en) * | 2017-11-29 | 2019-06-20 | 삼성전자주식회사 | Electronic apparatus and method for providing a text thereof |
CN111727132A (en) * | 2017-11-29 | 2020-09-29 | 三星电子株式会社 | Electronic device and text providing method thereof |
EP3666579A4 (en) * | 2017-11-29 | 2020-10-21 | Samsung Electronics Co., Ltd. | Electronic device and text providing method therefor |
US11440408B2 (en) | 2017-11-29 | 2022-09-13 | Samsung Electronics Co., Ltd. | Electronic device and text providing method therefor |
CN111727132B (en) * | 2017-11-29 | 2023-07-25 | 三星电子株式会社 | Electronic device and text providing method thereof |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US9909892B2 (en) | Terminal and method for controlling the same | |
KR101595957B1 (en) | Mobile terminal and controlling system | |
KR20160001266A (en) | Mobile terminal and method for controlling the same | |
KR101641424B1 (en) | Terminal and operating method thereof | |
KR20160019760A (en) | Mobile terminal and control method for the mobile terminal | |
KR101631999B1 (en) | Mobile terminal and method for controlling the same | |
KR20160086684A (en) | Mobile terminal and method for controlling the same | |
KR20150095124A (en) | Mobile terminal and control method for the mobile terminal | |
KR20170059760A (en) | Mobile terminal and method for controlling the same | |
KR20160097655A (en) | Mobile terminal and method for controlling the same | |
KR101677644B1 (en) | Mobile terminal and method of controlling the same | |
KR20190089293A (en) | Electronic device and method for controlling the same | |
KR20180017638A (en) | Mobile terminal and method for controlling the same | |
KR20160001229A (en) | Mobile terminal and method for controlling the same | |
KR101746503B1 (en) | Mobile terminal and method for controlling the same | |
KR20160016397A (en) | Mobile terminal and method for controlling the same | |
KR20160019279A (en) | Mobile terminal and method for controlling the same | |
KR20160148876A (en) | Mobile terminal payment authorizatable at the scheduled time and method for controlling the same | |
KR20160024538A (en) | Telematics terminal for providing user with customized recommended destination and method for controlling using the same | |
KR101677392B1 (en) | Mobile terminal with payment | |
KR101651011B1 (en) | Mobile terminal | |
KR20160120104A (en) | Display device for vehicles and method for controlling the same | |
KR20170013062A (en) | Mobile terminal and method for controlling the same | |
KR20160043842A (en) | Mobile terminal | |
KR101698104B1 (en) | Apparatus for guiding traggic lane and method thereof |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
A201 | Request for examination | ||
E902 | Notification of reason for refusal | ||
AMND | Amendment | ||
E902 | Notification of reason for refusal | ||
AMND | Amendment | ||
E601 | Decision to refuse application | ||
E601 | Decision to refuse application | ||
E801 | Decision on dismissal of amendment |