KR20160120101A - Vehicle terminal and control method thereof - Google Patents
Vehicle terminal and control method thereof Download PDFInfo
- Publication number
- KR20160120101A KR20160120101A KR1020150049189A KR20150049189A KR20160120101A KR 20160120101 A KR20160120101 A KR 20160120101A KR 1020150049189 A KR1020150049189 A KR 1020150049189A KR 20150049189 A KR20150049189 A KR 20150049189A KR 20160120101 A KR20160120101 A KR 20160120101A
- Authority
- KR
- South Korea
- Prior art keywords
- information
- driver
- display unit
- vehicle
- gesture
- Prior art date
Links
- 238000000034 method Methods 0.000 title claims abstract description 73
- 238000012544 monitoring process Methods 0.000 claims abstract description 20
- 210000003128 head Anatomy 0.000 claims description 25
- 210000000707 wrist Anatomy 0.000 claims description 5
- 238000004891 communication Methods 0.000 description 53
- 230000006870 function Effects 0.000 description 45
- 238000010586 diagram Methods 0.000 description 20
- 230000001133 acceleration Effects 0.000 description 18
- 238000010295 mobile communication Methods 0.000 description 13
- 230000008569 process Effects 0.000 description 11
- 230000005540 biological transmission Effects 0.000 description 8
- 238000005516 engineering process Methods 0.000 description 7
- 230000007774 longterm Effects 0.000 description 7
- 238000012545 processing Methods 0.000 description 7
- 238000001514 detection method Methods 0.000 description 6
- 230000000694 effects Effects 0.000 description 6
- 239000011521 glass Substances 0.000 description 6
- 230000033001 locomotion Effects 0.000 description 6
- 230000008859 change Effects 0.000 description 5
- 230000001172 regenerating effect Effects 0.000 description 5
- 238000010079 rubber tapping Methods 0.000 description 5
- 230000005236 sound signal Effects 0.000 description 5
- 230000000007 visual effect Effects 0.000 description 5
- 239000000284 extract Substances 0.000 description 4
- 230000003287 optical effect Effects 0.000 description 4
- 239000010408 film Substances 0.000 description 3
- 210000004247 hand Anatomy 0.000 description 3
- 239000004973 liquid crystal related substance Substances 0.000 description 3
- 239000011159 matrix material Substances 0.000 description 3
- 229910052751 metal Inorganic materials 0.000 description 3
- 239000002184 metal Substances 0.000 description 3
- 238000003909 pattern recognition Methods 0.000 description 3
- 239000004984 smart glass Substances 0.000 description 3
- 229920003002 synthetic resin Polymers 0.000 description 3
- 239000000057 synthetic resin Substances 0.000 description 3
- XUIMIQQOPSSXEZ-UHFFFAOYSA-N Silicon Chemical compound [Si] XUIMIQQOPSSXEZ-UHFFFAOYSA-N 0.000 description 2
- 238000004364 calculation method Methods 0.000 description 2
- 239000004020 conductor Substances 0.000 description 2
- 238000012790 confirmation Methods 0.000 description 2
- 239000000470 constituent Substances 0.000 description 2
- 238000010168 coupling process Methods 0.000 description 2
- 230000007613 environmental effect Effects 0.000 description 2
- 238000005286 illumination Methods 0.000 description 2
- 230000006698 induction Effects 0.000 description 2
- 239000000463 material Substances 0.000 description 2
- 230000037361 pathway Effects 0.000 description 2
- 230000009467 reduction Effects 0.000 description 2
- 229910052710 silicon Inorganic materials 0.000 description 2
- 239000010703 silicon Substances 0.000 description 2
- 239000007787 solid Substances 0.000 description 2
- 229910001220 stainless steel Inorganic materials 0.000 description 2
- 239000010935 stainless steel Substances 0.000 description 2
- 239000010936 titanium Substances 0.000 description 2
- 101150012579 ADSL gene Proteins 0.000 description 1
- 102100020775 Adenylosuccinate lyase Human genes 0.000 description 1
- 108700040193 Adenylosuccinate lyases Proteins 0.000 description 1
- RTAQQCXQSZGOHL-UHFFFAOYSA-N Titanium Chemical compound [Ti] RTAQQCXQSZGOHL-UHFFFAOYSA-N 0.000 description 1
- 229910052782 aluminium Inorganic materials 0.000 description 1
- XAGFODPZIPBFFR-UHFFFAOYSA-N aluminium Chemical compound [Al] XAGFODPZIPBFFR-UHFFFAOYSA-N 0.000 description 1
- 239000003086 colorant Substances 0.000 description 1
- 238000010276 construction Methods 0.000 description 1
- 238000012217 deletion Methods 0.000 description 1
- 230000037430 deletion Effects 0.000 description 1
- 229920001971 elastomer Polymers 0.000 description 1
- 230000005684 electric field Effects 0.000 description 1
- 230000008030 elimination Effects 0.000 description 1
- 238000003379 elimination reaction Methods 0.000 description 1
- 230000001939 inductive effect Effects 0.000 description 1
- 238000002347 injection Methods 0.000 description 1
- 239000007924 injection Substances 0.000 description 1
- 238000001746 injection moulding Methods 0.000 description 1
- 238000009434 installation Methods 0.000 description 1
- 239000010985 leather Substances 0.000 description 1
- 238000001646 magnetic resonance method Methods 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 210000003205 muscle Anatomy 0.000 description 1
- 230000010355 oscillation Effects 0.000 description 1
- 230000000149 penetrating effect Effects 0.000 description 1
- 229920001296 polysiloxane Polymers 0.000 description 1
- 230000005855 radiation Effects 0.000 description 1
- 238000009774 resonance method Methods 0.000 description 1
- 230000004044 response Effects 0.000 description 1
- 238000007789 sealing Methods 0.000 description 1
- 230000035807 sensation Effects 0.000 description 1
- 239000010454 slate Substances 0.000 description 1
- 238000001228 spectrum Methods 0.000 description 1
- 238000005507 spraying Methods 0.000 description 1
- 230000003068 static effect Effects 0.000 description 1
- 239000000126 substance Substances 0.000 description 1
- 239000000758 substrate Substances 0.000 description 1
- 230000001502 supplementing effect Effects 0.000 description 1
- 239000010409 thin film Substances 0.000 description 1
- 229910052719 titanium Inorganic materials 0.000 description 1
- XLYOFNOQVPJJNP-UHFFFAOYSA-N water Substances O XLYOFNOQVPJJNP-UHFFFAOYSA-N 0.000 description 1
Images
Classifications
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60K—ARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
- B60K28/00—Safety devices for propulsion-unit control, specially adapted for, or arranged in, vehicles, e.g. preventing fuel supply or ignition in the event of potentially dangerous conditions
- B60K28/02—Safety devices for propulsion-unit control, specially adapted for, or arranged in, vehicles, e.g. preventing fuel supply or ignition in the event of potentially dangerous conditions responsive to conditions relating to the driver
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60K—ARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
- B60K35/00—Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W40/00—Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models
- B60W40/08—Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models related to drivers or passengers
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W50/00—Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
- B60W50/08—Interaction between the driver and the control system
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W50/00—Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
- B60W50/08—Interaction between the driver and the control system
- B60W50/14—Means for informing the driver, warning the driver or prompting a driver intervention
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/017—Gesture based interaction, e.g. based on a set of recognized hand gestures
-
- B60K2350/1052—
-
- B60W2040/08—
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W50/00—Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
- B60W50/08—Interaction between the driver and the control system
- B60W50/14—Means for informing the driver, warning the driver or prompting a driver intervention
- B60W2050/146—Display means
Landscapes
- Engineering & Computer Science (AREA)
- Mechanical Engineering (AREA)
- Automation & Control Theory (AREA)
- Transportation (AREA)
- Human Computer Interaction (AREA)
- Combustion & Propulsion (AREA)
- Chemical & Material Sciences (AREA)
- Physics & Mathematics (AREA)
- General Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Mathematical Physics (AREA)
- General Physics & Mathematics (AREA)
- Navigation (AREA)
Abstract
The present invention relates to a vehicle terminal that controls information (data) of a vehicle terminal based on a driver's sight line and a gesture detected by a driver state monitoring (DSM) system, and a control method thereof, The terminal comprises: a first display portion for displaying the guidance information; A second display unit for displaying a control menu of the vehicle terminal; When the glance for gazing at the second display unit and detecting a gesture for controlling information of the second display unit is detected while the guidance information is being displayed on the first display unit, . ≪ / RTI >
Description
The present invention relates to a vehicle terminal and a control method thereof.
Generally, the vehicle terminal receives traffic information from a server and provides route guidance information based on the received traffic information.
It is an object of the present invention to provide a vehicle terminal and a control method thereof for controlling information (data) of a vehicle terminal based on a driver's sight line and a gesture detected by a driver state monitoring (DSM) system.
A vehicle terminal according to an embodiment disclosed herein includes a first display portion for displaying route guidance information; A second display unit for displaying a control menu of the vehicle terminal; When the glance for gazing at the second display unit and detecting a gesture for controlling information of the second display unit is detected while the guidance information is being displayed on the first display unit, . ≪ / RTI >
According to an embodiment of the present invention, the controller receives the driver's line and the driver's head position from the DSM system whenever alert information related to the route guidance information is generated, and when the information related to the route guidance occurs, Generates warning information indicating that the driving state of the driver is dangerous if the driver's head position does not face the front of the vehicle while the driver's gaze does not gaze forward of the vehicle, and outputs the warning information .
According to an embodiment of the present invention, it is possible to further include a head-up display (HUD) for displaying information on the windshield of the vehicle, and the controller can transmit route guidance information matching the actual road to the windshield The head position of the driver can be received from the DSM system while the display position of the driver guidance information is displayed on the windshield based on the head position of the driver.
According to the embodiment disclosed herein, the route guidance information may be an arrow indicating a turn-by-turn.
According to the embodiment disclosed herein, the controller receives the viewing angle of the driver from the DSM system, and displays the display position of the route guidance information displayed on the windshield based on the viewing angle of the driver and the head position of the driver .
According to the embodiment disclosed herein, the control unit generates and outputs request information requesting reception of an incoming call, and transmits caller information of the incoming call to the instrument panel of the vehicle, the first The DSM system detects the sight line and the gesture of the driver through the DSM system and the sight line of the driver detected during the output of the request information displays the caller information of the incoming call And may perform a telephone conversation by receiving the incoming call if the gesture is a preset gesture for receiving the incoming call while the user is looking at any one of them.
According to an embodiment of the present invention, the control unit determines that the driver's gaze detected while outputting the request information is in a state of gazing at any one of the caller information of the incoming call, and the gesture is the reception of the incoming call It may reject the incoming call if the gesture is a preset gesture.
According to the embodiment disclosed herein, the control unit displays music information on any one of the instrument panel, the first display unit, and the second display unit of the vehicle during music reproduction, A gesture is detected, and the driver's gaze detected during the music reproduction is in a state of gazing at any one of the music information display, and the detected gesture is displayed in advance for any one of music volume control, If the gesture is a set gesture, the music volume control, next or previous music selection, or the like may be executed.
According to the embodiment disclosed in the present specification, the control unit controls the display unit such that a gesture for enlarging or reducing the map data while the driver's gaze is gazing at the first display unit while displaying the map data on the first display unit is received The map data may be enlarged or reduced.
According to the embodiment disclosed herein, the control unit receives the hand position of the driver and the hand gesture from the DSM system, and when the received hand position is adjacent to the first display unit or the second display unit And may control information displayed on the first display unit or the second display unit if the received hand gesture is a gesture for controlling information displayed on the first display unit or the second display unit.
According to the embodiment disclosed herein, the control unit may include a voice command for controlling information displayed on the first display unit or the second display unit while the driver's gaze gazes at the first display unit or the second display unit, The information displayed on the first display unit or the second display unit may be controlled.
According to the embodiment disclosed in the present specification, the control unit may include a voice command or a gesture requesting to display the rear-seat image of the vehicle on the display unit of the rearview mirror while the driver's gaze is gazing at the display unit of the rearview mirror of the vehicle And if it is received, display the rear seat image of the vehicle photographed by the first camera on the display unit of the rearview mirror.
According to the embodiment disclosed in the present specification, the control unit may include a voice command or a gesture requesting to display the rear image of the vehicle on the display unit of the rearview mirror while the driver's gaze is gazing at the display unit of the rearview mirror of the vehicle The rear image of the vehicle photographed by the second camera different from the photographing direction of the first camera may be displayed on the display unit of the rearview mirror.
According to the embodiment disclosed herein, the control unit detects biometric information of the driver through the watch-type terminal mounted on the wrist of the driver, and detects the biometric information of the driver when the driver's gaze does not gaze forward of the vehicle If the driver's biometric information is less than or greater than the reference value, warning information is generated, and the generated warning information may be output.
According to the embodiment disclosed herein, the control unit may display the warning information on at least one of the instrument panel of the vehicle, the watch-type terminal, the first display unit, and the second display unit.
According to the embodiment disclosed herein, the control unit may receive the driver's eye wrapping time from the DSM system, generate different warning information according to the received eye wrapping time, and output the different warning information You may.
According to the embodiment disclosed herein, the control unit receives the eye-snatching time of the driver from the DSM system and the non-gazing time at which the driver does not gaze forward of the vehicle, It is possible to generate different warning information based on the non-gazing time, and to output the different warning information.
A control method of a car terminal according to an embodiment disclosed herein is a control method of a car terminal connected to a driver state monitoring (DSM) system for detecting a driver's gaze and a gesture in a car, Displaying the route guidance information; Displaying a control menu of the vehicle terminal on a second display unit of the terminal; When the detected sight line gazes at the second display section while a gesture for controlling the information of the second display section is detected while the guidance information is being displayed on the first display section, And controlling according to the gesture.
The vehicle terminal and the control method thereof according to the embodiments of the present invention can easily and conveniently control information (data) of the vehicle terminal based on the driver's sight line and the gesture detected by the DSM (driver state monitoring) system .
1 is a schematic view showing a vehicle for explaining an embodiment of the present invention.
2 is a diagram showing a configuration of a vehicle for explaining an embodiment of the present invention.
3 is a block diagram illustrating a configuration of a telematics terminal for explaining embodiments of the present invention.
4 is a block diagram illustrating a mobile terminal according to the present invention.
5A and 5B are conceptual diagrams illustrating an example of a mobile terminal according to the present invention in different directions.
6 is a perspective view showing an example of a watch-type mobile terminal according to an embodiment of the present invention.
7 is a flowchart illustrating a control method of a vehicle terminal according to an embodiment of the present invention.
8 is a view illustrating an example of a control method of a vehicle terminal according to an embodiment of the present invention.
9 is a diagram illustrating another example of a control method of a vehicle terminal according to an embodiment of the present invention.
10 is a diagram illustrating another example of a control method of a vehicle terminal according to an embodiment of the present invention.
11 is a diagram illustrating another example of a control method of a vehicle terminal according to an embodiment of the present invention.
12 is a diagram illustrating another example of a control method of a vehicle terminal according to an embodiment of the present invention.
13 is a diagram illustrating another example of a control method of a vehicle terminal according to an embodiment of the present invention.
FIGS. 14A and 14B are views illustrating another method of controlling a vehicle terminal according to an embodiment of the present invention.
FIG. 15 is another exemplary diagram illustrating a control method of a vehicle terminal according to an embodiment of the present invention.
It is noted that the technical terms used herein are used only to describe specific embodiments and are not intended to limit the invention. It is also to be understood that the technical terms used herein are to be interpreted in a sense generally understood by a person skilled in the art to which the present invention belongs, Should not be construed to mean, or be interpreted in an excessively reduced sense. Further, when a technical term used herein is an erroneous technical term that does not accurately express the spirit of the present invention, it should be understood that technical terms that can be understood by a person skilled in the art are replaced. In addition, the general terms used in the present invention should be interpreted according to a predefined or prior context, and should not be construed as being excessively reduced.
Also, the singular forms "as used herein include plural referents unless the context clearly dictates otherwise. In the present application, the term "comprising" or "comprising" or the like should not be construed as necessarily including the various elements or steps described in the specification, Or may be further comprised of additional components or steps.
Also, terms including ordinals such as first, second, etc. used in the present specification can be used to describe a plurality of constituent elements, but the constituent elements should not be limited by the terms. The terms are used only for the purpose of distinguishing one component from another. For example, without departing from the scope of the present invention, the first component may be referred to as a second component, and similarly, the second component may also be referred to as a first component.
Hereinafter, exemplary embodiments of the present invention will be described in detail with reference to the accompanying drawings, wherein like reference numerals refer to like or similar elements throughout the several views, and redundant description thereof will be omitted.
In the following description, well-known functions or constructions are not described in detail since they would obscure the invention in unnecessary detail. It is to be noted that the accompanying drawings are only for the purpose of facilitating understanding of the present invention, and should not be construed as limiting the scope of the present invention with reference to the accompanying drawings.
1 is a schematic view showing a vehicle (for example, an electric vehicle) for explaining an embodiment of the present invention. Embodiments of the present invention can be applied not only to general automobiles (for example, gasoline cars, automobiles, gas automobiles, etc.) but also pure electric vehicles and hybrid electric vehicles. The Hybrid Electric Vehicles (HEV) mounts a battery pack composed of a plurality of battery cells in order to receive necessary power. A plurality of battery cells included in the battery pack need to uniformize the voltages of the battery cells in order to improve safety and life, and to obtain high output.
2 is a diagram showing the configuration of a vehicle (for example, a hybrid electric vehicle) for explaining an embodiment of the present invention.
2, a
The M /
The M /
The
The power of the
The
The
The electric vehicle is a hybrid electric vehicle electronic control unit (HEV-ECU) that implements an electric vehicle that communicates with and controls the
The regenerative braking force calculated by the
The electric vehicle further comprises a
The HEV-
In addition, the electric vehicle includes a
Hereinafter, a configuration of a
3 is a block diagram illustrating a configuration of a
3, the
The
The main board is provided with a code division multiple access (CDMA)
Also, the
The
The
The
The audio output unit (amplifier) 226 is connected to the
The
The
A speech recognition device (or speech recognition module) 298 recognizes the speech uttered by the user and performs the corresponding function according to the recognized speech signal.
A navigation session (299) applied to the telematics terminal (200) displays a traveling route on the map data, and when the position of the mobile terminal is within a predetermined distance from a dead zone included in the traveling route, (E.g., a car navigation device) mounted on a nearby vehicle and / or a mobile terminal carried by a nearby pedestrian through a communication (for example, a local area wireless communication network) Receives location information of the surrounding vehicle from the installed terminal, and receives the location information of the surrounding pedestrian from the mobile terminal carried by the nearby pedestrian.
The mobile terminal described in this specification includes a mobile phone, a smart phone, a laptop computer, a digital broadcasting terminal, a personal digital assistant (PDA), a portable multimedia player (PMP), a navigation device, a slate PC A tablet PC, an ultrabook, a wearable device such as a smartwatch, a smart glass, and a head mounted display (HMD). have.
4 is a block diagram for explaining a
The
The
The
The
The
The biometric sensor may include any one or more of a fingerprint recognition sensor, an iris recognition sensor, a PPG (Photo Plethysmogram) sensor, an ECG (Electrocardiogram) sensor, a heart rate variability (HRV) have.
Meanwhile, the mobile terminal disclosed in the present specification can combine and utilize information sensed by at least two of the sensors.
The
The
In addition, the
The
In addition, the
The
Meanwhile, the
At least some of the components may operate in cooperation with one another to implement a method of operation, control, or control of a mobile terminal according to various embodiments described below. In addition, the operation, control, or control method of the mobile terminal may be implemented on the mobile terminal by driving at least one application program stored in the
Hereinafter, the components listed above will be described in more detail with reference to FIG. 4 before explaining various embodiments implemented through the
First, referring to the
The
The wireless signal may include various types of data depending on a voice call signal, a video call signal or a text / multimedia message transmission / reception.
The
Wireless Internet technologies include, for example, wireless LAN (WLAN), wireless fidelity (Wi-Fi), wireless fidelity (Wi-Fi) Direct, DLNA (Digital Living Network Alliance), WiBro Interoperability for Microwave Access, High Speed Downlink Packet Access (HSDPA), High Speed Uplink Packet Access (HSUPA), Long Term Evolution (LTE) and Long Term Evolution-Advanced (LTE-A) 113 transmit and receive data according to at least one wireless Internet technology, including Internet technologies not listed above.
In view of wireless Internet access by WiBro, HSDPA, HSUPA, GSM, CDMA, WCDMA, LTE, LTE-A and the like through a mobile communication network, the
The short-
Herein, another
The
The
The
The
Meanwhile, the
First, the
Examples of the
On the other hand, for convenience of explanation, the act of recognizing that the object is located on the touch screen in proximity to the touch screen without touching the object is referred to as "proximity touch" The act of actually touching an object on the screen is called a "contact touch. &Quot; The position at which the object is closely touched on the touch screen means a position where the object corresponds to the touch screen vertically when the object is touched. The
The touch sensor senses a touch (or touch input) applied to the touch screen (or the display unit 151) by using at least one of various touch methods such as a resistance film type, a capacitive type, an infrared type, an ultrasonic type, do.
For example, the touch sensor may be configured to convert a change in a pressure applied to a specific portion of the touch screen or a capacitance generated in a specific portion to an electrical input signal. The touch sensor may be configured to detect a position, an area, a pressure at the time of touch, a capacitance at the time of touch, and the like where a touch object touching the touch screen is touched on the touch sensor. Here, the touch object may be a finger, a touch pen, a stylus pen, a pointer, or the like as an object to which a touch is applied to the touch sensor.
Thus, when there is a touch input to the touch sensor, the corresponding signal (s) is sent to the touch controller. The touch controller processes the signal (s) and transmits the corresponding data to the controller 180. Thus, the
Meanwhile, the
On the other hand, the touch sensors and the proximity sensors discussed above can be used independently or in combination to provide a short touch (touch), a long touch, a multi touch, a drag touch ), Flick touch, pinch-in touch, pinch-out touch, swipe touch, hovering touch, and the like. Touch can be sensed.
The ultrasonic sensor can recognize the position information of the object to be sensed by using ultrasonic waves. On the other hand, the
The
The
The
In addition, the
In the stereoscopic display unit, a three-dimensional display system such as a stereoscopic system (glasses system), an autostereoscopic system (no-glasses system), and a projection system (holographic system) can be applied.
The
The
In addition to vibration, the
The
The
The signal output from the
The
The identification module is a chip for storing various information for authenticating the use right of the
The
The
The
Meanwhile, as described above, the
The
The
In addition, the
As another example, the
In the following, various embodiments may be embodied in a recording medium readable by a computer or similar device using, for example, software, hardware, or a combination thereof.
Meanwhile, the
Here, the
The tap object to which the tab is applied may be a body of the
Meanwhile, the object to which the tap gesture is applied may include at least one of the body of the
Meanwhile, in the present invention, the tap or tap gesture may be sensed by at least one of the acceleration sensor and the touch sensor included in the
That is, the acceleration sensor senses the motion (or vibration) of the main body of the
As long as the acceleration sensor is capable of detecting movement or vibration in the main body of the
In the mobile terminal according to the present invention, only one of the acceleration sensor and the touch sensor may be used, or the acceleration sensor and the touch sensor may be sequentially used to detect the taps of the
On the other hand, when the tab is sensed through the touch sensor, it is possible to more accurately grasp the position where the tab is sensed.
In the
For example, in the doze mode, only a light emitting element for outputting a screen in the
Therefore, in the doze mode state, that is, when the
Also, the
Accordingly, the
That is, tap gestures may mean that at least two or more tap gestures are continuously sensed within a reference time. Accordingly, in the following description, the detection of a "tap " may mean that a user's finger or touch pen is struck on the body of the
Further, the
Further, the
The taps may be a plurality of tapes continuously sensed within the reference time. Here, the reference time may be a very short time, for example, 300 ms to 2 s or less.
If it is detected that the main body of the
There are various methods for allowing the 'valid tap' to be recognized. For example, the
Also, the
It is to be understood that the reference time and the predetermined region may be variously modified according to the embodiment.
Meanwhile, the first tap and the second tap may be sensed not only as a reference time and a predetermined area, but also as separate tabs depending on the detected position of each tab. That is, the
Also, when the first tap and the second tap are configured by a plurality of touches, that is, a plurality of tapes, a plurality of touches constituting the first tap and the second tap may be simultaneously detected. For example, when the first touch constituting the first tab is detected and the first touch constituting the second tab is positioned at a position spaced apart from the position where the first touch of the first tab is sensed If it is sensed, the
If the tab of the main body of the
For example, the
Here, the functions that can be executed on the
As another example, the functions executable in the
Meanwhile, the
In addition, the function of the
5A and 5B are conceptual diagrams illustrating an example of a mobile terminal according to the present invention in different directions.
Referring to FIGS. 5A and 5B, the disclosed
Here, the terminal body can be understood as a concept of referring to the
The
A
In some cases, electronic components may be mounted on the
As shown, when the
These
The
Meanwhile, the
The
5A and 5B, a
However, these configurations are not limited to this arrangement. These configurations may be excluded or replaced as needed, or placed on different planes. For example, the
The
The
In addition, the
The
On the other hand, the touch sensor is formed of a film having a touch pattern and is disposed between the
In this way, the
The first
An acoustic hole may be formed in the
The
The
The first and
In the figure, the
Contents inputted by the first and
Meanwhile, a rear input unit (not shown) may be provided on the rear surface of the terminal body as another example of the
The rear input unit may be disposed so as to overlap the
When a rear input unit is provided on the rear surface of the terminal body, a new type of user interface using the rear input unit can be realized. In the case where the
Meanwhile, the
The
The
And a
The
The
A second
The terminal body may be provided with at least one antenna for wireless communication. The antenna may be embedded in the terminal body or formed in the case. For example, an antenna constituting a part of the
The terminal body is provided with a power supply unit 390 (see FIG. 4) for supplying power to the
The
The
The
Next, a communication system that can be implemented through the
First, the communication system may use different wireless interfaces and / or physical layers. For example, wireless interfaces that can be used by a communication system include Frequency Division Multiple Access (FDMA), Time Division Multiple Access (TDMA), Code Division Multiple Access (CDMA) ), Universal mobile telecommunication systems (UMTS) (in particular Long Term Evolution (LTE), Long Term Evolution-Advanced (LTE-A)), Global System for Mobile Communications May be included.
Hereinafter, for the sake of convenience of description, the description will be limited to CDMA. However, it is apparent that the present invention can be applied to all communication systems including an OFDM (Orthogonal Frequency Division Multiplexing) wireless communication system as well as a CDMA wireless communication system.
A CDMA wireless communication system includes at least one
Each of the plurality of BSs may comprise at least one sector, and each sector may comprise an omnidirectional antenna or an antenna pointing to a particular direction of radial from the BS. In addition, each sector may include two or more antennas of various types. Each BS may be configured to support a plurality of frequency assignments, and a plurality of frequency assignments may each have a specific spectrum (e.g., 1.25 MHz, 5 MHz, etc.).
A broadcast transmission unit (BT) transmits a broadcast signal to
In addition, a Global Positioning System (GPS) may be associated with the CDMA wireless communication system to identify the location of the
The
The
The WiFi Positioning System (WPS) is a system in which a
The WiFi location tracking system may include a Wi-Fi location server, a
The
The Wi-Fi location server extracts information of the wireless AP connected to the
The information of the wireless AP to be extracted based on the location information request message of the
As described above, the Wi-Fi position location server can receive the information of the wireless AP connected to the
Thereafter, the Wi-Fi position location server may extract (or analyze) the location information of the
As a method for extracting (or analyzing) the position information of the
The Cell-ID method is a method of determining the position of the mobile station with the strongest signal strength among neighboring wireless AP information collected by the mobile terminal. Although the implementation is simple, it does not cost extra and it can acquire location information quickly, but there is a disadvantage that positioning accuracy is lowered when the installation density of the wireless AP is low.
The fingerprint method collects signal strength information by selecting a reference position in a service area, and estimates the position based on the signal strength information transmitted from the mobile station based on the collected signal strength information. In order to use the fingerprint method, it is necessary to previously convert the propagation characteristics into a database.
The triangulation method is a method of calculating the position of the mobile terminal based on the coordinates of at least three wireless APs and the distance between the mobile terminals. (Time of Arrival, ToA), Time Difference of Arrival (TDoA) in which a signal is transmitted, and the time difference between the wireless AP and the wireless AP, in order to measure the distance between the mobile terminal and the wireless AP. , An angle at which a signal is transmitted (Angle of Arrival, AoA), or the like.
The landmark method is a method of measuring the position of a mobile terminal using a landmark transmitter that knows the location.
Various algorithms can be utilized as a method for extracting (or analyzing) the location information of the mobile terminal.
The location information of the extracted
The
As shown in FIG. 4, the mobile terminal according to the present invention may include Bluetooth ™, Radio Frequency Identification (RFID), Infrared Data Association (IrDA), UWB (Ultra Wideband), ZigBee, NFC Field Communication), and Wireless USB (Wireless Universal Serial Bus).
Among them, the NFC module provided in the mobile terminal supports the non-contact type short-range wireless communication between the terminals at a distance of about 10 cm. The NFC module may operate in either a card mode, a reader mode, or a P2P mode. In order for the NFC module to operate in the card mode, the
When the NFC module is operated in the card mode, the mobile terminal can transmit the stored card information to the outside such as a conventional IC card. Specifically, if the mobile terminal storing the card information of the payment card such as a credit card or a bus card is brought close to the fare payment machine, the mobile local payment can be processed, and the mobile terminal storing the card information of the access card can be accessed If you are close to the time, the approval process for access may begin. Cards such as credit cards, transportation cards, and access cards are mounted on the security module in the form of an applet, and the security module can store card information on the mounted card. Here, the card information of the payment card may be at least one of a card number, balance, and usage details, and the card information of the access card may include at least one of a name, a number (e.g., It can be one.
When the NFC module is operated in the reader mode, the mobile terminal can read data from an external tag. At this time, the data received from the mobile terminal by the tag may be coded into a data exchange format (NFC Data Exchange Format) defined by the NFC Forum. In addition, the NFC Forum defines four record types. Specifically, the NFC forum defines four RTDs (Record Type Definitions) such as Smart Poster, Text, Uniform Resource Identifier (URI), and General Control. If the data received from the tag is a smart poster type, the control unit executes a browser (e.g., an Internet browser), and if the data received from the tag is a text type, the control unit can execute the text viewer. When the data received from the tag is a URI type, the control unit executes the browser or makes a telephone call, and if the data received from the tag is a general control type, it can execute appropriate operations according to the control contents.
When the NFC module is operated in a peer-to-peer (P2P) mode, the mobile terminal can perform P2P communication with another mobile terminal. At this time, LLCP (Logical Link Control Protocol) may be applied to P2P communication. For P2P communication, a connection can be created between the mobile terminal and another mobile terminal. At this time, the generated connection can be divided into a connectionless mode in which one packet is exchanged and terminated, and a connection-oriented mode in which packets are exchanged consecutively. Through P2P communications, data such as business cards, contact information, digital photos, URLs in electronic form, and setup parameters for Bluetooth and Wi-Fi connectivity can be exchanged. However, since the usable distance of NFC communication is short, the P2P mode can be effectively used to exchange small-sized data.
The present invention described above can be embodied as computer-readable codes on a medium on which a program is recorded. The computer readable medium includes all kinds of recording devices in which data that can be read by a computer system is stored. Examples of the computer readable medium include a hard disk drive (HDD), a solid state disk (SSD), a silicon disk drive (SDD), a ROM, a RAM, a CD-ROM, a magnetic tape, a floppy disk, , And may also be implemented in the form of a carrier wave (e.g., transmission over the Internet). Also, the computer may include a control unit 180 of the terminal. Accordingly, the above description should not be construed in a limiting sense in all respects and should be considered illustrative. The scope of the present invention should be determined by rational interpretation of the appended claims, and all changes within the scope of equivalents of the present invention are included in the scope of the present invention.
Hereinafter, embodiments related to a control method that can be implemented in a mobile terminal configured as above will be described with reference to the accompanying drawings. It will be apparent to those skilled in the art that the present invention may be embodied in other specific forms without departing from the spirit or essential characteristics thereof.
The
The
Meanwhile, the mobile terminal can be extended to a wearable device that can be worn on the body beyond the dimension that the user mainly grasps and uses. These wearable devices include smart watch, smart glass, and head mounted display (HMD). Hereinafter, examples of a mobile terminal extended to a wearable device will be described.
The wearable device can be made capable of interchanging (or interlocking) data with another
6 is a perspective view illustrating an example of a watch-type
6, a watch-type
The
The
A
The
The
On the other hand, the
The
The vehicle can be conveniently and quickly controlled based on the vehicle control information of the wearable device (e.g., a smartwatch, a glass glass, a head mounted display (HMD)), . The vehicle terminal may be the
7 is a flowchart illustrating a control method of a vehicle terminal according to an embodiment of the present invention.
First, the
The
The DSM (driver state monitoring) system detects the driver's gaze and gesture of the vehicle in real time or in real time, and transmits the detected gaze and gesture to the
The
Hereinafter, a method of automatically controlling the
8 is a view illustrating an example of a control method of a vehicle terminal according to an embodiment of the present invention.
The first display unit 201-a and the second display unit 201-b of the
The
The
When the glance for gazing at the second display unit 201-b and for reducing the brightness of the second display unit 201-b is detected when the vehicle is running, 2 display unit 201-b. When the glance for gazing at the second display unit 201-b and increasing the brightness of the second display unit 201-b is detected when the vehicle is running, 2 display unit 201-b.
The
A part of the area 201-b1 of the second display unit 201-b may be an area for displaying the state information (for example, signal intensity information, announcement information, message information, etc.) of the
Hereinafter, notification information (warning information) indicating a dangerous state when route guidance information (for example, turn-by-turn information, restriction speed information, etc.) occurs in a state in which the driver's gaze does not gaze ahead of the vehicle Will be described with reference to FIG.
9 is a diagram illustrating another example of a control method of a vehicle terminal according to an embodiment of the present invention.
9, when the
The
Hereinafter, when the route guidance information (for example, turn-by-turn arrows) matching the actual road is displayed on the HUD (Head-Up Display) 700, the display position of the route guidance information is set to the viewing angle of the driver, A method of matching the actual road and the route guidance information even if the head position of the driver is changed will be described with reference to FIG.
10 is a diagram illustrating another example of a control method of a vehicle terminal according to an embodiment of the present invention.
As shown in FIG. 10, the
For example, even when the route guidance information (for example, turn-by-turn arrows) matching with the actual road is displayed on the HUD (Head-Up Display) 700, The driver's viewing angle is also changed accordingly, so that the route guidance information is not matched with the actual road. That is, if the driver's head height is the reference height 10-1, the turn-by-turn arrow 10-1a indicated on the HUD (Head-Up Display) 700 is displayed to match the actual road, If the height 10-2 is higher than the reference height 10-1, the turn-by-turn arrow 10-2a shown on the head-up
Therefore, if the driver's head height 10-2 is higher than the reference height 10-1, the
Hereinafter, a method of receiving a call signal based on the sight line and the gesture of the driver will be described with reference to FIG.
11 is a diagram illustrating another example of a control method of a vehicle terminal according to an embodiment of the present invention.
11, the
The
The
The
The
The
Hereinafter, a method of controlling the map data based on the sight line and the gesture of the driver will be described with reference to FIG.
12 is a diagram illustrating another example of a control method of a vehicle terminal according to an embodiment of the present invention.
12, while the map data is being displayed on the first display unit 201-a, the
The
The
The
The
Hereinafter, a method of controlling various data based on the driver's line of sight and the driver's voice command will be described with reference to FIG.
13 is a diagram illustrating another example of a control method of a vehicle terminal according to an embodiment of the present invention.
13, when the map data 13-1 is displayed on the first display unit 201-a while the map data is being displayed on the first display unit 201-a, When the voice command (13-2) requesting enlargement or reduction of the map data is received in the state of being examined, the map data is enlarged or reduced. The
The
Hereinafter, a method of changing the image displayed on the rearview mirror of the vehicle based on the sight line of the driver and the voice command of the driver will be described with reference to Figs. 14A to 14B.
FIGS. 14A and 14B are views illustrating another method of controlling a vehicle terminal according to an embodiment of the present invention.
As shown in Fig. 14A, the rearview mirror of the vehicle includes a
14A to 14B, the
The
Hereinafter, a method of informing the dangerous state of the driver based on the sight line of the driver and the driver's biometric information of the watch-
FIG. 15 is another exemplary diagram illustrating a control method of a vehicle terminal according to an embodiment of the present invention.
15, the watch-
The
The
The
The
The
The
The
The
As described above, the vehicle terminal and the control method thereof according to the embodiments of the present invention facilitate the information (data) of the vehicle terminal based on the driver's sight line and the gesture detected by the DSM (driver state monitoring) system And can be conveniently controlled.
Many modifications and variations will be apparent to those skilled in the art to which the invention pertains without departing from the essential characteristics thereof. Therefore, the embodiments disclosed in the present invention are intended to illustrate rather than limit the scope of the present invention, and the scope of the technical idea of the present invention is not limited by these embodiments. The scope of protection of the present invention should be construed according to the following claims, and all technical ideas within the scope of equivalents should be construed as falling within the scope of the present invention.
Claims (18)
A first display unit for displaying route guidance information;
A second display unit for displaying a control menu of the vehicle terminal;
When the glance for gazing at the second display unit and detecting a gesture for controlling information of the second display unit is detected while the guidance information is being displayed on the first display unit, In accordance with the control signal.
The driver's gaze and the driver's head position are received from the DSM system each time alert information related to the guidance information is generated, and when the information related to the route guidance occurs, the driver's gaze does not gaze ahead of the vehicle, And generates warning information indicating that the driving state of the driver is dangerous if the driver's head position is not directed toward the front of the vehicle, and outputs the warning information.
Further comprising a head-up display (HUD) for displaying information on the windshield of the vehicle, wherein the controller displays route guidance information matching the actual road on the windshield through the HUD, Receives the head position of the driver and changes the display position of the route guidance information displayed on the windshield based on the head position of the driver.
Wherein the control unit receives the viewing angle of the driver from the DSM system and changes the display position of the route guidance information displayed on the windshield based on the viewing angle of the driver and the head position of the driver. .
Generates and outputs request information requesting to receive an incoming call,
The caller information of the incoming call is displayed on any one of the dash panel of the vehicle, the first display unit, and the second display unit while the request information is being output, the sight line and the gesture of the driver are detected through the DSM system,
When the gesture is a predetermined gesture for receiving the incoming call while the driver's gaze detected while outputting the request information stays on one of the caller information of the incoming call and the gesture is a preset gesture for receiving the incoming call, To the vehicle terminal.
When the driver's gaze detected while outputting the request information is in a state of gazing at any one of the caller information of the incoming call and the gesture is a preset gesture for rejection of the incoming call, And the vehicle terminal.
The music information is displayed on any one of the instrument panel, the first display section, and the second display section of the vehicle during music reproduction, the sight line and the gesture of the driver are detected through the DSM system,
If the gaze of the driver detected during the music reproduction is in a state of gazing at any one of the music information display and the detected gesture is a preset gesture for music volume control, next or previous music selection, Volume control, next or previous music selection, and the like.
The map data is enlarged or reduced when a gesture for enlarging or reducing the map data is received while the driver's gaze is gazing at the first display portion while the map data is displayed on the first display portion Vehicle terminal.
Wherein the hand position of the driver and the gesture of the hand are received from the DSM system and the received hand position is adjacent to the first display unit or the second display unit and the gesture of the received hand is displayed on the first display unit or the second display unit, And controls information displayed on the first display unit or the second display unit if the gesture is a gesture for controlling information displayed on the second display unit.
When the voice command for controlling the information displayed on the first display unit or the second display unit is received while the driver's gaze is gazing at the first display unit or the second display unit, the first display unit or the second display unit And controls the displayed information.
When a voice command or a gesture requesting to display a rear-seat image of the vehicle on the display unit of the rearview mirror is received while the eyes of the driver are gazing at the display unit of the rearview mirror of the vehicle, And displays the image on the display section of the rearview mirror.
When a voice command or a gesture requesting to display the rear image of the vehicle on the display unit of the rearview mirror is received while the driver's gaze is gazing at the display unit of the rearview mirror of the vehicle, 2. The vehicle terminal according to claim 1, wherein a rear image of the vehicle photographed by the camera is displayed on a display unit of the rearview mirror.
The biometric information of the driver is detected through the watch-type terminal mounted on the driver's wrist, and if the detected driver's biometric information when the driver's gaze does not gaze forward of the vehicle is less than or greater than the reference value, And outputs the generated warning information.
Wherein the warning information is displayed on at least one of the instrument panel of the vehicle, the watch-type terminal, the first display portion, and the second display portion.
Receives the driver's eye wrapping time from the DSM system, generates different warning information according to the received eye wrapping time, and outputs the different warning information.
The DSM system receives the eye retracting time of the driver and the non-reference time when the driver does not look ahead of the vehicle, generates different warning information based on the received eye retraction time and the non-reference time , And outputs the different warning information.
Displaying route guidance information on a first display unit of the terminal;
Displaying a control menu of the vehicle terminal on a second display unit of the terminal;
When the detected sight line gazes at the second display section while a gesture for controlling the information of the second display section is detected while the guidance information is being displayed on the first display section, And controlling the gesture according to the gesture.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
KR1020150049189A KR101737800B1 (en) | 2015-04-07 | 2015-04-07 | Vehicle terminal and control method thereof |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
KR1020150049189A KR101737800B1 (en) | 2015-04-07 | 2015-04-07 | Vehicle terminal and control method thereof |
Publications (2)
Publication Number | Publication Date |
---|---|
KR20160120101A true KR20160120101A (en) | 2016-10-17 |
KR101737800B1 KR101737800B1 (en) | 2017-05-29 |
Family
ID=57250295
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
KR1020150049189A KR101737800B1 (en) | 2015-04-07 | 2015-04-07 | Vehicle terminal and control method thereof |
Country Status (1)
Country | Link |
---|---|
KR (1) | KR101737800B1 (en) |
Cited By (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR101886626B1 (en) * | 2017-11-22 | 2018-08-09 | 오레스트 주식회사 | Massage chair with hands-free function |
WO2020111348A1 (en) * | 2018-11-30 | 2020-06-04 | 엘지전자 주식회사 | Vehicle control device and vehicle control method |
WO2020235710A1 (en) * | 2019-05-21 | 2020-11-26 | 엘지전자 주식회사 | Autonomous vehicle control method |
US10952055B2 (en) | 2017-07-04 | 2021-03-16 | Hyundai Motor Company | Wireless communication system, vehicle, smart apparatus, and controlling method thereof |
WO2023008647A1 (en) * | 2021-07-29 | 2023-02-02 | (주)그래피카 | 3d data conversion and use method for high speed 3d rendering |
Families Citing this family (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR102179289B1 (en) * | 2019-05-31 | 2020-11-17 | 연세대학교 산학협력단 | Large Display Interaction System and Method of Autonomous Vehicles |
KR102375003B1 (en) * | 2021-10-29 | 2022-03-16 | 주식회사 서연이화 | Motion gesture sensing device and vehicle mounted device operation system applying the same |
Citations (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR20090102460A (en) | 2008-03-26 | 2009-09-30 | 유겐가이샤 테크노 프론티어 | Sensible heat exchange element |
Family Cites Families (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP3788203B2 (en) * | 1999-08-10 | 2006-06-21 | 日産自動車株式会社 | Hand-free telephone equipment for automobiles |
JP2012084068A (en) * | 2010-10-14 | 2012-04-26 | Denso Corp | Image analyzer |
JP2014174598A (en) * | 2013-03-06 | 2014-09-22 | Denso Corp | Vehicle input device |
-
2015
- 2015-04-07 KR KR1020150049189A patent/KR101737800B1/en active IP Right Grant
Patent Citations (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR20090102460A (en) | 2008-03-26 | 2009-09-30 | 유겐가이샤 테크노 프론티어 | Sensible heat exchange element |
Cited By (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10952055B2 (en) | 2017-07-04 | 2021-03-16 | Hyundai Motor Company | Wireless communication system, vehicle, smart apparatus, and controlling method thereof |
KR101886626B1 (en) * | 2017-11-22 | 2018-08-09 | 오레스트 주식회사 | Massage chair with hands-free function |
WO2020111348A1 (en) * | 2018-11-30 | 2020-06-04 | 엘지전자 주식회사 | Vehicle control device and vehicle control method |
KR20200067121A (en) * | 2018-11-30 | 2020-06-11 | 엘지전자 주식회사 | Vehicle control device and vehicle control method |
WO2020235710A1 (en) * | 2019-05-21 | 2020-11-26 | 엘지전자 주식회사 | Autonomous vehicle control method |
US11562579B2 (en) | 2019-05-21 | 2023-01-24 | Lg Electronics Inc. | Method for controlling autonomous vehicle |
WO2023008647A1 (en) * | 2021-07-29 | 2023-02-02 | (주)그래피카 | 3d data conversion and use method for high speed 3d rendering |
Also Published As
Publication number | Publication date |
---|---|
KR101737800B1 (en) | 2017-05-29 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
KR101641267B1 (en) | Vehicle controlling system and control method thereof | |
KR101698103B1 (en) | Vehicle controlling system and control method thereof | |
KR101737800B1 (en) | Vehicle terminal and control method thereof | |
US10162347B2 (en) | Mobile terminal and method for controlling the same | |
US10556567B2 (en) | Mobile terminal and control method thereof | |
US10630828B2 (en) | Mobile terminal and method for controlling same | |
KR101811613B1 (en) | Mobile terminal and method for controlling the same | |
KR102353485B1 (en) | Mobile terminal and method for controlling the same | |
KR20150137799A (en) | Mobile terminal and method for controlling the same | |
US20190043038A1 (en) | Mobile device and control method therefor | |
CN106254622A (en) | Share method and the mobile terminal thereof of traffic accident information | |
KR20170064342A (en) | Watch-type mobile terminal and method for controlling the same | |
CN106034173B (en) | Mobile terminal and control method thereof | |
KR102266712B1 (en) | Mobile terminal and method for controlling the same | |
KR101708313B1 (en) | Vehicle terminal and control method thereof | |
KR101649662B1 (en) | Vehicle controlling system and control method thereof | |
KR102385089B1 (en) | Mobile terminal and control method for the mobile terminal | |
KR20160081700A (en) | Mobile terminal and method for controlling the same | |
KR20160076263A (en) | Mobile terminal and method for controlling the same | |
KR20160004164A (en) | Mobile terminal and method for controlling the same | |
KR20150123644A (en) | Mobile terminal and method for controlling the same | |
KR20160124559A (en) | Vehicle terminal and control method thereof | |
KR20160084188A (en) | Mobile terminal and method for controlling the same | |
KR101697600B1 (en) | Mobile terminal | |
KR20160064801A (en) | Mobile terminal and control method thereof |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
A201 | Request for examination | ||
E902 | Notification of reason for refusal | ||
E902 | Notification of reason for refusal | ||
E701 | Decision to grant or registration of patent right |