KR101748258B1 - Apparatus and method for providing guide information for vehicle - Google Patents

Apparatus and method for providing guide information for vehicle Download PDF

Info

Publication number
KR101748258B1
KR101748258B1 KR1020150131804A KR20150131804A KR101748258B1 KR 101748258 B1 KR101748258 B1 KR 101748258B1 KR 1020150131804 A KR1020150131804 A KR 1020150131804A KR 20150131804 A KR20150131804 A KR 20150131804A KR 101748258 B1 KR101748258 B1 KR 101748258B1
Authority
KR
South Korea
Prior art keywords
vehicle
processor
information
guide information
button
Prior art date
Application number
KR1020150131804A
Other languages
Korean (ko)
Other versions
KR20170033700A (en
Inventor
김운영
류승원
박준식
이강섭
소윤나
윤소운
장성화
조현진
Original Assignee
엘지전자 주식회사
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 엘지전자 주식회사 filed Critical 엘지전자 주식회사
Priority to KR1020150131804A priority Critical patent/KR101748258B1/en
Publication of KR20170033700A publication Critical patent/KR20170033700A/en
Application granted granted Critical
Publication of KR101748258B1 publication Critical patent/KR101748258B1/en

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W50/00Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
    • B60W50/08Interaction between the driver and the control system
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K35/00Arrangement of adaptations of instruments
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K37/00Dashboards
    • B60K37/02Arrangement of instruments
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R16/00Electric or fluid circuits specially adapted for vehicles and not otherwise provided for; Arrangement of elements of electric or fluid circuits specially adapted for vehicles and not otherwise provided for
    • B60R16/02Electric or fluid circuits specially adapted for vehicles and not otherwise provided for; Arrangement of elements of electric or fluid circuits specially adapted for vehicles and not otherwise provided for electric constitutive elements
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W50/00Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
    • B60W50/08Interaction between the driver and the control system
    • B60W50/14Means for informing the driver, warning the driver or prompting a driver intervention
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B62LAND VEHICLES FOR TRAVELLING OTHERWISE THAN ON RAILS
    • B62DMOTOR VEHICLES; TRAILERS
    • B62D1/00Steering controls, i.e. means for initiating a change of direction of the vehicle
    • B62D1/02Steering controls, i.e. means for initiating a change of direction of the vehicle vehicle-mounted
    • B62D1/04Hand wheels
    • B62D1/046Adaptations on rotatable parts of the steering wheel for accommodation of switches
    • B60K2350/928
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W50/00Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
    • B60W50/08Interaction between the driver and the control system
    • B60W50/14Means for informing the driver, warning the driver or prompting a driver intervention
    • B60W2050/146Display means

Landscapes

  • Engineering & Computer Science (AREA)
  • Mechanical Engineering (AREA)
  • Transportation (AREA)
  • Automation & Control Theory (AREA)
  • Chemical & Material Sciences (AREA)
  • Combustion & Propulsion (AREA)
  • Human Computer Interaction (AREA)
  • Traffic Control Systems (AREA)
  • Navigation (AREA)

Abstract

The present invention relates to an apparatus and method for providing vehicle guide information, and more particularly, to an apparatus and method for providing vehicle guide information according to an embodiment of the present invention, which includes an input unit disposed on a steering wheel and at least one The display unit of the vehicle to select the operation, to generate the guide information for linking the at least one operation to the input unit, to guide the at least one operation linked to the input unit, and to output the image corresponding to the guide information And a processor for controlling the processor.

Description

BACKGROUND OF THE INVENTION 1. Field of the Invention [0001]

The present invention relates to a guide information providing apparatus and a control method thereof, and more particularly, to a guide information providing apparatus provided in a vehicle for providing guide information on a steering wheel required to perform an operation related to a predetermined event, ≪ / RTI >

A vehicle is a device that drives a wheel to transport a person or cargo from one place to another. For example, two-wheeled vehicles such as a motorcycle, a four-wheeled vehicle such as a sedan, as well as a train belong to the vehicle.

In order to increase the safety and convenience of users who use the vehicle, development of technologies for connecting various sensors and electronic devices to the vehicle has been accelerated. In particular, a system that provides various functions (eg, smart cruise control, lane keeping assistance) developed for the user's driving convenience is installed in the vehicle.

In particular, various devices for the convenience and safety of the driver such as an air conditioner / heater device, a navigation device, an indoor lighting device, a multimedia device, a display device, a handsfree device, a seat driving device, have.

Some of the input means, such as buttons for operating various devices in the vehicle, are disposed on the steering wheel, but due to space constraints of the steering wheel, the remaining input means are distributed at different locations such as the center fascia, door, It is common.

In this case, in order for the driver to operate the input means disposed at a position other than the steering wheel, at least one hand must be spaced apart from the steering wheel in a state in which the gaze is directed toward the non-forward direction. The safety of others may also be a threat.

The present invention is characterized in that guide information for guiding an operation method to a steering wheel (in particular, an input means disposed on a steering wheel) required for execution of an operation related to a predetermined event, which is provided in the vehicle, A windshield), thereby providing a guide information providing apparatus and a control method thereof, which enable a driver to perform various operations while watching a forward direction.

The problems of the present invention are not limited to the above-mentioned problems, and other problems not mentioned can be clearly understood by those skilled in the art from the following description.

According to an aspect of the present invention, there is provided an information processing apparatus including an input unit disposed on a steering wheel and at least one operation related to the event when a preset event occurs, And a processor for controlling the display unit of the vehicle to generate guide information for guiding the at least one operation interlocked with the input unit and output an image corresponding to the guide information, / RTI >

Further, the display unit may include a plurality of displays disposed at different positions of the vehicle, and the processor may be operatively connected to at least one of the plurality of displays.

In addition, at least one operation related to the event may include a first operation and a second operation different from the first operation among a plurality of operations executable by the vehicle guide information providing apparatus.

Further, the processor may select a telephone connection operation as the first operation and a telephone reject operation as the second operation when the event is a telephone call incoming.

The processor may select a schedule deletion operation as the first operation and a schedule transmission operation as the second operation when the event is a schedule notification.

In addition, the processor may select the previous music selection operation as the first operation and the next music selection operation as the second operation when the event is music reproduction.

The input unit may include a first button and a second button disposed at different positions of the steering wheel.

The control unit may further include a first object that interlocks the first operation and the second operation with the first button and the second button and guides the first operation interlocked with the first button, And a second object that guides the second operation interlocked with the second button.

The input unit may include a plurality of touch pads disposed in different regions of the rim of the steering wheel.

The plurality of touch pads include a first touch pad and a second touch pad,

Wherein the control unit includes a third object that interlocks the first operation and the second operation with the first touch pad and the second touch pad and guides the first operation interlocked with the first touch pad, And a fourth object for guiding the second operation interlocked with the second touch pad.

Further, the third object further guides the operation pattern of the first touch pad required for execution of the first operation, and the fourth object further instructs the operation of the second touch pad You can further guide the pattern.

In addition, the processor may determine a grip state of the driver with respect to the steering wheel based on a sensing signal provided from the plurality of touch pads when the event occurs.

In addition, the processor may generate the guide information further including a fifth object for guiding the grip state of the driver.

The first touch pad and the second touch pad may be touch pads not held by the driver among the plurality of touch pads.

The processor may also perform an operation corresponding to the user input during at least one operation associated with the input, based on the user input received by the input.

The details of other embodiments are included in the detailed description and drawings.

An apparatus and method for providing vehicle guide information according to the present invention will now be described.

According to at least one of the embodiments of the present invention, guide information for guiding an operation method for a steering wheel (in particular, input means disposed on the steering wheel) required for execution of an operation associated with a predetermined event, It is possible to support the driver to perform various operations while observing the forward direction by outputting to the front area (e.g., instrument panel, windshield) of the driver.

Further, according to at least one of the embodiments of the present invention, different actions are linked to specific input means disposed on the steering wheel in accordance with an event (e.g., phone call incoming, schedule notification, So that various operations can be easily operated. This reduces the number of input means disposed on the steering wheel, thereby helping to overcome the space limitation of the steering wheel, and also to prevent erroneous operations that may occur frequently when the driver rotates the steering wheel can do.

Further, there is an advantage that a driver-friendly user interface can be provided by changing the guide information according to the grip state of the driver with respect to the steering wheel.

The effects of the present invention are not limited to the effects mentioned above, and other effects not mentioned can be clearly understood by those skilled in the art from the description of the claims.

1 shows a block diagram of a vehicle according to an embodiment of the present invention.
2 is a block diagram of a guide information providing apparatus according to an embodiment of the present invention.
3 shows an interior view of an exemplary vehicle of a vehicle associated with the present invention.
4A to 4C show an exemplary form of an input unit of a guide information providing apparatus disposed on a steering wheel, according to an embodiment of the present invention.
FIG. 5 shows a flowchart of an exemplary process executed by the guide information providing apparatus according to an embodiment of the present invention.
FIG. 6 shows a flowchart of a process in which the guide information providing apparatus according to an embodiment of the present invention provides guide information for a steering wheel in response to a call incoming event.
7A and 7B illustrate a dashboard display functionally connected to the guide information providing apparatus according to an embodiment of the present invention to display guide information.
8 is a flowchart illustrating a process in which a guide information providing apparatus according to an embodiment of the present invention provides guide information for a steering wheel in response to a schedule notification event.
FIG. 9 illustrates guide information related to FIG. 8 displayed through the dash panel display shown in FIG. 7A according to the occurrence of a schedule notification event during a predetermined event.
10 is a flowchart illustrating a process of providing guide information for a steering wheel in response to a music playback event according to an embodiment of the present invention.
FIG. 11 illustrates guide information related to FIG. 10 displayed through the instrument panel display shown in FIG. 7A according to occurrence of a music reproduction event in a predetermined event.
12 is a flowchart illustrating a process of providing guide information for a plurality of touch pads disposed on a steering wheel according to an embodiment of the present invention.
13 and 14 illustrate guide information related to FIG. 12 displayed on the windshield in response to occurrence of a call incoming event during a preset event.
15 is a flowchart illustrating a process of providing guide information for a plurality of touch pads disposed on a steering wheel according to an embodiment of the present invention.

Hereinafter, embodiments of the present invention will be described in detail with reference to the accompanying drawings, wherein like reference numerals are used to designate identical or similar elements, and redundant description thereof will be omitted. The suffix "module" and " part "for the components used in the following description are given or mixed in consideration of ease of specification, and do not have their own meaning or role. In the following description of the embodiments of the present invention, a detailed description of related arts will be omitted when it is determined that the gist of the embodiments disclosed herein may be blurred. It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory and are intended to provide further explanation of the invention as claimed. , ≪ / RTI > equivalents, and alternatives.

Terms including ordinals, such as first, second, etc., may be used to describe various elements, but the elements are not limited to these terms. The terms are used only for the purpose of distinguishing one component from another.

It is to be understood that when an element is referred to as being "connected" or "connected" to another element, it may be directly connected or connected to the other element, . On the other hand, when an element is referred to as being "directly connected" or "directly connected" to another element, it should be understood that there are no other elements in between. It should also be understood that the term "controlling" one component is meant to encompass not only one component directly controlling the other component, but also controlling through mediation of a third component something to do. It is also to be understood that any element "providing" information or signals to another element is meant to encompass not only providing the element directly to the other element, but also providing it through intermediation of a third element .

The singular expressions include plural expressions unless the context clearly dictates otherwise.

In the present application, the terms "comprises", "having", and the like are used to specify that a feature, a number, a step, an operation, an element, a component, But do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, or combinations thereof.

The vehicle described in the present specification may be a concept including both an internal combustion engine vehicle having an engine as a power source, a hybrid vehicle having an engine and an electric motor as a power source, and an electric vehicle having an electric motor as a power source.

1 shows a block diagram of a vehicle 100 according to an embodiment of the present invention.

The vehicle 100 includes a communication unit 110, an input unit 120, a memory 130, an output unit 140, a vehicle driving unit 150, a sensing unit 160, a control unit 170, an interface unit 180, (Not shown).

The communication unit 110 may include one or more modules that enable wireless communication between the vehicle 100 and an external device (e.g., mobile terminal, server, other vehicle). In addition, the communication unit 110 may include one or more modules that connect the vehicle 100 to one or more networks.

The communication unit 110 may include a broadcast receiving module 111, a wireless Internet module 112, a local area communication module 113, a location information module 114, and an optical communication module 115.

The broadcast receiving module 111 receives broadcast signals or broadcast-related information from an external broadcast management server through a broadcast channel. Here, the broadcast includes a radio broadcast or a TV broadcast.

The wireless Internet module 112 refers to a module for wireless Internet access, and may be built in or externally mounted on the vehicle 100. The wireless Internet module 112 is configured to transmit and receive wireless signals in a communication network according to wireless Internet technologies.

Wireless Internet technologies include, for example, WLAN (Wireless LAN), Wi-Fi (Wireless Fidelity), Wi-Fi (Wireless Fidelity) Direct, DLNA, WiBro Interoperability for Microwave Access, High Speed Downlink Packet Access (HSDPA), High Speed Uplink Packet Access (HSUPA), Long Term Evolution (LTE) and Long Term Evolution-Advanced (LTE-A) 112 transmit and receive data according to at least one wireless Internet technology, including Internet technologies not listed above. For example, the wireless Internet module 112 may exchange data wirelessly with an external server. The wireless Internet module 112 can receive weather information and road traffic situation information (for example, TPEG (Transport Protocol Expert Group)) from an external server.

The short-range communication module 113 is for short-range communication, and includes Bluetooth ™, Radio Frequency Identification (RFID), Infrared Data Association (IrDA), Ultra Wideband (UWB) (Near Field Communication), Wi-Fi (Wireless-Fidelity), Wi-Fi Direct, and Wireless USB (Wireless Universal Serial Bus) technology.

The short-range communication module 113 may form short-range wireless communication networks to perform short-range communication between the vehicle 100 and at least one external device. For example, the short-range communication module 113 can wirelessly exchange data with the occupant's portable terminal. The short-range communication module 113 can receive weather information and road traffic situation information (for example, TPEG (Transport Protocol Expert Group)) from a portable terminal or an external server. For example, when the user aboard the vehicle 100, the user's portable terminal and the vehicle 100 can perform pairing with each other automatically or by execution of the user's application.

The position information module 114 is a module for acquiring the position of the vehicle 100, and a representative example thereof is a Global Positioning System (GPS) module. For example, when the vehicle utilizes a GPS module, it can acquire the position of the vehicle using a signal sent from the GPS satellite.

The optical communication module 115 may include a light emitting portion and a light receiving portion.

The light receiving section can convert the light signal into an electric signal and receive the information. The light receiving unit may include a photodiode (PD) for receiving light. Photodiodes can convert light into electrical signals. For example, the light receiving section can receive information of the front vehicle through light emitted from the light source included in the front vehicle.

The light emitting unit may include at least one light emitting element for converting an electric signal into an optical signal. Here, the light emitting element is preferably an LED (Light Emitting Diode). The optical transmitter converts the electrical signal into an optical signal and transmits it to the outside. For example, the optical transmitter can emit the optical signal to the outside through the blinking of the light emitting element corresponding to the predetermined frequency. According to an embodiment, the light emitting portion may include a plurality of light emitting element arrays. According to the embodiment, the light emitting portion can be integrated with the lamp provided in the vehicle 100. [ For example, the light emitting portion may be at least one of a headlight, a tail light, a brake light, a turn signal lamp, and a car light. For example, the optical communication module 115 can exchange data with other vehicles through optical communication.

The input unit 120 may include a driving operation unit 121, a microphone 123, and a user input unit 124.

The driving operation means 121 receives a user input for driving the vehicle 100. The driving operation means 121 may include a steering input means, a shift input means, an acceleration input means, and a brake input means.

The steering input means receives a forward direction input of the vehicle 100 from the user. The steering input means may include a steering wheel 121a. According to an embodiment, the steering input means may be formed of a touch screen, a touch pad or a button.

The shift input means receives inputs of parking (P), forward (D), neutral (N), and reverse (R) of the vehicle 100 from the user. The shift input means is preferably formed in a lever shape. According to an embodiment, the shift input means may be formed of a touch screen, a touch pad or a button.

The acceleration input means receives an input for acceleration of the vehicle 100 from the user. The brake input means receives an input for deceleration of the vehicle 100 from the user. The acceleration input means and the brake input means are preferably formed in the form of a pedal. According to the embodiment, the acceleration input means or the brake input means may be formed of a touch screen, a touch pad or a button.

The camera 122 is disposed at one side of the interior of the vehicle 100 to generate an indoor image of the vehicle 100. [ Unlike the camera 161, the camera 122 may be called an indoor camera. For example, the camera 122 may be disposed at various positions of the vehicle 100, such as a dashboard surface, a roof surface, a rear view mirror, etc., to photograph the passenger of the vehicle 100. In this case, the camera 122 may generate an indoor image of an area including the driver's seat of the vehicle 100. [ In addition, the camera 122 may generate an indoor image of an area including an operator's seat and an assistant seat of the vehicle 100. [ The indoor image generated by the camera 122 may be a two-dimensional image and / or a three-dimensional image. To generate a three-dimensional image, the camera 122 may include at least one of a stereo camera, a depth camera, and a three-dimensional laser scanner. The camera 122 can provide the indoor image generated by the camera 122 to the control unit 170 functionally combined with the indoor image.

The controller 170 analyzes the indoor image provided from the camera 122 and can detect various objects. For example, the control unit 170 can detect the sight line and / or the gesture of the driver from the portion corresponding to the driver's seat area in the indoor image. As another example, the control unit 170 can detect the sight line and / or the gesture of the passenger from the portion corresponding to the indoor area excluding the driver's seat area in the indoor image. Of course, the sight line and / or the gesture of the driver and the passenger may be detected at the same time.

The microphone 123 can process an external acoustic signal into electrical data. The processed data can be utilized variously according to functions performed in the vehicle 100. The microphone 123 can convert the voice command of the user into electrical data. The converted electrical data may be transmitted to the control unit 170.

The camera 122 or the microphone 123 may be a component included in the sensing unit 160 and not a component included in the input unit 120. [

The user input unit 124 is for receiving information from a user. When information is input through the user input unit 124, the controller 170 may control the operation of the vehicle 100 to correspond to the input information. The user input unit 124 may include a touch input means or a mechanical input means. According to an embodiment, the user input 124 may be located in one area of the steering wheel. In this case, the driver can operate the user input unit 124 with his / her finger while holding the steering wheel.

The input unit 120 may include a plurality of buttons or a touch sensor. It is also possible to perform various input operations through a plurality of buttons or touch sensors.

The sensing unit 160 senses a signal related to the running of the vehicle 100 or the like. To this end, the sensing unit 160 may include a sensor, a steering sensor, a speed sensor, a tilt sensor, a weight sensor, a heading sensor, a yaw sensor, a gyro sensor, Position sensor, vehicle forward / backward sensor, battery sensor, fuel sensor, tire sensor, steering sensor by steering wheel rotation, vehicle internal temperature sensor, internal humidity sensor, ultrasonic sensor, infrared sensor, radar, . ≪ / RTI >

Accordingly, the sensing unit 160 can sense the vehicle collision information, the vehicle direction information, the vehicle position information (GPS information), the vehicle angle information, the vehicle speed information, the vehicle acceleration information, the vehicle tilt information, Fuel information, tire information, vehicle lamp information, vehicle interior temperature information, vehicle interior humidity information, steering wheel rotation angle, and the like. The control unit 170 controls the acceleration and deceleration of the vehicle 100 based on the external environment information obtained by at least one of the camera, the ultrasonic sensor, the infrared sensor, the radar, A control signal for changing direction, etc. can be generated. Here, the external environment information may be information related to various objects located within a predetermined distance from the vehicle 100 in motion. For example, the external environment information may include information on the number of obstacles located within a distance of 100 m from the vehicle 100, a distance to the obstacle, a size of the obstacle, a type of the obstacle, and the like.

The sensing unit 160 may further include an accelerator pedal sensor, a pressure sensor, an engine speed sensor, an air flow sensor AFS, an intake air temperature sensor ATS, a water temperature sensor WTS, A sensor (TPS), a TDC sensor, a crank angle sensor (CAS), and the like.

The sensing unit 160 may include a biometric information sensing unit. The biometric information sensing unit senses and acquires the biometric information of the passenger. The biometric information may include fingerprint information, iris-scan information, retina-scan information, hand geo-metry information, facial recognition information, Voice recognition information. The biometric information sensing unit may include a sensor that senses the passenger's biometric information. Here, the camera 122 and the microphone 123 can operate as sensors. The biometric information sensing unit can acquire hand shape information and facial recognition information through the camera 122. [

The sensing unit 160 may include at least one camera 161 for photographing the outside of the vehicle 100. [ The camera 161 may be referred to as an external camera. For example, the sensing unit 160 may include a plurality of cameras 161 disposed at different positions of the vehicle exterior. The camera 161 may include an image sensor and an image processing module. The camera 161 can process still images or moving images obtained by an image sensor (e.g., CMOS or CCD). The image processing module may process the still image or the moving image obtained through the image sensor, extract necessary information, and transmit the extracted information to the control unit 170.

The camera 161 may include an image sensor (e.g., CMOS or CCD) and an image processing module. In addition, the camera 161 can process still images or moving images obtained by the image sensor. The image processing module can process the still image or moving image obtained through the image sensor. In addition, the camera 161 may acquire an image including at least one of a traffic light, a traffic sign, a pedestrian, another vehicle, and a road surface.

The output unit 140 may include a display unit 141, an acoustic output unit 142, and a haptic output unit 143 for outputting information processed by the control unit 170.

The display unit 141 may display information processed by the controller 170. [ For example, the display unit 141 can display vehicle-related information. Here, the vehicle-related information may include vehicle control information for direct control of the vehicle, or vehicle driving assistance information for a driving guide to the vehicle driver. Further, the vehicle-related information may include vehicle state information indicating the current state of the vehicle or vehicle driving information related to the driving of the vehicle.

The display unit 141 may be a liquid crystal display (LCD), a thin film transistor-liquid crystal display (TFT LCD), an organic light-emitting diode (OLED) display, a 3D display, and an e-ink display.

The display unit 141 may have a mutual layer structure with the touch sensor or may be integrally formed to realize a touch screen. Such a touch screen may function as a user input 124 that provides an input interface between the vehicle 100 and a user and may provide an output interface between the vehicle 100 and a user. In this case, the display unit 141 may include a touch sensor that senses a touch with respect to the display unit 141 so as to receive a control command by a touch method. When a touch is made to the display unit 141, the touch sensor senses the touch, and the control unit 170 generates a control command corresponding to the touch based on the touch. The content input by the touch method may be a letter or a number, an instruction in various modes, a menu item which can be designated, and the like.

Meanwhile, the display unit 141 may include a cluster so that the driver can check the vehicle state information or the vehicle driving information while driving. Clusters can be located on the dashboard. In this case, the driver can confirm the information displayed in the cluster while keeping the gaze ahead of the vehicle.

Meanwhile, according to the embodiment, the display unit 141 may include a Head Up Display (HUD). The HUD, when implemented, can output information to the windshield. The HUD has a projection module and can output information through the image projected on the windshield.

Meanwhile, according to the embodiment, the display unit 141 may include a transparent display. Such a transparent display can be superimposed on the windshield of the vehicle or replace the windshield. Since the transparent display has a certain level of transparency, the user may not be disturbed by the front view, and it is advantageous to be able to check various information related to the vehicle 100 while looking at the front.

The sound output unit 142 converts an electric signal from the control unit 170 into an audio signal and outputs the audio signal. For this purpose, the sound output unit 142 may include a speaker or the like. It is also possible that the sound output unit 142 outputs a sound corresponding to the operation of the user input unit 124. [

The haptic output unit 143 generates a tactile output. For example, the haptic output section 143 may vibrate the steering wheel, the seat belt, and the seat so that the user can operate to recognize the output.

The vehicle driving unit 150 can control the operation of various devices of the vehicle. The vehicle driving unit 150 includes a power source driving unit 151, a steering driving unit 152, a brake driving unit 153, a lamp driving unit 154, an air conditioning driving unit 155, a window driving unit 156, an airbag driving unit 157, A driving unit 158, and a wiper driving unit 159. [0035]

The power source drive unit 151 may perform electronic control of the power source in the vehicle 100. [ The power source drive unit 151 may include an accelerator for increasing the speed of the vehicle 100 and a decelerator for decreasing the speed of the vehicle 100. [

For example, when the fossil fuel-based engine (not shown) is a power source, the power source drive unit 151 can perform electronic control of the engine. Thus, the output torque of the engine and the like can be controlled. When the power source drive unit 151 is an engine, the speed of the vehicle can be limited by limiting the engine output torque under the control of the control unit 170. [

In another example, when the electric motor (not shown) is a power source, the power source drive unit 151 can perform control on the motor. Thus, the rotation speed, torque, etc. of the motor can be controlled.

The steering driver 152 may include a steering apparatus. Accordingly, the steering driver 152 can perform electronic control of the steering apparatus in the vehicle 100. [ For example, the steering driver 152 may be provided with a steering torque sensor, a steering angle sensor, and a steering motor, and the steering torque applied by the driver to the steering wheel may be sensed by the steering torque sensor. The steering driver 152 can control the steering force and the steering angle by changing the magnitude and direction of the current applied to the steering motor based on the speed of the vehicle 100 and the steering torque. In addition, the steering driver 152 can determine whether the running direction of the vehicle 100 is properly adjusted based on the steering angle information obtained by the steering angle sensor. Thereby, the running direction of the vehicle can be changed. In addition, when the vehicle 100 is running at a low speed, the steering driver 152 lowers the weight of the steering wheel by increasing the steering force of the steering motor and reduces the steering force of the steering motor when the vehicle 100 is traveling at high speed, The weight can be increased. When the autonomous vehicle running function of the vehicle 100 is executed, the steering driver 152 may be configured to determine whether or not the steering wheel 160 is in a state where the driver operates the steering wheel (e.g., a situation in which the steering torque is not detected) It is also possible to control the steering motor to generate appropriate steering force based on the sensing signal or the control signal provided by the control unit 170. [

The brake driver 153 may perform electronic control of a brake apparatus (not shown) in the vehicle 100. [ For example, it is possible to reduce the speed of the vehicle 100 by controlling the operation of the brakes disposed on the wheels. As another example, it is possible to adjust the traveling direction of the vehicle 100 to the left or right by differently operating the brakes respectively disposed on the left wheel and the right wheel.

The lamp driving unit 154 may control the turn-on / turn-off of at least one or more lamps disposed inside or outside the vehicle. The lamp driver 154 may include a lighting device. Further, the lamp driving unit 154 can control intensity, direction, etc. of light output from each of the lamps included in the lighting apparatus. For example, it is possible to perform control for a direction indicating lamp, a head lamp, a brake lamp, and the like.

The air conditioning driving unit 155 may perform electronic control on an air conditioner (not shown) in the vehicle 100. For example, when the temperature inside the vehicle is high, the air conditioner can be operated to control the cool air to be supplied to the inside of the vehicle.

The window driving unit 156 may perform electronic control of a window apparatus in the vehicle 100. [ For example, it is possible to control the opening or closing of the side of the vehicle with respect to the left and right windows.

The airbag drive 157 may perform electronic control of the airbag apparatus in the vehicle 100. [ For example, in case of danger, the airbag can be controlled to fire.

The sunroof driving unit 158 may perform electronic control of a sunroof apparatus (not shown) in the vehicle 100. [ For example, the opening or closing of the sunroof can be controlled.

The wiper driving unit 159 may control the wipers 14a and 14b provided on the vehicle 100. [ For example, the wiper drive 159 may be configured to provide an electronic control for the number of drives, drive speeds, etc. of the wipers 14a, 14b in response to user input upon receipt of a user input instructing to drive the wiper through the user input 124 Can be performed. The wiper drive unit 159 may determine the amount or intensity of the rainwater based on the sensing signal of the rain sensor included in the sensing unit 160 so that the wipers 14a and 14b may be used without user input, Can be automatically driven.

Meanwhile, the vehicle driving unit 150 may further include a suspension driving unit (not shown). The suspension driving unit may perform electronic control of a suspension apparatus (not shown) in the vehicle 100. For example, when there is a curvature on the road surface, it is possible to control the suspension device so as to reduce the vibration of the vehicle 100. [

The memory 130 is electrically connected to the controller 170. The memory 170 may store basic data for the unit, control data for controlling the operation of the unit, and data provided with guide information. The memory 190 may be, in hardware, various storage devices such as ROM, RAM, EPROM, flash drive, hard drive, and the like. The memory 130 may store various data for operation of the vehicle 100, such as a program for processing or controlling the controller 170. [

The interface unit 180 may serve as a path to various kinds of external devices connected to the vehicle 100. For example, the interface unit 180 may include a port connectable to the portable terminal, and may be connected to the portable terminal through the port. In this case, the interface unit 180 can exchange data with the portable terminal.

The interface unit 180 may receive the turn signal information. Here, the turn signal information may be a turn-on signal of the turn signal lamp for the left turn or the turn right turn inputted by the user. When the left or right turn signal turn-on input is received through the user input portion (724 in Fig. 6) of the vehicle, the interface portion 180 can receive left turn signal information or right turn signal information.

The interface unit 180 may receive vehicle speed information, rotation angle information of the steering wheel, or gear shift information. The interface unit 180 may receive the sensed vehicle speed information, the steering wheel rotation angle information, or the gear shift information through the sensing unit 160 of the vehicle. Alternatively, the interface unit 180 may receive the vehicle speed information, the steering wheel rotation angle information, or the gear shift information from the control unit 170 of the vehicle. Here, the gear shift information may be information on which state the shift lever of the vehicle is in. For example, the gear shift information may be information on which state the shift lever is in the parking (P), reverse (R), neutral (N), running (D) .

The interface unit 180 may receive user input received via the user input 124 of the vehicle 100. [ The interface unit 180 may receive the user input from the input unit 120 of the vehicle 100 or may receive the user input through the control unit 170. [

The interface unit 180 can receive information obtained from an external device. For example, when the traffic light change information is received from the external server through the communication unit 110 of the vehicle 100, the interface unit 180 can receive the traffic light change information from the control unit 170. [

The control unit 170 can control the overall operation of each unit in the vehicle 100. [ The control unit 170 may be referred to as an ECU (Electronic Control Unit).

The control unit 170 may be implemented in hardware as application specific integrated circuits (ASICs), digital signal processors (DSPs), digital signal processing devices (DSPDs), programmable logic devices (PLDs), field programmable gate arrays (FPGAs) ), Controllers, micro-controllers, microprocessors, and other electronic units for performing other functions.

The power supply unit 190 can supply power necessary for the operation of each component under the control of the controller 170. [ In particular, the power supply unit 170 can receive power from a battery (not shown) or the like inside the vehicle.

The AVN (Audio Video Navigation) device 400 can exchange data with the control unit 170. [ The control unit 170 may receive navigation information from the AVN apparatus or a separate navigation apparatus (not shown). Here, the navigation information may include set destination information, route information according to the destination, map information about the vehicle driving, or vehicle location information.

The control unit 170 may process the image received from the cameras 161 and 122 shown in FIG. 1 based on computer vision to generate vehicle-related information. The vehicle-related information may include vehicle-control information for direct control of the vehicle, or vehicle-driving assistance information for a driver's guide to the vehicle driver. Here, the camera 161 may include at least one of a mono camera, a stereo camera, and a depth camera.

The memory 130 may store programs and various data for the processing or control of the controller 170.

The memory 130 may store data for object identification. For example, when a predetermined object is detected in the image obtained through the cameras 161 and 122, the memory 130 may store data for confirming what the object corresponds to according to a predetermined algorithm.

The memory 130 may store data on traffic information. For example, when predetermined traffic information is detected from an external image obtained through the camera 161, the memory 130 may store data for determining what the traffic information corresponds to according to a predetermined algorithm have.

Meanwhile, the memory 130 may be various storage devices such as a ROM, a RAM, an EPROM, a flash drive, a hard drive, and the like in hardware.

The control unit 170 may process the vehicle front image or the vehicle periphery image obtained by the camera 161. [ In particular, the controller 170 performs signal processing based on computer vision. Accordingly, the control unit 170 can acquire an image of the front of the vehicle or the surroundings of the vehicle from the camera 161, and perform object detection and object tracking based on the image. Particularly, when detecting an object, the control unit 170 controls the lane detection (LD), the vehicle detection (VD), the pedestrian detection (PD), the light detection (Brightspot Detection, BD) Traffic sign recognition (TSR), road surface detection, and the like.

Meanwhile, the traffic signal may mean predetermined information that can be transmitted to the driver of the vehicle 100. Traffic signals can be delivered to the driver through a traffic light, traffic sign, or road surface. For example, the traffic signal may be a Go or Stop signal of a vehicle or pedestrian output from a traffic light. For example, the traffic signal may be various designs or texts displayed on a traffic sign. For example, traffic signals may be various designs or texts displayed on the road surface.

The control unit 170 can detect information in the vehicle surroundings image generated by the camera 161. [

The information may be information on the driving situation of the vehicle. For example, the information may be a concept including road information, traffic regulation information, surrounding vehicle information, vehicle or pedestrian signal information, construction information, traffic situation information, parking lot information, lane information, etc., which the vehicle travels.

The information may be traffic information. The control unit 170 can detect traffic information from any one of a traffic light, a traffic sign, and a road surface included in the external image acquired by the camera 161. For example, the control unit 170 can detect a Go or a Stop signal of a vehicle or a pedestrian from a signal light included in an image. For example, the control unit 170 can detect various patterns or texts from the traffic signs included in the image. For example, the control unit 170 can detect various designs or texts from the road surface included in the image.

The controller 170 can compare the detected information with the information stored in the memory 130 to confirm the information.

For example, the control unit 170 detects a graphic or text indicating a lampway in the object included in the acquired image. Here, the object may be a traffic sign or a road surface. Pattern or text. The control unit 170 may compare the traffic information stored in the memory 130 with the detected pattern or text to check the lampway information.

For example, the control unit 170 detects a graphic or text indicating a vehicle or a pedestrian stop in the object included in the acquired image. Here, the object may be a traffic sign or a road surface. The control unit 170 can compare the traffic information stored in the memory 130 with the detected pattern or text to check the stop information. Alternatively, the control unit 170 detects a stop line from the road surface included in the acquired image. The control unit 170 can compare the traffic information stored in the memory 130 with the stop line to confirm the stop information.

For example, the control unit 170 can detect the presence or absence of a lane in an object included in the acquired image. Here, the object may be a road surface. The control unit 170 can check the color of the detected lane. The control unit 170 can confirm whether the detected lane is a driving lane or a waiting lane.

For example, the control unit 170 may detect the Go or Stop information of the vehicle from the object included in the acquired image. Here, the object may be a vehicle traffic light. Here, the Go information of the vehicle may be a signal instructing the vehicle to go straight, turn left or right. The stop information of the vehicle may be a signal instructing the vehicle to stop. The Go information of the vehicle may be displayed in green, and the Stop information of the vehicle may be displayed in red.

For example, the control unit 170 can detect the Go or Stop information of the pedestrian from the object included in the acquired image. Here, the object may be a pedestrian signal or the like. Here, the Go information of the pedestrian may be a signal instructing the pedestrian to cross the lane in the pedestrian crossing. The stop information of the pedestrian may be a signal instructing the pedestrian to stop in the pedestrian crossing.

Meanwhile, the controller 170 may control the zoom of the cameras 161 and 122. For example, the control unit 170 can control the zoom of the camera 161 in accordance with the object detection result. For example, if a traffic sign is detected but the contents displayed on the traffic sign are not detected, the controller 170 may control the camera 161 to zoom in.

Meanwhile, the control unit 170 can receive weather information, road traffic situation information, for example, TPEG (Transport Protocol Expert Group) information through the communication unit 110.

On the other hand, the control unit 170 can grasp real-time or periodical information about the surroundings of the vehicle (e.g., traffic information, accident information, road conditions, and obstacles) based on the external image provided from the camera 161. The control unit 170 can also grasp real-time or periodical information on the in-vehicle situation (e.g., the driver's condition, the passenger's gesture) based on the indoor image provided from the camera 122. [

Meanwhile, the control unit 170 can receive navigation information and the like from the AVN apparatus or another navigation apparatus through the interface unit 180.

The controller 170 may receive the sensor information from the controller 170 or the sensing unit 160 through the interface unit 180. [ Here, the sensor information includes at least one of vehicle direction information, vehicle position information (GPS information), vehicle angle information, vehicle speed information, vehicle acceleration information, vehicle tilt information, vehicle forward / backward information, battery information, fuel information, Lamp information, vehicle interior temperature information, vehicle interior humidity information, and steering wheel rotation information.

Meanwhile, the controller 170 may receive navigation information from the controller 170, the AVN device, or another navigation device through the interface unit 180.

The controller 170 may be implemented as an application specific integrated circuit (ASIC), digital signal processors (DSP), digital signal processing devices (DSPDs), programmable logic devices (PLDs), field programmable gate arrays (FPGAs) May be implemented using at least one of controllers, micro-controllers, microprocessors, and electrical units for performing other functions.

The display unit 141 can display various kinds of information processed by the controller 170. The display unit 141 may display an image related to the operation of the vehicle 100. [ In order to display such an image, the display unit 141 may include a cluster or an HUD (Head Up Display) on the inside of the vehicle interior. Meanwhile, when the display unit 141 is the HUD, it may include a projection module that projects an image on the windshield of the vehicle 100. [

The power supply unit 190 can supply power necessary for the operation of each component under the control of the control unit 170. [ In particular, the power supply unit 190 can receive power from a battery or the like inside the vehicle 100. [

The vehicle 100 described above with reference to FIG. 1 may further include a vehicle guide information providing device 200 to be described later with reference to FIG.

2 is a block diagram of a vehicle guide information providing apparatus 200 according to an embodiment of the present invention.

Referring to FIG. 2, the vehicle guide information providing apparatus 200 according to the embodiment of the present invention includes a communication unit 210, an input unit 220, a memory 230, an interface unit 240, a processor 250, And a supply unit 260.

The communication unit 210 may include at least one communication module that enables wired and / or wireless communication with at least one device having a communication function. The communication unit 210 may include one or more communication modules for connecting the guide information providing apparatus 200 to one or more networks. For example, the communication unit 210 can exchange data with the vehicle 100, the mobile terminal 310, the server 320, and the like. Here, the server 320 may be a server of the content provider. The communication unit 210 may transmit data received from at least one device to the processor 250. [

The input unit 220 receives a user input. The input unit 220 may be disposed on the steering wheel. In one embodiment, the input section 220 is detachably coupled to at least one region of the steering wheel, or may be integrally formed with the steering wheel.

The input unit 220 may include a button 221 and a touchpad 220. The input unit 220 may include a button 221 and a touchpad 220. [ In this case, one button 221 and one touch pad 220 may be provided, respectively, or a plurality of buttons may be provided.

In one embodiment, a plurality of buttons 221 may be provided at different positions of the steering wheel. For example, the button 221 may be disposed on the central vertical axis of the steering wheel, one on the left and one on the right. These buttons 221 may be linked to at least one operation executable in the vehicle 100 or the guide information providing apparatus 200 by the guide information providing apparatus 200. [

In addition, the operation associated with the button 221 may be varied according to an event generated among preset events. For example, when the telephone call incoming event occurs, the button 221 is linked to the telephone reject operation, while the button 221 is associated with the next music selection operation when the music reproduction event occurs. Accordingly, it is possible to operate various operations through the same button 221.

In one embodiment, the touch pad 222 may be disposed along the curvature of the rim of the steering wheel. That is, the touch pad 222 may be disposed in a structure that overlaps the outer surface of the steering wheel. The touch pad 222 may receive a user's touch based on at least one of an electrostatic touch method, a pressure sensitive touch method, an optical touch method, an ultrasonic touch method, and an infrared touch method. The touch pad 222 generates a sensing signal by combining at least one of the position touched by the driver, the length of the touched time, the touched pattern, the touched area, the touched pressure, and the number of times touched, Signal to the control unit 250.

When there are a plurality of touch pads 222, one of the touch pads may be separated from the other one. Accordingly, it is possible to reduce the erroneous operation by the driver's mistake or the like.

The memory 230 stores basic data for each unit of the guide information providing apparatus 200, control data (e.g., a command word) for controlling the operation of each unit, data input to and output from the guide information providing apparatus 200, And at least one program executable by the computer.

The memory 230 may be various storage devices such as a ROM, a RAM, an EPROM, a flash drive, a hard drive, and the like in hardware.

The memory 230 may store various data for operation of the guide information providing apparatus 200, such as a program for processing or controlling the processor 250.

The memory 230 may store data generated through the input unit 220. The memory 230 may store information or data received through the communication unit 210.

The interface unit 240 can exchange data with the vehicle 100. [ Specifically, the interface unit 240 can exchange data with, for example, the output unit 140 of the vehicle 100, the sensing unit 160, the control unit 170, and the like.

The interface unit 240 can receive the vehicle-related data or user input or transmit the processed or generated signal to the outside by the wired communication or the wireless communication method. In particular, the interface unit 240 may transmit the control signal generated by the processor 250 to at least one of the output unit 140, the sensing unit 160, the control unit 170, and the vehicle display device 400.

The control signal generated by the processor 250 may be transmitted to the sensing unit 160, the output unit 140, the vehicle driving unit 150, or the other devices 310 and 320 via the control unit 170 Lt; / RTI >

Meanwhile, the interface unit 240 can receive the sensor information from the control unit 170 or the sensing unit 160. Here, the sensor information includes at least one of vehicle direction information, vehicle position information (GPS information), vehicle angle information, vehicle speed information, vehicle acceleration information, vehicle tilt information, vehicle forward / backward information, battery information, fuel information, Vehicle interior temperature information, vehicle interior humidity information, and vehicle exterior illuminance information. Such sensor information may include a heading sensor, a yaw sensor, a gyro sensor, a position module, a vehicle forward / backward sensor, a wheel sensor, a vehicle speed sensor, A vehicle body inclination sensor, a battery sensor, a fuel sensor, a tire sensor, a steering sensor by a steering wheel rotation, a vehicle internal temperature sensor, a vehicle internal humidity sensor, and an illuminance sensor. On the other hand, the position module may include a GPS module for receiving GPS information. On the other hand, among the sensor information, the vehicle direction information, the vehicle position information, the vehicle angle information, the vehicle speed information, the vehicle tilt information, and the like relating to the vehicle running can be referred to as vehicle running information.

The processor 250 can control the overall operation of each unit in the in-vehicle guide information providing device 200. [

The processor 250 may generate a control signal in accordance with a user input received via an input. The processor 250 may provide the control signal corresponding to the user input signal directly to the vehicle 100 or may provide the vehicle 100 with the control signal through the communication unit 210 or the interface unit 240.

Processor 250 may determine whether a mirrored event has occurred. Here, the predetermined event may be, for example, a call incoming event to the mobile terminal 310 located in the vehicle 100, a notification event of a schedule that has already arrived in the notification time of the schedule registered in the memory 230, A music reproduction event by a multimedia device, and the like. However, this is an exemplary one and is not particularly limited as long as it can occur in association with the vehicle 100. [

The processor 250 may provide a control signal corresponding to the user input signal to the vehicle driver 150. [ Specifically, the processor 250 can receive on / off inputs of various devices in the vehicle through the input unit 220. [ The processor 250 generates a control signal corresponding to the on / off input and provides the generated control signal to the lamp driver 154, the air conditioning driver 155, the window driver 156, or the sunroof driver 158 .

The processor 250 may be under the control of the control unit 770 or may control the control unit 770. The processor 250 may be implemented in hardware as application specific integrated circuits (ASICs), digital signal processors (DSPs), digital signal processing devices (DSPDs), programmable logic devices (PLDs), field programmable gate arrays (FPGAs) processors, controllers, micro-controllers, microprocessors, and other electronic units for performing other functions.

The power supply unit 260 can supply power necessary for operation of each unit of the vehicle guide information providing apparatus 200 under the control of the processor 250. In particular, the power supply unit 260 can be supplied with power from a battery or the like inside the vehicle 700.

FIG. 3 shows an interior view of an exemplary vehicle 100 of the vehicle 100 related to the present invention.

Referring to FIG. 3, the driver 10 can confirm the outside of the vehicle 100 (for example, the front situation) through the windshield 11 of the vehicle 100. For example, the driver 10 can check the state of the road, the surrounding facilities, and the like located in front of the vehicle 100 through the windshield 11.

On the other hand, the display unit 141 may include two or more displays as described above. For example, as shown in Fig. 3, a transparent display 141a coupled to at least one area of the windshield 11 or replacing the windshield 11, A navigation display 141c and a dashboard display 141d for outputting a head-up display 141b for projecting a virtual image, route guidance information and multimedia reproduction information, and the like can be arranged in the vehicle 100. [

On the other hand, the control unit 170 can simultaneously output arbitrary information through two or more displays. For example, any information displayed on the transparent display 141a may be simultaneously displayed on the head-up display 141b and the navigation display 141c.

The camera 122 photographs an indoor image of the vehicle 100. [ To this end, the camera 122 may be disposed at a position capable of shooting at least the driver's seat. For example, as shown, the camera 122 may be disposed on one side of the upper end of the windshield 11 to generate a two-dimensional and / or three-dimensional image of at least an area including the driver's seat. The control unit 170 extracts a body region (e.g., an eye and a face) of the driver 10 of the vehicle 100 from the images provided from the camera 122, and based on the extracted body region, Can be monitored.

The sound output unit 142 can output various sounds based on the electric signal provided from the control unit 170. [

Meanwhile, when a plurality of displays 141a, 141b, 141c, and 141d are included in the display unit 141, at least one of the displays 141a, 141b, 141c, and 141d may be functionally connected to the guide information providing apparatus 200. For example, the control unit 170 may provide a driver with a user interface that allows the user to select at least one of the plurality of displays 141a, 141b, 141c, and 141d, and may select at least one corresponding to the user input received by the input unit 220 The display can be connected to the guide information providing apparatus 200.

4A to 4C show an exemplary form of the input unit 220 of the guide information providing apparatus 200 disposed on the steering wheel 121a according to an embodiment of the present invention.

4A, the input unit 220 may include a plurality of buttons 221a and 221b. The plurality of buttons 221a and 221b may be disposed at different positions of the steering wheel 121a. As shown in the figure, when the first button 221a and the second button 221b are included in the input unit 220, the first button 221a is disposed on the left side with respect to the center of the steering wheel 121a, The second button 221b may be disposed on the right side with respect to the center of the steering wheel 121a. In this case, the first button 221a and the second button 221b may be disposed in an area other than the rim of the steering wheel 121a. Thereby, when the driver rotates the steering wheel 121a, it is possible to prevent that the button is inadvertently pressed and that an arbitrary operation is performed. The processor 250 may operate the first button 221a and the second button 221b to perform different operations, and the operations associated with the first button 221a and the second button 221b may be changed according to the type of the event detected by the processor 250 .

Referring to FIG. 4B, the input unit 220 may include a plurality of touch pads 222a, 222b, 222c, and 222d. The plurality of touch pads 222a, 222b, 222c, and 222d may be spaced apart from each other along the curvature of the rim of the steering wheel 121a. When the steering wheel 121a is circular and the input unit 220 includes a plurality of touch pads 222a, 222b, 222c, and 222d, 3 o'clock direction, between 3 o'clock and 6 o'clock direction, and between 6 o'clock and 9 o'clock direction. The processor 250 may link a specific operation to at least one of the plurality of touch pads 222a, 222b, 222c, and 222d when a predetermined event occurs. In this case, only one operation may be interlocked with any one of the touch pads, or two or more operations may be interlocked. When two or more operations are interlocked with any one of the touch pads, the processor 250 may associate different touch patterns for each interlocked operation. For example, an operation associated with a first touch pattern for a touch pad may be different from an operation associated with a second touch pattern. Thus, it is possible to selectively execute a plurality of operations through user input of different patterns for a common touch pad. That is, when the user input of the first touch pattern is received by the touch pad in which two or more operations are interlocked, the processor 250 performs an operation related to the first touch pattern. When the user input of the second touch pattern is received Another operation related to the second touch pattern can be executed.

Referring to FIG. 4C, the input unit 220 may include a plurality of buttons 221a and 221b and a plurality of touch pads 222a, 222b, 222c, and 222d. That is, the shapes shown in Figs. 4A and 4B can be combined. In this case, the processor 250 may link a specific operation associated with a predetermined event to at least one of the plurality of buttons 221a and 221b and the plurality of touch pads 222a, 222b, 222c, and 222d. For example, any one operation can be interlocked with only a single button 221 or a single touch pad 222. [ As another example, the other operation may be interlocked with only a plurality of buttons, or only with a plurality of touch pads. For example, another operation may be simultaneously performed on any one of the plurality of buttons 221a and 221b and the plurality of touch pads 222a, 222b, 222c, and 222d. Various other combinations are possible.

FIG. 5 shows a flowchart of an exemplary process (S500) executed by the guide information providing apparatus 200 according to an embodiment of the present invention.

Referring to FIG. 5, in step S510, the processor 250 may determine whether a predetermined event has occurred. Here, the preset event may be stored in the memory 230, may be the default setting from the time of factory shipment, or may be set by the user. Examples of the preset event include a phone call incoming event, a message incoming event, a schedule notification event, a music reproduction event, a route search event, an obstacle detection event, a voice command event, and the like.

For example, the processor 250 is connected to the mobile terminal 310 in the vehicle 100 through the communication unit and monitors the telephone connection status of the mobile terminal 310 in real time or periodically, It can be determined that a telephone incoming event has occurred. In another example, the schedule information may be registered by the user in the memory 230, and the processor 250 may compare the time of the registered schedule information with the current time to determine whether a schedule notification event occurs . In another example, the processor 250 receives music reproduction information by the multimedia device included in the vehicle 100 via the interface unit 240, and based on the received music reproduction information, It is possible to judge whether or not it occurs.

The processor 250 may perform step S520 if a predetermined event has occurred and repeat step S510 if no event has occurred.

In step S520, the processor 250 determines at least one action associated with the generated event. In memory 230, information about a plurality of different events, and information about at least one executable operation associated with each event, can be correlated and stored. Processor 250 may access memory 230 and select at least one action associated with the currently occurring event.

In one embodiment, processor 250 may select two or more different actions associated with a particular event that occurred, upon occurrence of a particular event. For example, the processor 250 may select a telephone connection operation and a telephone rejection operation among a plurality of operations related to the incoming call when the generated event is a telephone incoming call. As another example, if the generated event is a schedule notification, the processor 250 may select a schedule deletion operation and a schedule transmission operation among a plurality of operations related to the schedule notification. As another example, if the generated event is music playback, the processor 250 may select a previous music selection operation and a next music selection operation among a plurality of operations associated with music playback.

In step S530, the processor 250 may link the operation selected in step S520 to the input unit 220. [

The input unit 220 may include a first button 221a and a second button 221b disposed at different positions of the steering wheel 121a and the processor 250 may include a first button 221a and the second button 221b.

For example, when a telephone call incoming event is generated and the telephone connection operation and the telephone reject operation are selected in step S520, the processor 250 associates the telephone connection operation with the first button 221a and the telephone connection operation with the second button 221b The call rejection operation can be interlocked.

If the schedule delete event and the schedule delete operation are selected in step S520, the processor 250 associates the schedule delete operation with the first button 221a and the schedule delete operation with the second button 221b, The schedule transmission operation can be interlocked with the schedule transmission operation.

For example, when the music reproduction event occurs and the previous music selection operation and the next music selection operation are selected in step S520, the processor 250 associates the first music selection operation with the first button 221a, The next music selection operation can be linked to the music selection section 221b.

According to this, at least one of a plurality of operations for each event can be linked to a common button. That is, the operation that the driver can operate through a certain button can be changed according to the type of the event.

In one embodiment, the input section 220 may include a plurality of touch pads 222a, 222b, 222c, 222d disposed in different regions of the rim of the steering wheel 121a, and the processor 250 may include a plurality of Different operations can be interlocked for each of the touch pads 222a, 222b, 222c, and 222d. For example, when a telephone call incoming event occurs and the telephone connection operation and the telephone reject operation are selected in step S520, the processor 250 transmits a telephone connection operation to any one of the plurality of touch pads 222a, 222b, 222c, and 222d And the telephone denial operation can be interlocked with the other one. In another example, when a schedule notification event occurs and the schedule deletion operation and the schedule transmission operation are selected in step S520, the processor 250 performs a schedule deletion operation on any one of the plurality of touch pads 222a, 222b, 222c, and 222d And the schedule transmission operation can be linked to the other one. In another example, when a music reproduction event occurs and the previous music selection operation and the next music selection operation are selected in step S520, the processor 250 determines whether or not any of the plurality of touch pads 222a, 222b, 222c, The music selection operation can be interlocked with the other, and the next music selection operation can be interlocked with the other. According to this, at least one of a plurality of operations for each event can be interlocked with a common touch pad. That is, the operation that can be operated by the driver through any one of the touch pads can be changed according to the type of the event.

In step S540, the processor 250 may generate guide information that guides at least one operation associated with the input unit 220. [ In an embodiment, when a plurality of buttons 221a and 221b are included in the input unit 220, the guide information may include an object that guides an operation associated with each of the plurality of buttons 221a and 221b. In a case where a plurality of touch pads 222a, 222b, 222c and 222d are included in the input unit 220, the guide information is linked to the plurality of touch pads 222a, 222b, 222c and 222d Lt; RTI ID = 0.0 > and / or < / RTI > In this case, the guide information may include an operation method (for example, a single tap, a double tap, a long tap, a drag, a touch time, a touch direction, and a touch direction) required for execution of an operation interlocked with each of the plurality of touch pads 222a, 222b, 222c, The number of touches) can be further informed.

In step S550, the processor 250 may control the display unit 141 of the vehicle 100 to output an image corresponding to the guide information generated in step S540. In this case, the object included in the guide information can be displayed on the display unit 220 in association with a plurality of buttons 221a and 221b or a plurality of touch pads 222a, 222b, 222c, and 222d included in the input unit 220 Lt; / RTI >

FIG. 6 shows a flow chart of a process (S600) in which the guide information providing apparatus 200 according to an embodiment of the present invention provides guide information about the steering wheel 121a in response to a call incoming event. For convenience of explanation, it is assumed that the input unit 220 includes a first button 221a and a second button 221b as shown in FIG. 4A.

In step S610, the guide information providing apparatus 200 may be connected to the mobile terminal 310 in the vehicle 100. [ That is, the guide information providing apparatus 200 may form a wired / wireless network for data transmission / reception with the mobile terminal 310 in the vehicle 100. The processor 250 may be directly connected to the mobile terminal 310 via the communication unit 210 or the interface unit 240 to monitor the state of the mobile terminal 310. [ The processor 250 is connected to the vehicle 100 forming the wired and / or wireless network with the mobile terminal 310 through the communication unit 210 or the interface unit 240, Connection status can be monitored. The wireless connection between the guide information providing device 200, the vehicle 100 and the mobile terminal 310 may be performed by a wireless connection such as Wifi (wireless fidelity), WiFi Direct, infrared, Zigbee, Near field communications ), A Radio-Frequency IDentification (RFID), a Bluetooth, and a UWB (UltraWideBand).

In step S620, the guide information providing device 200 can determine whether a call incoming event has occurred in the mobile terminal 310. [ For example, when the mobile terminal 310 receives a telephone connection request from an arbitrary party, the mobile terminal 310 provides the corresponding data to the guide information providing apparatus 200 directly or through the vehicle 100, The mobile terminal 310 can determine whether a call incoming event has occurred based on the data provided from the mobile terminal 310.

If a call incoming event of the mobile terminal 310 occurs, the processor 250 performs step S630. On the other hand, if the call termination event of the mobile terminal 310 has not occurred, the processor 250 may repeat step S620

In step S630, the guide information providing device 200 can link the telephone connection operation, which is one of the operations related to the incoming call event, to the first button 221a. Further, in step S640, the guide information providing device 200 may link the phone rejection operation, which is another operation related to the incoming call event, to the second button 221b. However, the telephone connection operation and the telephone rejection operation are a part of the operations related to the telephone incoming event, and the processor 250 transmits the other operations (e.g., the recording of the telephone call content, the registration of the spam number, The first button 221a or the second button 221b.

In addition, in one embodiment, the processor 250 may associate another operation with the first button 221a in conjunction with a telephone connection operation, and may associate another operation with the second button 221b in addition to the telephone reject operation. That is, the processor 250 can link a plurality of operations to a single button. When a plurality of operations are linked to a single button, each operation is distinguished by the processor 250 in accordance with an operation pattern (e.g., short press, long press, fast press several times, And can be selectively executed.

In step S650, the guide information providing apparatus 200 may generate guide information for guiding operations linked to the first button 221a and the second button 221b, respectively. At this time, the guide information can guide an operation pattern (e.g., a short press, a long press, a fast press several times, and a slow press several times) required to perform an operation linked to each button. For example, the processor 250 may generate guide information including a first object for guiding a call connection operation interlocked with the first button 221a and a second object for guiding a call rejection operation interlocked with the second button 221b Can be generated.

In step S660, the guide information providing device 200 can control the display part 141 of the vehicle 100 to output an image corresponding to the guide information generated in step S650.

In step S670, the guide information providing apparatus 200 can execute either a telephone connection operation or a telephone rejection operation according to a user input received through the input unit 220 after the guide information is output. For example, the processor 250 may perform a telephone connection operation upon receiving a user input pressing the first button 221a, and may perform a telephone rejection operation upon receiving a user input pressing the second button 221b. At this time, when the first button 221a and the second button 221b are simultaneously pressed, the processor 250 may not perform any operation. Alternatively, when the first button 221a and the second button 221b are pressed at the same time, the processor 250 performs another operation (e.g., a spam number registration operation) related to the incoming call event in addition to the telephone connection operation and the telephone rejection operation Can be executed.

7A and 7B illustrate an instrument panel display 141d functionally connected to the guide information providing apparatus 200 according to an embodiment of the present invention to display guide information.

Referring to FIG. 7A, the instrument panel display 141d may include a plurality of screens 710, 720, 730, and 740.

When none of the predetermined events occurs, different information related to the state of the vehicle 100 may be displayed on each of the plurality of screens 710, 720, 730, and 740, as shown in the figure. For example, an indicator (eg, 'P', 'R', 'N', 'D') indicating the position of the transmission may be displayed on the screen 710. On the screen 720, a water temperature indicator indicating the cooling water temperature may be displayed. On the screen 730, a fuel gauge indicator indicating the remaining amount of fuel may be displayed. On the screen 740, an alarm system and an indicator for notifying the amount of the engine oil, the lighting state, the engine preheating state, the charging state, the tire state, and the like may be displayed.

Fig. 7B illustrates the guide information associated with Fig. 6 displayed via the instrument panel display 141d in response to the occurrence of a call termination event of the mobile terminal 310 in the vehicle 100. Fig. For convenience of explanation, the input unit 220 includes a first button 221a and a second button 221b as shown in FIG. 4A. The first button 221a is linked to the telephone connection operation, and the second button 221b are assumed to be in a state in which the telephone rejection operation is interlocked.

The processor 250 receives the call reception information (e.g., the call requestor information) from the mobile terminal 310 and displays the display screen 141d on the screen 710 Can be controlled. Accordingly, as compared with Fig. 7A, an indicator for indicating the position of the transmission before the occurrence of the incoming call event can be replaced with the incoming call information 711. [

An icon 721 for guiding the call receiving operation interlocked with the first button 221a is displayed on the screen 720 and a call rejection operation linked to the second button 221b is displayed on the screen 730 An icon 731 may be displayed.

The screen 740 may display an icon 741 for guiding the first button 221a and the second button 221b disposed on the steering wheel 121a and the steering wheel 121a.

If the user presses the first button 221a of the steering wheel 121a, the processor 250 may transmit an execution command of the telephone connection operation to the mobile terminal 310 to perform a call with the other party . On the other hand, when the user presses the second button 221b of the steering wheel 121a, the processor 250 transmits a command to execute the telephone rejection operation to the mobile terminal 310, The call request can be blocked.

FIG. 8 shows a flowchart of a process (S800) in which the guide information providing apparatus 200 according to an embodiment of the present invention provides guide information about the steering wheel 121a in response to a schedule notification event. For convenience of explanation, it is assumed that the input unit 220 includes a first button 221a and a second button 221b as shown in FIG. 4A.

In step S810, the guide information providing device 200 may determine whether a schedule notification event has occurred. For example, the processor 250 may determine whether a schedule notification event has occurred based on the schedule information of the user stored in the memory 230. [ In another example, the guide information providing apparatus 200 may be connected to the mobile terminal 310 in the vehicle 100 to collect schedule information of a user registered in the mobile terminal 310, , It can be determined whether or not a schedule notification event has occurred.

If a schedule notification event occurs, the processor 250 performs step S820. On the other hand, if a schedule notification event has not occurred, the processor 250 may repeat step S810

In step S820, the guide information providing device 200 may link the schedule delete operation, which is one of the operations related to the schedule notification event, to the first button 221a. Also, in step S830, the guide information providing device 200 may link the schedule transmission operation, which is another operation related to the schedule notification event, to the second button 221b. Here, the schedule deletion operation is performed by deleting the schedule stored in the memory 230 or the mobile terminal 310, and the schedule transmission operation is performed when the schedule stored in the memory 230 or the mobile terminal 310 is transmitted to another device Lt; / RTI > However, the schedule deletion operation and the schedule transmission operation are a part of the operations related to the schedule notification event, and the processor 250 transmits the other operation (e.g., schedule search operation, schedule change operation) to the first button 221a or the second button 221b.

In addition, in one embodiment, the processor 250 may associate another operation with the first button 221a in conjunction with the schedule deletion operation, and another operation with the second button 221b in conjunction with the schedule transfer operation. That is, the processor 250 can link a plurality of operations to a single button. When a plurality of operations are interlocked with a single button, they are distinguished by the processor 250 in accordance with an operation pattern (e.g., short press, long press, quick press, and slow press) Lt; / RTI >

In step S840, the guide information providing device 200 may generate guide information for guiding operations linked to the first button 221a and the second button 221b, respectively. At this time, the guide information can guide an operation pattern (e.g., a short press, a long press, a fast press several times, and a slow press several times) required to perform an operation linked to each button. For example, the processor 250 may generate guide information including a first object for guiding a schedule delete operation linked to the first button 221a and a second object for guiding a schedule transfer operation linked to the second button 221b Can be generated.

In step S850, the guide information providing device 200 can control the display unit 141 of the vehicle 100 to output an image corresponding to the guide information generated in step S840.

In step S860, the guide information providing apparatus 200 can execute either the schedule deletion operation or the schedule transmission operation according to the user input received through the input unit 220 after the guide information is outputted. For example, the processor 250 may perform a schedule deletion operation upon receiving a user input pressing the first button 221a and a schedule transmission operation upon receiving a user input pressing the second button 221b. At this time, when the first button 221a and the second button 221b are simultaneously pressed, the processor 250 may not perform any operation. Alternatively, when the first button 221a and the second button 221b are simultaneously pressed, the processor 250 executes another operation (e.g., a schedule search operation) related to the schedule notification event in addition to the schedule deletion operation and the schedule transmission operation .

FIG. 9 illustrates guide information related to FIG. 8 displayed through the instrument panel display 141d shown in FIG. 7A according to the occurrence of a schedule notification event during a predetermined event. For convenience of explanation, the input unit 220 includes a first button 221a and a second button 221b as shown in FIG. 4A. A schedule deletion operation is linked to the first button 221a, and a second button It is assumed that the schedule transmission operation is interlocked with the schedule transmission operation.

Specifically, upon occurrence of a schedule notification event, the processor 250 may control the dashboard display 141d to display schedule information corresponding to the schedule notification event on the screen 710. [ Accordingly, as compared with FIG. 7A, the indicator indicating the position of the transmission can be replaced with the schedule guide information 712 corresponding to the schedule notification event.

An icon 722 for guiding the schedule deletion operation linked to the first button 221a is displayed on the screen 720 and a schedule transmission operation linked to the second button 221b is displayed on the screen 730 Icon 732 may be displayed.

The screen 740 may display an icon 741 for guiding the first button 221a and the second button 221b disposed on the steering wheel 121a and the steering wheel 121a.

If the user presses the first button 221a of the steering wheel 121a, the processor 250 transmits a command to execute the schedule delete operation to the vehicle 100, and the vehicle 100 executes the schedule delete operation The schedule information 712 guided through the screen 710 can be deleted from the memory 130 or the mobile terminal 310 in response to the command. On the other hand, when the user presses the second button 221b of the steering wheel 121a, the processor 250 transmits an execution command of the schedule transmission operation to the vehicle 100, and the vehicle 100 executes the schedule transmission operation (E. G., Mobile terminal 310) the schedule information 712 that is guided through the screen 710 in response to the command.

FIG. 10 shows a flowchart of a process (S1000) in which the guide information providing apparatus 200 according to an embodiment of the present invention provides guide information for the steering wheel 121a in response to a music playback event. For convenience of explanation, it is assumed that the input unit 220 includes a first button 221a and a second button 221b as shown in FIG. 4A.

In step S1010, the guide information providing device 200 can determine whether or not a music reproduction event has occurred. For example, the guide information providing apparatus 200 may be connected to the multimedia device of the vehicle 100 or the mobile terminal 310 in the vehicle 100 to determine whether the multimedia application or the music application installed in the mobile terminal 310 is executed And when the music application is executed, it can be determined that a music reproduction event has occurred.

If a music reproduction event occurs, the processor 250 performs step S1020. On the other hand, if no music playback event has occurred, the processor 250 may repeat step S1010

In step S1020, the guide information providing device 200 may link the previous song selection operation, which is one of the operations related to the music reproduction event, to the first button 221a. Also, in step S1030, the guide information providing device 200 may link the next music selection operation, which is another operation related to the music reproduction event, to the second button 221b. However, the previous music selection operation and the next music selection operation are a part of the operations related to the music playback event, and the processor 250 may perform other operations such as a music volume increase / decrease operation, a music file download operation, Delete operation) can be linked to the first button 221a or the second button 221b.

Further, in one embodiment, the processor 250 may associate another operation with the first button 221a in conjunction with the previous song selection operation, and may couple the other operation with the second button 221b in conjunction with the next song selection operation have. That is, the processor 250 can link a plurality of operations to a single button. When a plurality of operations are linked to a single button, each operation is distinguished by the processor 250 in accordance with an operation pattern (e.g., short press, long press, fast press several times, .

In step S1040, the guide information providing device 200 may generate guide information for guiding operations linked to the first button 221a and the second button 221b, respectively. At this time, the guide information can guide an operation pattern (e.g., a short press, a long press, a fast press several times, and a slow press several times) required to perform an operation linked to each button. For example, the processor 250 may include a first object that guides a previous song selection operation associated with the first button 221a, and a second object that is associated with a second button 221b, Information can be generated.

In step S1050, the guide information providing device 200 can control the display unit 141 of the vehicle 100 to output an image corresponding to the guide information generated in step S104.

In step S1060, the guide information providing apparatus 200 may execute either the previous music selection operation or the next music selection operation according to the user input received through the input unit 220 after the guide information is outputted. For example, processor 250 may perform a previous song selection operation upon receipt of a user input that depresses first button 221a, and may perform a next song selection operation upon receipt of a user input that depresses second button 221b have. At this time, when the first button 221a and the second button 221b are simultaneously pressed, the processor 250 may not perform any operation. Alternatively, when the first button 221a and the second button 221b are simultaneously pressed, the processor 250 may perform another operation related to the music reproduction event (for example, a music file deletion operation ).

FIG. 11 illustrates guide information related to FIG. 10 displayed through the instrument panel display 141d shown in FIG. 7A according to the occurrence of a music playback event in a preset event. For convenience of explanation, the input unit 220 includes a first button 221a and a second button 221b as shown in FIG. 4A. The first button 221a is linked to a previous music selection operation, It is assumed that the next music selection operation is interlocked with the music selection operation.

Specifically, upon occurrence of a music playback event, the processor 250 may control the dashboard display 141d to display a thumbnail of the currently played music file and a playback order on the screen 710. [ Accordingly, as compared to FIG. 7A, the indicator for indicating the position of the transmission can be replaced with a thumbnail 713 of the currently playing music file (e.g., 'MUSIC 11').

In addition, a thumbnail 723 of a music file (e.g., 'MUSIC 10') corresponding to a previous music selection operation linked to the first button 221a is displayed on the screen 720, A thumbnail 733 of a music file (e.g., 'MUSIC 12') corresponding to the next music selection operation linked to the music selection button 221b may be displayed.

The screen 740 may display an icon 741 for guiding the first button 221a and the second button 221b disposed on the steering wheel 121a and the steering wheel 121a.

If the user presses the first button 221a of the steering wheel 121a, assuming that the music playback event is due to the multimedia device of the vehicle 100, To the multimedia device of the vehicle 100, and the multimedia device of the vehicle 100 reproduces the music file corresponding to the thumbnail 723 displayed on the screen 720 in response to the execution command of the previous music selection operation Lt; / RTI > On the other hand, when the user presses the second button 221b of the steering wheel 121a, the processor 250 transmits an execution command of the next music selection operation to the multimedia device of the vehicle 100, The device can initiate playback of the music file corresponding to the thumbnail 733 displayed on the screen 730 in response to the execution command of the next song selection operation.

12 is a flowchart illustrating a process (S1200) of providing guide information for a plurality of touch pads 222a, 222b, 222c, and 222d disposed on a steering wheel 121a according to an embodiment of the present invention. ). ≪ / RTI > For convenience of explanation, it is assumed that the input unit 220 includes a plurality of touch pads 222a, 222b, 222c, and 222d as shown in FIG. 4B.

In step S1210, the guide information providing device 200 may determine whether or not a predetermined event has occurred. As described above, the preset event may include, for example, a call incoming, a schedule announcement, music playback, and the like. If a preset event occurs, the processor 250 performs step S1220. On the other hand, if the predetermined event has not occurred, the processor 250 can repeat step S1210

In step S1220, the guide information providing device 200 can select the first operation and the second operation related to the predetermined event. In this case, the first operation and the second operation may be changed according to the kind of the event. For example, if the event is a telephone call incoming, the processor 250 may select the telephone connection operation as the first operation and the telephone reject operation as the second operation. As another example, if the event is a schedule notification, the processor 250 may select the schedule delete operation as the first operation and the schedule transfer operation as the second operation. As another example, when the event is music playback, the processor 250 may select the previous music selection operation as the first operation and the next music selection operation as the second operation.

In step S1230, the guide information providing device 200 can determine the grip state of the driver with respect to the steering wheel 121a. More specifically, the guide information providing apparatus 200 is configured such that the driver grasps the steering wheel 121a based on a sensing signal provided from a plurality of touch pads 222a, 222b, 222c, and 222d disposed on the steering wheel 121a, It is possible to judge whether or not it is in a state of being operated. In addition, when it is determined that the driver grasps the steering wheel 121a, the guide information providing device 200 can determine whether the driver grasps the steering wheel 121a with one hand or grasps the steering wheel 121a with both hands. In addition, the guide information providing device 200 can determine which portion of the steering wheel 121a is gripped when the steering wheel 121a is gripped with one hand or both hands. For example, when a driver's touch is detected by two or more of the plurality of touch pads 222a, 222b, 222c, and 222d, the processor 250 may determine that the driver is holding the steering wheel 121a with both hands have.

In step S1240, the guide information providing device 200 can select the first touch pad and the second touch pad that are not held by the driver among the plurality of touch pads 222a, 222b, 222c, and 222d. For example, when a touch of the driver is detected by one of the plurality of touch pads 222a, 222b, 222c, and 222d, the processor 250 selects one of the remaining touch pads as the first touch pad, Can be selected as the second touch pad.

In step S1250, the guide information providing device 200 can link the first operation to the first touch pad. Further, in step S1260, the guide information providing device 200 can link the second operation to the second touch pad. For example, when the event is a telephone call incoming, the processor 250 may associate the telephone connection operation with the first touch pad and the telephone reject operation with the second touch pad. As another example, when the event is a schedule notification, the processor 250 may link the schedule deletion operation to the first touch pad and the schedule transmission operation to the second touch pad. As another example, when the event is a music reproduction, the processor 250 can link the previous music selection operation to the first touch pad and interlock the next music selection operation with the second touch pad.

In step S1270, the guide information providing apparatus 200 may generate guide information for guiding the first operation interlocked with the first touch pad and the second operation interlocked with the second touch pad.

In step S1280, the guide information providing device 200 may control the display unit 141 of the vehicle 100 to output an image corresponding to the guide information generated in step S1270.

In step S1290, the guide information providing device 200 can execute either the first operation or the second operation according to the user input received through the input unit 220 after the guide information is output. For example, the processor 250 may execute a first operation upon receiving a user input designated for the first touch pad, and a second operation upon receiving a designated user input for the second touch pad. At this time, when the user input is simultaneously received through the first touch pad and the second touch pad, the processor 250 may not perform any operation. Alternatively, when a user input is simultaneously received through the first touch pad and the second touch pad, the processor 250 may perform an operation other than the first operation and the second operation.

13 and 14 illustrate guide information related to FIG. 12 displayed on the windshield 11 according to the occurrence of a telephone call incoming event during a predetermined event. For convenience of explanation, it is assumed that, as an operation related to a call incoming event, a telephone connection operation and a telephone reject operation are selected.

Referring to FIG. 13, at the time of a call incoming event, the processor 250 determines whether or not a steering wheel (not shown) currently gripped by the driver, based on a sensing signal provided from a plurality of touch pads 222a, 222b, 222c, 121a, that is, the touch pads 222a, 222b.

At this time, the processor 250 can interlock each of the touch pads 222c and 222d not held by the driver with the telephone connection operation and the telephone rejection operation. That is, the telephone connection operation is interlocked with the touch pad 222c, and the telephone reject operation can be interlocked with the touch pad 222d.

The processor 250 may generate guide information for guiding the operation of the touch pad 222c and 222d when the operation of the touch pad 222c and 222d is completed, and then provide the guide information to the vehicle 100. [ At this time, the processor 250 can provide the telephone call reception information (e.g., telephone requestor information) received from the mobile terminal 310 to the vehicle 100 together with the guide information.

The display unit 141 can output the guide information and the call reception information received from the processor 250 on the windshield 11. [ For example, the processor 250 may be operatively connected to at least one of the transparent display 141a and the head-up display 141b shown in Figure 3 via the communication unit 210 or the interface unit 240, Incoming information can be transmitted.

At least one of the transparent display 141a and the head-up display 141b outputs an image corresponding to the guide information and the call reception information received from the guide information providing device 200 to a predetermined area 1310 of the windshield 11 can do.

The image in the predetermined area 1310 of the windshield 11 may include a plurality of objects for guiding guide information and telephone call information. More specifically, the object 1320 corresponding to the incoming call information, the object 1330 corresponding to the steering wheel 121a, the objects 1341 and 1342 corresponding to the touch pads 222a and 222b currently held by the driver, An object 1351 corresponding to the touch pad 222c, an object 1352 corresponding to the touch pad 222d, an object 1361 guiding a telephone connection operation interlocked with the touch pad 222c, a touch pad 222d, An object 1362 for guiding a phone rejection operation interlocked with the object 1310 may be displayed in a predetermined area 1310 of the windshield 11. [

If the user applies a predetermined operation pattern touch to the touch pad 222c of the steering wheel 121a, the processor 250 transmits a command for executing the telephone connection operation to the mobile terminal 310, The call can be performed. On the other hand, when the user applies a touch of a predetermined operation pattern to the touch pad 222d of the steering wheel 121a, the processor 250 transmits a command to execute the telephone reject operation to the mobile terminal 310, 310 of the other party.

Next, FIG. 14 illustrates a case where two touch pads 222c and 222b disposed at the lower end of the steering wheel 121a are grasped by the driver, unlike FIG. 14, when an incoming call event occurs, the processor 250 determines whether or not the touch pad 222c (222c, 222d, 222c, 222d) currently gripped by the driver , And 222d.

At this time, the processor 250 can interlock each of the touch pads 222a and 222b not held by the driver with the telephone connection operation and the telephone rejection operation. That is, the telephone connection operation is interlocked with the touch pad 222a, and the telephone reject operation can be interlocked with the touch pad 222b.

The processor 250 may generate guide information for guiding the operation of the touch pad 222a and 222b when the operation of the touch pad 222a and 222b is completed, and may provide the guide information to the vehicle 100. [ At this time, the processor 250 can provide the telephone call reception information (e.g., telephone requestor information) received from the mobile terminal 310 to the vehicle 100 together with the guide information.

The display unit 141 can output the guide information and the call reception information received from the processor 250 on the windshield 11. [ For example, the processor 250 may be operatively connected to at least one of the transparent display 141a and the head-up display 141b shown in Figure 3 via the communication unit 210 or the interface unit 240, Incoming information can be transmitted.

At least one of the transparent display 141a and the head-up display 141b outputs an image corresponding to the guide information and the call reception information received from the guide information providing device 200 to a predetermined area 1410 of the windshield 11 can do.

The image in the predetermined area 1410 of the windshield 11 may include a plurality of objects for guiding guide information and telephone call information. Specifically, the object 1420 corresponding to the incoming call information, the object 1430 corresponding to the steering wheel 121a, the objects 1441 and 1442 corresponding to the touch pad 222c and 222b currently held by the driver, An object 1451 corresponding to the touch pad 222a, an object 1452 corresponding to the touch pad 222b, an object 1461 guiding a telephone connection operation interlocked with the touch pad 222a, a touch pad 222b, An object 1462 for guiding a phone rejection operation interlocked with the predetermined area 1410 of the windshield 11 may be displayed in the predetermined area 1410 of the windshield 11. [

If the user touches the touch pad 222a of the steering wheel 121a with a predetermined operation pattern, the processor 250 transmits a command for executing the telephone connection operation to the mobile terminal 310, The call can be performed. On the other hand, when the user touches the touch pad 222b of the steering wheel 121a with a predetermined operation pattern, the processor 250 transmits a command to execute the telephone reject operation to the mobile terminal 310, 310 of the other party.

13 and 14, when an operation related to an event is interlocked with a touch pad not currently gripped by the driver and the grip state of the driver is changed, the guide information is reconstructed according to the changed grip state, It is possible to prevent the unintentional operation by the user.

15 is a flowchart illustrating a process (S1500) of providing guide information for a plurality of touch pads 222a, 222b, 222c, and 222d disposed on a steering wheel 121a, according to an embodiment of the present invention. ). ≪ / RTI > For convenience of explanation, it is assumed that the input unit 220 includes a plurality of touch pads 222a, 222b, 222c, and 222d as shown in FIG. 4B.

In step S1510, the guide information providing device 200 may determine whether a preset event has occurred. As described above, the preset event may include, for example, a call incoming, a schedule announcement, music playback, and the like. If a preset event occurs, the processor 250 performs step S1520. On the other hand, if the preset event has not occurred, the processor 250 can repeat step S1510

In step S1520, the guide information providing device 200 can select the first operation and the second operation associated with the predetermined event. In this case, the first operation and the second operation may be changed according to the kind of the event. For example, if the event is a telephone call incoming, the processor 250 may select the telephone connection operation as the first operation and the telephone reject operation as the second operation. As another example, if the event is a schedule notification, the processor 250 may select the schedule delete operation as the first operation and the schedule transfer operation as the second operation. As another example, when the event is music playback, the processor 250 may select the previous music selection operation as the first operation and the next music selection operation as the second operation.

In step S1530, the guide information providing device 200 can determine whether there are two touch pads held by the driver out of the plurality of touch pads 222a, 222b, 222c, and 222d. That is, the guide information providing device 200 can determine whether the driver is holding the steering wheel 121a with both hands. More specifically, the guide information providing apparatus 200 is configured such that the driver grasps the steering wheel 121a based on a sensing signal provided from a plurality of touch pads 222a, 222b, 222c, and 222d disposed on the steering wheel 121a, It is possible to judge whether or not it is in a state of being operated. In addition, when it is determined that the driver grasps the steering wheel 121a, the guide information providing device 200 can determine whether the driver grasps the steering wheel 121a with one hand or grasps the steering wheel 121a with both hands. In addition, the guide information providing device 200 can determine which portion of the steering wheel 121a is gripped when the steering wheel 121a is gripped with one hand or both hands. For example, when a driver's touch is detected by any two of the plurality of touch pads 222a, 222b, 222c, and 222d, the processor 250 can determine that the driver is holding the steering wheel 121a with both hands have.

In step S1540, the guide information providing device 200 can link the first operation to the first touch pad, which is one of the two touch pads held by the driver. Also, in step S1550, the guide information providing device 200 may link the second operation to the second touch pad, which is another one of the two touch pads held by the driver. For example, when the event is a telephone call incoming, the processor 250 may associate the telephone connection operation with the first touch pad and the telephone reject operation with the second touch pad. As another example, when the event is a schedule notification, the processor 250 may link the schedule deletion operation to the first touch pad and the schedule transmission operation to the second touch pad. As another example, when the event is a music reproduction, the processor 250 can link the previous music selection operation to the first touch pad and interlock the next music selection operation with the second touch pad.

In step S1560, the guide information providing device 200 determines whether or not the first operation associated with the first touch pad, the second operation interlocked with the second touch pad, the operation pattern of the first touch pad for executing the first operation, It is possible to generate guide information for guiding the operation pattern of the second touch pad for execution of the second operation. In this case, the guide information may include an object that guides the grip state of the driver.

In step S1570, the guide information providing device 200 may control the display unit 141 of the vehicle 100 to output an image corresponding to the guide information generated in step S1560.

In step S1580, the guide information providing device 200 can execute either the first operation or the second operation according to the user input received through the input unit 220 after the guide information is output. For example, the processor 250 may execute a first operation upon receiving a user input designated for the first touch pad, and a second operation upon receiving a designated user input for the second touch pad. At this time, when the user input is simultaneously received through the first touch pad and the second touch pad, the processor 250 may not perform any operation. Alternatively, when a user input is simultaneously received through the first touch pad and the second touch pad, the processor 250 may perform operations other than the first operation and the second operation.

FIG. 16 illustrates guide information related to FIG. 15 displayed on the windshield 11 in response to occurrence of a call incoming event during a predetermined event. For convenience of explanation, it is assumed that, as an operation related to a call incoming event, a telephone connection operation and a telephone reject operation are selected.

Referring to FIG. 16, upon occurrence of a telephone call incoming event, the processor 250 determines whether or not the steering wheel 121a (121a, 122b, 222c, 222d) That is, the touch pads 222a and 222b.

At this time, the processor 250 can link each of the touch pads 222a and 222b held by the driver with the telephone connection operation and the telephone rejection operation. That is, the telephone connection operation is interlocked with the touch pad 222a, and the telephone reject operation can be interlocked with the touch pad 222b.

The processor 250 may generate guide information for guiding the operation of the touch pad 222a and 222b when the operation of the touch pad 222a and 222b is completed, and may provide the guide information to the vehicle 100. [ At this time, the processor 250 can provide the telephone call reception information (e.g., telephone requestor information) received from the mobile terminal 310 to the vehicle 100 together with the guide information.

The processor 250 is operatively connected to at least one of the transparent display 141a and the head-up display 141b shown in Figure 3 via the communication unit 210 or the interface unit 240, Can be transmitted.

At least one of the transparent display 141a and the head-up display 141b outputs an image corresponding to the guide information and the call reception information received from the guide information providing device 200 to a predetermined area 1610 of the windshield 11 can do.

The image in the predetermined area 1610 of the windshield 11 may include a plurality of objects for guiding guide information and telephone call information. Specifically, an object 1620 corresponding to the incoming call information, an object 1630 corresponding to the steering wheel 121a, objects 1631 and 1632 guiding the driver's hand position with respect to the steering wheel 121a, An object 1641 for guiding the telephone connection operation interlocked with the touch pad 222a, an object 1642 for guiding the telephone reject operation interlocked with the touch pad 222b, an operation of the touch pad 222a for executing the telephone connection operation An object 1652 guiding an operation pattern of an object 1651 for guiding a pattern and a touch pad 222b for executing a telephone reject operation may be displayed in a predetermined area 1610 of the windshield 11. [ For example, as shown, an object 1651 may guide a clockwise drag input to the touchpad 222a, and an object 1652 may guide a counterclockwise drag input to the touchpad 222b have.

If the user applies a clockwise drag input to the touch pad 222a, the processor 250 may transmit an execution command of the telephone connection operation to the mobile terminal 310 to perform communication with the other party . On the other hand, when the user applies a counterclockwise drag input to the touch pad 222b, the processor 250 transmits a command to execute the telephone rejection operation to the mobile terminal 310, Of the call request.

The embodiments of the present invention described above are not only implemented by the apparatus and method but may be implemented through a program for realizing the function corresponding to the configuration of the embodiment of the present invention or a recording medium on which the program is recorded, The embodiments can be easily implemented by those skilled in the art from the description of the embodiments described above.

It is to be understood that both the foregoing general description and the following detailed description of the present invention are exemplary and explanatory and are intended to be illustrative, The present invention is not limited to the drawings, but all or some of the embodiments may be selectively combined so that various modifications may be made.

100: vehicle
200: guide information providing device

Claims (15)

An input unit 220 including a plurality of touch pads 222 disposed in the rim of the steering wheel; And
Upon occurrence of a predetermined event, selecting at least one action associated with the event,
Determines a grip position of the rider with respect to the rim based on a sensing signal provided from the plurality of touch pads (222)
Selecting at least one touch pad not held by the driver among the plurality of touch pads 222 and interlocking with the selected at least one operation,
A processor (250) for generating guide information for guiding at least one selected operation and controlling the display unit (141) of the vehicle to output an image corresponding to the guide information; And the vehicle guide information providing device.
The method according to claim 1,
The display unit 141 displays,
And a plurality of displays disposed at different positions of the vehicle,
The processor (250)
And is operatively connected to at least one of the plurality of displays.
The method according to claim 1,
At least one action associated with the event,
And a second operation different from the first operation among a plurality of operations executable by the vehicle guide information providing apparatus.
The method of claim 3,
The processor (250)
When the event is an incoming call, selecting the telephone connection operation as the first operation and selecting the telephone reject operation as the second operation.
The method of claim 3,
The processor (250)
And when the event is a schedule notification, selects a schedule delete operation as the first operation and selects a schedule transfer operation as the second operation.
The method of claim 3,
The processor (250)
And when the event is music reproduction, selects the previous music selection operation as the first operation and selects the next music selection operation as the second operation.
The method of claim 3,
The input unit 220,
And a first button and a second button disposed at different positions of the steering wheel.
8. The method of claim 7,
The processor (250)
The first operation and the second operation being interlocked with the first button and the second button, respectively,
And generates the guide information including a first object guiding the first operation interlocked with the first button and a second object guiding the second action interlocked with the second button, .
delete The method of claim 3,
The plurality of touch pads 222 include a first touch pad and a second touch pad,
The processor (250)
The first operation and the second operation are interlocked with the first touch pad and the second touch pad, respectively,
And generating the guide information including a third object guiding the first operation interlocked with the first touch pad and a fourth object guiding the second operation interlocked with the second touch pad, .
11. The method of claim 10,
The third object further guides the operation pattern of the first touch pad required for execution of the first operation,
And the fourth object further guides the operation pattern of the second touch pad required for execution of the second operation.
delete 12. The method of claim 11,
The processor (250)
And generates the guide information further including a fifth object for guiding the grip state of the driver.
The method according to claim 1,
The processor (250)
Selecting a first action and a second action associated with the event,
Wherein when one of the two touch pads is gripped by the driver, one of the two touch pads and the first operation are interlocked with each other, 2 operation,
And generates guide information for an operation pattern corresponding to the first operation and an operation pattern corresponding to the second operation.
The method according to claim 1,
The processor (250)
And performs an operation corresponding to the user input during at least one operation associated with the input unit (220) based on the user input received by the input unit (220).
KR1020150131804A 2015-09-17 2015-09-17 Apparatus and method for providing guide information for vehicle KR101748258B1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
KR1020150131804A KR101748258B1 (en) 2015-09-17 2015-09-17 Apparatus and method for providing guide information for vehicle

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
KR1020150131804A KR101748258B1 (en) 2015-09-17 2015-09-17 Apparatus and method for providing guide information for vehicle

Publications (2)

Publication Number Publication Date
KR20170033700A KR20170033700A (en) 2017-03-27
KR101748258B1 true KR101748258B1 (en) 2017-06-16

Family

ID=58496915

Family Applications (1)

Application Number Title Priority Date Filing Date
KR1020150131804A KR101748258B1 (en) 2015-09-17 2015-09-17 Apparatus and method for providing guide information for vehicle

Country Status (1)

Country Link
KR (1) KR101748258B1 (en)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2003063326A (en) * 2001-08-28 2003-03-05 Nissan Motor Co Ltd Steering switch
JP2006298241A (en) * 2005-04-22 2006-11-02 Toyota Motor Corp Display device for vehicle
JP2011213343A (en) * 2010-03-31 2011-10-27 Tk Holdings Inc Steering wheel sensors

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2003063326A (en) * 2001-08-28 2003-03-05 Nissan Motor Co Ltd Steering switch
JP2006298241A (en) * 2005-04-22 2006-11-02 Toyota Motor Corp Display device for vehicle
JP2011213343A (en) * 2010-03-31 2011-10-27 Tk Holdings Inc Steering wheel sensors

Also Published As

Publication number Publication date
KR20170033700A (en) 2017-03-27

Similar Documents

Publication Publication Date Title
KR101989523B1 (en) Vehicle control device mounted on vehicle and method for controlling the vehicle
KR101668248B1 (en) Input apparatus for vehicle and Vehicle
KR101708657B1 (en) Vehicle and control method for the same
US9854085B2 (en) Apparatus and method for controlling portable device in vehicle
KR101990547B1 (en) Parking Assistance Apparatus and Vehicle Having The Same
KR101732983B1 (en) Rear combination lamp for vehicle and Vehicle including the same
KR101750159B1 (en) Assistance Apparatus for Driving of a Vehicle, Method thereof, and Vehicle having the same
KR101969805B1 (en) Vehicle control device and vehicle comprising the same
KR20170068780A (en) Steer Input apparatus for vehicle and Vehicle
KR102014263B1 (en) Vehicle control device and vehicle comprising the same
KR101691800B1 (en) Display control apparatus and operating method for the same
KR101917412B1 (en) Apparatus for providing emergency call service using terminal in the vehicle and Vehicle having the same
KR20170054849A (en) Driver Assistance Apparatus and Vehicle Having The Same
KR101732263B1 (en) Driver Assistance Apparatus and Vehicle Having The Same
KR101841501B1 (en) Mobile device for car sharing and method for car sharing system
KR101912005B1 (en) Controller using magnetic levitation principle and vehicle having the same
KR101828400B1 (en) Portable v2x terminal and method for controlling the same
KR20220125148A (en) Video output device and its control method
KR101859043B1 (en) Mobile terminal, vehicle and mobile terminal link system
KR101893815B1 (en) Apparatus for providing around view and Vehicle
KR101807788B1 (en) Display apparatus for vehicle and control method for the same
KR20170041418A (en) Display apparatus for vehicle and control method for the same
KR101705454B1 (en) Driver Assistance Apparatus, Vehicle having the same
KR101748258B1 (en) Apparatus and method for providing guide information for vehicle
KR102023995B1 (en) Vehicle control method

Legal Events

Date Code Title Description
A201 Request for examination
E902 Notification of reason for refusal
E701 Decision to grant or registration of patent right
GRNT Written decision to grant