KR20170027163A - Display apparatus for vehicle and Vehicle including the same - Google Patents

Display apparatus for vehicle and Vehicle including the same Download PDF

Info

Publication number
KR20170027163A
KR20170027163A KR1020150123757A KR20150123757A KR20170027163A KR 20170027163 A KR20170027163 A KR 20170027163A KR 1020150123757 A KR1020150123757 A KR 1020150123757A KR 20150123757 A KR20150123757 A KR 20150123757A KR 20170027163 A KR20170027163 A KR 20170027163A
Authority
KR
South Korea
Prior art keywords
display
vehicle
processor
unit
information
Prior art date
Application number
KR1020150123757A
Other languages
Korean (ko)
Other versions
KR101809924B1 (en
Inventor
엄진우
Original Assignee
엘지전자 주식회사
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 엘지전자 주식회사 filed Critical 엘지전자 주식회사
Priority to KR1020150123757A priority Critical patent/KR101809924B1/en
Priority to PCT/KR2015/012227 priority patent/WO2016093502A1/en
Publication of KR20170027163A publication Critical patent/KR20170027163A/en
Application granted granted Critical
Publication of KR101809924B1 publication Critical patent/KR101809924B1/en

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K35/00Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60JWINDOWS, WINDSCREENS, NON-FIXED ROOFS, DOORS, OR SIMILAR DEVICES FOR VEHICLES; REMOVABLE EXTERNAL PROTECTIVE COVERINGS SPECIALLY ADAPTED FOR VEHICLES
    • B60J1/00Windows; Windscreens; Accessories therefor
    • B60J1/08Windows; Windscreens; Accessories therefor arranged at vehicle sides
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K28/00Safety devices for propulsion-unit control, specially adapted for, or arranged in, vehicles, e.g. preventing fuel supply or ignition in the event of potentially dangerous conditions
    • B60K28/02Safety devices for propulsion-unit control, specially adapted for, or arranged in, vehicles, e.g. preventing fuel supply or ignition in the event of potentially dangerous conditions responsive to conditions relating to the driver
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R21/00Arrangements or fittings on vehicles for protecting or preventing injuries to occupants or pedestrians in case of accidents or other traffic risks
    • B60R21/01Electrical circuits for triggering passive safety arrangements, e.g. airbags, safety belt tighteners, in case of vehicle accidents or impending vehicle accidents
    • B60R21/013Electrical circuits for triggering passive safety arrangements, e.g. airbags, safety belt tighteners, in case of vehicle accidents or impending vehicle accidents including means for detecting collisions, impending collisions or roll-over
    • B60R21/0134Electrical circuits for triggering passive safety arrangements, e.g. airbags, safety belt tighteners, in case of vehicle accidents or impending vehicle accidents including means for detecting collisions, impending collisions or roll-over responsive to imminent contact with an obstacle, e.g. using radar systems
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W40/00Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models
    • B60W40/08Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models related to drivers or passengers
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W50/00Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
    • B60W50/08Interaction between the driver and the control system
    • B60W50/14Means for informing the driver, warning the driver or prompting a driver intervention
    • B60K2350/922
    • B60K2350/927
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W50/00Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
    • B60W50/08Interaction between the driver and the control system
    • B60W50/14Means for informing the driver, warning the driver or prompting a driver intervention
    • B60W2050/146Display means

Landscapes

  • Engineering & Computer Science (AREA)
  • Mechanical Engineering (AREA)
  • Transportation (AREA)
  • Automation & Control Theory (AREA)
  • Chemical & Material Sciences (AREA)
  • Combustion & Propulsion (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • Mathematical Physics (AREA)
  • Human Computer Interaction (AREA)
  • Fittings On The Vehicle Exterior For Carrying Loads, And Devices For Holding Or Mounting Articles (AREA)

Abstract

The present invention relates to a vehicle display device, comprising: a display unit having a transparent flexible display disposed while being rolled around a predetermined shaft; a driving unit controlling the length of an area of the transparent flexible display, wherein the area is exposed to an indoor space of a vehicle; and a processor controlling the driving unit and controlling a screen to be displayed on the transparent flexible display.

Description

BACKGROUND OF THE INVENTION 1. Field of the Invention [0001] The present invention relates to a vehicle display device,

The present invention relates to a vehicle display device and a vehicle equipped with the same.

A vehicle is a device that moves a user in a desired direction by a boarding user. Typically, automobiles are examples.

On the other hand, for the convenience of users who use the vehicle, various sensors and electronic devices are provided. In particular, various devices for the user's driving convenience have been developed.

On the other hand, one of the devices for driving convenience is a sun visor for protecting the passenger's eyes from direct sunlight. The light entering through the windshield or the window of the vehicle directly shakes the passenger's eyes or is reflected by the display device installed in the vehicle to reduce the readability of the display device so that the passenger operates the sunvisor to block the inflow of light do.

On the other hand, the vehicle display device according to the related art is fixedly provided to display predetermined contents. Such a display device has a problem of occupying a space even when the display device is not used.

The present invention relates to a display device for a vehicle including a transparent flexible display so as to be deployed as needed in order to solve the above problems.

The problems of the present invention are not limited to the above-mentioned problems, and other problems not mentioned can be clearly understood by those skilled in the art from the following description.

According to another aspect of the present invention, there is provided a display device for a vehicle, comprising: a display unit including a transparent flexible display disposed to be curled around a predetermined axis; A driver for adjusting a length of a region of the transparent flexible display which is exposed to a room of the vehicle; And a processor for controlling the driving unit and controlling the display to be displayed on the transparent flexible display.

The details of other embodiments are included in the detailed description and drawings.

According to an embodiment of the present invention, there is one or more of the following effects.

First, by using a transparent flexible display, there is an effect of preventing blindness of a passenger.

Second, since the length of the light blocking area is adjusted in consideration of the illuminance, the position of the sun, or the passenger's gaze, there is an effect of preventing the passenger's glare adaptively.

Thirdly, there is an effect that, if necessary, an image assisting the driving in the image display area is displayed so that the passenger can perform safe driving.

The effects of the present invention are not limited to the effects mentioned above, and other effects not mentioned can be clearly understood by those skilled in the art from the description of the claims.

1 is a view showing an appearance of a vehicle according to an embodiment of the present invention.
2 is a block diagram of a vehicle driving assist system according to an embodiment of the present invention.
FIGS. 3A to 4B are views for explaining a transparent flexible display, a driving unit, and a guide unit according to an embodiment of the present invention.
FIG. 5A is a diagram referred to explain the driving unit 230 according to an embodiment of the present invention.
FIG. 5B is a diagram referred to explain a guide driver according to an embodiment of the present invention. FIG.
6 is a diagram for describing a driving unit according to another embodiment of the present invention.
Fig. 7 is an example of an internal block diagram of the vehicle of Fig. 1. Fig.
Fig. 8 is an example of an internal block diagram of the electronic control unit in the vehicle of Fig. 1. Fig.
9 is an exemplary diagram referred to describe a vehicle driving assistant device according to an embodiment of the present invention.
FIG. 10A is a diagram for describing an operation for controlling the length of a transparent flexible display in a driving assistance device according to an embodiment of the present invention. FIG.
10B is a diagram referred to explain the operation of controlling the angle of the transparent flexible display in the driving assistance device according to the embodiment of the present invention.
FIGS. 11A through 11D are views referred to to explain a screen displayed on a transparent flexible display according to an embodiment of the present invention. FIG.
12 is a diagram referred to explain an operation of controlling an image display region through the mobile terminal 250 according to an embodiment of the present invention.
13 is a diagram referred to explain an operation of outputting vehicle information by voice according to an embodiment of the present invention.
14 is a diagram referred to explain a voice command reception operation according to an embodiment of the present invention.
FIG. 15 is a diagram for describing an operation of controlling an image display area by sensing a passenger's gaze according to an embodiment of the present invention. Referring to FIG.
16 is a diagram referred to explain an operation of detecting a gesture of a passenger according to an embodiment of the present invention.
17 is a diagram referred to explain an operation of displaying text message reception information and call reception information according to an embodiment of the present invention.
Figures 18A-18B are diagrams that are referenced to illustrate the operation of displaying a transparent flexible display, a driver, or a guide portion status, in accordance with an embodiment of the present invention.
Hereinafter, embodiments of the present invention will be described in detail with reference to the accompanying drawings, wherein like reference numerals are used to designate identical or similar elements, and redundant description thereof will be omitted. The suffix "module" and " part "for the components used in the following description are given or mixed in consideration of ease of specification, and do not have their own meaning or role. In the following description of the embodiments of the present invention, a detailed description of related arts will be omitted when it is determined that the gist of the embodiments disclosed herein may be blurred. It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory and are intended to provide further explanation of the invention as claimed. , ≪ / RTI > equivalents, and alternatives.
Terms including ordinals, such as first, second, etc., may be used to describe various elements, but the elements are not limited to these terms. The terms are used only for the purpose of distinguishing one component from another.
It is to be understood that when an element is referred to as being "connected" or "connected" to another element, it may be directly connected or connected to the other element, . On the other hand, when an element is referred to as being "directly connected" or "directly connected" to another element, it should be understood that there are no other elements in between.
The singular expressions include plural expressions unless the context clearly dictates otherwise.
In the present application, the terms "comprises", "having", and the like are used to specify that a feature, a number, a step, an operation, an element, a component, But do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, or combinations thereof.
The vehicle described herein may be a concept including a car, a motorcycle. Hereinafter, the vehicle will be described mainly with respect to the vehicle.
The vehicle described in the present specification may be a concept including both an internal combustion engine vehicle having an engine as a power source, a hybrid vehicle having an engine and an electric motor as a power source, and an electric vehicle having an electric motor as a power source.
In the following description, the left side of the vehicle means the left side in the running direction of the vehicle, that is, the driver's seat side, and the right side of the vehicle means the right side in the running direction of the vehicle, that is, the assistant's seat side.
1 is a view showing an appearance of a vehicle according to an embodiment of the present invention.
1, the vehicle 700 includes wheels 103FR, 103FL, 103RL, ... rotated by a power source, a steering wheel for adjusting the traveling direction of the vehicle 10, a vehicle driving assistant 100, . ≪ / RTI >
The vehicle driving assistant device 100 according to the embodiment of the present invention may be provided inside the vehicle. It is preferable that the vehicle driving assistant device 100 is attached to the vehicle interior ceiling on the driver's seat side.
Meanwhile, the vehicle driving assistant device 100 according to the embodiment of the present invention can be realized by being integrated with the vehicle display device 400. [ According to the embodiment, the vehicle driving assist apparatus 100 described with reference to Figs. 2 to 18B may be referred to as a vehicle display apparatus.
2 is a block diagram of a vehicle driving assist system according to an embodiment of the present invention.
2, a vehicle driving assist system 100 according to an embodiment of the present invention includes a communication unit 110, an input unit 120, a sensing unit 140, an interface unit 160, a memory 170, a processor A power supply unit 190, a display unit 210, an acoustic output unit 220, a driving unit 230, and a guide unit 240.
The communication unit 110 can exchange data with the mobile terminal 250, the server 260, or the other vehicle 261 in a wireless manner. In particular, the communication unit 110 can exchange data wirelessly with the mobile terminal of the vehicle occupant. As a wireless data communication method, various data communication methods such as Bluetooth, WiFi Direct, WiFi, and APiX are possible.
The communication unit 110 can receive weather information and traffic situation information of the road such as TPEG (Transport Protocol Expert Group) information from the mobile terminal 250, the server 260 or the other vehicle 261 .
On the other hand, when the user is aboard the vehicle, the user's mobile terminal 250 and the vehicle driving assistant device 100 can perform pairing with each other automatically or by execution of the user's application.
Meanwhile, the communication unit 110 can receive the control signal from the mobile terminal of the vehicle occupant. For example, the mobile terminal generates a control signal corresponding to an input signal received from the user, and transmits the generated control signal to the vehicle driving assistant 100. In this case, the communication unit 110 may receive the control signal and transmit the control signal to the processor 180.
The input unit 120 includes an input unit 120 for inputting video signals, a camera 121 or an image input unit, a microphone 123 for inputting audio signals, or an audio input unit, a user input unit (E.g., a touch key, a mechanical key, and the like). The voice data or image data collected by the input unit 120 may be analyzed and processed by a user's control command.
The vehicle driving assistant (100) may include a plurality of cameras (121). The camera 121 processes image frames such as still images or moving images obtained by the image sensor. The processed image frame may be displayed on the display unit 210 or stored in the memory 170. [ Meanwhile, the plurality of cameras 121 provided in the vehicle driving assistant device 100 may be arranged to have a matrix structure, and the vehicle driving assistant device 100 may be provided with various A plurality of image information having an angle or a focus can be input. In addition, the plurality of cameras 121 may be arranged in a stereo structure to acquire a left image and a right image for realizing a stereoscopic image.
Meanwhile, the first camera 121a may be disposed at a position suitable for acquiring an image outside the vehicle. The first camera 121a acquires an image of the surroundings of the vehicle, and transmits the image to the processor 180. [ The first camera 121a is arranged in the lateral direction in front of the vehicle so that two cameras can acquire a stereo image.
On the other hand, the second camera 121b can be disposed at a position suitable for capturing an image of the interior of the vehicle. Particularly, the second camera 121b can acquire an image of the passenger boarding the vehicle. For example, the second camera 121b may be disposed at a position where an eye of a passenger can be photographed for tracking the passenger's gaze. For example, the second camera 121b may receive the gesture input of the occupant.
The microphone 123 processes an external acoustic signal into electrical voice data. The processed voice data can be utilized variously according to functions (or application programs being executed) being performed in the vehicle driving assistant device 100. [ Meanwhile, the microphone 123 may be implemented with various noise reduction algorithms for eliminating noise generated in receiving an external sound signal.
On the other hand, the microphone 123 can receive the voice input of the occupant. The microphone 123 can convert the received voice input into an electrical signal.
The user input unit 125 receives information from the user. When information is inputted through the user input unit 125, the processor 180 controls the operation of the driving assist system 100 to correspond to the input information . The user input unit 125 may include a mechanical input unit (or a mechanical key such as a button, a dome switch, a jog wheel, a jog switch, etc.), and a touch input unit .
The user input unit 125 can receive an input from the passenger to control the driving unit 230 or the guide driving unit 245. The vehicle occupant can control the driving unit 230 through the user input unit 125 to adjust the incoming or outgoing of the transparent flexible display. Further, the vehicle occupant can adjust the angle of the contact portion 241 by controlling the guide driving portion 245 through the user input portion 125. [
The sensing unit 140 may include at least one sensor for sensing at least one of information in the vehicle driving assistant device 100, surrounding environment information surrounding the driving assist device 100, and user information. For example, the sensing unit 140 may include a sun sensor 141 and a light amount sensor 142.
The solar sensor 141 tracks the position of the sun. For example, the sun sensor 141 performs the azimuth and elevation angle tracking of the sun. The solar sensor 141 may include one or more photodiodes to track the position of the sun.
The light amount sensor 142 senses the amount of light entering the interior of the vehicle. Specifically, the light amount sensor 142 detects the solar light amount. The light amount sensor 142 may include a photoconductive element such as a CdS photoconductive cell or a CdS cell.
The interface unit 160 can receive map information related to vehicle driving by data communication with the display device 400. [ For example, the display device 400 may include navigation and the interface 160 may receive information about the location of the vehicle on the map and the map from the navigation, and may communicate the information to the processor 180 .
On the other hand, the interface unit 160 can receive the sensor information from the control unit 770 or the vehicle sensor unit 760.
Here, the sensor information includes at least one of a slip degree of the vehicle, vehicle direction information, vehicle position information (GPS information), vehicle angle information, vehicle speed information, vehicle acceleration information, vehicle tilt information, vehicle forward / backward information, , Tire information, vehicle lamp information, vehicle interior temperature information, and vehicle interior humidity information.
Such sensor information may include at least one of a wheel speed sensor, a heading sensor, a yaw sensor, a gyro sensor, a position module, a vehicle forward / backward sensor, a wheel sensor, A vehicle speed sensor, a vehicle body inclination sensor, a battery sensor, a fuel sensor, a tire sensor, a steering sensor by steering wheel rotation, a vehicle internal temperature sensor, and a vehicle internal humidity sensor. On the other hand, the position module may include a GPS module for receiving GPS information.
On the other hand, among the sensor information, the vehicle direction information, the vehicle position information, the vehicle angle information, the vehicle speed information, the vehicle tilt information, and the like relating to the vehicle running can be referred to as vehicle running information.
The memory 170 may store various data for operation of the vehicle driving assistance apparatus 100, such as a program for processing or controlling the processor 180. [ The memory 170 may be various storage devices such as ROM, RAM, EPROM, flash drive, hard drive, and the like.
The processor 180 controls the overall operation of each unit in the vehicle driving assistance apparatus 100. [ In particular, the processor 180 may control the display unit 210, the driving unit 230, and the guide unit 240.
The processor 180 may calculate the length value of the light blocking area or the image display area included in the transparent flexible display 211 based on the position of the sun. The processor 180 may control the driving unit 230 according to the calculated length value to adjust the length of the light blocking area or the image display area. For example, the processor 180 may adjust the length of the light blocking area or the image display area based on the occupant's gaze sensed by the second camera 121b and the position of the sun tracked through the sun sensor 141 have. Specifically, the processor 180 can control the driving unit 230 so that the light blocking area or the image display area can be positioned on the straight line formed by the passenger's line of sight and the sun's position.
The processor 180 can calculate an angle value that the light blocking area or the image display area forms with the ground, based on the position of the sun. The processor 180 can control the guide unit 240 according to the calculated angle value. For example, based on the occupant's gaze detected by the second camera 121b and the position of the sun tracked through the sun sensor 141, the processor 180 determines whether the light blocking area or the image display area is formed Can be calculated. Specifically, the processor 180 calculates an angle value formed by the light blocking area or the image display area with the ground, so that the sunlight can be effectively blocked based on the occupant's gaze and the position of the sun , And controls the guide drive unit according to the calculated angle value to adjust the angle formed by the contact unit with the ground surface.
As described above, the processor 180 further determines the length of the light blocking area or the image display area by considering the occupant's gaze detected by the second camera 121b at the position of the sun tracked through the sun sensor 141 And controls the driving unit 230 according to the calculated length value.
The processor 180 may receive information on the amount of light entering the interior of the vehicle from the light amount sensor 142. The processor 180 may adjust the transparency of the transparent flexible display 211 based on the amount of light.
The processor 180 receives the image data photographed by the camera 121. The processor 180 processes the image data based on computer vision to generate vehicle-related information. The processor 180 may display vehicle-related information through one area of the transparent flexible display 211.
On the other hand, when the first camera 121a is configured as a stereo camera, the processor 180 acquires a stereo image of the front of the vehicle from the first camera 121a, and based on the stereo image, Perform parity calculation, perform object detection on at least one of the stereo images based on the calculated disparity information, and continuously track the movement of the object after object detection.
Particularly, when the object is detected, the processor 180 may detect a lane detection (LD), a vehicle detection (VD), a pedestrian detection (PD), a brilliant spot detection Traffic sign recognition (TSR), road surface detection, and the like.
Then, the processor 180 can perform a distance calculation to the detected nearby vehicle, a speed calculation of the detected nearby vehicle, a speed difference calculation with the detected nearby vehicle, and the like.
Processor 180 may generate vehicle related information. The processor 180 may display vehicle-related information through one area of the transparent flexible display 211. Here, the vehicle-related information may include vehicle control information for direct control of the vehicle or vehicle driving assistance information for a driving guide to the vehicle occupant. The vehicle-related information may be information received through the input unit 120. [ Alternatively, the vehicle-related information may be information received from the vehicle sensor unit 760 via the interface unit 160. [ Alternatively, the vehicle information may be information received from the control unit 770 through the interface unit 160. [ Alternatively, the vehicle information may be information received from the mobile terminal 250 or the server 260 via the communication unit 110. [
Processor 180 may be formed as an electrical unit for performing a digital signal processor (DSP), application specific integrated circuit (ASIC), microcontroller, programmable logic devices (PLDs), field programmable gate arrays (FPGAs) And can be mounted on one surface of a predetermined circuit board.
The power supply unit 190 can supply power necessary for the operation of each component under the control of the processor 180. [ Particularly, the power supply unit 190 can receive power from a battery or the like inside the vehicle. In particular, the power supply 190 may contact the transparent flexible display area to supply power.
The display unit 210 displays (outputs) the information processed by the processor 180. For example, the display unit 210 displays execution screen information of an application program driven by the vehicle driving assistant 100, UI (User Interface) and GUI (Graphic User Interface) information corresponding to the execution screen information .
The display unit 210 may include a transparent flexible display (211).
The transparent flexible display 211 may be configured to be deformable by an external force. The deformation may be at least one of warping, bending, folding, twisting, and curling of the transparent flexible display 211.
In a state in which the transparent flexible display 211 is not deformed (for example, a state having an infinite radius of curvature, hereinafter referred to as a first state), the display area of the transparent flexible display 211 becomes flat. In the first state, the display area may be a curved surface in a state deformed by an external force (for example, a state having a finite radius of curvature, hereinafter referred to as a second state). The information displayed in the second state may be time information output on the curved surface. Such visual information is realized by independently controlling the emission of a sub-pixel arranged in a matrix form. The unit pixel means a minimum unit for implementing one color.
The transparent flexible display 211 may be placed in a bent state (e.g., bent up or down or left or right) instead of a flat state in the first state. In this case, when an external force is applied to the transparent flexible display 211, the transparent flexible display 211 can be deformed into a flat state (or less curved state) or more curved state.
Meanwhile, the transparent flexible display 211 may be combined with a touch sensor to implement a flexible touch screen. When a touch is made to the flexible touch screen, the processor 180 may perform control corresponding to the touch input. The flexible touch screen may be configured to sense the touch input in the first state as well as the second state.
Meanwhile, the transparent flexible display 211 may have a predetermined transparency. In order to have such a transparency, the transparent flexible display 211 may be formed of a transparent thin film transistor (TFT), a transparent organic light-emitting diode (OLED), a liquid crystal display (LCD), a transmissive transparent display, And a display. The transparency of the transparent flexible display 211 may be adjusted under the control of the processor 180.
Meanwhile, the transparent flexible display 211 may be provided to be adjustable in length within the vehicle. The transparent flexible display 211 can block light entering the interior of the vehicle and display at least one image.
Meanwhile, the transparent flexible display 211 may include a light blocking area or an image display area. Here, the light blocking area adjusts the transparency of the transparent flexible display 211 to prevent the light incident from the sun from being directly irradiated to the occupant. The size of the area or degree of transparency can be adjusted based on the data sensed by the sun sensor 141 or the light amount sensor 142 in the light shielding area. The video display area displays information processed by the processor 180. [ The image display area may be plural. The processor 180 processes the control command or data transmitted from the communication unit 110, the input unit 120, the interface unit 160, or the memory 170 and displays the processed control command or data on the video display area. For example, the image display area included in the transparent flexible display 211 can display an image obtained by the camera 121. [
Meanwhile, the image display area included in the transparent flexible display 211 can display the state of the guide unit 240 or the driving unit 230. FIG. Here, the state of the guide unit 240 or the driving unit 230 can be displayed numerically or graphically. For example, the image display area can display the degree of pull-in and pull-out of the transparent flexible display 211, the angle formed by the contact portion with the ground, and the like.
On the other hand, the transparent flexible display 211 can move according to the driving force generated by the driving unit 230. The length of the light blocking area or the image display area can be adjusted on the basis of the driving force of the transparent flexible display 211. For example, the transparent flexible display 211 can be moved in the vehicle traveling direction or the vehicle traveling direction in accordance with the driving force generated in the driving unit 230.
On the other hand, the transparent flexible display 211 may be wound around a region other than the light shielding region or the image display region according to the driving force generated by the driving unit 230. For example, in the case where the driving force generating unit 231 included in the driving unit 230 is a motor, an area excluding the light shielding area or the image display area in the transparent flexible display 211 may be a light shielding area or an image display area, Or pulled out, through the rotational force of the motor.
In the sound output unit 220, the sound output unit 220 converts an electric signal from the processor 180 into an audio signal and outputs the audio signal. For this purpose, a speaker or the like may be provided. The sound output unit 220 is also capable of outputting sound corresponding to the operation of the user input unit 125, that is, the button.
The driving unit 230 is connected to the transparent flexible display 211 to control the length adjustment of the light blocking area or the image display area included in the transparent flexible display 211. For example, the driving unit 230 can adjust the length of the light blocking area based on the azimuth angle and the altitude angle of the sun tracked by the sun sensor 141. For example, when an input signal for adjusting the length of the light blocking area or the image display area is received through the user input unit 125, the driving unit 230 may adjust the length of the light blocking area or the image display area have.
Meanwhile, the driving unit 230 may include a driving force generating unit 231. The driving force generation section 231 provides a driving force that allows the transparent flexible display 211 connected to the driving section 230 to move forward or backward. For example, the driving force generation section 231 may include a motor or an actuator. For example, when the driving force generating unit 231 is constructed by a motor, the length of the light blocking area or the image display area can be adjusted according to the driving force generated in the transparent flexible display 211. For example, when the driving force generating unit 231 is configured by an actuator, the length of the light blocking area or the image display area can be adjusted according to the driving force generated in the actuator of the transparent flexible display 211.
Meanwhile, the driving unit 230 may transmit the driving force to the transparent flexible display 211 so that the transparent flexible display 211 may wind the area excluding the light shielding area or the image display area. For example, when the driving force generating unit 231 included in the driving unit 230 is configured by a motor, the driving unit 230 may provide the rotational force of the motor to draw in or out the light blocking area or the image display area. In this case, in the transparent flexible display 211, a region excluding the light shielding region or the image display region can be wound or unwound based on the rotational force. The vehicle driving assistant (100) may further include a stopper. Where the stopper limits winding of the transparent flexible display 211.
Meanwhile, the driving unit 230 may further include an elastic portion. Here, the elastic part can support the transparent flexible display 211 upward. The transparent flexible display 211 is supported upward by the elastic portion, thereby restricting the upward and downward movement of the transparent flexible display 211. Because of the function of the elastic portion, the movement of the transparent flexible display 211 is limited even when the vehicle moves up and down, and the effect is fixed without being shaken.
The guide portion 240 is provided inside the vehicle. The guide 240 guides the movement of the transparent flexible display when the length of the transparent flexible display 211 is adjusted.
The guide portion 240 may include a contact portion and a guide driving portion. The contacts may be in contact with a transparent flexible display. The contact portion forms a predetermined angle with the ground. The contacts guide movement of the transparent flexible display. The guide driving portion adjusts an angle formed by the contact portion with the ground surface. For example, the guide driving unit may include a driving force generating means, such as a motor or an actuator, to adjust an angle formed by the contact portion and the ground.
FIGS. 3A to 4B are views for explaining a transparent flexible display, a driving unit, and a guide unit according to an embodiment of the present invention. 3A to 4B are side views of a driver's seat inside a vehicle provided with the vehicle driving assistant device 100. Fig.
3A and 3B, the driving unit 230 is connected to the transparent flexible display 211 to control the length adjustment of the light blocking area or the image display area included in the transparent flexible display 211.
The driving unit 230 may include a driving force generating unit 231, a connecting unit 232, and a moving unit 233.
The driving force generating section 231 provides a driving force that allows the transparent flexible display 211 connected to the driving section 230 to be moved forward or backward under the control of the processor 180. [ In the drawing, the driving force generating unit 231 exemplifies a linear motor, but is not limited thereto. The driving force generation section 231 generates a driving force in accordance with a control command of the processor 180. [
The connection unit 232 connects the driving unit 230 and the transparent flexible display 211. The moving part 233 is attached to the lower end of the connecting part 232. The moving part 233 may be a roller.
The driving force generated by the driving force generation section 231 is transmitted to the connection section 232. The driving unit 230 can move forward or backward through the driving force. That is, the driving unit 230 can move linearly toward the windshield 290 or toward the opposite direction of the windshield 290 through the driving force. The transparent flexible display 211 connected through the connection part 232 can be drawn out of the housing 280 or can be drawn into the housing 280. [ FIG. 3A illustrates a state in which the transparent flexible display 211 is drawn into the housing 280, and FIG. 3B illustrates a state in which the transparent flexible display 211 is drawn out of the housing 280. FIG.
Meanwhile, the housing 280 accommodating the driving unit 230 may be attached to the ceiling of the driver's seat in the vehicle.
The guide portion 240 may include a contact portion 241 and a guide driving portion 243.
The contact portion 241 may be in contact with the transparent flexible display 211. For example, when the transparent flexible display 211 is pulled out of the housing 280, the contact portion 241 contacts the transparent flexible display 211. When the transparent flexible display 211 is continuously drawn out in contact with the contact portion 241, the contact portion 241 guides the moving direction of the transparent flexible display 211. For example, the contact portion 241 guides the transparent flexible display 211 to move downward toward a predetermined angle. 3B illustrates a state in which the contact portion 241 guides the movement of the transparent flexible display 211 and forms a predetermined angle downward.
The guide driving portion 243 adjusts the angle formed by the contact portion 241 with the ground. For example, in order to adjust the angle formed by the transparent flexible display 211 with the ground, the guide driving unit 243 can adjust the angle formed by the contact portion 241 with the ground. Here, the angle that the transparent flexible display 211 forms with the ground can be calculated in the processor 180 based on the position of the sun or the line of sight of the occupant.
On the other hand, the second camera 121b may be disposed in one area of the housing 280 so that the lens faces the passenger. And the second camera 121b can sense a line of sight or a gesture of the occupant. In addition, the microphone 123 may be disposed in one area of the housing 280, so as to facilitate reception of the occupant's voice.
On the other hand, the first camera 121a may be disposed between the guide unit 240 and the windshield 290 such that the lens faces the front of the vehicle.
Figures 4A-4B show differences in the location of the rollers 235 in Figures 3A-3B. The difference will be mainly described below. Referring to FIGS. 4A-4B, the vehicle driving assistance apparatus 100 may include one or more rollers 235. The roller 235 is moved to the transparent flexible display 211 when the transparent flexible display 211 is pulled out of the housing 280 or into the housing 280 by the driving force generated by the driving force generating unit 231, Thereby reducing the frictional force due to the movement of the motor. For example, when the transparent flexible display 211 is pulled out of the housing 280, the roller 235 rotates in contact with the transparent flexible display 211. With the rotation of the roller 235, drawing out of the transparent flexible display 211 is easier. For example, when the transparent flexible display 211 is drawn into the housing 280, the roller 235 rotates in contact with the transparent flexible display 211. With the rotation of the roller 235, the transparent flexible display 211 is more easily drawn in.
FIG. 5A is a diagram referred to explain the driving unit 230 according to an embodiment of the present invention. 5A is a partial front view showing the driving unit 230 when the driving unit 230 is viewed from the windshield 290 of FIG. 3A. 5A illustrates a first driver 230a disposed on one side of the transparent flexible display 211 in the width direction. A second driving unit 230b may be disposed on the other side of the transparent flexible display 211 in the width direction. The second driving unit 230b may correspond to the first driving unit 230a.
5A, the driving unit 230 may include a driving force generating unit 231, a rotation unit 236, a fixing unit 237, and an elastic unit 238. [ In addition, the display unit 210 may include a transparent flexible display 211, an extension 212, and a protrusion 213.
The driving force generation section 231 generates a driving force under the control of the processor 180. [ In the figure, the driving force generation section 231 exemplifies a motor. The generated driving force is transmitted to the rotating portion 236. In accordance with the rotation of the rotation part 236, the transparent flexible display 211 moves. The rotating portion 237 is disposed between the two projecting portions 213. [
The fixing portion 237 restrains the movement other than the rotational movement of the rotation portion 236. [ The fixing portion 237 restricts the upward, downward, leftward, rightward, backward, and backward movements of the rotation portion 236 so that rotational force is transmitted to the display portion 210.
The elastic portion 238 is connected to the fixing portion 237. The elastic portion 238 includes an elastic member that provides elasticity. Although the spring is illustrated in this drawing, it is not limited thereto. The elastic portion 238 allows the display portion 210 to be supported upward, that is, in the ceiling direction of the vehicle. The transparent flexible display 211 is supported upward by the elastic portion, thereby restricting the upward and downward movement of the transparent flexible display 211. Because of the function of the elastic portion, the movement of the transparent flexible display 211 is limited even when the vehicle moves up and down, and the effect is fixed without being shaken.
The extension portion 212 extends from the display 211 toward one side in the width direction. The protruding portion 213 protrudes downward from the extension portion 236. A rotation part 236 is disposed between the two protruding parts 213. Since the rotation part 236 is disposed between the two protruding parts 213, the rotational force generated in the rotation part 236 can be accurately transmitted to the display part 210.
FIG. 5B is a diagram referred to explain a guide driver according to an embodiment of the present invention. FIG.
5B, the first worm gear 252 is connected to the rotation axis of the rotating motor 251, under the control of the processor 180, in the guide driving unit 243. The first worm gear 252 is engaged with a first worm wheel gear 253 that reduces the rotational speed of the motor 251 and converts the rotational center axis orthogonally to the motor shaft. The second worm wheel gear 255 is engaged with the second worm gear 254 formed integrally with the first worm wheel gear 253. The second worm wheel gear 255 serves to convert the rotational center axis direction of the first worm wheel gear 253 and the second worm gear 254 to a right angle.
On the same axis as the second worm wheel gear 255, the first spur gear 256 is located. The first spur gear 256 meshes with the second spur gear 257 formed at the end of the pivot 258 and transmits the rotational force to the contact portion 241.
6 is a diagram for describing a driving unit according to another embodiment of the present invention.
Referring to FIG. 6, the driving unit 230 may transmit a driving force to the transparent flexible display 211 so that the area 215 excluding the light shielding area or the image display area is wound on the transparent flexible display 211. Here, the area 215 excluding the light shielding area or the image display area may be formed of a material that is extended from the transparent flexible display 211 and is deformable by an external force, and according to an embodiment, a transparent flexible display 211 As shown in FIG.
Meanwhile, when the driving force generating unit 231 included in the driving unit 230 is configured by a motor, the driving unit 230 may provide the rotational force of the motor to draw in or out the light blocking area or the image display area. Here, the driving unit 230 may be the driving unit described with reference to FIG. 5A. In this case, in the transparent flexible display 211, the area 215 excluding the light blocking area or the image display area can be wound or unwound based on the rotation force. The driving unit 230 may further include a stopper 216. Here, the stopper 216 restricts winding of the transparent flexible display 211.
FIG. 7 is a diagram for describing a user input unit 125 according to an embodiment of the present invention.
Referring to FIG. 7, the user input unit 125 may be attached to the steering wheel 310. The user input portion 125 may be configured to perform input with the thumb, with the occupant holding the steering wheel 310. [
As shown in FIG. 7A, the user input unit 125 includes a + button 311, a - button 312, a first button 313, a second button 314, a third button 315, . ≪ / RTI > Here, each of the buttons 311, 312, 313, 314, and 315 may be a physical button or a soft key.
When an input is received at the first button 313 and an input is received at the + button, the processor 180 controls the transparent flexible display 211 to move downward. For example, the processor 180 controls the driving unit 230 to allow the transparent flexible display 211 to be drawn out of the housing. At this time, the moving direction of the transparent flexible display 211 is changed to have a predetermined angle downward through the guide portion 240. Through this process, the transparent flexible display 211 can be moved downward.
When an input is received at the first button 313 and an input is received at the - button, the processor 180 controls the transparent flexible display 211 to be moved upward. For example, the processor 180 controls the driver 230 to allow the transparent flexible display 211 to enter the interior of the housing.
When an input is received at the second button 314 and an input is received at the + button, the processor 180 adjusts the transparency of the transparent flexible display 211 to be higher.
When an input is received at the second button 314 and an input is received at the - button, the processor 180 adjusts the transparency of the transparent flexible display 211 to be lowered.
When the input is received at the third button 315 and the input at the + button is received, the processor 180 controls the guide driver 243 to adjust the angle at which the contact portion 241 forms with the ground .
When the input is received at the third button 315 and the input at the -button is received, the processor 180 controls the guide driver 243 to adjust the angle formed by the contact portion 241 with the ground to be small .
As shown in FIG. 7 (b), the user input unit 125 may be configured as a ball type 320 (ball type). The ball type user input section can receive up, down, left, and right rolling inputs.
For example, when a rolling input is received in a first direction, the processor 180 controls the transparent flexible display 211 to move downward. For example, the processor 180 controls the driving unit 230 to allow the transparent flexible display 211 to be drawn out of the housing. At this time, the moving direction of the transparent flexible display 211 is changed to have a predetermined angle downward through the guide portion 240. Through this process, the transparent flexible display 211 can be moved downward.
For example, when a rolling input is received in a second direction, the processor 180 controls the transparent flexible display 211 to move upward. For example, the processor 180 controls the driver 230 to allow the transparent flexible display 211 to enter the interior of the housing.
For example, when a rolling input is received in a third direction, the processor 180 adjusts the transparency of the transparent flexible display 211 to be higher.
For example, when a rolling input is received in a fourth direction, the processor 180 adjusts the transparency of the transparent flexible display 211 to be low.
On the other hand, it is specified that the control operation of the processor 180 corresponding to the above-described rolling direction is only an embodiment. The control operation of the processor 180 corresponding to the rolling direction is possible according to the matching combination, and various embodiments can be included in the scope of the present invention.
Fig. 7 is an example of an internal block diagram of the vehicle of Fig. 1. Fig.
The vehicle 700 includes a communication unit 710, an input unit 720, a sensing unit 760, an output unit 740, a vehicle driving unit 750, a memory 730, an interface unit 780, a control unit 770, A display device 790, a vehicle driving assistant device 100, and a display device 400 for a vehicle.
The communication unit 710 is connected to the communication unit 710 and the communication unit 710. The communication unit 710 is configured to communicate with the vehicle 700 and the mobile terminal 600, Modules. In addition, the communication unit 710 may include one or more modules that connect the vehicle 700 to one or more networks.
The communication unit 710 may include a broadcast receiving module 711, a wireless Internet module 712, a local communication module 713, a location information module 714, an optical communication module 715, and a V2X communication module 716 have.
The broadcast receiving module 711 receives broadcast signals or broadcast-related information from an external broadcast management server through a broadcast channel. Here, the broadcast includes a radio broadcast or a TV broadcast.
The wireless Internet module 712 refers to a module for wireless Internet access, and may be embedded in the vehicle 700 or externally. The wireless Internet module 712 is configured to transmit and receive wireless signals in a communication network according to wireless Internet technologies.
Wireless Internet technologies include, for example, WLAN (Wireless LAN), Wi-Fi (Wireless Fidelity), Wi-Fi (Wireless Fidelity) Direct, DLNA, WiBro World Wide Interoperability for Microwave Access (HSDPA), High Speed Uplink Packet Access (HSUPA), Long Term Evolution (LTE), and Long Term Evolution-Advanced (LTE-A) (712) transmits and receives data according to at least one wireless Internet technology in a range including internet technologies not listed above. For example, the wireless Internet module 712 can exchange data with the external server 601 wirelessly. The wireless Internet module 712 can receive weather information and road traffic situation information (for example, TPEG (Transport Protocol Expert Group)) information from the external server 601. [
The short-range communication module 713 is for short-range communication and may be a Bluetooth ™, a Radio Frequency Identification (RFID), an Infrared Data Association (IrDA), an Ultra Wideband (UWB) It is possible to support near-field communication using at least one of Near Field Communication (NFC), Wireless-Fidelity (Wi-Fi), Wi-Fi Direct and Wireless USB (Universal Serial Bus)
The short-range communication module 713 may form short-range wireless communication networks to perform short-range communication between the vehicle 700 and at least one external device. For example, the short-range communication module 713 can exchange data with the mobile terminal 600 wirelessly. The short distance communication module 713 can receive weather information and traffic situation information of the road (for example, TPEG (Transport Protocol Expert Group)) from the mobile terminal 600. For example, when the user has boarded the vehicle 700, the user's mobile terminal 600 and the vehicle 700 can perform pairing with each other automatically or by execution of the user's application.
The position information module 714 is a module for obtaining the position of the vehicle 700, and a representative example thereof is a Global Positioning System (GPS) module. For example, when the vehicle utilizes a GPS module, it can acquire the position of the vehicle using a signal sent from the GPS satellite.
The optical communication module 715 may include a light emitting portion and a light receiving portion.
The light receiving section can convert the light signal into an electric signal and receive the information. The light receiving unit may include a photodiode (PD) for receiving light. Photodiodes can convert light into electrical signals. For example, the light receiving section can receive information of the front vehicle through light emitted from the light source included in the front vehicle.
The light emitting unit may include at least one light emitting element for converting an electric signal into an optical signal. Here, the light emitting element is preferably an LED (Light Emitting Diode). The optical transmitter converts the electrical signal into an optical signal and transmits it to the outside. For example, the optical transmitter can emit the optical signal to the outside through the blinking of the light emitting element corresponding to the predetermined frequency. According to an embodiment, the light emitting portion may include a plurality of light emitting element arrays. According to the embodiment, the light emitting portion can be integrated with the lamp provided in the vehicle 700. [ For example, the light emitting portion may be at least one of a headlight, a tail light, a brake light, a turn signal lamp, and a car light. For example, the optical communication module 715 can exchange data with another vehicle 602 via optical communication.
The V2X communication module 716 is a module for performing wireless communication with the server 601 or the other vehicle 602. V2X module 716 includes modules that can implement inter-vehicle communication (V2V) or vehicle-to-infrastructure communication (V2I) protocols. The vehicle 700 can perform wireless communication with the external server 601 and the other vehicle 602 via the V2X communication module 716. [
The input unit 720 may include a driving operation unit 721, a camera 195, a microphone 723, and a user input unit 724.
The driving operation means 721 receives a user input for driving the vehicle 700. [ The driving operation means 721 may include a steering input means 721a, a shift input means 721b, an acceleration input means 721c, and a brake input means 721d.
The steering input means 721a receives the input of the traveling direction of the vehicle 700 from the user. The steering input means 721a is preferably formed in a wheel shape so that steering input is possible by rotation. According to an embodiment, the steering input means 721a may be formed of a touch screen, a touch pad, or a button.
The shift input means 721b receives the input of parking (P), forward (D), neutral (N), and reverse (R) of the vehicle 700 from the user. The shift input means 721b is preferably formed in a lever shape. According to the embodiment, the shift input means 721b may be formed of a touch screen, a touch pad, or a button.
The acceleration input means 721c receives an input for acceleration of the vehicle 700 from the user. The brake input means 721d receives an input for decelerating the vehicle 700 from the user. The acceleration input means 721c and the brake input means 721d are preferably formed in a pedal shape. According to the embodiment, the acceleration input means 721c or the brake input means 721d may be formed of a touch screen, a touch pad, or a button.
The camera 195 may include an image sensor and an image processing module. The camera 195 may process still images or moving images obtained by an image sensor (e.g., CMOS or CCD). The image processing module may process the still image or the moving image obtained through the image sensor, extract necessary information, and transmit the extracted information to the control unit 170.
Meanwhile, the vehicle 100 may include a front camera 195a for photographing a forward image of the vehicle, an auroral view camera 195b for photographing a peripheral image of the vehicle, and an internal camera 195c for photographing an internal image of the vehicle. Each camera 195a, 195b, 195c may include a lens, an image sensor, and a processor. The processor may process the image to be photographed by computer processing, generate data or information, and transmit the generated data or information to the control unit 170.
The processor included in the camera 195 may be under the control of the control unit 170.
The processor included in the camera 195 may be implemented in hardware as application specific integrated circuits (ASICs), digital signal processors (DSPs), digital signal processing devices (DSPDs), programmable logic devices (PLDs), field programmable gate arrays ), Processors, controllers, micro-controllers, microprocessors, and other electronic units for performing other functions.
The front camera 195a may include a stereo camera. In this case, the processor of the camera 195a can detect the distance to the object, the relative speed with respect to the object detected in the image, and the distance between the plurality of objects using the disparity difference detected in the stereo image have.
The front camera 195a may include a time of flight (TOF) camera. In this case, the camera 195 may include a light source (for example, an infrared ray or a laser) and a receiving unit. In this case, the process of the camera 195a determines the distance to the object, the relative speed with the object, the distance between the plurality of objects based on the time (TOF) until the infrared ray or laser emitted from the light source is reflected and received by the object, Can be detected.
The surround view camera 195b may include a plurality of cameras. For example, a plurality of cameras may be disposed on the left, rear, right, and front of the vehicle.
The left camera may be disposed in a case surrounding the left side mirror. Alternatively, the left camera may be disposed outside the case surrounding the left side mirror. Alternatively, the left camera may be disposed in one area outside the left front door, the left rear door, or the left fender.
The right camera can be disposed in a case surrounding the right side mirror. Or the right camera may be disposed outside the case surrounding the right side mirror. Alternatively, the right camera may be disposed in one area outside the right front door, the right rear door, or the right fender.
On the other hand, the rear camera may be disposed in the vicinity of a rear license plate or a trunk or a tailgate switch.
The front camera can be placed near the ambulance or near the radiator grill.
Each image photographed by a plurality of cameras is transmitted to a processor of the camera 195b, and the processor can synthesize the respective images to generate a vehicle periphery image. At this time, the vehicle surroundings image may be displayed through the display unit 141 as a top view image or a bird eye image.
The internal camera 195c can photograph the interior of the vehicle 100. [ The internal camera 195c can acquire an image of the occupant.
The processor of the internal camera 195c can acquire an image of the occupant in the vehicle 100, and can detect how many persons are occupying and where the occupant is aboard. For example, the internal camera 195c can detect the presence or absence of a passenger and the boarding position.
The internal camera 195c can acquire an image for biometrics of the occupant. The processor of the internal camera 195c can confirm the ID of the occupant based on the image of the face of the occupant.
7, the camera 195 is included in the input unit 720, but the camera 195 may be described as being included in the sensing unit 125. FIG.
The microphone 723 can process an external sound signal as electrical data. The processed data can be utilized variously according to functions performed in the vehicle 700. The microphone 723 can convert the voice command of the user into electrical data. The converted electrical data can be transmitted to the control unit 770.
The camera 722 or the microphone 723 may be a component included in the sensing unit 760 rather than a component included in the input unit 720. [
The user input unit 724 is for receiving information from a user. When information is inputted through the user input unit 724, the control unit 770 can control the operation of the vehicle 700 to correspond to the inputted information. The user input unit 724 may include touch input means or mechanical input means. According to an embodiment, the user input 724 may be located in one area of the steering wheel. In this case, the driver can operate the user input portion 724 with his / her finger while holding the steering wheel.
The sensing unit 760 senses a signal relating to the running of the vehicle 700 and the like. To this end, the sensing unit 760 may include a sensor, a wheel sensor, a velocity sensor, a tilt sensor, a weight sensor, a heading sensor, a yaw sensor, a gyro sensor, , A position module, a vehicle forward / backward sensor, a battery sensor, a fuel sensor, a tire sensor, a steering sensor by a steering wheel rotation, a vehicle internal temperature sensor, an internal humidity sensor, a rain sensor, , A light detector (LiADAR), and the like.
Thereby, the sensing unit 760 can acquire the vehicle collision information, vehicle direction information, vehicle position information (GPS information), vehicle angle information, vehicle speed information, vehicle acceleration information, vehicle tilt information, , Fuel information, tire information, vehicle lamp information, vehicle interior temperature information, vehicle interior humidity information, information on rain, and the steering wheel rotation angle.
In addition, the sensing unit 760 may include an acceleration pedal sensor, a pressure sensor, an engine speed sensor, an air flow sensor AFS, an intake air temperature sensor ATS, a water temperature sensor WTS, A position sensor (TPS), a TDC sensor, a crank angle sensor (CAS), and the like.
The sensing unit 760 may include a biometric information sensing unit. The biometric information sensing unit senses and acquires the biometric information of the passenger. The biometric information may include fingerprint information, iris-scan information, retina-scan information, hand geo-metry information, facial recognition information, Voice recognition information. The biometric information sensing unit may include a sensor for sensing passenger ' s biometric information. Here, the internal camera 195c and the microphone 723 can operate as sensors. The biometric information sensing unit can acquire the hand shape information and the face recognition information through the internal camera 195c.
The output unit 740 is for outputting information processed by the control unit 770 and may include a display unit 741, an acoustic output unit 742, and a haptic output unit 743. [
The display unit 741 can display information processed in the control unit 770. For example, the display unit 741 can display the vehicle-related information. Here, the vehicle-related information may include vehicle control information for direct control of the vehicle, or vehicle driving assistance information for a driving guide to the vehicle driver. Further, the vehicle-related information may include vehicle state information indicating the current state of the vehicle or vehicle driving information related to the driving of the vehicle.
The display unit 741 may be a liquid crystal display (LCD), a thin film transistor-liquid crystal display (TFT LCD), an organic light-emitting diode (OLED) display, a 3D display, and an e-ink display.
The display unit 741 may have a mutual layer structure with the touch sensor or may be integrally formed to realize a touch screen. This touch screen may function as a user input 724 that provides an input interface between the vehicle 700 and the user and may provide an output interface between the vehicle 700 and the user. In this case, the display unit 741 may include a touch sensor that senses a touch with respect to the display unit 741 so that a control command can be received by a touch method. When a touch is made to the display unit 741, the touch sensor senses the touch, and the control unit 770 generates a control command corresponding to the touch based on the touch. The content input by the touch method may be a letter or a number, an instruction in various modes, a menu item which can be designated, and the like.
Meanwhile, the display unit 741 may include a cluster so that the driver can check the vehicle state information or the vehicle driving information while driving. Clusters can be located on the dashboard. In this case, the driver can confirm the information displayed in the cluster while keeping the line of sight ahead of the vehicle.
Meanwhile, according to the embodiment, the display unit 741 may be implemented as a Head Up Display (HUD). When the display unit 741 is implemented as a HUD, information can be output through a transparent display provided in the windshield. Alternatively, the display unit 741 may include a projection module to output information through an image projected on the windshield.
The sound output unit 742 converts an electric signal from the control unit 770 into an audio signal and outputs the audio signal. For this purpose, the sound output unit 742 may include a speaker or the like. It is also possible for the sound output section 742 to output a sound corresponding to the operation of the user input section 724. [
The haptic output unit 743 generates a tactile output. For example, the haptic output section 743 may operate to vibrate the steering wheel, the seat belt, and the seat so that the user can recognize the output.
The vehicle drive unit 750 can control the operation of various devices of the vehicle. The vehicle driving unit 750 includes a power source driving unit 751, a steering driving unit 752, a brake driving unit 753, a lamp driving unit 754, an air conditioning driving unit 755, a window driving unit 756, an airbag driving unit 757, A driving unit 758 and a suspension driving unit 759.
The power source driving unit 751 can perform electronic control on the power source in the vehicle 700. [
For example, when the fossil fuel-based engine (not shown) is a power source, the power source drive unit 751 can perform electronic control on the engine. Thus, the output torque of the engine and the like can be controlled. When the power source drive unit 751 is an engine, the speed of the vehicle can be limited by limiting the engine output torque under the control of the control unit 770. [
As another example, when the electric motor (not shown) is a power source, the power source driving unit 751 can perform control on the motor. Thus, the rotation speed, torque, etc. of the motor can be controlled.
The power source driving section 751 can receive the acceleration control signal from the vehicle driving assistant device 100. [ The power source driving unit 751 can control the power source in accordance with the received acceleration control signal.
The steering driver 752 may perform electronic control of the steering apparatus in the vehicle 700. [ Thus, the traveling direction of the vehicle can be changed. The steering driver 752 may receive the steering control signal from the vehicle driving assist system 100. [ The steering driver 752 can control the steering apparatus to steer according to the received steering control signal.
The brake driver 753 can perform electronic control of a brake apparatus (not shown) in the vehicle 700. [ For example, it is possible to reduce the speed of the vehicle 700 by controlling the operation of the brakes disposed on the wheels. As another example, it is possible to adjust the traveling direction of the vehicle 700 to the left or right by differently operating the brakes respectively disposed on the left wheel and the right wheel. The brake driver 753 can receive the deceleration control signal from the vehicle driving assistant 100. [ The brake driver 759 can control the brake device in accordance with the received deceleration control signal.
The lamp driver 754 can control the turn-on / turn-off of the lamps disposed inside and outside the vehicle. Also, the intensity, direction, etc. of the light of the lamp can be controlled. For example, it is possible to perform control on a direction indicating lamp, a brake lamp, and the like.
The air conditioning driving section 755 can perform electronic control on an air conditioner (not shown) in the vehicle 700. [ For example, when the temperature inside the vehicle is high, the air conditioner can be operated to control the cool air to be supplied to the inside of the vehicle.
The window driver 756 may perform electronic control of the window apparatus in the vehicle 700. [ For example, it is possible to control the opening or closing of the side of the vehicle with respect to the left and right windows.
The airbag driving unit 757 can perform electronic control of the airbag apparatus in the vehicle 700. [ For example, in case of danger, the airbag can be controlled to fire.
The sunroof driving unit 758 may perform electronic control of a sunroof apparatus (not shown) in the vehicle 700. [ For example, the opening or closing of the sunroof can be controlled.
The suspension driving unit 759 can perform electronic control on a suspension apparatus (not shown) in the vehicle 700. [ For example, when there is a curvature on the road surface, it is possible to control the suspension device so as to reduce the vibration of the vehicle 700. [ The suspension driving unit 759 can receive the suspension control signal from the vehicle driving assistant 100. [ The suspension driving unit 759 can control the suspension device according to the received suspension control signal.
The memory 730 is electrically connected to the control unit 770. The memory 730 may store basic data for the unit, control data for controlling the operation of the unit, and input / output data. The memory 730 can be, in hardware, various storage devices such as ROM, RAM, EPROM, flash drive, hard drive, and the like. The memory 730 may store various data for operation of the vehicle 700, such as a program for processing or controlling the control unit 770. [
The interface unit 780 may serve as a pathway to various kinds of external devices connected to the vehicle 700. For example, the interface unit 780 may include a port that can be connected to the mobile terminal 600, and may be connected to the mobile terminal 600 through the port. In this case, the interface unit 780 can exchange data with the mobile terminal 600.
Meanwhile, the interface unit 780 may serve as a channel for supplying electrical energy to the connected mobile terminal 600. The interface unit 780 provides electric energy supplied from the power supply unit 790 to the mobile terminal 600 under the control of the control unit 770 when the mobile terminal 600 is electrically connected to the interface unit 780 do.
The control unit 770 can control the overall operation of each unit in the vehicle 700. [ The control unit 770 may be referred to as an ECU (Electronic Control Unit).
The controller 770 may be implemented in hardware as application specific integrated circuits (ASICs), digital signal processors (DSPs), digital signal processing devices (DSPDs), programmable logic devices (PLDs), field programmable gate arrays (FPGAs) processors, controllers, micro-controllers, microprocessors, and other electronic units for performing other functions.
The power supply unit 790 can supply power necessary for the operation of each component under the control of the control unit 770. [ Particularly, the power supply unit 770 can receive power from a battery (not shown) in the vehicle.
Vehicle driving assistance apparatus 100 can exchange data with the control unit 770. [ The control signal generated by the vehicle driving assistant 100 may be output to the control unit 770. The vehicle driving assistant apparatus 100 may be the vehicle driving assistant apparatus described above with reference to Figs.
The vehicle display apparatus 400 can exchange data with the control unit 770. [
9 is an exemplary diagram referred to describe a vehicle driving assistant device according to an embodiment of the present invention.
Referring to FIG. 9, the transparent flexible display 211 is provided in the vehicle so as to be adjustable in length. Specifically, the transparent flexible display 211 includes a light blocking area or an image display area. The transparency of the transparent flexible display 211 is adjustable.
The processor 180 controls the driving unit 230 to adjust the length of the light blocking area or the image display area. Here, the driving unit 230 is housed in the housing 280.
The processor 180 may receive information on the amount of light entering the interior of the vehicle from the light amount sensor 142. The processor 180 may adjust the transparency of the transparent flexible display 211 based on the amount of light.
FIG. 10A is a diagram for describing an operation for controlling the length of a transparent flexible display in a driving assistance device according to an embodiment of the present invention. FIG.
The processor 180 may calculate a length value L of the light blocking area or image display area included in the transparent flexible display 211 based on the position of the sun 1010. [ The processor 180 may control the driving unit 230 according to the calculated length value L to adjust the length L of the light blocking area or the image display area. For example, the processor 180 may be configured to display a light blocking area or image display area based on the position of the sun 1010 tracked by the occupant line 1020 and the sun sensor 141, as sensed by the second camera 121b. The length L of the region can be adjusted. Specifically, the processor 180 can control the driving unit 230 so that the light blocking area or the image display area can be positioned on the straight line 1030 formed by the occupant's line of sight 1020 and the sun's position 1010 .
10B is a diagram referred to explain the operation of controlling the angle of the transparent flexible display in the driving assistance device according to the embodiment of the present invention.
The processor 180 may calculate an angle value? That the light blocking area or image display area forms with the ground 1040 based on the position of the sun. The processor 180 can control the guide unit 240 according to the calculated angle value alpha. For example, the processor 180 may be configured to determine the position of the sun block 1010 based on the position of the sun 1010 tracked by the occupant line 1020 and the sun sensor 141, as sensed by the second camera 121b, It is possible to calculate the angle value alpha formed by the display area on the paper surface 1040. [ Specifically, the processor 180 determines whether the light blocking area or the image display area is formed with the ground surface so that the sunlight can be effectively blocked, based on the position of the occupant line 1020 and the sun 1010, And adjusts the angle formed by the contact portion 241 with the ground by controlling the guide driving portion 243 in accordance with the calculated angle value alpha. The angle formed by the transparent flexible display 211 contacting the contact portion 241 with the ground surface 1040 is adjusted as the angle formed by the contact portion 241 with the ground surface is adjusted.
FIGS. 11A through 11D are views referred to to explain a screen displayed on a transparent flexible display according to an embodiment of the present invention. FIG.
Referring to FIG. 11A, the transparent flexible display 211 may include image display areas 1110, 1120, and 1130. The processor 180 may display an image received from the camera 121 in an image display area.
For example, the first camera 121a may be composed of a plurality of cameras. The first camera 121a is composed of three cameras, and can capture images of the rear of the vehicle, the rear left, and the rear right. Here, the camera for photographing the rear of the vehicle can be installed at the upper or lower end of the rear plate. A camera that photographs the left rear of the vehicle may be installed in the left door or the left side mirror module. The camera for photographing the right rear of the vehicle may be installed in the right door or the right side mirror module. The processor 180 may display the obtained vehicle rear image in the first area 1110. [ The processor 180 may display the acquired vehicle right rear image in the second area 1120. [ The processor 180 may display the obtained vehicle left rear image in the third area 1130. [ In this case, the transparent flexible display 211 can perform the side mirror function.
For example, the second camera 121b can photograph an in-vehicle image. The processor 180 may display the in-vehicle image in any one of the first to third areas 1110, 1120, and 1130. In this case, the transparent flexible display 211 can perform a room mirror function.
Fig. 11B shows an embodiment in which the vehicle rear image is received and the vehicle rear image is displayed on the display 210. Fig.
The camera for acquiring the vehicle rear image may include a first camera 1160a, a second camera 1160b, and a third camera 1160c. The first camera 1160a may be provided at the rear of the vehicle (e.g., above or below the rear license plate). The first camera 1160a may acquire the first image 1170a. At this time, the first image 1170a may be a vehicle rear image. The second camera 1160b may be provided on the left side of the vehicle (for example, the left side mirror module). And the second camera 1160b may acquire the second image 1170b. At this time, the second image 1170b may be the left rear image of the vehicle. The third camera 1160c may be provided on the right side of the vehicle (for example, the right side mirror module). And the third camera 1160c can acquire the third image 1170c. At this time, the third image 1170c may be the right rear image of the vehicle.
The processor 180 may receive the first to third images 1170a, 1170b and 1170c and may synthesize the first to third images 1170a, 1170b and 1170c to generate the composite image 1180. [ The processor 180 may be controlled to be displayed on the transparent flexible display 211. [
On the other hand, an area overlapping each of the images acquired by the first to third cameras 1160a, 1160b, and 1160c may be generated. When creating the composite image 1180, it is necessary to process overlapping areas.
The processing of the overlapping area can be processed in the following manner. When generating the composite image 1180, the processor 180 synthesizes the first image 1170a and the second image 1170b on the basis of the extension line 1162a (hereinafter referred to as a first line) of the left body line of the vehicle can do. Specifically, the processor 180 determines that the right side of the first line 1162a based on the first line 1162a is based on the first image 1170a, the left side of the first line 1162a is based on the second image 1170b ). ≪ / RTI > In addition, the processor 180 may synthesize the first image 1170a and the third image 1170c based on the extension line 1162b (hereinafter referred to as a second line) of the right side body line of the vehicle. Specifically, the processor 180 determines that the left side of the second line 1162b is based on the first image 1170a, the right side of the second line 1162b is based on the second image 1162b, 1170c. ≪ / RTI > Thus, by combining the first to third images, it is possible to provide the passenger with a wide view of the rear view of the vehicle.
On the other hand, the first to third cameras 1160a, 1160b, and 1160c have different positions in the vehicle. Accordingly, a sense of distance may be generated in the image captured by each camera. When generating the composite image 1180, distance correction is required.
The distance correction can be processed in the following manner. When generating a composite image, the processor 180 may adjust the distance to the second and third images based on the first camera position, thereby synthesizing the first to third images. For example, the second and third cameras may not be disposed in the same plane as the first camera. The first camera may be provided on the upper or lower side of the rear plate of the vehicle, the second camera may be provided on the left side mirror module, and the third camera may be provided on the right side mirror module. In this case, a point at which the second and third cameras acquire an image is different from a point at which the first camera acquires an image. Accordingly, a distance between the second and third cameras and the first image is generated between the second and third cameras and the first camera. The processor 180 can adjust the distance through scaling, just as the first through third images are photographed on the same plane.
Referring to FIG. 11C, the processor 180 may display the vehicle information in the image display area. Here, the vehicle information may be information received from the vehicle sensor unit 760 through the interface unit 160. [ Alternatively, the vehicle information may be information received from the control unit 770 through the interface unit 160. [ Alternatively, the vehicle information may be information received from the mobile terminal 250 or the server 260 via the communication unit 110. [ Alternatively, the vehicle information may be the received information through the input unit 120. [
For example, the processor 180 displays the vehicle speed information (shown as (a) in FIG. 11B), the traffic sign information, the navigation information, the remaining fuel amount information Information, traffic information, and the like can be displayed.
Referring to FIG. 11D, the processor 180 may exchange data with the server 260 via the communication unit 110. FIG. At this time, the server 260 may be a home server. For example, an image photographed from a camera installed in a house can be delivered to the home server. The processor 180 can receive the photographed image through the communication unit 110 and display the photographed image on the image display area 1150. [
12 is a diagram referred to explain an operation of controlling an image display region through the mobile terminal 250 according to an embodiment of the present invention.
12, when the mobile terminal 250 and the vehicle driving assistant 100 are paired, the processor 180 receives the control signal of the mobile terminal 250 and controls the display unit 210 .
For example, when a pinch-in input is received on the display unit of the mobile terminal 250 while the mobile terminal 250 and the vehicle driving assistance apparatus 100 are paired, The size of the image display area 1210 displayed on the transparent flexible display 211 can be reduced corresponding to the pinch-in input.
For example, when a pinch-out input is received on the display unit of the mobile terminal 250 while the mobile terminal 250 and the vehicle driving assistance apparatus 100 are paired, The size of the image display area 1210 displayed on the transparent flexible display 211 can be increased corresponding to the pinch-out input.
For example, when a touch-and-drag input is received on the display unit of the mobile terminal 250 while the mobile terminal 250 and the vehicle driving assistant 100 are paired, the processor 180 may respond to the drag direction The position of the image display area 1210 displayed on the transparent flexible display 211 can be changed.
13 is a diagram referred to explain an operation of outputting vehicle information by voice according to an embodiment of the present invention.
Referring to FIG. 13, the processor 180 can output voice information of the vehicle information through the sound output unit 220. [0040] FIG.
For example, as shown, the processor 180 may output (1310) the state of the residual fuel amount by voice. If the residual fuel amount is insufficient, the user can inquire about the search for the surrounding gas station by voice 1310. In this case, the processor 180 can receive voice commands via the microphone 123. [
On the other hand, the processor 180 may provide a calendar function. Here, the calendar function may be provided through an application provided in the vehicle 10. [ Alternatively, the calendar function of the mobile terminal 250 may be provided through pairing. For example, as shown, when the processor 180 corresponds to a preset schedule, the processor 180 may output 1320 a predetermined schedule through the sound output unit 220. In addition, the processor 180 may output the navigation information corresponding to the preset schedule to the voice 1320. [
14 is a diagram referred to explain a voice command reception operation according to an embodiment of the present invention.
Referring to FIG. 14, the processor 180 may receive user voice commands via the microphone 123. For example, as shown in the figure, the passenger can ask the road 1410 of the road congestion reason in the road congestion section. In this case, the processor 180 may receive information from the server 260 via the communication unit 110. [ Here, the server 260 may be a server for providing traffic information. The processor 180 may output the received information to the sound 1420 through the sound output unit 220. Alternatively, the processor 180 may display the received information on the transparent flexible display 211.
FIG. 15 is a diagram for describing an operation of controlling an image display area by sensing a passenger's gaze according to an embodiment of the present invention. Referring to FIG.
Referring to FIG. 15, the second camera 121b can capture the line of sight of the passenger. For example, the second camera 121b can photograph the eyes of the passenger. In this case, the processor 180 receives the image from the second camera 121b. The processor 180 extracts the pupil from the received image. Processor 180 tracks the extracted pupil.
If the pupil is located above the predetermined position, the processor 180 may activate the image display area 1510. [ That is, when the line of vision of the passenger is directed toward the front of the vehicle, the transparent flexible display 211 performs only a function of blocking light that is externally introduced. When the passenger's gaze is directed to the transparent flexible display 211, the processor 180 activates the image display area 1510 and the transparent flexible display 211 may serve as display means.
On the other hand, in the image display area 1510, the image described with reference to Figs. 11A to 11C can be displayed. In particular, the in-vehicle image obtained from the second camera 121b may be displayed in the image display area 1510. [
16 is a diagram referred to explain an operation of detecting a gesture of a passenger according to an embodiment of the present invention.
Referring to FIG. 16, when the mobile terminal 250 is paired with the mobile terminal 250, the processor 180 receives the call through the display unit 210 or the sound output unit 220 , And can output call reception information.
In this case, when the occupant takes a preset gesture, the processor 180 receives the gesture input through the second camera 121b. When the gesture input is a gesture corresponding to the telephone receiving operation, the processor 180 transmits a telephone receiving command to the mobile terminal 250. Accordingly, the mobile terminal 250 can establish a call connection. At this time, the processor 180 can receive the user's voice through the microphone 123, and output the received voice through the sound output unit 220. [
17 is a diagram referred to explain an operation of displaying text message reception information and call reception information according to an embodiment of the present invention.
17 (a), when a text message is received in the mobile terminal 250 in a state of being paired with the mobile terminal 250, the processor 180 transmits a text message And displays it on the display unit 210. Specifically, the processor 180 displays the text message in the image display area 1710 included in the transparent flexible display 211. [
17, when a call signal is received in the mobile terminal 250 in a state of being paired with the mobile terminal 250, the processor 180 transmits the call signal to the mobile terminal 250 through the communication unit 110 , And receives a call signal and displays a call reception event on the display unit 210. Specifically, the processor 180 displays the call reception event in the video display area 1720 included in the transparent flexible display 211. [
Figures 18A-18B are diagrams that are referenced to illustrate the operation of displaying a transparent flexible display, a driver, or a guide portion status, in accordance with an embodiment of the present invention.
18A, the processor 180 may display the status of the transparent flexible display 211, the driving unit 230, or the guide unit 240. For example, the processor 180 may display the image 1810 corresponding to the length of the drawn out transparent flexible display 211 on the display unit 210. [ For example, the processor 180 may display an image 1820 on the display 210 corresponding to an angle formed by the contact portion 241 with the ground. For example, the processor 180 may display an image 1830 on the display 210 corresponding to an angle formed by the transparent flexible display 211 with the ground. Through each of the displayed images 1810, 1820, 1830, the occupant can intuitively perceive the state of the transparent flexible display, the driver, or the guide.
The processor 180 may display the status of the transparent flexible display 211, the driver 230, or the guide 240, as shown in FIG. 18B. For example, the processor 180 may display the length of the fetched transparent flexible display 211 as a character 1840. For example, the processor 180 may display the angle that the transparent flexible display 211 forms with the ground, as characters 1850. For example, although not shown, the processor 180 may characterize the angle that the contact portion 241 forms with the ground.
19 is a block diagram for explaining a display device for a vehicle according to an embodiment of the present invention.
19, the vehicle display device 400 includes a communication unit 410, an input unit 420, a memory 430, a display unit 440, an acoustic output unit 448, a driver 450, 460, a processor 470, an interface unit 480, and a power supply unit 490.
The communication unit 410 is a communication unit that transmits and receives data between the vehicle 700 and the mobile terminal 600 and between the vehicle 700 and the external server 510 or between the vehicle 700 and the other vehicle 520, Modules. In addition, the communication unit 410 may include one or more modules that connect the vehicle 700 to one or more networks.
The communication unit 410 may include a broadcast receiving module 411, a wireless Internet module 412, a local communication module 413, a location information module 414, and a V2X communication module 416.
The broadcast receiving module 411 receives broadcast signals or broadcast-related information from an external broadcast management server through a broadcast channel. Here, the broadcast includes a radio broadcast or a TV broadcast.
The wireless Internet module 412 refers to a module for wireless Internet access, and may be embedded in the vehicle 700 or externally. The wireless Internet module 412 is configured to transmit and receive wireless signals in a communication network according to wireless Internet technologies.
Wireless Internet technologies include, for example, WLAN (Wireless LAN), Wi-Fi (Wireless Fidelity), Wi-Fi (Wireless Fidelity) Direct, DLNA, WiBro World Interoperability for Microwave Access (HSDPA), High Speed Uplink Packet Access (HSUPA), Long Term Evolution (LTE) and Long Term Evolution-Advanced (LTE-A) (412) transmits and receives data according to at least one wireless Internet technology in a range including internet technologies not listed above. For example, the wireless Internet module 412 may exchange data with the external server 510 wirelessly. The wireless Internet module 412 can receive weather information and road traffic situation information (for example, TPEG (Transport Protocol Expert Group)) information from the external server 510. [
The short-range communication module 413 is for short-range communication and may be a Bluetooth ™, a Radio Frequency Identification (RFID), an Infrared Data Association (IrDA), a UWB (Ultra Wideband) It is possible to support near-field communication using at least one of Near Field Communication (NFC), Wireless-Fidelity (Wi-Fi), Wi-Fi Direct and Wireless USB (Universal Serial Bus)
The short-range communication module 413 may form short-range wireless communication networks to perform short-range communication between the vehicle 400 and at least one external device. For example, the short-range communication module 413 can exchange data with the mobile terminal 600 wirelessly. The short-range communication module 413 may receive weather information and road traffic situation information (e.g., TPEG (Transport Protocol Expert Group)) from the mobile terminal 600. For example, when the user has boarded the vehicle 700, the user's mobile terminal 600 and the vehicle 700 can perform pairing with each other automatically or by execution of the user's application.
The position information module 414 is a module for obtaining the position of the vehicle 700, and a representative example thereof is a Global Positioning System (GPS) module. For example, when the vehicle utilizes a GPS module, it can acquire the position of the vehicle using a signal sent from the GPS satellite.
The V2X communication module 416 is a module for performing wireless communication with the server 510 or the other vehicle 520. V2X module 416 includes modules capable of implementing inter-vehicle communication (V2V) or vehicle-to-infrastructure communication (V2I) protocols. The vehicle 700 can perform wireless communication with the external server 510 and the other vehicle 520 via the V2X communication module 416. [
The input unit 420 may include a user input unit 421 and an acoustic input unit 422.
The user input unit 421 receives information from a user. When information is input through the user input unit 424, the processor 470 can control the operation of the vehicle display apparatus 400 so as to correspond to the input information. The user input portion 424 may include a touch input means or a mechanical input means.
The sound input unit 422 can process an external sound signal as electrical data. The processed data can be utilized variously according to the function being performed in the vehicle display apparatus 400. The sound input unit 422 can convert the voice command of the user into electrical data. The converted electrical data may be communicated to the processor 470.
The memory 430 is electrically connected to the processor 470. The memory 430 may store basic data for the unit, control data for controlling the operation of the unit, and input / output data. The memory 430 may be, in hardware, various storage devices such as ROM, RAM, EPROM, flash drive, hard drive, and the like. The memory 430 may store various data for operation of the vehicle display device 400, such as a program for processing or controlling the processor 470. [
The memory 430 may store map data for implementing the navigation function. Here, the map data can be stored by default when the vehicle is shipped. Alternatively, the map data may be received from the external device via the communication unit 410 or the interface unit 480. [
The display unit 440 may display information processed by the processor 470. For example, the display unit 440 can display the vehicle-related information. Here, the vehicle-related information may include vehicle control information for direct control of the vehicle, or vehicle driving assistance information for a driving guide to the vehicle driver. Further, the vehicle-related information may include vehicle state information indicating the current state of the vehicle or vehicle driving information related to the driving of the vehicle.
The display unit 440 may include a transparent flexible display 441.
The transparent flexible display 441 can be configured to be deformable by an external force. The deformation may be at least one of warping, bending, folding, twisting, and curling of the transparent flexible display 441.
In a state in which the transparent flexible display 441 is not deformed (for example, a state having an infinite radius of curvature, hereinafter referred to as a first state), the display area of the transparent flexible display 441 becomes flat. In the first state, the display area may be a curved surface in a state deformed by an external force (for example, a state having a finite radius of curvature, hereinafter referred to as a second state). The information displayed in the second state may be time information output on the curved surface. Such visual information is realized by independently controlling the emission of a sub-pixel arranged in a matrix form. The unit pixel means a minimum unit for implementing one color.
The transparent flexible display 441 may be placed in a bent state (e.g., a state in which the flexible flexible display 441 is bent in the up-down direction or the left-right direction) instead of the flat state in the first state. In this case, when an external force is applied to the transparent flexible display 441, the transparent flexible display 441 may be deformed into a flat state (or a less curved state) or a more curved state.
Meanwhile, the transparent flexible display 441 may be combined with a touch sensor to implement a flexible touch screen. When a touch is made to the flexible touch screen, the processor 470 can perform control corresponding to the touch input. The flexible touch screen may be configured to sense the touch input in the first state as well as the second state.
Meanwhile, the transparent flexible display 441 may have a predetermined transparency. In order to have such transparency, the transparent flexible display 441 may be formed of a transparent thin-film electroluminescent (TFEL), a transparent organic electroluminescent diode (OLED), a liquid crystal display (LCD), a transparent transparent display, And a display. The transparency of the transparent flexible display 441 may be adjusted under the control of the processor 470.
The transparent flexible display 441 may be disposed inside the vehicle without being exposed to the outside, with the axle being curled around a predetermined axis. In this state, according to a predetermined event, one area of the transparent flexible display 441 may be exposed to the interior of the vehicle. Here, the event may be a user input.
For example, the transparent flexible display 441 may be disposed inside the dashboard with the predetermined axis being curled. The transparent flexible display 441 may be deployably wound on the dashboard toward the front windshield. When a predetermined event occurs, under the control of the processor 470, the transparent flexible display 441 may be deployed toward the front windshield in the dashboard.
For example, the transparent flexible display 441 may be disposed inside the A-pillar with the predetermined axis being curled. The transparent flexible display 441 may be rolled so as to be deployable toward the front windshield or the side window glass at the ap filler. When a predetermined event occurs, under the control of the processor 470, the transparent flexible display 441 may be deployed toward the front windshield or the side window glass at the ap filler.
For example, the transparent flexible display 441 may be disposed inside the door with the predetermined axis being curled. The transparent flexible display 441 may be deployably wrapped from the door toward the side window glass. When a predetermined event occurs, under the control of the processor 470, the transparent flexible display 441 may be deployed from the door toward the side window glass.
For example, the transparent flexible display 441 may be disposed in the vehicle interior ceiling, with the center axis of the transparent flexible display 441 being curled. The transparent flexible display 441 may be rolled over the ceiling to the front windshield, the rear windshield, or the side window glass. If a predetermined event occurs, under the control of the processor 470, the transparent flexible display 441 may be deployed from the ceiling toward the front windshield, the rear windshield or the side window glass.
For example, the transparent flexible display 441 may be placed on a sheet, with the selected axis being curled. The transparent flexible display 441 may be rolled up so as to extend upward, downward, or sideways from the sheet. When a predetermined event occurs, under the control of the processor 470, the transparent flexible display 441 may be deployed upward, downward, or sideways in the seat.
The area of the transparent flexible display 441 that is exposed to the interior of the vehicle can be adjusted in length according to user input. For example, the area developed in the transparent flexible display 441 may be adjusted according to user input.
The transparent flexible display 441 may be disposed in close proximity to at least one of the front windshield, the rear windshield, the side window glass, and the front seat in the interior space of the vehicle.
The audio output unit 448 converts the electric signal from the processor 470 into an audio signal and outputs the audio signal. For this purpose, the sound output unit 448 may include a speaker or the like. It is also possible for the sound output section 448 to output sound corresponding to the operation of the user input section 421. [
The driving unit 450 can adjust the length of the transparent flexible display 441, which is exposed to the interior of the vehicle. For example, the driver 450 can adjust the expansion area of the transparent flexible display 441.
The driving unit 450 may include a roller unit 451, a driving force generating unit 452, an elastic supporting unit (2221a to 2221d in FIG. 22), and a tilt adjusting unit (2230 in FIG. 22).
The roller portion 451 contacts the transparent flexible display 441 and can adjust the length of the region exposed to the outside through rotation. For example, the roller portion 451 can contact the transparent flexible display 441 and rotate to adjust the development area.
The roller portion 451 may include a main roller (2211 in Fig. 22) and a sub roller (2212a to 2211f in Fig. 22).
The main roller (2211 in Fig. 22) may be connected to the driving force generating portion 452. [ The main roller (2211 in Fig. 22) can receive the rotational driving force generated by the driving force generating portion 452. [
The sub-rollers (2212a through 2212f in FIG. 22) may be disposed between the transparent flexible display 441 and the housing. The sub-rollers (2212a to 2212f in Fig. 22) can reduce the frictional force with the housing (2235 in Fig. 22) when the transparent flexible display 441 is pulled in or pulled out.
The driving force generation section 452 can provide a rotational driving force to the roller section 451. [
For example, the driving force generation section 452 may include a motor. The driving force generating section 452 can provide the rotational driving force generated by the motor to the roller section 451. [
The elastic supports (2221a to 2221d in Fig. 22) can elastically support the transparent flexible display 441 to the housing 2235. Fig.
The tilt adjusting section (2230 in Fig. 22) can adjust the tilt of the transparent flexible display 441. Fig.
The driving unit 450 will be described later with reference to Fig.
The sensing unit 460 may include a sun sensor 461, a light amount sensor 462, and a length sensing unit 463.
The solar sensor 461 tracks the position of the sun. For example, the solar sensor 461 performs the azimuth and elevation angle tracking of the sun. The solar sensor 461 may include one or more photodiodes to track the position of the sun.
The light amount sensor 462 senses the amount of light entering the interior of the vehicle. Specifically, the light amount sensor 462 detects the solar light amount. The light intensity sensor 462 may include a photoconductive element such as a CdS photoconductive cell or a CdS cell.
The length sensing unit 463 can sense the length of the transparent flexible display 441 exposed to the interior of the vehicle. For example, the length sensing portion 463 may sense the deployment area of the transparent flexible display 441. [
The processor 470 controls the overall operation of each unit in the vehicle display device 400. [
The processor 470 may control the display unit 440 or the sound output unit 448 to output information or data received through the communication unit 410, the input unit 420, or the interface unit 480. [ The processor 470 may control the display unit 440 or the sound output unit 448 so that information or data stored in the memory 430 is output. The processor 470 can directly output or process the received information or data. The processor 470 can visually output information or data through the display unit 440. [ The processor 470 may output information or data audibly through the audio output 448.
The processor 470 can control the driving unit 450. [ The processor 470 controls the driving unit 450 to control the entry or withdrawal of the transparent flexible display 441. [ For example, the processor 470 can control the development of the transparent flexible display through the control of the driver 450. [
When the transparent flexible display 441 is drawn out, one area of the transparent flexible display 441 may be exposed to the interior of the vehicle.
The processor 470 can control the screen to be displayed on the transparent flexible display 441. [ The processor 470 may control the screen to be displayed in the exposed area. For example, the processor 470 can control the screen to be displayed in the expansion area of the transparent flexible display 441. [
When there are a plurality of transparent flexible displays 441, the processor 470 may control each of the plurality of transparent displays to display different screens.
The transparent flexible display 441 may include a first display and a second display. If a passenger other than the user is boarding, the processor 470 may control the second display to be exposed to the interior of the vehicle. For example, the processor 470 may control the second display to be deployed.
The processor 470 can receive the image data obtained by the camera 195 via the interface unit 480. [ The processor 470 can display a screen based on the image data on the transparent flexible display 441. [
If the transparent flexible display 441 is disposed close to the side window glass, the processor 470 can control the transparent flexible display 441 to display the rear image of the vehicle.
For example, in the case of running in an eye or a rainy situation, the processor 470 can control to display the rear image of the vehicle on the transparent flexible display 441 arranged close to the side window glass.
For example, when a turn signal is input for a lane change or a change of direction, the processor 470 can control to display the rear image of the vehicle on a transparent flexible display 441 disposed close to the side window glass have.
The processor 470 can darken the transparent flexible display 441 according to the amount of light irradiated into the vehicle. The processor 470 can control the transparent flexible display 441 to form a predetermined color or darkness so that the transparent flexible display 441 can be darkened. On the other hand, the light amount can be sensed through the light amount sensor 462.
The processor 470 can receive the collision information with the object via the interface unit 480. [
For example, the camera 195 may detect a collision with an object. Alternatively, a radar, a rider (LiDAR), or an ultrasonic sensor included in the sensing unit 760 may detect a collision with an object.
When the collision information with the object is received, the processor 470 can control the transparent flexible display 441 not to be exposed through the driving unit 450. [ For example, the processor 470 can control the driving unit 450 to control the transparent flexible display 441 to be arranged so as to be curled around a predetermined axis. Thus, breakage of the transparent flexible display 441 due to an accident can be prevented.
On the other hand, the processor 470 can generate new information based on the information or data received via the interface unit 480. [ The processor 470 may control the display unit 441 to display the generated information or a screen corresponding to the generated information.
The processor 470 may be implemented as an application specific integrated circuit (ASIC), digital signal processors (DSPs), digital signal processing devices (DSPDs), programmable logic devices (PLDs), field programmable gate arrays (FPGAs) May be implemented using at least one of controllers, micro-controllers, microprocessors, and electrical units for performing other functions.
The interface unit 480 may receive the data or transmit the processed or generated signal to the processor 470. To this end, the interface unit 130 can perform data communication with the control unit 770, the vehicle driving assist device 400, the sensing unit 760, and the like in the vehicle by a wire communication or a wireless communication method.
The interface unit 480 can receive the sensor information from the control unit 770 or the sensing unit 760.
Here, the sensor information includes at least one of vehicle direction information, vehicle position information (GPS information), vehicle angle information, vehicle speed information, vehicle acceleration information, vehicle tilt information, vehicle forward / backward information, battery information, fuel information, Information on the inside temperature of the vehicle, and humidity information of the inside of the vehicle.
Such sensor information may include a heading sensor, a yaw sensor, a gyro sensor, a position module, a vehicle forward / backward sensor, a wheel sensor, a vehicle speed sensor, A vehicle body inclination sensor, a battery sensor, a fuel sensor, a tire sensor, a steering sensor by steering wheel rotation, a vehicle internal temperature sensor, and a vehicle internal humidity sensor. On the other hand, the position module may include a GPS module for receiving GPS information.
On the other hand, among the sensor information, the vehicle direction information, the vehicle position information, the vehicle angle information, the vehicle speed information, the vehicle tilt information, and the like relating to the vehicle running can be referred to as vehicle running information.
On the other hand, the interface unit 480 can receive object detection information acquired through a radar, a rider, or an ultrasonic sensor, distance information with the object, relative speed information with the object, or collision information with the object. Here, the object may be various objects located on the road such as other vehicles, traffic lights, traffic signs, roadside trees, pedestrians, streetlights, guard rails, and the like.
The interface unit 480 can receive the vehicle front image data or the vehicle surrounding image data obtained from the camera 195. [ For example, the interface unit 480 can receive image data acquired by respective cameras arranged in front, rear, left, and right rooms.
The interface unit 480 can receive information or data processed by the processor 170 of the vehicle driving assistant 100 from the vehicle front image or the vehicle periphery image. For example, the interface unit 480 can receive the object detection information, the distance information to the object, the relative speed information with the object, or the collision information with the object acquired through the image processing.
The interface unit 480 can receive the occupant information obtained by the internal camera 195c. For example, the interface unit 480 can receive the user's gaze information acquired by the internal camera 195c.
The interface unit 480 can receive vehicle status information. For example, the interface unit 480 can receive window opening / closing status information from the window driving unit 756. [
The power supply unit 490 can supply power necessary for operation of each component under the control of the processor 770. [ Particularly, the power supply unit 490 can receive power from a battery or the like inside the vehicle.
FIG. 20 is a diagram referred to explain the position of a transparent flexible display according to an embodiment of the present invention.
Referring to Fig. 20, the transparent flexible display 441 may be disposed close to the front windshield 2210. Fig.
For example, the transparent flexible display 441 may be located inside the dashboard. In this case, a hole for drawing out the transparent flexible display 441 may be formed on the upper side of the dashboard. The transparent flexible display 441 may be disposed inside the dashboard with the predetermined axis being curled. Depending on the control of the processor 470, a portion of the transparent flexible display 441 may be exposed towards the interior of the vehicle.
The transparent flexible display 441 may be disposed close to the rear windshield.
For example, the transparent flexible display 441 may be located inside the seed filler (C-PILLAR). In this case, a hole for drawing out the transparent flexible display 441 may be formed on one side of the seed filler (C-PILLAR). The transparent flexible display 441 may be disposed inside the seed filler (C-PILLAR) with a predetermined axis being curled. Depending on the control of the processor 470, a portion of the transparent flexible display 441 may be exposed towards the interior of the vehicle.
The transparent flexible display 441 may be disposed close to the side window glass 2020.
For example, the transparent flexible display 441 may be located inside the door of the vehicle. In this case, a hole for drawing out the transparent flexible display 441 may be formed on the upper side of the door. The transparent flexible display 441 may be disposed inside the door with the predetermined axis being curled. Depending on the control of the processor 470, a portion of the transparent flexible display 441 may be exposed towards the interior of the vehicle.
For example, the transparent flexible display 441 may be located inside the vehicle's A-PILLAR or non-pillar B-PILLAR. In this case, a hole for drawing out the transparent flexible display 441 may be formed on one side of the A-PILLAR or the B-PILLAR. The transparent flexible display 441 may be disposed inside the A-PILLAR or the B-PILLAR with the axes thereof being curled. Depending on the control of the processor 470, a portion of the transparent flexible display 441 may be exposed towards the interior of the vehicle.
The transparent flexible display 441 may be disposed close to the front sheet 2020. [
For example, the transparent flexible display 441 may be disposed on the back side of the front sheet 2020. [ In this case, the user sitting on the rear seat can view the contents displayed on the transparent flexible display 441.
Fig. 21 is an exemplary view of a case where a transparent flexible display is disposed close to a front windshield, according to an embodiment of the present invention. Fig.
Referring to FIG. 21, the transparent flexible display 441 may be located inside the dashboard 2110. In this case, a hole 2115 for drawing out the transparent flexible display 441 may be formed on the upper side of the dash 2110 board.
The transparent flexible display 441 may be disposed inside the dashboard 2110 with the predetermined axis being curled. Depending on the control of the processor 470, a portion of the transparent flexible display 441 may be exposed towards the interior of the vehicle.
On the other hand, the front windshield may be formed of an inner glass and an outer glass. In this case, a space may be formed between the inner glass and the outer glass. The exposed area of the transparent flexible display 441 may be disposed in a space between the inner glass and the outer glass.
22 is a diagram referred to explain a transparent flexible display and a driving unit according to an embodiment of the present invention. Fig. 22 illustrates a side view of the driver 450. Fig.
Referring to FIG. 22, the transparent flexible display 441 may be curled around the first axis 2201.
Transparent electrodes 2241 and 2242 may be disposed in at least one region of the transparent flexible display 441. The processor 470 may transmit electrical signals or electrical energy to the transparent flexible display 441 through the transparent electrodes 2241 and 2242. [
The driving unit 450 can adjust the length of the transparent flexible display 441, which is exposed to the interior of the vehicle.
The driving unit 450 may include a roller unit 451, a driving force generating unit 452, elastic supporting units 2221 to 2221d, and a tilt adjusting unit 2230.
The roller portion 451 contacts the transparent flexible display 441 and can adjust the length of the region exposed to the outside through rotation. The roller portion 451 can be released or released from the transparent flexible display 441 through the rotational driving force.
The roller portion 451 may include a main roller 2211 and sub rollers 2212a through 2211f.
The main roller 2211 may be connected to the driving force generating portion 452. The main roller 2211 can receive the rotational driving force generated by the driving force generating unit 452.
The main roller 2211 may be connected to the transparent flexible display 441.
The main roller 2211 can rotate around the first axis 2201. Here, the first axis 2201 is a center axis when the transparent flexible display 441 is curled. The main roller 2211 can be rotated or rotated about the first axis 2201 to or from the transparent flexible display 441.
The sub rollers 2212a through 2212f may be disposed between the transparent flexible display 441 and the housing. In the figure, six sub rollers 2212a through 2212f are illustrated, but the number is not limited thereto. The driving unit 450 may include one or more sub-rollers.
The sub rollers 2212a through 2212f can reduce the frictional force with the housing 2235 when the transparent flexible display 441 is pulled in or pulled out.
The driving force generation section 452 can provide a rotational driving force to the roller section 451. [
For example, the driving force generation section 452 may include a motor. The driving force generating section 452 can provide the rotational driving force generated by the motor to the roller section 451. [
The elastic supports 2221a to 2221d can elastically support the transparent flexible display 441 to the housing 2235. [ For example, the resilient supports 2221a through 2221d may include a spring. The elastic supporting portions 2221a to 2221d may tightly contact the transparent flexible display 441 in a predetermined direction so that the transparent flexible display 441 and the housing 2235 are not spaced apart.
The tilt adjusting unit 2230 can adjust the inclination of the transparent flexible display 441. [ The tilt adjustment portion 2230 may be disposed in one area of the housing 2235. [ The tilt adjusting section 2230 can adjust the tilt of the exposure area of the transparent flexible display 441 according to the degree of protrusion.
23 to 25 are drawings referred to explain a support module according to an embodiment of the present invention.
23 to 25, the vehicle display apparatus 400 further includes a support module 2310, a support module roller 2315, an elasticity module 2330, an insertion guide 2335, and an elasticity module roller 2340 .
The support module 2310 may support a region of the transparent flexible display 441 that is exposed to the interior of the vehicle.
The support module 2310 may be formed to be rollable. The support module 2310 may include a plurality of aprons 2311 connected to each other. The support module 2310 can be wound or unwound by the support module roller 2315. [
Each of the plurality of aprons 2311 may include a magnet 2312. The magnet 2312 can attach the apron to the transparent flexible display 441.
As illustrated in Fig. 25, the support module roller 2315 can untie or unfasten the support module 2310 through the rotational driving force generated in the motor 2520. [ The rotational driving force generated by the motor 2520 may be transmitted to the supporting module 2315 through the driving force transmitting portion 2510. [ Here, the driving force transmitting portion 2510 may include a belt 2511 and a gear 2512.
The elastic module 2330 may include a plurality of elastic pieces 2230. 24, the elastic piece 2330 can be inserted into a slot 2331 formed in the apron 2311. [ The slit 2331 formed in the apron 2311 is not inserted because the width of the elastic piece 2330 is larger than that of the slot 2331 because it is an undercut structure. The insertion guide 2335 can be inserted into the slot 2331 by narrowing the width of the elastic piece 2330.
As described above, each of the plurality of elastic pieces 2330 is inserted into the slots 2331 formed in the plurality of aprons 2311, so that the vertical posture of the support module 2310 can be maintained by the elastic force of the elastic pieces 2330 .
The elastic modular roller 2340 can be released or released from the elastic module 2310 through a rotational driving force generated from a motor (not shown).
The transparent flexible display 441 may be attached to the support module 2310 via a plurality of magnets 2312. [ The transparent flexible display 441 may be exposed to the interior of the vehicle while being attached to the support module 2310. In this case, the transparent flexible display 441 can maintain the vertical posture.
26A to 26B are views referred to explain a support module according to an embodiment of the present invention.
26A to 26B, the support module 2310 can support a region of the transparent flexible display 441 that is exposed to the interior of the vehicle.
The supporting module 2600 may include a fixing portion 2620 and a connecting portion 2625 and a guide portion 2610. [
Meanwhile, the supporting module 2600 may include a first supporting module and a second supporting module. In the following description, the first supporting module will be mainly described. The second support module may be formed to correspond to the first support module.
The fixing portion 2620 can fix the exposed region of the transparent flexible display 441. [ For example, the fixing portion 2620 can be held in a state in which a part of the exposed region of the transparent flexible display 441 is caught.
Meanwhile, the fixing portion 2620 may be formed of a transparent material. For example, the fixing portion 2620 may be formed of a transparent plastic material.
The guide portion 2610 can be formed so that the fixing portion 2620 can move linearly. The guide portion 2610 may include a groove. A connecting portion 2625 may be seated in the groove. In this case, the fixing portion 2620 can be linearly moved up and down or left and right along the groove in a state where the connecting portion 2625 is seated in the groove.
For example, the transparent flexible display 441 may be formed close to the front windshield. In this case, the fixing portion 2620 can be linearly moved in the vertical direction or the left-right direction of the front windshield.
Meanwhile, the guide portion 2610 may be disposed inside the A-PILLAR.
The guide portion 2610 may include a first guide portion and a second guide portion.
In this case, as illustrated in FIG. 26A, the first guide portion and the second guide portion may be respectively disposed inside the A-PILLAR on both sides of the vehicle. In this case, the transparent flexible display 441 can be spread or rolled up and down.
Alternatively, as illustrated in Fig. 26B, the first guide portion may be disposed on the dashboard, and the second guide portion may be disposed on the ceiling. In this case, the transparent flexible display 441 can be spread or rolled in the left-right direction.
27 is a diagram referred to explain an operation in which a transparent flexible display is exposed according to an embodiment of the present invention.
As illustrated at 2701, the transparent flexible display 441 may be disposed within the dashboard with the first axis rolled around.
Processor 470 may receive user input, via input 420. For example, the processor 470 may receive the speech input 2710.
When user input is received, processor 470, as illustrated at 2702, may control one region of transparent flexible display 441 to be exposed to the interior of the vehicle.
The processor 270 can control the driving force generating unit 452 to generate the rotational driving force. The rotational driving force generated by the driving force generating unit 452 may be transmitted to the main roller 2211. [ The main roller 2211 can rotate. As the main roller 2211 rotates, the transparent flexible display 441 connected to the main roller 2211 can be released. As the transparent flexible display 441 is released, one area 441a of the transparent flexible display 441 can be exposed to the interior of the vehicle.
On the other hand, as shown at 2703, the length of the exposed area 441a of the transparent flexible display can be adjusted. The processor 470 controls the driving unit 450 to adjust the length of the transparent flexible display 441 in the area 441a exposed to the interior of the vehicle.
Meanwhile, the processor 470 may display information in the exposure area 441a. For example, the processor 470 may display navigation information in the exposure area 441a.
28 is a diagram referred to for explaining an operation of displaying information in an exposure area, according to an embodiment of the present invention.
28, in a state in which one area 441a of the transparent flexible display 441 is exposed, the processor 470 can display information. Here, the information may include vehicle-related information, driving situation information, and navigation information.
The vehicle-related information may include vehicle control information for direct control of the vehicle, or vehicle driving assistance information for a driver's guide to the vehicle driver. Further, the vehicle-related information may include vehicle state information indicating the current state of the vehicle or vehicle driving information related to the driving of the vehicle.
The running situation information may include information about the running environment or the running road. The travel environment information may include weather information at the time of travel. The traveling road information may include construction information, road congestion information, and the like.
The navigation information may include destination information, estimated arrival time, arrival time, route, and the like.
29 is a diagram referred to illustrate an operation of displaying information in an exposure area of a first display and a second display, according to an embodiment of the present invention.
Referring to FIG. 29, when there are a plurality of transparent flexible displays 441, the processor 470 can control each of the plurality of transparent displays to display different screens.
The processor 470 can control so that different screens are displayed in the exposure regions of the plurality of transparent flexible displays.
For example, the transparent flexible display 441 may include a first display 2901 and a second display 2902.
The first display 2901 may be a display disposed close to the driver's seat. The first display 2901 may be disposed inside the dashboard with the first axis rolled around the first axis.
Processor 470 may control to expose one area of first display 2901 to the interior of the vehicle, depending on user input. Processor 470 may display vehicle-related information, driving situation information, or navigation information in the exposed area of first display 2901. [
The second display 2902 may be a display disposed proximate to the assistant seat. The second display 2902 may be disposed inside the dashboard with the second axis rolled about the second axis.
Processor 470 may control to expose one area of second display 2902 to the interior of the vehicle, depending on the user input. The processor 470 may display predetermined content in the exposed area of the second display 2902. Here, the content may be irrelevant to a vehicle such as a moving image, the Internet, or a game.
Meanwhile, according to the embodiment, the processor 470 can control the second display 2902 to be exposed to the inside of the vehicle when the passenger boarding. For example, in a state where only the first display 2901 is exposed, when the passenger boarding the passenger seat is detected through the internal camera 195c, the processor 470 displays the second display 2902 in the interior of the vehicle And can be controlled to be exposed.
30 is a diagram referred to explain an operation of displaying an image obtained through a camera on a transparent flexible display according to an embodiment of the present invention.
Referring to Fig. 30, the processor 470 can receive the image data obtained by the camera 195 via the interface unit 480. Fig. The processor 470 can display a screen based on the image data on the transparent flexible display 441. [
If the transparent flexible display 441 is disposed close to the side window glass, the processor 470 can control the transparent flexible display 441 to display the rear image of the vehicle.
As shown at 3001, the transparent flexible display 441 may be disposed inside the door with the first axis rolled around.
The processor 470 can control one area 441a of the transparent flexible display 441 to be exposed to the interior of the vehicle according to the user input. The processor 470 can display the image photographed by the camera 195 in the exposure area 441a.
As shown at 3002, the transparent flexible display 441 may be disposed within the A-PILLAR with the first axis being curled.
The processor 470 can control one area 441a of the transparent flexible display 441 to be exposed to the interior of the vehicle according to the user input. The processor 470 can display the image photographed by the camera 195 in the exposure area 441a.
For example, in the case of running in an eye or a rainy situation, the processor 470 can control the rear image 3010 of the vehicle to be displayed on the transparent flexible display 441a placed close to the side window glass.
For example, if a turn signal is input for a lane change or redirection, the processor 470 may cause the backward image 3003 of the vehicle to be displayed on a transparent flexible display 441a disposed close to the side window glass Can be controlled.
31 is a diagram referred to explain a transparent flexible display capable of adjusting the angle and brightness according to an embodiment of the present invention.
Referring to FIG. 31, the transparent flexible display 441 can be angularly adjusted. The exposed area 441a of the transparent flexible display 441 may form a predetermined angle with the dashboard in the YZ plane. Here, the Y axis may be the full width direction (W), and the Z axis may be the total length direction (L).
The exposed area 441a of the transparent flexible display 441 can be angled according to the user's gaze. The processor 470 can control the degree of projection of the tilt adjustment portion 2230. [ The processor 470 can adjust the inclination of the exposure region 441a with the dashboard through the control of the degree of projection of the tilt adjusting unit 2230. [
The processor 470 can control the inclination of the exposed area 441a so as to easily look at the screen in which the exposure area 441a is displayed at the same time while the user is looking forward. For example, the processor 470 can control the inclination of the exposed region 441a such that the direction in which the user's line of sight is directed and the angle formed by the exposed region 441a in the YZ plane is 80 ° or more and 100 ° or less.
On the other hand, the processor 470 can adjust the brightness of the transparent flexible display 441 according to the amount of light irradiated into the passenger compartment. For example, the processor 470 can darken the transparent flexible display 441 in proportion to the amount of light irradiated into the vehicle interior. By visually processing the transparent flexible display 441 as a whole, the visibility of the displayed content can be improved.
32 is a diagram referred to explain the operation of a transparent flexible display when collision information with an object is received, according to an embodiment of the present invention.
Referring to FIG. 32, the processor 470 can receive collision information with an object via the interface unit 480. FIG. The camera 195 may detect a collision with an object. Alternatively, a radar, a rider (LiDAR), or an ultrasonic sensor included in the sensing unit 760 may detect a collision with an object.
When the collision information with the object is received, the processor 470 can control the transparent flexible display 441 not to be exposed through the driving unit 450. [ For example, the processor 470 can control the driving unit 450 to control the transparent flexible display 441 to be arranged so as to be curled around a predetermined axis. Thus, breakage of the transparent flexible display 441 due to an accident can be prevented.
Figure 33 is a diagram that is referenced to illustrate the operation of displaying POI information in a transparent flexible display, in accordance with an embodiment of the present invention.
Referring to FIG. 33, the vehicle display apparatus 400 can communicate with the user's mobile terminal 250 through the communication unit 410. The processor 470 can receive the user's interest information from the mobile terminal 250 via the communication unit 410. [
For example, the mobile terminal 250 may accumulate and store information corresponding to the website, SNS keyword, and text message keyword accessed by the user. The mobile terminal 250 may extract the user's interest information based on the stored information.
The processor 470 may receive the interest information from the mobile terminal 250. The processor 470 may display POIs (Point of Interest) 3311, 3312, 3313, and 3314 corresponding to the information of interest on the exposed area 441a of the transparent flexible display.
Figure 34 is a diagram that is referenced to illustrate the operation of displaying TBT information on a transparent flexible display, in accordance with an embodiment of the present invention.
Referring to FIG. 34, the processor 470 may display the navigation information in the exposed area 441a of the transparent flexible display 441. FIG.
The navigation information may include TBT (Turn by Turn) information.
Processor 470 may display TBT information 3410 to match the actual lane. Processor 470 can display the TBT image 3410 with a perspective. Alternatively, the processor 470 may stereoscopically display the TBT image 3410. The processor 470 may display the height of the TBT image 3410 to match the perspective of the actual lane. In this case, the width of the TBT image can be displayed to be smaller according to the height.
35 is a diagram referred to explain an operation in which distance information to an object is displayed on a transparent flexible display according to an embodiment of the present invention.
35, the processor 470 can receive the distance information to the object through the interface unit 480. [ Here, the distance information can be detected through the camera 195, a radar, a rider, and an ultrasonic sensor.
The processor 470 may display distance information with at least one object in an exposure area of the transparent flexible display.
Figure 36 illustrates a case in which a transparent flexible display is disposed on the front seat, in accordance with an embodiment of the present invention.
As illustrated at 3601, the transparent flexible display 441 may be disposed within the front seat, with the first axis being curled.
Processor 470 may receive user input, via input 420. For example, processor 470 may receive audio input 3605. [
When user input is received, processor 470, as illustrated at 3602, may control one region of transparent flexible display 441 to be exposed to the interior of the vehicle. In this case, the exposed region 441a may be exposed along the rear surface of the headrest of the front seat.
The processor 270 can control the driving force generating unit 452 to generate the rotational driving force. The rotational driving force generated by the driving force generating unit 452 may be transmitted to the main roller 2211. [ The main roller 2211 can rotate. As the main roller 2211 rotates, the transparent flexible display 441 connected to the main roller 2211 can be released. As the transparent flexible display 441 is released, one area 441a of the transparent flexible display 441 can be exposed to the interior of the vehicle.
On the other hand, as shown in 3603, the length of the exposed region 441a of the transparent flexible display can be adjusted. The processor 470 controls the driving unit 450 to adjust the length of the transparent flexible display 441 in the area 441a exposed to the interior of the vehicle.
Meanwhile, the processor 470 can display the content in the exposure area 441a. For example, here, the content may be unrelated to the vehicle, such as a moving picture, the Internet, or a game.
37 is a block diagram for explaining a display device for a vehicle according to another embodiment of the present invention.
38 is a diagram referred to explain a display unit according to an embodiment of the present invention.
37 to 38, the vehicle display device 400 includes a communication unit 410, an input unit 420, a memory 430, a display unit 440, an acoustic output unit 448, a driver 450, A sensing unit 460, a processor 470, an interface unit 480, and a power supply unit 490.
The display device for a vehicle of FIG. 37 differs from the display device for a vehicle of FIG. 19 in the configuration or operation of the processor 470 and the display portion 440. Hereinafter, the main configuration will be mainly described. 37 is the same as that of the vehicle display apparatus of Fig. 19 except for the processor 470 and the display section 440 in the vehicle display apparatus of Fig.
Display portion 440 may include an A-Pillar display 445 and a rolling display 446. [
The A-pillar display 445 may be disposed on the A-pillar 3801 of the vehicle.
The rolling display 446 may be one embodiment of the transparent flexible display 441 described in FIG.
The rolling display 446 can be deployably unwound from the aileron 3801 toward the front windshield 3802 or the side window glass 3803.
The rolling display 446 may be configured to be deformable by an external force. The deformation may be at least one of warping, bending, folding, twisting, and curling of the transparent flexible display 441.
In a state in which the rolling display 446 is not deformed (e.g., a state having an infinite radius of curvature, hereinafter referred to as a first state), the display area of the rolling display 446 becomes flat. In the first state, the display area may be a curved surface in a state deformed by an external force (for example, a state having a finite radius of curvature, hereinafter referred to as a second state). The information displayed in the second state may be time information output on the curved surface. Such visual information is realized by independently controlling the emission of a sub-pixel arranged in a matrix form. The unit pixel means a minimum unit for implementing one color.
The rolling display 446 may be placed in a bent state (e.g., up, down, or sideways) rather than in a flat state in the first state. In this case, when an external force is applied to the rolling display 446, the rolling display 446 may be deformed into a flat state (or less bent state) or more bent state.
Meanwhile, the rolling display 446 may be combined with a touch sensor to implement a flexible touch screen. When a touch is made to the flexible touch screen, the processor 470 can perform control corresponding to the touch input. The flexible touch screen may be configured to sense the touch input in the first state as well as the second state.
Meanwhile, the rolling display 446 may have a predetermined transparency. In order to have this transparency, the rolling display 446 may be a transparent thin film transistor (TFT), a transparent thin-film electroluminescent (TFEL), a transparent organic light-emitting diode (OLED), a transparent liquid crystal display Or the like. The transparency of the rolling display 446 may be adjusted according to the control of the processor 470.
The rolling display 446 may include a first rolling display 3810 and a second rolling display 3820. Here, the first rolling display 3810 can be deployably wound from the a-pillar 3801 toward the front windshield 3802. The second rolling display 3820 can be deployably unwound from the aileron 3801 towards the side window glass 3803.
On the other hand, the second rolling display 3820 may include a third rolling display and a fourth rolling display. Here, the third rolling display can be deployably wound on the left side window glass 3803 from the left side aileron. The fourth rolling display can be deployably unwound from the right side window to the right side window glass 3803.
On the other hand, according to the embodiment, the Aipillar display 445 may be formed integrally with the rolling display 446. [
The driving unit 450 can adjust the length of a region of the rolling display 446 that is exposed to the interior of the vehicle. For example, the driver 450 can adjust the deployment area of the rolling display 446.
The driving unit 450 may include a roller unit 451, a driving force generating unit 452, an elastic supporting unit (2221a to 2221d in FIG. 22), and a tilt adjusting unit (2230 in FIG. 22).
The roller portion 451 contacts the rolling display 446 and can adjust the length of the region exposed to the outside through rotation. For example, the roller portion 451 can contact the rolling display 446 and rotate to adjust the development area.
The roller portion 451 may include a main roller (2211 in Fig. 22) and a sub roller (2212a to 2211f in Fig. 22).
The main roller (2211 in Fig. 22) may be connected to the driving force generating portion 452. [ The main roller (2211 in Fig. 22) can receive the rotational driving force generated by the driving force generating portion 452. [
The sub-rollers (2212a through 2212f in Fig. 22) may be disposed between the rolling display 446 and the housing. The sub-rollers (2212a to 2212f in Fig. 22) can reduce the frictional force with the housing (2235 in Fig. 22) when the rolling display 446 is retracted or withdrawn.
The driving force generation section 452 can provide a rotational driving force to the roller section 451. [
For example, the driving force generation section 452 may include a motor. The driving force generating section 452 can provide the rotational driving force generated by the motor to the roller section 451. [
The elastic supports (2221a through 2221d in Fig. 22) can resiliently support the rolling display 446 to the housing 2235.
The tilt adjusting portion (2230 in Fig. 22) can adjust the tilt of the rolling display 446. Fig.
The processor 470 may control the deployment of the rolling display 446. Specifically, the processor 470 can control the drive unit 450 to control the incoming or outgoing of the rolling display 446. For example, the processor 470 may control the length of the area of the rolling display 446 exposed to the interior of the vehicle.
The processor 470 may control the deployment of the rolling display 446 in accordance with an input signal received via the input. For example, the processor 470 may control the deployment of the rolling display 446 according to the user's button input or touch input received via the user input 421. [ For example, the processor 470 may control the deployment of the rolling display 446 in accordance with the user's voice input received via the acoustic input 422.
The processor 470 can control a screen displayed on the rolling display 446 according to the degree of development detected by the length detector 463. [ For example, the processor 470 can control the size of the screen in accordance with the degree of expansion. For example, the processor 470 can control an area displayed on the screen according to the degree of expansion.
The processor 470 can control the screen displayed on the Aipillar display 445 or the rolling display 446. [
For example, in a state in which the first content is displayed on the rolling display 446, when the expanded state of the rolling display 446 is switched to widen the display area, the processor 470, in a state in which the first content display is maintained , And to further display the second content on the rolling display 446. [
Here, the display area is an area in which the content can be displayed. For example, the display area may be the area where the rolling display 446 is exposed to the vehicle interior.
For example, if the rolled display 446 is switched so that the rolled display 446 narrows the display area while the first content is displayed on the rolling display 446, the processor 460 keeps the first content displayed, It is possible to control so that a part of the first contents is not displayed as much as the display area which is not displayed.
On the other hand, the processor 470 can control the screen displayed on the Aipillar display 445 according to the degree of development of the rolling display 446. [
Meanwhile, the A-filler display 445 may include a touch sensor. The processor 470 may control the deployment of the rolling display 445 in accordance with an input signal received via a touch sensor included in the A & The processor 470 may control the screen displayed on the Aipillar display 445 or the rolling display 446 according to an input signal received via the touch sensor included in the Aipillar display 445. [ Here, the input signal may include tap, touch & hold, double tap, drag, panning, flick, drag and drop. The "tab" represents an operation in which the user touches the screen very quickly using a finger or a stylus. "Touch &Hold" represents an operation in which a user touches a screen using a finger or a stylus and then maintains a touch input over a critical time. "Double tap" indicates an operation in which the user quickly touches the screen twice with a finger or a stylus. "Drag" means an operation of moving a finger or a touch tool to another position on the screen while the user holds the touch after touching the finger or the touch tool with the screen. "Panning" indicates a case where a user performs a drag operation without selecting an object. "Flick" indicates an operation in which a user drags very quickly using a finger or a touch tool. "Drag and drop" means an operation in which a user drags an object to a predetermined position on the screen using a finger or a touch tool, and then releases the object.
The processor 470 can receive the object sensing information through the interface unit 480. [ The sensing unit 125 or the camera 195 can generate object sensing information by sensing an object located in the vicinity of the vehicle.
The processor 470 can control so that the image corresponding to the object is displayed on the apiller display 445 or the rolling display 446 corresponding to the object sensing information.
The processor 470 can receive the motion information of the object. The sensing unit 125 or the camera 195 can generate the object motion information by tracking the motion of the object.
The processor 470 can control the display area of the image corresponding to the object to change in accordance with the motion information.
For example, the processor 470 may receive the first motion information via the interface unit 480. [ The first motion information may be information that the object moves in the direction of the front windshield 3802 or the side window glass 3803 from the aileron 3801 on the basis of the driver. The processor 470 can control to move the image corresponding to the object displayed on the A pillar display 455 to the rolling display 446 in response to the first motion information.
For example, the processor 470 can receive the second motion information via the interface unit 480. [ The second motion information may be information that the object moves in the direction of the a-pillar 3801 in the direction of the front windshield 3802 or the side window glass 3803 based on the driver. The processor 470 may control to move the image corresponding to the object displayed on the rolling display 446 to the aileron display 445 in response to the second motion information.
On the other hand, the rolling display 446 includes a first rolling display 3810 that is deployed from the aileron 3801 toward the front windshield 3802 and a second rolling display 3810 that is deployable from the aileron 3801 toward the side window glass 3803 And a second rolling display 3820 wrapped around the second rolling display 3820.
The processor 470 can control different contents to be displayed on the apiller display 445, the first rolling display 3810, and the second rolling display 3820, respectively.
The processor 470 can receive the user's viewpoint information from the internal camera 195c via the interface unit 480. [ The processor 470 may display the content on the second rolling display 3820 if the user's gaze is directed to the side window glass 3803. [ For example, the processor 470 may display the vehicle rear image obtained from the rear camera on the second rolling display 3820, if the user's gaze is directed to the side window glass 3803.
On the other hand, the processor 470 can receive the turn signal information through the interface unit 480. The processor 470 may display the content on the second rolling display 3820 corresponding to the turn signal direction. For example, when a turn signal input in the left direction is received, the processor 470 may display the content on the third rolling display 3820 corresponding to the left window glass. For example, if a turn signal input is received in the right direction, the processor 470 may display the content on a fourth rolling display corresponding to the right window glass.
Meanwhile, the processor 470 may display the content on the second rolling display 3820 according to the user's sight line and turn signal information.
The processor 470 can receive the open or closed status information of the window through the interface unit 480. [
The processor 470 can control the second rolling display 3820 to be wound when the window is opened. This is to prevent damage to the second rolling display when the window is opened.
The processor 470 may control to cause the second rolling display 3820 to deploy when the window is closed again.
The processor 470 may control to cause the first rolling display 3810 to deploy in response to the degree to which the second rolling display 3820 is rolled when the window is opened.
The processor 470 can receive weather information from the sensing unit 125 that senses the weather condition through the interface unit 480. [ The illuminance sensor and the humidity sensor included in the sensing unit 125 of the vehicle 100 can sense weather information.
The processor 470 can receive weather information from the external server 260 through the communication unit 410. [
The processor 470 can receive image data from the camera 195 through the interface unit 480. [ For example, the processor 470 may receive the vehicle rear image data from the rear camera through the interface unit 480. [
The processor 470 can display the image corresponding to the vehicle rear image data on the second rolling display 446 when rain or snow is present. When the rain or snow comes and it is difficult to obtain a view, the vehicle rear image is displayed on the second rolling display 446, thereby assisting safe driving.
On the other hand, the processor 470 can display the image corresponding to the vehicle rear image data in the second rolling display 3820 when the user's line of sight is directed to the side window glass 3803 in the rain or snow state .
On the other hand, when the turn signal input is received in the rain or snow state, the processor 470 can display the image corresponding to the vehicle rear image data in the second rolling display 3820 corresponding to the turn signal direction have.
The processor 470 can control the message receiving information to be displayed on the A-pillar display 445 when a message is received from the external device through the communication unit 410. [ The processor 470 may control that the detailed content of the message is displayed on the first rolling display 446 if a user input is received or stopped. Here, the user input may be received through the input unit 420 or via a touch sensor included in the Aipillar display 445.
The second rolling display 446, on the other hand, includes a third rolling display which is deployed to the left side window glass in a deployable manner from the left side window glass and a fourth rolling display which is deployably wound from the right side window glass toward the right side window glass can do.
The processor 470 can display an image corresponding to the image data acquired through the camera disposed in front of the first rolling display 3810. [ The processor 470 may display an image corresponding to the image data acquired through the camera disposed in the rear or left room on the third rolling display. The processor 470 can display an image corresponding to the image data acquired through the camera disposed in the rear or right room on the fourth rolling display.
39A to 39B are views referred to explain a display unit according to an embodiment of the present invention. Figs. 39A to 39B are top view images around the ap filler. Fig.
Referring to Fig. 39A, the A pillar display 445 may be disposed on the a-pillar.
Meanwhile, the driving unit 450 may include a first driving unit 450a and a second driving unit 450b.
The first rolling display 3810 may be connected to the first driver 450a. The processor 470 may control the first driving portion 450a to control the deployment of the first rolling display 3810. [ The first rolling display 3810 is deployable in the direction of the front windshield (a) from the ap filler.
The second rolling display 3810 may be connected to the second driver 450b. The processor 470 may control the second driving portion 450b to control the deployment of the second rolling display 3820. [ The second rolling display 3820 is deployable in the side window glass direction b from the aileron.
Meanwhile, the first driver 450a may be disposed at a portion where the ap filler and the front windshield are connected. In this case, the first driver 450a may be formed of a transparent material.
The second driver 450b may be disposed at a portion where the ap filler and the side window glass are connected. In this case, the second driver 450b may be formed of a transparent material.
Referring to FIG. 39B, the driving unit 450 may include a third driving unit 450c and a fourth driving unit 450d.
The first rolling display 3810 may be connected to the third driver 450c. The processor 470 may control the third driver 450c to control the deployment of the first rolling display 3810. [
The A-filler display 445 may be formed integrally with the first rolling display 3810. In this case, the first rolling display 3810 is deployed and is deployable in the front windshield direction c while covering the aileron. At this time, the area of the first rolling display 3810 covering the A-filler may be named as the A-filler display 445.
The second rolling display 3810 may be connected to the fourth driver 450d. The processor 470 can control the fourth driving unit 450d to control the development of the second rolling display 3820. [ The second rolling display 3820 is deployable in the side window glass direction d in the ap filler.
Meanwhile, the A-filler display 445 may be formed integrally with the second rolling display 3820. In this case, the second rolling display 3810 is developed and can be deployed in the direction of the side window glass while covering the ap filler. At this time, it may be termed the zero-excitation a-filler display 445 which covers the a-filler of the second rolling display 3820.
The first rolling display 3810 is deployable in the direction of the side window glass from the ap filler.
40A-40B are diagrams referenced to illustrate a rolling display deployment operation, in accordance with an embodiment of the present invention.
Referring to Figure 40A, as illustrated at 4001, the display portion 440 may include an A pillar display 445 and a rolling display 446. [ Here, the rolling display 446 may include a first rolling display 3810 and a second rolling display 3820.
As illustrated at 4002, the processor 470 may control the deployment of the rolling display 446 when a user input is received.
The processor 470 may also control the screen displayed on the rolling display 446 when a user input is received.
Here, the user input may be an input received through the input unit 420. Alternatively, the user input may be an input received via a touch sensor included in the Aipillar display 445.
On the other hand, the length detector 463 can detect the degree of development of the rolling display. The processor 470 can control the screen displayed on the rolling display 446 according to the degree of deployment.
When the rolled-up display state is switched (4002) so that the display area is widened in the state that the first content 4020 is displayed on the rolling display 446, the processor 470 maintains the first content 4020 display The second content 4030 can be further displayed on the rolling display 446.
Referring to FIG. 40B, when the expanded state of the rolling display is switched (4003) so that the display area is narrowed in the state where the first contents 4020 are displayed on the rolling display 446 as illustrated in 4003, The processor 470 can control the display of the first content 4020 so that a part of the first content is not displayed as much as the narrowed area.
41 is a diagram referred to explain an operation of displaying an image corresponding to a detected object, according to an embodiment of the present invention.
41, the processor 470 may receive the object sensing information from the sensing unit 125 that senses the object through the interface unit 480. [ The processor 470 can control so that the image 4010 corresponding to the object is displayed on the apiller display 445 or the rolling display 446 corresponding to the object sensing information.
The processor 470 can receive the motion information of the object. The processor 470 can control the display area of the image 4010 to change in accordance with the motion information.
The processor 470 can receive the first motion information via the interface unit 480. [ The first motion information may be information (from 4101 to 4102) in which the object moves in the direction of the front windshield 3802 from the aileron 3801 on the basis of the driver. The processor 470 may control to move the image 4110 corresponding to the object displayed on the aipilar display 455 to the rolling display 446 (from 4101 to 4102) in response to the first motion information . In this case, the processor 470 may output an alarm 4120 if a collision with an object is detected.
The processor 470 can receive the second motion information through the interface unit 480. [ The second motion information may be information (4103 to 4101) moving in the direction of the a-pillar 3801 in the direction of the side window glass 3803 with reference to the driver. The processor 470 may control to move (to 4101) the image 4110 corresponding to the object displayed on the rolling display 446 to the aPiller display 445 in response to the second motion information . In this case, the processor 470 may output an alarm 4120 if a collision with an object is detected.
42 is a diagram referred to explain a screen control operation according to user input according to an embodiment of the present invention.
Referring to Figure 42, when user inputs 4210 and 4240 are received via a touch sensor included in the Aipillar display 445, as shown at 4202 and 4203, the processor 470, And to control the deployment of the rolling display 446. [ The processor 470 may control the screens 4230 and 4250 displayed on the rolling display 446 when the user inputs 4210 and 4240 are received via the touch sensor included in the Aipillar display 445 .
For example, when a message is received from the external device 250, 260, 261, the processor 470 may control the message receiving information 4210 to be displayed on the aipilar display 445. In this figure, a case where a text message is received from the counterpart mobile terminal 250 is illustrated. As illustrated at 4201, the processor 470 may display the other party's picture 4210 on the aipilar display 445.
When the first user input 4220 is received via the touch sensor included in the A-filler display 445, the processor 470 may display the content 4230 of the received message on the rolling display 446 . As illustrated at 4202, the processor 470 displays the text message content received from the peer mobile terminal 250 on the rolling display 445.
Thereafter, when the second user input 4240 is received via the touch sensor included in the a-filler display 445, the processor 470 may perform an operation corresponding to the second user input. As illustrated at 4203, the processor 470 scrolls and displays the text message content corresponding to the second user input.
Figure 43 is a diagram referenced to illustrate the operation of displaying a camera acquired image on a rolling display, in accordance with an embodiment of the present invention.
43, the processor 470 may display an alarm 4310 on the Aipillar display 445 or the rolling display 446, as illustrated at 4301. [
For example, if an object (e.g., another vehicle) is detected in the left rear in a state in which a turn signal is received for a lane change to the left, the processor 470 sends an alarm (4310) can be displayed.
For example, when an object (e.g., another vehicle or a pedestrian) is detected in front of the vehicle while the user is gazing toward the side window glass, the processor 470 controls the second rolling display 3820 An alarm 4310 can be displayed.
The processor 470 may display the image acquired via the camera 195 on the Aipillar display 445 or the rolling display 446 as illustrated at 4302. [
For example, when the user's gaze is directed to the side window glass 3803 in the rain or snow state, the processor 470 displays the image corresponding to the vehicle rear image data in the second rolling display 3820 .
For example, in the rain or snow state, when the turn signal input is received, the processor 470 displays an image corresponding to the vehicle rear image data in the second rolling display 3820 corresponding to the turn signal direction can do.
44 is a diagram referred to explain the development operation of a rolling display corresponding to window opening according to an embodiment of the present invention.
Referring to FIG. 44, the processor 470 can receive the open or closed state information of the window through the interface unit 480.
Processor 470 may control to cause the second rolling display 3820 to wind when a window is opened, as illustrated at 4401. [ This is to prevent damage to the second rolling display when the window is opened.
At this time, it is possible to control the first rolling display 3810 to be developed corresponding to the degree to which the second rolling display 3820 is wound.
Processor 470, on the other hand, may control to cause the second rolling display 3820 to re-deploy if the window is closed again, as illustrated at 4402. [
Figure 45 is a diagram referenced to illustrate an operation of displaying an alarm on an aipilar display or a rolling display, in accordance with an embodiment of the present invention.
Referring to Figure 45, the processor 470 may display an alarm on the A pillar display 445 or the rolling display 446 by displaying a color.
For example, the processor 470 can display a predetermined color on the A pillar display 445 corresponding to the time of collision with the object. For example, when the collision prediction time with the object is within the first range, the processor 470 may display the color of the Aipilar display 445 as the first color. Further, when the collision prediction time with the object is within the second range, the processor 470 can display the color of the aipilar display 445 in the second color.
The processor 470 may display the information in a state in which the predetermined color is displayed on the A-filler display 445. [
Figures 46A-46D are illustrations that are referenced to illustrate externally visible information on a transparent flexible display or a rolling display, in accordance with an embodiment of the present invention.
Referring to the drawings, the processor 470 may display information on a transparent flexible display 441 or on a rolling display 446. [ The processor 470 may display information in a state in which the transparent flexible display 441 or the rolling display 446 is deployed. Here, the information can be displayed so as to be visible outside the vehicle. The information may be information that an outsider needs, rather than a vehicle occupant.
For example, the processor 470 can display information with respect to a direction from outside to the vehicle 100. [ In this case, when viewed from inside the vehicle, the information can be displayed in the form of inverted left and right.
As illustrated in the figure, the processor 470 has information on the disabled parking available vehicle information 4601, infant and toddler boarding information 4602, parking time information 4603, predetermined area accessibility information 4604, 4605) can be visibly displayed outside the vehicle.
The present invention described above can be embodied as computer-readable codes on a medium on which a program is recorded. The computer readable medium includes all kinds of recording devices in which data that can be read by a computer system is stored. Examples of the computer readable medium include a hard disk drive (HDD), a solid state disk (SSD), a silicon disk drive (SDD), a ROM, a RAM, a CD-ROM, a magnetic tape, a floppy disk, , And may also be implemented in the form of a carrier wave (e.g., transmission over the Internet). The computer may also include a processor 180, 470 or a controller 770. Accordingly, the above description should not be construed in a limiting sense in all respects and should be considered illustrative. The scope of the present invention should be determined by rational interpretation of the appended claims, and all changes within the scope of equivalents of the present invention are included in the scope of the present invention.

10: Vehicle
100: vehicle driving assist device

Claims (21)

A display unit including an ap filler display disposed on an A-pillar and a rolling display that is expandably wound on a front windshield or a side window glass in the ap filler; And
Controls the deployment of the rolling display,
And a processor for controlling a screen displayed on the apiller display or the rolling display.
The method according to claim 1,
And an input unit,
Wherein the processor controls the deployment of the rolling display according to an input signal received through the input unit.
The method according to claim 1,
Further comprising a length sensing unit for sensing a degree of development of the rolling display,
Wherein the processor controls a screen displayed on the rolling display according to the degree of deployment.
The method of claim 3,
When the rolling state of the rolling display is switched to widen the display area in a state in which the first content is displayed on the rolling display,
Wherein the processor controls to display the second content on the rolling display while maintaining the first content display.
The method of claim 3,
When the rolling state of the rolling display is switched to narrow the display area in a state in which the first content is displayed on the rolling display,
Wherein the processor controls to prevent a portion of the first content from being displayed by the narrowing display area while maintaining the first content display.
The method of claim 3,
Wherein the processor controls a screen displayed on the apiller display according to a degree of development of the rolling display.
The method according to claim 1,
Wherein the Aipillar display comprises a touch sensor,
Wherein the processor controls the development of the rolling display or controls a screen displayed on the apiller display or the rolling display in accordance with an input signal received through the touch sensor.
The method according to claim 1,
And an interface unit for receiving object sensing information from a sensing unit that senses an object positioned in the vicinity of the vehicle,
Wherein the processor controls to display an image corresponding to the object on the apiller display or the rolling display corresponding to the object sensing information.
9. The method of claim 8,
Wherein the interface unit receives the motion information of the object,
Wherein the processor controls the display area of the image to be changed in accordance with the motion information.
10. The method of claim 9,
When the object receives the first motion information from the ap filler moving in the direction of the front windshield or the side window glass on the basis of the driver,
And the processor controls to move the image displayed on the apiller display to the rolling display in correspondence with the first motion information.
10. The method of claim 9,
When the object receives the second motion information moving in the direction of the aileron in the direction of the front windshield or the side window glass on the basis of the driver,
Wherein the processor controls to move the image displayed on the rolling display to the aipilar display in response to the second motion information.
The method according to claim 1,
The rolling display comprises:
A first rolling display that is deployably deployed from the ap filler toward the front windshield; And
And a second rolling display that is deployed from the ap filler toward the side window glass.
13. The method of claim 12,
The processor comprising:
And controls different contents to be displayed on the apiller display, the first rolling display and the second rolling display, respectively.
14. The method of claim 13,
And an interface unit for receiving user's gaze information from an internal camera,
The processor comprising:
And displays the content on the second rolling display when the user's gaze is directed to the side window glass.
14. The method of claim 13,
And an interface unit for receiving window opening / closing status information from the window driving unit,
The processor comprising:
And controls the second rolling display to be wound when the window is opened.
16. The method of claim 15,
The processor comprising:
And controls the second rolling display to be deployed when the window is closed again.
16. The method of claim 15,
The processor comprising:
And controls the first rolling display to be deployed corresponding to the degree of winding of the second rolling display when the window is opened.
14. The method of claim 13,
And an interface unit for receiving weather information from a sensing unit sensing the weather condition and receiving vehicle backward image data from the camera,
The processor comprising:
And displays the image corresponding to the vehicle rear image data on the second rolling display when rain or snow is present.
14. The method of claim 13,
And a communication unit for performing communication with an external device,
The processor comprising:
Wherein when the message is received from the external device through the communication unit, the control unit controls the message reception information to be displayed on the Aipillar display,
When the user input is received or stopped, the detailed contents of the message are displayed on the first rolling display.
14. The method of claim 13,
And an interface unit for receiving image data obtained through each of the cameras disposed in the front, rear, left, or right rooms,
Wherein the second rolling display comprises:
A third rolling display in which a left side window glass is deployably wound in a left side window glass, and a fourth rolling display in which a right side window glass is deployably wound from the right side window glass toward the right side window glass,
The processor comprising:
Displaying an image corresponding to the image data acquired through the camera disposed in the forward direction on the first rolling display,
Displays an image corresponding to the image data acquired through the camera disposed in the rear or the left room on the third rolling display,
And controls the fourth rolling display to display an image corresponding to the image data acquired through the camera disposed in the rear or right room.
A vehicle including the vehicle display device according to any one of claims 1 to 20.

KR1020150123757A 2014-12-10 2015-09-01 Display apparatus for vehicle and Vehicle including the same KR101809924B1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
KR1020150123757A KR101809924B1 (en) 2015-09-01 2015-09-01 Display apparatus for vehicle and Vehicle including the same
PCT/KR2015/012227 WO2016093502A1 (en) 2014-12-10 2015-11-13 Vehicle display device and vehicle having same

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
KR1020150123757A KR101809924B1 (en) 2015-09-01 2015-09-01 Display apparatus for vehicle and Vehicle including the same

Publications (2)

Publication Number Publication Date
KR20170027163A true KR20170027163A (en) 2017-03-09
KR101809924B1 KR101809924B1 (en) 2017-12-20

Family

ID=58402459

Family Applications (1)

Application Number Title Priority Date Filing Date
KR1020150123757A KR101809924B1 (en) 2014-12-10 2015-09-01 Display apparatus for vehicle and Vehicle including the same

Country Status (1)

Country Link
KR (1) KR101809924B1 (en)

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2019124834A1 (en) * 2017-12-21 2019-06-27 삼성전자 주식회사 Method and device for controlling display on basis of driving context
WO2020032307A1 (en) * 2018-08-10 2020-02-13 엘지전자 주식회사 Vehicular display system
CN113085728A (en) * 2020-01-08 2021-07-09 Lg电子株式会社 Vehicle-mounted display device
KR20210089488A (en) * 2020-01-08 2021-07-16 엘지전자 주식회사 Display apparatus for vehicle
KR20210089554A (en) * 2020-01-08 2021-07-16 엘지전자 주식회사 Display apparatus for car
KR20210089553A (en) * 2020-01-08 2021-07-16 엘지전자 주식회사 Display apparatus for car
CN114274773A (en) * 2021-12-24 2022-04-05 北京梧桐车联科技有限责任公司 Automobile display system and automobile
WO2023096402A1 (en) * 2021-11-25 2023-06-01 삼성전자 주식회사 Electronic device comprising flexible display and control method thereof

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR102186703B1 (en) * 2019-04-05 2020-12-07 황광택 A roof imaging device for automobile displaying external environment
KR20230146165A (en) 2022-04-11 2023-10-19 연세대학교 산학협력단 Display apparatus and method for improving visibility in autonomous driving situation
KR20240019423A (en) 2022-08-03 2024-02-14 현대모비스 주식회사 Pillar display device for vehicle and method for controlling the same

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2003019B1 (en) 2007-06-13 2014-04-23 Aisin AW Co., Ltd. Driving assist apparatus for vehicle
US20100230193A1 (en) 2009-03-12 2010-09-16 Ford Global Technologies, Llc Plug-in vehicle function indication
DE112009005391T5 (en) 2009-11-19 2012-09-13 Lear Corporation Vehicle with flexible display device

Cited By (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2019124834A1 (en) * 2017-12-21 2019-06-27 삼성전자 주식회사 Method and device for controlling display on basis of driving context
EP3712001A4 (en) * 2017-12-21 2021-01-06 Samsung Electronics Co., Ltd. Method and device for controlling display on basis of driving context
US11718175B2 (en) 2017-12-21 2023-08-08 Samsung Electronics Co., Ltd Method and device for controlling display on basis of driving context
WO2020032307A1 (en) * 2018-08-10 2020-02-13 엘지전자 주식회사 Vehicular display system
KR20210089553A (en) * 2020-01-08 2021-07-16 엘지전자 주식회사 Display apparatus for car
KR20210089554A (en) * 2020-01-08 2021-07-16 엘지전자 주식회사 Display apparatus for car
KR20210089488A (en) * 2020-01-08 2021-07-16 엘지전자 주식회사 Display apparatus for vehicle
US11166388B2 (en) 2020-01-08 2021-11-02 Lg Electronics Inc. Display apparatus for vehicle
US11617276B2 (en) 2020-01-08 2023-03-28 Lg Electronics Inc. Display device for vehicle
CN113085728A (en) * 2020-01-08 2021-07-09 Lg电子株式会社 Vehicle-mounted display device
US11827096B2 (en) 2020-01-08 2023-11-28 Lg Electronics Inc. Display device for vehicle
CN113085728B (en) * 2020-01-08 2023-12-22 Lg电子株式会社 Vehicle-mounted display device
US11912128B2 (en) 2020-01-08 2024-02-27 Lg Electronics Inc. Display device for vehicle
WO2023096402A1 (en) * 2021-11-25 2023-06-01 삼성전자 주식회사 Electronic device comprising flexible display and control method thereof
CN114274773A (en) * 2021-12-24 2022-04-05 北京梧桐车联科技有限责任公司 Automobile display system and automobile

Also Published As

Publication number Publication date
KR101809924B1 (en) 2017-12-20

Similar Documents

Publication Publication Date Title
US10086762B2 (en) Vehicle display device and vehicle comprising same
KR101809924B1 (en) Display apparatus for vehicle and Vehicle including the same
KR101924059B1 (en) Display apparatus for vehicle and Vehicle including the same
KR101732983B1 (en) Rear combination lamp for vehicle and Vehicle including the same
KR101965881B1 (en) Driver assistance apparatus and Vehicle including the same
US10431086B2 (en) Vehicle, mobile terminal and method for controlling the same
KR101730315B1 (en) Electronic device and method for image sharing
KR101711835B1 (en) Vehicle, Vehicle operating method and wearable device operating method
KR102130800B1 (en) Vehicle control device and vehicle comprising the same
CN106240457B (en) Vehicle parking assistance device and vehicle
US9517776B2 (en) Systems, methods, and apparatus for controlling devices based on a detected gaze
KR101969805B1 (en) Vehicle control device and vehicle comprising the same
CN106945606A (en) Parking execution device and vehicle
CN107021017A (en) Vehicle provides device and vehicle with looking around
CN109484343B (en) Vehicle control apparatus mounted on vehicle and method for controlling vehicle
KR20190031057A (en) Driver assistance apparatus and vehicle
KR20160114486A (en) Mobile terminal and method for controlling the same
KR101736820B1 (en) Mobile terminal and method for controlling the same
KR101859043B1 (en) Mobile terminal, vehicle and mobile terminal link system
KR101807788B1 (en) Display apparatus for vehicle and control method for the same
EP4191204A1 (en) Route guidance device and route guidance method thereof
KR101916425B1 (en) Vehicle interface device, vehicle and mobile terminal link system
KR101781689B1 (en) Vitual image generating apparatus, head mounted display and vehicle
KR101892498B1 (en) Vehicle interface device, vehicle and mobile terminal link system
KR102121990B1 (en) Signage communicable with vehicle and method for controlling the signage

Legal Events

Date Code Title Description
A201 Request for examination
E701 Decision to grant or registration of patent right