KR20190031053A - Vehicle control method - Google Patents
Vehicle control method Download PDFInfo
- Publication number
- KR20190031053A KR20190031053A KR1020170118898A KR20170118898A KR20190031053A KR 20190031053 A KR20190031053 A KR 20190031053A KR 1020170118898 A KR1020170118898 A KR 1020170118898A KR 20170118898 A KR20170118898 A KR 20170118898A KR 20190031053 A KR20190031053 A KR 20190031053A
- Authority
- KR
- South Korea
- Prior art keywords
- vehicle
- occupant
- unit
- communication
- display
- Prior art date
Links
Images
Classifications
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W50/00—Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
- B60W50/08—Interaction between the driver and the control system
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60K—ARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
- B60K35/00—Arrangement of adaptations of instruments
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60R—VEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
- B60R11/00—Arrangements for holding or mounting articles, not otherwise provided for
- B60R11/02—Arrangements for holding or mounting articles, not otherwise provided for for radio sets, television sets, telephones, or the like; Arrangement of controls thereof
- B60R11/0217—Arrangements for holding or mounting articles, not otherwise provided for for radio sets, television sets, telephones, or the like; Arrangement of controls thereof for loud-speakers
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W40/00—Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models
- B60W40/08—Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models related to drivers or passengers
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/017—Gesture based interaction, e.g. based on a set of recognized hand gestures
-
- B60K2350/1052—
-
- B60K2350/352—
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W50/00—Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
- B60W50/08—Interaction between the driver and the control system
- B60W50/14—Means for informing the driver, warning the driver or prompting a driver intervention
- B60W2050/146—Display means
Abstract
Description
The present invention relates to a vehicle control method of a vehicle control device mounted on a vehicle.
A vehicle means a means of transporting people or goods by using kinetic energy. Typical examples of vehicles include automobiles and motorcycles.
For safety and convenience of a user who uses the vehicle, various sensors and devices are provided in the vehicle, and the functions of the vehicle are diversified.
The function of the vehicle can be divided into a convenience function for the convenience of the driver and a safety function for the safety of the driver and / or the pedestrian.
First, the convenience function has the motive for development related to driver convenience, such as providing infotainment (information + entertainment) function to the vehicle, supporting partial autonomous driving function, assisting driver's vision such as night vision or blind spot . For example, an active cruise control (ACC), a smart parking assist system (SPAS), a night vision (NV), a head up display (HUD) (AVM), adaptive headlight system (AHS), and so on.
The safety function is to secure the safety of the driver and / or pedestrians. The lane departure warning system (LDWS), the lane keeping assist system (LKAS), the autonomous emergency braking, and AEB) functions.
In the present vehicle, the space between the front seat and the rear seat is separated, so communication between the passengers is difficult because they do not face each other. There is also a case where the other party is difficult to understand even if the driver attempts to communicate within the vehicle, such as when the driver is concentrating on the driving situation or the passenger is listening to music. Therefore, there is a need to provide an environment in which communication can be smoothly performed when the passenger intends to communicate with other passengers in the vehicle. To this end, it is necessary to develop a vehicle control method that allows a passenger to recognize the intention and gesture of a conversation with another passenger, and notify the other party of the conversation.
The present invention aims to solve the following problems.
An object of the present invention is to provide a vehicle control method capable of facilitating communication in a case where communication is difficult, such as when the front / rear space is separated in the vehicle and when the vehicle is focused on the driving situation or music sound is large .
It is an object of the present invention to provide a vehicle control method in which a driver can block communication and concentrate on driving for safe driving of the vehicle.
It is an object of the present invention to provide a vehicle control method capable of smoothly communicating with a passenger who is focused on other actions and does not recognize direct communication.
The present invention provides a vehicle control method for a vehicle control device mounted on a vehicle equipped with a sensor, the method comprising the steps of: receiving sensing information from the sensor for sensing the interior of the vehicle; Searching for at least one of predetermined gestures defined as calling a passenger, and when at least one of the gestures is searched, determining that factors affecting communication of the first and second occupants meet the criteria And controlling the vehicle.
In one embodiment, in the vehicle control method, the method may further include controlling the display so that a guidance message corresponding to the searched gesture is output on the display of the vehicle.
In one embodiment, in the vehicle control method, when the second occupant starts to speak in response to the guidance message, the elements affecting the communication of the first and second occupants satisfy the reference condition And controlling the vehicle.
In one embodiment, in the vehicle control method, if the second occupant does not speak in response to the guidance message, wait until the second occupant's utterance is sensed, and then the second occupant's utterance And controlling the vehicle in response to being sensed.
In one embodiment, in the vehicle control method, the step of controlling the vehicle such that a plurality of speakers are provided in the vehicle, and elements affecting the communication satisfy a reference condition, includes the steps of: And selecting at least one speaker based on the boarding position of the at least one speaker.
In one embodiment, in the vehicle control method, the method further comprises adjusting the volume of the selected speaker and maintaining the volume of the unselected speaker.
In one embodiment, in the vehicle control method, when the volume of at least one of the first and second passengers is sensed, the volume of the selected speaker is lowered, and when the utterance is not sensed, And controlling the vehicle to restore the vehicle to its original state.
In one embodiment, in the vehicle control method, the step of controlling the vehicle such that the elements affecting the communication satisfy the reference condition includes displaying on a display provided in the vehicle at least one of the first and second occupants And controlling the display so that an image photographed by the user is displayed.
In one embodiment, in the vehicle control method, when the second occupant is the driver of the vehicle, and the driver has previously set not to output the guidance message, even if at least one of the predetermined gestures is searched, And controlling the display so that the guide message is not output.
In one embodiment, in the vehicle control method, an output command is transmitted using the communication device of the vehicle such that the guidance message is output on the display of the mobile device when the second occupant is using the mobile device The method comprising the steps of:
The effect of the vehicle control method according to the present invention will be described below.
The vehicle control method can search for gestures from inside information of a vehicle sensed by a sensor in a situation where it is difficult to communicate among occupants in the vehicle, and use the detected gestures to control the vehicle so that communication is smooth. Specifically, when a gesture defined as one in which an occupant intends to communicate with another occupant is sensed, it is possible to provide a communication environment in a vehicle using a speaker and a display of the vehicle.
In addition, the vehicle control method may be configured to prevent a guidance message from being output for safe operation when the other party requested to talk is the driver. Specifically, even if a gesture defined as having an intention to communicate with a driver is sensed, it is possible to provide an environment in which an alarm is not displayed so that the driver can concentrate on driving.
1 is a view showing an appearance of a vehicle according to an embodiment of the present invention;
2 is a view showing a vehicle according to an embodiment of the present invention viewed from various angles
3 to 4 are views showing the inside of the vehicle according to the embodiment of the present invention
5 to 6 are diagrams for explaining an object according to an embodiment of the present invention,
7 is a block diagram of a vehicle according to an embodiment of the present invention.
8 is a flowchart for explaining a vehicle control method according to an embodiment of the present invention.
9, 10 and 11 are illustrations for explaining a vehicle control method according to an embodiment of the present invention.
12, 13 and 14 are flowcharts for explaining a vehicle control method according to an embodiment of the present invention.
15 is an exemplary diagram for explaining a vehicle control method according to an embodiment of the present invention;
Hereinafter, embodiments of the present invention will be described in detail with reference to the accompanying drawings, wherein like reference numerals are used to designate identical or similar elements, and redundant description thereof will be omitted. The suffix "module" and " part "for the components used in the following description are given or mixed in consideration of ease of specification, and do not have their own meaning or role. In the following description of the embodiments of the present invention, a detailed description of related arts will be omitted when it is determined that the gist of the embodiments disclosed herein may be blurred. It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory and are intended to provide further explanation of the invention as claimed. , ≪ / RTI > equivalents, and alternatives.
Terms including ordinals, such as first, second, etc., may be used to describe various elements, but the elements are not limited to these terms. The terms are used only for the purpose of distinguishing one component from another.
It is to be understood that when an element is referred to as being "connected" or "connected" to another element, it may be directly connected or connected to the other element, . On the other hand, when an element is referred to as being "directly connected" or "directly connected" to another element, it should be understood that there are no other elements in between.
The singular expressions include plural expressions unless the context clearly dictates otherwise.
In the present application, the terms "comprises", "having", and the like are used to specify that a feature, a number, a step, an operation, an element, a component, But do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, or combinations thereof.
The vehicle described herein may be a concept including a car, a motorcycle. Hereinafter, the vehicle will be described mainly with respect to the vehicle.
The vehicle described in the present specification may be a concept including both an internal combustion engine vehicle having an engine as a power source, a hybrid vehicle having an engine and an electric motor as a power source, and an electric vehicle having an electric motor as a power source.
In the following description, the left side of the vehicle means the left side in the running direction of the vehicle, and the right side of the vehicle means the right side in the running direction of the vehicle.
1 is a view showing an appearance of a vehicle according to an embodiment of the present invention.
2 is a view of a vehicle according to an embodiment of the present invention viewed from various angles.
3 to 4 are views showing an interior of a vehicle according to an embodiment of the present invention.
5 to 6 are drawings referred to explain an object according to an embodiment of the present invention.
FIG. 7 is a block diagram for explaining a vehicle according to an embodiment of the present invention. FIG.
Referring to FIGS. 1 to 7, the
The
Here, the autonomous driving is defined as controlling at least one of acceleration, deceleration, and driving direction based on a predetermined algorithm. In other words, this means that the driving operation device is automatically operated even if no user input is inputted to the driving operation device.
The
For example, the
The
For example, the
For example, the
The
When the
For example, the
When the
The overall length means the length from the front portion to the rear portion of the
7, the
According to the embodiment, the
The user interface device 200 is a device for communicating between the
The user interface device 200 may include an
According to the embodiment, the user interface device 200 may further include other components than the components described, or may not include some of the components described.
The input unit 200 is for receiving information from a user. The data collected by the input unit 200 may be analyzed by the
The input unit 200 can be disposed inside the vehicle. For example, the input unit 200 may include one area of a steering wheel, one area of an instrument panel, one area of a seat, one area of each pillar, one area of the head console, one area of the door, one area of the center console, one area of the head lining, one area of the sun visor, one area of the windshield, One area or the like.
The input unit 200 may include an audio input unit 211, a
The voice input unit 211 can switch the voice input of the user into an electrical signal. The converted electrical signal may be provided to the
The voice input unit 211 may include one or more microphones.
The
The
According to an embodiment, the
The
The
The
According to the embodiment, the
The mechanical input unit 214 may include at least one of a button, a dome switch, a jog wheel, and a jog switch. The electrical signal generated by the mechanical input 214 may be provided to the
The mechanical input unit 214 may be disposed on a steering wheel, a centepascia, a center console, a cockpit module, a door, or the like.
The
The
The output unit 250 is for generating an output related to a visual, auditory or tactile sense or the like.
The output unit 250 may include at least one of a display unit 251, an acoustic output unit 252, and a
The display unit 251 may display graphic objects corresponding to various information.
The display unit 251 may be a liquid crystal display (LCD), a thin film transistor-liquid crystal display (TFT LCD), an organic light-emitting diode (OLED) display, a 3D display, and an e-ink display.
The display unit 251 may have a mutual layer structure with the
The display unit 251 may be implemented as a Head Up Display (HUD). When the display unit 251 is implemented as an HUD, the display unit 251 may include a projection module to output information through an image projected on a windshield or a window.
The display unit 251 may include a transparent display. The transparent display may be attached to the windshield or window.
The transparent display can display a predetermined screen while having a predetermined transparency. Transparent displays can be made of transparent TFEL (Thin Film Elecroluminescent), transparent OLED (Organic Light-Emitting Diode), transparent LCD (Liquid Crystal Display), transmissive transparent display, transparent LED (Light Emitting Diode) Or the like. The transparency of the transparent display can be adjusted.
Meanwhile, the user interface device 200 may include a plurality of
The display unit 251 includes one area of the steering wheel, one area of the
The audio output unit 252 converts an electric signal provided from the
The
The
In accordance with an embodiment, the user interface device 200 may include a plurality of
If the user interface device 200 does not include the
On the other hand, the user interface device 200 may be referred to as a vehicle display device.
The user interface device 200 may be operated under the control of the
The
The object may be various objects related to the operation of the
5 to 6, an object O is a vehicle that is a vehicle that has a lane OB10, another vehicle OB11, a pedestrian OB12, a two-wheeled vehicle OB13, traffic signals OB14 and OB15, Speed bumps, terrain, animals, and the like.
The lane OB10 may be a driving lane, a side lane of the driving lane, or a lane on which the opposed vehicle runs. The lane OB10 may be a concept including left and right lines Line forming a lane.
The other vehicle OB11 may be a vehicle running in the vicinity of the
The pedestrian OB12 may be a person located in the vicinity of the
The two-wheeled vehicle OB12 may mean a vehicle located around the
The traffic signal may include a traffic light (OB15), a traffic sign (OB14), a pattern drawn on the road surface, or text.
The light may be light generated from lamps provided in other vehicles. Light can be light generated from a street light. Light can be solar light.
The road may include a slope such as a road surface, a curve, an uphill, a downhill, and the like.
The structure may be an object located around the road and fixed to the ground. For example, the structure may include street lamps, street lamps, buildings, electric poles, traffic lights, and bridges.
The terrain may include mountains, hills, and the like.
On the other hand, an object can be classified into a moving object and a fixed object. For example, the moving object may be a concept including an other vehicle, a pedestrian. For example, the fixed object may be a concept including a traffic signal, a road, and a structure.
The
According to the embodiment, the
The
For example, the
For example, the
For example, the
The
The
The
The
The
The
When implemented in a driving manner, the
In the case of non-driven implementation, the
The
The
The
The
The
The
The
The
The
The
The
The
According to the embodiment, the
The
The
The
The
The
According to the embodiment, the
The short-
The short-
The
The
The
According to the embodiment, the light emitting portion may be formed so as to be integrated with the lamp included in the
The broadcast transmission /
The
In accordance with an embodiment, the
When the
On the other hand, the
The
The driving
In the manual mode, the
The driving
The
The
The driving
The
The
According to the embodiment, the
On the other hand, the
The power
The power
The power
For example, when the fossil fuel-based engine is a power source, the power
For example, when the electric energy based motor is a power source, the power
The
The
On the other hand, when the engine is a power source, the
The
The
The
The
On the other hand, the
The
On the other hand, the
The door /
The door /
The
The
The safety
The safety
The
The seat
The pedestrian protection
The
The air
The
The
The
The
According to the embodiment, the
On the other hand, the
Meanwhile, according to the embodiment, when the
According to the embodiment, the
The traveling
The
The traveling
The traveling
The
The
The
The
The
The
The
The
The
The
According to an embodiment, the
According to an embodiment, the
The
The
The
The
Meanwhile, the
The
According to the embodiment, the
The
The
One or more processors and controls 170 included in
There is a need for a vehicle control method capable of facilitating communication when communication is difficult, such as when the front and rear spaces are separated from each other in the present vehicle, or when the user concentrates on the driving situation or has a large music sound. For this purpose, it is necessary for the passenger to communicate with the other party with the intention to communicate with the other passenger.
Hereinafter, the
Hereinafter, the vehicle control method will be described in detail.
8 is a flowchart illustrating a vehicle control method according to an embodiment of the present invention. 12, 13, and 14 are flowcharts for explaining a vehicle control method according to an embodiment of the present invention.
Referring to FIG. 8, the vehicle control method is performed by a vehicle control device. The
The
The
The predetermined gesture may be in various forms such as behavior, title, and gaze, and the gesture may be added by learning and may be subdivided. Specifically, the intention and the gesture of the first passenger to talk may appear by touching the seat of the other party to be talked or beckoning toward the other party. Also, the user can recognize the purpose of calling the other party according to the number of times of touching or beckoning, or the speed, and inform the other party of the purpose. More specifically, when a child in the back seat of the vehicle touches the driver's seat to call the parent in the driver's seat, if the child touches many times at a high speed, it can be recognized that there is a sudden situation. Accordingly, the urgent situation can be communicated to the parent on the driver's seat by using the display and the speaker. This behavioral perception can be predefined, and data can be collected and recognized by learning.
Also, the intention to communicate with the first occupant can be recognized by the name of the first occupant. Specifically, it can be recognized that the first passenger intends to communicate with another passenger when his / her name, nickname, or nickname is called. In this case, the vehicle recognizes the occupant's information through face recognition of the passengers on board the vehicle or other sensing method such as based on the in-vehicle terminal data, and then, when the first occupant fires, And may determine that there is a conversation intention with the selected occupant. More specifically, a situation can be recognized when the first occupant shows intention to communicate with the title. When the child in the back seat of the vehicle makes an utterance such as "father, I forbid for the toilet ", the
In addition, the intention to communicate through the eye tracking of the first occupant can be recognized. Specifically, when the first occupant looks directly or otherwise by the person to communicate with, the
If at least one of the gestures is searched, the vehicle may be controlled so that the factors affecting the communication of the first and second occupants satisfy the reference condition (S830). The factors affecting the communication are the large volume of speakers located in the speaker's seat, the situation where the speakers are not facing each other, or the conversation is focused on other situations, such as sleeping or using a mobile device, There are difficult elements. The
Specifically, when at least one of the gestures of the first occupant is searched, the vehicle may lower the volume of the speaker located in the seat of the person selected as the talker. If the volume of the speaker is high, it may affect communication, the
Further, when at least one of the first occupant's gestures is searched, the vehicle may display an image of the talker on a display based on the boarding position of the people selected as the talker. This is to create an environment suitable for communication by creating the same situation as facing each other.
FIG. 9 is an exemplary diagram for explaining a vehicle control method according to an embodiment of the present invention.
As described above, when the vehicle recognizes the intention to use the sensor to make conversation of the first occupant, the vehicle can display it on the display or adjust the volume. For example, a plurality of speakers may be provided in the vehicle. Each speaker can be adjusted for each occupant of the seat. Specifically, a driver's seat, an assistant seat, and a rear seat speaker can be adjusted for the occupant of each seat. When communication is carried out in the vehicle, only the speaker located in the passenger seat for communicating is adjusted, and the speaker located in the passenger seat not communicating needs to be maintained. This is because there is no need to interfere with behaviors such as music appreciation of passengers who are not engaged in communication.
To this end, when recognizing the communication intention of the first occupant, the
In order to facilitate communication, the
The volume of the speaker may be adjusted depending on the environment. Specifically, in an environment in which communication is performed, the
Further, the vehicle may be provided with a plurality of displays. For example, a cluster, a HUD, a room mirror, a rear seat display, and the like may be provided. When the
In order to facilitate communication, the
10 is an exemplary diagram for explaining a vehicle control method according to an embodiment of the present invention.
The
In this case, when the second occupant who has received the guidance message starts speaking in response to the guidance message by sensing the second occupant's utterance (S1230), the elements affecting the communication of the first and second occupants are set as criteria It is possible to control the vehicle so as to satisfy the condition (S1240).
If the second occupant does not speak in response to the guidance message, the control unit waits until the second occupant's speech is sensed (S1250). Thereafter, in response to sensing the second occupant's utterance, Can be controlled. Specifically, if a driver of a seat behind the driver's seat senses a gesture requesting the driver to talk, a guidance message may be displayed to the driver. The guidance message may be output on the display of the second occupant, as described above, or may be outputted as a voice through the speaker.
At this time, the driver can speak in response to the guidance message. When the above-mentioned utterance starts, the vehicle is controlled so that communication is smoothly determined in response to the communication. On the other hand, when the driver does not speak, it is judged that there is no communication intention and can wait until there is an utterance. This situation can be guided to the first passenger who has taken the gesture intended for communication. The guidance may be output as a message or voice through a display or a speaker.
11 is a diagram for explaining a vehicle control method according to an embodiment of the present invention.
The
15 is an exemplary diagram for explaining a vehicle control method according to an embodiment of the present invention.
If the person being requested to talk is the driver, the driver may need to focus on driving rather than communication for safety reasons, so it is necessary to decide whether or not to convey the passenger's intent according to the driving environment. Specifically, when the second occupant is the driver of the vehicle and the driver has previously set not to output the guidance message, the display may be displayed so that the guidance message is not output even if at least one of the predetermined gestures is searched. Can be controlled.
For example, in driving environments such as steep curves, accidents, ice roads, heavy rain, and fog, drivers need to restrict communication for safe operation of the vehicle. In this case, even if a gesture with intention of communication of the first occupant is detected, it is necessary to create an environment in which the driver can concentrate on driving without giving an alarm to the driver. Accordingly, the driver needs to concentrate on the driving as in the danger zone described above. If it is necessary to shut off the careless element, the driver can set the guidance message to not be outputted in advance and lower the priority of the passenger's intention transmission.
At this time, a message indicating that the first occupant who has expressed the intention of communication can not communicate because the driver must concentrate on driving may be displayed to the first occupant. The message may be displayed on a display based on the boarding position of the first occupant, and may be outputted as a voice through a speaker.
If the driver is able to communicate outside the situation requiring attention to driving, a message informing the first passenger that communication is possible is displayed again. The message may be displayed on a display based on the boarding position of the first occupant, and may be outputted as a voice through a speaker. In order for the message to be displayed on the first passenger, the driver may release the setting for not outputting the guidance message as described above, or automatically when the driver is determined to be able to communicate by sensing and learning in the vehicle The communication environment can be created.
Through the above-described methods, it is possible to provide a vehicle control method capable of facilitating communication when communication is difficult, such as when a front / rear space is separated in a vehicle and a situation is to be focused on each driving situation.
The above-described present invention can be implemented as a computer-readable code (or application or software) on a medium on which the program is recorded. The control method of the above-described autonomous vehicle can be realized by a code stored in a memory or the like.
The computer readable medium includes all kinds of recording devices in which data that can be read by a computer system is stored. Examples of the computer readable medium include a hard disk drive (HDD), a solid state disk (SSD), a silicon disk drive (SDD), a ROM, a RAM, a CD-ROM, a magnetic tape, a floppy disk, , And may also be implemented in the form of a carrier wave (e.g., transmission over the Internet). In addition, the computer may include a processor or a control unit. Accordingly, the above description should not be construed in a limiting sense in all respects and should be considered illustrative. The scope of the present invention should be determined by rational interpretation of the appended claims, and all changes within the scope of equivalents of the present invention are included in the scope of the present invention.
Claims (10)
Receiving sensing information from the sensor that senses the interior of the vehicle;
Using said sensing information to search for at least one of predetermined gestures defined by a first occupant calling a second occupant; And
Controlling said vehicle such that elements that affect communication of said first and second occupants satisfy a reference condition when at least one of said gestures is searched.
And controlling the display such that a guidance message corresponding to the searched gesture is output on the display of the vehicle.
When the second occupant starts speaking in response to the guidance message,
And controlling the vehicle such that the elements affecting the communication of the first and second occupants meet the reference condition.
When the second occupant does not speak in response to the guidance message,
Waiting until an ignition of the second occupant is sensed, and thereafter controlling the vehicle in response to the sensation of the second occupant's ignition.
A plurality of speakers are provided in the vehicle,
Wherein the step of controlling the vehicle such that the factors affecting the communication satisfy the reference condition,
And selecting at least one speaker based on the boarding position of the first and second passengers.
Adjusting the volume of the selected speaker, and maintaining the volume of the unselected speaker.
When the volume of at least one of the first and second passengers is sensed,
Further comprising the step of controlling the vehicle to restore the volume of the selected speaker to the original state if the ignition is not detected.
Wherein the step of controlling the vehicle such that the factors affecting the communication satisfy the reference condition,
And controlling the display so that an image of at least one of the first and second passengers is displayed on a display of the vehicle.
When the second occupant is the driver of the vehicle and the driver has previously set the guide message not to be output,
Further comprising controlling the display such that the guidance message is not output even if at least one of the predetermined gestures is searched.
When the second occupant is using the mobile device,
Further comprising transmitting an output command using the communication device of the vehicle such that the guidance message is output on the display of the mobile device.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
KR1020170118898A KR102023995B1 (en) | 2017-09-15 | 2017-09-15 | Vehicle control method |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
KR1020170118898A KR102023995B1 (en) | 2017-09-15 | 2017-09-15 | Vehicle control method |
Publications (2)
Publication Number | Publication Date |
---|---|
KR20190031053A true KR20190031053A (en) | 2019-03-25 |
KR102023995B1 KR102023995B1 (en) | 2019-09-23 |
Family
ID=65907991
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
KR1020170118898A KR102023995B1 (en) | 2017-09-15 | 2017-09-15 | Vehicle control method |
Country Status (1)
Country | Link |
---|---|
KR (1) | KR102023995B1 (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN111152790A (en) * | 2019-12-29 | 2020-05-15 | 的卢技术有限公司 | Multi-device interactive vehicle-mounted head-up display method and system based on use scene |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2004177315A (en) * | 2002-11-28 | 2004-06-24 | Alpine Electronics Inc | Apparatus for detecting direction of line of vision, dialog system using it, and driving support system |
JP2007074081A (en) * | 2005-09-05 | 2007-03-22 | Denso Corp | On-vehicle communication apparatus |
JP2010023639A (en) * | 2008-07-18 | 2010-02-04 | Kenwood Corp | In-cabin conversation assisting device |
JP2015071320A (en) * | 2013-10-01 | 2015-04-16 | アルパイン株式会社 | Conversation support device, conversation support method, and conversation support program |
-
2017
- 2017-09-15 KR KR1020170118898A patent/KR102023995B1/en active IP Right Grant
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2004177315A (en) * | 2002-11-28 | 2004-06-24 | Alpine Electronics Inc | Apparatus for detecting direction of line of vision, dialog system using it, and driving support system |
JP2007074081A (en) * | 2005-09-05 | 2007-03-22 | Denso Corp | On-vehicle communication apparatus |
JP2010023639A (en) * | 2008-07-18 | 2010-02-04 | Kenwood Corp | In-cabin conversation assisting device |
JP2015071320A (en) * | 2013-10-01 | 2015-04-16 | アルパイン株式会社 | Conversation support device, conversation support method, and conversation support program |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN111152790A (en) * | 2019-12-29 | 2020-05-15 | 的卢技术有限公司 | Multi-device interactive vehicle-mounted head-up display method and system based on use scene |
CN111152790B (en) * | 2019-12-29 | 2022-05-24 | 的卢技术有限公司 | Multi-device interactive vehicle-mounted head-up display method and system based on use scene |
Also Published As
Publication number | Publication date |
---|---|
KR102023995B1 (en) | 2019-09-23 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
KR101989523B1 (en) | Vehicle control device mounted on vehicle and method for controlling the vehicle | |
KR101930462B1 (en) | Vehicle control device and vehicle comprising the same | |
KR101922009B1 (en) | Vehicle control device mounted on vehicle and method for controlling the vehicle | |
CN107878460B (en) | Control method and server for automatic driving vehicle | |
KR101777329B1 (en) | Regenerative braking control apparatus for vehicle | |
EP3425469A1 (en) | Driving system for vehicle and vehicle thereof | |
KR101979269B1 (en) | Autonomous Vehicle and operating method for the same | |
KR101911703B1 (en) | Driving control apparatus for vehicle and vehicle | |
KR20190033368A (en) | Driving system and vehicle | |
KR101969805B1 (en) | Vehicle control device and vehicle comprising the same | |
KR20190001145A (en) | Interface system for vehicle | |
KR20180026243A (en) | Autonomous vehicle and control method thereof | |
KR20190007286A (en) | Driving system for vehicle and Vehicle | |
KR101910385B1 (en) | Vehicle control device mounted on vehicle and method for controlling the vehicle | |
KR20190072235A (en) | Vehicle control device mounted on vehicle and method for controlling the vehicle | |
KR102212777B1 (en) | Video output device | |
KR20190072239A (en) | Vehicle control device mounted on vehicle and method for controlling the vehicle | |
KR102005443B1 (en) | Apparatus for user-interface | |
KR20200044515A (en) | Vehicle Indoor Person Monitoring Device and method for operating the same | |
KR101929816B1 (en) | Vehicle controlling device mounted at vehicle and method for controlling the vehicle | |
KR101951424B1 (en) | Vehicle control device mounted at vehicle and method for controlling the vehicle | |
KR102023995B1 (en) | Vehicle control method | |
KR20180046228A (en) | Vehicle control device mounted at vehicle and method for controlling the vehicle | |
KR102089955B1 (en) | Robot for vehicle mounted on the vehcile and method for controlling the robot for vehicle | |
KR101919889B1 (en) | Display device and vehicle comprising the same |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
A201 | Request for examination | ||
E902 | Notification of reason for refusal | ||
E90F | Notification of reason for final refusal | ||
E701 | Decision to grant or registration of patent right | ||
GRNT | Written decision to grant |