CN113022579A - Driving assisting method and vehicle-mounted device - Google Patents

Driving assisting method and vehicle-mounted device Download PDF

Info

Publication number
CN113022579A
CN113022579A CN202110400588.XA CN202110400588A CN113022579A CN 113022579 A CN113022579 A CN 113022579A CN 202110400588 A CN202110400588 A CN 202110400588A CN 113022579 A CN113022579 A CN 113022579A
Authority
CN
China
Prior art keywords
target object
dynamic
judged
received
dynamic target
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202110400588.XA
Other languages
Chinese (zh)
Inventor
蔡至杰
郑竣中
黄腾莹
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
AU Optronics Corp
Original Assignee
AU Optronics Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by AU Optronics Corp filed Critical AU Optronics Corp
Publication of CN113022579A publication Critical patent/CN113022579A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W40/00Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models
    • B60W40/08Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models related to drivers or passengers
    • B60W40/09Driving style or behaviour
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R16/00Electric or fluid circuits specially adapted for vehicles and not otherwise provided for; Arrangement of elements of electric or fluid circuits specially adapted for vehicles and not otherwise provided for
    • B60R16/02Electric or fluid circuits specially adapted for vehicles and not otherwise provided for; Arrangement of elements of electric or fluid circuits specially adapted for vehicles and not otherwise provided for electric constitutive elements
    • B60R16/023Electric or fluid circuits specially adapted for vehicles and not otherwise provided for; Arrangement of elements of electric or fluid circuits specially adapted for vehicles and not otherwise provided for electric constitutive elements for transmission of signals between vehicle parts or subsystems
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W50/00Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
    • B60W50/08Interaction between the driver and the control system
    • B60W50/14Means for informing the driver, warning the driver or prompting a driver intervention

Landscapes

  • Engineering & Computer Science (AREA)
  • Automation & Control Theory (AREA)
  • Mechanical Engineering (AREA)
  • Transportation (AREA)
  • Physics & Mathematics (AREA)
  • Mathematical Physics (AREA)
  • Human Computer Interaction (AREA)
  • Traffic Control Systems (AREA)
  • Steering Control In Accordance With Driving Conditions (AREA)

Abstract

A driving assistance method and a vehicle-mounted device. A dynamic sensor is mounted on the vehicle-mounted device. And when the operation instruction is judged to be received, judging the position of the dynamic target object through the dynamic sensor. Then, whether or not to execute the operation corresponding to the operation command is determined based on the position of the dynamic target object.

Description

Driving assisting method and vehicle-mounted device
Technical Field
The present invention relates to a vehicle-mounted device and an auxiliary method thereof, and more particularly, to a driving auxiliary method and a vehicle-mounted device with a dynamic sensor.
Background
With the development of technology, the vehicle-mounted device has developed more functions from the original state of simply providing music playing and vehicles. Key parts can be made smaller and smaller through Micro Electro-Mechanical Systems (MEMS), the operation efficiency is increased linearly, and the vehicle-mounted device can integrate the original independent audio-visual system, navigation system, driving record, active/passive driving safety system and the like, so that the application functions are developed to be more and more diversified. One of the issues is how to use the in-vehicle device to improve driving safety.
Disclosure of Invention
The invention provides a driving assisting method and a vehicle-mounted device, which can improve the driving safety.
The driving assistance method of the invention comprises the following steps: judging whether an operation instruction is received or not; when the operation instruction is judged to be received, the position of the dynamic target object is judged through the dynamic sensor; and determining whether to execute the operation corresponding to the operation instruction based on the position of the dynamic target object.
In an embodiment of the present invention, the step of determining whether the operation instruction is received includes: and judging whether an operation instruction is received from the console panel.
In an embodiment of the present invention, after the step of determining whether the operation command is received from the console panel, the method further includes: under the condition that the operation instruction is received from the console panel, judging whether the current speed per hour is greater than a first preset value; when the current speed per hour is greater than a first preset value, the position of the dynamic target object is judged through the dynamic sensor; and when the current speed per hour is not greater than the first preset value, directly executing the operation corresponding to the operation instruction.
In an embodiment of the present invention, when the current speed per hour is greater than the first preset value, the method includes: when the dynamic target object is judged to be the co-driver based on the position of the dynamic target object, executing operation corresponding to the operation instruction; and refusing to execute the operation corresponding to the operation instruction when the dynamic target object is determined not to be the co-driver based on the position of the dynamic target object.
In an embodiment of the present invention, the step of determining whether the operation instruction is received includes: and judging whether an operation instruction is received from the brake.
In an embodiment of the present invention, after the step of determining whether the operation command is received from the brake, the method further includes: under the condition that the self-brake receives the operation instruction, judging whether the current speed per hour is less than a second preset value; when the current speed per hour is less than a second preset value, the position of the dynamic target object is judged through the dynamic sensor; and refusing to execute the operation corresponding to the operation instruction when the current speed per hour is not less than the second preset value.
In an embodiment of the present invention, when the current speed per hour is less than the second preset value, the method includes: when the dynamic target object is judged to be driving based on the position of the dynamic target object, executing operation corresponding to the operation instruction; and refusing to execute the operation corresponding to the operation instruction when the dynamic target object is determined not to be driven based on the position of the dynamic target object.
In an embodiment of the present invention, the driving assistance method further includes: detecting whether the driving sight line direction turns to a specified direction or not through a dynamic sensor; when the driving sight direction is detected to turn to the designated direction, whether the designated area is shielded or not is judged through a dynamic sensor; and sending out a warning signal when the specified area is judged to be shielded.
The vehicle-mounted device of the present invention includes: a dynamic sensor; and a processor coupled to the dynamic sensor and configured to: judging whether an operation instruction is received or not; when the operation instruction is judged to be received, the position of the dynamic target object is judged through the dynamic sensor; and determining whether to execute the operation corresponding to the operation instruction based on the position of the dynamic target object.
The driving assistance method of the invention comprises the following steps: detecting whether the driving sight line direction turns to a specified direction or not through a dynamic sensor; when the driving sight direction is detected to turn to the designated direction, whether the designated area is shielded or not is judged through a dynamic sensor; and sending out a warning signal when the specified area is judged to be shielded.
Based on the above, the dynamic sensor is used to determine the action of the operator, so as to determine whether to execute the corresponding operation, thereby improving the driving safety.
Drawings
Fig. 1 is a block diagram of an in-vehicle apparatus according to an embodiment of the present invention.
Fig. 2 is a flowchart of a driving assistance method according to an embodiment of the invention.
Fig. 3 is a flowchart of a driving assistance method according to an embodiment of the invention.
FIG. 4 is a schematic diagram of an operating console panel according to an embodiment of the invention.
Fig. 5 is a flowchart of a driving assistance method according to an embodiment of the invention.
Fig. 6 is a flowchart of a driving assistance method according to an embodiment of the invention.
Description of reference numerals:
100: vehicle-mounted device
110: dynamic sensor
111: image recognition unit
113: judging unit
120: processor with a memory having a plurality of memory cells
130: center console panel
140: control module
410: right side range
U: hand (W.E.)
S205 to S215: each step of the driving assistance method of an embodiment of the present invention
S305 to S325: each step of the driving assistance method of an embodiment of the present invention
S505 to S525: each step of the driving assistance method of an embodiment of the present invention
S605 to S615: each step of the driving assistance method of an embodiment of the present invention
Detailed Description
Fig. 1 is a block diagram of an in-vehicle apparatus according to an embodiment of the present invention. Referring to fig. 1, the in-vehicle device 100 includes a dynamic sensor 110, a processor 120, a console panel 130, and a control module 140. The processor 120 is coupled to the dynamic sensor 110, the console panel 130, and the control module 140.
In the present embodiment, the dynamic sensor 110 is an image capturing device, such as a camera or a camera using a Charge Coupled Device (CCD) lens or a Complementary Metal Oxide Semiconductor (CMOS) lens, for capturing an image.
The motion sensor 110 further includes an image recognition unit 111 and a determination unit 113. In practice, the image recognition unit 111 and the determination unit 113 may be separate computing chips. In addition, the image recognition unit 111 and the determination unit 113 may also be composed of program code segments, and executed by the control chip of the motion sensor 110.
After the image is acquired, the motion sensor 110 performs analysis and recognition by the image recognition unit 111, thereby finding out a motion target object in the image. After that, the position of the dynamic target object is further determined by the determination unit 113.
The Processor 120 is, for example, a Central Processing Unit (CPU), a Physical Processing Unit (PPU), a programmable Microprocessor (Microprocessor), an embedded control chip, a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), or the like.
The center console panel 130 includes a plurality of buttons, and functions such as an air conditioner, a sound, a navigation system, and the like are controlled by the buttons.
The control module 140 is used for receiving the instruction from the processor 120, and opening or closing the related operation and sending out the alarm signal according to the instruction. The control module 140 is composed of one or more program code segments, for example, and is stored in a storage unit (not shown) and executed by the processor 120. The storage unit is, for example, any form of fixed or removable Random Access Memory (RAM), Read-Only Memory (ROM), Flash Memory (Flash Memory), hard disk, or other similar device or combination of these devices.
The processor 120 determines whether an operation instruction is received, and determines the position of the dynamic object through the dynamic sensor 110 when it is determined that the operation instruction is received. Then, the processor 120 determines whether to execute the operation corresponding to the operation instruction based on the position of the dynamic object. The processor 120 may further detect whether the driving sight line direction turns to a specified direction through the dynamic sensor 110, and determine whether the specified area is obscured through the dynamic sensor 110 when it is detected that the driving sight line direction turns to the specified direction. Upon determining that the designated area is occluded, the processor 120 issues an alert signal.
Fig. 2 is a flowchart of a driving assistance method according to an embodiment of the invention. Referring to fig. 1 and fig. 2, in step S205, the processor 120 determines whether an operation instruction is received. Here, the dynamic sensor 110 may also perform image analysis to determine whether an operation command is received. The motion sensor 110 is used to acquire an image, and then the image analysis is used to determine whether the console panel 130 or the brake (not shown) receives an operation command. In a case where it is not determined that the operation instruction is received, step S205 is continuously executed.
When it is determined that the operation command is received, in step S210, the position of the dynamic object is determined by the dynamic sensor 110. Then, in step S215, it is determined whether or not to execute the operation corresponding to the operation command based on the position of the dynamic target object. Here, the dynamic object is, for example, a hand. The motion sensor 110 recognizes the position of the hand in the image, and further deduces the position of the hand in the actual space, thereby determining that the operator is a driver, a passenger seat, or a passenger in a rear seat.
Fig. 3 is a flowchart of a driving assistance method according to an embodiment of the invention. This embodiment is an application example of the embodiment shown in fig. 2. Referring to fig. 1 and 3, in step S305, the processor 120 determines whether an operation instruction is received from the console panel 130. For example, when one of the buttons of the console panel 130 is pressed, the processor 120 determines that the console panel 130 receives the operation command. In a case where it is not determined that the operation instruction is received, step S305 is continuously executed.
In the case that it is determined that the operation command is received from the console panel 130, in step S310, the processor 120 determines whether the current speed is greater than a first preset value. The first preset value is, for example, 5km/h, but not limited thereto. When the current speed is not greater than the first preset value, as shown in step S320, the operation corresponding to the operation instruction is directly executed.
When the current speed is greater than the first preset value, in step S315, the dynamic sensor 110 is used to determine the dynamic object. Specifically, the position of the dynamic target object is determined by the dynamic sensor 110, and the dynamic target object is determined to be a driver, a passenger, or a rear passenger based on the position of the dynamic target object. Generally, the user operates the console panel 130 with his/her hand, so the dynamic sensor 110 can be configured to identify the direction of the hand and determine the dynamic object by identifying whether the hand is left or right. Specifically, the image recognition unit 111 analyzes and recognizes whether the left or right hand and the source direction (from which the left or right hand is moved to the console) are performing the operation by acquiring the image, and the determination unit 113 determines whether the dynamic object is the first object, the second object, or the third object according to the source direction and the left or right hand. Here, the first object, the second object, and the third object may correspond to a driver, a passenger, and a rear seat passenger, respectively. The following description will be made by taking an example.
FIG. 4 is a schematic diagram of an operating console panel according to an embodiment of the invention. In the present embodiment, a left-driving operation is described as an example. Referring to fig. 4, in a general situation, when the passenger wants to operate the console panel 130, the hand U enters from the right side 410 facing the console panel 130. By analogy, when the driver operates the center console panel 130, the hand enters from the left side facing the center console panel 130. When the rear seat passenger operates the center console panel 130, the hand enters the range below the center console panel 130. Accordingly, the relationship shown in table 1 (left-hand drive as an example) can be collated.
The relationship between the three parameters of the position of the dynamic object, left/right hand, and dynamic object is shown in table 1. When the position of the dynamic object is located on the left side facing the center console panel 130, the dynamic object is determined to be driving regardless of whether the left hand or the right hand is detected. When the position of the dynamic object is located on the left side facing the center console panel 130, the dynamic object is determined to be driving regardless of whether the left hand or the right hand is detected. In the case where the position of the dynamic target object is located on the right side facing the center console panel 130, the dynamic target object is determined to be a co-driver regardless of whether the left hand or the right hand is detected.
With this feature, the image recognition unit 111 can be designed to recognize the direction from which the hand in the acquired image enters the console panel 130. For example, it can be determined from a plurality of images acquired in succession.
TABLE 1
Position of dynamic target Left/right hand Dynamic target
Left side of the Left hand Driving
Left side of the Right hand Driving
Lower side Left hand Rear seat passenger
Lower side Right hand Rear seat passenger
Left side of the Left hand Copilot driver
Right side of the Right hand Copilot driver
Returning to fig. 3, when it is determined that the target dynamic object is the co-driver, the operation corresponding to the operation command is directly executed as shown in step S320. When it is determined that the target dynamic object is not the co-driver, the execution of the operation corresponding to the operation command is rejected as shown in step S325.
Through the above embodiment, when the vehicle is traveling, if it is detected that the console panel 130 receives the operation command and the current speed per hour is greater than the first preset value, the dynamic sensor 110 determines whether the target dynamic object (operator) is a co-driver, and if not, the operation is prohibited to avoid distracting driving to perform other operations. Accordingly, the driving safety of the driver can be ensured without locking the function of the console panel 130.
Fig. 5 is a flowchart of a driving assistance method according to an embodiment of the invention. This embodiment is an application example of the embodiment shown in fig. 2. Referring to fig. 1 and 5, in step S505, the processor 120 determines whether an operation command is received from the brake (brake, P-range). If it is not determined that the operation instruction is received, step S505 is continuously executed.
In the case where it is determined that the operation command is received from the brake, it is determined whether the current speed per hour is less than the second preset value in step S510. The second preset value is, for example, 3 km/h. When the current speed per hour is not less than the second preset value, as shown in step S525, the operation corresponding to the operation instruction is rejected.
When the current speed is less than the second preset value, in step S515, the dynamic sensor 110 determines the dynamic object. Specifically, the position of the dynamic target object is determined by the dynamic sensor 110, and the dynamic target object is determined to be a driver, a passenger, or a rear passenger based on the position of the dynamic target object.
When it is determined that the dynamic target object is driving, an operation corresponding to the operation command is executed as shown in step S520. When it is determined that the dynamic target object is not driving, the execution of the operation corresponding to the operation command is rejected as shown in step S525.
With the above embodiment, when the vehicle receives the brake operation command at a certain speed (e.g. 3km/h), the dynamic sensor 110 determines whether the target dynamic object (operator) is driving, and if the target dynamic object is driving, the operation is allowed, and if the target dynamic object is not driving, the operation is prohibited. Therefore, the driver can avoid touching the brake by mistake. In the present embodiment, the method of determining whether or not the target dynamic object (operator) is driving is the same as the embodiment shown in fig. 3. That is, the dynamic sensor 110 can be configured to identify the source direction of the hand and determine the dynamic object by determining whether the hand performing the operation is left or right.
Fig. 6 is a flowchart of a driving assistance method according to an embodiment of the invention. Referring to fig. 1 and 6, in step S605, it is detected whether the driving sight line direction turns to a designated direction through the dynamic sensor 110, and step S610 is not executed until it is detected that the driving sight line direction turns to the designated direction. For example, the image is acquired by the dynamic sensor 110, and whether the head of the driver in the image is rotated or whether the line of sight direction of the driver is turned to a specified direction is further analyzed.
In step S610, it is determined whether the designated area is occluded by the dynamic sensor 110. For example, the image recognition unit 111 recognizes a specific area in the acquired image, and the determination unit 113 determines whether the specific area is hidden by other objects. When it is determined that the designated area is shielded, in step S615, an alarm signal is issued. In the case of left driving, the designated direction is, for example, the right direction of driving, and the designated area is an area where the driver can see the right rear view mirror. When the driving sight line direction is turned to the designated direction, if a secondary driving action or other objects appear in the designated area, the driving visual angle is affected, so that the designated area is judged to be shielded, and a warning signal is sent out.
In summary, the present invention utilizes the dynamic sensor to determine whether the operator is driving or driving in a front-end driver, and further determine the line of sight of the driver and the actions of the front-end driver, so as to determine whether to execute the corresponding operation, thereby improving the safety of driving.

Claims (17)

1. A driving assistance method comprising:
judging whether an operation instruction is received or not;
when the operation instruction is judged to be received, the position of a dynamic target object is judged through a dynamic sensor; and
and determining whether to execute the operation corresponding to the operation instruction based on the position of the dynamic target object.
2. The driving assistance method according to claim 1, wherein the step of determining whether the operation command is received comprises:
whether the operation instruction is received from a console panel is judged.
3. The driving assistance method according to claim 2, wherein after the step of determining whether the operation command is received from the console panel, the method further comprises:
under the condition that the operating instruction is received from the console panel, judging whether the current speed per hour is greater than a first preset value;
when the current speed per hour is greater than the first preset value, the position of the dynamic target object is judged through the dynamic sensor; and
and when the current speed per hour is not greater than the first preset value, directly executing the operation corresponding to the operation instruction.
4. The driving assistance method according to claim 3, wherein when the current speed per hour is greater than the first preset value, the method comprises:
when the dynamic target object is judged to be a pair of drivers based on the position of the dynamic target object, executing the operation corresponding to the operation instruction; and
and when the dynamic target object is determined not to be the co-driver based on the position of the dynamic target object, refusing to execute the operation corresponding to the operation instruction.
5. The driving assistance method according to claim 1, wherein the step of determining whether the operation command is received comprises:
and judging whether the operation instruction is received from a brake.
6. The driving assistance method according to claim 5, wherein after the step of determining whether the operation command is received from the brake, further comprising:
under the condition that the operating instruction is received from the brake, judging whether the current speed per hour is less than a second preset value;
when the current speed per hour is less than the second preset value, the position of the dynamic target object is judged through the dynamic sensor; and
and refusing to execute the operation corresponding to the operation instruction when the current speed per hour is not less than the second preset value.
7. The driving assistance method according to claim 6, wherein when the current speed per hour is less than the second predetermined value, the method comprises:
when the dynamic target object is judged to be a driver based on the position of the dynamic target object, executing the operation corresponding to the operation instruction; and
and when the dynamic target object is determined not to be the driving based on the position of the dynamic target object, refusing to execute the operation corresponding to the operation instruction.
8. The driving assistance method according to claim 1, further comprising:
detecting whether a driving sight line direction turns to a specified direction through the dynamic sensor;
when the driving sight line direction is detected to turn to the appointed direction, whether an appointed area is shielded or not is judged through the dynamic sensor; and
and when the specified area is judged to be shielded, sending out a warning signal.
9. An in-vehicle apparatus comprising:
a dynamic sensor; and
a processor coupled to the dynamic sensor and configured to determine whether an operation command is received;
when the operation instruction is judged to be received, the position of a dynamic target object is judged through the dynamic sensor; and
and determining whether to execute the operation corresponding to the operation instruction based on the position of the dynamic target object.
10. The vehicle-mounted device according to claim 9, further comprising:
a panel of a center console, a front panel of the center console,
wherein the processor is configured to determine whether the operation command is received from the console panel.
11. The in-vehicle device of claim 10, wherein the processor is configured to:
under the condition that the operating instruction is received from the console panel, judging whether the current speed per hour is greater than a first preset value;
when the current speed per hour is greater than the first preset value, the position of the dynamic target object is judged through the dynamic sensor; and
and when the current speed per hour is not greater than the first preset value, directly executing the operation corresponding to the operation instruction.
12. The in-vehicle device of claim 11, wherein the processor is configured to:
under the condition that the current speed per hour is greater than the first preset value,
when the dynamic target object is judged to be a pair of drivers based on the position of the dynamic target object, executing the operation corresponding to the operation instruction; and
and when the dynamic target object is determined not to be the co-driver based on the position of the dynamic target object, refusing to execute the operation corresponding to the operation instruction.
13. The vehicle-mounted device according to claim 9, further comprising:
a brake device, a brake device and a brake device,
wherein the processor is configured to determine whether the operation command is received from the actuator.
14. The in-vehicle device of claim 13, wherein the processor is configured to:
under the condition that the operating instruction is received from the brake, judging whether the current speed per hour is less than a second preset value;
when the current speed per hour is less than the second preset value, the position of the dynamic target object is judged through the dynamic sensor; and
and refusing to execute the operation corresponding to the operation instruction when the current speed per hour is not less than the second preset value.
15. The in-vehicle device of claim 14, wherein the processor is configured to:
in case the current speed per hour is less than the second preset value,
when the dynamic target object is judged to be a driver based on the position of the dynamic target object, executing the operation corresponding to the operation instruction; and
and when the dynamic target object is determined not to be the driving based on the position of the dynamic target object, refusing to execute the operation corresponding to the operation instruction.
16. The in-vehicle device of claim 9, wherein the processor is configured to:
detecting whether a driving sight line direction turns to a specified direction through the dynamic sensor;
when the driving sight line direction is detected to turn to the appointed direction, whether an appointed area is shielded or not is judged through the dynamic sensor; and
and when the specified area is judged to be shielded, sending out a warning signal.
17. A driving assistance method comprising:
detecting whether a driving sight line direction turns to a specified direction through a dynamic sensor;
when the driving sight line direction is detected to turn to the appointed direction, whether an appointed area is shielded or not is judged through the dynamic sensor; and
and when the specified area is judged to be shielded, sending out a warning signal.
CN202110400588.XA 2020-10-14 2021-04-14 Driving assisting method and vehicle-mounted device Pending CN113022579A (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
TW109135415A TWI815046B (en) 2020-10-14 2020-10-14 Driving assistance method and vehicle mounted apparatus
TW109135415 2020-10-14

Publications (1)

Publication Number Publication Date
CN113022579A true CN113022579A (en) 2021-06-25

Family

ID=76457353

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110400588.XA Pending CN113022579A (en) 2020-10-14 2021-04-14 Driving assisting method and vehicle-mounted device

Country Status (2)

Country Link
CN (1) CN113022579A (en)
TW (1) TWI815046B (en)

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH07103778A (en) * 1993-10-05 1995-04-18 Mitsubishi Electric Corp Controller for traveling body
JP2007045169A (en) * 2005-08-05 2007-02-22 Aisin Aw Co Ltd Information processor for vehicle
CN101221621A (en) * 2007-01-12 2008-07-16 国际商业机器公司 Method and system for warning a user about adverse behaviors
JP2014205448A (en) * 2013-04-15 2014-10-30 株式会社デンソー On-vehicle display device
CN104943544A (en) * 2014-03-24 2015-09-30 福特全球技术公司 System and method for enabling touchscreen by passenger in moving vehicle
US20180208208A1 (en) * 2017-01-20 2018-07-26 Honda Motor Co., Ltd. System and method for identifying at least one passenger of a vehicle by a pattern of movement
US20190094038A1 (en) * 2017-09-25 2019-03-28 Lg Electronics Inc. Vehicle control device and vehicle comprising the same
US20200012295A1 (en) * 2019-08-16 2020-01-09 Lg Electronics Inc. Method for controlling vehicle in autonomous driving system and apparatus thereof

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP6330842B2 (en) * 2016-03-31 2018-05-30 マツダ株式会社 Driving assistance device
JP6933275B2 (en) * 2016-09-05 2021-09-08 日産自動車株式会社 Driving support method and driving support device
JP6631585B2 (en) * 2017-04-21 2020-01-15 株式会社デンソー Presentation control device, automatic operation control device, presentation control method, and automatic operation control method

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH07103778A (en) * 1993-10-05 1995-04-18 Mitsubishi Electric Corp Controller for traveling body
JP2007045169A (en) * 2005-08-05 2007-02-22 Aisin Aw Co Ltd Information processor for vehicle
CN101221621A (en) * 2007-01-12 2008-07-16 国际商业机器公司 Method and system for warning a user about adverse behaviors
JP2014205448A (en) * 2013-04-15 2014-10-30 株式会社デンソー On-vehicle display device
CN104943544A (en) * 2014-03-24 2015-09-30 福特全球技术公司 System and method for enabling touchscreen by passenger in moving vehicle
US20180208208A1 (en) * 2017-01-20 2018-07-26 Honda Motor Co., Ltd. System and method for identifying at least one passenger of a vehicle by a pattern of movement
US20190094038A1 (en) * 2017-09-25 2019-03-28 Lg Electronics Inc. Vehicle control device and vehicle comprising the same
US20200012295A1 (en) * 2019-08-16 2020-01-09 Lg Electronics Inc. Method for controlling vehicle in autonomous driving system and apparatus thereof

Also Published As

Publication number Publication date
TWI815046B (en) 2023-09-11
TW202214467A (en) 2022-04-16

Similar Documents

Publication Publication Date Title
US10407077B2 (en) Method for ascertaining a degree of awareness of a vehicle operator
CN108216214B (en) Automatic parking assistance device and method for assisting parking using the same
CN108569296B (en) Method for self-adaptively matching auxiliary driving system and implementation module thereof
EP3290301B1 (en) Parking assist device
JP6611085B2 (en) Vehicle control device
CN107944333A (en) Automatic Pilot control device, the vehicle and its control method with the equipment
CN108238054B (en) Display system for vehicle
US10807604B2 (en) Vehicle and method for controlling thereof
JP6365385B2 (en) Three-dimensional object detection apparatus and three-dimensional object detection method
CN111867893B (en) Parking assist apparatus
US9757985B2 (en) System and method for providing a gear selection indication for a vehicle
JP6441399B2 (en) Driving support device, driving support method and program
JP2017058761A (en) Driving assistance device and driving assistance program
US20190009793A1 (en) Method and device for controlling at least one driver interaction system
JP6611083B2 (en) Vehicle control device
CN111204219A (en) Display device for vehicle, display method for vehicle, and storage medium
KR20210030137A (en) Apparatus for controlling automatic parking and method thereof
EP3472642B1 (en) Overtake acceleration aid for adaptive cruise control in vehicles
US11667293B2 (en) Device and method for controlling travel of vehicle
JP7073682B2 (en) In-vehicle alarm device
JP6385624B2 (en) In-vehicle information processing apparatus, in-vehicle apparatus, and in-vehicle information processing method
JP6604368B2 (en) Vehicle control device
JP6524510B1 (en) Self-driving car
CN113022579A (en) Driving assisting method and vehicle-mounted device
CN109816991A (en) Driving ancillary equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20210625