CN117382661A - Information display control method, device, electronic equipment and medium - Google Patents

Information display control method, device, electronic equipment and medium Download PDF

Info

Publication number
CN117382661A
CN117382661A CN202210788133.4A CN202210788133A CN117382661A CN 117382661 A CN117382661 A CN 117382661A CN 202210788133 A CN202210788133 A CN 202210788133A CN 117382661 A CN117382661 A CN 117382661A
Authority
CN
China
Prior art keywords
target
auxiliary driving
driving
information
vehicle
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210788133.4A
Other languages
Chinese (zh)
Inventor
王希尧
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Huawei Technologies Co Ltd
Original Assignee
Huawei Technologies Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Huawei Technologies Co Ltd filed Critical Huawei Technologies Co Ltd
Priority to CN202210788133.4A priority Critical patent/CN117382661A/en
Publication of CN117382661A publication Critical patent/CN117382661A/en
Pending legal-status Critical Current

Links

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W50/00Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
    • B60W50/08Interaction between the driver and the control system
    • B60W50/14Means for informing the driver, warning the driver or prompting a driver intervention
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W50/00Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W50/00Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
    • B60W50/08Interaction between the driver and the control system
    • B60W50/10Interpretation of driver requests or demands
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W50/00Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
    • B60W2050/0062Adapting control system settings
    • B60W2050/0063Manual parameter input, manual setting means, manual initialising or calibrating means
    • B60W2050/0066Manual parameter input, manual setting means, manual initialising or calibrating means using buttons or a keyboard connected to the on-board processor
    • B60W2050/0067Confirmation by the driver
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W50/00Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
    • B60W50/08Interaction between the driver and the control system
    • B60W50/14Means for informing the driver, warning the driver or prompting a driver intervention
    • B60W2050/146Display means
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2540/00Input parameters relating to occupants
    • B60W2540/221Physiology, e.g. weight, heartbeat, health or special needs

Landscapes

  • Engineering & Computer Science (AREA)
  • Automation & Control Theory (AREA)
  • Human Computer Interaction (AREA)
  • Transportation (AREA)
  • Mechanical Engineering (AREA)
  • Traffic Control Systems (AREA)

Abstract

The embodiment of the application provides an information display control method, an information display control device, electronic equipment and a medium, wherein the method comprises the following steps: determining whether a preset target event exists under the condition that an auxiliary driving system of the vehicle is in a starting state; if a target event exists, acquiring target information corresponding to the target event, wherein the target information comprises auxiliary driving information corresponding to the target event; in the case where the vehicle ends driving, performing a target operation corresponding to a predetermined display device; in response to a trigger for a target operation, target information is output to the display device. The embodiment is beneficial to learning the auxiliary driving function of the user and improves the use experience of the user on the auxiliary driving system.

Description

Information display control method, device, electronic equipment and medium
Technical Field
The present disclosure relates to the field of vehicle technologies, and in particular, to an information display control method, an information display control device, an electronic device, and a medium.
Background
As vehicle models with driving assistance systems (Advanced Driving Assistance System, ADAS) become more popular, more and more users, such as vehicle drivers, have access to the driving assistance systems.
However, the user is often unfamiliar with the system logic and the usage method of the auxiliary driving system, which may result in doubt and fear of the auxiliary driving function, and poor experience of the auxiliary driving system.
Disclosure of Invention
The embodiment of the invention provides an information display control method, an information display control device, electronic equipment and a medium, which are beneficial to learning of an auxiliary driving function by a user and improve the use experience of the user on an auxiliary driving system.
In a first aspect, an embodiment of the present invention provides an information display control method, including: determining whether a preset target event exists under the condition that an auxiliary driving system of the vehicle is in a starting state; if the target event exists, acquiring target information corresponding to the target event, wherein the target information comprises auxiliary driving information corresponding to the target event; executing a target operation corresponding to a predetermined display device in a case where the vehicle ends driving; and responding to the trigger of the target operation, and outputting the target information to the display device.
Optionally, the determining whether the preset target event exists includes: monitoring whether the auxiliary driving system performs an auxiliary driving operation; determining whether the auxiliary driving operation is successfully executed or not under the condition that the auxiliary driving system is monitored to execute the auxiliary driving operation; wherein the condition that the target event exists comprises: and (3) a situation that the auxiliary driving operation fails to be executed.
Optionally, the determining whether the preset target event exists includes: detecting whether request information for requesting to learn an auxiliary driving strategy is received; wherein the condition that the target event exists comprises: and receiving the request information.
Optionally, the detecting whether the request information for requesting to learn the driving assistance policy is received includes: detecting whether a target key of the vehicle configuration is triggered; the receiving the request information includes: and the target key is triggered.
Optionally, the detecting whether the request information for requesting to learn the driving assistance policy is received includes: detecting whether a voice signal for requesting to learn an auxiliary driving strategy is received; the receiving the request information includes: and receiving the voice signal.
Optionally, the determining whether the preset target event exists includes: determining whether a driving scene of the vehicle is a scene of target information to be acquired; wherein the condition that the target event exists comprises: the driving scene of the vehicle is the scene of the target information to be acquired.
Optionally, the determining whether the driving scene of the vehicle is a scene of the target information to be acquired includes: acquiring heart rate data of a user; determining whether the situation that the heart rate of the user changes regularly is existed according to the heart rate data of the user; if the heart rate of the user changes in a preset rule, determining the driving scene of the vehicle as the scene of the target information to be acquired.
Optionally, the determining whether the preset target event exists includes: determining whether the terminal equipment is in a working state; under the condition that the terminal equipment is in a working state, determining whether the running road condition of the vehicle is the road condition of target information to be acquired; wherein the condition that the target event exists comprises: the driving road condition of the vehicle is the condition of the road condition of the target information to be obtained.
Optionally, the auxiliary driving information corresponding to the target event includes: at least one of a first time period corresponding to the target event, an auxiliary driving strategy of the auxiliary driving system in the first time period, an auxiliary driving execution result of the auxiliary driving system in the first time period and a reason when the auxiliary driving execution result is the execution failure.
Optionally, the target information further includes: at least one of driving video of a driving recorder of the vehicle in a second time period and manual driving recommended operation corresponding to the target event; the second time period comprises a first time period corresponding to the target event.
Optionally, the performing the target operation corresponding to the predetermined display device includes: and controlling a predetermined display device to display the predetermined notification information through the popup window.
Optionally, the display device includes: the display screen of the vehicle-mounted computer of the vehicle.
Optionally, the display device includes: and a display screen of the terminal equipment.
In a second aspect, an embodiment of the present invention provides an information display control apparatus, including: the determining module is used for determining whether a preset target event exists or not under the condition that an auxiliary driving system of the vehicle is in a starting state; the acquisition module is used for acquiring target information corresponding to the target event if the target event exists, wherein the target information comprises auxiliary driving information corresponding to the target event; a processing module for executing a target operation corresponding to a predetermined display device in a case where the vehicle ends driving; and the output module is used for responding to the trigger of the target operation and outputting the target information to the display equipment.
In a third aspect, an embodiment of the present invention provides an electronic chip, including: a processor for executing computer program instructions stored on a memory, wherein the computer program instructions, when executed by the processor, trigger the electronic chip to perform the method of any of the first aspects.
In a fourth aspect, an embodiment of the present invention provides an electronic device comprising a memory for storing computer program instructions, a processor for executing the computer program instructions, and communication means, wherein the computer program instructions, when executed by the processor, trigger the electronic device to perform the method according to any of the first aspects.
In a fifth aspect, embodiments of the present invention provide a computer-readable storage medium having a computer program stored therein, which when run on a computer, causes the computer to perform the method according to any of the first aspects.
In a sixth aspect, an embodiment of the present invention provides a computer program product comprising a computer program which, when run on a computer, causes the computer to perform the method according to any one of the first aspects.
In the embodiment of the invention, under the condition that the auxiliary driving system of the vehicle is in a starting state, whether a target event exists or not is determined; if the target event exists, acquiring target information corresponding to the target event, wherein the target information comprises auxiliary driving information corresponding to the target event; performing a target operation of the corresponding display device in a case where the vehicle ends driving; in response to triggering the target operation, the target information is output to the display device so that the user views the target information through the display device. The embodiment is beneficial to learning the auxiliary driving function of the user and improves the use experience of the user on the auxiliary driving system.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present invention, the drawings that are needed in the embodiments will be briefly described below, and it is obvious that the drawings in the following description are only some embodiments of the present invention, and that other drawings can be obtained according to these drawings without inventive effort for a person skilled in the art.
Fig. 1 is a schematic structural diagram of an electronic device according to an embodiment of the present invention;
fig. 2 is a schematic flow chart of a data recording implementation process of an automobile with driving assisting equipment according to an embodiment of the present invention;
FIG. 3 is a flowchart of an information display control method according to an embodiment of the present invention;
FIG. 4 is a flowchart of generating target information according to an embodiment of the present invention;
FIG. 5 is a schematic diagram of a multi-disc image viewing interface according to an embodiment of the present invention;
FIG. 6 is a schematic diagram of a multi-disc image storage interface according to an embodiment of the present invention;
FIG. 7 is an interface diagram of a multiple video recording recommendation method according to an embodiment of the present invention;
FIG. 8 is an interface diagram of another exemplary embodiment of a multiple video recording recommendation method according to the present invention;
FIG. 9 is a flowchart of another information display control method according to an embodiment of the present invention;
FIG. 10 is a flowchart of another method for controlling information display according to an embodiment of the present invention;
FIG. 11 is a flowchart of a method for controlling information display according to another embodiment of the present invention;
fig. 12 is a flowchart of another information display control method according to an embodiment of the present invention.
Detailed Description
For a better understanding of the technical solution of the present invention, the following detailed description of the embodiments of the present invention refers to the accompanying drawings.
It should be understood that the described embodiments are merely some, but not all, embodiments of the invention. All other embodiments, which can be made by those skilled in the art based on the embodiments of the invention without making any inventive effort, are intended to be within the scope of the invention.
The terminology used in the embodiments of the invention is for the purpose of describing particular embodiments only and is not intended to be limiting of the invention. As used in this application and the appended claims, the singular forms "a," "an," and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise.
It should be understood that the term "at least one" as used herein means one or more, and "a plurality" means two or more. The term "and/or" as used herein is merely one association relationship describing the associated object, meaning that there may be three relationships, e.g., a and/or B, may represent: a exists alone, A and B exist together, and B exists alone. In addition, the character "/" herein generally indicates that the front and rear associated objects are an "or" relationship. "at least one of the following" and the like means any combination of these items, including any combination of single or plural items. For example, at least one of a, b and c may represent: a, b, c, a-b, a-c, b-c, or a-b-c, wherein a, b, c may be single or plural.
It should be understood that although the terms first, second, etc. may be used in embodiments of the present invention to describe the set threshold values, these set threshold values should not be limited to these terms. These terms are only used to distinguish the set thresholds from each other. For example, a first set threshold may also be referred to as a second set threshold, and similarly, a second set threshold may also be referred to as a first set threshold, without departing from the scope of embodiments of the present invention.
The terminology used in the description section of the present application is for the purpose of describing particular embodiments of the present application only and is not intended to be limiting of the present application.
The information display control method provided in any embodiment of the present application may be applied to the electronic device 100 shown in fig. 1. Fig. 1 shows a schematic configuration of an electronic device 100.
The electronic device 100 may include a processor 110, an external memory interface 120, an internal memory 121, a universal serial bus (universal serial bus, USB) interface 130, a charge management module 140, a power management module 141, a battery 142, an antenna 1, an antenna 2, a mobile communication module 150, a wireless communication module 160, an audio module 170, a speaker 170A, a receiver 170B, a microphone 170C, an earphone interface 170D, a sensor module 180, keys 190, a motor 191, an indicator 192, a camera 193, a display 194, and a subscriber identity module (subscriber identification module, SIM) card interface 195, etc. The sensor module 180 may include a pressure sensor 180A, a gyro sensor 180B, an air pressure sensor 180C, a magnetic sensor 180D, an acceleration sensor 180E, a distance sensor 180F, a proximity sensor 180G, a fingerprint sensor 180H, a temperature sensor 180J, a touch sensor 180K, an ambient light sensor 180L, a bone conduction sensor 180M, and the like.
It is to be understood that the structure illustrated in the embodiments of the present application does not constitute a specific limitation on the electronic device 100. In other embodiments of the present application, electronic device 100 may include more or fewer components than shown, or certain components may be combined, or certain components may be split, or different arrangements of components. The illustrated components may be implemented in hardware, software, or a combination of software and hardware.
The processor 110 may include one or more processing units, such as: the processor 110 may include an application processor (application processor, AP), a modem processor, a graphics processor (graphics processing unit, GPU), an image signal processor (image signal processor, ISP), a controller, a video codec, a digital signal processor (digital signal processor, DSP), a baseband processor, and/or a neural network processor (neural-network processing unit, NPU), etc. Wherein the different processing units may be separate devices or may be integrated in one or more processors.
The controller can generate operation control signals according to the instruction operation codes and the time sequence signals to finish the control of instruction fetching and instruction execution.
A memory may also be provided in the processor 110 for storing instructions and data. In some embodiments, the memory in the processor 110 is a cache memory. The memory may hold instructions or data that the processor 110 has just used or recycled. If the processor 110 needs to reuse the instruction or data, it can be called directly from the memory. Repeated accesses are avoided and the latency of the processor 110 is reduced, thereby improving the efficiency of the system.
In some embodiments, the processor 110 may include one or more interfaces. The interfaces may include an integrated circuit (inter-integrated circuit, I2C) interface, an integrated circuit built-in audio (inter-integrated circuit sound, I2S) interface, a pulse code modulation (pulse code modulation, PCM) interface, a universal asynchronous receiver transmitter (universal asynchronous receiver/transmitter, UART) interface, a mobile industry processor interface (mobile industry processor interface, MIPI), a general-purpose input/output (GPIO) interface, a subscriber identity module (subscriber identity module, SIM) interface, and/or a universal serial bus (universal serial bus, USB) interface, among others.
The USB interface 130 is an interface conforming to the USB standard specification, and may specifically be a Mini USB interface, a Micro USB interface, a USB Type C interface, or the like. It should be understood that the interfacing relationship between the modules illustrated in the embodiments of the present application is only illustrative, and does not limit the structure of the electronic device 100. In other embodiments of the present application, the electronic device 100 may also use different interfacing manners, or a combination of multiple interfacing manners in the foregoing embodiments.
The charge management module 140 is configured to receive a charge input from a charger. The power management module 141 is used for connecting the battery 142, and the charge management module 140 and the processor 110. The wireless communication function of the electronic device 100 may be implemented by the antenna 1, the antenna 2, the mobile communication module 150, the wireless communication module 160, a modem processor, a baseband processor, and the like.
The antennas 1 and 2 are used for transmitting and receiving electromagnetic wave signals. Each antenna in the electronic device 100 may be used to cover a single or multiple communication bands. Different antennas may also be multiplexed to improve the utilization of the antennas. For example: the antenna 1 may be multiplexed into a diversity antenna of a wireless local area network. In other embodiments, the antenna may be used in conjunction with a tuning switch.
The mobile communication module 150 may provide a solution for wireless communication including 2G/3G/4G/5G, etc., applied to the electronic device 100. The mobile communication module 150 may include at least one filter, switch, power amplifier, low noise amplifier (low noise amplifier, LNA), etc. The mobile communication module 150 may receive electromagnetic waves from the antenna 1, perform processes such as filtering, amplifying, and the like on the received electromagnetic waves, and transmit the processed electromagnetic waves to the modem processor for demodulation. The mobile communication module 150 can amplify the signal modulated by the modem processor, and convert the signal into electromagnetic waves through the antenna 1 to radiate. In some embodiments, at least some of the functional modules of the mobile communication module 150 may be disposed in the processor 110. In some embodiments, at least some of the functional modules of the mobile communication module 150 may be provided in the same device as at least some of the modules of the processor 110.
The modem processor may include a modulator and a demodulator. The modulator is used for modulating the low-frequency baseband signal to be transmitted into a medium-high frequency signal. The demodulator is used for demodulating the received electromagnetic wave signal into a low-frequency baseband signal. The demodulator then transmits the demodulated low frequency baseband signal to the baseband processor for processing. The low frequency baseband signal is processed by the baseband processor and then transferred to the application processor. The application processor outputs sound signals through an audio device (not limited to the speaker 170A, the receiver 170B, etc.), or displays images or video through the display screen 194. In some embodiments, the modem processor may be a stand-alone device. In other embodiments, the modem processor may be provided in the same device as the mobile communication module 150 or other functional module, independent of the processor 110.
The wireless communication module 160 may provide solutions for wireless communication including wireless local area network (wireless local area networks, WLAN) (e.g., wireless fidelity (wireless fidelity, wi-Fi) network), bluetooth (BT), global navigation satellite system (global navigation satellite system, GNSS), frequency modulation (frequency modulation, FM), near field wireless communication technology (near field communication, NFC), infrared technology (IR), etc., as applied to the electronic device 100. The wireless communication module 160 may be one or more devices that integrate at least one communication processing module. The wireless communication module 160 receives electromagnetic waves via the antenna 2, modulates the electromagnetic wave signals, filters the electromagnetic wave signals, and transmits the processed signals to the processor 110. The wireless communication module 160 may also receive a signal to be transmitted from the processor 110, frequency modulate it, amplify it, and convert it to electromagnetic waves for radiation via the antenna 2.
In some embodiments, antenna 1 and mobile communication module 150 of electronic device 100 are coupled, and antenna 2 and wireless communication module 160 are coupled, such that electronic device 100 may communicate with a network and other devices through wireless communication techniques. The wireless communication techniques may include the Global System for Mobile communications (global system for mobile communications, GSM), general packet radio service (general packet radio service, GPRS), code division multiple access (code division multiple access, CDMA), wideband code division multiple access (wideband code division multiple access, WCDMA), time division code division multiple access (time-division code division multiple access, TD-SCDMA), long term evolution (long term evolution, LTE), BT, GNSS, WLAN, NFC, FM, and/or IR techniques, among others. The GNSS may include a global satellite positioning system (global positioning system, GPS), a global navigation satellite system (global navigation satellite system, GLONASS), a beidou satellite navigation system (beidou navigation satellite system, BDS), a quasi zenith satellite system (quasi-zenith satellite system, QZSS) and/or a satellite based augmentation system (satellite based augmentation systems, SBAS).
The electronic device 100 implements display functions through a GPU, a display screen 194, an application processor, and the like. The GPU is a microprocessor for image processing, and is connected to the display 194 and the application processor. The GPU is used to perform mathematical and geometric calculations for graphics rendering. Processor 110 may include one or more GPUs that execute program instructions to generate or change display information.
The display screen 194 is used to display images, videos, and the like. The display 194 includes a display panel. The display panel may employ a liquid crystal display (liquid crystal display, LCD), an organic light-emitting diode (OLED), an active-matrix organic light-emitting diode (AMOLED) or an active-matrix organic light-emitting diode (matrix organic light emitting diode), a flexible light-emitting diode (flex), a mini, a Micro led, a Micro-OLED, a quantum dot light-emitting diode (quantum dot light emitting diodes, QLED), or the like. In some embodiments, the electronic device 100 may include 1 or N display screens 194, N being a positive integer greater than 1.
The electronic device 100 may implement photographing functions through an ISP, a camera 193, a video codec, a GPU, a display screen 194, an application processor, and the like.
The ISP is used to process data fed back by the camera 193. The camera 193 is used to capture still images or video. The object generates an optical image through the lens and projects the optical image onto the photosensitive element.
The digital signal processor is used for processing digital signals, and can process other digital signals besides digital image signals. For example, when the electronic device 100 selects a frequency bin, the digital signal processor is used to fourier transform the frequency bin energy, or the like.
Video codecs are used to compress or decompress digital video. The electronic device 100 may support one or more video codecs. In this way, the electronic device 100 may play or record video in a variety of encoding formats, such as: dynamic picture experts group (moving picture experts group, MPEG) 1, MPEG2, MPEG3, MPEG4, etc.
The NPU is a neural-network (NN) computing processor, and can rapidly process input information by referencing a biological neural network structure, for example, referencing a transmission mode between human brain neurons, and can also continuously perform self-learning.
The external memory interface 120 may be used to connect an external memory card, such as a Micro SD card, to enable expansion of the memory capabilities of the electronic device 100. The external memory card communicates with the processor 110 through an external memory interface 120 to implement data storage functions. For example, files such as music, video, etc. are stored in an external memory card.
The internal memory 121 may be used to store computer executable program code including instructions. The internal memory 121 may include a storage program area and a storage data area. The storage program area may store an application program (such as a sound playing function, an image playing function, etc.) required for at least one function of the operating system, etc. The storage data area may store data created during use of the electronic device 100 (e.g., audio data, phonebook, etc.), and so on. In addition, the internal memory 121 may include a high-speed random access memory, and may further include a nonvolatile memory such as at least one magnetic disk storage device, a flash memory device, a universal flash memory (universal flash storage, UFS), and the like. The processor 110 performs various functional applications of the electronic device 100 and data processing by executing instructions stored in the internal memory 121 and/or instructions stored in a memory provided in the processor.
The electronic device 100 may implement audio functions through an audio module 170, a speaker 170A, a receiver 170B, a microphone 170C, an earphone interface 170D, an application processor, and the like. Such as music playing, recording, etc.
The pressure sensor 180A is used to sense a pressure signal, and may convert the pressure signal into an electrical signal. In some embodiments, the pressure sensor 180A may be disposed on the display screen 194. The pressure sensor 180A is of various types, such as a resistive pressure sensor, an inductive pressure sensor, a capacitive pressure sensor, and the like. The capacitive pressure sensor may be a capacitive pressure sensor comprising at least two parallel plates with conductive material. The capacitance between the electrodes changes when a force is applied to the pressure sensor 180A. The electronic device 100 determines the strength of the pressure from the change in capacitance. When a touch operation is applied to the display screen 194, the electronic apparatus 100 detects the touch operation intensity according to the pressure sensor 180A. The electronic device 100 may also calculate the location of the touch based on the detection signal of the pressure sensor 180A. In some embodiments, touch operations that act on the same touch location, but at different touch operation strengths, may correspond to different operation instructions. For example: and executing an instruction for checking the short message when the touch operation with the touch operation intensity smaller than the first pressure threshold acts on the short message application icon. And executing an instruction for newly creating the short message when the touch operation with the touch operation intensity being greater than or equal to the first pressure threshold acts on the short message application icon.
The gyro sensor 180B may be used to determine a motion gesture of the electronic device 100. The air pressure sensor 180C is used to measure air pressure. In some embodiments, electronic device 100 calculates altitude from barometric pressure values measured by barometric pressure sensor 180C, aiding in positioning and navigation. The magnetic sensor 180D includes a hall sensor. The acceleration sensor 180E may detect the magnitude of acceleration of the electronic device 100 in various directions (typically three axes). A distance sensor 180F for measuring a distance. The electronic device 100 may measure the distance by infrared or laser. In some embodiments, the electronic device 100 may range using the distance sensor 180F to achieve quick focus. The proximity light sensor 180G may include, for example, a Light Emitting Diode (LED) and a light detector, such as a photodiode. The ambient light sensor 180L is used to sense ambient light level. The fingerprint sensor 180H is used to collect a fingerprint. The electronic device 100 may utilize the collected fingerprint feature to unlock the fingerprint, access the application lock, photograph the fingerprint, answer the incoming call, etc. The temperature sensor 180J is for detecting temperature.
The touch sensor 180K, also referred to as a "touch device". The touch sensor 180K may be disposed on the display screen 194, and the touch sensor 180K and the display screen 194 form a touch screen, which is also called a "touch screen". The touch sensor 180K is for detecting a touch operation acting thereon or thereabout. The touch sensor may communicate the detected touch operation to the application processor to determine the touch event type. Visual output related to touch operations may be provided through the display 194. In other embodiments, the touch sensor 180K may also be disposed on the surface of the electronic device 100 at a different location than the display 194.
The SIM card interface 195 is used to connect a SIM card. The SIM card may be inserted into the SIM card interface 195, or removed from the SIM card interface 195 to enable contact and separation with the electronic device 100. The electronic device 100 may support 1 or N SIM card interfaces, N being a positive integer greater than 1. The SIM card interface 195 may support Nano SIM cards, micro SIM cards, and the like. The same SIM card interface 195 may be used to insert multiple cards simultaneously. The types of the plurality of cards may be the same or different. The SIM card interface 195 may also be compatible with different types of SIM cards. The SIM card interface 195 may also be compatible with external memory cards. The electronic device 100 interacts with the network through the SIM card to realize functions such as communication and data communication. In some embodiments, the electronic device 100 employs esims, i.e.: an embedded SIM card. The eSIM card can be embedded in the electronic device 100 and cannot be separated from the electronic device 100. The software system of the electronic device 100 may employ a layered architecture, an event driven architecture, a microkernel architecture, a microservice architecture, or a cloud architecture.
Prior to introducing the technical solution of the present application, a description will be given first of all of the related art.
A data recording technique for a motor vehicle with a driving assistance device may be: during the period when the user drives the vehicle and starts the auxiliary driving system, the required driving data can be obtained for accident evidence based on the starting condition of the auxiliary driving system and the dangerous signal moment. Referring to fig. 2, the implementation process of the prior art may include the following steps 201 and 202.
In step 201, in response to an on signal of the auxiliary driving system, working data and driver data of the auxiliary driving system are synchronously acquired, and the acquired data are stored in a data storage module.
In response to the hazard state information, corresponding work data and driver data are extracted from the data storage module based on the driver data and the extracted data are stored to another data storage module, step 202. The extracted data can be used to evidence the incident for the moment of incident.
However, in this prior art, the corresponding driving data is recorded only for the moment of occurrence of the accident for accident evidence collection, but not and not for teaching of the auxiliary driving function. The prior art is inconvenient for a user to learn the auxiliary driving function, and cannot improve the use experience of the user on the auxiliary driving system.
Before describing the technical scheme of the present application, some usage scenarios of the present application will be described first.
Currently, vehicle models with auxiliary driving systems (Advanced Driving Assistance System, ADAS) are becoming more popular, so that more and more users, such as vehicle drivers, can access the auxiliary driving systems. However, the user is often unfamiliar with the system logic and the usage method of the auxiliary driving system, which may result in doubt and fear of the auxiliary driving function, and poor experience of the auxiliary driving system.
For example, in one possible use scenario (scenario 1), when the auxiliary driving system fails to perform due to external factors (rather than due to the system's own poor function), such as failure to turn on, suddenly exit, lane change failure, etc., the user does not understand the cause to consider the auxiliary driving system to be poorly functioning.
As another example, in another possible use scenario (scenario 2), when the auxiliary driving system performs a specific operation, the user does not understand why the auxiliary driving system performs, but considers that the auxiliary driving system functions poorly.
Taking scenario 1 as an example, when the auxiliary driving system fails to execute (such as failing to start, suddenly exit, and change the lane), the user may not clearly execute the failure, so that the technical effect of the auxiliary driving system is considered to be poor, and doubt and fear are generated for the auxiliary driving system, so that when the auxiliary driving system fails to execute, corresponding auxiliary driving information (such as the failure executing reason, the current road condition information, the auxiliary driving strategy, etc.) can be obtained for the user to view, so that the user can use the auxiliary driving function more proficiently and efficiently.
Therefore, for the driving assistance scenario that the user does not understand, the user can be provided with an opportunity to learn the driving assistance execution logic, which is advantageous for the user to eliminate suspicion and fear of the driving assistance function and to use the driving assistance function more skillfully and efficiently later.
In order to improve the use experience of the user on the auxiliary driving system, enhance the trust sense of the user on the auxiliary driving function, improve the use willingness of the user on the auxiliary driving system, except the auxiliary driving scene which is not understood by the user, the opportunity of learning or knowing the execution logic of the auxiliary driving can be provided for some auxiliary driving scenes which suggest the user to learn or understand the corresponding auxiliary driving function.
For example, in yet another possible use scenario (scenario 3), during operation of the auxiliary driving system, and when the vehicle approaches the problematic driving scenario, the user may have a tension due to uncertainty as to whether the auxiliary driving system is able to cope with the problematic driving scenario, the operations performed by the auxiliary driving system and the driving operations expected by the user are not very consistent, and so on.
For example, in performing the driving assist operation, some operation may occur that causes a user to be stressed, such as passing by a cart, merging on a highway, or the like.
The user may be in tension and panic for certain road conditions while driving manually, but not have time to record and learn, so that each time such a scene occurs, the user may be in tension and fear of emotion. But because the scene is fast going through, the user may not remember intentionally. But each time a scene appears, the user is panicked and stressed. If such problems are not resolved, the user may never trust the auxiliary driving system or trust his own driving technique.
For another example, in another possible use scenario (scenario 4), the user may not always be attentive to driving the vehicle (e.g., the user may answer a call during driving of the vehicle, review information of a mobile phone, etc.) because the driving assistance function is activated, and the driving assistance operation may be performed to enable the vehicle to safely pass through the complex dangerous road condition when the vehicle is driving under the complex dangerous road condition.
Therefore, aiming at the driving assisting scene which is not solved by the user and the driving assisting scene which is suggested to the user to learn or know the corresponding driving assisting function, the user can be provided with opportunities to learn and know the corresponding driving assisting function.
It is possible to generate corresponding target information (the target information includes at least the driving assistance information) in the above-described scenario, and by presenting the target information to the user, the user can learn and understand the corresponding driving assistance function.
For example, by displaying the target information generated in the scene 1 to the user, the reason of the failure of the execution of the auxiliary driving system can be explained to the user, so that the user can know that the failure of the execution of the auxiliary driving system is not caused by the poor function of the auxiliary driving system, the doubt and fear of the user on the auxiliary driving function can be eliminated, and the user can learn and know the corresponding auxiliary driving function conveniently, so that the user can use the auxiliary driving function more skillfully and efficiently in the follow-up.
In addition, since the presentation of the target information to the user may interfere with the normal driving of the vehicle by the driver if the driver is driving the vehicle (it is not advisable to interpret the user for a short time after the occurrence of the event during the driving of the vehicle by the user, otherwise, the driving is interfered), the target information may be presented again after the driver finishes driving, so that the user learns the auxiliary driving function by checking the target information.
The user may typically be the driver of the vehicle, or may also be other users who need to learn the auxiliary driving function, for example.
As shown in fig. 3, an embodiment of the present invention provides an information display control method, which may include steps 301 to 304:
in step 301, it is determined whether a preset target event exists in a case where the driving support system of the vehicle is in a start state.
The vehicle may be provided with a key for switching the driving assistance system, which may be a physical key or a virtual key on the vehicle computer. By triggering the key, the user can turn on or off the auxiliary driving system of the vehicle as required.
After the auxiliary driving system is started, whether a preset target event exists or not can be monitored in real time.
The number of preset target events may be one or more. When a plurality of target events are preset, whether each target event exists or not can be monitored in parallel.
In one embodiment, the preset target event may have event 1 corresponding to scenario 1 above: the auxiliary driving operation performed by the auxiliary driving system fails to be performed.
When the event 1 exists, it is necessary to generate corresponding target information (the target information at least includes auxiliary driving information) for the user to learn by multiple rounds and understand the corresponding auxiliary driving function.
In one embodiment, the preset target event may have event 2 corresponding to scenario 2 above: request information for requesting learning of the assist driving strategy is received.
For example, the user may request to learn the driving assistance policy by triggering a corresponding key, making a corresponding voice, or the like, without understanding why the driving assistance system performs the driving assistance operation as such.
When the event 2 exists, it is necessary to generate corresponding target information (the target information at least includes auxiliary driving information) for the user to learn by multiple rounds and understand the corresponding auxiliary driving function.
In one embodiment, the preset target event may have event 3 corresponding to the above scenario 3: the driving scene of the vehicle is a scene in which target information is to be acquired.
By way of example, considering that a stress occurs and a heart rate abnormality occurs due to uncertainty of a user about whether the auxiliary driving system can cope with a difficult driving scene, an operation performed by the auxiliary driving system is not consistent with a driving operation expected by the user, and the like, the current scene can be determined to be a scene of target information to be acquired when the heart rate of the user is abnormal.
When the event 3 exists, it is necessary to generate corresponding target information (the target information at least includes auxiliary driving information) for the user to learn by multiple rounds and understand the corresponding auxiliary driving function.
In one embodiment, the preset target event may have event 4 corresponding to the above scenario 4: the user uses the mobile phone, and the driving road condition of the vehicle is the road condition of the target information to be acquired.
When the event 4 exists, it is necessary to generate corresponding target information (the target information at least includes auxiliary driving information) for the user to learn about the corresponding auxiliary driving function.
In addition, the user can also be warned for safe driving by learning the corresponding auxiliary driving function through the repeated disc. For example, the target information may include information such as road conditions and auxiliary driving operations when the event 4 occurs, so that when the user views the target information, the user can know the dangerous situation where the user is at the moment, thereby warning the user that the user should concentrate on driving.
Step 302, if there is a target event, acquiring target information corresponding to the target event, where the target information includes auxiliary driving information corresponding to the target event.
If the target event does not exist, the current flow may be ended.
In one embodiment, the driving assistance information corresponding to the target event may include: the method comprises the steps of selecting a first time period corresponding to a target event, an auxiliary driving strategy of an auxiliary driving system in the first time period, an auxiliary driving execution result of the auxiliary driving system in the first time period and a reason that the auxiliary driving execution result is failure execution.
In one embodiment, the target information may further include: at least one of driving video of a driving recorder of a vehicle in a second time period and manual driving recommended operation corresponding to a target event; the second time period comprises a first time period corresponding to the target event.
For example, the target information corresponding to the event 1 may include: the method comprises the steps of a first time period corresponding to an event 1, an auxiliary driving strategy of an auxiliary driving system in the first time period, an auxiliary driving execution result of the auxiliary driving system in the first time period, a reason when the auxiliary driving execution result is an execution failure, and a driving video of a driving recorder of a vehicle in a second time period.
And then, the target information is displayed to the user, so that the user has an opportunity to know the scene and the reason of the decision of the auxiliary driving system at the time, the user can use the automatic driving more proficiently later, and the strangeness and the untrustiness of the user on the auxiliary driving system are reduced.
Based on this, in one embodiment of the present invention, the determining whether the target event exists includes: monitoring whether the auxiliary driving system performs an auxiliary driving operation; in the case where it is monitored that the auxiliary driving system performs the auxiliary driving operation, it is determined whether the auxiliary driving operation is successfully performed. The presence of a target event includes: and (3) a case that the auxiliary driving operation fails to be executed.
For example, the target information corresponding to the event 2 may include: the driving assistance strategy of the driving assistance system in the first time period, the driving assistance execution result of the driving assistance system in the first time period and the driving video of the driving recorder of the vehicle in the second time period corresponding to the event 2.
The target information is displayed to the user later, so that the user can learn the corresponding auxiliary driving function, and the user can use automatic driving more proficiently later.
Based on this, in one embodiment of the present invention, the determining whether the target event exists includes: it is detected whether request information for requesting learning of the assist driving strategy is received. The presence of a target event includes: and receiving the request information.
In one embodiment of the present invention, the detecting whether request information for requesting learning of the driving assistance policy is received includes: it is detected whether a target key of a vehicle configuration is triggered. The case of receiving the request information includes: and the target key is triggered.
For example, a shortcut key may be provided on a steering wheel of a vehicle. When the user does not understand the current auxiliary driving operation, the shortcut key can be triggered to perform active marking so as to actively request to learn the corresponding auxiliary driving function.
In another embodiment of the present invention, the detecting whether the request information for requesting learning of the driving assistance policy is received includes: it is detected whether a voice signal for requesting learning of an assisted driving strategy is received. The case of receiving the request information includes: and receiving the voice signal.
For example, when the user does not understand the current auxiliary driving operation, the voice assistant may be evoked and a corresponding voice may be issued to actively mark to actively request learning of the corresponding auxiliary driving function. For example, the user may make a voice of "×" (× is used to call a voice assistant), how do the just assisted drive go back? ".
For example, the target information corresponding to the event 3 may include: the method comprises the steps of a first time period corresponding to an event 3, an auxiliary driving strategy of an auxiliary driving system in the first time period, an auxiliary driving execution result of the auxiliary driving system in the first time period, a driving video of a vehicle driving recorder in a second time period and manual driving recommended operation corresponding to a target event.
For example, when the user runs to the tunnel, a tension occurs, and the corresponding manual driving recommendation operation may be an operation for representing "driving skill when entering the tunnel".
The target information is displayed to the user, so that the user can learn and know the auxiliary driving function conveniently, the trust degree of the user on the auxiliary driving system is improved, and the tension emotion of the user facing the same event (such as entering a tunnel) can be relieved and weakened.
Based on this, in one embodiment of the present invention, the determining whether the target event exists includes: it is determined whether a driving scene of the vehicle is a scene in which the target information is to be acquired. The presence of a target event includes: the driving scene of the vehicle is a case of a scene in which the target information is to be acquired.
In one embodiment of the present invention, the determining whether the driving scene of the vehicle is a scene in which the target information is to be acquired includes: acquiring heart rate data of a user; determining whether the situation that the heart rate of the user changes regularly is existed according to the heart rate data of the user; if the heart rate of the user changes in a preset rule, determining the driving scene of the vehicle as the scene of the target information to be acquired.
The preset law may be a law of steep increase in heart rate. User heart rate data may be collected by a wearable device worn by the user, such as a smart watch.
In another embodiment of the present invention, voice data may be collected, and whether there is user tension is determined according to the voice data, and if there is user tension, it may also be determined that a driving scene of the vehicle is a scene in which target information is to be acquired. For example, the voice data is voice data when the user is stressed and a screaming is made.
For example, the target information corresponding to the event 4 may include: the first time period corresponding to the event 4, an auxiliary driving strategy of an auxiliary driving system in the first time period, an auxiliary driving execution result of the auxiliary driving system in the first time period and a driving video of a driving recorder of a vehicle in the second time period.
The target information is displayed to the user, so that the user can learn and know the auxiliary driving function conveniently, the trust degree of the user on the auxiliary driving system is improved, and the user can be warned in safe driving.
Based on this, in one embodiment of the present invention, the determining whether the target event exists includes: determining whether the terminal equipment is in a working state; and under the condition that the terminal equipment is in a working state, determining whether the driving road condition of the vehicle is the road condition of the target information to be acquired. The presence of a target event includes: the driving road condition of the vehicle is the condition of the road condition of the target information to be obtained.
Referring to fig. 4, in an embodiment, taking the event 2 as an example, the implementation process of generating the target information may include the following steps 401 to 404.
In step 401, a first period (or time point data) corresponding to the target event is recorded.
Step 402, determining a second time period including the first time period, and acquiring a driving video of a driving recorder of the vehicle in the second time period.
For example, the second time period may be based on a time period obtained by flaring the first time period forward, backward for n minutes (e.g., 1 minute, 3 minutes, etc.), respectively.
Step 403, obtaining an auxiliary driving strategy of the auxiliary driving system in the first time period and an auxiliary driving execution result of the auxiliary driving system in the first time period.
By displaying the auxiliary driving strategy to the user, the user can conveniently know the decision reason for making the auxiliary driving behavior. For example, the decision is made because the vehicle is pressed against the traffic regulation, and the corresponding auxiliary driving behavior is to turn the steering wheel to return to the driving direction.
The auxiliary driving execution result can be used for showing whether the auxiliary driving behavior is executed successfully or not. In step 403, the cause of the execution failure may be further acquired when the execution result is the execution failure corresponding to the event 1.
For example, when the auxiliary driving system cannot be started or suddenly withdrawn as a result of the execution of the auxiliary driving, the reasons for the execution failure may be that the vehicle speed is too low, the indication line cannot be identified, the image cannot be identified by the camera due to too strong light in front of the vehicle, what kind of improper operation is executed by the user, and the like.
In this embodiment, by matching the time axis, each item of information in the target information has consistency and relevance in the time dimension, so that the user can learn and understand the auxiliary driving function conveniently.
And step 404, generating target information according to the driving video, the driving assistance strategy and the driving assistance execution result in the first time period and the second time period.
The target information may be, for example, a video file used as a compound-disc teaching video. In other embodiments, the target information may likewise be an image file, a text file, or the like.
Taking the target information as an example of a video file used as a multi-disc teaching video, please refer to fig. 5, fig. 5 is a schematic diagram of a multi-disc image (i.e. an image in the multi-disc teaching video) viewing interface according to an embodiment of the present invention.
As shown in fig. 5, the target information for the user to learn the auxiliary driving function may include: the second time period, driving video in the second time period, auxiliary driving strategies and the like. In the process of playing the driving video, corresponding driving assistance strategies can be introduced in real time through characters in the upper right area of the image display interface.
Referring to fig. 5, the target information may include other conventional information in addition to information about a time period of the target event, a driving video, a driving assistance strategy, and the like.
The other conventional information can be, for example, information such as instrument content, in-car sound recordings, central control content, weather, vehicle speed and the like.
All target information generated during the start-up of the auxiliary driving system can be stored in the same storage module so that a user can select any historical target information for viewing as required.
Referring to fig. 6, fig. 6 is a schematic diagram of a multiple disc image storage interface according to an embodiment of the invention. As shown in fig. 6, the user can view all of the target information generated historically.
In one embodiment, all of the target information may be found from the application of the vehicle event recorder.
In step 303, in the case where the vehicle ends driving, a target operation corresponding to the predetermined display device is performed.
The driving state of the vehicle may be monitored to determine recommended user learning and learning of the auxiliary driving function timing. Wherein the target operation may be performed when it is monitored that the vehicle is finished driving.
In one implementation, the target operation may be to pop a window on the display device to display preset notification information, so as to notify and remind the user of the multiple learning auxiliary driving function.
Based on this, in one embodiment of the present invention, the performing the target operation of the corresponding display device includes: and controlling the display equipment to display preset notification information through the popup window.
The display device can be a display screen of a vehicle-mounted computer of a vehicle or a display screen of user terminal equipment. The user terminal equipment and the user account corresponding to the vehicle are the same.
The display device may also be a display screen of a terminal device of the other subscriber to be pre-bound, if applicable. The terminal equipment of other users is different from the user account corresponding to the vehicle.
Thus, in one embodiment, after the vehicle finishes driving, the user may be prompted to view the target information by sending a notification together with the vehicle-mounted computer and the user terminal device. The user may choose to view the target information on the vehicle computer or user terminal.
Referring to fig. 7, fig. 7 is an interface schematic diagram of a multiple video recording recommendation method. As shown in fig. 7, a window may be flicked on a display screen of the on-board computer to prompt the user to view the target information. For example, the content of the notification information in the popup window may be "the driving assistance today is recorded on a double-disc, and the user clicks to view.
Alternatively, a pop-up window may be displayed in the notification center of the vehicle-mounted computer.
In addition, as shown in fig. 7, the user may be prompted to view the target information through an information icon, in addition to through a pop-up window. For example, when the generated target information is to be viewed by a user, a corresponding logo (such as the dot logo shown in fig. 7) may exist in the upper right corner of the information icon.
Referring to fig. 8, fig. 8 is an interface diagram of another multiple video recording recommendation method. As shown in fig. 8, a window may be flicked on a display screen of the terminal device to prompt the user to view the target information. For example, the content of the notification information in the popup window may be "the driving assistance today is recorded on a double-disc, and the user clicks to view.
Alternatively, the notification banner may be sent via a corresponding Application (APP) on the owner's handset.
In another implementation, the target operation may also be to output a reminder voice to provide a user with multiple-disc learning driving assistance functionality on the display device.
Step 304, in response to the trigger of the target operation, outputting target information to the display device.
The user may perform a corresponding trigger operation in response to the target operation to confirm approval of the multiple disc learning auxiliary driving function. At this time, the target information may be output to a display device for presentation to a user for viewing.
In one implementation, the target information may be output to the display device in response to a user triggering a pop-up window, or triggering an operation of a control displayed on the pop-up window that indicates approval to view the target information.
In another implementation, the target information may be output to the display device in response to voice information indicating approval to view the target information issued by the user during the response to the alert voice.
The display device may display the received target information after receiving the target information. For example, when the target information is a video file used as a compound teaching video, the video file may be played.
The display interface of the display screen of the vehicle-mounted computer for displaying the multi-disc teaching video can be shown in fig. 5.
The embodiment provides the opportunity for the user to learn the execution logic aiming at the situation that the user is not familiar with the logic and the using method of the auxiliary driving system and doubts and fear are generated on the auxiliary driving, and the user is not aware of the situation that the user needs to learn the auxiliary driving function, so that the user can be helped to master the using method of the auxiliary driving system, and the auxiliary driving system is used more skillfully.
Referring to fig. 9, an embodiment of the present invention provides an information display control method corresponding to the above scenario 1, which may include the following steps:
Step 901, monitoring whether the auxiliary driving system performs the auxiliary driving operation when the auxiliary driving system of the vehicle is in a starting state, if yes, executing step 902, otherwise ending the current flow.
Step 902, determining whether the auxiliary driving operation is successfully executed, if yes, executing step 903, otherwise ending the current flow.
Step 903, obtaining target information, where the target information includes: the method comprises the steps of performing an auxiliary driving strategy of an auxiliary driving system in a current time period, performing an auxiliary driving result of the auxiliary driving system in the current time period, performing a reason when the auxiliary driving result is failed to be performed, and performing a driving video of a driving recorder of a vehicle in a second time period; the second time period includes the current time period.
The current time period is a time period corresponding to the auxiliary driving system executing the auxiliary driving operation.
Step 904, when the vehicle ends driving, step 905 and step 907 are executed, respectively.
Step 905, controlling a display screen of an onboard computer of the vehicle to display first notification information through a popup window, and executing step 906.
Step 906, in response to the triggering of the first notification information, outputting the target information to a display screen of an on-board computer of the vehicle.
Step 907, the display screen of the terminal device is controlled to display the second notification information through the popup window, and step 908 is performed.
In one implementation, the information content of the first notification information is the same as the information content of the second notification information.
In another implementation, the information content of the first notification information and the information content of the second notification information are different.
Step 908, in response to the triggering of the second notification information, outputting the target information to a display screen of a terminal device of the vehicle.
In this embodiment, a time point and an abnormality cause of an abnormal condition (such as incapability of starting/suddenly exiting/changing a lane) of the auxiliary driving system can be automatically recorded, a video of a vehicle recorder before and after the time point is intercepted in the background, and then auxiliary driving decision logic, the cause and the like are marked in the video, so that the vehicle is provided for a user to perform multi-lane learning after driving is finished.
The embodiment can provide the user with the opportunity of post-return learning of the abnormal execution scene of the auxiliary driving system in the driving process, so that the user can know the scene and the reason of the decision at the time, the user can use the automatic driving more skillfully later, and the strangeness and the untrustiness of the user to the auxiliary driving system can be reduced.
Referring to fig. 10, referring to the above scenario 2, an embodiment of the present invention provides another information display control method, which may include the following steps:
step 1001, when the driving support system of the vehicle is in a start state, step 1002 and step 1003 are executed, respectively.
Step 1002, detecting whether a target key configured by the vehicle is triggered, if yes, executing step 1004, otherwise, ending the current flow.
Step 1003, detecting whether a voice signal for requesting to learn the driving assistance strategy is received, if yes, executing step 1004, otherwise ending the current flow.
Step 1004, when it is detected that the target key is triggered and any one of the voice signals is received, obtaining target information, where the target information includes: the driving assistance strategy of the driving assistance system in the current time period, the driving assistance execution result of the driving assistance system in the current time period and the driving video of the driving recorder of the vehicle in the second time period; the second time period includes the current time period.
The current time period is a time period corresponding to the received voice signal.
Step 1005, when the vehicle ends driving, step 1006 and step 1008 are executed, respectively.
Step 1006, the display screen of the on-board computer controlling the vehicle displays the first notification information through the popup window, and step 1007 is performed.
Step 1007, in response to the trigger to the first notification information, outputting the target information to a display screen of an on-board computer of the vehicle.
Step 1008, controlling the display screen of the terminal device to display the second notification information through the popup window, and executing step 1009.
In one implementation, the information content of the first notification information is the same as the information content of the second notification information.
In another implementation, the information content of the first notification information and the information content of the second notification information are different.
In step 1009, in response to the trigger to the second notification information, the target information is output to the display screen of the terminal device of the vehicle.
The embodiment supports the user to record the self-unobvious time point of the auxiliary driving execution scene, intercepts the video of the automobile data recorder before and after the time point in the background, and marks the auxiliary driving decision logic and the like in the video so as to provide the auxiliary driving decision logic and the like for the user to learn multiple driving after the automobile finishes driving.
Referring to fig. 11, referring to the above scenario 3, an embodiment of the present invention provides a further information display control method, which may include the following steps:
Step 1101, obtaining user heart rate data when the auxiliary driving system of the vehicle is in a start state.
Step 1102, determining whether there is a situation that the heart rate of the user changes regularly according to the heart rate data of the user, if so, executing step 1103, otherwise, ending the current flow.
Step 1103, obtaining target information, where the target information includes: the method comprises the following steps of performing an auxiliary driving strategy of an auxiliary driving system in a current time period, performing an auxiliary driving execution result of the auxiliary driving system in the current time period, performing driving video recording of a driving recorder of a vehicle in a second time period, and performing manual driving recommendation operation corresponding to a target event; the second time period includes the current time period.
The current time period is a time period corresponding to the occurrence of the preset regular change of the heart rate of the user.
Step 1104, when the vehicle ends driving, executes step 1105 and step 1107, respectively.
Step 1105, controlling a display screen of an onboard computer of the vehicle to display first notification information through a popup window, and executing step 1106.
In step 1106, in response to the triggering of the first notification information, the target information is output to a display screen of an on-board computer of the vehicle.
Step 1107, controlling the display screen of the terminal device to display the second notification information through the popup window, and executing step 1108.
In one implementation, the information content of the first notification information is the same as the information content of the second notification information.
In another implementation, the information content of the first notification information and the information content of the second notification information are different.
Step 1108, in response to the triggering of the second notification information, outputting the target information to a display screen of a terminal device of the vehicle.
In this embodiment, a time point of a tension scene causing a hiking increase in the heart rate of the user may be identified by a smart watch or the like, and recordings of the automobile data recorder before and after the time point may be intercepted in the background, and then the driving assistance decision logic or the like may be marked in the recordings, so as to provide the user with multiple-disc learning after the driving of the automobile is finished, and sooth the user.
In addition, the driving assistance surrounding conditions before and after the time point can be recorded, driving suggestions can be given by combining the stressed scene of the user, and the driving suggestions are also marked in the video. For example, if the heartbeat acceleration of the user is tense when the user enters the tunnel every time, the user is provided with the current scene video and the driving skill when entering the tunnel afterwards.
According to the embodiment, the driving tension scene of the user is actively identified, teaching guidance is given, the trust degree of the user on auxiliary driving can be enhanced, and the user is helped to complement the short plates of the self driving skill level.
Referring to fig. 12, referring to the above scenario 4, an embodiment of the present invention provides a further information display control method, which may include the following steps:
step 1201, determining whether the terminal device is in a working state when the driving support system of the vehicle is in a starting state, if so, executing step 1202, otherwise, ending the current flow.
Step 1202, determining whether the driving road condition of the vehicle is the road condition of the target information to be acquired, if so, executing step 1203, otherwise, ending the current flow.
Step 1203, obtaining target information, where the target information includes: the driving assistance strategy of the driving assistance system in the current time period, the driving assistance execution result of the driving assistance system in the current time period and the driving video of the driving recorder of the vehicle in the second time period; the second time period includes the current time period.
The current time period is a time period corresponding to the condition that the terminal equipment is in a working state and the running road condition of the vehicle is the road condition of the target information to be acquired.
Step 1204, when the vehicle ends driving, step 1205 and step 1207 are executed, respectively.
Step 1205, controlling the display screen of the vehicle-mounted computer of the vehicle to display the first notification information through the popup window, and executing step 1206.
In step 1206, in response to the triggering of the first notification information, the target information is output to a display screen of an on-board computer of the vehicle.
Step 1207, controlling the display screen of the terminal device to display the second notification information through the popup window, and executing step 1208.
In one implementation, the information content of the first notification information is the same as the information content of the second notification information.
In another implementation, the information content of the first notification information and the information content of the second notification information are different.
Step 1208, outputting the target information to a display screen of a terminal device of the vehicle in response to the triggering of the second notification information.
According to the embodiment, whether the mobile phone with the same account number is in the operation of unlocking the bright screen or not can be identified through the user account number, and auxiliary driving road condition information can be determined by combining the automobile data recorder, so that various levels of dangerous situations (which are possible to cause risks although accidents are not caused) of the user in a driving road section playing the mobile phone are actively recorded.
The embodiment can assist in recording the user irregular behavior in the use process of the driving assisting system through the identification and the use of the mobile phone use state. Because the user may distract from the mobile phone during the driving assistance process and neglect the possible driving danger, the embodiment actively educates the user that the user should concentrate carefully during the driving assistance stage (let the user see that he is near the danger, and can more effectively warn the user).
In one possible implementation, either method embodiment of the present application may be implemented in conjunction with the existing 1+8+N device distributed data sharing capabilities of a hong system (a distributed operating system). For example, the corresponding target information can be automatically acquired when the situation of the execution of the difficulty is detected during the use of the auxiliary driving system by the user.
According to any method embodiment provided by the application, the user can learn the auxiliary driving function through multiple driving after driving is finished, the user is helped to be familiar with the system logic and the using method of the auxiliary driving system, the doubt and fear of the user on the auxiliary driving function are eliminated or weakened, and the use experience of the user on the auxiliary driving system is improved.
An embodiment of the present invention also provides an information display control apparatus, including: the determining module is used for determining whether a preset target event exists or not under the condition that an auxiliary driving system of the vehicle is in a starting state; the acquisition module is used for acquiring target information corresponding to the target event if the target event exists, wherein the target information comprises auxiliary driving information corresponding to the target event; a processing module for executing a target operation corresponding to a predetermined display device in a case where the vehicle ends driving; and the output module is used for responding to the trigger of the target operation and outputting the target information to the display equipment.
One embodiment of the present invention also provides an electronic chip mounted in an electronic device (UE), the electronic chip including: a processor for executing computer program instructions stored on a memory, wherein the computer program instructions, when executed by the processor, trigger an electronic chip to perform the method steps provided by any of the method embodiments of the present application.
An embodiment of the present application further proposes a terminal device, which includes a communication module, a memory for storing computer program instructions and a processor for executing the program instructions, wherein the computer program instructions, when executed by the processor, trigger the terminal device to execute the method steps provided by any of the method embodiments of the present application.
An embodiment of the present application further proposes a server device comprising a communication module, a memory for storing computer program instructions and a processor for executing the program instructions, wherein the computer program instructions, when executed by the processor, trigger the server device to perform the method steps provided by any of the method embodiments of the present application.
An embodiment of the present invention also provides an electronic device including a plurality of antennas, a memory for storing computer program instructions, a processor for executing the computer program instructions, and a communication means (such as a communication module that may enable 5G communication based on an NR protocol), wherein the computer program instructions, when executed by the processor, trigger the electronic device to perform the method steps provided by any of the method embodiments of the present application.
In particular, in an embodiment of the present application, one or more computer programs are stored in the memory, the one or more computer programs comprising instructions which, when executed by the apparatus, cause the apparatus to perform the method steps described in the embodiments of the present application.
Specifically, in an embodiment of the present application, the processor of the electronic device may be a System On Chip (SOC), where the processor may include a central processing unit (Central Processing Unit, CPU), and may further include other types of processors. Specifically, in an embodiment of the present application, the processor of the electronic device may be a PWM control chip.
In particular, in an embodiment of the present application, the processor may include, for example, a CPU, DSP (digital signal processor ) or microcontroller, and may further include a GPU (graphics processing unit, graphics processor), an embedded Neural network processor (Neural-network Process Units, NPU), and an image signal processor (Image Signal Processing, ISP), where the processor may further include a necessary hardware accelerator or logic processing hardware circuit, such as an ASIC, or one or more integrated circuits for controlling the execution of the program of the present application, and so on. Further, the processor may have a function of operating one or more software programs, which may be stored in a storage medium.
In particular, in an embodiment of the present application, the memory of the electronic device may be a read-only memory (ROM), other type of static storage device capable of storing static information and instructions, a random access memory (random access memory, RAM) or other type of dynamic storage device capable of storing information and instructions, an electrically erasable programmable read-only memory (electrically erasable programmable read-only memory, EEPROM), a compact disc read-only memory (compact disc read-only memory, CD-ROM) or other optical disc storage, optical disc storage (including compact disc, laser disc, optical disc, digital versatile disc, blu-ray disc, etc.), magnetic disk storage media or other magnetic storage devices, or any computer readable medium capable of carrying or storing desired program code in the form of instructions or data structures and capable of being accessed by a computer.
In particular, in an embodiment of the present application, the processor and the memory may be combined into a processing device, more commonly separate components, and the processor is configured to execute the program code stored in the memory to implement the method described in the embodiment of the present application. In particular, the memory may also be integrated into the processor or may be separate from the processor.
Further, the devices, apparatuses, modules illustrated in the embodiments of the present application may be implemented by a computer chip or entity, or by a product having a certain function.
It will be apparent to those skilled in the art that embodiments of the present application may be provided as a method, apparatus, or computer program product. Accordingly, the present invention may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, the present invention may take the form of a computer program product embodied on one or more computer-usable storage media having computer-usable program code embodied therein.
In several embodiments provided herein, any of the functions, if implemented in the form of software functional units and sold or used as a stand-alone product, may be stored in a computer-readable storage medium. Based on such understanding, the technical solution of the present application may be embodied essentially or in a part contributing to the prior art or in a part of the technical solution, in the form of a software product stored in a storage medium, including several instructions for causing a computer device (which may be a personal computer, a server, or a network device, etc.) to perform all or part of the steps of the methods described in the embodiments of the present application.
Specifically, in an embodiment of the present application, there is further provided a computer readable storage medium, where a computer program is stored, when the computer program is executed on a computer, to cause the computer to perform the method steps provided in the embodiments of the present application.
An embodiment of the present application also provides a computer program product comprising a computer program which, when run on a computer, causes the computer to perform the method steps provided by the embodiments of the present application.
The description of embodiments herein is described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (devices), and computer program products according to embodiments herein. It will be understood that each flow and/or block of the flowchart illustrations and/or block diagrams, and combinations of flows and/or blocks in the flowchart illustrations and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the flowchart of the process or processes and/or steps
Or a block diagram of one or more functions specified in the block diagram.
In the several embodiments provided by the present invention, it should be understood that the disclosed systems, devices, and methods may be implemented in other manners. For example, the apparatus embodiments described above are merely illustrative, e.g., the division of the elements is merely a logical function division, and there may be additional divisions when actually implemented, e.g., multiple elements or components may be combined or integrated into another system, or some features may be omitted or not performed. Alternatively, the coupling or direct coupling or communication connection shown or discussed with each other may be an indirect coupling or communication connection via some interfaces, devices, or units, which may be in electrical, mechanical, or other forms.
The units described as separate units may or may not be physically separate, and units shown as units may or may not be physical units, may be located in one place, or may be distributed over a plurality of network units. Some or all of the units may be selected according to actual needs to achieve the purpose of the solution of this embodiment.
In addition, each functional unit in the embodiments of the present invention may be integrated in one processing unit, or each unit may exist alone physically, or two or more units may be integrated in one unit. The integrated units may be implemented in hardware or in hardware plus software functional units.
The integrated units, implemented in the form of software functional units, may be stored in a computer readable storage medium. The software functional unit is stored in a storage medium, and includes several instructions for causing a computer device (which may be a personal computer, a server, or a network device, etc.) or a Processor (Processor) to perform part of the steps of the methods according to the embodiments of the present invention. And the aforementioned storage medium includes: a U-disk, a removable hard disk, a Read-Only Memory (ROM), a random access Memory (Random Access Memory, RAM), a magnetic disk, or an optical disk, or other various media capable of storing program codes.
In the present embodiments, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising one … …" does not exclude the presence of other like elements in a process, method, article or apparatus that comprises the element.
The application may be described in the general context of computer-executable instructions, such as program modules, being executed by a computer. Generally, program modules include routines, programs, objects, components, data structures, etc. that perform particular tasks or implement particular abstract data types. The application may also be practiced in distributed computing environments where tasks are performed by remote processing devices that are linked through a communications network. In a distributed computing environment, program modules may be located in both local and remote computer storage media including memory storage devices.
All embodiments in the application are described in a progressive manner, and identical and similar parts of all embodiments are mutually referred, so that each embodiment mainly describes differences from other embodiments. In particular, for the device embodiments, since they are substantially similar to the method embodiments, the description is relatively simple, and reference is made to the description of the method embodiments in part.
Those of ordinary skill in the art will appreciate that the various elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as a combination of electronic hardware, computer software, and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the solution. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
It will be clear to those skilled in the art that, for convenience and brevity of description, specific working procedures of the above-described systems, apparatuses and units may refer to corresponding procedures in the foregoing method embodiments, and are not repeated herein.
The foregoing description of the preferred embodiments of the invention is not intended to be limiting, but rather to enable any modification, equivalent replacement, improvement or the like to be made within the spirit and principles of the invention.

Claims (18)

1. An information display control method, characterized by comprising:
determining whether a preset target event exists under the condition that an auxiliary driving system of the vehicle is in a starting state;
if the target event exists, acquiring target information corresponding to the target event, wherein the target information comprises auxiliary driving information corresponding to the target event;
executing a target operation corresponding to a predetermined display device in a case where the vehicle ends driving;
and responding to the trigger of the target operation, and outputting the target information to the display device.
2. The method of claim 1, wherein the determining whether a preset target event exists comprises:
monitoring whether the auxiliary driving system performs an auxiliary driving operation;
determining whether the auxiliary driving operation is successfully executed or not under the condition that the auxiliary driving system is monitored to execute the auxiliary driving operation;
Wherein the condition that the target event exists comprises: and (3) a situation that the auxiliary driving operation fails to be executed.
3. The method of claim 1, wherein the determining whether a preset target event exists comprises:
detecting whether request information for requesting to learn an auxiliary driving strategy is received;
wherein the condition that the target event exists comprises: and receiving the request information.
4. A method according to claim 3, wherein said detecting whether request information for requesting learning of a driving assistance strategy is received comprises:
detecting whether a target key of the vehicle configuration is triggered;
the receiving the request information includes: and the target key is triggered.
5. A method according to claim 3, wherein said detecting whether request information for requesting learning of a driving assistance strategy is received comprises:
detecting whether a voice signal for requesting to learn an auxiliary driving strategy is received;
the receiving the request information includes: and receiving the voice signal.
6. The method of claim 1, wherein the determining whether a preset target event exists comprises:
Determining whether a driving scene of the vehicle is a scene of target information to be acquired;
wherein the condition that the target event exists comprises: the driving scene of the vehicle is the scene of the target information to be acquired.
7. The method of claim 6, wherein the determining whether the driving scenario of the vehicle is a scenario in which target information is to be acquired comprises:
acquiring heart rate data of a user;
determining whether the situation that the heart rate of the user changes regularly is existed according to the heart rate data of the user;
if the heart rate of the user changes in a preset rule, determining the driving scene of the vehicle as the scene of the target information to be acquired.
8. The method of claim 1, wherein the determining whether a preset target event exists comprises:
determining whether the terminal equipment is in a working state;
under the condition that the terminal equipment is in a working state, determining whether the running road condition of the vehicle is the road condition of target information to be acquired;
wherein the condition that the target event exists comprises: the driving road condition of the vehicle is the condition of the road condition of the target information to be obtained.
9. The method of claim 1, wherein the driving assistance information corresponding to the target event comprises:
At least one of a first time period corresponding to the target event, an auxiliary driving strategy of the auxiliary driving system in the first time period, an auxiliary driving execution result of the auxiliary driving system in the first time period and a reason when the auxiliary driving execution result is the execution failure.
10. The method of claim 1, wherein the target information further comprises: at least one of driving video of a driving recorder of the vehicle in a second time period and manual driving recommended operation corresponding to the target event;
the second time period comprises a first time period corresponding to the target event.
11. The method of claim 1, wherein the performing the target operation corresponding to the predetermined display device comprises:
and controlling a predetermined display device to display the predetermined notification information through the popup window.
12. The method of claim 1, wherein the display device comprises: the display screen of the vehicle-mounted computer of the vehicle.
13. The method of claim 1, wherein the display device comprises: and a display screen of the terminal equipment.
14. An information display control apparatus, comprising:
the determining module is used for determining whether a preset target event exists or not under the condition that an auxiliary driving system of the vehicle is in a starting state;
the acquisition module is used for acquiring target information corresponding to the target event if the target event exists, wherein the target information comprises auxiliary driving information corresponding to the target event;
a processing module for executing a target operation corresponding to a predetermined display device in a case where the vehicle ends driving;
and the output module is used for responding to the trigger of the target operation and outputting the target information to the display equipment.
15. An electronic chip, comprising:
a processor for executing computer program instructions stored on a memory, wherein the computer program instructions, when executed by the processor, trigger the electronic chip to perform the method of any of claims 1-13.
16. An electronic device comprising a memory for storing computer program instructions, a processor for executing the computer program instructions, and communication means, wherein the computer program instructions, when executed by the processor, trigger the electronic device to perform the method of any of claims 1-13.
17. A computer readable storage medium, characterized in that the computer readable storage medium has stored therein a computer program which, when run on a computer, causes the computer to perform the method according to any of claims 1-13.
18. A computer program product, characterized in that the computer program product comprises a computer program which, when run on a computer, causes the computer to perform the method according to any of claims 1-13.
CN202210788133.4A 2022-07-04 2022-07-04 Information display control method, device, electronic equipment and medium Pending CN117382661A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210788133.4A CN117382661A (en) 2022-07-04 2022-07-04 Information display control method, device, electronic equipment and medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210788133.4A CN117382661A (en) 2022-07-04 2022-07-04 Information display control method, device, electronic equipment and medium

Publications (1)

Publication Number Publication Date
CN117382661A true CN117382661A (en) 2024-01-12

Family

ID=89470824

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210788133.4A Pending CN117382661A (en) 2022-07-04 2022-07-04 Information display control method, device, electronic equipment and medium

Country Status (1)

Country Link
CN (1) CN117382661A (en)

Similar Documents

Publication Publication Date Title
WO2021037251A1 (en) Method for displaying user interface, and vehicle-mounted terminal
Chhabra et al. A survey on driver behavior detection techniques for intelligent transportation systems
US9821657B2 (en) Drowsy driver detection
CN108961681B (en) Fatigue driving reminding method and device and storage medium
JP6613623B2 (en) On-vehicle device, operation mode control system, and operation mode control method
US20160167578A1 (en) Warning method and system therefor
CN105035074A (en) Vehicle active safety control method based on portable intelligent equipment
CN105023394A (en) Dangerous driving reminding and controlling method based on portable intelligent device
US9643493B2 (en) Display control apparatus
CN105139583A (en) Vehicle danger prompting method based on portable intelligent equipment
US20190221052A1 (en) Vehicle operation management system
WO2014191166A1 (en) Method and device for detecting a collision between a vehicle and an object by using a mobile terminal that can be coupled to the vehicle
JP2017175220A (en) Recording apparatus and recording method
CN109624985B (en) Early warning method and device for preventing fatigue driving
CN111532281A (en) Driving behavior monitoring method and device, terminal and storage medium
Chang et al. An implementation of smartphone-based driver assistance system using front and rear camera
CN105034804A (en) Vehicle active safety control method and vehicle active safety control system
CN105383497B (en) Vehicle-mounted system
WO2022001249A1 (en) Vehicle borrowing method, vehicle returning method, vehicle-mounted terminal, and vehicle borrowing and returning system
CN117382661A (en) Information display control method, device, electronic equipment and medium
CN114422936B (en) Tunnel traffic management method, device and storage medium
CN111147738A (en) Police vehicle-mounted panoramic and coma system, device, electronic equipment and medium
CN105035005A (en) Road danger prompting and controlling method based on portable intelligent equipment
CN116774203A (en) Method and device for sensing target
JP2018081555A (en) Vehicular display device, vehicular display method, and vehicular display program

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination