CN106891724B - Automobile main control screen system and method - Google Patents

Automobile main control screen system and method Download PDF

Info

Publication number
CN106891724B
CN106891724B CN201710046134.0A CN201710046134A CN106891724B CN 106891724 B CN106891724 B CN 106891724B CN 201710046134 A CN201710046134 A CN 201710046134A CN 106891724 B CN106891724 B CN 106891724B
Authority
CN
China
Prior art keywords
information
module
main control
control screen
execution
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201710046134.0A
Other languages
Chinese (zh)
Other versions
CN106891724A (en
Inventor
郑晓鹏
刘旺
吴国彬
艾惠灵
余蔚
赵亮
蔡路益
陈龙
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Banma Information Technology Co Ltd
Original Assignee
Banma Information Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Banma Information Technology Co Ltd filed Critical Banma Information Technology Co Ltd
Priority to CN201710046134.0A priority Critical patent/CN106891724B/en
Publication of CN106891724A publication Critical patent/CN106891724A/en
Application granted granted Critical
Publication of CN106891724B publication Critical patent/CN106891724B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K35/00Arrangement of adaptations of instruments
    • B60K35/10
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R16/00Electric or fluid circuits specially adapted for vehicles and not otherwise provided for; Arrangement of elements of electric or fluid circuits specially adapted for vehicles and not otherwise provided for
    • B60R16/02Electric or fluid circuits specially adapted for vehicles and not otherwise provided for; Arrangement of elements of electric or fluid circuits specially adapted for vehicles and not otherwise provided for electric constitutive elements
    • B60R16/023Electric or fluid circuits specially adapted for vehicles and not otherwise provided for; Arrangement of elements of electric or fluid circuits specially adapted for vehicles and not otherwise provided for electric constitutive elements for transmission of signals between vehicle parts or subsystems
    • B60R16/0237Electric or fluid circuits specially adapted for vehicles and not otherwise provided for; Arrangement of elements of electric or fluid circuits specially adapted for vehicles and not otherwise provided for electric constitutive elements for transmission of signals between vehicle parts or subsystems circuits concerning the atmospheric environment
    • B60K2360/145
    • B60K2360/146

Abstract

A car main control screen system and its method, carry on the safe recognition processing and then reduce the misoperation through the control of the main control screen, including receiving a control input information; processing the information to identify and distinguish an action of the information; and judging whether the action of the information is executed or not, if so, executing the control action of the information, if not, not executing the control action of the information, wherein before the judging step, the method further comprises inquiring whether the information is executed or not, wherein before the inquiring step, the method further comprises collecting environmental data to carry out predictive analysis on the execution consequence of the information, and if the execution consequence of the information is judged to be an adverse consequence, the inquiry is needed, so that the misoperation condition is corrected, and the safety in driving is ensured.

Description

Automobile main control screen system and method
Technical Field
The invention relates to an automobile main control screen system and an automobile main control screen method, in particular to a system and a method which are applied to an automobile and used for carrying out safety identification processing through main control screen control so as to reduce misoperation.
Background
Touch screens have been increasingly used in automobiles because of their compact interface and personalized design that are popular with the market. The touch screen is characterized in that a finger touches the surface of the screen, and an internal system identifies the surface of the screen and judges what action is to be executed.
The touch screen installed in the automobile, namely the main control screen of the automobile, integrates the control functions of a plurality of electronic devices in the automobile. The most common music playing and broadcast playing control, traffic navigation, even car window control and other control operations are gradually integrated into the main control screen. Of course, the intelligent automobile can not be separated from the main control screen. When the control terminals of almost all electronic devices are integrated into the main control screen, the centralized control is convenient, and more risks exist.
The traditional main control screen is arranged in the middle of the front row of the automobile. Usually above the gear of the vehicle, on the side of the front row of gauges. This position is best suited for placing a large touch screen as the main control screen, but this position and the driver's line of sight are not at a single height. That is, the driver's eyes must deviate from the driving route in order to look at the main control screen. This causes an unsafe factor in driving. Moreover, the main control screen uses soft keys, namely feedback without touch feeling. The driver will typically look at the main control screen to determine whether a corresponding action has been performed. This is a difficulty that the current main control screen is difficult to overcome. The soft keys used by the main control screen make the sight of the driver stop unconsciously. The key point is that the technical level of the main control screen is improved from hardware, a large amount of early research and development are needed, and the requirement of production cost is difficult to follow.
In the operation of the main control screen, misoperation is easy to occur. Because the boundaries of the soft keys all coincide. Sometimes it is possible to press two soft keys at a time or more. This priority is difficult to set in advance by software. Sometimes it is also very easy to press a key next to the object. Because the driver still wants to watch the road condition, the driver can operate the main control screen conveniently. In particular, it is customary for drivers who are familiar with the main control screen to press keys. Such misoperations cause a vicious circle of misoperations, which is a safety risk even for a running automobile.
The traditional main control screen can update software or control elements. But basically repairs bugs or problems in the software. The patch update is passive and many problems have not been solved. Many drivers do not update, and the correction opportunity of the main control screen is lost.
Many main control screens use soft keys and hard keys in a comprehensive way. Thus, the accuracy of the operation of the main control screen is improved. However, the main control screen system is always updated, and the functions are also enriched. Often, the operation of the soft key and the operation of the hard key conflict with each other, which affects the identification of the control system.
The main control screen can be connected with and control almost all electronic devices in the automobile and used as a control terminal to operate the automobile. However, the conventional main control screen is only used as an input terminal in the control link, that is, it is only used as a switch of the electronic device. And is an integrated switch, and the main feedback function is only to display the input condition. As a control input terminal in a vehicle, especially during driving, security of a main control panel should be considered. And the main control screen is used as the most important ring in human-vehicle interaction, and the control function of the main control screen can also be used as an intelligent medium in human-vehicle interaction.
Because the main control screen is in the key position of automobile control, research and development and design of the main control screen are all key devices in novel automobiles such as intelligent automobiles, network automobiles, unmanned automobiles, Internet of vehicles and the like. The main control screen is more intelligent, safe and convenient, and is also pursued by market prospects.
Disclosure of Invention
The invention aims to provide an automobile main control screen system, which takes a main control screen as the core of a control system, and utilizes a processing module, an auxiliary module and a sensing module to collect information and correspond to the action of an execution module, so as to safely, effectively and stably control an automobile.
Another object of the present invention is to provide a vehicle main control screen system, which is used for information collection and input, and ensures integration and simultaneously accurately judges the action to be executed.
Another object of the present invention is to provide a main control screen system for an automobile, which can effectively and reliably perform a control operation without being watched, thereby reducing unsafe factors during driving.
Another object of the present invention is to provide a vehicle main control screen system, which utilizes the auxiliary module to feed back both input and execution, thereby avoiding repeated operation and conflicting operation.
Another object of the present invention is to provide a vehicle main control screen system, wherein the processing module identifies and analyzes the information to accurately, safely and reliably identify the information to be executed, and further analyze the information.
Another objective of the present invention is to provide a vehicle main control screen system, wherein the processing module performs predictive analysis on the action to be executed according to the data of the sensing module, and determines whether there is a possibility of misoperation or dangerous influence on driving.
Another objective of the present invention is to provide a vehicle main control screen system, wherein the processing module can distinguish information according to habituation and accuracy, and accurately identify a wrong operation by cooperating with the auxiliary module to find a correct operation.
Another object of the present invention is to provide a vehicle main control screen system, wherein the processing module and the auxiliary module utilize self-learning and adaptive analysis processes, and can operate in accordance with the situation without actively updating the system.
Another object of the present invention is to provide a vehicle main control screen system, in which the processing module can exchange information with the outside in a communication manner, so that vehicle control can have more various choices based on human-vehicle interaction.
Another object of the present invention is to provide a vehicle main control screen system, which provides richer resources for the processing module to analyze information through communication and contact with other intelligent devices.
Another object of the present invention is to provide a vehicle main control screen system, wherein the auxiliary module can not only automatically analyze the misoperation and operation danger, but also query and confirm the information without neglecting the indication to the system.
Another object of the present invention is to provide a vehicle main control panel system, which can determine and process input conflicts of the main control panel, and stably and reliably control the input conflicts.
Another object of the present invention is to provide a car main control screen system, which makes full use of the role of the main control screen in car control and provides a safer, intelligent and convenient control system.
Another object of the present invention is to provide a vehicle main control screen system, which reduces the cost of hardware development, provides stable centralized control using the main control screen system, has personalized advantages, and can be adapted to more occasions.
Another object of the present invention is to provide a car main control screen system which reduces the chance of operating the main control screen by being gazed at, thereby reducing the probability of danger occurring due to the operation of the main control screen.
According to one aspect of the present invention, there is provided a method for controlling a screen of an automobile, comprising:
receiving control input information;
processing the information to identify and distinguish an action of the information; and
judging whether the action of the information is executed or not, if so, executing the control action of the information, if not, not executing the control action of the information, wherein before the judging step, the method further comprises inquiring whether the information is executed or not, wherein before the inquiring step, the method further comprises collecting environmental data to carry out predictive analysis on the execution consequence of the information, if so, inquiring is needed, and the condition of misoperation is corrected through processing and judging the information, so that the safety in driving is ensured.
Preferably, the processing the information further comprises identifying the information and discriminating the information, wherein the identifying the information results in a corresponding action, wherein the discriminating the information is analyzed for accuracy and habituation.
It is worth mentioning that the identification of the information determines the accuracy of the characteristics of the information, checks whether there is a malfunction, performs habitual analysis on the information, and checks whether the information is a malfunction according to the habitual data.
It is worth mentioning that the identifying the information matches the information with data in a habit database and identifies whether the information is a wrong operation.
It is worth mentioning that, if the result of the processing of the information is an error operation, the information inquiring whether to execute the query is performed.
Preferably, said step of collecting said environmental data for predictive analysis of the outcome of said performing of said information, said environmental data relating to the environment of the vehicle and the driving environment, such that said environmental data is the source of parametric data for identifying, discriminating and predictive analysing said information.
Preferably, the vehicle main control screen method further comprises collecting an external setting parameter, wherein the external setting parameter is used as a parameter for vehicle control to participate in the prediction analysis of the information.
It is worth mentioning that the step of collecting the environment data performs predictive analysis on the execution consequence of the information, wherein the result of the action to be executed by the information is subjected to predictive analysis and whether the execution consequence of the information is an adverse consequence is judged.
It is worth mentioning that the execution consequence of the information is predicted using the environment data and the setting parameter.
It is worth mentioning that the outcome of the information is predicted by using simulation analysis, and if the result is an adverse outcome, the information needs to be queried.
Preferably, the information inquiring whether to execute further comprises a visual inquiry and a voice inquiry, wherein the visual inquiry displays the condition of the information in the main control screen to wait for reconfirmation, and the voice inquiry provides reconfirmation for the condition of the information through voice reminding.
It is worth mentioning that, according to the result of the processing of the information, if the determination of the accuracy of the information by the discrimination is an erroneous operation, the information is queried as to whether or not to be executed.
It is to be noted that, if the judgment for judging the habituation of the information to the information is an erroneous operation, the information to be asked whether or not to be executed is performed, based on the result of the processing of the information.
It is worth mentioning that, according to the result of the predictive analysis of the information, if the result of the predictive analysis of the information is determined to be an adverse result, the query is performed to determine whether the information is to be executed.
According to another aspect of the present invention, there is provided a vehicle main control screen system, comprising: the main control screen, the processing module, the auxiliary module and the execution module are connected with each other in a communication mode, wherein the main control screen collects control input information, the processing module and the auxiliary module process and analyze the information to correspond to an action, and the execution module outputs and executes the action according to results of the processing module and the auxiliary module.
Preferably, the main control screen further comprises an input module and a display module, wherein the input module receives the information, and the display module displays the operation and feedback information of the system.
Preferably, the system further comprises a sensing module, wherein the sensing module is correspondingly arranged on the automobile to collect environment data, wherein the environment data is related to the environment of the automobile and the driving environment.
It is worth mentioning that the environmental data collected by the sensing module is used by the processing module as one of the data sources analyzed by the processing module.
It is worth mentioning that the sensing module further comprises an environment monitoring module and a driving monitoring module, wherein the environment monitoring module and the driving monitoring module respectively collect and detect the surrounding environment and the driving environment of the automobile.
It is worth mentioning that the environment monitoring module obtains the automobile surrounding environment condition through temperature sensor, humidity transducer, rainfall sensor, light sensor, wind direction speedtransmitter, the driving monitoring module obtains the automobile driving condition through speed sensor, speedtransmitter, acceleration sensor, automatically controlled engine sensor, pressure sensor, corner sensor, torque hydraulic sensor.
It should be noted that the input module further includes a fingerprint input unit, a gesture input unit, and a hard key input unit, wherein the fingerprint input unit, the gesture input unit, and the hard key input unit perform classified recognition on the received information.
It should be noted that the processing module further includes an identification module, a recognition module, a prediction module and an external input interface, the identification module, the recognition module and the prediction module analyze the information received by the input module, wherein an external setting parameter is input to the external input interface, the identification module performs preliminary processing on the information received by the input module, the recognition module analyzes the accuracy and habituation of the information, and the prediction module predicts the execution result of the information by using the environmental data, the setting parameter, and the like.
It is worth mentioning that the external input interface is in communication with the network for receiving data in the network.
It should be noted that the identification module further includes an accuracy identification unit and a habitual identification unit, wherein the accuracy identification unit determines the accuracy of the characteristics of the information to check whether there is an error, and the habitual identification unit performs habitual analysis on the information to check whether the information is an error according to the habitual data.
It is worth mentioning that the identification module determines the misoperation of the information according to the characteristics of the information, especially the characteristics obtained by the input module.
It is worth mentioning that the accuracy discrimination unit performs matching analysis on the information by data in a database.
It is worth mentioning that the habitual recognition unit compares and matches the relevant data of the information from the habitual database.
It should be noted that the prediction module analyzes the result of the information execution according to the environmental data obtained by the sensing module and the external setting parameter obtained by the external input module.
It is worth mentioning that the execution consequence of the information is judged by the prediction module whether there is an adverse consequence, and when the information is judged to have an adverse consequence, the auxiliary module of the system reconfirms the execution of the information.
It is worth mentioning that the auxiliary module further comprises a feedback module and an inquiry module, wherein the feedback module is communicatively interconnected with the input module and the execution module to provide feedback of the input, processing and execution of the information, and wherein the inquiry module is communicatively interconnected with the discrimination module and the prediction module of the processing module to assist in inquiring whether the information is executed so that the process of processing and analyzing the information is perceived.
It should be noted that the feedback module further includes an input feedback, a display feedback and an execution feedback, wherein the input feedback is connected to the input module, wherein the display feedback provides feedback data for the display module, and wherein the execution feedback is connected to the execution module.
It should be noted that the input feedback obtains the information from the input module and displays the information, wherein the display feedback feeds the information and the processing process of the information back to the display module, so that the corresponding process of the information is displayed by the display module, and the execution feedback is communicated with the execution module to feed back and display the condition of the information in the execution process.
It is worth mentioning that the query module further includes a voice query unit and a visual query unit, wherein the voice query unit performs the query in a form of voice on whether the information in the discrimination module and the prediction module continues to be performed, and wherein the visual query unit performs the query in a form of visual on whether the information in the discrimination module and the prediction module continues to be performed.
Drawings
Fig. 1 is an operational view of the car main control screen system according to a preferred embodiment of the present invention.
Fig. 2 is a frame diagram of the car main control screen system according to the above preferred embodiment of the present invention.
Fig. 3 is a flowchart of the car main control screen system and method according to the above preferred embodiment of the present invention.
FIG. 4 is a flowchart illustrating operations of modules of the car main control screen system and method according to the above preferred embodiment of the present invention
Fig. 5 is an input feedback process of the automobile main control screen system and method according to the above preferred embodiment of the present invention.
Fig. 6 is a flowchart illustrating a process of predicting the information by the prediction module of the car key screen system and method according to the preferred embodiment of the present invention.
Fig. 7 is a flowchart illustrating the information analysis by the recognition module of the car key screen system and method according to the above preferred embodiment of the present invention.
Fig. 8 is an analysis flowchart of the prediction module predicting the execution result of the information according to the above preferred embodiment of the present invention.
Fig. 9 is a flowchart of the automobile main control screen system and method for resolving a conflict between the hard key input unit and the fingerprint input unit and the gesture input unit according to the above preferred embodiment of the present invention.
Fig. 10 is a possible scenario of the car main control screen system and method according to the above preferred embodiment of the present invention.
Fig. 11 is another possible scenario of the car main control screen system and method according to the above preferred embodiment of the present invention.
Fig. 12 is another possible scenario of the car main control screen system and method according to the above preferred embodiment of the present invention.
Fig. 13 is another possible scenario of the car main control screen system and method according to the above preferred embodiment of the present invention.
Fig. 14 is another possible scenario of the car main control screen system and method according to the above preferred embodiment of the present invention.
Detailed Description
The following description is presented to disclose the invention so as to enable any person skilled in the art to practice the invention. The preferred embodiments in the following description are given by way of example only, and other obvious variations will occur to those skilled in the art. The basic principles of the invention, as defined in the following description, may be applied to other embodiments, variations, modifications, equivalents, and other technical solutions without departing from the spirit and scope of the invention.
It will be understood by those skilled in the art that in the present disclosure, the terms "longitudinal," "lateral," "upper," "lower," "front," "rear," "left," "right," "vertical," "horizontal," "top," "bottom," "inner," "outer," and the like are used in an orientation or positional relationship indicated in the drawings for ease of description and simplicity of description, and do not indicate or imply that the referenced devices or components must be constructed and operated in a particular orientation and thus are not to be considered limiting.
It is understood that the terms "a" and "an" should be interpreted as meaning that a number of one element or element is one in one embodiment, while a number of other elements is one in another embodiment, and the terms "a" and "an" should not be interpreted as limiting the number.
FIG. 1 is a schematic diagram illustrating an operation of a vehicle main control screen system according to a preferred embodiment of the present invention. The automobile main control screen system and the method process and analyze the input of the main control screen, particularly the input of the soft keys, and accurately and stably control the electronic devices in the automobile. The method comprises the steps of receiving control input information, processing the information for identification and discrimination, collecting relevant environment data to carry out predictive analysis on the execution result of the information, and executing the control action of the information, wherein the method further comprises the step of inquiring whether the information with poor execution result is carried out or not before executing the control action of the information, namely, if the result of the predictive analysis is no poor result, the inquiry is needed to confirm the continuous execution. And the condition of misoperation is corrected by processing and analyzing the information, so that the safety in driving is ensured.
As shown in fig. 2, the system includes a main control screen 10, a processing module 20, an auxiliary module 30, and an execution module 40, wherein the main control screen 10, the processing module 20, the auxiliary module 30, and the execution module 40 are communicably connected to each other, wherein the main control screen 10 collects a control input information, the processing module 20 and the auxiliary module 30 process and analyze the information to correspond to an action, and the execution module 40 outputs and executes the action according to the results of the processing module 20 and the auxiliary module 30. Fig. 1 is a schematic view of a preferred embodiment of the present invention, and the main control screen 10 is installed on a control panel of an automobile. The processing module 20 and the auxiliary module 30 are disposed in a panel of a vehicle and communicably connected to the main control screen 10. The execution module 40 is an electronic device controlled by the main control screen 10 in the automobile. Preferably, the execution module 40 may be an electronic device in a car, such as a car audio control terminal, a car telephone control terminal, a wiper control terminal, a sunroof switch, an air conditioner control terminal, and the like. The execution module 40 is matched to the corresponding information, so as to trigger, convert or drive the electronic device corresponding to the execution module 40 through the information.
The main control screen 10 is inputted with the control input information. In the preferred embodiment, the information is operation information that is input to control the execution of the vehicle. The main control screen 10 further includes an input module 11 and a display module 12, wherein the input module 11 receives the information and also includes soft key information, and the display module 12 displays the operation and feedback information of the system.
The system further comprises a sensing module 50, wherein the sensing module 50 is correspondingly disposed in the vehicle to collect environmental data, wherein the environmental data is related to the environment of the vehicle and the driving environment. The environmental data collected by the sensing module 50 is used by the processing module 20 as one of the data sources analyzed by the processing module 20. The sensing module 50 further includes an environment monitoring module 51 and a driving monitoring module 52, wherein the environment monitoring module 51 and the driving monitoring module 52 respectively collect and detect the surrounding environment and the driving environment of the vehicle. Preferably, the environment monitoring module 51 obtains the environment condition around the vehicle through a temperature sensor, a humidity sensor, a rainfall sensor, an illumination sensor, a wind direction and speed sensor, and the like, and the driving monitoring module 52 obtains the driving condition of the vehicle through a rotation speed sensor, a speed sensor, an acceleration sensor, an electronic control engine sensor, a pressure sensor, a rotation angle sensor, a torque and hydraulic pressure sensor, and the like. In the preferred embodiment, the accuracy of the sensors used by the environment monitoring module 51 and the driving monitoring module 52 is not required, but the requirement for the real-time analysis and summary of the environment of the vehicle by the sensor module 50 is satisfied. The environmental data collected by the sensing module 50 is used for the processing module 20 to distinguish and predict the control input information, and is a source of environmental parameter data. By using the environmental data, the processing module 20 can analyze the information more reliably, so as to accurately, safely and reliably identify the information to be executed, and further analyze the information.
The processing module 20 further includes an identification module 21, a recognition module 22, a prediction module 23 and an external input interface 24, the identification module 21, the recognition module 22 and the prediction module 23 analyze the information received by the input module 11, wherein the external input interface 24 is inputted with an external setting parameter, wherein the identification module 21 performs a preliminary processing on the information received by the input module 11, wherein the recognition module 22 analyzes the accuracy and habituation of the information, and the prediction module 23 predicts the execution result of the information by using the environmental data, the setting parameter, and the like. Preferably, the external input interface 24 is in communication with a network to receive data in the network. For example, the external input interface 24 receives weather data on the network, and cooperates with the environment detection module 51 to obtain real-time weather data around the vehicle. The external input interface 24 is in contact with the internet of vehicles and receives data of the internet of vehicles. For example, the external input interface 24 receives other vehicle traffic conditions and coordinates the vehicle driving data monitored by the driving monitoring module 52 to provide the vehicle driving conditions for the processing module 20. In this way, the external input interface 24 provides reference data for the analysis of the information by the processing module 20. The recognition module 21 performs preliminary processing, such as amplification, feature extraction, etc., on the information received by the input module 11. The input module 11 further includes a fingerprint input unit 111, a gesture input unit 112, and a hard key input unit 113, wherein the fingerprint input unit 111, the gesture input unit 112, and the hard key input unit 113 perform classified recognition on the received information. That is to say, the main control screen 10 may not only directly receive information of the touch screen, but also receive fingerprint information, gesture information, and information of the hard keys, and take them as features. The recognition module 21 of the processing module 20 performs preliminary recognition through different classifications of the input module 11, and extracts the features of the information. After that, the recognition module 22 of the processing module 20 recognizes the information to determine whether the information is mishandled or has a dangerous impact on the driving.
The identification module 22 further comprises an accuracy identification unit 221 and a habitual identification unit 222, wherein the accuracy identification unit 221 determines the accuracy of the characteristics of the information to check whether the information is mishandled, and the habitual identification unit 222 performs habitual analysis on the information to check whether the information is mishandled according to the habitual data. It should be noted that the identification module 22 determines the misoperation of the information according to the characteristics of the information, particularly the characteristics obtained by the input module 11. When the accuracy identification unit 221 or the habitual identification unit 222 analyzes the information to determine that the information is misoperated, the processing module 20 and the auxiliary module 30 re-inquire the information. Preferably, the accuracy recognition unit 222 analyzes the characteristics of the information received by the fingerprint input unit 111, the gesture input unit 112, and the hard key input unit 113 of the input module 11. For example, the accuracy discrimination unit 222 analyzes the position of the information with respect to the main control screen 10, the content thereof, and the like, and determines whether the information exactly matches the action performed. Preferably, in the case that the information is a boundary soft key, critical information, an unknown gesture, a fuzzy pressure, or the like, the accuracy discrimination unit 222 determines the accuracy of the information. Preferably, the accuracy discrimination unit 222 performs matching analysis on the information by data in a database. In addition, the habitual recognition unit 222 preferably compares and matches the relevant data of the information from the habit database. For example, the habitual recognition unit 222 performs matching analysis on the time, position, repetition, and order of occurrence of the information, and a malfunction of the information may be recognized from habitual. Further, after the information is identified by the identification module 22, the prediction module 23 performs predictive analysis on the information, and the information can be matched with the situation to operate without actively updating the system by using self-learning and adaptive analysis processes.
The prediction module 23 analyzes the result of the information execution according to the environmental data obtained by the sensing module 50 and the external setting parameter obtained by the external input module 24. Preferably, the prediction module 23 is a simulation analysis. Because safety is a primary requirement when the automobile is running. The prediction module 23 will analyze the information for driving safety. Of course, the prediction module 23 will also analyze comfort and entertainment. The outcome of the execution of the information is determined by the prediction module 23 whether there is an adverse outcome. When the information is determined to have an adverse consequence, the auxiliary module 30 of the system reconfirms the execution of the information. Preferably, the secondary module 30 will re-query to confirm the execution of the message. When the information is determined to have no adverse consequences, the information is considered to be executable and will be executed by the execution module 40. That is, before the information is executed, the processing module 20 of the system analyzes and predicts the information, determines the possibility of misoperation, and prevents the occurrence of a critical safety event. And the assistance module 30 of the system may query the information to assist in confirming whether the information is to be executed. Adverse consequences in the preferred embodiment are those caused by faulty operation, incomplete accuracy, harm to driving, non-operational intent, non-target outcome, or control actions that affect other tasks.
The auxiliary module 30 further comprises a feedback module 31 and an inquiry module 32, wherein the feedback module 31 is communicatively interconnected with the input module 11 and the execution module 40 to provide feedback on the input, processing and execution of the information, and wherein the inquiry module 32 is communicatively interconnected with the discrimination module 22 and the prediction module 23 of the processing module 20 to assist in inquiring whether the information is executed. It should be noted that the feedback module 31 provides the data to be fed back to the display module 12, and the data is displayed by the display module 12. Of course, the feedback device connected to the feedback module 31 may be not only the display module 12, but also other electronic devices in the vehicle, such as an electronic indicator light, a lighting lamp, a speaker, etc. in the vehicle, which are used as media for the feedback module 31 to provide feedback to the input module 11. The feedback module 31 provides the input condition and the processing procedure condition of the information, so that the information and other processes can be perceived, and the information can be conveniently input into the main control screen 10 and displayed. More, the feedback module 31 further includes an input feedback unit 311, a display feedback unit 312 and an execution feedback unit 313, wherein the input feedback unit 311 is connected to the input module 11, wherein the display feedback 12 provides feedback data for the display module 12, and wherein the execution feedback unit 313 is connected to the execution module 40. Specifically, the input feedback unit 311 obtains the information from the input module 11 and displays the information. The display feedback unit 312 feeds back the information and the processing procedure data of the information to the display module 12, so that the corresponding procedure of the information is displayed by the display module 12. The execution feedback unit 313 communicates with the execution module 40, and feeds back and displays the information during the execution process. Preferably, the input feedback unit 311 is configured to perform voice feedback to feed back the input condition of the information, so as to prevent repeated input. Preferably, the input feedback unit 311 is a light feedback, and controls a prompt light of the corresponding execution module 40 corresponding to the information, so that the prompt light corresponding to the information is used as a feedback signal. Preferably, the input feedback unit 311 and the display feedback unit 312 cooperate with each other, so that data of the input feedback unit 311 is displayed on the main control screen 10 through the display feedback unit 312 and the display module 12. It should be noted that, preferably, the input feedback unit 311 displays the characteristics of the information in the display module 12 through the display feedback unit 312, for example, the input feedback unit 311 displays fingerprint information of the information in the display module 12, and feeds back the information input by which finger. In addition, the execution feedback unit 313 obtains the execution condition of the information from the execution module 40 and feeds the information back. The execution of the execution module 40 may be obtained from control information of the automotive electronic device, or may be obtained from an operation effect of the electronic device. In the preferred embodiment, the execution feedback 40 obtains the action effect of the electronic device, for example, after the control media is played, the execution feedback 40 performs feedback to prevent repeated operations. Thus, the system can accurately judge the action to be executed while ensuring integration.
When the input module 11 receives the information, the input feedback unit 311 feeds the information back. In addition, the input feedback unit 311 feeds back the information through the display feedback unit 312. Preferably, the display feedback unit 312 displays the information receiving condition in a form of a progress bar. Preferably, the input feedback unit 311 displays the receiving condition of the information in a form of voice prompt.
The query module 32 further includes a voice query unit 321 and a visual query unit 322, wherein the voice query unit 321 performs the voice-based form query on whether the information in the discrimination module 22 and the prediction module 23 continues to be queried, and wherein the visual query unit 322 performs the visual-based form query on whether the information in the discrimination module 22 and the prediction module 23 continues to be queried. Preferably, the information that a possible malfunction or a predicted bad result occurs in the discrimination module 22 or the prediction module 23 is reconfirmed by the query module 32. In the preferred embodiment, the voice query unit 321 is a voice reminding and voice collecting device, so as to query and reconfirm the information. In another optional preferred form, the voice query unit 321 is a prompt tone and the input module 11 receives the prompt tone again, so that a misoperation of confirming the information is possible, and an adverse effect is avoided. For example, after the identification module 22 identifies the information, and determines that the information does not match the habit data and there is a possibility of incorrect operation, the query module 32 makes a query to the information through the voice query unit 321 to determine whether to continue to perform the operation corresponding to the information. In the preferred embodiment, the visual inquiry unit 322 preferably provides an inquiry through the display module 12 of the main control screen 10, and preferably an inquiry through an indicator light of an electronic device in an automobile. For example, after the prediction module 23 predicts the outcome of the information, and the outcome of the execution of the information is determined as an adverse outcome, the visual inquiry unit 322 inquires about the execution of the information. Preferably, the visual inquiry unit 322 is used for inquiring whether the information is to be continuously executed or not through a visual effect for the main control screen 10, and also facilitates the input and confirmation of the input module 11. It is worth mentioning that the query module 32 cooperates with the feedback module 31 for further execution and validation of the information. Preferably, the feedback module 31 provides feedback to the information, and the query module 32 provides corresponding feedback data to the feedback module 31.
Fig. 3 is a flowchart of the automobile main control screen system and method. The information is input to the main control screen 10 so that the input module 11 of the main control screen 10 receives the information. The information is received and then processed by the processing module 20. Specifically, the processing module 20 processes and analyzes the information through the identification module 21, the identification module 22 and the prediction module 23, determines the possibility of misoperation and adverse consequences of the information, and sends the information to be confirmed to the auxiliary module 30 for inquiry. The auxiliary module 30 inquires and judges the execution of the information, and if the execution is confirmed to be continued, the information is delivered to the execution module 40; if the information is determined not to be executed, the execution module 40 does not execute the information, and the auxiliary module 30 feeds back the processing and execution conditions of the information. It is worth mentioning that the processing and analysis of the information by the processing module 20 requires the collection of the environmental data around the vehicle by the sensing module 50. Moreover, the processing module 20 also analyzes the probability of misoperation and the possibility of adverse consequences of the information according to the environment data in which the information is located.
The specific process of the input module 11 of the main control screen 10 receiving the information and the processing module 20 analyzing the information is shown in fig. 4. The input module 11 receives the information, and the fingerprint input unit 111, the gesture input unit 112, and the hard key input unit 113 of the input module 11 recognize the received information in a classified manner. That is to say, the main control screen 10 may not only directly receive information of the touch screen, but also receive fingerprint information, gesture information, and information of the hard keys, and take them as features. The recognition module 21 of the processing module 20 performs preliminary recognition through different classifications of the input module 11, and extracts the features of the information. The identification module 22 of the processing module 20 then identifies the information to determine whether the information is mishandled. If there is a possibility of malfunction, the query module 32 of the auxiliary module 30 queries the continued execution of the information. In the preferred embodiment, for example, when the fingerprint input unit 111 of the input module 11 receives middle finger fingerprint information, but the identification module 22 of the processing module 20 determines that the operation is wrong according to habituation, the inquiry module 32 issues an inquiry to determine whether to execute the information of the middle finger that is not used by the user. The information is also processed by the prediction module 23 of the processing module 20 for predicted outcomes. The outcome of the execution of the information is determined by the prediction module 23 whether there is an adverse outcome. When the information is determined to have an adverse consequence, the auxiliary module 30 of the system reconfirms the execution of the information. Preferably, the secondary module 30 will re-query to confirm the execution of the message. When the information is determined to have no adverse consequences, the information is considered to be executable and will be executed by the execution module 40. That is, before the information is executed, the processing module 20 of the system analyzes and predicts the information, determines the possibility of misoperation, and prevents the occurrence of a critical safety event. After determining the execution, the execution of the information is fed back by the feedback module 31 of the auxiliary module 30.
It should be noted that the feedback module 31 of the auxiliary module 30 can feed back the input and execution of the information through the display module 12. The input feedback flow of the preferred embodiment is shown in fig. 5. The input feedback unit 311 obtains the information from the input module 11 and displays the information. The display feedback unit 312 feeds back the information and the processing procedure data of the information to the display module 12, so that the corresponding procedure of the information is displayed by the display module 12. Preferably, the input feedback unit 311 is configured to perform voice feedback to feed back the input condition of the information, so as to prevent repeated input. Preferably, the input feedback unit 311 is a light feedback, and controls a prompt light of the corresponding execution module 40 corresponding to the information, so that the prompt light corresponding to the information is used as a feedback signal. Preferably, the input feedback unit 311 and the display feedback unit 312 cooperate with each other, so that data of the input feedback unit 311 is displayed on the main control screen 10 through the display feedback unit 312 and the display module 12. It should be noted that, preferably, the input feedback unit 311 displays the characteristics of the information in the display module 12 through the display feedback unit 312, for example, the input feedback unit 311 displays fingerprint information of the information in the display module 12, and feeds back the information input by which finger. For example, the input feedback unit 311 displays the gesture information of the information in the display module 12, and feeds back the information of what gesture action is. Of course, the input feedback unit 311 waits for the information of the input module 11 to be fed back in real time. After finishing the processing of the information input this time, the input module 11 waits for input.
Fig. 6 shows a flowchart of the prediction processing of the information by the prediction module 23 of the preferred embodiment. According to the action corresponding to the information by the distinguishing module 21 and the environmental parameter obtained by the sensing module 50, the predicting module 23 predicts the consequence of the action to be executed by the information. More specifically, the prediction module 23 analyzes the result of the information execution according to the external setting parameter of the external input module 24. The outcome of the execution of the information is determined by the prediction module 23 whether there is an adverse outcome. When the information is determined to have an adverse consequence, the auxiliary module 30 of the system reconfirms the execution of the information. In particular, if the execution of the information has an impact on the driving safety, the prediction module 23 passes the information to the query module 32 for reconfirmation. Preferably, the secondary module 30 will re-query to confirm the execution of the message. The voice inquiring unit 321 performs a formal inquiry of whether the information in the discriminating module 22 and the predicting module 23 continues to be performed by voice, wherein the visual inquiring unit 322 performs a formal inquiry of whether the information in the discriminating module 22 and the predicting module 23 continues to be performed by visual. When the information is determined to have no adverse consequences, the information is considered to be executable and will be executed by the execution module 40. That is, before the information is executed, the processing module 20 of the system analyzes and predicts the information, determines the possibility of misoperation, and prevents the occurrence of a critical safety event. And the assistance module 30 of the system may query the information to assist in confirming whether the information is to be executed. For example, if the information is turning on a high beam, the prediction module 23 evaluates the execution result of the information through the sensing module 50 and the external input interface 24, and if the external input interface 24 obtains an instruction that the high beam cannot be turned on in the current road section, the prediction module 23 determines the execution result of the information as an adverse result, and sends the information to the query module 32 to query whether to continue. The information is reconfirmed by re-interrogation without ignoring the indication to the system.
Fig. 7 is a flow chart of the analysis of the information by the discrimination module 22 according to the preferred embodiment. The information is identified by the identification module 21 as corresponding execution action, the recognition module 22 analyzes the accuracy of the information, and the recognition module 22 analyzes the habituation of the information. The accuracy determination unit 221 of the determination module 22 determines the accuracy of the features of the information to check whether there is an erroneous operation, and the habitual determination unit 222 performs habitual analysis on the information to check whether the information is an erroneous operation according to habitual data. It should be noted that the identification module 22 determines the misoperation of the information according to the characteristics of the information, particularly the characteristics obtained by the input module 11. When the accuracy identification unit 221 or the habitual identification unit 222 analyzes the information to determine that the information is misoperated, the processing module 20 and the auxiliary module 30 re-inquire the information. Preferably, the accuracy recognition unit 222 analyzes the characteristics of the information received by the fingerprint input unit 111, the gesture input unit 112, and the hard key input unit 113 of the input module 11. For example, the accuracy discrimination unit 222 analyzes the position of the information with respect to the main control screen 10, the content thereof, and the like, and determines whether the information exactly matches the action performed. Preferably, in the case that the information is a boundary soft key, critical information, an unknown gesture, a fuzzy pressure, or the like, the accuracy discrimination unit 222 determines the accuracy of the information. Preferably, the accuracy discrimination unit 222 performs matching analysis on the information by data in a database. In addition, the habitual recognition unit 222 preferably compares and matches the relevant data of the information from the habit database. For example, the habitual recognition unit 222 performs matching analysis on the time, position, repetition, and order of occurrence of the information, and a malfunction of the information may be recognized from habitual. Further, after the information is identified by the identification module 22, the prediction module 23 performs prediction analysis on the information. For example, when the recognition module 21 recognizes the information as pressing two touch screen keys of broadcast and music, the recognition module 22 analyzes the accuracy and habituation of the information. The recognition module 22 recognizes most of the information pressed on the broadcast key according to the accuracy of the key, then the recognition module 22 determines that the information should be a control broadcast, and the visual inquiry unit 322 of the inquiry module 32 reminds to reconfirm the information while delaying the start of the broadcast or the music by the blinking of the broadcast indicator light. For example, the identifying module 22 determines, according to the determination of the time period or the sequence control information in the habit data, if the information in the habit data is that the broadcast is started in the morning, the information is that the pressing of two touch screen keys of the broadcast and the music is likely to be a misoperation, and the identifying module 22 determines that the misoperation is performed through comparison in the habit data, and further please the executing module 40 to preferentially start the broadcast.
More specifically, the prediction module 23 predicts that the analysis of the execution result of the information is performed by a plurality of modules communicating with each other. As shown in fig. 8, the data sources of the prediction module 23 are preferably the recognition module 22, the external input interface 24, the environment monitoring module 51, the driving detection module 52, and the like, for processing the information and processing the data around the vehicle. The discrimination module 22 of the processing module 20 discriminates the information to determine whether the information is mishandled. In particular, the habitual identification unit 222 of the identification module 22 preferably compares and matches the relevant data of the information from a habit database. For example, the habitual recognition unit 222 performs matching analysis on the time, position, repetition, and order of occurrence of the information, and a malfunction of the information may be recognized from habitual. The prediction module 23 also performs predictive analysis on the information according to the processing of the discrimination module 22. The external input interface 24 collects the external setting parameters of the relevant information, and the prediction module 23 performs prediction analysis with reference to the external setting parameters. For example, the external input interface 24 receives data about body temperature in the smart band, and the prediction module 23 analyzes comfort through external parameters and control operation information. If the information is to reduce the temperature of the air conditioner in the automobile, but the external input interface 24 obtains a condition that the body temperature is low and is not suitable for reducing the temperature of the air conditioner, the execution of the information by the prediction module 23 is judged to be an adverse consequence, and the execution of the information is reconfirmed by the inquiry module 32. The environmental data collected by the sensing module 50 is used for the processing module 20 to distinguish and predict the control input information, and is a source of environmental parameter data. With the environmental data, the predictive analysis of the information by the predictive module 23 of the processing module 20 is more reliable. The query module 32 reconfirms the information when the analysis of the information by the prediction module 23 has adverse consequences. Through the communication and the liaison with other smart machines, for processing module analysis information provides abundanter resource.
In particular, the fingerprint input unit 111 and the gesture input unit 112 of the input module 11 are preferably in the form of soft keys that receive the information. The hard key input unit 113 is in the form of a hard key for receiving the information. In the traditional main control screen, the input of the hard key and the input of the soft key are performed separately. In a preferred embodiment of the present invention, the inputs of the fingerprint input unit 111 and the gesture input unit 112 and the hard key input unit 113 may be simultaneously performed with control input. To solve the conflict between the hard key input unit 113 and the texture input unit 111 and the gesture input unit 112, the flow of the recognition module 22 and the query module 32 is shown in fig. 9. When the received information conflicts with the input information of the hard key input 113, the discrimination module 22 will analyze the information, and the query module 32 queries the information. If the identification module 22 and the query module 32 determine that the information is not mishandled, the prediction module 23 predicts the information and determines whether to execute the information. If both the discrimination module 22 and the interrogation module 32 determine that they are faulty operations, the message will not be executed. Therefore, the hardware development cost can be reduced, stable centralized control is provided by using the main control screen system, and the system has the advantage of individuation and can adapt to more occasions. Moreover, the chance of operating the main control screen by being watched is reduced, thereby reducing the probability of danger due to the operation of the main control screen.
Application scenarios of the preferred embodiment are shown in fig. 10 to 14. As shown in fig. 10, the input module 11 of the main control panel 10 of the system receives a control input, and the recognition module 21 recognizes the input information of turning on the wiper, and the recognition module 22 analyzes the accuracy and habituation of the information. The accuracy identification unit 221 of the identification module 22 analyzes the information, the touch screen keys adjacent to the information are music, car lights, and other keys, and the accuracy of the information is slightly related to the nature of music. The habitual recognition unit 222 of the recognition module 22 analyzes the habitual nature of the information, and the habitual recognition unit 222 judges that the information is not an erroneous operation based on habitual data. The prediction module 23 obtains the weather data around the vehicle from the environment monitoring module 51. If the environment monitoring module 51 obtains that the surrounding of the current automobile is sunny, the prediction module 23 predicts that the information for opening the skylight has bad consequences and wastes the consequences. The query module 32 feeds back query information to the display module 12 through the display feedback unit 312 via the visual query unit 322 according to the result of the processing module 20, that is, "? And reminding. The execution module 40 will delay execution of the message and wait for an input.
Fig. 11 shows a case of scene two. The input module 11 receives control input information for opening the sunroof. The recognition module 22 analyzes the accuracy and habituation of the input information recognized as open skylights by the recognition module 21. The accuracy discriminating unit 221 of the discriminating module 22 analyzes the information with accuracy that the control key is completely pressed, which is very accurate. The habitual recognition unit 222 of the recognition module 22 analyzes the habitual character of the information, and determines that the information is normally operated by opening the skylight at random timing according to the habitual data. If the discrimination module 22 does not determine that the operation is wrong, the prediction module 23 performs prediction analysis on the information. The prediction module 23 obtains the weather data around the vehicle from the environment monitoring module 51. If the environmental monitoring module 51 obtains that the vehicle is raining, the prediction module 23 predicts that there is a bad result for the information of opening the skylight, and there is a rain result. The query module 32 feeds back query information to the display module 12 through the display feedback unit 312 via the visual query unit 322 according to the result of the processing module 20, that is, a prompt of a delay progress bar is displayed in a key. The execution module 40 will delay the execution of the message and wait for the results of the interrogation module 32. And if a timely inquiry result is obtained, the inquiry is carried out according to the result. If the timely inquiry result is not obtained or the skylight is confirmed to be opened, the operation of opening the skylight is continuously executed, as shown in fig. 12. The skylight starts to open, and the execution feedback unit 313 of the feedback module 31 feeds back the execution start of the information. Specifically, the indicator light arranged in the skylight starts to flash to remind the skylight of being opened. The display feedback unit 312 prompts the processing procedure of the information, and preferably, the display feedback unit 312 feeds back the received environmental data of the environmental monitoring module 51 which is raining in the display module 12.
As shown in fig. 13, in the case of the third scenario, the input module 11 of the main control screen 10 of the system receives a control input, the input information is identified as the incoming information of the broadcast telephone through the identification module 21, and the identification module 22 analyzes the accuracy and the habituation of the information. The accuracy discrimination unit 221 of the discrimination module 22 analyzes the information, the touch screen keys adjacent to the information are broadcast, music, car lights, and so on, and the accuracy of the information is slightly according to the nature of the music and broadcast. The habitual recognition unit 222 of the recognition module 22 analyzes the habitual nature of the information, and the habitual recognition unit 222 judges that the information is an erroneous operation according to habitual data, in the present period, normally, a specific program is listened to on broadcasting. The query module 32 queries whether to make a call through the voice query unit 321 according to the result of the processing module 20, and waits for confirmation.
Fig. 14 shows a case of scene four. The input module 11 receives control input information for lowering the temperature of the air conditioner. The recognition module 21 recognizes the input information as a temperature reduction of the air conditioner, and the recognition module 22 analyzes accuracy and habituation of the information. The accuracy discriminating unit 221 of the discriminating module 22 analyzes the information with accuracy that the control key is completely pressed, which is very accurate. The habitual recognition unit 222 of the recognition module 22 analyzes the habitual character of the information, and the habitual recognition unit 222 judges that the information is normally operated according to habitual data and records that the air conditioner temperature is lowered in the current period. If the discrimination module 22 does not determine that the operation is wrong, the prediction module 23 performs prediction analysis on the information. The prediction module 23 obtains the weather data around the vehicle from the environment monitoring module 51. If the environmental monitoring module 51 receives a current vehicle ambient temperature, the prediction module 23 predicts that there is no temporary adverse effect on the information based on the monitoring module 51. The external input interface 24 receives body temperature data of an external smart band as the external setting data, and delivers the body temperature data to the prediction module 23 for prediction analysis. If the external settings data suggest that the temperature is not too low, then the prediction module 23 predicts an adverse outcome to the information. The query module 32 queries whether to continue execution through the voice query unit 321 according to the result of the processing module 20, and meanwhile, the display feedback unit 312 displays the processing procedure through the display module 12, that is, the smart band requires no temperature reduction, and then whether to continue temperature reduction. The interrogation module 32 waits for a result. And if a timely inquiry result is obtained, the inquiry is carried out according to the result. If the timely inquiry result is not obtained or the temperature is confirmed to be reduced, the operation of reducing the temperature of the air conditioner is continuously executed.
It will be appreciated by persons skilled in the art that the embodiments of the invention described above and shown in the drawings are given by way of example only and are not limiting of the invention. The objects of the invention have been fully and effectively accomplished. The functional and structural principles of the present invention have been shown and described in the examples, and any variations or modifications of the embodiments of the present invention may be made without departing from the principles.

Claims (17)

1. A method for controlling a screen of an automobile is characterized by being applied to a system for controlling a screen of an automobile and comprising the following steps:
receiving control input information;
processing the information to identify and distinguish the information;
wherein identifying the information and distinguishing the information comprises identifying the information resulting in a corresponding action, the distinguishing the information being analyzed for accuracy and habituation;
the information is distinguished to judge the accuracy of the characteristics of the information, whether misoperation exists is checked, the information is subjected to habitual analysis, and whether the information is in misoperation is checked according to habitual data;
and collecting environmental data to perform predictive analysis on the execution consequence of the information when judging that the information is not misoperated,
and if the execution result of the information is judged to be an adverse result, the control action of the information is not executed for the moment, if the execution result of the information is judged to be no adverse result, the control action of the information is executed, and the condition of misoperation is corrected through processing and judging the information.
2. The automobile main control screen method according to claim 1, wherein the identifying the information matches the information with data in a habit database and identifies whether the operation is a wrong operation.
3. The car main control screen method according to claim 1, wherein the judging step further comprises the information inquiring whether to perform execution or not.
4. The automobile main control screen method according to claim 3, wherein the information inquiring whether to execute is performed if the result of processing the information is an erroneous operation.
5. The vehicle dashboard method of claim 3, wherein collecting said environmental data for predictive analysis of the outcome of said information execution, said environmental data being related to vehicle environment and driving environment, such that said environmental data is a source of parametric data for identification, identification and predictive analysis of said information.
6. The vehicle dashboard method of claim 3, further comprising collecting an external setting parameter, wherein said external setting parameter is included as a parameter for vehicle control in predictive analysis of said information.
7. The vehicle main control screen method according to claim 6, wherein an execution consequence of the information is predicted using the environment data and the setting parameter.
8. The automobile main control screen method according to claim 6, wherein the result of the information is predicted by using simulation analysis, and if the result shows that the information has an adverse result, the information needs to be inquired.
9. The car main control screen method according to claim 3, wherein the information inquiring whether to perform further comprises visual inquiry and voice inquiry, wherein the visual inquiry displays the condition of the information in the main control screen to wait for reconfirmation, wherein the voice inquiry provides reconfirmation of the condition of the information through voice prompt.
10. The screen control method according to claim 9, wherein, in accordance with the result of processing the information, if the determination of the accuracy of the information from the discrimination of the information is an erroneous operation, an inquiry is made as to whether or not to execute the information.
11. The screen control method according to claim 9, wherein, in accordance with a result of the processing of the information, if the judgment of the habituation of the information to the information is an erroneous operation, an inquiry is made as to whether or not to execute the information.
12. The automobile main control screen method according to claim 9, wherein, according to the result of the predictive analysis of the information, if the result of the predictive analysis of the information is an adverse result, an inquiry is made as to whether to execute the information.
13. An automobile master control screen system, comprising:
the system comprises a main control screen, a processing module and an auxiliary module, wherein the main control screen, the processing module and the auxiliary module are connected with each other in a communication way;
the main control screen is used for collecting control input information;
the processing module and the auxiliary module are used for processing and analyzing the information to correspond to an action, analyzing the accuracy and the habituation of the information and checking whether the information has misoperation; and
when the information is judged not to be misoperation, collecting environmental data to carry out predictive analysis on the execution consequence of the information; and if the execution result of the information is judged to be an adverse result, the control action of the information is not executed for the moment, if the execution result of the information is judged to be no adverse result, the control action of the information is executed, and the condition of misoperation is corrected through processing and judging the information.
14. The system of claim 13, further comprising an execution module that outputs and executes the action based on results of the processing module and the auxiliary module.
15. The system of claim 13, wherein the main control screen further comprises an input module and a display module, wherein the input module receives the information and the display module displays operation and feedback information of the system.
16. The system of claim 13, further comprising a sensing module, wherein the sensing module is correspondingly disposed in the vehicle to collect environmental data, wherein the environmental data is related to the vehicle environment and the driving environment.
17. The system of claim 16, wherein the sensing module further comprises an environmental monitoring module and a driving monitoring module, wherein the environmental monitoring module and the driving monitoring module respectively collect and detect the environment around the vehicle and the driving environment.
CN201710046134.0A 2017-01-22 2017-01-22 Automobile main control screen system and method Active CN106891724B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201710046134.0A CN106891724B (en) 2017-01-22 2017-01-22 Automobile main control screen system and method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201710046134.0A CN106891724B (en) 2017-01-22 2017-01-22 Automobile main control screen system and method

Publications (2)

Publication Number Publication Date
CN106891724A CN106891724A (en) 2017-06-27
CN106891724B true CN106891724B (en) 2022-05-03

Family

ID=59198240

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201710046134.0A Active CN106891724B (en) 2017-01-22 2017-01-22 Automobile main control screen system and method

Country Status (1)

Country Link
CN (1) CN106891724B (en)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107323379A (en) * 2017-06-30 2017-11-07 百度在线网络技术(北京)有限公司 Control method for vehicle, device, equipment and storage medium
CN108564945B (en) * 2018-03-13 2020-11-10 斑马网络技术有限公司 Vehicle-mounted voice control method and device, electronic equipment and storage medium
CN108973625A (en) * 2018-08-06 2018-12-11 芜湖莫森泰克汽车科技股份有限公司 A kind of sunroof control system with gesture control function

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2010061448A1 (en) * 2008-11-27 2010-06-03 パイオニア株式会社 Operation input device, information processor, and selected button identification method
CN105620393A (en) * 2015-12-25 2016-06-01 莆田市云驰新能源汽车研究院有限公司 Self-adaptive vehicle human-computer interaction method and system thereof

Family Cites Families (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1491827A (en) * 2003-07-18 2004-04-28 陈华元 Device for controlling means of transportation
CN102866803B (en) * 2012-08-30 2016-02-17 浙江大学 A kind ofly support the virtual console gesture control method of the automobile of blind operation and device
JP5954901B2 (en) * 2013-12-18 2016-07-20 富士重工業株式会社 Vehicle control device
JP6432233B2 (en) * 2014-09-15 2018-12-05 株式会社デンソー Vehicle equipment control device and control content search method
CN105620237A (en) * 2014-10-30 2016-06-01 江苏新通达电子科技股份有限公司 Automatic control system for automobile skylight
US20160147322A1 (en) * 2014-11-20 2016-05-26 Hyundai America Technical Center, Inc. System and method for identifying a user of a vehicle head unit
US10627813B2 (en) * 2015-04-21 2020-04-21 Panasonic Intellectual Property Management Co., Ltd. Information processing system, information processing method, and program

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2010061448A1 (en) * 2008-11-27 2010-06-03 パイオニア株式会社 Operation input device, information processor, and selected button identification method
CN105620393A (en) * 2015-12-25 2016-06-01 莆田市云驰新能源汽车研究院有限公司 Self-adaptive vehicle human-computer interaction method and system thereof

Also Published As

Publication number Publication date
CN106891724A (en) 2017-06-27

Similar Documents

Publication Publication Date Title
US10606276B2 (en) User data-based autonomous vehicle system
CN106891724B (en) Automobile main control screen system and method
CN107199971B (en) Vehicle-mounted voice interaction method, terminal and computer readable storage medium
US9524032B2 (en) Method and device for operating a motor vehicle component by means of gestures
US20180267557A1 (en) Vehicle system and vehicle controller for controlling vehicle
CN104553969B (en) A kind of automated steering lamp system and its control method based on navigation system
CN106448221B (en) Pushing method of online driving assistance information pushing system and application of pushing method
CN108382155B (en) Air conditioner voice control device with reminding function
CN111439271A (en) Auxiliary driving method and auxiliary driving equipment based on voice control
WO2013179588A1 (en) Human detection apparatus
CN113239871B (en) Method, device and system for processing dangerous scene in vehicle and electronic equipment
CN112041201B (en) Method, system, and medium for controlling access to vehicle features
US20200134729A1 (en) Information processing device, information processing system, information processing method, and program
CN113947828A (en) Multifunctional automobile key control system and method
CN112793581B (en) Steering wheel hands-off detection method and system, computer equipment and storage medium
US20170334292A1 (en) Information providing apparatus for vehicle
CN113352986B (en) Vehicle voice atmosphere lamp partition interaction control method and system
CN112061059A (en) Screen adjusting method and device for vehicle, vehicle and readable storage medium
WO2018120666A1 (en) On-board prompting method and terminal
CN115700199A (en) Data processing method and device applied to intelligent driving
CN111873800B (en) Driving safety prompting method, device and equipment based on vehicle-mounted input method
CN112298080A (en) Vehicle control method and system
JP4779000B2 (en) Device control device by voice recognition
CN112874631A (en) HOD monitoring system for automobile steering wheel
CN116022158B (en) Driving safety control method and device for cooperation of multi-domain controller

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant