CN114510193A - Vehicle control method and device, vehicle and storage medium - Google Patents

Vehicle control method and device, vehicle and storage medium Download PDF

Info

Publication number
CN114510193A
CN114510193A CN202210172382.0A CN202210172382A CN114510193A CN 114510193 A CN114510193 A CN 114510193A CN 202210172382 A CN202210172382 A CN 202210172382A CN 114510193 A CN114510193 A CN 114510193A
Authority
CN
China
Prior art keywords
trigger
vehicle
driver
palm
profile
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210172382.0A
Other languages
Chinese (zh)
Inventor
崔芮星
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Chery Automobile Co Ltd
Lion Automotive Technology Nanjing Co Ltd
Wuhu Lion Automotive Technologies Co Ltd
Original Assignee
Chery Automobile Co Ltd
Lion Automotive Technology Nanjing Co Ltd
Wuhu Lion Automotive Technologies Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Chery Automobile Co Ltd, Lion Automotive Technology Nanjing Co Ltd, Wuhu Lion Automotive Technologies Co Ltd filed Critical Chery Automobile Co Ltd
Priority to CN202210172382.0A priority Critical patent/CN114510193A/en
Publication of CN114510193A publication Critical patent/CN114510193A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R11/00Arrangements for holding or mounting articles, not otherwise provided for
    • B60R11/02Arrangements for holding or mounting articles, not otherwise provided for for radio sets, television sets, telephones, or the like; Arrangement of controls thereof
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/90Details of database functions independent of the retrieved data types
    • G06F16/903Querying
    • G06F16/9035Filtering based on additional data, e.g. user or group profiles
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/90Details of database functions independent of the retrieved data types
    • G06F16/903Querying
    • G06F16/9038Presentation of query results

Abstract

The application discloses a control method and device of a vehicle, the vehicle and a storage medium, wherein the method comprises the following steps: detecting whether a driver triggers a vehicle operation interface; when the fact that a driver triggers a vehicle operation interface is detected, identifying a trigger profile and/or a trigger track of the driver; at least one trigger feature is extracted from the trigger contour and/or the trigger track, the preset database is inquired by taking the at least one trigger feature as an index, the trigger instruction corresponding to the at least one trigger feature is matched, and the vehicle is controlled to execute the corresponding action based on the trigger instruction. Therefore, the technical problems that in the related art, multiple operations are defined by using a multi-action mode between fingers, data monitoring and operation are carried out and interactive response is carried out in a single-point and multi-point touch mode aiming at a small-area fingertip touch area, the operation is complex, the carrying function is excessive, a driver is difficult to carry out blind operation under the driving condition, the attention of the driver is dispersed, and traffic accidents are easily caused are solved.

Description

Vehicle control method and device, vehicle and storage medium
Technical Field
The present disclosure relates to the field of vehicle control technologies, and in particular, to a method and an apparatus for controlling a vehicle, and a storage medium.
Background
Nowadays, driving trip becomes one of the main trip modes of people's selection, along with the renewal of vehicle, its inside each item supporting function is also perfect gradually, if: temperature and humidity regulation, audio-visual broadcast, on-vehicle map etc. the driver can generally give operating instruction to above-mentioned device through the touch screen in the car.
However, when the driver operates the touch screen, the driver needs to look at the positions of the function keys in the screen, so that the driver is likely to loose attention due to the need to operate the touch screen during driving, and traffic accidents are caused.
In the related art, a plurality of operations are defined mainly by using a multi-action mode among fingers, data monitoring and operation are carried out and interactive response is carried out aiming at a single-point and multi-point touch mode of a small-area fingertip touch area, and the following problems mainly exist;
firstly, the method comprises the following steps: the operation difficulty is high, the operation details are complex, the misoperation is easy, the requirement on the touch identification precision of a car machine system screen is relatively high, and the system operation is complex;
secondly, the method comprises the following steps: the functions of single gesture participation and bearing are relatively more (for example, finger operation, namely, participation of single-point clicking, sliding and long pressing, and participation of multi-point operation and the like are required);
thirdly, the method comprises the following steps: blind operation is difficult in complex cockpit environments;
fourthly: rapid operation under driving conditions in a cabin environment cannot be accommodated.
In conclusion, the related technology is not only complex in operation, not beneficial to the driver to adjust various functions in the vehicle in the driving process, but also single in operation, incapable of meeting the use requirements of the driver, reducing the use experience, and urgently needing improvement.
Content of application
The application provides a vehicle control method, a vehicle control device, a vehicle and a storage medium, which are used for solving the technical problems that in the related art, multiple operations are defined by using a multi-action mode between fingers, data monitoring and operation are carried out and interactive response is carried out in a single-point and multi-point touch mode aiming at a small-area fingertip touch area, the operation is complex, the carrying function is excessive, a driver is difficult to carry out blind operation under the driving condition, the attention of the driver is further dispersed, and traffic accidents are easily caused.
An embodiment of a first aspect of the present application provides a control method for a vehicle, including the following steps: detecting whether a driver triggers a vehicle operation interface; identifying a trigger profile and/or trigger trajectory of the driver upon detecting the driver triggering the vehicle operation interface; and extracting at least one trigger feature from the trigger contour and/or the trigger track, inquiring a preset database by taking the at least one trigger feature as an index, matching a trigger instruction corresponding to the at least one trigger feature, and controlling the vehicle to execute a corresponding action based on the trigger instruction.
Optionally, in an embodiment of the present application, the at least one triggering characteristic includes one or more of an actual distance between any two fingers of the palm, an actual contact area between the palm and the vehicle operation interface, a contact profile between the palm and the vehicle operation interface, an actual contact strength between the palm and the vehicle operation interface, and a triggering number between the palm and the vehicle operation interface within a preset time period.
Optionally, in an embodiment of the present application, the method further includes: and controlling the vehicle to perform feedback prompt when the matching fails or the corresponding action is failed to be executed.
Optionally, in an embodiment of the present application, the method further includes: receiving a setting instruction of the driver; and modifying the mapping relation between the trigger characteristics and the trigger instruction in the preset database according to the setting instruction.
Optionally, in an embodiment of the present application, after detecting that the driver triggers the vehicle operation interface, the method further includes: judging whether the vehicle starts a trigger function or not; and if the vehicle does not start the trigger function, sending an identification invalid prompt.
An embodiment of a second aspect of the present application provides a control apparatus for a vehicle, including: the detection module is used for detecting whether a driver triggers a vehicle operation interface or not; the identification module is used for identifying a trigger profile and/or a trigger track of the driver when the driver is detected to trigger the vehicle operation interface; and the control module is used for extracting at least one trigger feature from the trigger contour and/or the trigger track, inquiring a preset database by taking the at least one trigger feature as an index, matching a trigger instruction corresponding to the at least one trigger feature, and controlling the vehicle to execute a corresponding action based on the trigger instruction.
Optionally, in an embodiment of the present application, the at least one triggering characteristic includes one or more of an actual distance between any two fingers of the palm, an actual contact area between the palm and the vehicle operation interface, a contact profile between the palm and the vehicle operation interface, an actual contact strength between the palm and the vehicle operation interface, and a triggering number between the palm and the vehicle operation interface within a preset time period.
Optionally, in an embodiment of the present application, the method further includes: and the reminding module is used for controlling the vehicle to carry out feedback prompt when the matching fails or the corresponding action is failed to be executed.
Optionally, in an embodiment of the present application, the method further includes: the receiving module is used for receiving a setting instruction of the driver; and the correction module is used for modifying the mapping relation between the trigger characteristics and the trigger instruction in the preset database according to the setting instruction.
Optionally, in an embodiment of the present application, the detection module further includes: the judging unit is used for judging whether the vehicle starts a triggering function or not; and the reminding unit is used for sending an identification invalid reminder when the triggering function is not started by the vehicle.
An embodiment of a third aspect of the present application provides a vehicle, comprising: a memory, a processor and a computer program stored on the memory and executable on the processor, the processor executing the program to implement the control method of the vehicle as described in the above embodiments.
A fourth aspect of the present application provides a computer-readable storage medium storing computer instructions for causing a computer to execute a control method of a vehicle according to the above embodiment.
The embodiment of the application can extract the trigger characteristics from the trigger outline and/or the trigger track of the driver on the vehicle operation interface, and the vehicle is controlled to execute corresponding actions according to the preset instruction matched with the trigger characteristics, so that the operation mode is simple, the identification is easier, the blind operation of the driver under the driving condition is facilitated, the driving experience of the driver is improved, and the driving safety of the vehicle is ensured. Therefore, the technical problems that in the related art, multiple operations are defined by using a multi-action mode between fingers, data monitoring and operation are carried out and interactive response is carried out in a single-point and multi-point touch mode aiming at a small-area fingertip touch area, the operation is complex, the carrying function is excessive, a driver is difficult to carry out blind operation under the driving condition, the attention of the driver is dispersed, and traffic accidents are easily caused are solved.
Additional aspects and advantages of the present application will be set forth in part in the description which follows and, in part, will be obvious from the description, or may be learned by practice of the present application.
Drawings
The foregoing and/or additional aspects and advantages of the present application will become apparent and readily appreciated from the following description of the embodiments, taken in conjunction with the accompanying drawings of which:
fig. 1 is a flowchart of a control method of a vehicle according to an embodiment of the present application;
FIG. 2 is a first schematic diagram illustrating a palm-wide graphical profile input of a control method for a vehicle according to an embodiment of the present application;
FIG. 3 is a schematic diagram of a full palm-size graphical profile input for a control method of a vehicle according to an embodiment of the present application;
FIG. 4 is a schematic diagram of a full palm-size graphical profile input for a control method of a vehicle according to one embodiment of the present application;
FIG. 5 is a partial palm graphical profile input diagram one of a control method for a vehicle according to one embodiment of the present application;
FIG. 6 is a partial palm graphical profile input diagram two of a method of controlling a vehicle according to one embodiment of the present application;
FIG. 7 is a schematic illustration of a partial palm graphical profile input for a method of controlling a vehicle according to an embodiment of the present application;
FIG. 8 is a schematic diagram of distance input between any two fingers of a control method of a vehicle according to one embodiment of the present application;
FIG. 9 is a first schematic illustration of a finger and palm joint area input of a control method of a vehicle according to an embodiment of the present application;
FIG. 10 is a second schematic finger-palm joint area input diagram of a control method for a vehicle according to an embodiment of the present application;
FIG. 11 is a third schematic illustration of a finger-palm joint area input for a control method of a vehicle according to an embodiment of the present application;
FIG. 12 is a first finger-only/palm-only area input diagram of a control method of a vehicle according to an embodiment of the present application;
FIG. 13 is a second schematic finger-only/palm-only area input diagram of a control method for a vehicle according to an embodiment of the present application;
FIG. 14 is a schematic diagram of a method of controlling a vehicle according to one embodiment of the present application;
fig. 15 is a schematic structural diagram of a control device of a vehicle according to an embodiment of the present application;
fig. 16 is a schematic structural diagram of a vehicle according to an embodiment of the present application.
Detailed Description
Reference will now be made in detail to embodiments of the present application, examples of which are illustrated in the accompanying drawings, wherein like or similar reference numerals refer to the same or similar elements or elements having the same or similar function throughout. The embodiments described below with reference to the accompanying drawings are illustrative and intended to explain the present application and should not be construed as limiting the present application.
A control method and apparatus of a vehicle, and a storage medium according to embodiments of the present application are described below with reference to the drawings. In the method, a triggering profile and/or a triggering track of a driver on a vehicle operation interface can be identified, triggering characteristics are extracted from the triggering profile and/or the triggering track, the vehicle is controlled to execute corresponding actions according to preset instructions matched with the triggering characteristics, the operation mode is simple and easy to identify, the blind operation of the driver under the driving condition is facilitated, and the driving experience of the driver is improved, the driving safety of the vehicle is ensured. Therefore, the problems that in the related art, multiple operations are defined by using a multi-action mode between fingers, data monitoring and operation are carried out and interactive response is carried out in a single-point and multi-point touch mode aiming at a small-area fingertip touch area, the operation is complex, the carrying function is excessive, a driver is difficult to carry out blind operation under the driving condition, the attention of the driver is dispersed, and traffic accidents are easily caused are solved.
Specifically, fig. 1 is a schematic flowchart of a control method of a vehicle according to an embodiment of the present application.
As shown in fig. 1, the control method of the vehicle includes the steps of:
in step S101, it is detected whether the driver triggers the vehicle operation interface.
It will be appreciated that the driver may trigger the vehicle operator interface in a number of ways, such as fingerprint recognition, voice recognition, etc. The embodiment of the application can detect whether the driver triggers the operation interface of the vehicle, such as whether the operation interface is normally bright or not, whether the occupied power of the operation interface is large or not, and the like.
It should be noted that, a person skilled in the art may set the manner of triggering the vehicle operation interface and determining whether the driver triggers the vehicle operation interface according to the actual situation, which is not limited herein.
Optionally, in an embodiment of the present application, after detecting that the driver triggers the vehicle operation interface, the method further includes: judging whether the vehicle starts a trigger function or not; and if the vehicle does not start the trigger function, sending an identification invalid prompt.
In the actual implementation process, when a driver drives a vehicle, the situation that the driver does not need to start the trigger function may occur, and therefore, in order to avoid that when the driver does not need to start the trigger function, the current trigger profile and/or trigger trajectory of the driver are identified due to false touch, and further the response obtained by the driver is inconsistent with the expectation. In addition, there may be a case where the driver forgets whether to turn on the trigger function, so that the driver cannot obtain an expected response when performing the operation of the operation interface.
In order to avoid the situation, the embodiment of the application can judge whether the vehicle starts the trigger function after detecting that the driver triggers the vehicle operation interface, and if the current vehicle starts the trigger function, the method enters the identification of the subsequent step; if the trigger function is not started in the current vehicle, an identification invalid prompt is sent, so that the driver can timely receive feedback conveniently.
In step S102, when it is detected that the driver triggers the vehicle operating interface, a trigger profile and/or a trigger trajectory of the driver is identified.
Specifically, when the driver is detected to trigger the vehicle operation interface, the trigger profile and/or the trigger track of the driver can be identified.
For example, the embodiment of the application can identify the contact contour of the hand of the driver and the operation interface, such as identifying the contour and the area of the contact between the finger or the palm of the driver and the operation interface, identifying the distance between any two fingers of the driver according to the contact contour, and the like; the method and the device can identify the trigger track, namely the sliding track, of the fingers of the driver on the operation interface, the trigger track can be preset according to the embodiment of the application, when the situation that the driver is in a track input state, namely the driver uses the fingers to perform sliding operation, the sliding track of the fingers of the driver can be detected, and when the sliding track is consistent with the preset trigger track, the track input can be judged to be effective.
It should be noted that the preset trigger trajectory may be preset by the driver according to the habit of the driver, and is not limited herein.
In step S103, at least one trigger feature is extracted from the trigger profile and/or the trigger trajectory, the preset database is queried by using the at least one trigger feature as an index, a trigger instruction corresponding to the at least one trigger feature is matched, and the vehicle is controlled to execute a corresponding action based on the trigger instruction.
In the actual execution process, the embodiment of the application can preset a database to store preset triggering characteristics capable of triggering the instruction and vehicle execution actions matched with the triggering instruction.
Further, the embodiment of the application can extract at least one trigger feature from the trigger profile and/or the trigger track, match the extracted at least one trigger feature with a trigger instruction in a preset database, and further control the vehicle to execute corresponding actions according to a matching result. The operation mode of the embodiment of the application is simple, and is easier to be identified, so that the driver can conveniently carry out blind operation under the driving condition, and the driving safety of the vehicle is ensured while the driving experience of the driver is improved.
It is understood that the database may be set by those skilled in the art according to actual conditions, and the vehicle execution action matched with the trigger instruction may be set by the driver according to the driving habits of the driver, which is not limited herein.
Optionally, in an embodiment of the present application, the at least one triggering characteristic includes one or more of an actual distance between any two fingers of the palm, an actual contact area between the palm and the vehicle operation interface, a contact profile between the palm and the vehicle operation interface, an actual contact strength between the palm and the vehicle operation interface, and a triggering number between the palm and the vehicle operation interface within a preset time period.
By way of example, the triggering characteristics may include the following:
firstly, judging according to the contour
1. Full palm, complete figure:
as shown in fig. 2 to 4, when the touch screen monitors that there is a palm graphic profile input, the input can be determined as valid regardless of the angle and direction of the palm profile in the touch screen, or the palm profile information of the left and right hands.
Wherein, fig. 2 is the case of right-hand contour (up) input of a conventional portrait screen; FIG. 3 is a right hand full outline (right) input case of a conventional landscape screen; fig. 4 is a case of right-hand full-profile (tilt) input of a conventional landscape screen.
2. Partial palm, incomplete figure:
as shown in fig. 5 to 7, when the touch screen monitors that there is a palm graphic profile input, if the palm profile information is not complete palm profile information in the screen input, but is verified and compared by the system, the matching criteria of the related palm profile information base stored in the system are met, and the matching degree can reach a relatively similar degree, the input can be determined as valid.
For example: after comparing the contour information obtained in fig. 5 with the complete palm contour information in the information base, it is found that there are finger contours and partial palm contours, and the similarity reaches 75%, and it can be determined that the input is valid.
Wherein, fig. 5 is the right hand partial outline (complete finger plus partial palm) input condition of the conventional horizontal screen; FIG. 6 is a right hand partial outline (full palm plus partial fingers) input for a conventional landscape screen; fig. 7 is a right hand partial outline (full finger plus partial palm) input case of a conventional landscape screen.
3. Distance between any two fingers:
as shown in fig. 8, when the touch screen monitors that there is an input of a palm graphic profile, a current palm profile is identified, and a distance between any two fingers is calculated for the profile. The embodiment of the application can compare the actual calculation result with a preset threshold, such as: when the actual calculation result is greater than the preset threshold, it may be determined as a valid input, or when the actual result is less than or equal to the preset threshold, it may be determined as a valid input.
In fig. 8, the distance between the index finger and the middle finger of the right hand is taken as an example, and the input in each direction is performed.
In addition, when carrying out the profile recognition, except according to the condition that whole palm figure profile discerns and calculates the distance between arbitrary two fingers, this application embodiment can also discern and calculate distance between arbitrary two fingers according to partial palm figure profile, if the condition that only has the finger profile is monitored to the touch screen, at this moment, this application embodiment can compare actual calculated result with the threshold value of predetermineeing equally, if: when the actual calculation result is greater than the preset threshold, effective input can be judged, or when the actual result is less than or equal to the preset threshold, effective input can be judged, and according to the difference of the distance between any two fingers which are actually monitored and calculated, different trigger characteristics can be extracted, and then different functions can be started according to mapping.
It should be noted that, according to different people, the palm size and the distance between two fingers are different, the actual threshold value should be adjusted correspondingly according to the actual situation of the driver, and the specific value may be set correspondingly by a person skilled in the art, which is not limited specifically herein.
II, judging according to the area
1. Finger-palm joint area:
as shown in fig. 9-11, when a large area graph (compared with the area of a fingertip click region) is monitored to be input by the touch screen, if the area graph information is verified and compared by the system, and meets the matching criteria of the related palm area information base stored in the system, and the matching degree can reach a relatively similar degree, it can be determined as an effective input.
Wherein, fig. 9 is a screen input situation in which the palm pressing degree is deep and the palm area information is complete; FIG. 10 is a screen input with moderate palm depression, insignificant partial palm area information, or no continuous graphics; FIG. 11 is a diagram of a case of a screen input with a small palm pressing degree and incomplete palm area information but with several key nodes (e.g., 4-5 finger bellies, the thenar muscles, the palms, etc.).
2. Finger/palm area only
As shown in fig. 12 and 13, when the touch screen monitors that similar multi-segment finger graphs or large-area palm surface graphs are input, if the area graph information is verified and compared by the system, the area graph information meets the matching standard of the related palm area information base stored in the system, and the matching degree can reach a relatively similar degree, the input can be determined as valid.
Fig. 12 shows a case of a multi-segment, multi-block discontinuous screen input in which a palm (only finger portion) is input; fig. 13 shows a case of a screen input in which the palm (only the palm surface portion) is input in area and a multi-stage and multi-block palm dome is provided.
In addition, if the input palm area graphic information is difficult to match and recognize, or multiple sections, multiple blocks are discontinuous or unobvious multiple graphic information is input, the area recognition threshold value can be preset in the embodiment of the application, and when the total area exceeds the preset area recognition threshold value, the palm information input can be judged to be effective. For example, when the user performs the operation by clicking the conventional finger, the contact area is smaller because the user mostly performs the single-point and multi-point touch, and the touch area when the user performs the operation by using the conventional finger motor is taken as a preset threshold, and when the total area exceeds the threshold, the palm information input can be determined to be effective.
It should be noted that the preset threshold set by the touch area when the conventional finger motor operates may be set by a person skilled in the art according to actual situations, and is not limited specifically herein.
In addition, at least one trigger characteristic of the embodiment of the present application may also be an actual distance between any two fingers of the palm, an actual contact strength between the palm and the vehicle operation interface, a number of triggers within a preset duration between the palm and the vehicle operation interface, and the like.
It should be noted that the specific trigger feature can be set by those skilled in the art according to practical situations, and is not limited in particular here.
Optionally, in an embodiment of the present application, the method further includes: and when the matching fails or the corresponding action is failed to be executed, controlling the vehicle to perform feedback prompt.
It can be understood that, when the driver drives the vehicle, all attention of the driver is mainly focused on the driving situation, so that when the operation interface is operated, at least one extracted feature instruction cannot be matched with the trigger instruction in the preset database due to an unexpected condition.
In addition, when the vehicle executes the corresponding action, the action may be failed to be executed due to factors such as a fault of the corresponding equipment, and the vehicle may not meet the driving requirement of the driver.
Therefore, when the matching fails or the corresponding action is failed to be executed, the vehicle can be controlled to perform feedback prompt, the driver is prevented from being distracted due to the fact that the current vehicle cannot execute the expected action, and the driving experience of the driver is improved.
Optionally, in an embodiment of the present application, the method further includes: receiving a setting instruction of a driver; and modifying the mapping relation between the trigger characteristics and the trigger instruction in the preset database according to the setting instruction.
In the actual implementation process, a driver may have a situation that the mapping relationship between the trigger characteristics and the trigger instructions in the preset database needs to be modified due to external factors (such as season change, daily driving road condition change, and the like) or internal factors (such as driving habit change, and the like).
The following describes the control method of the vehicle according to the present application in detail with reference to fig. 2 to 14 as a specific example.
As shown in fig. 14, the embodiment of the present application includes the following steps:
step S1401: and detecting whether the vehicle starts an operation interface. In the actual execution process, when a driver drives a vehicle, the situation that whether the trigger function is started or not is forgotten possibly exists, so that the driver cannot obtain expected response when operating an operation interface, in order to avoid the situation, the embodiment of the application can judge whether the vehicle starts the trigger function or not after detecting that the driver triggers the vehicle operation interface, and if the current vehicle starts the trigger function, the identification of the subsequent step is carried out; if the trigger function is not started in the current vehicle, an identification invalid prompt is sent, so that the driver can timely receive feedback conveniently.
Step S1402: whether the driver triggers the operation interface is detected. It will be appreciated that the driver may trigger the vehicle operator interface in a number of ways, such as fingerprint recognition, voice recognition, etc. The embodiment of the application can detect whether the driver triggers the operation interface of the vehicle, such as whether the operation interface is normally bright or not, whether the operation interface occupies large power or not, and the like.
It should be noted that, a person skilled in the art may set the manner of triggering the vehicle operation interface and determining whether the driver triggers the vehicle operation interface according to the actual situation, which is not limited herein.
Step S1403: a trigger profile and/or trigger trajectory of the driver is identified. Specifically, when the driver is detected to trigger the vehicle operation interface, the trigger profile and/or the trigger track of the driver can be identified.
For example, the embodiment of the application can identify the contact contour of the hand of the driver and the operation interface, such as the contour and the area of the contact of the finger or the palm of the driver and the operation interface; the method and the device can identify the trigger track, namely the sliding track, of the fingers of the driver on the operation interface, the trigger track can be preset according to the embodiment of the application, when the situation that the driver is in a track input state, namely the driver uses the fingers to perform sliding operation, the sliding track of the fingers of the driver can be detected, and when the sliding track is consistent with the preset trigger track, the track input can be judged to be effective.
It should be noted that the preset trigger trajectory may be preset by the driver according to the habit of the driver, and is not limited herein.
Step S1404: and extracting the trigger characteristics of the driver and matching the trigger characteristics with the database trigger instructions. In the actual execution process, the embodiment of the application can preset a database to store preset triggering characteristics capable of triggering the instruction and vehicle execution actions matched with the triggering instruction.
Further, the embodiment of the application can extract at least one trigger feature from the trigger profile and/or the trigger track, match the extracted at least one trigger feature with a trigger instruction in a preset database, and further control the vehicle to execute corresponding actions according to a matching result. The operation mode of the embodiment of the application is simple, and is easier to be identified, so that the driver can conveniently carry out blind operation under the driving condition, and the driving safety of the vehicle is ensured while the driving experience of the driver is improved.
By way of example, the triggering characteristics may include the following:
firstly, judging according to the contour
1. Full palm, complete figure:
as shown in fig. 2 to 4, when the touch screen monitors that there is a palm graphic profile input, the input can be determined as valid regardless of the angle and direction of the palm profile in the touch screen, or the palm profile information of the left and right hands.
Wherein, fig. 2 is the case of right-hand contour (up) input of a conventional portrait screen; FIG. 3 is a right hand full outline (right) input case of a conventional landscape screen; fig. 4 is a case of right-hand full-profile (tilt) input of a conventional landscape screen.
2. Partial palm, incomplete figure:
as shown in fig. 5 to 7, when the touch screen monitors that there is a palm graphic profile input, if the palm profile information is not complete palm profile information in the screen input, but is verified and compared by the system, the matching criteria of the related palm profile information base stored in the system are met, and the matching degree can reach a relatively similar degree, the input can be determined as valid.
For example: after comparing the contour information obtained in fig. 5 with the complete palm contour information in the information base, it is found that there are finger contours and partial palm contours, and the similarity reaches 75%, and it can be determined that the input is valid.
Wherein, fig. 5 is the right hand partial outline (complete finger plus partial palm) input condition of the conventional horizontal screen; FIG. 6 is a right hand partial outline (full palm plus partial fingers) input for a conventional landscape screen; fig. 7 is a right hand partial outline (full finger plus partial palm) input case of a conventional landscape screen.
3. Distance between any two fingers:
as shown in fig. 8, when the touch screen monitors that there is an input of a palm graphic profile, a current palm profile is identified, and a distance between any two fingers is calculated for the profile. The embodiment of the application can compare the actual calculation result with a preset threshold, such as: when the actual calculation result is greater than the preset threshold, it may be determined as a valid input, or when the actual result is less than or equal to the preset threshold, it may be determined as a valid input.
In addition, when carrying out the profile recognition, except according to the condition that whole palm figure profile discerns and calculates the distance between arbitrary two fingers, this application embodiment can also discern and calculate distance between arbitrary two fingers according to partial palm figure profile, if the condition that only has the finger profile is monitored to the touch screen, at this moment, this application embodiment can compare actual calculated result with the threshold value of predetermineeing equally, if: when the actual calculation result is greater than the preset threshold, effective input can be judged, or when the actual result is less than or equal to the preset threshold, effective input can be judged, and according to the difference of the distance between any two fingers which are actually monitored and calculated, different trigger characteristics can be extracted, and then different functions can be started according to mapping.
It should be noted that, according to different people, the palm size and the distance between two fingers are different, the actual threshold value should be adjusted correspondingly according to the actual situation of the driver, and the specific value may be set correspondingly by a person skilled in the art, which is not limited specifically herein.
II, judging according to the area
1. Finger-palm joint area:
as shown in fig. 9-11, when a large area graph (compared with the area of a fingertip click region) is monitored to be input by the touch screen, if the area graph information is verified and compared by the system, and meets the matching criteria of the related palm area information base stored in the system, and the matching degree can reach a relatively similar degree, it can be determined as an effective input.
Fig. 9 shows a case of screen input in which the palm pressing degree is deep and the palm area information is complete; FIG. 10 is a screen input with moderate palm depression, insignificant partial palm area information, or no continuous graphics; FIG. 11 is a diagram of a case of a screen input with a small palm pressing degree and incomplete palm area information but with several key nodes (e.g., 4-5 finger bellies, the thenar muscles, the palms, etc.).
2. Finger/palm area only
As shown in fig. 12 and 13, when the touch screen monitors that similar multi-segment finger graphs or large-area palm surface graphs are input, if the area graph information is verified and compared by the system, the area graph information meets the matching standard of the related palm area information base stored in the system, and the matching degree can reach a relatively similar degree, the input can be determined as valid.
Fig. 12 shows a case of a multi-segment, multi-block discontinuous screen input in which a palm (only finger portion) is input; fig. 13 shows a case of a screen input in which the palm (only the palm surface portion) is input in area and a multi-stage and multi-block palm dome is provided.
In addition, if the input palm area graphic information is difficult to match and identify, or multiple sections, multiple blocks are discontinuous or unobvious multiple graphic information is input, the embodiment of the application can preset an area identification threshold value, and when the total area exceeds the preset area identification threshold value, the palm information input can be judged to be effective. For example, when the user performs the operation by clicking the conventional finger, the contact area is smaller because the user mostly performs the single-point and multi-point touch, and the touch area when the user performs the operation by using the conventional finger motor is taken as a preset threshold, and when the total area exceeds the threshold, the palm information input can be determined to be effective.
It should be noted that the preset threshold set by the touch area when the conventional finger motor operates may be set by a person skilled in the art according to actual situations, and is not limited specifically herein.
In addition, at least one trigger characteristic of the embodiment of the present application may also be an actual distance between any two fingers of the palm, an actual contact strength between the palm and the vehicle operation interface, a number of triggers within a preset duration between the palm and the vehicle operation interface, and the like.
It should be noted that the specific trigger feature can be set by those skilled in the art according to practical situations, and is not limited in particular here.
Step S1405: and executing the triggering instruction. It should be noted that, when a driver drives a vehicle, all attention of the driver is mainly focused on a driving situation, so that when an operation interface is operated, at least one extracted feature instruction cannot be matched with a trigger instruction in a preset database due to an unexpected situation.
In addition, when the vehicle executes the corresponding action, the action may be failed to be executed due to factors such as a fault of the corresponding equipment, and the vehicle may not meet the driving requirement of the driver.
Therefore, when the matching fails or the corresponding action is failed to be executed, the vehicle can be controlled to perform feedback prompt, the driver is prevented from being distracted due to the fact that the current vehicle cannot execute the expected action, and the driving experience of the driver is improved.
In addition, in the actual execution process, the situation that the mapping relation between the trigger characteristics and the trigger instructions in the preset database needs to be modified due to external factors (such as season change, daily driving road condition change and the like) or internal factors (such as driving habit change and the like) may exist, the embodiment of the application can receive the setting instructions of the driver, modify the mapping relation between the trigger characteristics and the trigger instructions in the preset database according to the setting instructions, further ensure that the driver can adjust the vehicle to the most suitable driving state according to the actual situation, and improve the driving experience of the driver.
According to the control method of the vehicle, the triggering contour and/or the triggering track of the driver on the vehicle operation interface can be identified, the triggering characteristics are extracted from the triggering contour and/or the triggering track, the vehicle is controlled to execute corresponding actions according to the preset instructions matched with the triggering characteristics, the operation mode is simple, the vehicle is easier to identify, the driver can conveniently perform blind operation under the driving condition, the driving experience of the driver is improved, and meanwhile the driving safety of the vehicle is guaranteed. Therefore, the technical problems that in the related art, multiple operations are defined by using a multi-action mode between fingers, data monitoring and operation are carried out and interactive response is carried out in a single-point and multi-point touch mode aiming at a small-area fingertip touch area, the operation is complex, the carrying function is excessive, a driver is difficult to carry out blind operation under the driving condition, the attention of the driver is dispersed, and traffic accidents are easily caused are solved.
Next, a control device of a vehicle according to an embodiment of the present application is described with reference to the drawings.
Fig. 15 is a block diagram schematically illustrating a control device of a vehicle according to an embodiment of the present application.
As shown in fig. 15, the control device 10 for a vehicle includes: a detection module 100, an identification module 200 and a control module 300.
Specifically, the detection module 100 is configured to detect whether a driver triggers a vehicle operation interface.
The identification module 200 is configured to identify a trigger profile and/or a trigger trajectory of a driver when the driver triggering the vehicle operation interface is detected.
The control module 300 is configured to extract at least one trigger feature from the trigger profile and/or the trigger trajectory, query a preset database by using the at least one trigger feature as an index, match a trigger instruction corresponding to the at least one trigger feature, and control the vehicle to execute a corresponding action based on the trigger instruction.
Optionally, in an embodiment of the present application, the at least one triggering characteristic includes one or more of an actual distance between any two fingers of the palm, an actual contact area between the palm and the vehicle operation interface, a contact profile between the palm and the vehicle operation interface, an actual contact strength between the palm and the vehicle operation interface, and a triggering number between the palm and the vehicle operation interface within a preset time period.
Optionally, in an embodiment of the present application, the control device 10 of the vehicle further includes: and a reminding module.
And the reminding module is used for controlling the vehicle to carry out feedback prompt when the matching fails or the corresponding action is failed to be executed.
Optionally, in an embodiment of the present application, the control device 10 of the vehicle further includes: the device comprises a receiving module and a correcting module.
The receiving module is used for receiving a setting instruction of a driver.
And the correction module is used for modifying the mapping relation between the trigger characteristics and the trigger instruction in the preset database according to the setting instruction.
Optionally, in an embodiment of the present application, the detection module 100 further includes: a judging unit and a reminding unit.
The judging unit is used for judging whether the vehicle starts the triggering function or not.
And the reminding unit is used for sending an identification invalid reminder when the triggering function is not started.
It should be noted that the foregoing explanation of the embodiment of the control method for the vehicle is also applicable to the control device for the vehicle in this embodiment, and the details are not repeated here.
According to the control device of the vehicle, the triggering characteristics can be extracted from the triggering outline and/or the triggering track of the driver on the vehicle operation interface, the vehicle is controlled to execute corresponding actions according to the preset instructions matched with the triggering characteristics, the operation mode is simple, the vehicle can be more easily identified, the driver can conveniently perform blind operation under the driving condition, the driving experience of the driver is improved, and meanwhile, the driving safety of the vehicle is guaranteed. Therefore, the technical problems that in the related art, multiple operations are defined by using a multi-action mode between fingers, data monitoring and operation are carried out and interactive response is carried out in a single-point and multi-point touch mode aiming at a small-area fingertip touch area, the operation is complex, the carrying function is excessive, a driver is difficult to carry out blind operation under the driving condition, the attention of the driver is dispersed, and traffic accidents are easily caused are solved.
Fig. 16 is a schematic structural diagram of a vehicle according to an embodiment of the present application. The vehicle may include:
memory 1601, processor 1602, and a computer program stored on memory 1601 and operable on processor 1602.
The processor 1602 executes the program to implement the control method of the vehicle provided in the above-described embodiment.
Further, the vehicle further includes:
a communication interface 1603 for communication between the memory 1601 and the processor 1602.
A memory 1601 is used to store computer programs that can be run on the processor 1602.
Memory 1601 may comprise high-speed RAM memory, and may also include non-volatile memory (non-volatile memory), such as at least one disk memory.
If the memory 1601, the processor 1602 and the communication interface 1603 are implemented independently, the communication interface 1603, the memory 1601 and the processor 1602 may be connected to each other via a bus and perform communication with each other. The bus may be an Industry Standard Architecture (ISA) bus, a Peripheral Component Interconnect (PCI) bus, an Extended ISA (EISA) bus, or the like. The bus may be divided into an address bus, a data bus, a control bus, etc. For ease of illustration, only one thick line is shown in FIG. 16, but this is not intended to represent only one bus or type of bus.
Alternatively, in practical implementation, if the memory 1601, the processor 1602 and the communication interface 1603 are implemented by being integrated on one chip, the memory 1601, the processor 1602 and the communication interface 1603 may communicate with each other via an internal interface.
The processor 1602 may be a Central Processing Unit (CPU), an Application Specific Integrated Circuit (ASIC), or one or more Integrated circuits configured to implement embodiments of the present Application.
The present embodiment also provides a computer-readable storage medium on which a computer program is stored, which when executed by a processor, implements the control method of the vehicle as above.
In the description of the present specification, reference to the description of "one embodiment," "some embodiments," "an example," "a specific example," or "some examples" or the like means that a particular feature, structure, material, or characteristic described in connection with the embodiment or example is included in at least one embodiment or example of the present application. In this specification, the schematic representations of the terms used above are not necessarily intended to refer to the same embodiment or example. Furthermore, the particular features, structures, materials, or characteristics described may be combined in any suitable manner in any one or N embodiments or examples. Furthermore, various embodiments or examples and features of different embodiments or examples described in this specification can be combined and combined by one skilled in the art without contradiction.
Furthermore, the terms "first", "second" and "first" are used for descriptive purposes only and are not to be construed as indicating or implying relative importance or implicitly indicating the number of technical features indicated. Thus, a feature defined as "first" or "second" may explicitly or implicitly include at least one such feature. In the description of the present application, "N" means at least two, e.g., two, three, etc., unless specifically limited otherwise.
Any process or method descriptions in flow charts or otherwise described herein may be understood as representing modules, segments, or portions of code which include one or more N executable instructions for implementing steps of a custom logic function or process, and alternate implementations are included within the scope of the preferred embodiment of the present application in which functions may be executed out of order from that shown or discussed, including substantially concurrently or in reverse order, depending on the functionality involved, as would be understood by those reasonably skilled in the art of implementing the embodiments of the present application.
The logic and/or steps represented in the flowcharts or otherwise described herein, e.g., an ordered listing of executable instructions that can be considered to implement logical functions, can be embodied in any computer-readable medium for use by or in connection with an instruction execution system, apparatus, or device, such as a computer-based system, processor-containing system, or other system that can fetch the instructions from the instruction execution system, apparatus, or device and execute the instructions. For the purposes of this description, a "computer-readable medium" can be any means that can contain, store, communicate, propagate, or transport the program for use by or in connection with the instruction execution system, apparatus, or device. More specific examples (a non-exhaustive list) of the computer-readable medium would include the following: an electrical connection (electronic device) having one or N wires, a portable computer diskette (magnetic device), a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber device, and a portable compact disc read-only memory (CDROM). Additionally, the computer-readable medium could even be paper or another suitable medium upon which the program is printed, as the program can be electronically captured, via for instance optical scanning of the paper or other medium, then compiled, interpreted or otherwise processed in a suitable manner if necessary, and then stored in a computer memory.
It should be understood that portions of the present application may be implemented in hardware, software, firmware, or a combination thereof. In the above embodiments, the N steps or methods may be implemented in software or firmware stored in a memory and executed by a suitable instruction execution system. If implemented in hardware, as in another embodiment, any one or combination of the following techniques, which are known in the art, may be used: a discrete logic circuit having a logic gate circuit for implementing a logic function on a data signal, an application specific integrated circuit having an appropriate combinational logic gate circuit, a Programmable Gate Array (PGA), a Field Programmable Gate Array (FPGA), or the like.
It will be understood by those skilled in the art that all or part of the steps carried by the method for implementing the above embodiments may be implemented by hardware related to instructions of a program, which may be stored in a computer readable storage medium, and when the program is executed, the program includes one or a combination of the steps of the method embodiments.
In addition, functional units in the embodiments of the present application may be integrated into one processing module, or each unit may exist alone physically, or two or more units are integrated into one module. The integrated module can be realized in a hardware mode, and can also be realized in a software functional module mode. The integrated module, if implemented in the form of a software functional module and sold or used as a stand-alone product, may also be stored in a computer readable storage medium.
The storage medium mentioned above may be a read-only memory, a magnetic or optical disk, etc. Although embodiments of the present application have been shown and described above, it is understood that the above embodiments are exemplary and should not be construed as limiting the present application, and that variations, modifications, substitutions and alterations may be made to the above embodiments by those of ordinary skill in the art within the scope of the present application.

Claims (10)

1. A control method of a vehicle, characterized by comprising the steps of:
detecting whether a driver triggers a vehicle operation interface;
identifying a trigger profile and/or trigger trajectory of the driver upon detecting the driver triggering the vehicle operation interface; and
extracting at least one trigger feature from the trigger contour and/or the trigger track, inquiring a preset database by taking the at least one trigger feature as an index, matching a trigger instruction corresponding to the at least one trigger feature, and controlling the vehicle to execute a corresponding action based on the trigger instruction.
2. The method of claim 1, wherein the at least one triggering characteristic comprises one or more of an actual distance between any two fingers of a palm, an actual contact area between the palm and the vehicle operator interface, a contact profile between the palm and the vehicle operator interface, an actual contact strength between the palm and the vehicle operator interface, and a number of triggers within a preset duration between the palm and the vehicle operator interface.
3. The method of claim 1, further comprising:
and controlling the vehicle to perform feedback prompt when the matching fails or the corresponding action is failed to be executed.
4. The method of claim 1, further comprising:
receiving a setting instruction of the driver;
and modifying the mapping relation between the trigger characteristics and the trigger instruction in the preset database according to the setting instruction.
5. The method of claim 1, after detecting that the driver triggers the vehicle operator interface, further comprising:
judging whether the vehicle starts a trigger function or not;
and if the vehicle does not start the trigger function, sending an identification invalid prompt.
6. A control apparatus of a vehicle, characterized by comprising:
the detection module is used for detecting whether a driver triggers a vehicle operation interface;
the identification module is used for identifying a trigger profile and/or a trigger track of the driver when the driver is detected to trigger the vehicle operation interface; and
the control module is used for extracting at least one trigger feature from the trigger contour and/or the trigger track, inquiring a preset database by taking the at least one trigger feature as an index, matching a trigger instruction corresponding to the at least one trigger feature, and controlling the vehicle to execute a corresponding action based on the trigger instruction.
7. The apparatus of claim 6, wherein the at least one trigger feature comprises one or more of an actual distance between any two fingers of a palm, an actual contact area between the palm and the vehicle operator interface, a contact profile between the palm and the vehicle operator interface, an actual contact strength between the palm and the vehicle operator interface, and a number of triggers within a preset time period between the palm and the vehicle operator interface.
8. The apparatus of claim 6, wherein the detection module further comprises:
the judging unit is used for judging whether the vehicle starts a triggering function or not;
and the reminding unit is used for sending an identification invalid reminder when the triggering function is not started by the vehicle.
9. A vehicle, characterized by comprising: memory, a processor and a computer program stored on the memory and executable on the processor, the processor executing the program to implement the control method of the vehicle according to any one of claims 1 to 5.
10. A computer-readable storage medium on which a computer program is stored, characterized in that the program is executed by a processor for implementing a control method of a vehicle according to any one of claims 1 to 5.
CN202210172382.0A 2022-02-24 2022-02-24 Vehicle control method and device, vehicle and storage medium Pending CN114510193A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210172382.0A CN114510193A (en) 2022-02-24 2022-02-24 Vehicle control method and device, vehicle and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210172382.0A CN114510193A (en) 2022-02-24 2022-02-24 Vehicle control method and device, vehicle and storage medium

Publications (1)

Publication Number Publication Date
CN114510193A true CN114510193A (en) 2022-05-17

Family

ID=81553303

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210172382.0A Pending CN114510193A (en) 2022-02-24 2022-02-24 Vehicle control method and device, vehicle and storage medium

Country Status (1)

Country Link
CN (1) CN114510193A (en)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20160291862A1 (en) * 2015-04-02 2016-10-06 Inpris Innovative Products From Israel Ltd System, apparatus and method for vehicle command and control
CN108944737A (en) * 2017-05-18 2018-12-07 德尔福技术有限责任公司 For the sliding contact control unit in the control panel of motor vehicles
CN111176443A (en) * 2019-12-12 2020-05-19 青岛小鸟看看科技有限公司 Vehicle-mounted intelligent system and control method thereof
CN113840766A (en) * 2021-03-31 2021-12-24 华为技术有限公司 Vehicle control method and device

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20160291862A1 (en) * 2015-04-02 2016-10-06 Inpris Innovative Products From Israel Ltd System, apparatus and method for vehicle command and control
CN108944737A (en) * 2017-05-18 2018-12-07 德尔福技术有限责任公司 For the sliding contact control unit in the control panel of motor vehicles
CN111176443A (en) * 2019-12-12 2020-05-19 青岛小鸟看看科技有限公司 Vehicle-mounted intelligent system and control method thereof
CN113840766A (en) * 2021-03-31 2021-12-24 华为技术有限公司 Vehicle control method and device

Similar Documents

Publication Publication Date Title
CN105824559B (en) False touch recognition and processing method and electronic equipment
EP0690407B1 (en) Image recognition apparatus and method
JP4899806B2 (en) Information input device
CN103853481B (en) Method and system for simulating touch screen mobile terminal key
US10035539B2 (en) Steering wheel control system
CN107547738B (en) Prompting method and mobile terminal
CN109947348B (en) Method for selecting items based on vehicle-mounted touch screen and vehicle-mounted touch screen
KR101591586B1 (en) Data processing apparatus which detects gesture operation
JP2004355426A (en) Software for enhancing operability of touch panel and terminal
CN110588759B (en) Vehicle steering control method and device
JPWO2013137455A1 (en) Information terminal and execution control method
CN110049892A (en) The operating device of information entertainment for motor vehicles, operator for knowing this operating element place at method
CN114510193A (en) Vehicle control method and device, vehicle and storage medium
EP3046010A1 (en) System and method for guarding emergency and critical touch targets
US11221735B2 (en) Vehicular control unit
CN113799791B (en) Steering wheel key false touch prevention method and device based on camera and vehicle
US20200371661A1 (en) Method of switching operation mode of touch panel
CN115431995B (en) Equipment control method and device based on different-level auxiliary driving
CN111409459B (en) Vehicle-mounted function control method, control device, vehicle and storage medium
CN112698745A (en) Control display method and electronic equipment
JP2014056460A (en) Input device
CN114217716A (en) Menu bar display method and device and electronic equipment
CN112835661A (en) On-board auxiliary system, vehicle comprising same, and corresponding method and medium
US20210019003A1 (en) Input control device and input control method
CN110949284A (en) Vehicle, and control method and control device thereof

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination