CN117675995A - Vehicle control method, mobile terminal, vehicle and storage medium - Google Patents

Vehicle control method, mobile terminal, vehicle and storage medium Download PDF

Info

Publication number
CN117675995A
CN117675995A CN202311866750.2A CN202311866750A CN117675995A CN 117675995 A CN117675995 A CN 117675995A CN 202311866750 A CN202311866750 A CN 202311866750A CN 117675995 A CN117675995 A CN 117675995A
Authority
CN
China
Prior art keywords
vehicle
scene
custom
self
action
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202311866750.2A
Other languages
Chinese (zh)
Inventor
林燕婷
陈思云
王肖
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangzhou Xiaopeng Motors Technology Co Ltd
Original Assignee
Guangzhou Xiaopeng Motors Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangzhou Xiaopeng Motors Technology Co Ltd filed Critical Guangzhou Xiaopeng Motors Technology Co Ltd
Priority to CN202311866750.2A priority Critical patent/CN117675995A/en
Publication of CN117675995A publication Critical patent/CN117675995A/en
Pending legal-status Critical Current

Links

Classifications

    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02PCLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
    • Y02P90/00Enabling technologies with a potential contribution to greenhouse gas [GHG] emissions mitigation
    • Y02P90/02Total factory control, e.g. smart factories, flexible manufacturing systems [FMS] or integrated manufacturing systems [IMS]

Abstract

The application discloses a vehicle control method, a mobile terminal, a vehicle and a storage medium, wherein the vehicle control method comprises the following steps: responding to the operation of an operation control corresponding to a preset operation scene, and generating a scene entering instruction; the operation scene comprises a self-defined trigger condition except the operation of the operation control and a self-defined action triggered and executed by the self-defined trigger condition; sending a scene entry instruction to the target vehicle so that the target vehicle receives the scene entry instruction through the TBox; and according to the scene entering instruction, the vehicle-mounted system skips the custom triggering condition to execute the custom action. Based on the scheme of the application, the waiting time of the self-defined triggering condition can be avoided, the instant requirement of a user is met, and the user experience is effectively improved.

Description

Vehicle control method, mobile terminal, vehicle and storage medium
Technical Field
The present disclosure relates to the field of vehicle control technologies, and in particular, to a vehicle control method, a mobile terminal, a vehicle, and a storage medium.
Background
With the progress of technology, automobile intellectualization has become a necessary trend in industry development. The user can customize the running scene of the automobile according to personal habits and requirements. When the preset triggering condition is reached, the vehicle machine system can automatically execute corresponding actions, such as setting a navigation route, adjusting the seat position, adjusting the sound system, adjusting the temperature of an air conditioner and the like, so that convenient vehicle experience is brought.
However, in the prior art, the user-defined operation scenario and the trigger condition are often based on daily use requirements, and the processing capability for the instant requirement is insufficient. If only preset trigger conditions are relied upon, a longer period of time may be required to trigger the execution of the corresponding action.
In short, the current running scene response mode of the vehicle cannot meet the instant requirement of the user in time.
Disclosure of Invention
The main purpose of the application is to provide a vehicle control method, a mobile terminal, a vehicle and a storage medium, and aims to solve the problem that the current vehicle operation scene response mode cannot meet the instant requirement of a user in time.
In order to achieve the above object, the present application provides a vehicle control method, where the method is applied to a mobile terminal, the mobile terminal is pre-associated with the same user account with a target vehicle, and the target vehicle is provided with a TBox and a vehicle machine system, and the method includes:
responding to the operation of an operation control corresponding to a preset operation scene, and generating a scene entering instruction; the operation scene comprises a self-defined trigger condition except the operation of the operation control and a self-defined action triggered and executed by the self-defined trigger condition;
And sending the scene entering instruction to the target vehicle, so that the target vehicle receives the scene entering instruction through the TBox, and executing the custom action through the vehicle-to-machine system skipping the custom triggering condition according to the scene entering instruction.
Optionally, the configuration process of the operation scene includes:
responding to a scene creation operation, creating the operation scene, and entering an editing interface of the operation scene;
responding to trigger condition editing operation in the editing interface, and determining the custom trigger condition;
determining the custom action in response to an action editing operation in the editing interface;
and responding to a save operation in the editing interface, and saving the running scene.
Optionally, after the step of generating the scene entry instruction in response to the operation of the operation control corresponding to the preset operation scene, before the step of sending the scene entry instruction to the target vehicle, the method further includes:
detecting a network connection state of the mobile terminal and the target vehicle;
if the network connection state is abnormal, pushing network abnormality prompt information;
If the network connection state is not abnormal, executing the steps of: and sending the scene entering instruction to the target vehicle.
The embodiment of the application also provides a vehicle control method, which is applied to a target vehicle, wherein the target vehicle and a mobile terminal are pre-associated with the same user account, and the target vehicle is provided with a TBox and a vehicle-mounted system, and the method comprises the following steps:
receiving a scene entry instruction through the TBox; the scene entering instruction is generated by the mobile terminal in response to operation of an operation control corresponding to a preset operation scene, and is sent to the target vehicle by the mobile terminal; the operation scene comprises a self-defined triggering condition except for the operation of the operation control and a self-defined action triggered and executed by the self-defined triggering condition;
and according to the scene entering instruction, the self-defining action is executed by the vehicle-mounted system skipping the self-defining triggering condition.
Optionally, after the step of receiving the scene entry instruction through the TBox, before the step of executing the custom action through the vehicle system skipping the custom trigger condition according to the scene entry instruction, the method further includes:
And under the condition that the vehicle-mounted system is in a dormant state, waking up the vehicle-mounted system according to the scene entering instruction.
Optionally, after the step of receiving the scene entry instruction through the TBox, before the step of executing the custom action through the vehicle system skipping the custom trigger condition according to the scene entry instruction, the method further includes:
detecting a gear of the target vehicle;
if the gear is a parking gear, executing the steps of: and according to the scene entering instruction, the self-defining action is executed by the vehicle-mounted system skipping the self-defining triggering condition.
Optionally, the custom trigger condition includes a primary custom trigger condition and a secondary custom trigger condition, the custom action includes a primary custom action and a secondary custom action, and the step of executing the custom action by the vehicle system skipping the custom trigger condition according to the scene entry instruction includes:
according to the scene entering instruction, the main custom action is executed by the vehicle-to-machine system skipping the main custom triggering condition;
after the step of executing the custom action by the vehicle-mounted system according to the scene entry instruction and by skipping the custom trigger condition, the method further comprises the following steps:
Starting a monitoring process of the secondary custom triggering condition through the vehicle-to-machine system;
and triggering and executing the secondary custom action under the condition that the secondary custom triggering condition is met.
The embodiment of the application also provides a mobile terminal, which comprises a memory, a processor and a vehicle control program stored in the memory and capable of running on the processor, wherein the vehicle control program realizes the steps of the vehicle control method when being executed by the processor.
The embodiment of the application also provides a vehicle, which comprises a memory, a processor and a vehicle control program stored on the memory and capable of running on the processor, wherein the vehicle control program realizes the steps of the vehicle control method when being executed by the processor.
The embodiments of the present application also propose a computer-readable storage medium, on which a vehicle control program is stored, which when executed by a processor implements the steps of the vehicle control method as described above.
According to the vehicle control method, the mobile terminal, the vehicle and the storage medium, a scene entering instruction is generated by responding to operation of an operation control corresponding to a preset operation scene; the operation scene comprises a self-defined trigger condition except the operation of the operation control and a self-defined action triggered and executed by the self-defined trigger condition; sending the scene entry instruction to the target vehicle so that the target vehicle receives the scene entry instruction through the TBox; and according to the scene entering instruction, the self-defining action is executed by the vehicle-mounted system skipping the self-defining triggering condition. Based on the scheme, a user can operate an operation control corresponding to an operation scene through the mobile terminal, and the mobile terminal responds to the operation to generate and send a scene entering instruction to a target vehicle. After receiving the scene entering instruction through the TBox, the target vehicle skips the judgment of the self-defined triggering condition according to the scene entering instruction through the vehicle-to-machine system, and directly executes the self-defined action. Thus, the waiting time of the self-defined triggering condition can be avoided, the instant requirement of the user is met, and the user experience is effectively improved.
Drawings
FIG. 1 is a flow chart of a first exemplary embodiment of a vehicle control method of the present application;
FIG. 2 is a schematic diagram of an operation scene control interface according to the vehicle control method of the present application;
FIG. 3 is another schematic diagram of an operation scene control interface according to the vehicle control method of the present application;
FIG. 4 is a flow chart of a second exemplary embodiment of a vehicle control method of the present application;
FIG. 5 is a schematic illustration of an editing interface involved in the vehicle control method of the present application;
FIG. 6 is a flow chart of a third exemplary embodiment of a vehicle control method of the present application;
FIG. 7 is a flow chart of a fourth exemplary embodiment of a vehicle control method of the present application;
FIG. 8 is a flow chart of a fifth exemplary embodiment of a vehicle control method of the present application;
FIG. 9 is a flowchart of a sixth exemplary embodiment of a vehicle control method of the present application;
fig. 10 is a flowchart illustrating a seventh exemplary embodiment of a vehicle control method according to the present application.
The realization, functional characteristics and advantages of the present application will be further described with reference to the embodiments, referring to the attached drawings.
Detailed Description
It should be understood that the specific embodiments described herein are for purposes of illustration only and are not intended to limit the present application.
The main solutions of the embodiments of the present application are: responding to the operation of an operation control corresponding to a preset operation scene, and generating a scene entering instruction; the operation scene comprises a self-defined trigger condition except the operation of the operation control and a self-defined action triggered and executed by the self-defined trigger condition; sending the scene entry instruction to the target vehicle so that the target vehicle receives the scene entry instruction through the TBox; and according to the scene entering instruction, the self-defining action is executed by the vehicle-mounted system skipping the self-defining triggering condition. Based on the scheme, a user can operate an operation control corresponding to an operation scene through the mobile terminal, and the mobile terminal responds to the operation to generate and send a scene entering instruction to a target vehicle. After receiving the scene entering instruction through the TBox, the target vehicle skips the judgment of the self-defined triggering condition according to the scene entering instruction through the vehicle-to-machine system, and directly executes the self-defined action. Thus, the waiting time of the self-defined triggering condition can be avoided, the instant requirement of the user is met, and the user experience is effectively improved.
Referring to fig. 1, a first embodiment of a vehicle control method according to the present application provides a flowchart, where the method is applied to a mobile terminal, the mobile terminal is pre-associated with the same user account with a target vehicle, and the target vehicle is provided with a TBox and a vehicle-mounted system, and the method includes:
step A10, generating a scene entry instruction in response to operation of an operation control corresponding to a preset operation scene; the operation scene comprises a self-defined trigger condition except the operation of the operation control and a self-defined action triggered and executed by the self-defined trigger condition;
in particular, with the advancement of technology, automobile intellectualization has become a necessary trend in industry development. The user can customize the running scene of the automobile according to personal habits and requirements. When the preset triggering condition is reached, the vehicle machine system can automatically execute corresponding actions, such as setting a navigation route, adjusting the seat position, adjusting the sound system, adjusting the temperature of an air conditioner and the like, so that convenient vehicle experience is brought. However, in the prior art, the user-defined operation scenario and the trigger condition are often based on daily use requirements, and the processing capability for the instant requirement is insufficient. If only preset trigger conditions are relied upon, a longer period of time may be required to trigger the execution of the corresponding action. In short, the current running scene response mode of the vehicle cannot meet the instant requirement of the user in time.
Therefore, the present embodiment provides a vehicle control method capable of meeting the instant requirement, and the user can skip the custom trigger condition by operating the mobile terminal, so as to enable the target vehicle to execute the custom action, and avoid the waiting time of the custom trigger condition.
The mobile terminal related to the embodiment can be terminal equipment such as a mobile phone, a tablet computer, a notebook computer, intelligent wearable equipment and the like, has a networking function, and can communicate with a target vehicle based on a network; the target vehicle is an intelligent vehicle provided with a TBox (Telematics Box) and a vehicle-mounted system. The TBox is generally used for vehicle-mounted communication, and can realize the functions of remote monitoring, information interaction, fault diagnosis and the like of a target vehicle; the vehicle-mounted system is an intelligent control system in the target vehicle, and can realize various intelligent operations and management of the vehicle.
Firstly, a user needs to configure an operation scene through a mobile terminal, wherein the operation scene comprises a custom trigger condition except operation on an operation control and a custom action triggered and executed by the custom trigger condition. The name or the identification of the operation scene can be set by the user; the custom triggering condition is a condition for judging by using one or more of the perception data or the system data of the target vehicle, and corresponding custom actions can be triggered and executed under the condition that the custom triggering condition is met; the custom action corresponds to a custom trigger condition, serving the user requirements in the operating scenario, for example: adjusting the car configuration (adjusting the seat position, steering wheel angle, rear view mirror position, etc.); changing the driving mode of the automobile (changing the operation modes of an engine, a steering system, an air conditioner and the like); adjusting the vehicle-mounted equipment (adjusting the sound system, the navigation system, the lighting system, the air conditioning system, etc.); setting up the vehicle-mounted Application (starting or closing the vehicle-mounted APP (Application), or performing personalized setting on functions in the APP). There are many other different types of custom actions, which are not described in detail herein.
Optionally, the above-mentioned sensing data refers to data obtained in real time by the target vehicle through sensing capability, such as the movement state data of the vehicle, such as position, speed, acceleration, etc.; energy state data such as the oil mass, the electric quantity, the engine state and the like of the vehicle; external environmental data such as external weather conditions, road conditions, traffic signals, etc. of the vehicle; intelligent driving detection data of pedestrians, vehicles, obstacles and the like around the vehicle. The above-mentioned system data refers to data that the target vehicle can locally acquire, such as the system time, system date, etc. of the vehicle. In addition, there are many other different types of perception data and system data, which are not described in detail herein.
It will be appreciated that the custom trigger condition is determined using one or more of the perception data or the system data of the target vehicle, and the operation of the run control is implemented by the user via the mobile terminal, and the operation of the run control is not part of the custom trigger condition.
After the configuration of the operation scene is completed, the user can see the operation control corresponding to the operation scene through the display interface of the mobile terminal, and the operation control can be a button control, a sliding block control, a switch control and the like, and the form of the operation control is not limited. The user can operate the running control, such as clicking a button control, dragging a slider control, opening a switch control, and the like, and accordingly, the mobile terminal can respond to the operation of the running control by the user to generate a scene entering instruction. The scene entering instruction is used for enabling the target vehicle to enter the operation scene and executing corresponding self-defining actions.
And step A20, sending the scene entry instruction to the target vehicle, so that the target vehicle receives the scene entry instruction through the TBox, and executing the custom action through the vehicle-to-vehicle system skipping the custom trigger condition according to the scene entry instruction.
Specifically, the TBox is operating and in a standby state regardless of whether the on-board system of the target vehicle is started. After generating the scene entry instruction, the mobile terminal transmits the scene entry instruction to the target vehicle through the network. Accordingly, the target vehicle receives a scene entry instruction through the TBox. Then, the target vehicle skips the self-defined triggering condition through the vehicle-to-machine system according to the scene entering instruction, and directly executes the self-defined action, so that the waiting time of the self-defined triggering condition is avoided.
In one case, when the target vehicle receives the scene entry instruction through the TBox, the vehicle system is in an awake state, and at this time, the vehicle system can normally execute the custom action. In another case, when the target vehicle receives the scene entry instruction through the TBox, the vehicle-mounted system is in a dormant state, and the vehicle-mounted system needs to be awakened first to normally execute the self-defined action.
It is noted that, in order to implement vehicle security control, the mobile terminal and the target vehicle need to be associated with the same user account in advance. The manner of association may be: the mobile terminal and the target vehicle are respectively provided with a vehicle control APP, and the mobile terminal and the target vehicle are respectively registered in the same user account on the corresponding vehicle control APP.
Taking fig. 2 as an example, fig. 2 is a schematic diagram of an operation scene control interface related to the vehicle control method of the present application. The operation scene control interface is displayed on the mobile terminal, wherein an operation scene named as 'inflammatory summer' is defined; the self-defined triggering conditions under the operation scene comprise that the main driving is on the car and the temperature in the car is above 35 ℃; the corresponding custom action includes opening an air conditioner switch; the main driving temperature is adjusted to 18 ℃; closing the ventilation of the main driving seat; the button control of the lower 'immediately running' is a running control corresponding to a running scene of 'inflammatory summer'. That is, when the main driving is on and the temperature in the vehicle is 35 ℃ or higher, the target vehicle turns on the air conditioner switch, adjusts the main driving temperature to 18 ℃, and turns off the ventilation of the main driving seat. Or when the user clicks the button control for 'immediately running', the mobile terminal generates a corresponding scene entering instruction, sends the scene entering instruction to the target vehicle, receives and skips the condition judgment that the main driving is carried out and the temperature in the vehicle is more than 35 ℃ according to the scene entering instruction, directly turns on an air conditioner switch, adjusts the main driving temperature to 18 ℃, and closes the ventilation of the main driving seat.
Taking fig. 3 as an example, fig. 3 is another schematic diagram of an operation scene control interface related to the vehicle control method in the present application. The operation scene control interface is displayed on the mobile terminal, in one case, a plurality of user-defined operation scenes are provided, and the "on-duty commute", "camping mode" and "intelligent day and night mode" shown in fig. 3 are all operation scenes, and the user-defined triggering conditions and user-defined actions corresponding to the operation scenes can be different.
In this embodiment, the user may operate the operation control corresponding to the operation scene through the mobile terminal, and the mobile terminal generates and sends the scene entry instruction to the target vehicle in response to the operation. After receiving the scene entering instruction through the TBox, the target vehicle skips the judgment of the self-defined triggering condition according to the scene entering instruction through the vehicle-to-machine system, and directly executes the self-defined action. Thus, the waiting time of the self-defined triggering condition can be avoided, the instant requirement of the user is met, and the user experience is effectively improved.
Further, referring to fig. 4, a flow schematic is provided according to a second embodiment of the vehicle control method of the present application, based on the embodiment shown in fig. 1, a configuration process of the operation scenario includes:
Step A011, responding to a scene creation operation, creating the operation scene, and entering an editing interface of the operation scene;
step A012, responding to the trigger condition editing operation in the editing interface, and determining the self-defined trigger condition;
step A013, responding to action editing operation in the editing interface, and determining the custom action;
step a014, in response to a save operation in the editing interface, saving the running scene.
Specifically, when a user wants to create a new operation scene, the mobile terminal provides an interface for the user to edit and set the operation scene.
First, the user clicks on a graphical interface on the mobile terminal to select "create new scene" or similar option. Accordingly, the mobile terminal responds to the scene creation operation of the user, creates an operation scene and enters an editing interface of the operation scene. It will be appreciated that the running scene is now a new blank scene.
The user then edits custom trigger conditions in the edit interface, which define when to perform custom actions, either by entering edits or selecting edits. Correspondingly, the mobile terminal responds to the trigger condition editing operation of the user in the editing interface, and the self-defined trigger condition is determined. It is noted that the number of custom trigger conditions may be at least one.
In addition, the user edits the custom actions in the editing interface in a manner of inputting edits or selecting edits, and the custom actions define actions to be executed when the custom trigger condition is satisfied. Accordingly, the mobile terminal responds to action editing operation of a user in an editing interface, and determines a custom action. It is noted that the number of custom actions may be at least one.
After the user finishes setting, the running scene is saved into the system through a saving operation. For example, the user clicks a "save"/"complete" button or similar option in the editing interface. Correspondingly, the mobile terminal responds to the save operation of the user in the editing interface, and saves the operation scene into a database for subsequent use.
Through the steps, the user can easily create, edit and save the operation scene so as to meet personalized requirements and preferences. This flexibility allows for more intelligent and convenient vehicle control.
Taking fig. 5 as an example, fig. 5 is a schematic diagram of an editing interface related to the vehicle control method of the present application. Entering an editing interface through scene creation operation of a user, and editing custom trigger conditions and custom actions in the editing interface.
Alternatively, the custom trigger conditions may include a primary custom trigger condition and a secondary custom trigger condition, and the custom action may include a primary custom action and a secondary custom action. The main custom trigger condition is used for triggering and executing the main custom action, and the secondary custom trigger condition is used for triggering and executing the secondary custom action.
The main custom trigger condition in FIG. 5 is "main drive is on while the vehicle is on and the temperature in the vehicle is above 35 ℃, the main custom action is" turn on the air conditioner switch "; the main driving temperature is adjusted to 18 ℃; closing ventilation of a main driving seat, wherein the secondary custom triggering condition is that the temperature in a vehicle is 20-30 ℃, and the secondary custom action is that the temperature of the main driving seat is adjusted to 25 ℃; the ventilation of the main driving seat is set to be 1 gear).
Under the general condition, the target vehicle firstly judges whether the main self-defined triggering condition is met, and if the main self-defined triggering condition is met, the main self-defined action is executed; then, the target vehicle judges whether the secondary custom trigger condition is met, and if the secondary custom trigger condition is met, the secondary custom action is executed.
In the event that the user operates the run control of the run scene (e.g., clicks on the one-touch run control), the target vehicle receives a scene entry instruction from the mobile terminal. Firstly, a target vehicle skips a main custom triggering condition according to a scene entering instruction, and directly executes a main custom action through a vehicle-to-machine system; and then, starting the monitoring process to judge whether the secondary custom trigger condition is met, and triggering the execution of the secondary custom action if the secondary custom trigger condition is met.
In this embodiment, by responding to different operations, editing of custom trigger conditions and custom actions is achieved. The configuration process not only provides great flexibility, but also enables the user to set personalized operation scenes according to own requirements. Meanwhile, by storing the operation scene, the user can call and execute various functions in the operation scene at any time, and convenience in configuration and use of the operation scene is improved.
Further, referring to fig. 6, a flow chart is provided according to a third embodiment of the vehicle control method of the present application, based on the embodiment shown in fig. 1, step a10, after generating a scene entry instruction in response to an operation of an operation control corresponding to a preset operation scene, step a20, before sending the scene entry instruction to the target vehicle, further includes:
step A021, detecting the network connection state of the mobile terminal and the target vehicle;
step A022, if the network connection state is abnormal, pushing network abnormality prompt information;
step A023, if the network connection state is not abnormal, executing the steps of: and sending the scene entering instruction to the target vehicle.
Specifically, in order to ensure effective control of the target vehicle, the mobile terminal needs to detect the network connection state of the mobile terminal and the target vehicle before transmitting a scene entry instruction to the target vehicle. The process of detecting the network connection state may include: (1) data transmission rate detection: by periodically detecting the data transmission rate, it can be determined whether the network connection is normal. If the data transmission rate drops significantly or fluctuates significantly, it may indicate that there is a problem with the network connection. (2) packet loss rate detection: if data packets between the mobile terminal and the target vehicle are frequently lost, it may indicate that the network connection is unstable. Normally, a small number of packets are lost normally, but if the loss rate is too high, this indicates a problem with the network. (3) network delay detection: by detecting the delay time of data transmission, whether the network connection is normal or not can be judged. If the delay time is too long or the fluctuation is large, it may indicate that there is a problem with the network connection. (4) network connectivity detection: by periodically sending the probe data packet, it is detected whether the mobile terminal and the target vehicle can successfully establish a connection. If connectivity is problematic, it may be indicated that there is a problem with the network connection.
Based on the above detection process, it can be determined whether or not there is an abnormality in the network connection state. Types of anomalies may include: network disconnection, data transmission delay, data packet loss, network fluctuations.
If the network connection state is abnormal, the mobile terminal pushes network abnormality prompt information. For example, a text prompt similar to "network abnormality" is displayed on the display interface of the mobile terminal through a popup window, and in addition, various prompt pushing modes such as voice prompt, vibration prompt, status bar prompt and the like can be adopted.
If the network connection state is not abnormal, the mobile terminal further sends a scene entering instruction to the target vehicle, and the scene entering instruction is normally sent to the target vehicle.
In this embodiment, after generating the scene entry instruction, detection of the network connection state between the mobile terminal and the target vehicle is increased. When the network connection state is abnormal, the mobile terminal can push related prompt information in time, so that correct transmission of instructions is ensured. Only when the network connection is normal, an instruction can be sent to the target vehicle, so that the reliability and stability of the vehicle control method are improved.
Referring to fig. 7, a fourth embodiment of a vehicle control method of the present application provides a flowchart, where the method is applied to a target vehicle, the target vehicle and a mobile terminal are pre-associated with the same user account, and the target vehicle is provided with a TBox and a vehicle-mounted system, and the method includes:
Step B10, receiving a scene entry instruction through the TBox; the scene entering instruction is generated by the mobile terminal in response to operation of an operation control corresponding to a preset operation scene, and is sent to the target vehicle by the mobile terminal; the operation scene comprises a self-defined triggering condition except for the operation of the operation control and a self-defined action triggered and executed by the self-defined triggering condition;
and step B20, according to the scene entry instruction, the vehicle-mounted system skips the custom trigger condition to execute the custom action.
Specifically, the mobile terminal related to the embodiment may be a terminal device such as a mobile phone, a tablet computer, a notebook computer, an intelligent wearable device, and the like, where the mobile terminal has a networking function and is capable of communicating with a target vehicle based on a network; the target vehicle is an intelligent vehicle provided with a TBox (Telematics Box) and a vehicle-mounted system. The TBox is generally used for vehicle-mounted communication, and can realize the functions of remote monitoring, information interaction, fault diagnosis and the like of a target vehicle; the vehicle-mounted system is an intelligent control system in the target vehicle, and can realize various intelligent operations and management of the vehicle.
Firstly, a user needs to configure an operation scene through a mobile terminal, wherein the operation scene comprises a custom trigger condition except operation on an operation control and a custom action triggered and executed by the custom trigger condition. The name or the identification of the operation scene can be set by the user; the custom triggering condition is a condition for judging by using one or more of the perception data or the system data of the target vehicle, and corresponding custom actions can be triggered and executed under the condition that the custom triggering condition is met; the custom action corresponds to a custom trigger condition, serving the user requirements in the operating scenario, for example: adjusting the car configuration (adjusting the seat position, steering wheel angle, rear view mirror position, etc.); changing the driving mode of the automobile (changing the operation modes of an engine, a steering system, an air conditioner and the like); adjusting the vehicle-mounted equipment (adjusting the sound system, the navigation system, the lighting system, the air conditioning system, etc.); setting up the vehicle-mounted Application (starting or closing the vehicle-mounted APP (Application), or performing personalized setting on functions in the APP). There are many other different types of custom actions, which are not described in detail herein.
Optionally, the above-mentioned sensing data refers to data obtained in real time by the target vehicle through sensing capability, such as the movement state data of the vehicle, such as position, speed, acceleration, etc.; energy state data such as the oil mass, the electric quantity, the engine state and the like of the vehicle; external environmental data such as external weather conditions, road conditions, traffic signals, etc. of the vehicle; intelligent driving detection data of pedestrians, vehicles, obstacles and the like around the vehicle. The above-mentioned system data refers to data that the target vehicle can locally acquire, such as the system time, system date, etc. of the vehicle. In addition, there are many other different types of perception data and system data, which are not described in detail herein.
It will be appreciated that the custom trigger condition is determined using one or more of the perception data or the system data of the target vehicle, and the operation of the run control is implemented by the user via the mobile terminal, and the operation of the run control is not part of the custom trigger condition.
After the configuration of the operation scene is completed, the user can see the operation control corresponding to the operation scene through the display interface of the mobile terminal, and the operation control can be a button control, a sliding block control, a switch control and the like, and the form of the operation control is not limited. The user can operate the running control, such as clicking a button control, dragging a slider control, opening a switch control, and the like, and accordingly, the mobile terminal can respond to the operation of the running control by the user to generate a scene entering instruction. The scene entering instruction is used for enabling the target vehicle to enter the operation scene and executing corresponding self-defining actions.
Specifically, the TBox is operating and in a standby state regardless of whether the on-board system of the target vehicle is started. After generating the scene entry instruction, the mobile terminal transmits the scene entry instruction to the target vehicle through the network. Accordingly, the target vehicle receives a scene entry instruction through the TBox.
Then, the target vehicle skips the self-defined triggering condition through the vehicle-to-machine system according to the scene entering instruction, and directly executes the self-defined action, so that the waiting time of the self-defined triggering condition is avoided.
In one case, when the target vehicle receives the scene entry instruction through the TBox, the vehicle system is in an awake state, and at this time, the vehicle system can normally execute the custom action. In another case, when the target vehicle receives the scene entry instruction through the TBox, the vehicle-mounted system is in a dormant state, and the vehicle-mounted system needs to be awakened first to normally execute the self-defined action.
It is noted that, in order to implement vehicle security control, the mobile terminal and the target vehicle need to be associated with the same user account in advance. The manner of association may be: the mobile terminal and the target vehicle are respectively provided with a vehicle control APP, and the mobile terminal and the target vehicle are respectively registered in the same user account on the corresponding vehicle control APP.
Taking fig. 2 as an example, fig. 2 is a schematic diagram of an operation scene control interface related to the vehicle control method of the present application. The operation scene control interface is displayed on the mobile terminal, wherein an operation scene named as 'inflammatory summer' is defined; the self-defined triggering conditions under the operation scene comprise that the main driving is on the car and the temperature in the car is above 35 ℃; the corresponding custom action includes opening an air conditioner switch; the main driving temperature is adjusted to 18 ℃; closing the ventilation of the main driving seat; the button control of the lower 'immediately running' is a running control corresponding to a running scene of 'inflammatory summer'. That is, when the main driving is on and the temperature in the vehicle is 35 ℃ or higher, the target vehicle turns on the air conditioner switch, adjusts the main driving temperature to 18 ℃, and turns off the ventilation of the main driving seat. Or when the user clicks the button control for 'immediately running', the mobile terminal generates a corresponding scene entering instruction, sends the scene entering instruction to the target vehicle, receives and skips the condition judgment that the main driving is carried out and the temperature in the vehicle is more than 35 ℃ according to the scene entering instruction, directly turns on an air conditioner switch, adjusts the main driving temperature to 18 ℃, and closes the ventilation of the main driving seat.
Taking fig. 3 as an example, fig. 3 is another schematic diagram of an operation scene control interface related to the vehicle control method in the present application. The operation scene control interface is displayed on the mobile terminal, in one case, a plurality of user-defined operation scenes are provided, and the "on-duty commute", "camping mode" and "intelligent day and night mode" shown in fig. 3 are all operation scenes, and the user-defined triggering conditions and user-defined actions corresponding to the operation scenes can be different.
In this embodiment, the user may operate the operation control corresponding to the operation scene through the mobile terminal, and the mobile terminal generates and sends the scene entry instruction to the target vehicle in response to the operation. After receiving the scene entering instruction through the TBox, the target vehicle skips the judgment of the self-defined triggering condition according to the scene entering instruction through the vehicle-to-machine system, and directly executes the self-defined action. Thus, the waiting time of the self-defined triggering condition can be avoided, the instant requirement of the user is met, and the user experience is effectively improved.
Further, referring to fig. 8, a flow chart is provided according to a fifth embodiment of the vehicle control method of the present application, based on the embodiment shown in fig. 7, after receiving a scene entry instruction by the TBox, step B10, and before executing the custom action by the vehicle system according to the scene entry instruction by skipping the custom trigger condition, step B20 further includes:
And B011, waking up the vehicle-mounted system according to the scene entering instruction under the condition that the vehicle-mounted system is in a dormant state.
Specifically, the TBox is operating and in a standby state regardless of whether the on-board system of the target vehicle is started. However, the vehicle system of the target vehicle may be in a sleep state or an awake state, in which the vehicle system cannot perform a custom action.
When the target vehicle receives the scene entering instruction through the TBox, if the vehicle-mounted system is in a dormant state, the target vehicle can wake up the vehicle-mounted system according to the scene entering instruction. That is, the scene entry instruction contains content that wakes up the dormant vehicle system. Therefore, the awakened vehicle-mounted system can skip the custom trigger condition to execute the custom action under the control of the scene entering instruction.
In this embodiment, before the target vehicle performs the custom action, it is checked whether the vehicle system is in a sleep state. If the vehicle-mounted system is in a dormant state, the target vehicle can automatically wake up the vehicle-mounted system according to the scene entering instruction, so that the target vehicle can normally receive and execute the instruction. The user often is not in the target vehicle when the vehicle machine system is in the dormant state, and the requirement of the user for remote control can be effectively met by the vehicle machine system, so that the target vehicle wakes up the vehicle machine system in advance and enters an operation scene under the control of the mobile terminal, the subsequent vehicle use of the user is facilitated, and the vehicle use experience is greatly improved.
Further, referring to fig. 9, a flowchart is provided in a sixth embodiment of the vehicle control method according to the present application, based on the embodiment shown in fig. 7, and the method is characterized in that step B10, after receiving a scene entry instruction through the TBox, step B20, before executing the custom action through the vehicle system skipping the custom trigger condition according to the scene entry instruction, further includes:
step B021, detecting the gear of the target vehicle;
step B022, if the gear is a parking gear, executing the steps of: and according to the scene entering instruction, the self-defining action is executed by the vehicle-mounted system skipping the self-defining triggering condition.
In particular, performing certain custom actions may have an impact on driving safety. For example, if the vehicle automatically opens windows or adjusts the steering wheel position during travel, this may distract the driver or cause accidents. Therefore, the target vehicle needs to detect the corresponding gear before performing the custom action. If the gear is not the parking gear, the target vehicle cannot skip the custom trigger condition to execute the custom action according to the scene entering instruction through the vehicle-to-machine system; if the gear is a parking gear, the target vehicle can further execute the custom action according to the scene entry instruction by the vehicle system skipping the custom trigger condition.
In this embodiment, by detecting the vehicle gear and performing the custom action only when in the park gear, it is possible to ensure that the vehicle is in a safe state without threatening the safety of the driver or the passenger.
Further, referring to fig. 10, a flowchart is provided in a seventh embodiment of the vehicle control method according to the present application, based on the embodiment shown in fig. 7, where the custom trigger condition includes a primary custom trigger condition and a secondary custom trigger condition, the custom action includes a primary custom action and a secondary custom action, and the step B20 of executing the custom action by the vehicle system skipping the custom trigger condition according to the scene entry instruction further includes:
step B21, according to the scene entry instruction, the main custom action is executed by the vehicle-mounted system skipping the main custom trigger condition;
step B20, after executing the custom action by the vehicle-mounted system skipping the custom trigger condition according to the scene entry instruction, further includes:
step B031, starting a monitoring process of the secondary self-defined triggering condition through the vehicle-mounted system;
and B032, triggering and executing the secondary custom action under the condition that the secondary custom triggering condition is met.
Specifically, the custom trigger conditions related to the embodiment may include a primary custom trigger condition and a secondary custom trigger condition, and the custom action may include a primary custom action and a secondary custom action. The main custom trigger condition is used for triggering and executing the main custom action, and the secondary custom trigger condition is used for triggering and executing the secondary custom action.
After receiving the scene entering instruction through the TBox, the target vehicle firstly skips the main custom trigger condition through the vehicle system according to the scene entering instruction to execute the main custom action.
Then, a monitoring process of the secondary custom trigger condition is started through the vehicle-mounted system, and the monitoring process can continuously monitor whether the secondary custom trigger condition is met or not. And triggering to execute the secondary custom action under the condition that the secondary custom triggering condition is met.
Taking fig. 5 as an example, the main custom trigger condition in fig. 5 is "main drive is on while the temperature in the vehicle is above 35 ℃, and the main custom action is" turn on the air conditioner switch "; the main driving temperature is adjusted to 18 ℃; closing ventilation of a main driving seat, wherein the secondary custom triggering condition is that the temperature in a vehicle is 20-30 ℃, and the secondary custom action is that the temperature of the main driving seat is adjusted to 25 ℃; the ventilation of the main driving seat is set to be 1 gear).
Under the general condition, the target vehicle firstly judges whether the main self-defined triggering condition is met, and if the main self-defined triggering condition is met, the main self-defined action is executed; then, the target vehicle judges whether the secondary custom trigger condition is met, and if the secondary custom trigger condition is met, the secondary custom action is executed.
In the event that the user operates the run control of the run scene (e.g., clicks on the one-touch run control), the target vehicle receives a scene entry instruction from the mobile terminal. Firstly, a target vehicle skips a main custom triggering condition according to a scene entering instruction, and directly executes a main custom action through a vehicle-to-machine system; and then, starting the monitoring process to judge whether the secondary custom trigger condition is met, and triggering the execution of the secondary custom action if the secondary custom trigger condition is met.
In this embodiment, the primary custom trigger condition and the secondary custom trigger condition are distinguished, and the primary custom action and the secondary custom action are distinguished. First, the target vehicle skips the main custom trigger condition and executes the main custom action according to the scene entry instruction. Then, a monitoring process of the secondary custom trigger condition is started, and the secondary custom action is triggered to be executed when the secondary custom trigger condition is met. This design provides a more careful and flexible control scheme that allows for more precise and personalized vehicle control for different scenarios and needs.
In addition, the embodiment of the application also provides a mobile terminal, which comprises a memory, a processor and a vehicle control program stored in the memory and capable of running on the processor, wherein the vehicle control program realizes the steps of the vehicle control method when being executed by the processor.
Because the control program of the vehicle is executed by the processor, all the technical solutions of all the embodiments are adopted, and therefore, the control program at least has all the beneficial effects brought by all the technical solutions of all the embodiments, and the description is omitted herein.
In addition, the embodiment of the application also provides a vehicle, which comprises a memory, a processor and a vehicle control program stored on the memory and capable of running on the processor, wherein the vehicle control program realizes the steps of the vehicle control method when being executed by the processor.
Because the control program of the vehicle is executed by the processor, all the technical solutions of all the embodiments are adopted, and therefore, the control program at least has all the beneficial effects brought by all the technical solutions of all the embodiments, and the description is omitted herein.
In addition, the embodiment of the application also proposes a computer-readable storage medium, on which a vehicle control program is stored, which when executed by a processor implements the steps of the vehicle control method as described above.
Because the control program of the vehicle is executed by the processor, all the technical solutions of all the embodiments are adopted, and therefore, the control program at least has all the beneficial effects brought by all the technical solutions of all the embodiments, and the description is omitted herein.
It should be noted that, in this document, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or system that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or system. Without further limitation, an element defined by the phrase "comprising one … …" does not exclude the presence of other like elements in a process, method, article, or system that comprises the element.
The foregoing embodiment numbers of the present application are merely for describing, and do not represent advantages or disadvantages of the embodiments.
From the above description of the embodiments, it will be clear to those skilled in the art that the above-described embodiment method may be implemented by means of software plus a necessary general hardware platform, but of course may also be implemented by means of hardware, but in many cases the former is a preferred embodiment. Based on such understanding, the technical solution of the present application may be embodied essentially or in a part contributing to the prior art in the form of a software product stored in a storage medium (e.g. ROM/RAM, magnetic disk, optical disk) as above, including several instructions for causing a terminal device (which may be a mobile phone, a computer, a server, a controlled terminal, or a network device, etc.) to perform the method of each embodiment of the present application.
The foregoing description is only of the preferred embodiments of the present application, and is not intended to limit the scope of the claims, and all equivalent structures or equivalent processes using the descriptions and drawings of the present application, or direct or indirect application in other related technical fields are included in the scope of the claims of the present application.

Claims (10)

1. The vehicle control method is characterized in that the method is applied to a mobile terminal, the mobile terminal and a target vehicle are pre-associated with the same user account, the target vehicle is provided with a TBox and a vehicle-mounted system, and the method comprises the following steps:
responding to the operation of an operation control corresponding to a preset operation scene, and generating a scene entering instruction; the operation scene comprises a self-defined trigger condition except the operation of the operation control and a self-defined action triggered and executed by the self-defined trigger condition;
and sending the scene entering instruction to the target vehicle, so that the target vehicle receives the scene entering instruction through the TBox, and executing the custom action through the vehicle-to-machine system skipping the custom triggering condition according to the scene entering instruction.
2. The vehicle control method according to claim 1, characterized in that the configuration process of the operation scene includes:
Responding to a scene creation operation, creating the operation scene, and entering an editing interface of the operation scene;
responding to trigger condition editing operation in the editing interface, and determining the custom trigger condition;
determining the custom action in response to an action editing operation in the editing interface;
and responding to a save operation in the editing interface, and saving the running scene.
3. The vehicle control method according to claim 1, characterized in that after the step of generating a scene entry instruction in response to an operation of an operation control corresponding to a preset operation scene, the step of transmitting the scene entry instruction to the target vehicle further comprises:
detecting a network connection state of the mobile terminal and the target vehicle;
if the network connection state is abnormal, pushing network abnormality prompt information;
if the network connection state is not abnormal, executing the steps of: and sending the scene entering instruction to the target vehicle.
4. A vehicle control method, wherein the method is applied to a target vehicle, the target vehicle and a mobile terminal are pre-associated with the same user account, the target vehicle is provided with a TBox and a vehicle-mounted system, and the method comprises:
Receiving a scene entry instruction through the TBox; the scene entering instruction is generated by the mobile terminal in response to operation of an operation control corresponding to a preset operation scene, and is sent to the target vehicle by the mobile terminal; the operation scene comprises a self-defined triggering condition except for the operation of the operation control and a self-defined action triggered and executed by the self-defined triggering condition;
and according to the scene entering instruction, the self-defining action is executed by the vehicle-mounted system skipping the self-defining triggering condition.
5. The vehicle control method according to claim 4, wherein after the step of receiving a scene entry instruction by the TBox, before the step of executing the custom action by the vehicle system skipping the custom trigger condition according to the scene entry instruction, further comprising:
and under the condition that the vehicle-mounted system is in a dormant state, waking up the vehicle-mounted system according to the scene entering instruction.
6. The vehicle control method according to claim 4, wherein after the step of receiving a scene entry instruction by the TBox, before the step of executing the custom action by the vehicle system skipping the custom trigger condition according to the scene entry instruction, further comprising:
Detecting a gear of the target vehicle;
if the gear is a parking gear, executing the steps of: and according to the scene entering instruction, the self-defining action is executed by the vehicle-mounted system skipping the self-defining triggering condition.
7. The vehicle control method of claim 4, wherein the custom trigger condition includes a primary custom trigger condition and a secondary custom trigger condition, the custom action includes a primary custom action and a secondary custom action, and the step of executing the custom action by the vehicle system skipping the custom trigger condition according to the scene entry instruction includes:
according to the scene entering instruction, the main custom action is executed by the vehicle-to-machine system skipping the main custom triggering condition;
after the step of executing the custom action by the vehicle-mounted system according to the scene entry instruction and by skipping the custom trigger condition, the method further comprises the following steps:
starting a monitoring process of the secondary custom triggering condition through the vehicle-to-machine system;
and triggering and executing the secondary custom action under the condition that the secondary custom triggering condition is met.
8. A mobile terminal comprising a memory, a processor and a vehicle control program stored on the memory and operable on the processor, which when executed by the processor, implements the steps of the vehicle control method of any one of claims 1-3.
9. A vehicle comprising a memory, a processor and a vehicle control program stored on the memory and operable on the processor, which when executed by the processor, implements the steps of the vehicle control method of any one of claims 4-7.
10. A computer-readable storage medium, characterized in that the computer-readable storage medium has stored thereon a vehicle control program which, when executed by a processor, implements the steps of the vehicle control method according to any one of claims 1-3 or 4-7.
CN202311866750.2A 2023-12-29 2023-12-29 Vehicle control method, mobile terminal, vehicle and storage medium Pending CN117675995A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202311866750.2A CN117675995A (en) 2023-12-29 2023-12-29 Vehicle control method, mobile terminal, vehicle and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202311866750.2A CN117675995A (en) 2023-12-29 2023-12-29 Vehicle control method, mobile terminal, vehicle and storage medium

Publications (1)

Publication Number Publication Date
CN117675995A true CN117675995A (en) 2024-03-08

Family

ID=90064193

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202311866750.2A Pending CN117675995A (en) 2023-12-29 2023-12-29 Vehicle control method, mobile terminal, vehicle and storage medium

Country Status (1)

Country Link
CN (1) CN117675995A (en)

Similar Documents

Publication Publication Date Title
CN105549454B (en) System and method for providing passenger substituting instructions for an autonomous vehicle
US9188449B2 (en) Controlling in-vehicle computing system based on contextual data
CN103149845B (en) For realizing the system and method for the vehicle service of customization
CN110654389B (en) Vehicle control method and device and vehicle
US20170190331A1 (en) Method and system for adaptive detection and application of horn for an autonomous vehicle
US20180151088A1 (en) Vehicle tutorial system and method for sending vehicle tutorial to tutorial manager device
US20030167112A1 (en) Vehicle agent system acting for driver in controlling in-vehicle devices
CN106564449A (en) Intelligent driving customization method and device
CN110001566B (en) In-vehicle life body protection method and device and computer readable storage medium
US20210122242A1 (en) Motor Vehicle Human-Machine Interaction System And Method
US11285966B2 (en) Method and system for controlling an autonomous vehicle response to a fault condition
US11577688B2 (en) Smart window apparatus, systems, and related methods for use with vehicles
CN112092583A (en) Vehicle-mounted aromatherapy control system and vehicle-mounted aromatherapy
CN112172827B (en) Driving assistance system control method, device, equipment and storage medium
CN112124048A (en) Vehicle-mounted aromatherapy control system and vehicle-mounted aromatherapy
JP4393277B2 (en) Car room temperature monitoring device
US20230182759A1 (en) Methods and systems for imporiving user alertness in an autonomous vehicle
CN117675995A (en) Vehicle control method, mobile terminal, vehicle and storage medium
JP2013057321A (en) Energy saving evaluation device and energy saving evaluation method
CN112078339A (en) Vehicle-mounted aromatherapy control system and vehicle-mounted aromatherapy
US20220297633A1 (en) Systems And Methods For Random Vehicle Movement For Vehicle Safety
US20230077561A1 (en) System and method for remote interface with vehicle
US11794770B1 (en) Systems and methods for hearing impaired drivers
WO2024036633A1 (en) Control method and apparatus, and vehicle
CN116443040A (en) Vehicle control method and device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination