CN115214702A - Vehicle control method, system, device, and storage medium - Google Patents

Vehicle control method, system, device, and storage medium Download PDF

Info

Publication number
CN115214702A
CN115214702A CN202110407014.5A CN202110407014A CN115214702A CN 115214702 A CN115214702 A CN 115214702A CN 202110407014 A CN202110407014 A CN 202110407014A CN 115214702 A CN115214702 A CN 115214702A
Authority
CN
China
Prior art keywords
target
vehicle
human
control
size
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202110407014.5A
Other languages
Chinese (zh)
Inventor
陈玮
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shanghai Pateo Network Technology Service Co Ltd
Original Assignee
Shanghai Pateo Network Technology Service Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shanghai Pateo Network Technology Service Co Ltd filed Critical Shanghai Pateo Network Technology Service Co Ltd
Priority to CN202110407014.5A priority Critical patent/CN115214702A/en
Publication of CN115214702A publication Critical patent/CN115214702A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W50/00Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
    • B60W50/08Interaction between the driver and the control system
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R16/00Electric or fluid circuits specially adapted for vehicles and not otherwise provided for; Arrangement of elements of electric or fluid circuits specially adapted for vehicles and not otherwise provided for
    • B60R16/02Electric or fluid circuits specially adapted for vehicles and not otherwise provided for; Arrangement of elements of electric or fluid circuits specially adapted for vehicles and not otherwise provided for electric constitutive elements

Landscapes

  • Engineering & Computer Science (AREA)
  • Mechanical Engineering (AREA)
  • Automation & Control Theory (AREA)
  • Human Computer Interaction (AREA)
  • Transportation (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The application provides a vehicle control method, a system, a device and a storage medium, wherein the method comprises the following steps: firstly, acquiring running speed information of a target vehicle, then acquiring fixation point information of a target user in the target vehicle, and finally adjusting a human-computer interaction interface of the target vehicle according to the running speed information and the fixation point information. Through in the vehicle driving process, based on the speed of going of vehicle and the target user's in this vehicle point of regard circumstances, carry out differentiation, the adjustment of dynamicity to the human-computer interaction interface of vehicle, the convenience that the user controlled human-computer interaction interface has been promoted, and then the security of vehicle driving process has been improved, furthermore, based on the frequency of use of human-computer interaction interface's controlling part quantity and each controlling part, determine the higher controlling part of frequency of use and show, the validity that the controlling part shows has been improved.

Description

Vehicle control method, system, device, and storage medium
Technical Field
The present application relates to the field of data processing technologies, and in particular, to a vehicle control method, system, device, and storage medium.
Background
With the high-speed development in the technical field of automobiles, the functions of a vehicle machine are more and more abundant, and the vehicle machine can be used for realizing information communication between a person and a vehicle and between the vehicle and the outside (between the vehicle and the vehicle) in function, serving as a human-computer interaction interface of the vehicle, outputting information to a user and acquiring the operation of the user so as to realize the control of the user on the vehicle.
At present, a man-machine interface of a vehicle generally displays contents (including controls) in a fixed size, and a user cannot conveniently view and operate the man-machine interface in the driving process of the vehicle, so that the safety of the driving process of the vehicle is low.
Disclosure of Invention
Based on the problems, the application provides a vehicle control method, a vehicle control system, a vehicle control device, a vehicle control storage medium, a vehicle control device and a vehicle control storage medium, wherein in the driving process of a vehicle, differentiation and dynamic adjustment are performed on a human-computer interaction interface of the vehicle based on the driving speed of the vehicle and the viewpoint situation of a target user in the vehicle, so that the convenience of the user for controlling the human-computer interaction interface is improved, and the safety of the driving process of the vehicle is improved.
In a first aspect, an embodiment of the present application provides a vehicle control method, including the following steps: acquiring running speed information of a target vehicle;
acquiring the fixation point information of a target user in the target vehicle;
and adjusting a human-computer interaction interface of the target vehicle according to the running speed information and the fixation point information.
In a second aspect, an embodiment of the present application provides a vehicle control system, which includes a first obtaining module, a second obtaining module, and a regulating module, where the first obtaining module and the second obtaining module are respectively in data transmission connection with the regulating module;
the first acquisition module is used for acquiring the running speed information of the target vehicle;
the second acquisition module is used for acquiring the fixation point information of a target user in the target vehicle;
and the adjusting module is used for adjusting the human-computer interaction interface of the target vehicle according to the running speed information and the fixation point information.
In a third aspect, an embodiment of the present application provides an electronic device, including a processor, a memory, a communication interface, and one or more programs, where the one or more programs are stored in the memory and configured to be executed by the processor, and the programs include instructions for executing the steps in the first aspect of the embodiment of the present application.
In a fourth aspect, an embodiment of the present application provides a computer-readable storage medium, where the computer-readable storage medium stores a computer program for electronic data exchange, where the computer program enables a computer to perform some or all of the steps described in the first aspect of the embodiment of the present application.
In a fifth aspect, embodiments of the present application provide a computer program product, where the computer program product includes a non-transitory computer-readable storage medium storing a computer program, where the computer program is operable to cause a computer to perform some or all of the steps as described in the first aspect of the embodiments of the present application. The computer program product may be a software installation package.
It can be seen that, in the embodiment of the application, firstly, the driving speed information of the target vehicle is obtained, then, the gaze point information of the target user in the target vehicle is obtained, and finally, the human-computer interaction interface of the target vehicle is adjusted according to the driving speed information and the gaze point information. It is thus clear that through in the vehicle driving process, based on the speed of traveling of vehicle and the target user's in this vehicle point of regard condition, carry out differentiation, dynamic regulation to the human-computer interaction interface of vehicle, promoted the convenience that the user controlled the display screen, and then improved the security of vehicle driving process, in addition, based on the controlling part quantity of human-computer interaction interface and the frequency of use of each controlling part, determine the higher controlling part of frequency of use and show, improved the validity that the controlling part shows.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present invention, the drawings needed to be used in the description of the embodiments are briefly introduced below, and it is obvious that the drawings in the following description are some embodiments of the present invention, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without creative efforts.
FIG. 1 is a schematic flow chart of a vehicle control method provided in an embodiment of the present application;
fig. 2 is a schematic structural diagram of an electronic device according to an embodiment of the present disclosure;
fig. 3 is a block diagram of functional modules of a vehicle control system according to an embodiment of the present disclosure.
Detailed Description
In order to make the technical solutions of the present application better understood by those skilled in the art, the technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are only a part of the embodiments of the present application, and not all the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
The terms "first," "second," and the like in the description and claims of the present application and in the above-described drawings are used for distinguishing between different objects and not for describing a particular order. Furthermore, the terms "include" and "have," as well as any variations thereof, are intended to cover a non-exclusive inclusion. For example, a process, method, system, apparatus, and storage medium, product, or apparatus that comprises a list of steps or elements is not limited to only those steps or elements but may alternatively include other steps or elements not expressly listed or inherent to such process, method, product, or apparatus.
Reference herein to "an embodiment" means that a particular feature, structure, or characteristic described in connection with the embodiment can be included in at least one embodiment of the application. The appearances of the phrase in various places in the specification are not necessarily all referring to the same embodiment, nor are separate or alternative embodiments mutually exclusive of other embodiments. It is explicitly and implicitly understood by one skilled in the art that the embodiments described herein can be combined with other embodiments.
In order to make the technical solutions of the present application better understood, the technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are only a part of the embodiments of the present application, and not all of the embodiments. All other embodiments obtained by a person of ordinary skill in the art based on the embodiments in the present application without making any creative effort belong to the protection scope of the present application.
A vehicle control method in the embodiment of the present application is described below with reference to fig. 1, where fig. 1 is a schematic flow chart of the vehicle control method provided in the embodiment of the present application, and the method specifically includes the following steps:
step 101, obtaining the running speed information of the target vehicle.
Wherein the traveling speed information is used for reflecting the traveling speed of the target vehicle.
The running speed information may be the rotation speed of the engine of the target vehicle and the tire outer diameter size of the target vehicle, in which case, the current running speed of the target vehicle can be determined according to the rotation speed change and the tire outer diameter size; the running speed information can be position information acquired by a vehicle-mounted GPS according to a set time period, and in this case, the current running speed of the target vehicle can be determined according to position change; in addition, since the mobile terminal such as a mobile phone or a tablet disposed in the vehicle is usually stationary with respect to the vehicle body, the travel speed information may be position information acquired by the in-vehicle mobile terminal GPS at a set time period, and in this case, the current travel speed of the target vehicle may be determined based on a change in position. In addition, in order to ensure the accuracy of the vehicle speed, the driving speed information can be acquired through a plurality of different channels, and the current vehicle speed is obtained by comprehensively processing the driving speed information acquired through the different channels.
In addition, the acquisition of the travel speed information of the target vehicle may be set according to actual conditions, for example, 1 second or 2 seconds.
And 102, acquiring the gazing point information of the target user in the target vehicle.
The gazing point information is used for reflecting the gazing point condition of the target user.
Wherein, the target user may be a driver.
Wherein the gazing point information includes: whether the target user's gaze point falls into the human-computer interaction interface of the target vehicle, and the duration of the target user's gaze point falling into the human-computer interaction interface of the target vehicle. The above-mentioned implementation manner of obtaining the gaze point information of the target user in the target vehicle may be: acquiring a face image of a user; and analyzing the fixation point state of the user according to the face image to obtain the fixation point information.
And 103, adjusting a human-computer interaction interface of the target vehicle according to the running speed information and the fixation point information.
The man-machine interaction interface is a communication medium or means between a person and a computer system, and is a platform for performing bidirectional information exchange of various symbols and actions between the person and the computer. Specifically, the human-computer interaction interface includes an interface for performing information interaction with a user, such as a touch screen and a display panel.
Step 101 and step 102 are performed synchronously without interfering with each other.
It can be seen that, in the embodiment of the application, firstly, the driving speed information of the target vehicle is obtained, then, the gaze point information of the target user in the target vehicle is obtained, and finally, the human-computer interaction interface of the target vehicle is adjusted according to the driving speed information and the gaze point information. It is thus clear that through at the vehicle in-process, based on the speed of going of vehicle and the target user's in this vehicle point of regard condition, carry out differentiation, dynamic regulation to the human-computer interaction interface of vehicle, promoted the convenience that the user controlled the display screen, and then improved the security of vehicle driving process. Wherein, the adjusting the human-computer interaction interface of the target vehicle according to the driving speed information and the gaze point information comprises the following steps A1-A3:
step A1, judging whether the gaze point of the target user meets a first condition according to the gaze point information, wherein the first condition comprises that the gaze point of the target user falls into the human-computer interaction interface.
Preferably, the first condition further includes that a duration of the target user's gaze point staying in the human-computer interaction interface exceeds a preset duration.
The preset time length may be 0.5s or 1s, or other time lengths, and the preset time length may be set according to actual requirements, and is not particularly limited.
As can be seen, in this example, the human-computer interaction interface of the vehicle can be adjusted when the duration that the gaze point of the target user stays in the human-computer interaction interface exceeds the preset duration.
And step A2, if the fixation point of the target user meets the first condition, determining the current running speed of the target vehicle according to the running speed information.
If the target user's point of regard does not satisfy the first condition, the foregoing steps 101 and 102 are performed.
And A3, adjusting a human-computer interaction interface of the target vehicle according to the current running speed.
Wherein, the adjusting the human-computer interaction interface of the target vehicle comprises: selecting a target control from the controls of the human-computer interaction interface; adjusting the size of the target control.
Here, the controls of the human-computer interaction interface include a currently displayed control and a temporarily hidden control; the target control may be a control with a high use frequency of the target user.
Further, the selecting a target control from the controls of the human-computer interaction interface includes: acquiring the number of controls of the human-computer interaction interface and the use frequency of each control; and determining the target control according to the number of the controls and the use frequency of each control.
The target control may be a control that is used more frequently by the target user. When the information such as the control use frequency of the target user is not acquired, the target control may be a control with higher importance of each control, and the target control may be determined based on preset control importance levels.
Further, the determining the target control according to the number of the controls and the use frequency of each control includes the following steps: if the number of the controls is less than or equal to a preset value, determining that each control in the human-computer interaction interface is set as a target control; and if the number of the controls is larger than a preset value, selecting n controls meeting a second condition as target controls, wherein the second condition is that the controls are arranged at the top n bits in a control sequence obtained by sequencing all the controls from high use frequency to low use frequency, and n is equal to the preset value.
In a specific implementation, the preset value may be set as needed, and is not specifically limited, where the preset value is 5 or 6.
Therefore, in the example, the control with the higher use frequency can be determined to be displayed based on the number of the controls of the human-computer interaction interface and the use frequency of each control, and the display effectiveness of the controls is improved.
In this embodiment, the adjusting the size of the target control includes the following steps: judging a speed interval in which the current running speed is positioned, wherein the speed interval comprises a first speed interval, a second speed interval and a third speed interval which are obtained by dividing the speed of the vehicle from low to high; if the speed interval in which the current running speed is located is the first speed interval, adjusting the size of the target control to be a first size; if the speed interval of the current running speed is the second speed interval, adjusting the size of the target control to be a second size, wherein the second size is larger than the first size; and if the speed interval in which the current running speed is located is the third speed interval, adjusting the size of the target control to be a third size, wherein the third size is larger than the second size.
Before this step, the driving speed of the vehicle needs to be divided into a plurality of speed intervals in advance, and the size of the control corresponding to each speed interval is set based on the size of the human-computer interaction interface. For example, the running speed of the vehicle may be divided into a first speed section, a second speed section, and a third speed section from low to high, that is, the vehicle speed of the second speed section is greater than the vehicle speed of the first speed section and is less than the vehicle speed of the third speed section. The method specifically comprises the following steps: the first speed interval is [ 0-18) KM/h, the second speed interval is [ 18-60) KM/h, and the third speed interval is >60KM/h.
It is to be understood that, according to practical situations, a person skilled in the art can also divide the running speed of the vehicle into other speed intervals, and the above-mentioned preferred embodiment is not to be construed as a limitation on the setting manner.
It should be noted that the size of the control of the human-computer interaction interface may be dynamically adjusted based on the driving speed of the vehicle and dynamically displayed in time when the gaze point condition of the target user meets a preset first condition. If the display states of the human-computer interaction interface are the same before and after the condition of the point of regard of the target user meets the preset first condition, the effect presented visually is not changed under the condition of adjusting the human-computer interaction interface, namely the contents displayed before and after are the same.
Presetting three speed intervals (including a first speed interval of [ 0-18%) KM/h, a second speed interval of [ 18-60%) KM/h and a third speed interval of >60 KM/h), presetting the preset value to be 5, and presetting the first condition that the fixation point of the target user falls into the human-computer interaction interface. In a specific application scenario, if a vehicle owner of a target user is determined, in the driving process of the vehicle owner, if the vehicle owner is determined to look at a human-computer interaction interface, the current driving speed of the vehicle is 12KM/h, the number of controls of the human-computer interaction interface is 6, and if the number of the controls 6 is determined to be greater than the preset value 5, 5 target controls of the 5 front-located 5-bit using frequency of the vehicle owner are determined from the 6 controls corresponding to the human-computer interaction interface, and the 5 target controls are displayed in a first size.
The above description has introduced the solution of the embodiment of the present application mainly from the perspective of the method-side implementation process. It is understood that the electronic device comprises corresponding hardware structures and/or software modules for performing the respective functions in order to realize the above-mentioned functions. Those of skill in the art will readily appreciate that the present application is capable of hardware or a combination of hardware and computer software implementing the various illustrative modules, elements, and algorithm steps described in connection with the embodiments provided herein. Whether a function is performed as hardware or computer software drives hardware depends upon the particular application and design constraints imposed on the solution. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
Referring to fig. 2 in accordance with the embodiment shown in fig. 1, fig. 2 is a schematic structural diagram of an electronic device 200 provided in an embodiment of the present application, and as shown in fig. 2, the electronic device 200 includes an application processor 210, a memory 220, a communication interface 230, and one or more programs 221, where the one or more programs 221 are stored in the memory 220 and configured to be executed by the application processor 210, and the one or more programs 221 include instructions for executing part or all of the steps of any of the methods described in the method embodiments.
It can be seen that, in the embodiment of the application, firstly, the driving speed information of the target vehicle is obtained, then, the gaze point information of the target user in the target vehicle is obtained, and finally, the human-computer interaction interface of the target vehicle is adjusted according to the driving speed information and the gaze point information. It is thus clear that through in the vehicle driving process, based on the speed of going of vehicle and the target user's in this vehicle point of regard condition, carry out differentiation, dynamic regulation to the human-computer interaction interface of vehicle, promoted the convenience that the user controlled the display screen, and then improved the security of vehicle driving process, in addition, based on the controlling part quantity of human-computer interaction interface and the frequency of use of each controlling part, determine the higher controlling part of frequency of use and show, improved the validity that the controlling part shows.
In the embodiment of the present application, the electronic device may be divided into the functional modules and the functional units according to the method example, for example, each module and each functional unit may be divided corresponding to each function, or two or more functions may be integrated into one processing unit or one processing module. The integrated modules and units can be realized in the form of hardware, and can also be realized in the form of software functional modules and units. It should be noted that, in the embodiment of the present application, the division of the module and the unit is schematic, and is only a logic function division, and there may be another division manner in actual implementation.
In the case of dividing each function module and unit according to each function, a detailed description is provided below with reference to fig. 3 for a vehicle control system in the embodiment of the present application, where the vehicle control system 300 includes a first obtaining module 301, a second obtaining module 302, and an adjusting module 303, and the first obtaining module 301 and the second obtaining module 302 are respectively connected to the adjusting module 303 in a data transmission manner;
the first obtaining module 301 is configured to obtain running speed information of a target vehicle;
the second obtaining module 302 is configured to obtain the gazing point information of the target user in the target vehicle;
the adjusting module 303 is configured to adjust a human-computer interaction interface of the target vehicle according to the driving speed information and the gaze point information.
It can be seen that, in the embodiment of the application, firstly, the driving speed information of the target vehicle is obtained, then, the gaze point information of the target user in the target vehicle is obtained, and finally, the human-computer interaction interface of the target vehicle is adjusted according to the driving speed information and the gaze point information. It is thus clear that through in the vehicle driving process, based on the speed of going of vehicle and the target user's in this vehicle point of regard condition, carry out differentiation, the adjustment of dynamicity to the human-computer interaction interface of vehicle, the convenience that the user controlled the display screen has been promoted, and then the security of vehicle driving process has been improved, furthermore, based on the frequency of use of human-computer interaction interface's controlling part quantity and each controlling part, determine the higher controlling part of frequency of use and show, the validity that the controlling part shows has been improved.
In one possible example, the second acquisition module 302 comprises an image acquisition unit and an image analysis unit;
the image acquisition unit is used for acquiring a face image of a user;
the image analysis unit is used for analyzing the fixation point state of the user according to the face image to obtain the fixation point information.
In one possible example, the adjusting module 302 includes a judging unit, a travel speed determining unit, and an adjusting unit;
the judging unit is used for judging whether the target user's gaze point meets a first condition according to the gaze point information, wherein the first condition comprises that the target user's gaze point falls into the human-computer interaction interface;
the running speed determining unit is used for determining the current running speed of the target vehicle according to the running speed information if the gaze point of the target user meets the first condition;
and the adjusting unit is used for adjusting the human-computer interaction interface of the target vehicle according to the current running speed.
In one possible example, the first condition further includes that a duration of time that the target user's point of regard stays within the human-computer interaction interface exceeds a preset duration.
In a possible example, in terms of the adjusting the human machine interface of the target vehicle, the adjusting unit is specifically configured to:
selecting a target control from the controls of the human-computer interaction interface;
and adjusting the size of the target control.
In one possible example, in the aspect of selecting a target control from the controls of the human-computer interaction interface, the adjusting unit is specifically configured to:
acquiring the number of controls of the human-computer interaction interface and the use frequency of each control;
and determining the target control according to the number of the controls and the use frequency of each control.
In one possible example, in the aspect of determining the target control according to the number of controls and the use frequency of each control, the adjusting unit is specifically configured to:
if the number of the controls is less than or equal to a preset value, determining that each control in the human-computer interaction interface is set as a target control;
if the number of the controls is larger than a preset value, selecting n controls meeting a second condition as target controls, wherein the second condition is that the controls are arranged at the top n bits in a control sequence obtained by sequencing all the controls from high use frequency to low use frequency, and n is equal to the preset value.
In one possible example, in terms of the adjusting the size of the target control, the adjusting unit is specifically configured to: judging a speed interval in which the current running speed is positioned, wherein the speed interval comprises a first speed interval, a second speed interval and a third speed interval which are obtained by dividing the speed of the vehicle from low to high;
if the speed interval in which the current running speed is located is the first speed interval, adjusting the size of the target control to be a first size;
if the speed interval of the current running speed is the second speed interval, adjusting the size of the target control to be a second size, wherein the second size is larger than the first size;
and if the speed interval in which the current running speed is located is the third speed interval, adjusting the size of the target control to be a third size, wherein the third size is larger than the second size.
It can be understood that, since the method embodiment and the system embodiment are different presentation forms of the same technical concept, the contents of the method embodiment portion in the present application should be synchronously adapted to the system embodiment portion, and are not described herein again. The electronic device 200 and the vehicle control system 300 may each execute all of the vehicle control methods included in the method embodiments described above.
Embodiments of the present application further provide a computer-readable storage medium, wherein the computer-readable storage medium stores a computer program for electronic data exchange, the computer program enables a computer to perform part or all of the steps of any one of the methods as set forth in the above method embodiments, and the computer includes a fish school detection device.
Embodiments of the present application also provide a computer program product comprising a non-transitory computer readable storage medium storing a computer program operable to cause a computer to perform some or all of the steps of any one of the methods as set out in the above method embodiments. The computer program product may be a software installation package, the computer comprising an electronic device.
It should be noted that for simplicity of description, the above-mentioned embodiments of the method are described as a series of acts, but those skilled in the art should understand that the present application is not limited by the described order of acts, as some steps may be performed in other orders or simultaneously according to the present application. Further, those skilled in the art will recognize that the embodiments described in this specification are preferred embodiments and that acts or modules referred to are not necessarily required for this application.
In the foregoing embodiments, the descriptions of the respective embodiments have respective emphasis, and for parts that are not described in detail in a certain embodiment, reference may be made to related descriptions of other embodiments.
In the several embodiments provided in the present application, it should be understood that the disclosed system may be implemented in other ways. For example, the above-described system embodiments are merely illustrative, and for example, the division of the above-described units is only one type of division of logical functions, and other divisions may be realized in practice, for example, a plurality of units or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, systems or units, and may be in an electrical or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present application may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit may be implemented in the form of hardware, or may also be implemented in the form of a software functional unit.
The integrated unit may be stored in a computer readable memory if it is implemented in the form of a software functional unit and sold or used as a stand-alone product. Based on such understanding, the technical solutions of the present application, which are essential or part of the technical solutions contributing to the prior art, or all or part of the technical solutions, may be embodied in the form of a software product, which is stored in a memory and includes several instructions for causing a computer device (which may be a personal computer, a server, or a network device, etc.) to execute all or part of the steps of the above methods of the embodiments of the present application. And the aforementioned memory comprises: a U-disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a removable hard disk, a magnetic disk, or an optical disk, and various media capable of storing program codes.
Those skilled in the art will appreciate that all or part of the steps in the methods of the above embodiments may be implemented by associated hardware instructed by a program, which may be stored in a computer-readable memory, which may include: flash Memory disks, read-Only memories (ROMs), random Access Memories (RAMs), magnetic or optical disks, and the like.
The foregoing detailed description of the embodiments of the present application has been presented to illustrate the principles and implementations of the present application, and the above description of the embodiments is only provided to help understand the method and the core concept of the present application; meanwhile, for a person skilled in the art, according to the idea of the present application, there may be variations in the specific embodiments and the application scope, and in summary, the content of the present specification should not be construed as a limitation to the present application.

Claims (17)

1. A vehicle control method characterized by comprising the steps of:
acquiring running speed information of a target vehicle;
acquiring the fixation point information of a target user in the target vehicle;
and adjusting a human-computer interaction interface of the target vehicle according to the running speed information and the point of regard information.
2. The method of claim 1, the adjusting the human-machine interface of the target vehicle according to the travel speed information and the point of regard information, comprising:
judging whether the fixation point of the target user meets a first condition according to the fixation point information, wherein the first condition comprises that the fixation point of the target user falls into the human-computer interaction interface;
if the point of regard of the target user meets the first condition, determining the current running speed of the target vehicle according to the running speed information;
and adjusting the human-computer interaction interface of the target vehicle according to the current running speed.
3. The method of claim 2, wherein the first condition further comprises a duration of time that the target user's point of regard stays within the human-computer interaction interface exceeding a preset duration.
4. The method according to any one of claims 1-3, wherein said adjusting the human-machine interface of the target vehicle comprises the steps of:
selecting a target control from the controls of the human-computer interaction interface;
and adjusting the size of the target control.
5. The method of claim 4, wherein the selecting of the target control from the controls of the human-computer interaction interface comprises the following steps:
acquiring the number of controls of the human-computer interaction interface and the use frequency of each control;
and determining the target control according to the number of the controls and the use frequency of each control.
6. The method of claim 5, wherein the determining the target control according to the number of controls and the frequency of use of each control comprises the following steps:
if the number of the controls is less than or equal to a preset value, determining that each control in the human-computer interaction interface is set as a target control;
and if the number of the controls is larger than a preset value, selecting n controls meeting a second condition as target controls, wherein the second condition is that the controls are arranged at the top n bits in a control sequence obtained by sequencing all the controls from high use frequency to low use frequency, and n is equal to the preset value.
7. The method of claim 4, the adjusting the size of the target control comprising:
judging a speed interval in which the current running speed is positioned, wherein the speed interval comprises a first speed interval, a second speed interval and a third speed interval which are obtained by dividing the speed of the vehicle from low to high;
if the speed interval in which the current running speed is located is the first speed interval, adjusting the size of the target control to be a first size;
if the speed interval of the current running speed is the second speed interval, adjusting the size of the target control to be a second size, wherein the second size is larger than the first size;
and if the speed interval in which the current running speed is located is the third speed interval, adjusting the size of the target control to be a third size, wherein the third size is larger than the second size.
8. A vehicle control system is characterized by comprising a first acquisition module, a second acquisition module and an adjustment module, wherein the first acquisition module and the second acquisition module are respectively in data transmission connection with the adjustment module;
the first acquisition module is used for acquiring the running speed information of the target vehicle;
the second acquisition module is used for acquiring the fixation point information of a target user in the target vehicle;
and the adjusting module is used for adjusting the human-computer interaction interface of the target vehicle according to the running speed information and the fixation point information.
9. The vehicle control system of claim 8, the second acquisition module comprising an image acquisition unit and an image analysis unit;
the image acquisition unit is used for acquiring a face image of a user;
the image analysis unit is used for analyzing the fixation point state of the user according to the face image to obtain the fixation point information.
10. The vehicle control system according to claim 9, the adjusting module includes a judging unit, a travel speed determining unit, and an adjusting unit;
the judging unit is used for judging whether the target user's gaze point meets a first condition according to the gaze point information, wherein the first condition comprises that the target user's gaze point falls into the human-computer interaction interface;
the running speed determining unit is used for determining the current running speed of the target vehicle according to the running speed information if the fixation point of the target user meets the first condition;
and the adjusting unit is used for adjusting the human-computer interaction interface of the target vehicle according to the current running speed.
11. The vehicle control system according to claim 10, the first condition further comprising a duration in which the target user's point of regard stays within the human-machine interaction interface exceeding a preset duration.
12. The vehicle control system according to any one of claims 8-11, in said adjusting a human machine interface of the target vehicle, the adjusting unit being specifically configured to:
selecting a target control from the controls of the human-computer interaction interface;
adjusting the size of the target control.
13. The vehicle control system according to claim 12, wherein the adjusting unit is specifically configured to, in selecting a target control from the controls of the human-machine interface:
acquiring the number of controls of the human-computer interaction interface and the use frequency of each control;
and determining the target control according to the number of the controls and the use frequency of each control.
14. The vehicle control system according to claim 13, the adjusting unit being specifically configured to, in said determining the target control according to the number of controls and the frequency of use of each control:
if the number of the controls is less than or equal to a preset value, determining that each control in the human-computer interaction interface is set as a target control;
and if the number of the controls is larger than a preset value, selecting n controls meeting a second condition as target controls, wherein the second condition is that the controls are arranged at the top n bits in a control sequence obtained by sequencing all the controls from high use frequency to low use frequency, and n is equal to the preset value.
15. The vehicle control system of claim 12, in terms of the adjusting the size of the target control, the adjustment unit being specifically configured to:
judging a speed interval in which the current running speed is positioned, wherein the speed interval comprises a first speed interval, a second speed interval and a third speed interval which are obtained by dividing the speed of the vehicle from low to high;
if the speed interval in which the current running speed is located is the first speed interval, adjusting the size of the target control to be a first size;
if the speed interval of the current running speed is the second speed interval, adjusting the size of the target control to be a second size, wherein the second size is larger than the first size;
and if the speed interval in which the current running speed is located is the third speed interval, adjusting the size of the target control to be a third size, wherein the third size is larger than the second size.
16. An electronic device comprising an application processor, a memory, and one or more programs stored in the memory and configured to be executed by the application processor, the programs comprising instructions for performing the steps of the method of any of claims 1-7.
17. A computer-readable storage medium, characterized in that the computer-readable storage medium stores a computer program comprising program instructions that, when executed by a processor, cause the processor to carry out the method according to any one of claims 1 to 7.
CN202110407014.5A 2021-04-15 2021-04-15 Vehicle control method, system, device, and storage medium Pending CN115214702A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110407014.5A CN115214702A (en) 2021-04-15 2021-04-15 Vehicle control method, system, device, and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110407014.5A CN115214702A (en) 2021-04-15 2021-04-15 Vehicle control method, system, device, and storage medium

Publications (1)

Publication Number Publication Date
CN115214702A true CN115214702A (en) 2022-10-21

Family

ID=83605052

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110407014.5A Pending CN115214702A (en) 2021-04-15 2021-04-15 Vehicle control method, system, device, and storage medium

Country Status (1)

Country Link
CN (1) CN115214702A (en)

Similar Documents

Publication Publication Date Title
US20170147137A1 (en) Systems and methods for regulating control of a vehicle infotainment system
CN110231863B (en) Voice interaction method and vehicle-mounted equipment
CN111845762A (en) Driver distraction determination
CN110545220B (en) Automobile diagnosis protocol detection method and related product
DE202017105761U1 (en) Automatic step control of driver interaction with content
CN112959945B (en) Vehicle window control method and device, vehicle and storage medium
CN115570976B (en) Picture presentation method and device, HUD (head Up display) and storage medium
CN113696728A (en) Alarm control method, device, equipment and storage medium for vehicle instrument
CN112560396A (en) Vehicle-mounted HMI (human machine interface) adjusting method, device and system, electronic equipment and storage medium
EP3704574B1 (en) Vehicle state based graphical user interface
CN115214702A (en) Vehicle control method, system, device, and storage medium
CN116155988A (en) Vehicle-mounted information pushing method, device, equipment and storage medium
CN113386779B (en) Driving style recognition method, device and storage medium
CN111942402B (en) Information pushing method, device, equipment, storage medium and product
CN113961114A (en) Theme replacement method and device, electronic equipment and storage medium
DE102009059142A1 (en) Method for integrating component in information system of vehicle, involves providing applications to user of vehicle by human-machine-interface of information system, where application is accessed through program interface at parameter
CN114764288A (en) Adjusting method and device
CN114861056A (en) Information pushing method and device, electronic equipment and storage medium
CN113733923A (en) Control method and device for automatically setting recovery torque of pure electric vehicle
CN111783550B (en) Monitoring and adjusting method and system for emotion of driver
CN113844456B (en) ADAS automatic opening method and device
CN110297686B (en) Content display method and device
CN116142204A (en) Method and related device for intelligently processing driving interference data
CN117719445A (en) User emotion adjustment method, device, vehicle and storage medium
CN115712399A (en) Display method, electronic equipment and vehicle

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination