CN116627602A - Task scheduling method and device, readable medium and electronic equipment - Google Patents

Task scheduling method and device, readable medium and electronic equipment Download PDF

Info

Publication number
CN116627602A
CN116627602A CN202210129576.2A CN202210129576A CN116627602A CN 116627602 A CN116627602 A CN 116627602A CN 202210129576 A CN202210129576 A CN 202210129576A CN 116627602 A CN116627602 A CN 116627602A
Authority
CN
China
Prior art keywords
task
executed
executing
execution information
execution
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210129576.2A
Other languages
Chinese (zh)
Inventor
周艳
姚鑫
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Huawei Technologies Co Ltd
Original Assignee
Huawei Technologies Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Huawei Technologies Co Ltd filed Critical Huawei Technologies Co Ltd
Priority to CN202210129576.2A priority Critical patent/CN116627602A/en
Publication of CN116627602A publication Critical patent/CN116627602A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/46Multiprogramming arrangements
    • G06F9/48Program initiating; Program switching, e.g. by interrupt
    • G06F9/4806Task transfer initiation or dispatching
    • G06F9/4843Task transfer initiation or dispatching by program, e.g. task dispatcher, supervisor, operating system
    • G06F9/4881Scheduling strategies for dispatcher, e.g. round robin, multi-level priority queues
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/46Multiprogramming arrangements
    • G06F9/50Allocation of resources, e.g. of the central processing unit [CPU]
    • G06F9/5083Techniques for rebalancing the load in a distributed system
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02DCLIMATE CHANGE MITIGATION TECHNOLOGIES IN INFORMATION AND COMMUNICATION TECHNOLOGIES [ICT], I.E. INFORMATION AND COMMUNICATION TECHNOLOGIES AIMING AT THE REDUCTION OF THEIR OWN ENERGY USE
    • Y02D10/00Energy efficient computing, e.g. low power processors, power management or thermal management

Abstract

The application relates to the field of communication, and discloses a task scheduling method, a task scheduling device, a readable medium and electronic equipment. The task scheduling method of the application comprises the following steps: the method comprises the steps of obtaining a task to be executed and first execution information of the task to be executed, the task being executed and the first execution information of the task being executed, available system resources, system resources occupied by the task being executed and system resources required by the task to be executed. The system resource required by the task to be executed may be a system resource required by the task to be executed which is predicted according to the task to be executed and the first execution information of the task to be executed. Under the condition that the available system resources do not meet the predicted system resources required by the task to be executed, the first execution information of the task being executed is adjusted to the corresponding second execution information, so that the scheduling processing of the task to be executed is ensured, and the user experience is improved.

Description

Task scheduling method and device, readable medium and electronic equipment
Technical Field
The present application relates to the field of communications, and in particular, to a task scheduling method and apparatus, a readable medium, and an electronic device.
Background
With the advent of the digital era, in a general hardware system and a matched operating system thereof, in order to emphasize fairness of tasks, an existing task scheduling algorithm makes different tasks occupy system resources of equipment in turn as much as possible, and in reality, many applications have specific real-time requirements, namely, requirements of completing tasks within a specified time, and responding and processing random external events. For such applications, the common software and hardware scheduling algorithm for emphasizing the task fairness is difficult to meet the requirement, and a task scheduling algorithm with real-time guarantee must be adopted.
Disclosure of Invention
The embodiment of the application provides a task scheduling method, a task scheduling device, a readable medium and electronic equipment.
In a first aspect, an embodiment of the present application provides a task scheduling method, configured to a first electronic device, including: and acquiring the task to be executed, first execution information of the task to be executed, the task being executed and the first execution information of the task being executed. And determining first system resource requirements required by completing the task to be executed according to the first execution information of the task to be executed. When the first system resource capable of being used for executing the task to be executed does not meet the first system resource requirement, the first electronic device adjusts the first executing information of the task to be executed to the second executing information of the task to be executed. And when the task is completed according to the second execution information of the task, the second system resource capable of being used for executing the task to be executed meets the first system resource requirement.
For example, the task scheduling method of the application can be applied to a single electronic device, and can also be applied to one electronic device in a distributed system. It may be appreciated that the first electronic device may be one of a vehicle-mounted device, a mobile phone, a computer, a wearable device, a smart large screen, a smart speaker, a virtual reality glasses, etc., and may also be one of a vehicle-mounted device, a mobile phone, a computer, a wearable device, a smart large screen, a smart speaker, a virtual reality glasses, etc. in a distributed system. The first system resource requirement required to complete the task to be performed may be a system resource required for the task to be performed hereinafter, and the first system resource currently available for performing the task to be performed may be an available system resource hereinafter.
For example, the first electronic device obtains a task to be executed and first execution information of the task to be executed, the task being executed and the first execution information of the task being executed, available system resources, system resources occupied by the task being executed, and system resources required by the task to be executed according to a current scene. The system resource required by the task to be executed may be a system resource required by the task to be executed which is predicted according to the task to be executed and the first execution information of the task to be executed. In the case that the available system resources do not meet the predicted system resources required by the task to be executed, the first execution information of the task being executed is adjusted to the second execution information of the task being executed, so as to be more consistent with the target, and the adjustment manner includes, but is not limited to, reducing the system resources occupied by the task being executed, increasing the available system resources, or identifying that the task to be executed enters a specific scheduling queue to meet the time delay target, or other measures. Therefore, the scheduling processing of the task to be executed is ensured, and the user experience is improved.
In a possible implementation of the first aspect, the method further includes: the first system resource is a system resource of the first electronic device. After the first electronic device adjusts the first execution information of the executing task to the second execution information of the executing task, the method further includes: the first electronic device executes the executing task according to the second executing information of the executing task, and executes the task to be executed according to the first executing information of the task to be executed.
For example, the first electronic device is a terminal device or an in-vehicle device, and the first system resource is an available system resource of the terminal device or the in-vehicle device. The terminal device or the in-vehicle device corresponds to adjusting the first execution information of the executing task to the second execution information of the executing task, executing the executing task with the second execution information of the executing task, and executing the task to be executed with the first execution information of the task to be executed.
In a possible implementation of the first aspect, the method further includes: the first system resource comprises a system resource of at least one second electronic device. After the first electronic device adjusts the first execution information of the executing task to the second execution information of the executing task, the method further includes: the first electronic device sends a first execution instruction to at least one second electronic device based on second execution information of the task being executed and first execution information of the task to be executed, so that the at least one second electronic device executes the task being executed with the second execution information of the task being executed based on the first execution instruction, and executes the task to be executed with the first execution information of the task to be executed.
For example, the distributed system includes one first electronic device and at least one second electronic device, the first system resource includes system resources of a plurality of second electronic devices in the distributed system, the first electronic device in the distributed system sends a first execution instruction to the at least one second electronic device in the distributed system based on second execution information of a task being executed and first execution information of the task to be executed, so that the at least one second electronic device in the distributed system executes the task being executed with the second execution information of the task being executed based on the first execution instruction, and executes the task to be executed with the first execution information of the task to be executed.
In a possible implementation of the first aspect, the method further includes: further comprises: after the first electronic device executes the task to be executed according to the first execution information of the task to be executed, the method further comprises: and generating a first execution result of the task to be executed. When the first execution result of the task to be executed does not meet the first execution information of the task to be executed, the second execution information of the task to be executed is adjusted to third execution information of the task to be executed, wherein when the task to be executed is completed according to the third execution information of the task to be executed, third system resources capable of being used for executing the task to be executed are larger than the second system resources, and the third system resources meet the first system resource requirements.
For example, the first execution result of the task to be executed may be a processing result of the first electronic device performing scheduling processing on the task to be executed to generate the task to be executed. The first electronic device may adjust the first execution information of the executing task or the second execution information of the executing task to the third execution information of the executing task when the processing result of the task to be executed does not satisfy the first execution information of the task to be executed. Completing the executing task with the third execution information of the executing task corresponds to a third system resource for executing the task to be executed, the third system resource being larger than the second system resource, and the third system resource meeting the first system resource requirement. Therefore, the system resource required by the scheduling processing of the task to be executed is further met, and the first execution information of the task to be executed is met when the task to be executed is scheduled, so that the execution requirement of the task to be executed is met. Therefore, the scheduling processing of the task to be executed is ensured, and the user experience is improved.
In a possible implementation of the first aspect, the method further includes: when the first execution result of the task to be executed by the at least one second electronic device according to the first execution information of the task to be executed does not meet the first execution information of the task to be executed, the method further comprises the steps of: the first electronic device adjusts second execution information of the executing task into third execution information of the executing task, the first electronic device sends a second execution instruction to at least one second electronic device, so that the at least one second electronic device executes the executing task with the third execution information of the executing task based on the second execution instruction, and executes the task to be executed with the first execution information of the task to be executed, wherein when the at least one second electronic device completes the task to be executed with the third execution information of the executing task, third system resources capable of being used for executing the task to be executed are larger than the second system resources, and the third system resources meet the first system resource requirements.
For example, the first execution result of the task to be executed may be a processing result of the first electronic device in the distributed system performing scheduling processing on the task to be executed to generate the task to be executed. The first electronic device in the distributed system may adjust the first execution information of the executing task or the second execution information of the executing task to the third execution information of the executing task when the processing result of the task to be executed does not satisfy the first execution information of the task to be executed. The first electronic device in the distributed system sends a second execution instruction to at least one second electronic device in the distributed system, so that the at least one second electronic device in the distributed system executes the task to be executed according to third execution information of the task to be executed and the task to be executed according to the first execution information of the task to be executed, wherein when the at least one second electronic device in the distributed system completes the task to be executed according to the third execution information of the task to be executed, third system resources capable of being used for executing the task to be executed are larger than the second system resources, and the third system resources meet the first system resource requirements. Therefore, the system resource required by the scheduling processing of the task to be executed is further met, and the first execution information of the task to be executed is met when the task to be executed is scheduled, so that the execution requirement of the task to be executed is met. Therefore, the scheduling processing of the task to be executed is ensured, and the user experience is improved.
In a possible implementation of the first aspect, the method further includes: when the first system resource capable of being used for executing the task to be executed meets the first system resource requirement, executing the task to be executed with the first execution information of the task to be executed.
For example, in the case where the available system resources (first system resources) satisfy the predicted system resources (first system resource requirements) required for the task to be executed, the task to be executed is scheduled according to the first execution information of the task to be executed, and a processing result of the task to be executed is generated.
In a possible implementation of the first aspect, the method further includes: the first execution information of the executing task, the second execution information of the executing task and the first execution information of the task to be executed comprise at least one of the following: scheduling priority, execution time range, execution frequency range, display resolution, display refresh rate, delay jitter range, reliability, trusted security environment, isolation.
In a possible implementation of the first aspect, the method further includes: adjusting the first execution information of the executing task to the second execution information of the executing task includes at least one of: and adjusting the first scheduling priority included in the first execution information of the executing task to the second scheduling priority. The first execution time range included in the first execution information of the executing task is adjusted to the second execution time range. The first execution frequency range included in the first execution information of the executing task is adjusted to the second execution frequency range. The first display resolution included in the first execution information of the executing task is adjusted to the second display resolution. The first display refresh rate included in the first execution information of the executing task is adjusted to the second display refresh rate. The first delay jitter range included in the first execution information of the executing task is adjusted to the second delay jitter range. The first reliability included in the first execution information of the executing task is adjusted to the second reliability. The first trusted security environment included in the first execution information of the executing task is adjusted to a second trusted security environment. And adjusting the first isolation included in the first execution information of the executing task to the second isolation.
In a possible implementation of the first aspect, the method further includes: the system resources in the first system resource, the first system resource requirement, or the second system resource requirement comprise at least one of: comprehensive computing power and network bandwidth.
In a second aspect, the present application provides a task scheduling device, which is characterized in that the task scheduling device includes: the scene perception module is used for acquiring a task to be executed, first execution information of the task to be executed, the task being executed and the first execution information of the task being executed. The scene perception module is further used for determining first system resource requirements needed by completing the task to be executed according to the first execution information of the task to be executed. The target overall module is used for adjusting the first execution information of the executing task to the second execution information of the executing task when the first system resource capable of being used for executing the task to be executed does not meet the first system resource requirement. And when the task is completed according to the second execution information of the task, the second system resource capable of being used for executing the task to be executed meets the first system resource requirement.
For example, the scene perception module obtains a task to be executed and first execution information of the task to be executed, the task being executed and the first execution information of the task being executed, available system resources, system resources occupied by the task being executed, and system resources required by the task to be executed according to a current scene. The system resource required by the task to be executed may be a system resource required by the scene perception module according to the task to be executed and the first execution information of the task to be executed, which is predicted. The target overall module adjusts the first execution information of the executing task to the second execution information of the executing task under the condition that the available system resources do not meet the predicted system resources required by the task to be executed, so that the target overall module is more in line with the target, and the adjustment mode comprises the steps of not limiting the system resources occupied by the executing task to increase the available system resources, or identifying the task to be executed to enter a specific scheduling queue to meet a time delay target, or other measures. Therefore, the scheduling processing of the task to be executed is ensured, and the user experience is improved.
In a possible implementation manner of the second aspect, the apparatus further includes: and the task scheduling and processing module is used for executing the task to be executed according to the first execution information of the task to be executed and generating a first execution result of the task to be executed. And the prediction and compensation module is used for adjusting the second execution information of the executing task to the third execution information of the executing task when the first execution result of the executing task does not meet the first execution information of the executing task, wherein when the executing task is completed according to the third execution information of the executing task, the third system resource capable of being used for executing the executing task is larger than the second system resource, and the third system resource meets the first system resource requirement.
For example, the target overall module sends the task to be executed and the first execution information of the task to be executed to the task scheduling and processing module. The task scheduling and processing module performs scheduling processing on the task to be executed to generate a processing result of the task to be executed. The prediction and compensation module may adjust the first execution information of the executing task or the second execution information of the executing task to the third execution information of the executing task in case the processing result of the task to be executed does not satisfy the first execution information of the task to be executed. Further meeting the objectives, ways of adjustment include, but are not limited to, reducing the system resources occupied by the executing task, increasing the available system resources, or identifying the task to be executed to enter a particular dispatch queue to meet a latency objective, or other measure. Therefore, the scheduling processing of the task to be executed is ensured, and the user experience is improved.
In a third aspect, an embodiment of the present application provides a readable medium having stored thereon instructions that, when executed on an electronic device, cause the electronic device to perform any one of the above-mentioned first aspect and various possible implementations of the first aspect.
In a fourth aspect, an embodiment of the present application provides a first electronic device, including:
a memory for storing instructions for execution by the one or more processors of the first electronic device, and
a processor, which is one of the processors of the first electronic device, for executing the task scheduling method of the first aspect and any of the various possible implementations of the first aspect.
In a possible implementation manner of the fourth aspect, the first electronic device is one of a vehicle-mounted device, a mobile phone, a computer, a wearable device, a smart large screen, a smart speaker, and virtual reality glasses.
In a fifth aspect, embodiments of the present application provide a computer program product comprising a computer program/instruction which, when executed by a processor, implements the task scheduling method of the first aspect and any of the various possible implementations of the first aspect.
In a sixth aspect, the present application further provides a chip, including a processor, where the processor is coupled to a memory, and is configured to read and execute program instructions stored in the memory, so that the chip implements the task scheduling method according to the first aspect and any one of various possible implementations of the first aspect.
The technical effects of each of the third to sixth aspects and the technical effects that may be achieved by each aspect are referred to the technical effects that may be achieved by each possible aspect of the first aspect, and the description is not repeated here.
Drawings
FIG. 1A illustrates a scene graph of a user driving a vehicle for travel, according to some embodiments of the application;
FIG. 1B illustrates a scene graph of a vehicle displaying vehicle travel information on a heads-up display, according to some embodiments of the application;
FIG. 2 illustrates a smart home scene graph, according to some embodiments of the application;
FIG. 3 illustrates a smart phone scene graph, according to some embodiments of the application;
FIG. 4 illustrates an architectural diagram of a task scheduler, according to some embodiments of the present application;
FIG. 5 illustrates an interaction diagram of a task scheduling method, according to some embodiments of the application;
FIG. 6 illustrates an interaction diagram of another task scheduling method, according to some embodiments of the application;
FIG. 7 illustrates an interaction diagram of another task scheduling method, according to some embodiments of the application;
FIG. 8 illustrates a schematic structural diagram of an electronic device or terminal device in a vehicle-mounted device or distributed system, according to some embodiments of the present application;
fig. 9 is a block diagram showing a software configuration of an electronic device or a terminal device in a vehicle-mounted device or a distributed system according to some embodiments of the present application.
Detailed Description
Illustrative embodiments of the application include, but are not limited to, task scheduling methods, apparatus, readable media, and electronic devices.
In order to solve the technical problems, the application provides a task scheduling device which comprises a scene sensing module, a target overall module, a task scheduling and processing module, a prediction and compensation module, a preset scheduling module and the like.
Specifically, the scene perception module obtains a task to be executed and first execution information of the task to be executed, the task being executed and the first execution information of the task being executed, available system resources, system resources occupied by the task being executed and system resources required by the task to be executed according to a current scene. The system resource required by the task to be executed may be a system resource required by the scene perception module according to the task to be executed and the first execution information of the task to be executed, which is predicted. The target overall module adjusts the first execution information of the executing task to the corresponding second execution information under the condition that the available system resources do not meet the predicted system resources required by the task to be executed, so as to better meet the target, wherein the adjustment mode comprises not only reducing the system resources occupied by the executing task so as to increase the available system resources, or identifying the task to be executed to enter a specific scheduling queue so as to meet the time delay target, or other measures. Therefore, the scheduling processing of the task to be executed is ensured, and the user experience is improved.
And then, the target overall module sends the task to be executed and the first execution information of the task to be executed to the task scheduling and processing module. The task scheduling and processing module performs scheduling processing on the task to be executed to generate a processing result of the task to be executed.
The prediction and compensation module may adjust the first execution information of the executing task or the second execution information of the executing task to the third execution information of the executing task in case the processing result of the task to be executed does not satisfy the first execution information of the task to be executed. Further meeting the objectives, ways of adjustment include, but are not limited to, reducing the system resources occupied by the executing task, increasing the available system resources, or identifying the task to be executed to enter a particular dispatch queue to meet a latency objective, or other measure. Therefore, the scheduling processing of the task to be executed is ensured, and the user experience is improved.
Optionally, the prediction and compensation module may further increase available system resources by increasing a processing frequency of the central processing unit when a processing result of the task to be executed does not meet the first execution information of the task to be executed, so as to ensure scheduling processing of the task to be executed.
It can be understood that, in the task scheduling device provided by the application, the target overall module can realize purposeful scheduling and processing of the task to be executed according to the task and the task related information acquired by the scene perception module, and actively adjust the first execution information of the task being executed on the premise of not influencing the user experience under the condition of insufficient available system resources, so as to better meet the target, wherein the adjustment mode includes but is not limited to reducing the system resources occupied by the task being executed so as to increase the available system resources, or marking the task to be executed to enter a specific scheduling queue so as to meet the time delay target, or other measures. And under the condition that the processing result of the task to be executed does not meet the first execution information of the task to be executed, the prediction and compensation module actively adjusts the first execution information or the second execution information of the task to be executed, so that the method is more in line with the target, the adjustment mode comprises the steps of not limiting the system resources occupied by the task to be executed to increase the available system resources, or identifying the task to be executed to enter a specific scheduling queue to meet the time delay target, or other measures, so that the scheduling processing of the task to be executed is ensured, and the user experience is improved.
It should be noted that, the task to be executed in the present application may be one task or may be a plurality of tasks. The executing task of the present application may be one task or a plurality of tasks. When the first execution information or the second execution information of the executing task is adjusted, the first execution information or the second execution information of one task may be adjusted, or the first execution information or the second execution information corresponding to a plurality of tasks may be adjusted. Can be adjusted according to different scenes of practical application, and the application is not particularly limited herein.
It should be noted that, the first execution information of the task to be executed according to the present application may include at least one of the following: the method comprises the steps of first scheduling priority of a task to be executed, first execution time range of the task to be executed, first execution frequency range of the task to be executed, first display resolution of the task to be executed, first display refresh rate of the task to be executed, first delay jitter range of the task to be executed, first reliability of the task to be executed, first trusted security environment of the task to be executed, first isolation degree of the task to be executed and the like. It can be understood that the first execution information of the task to be executed is used for describing the execution requirement when the task scheduling and processing module schedules and processes the task to be executed, the first execution information of the task to be executed depends on different scenes of actual application and different task types of the task to be executed, and the first execution information of the task to be executed is not particularly limited.
It should be noted that, the first execution information of the executing task of the present application may include at least one of the following: a first scheduling priority of a executing task, a first execution time range of the executing task, a first execution frequency range of the executing task, a first display resolution of the executing task, a first display refresh rate of the executing task, a first latency jitter range of the executing task, a first reliability of the executing task, a first trusted security environment of the executing task, a first isolation of the executing task, etc. It can be understood that the first execution information of the executing task of the present application is used to describe the execution requirement of the task scheduling and processing module when the executing task is scheduled and processed, the first execution information of the executing task depends on different scenes of the actual application and different task types of the executing task, and the present application does not specifically limit the first execution information of the executing task.
Similarly, the second execution information corresponding to the first execution information of the executing task is also used for describing the execution requirement of the task scheduling and processing module when the task scheduling and processing module schedules and processes the executing task, and compared with the task scheduling and processing module which schedules and processes the executing task according to the first execution information of the executing task, the task scheduling and processing module is more beneficial to achieving the target time delay of the task to be executed or other execution targets when the task scheduling and processing module schedules and processes the executing task according to the second execution information of the executing task. It will be appreciated that the second execution information of the executing task depends on different scenarios of the actual application and different task types of the executing task, and the present application is not limited in particular to the second execution information of the executing task.
Similarly, the third execution information corresponding to the first execution information or the second execution information of the executing task is also used for describing the execution requirement of the task scheduling and processing module when the executing task is scheduled and processed, and compared with the task scheduling and processing module which schedules and processes the executing task according to the first execution information or the second execution information of the executing task, the task scheduling and processing module is more beneficial to achieving the target time delay or other execution targets of the task to be executed when the executing task is scheduled and processed according to the third execution information of the executing task. It will be appreciated that the third execution information of the executing task depends on different scenarios of the application and different task types of the executing task, and the present application is not limited in particular to the third execution information of the executing task.
The device for operating the task scheduling device of the present application may be a single electronic device or a distributed system formed by a plurality of electronic devices. For example, the electronic device may be an in-vehicle device, a laptop computer, a desktop computer, a tablet computer, a cell phone, a wearable device, a head-mounted display, a mobile email device, a portable game machine, a portable music player, a reader device, a smart watch, a bracelet, jewelry (e.g., a device that forms a decorative article such as an earring, a bracelet), a server, or glasses, or the like. It will be appreciated that the present application is not limited in particular to the apparatus in which the task scheduling device of the present application operates.
In order to make the objects, technical solutions and advantages of the present application more apparent, the technical solutions of the present application will be described in detail with reference to fig. 1 to 9.
Fig. 1A shows a scenario diagram of a user driving a vehicle for traveling, where the scenario of fig. 1 includes a driving vehicle 100, and a vehicle-mounted device 10 is mounted on the vehicle 100, where the vehicle-mounted device 10 may provide vehicle control services including driving data acquisition, driving track recording, vehicle state monitoring, driving behavior analysis, and the like for the vehicle 100, and may also provide services including real-time navigation, web browsing, video viewing, and the like for the vehicle 100.
As shown in fig. 1A, the in-vehicle apparatus 10 includes a Head Up Display (HUD) 11, a microphone 12, a center control screen 13, a drive recording camera 14, an instrument screen 15, a co-driver screen 16, and the like. For example, the meter screen 15 may provide a track related data display for the user. The tachograph camera 14 may be used for tachograph data acquisition. The central control screen 13 may provide a navigational map display for the user. The microphone 12 may collect voice data of the user, and the in-vehicle apparatus 10 may perform recognition processing on the voice of the user based on the voice data collected by the microphone 12. The head up display 11 may be used to display a driving path indication, a vehicle speed, etc.
Fig. 1B shows a scene diagram of a vehicle 100 displaying vehicle driving information on a head-up display 11, where, as shown in fig. 1B, the head-up display 11 displays real-time navigation information, specifically including: the speed limit 60MAX of the road section is displayed by the icon 110, the current running speed is 16km/h, and the vehicle 100 is displayed by the icon 112 at a distance of 300m from the next right-hand turn capable intersection.
The task scheduling process performed by the task scheduling device for performing task scheduling on the real-time navigation information task displayed on the head-up display 11 will be briefly described below by taking the task to be executed of the in-vehicle apparatus 10 as an example of displaying the real-time navigation information task on the head-up display 11.
For example, based on the vehicle driving scene of fig. 1B, the scene perception module obtains the to-be-executed task to display the navigation information task for the head-up display 11, and the first execution information of the navigation information task displayed by the head-up display 11 includes: the first execution time range of the navigation information task displayed by the head-up display 11 is not more than 10ms, the first execution frequency range of the navigation information task displayed by the head-up display 11 is not more than 50ms, the first display resolution of the navigation information task displayed by the head-up display 11 is 4K, and the first display refresh rate of the navigation information task displayed by the head-up display 11 is 60fps.
The scene perception module is based on the scene of fig. 1B, and the acquired running tasks include 4 tasks, which are respectively: the camera 14 collects images and generates a driving recording task, a vehicle state monitoring task, a navigation map displaying task by the central control screen 13 and a vehicle driving speed and distance displaying task by the instrument screen 15. Taking an example in which the instrument panel 15 in the running task displays the vehicle running speed and distance task and the center control panel 13 displays the navigation map task, the instrument panel 15 displays the first execution information of the vehicle running speed and distance task including: the first scheduling priority of the vehicle running speed and distance task displayed by the meter screen 15 is higher than the scheduling priority of the real-time navigation information task displayed by the head-up display 11, and the first display refresh rate of the vehicle running speed and distance task displayed by the meter screen 15 is 80fps. The central control screen 13 displays first execution information of the navigation map task including: the first scheduling priority of the navigation map task displayed by the central control screen 13 is higher than the scheduling priority of the navigation information task displayed by the head-up display 11, and the first display refresh rate of the navigation map task displayed by the central control screen 13 is 100fps.
The scene perception module obtains available resources of the vehicle-mounted device 10, and predicts system resources required by the head-up display 11 for displaying the navigation information task based on the navigation information task displayed by the head-up display 11 and first execution information of the navigation information task displayed by the head-up display 11.
The objective orchestration module may adjust the first execution information that is executing the task to the second execution information that is executing the task, in a case where the available resources of the in-vehicle device 10 do not satisfy the system resources required for the head-up display 11 to display the navigation information task. The scheduling of the executing task according to the second execution information of the executing task takes less resources than the scheduling of the executing task according to the first execution information of the executing task. Therefore, the resource occupation of the executing task is reduced, more available system resources are released for the scheduling processing of the navigation information task displayed by the head-up display 11, the completion of the navigation information task displayed by the head-up display 11 is ensured, and under the condition of insufficient system resources, the display resolution of the head-up display 11 is 4K images, and the display refresh rate of the head-up display 11 is 60fps, so that the user experience is improved.
For example, the goal orchestration module may adjust a first scheduling priority of the meter screen 15 for displaying the vehicle travel speed and distance tasks to a second scheduling priority that is lower than a scheduling priority of the heads-up display 11 for displaying the real-time navigation information tasks. The first display refresh rate of the meter screen 15 for displaying the vehicle travel speed and distance tasks may also be adjusted to a second display refresh rate of 30fps. For another example, the goal orchestration module may further adjust the first scheduling priority of the navigation map task displayed by the central control screen 13 to a second scheduling priority, where the adjusted second scheduling priority is lower than the scheduling priority of the real-time navigation information task displayed by the head-up display 11. For another example, the goal orchestration module may also adjust the first display refresh rate of the central control screen 13 to display the navigational map task to a second display refresh rate, which may be 80fps.
The target overall module adjusts a first display refresh rate (100 fps) of the navigation map task displayed by the central control screen 13 to a second display refresh rate (80 fps), and then the task scheduling and processing module performs scheduling processing on the navigation information task displayed by the head-up display 11 to generate a processing result of the navigation information task displayed by the head-up display 11. In the case where the processing result of the task of displaying navigation information by the head-up display 11 still does not satisfy that the head-up display 11 displays navigation information at a display refresh rate of 60fps, the prediction and compensation module adjusts the second display refresh rate (80 fps) of the task of displaying navigation map by the center control screen 13 to a third display refresh rate (60 fps). The system resources occupied by the navigation map displaying task of the adjusted central control screen 13 are reduced, so that more system resources are further released for scheduling processing of the navigation information displaying task of the head-up display 11, the head-up display 11 is ensured to display navigation information at a display refresh rate of 60fps, and user experience is further improved.
Optionally, in a case where the processing result of the navigation information task displayed by the head-up display 11 does not satisfy that the head-up display 11 displays navigation information at a display refresh rate of 60fps, the prediction and compensation module may further increase the available system resources of the vehicle-mounted device 10 by increasing the processing frequency of the central processor, the processing frequency of the graphics processor, the operating frequency of the memory, and so on of the vehicle-mounted device 10, so that the available system resources of the vehicle-mounted device 10 may satisfy the scheduling process of the navigation information task displayed by the head-up display 11.
Fig. 2 shows a smart home scenario diagram, the scenario of fig. 2 comprising a distributed system 20, the distributed system 20 comprising: smart large screen 201, smart speaker 202, camera 203, virtual Reality (VR) glasses 204, and mobile phone 206.
Based on the scene of fig. 2, when the user wears the VR glasses 204 to turn around to watch the VR video, and based on the scene, the distributed system 20 detects the turning motion of the user 1, and the task to be executed of the distributed system 20 acquired by the scene perception module is a video display task of 360 degrees around the VR glasses 204. The scene perception module may further obtain first execution information of the 360-degree video display task of the annular scene of the VR glasses 204, where the first execution information of the 360-degree video display task of the annular scene of the VR glasses 204 may include: the first display resolution is 16K. The scene perception module predicts system resources required by the VR glasses 204 panoramic 360-degree video display task according to the first display resolution of the VR glasses 204 panoramic 360-degree video display task.
Under the condition that available system resources do not meet system resources required by the video display task of 360 degrees around the VR glasses 204, the target overall module can adjust first execution information of the executing task to second execution information of the executing task, compared with scheduling processing of the executing task according to the first execution information of the executing task, resources occupied by scheduling processing of the executing task according to the second execution information of the executing task are fewer, and therefore system resources occupied by the executing task are reduced, more system resources are released for scheduling processing of the video display task of 360 degrees around the VR glasses 204, 16K display resolution is achieved for the video display task of 360 degrees around the VR glasses 204, and accordingly 16K display resolution video can be seen when a user turns around under the condition that the system resources are insufficient, and user experience is improved.
For example, the executing task of the distributed system 20 acquired by the scene perception module may be an educational website video display task of the smart large screen 201, where the corresponding first execution information of the educational website video display task of the smart large screen 201 acquired by the scene perception module may include a first display refresh rate of 100fps. The goal orchestration module may adjust a first display refresh rate (100 fps) of the educational web site video display tasks of the smart large screen 201 to a second display refresh rate (60 fps). It is apparent that compared with the task scheduling and processing module which performs scheduling processing on the educational website video display task of the intelligent large screen 201 based on the first display refresh rate of the educational website video display task of the intelligent large screen 201, the task scheduling and processing module performs scheduling processing on the educational website video display task of the intelligent large screen 201 based on the second display refresh rate of the educational website video display task of the intelligent large screen 201, the occupied system resources are less.
Further, after the goal orchestration module adjusts the first display refresh rate (100 fps) of the video display task of the educational website of the intelligent large screen 201 to the second display refresh rate (60 fps), the task scheduling and processing module schedules the video display task of 360 degrees around the VR glasses 204, and generates a processing result of 8K display resolution of the video display task of 360 degrees around the VR glasses 204. In the event that the 8K display resolution of the VR glasses 204 ring view 360 degree video display task still does not meet the 16K display resolution of the VR glasses 204 ring view 360 degree video display, the prediction and compensation module adjusts the second display refresh rate (60 fps) to a third display refresh rate (50 fps). The system resources occupied by the task being executed after adjustment are reduced, so that more system resources are further released for scheduling processing of the video display task of 360 degrees around the VR glasses 204, 16K display resolution of the video display task of 360 degrees around the VR glasses 204 is guaranteed, and therefore, under the condition of insufficient system resources, a user can watch the video of 16K display resolution when turning around, and user experience is improved.
Optionally, in the case that the 8K display resolution of the video display task of 360 degrees around the VR glasses 204 does not meet the 16K display resolution of the video display 16 degrees around the VR glasses 204, the prediction and compensation module may further increase the available system resources of the distributed system 20 by increasing the processing frequency of the central processor, the processing frequency of the graphics processor, the operating frequency of the memory, and so on of the distributed system 20, so that the available system resources of the distributed system 20 can meet the video display task of 360 degrees around the VR glasses 204 to achieve the 16K display resolution.
It can be understood that the task scheduling device of the present application is not only applicable to the in-vehicle device 10 in fig. 1 or the scenario of the distributed system 20 including a plurality of electronic devices in fig. 2, but also applicable to the application scenario of a single electronic device, and the electronic device is taken as an example for the mobile phone 30 to be described below.
Fig. 3 shows a smart phone scenario diagram, where the scenario of fig. 3 includes a mobile phone 30, and as shown in fig. 3, a display interface of the mobile phone 30 displays a video call of a user through instant messaging software.
As shown in fig. 3, the task to be executed acquired by the scene sensing module is a video call task, and the task being executed acquired by the scene sensing module includes a map navigation task, a screen throwing task and the like. The context awareness module also obtains first execution information of the task to be executed, the first execution information of the task being executed, available system resources of the mobile phone 30, and the like. The scene perception module predicts the system resources required by the video call task of the mobile phone 30 according to the video call task of the mobile phone 30 and the first execution information of the video call task.
In the case where the system resources available do not meet the system resources required for the video call task, the goal orchestration module may adjust the first execution information of the executing task to the second execution information of the executing task. The scheduling of the executing task according to the second executing information of the executing task of the handset 30 takes less resources than the scheduling of the executing task according to the first executing information of the executing task of the handset 30. Therefore, the occupation of resources of the mobile phone 30, which are executing tasks, is reduced, more system resources are released for the scheduling processing of the video call tasks, the completion of the video call tasks is ensured, and the scheduling processing of the video call tasks can be completed under the condition of insufficient system resources, so that the user experience is improved.
The target overall module adjusts the first execution information of the executing task of the mobile phone 30 to the second execution information of the executing task, and then the task scheduling and processing module performs scheduling processing on the video call task of the mobile phone 30 to generate a processing result of the video call task of the mobile phone 30. In the case that the processing result of the video call task of the mobile phone 30 still does not satisfy the first execution information of the video call task, the prediction and compensation module may adjust the second execution information of the executing task of the mobile phone 30 to the third execution information of the executing task. The system resources occupied by the task being executed of the mobile phone 30 after adjustment are reduced, so that more system resources are further released for scheduling processing of the video call task of the mobile phone 30, and user experience is further improved.
Optionally, in the case that the processing result of the video call task of the mobile phone 30 does not meet the first execution information of the video call task, the prediction and compensation module may further increase the available system resources of the mobile phone 30 by increasing the processing frequency of the central processor of the mobile phone 30, the processing frequency of the graphics processor, the operating frequency of the memory, and so on, so that the available system resources of the mobile phone 30 can meet the first execution information capable of implementing the video call task when the video call task of the mobile phone 30 is scheduled.
Based on the scenarios of fig. 1 to 3, in an embodiment of the present application, the task scheduling method of the present application may be implemented by operating the task scheduling device on the in-vehicle device 10 or the distributed system 20 including a plurality of electronic devices or a single electronic device (e.g., the mobile phone 30).
Fig. 4 shows a schematic architecture of a task scheduler 40, and as shown in fig. 4, the task scheduler 40 specifically includes a scene perception module 41, a goal orchestration module 42, a task scheduling and processing module 43, a prediction and compensation module 44, a preset scheduling module 45, a measurement and monitoring module 46, and a clock synchronization and hard cycle module 47.
Scene perception module 41: the scene perception module 41 is configured to obtain a task to be executed and first execution information of the task to be executed, the task to be executed and the first execution information of the task to be executed, system resources occupied by the task to be executed and available system resources, and predict system resources required by the task to be executed based on the first execution information of the task to be executed.
The context awareness module 41 is further configured to send the task to be executed and the first execution information of the task to be executed, the task being executed and the first execution information of the task being executed, system resources and available system resources required by the task to be executed, and system resources occupied by the task being executed to the target overall module 42.
The goal orchestration module 42: for receiving the task to be executed and the first execution information of the task to be executed, the task being executed and the first execution information of the task being executed, the system resources and available system resources required by the task to be executed, and the system resources occupied by the task being executed, which are sent by the scene perception module 41.
The objective overall module 42 is further configured to determine whether the available system resources meet the system resources required by the task to be executed, and in case that the available system resources do not meet the system resources required by the task to be executed, adjust the first execution information of the task to be executed according to the first execution information of the task to be executed, the system resources required by the task to be executed, the system resources occupied by the task to be executed, and the available system resources, and generate the second execution information of the task to be executed.
The objective orchestration module 42 is further configured to send the task to be executed and the first execution information of the task to be executed to the task scheduling and processing module 43.
Task scheduling and processing module 43: the scheduling method comprises the steps of receiving a task to be executed and first execution information of the task to be executed, scheduling the task to be executed according to the task to be executed and the first execution information of the task to be executed, and generating a processing result of the task to be executed.
In some embodiments, the task scheduling and processing module 43 may schedule the task to be executed in a plurality of manners, such as priority queue scheduling, batch processing, or sleep power saving, to generate a processing result of the task to be executed.
Optionally, as the latency deterministic demand develops, the task scheduling and processing module 43 may also be configured to schedule preferentially the queue in which the task to be executed with low latency requirements or low jitter requirements is located. For example, a queue in which a task to be executed with low latency requirements is located may trigger and prioritize the processing of the pending messages of the queue in time by a hard interrupt of an accurate clock. The queue in which the task to be executed with low jitter requirements is located needs to be buffered for aligning the time period if the jitter in the task process is to be eliminated. It will be appreciated that the low latency requirements of the queue in which the task to be executed is located and the low jitter requirements of the queue in which the task to be executed is located may both be considered during the scheduling and processing of the task to be processed. Furthermore, for queues where tasks to be performed have reliability requirements, available computing resources may be picked up or done through redundant computation and verification.
Prediction and compensation module 44: and the processing result of the task to be executed is determined whether the processing result of the task to be executed meets the first execution information of the task to be executed, and the first execution information or the second execution information of the task to be executed is adjusted to be the third execution information of the task to be executed under the condition that the processing result of the task to be executed does not meet the first execution information of the task to be executed.
Preset scheduling module 45: the method comprises the steps of presetting first execution information of a task to be executed and the first execution information of the task being executed. The first execution information of the task to be executed may include at least one of: the method comprises the steps of first scheduling priority of a task to be executed, first execution time range of the task to be executed, first execution frequency range of the task to be executed, first display resolution of the task to be executed and first display refresh rate of the task to be executed. The first execution information of the executing task includes at least one of: a first scheduling priority of the executing task, a first execution time range of the executing task, a first execution frequency range of the executing task, a first display resolution of the executing task, a first display refresh rate of the executing task.
The metrology and monitoring module 46: the statistics and metrics task scheduling and processing module 43 is used to calculate and measure the processing delay of each task and the subsequent task processing results (whether the task processing is successful or not through interface interaction, etc.), so as to obtain the metrics and feedback of task processing and the metrics of the effectiveness of system resources.
Clock synchronization and hard cycle module 47: the clock synchronization method is used for synchronizing each link when the low-delay task is completed; if the task is a high-certainty task, in order to prevent the software from accumulating time delay, the certainty task can be directly and preferentially scheduled by using hard interrupt, the existing low-priority task can be interrupted and suspended, the high-certainty task scheduling is appointed, and the processing time delay and period are ensured. For example, as shown in the scenario of fig. 1B, the task to be performed is a task (high certainty task) in which the head-up display 11 displays real-time navigation information, and the first display refresh rate of the task in which the head-up display 11 displays real-time navigation information is 100fps, that is, the hard period in which the head-up display 11 displays real-time navigation information is 10ms. In order to realize that the head-up display 11 can display real-time navigation information according to the display refresh rate of 100fps, the head-up display 11 can be directly and preferentially scheduled to display the real-time navigation information task by using the hard interrupt, so that the head-up display 11 can display the real-time navigation information according to the display refresh rate of 100 fps.
Based on the scenario of the user driving the vehicle and traveling in fig. 1, the execution process of the task scheduling method of the present application is described in detail below through the interaction process between the modules in the task scheduling device in fig. 4.
Fig. 5 shows an interaction diagram of a task scheduling method, specifically including:
s501: the scene perception module 41 acquires the task to be executed of the in-vehicle apparatus 10 and the first execution information of the task to be executed based on the current scene.
In some embodiments, the task to be performed of the in-vehicle apparatus 10 may be generated based on a user operation or an action of the user. The first execution information of the task to be executed may be first execution information of the task to be executed preset by the preset scheduling module 45. In some embodiments, as in the smart trip scenario shown in fig. 1A, the current scenario may be a vehicle 100 driving scenario, a vehicle 100 parking scenario, a vehicle 100 turning scenario, or the like. The present application is not particularly limited to the current scene.
In some embodiments, the obtaining, by the scene sensing module 41, the task to be executed of the vehicle device 10 may include one task or a plurality of tasks, and the number of tasks to be executed is not specifically limited according to different scenes of the actual application. For example, when the current scene is a scene of the user driving the vehicle to travel in fig. 1B, and based on the scene of the user driving the vehicle to travel in fig. 1B, the vehicle-mounted device 10 detects that the user starts the head up display 11, the task to be performed by the vehicle-mounted device 10 acquired by the scene sensing module 41 may be a navigation information task displayed by the head up display 11, and the first execution information of the navigation information task displayed by the head up display 11 acquired by the scene sensing module 41 includes: the first scheduling priority of the navigation information task displayed by the head-up display 11, the first execution time range of the navigation information task displayed by the head-up display 11, the first execution frequency range of the navigation information task displayed by the head-up display 11, the first display resolution of the navigation information task displayed by the head-up display 11, and the first display refresh rate of the navigation information task displayed by the head-up display 11. For example, the first display refresh rate for the head-up display 11 to display navigational information tasks is 100fps.
In other embodiments, the current scene may also be a scene of a navigation application started by the user on the central control screen 13, based on the scene, when the vehicle-mounted device 10 detects that the user starts the navigation application on the central control screen 13, the task to be executed of the vehicle-mounted device 10 acquired by the scene sensing module 41 may be a navigation map task displayed by the central control screen 13, and the first display refresh rate of the navigation map task displayed by the central control screen 13 may be 60fps. It is easy to understand that the task to be executed in the intelligent travel scene depends on different scenes of actual application and different task types of the task to be executed, and the task to be executed in the intelligent travel scene is not particularly limited.
In other embodiments, the first execution information of the task to be executed of the in-vehicle device 10 may further include a first delay jitter range, a first reliability, a first trusted security environment, a first isolation, and the like. It can be understood that the first execution information of the task to be executed in the smart trip scene is used for describing the execution requirement when the task scheduling and processing module schedules and processes the task to be executed, the first execution information of the task to be executed depends on different scenes of actual application and different task types of the task to be executed, and the first execution information of the task to be executed in the smart trip scene is not particularly limited in the embodiment of the application.
S502: the scene perception module 41 acquires the task being executed of the in-vehicle apparatus 10 and the first execution information of the task being executed based on the current scene.
In some embodiments, the executing task may be the determining in-vehicle device 10 is executing the task and the first execution information of the executing task based on the current running scenario of the vehicle 100. The first execution information of the executing task of the in-vehicle device 10 may be first execution information of the executing task preset by the preset scheduling module 45.
In other embodiments, the first execution information of the executing task of the in-vehicle device 10 may further include a first delay jitter range, a first reliability, a first trusted security environment, a first isolation, and the like. It can be appreciated that the first execution information of the executing task in the smart trip scenario is used to describe the execution requirement of the executing task, which depends on different scenarios of actual application and different task types of the executing task, and the embodiment of the present application does not specifically limit the first execution information of the executing task in the smart trip scenario.
In some embodiments, acquiring the task being executed of the in-vehicle apparatus 10 may include one task or a plurality of tasks, and the present application does not specifically limit the number of tasks being executed depending on different scenarios of actual applications. For example, the current scenario is the scenario of the user driving the vehicle for traveling in fig. 1B, and based on the scenario of the user driving the vehicle for traveling in fig. 1B, the scenario awareness module 41 detects that the running tasks include 4 tasks, which are respectively: the instrument screen 15 displays a vehicle track related data task, a vehicle data acquisition task of the vehicle recording camera 14, a navigation map task of the central control screen 13, and a task of acquiring voice data of the microphone 12. For example, the meter screen 15 displays track related data tasks including at least one of: the first scheduling priority of the data task related to the driving track displayed on the meter screen 15, the first execution time range of the data task related to the driving track displayed on the meter screen 15, the first execution frequency range of the data task related to the driving track displayed on the meter screen 15, the first display resolution of the data task related to the driving track displayed on the meter screen 15, and the first display refresh rate of the data task related to the driving track displayed on the meter screen 15.
S503: the scene perception module 41 acquires the available system resources of the in-vehicle apparatus 10 based on the current scene.
In some embodiments, the scene perception module 41 obtains available system resources of the vehicle-mounted device 10 based on the current scene, where the available system resources of the vehicle-mounted device 10 are system resources that the vehicle-mounted device 10 can use to process the task to be executed. The available system resources of the in-vehicle apparatus 10 specifically include: comprehensive computing power, network bandwidth, etc. available to the system of the in-vehicle device 10. The comprehensive computing power available to the system of the in-vehicle apparatus 10 may include, in particular: memory capacity, central processor processing power, graphics processor processing power, etc.
S504: the scene perception module 41 acquires the system resources occupied by the executing task of the in-vehicle apparatus 10 based on the current scene.
In some embodiments, the context awareness module 41 obtains the system resources occupied by the executing task based on the current context. For example, when a task being executed is a single task, the system resources occupied by the single task may be acquired. When the executing task is a plurality of tasks, the system resources occupied by each task in the running task can be respectively acquired.
In some embodiments, the system resources occupied by the executing task of the in-vehicle device 10 may include at least one of: comprehensive computing power and network bandwidth.
S505: the scene perception module 41 predicts system resources required by the task to be executed according to the task to be executed of the vehicle-mounted device 10 and the first execution information of the task to be executed.
In some embodiments, the system resource required by the task to be executed may be a system resource required by the task scheduling and processing module 43 to schedule the task to be executed to implement the first execution information of the task to be executed.
It is to be understood that the system resources required for the task to be executed of the in-vehicle apparatus 10 may be the system resources required for the task scheduling and processing module 43 to schedule and process the task to be executed at the present time, or may be the system resources required for the task scheduling and processing module 43 to schedule and process the task to be executed in a preset time period in the future. It will be appreciated that the present application does not limit the specific time requirements of the system resources required for the task to be executed of the in-vehicle apparatus 10 according to the different task types of the task to be executed of the in-vehicle apparatus 10.
In some embodiments, the scene perception module 41 may predict the system resources required for the task to be performed according to the task to be performed and the first execution information of the task to be performed by a statistical rule algorithm, an expert system algorithm, a machine learning algorithm, and the like.
For example, the context awareness module 41 may predict system resources required by the task to be performed according to the first execution information of the task to be performed through a machine learning algorithm, and generate the predicted system resources required by the task to be performed. The machine learning algorithm may be a machine learning algorithm generated by the in-vehicle apparatus 10 pre-training the machine learning algorithm by a data set collected in advance. The pre-collected data set may be data of system resources consumed during a preset period of time in the past when the in-vehicle apparatus 10 schedules a task to be executed and implements first execution information of the task to be executed.
It will be appreciated that in embodiments of the present application, machine learning algorithms or the like may be employed to predict the system resources required for a task to be performed.
It should be understood that the task scheduling method of the present application is not limited to the execution sequence of the steps S501 to S505, and may also simultaneously execute the steps S501 and S502, for example, simultaneously obtain the task to be executed and the first execution information of the task to be executed, or execute the step S502 first and then execute the step S501, or execute the step S501 first and then execute the step S505, and then execute the steps S502 to S504. The execution order of steps S501 to S505 is not particularly limited in the present application.
S506: the scene perception module 41 sends the task to be executed and the first execution information of the task to be executed, the task being executed and the first execution information of the task being executed, the system resources and the available system resources required by the task to be executed, and the system resources occupied by the task being executed of the in-vehicle device 10 to the target overall module 42.
S507: the objective orchestration module 42 determines whether the available system resources of the in-vehicle device 10 meet the predicted system resources required for the task to be performed by the in-vehicle device 10. In the case where the available system resources of the in-vehicle apparatus 10 satisfy the system resources required for the task to be performed of the in-vehicle apparatus 10, step S508 and step S511 are performed. In the case where the available system resources of the in-vehicle apparatus 10 do not satisfy the system resources required for the task to be performed of the in-vehicle apparatus 10, step S509, step S510, and step S511 are performed.
S508: in the case where the available system resources of the in-vehicle apparatus 10 satisfy the system resources required for the task to be performed of the in-vehicle apparatus 10, the target orchestration module 42 sends the task to be performed of the in-vehicle apparatus 10 and the first execution information of the task to be performed to the task scheduling and processing module 43.
S509: in the case where the available system resources of the in-vehicle apparatus 10 do not satisfy the system resources required for the task to be executed of the in-vehicle apparatus 10, the target orchestration module 42 adjusts the first execution information of the task being executed according to the task being executed and the first execution information of the task being executed, the system resources required for the task to be executed, the system resources occupied by the task being executed, and the available system resources of the in-vehicle apparatus 10, and generates the second execution information of the task being executed.
It will be understood that, in the case where the available system resources of the in-vehicle apparatus 10 do not meet the predicted system resources required by the task to be performed of the in-vehicle apparatus 10, the target overall module 42 performs scheduling processing on the task to be performed according to the task to be performed of the in-vehicle apparatus 10 and the first execution information of the task to be performed, the task to be performed and the first execution information of the task to be performed, and when the task to be performed performs scheduling processing, the scheduling and processing of the task to be performed with high priority will be preferentially met, which will result in that when the priority of the task to be performed is relatively low, no processing is performed for a longer period of time, and the user experience will be affected. For example, the priority of displaying the navigation information task by the head-up display 11 is low, and in the case that the available system resources do not meet the predicted system resources required by the head-up display 11 to display the navigation information task, the situation that the navigation information displayed on the head-up display 11 is not refreshed for 2 minutes may occur, which causes that the navigation information displayed on the head-up display 11 does not conform to the road condition actually seen by the user, thereby affecting the user experience.
Thus, in some embodiments, in order not to affect the user experience, the goal orchestration module 42 may adjust the first execution information of the executing task to the second execution information in case the available system resources do not meet the predicted system resources required for the task to be executed of the in-vehicle device 10. The second execution information of the executing task may include at least one of: the method comprises the steps of executing a second scheduling priority corresponding to a first scheduling priority of a task, executing a second execution time range corresponding to a first execution time range of the task, executing a second execution frequency range corresponding to a first execution frequency range of the task, executing a second display resolution corresponding to a first display resolution of the task, and executing a second display refresh rate corresponding to a first display refresh rate of the task.
For example, in the event that the available system resources do not meet the predicted system resources required for the task to be performed of the in-vehicle device 10, the target orchestration module 42 may adjust the first display refresh rate of at least one of the executing tasks to the second display refresh rate.
Specifically, in the scenario shown in fig. 1B, the executing tasks in the driving scenario of the vehicle 100 may include: the central control screen 13 displays a navigation map task, the camera 14 collects images and generates a driving record task, the instrument screen 15 displays the driving speed and distance task of the vehicle in real time, and the like. The first refresh rate of the navigation map task displayed by the central control screen 13 is 100fps. In the event that the available system resources do not meet the predicted system resources required for the task to be performed by the in-vehicle apparatus 10, the target orchestration module 42 may adjust the first refresh rate 100fps at which the center control screen 13 displays the navigation map task to the second display refresh rate 60fps.
It will be appreciated that the goal orchestration module 42 may adjust the first refresh rate of 100fps for the navigation map task displayed by the central control screen 13 to be 60fps for the second display refresh rate, so as to reduce system resources occupied by the navigation map task displayed by the central control screen 13, increase available resources of the system, ensure scheduling processing of the task to be executed, and improve user experience.
Similarly, in the event that the available system resources do not meet the predicted system resources required for the task to be performed by the in-vehicle device 10, the goal orchestration module 42 may also perform one or more of: the first execution time range of the navigation map task displayed by the central control screen 13 is adjusted to be the second execution time range, the first execution frequency range of the navigation map task displayed by the central control screen 13 is adjusted to be the second execution frequency range, or the first display resolution of the navigation map task displayed by the central control screen 13 is adjusted to be the second display resolution. Therefore, system resources occupied by the navigation map task displayed by the central control screen 13 are reduced, available system resources are increased, namely more available system resources are reserved for scheduling the task to be executed, task scheduling processing of the task to be executed is guaranteed, and user experience is improved.
It will be appreciated that the objective orchestration module 42 may also adjust the first execution information of the running task to the corresponding second execution information by displaying the running speed and the distance task of the vehicle on the instrument panel 15 in the running task, or collecting the image by the camera 14 and generating the running record task, or displaying the running speed and the distance task of the vehicle on the instrument panel 15 in real time, so as to reduce the system resources occupied by the running task, increase the available resources of the system, i.e. reserve more available system resources for scheduling the task to be executed, thereby ensuring the task scheduling process of the task to be executed, and improving the user experience.
For example, in the event that the available system resources do not meet the predicted system resources required for the task to be performed of the in-vehicle device 10, the target orchestration module 42 may adjust the first scheduling priority of at least one of the executing tasks to the second scheduling priority.
Specifically, in the scenario shown in fig. 1B, the executing tasks in the driving scenario of the vehicle 100 may include: the central control screen 13 displays a navigation map task, the camera 14 collects images and generates a driving recording task, the instrument screen 15 displays the driving speed and distance of the vehicle in real time, and the task to be executed is a task of displaying real-time navigation information for the head-up display 11. Wherein, the first scheduling priority of the navigation map task displayed by the central control screen 13 is higher than the scheduling priority of the real-time navigation information task displayed by the head-up display 11. In the case of insufficient available system resources, the first scheduling priority of the navigation map task displayed by the central control screen 13 may be adjusted to the second scheduling priority. The second scheduling priority of the navigation map task displayed by the adjusted central control screen 13 is lower than the scheduling priority of the real-time navigation information task displayed by the head-up display 11.
It will be appreciated that the target overall module 42 adjusts the first scheduling priority of the navigation map task displayed by the central control screen 13 to the second scheduling priority, so that the task scheduling and processing module 43 preferentially schedules and processes the head-up display 11 to display the real-time navigation information task in the case of insufficient available resources of the system, thereby ensuring that the head-up display 11 displays the task scheduling process of the real-time navigation information task (task to be executed), and improving the user experience.
In other embodiments of the present application, the goal orchestration module 42 may also shut down the low priority executing task based on the first execution information of the executing task in the event that the available system resources of the in-vehicle device 10 do not meet the predicted system resources required for the task to be executed of the in-vehicle device 10. For example, if the executing task includes a voice recognition task, the target overall module 42 may reduce system resources occupied by the executing task by closing the voice recognition task when the user temporarily does not use the voice recognition function of the vehicle 100 or the priority of the voice recognition task is low, so as to release more available system resources for the scheduling process of the task to be executed, thereby ensuring the scheduling process of the task to be executed and improving the user experience.
It is to be understood that, in step S505, the specific time requirement of the system resource required for the task to be executed of the in-vehicle apparatus 10 may be the system resource required for the task scheduling and processing module 43 to schedule and process the task to be executed at the present time, or may be the system resource required for the task scheduling and processing module 43 to schedule and process the task to be executed in a preset time period in the future. Therefore, the corresponding system available resource may be a system available resource at the current time, or may be a system available resource within a preset time period in the future, and in the case where the system available resource does not satisfy the system resource required by the task to be executed at the corresponding time, the goal orchestration module 42 may adjust the first execution information of the task being executed to the second execution information of the task being executed at the corresponding time or when the corresponding time is about to come. It will be appreciated that the adjustment time depends on different scenarios of the actual application and different task types of the tasks to be performed, and the present application does not limit the specific time for the target orchestration module 42 to adjust the first execution information of the executing task to the second execution information of the executing task.
In other embodiments, the second execution information corresponding to the first execution information of the executing task of the in-vehicle apparatus 10 (the second execution information of the executing task of the in-vehicle apparatus 10) may further include a second delay jitter range, a second reliability, a second trusted security environment, a second isolation degree, and the like. It can be appreciated that the second execution information of the executing task in the smart trip scenario is used to describe the execution requirement of the task scheduling and processing module when the executing task is scheduled and processed. Compared with the task scheduling and processing module which schedules and processes the executing task according to the first executing information of the executing task, the task scheduling and processing module is more beneficial to achieving the target time delay or other executing targets of the task to be executed when scheduling and processing the executing task according to the second executing information of the executing task. The second execution information of the executing task depends on different scenes of the actual application and different task types of the executing task, and the embodiment of the application does not specifically limit the second execution information of the executing task in the smart trip scene.
In some embodiments, in a case where there are a plurality of tasks to be executed, and the available system resources of the in-vehicle apparatus 10 do not meet the predicted system resources required by the tasks to be executed of the in-vehicle apparatus 10, the target overall module 42 may send each task to be executed and the first execution information corresponding to each task to be executed to the task scheduling and processing module 43, and the task scheduling and processing module 43 performs scheduling processing on each task to be executed according to each task to be executed and the first execution information corresponding to each task to be executed.
In other embodiments, in a case where there are a plurality of tasks to be executed, and the available system resources of the vehicle-mounted device 10 do not meet the predicted system resources required by the tasks to be executed of the vehicle-mounted device 10, the target overall module 42 may also send the tasks to be executed with higher priority and the first execution information corresponding to the tasks to be executed with higher priority to the task scheduling and processing module 43 according to the first scheduling priority corresponding to each task to be executed of the vehicle-mounted device 10, where the task scheduling and processing module 43 performs the scheduling processing on the tasks to be executed with higher priority preferentially according to the tasks to be executed with higher priority and the corresponding first execution information thereof. Under the condition that the available system resources of the vehicle-mounted device 10 meet the predicted system resources required by the tasks to be executed of the vehicle-mounted device 10, the tasks to be executed with lower priority and the first execution information corresponding to the tasks to be executed with lower priority are sent to the task scheduling and processing module 43, and the task scheduling and processing module 43 performs scheduling processing on the tasks to be executed with lower priority according to the tasks to be executed with lower priority and the corresponding first execution information.
Specifically, for example, in the scenario shown in fig. 1B, the tasks to be performed in the driving scenario of the vehicle 100 may include: the head-up display 11 displays navigation information tasks and the assistant screen 16 displays image tasks, wherein the first scheduling priority of the head-up display 11 displaying navigation information tasks is higher than the first scheduling priority of the assistant screen 16 displaying image tasks. In the case where the available system resources of the in-vehicle apparatus 10 do not satisfy the predicted system resources required for the head-up display 11 of the in-vehicle apparatus 10 to display the navigation information task and the co-driver screen 16 to display the image task, the target overall module 42 may send the first execution information of the navigation information task displayed by the head-up display 11 and the navigation information task displayed by the head-up display 11 to the task scheduling and processing module 43, and the task scheduling and processing module 43 may schedule the navigation information task displayed by the head-up display 11 preferentially according to the navigation information task displayed by the head-up display 11 and the first execution information of the navigation information task displayed by the head-up display 11. In the case where the available system resources of the in-vehicle apparatus 10 satisfy the predicted system resources required for the head-up display 11 of the in-vehicle apparatus 10 to display the navigation information task and the image task displayed by the co-pilot screen 16, the target orchestration module 42 then sends the first execution information of the image task displayed by the co-pilot screen 16 and the image task displayed by the co-pilot screen 16 to the task scheduling and processing module 43, and the task scheduling and processing module 43 performs scheduling processing on the image task displayed by the co-pilot screen 16 according to the first execution information of the image task displayed by the co-pilot screen 16 and the image task displayed by the co-pilot screen 16.
S510: the objective orchestration module 42 sends the task to be executed by the in-vehicle apparatus 10 and the first execution information of the task to be executed to the task scheduling and processing module 43.
S511: the task scheduling and processing module 43 performs scheduling processing on the task to be executed according to the first execution information of the task to be executed, and generates a processing result of the task to be executed.
In some embodiments, in a case where the available system resources satisfy the predicted system resources required for the task to be executed of the in-vehicle apparatus 10, the task scheduling and processing module 43 performs scheduling processing on the task to be executed according to the first execution information of the task being executed and the first execution information of the task to be executed, and generates a processing result of the task to be executed. In the case where the available system resources do not satisfy the predicted system resources required by the task to be executed of the in-vehicle apparatus 10, the target orchestration module 42 adjusts the first execution information of the task to be executed according to the first execution information of the task to be executed and the task to be executed of the in-vehicle apparatus 10, the first execution information of the task to be executed, the system resources required by the task to be executed, the system resources occupied by the task to be executed, and the available system resources, and generates the second execution information corresponding to the first execution information of the task to be executed. The task scheduling and processing module 43 performs scheduling processing on the task to be executed according to the second execution information of the task being executed and the first execution information of the task to be executed, and generates a processing result of the task to be executed.
S512: the prediction and compensation module 44 acquires the processing result of the task to be executed from the task scheduling and processing module 43.
S513: the prediction and compensation module 44 adjusts the first execution information of the executing task or the second execution information of the executing task to the third execution information of the executing task in the case where the processing result of the task to be executed does not satisfy the first execution information of the task to be executed.
In some embodiments, prediction and compensation module 44 adjusts the first execution information of the executing task or the second execution information of the executing task to the third execution information of the executing task if the processing result of the executing task does not satisfy the first execution information of the executing task. Wherein the third execution information of the executing task includes at least one of: a third scheduling priority of the executing task, a third execution time range of the executing task, a third execution frequency range of the executing task, a third display resolution of the executing task, a third display refresh rate of the executing task.
To ensure the user experience. In the case where the available system resources satisfy the system resources required for the predicted task to be executed of the in-vehicle apparatus 10 and the processing result of the task to be executed does not satisfy the first execution information of the task to be executed, the prediction and compensation module 44 may adjust the first execution information of the task being executed to the third execution information of the task being executed.
For example, in the scenario shown in fig. 1B, the task to be executed is a task of displaying real-time navigation information by the head-up display 11, the running task is a task of displaying a navigation map by the central control screen 13, wherein the first display refresh rate of displaying the real-time navigation information by the head-up display 11 is 60fps, and the first display refresh rate of displaying the navigation map by the central control screen 13 is 100fp.
Since the available system resources meet the predicted system resources required by the task to be executed of the vehicle-mounted device 10, and the processing result of the task to be executed does not meet the first execution information of the task to be executed, specifically, the processing result of the head-up display 11 displaying the real-time navigation information task is a display refresh rate of 20fps, the first display refresh rate of the head-up display 11 displaying the real-time navigation information task is 60fps, and the processing result of the head-up display 11 displaying the real-time navigation information task does not meet the first display refresh rate of the head-up display 11 displaying the real-time navigation information task. Prediction and compensation module 44 may adjust a first display refresh rate (100 fp) at which central control screen 13 displays navigational map tasks to a third display refresh rate (60 fps) at which central control screen 13 displays navigational map tasks.
It will be appreciated that the prediction and compensation module 44 adjusts the first refresh rate of 100fps for the navigation map task displayed by the central control screen 13 to be 60fps for the third display refresh rate, so as to reduce the system resources occupied by the navigation map task displayed by the central control screen 13, increase the available resources of the system, ensure the scheduling processing of the real-time navigation information task displayed by the head-up display 11, and improve the user experience.
In other embodiments, to ensure a user experience, in a case where the available system resources do not satisfy the system resources required for the predicted task to be performed of the in-vehicle apparatus 10 and the processing result of the task to be performed does not satisfy the first execution information of the task to be performed, the prediction and compensation module 44 may adjust the first execution information of the task being performed to the third execution information of the task being performed.
For example, in the scenario shown in fig. 1B, the task to be executed is a task of displaying real-time navigation information by the head-up display 11, the running task is a task of displaying a navigation map by the central control screen 13, wherein a first display refresh rate of displaying the real-time navigation information by the head-up display 11 is 60fps, and a second display refresh rate of displaying the navigation map by the central control screen 13 is 80fp. Since the available system resources do not satisfy the predicted system resources required by the task to be executed of the vehicle-mounted device 10, and the processing result of the task to be executed does not satisfy the first execution information of the task to be executed, specifically, the processing result of the head-up display 11 displaying the real-time navigation information task is a display refresh rate of 20fps, the first display refresh rate of the head-up display 11 displaying the real-time navigation information task is 60fps, and the processing result of the head-up display 11 displaying the real-time navigation information task does not satisfy the first display refresh rate of the head-up display 11 displaying the real-time navigation information task. Prediction and compensation module 44 may adjust the second display refresh rate (80 fp) at which central control screen 13 displays the navigational map task to a third display refresh rate (30 fps) at which central control screen 13 displays the navigational map task.
It will be appreciated that the prediction and compensation module 44 adjusts the second refresh rate 80fps of the navigation map task displayed by the central control screen 13 to be 60fps, so as to reduce the system resources occupied by the navigation map task displayed by the central control screen 13, increase the available resources of the system, ensure the scheduling processing of the real-time navigation information task displayed by the head-up display 11, and improve the user experience.
In other embodiments, the third execution information of the executing task of the in-vehicle device 10 may further include a third delay jitter range, a third reliability, a third trusted security environment, a third isolation, and the like. It can be appreciated that the third execution information of the executing task in the smart trip scenario is used to describe the execution requirement of the task scheduling and processing module when the executing task is scheduled and processed. Compared with the task scheduling and processing module which schedules and processes the executing task according to the first executing information or the second executing information of the executing task, the task scheduling and processing module is more beneficial to achieving the target time delay or other executing targets of the task to be executed when scheduling and processing the executing task according to the third executing information of the executing task. The third execution information of the executing task depends on different scenes of the actual application and different task types of the executing task, and the embodiment of the application does not specifically limit the third execution information of the executing task in the smart trip scene.
In other embodiments of the present application, in the case that the processing result of the task to be executed does not satisfy the first execution information of the task to be executed, the prediction and compensation module 44 may further increase the operating frequency of the central processing unit to increase the processing capability of the central processing unit, increase the operating frequency of the graphics processor to increase the processing capability of the graphics processor, and increase the operating frequency of the memory to increase the data reading capability, thereby increasing the available system resources of the vehicle-mounted device 10, further increasing the available system resources of the vehicle-mounted device 10, ensuring the scheduling processing of the task to be executed, and improving the user experience.
In other embodiments of the present application, the prediction and compensation module 44 may also shut down a low priority executing task of a low priority based on the first execution information or the second execution information of the executing task in case the processing result of the task to be executed does not satisfy the first execution information of the task to be executed. Therefore, the system resources occupied by the executing task are reduced, more available system resources are released for the scheduling processing of the task to be executed, the scheduling processing of the task to be executed is ensured, and the user experience is improved.
As is apparent from the above description, the scene perception module 41 obtains the task to be executed by the in-vehicle apparatus 10 and the first execution information of the task to be executed, the first execution information that the in-vehicle apparatus 10 is executing the task and executing the task, the available system resources of the in-vehicle apparatus 10, and the system resources occupied by the executing task, according to the current scene. The context awareness module 41 further predicts system resources required by the task to be executed according to the task to be executed and the first execution information of the task to be executed. The target overall module 42 determines whether the available system resources of the in-vehicle apparatus 10 satisfy the predicted system resources required for the task to be performed of the in-vehicle apparatus 10, and in the case where the available system resources of the in-vehicle apparatus 10 satisfy the predicted system resources required for the task to be performed of the in-vehicle apparatus 10, the target overall module 42 sends the task to be performed of the in-vehicle apparatus 10 and the first execution information of the task to be performed to the task scheduling and processing module 43. In the case where the available system resources of the in-vehicle apparatus 10 do not satisfy the predicted system resources required for the task to be executed of the in-vehicle apparatus 10, the target orchestration module 42 adjusts the first execution information of the task being executed to the corresponding second execution information. And thus more closely meets the objectives, including, but not limited to, reducing the system resources occupied by the task being executed to increase the available system resources of the in-vehicle device 10, or identifying that the task to be executed enters a particular dispatch queue to meet a latency objective, or other measure. Therefore, the scheduling processing of the task to be executed is ensured, and the user experience is improved.
Then, the objective orchestration module 42 sends the task to be executed of the in-vehicle apparatus 10 and the first execution information of the task to be executed to the task scheduling and processing module 43. The task scheduling and processing module 43 performs scheduling processing on the task to be executed of the in-vehicle apparatus 10 to generate a processing result of the task to be executed. The prediction and compensation module 44 adjusts the first execution information of the task being executed or the second execution information of the task being executed of the in-vehicle apparatus 10 to the third execution information of the task being executed in the case where the processing result of the task to be executed of the in-vehicle apparatus 10 does not satisfy the first execution information of the task to be executed. Further meeting the objectives, including but not limited to reducing the system resources occupied by the task being performed, to increase the available system resources of the in-vehicle device 10, or to identify entry into a particular dispatch queue to meet a latency objective, or other measure. The scheduling processing of the tasks to be executed of the vehicle-mounted equipment 10 is guaranteed, and user experience is improved.
It can be understood that, by running the task scheduling device, the objective overall module 42 performs purposefully scheduling and processing on the task to be executed of the vehicle-mounted device 10 according to the task and the task related information acquired by the scene perception module 41, and actively adjusts the first execution information of the task to be executed of the vehicle-mounted device 10 on the premise of not affecting the user experience under the condition that the available system resources are insufficient, so that the resource consumption of the task to be executed of the vehicle-mounted device 10 is reduced, the scheduling processing of the task to be executed of the vehicle-mounted device 10 is ensured, and the user experience is improved. In addition, when the processing result of the task to be executed of the vehicle-mounted device 10 does not meet the first execution information of the task to be executed, the prediction and compensation module 44 actively adjusts the first execution information or the second execution information of the task to be executed of the vehicle-mounted device 10, so that the resource consumption of the task to be executed of the vehicle-mounted device 10 is further reduced, the scheduling processing of the task to be executed is ensured, and the user experience is improved.
Based on the smart home scenario of fig. 2, the following describes in detail the execution process of another task scheduling method according to the present application through the interaction process between the modules in the task scheduling device of fig. 4.
Fig. 6 shows an interaction diagram of another task scheduling method, and in contrast to the task scheduling method of fig. 5, the task scheduling method of the present application may also be applied to the distributed system 20, and specifically includes:
s601: the scene perception module 41 obtains the task to be performed of the distributed system 20 and the first execution information of the task to be performed based on the current scene.
In some embodiments, the tasks to be performed by the distributed system 20 may be generated based on user operations or actions of the user. The first execution information of the task to be executed may be first execution information of the task to be executed preset by the preset scheduling module 45.
In some embodiments, as in the smart home scenario shown in fig. 2, smart large screen 201, smart speaker 202, camera 203, VR glasses 204, cell phone 206, etc. form distributed system 20. The current scene may be a scene where user 1 is opening VR glasses 204 and user 2 is talking to smart speaker 202, user 3 is watching smart large screen 201 is playing an educational web site video, etc. It may also be the scene of user 4 browsing the web page using mobile phone 206. It will be appreciated that the present application is not particularly limited to the current scenario.
For example, the current scene is a scene that the user 1 in the smart home scene of fig. 2 wears the VR glasses 204 to turn around, based on the scene, the distributed system 20 detects the turning around action of the user 1, then the task to be executed of the distributed system 20 acquired by the scene sensing module 41 may be a 360-degree video display task of the VR glasses 204, and the first execution information of the 360-degree video display task of the VR glasses 204 acquired by the scene sensing module 41 includes at least one of the following: the first scheduling priority of the VR glasses 204 ring view 360-degree video display task, the first execution time range of the VR glasses 204 ring view 360-degree video display task, the first execution frequency range of the VR glasses 204 ring view 360-degree video display task, the first display resolution of the VR glasses 204 ring view 360-degree video display task, and the first display refresh rate of the VR glasses 204 ring view 360-degree video display task. For example, the first display resolution of the VR glasses 204 for a 360 degree video display task is 16K resolution.
In other embodiments, the current scenario may also be a scenario in which the user 3 starts the smart large screen 201 to display the educational website video, and based on the scenario, the distributed system 20 detects that the user 3 starts the smart large screen 201, and the task to be performed of the distributed system 20 acquired by the scene perception module 41 may be that the smart large screen 201 is playing the educational website video task. It is to be understood that the task to be executed in the smart home scene depends on different scenes of actual application and different types of the task to be executed, and the task to be executed in the smart home scene is not particularly limited in the application.
In other embodiments, the first execution information of the task to be executed of the distributed system 20 may further include a first delay jitter range, a first reliability, a first trusted security environment, a first isolation, and the like. It can be appreciated that the first execution information of the task to be executed of the distributed system 20 is used to describe the execution requirement when the task scheduling and processing module schedules and processes the task to be executed, and the first execution information of the task to be executed is not specifically limited in the embodiment of the present application according to different scenarios of actual application and different task types of the task to be executed.
S602: the scene perception module 41 obtains the task being executed and the first execution information of the task being executed of the distributed system 20 based on the current scene.
In some embodiments, the executing task may be based on a current application scenario of the distributed system 20, the determined first execution information of the executing task and the executing task of the distributed system 20. The first execution information of the executing task of the distributed system 20 may be first execution information of the executing task preset by the preset scheduling module 45.
In some embodiments, acquiring the executing tasks of the distributed system 20 may include one task or a plurality of tasks, which depends on different scenarios of actual application, and the present application does not specifically limit the number of tasks being executed.
For example, the current scenario is the smart home scenario of fig. 2, based on the smart home scenario of fig. 2, the scenario awareness module 41 may detect that the running tasks include 4 tasks, which are: the educational website video display task of the intelligent large screen 201, the voice recognition task of the intelligent sound box 202, the webpage refreshing task of the mobile phone 206 and the monitoring task of the camera 203. For example, the educational website video display tasks of the smart large screen 201 include at least one of: the first scheduling priority of the educational website video display task of the intelligent large screen 201, the first execution time range of the educational website video display task of the intelligent large screen 201, the first execution frequency range of the educational website video display task of the intelligent large screen 201, the first display resolution of the educational website video display task of the intelligent large screen 201, and the first display refresh rate of the educational website video display task of the intelligent large screen 201. For example, the first display refresh rate for the educational web site video display task for the smart large screen 201 is 100fps.
S603: the scene perception module 41 obtains available system resources of the distributed system 20 based on the current scene.
In some embodiments, the scene perception module 41 obtains available system resources of the distributed system 20 based on the current scene, where the available system resources of the distributed system 20 are comprehensive system resources that each electronic device in the distributed system 20 can use to process a task to be executed. The available system resources of the distributed system 20 include in particular: the overall computing power, network bandwidth, etc. available to the system of distributed system 20. The comprehensive computing power available to the systems of distributed system 20 may include, in particular: memory capacity, central processor processing power, graphics processor processing power, etc.
S604: the scene perception module 41 obtains system resources occupied by the executing task of the distributed system 20 based on the current scene.
In some embodiments, the context awareness module 41 obtains the system resources occupied by the executing task of the distributed system 20 based on the current context. For example, the executing task of the distributed system 20 may be a single task, and the system resources occupied by the single task of the distributed system 20 may be obtained. When the executing task of the distributed system 20 is a plurality of tasks, system resources occupied by each task in the executing tasks of the distributed system 20 may be acquired respectively.
In some embodiments, the system resources occupied by the executing tasks of distributed system 20 may include at least one of: comprehensive computing power and network bandwidth.
S605: the scene perception module 41 predicts system resources required by the task to be executed of the distributed system 20 according to the task to be executed of the distributed system 20 and the first execution information of the task to be executed. The specific content refers to step S505, and will not be described herein.
In some embodiments, the system resources required by the task to be executed of the distributed system 20 may be the system resources required by the task scheduling and processing module 43 to schedule the task to be executed to implement the first execution information of the task to be executed.
It should be understood that the system resources required for the task to be executed in the distributed system 20 may be the system resources required for the task scheduling and processing module 43 to schedule and process the task to be executed at the current time, or the system resources required for the task scheduling and processing module 43 to schedule and process the task to be executed in a preset time period in the future. For example, the task to be performed is a VR glasses 204 ring view 360-degree video display task, and the system resource required by the VR glasses 204 ring view 360-degree video display task may be the system resource required by the task scheduling and processing module 43 to schedule the VR glasses 204 ring view 360-degree video display task for implementing 16K display resolution when 10ms is in the future. The application is not limited to the specific time requirements of the system resources required for the tasks to be performed by the distributed system 20.
It should be understood that the task scheduling method of the present application is not limited to the execution sequence of steps S601 to S605, and steps S601 and S602 may be executed simultaneously, for example, the task to be executed and the first execution information of the task to be executed, the task being executed and the first execution information of the task being executed, or step S602 is executed first and then step S601 is executed, or step S601 is executed first and then step S605 is executed, and then steps S602 to S604 are executed. The execution order of steps S601 to S605 is not particularly limited in the present application.
S606: the scenario awareness module 41 sends the task to be executed and the first execution information of the task to be executed, the task being executed and the first execution information of the task being executed, the system resources required and available for the task to be executed, and the system resources occupied by the task being executed of the distributed system 20 to the goal orchestration module 42.
S607: the goal orchestration module 42 determines whether the available system resources of the distributed system 20 meet the predicted system resources required for the tasks to be performed by the distributed system 20. In the case where the available system resources of the distributed system 20 satisfy the system resources required for the task to be performed of the distributed system 20, step S608 and step S611 are performed. In the case where the available system resources of the distributed system 20 do not satisfy the system resources required for the task to be performed of the distributed system 20, step S609, step S610, and step S611 are performed.
S608: in case the available system resources of the distributed system 20 meet the system resources required for the tasks to be performed of the distributed system 20, the target orchestration module 42 sends the tasks to be performed of the distributed system 20 and the first execution information of the tasks to be performed to the task scheduling and processing module 43.
S609: in the case that the available system resources of the distributed system 20 do not satisfy the system resources required by the tasks to be performed of the distributed system 20, the target orchestration module 42 adjusts the first execution information of the tasks to be performed according to the tasks to be performed and the first execution information of the tasks to be performed, the system resources required by the tasks to be performed, the system resources occupied by the tasks to be performed, and the available system resources of the distributed system 20 by the distributed system 20, and generates the second execution information of the tasks to be performed.
In some embodiments, in order not to affect the user experience, the goal orchestration module 42 may adjust the first execution information of the executing tasks of the distributed system 20 to corresponding second execution information in case the available system resources do not meet the predicted system resources required for the tasks to be executed of the distributed system 20. The second execution information of the executing task of the distributed system 20 may include at least one of: the method comprises the steps of executing a second scheduling priority corresponding to a first scheduling priority of a task, executing a second execution time range corresponding to a first execution time range of the task, executing a second execution frequency range corresponding to a first execution frequency range of the task, executing a second display resolution corresponding to a first display resolution of the task, and executing a second display refresh rate corresponding to a first display refresh rate of the task. The specific content refers to S509 of fig. 5, and will not be described herein.
It should be understood that, in step S605, the specific time requirement of the system resource required for the task to be executed of the distributed system 20 may be the system resource required for the task scheduling and processing module 43 to schedule and process the task to be executed at the current moment, or the system resource required for the task scheduling and processing module 43 to schedule and process the task to be executed at a preset time period in the future. Therefore, the corresponding system available resource may be a system available resource at the current time, or may be a system available resource at a preset time period in the future, where the target overall module 42 adjusts the first execution information of the executing task to the corresponding second execution information when the corresponding time or the corresponding time is about to come, in the case where the system available resource does not satisfy the system resource required by the task to be executed at the corresponding time. It will be appreciated that the adjustment time depends on different scenarios of the actual application and different task types of the task to be executed, and the specific time of the target orchestration module 42 adjusting the first execution information of the executing task to the corresponding second execution information is not limited in the present application.
In other embodiments, the second execution information corresponding to the first execution information of the executing task of the distributed system 20 (the second execution information of the executing task of the distributed system 20) may further include a second delay jitter range, a second reliability, a second trusted security environment, a second isolation, and the like. It can be appreciated that the second execution information of the executing task in the smart home scenario is used to describe the execution requirement when the task scheduling and processing module schedules and processes the executing task, which is more helpful to achieve the target delay of the task to be executed or other execution targets when the task scheduling and processing module schedules and processes the executing task according to the second execution information of the executing task than when the task scheduling and processing module schedules and processes the executing task according to the first execution information of the executing task. The second execution information of the executing task depends on different scenes of the actual application and different task types of the executing task, and the embodiment of the application does not specifically limit the second execution information of the executing task in the smart home scene.
S610: the goal orchestration module 42 sends the tasks to be performed of the distributed system 20 and the first execution information of the tasks to be performed to the task scheduling and processing module 43.
S611: the task scheduling and processing module 43 performs scheduling processing on the task to be executed according to the first execution information of the task to be executed of the distributed system 20, and generates a processing result of the task to be executed of the distributed system 20.
S612: the prediction and compensation module 44 obtains the processing results of the tasks to be performed of the distributed system 20 from the task scheduling and processing module 43.
S613: the prediction and compensation module 44 adjusts the first execution information of the executing task of the distributed system 20 or the second execution information of the executing task of the distributed system 20 to the third execution information of the executing task in the case where the processing result of the executing task of the distributed system 20 does not satisfy the first execution information of the executing task of the distributed system 20. The specific content refers to step S513, and will not be described here in detail.
As can be seen from the above description, the context awareness module 41 obtains the task to be executed of the distributed system 20 and the first execution information of the task to be executed, the task being executed of the distributed system 20 and the first execution information of the task being executed, the available system resources of the distributed system 20, and the system resources occupied by the task being executed according to the current context. The context awareness module 41 further predicts system resources required by the task to be executed according to the task to be executed and the first execution information of the task to be executed. The objective orchestration module 42 determines whether the available system resources of the distributed system 20 meet the predicted system resources required by the tasks to be performed by the distributed system 20, and if the available system resources of the distributed system 20 meet the predicted system resources required by the tasks to be performed by the distributed system 20, the objective orchestration module 42 sends the tasks to be performed by the distributed system 20 and the first execution information of the tasks to be performed to the task scheduling and processing module 43. In the event that the available system resources of the distributed system 20 do not meet the predicted system resources required by the task of the distributed system 20 to be performed, the target orchestration module 42 adjusts the first execution information of the executing task to the corresponding second execution information. Thereby meeting goals more closely, including but not limited to reducing the system resources occupied by executing tasks, increasing the available system resources of distributed system 20, or identifying tasks to be executed to enter a particular scheduling queue to meet latency objectives, or other measures. Therefore, the scheduling processing of the task to be executed is ensured, and the user experience is improved.
The goal orchestration module 42 then sends the task to be performed and the first execution information of the task to be performed of the distributed system 20 to the task scheduling and processing module 43. The task scheduling and processing module 43 performs scheduling processing on the task to be executed of the distributed system 20 to generate a processing result of the task to be executed. The prediction and compensation module 44 adjusts the first execution information of the executing task or the second execution information of the executing task of the distributed system 20 to the third execution information of the executing task in the case where the processing result of the task to be executed of the distributed system 20 does not satisfy the first execution information of the task to be executed. Thereby further reducing the occupation of system resources which are executing tasks and increasing the available resources of the system. Further ensures the scheduling processing of the task to be executed and improves the user experience.
It can be understood that, by operating the task scheduling device, the target overall module 42 performs purposefully scheduling and processing on the task to be executed according to the task and the task related information acquired by the scene perception module 41, and actively adjusts the first execution information of the task being executed of the distributed system 20 on the premise of not affecting the user experience under the condition of insufficient available system resources, thereby reducing the resource consumption of the task being executed, ensuring the scheduling processing of the task to be executed of the distributed system 20, and improving the user experience. And, the prediction and compensation module 44 actively adjusts the first execution information or the second execution information of the executing task in the case that the processing result of the task to be executed of the distributed system 20 does not satisfy the first execution information of the task to be executed, so as to further meet the target, including but not limited to reducing the system resources occupied by the task to be executed, increasing the available system resources of the distributed system 20, or identifying to enter a specific scheduling queue to satisfy the time delay target, or other measures, so as to ensure the scheduling processing of the task to be executed of the distributed system 20 and improve the user experience.
In conjunction with the smart phone scenario of fig. 3, the following describes in detail the execution process of another task scheduling method of the present application through the interaction process between the modules in the task scheduling device of fig. 4.
Fig. 7 shows an interaction diagram of another task scheduling method, and in contrast to the task scheduling method of fig. 5, the task scheduling method of the present application may also be applied to a single electronic device (for example, a mobile phone 30), and specifically includes:
s701: the scene perception module 41 obtains a task to be executed of the current device and first execution information of the task to be executed based on the current scene.
In some embodiments, the task to be performed by the current device may be generated based on a user operation or action of the user. The first execution information of the task to be executed may be first execution information of the task to be executed preset by the preset scheduling module 45.
In some embodiments, as shown in fig. 3, the current device is the mobile phone 30, the current device may be a scene that the mobile phone 30 performs video call through instant messaging software, and the mobile phone 30 runs a map navigation application in the background, and the video application plays a video screen to a television.
For example, the current scenario is a scenario in which the user opens the instant messaging software of the mobile phone 30 to perform a video call in the smart phone scenario of fig. 3, based on the scenario, the mobile phone 30 detects that the user opens the instant messaging software of the mobile phone 30 and performs a video call, based on the operation, the scenario awareness module 41 obtains that a task to be performed of the mobile phone 30 may be a video call task, and the first execution information of the video call task obtained by the scenario awareness module 41 includes at least one of: the method comprises the steps of first scheduling priority of video call tasks, first execution time range of the video call tasks, first execution frequency range of the video call tasks, first display resolution of the video call tasks and first display refresh rate of the video call tasks. For example, the first display resolution of the video call task is 2K resolution and the first display refresh rate of the video call task is 30fps.
In other embodiments, the current scene may also be a scene of the user starting the mobile phone 30 navigation application, based on which the mobile phone 30 detects the operation of the user starting the mobile phone 30 navigation application, and the task to be performed of the mobile phone 30 acquired by the scene sensing module 41 may be a task of the mobile phone 30 displaying a navigation map. The first display refresh rate for the handset 30 to display navigational map tasks is 80fps. It is to be understood that the first execution information of the task to be executed depends on different scenes of the actual application and different task types of the task to be executed, and the task to be executed and the first execution information of the task to be executed in the smart phone scene are not particularly limited in the application.
S702: the scene perception module 41 obtains the task being executed and the first execution information of the task being executed of the current device based on the current scene.
In some embodiments, the first execution information of the current device that is executing the task includes at least one of: a first scheduling priority of the executing task, a first execution time range of the executing task, a first execution frequency range of the executing task, a first display resolution of the executing task, a first display refresh rate of the executing task.
In some embodiments, acquiring the executing task of the current device may include one task or a plurality of tasks, which depends on different scenarios of actual application, and the present application does not specifically limit the number of tasks that are executing tasks.
For example, the current scenario is the smart phone scenario of fig. 3, based on the smart phone scenario of fig. 3, the scenario awareness module 41 may detect that the running tasks include 4 tasks, which are: navigation task, video playing task, webpage refreshing task and photographing task. For example, the first execution information of the navigation task includes at least one of: the method comprises the steps of first scheduling priority of a navigation task, first execution time range of the navigation task, first execution frequency range of the navigation task, first display resolution of the navigation task and first display refresh rate of the navigation task. For example, the first display refresh rate for the navigation task is 80fps.
S703: the scene perception module 41 obtains available system resources of the current device based on the current scene.
In some embodiments, the context awareness module 41 obtains available system resources of the current device based on the current context, where the available system resources of the current device are system resources that the current device can use to process the task to be performed. The available system resources of the current device specifically include: the comprehensive computing power available to the system of the current device, network bandwidth, etc. The comprehensive computing power available to the system of the current device may include, in particular: memory capacity, central processor processing power, graphics processor processing power, etc.
S704: the scene perception module 41 obtains the system resources occupied by the executing task of the current device based on the current scene.
In some embodiments, the context awareness module 41 obtains the system resources occupied by the executing task of the current device based on the current context. For example, the current device's executing task is a single task, and the system resources occupied by the single task of the current device may be obtained. When the current device is executing tasks as a plurality of tasks, system resources occupied by each task in the current device's running tasks can be acquired respectively.
In some embodiments, the system resources currently occupied by the executing task of the device may include at least one of: comprehensive computing power and network bandwidth.
S705: the scene perception module 41 predicts system resources required by the task to be executed of the current device according to the task to be executed of the current device and the first execution information of the task to be executed. The specific content refers to step S505, and will not be described herein.
It should be understood that the task scheduling method of the present application is not limited to the execution sequence of steps S701 to S705, and steps S701 and S702 may be executed simultaneously, for example, the task to be executed and the first execution information of the task to be executed, the task being executed and the first execution information of the task being executed, or step S502 is executed first and then step S501 is executed, or step S501 is executed first and then step S505 is executed, and then steps S502 to S504 are executed. The execution order of steps S701 to S705 is not particularly limited in the present application.
S706: the context awareness module 41 sends the task to be executed and the first execution information of the task to be executed, the task being executed and the first execution information of the task being executed, the system resources required by the task to be executed, the available system resources and the system resources occupied by the task being executed of the current device to the target overall module 42.
S707: the goal orchestration module 42 determines whether the available system resources of the current device meet the predicted system resources required for the task to be performed by the current device. In the case where the available system resources of the current device satisfy the system resources required for the task to be performed of the current device, step S708 and step S711 are performed. In the case where the available system resources of the current device do not satisfy the system resources required for the task to be performed of the current device, step S709, step S710, and step S711 are performed.
S708: in case the available system resources of the current device meet the system resources required for the task to be performed of the current device, the target orchestration module 42 sends the task to be performed of the current device and the first execution information of the task to be performed to the task scheduling and processing module 43.
S709: in the case that the available system resources of the current device do not meet the system resources required by the task to be performed of the current device, the target overall module 42 adjusts the first execution information of the task to be performed according to the task to be performed and the first execution information of the task to be performed, the system resources required by the task to be performed, the system resources occupied by the task to be performed and the available system resources of the current device, and generates the second execution information of the task to be performed.
In some embodiments, in order not to affect the user experience, the target orchestration module 42 may adjust the first execution information of the current device that is executing the task to corresponding second execution information in the event that the available system resources do not meet the predicted system resources required for the task to be executed by the current device. The second execution information of the current device that is executing the task may include at least one of: the method comprises the steps of executing a second scheduling priority corresponding to a first scheduling priority of a task, executing a second execution time range corresponding to a first execution time range of the task, executing a second execution frequency range corresponding to a first execution frequency range of the task, executing a second display resolution corresponding to a first display resolution of the task, and executing a second display refresh rate corresponding to a first display refresh rate of the task. The specific content refers to S509 of fig. 5, and will not be described herein.
S710: the goal orchestration module 42 sends the task to be performed and the first execution information of the task to be performed of the current device to the task scheduling and processing module 43.
S711: the task scheduling and processing module 43 performs scheduling processing on the task to be executed according to the first execution information of the task to be executed of the current device, and generates a processing result of the task to be executed of the current device.
S712: the prediction and compensation module 44 acquires the processing result of the task to be executed of the current apparatus from the task scheduling and processing module 43.
S713: the prediction and compensation module 44 adjusts the first execution information of the current device that is executing the task or the second execution information of the current device that is executing the task to the third execution information of the task in the case where the processing result of the task to be executed of the current device does not satisfy the first execution information of the task to be executed of the current device. The specific content refers to step S513, and will not be described here in detail.
As can be seen from the above description, the scene perception module 41 obtains, according to the current scene, the task to be executed and the first execution information of the task to be executed of the current device, the task being executed and the first execution information of the task being executed of the current device, available system resources of the current device, and system resources occupied by the task being executed. The context awareness module 41 further predicts system resources required by the task to be executed according to the task to be executed and the first execution information of the task to be executed. The target overall module 42 determines whether the available system resources of the current device meet the predicted system resources required by the task to be performed of the current device, and if the available system resources of the current device meet the predicted system resources required by the task to be performed of the current device, the target overall module 42 sends the task to be performed of the current device and the first execution information of the task to be performed to the task scheduling and processing module 43. In the case where the available system resources of the current device do not satisfy the predicted system resources required for the task to be performed of the current device, the target orchestration module 42 adjusts the first execution information of the task being performed to corresponding second execution information. Thus meeting goals more closely, including but not limited to reducing occupation of system resources that are executing tasks, to increase system available resources, or to identify tasks to be executed to enter a particular scheduling queue to meet latency objectives, or other measures. And the scheduling processing of the task to be executed is ensured, and the user experience is improved.
The goal orchestration module 42 then sends the task to be performed and the first execution information of the task to be performed of the current device to the task scheduling and processing module 43. The task scheduling and processing module 43 performs scheduling processing on the task to be executed of the current device to generate a processing result of the task to be executed. The prediction and compensation module 44 adjusts the first execution information of the executing task or the second execution information of the executing task of the current device to the third execution information of the executing task in the case where the processing result of the task to be executed of the current device does not satisfy the first execution information of the task to be executed. Further meeting goals, including but not limited to reducing occupation of system resources that are performing tasks, to increase system available resources, or to identify entry into a particular dispatch queue to meet latency objectives, or other measures. Further ensures the scheduling processing of the task to be executed and improves the user experience.
It can be understood that, by operating the task scheduling device, the target overall module 42 performs purposefully scheduling and processing on the task to be executed according to the task and the task related information acquired by the scene perception module 41, and actively adjusts the first execution information of the task to be executed of the current device on the premise of not affecting the user experience under the condition that the available system resources are insufficient, so as to reduce the resource consumption of the task to be executed of the current device, ensure the scheduling processing of the task to be executed of the current device, and improve the user experience. And, the prediction and compensation module 44 actively adjusts the first execution information or the second execution information of the executing task under the condition that the processing result of the task to be executed of the current device does not meet the first execution information of the task to be executed, so as to further reduce the resource consumption of the task to be executed of the current device, ensure the scheduling processing of the task to be executed of the current device, and improve the user experience.
Fig. 8 shows a schematic structural diagram of an in-vehicle apparatus 10, or an electronic apparatus, or a terminal apparatus 30 in a distributed system 20, which is suitable for use in the embodiment of the present application.
As shown in fig. 8, the in-vehicle apparatus 10 or the distributed system 20 or the terminal apparatus 30 may include an audio module 210, a speaker 210A, a microphone 210B, a screen 220, a processor 230, an internal memory 240, an external memory interface 250, a power management module 260, a sensor 270, a key 280, an antenna 1, an antenna 2, a mobile communication module 294, a wireless communication module 295, and the like.
One electronic device or terminal device 30 in the in-vehicle device 10 or the distributed system 20 may implement audio functions through an audio module 210, a speaker 210A, a receiver 210B, a microphone 210C, an earphone interface 110D, an application processor, and the like. In the embodiment of the present application, the vehicle-mounted device 10 or one electronic device or terminal device 30 in the distributed system 20 may prompt the user for map information in the map application through the speaker 210A in a voice manner;
the audio module 210 is used to convert digital audio information into an analog audio signal output and also to convert an analog audio input into a digital audio signal. The audio module 210 may also be used to encode and decode audio signals. In some embodiments, the audio module 210 may be disposed in the processor 230, or some functional modules of the audio module 210 may be disposed in the processor 230.
The speaker 210A, also referred to as a "horn," is used to convert audio electrical signals into sound signals.
Microphone 210B, also referred to as a "microphone" or "microphone", is used to convert sound signals into electrical signals.
The screen 220 may be a touch screen composed of a touch sensor and a display screen, also referred to as a "touch screen", for detecting gesture operations acting on or near the screen. The screen 220 may communicate the detected gesture operation to the processor 230 to determine the type of gesture operation. In an embodiment of the present application, the screen 220 is used to display a display interface of the map application.
Processor 230 may include one or more processing units such as, for example: processor 230 may include an application processor (application processor, AP), a modem processor, a graphics processor (graphics processing unit, GPU), an image signal processor (image signal processor, ISP), a controller, a video codec, a digital signal processor (digital signal processor, DSP), a baseband processor, and/or a neural network processor (neural-network processing unit, NPU), etc. Wherein the different processing units may be separate devices or may be integrated in one or more processors.
The internal memory 240 may be used to store computer executable program code including instructions. The internal memory 240 may include a storage program area and a storage data area. The processor 230 performs various functional applications and data processing of the in-vehicle device 10 or one of the electronic devices or the terminal devices 30 in the distributed system 20 by executing instructions stored in the internal memory 240 and/or instructions stored in a memory provided in the processor. In an embodiment of the present application, the internal memory 240 may be used to store map information of a map application in an embodiment of the present application.
The external memory interface 250 may be used to connect an external memory card, such as a Micro SD card, to enable expansion of the memory capabilities of the vehicle-mounted device 10 or one of the electronic devices or the terminal devices 30 in the distributed system 20. The external memory card communicates with the processor 230 via an external memory interface 250 to implement data storage functions. For example, files such as music, video, etc. are stored in an external memory card.
For example, the power management module 260 is used for power input to the in-vehicle apparatus 10 from the power supply apparatus of the vehicle 100.
The sensor 270 may include a position sensor (e.g., GPS (Global Positioning System, global positioning system), inertial measurement unit (IMU, inertial Measurement Unit), radar sensor, light detection and ranging (LIDAR, light Detection And Ranging) sensor, image sensor, mileage sensor, temperature/humidity sensor, infrared sensor, barometric pressure sensor, proximity sensor, illuminance sensor, magnetic sensor, acceleration sensor, or gyro sensor.
The keys 280 include a power on key, a volume key, etc. The keys 290 may be mechanical keys. Or may be a touch key. The in-vehicle device 10 or one of the electronic devices or the terminal devices 30 in the distributed system 20 may receive key inputs, generating key signal inputs related to user settings and function control of the in-vehicle device 10 or one of the electronic devices or the terminal devices 30 in the distributed system 20.
The wireless communication function of the in-vehicle apparatus 10 or one of the electronic apparatuses or the terminal apparatuses 30 in the distributed system 20 can be realized by the antenna 1, the antenna 2, the mobile communication module 194, the wireless communication module 195, the modem processor, the baseband processor, and the like. The antennas 1 and 2 are used for transmitting and receiving electromagnetic wave signals.
The mobile communication module 294 may provide a solution including 2G/3G/4G/5G wireless communication applied to one of the electronic devices or terminal devices 30 in the in-vehicle device 10 or the distributed system 20.
The wireless communication module 295 may provide a solution for wireless communication including wireless local area network (wireless local area networks, WLAN) (e.g., wireless fidelity (wireless fidelity, wi-Fi) network), bluetooth (BT), global navigation satellite system (global navigation satellite system, GNSS), frequency modulation (frequency modulation, FM), near field wireless communication technology (near field communication, NFC), infrared technology (IR), etc., applied to one of the electronic devices or terminal devices 30 in the in-vehicle device 10 or the distributed system 20. The wireless communication module 295 receives electromagnetic waves via the antenna 2, modulates the electromagnetic wave signals, filters the electromagnetic wave signals, and transmits the processed signals to the processor 230.
Fig. 9 is a block diagram of a software architecture of an electronic device or a terminal device 30 in the in-vehicle device 10 or the distributed system 20 disclosed in some embodiments of the present application.
The layered architecture divides the software into several layers, each with distinct roles and branches. The layers communicate with each other through a software interface. In some embodiments, the Android system is divided into four layers, from top to bottom, an application layer, an application framework layer, an Zhuoyun row (Android run) and system libraries, and a kernel layer, respectively.
The application layer may include a series of application packages.
As shown in fig. 9, the application package may include applications for cameras, gallery, calendar, phone calls, maps, navigation, WLAN, bluetooth, music, video, short messages, etc.
As shown in fig. 9, the application framework layer may include a window manager, a content provider, a view system, a phone manager, a resource manager, a notification manager, and the like.
The window manager is used for managing window programs. The window manager can acquire the size of the display screen, judge whether a status bar exists, lock the screen, intercept the screen and the like.
The content provider is used to store and retrieve data and make such data accessible to applications. The data may include video, images, audio, calls made and received, browsing history and bookmarks, phonebooks, etc.
The view system includes visual controls, such as controls to display text, controls to display pictures, and the like. The view system may be used to build applications. The display interface may be composed of one or more views. For example, a display interface including a text message notification icon may include a view displaying text and a view displaying a picture.
The telephony manager is used to provide communication functions for one of the electronic devices or terminal devices 30 in the in-vehicle device 10 or the distributed system 20. Such as the management of call status (including on, hung-up, etc.).
The resource manager provides various resources for the application program, such as localization strings, icons, pictures, layout files, video files, and the like.
The notification manager allows the application to display notification information in a status bar, can be used to communicate notification type messages, can automatically disappear after a short dwell, and does not require user interaction.
Android run time includes a core library and virtual machines. Android run time is responsible for scheduling and management of the Android system.
The core library consists of two parts: one part is a function which needs to be called by java language, and the other part is a core library of android.
The application layer and the application framework layer run in a virtual machine. The virtual machine executes java files of the application program layer and the application program framework layer as binary files. The virtual machine is used for executing the functions of object life cycle management, stack management, thread management, security and exception management, garbage collection and the like.
The system library may include a plurality of functional modules. For example: surface manager (surface manager), media Libraries (Media Libraries), three-dimensional graphics processing Libraries (e.g., openGL ES), 2D graphics engines (e.g., SGL), etc.
The surface manager is used to manage the display subsystem and provides a fusion of 2D and 3D layers for multiple applications.
Media libraries support a variety of commonly used audio, video format playback and recording, still image files, and the like. The media library may support a variety of audio and video encoding formats, such as MPEG4, h.264, MP3, AAC, AMR, JPG, PNG, etc.
The three-dimensional graphic processing library is used for realizing three-dimensional graphic drawing, image rendering, synthesis, layer processing and the like.
The 2D graphics engine is a drawing engine for 2D drawing.
The kernel layer is a layer between hardware and software. The inner core layer at least comprises a display driver, a camera driver, an audio driver and a sensor driver.
Embodiments of the disclosed mechanisms may be implemented in hardware, software, firmware, or a combination of these implementations. Embodiments of the application may be implemented as a computer program or program code that is executed on a programmable system comprising at least one processor, a storage system (including volatile and non-volatile memory and/or storage elements), at least one input device, and at least one output device.
Program code may be applied to input instructions to perform the functions described herein and generate output information. The output information may be applied to one or more output devices in a known manner. For the purposes of this application, a processing system includes any system having a processor such as, for example, a digital signal processor (Digital Signal Processor, DSP), microcontroller, application specific integrated circuit (Application Specific Integrated Circuit, ASIC), or microprocessor.
The program code may be implemented in a high level procedural or object oriented programming language to communicate with a processing system. Program code may also be implemented in assembly or machine language, if desired. Indeed, the mechanisms described in the present application are not limited in scope by any particular programming language. In either case, the language may be a compiled or interpreted language.
In some cases, the disclosed embodiments may be implemented in hardware, firmware, software, or any combination thereof. The disclosed embodiments may also be implemented as instructions carried by or stored on one or more transitory or non-transitory machine-readable (e.g., computer-readable) storage media, which may be read and executed by one or more processors. For example, the instructions may be distributed over a network or through other computer readable media. Thus, a machine-readable medium may include any mechanism for storing or transmitting information in a form readable by a machine (e.g., a computer), including but not limited to floppy diskettes, optical disks, read-Only memories (CD-ROMs), magneto-optical disks, read Only Memories (ROMs), random access memories (Random Access Memory, RAMs), erasable programmable Read-Only memories (Erasable Programmable Read Only Memory, EPROMs), electrically erasable programmable Read-Only memories (Electrically Erasable Programmable Read-Only memories, EEPROMs), magnetic or optical cards, flash Memory, or tangible machine-readable Memory for transmitting information (e.g., carrier waves, infrared signal digital signals, etc.) using the internet in an electrical, optical, acoustical or other form of propagated signal. Thus, a machine-readable medium includes any type of machine-readable medium suitable for storing or transmitting electronic instructions or information in a form readable by a machine (e.g., a computer).
It will be understood that, although the terms "first," "second," etc. may be used herein to describe various features, these features should not be limited by these terms. These terms are used merely for distinguishing and are not to be construed as indicating or implying relative importance. For example, a first feature may be referred to as a second feature, and similarly a second feature may be referred to as a first feature, without departing from the scope of the example embodiments.
Furthermore, various operations will be described as multiple discrete operations, in a manner that is most helpful in understanding the illustrative embodiments; however, the order of description should not be construed as to imply that these operations are necessarily order dependent, and that many of the operations be performed in parallel, concurrently or with other operations. Furthermore, the order of the operations may also be rearranged. When the described operations are completed, the process may be terminated, but may also have additional operations not included in the figures. The processes may correspond to methods, functions, procedures, subroutines, and the like.
References in the specification to "one embodiment," "an illustrative embodiment," etc., indicate that the embodiment described may include a particular feature, structure, or characteristic, but every embodiment may or may not necessarily include the particular feature, structure, or characteristic. Moreover, such phrases are not necessarily referring to the same embodiment. Furthermore, when a particular feature is described in connection with a particular embodiment, it is within the knowledge of one skilled in the art to affect such feature in connection with other embodiments, whether or not such embodiment is explicitly described.
The terms "comprising," "having," and "including" are synonymous, unless the context dictates otherwise. The phrase "A/B" means "A or B". The phrase "a and/or B" means "(a), (B) or (a and B)".
As used herein, the term "module" may refer to, be part of, or include: a memory (shared, dedicated, or group) for running one or more software or firmware programs, an Application Specific Integrated Circuit (ASIC), an electronic circuit and/or processor (shared, dedicated, or group), a combinational logic circuit, and/or other suitable components that provide the described functionality.
In the drawings, some structural or methodological features may be shown in a particular arrangement and/or order. However, it should be understood that such a particular arrangement and/or ordering is not required. Rather, in some embodiments, these features may be described in a different manner and/or order than shown in the illustrative figures. Additionally, the inclusion of a structural or methodological feature in a particular drawing does not imply that all embodiments need to include such feature, and in some embodiments may not be included or may be combined with other features.
In the drawings, some structural or methodological features may be shown in a particular arrangement and/or order. However, it should be understood that such a particular arrangement and/or ordering may not be required. Rather, in some embodiments, these features may be arranged in a different manner and/or order than shown in the illustrative figures. Additionally, the inclusion of structural or methodological features in a particular figure is not meant to imply that such features are required in all embodiments, and in some embodiments, may not be included or may be combined with other features.
It should be noted that, in the embodiments of the present application, each unit/module mentioned in each device is a logic unit/module, and in physical terms, one logic unit/module may be one physical unit/module, or may be a part of one physical unit/module, or may be implemented by a combination of multiple physical units/modules, where the physical implementation manner of the logic unit/module itself is not the most important, and the combination of functions implemented by the logic unit/module is only a key for solving the technical problem posed by the present application. Furthermore, in order to highlight the innovative part of the present application, the above-described device embodiments of the present application do not introduce units/modules that are less closely related to solving the technical problems posed by the present application, which does not indicate that the above-described device embodiments do not have other units/modules.
It should be noted that in the examples and descriptions of this patent, relational terms such as first and second, and the like are used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. Moreover, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element.
While the application has been shown and described with reference to certain preferred embodiments thereof, it will be understood by those skilled in the art that various changes in form and details may be made therein without departing from the spirit and scope of the application.
The embodiments of the present application have been described in detail above with reference to the accompanying drawings, but the use of the technical solution of the present application is not limited to the applications mentioned in the embodiments of the present application, and various structures and modifications can be easily implemented with reference to the technical solution of the present application to achieve the various advantageous effects mentioned herein. Various changes, which may be made by those skilled in the art without departing from the spirit of the application, are deemed to be within the scope of the application as defined by the appended claims.

Claims (16)

1. A task scheduling method for a first electronic device, comprising:
acquiring a task to be executed, first execution information of the task to be executed, an executing task and the first execution information of the executing task;
determining a first system resource requirement required by completing the task to be executed according to the first execution information of the task to be executed;
when the first system resource capable of being used for executing the task to be executed does not meet the first system resource requirement, the first electronic device adjusts the first executing information of the task to be executed into second executing information of the task to be executed; and when the executing task is completed according to the second executing information of the executing task, the second system resource capable of being used for executing the task to be executed meets the first system resource requirement.
2. The method of claim 1, wherein the first system resource is a system resource of the first electronic device; after the first electronic device adjusts the first execution information of the executing task to the second execution information of the executing task, the method further includes:
and the first electronic equipment executes the executing task according to the second executing information of the executing task, and executes the task to be executed according to the first executing information of the task to be executed.
3. The method of claim 1, wherein the first system resources comprise system resources of at least one second electronic device; after the first electronic device adjusts the first execution information of the executing task to the second execution information of the executing task, the method further includes:
the first electronic device sends a first execution instruction to the at least one second electronic device based on the second execution information of the executing task and the first execution information of the task to be executed, so that the at least one second electronic device executes the executing task with the second execution information of the executing task based on the first execution instruction, and executes the task to be executed with the first execution information of the task to be executed.
4. The method of claim 2, wherein after the first electronic device performs the task to be performed according to the first performance information of the task to be performed, the method further comprises:
generating a first execution result of the task to be executed;
and when the first execution result of the task to be executed does not meet the first execution information of the task to be executed, adjusting the second execution information of the task to be executed into third execution information of the task to be executed, wherein when the task to be executed is completed according to the third execution information of the task to be executed, third system resources capable of being used for executing the task to be executed are larger than the second system resources, and the third system resources meet the first system resource requirement.
5. The method of claim 3, wherein the at least one second electronic device performs the first execution result of the task to be performed according to the first execution information of the task to be performed, and when the first execution result of the task to be performed does not satisfy the first execution information of the task to be performed, the method further comprises:
the first electronic device adjusts the second execution information of the executing task into third execution information of the executing task, the first electronic device sends a second execution instruction to the at least one second electronic device, so that the at least one second electronic device executes the executing task with the third execution information of the executing task based on the second execution instruction, and executes the task to be executed with the first execution information of the task to be executed, wherein when the at least one second electronic device completes the task to be executed with the third execution information of the task to be executed, a third system resource capable of being used for executing the task to be executed is larger than the second system resource, and the third system resource meets the first system resource requirement.
6. The method of claim 1, wherein the first electronic device performs the executing task with the first execution information of the executing task when a first system resource currently available for executing the task to be executed meets the first system resource requirement.
7. The method of claim 1, wherein the first execution information of the executing task, the second execution information of the executing task, and the first execution information of the task to be executed comprise at least one of:
scheduling priority, execution time range, execution frequency range, display resolution, display refresh rate, delay jitter range, reliability, trusted security environment, isolation.
8. The method of claim 7, wherein the first electronic device adjusting the first execution information of the executing task to the second execution information of the executing task comprises at least one of:
the first electronic device adjusts a first scheduling priority included in the first execution information of the executing task to a second scheduling priority;
the first electronic device adjusts a first execution time range included in the first execution information of the executing task to a second execution time range;
The first electronic device adjusts a first execution frequency range included in the first execution information of the executing task to a second execution frequency range;
the first electronic device adjusts a first display resolution included in the first executing information of the executing task to a second display resolution;
the first electronic device adjusts a first display refresh rate included in the first execution information of the executing task to a second display refresh rate;
the first electronic device adjusts a first time delay jitter range included in the first executing information of the executing task to a second time delay jitter range;
the first electronic device adjusts first reliability included in the first execution information of the executing task to second reliability;
the first electronic device adjusts a first trusted security environment included in the first execution information of the executing task into a second trusted security environment;
and the first electronic equipment adjusts the first isolation degree included in the first executing information of the executing task to a second isolation degree.
9. The method of claim 1, wherein the system resources in the first system resource, first system resource requirement, or second system resource requirement comprise integrated computing power and/or network bandwidth.
10. A task scheduling device, comprising:
the scene perception module is used for acquiring a task to be executed, first execution information of the task to be executed, an executing task and the first execution information of the executing task;
the scene perception module is also used for determining a first system resource requirement required by completing the task to be executed according to the first execution information of the task to be executed;
the target overall module is used for adjusting the first execution information of the executing task to the second execution information of the executing task when the first system resource capable of being used for executing the task to be executed does not meet the first system resource requirement; and when the executing task is completed according to the second executing information of the executing task, the second system resource capable of being used for executing the task to be executed meets the first system resource requirement.
11. The apparatus as recited in claim 1, further comprising:
the task scheduling and processing module is used for executing the task to be executed according to the first execution information of the task to be executed and generating a first execution result of the task to be executed;
And the prediction and compensation module is used for adjusting the second execution information of the executing task to the third execution information of the executing task when the first execution result of the task to be executed does not meet the first execution information of the task to be executed, wherein when the executing task is completed according to the third execution information of the executing task, the third system resource capable of being used for executing the task to be executed is larger than the second system resource, and the third system resource meets the first system resource requirement.
12. A first electronic device, comprising:
a memory for storing instructions for execution by the one or more processors of the first electronic device, and
a processor, being one of the one or more processors of the first electronic device, for performing the task scheduling method of any one of claims 1 to 9.
13. The first electronic device of claim 12, wherein the first electronic device is a vehicle-mounted device, a cell phone, a computer, a wearable device, a smart large screen, a smart speaker, or virtual reality glasses.
14. A computer readable storage medium, characterized in that the computer readable storage medium has stored therein computer executable instructions for causing the computer to perform the task scheduling method according to any one of claims 1 to 9 when called by the computer.
15. A computer program product, characterized in that the computer program product, when run on a computer, causes the computer to perform the task scheduling method of any one of claims 1 to 9.
16. A chip, characterized in that the chip is coupled to a memory for reading and executing program instructions stored in the memory for implementing the task scheduling method according to any one of claims 1 to 9.
CN202210129576.2A 2022-02-11 2022-02-11 Task scheduling method and device, readable medium and electronic equipment Pending CN116627602A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210129576.2A CN116627602A (en) 2022-02-11 2022-02-11 Task scheduling method and device, readable medium and electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210129576.2A CN116627602A (en) 2022-02-11 2022-02-11 Task scheduling method and device, readable medium and electronic equipment

Publications (1)

Publication Number Publication Date
CN116627602A true CN116627602A (en) 2023-08-22

Family

ID=87596018

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210129576.2A Pending CN116627602A (en) 2022-02-11 2022-02-11 Task scheduling method and device, readable medium and electronic equipment

Country Status (1)

Country Link
CN (1) CN116627602A (en)

Similar Documents

Publication Publication Date Title
WO2020244622A1 (en) Notification prompt method, terminal and system
US11095727B2 (en) Electronic device and server for providing service related to internet of things device
CN111983559A (en) Indoor positioning navigation method and device
CN113722028B (en) Dynamic card display method and device
CN113837920B (en) Image rendering method and electronic equipment
CN110851510A (en) Data processing method and device of transaction system based on block chain
CN115659069B (en) Card punching recommendation method and device and terminal equipment
CN115422480B (en) Method, apparatus and storage medium for determining region of event venue
CN115017002B (en) Frequency prediction method and frequency prediction device
KR20150027934A (en) Apparatas and method for generating a file of receiving a shoot image of multi angle in an electronic device
CN114424927A (en) Sleep monitoring method and device, electronic equipment and computer readable storage medium
CN111428937A (en) User loss prediction method and device, electronic equipment and readable storage medium
WO2021213451A1 (en) Track playback method, and related apparatus
CN115333941A (en) Method for acquiring application running condition and related equipment
CN114241415A (en) Vehicle position monitoring method, edge calculation device, monitoring device and system
CN116048765B (en) Task processing method, sample data processing method and electronic equipment
CN115017003B (en) Load prediction method and load prediction device
CN116627602A (en) Task scheduling method and device, readable medium and electronic equipment
CN114550417B (en) Disaster early warning method, terminal equipment and storage medium
CN116468134A (en) Stroke prompting method and device
CN110519319B (en) Method and device for splitting partitions
KR20170074732A (en) An electric device and a server for providing service related to an IoT(Internet of things) device
CN114422936A (en) Tunnel traffic management method, device and storage medium
CN114911400A (en) Method for sharing pictures and electronic equipment
CN111429106A (en) Resource transfer certificate processing method, server, electronic device and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination