CN115437609A - Development method and device of automatic driving system and storage medium - Google Patents

Development method and device of automatic driving system and storage medium Download PDF

Info

Publication number
CN115437609A
CN115437609A CN202211000753.3A CN202211000753A CN115437609A CN 115437609 A CN115437609 A CN 115437609A CN 202211000753 A CN202211000753 A CN 202211000753A CN 115437609 A CN115437609 A CN 115437609A
Authority
CN
China
Prior art keywords
development
detection
information
target
driving
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202211000753.3A
Other languages
Chinese (zh)
Inventor
李丰军
周剑光
方芳
付勇
徐聪
高文建
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
China Automotive Innovation Co Ltd
Original Assignee
China Automotive Innovation Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by China Automotive Innovation Co Ltd filed Critical China Automotive Innovation Co Ltd
Priority to CN202211000753.3A priority Critical patent/CN115437609A/en
Publication of CN115437609A publication Critical patent/CN115437609A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F8/00Arrangements for software engineering
    • G06F8/20Software design

Landscapes

  • Engineering & Computer Science (AREA)
  • Software Systems (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Traffic Control Systems (AREA)

Abstract

The application discloses a development method, a device and a storage medium of an automatic driving system, wherein the method comprises the following steps: acquiring a target driving scene and a plurality of preset detection items, wherein the target driving scene is any one of a plurality of driving scenes; determining corresponding detection information of a target driving scene under a plurality of preset detection items; development information of the automatic driving system is obtained according to detection information corresponding to a plurality of driving scenes, the automatic driving system can be developed in an auxiliary mode by the aid of the technical scheme, and vehicle driving function development efficiency is improved.

Description

Development method and device of automatic driving system and storage medium
Technical Field
The application relates to the technical field of intelligent networked automobiles, in particular to a development method and a development device of an automatic driving system and a storage medium.
Background
Since the autonomous vehicle is confronted with various driving scenes during the course of driving on the road, it is necessary that the autonomous vehicle has a function of coping with all the possible driving scenes that may occur and change, so as to realize unmanned driving with a safety level of L3 or more.
According to investigation, unmanned software or related platforms exist in the industry at present, functions for coping with different driving scenes are developed, however, the developed functions are only one part of all scenes which may appear and change, developers cannot master the driving scene overall view which can be coped with by vehicles and the function requirements corresponding to the driving scenes, under the condition that the driving scene overall view which can be coped with by vehicles and the function requirements corresponding to the driving scenes cannot be mastered, it is difficult to judge whether the existing driving scenes meet the requirements of urban unmanned driving, besides, the developers cannot utilize effective development resources to rapidly expand the copeable scenes, and further the problem of low development efficiency is caused.
Therefore, a solution for assisting the development of an automatic driving system is needed to solve the above problems in the prior art.
Disclosure of Invention
In order to solve the problems in the prior art, embodiments of the present application provide a method and an apparatus for developing an automatic driving system, and a storage medium, where the technical solution is as follows:
in one aspect, a method for developing an automatic driving system is provided, the method comprising:
the method comprises the steps of obtaining a target driving scene and a plurality of preset detection items, wherein the target driving scene is any one of a plurality of driving scenes;
determining corresponding detection information of the target driving scene under the plurality of preset detection items;
and obtaining development information of the automatic driving system according to the detection information corresponding to the plurality of driving scenes.
Further, the development information includes development state information of the target driving scene; the obtaining of the development information of the automatic driving system according to the detection information comprises:
and obtaining development state information of the target driving scene according to the detection information.
Further, the obtaining development state information of the target driving scene according to the detection information includes:
if the detection information of the target driving scene under the preset detection items indicates a completion state, marking the development state information of the target driving scene as a developed state.
Further, after the step of obtaining development information of the automatic driving system based on the detection information corresponding to the plurality of driving scenes, the method further includes:
determining the ratio of the number of the target driving scenes with the development state information being the developed state to the number of the plurality of driving scenes to obtain a first ratio result, wherein the first ratio result represents the development progress of the plurality of driving scenes.
Further, the obtaining of the development state information of the target driving scene according to the detection information includes:
if any detection information of the target driving scene under the preset detection items does not indicate a completion state, marking the development state information of the target driving scene as an undeveloped state or a development progress.
Further, still include:
acquiring the development state information as an undeveloped state or the target driving scene in development;
and determining the prior development information of a target detection item in a plurality of preset detection items to be developed based on a first preset value of the target driving scene, wherein the first preset value represents the preset importance degree of the target driving scene.
Further, the development information includes detection item development information corresponding to a target detection item, where the target detection item is any one of the multiple preset detection items; the obtaining of the development information of the automatic driving system according to the detection information comprises:
and determining the detection item development information corresponding to the target detection item according to the detection development state corresponding to the detection information under the target detection item.
Further, the determining the detection information corresponding to the target driving scene under the plurality of preset detection items includes:
determining the target detection items under the plurality of preset detection items corresponding to the target driving scene;
and determining detection development information corresponding to the target detection item.
In another aspect, there is provided a developing apparatus of an automatic driving system, the apparatus including:
an acquisition module: the method comprises the steps of acquiring a target driving scene and a plurality of preset detection items, wherein the target driving scene is any one of a plurality of driving scenes;
a detection information determination module: the detection information corresponding to the target driving scene under the plurality of preset detection items is determined;
a development information determination module: and the development information of the automatic driving system is obtained according to the detection information corresponding to the plurality of driving scenes.
Another aspect provides a development device of an automatic driving system, which includes a processor and a memory, where the memory stores at least one instruction, at least one program, a code set, or a set of instructions, and the at least one instruction, the at least one program, the code set, or the set of instructions is loaded and executed by the processor to implement the development method of an automatic driving system as described above.
Another aspect provides a computer-readable storage medium having stored therein at least one instruction, at least one program, a set of codes, or a set of instructions, which is loaded and executed by a processor to implement the method of developing an autopilot system as described above.
The development method, the development device and the storage medium of the automatic driving system have the following technical effects:
according to the method and the device, the target driving scene and the plurality of preset detection items are obtained, wherein the target driving scene is any one of the plurality of driving scenes, so that the target driving scene is associated with the plurality of preset detection items, and the corresponding detection information of the target driving scene under the plurality of preset detection items is further determined, so that the development information of the automatic driving system is obtained according to the detection information corresponding to the plurality of driving scenes, the technical scheme provided by the application can be used for assisting in developing the automatic driving system, and the efficiency of developing the driving function of the vehicle is improved.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present application, the drawings needed to be used in the description of the embodiments are briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present application, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without creative efforts.
Fig. 1 is a schematic flowchart of a method for developing an automatic driving system according to an embodiment of the present disclosure;
fig. 2 is a schematic flow chart illustrating generation of a corresponding analysis basis of a driving scenario according to an embodiment of the present disclosure;
fig. 3 is a proportion distribution diagram of algorithm coverage degrees corresponding to driving scenes according to an embodiment of the present application;
fig. 4 is a flowchart illustrating a method for determining priority development information according to an embodiment of the present application,
fig. 5 is a proportion distribution diagram of algorithm coverage degrees corresponding to each detection item in preset detection items according to the embodiment of the present application;
fig. 6 is a schematic structural diagram of a development device of an automatic driving system according to an embodiment of the present application;
fig. 7 is a schematic structural diagram of a server according to an embodiment of the present application.
Detailed Description
The technical solutions in the embodiments of the present application will be described clearly and completely with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are only some embodiments of the present application, and not all embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
It is noted that the present specification provides method steps as described in the examples or flowcharts, but may include more or less steps based on routine or non-inventive efforts. The order of steps recited in the embodiments is merely one manner of performing the steps in a multitude of orders and does not represent the only order of execution.
Please refer to fig. 1, which is a flowchart illustrating a method for developing an automatic driving system according to an embodiment of the present application, and the following describes a technical solution of the present application in detail with reference to fig. 1.
The embodiment of the application provides a development method of an automatic driving system, which specifically comprises the following steps:
s101: the method comprises the steps of obtaining a target driving scene and a plurality of preset detection items, wherein the target driving scene is any one of the driving scenes.
In the embodiment of the application, the target driving scene is a driving scene which needs to be dealt with by the automatic driving vehicle, the target driving scene is a driving scene which is composed of road information and target perception environment information, and the driving scene of the automatic driving vehicle is ensured to be generated more completely by acquiring the comprehensively covered and wide target driving scene which may be met by the automatic driving vehicle in the driving process, so that the safety performance of the automatic driving vehicle is improved.
Specifically, the road information may be road data of a route in a global planned path from a starting point to a terminal point of the autonomous vehicle, and for example, the road information may be lane information, lane change information, and driving state information of the autonomous vehicle, where the lane change information may be lane change to the left, lane change to the right, lane change to the upper ramp, lane change to the lower ramp, and the like, and the driving state information may be straight, turning to the left, turning to the right, turning around, exiting from a road, roundabout, parking, and the like.
When the automatic driving vehicle is in a driving process, the automatic driving vehicle can not only approach complicated and changeable road information, but also meet target perception environment information for changing a driving track of the automatic driving vehicle, specifically, the target perception environment information comprises weather environment factors, object factors and road facility factors, wherein the weather environment factors can be the weather factors which can influence accurate recognition of a sensor or the weather factors which can influence a friction coefficient of a road surface, and the like, the object factors can be whether objects exist around the driving vehicle or not, the types of the existing objects, the states of the objects or the relative relationship between the objects and the automatic driving vehicle and the objects, and the road facility factors can be whether signal lamps, the types of the signal lamps, traffic facilities or mark marks exist or not.
In practical application, each target driving scene may include a driving scene in which any one of the road information and one or more of the target perception environment information are arranged and combined, the road information and the target perception environment information are arranged and combined so as to form a driving scene which is encountered by the autonomous vehicle as much as possible in the driving process, and the driving stability of the autonomous vehicle in the driving process is effectively improved by setting the driving scene in advance.
Specifically, as shown in fig. 2, it is a schematic flow chart of generating a driving scenario correspondence analysis basis according to an embodiment of the present application, and a comprehensive driving scenario is obtained according to a permutation and combination between the road information and the target sensing environment information in fig. 2, so that the autonomous vehicle can cope with various and complex driving scenarios.
In another embodiment, the preset detection items are detection items set corresponding to a driving scene, specifically, the preset detection items include a perception item, a prediction item, a decision item and a trajectory production item, the perception item is used for perceiving road information and target perception environment information existing in the driving scene, the prediction item is used for predicting the information perceived by the perception item, the decision item is used for making a judgment according to a result of the prediction, and the trajectory production item is used for generating a corresponding trajectory according to the result of the judgment. In practical application, according to different driving scenes, prediction items required by the driving scenes are determined, so that corresponding algorithms are developed according to preset detection items, and the automatic vehicle can safely drive in a target driving scene according to the corresponding algorithms.
S102: determining corresponding detection information of a target driving scene under a plurality of preset detection items;
in an alternative embodiment, step S102 may include:
s1021: and determining target detection items under a plurality of preset detection items corresponding to the target driving scene.
S1022: and determining detection development information corresponding to the target detection item.
In the embodiment of the application, the detection information is detection item information required by a target driving scene, developed information of the target detection item corresponding to the target driving scene is obtained according to the target detection item corresponding to the determined target driving scene and the detection development state corresponding to the target detection item, and further the development progress of the target driving scene is determined, so that an algorithm developer can determine the detection item corresponding to the undeveloped target driving scene according to the algorithm development condition of the target driving scene, and the development efficiency of the algorithm developer on the detection item is improved.
S103: and obtaining development information of the automatic driving system according to the detection information corresponding to the plurality of driving scenes.
In an alternative embodiment, the development information includes development status information of the target driving scenario; step S103 may include:
s1031: and obtaining development state information of the target driving scene according to the detection information.
In the embodiment of the application, the development state information comprises a developed state, a development in progress state and an undeveloped state, and the driving scenes which can be dealt with by the automatic driving vehicle and the driving scenes which cannot be dealt with are determined according to the development state information of the target driving scenes, so that the development state information required by the target driving scenes is statistically processed to correspondingly develop algorithms for the target driving scenes in the undeveloped state and in the development in progress state, so that the automatic driving vehicle can deal with more driving scenes, and the safety of the automatic driving vehicle is improved.
In another optional embodiment, step S1031 may include:
and if the detection information of the target driving scene under the plurality of preset detection items indicates the completion state, marking the development state information of the target driving scene as a developed state.
In another optional embodiment, step S1031 may further include:
if any detection information of the target driving scene under a plurality of preset detection items does not indicate a completion state, marking the development state information of the target driving scene as an undeveloped state or a development progress.
In the embodiment of the application, a target driving scene which can be dealt with by the automatic driving vehicle at present is determined according to an algorithm development state of any detection information of the target driving scene under a plurality of preset detection items, and a development strategy of the detection items is formulated based on the target driving scene which can be dealt with by the automatic driving vehicle at present, so that algorithm development is performed on the preset detection items in an algorithm development state or an algorithm undeveloped state, and further, the unmanned driving vehicle is determined to be capable of dealing with more target driving scenes, so that automatic driving of the automatic driving vehicle in the complicated and changeable target driving scenes with more obstacles is realized, and safety of the automatic driving vehicle is improved.
Specifically, if the detection information of the target driving scene under the multiple target detection items indicates a completion state, that is, the multiple target detection items required by the target driving scene are all algorithm development completion states, determining that the development state information of the target driving scene is a developed state; if any detection information of the target driving scene under the multiple target detection items does not indicate a completion state, namely the detection information of any target detection item of the multiple target detection items required by the target driving scene is in an algorithm development state or an algorithm undeveloped state, determining that development state information of the target driving scene is in an undeveloped state or in a development progress state, so that an algorithm developer performs algorithm development on the target driving scene of which the development state information is in the undeveloped state or in the development progress state, and formulating an algorithm development strategy to develop a target driving scene which can be dealt with by more automatic driving vehicles in an extreme time.
In practical applications, the development state information of the target driving scene and the corresponding relationship between the prediction items required by the target driving scene are shown in table 1, and the specific table 1 is:
Figure BDA0003807264390000101
next, the present embodiment will be described by taking a driving scenario 1 and a driving scenario 5 as examples. As can be seen from table 1, for a driving scene 1, the driving scene 1 is a driving scene with a speed limit sign, a preset detection item required by the driving scene 1 includes a sensing item, a decision item and a track production item, where the sensing item is used to identify the speed limit sign, the decision item is used to make an acceleration/deceleration determination according to a current speed of the autonomous vehicle when the speed limit sign is sensed, the track production item is used to generate a planned track of acceleration/deceleration according to a result output by the decision item, and in the detection item required by the driving scene 1, an algorithm development state corresponding to the sensing item, an algorithm development state corresponding to the decision item, and an algorithm development state corresponding to the track production item are development completion states, so that development state information of the driving scene 1 is a developed state, and an algorithm corresponding to the detection item required by the driving scene 1 does not need to be developed, that the autonomous vehicle can cope with the driving scene 1, and normal driving can be realized in the driving scene 1.
Further, as can be seen from table 1, for the driving scene 5, the driving scene 5 is a driving scene in which a static object exists in a lane in which the vehicle is driving and a driving vehicle exists in a left rear adjacent lane, detection items required by the driving scene 5 include a perception item for identifying the static object and the driving vehicle, a prediction item for predicting a driving intention of the driving vehicle in a driving state, a decision item for making a detour/deceleration judgment according to a prediction result when the driving intention of the driving vehicle is predicted, and a trajectory production item for generating a corresponding planned trajectory according to a judgment result made by the decision item, where the corresponding planned trajectory may be a decelerated stop, the driving vehicle passes through a rear lane, and among preset detection items required by the driving scene 5, at least one of an algorithm development state corresponding to the perception item, an algorithm development state corresponding to the decision item, and an algorithm development state corresponding to the trajectory production item exist and an algorithm development state corresponding to the trajectory production item, and if no algorithm development item exists, the driving scene 5 is a driving scene in which the driving scene is required, and if the driving scene 5 is a driving scene which the driving scene is not developed, the driving scene is not developed algorithm development is capable of being developed, the driving scene, and if the driving scene is not developed, the driving scene 5, the driving scene is capable of being developed, the driving scene, and the driving scene capable of improving the driving scene.
In an optional embodiment, after step S103, the method may further include:
determining the ratio of the number of the target driving scenes with the development state information as the developed state to the number of the plurality of driving scenes to obtain a first ratio result, wherein the first ratio result represents the development progress of the plurality of driving scenes.
In the embodiment of the application, the proportion of the target driving scenes which can be responded by the automatic driving vehicle is determined by calculating the first ratio result representing the development progress of the plurality of driving scenes so as to assist an algorithm developer in developing an automatic driving system, so that the automatic driving vehicle can respond to more driving scenes, more driving scenes are covered, the algorithm development progress of each driving scene is comprehensively mastered, and development resources are timely adjusted.
Specifically, the coverage of the target driving scenario is determined by the obtained first ratio result, as shown in fig. 3, it is apparent from fig. 3 that the development state information is the target driving scenario in the developed state, and the development of the automatic driving system is further assisted, so that the function developer can grasp the driving scenario that the automatic driving vehicle can cope with as a whole.
In an optional implementation manner, as shown in fig. 4, which is a schematic flow chart of a method for determining priority development information provided in an embodiment of the present application, the method may further include:
s401: and acquiring development state information as an undeveloped state or a target driving scene in development progress.
S402: the method comprises the steps of determining priority development information of a target detection item in a plurality of preset detection items to be developed based on a first preset value of a target driving scene, wherein the first preset value represents the importance degree of the target driving scene.
In the embodiment of the application, based on a first preset value of a target driving scene, that is, a preset importance degree of the target driving scene, an algorithm development state of any detection item in a plurality of target detection items is adjusted, based on the adjusted algorithm development state of any detection item in the plurality of target detection items, a ratio of the number of the target driving scene in the development state to the number of the plurality of driving scenes is calculated, a second ratio result is obtained, a change rate of algorithm coverage of the target driving scene is determined based on the first ratio result and the second ratio result, and then priority development information of the target detection item in the plurality of preset detection items to be developed is determined, so that development efficiency is improved, and more driving scenes can be developed in the same time.
Specifically, the algorithm development state of any detection item in the multiple target detection items is adjusted, that is, the detection item in the algorithm undeveloped state or the algorithm development is adjusted to be the algorithm development completion state, the preset importance degree of the target driving scene can be determined according to the sequence of driving performance, safety, riding experience, intelligentization and the like, and an algorithm development strategy is formulated by adjusting the algorithm development state of the detection item and comparing the adjusted second ratio result with the first ratio result before adjustment to obtain the change rate of the algorithm coverage of the target driving scene, that is, the change condition of the algorithm coverage before and after adjustment, and further based on the change condition of the algorithm coverage before and after adjustment.
In an optional embodiment, the development information includes detection item development information corresponding to a target detection item, where the target detection item is any one of a plurality of preset detection items; step S103 may include:
s1032: and determining the detection item development information corresponding to the target detection item according to the detection development state corresponding to the detection information under the target detection item.
In the embodiment of the application, the detection item development information is development information corresponding to the detection item, the detection item development information corresponding to the target detection item can be determined according to the detection development state of the target detection item, and specifically, the algorithm coverage condition of each target detection item in a plurality of target detection items in the algorithm development completion state is respectively calculated, that is, the ratio of the number of the detection items in the algorithm development completion state to the number of the target detection items is calculated to obtain a third ratio result, and the third ratio result represents the development progress of the target detection item, so that an algorithm developer can perform reasonable resource allocation.
In an optional implementation manner, the sensing function of the sensing item is implemented by a preset sensor, wherein the preset sensor may be one or more of a laser radar, a millimeter wave radar, a camera, or an ultrasonic radar, in an actual application, a plurality of preset sensors are provided on an autonomous vehicle, since sensing ranges or sensing objects of different preset sensors are the same, development state information corresponding to sensing information of different preset sensors is respectively obtained, if environmental information sensed by different preset sensors in the same driving scene is consistent, and a development completion state exists in development state information corresponding to environmental information sensed by different preset sensors, development state information of the sensing item required by the driving scene is an algorithm development completion state, if environmental information sensed by different preset sensors in the same driving scene is consistent, and development state information corresponding to environmental information sensed by different preset sensors exists in algorithm development, and when no algorithm development completion state exists, development state information of the sensing item required by the target driving scene is in algorithm development, so that repeated development of the sensing item can be avoided, and the sensing efficiency of the sensing item corresponding to the driving scene is improved.
According to the technical scheme of the embodiment of the application, the method has the following technical effects:
according to the method and the device, the target driving scene and the plurality of preset detection items are obtained, wherein the target driving scene is any one of the plurality of driving scenes, so that the target driving scene is associated with the plurality of preset detection items, and the corresponding detection information of the target driving scene under the plurality of preset detection items is further determined, so that the development information of the automatic driving system is obtained according to the detection information corresponding to the plurality of driving scenes, the technical scheme provided by the application can be used for assisting in developing the automatic driving system, and the efficiency of developing the driving function of the vehicle is improved.
The embodiment of the present application further provides a development device of an automatic driving system, as shown in fig. 6, which is a schematic structural diagram of the development device of an automatic driving system provided in this embodiment, and the device specifically includes the following modules:
the acquisition module 10: the method comprises the steps of acquiring a target driving scene and a plurality of preset detection items, wherein the target driving scene is any one of a plurality of driving scenes.
The detection information determination module 20: the method is used for determining the corresponding detection information of the target driving scene under a plurality of preset detection items.
Development information determination module 30: the method is used for obtaining development information of the automatic driving system according to the detection information corresponding to the plurality of driving scenes.
Preferably, the development information includes development state information of the target driving scene; the development information determination module 30 may include:
development state information determination module 301: and obtaining development state information of the target driving scene according to the detection information.
Preferably, the development status information determining module 301 includes:
the first state-labeling module 3011: and marking the development state information of the target driving scene as a developed state if the detection information of the target driving scene under the plurality of preset detection items indicates a completion state.
Preferably, the apparatus further comprises:
a first ratio result determination module: the method comprises the steps of determining the ratio of the number of target driving scenes with development state information being a developed state to the number of the plurality of driving scenes to obtain a first ratio result, wherein the first ratio result represents the development progress of the plurality of driving scenes.
Preferably, the development state information determining module 301 includes:
the second state labeling module 3012: and if any detection information of the target driving scene under the plurality of preset detection items does not indicate a completion state, marking the development state information of the target driving scene as an undeveloped state or in the process of development.
Preferably, the apparatus further comprises:
the target driving scene obtaining module: the method is used for acquiring the development state information as an undeveloped state or a target driving scene in development progress.
A priority development information determination module: the method comprises the steps of determining priority development information of a target detection item in a plurality of preset detection items to be developed based on a first preset value of a target driving scene, wherein the first preset value represents a preset importance degree of the target driving scene.
Preferably, the development information includes detection item development information corresponding to a target detection item, and the target detection item is any one of a plurality of preset detection items; the development information determination module 30 includes:
detection item development information determination module: and the method is used for determining the detection item development information corresponding to the target detection item according to the detection development state corresponding to the detection information under the target detection item.
Preferably, the detection information determining module 20 includes:
a first determination module: the method is used for determining target detection items under a plurality of preset detection items corresponding to the target driving scene.
A second determination module: the method is used for determining the detection development information corresponding to the target detection item.
With regard to the apparatus in the above-described embodiment, the specific manner in which each module performs the operation has been described in detail in the embodiment related to the method, and will not be elaborated here.
The embodiment of the application provides development equipment of an automatic driving system, which comprises a processor and a memory, wherein at least one instruction, at least one program, a code set or an instruction set is stored in the memory, and the at least one instruction, the at least one program, the code set or the instruction set is loaded and executed by the processor to realize the development method of the automatic driving system provided by the method embodiment.
The memory may be used to store software programs and modules, and the processor may execute various functional applications and data processing by operating the software programs and modules stored in the memory. The memory can mainly comprise a program storage area and a data storage area, wherein the program storage area can store an operating system, application programs needed by functions and the like; the storage data area may store data created according to use of the apparatus, and the like.
The development device of the automatic driving system may be a server, and an embodiment of the present application further provides a schematic structural diagram of the server, referring to fig. 7, where the server 700 is configured to implement the data processing method provided in the foregoing embodiment. The server 700, which may vary considerably due to configuration or performance, may include one or more processors 710 (e.g., one or more processors) and storage 730, one or more storage media 720 (e.g., one or more mass storage devices) that store application programs 723 or data 722. Memory 730 and storage medium 720 may be, among other things, transient storage or persistent storage. The program stored in the storage medium 720 may include one or more modules, each of which may include a series of instruction operations for the server. Further, processor 710 may be configured to communicate with storage medium 720 to execute a series of instruction operations in storage medium 720 on server 700. The server 700 may also include one or more power supplies 770, one or more wired or wireless network interfaces 750, one or more input-output interfaces 740, and/or one or more operating systems 721, such as Windows Server, mac OS XTM, unixTM, linuxTM, freeBSDTM, etc.
Embodiments of the present application further provide a computer-readable storage medium, where the storage medium may be disposed in a server to store at least one instruction, at least one program, a code set, or a set of instructions related to implementing a method for developing an automatic driving system in the method embodiments, where the at least one instruction, the at least one program, the code set, or the set of instructions are loaded and executed by the processor to implement the method for developing an automatic driving system provided in the method embodiments.
It should be noted that: the sequence of the embodiments of the present application is only for description, and does not represent the advantages and disadvantages of the embodiments. And specific embodiments thereof have been described above. Other embodiments are within the scope of the following claims. In some cases, the actions or steps recited in the claims may be performed in a different order than in the embodiments and still achieve desirable results. In addition, the processes depicted in the accompanying figures do not necessarily require the particular order shown, or sequential order, to achieve desirable results. In some embodiments, multitasking and parallel processing may also be possible or may be advantageous.
The embodiments in the present specification are described in a progressive manner, and the same and similar parts among the embodiments are referred to each other, and each embodiment focuses on the differences from the other embodiments. In particular, for the system and server embodiments, since they are substantially similar to the method embodiments, the description is simple, and reference may be made to some descriptions of the method embodiments for relevant points.
The above description is only for the specific embodiments of the present application, but the scope of the present application is not limited thereto, and any person skilled in the art can easily conceive of the changes or substitutions within the technical scope of the present application, and shall be covered by the scope of the present application. Therefore, the protection scope of the present application shall be subject to the protection scope of the claims.

Claims (10)

1. A method of developing an autopilot system, the method comprising:
the method comprises the steps of obtaining a target driving scene and a plurality of preset detection items, wherein the target driving scene is any one of a plurality of driving scenes;
determining corresponding detection information of the target driving scene under the plurality of preset detection items;
and obtaining development information of the automatic driving system according to the detection information corresponding to the plurality of driving scenes.
2. The method of claim 1, wherein the development information includes development status information of the target driving scenario; the obtaining of development information of the automatic driving system according to the detection information includes:
and obtaining development state information of the target driving scene according to the detection information.
3. The method according to claim 2, wherein the deriving development state information of the target driving scenario according to the detection information includes:
and if the detection information of the target driving scene under the preset detection items indicates a completion state, marking the development state information of the target driving scene as a developed state.
4. The method according to claim 3, further comprising, after the step of obtaining development information of the automatic driving system based on the detection information corresponding to the plurality of driving scenarios:
determining the ratio of the number of the target driving scenes with the development state information being the developed state to the number of the plurality of driving scenes to obtain a first ratio result, wherein the first ratio result represents the development progress of the plurality of driving scenes.
5. The method according to claim 2, wherein the deriving development state information of the target driving scenario according to the detection information includes:
if any detection information of the target driving scene under the preset detection items does not indicate a completion state, marking the development state information of the target driving scene as an undeveloped state or a development progress.
6. The method of claim 5, further comprising:
acquiring the development state information as an undeveloped state or the target driving scene in development progress;
and determining priority development information of a target detection item in a plurality of preset detection items to be developed based on a first preset value of the target driving scene, wherein the first preset value represents a preset importance degree of the target driving scene.
7. The method according to claim 1, wherein the development information includes detection item development information corresponding to a target detection item, and the target detection item is any one of the plurality of preset detection items; the obtaining of the development information of the automatic driving system according to the detection information comprises:
and determining the detection item development information corresponding to the target detection item according to the detection development state corresponding to the detection information under the target detection item.
8. The method according to claim 7, wherein the determining the detection information corresponding to the target driving scene under the plurality of preset detection items comprises:
determining the target detection items under the plurality of preset detection items corresponding to the target driving scene;
and determining detection development information corresponding to the target detection item.
9. An apparatus for developing an autopilot system, the apparatus comprising:
an acquisition module: the method comprises the steps of acquiring a target driving scene and a plurality of preset detection items, wherein the target driving scene is any one of a plurality of driving scenes;
a detection information determination module: the detection information corresponding to the target driving scene under the plurality of preset detection items is determined;
a development information determination module: and the development information of the automatic driving system is obtained according to the detection information corresponding to the plurality of driving scenes.
10. A computer-readable storage medium, wherein at least one instruction or at least one program is stored in the storage medium, and the at least one instruction or the at least one program is loaded and executed by a processor to implement the development method of the automatic driving system according to any one of claims 1 to 8.
CN202211000753.3A 2022-08-16 2022-08-16 Development method and device of automatic driving system and storage medium Pending CN115437609A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211000753.3A CN115437609A (en) 2022-08-16 2022-08-16 Development method and device of automatic driving system and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211000753.3A CN115437609A (en) 2022-08-16 2022-08-16 Development method and device of automatic driving system and storage medium

Publications (1)

Publication Number Publication Date
CN115437609A true CN115437609A (en) 2022-12-06

Family

ID=84241987

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211000753.3A Pending CN115437609A (en) 2022-08-16 2022-08-16 Development method and device of automatic driving system and storage medium

Country Status (1)

Country Link
CN (1) CN115437609A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117724693A (en) * 2024-02-07 2024-03-19 深圳海星智驾科技有限公司 Development method, system, computer equipment and storage medium of autopilot software

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117724693A (en) * 2024-02-07 2024-03-19 深圳海星智驾科技有限公司 Development method, system, computer equipment and storage medium of autopilot software
CN117724693B (en) * 2024-02-07 2024-05-24 深圳海星智驾科技有限公司 Development method, system, computer equipment and storage medium of autopilot software

Similar Documents

Publication Publication Date Title
CN109213134B (en) Method and device for generating automatic driving strategy
CN110809790B (en) Vehicle information storage method, vehicle travel control method, and vehicle information storage device
CN112286206B (en) Automatic driving simulation method, system, equipment, readable storage medium and platform
CN111680362B (en) Automatic driving simulation scene acquisition method, device, equipment and storage medium
CN109064763A (en) Test method, device, test equipment and the storage medium of automatic driving vehicle
CN111832652B (en) Training method and device for decision model
US10755565B2 (en) Prioritized vehicle messaging
CN112579464A (en) Verification method, device and equipment of automatic driving algorithm and storage medium
CN107745711B (en) Method and device for determining route in automatic driving mode
WO2022052856A1 (en) Vehicle-based data processing method and apparatus, computer, and storage medium
CN111874007A (en) Knowledge and data drive-based unmanned vehicle hierarchical decision method, system and device
CN115437609A (en) Development method and device of automatic driving system and storage medium
CN113291320A (en) Vehicle track prediction method, device, equipment and storage medium
CN112671487B (en) Vehicle testing method, server and testing vehicle
CN115165398A (en) Vehicle driving function test method and device, computing equipment and medium
CN113428178B (en) Control method, device and medium for automatically driving vehicle and vehicle
CN112179359B (en) Map matching method and device, electronic equipment and storage medium
CN113593221A (en) Information value evaluation type driving system, internet vehicle system and data transmission method
CN116088538B (en) Vehicle track information generation method, device, equipment and computer readable medium
CN113758492A (en) Map detection method and device
CN114419758B (en) Vehicle following distance calculation method and device, vehicle and storage medium
CN115320636A (en) Automatic driving method, device and storage medium
CN113077631B (en) V2X vehicle identification method, device, equipment and medium
CN115454082A (en) Vehicle obstacle avoidance method and system, computer readable storage medium and electronic device
CN111104611B (en) Data processing method, device, equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination