CN115649186B - Unmanned operation method and device based on manned operation, electronic equipment and storage medium - Google Patents

Unmanned operation method and device based on manned operation, electronic equipment and storage medium Download PDF

Info

Publication number
CN115649186B
CN115649186B CN202211670197.0A CN202211670197A CN115649186B CN 115649186 B CN115649186 B CN 115649186B CN 202211670197 A CN202211670197 A CN 202211670197A CN 115649186 B CN115649186 B CN 115649186B
Authority
CN
China
Prior art keywords
manned
area
unmanned
vehicle
unmanned vehicle
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202211670197.0A
Other languages
Chinese (zh)
Other versions
CN115649186A (en
Inventor
张磊
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Yikong Zhijia Technology Co Ltd
Original Assignee
Beijing Yikong Zhijia Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Yikong Zhijia Technology Co Ltd filed Critical Beijing Yikong Zhijia Technology Co Ltd
Priority to CN202211670197.0A priority Critical patent/CN115649186B/en
Publication of CN115649186A publication Critical patent/CN115649186A/en
Application granted granted Critical
Publication of CN115649186B publication Critical patent/CN115649186B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Landscapes

  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)

Abstract

The present disclosure relates to a unmanned operation method, apparatus, electronic device and storage medium based on manned, the method including: by forming a group of unmanned vehicles and manned vehicles, the unmanned vehicles acquire manned data provided by the manned vehicles in the group, and execute corresponding unmanned operation in a target area when the unmanned vehicles are judged to travel to the operation area based on operation information. Thus, the unmanned vehicle can work in a work area through the manned data obtained by the unmanned vehicle, so that the unmanned vehicle can be ensured to keep optimal decisions in different environmental conditions and changing scenes by manpower; the unmanned vehicle utilizes the advantages of the unmanned vehicle, adapts to the scene of real-time dynamic change, and achieves the capability complementation between the unmanned vehicle and the unmanned vehicle so as to improve the environment and scene adaptability.

Description

Unmanned operation method and device based on manned operation, electronic equipment and storage medium
Technical Field
The present disclosure relates to the field of unmanned technologies, and in particular, to an unmanned operation method, apparatus, electronic device, and storage medium based on manned operation.
Background
Unmanned systems typically identify the scene in which they are located by various types of sensors and associated algorithms. In order to improve the perception and recognition capability of the unmanned system to the scene, the processing capability of the sensor and the algorithm is required to be continuously improved. Such as increasing the types and the number of the sensors, improving the performance of the sensors, collecting enough scene data for deep learning training, and the like.
However, in the unmanned field such as mines, the situation of the terrain change of the loading area may not be exhaustive with the progress of the excavation process. If the processing capacity of the sensor and the algorithm is simply improved, the scene recognition capacity of the unmanned driving system is improved, so that the cost is high, and the efficiency is low.
Disclosure of Invention
The disclosure provides an unmanned operation method and device based on manned operation, electronic equipment and storage medium.
According to a first aspect of the present disclosure, there is provided a unmanned operation method based on manned, the method comprising:
forming a consist of an unmanned vehicle and a manned vehicle;
the unmanned vehicle acquires the manned data provided by the manned vehicles in the consist; wherein the manned data includes: manually marking job information when a user drives the manned vehicle to work in a target area, and associating the job information with real-time positioning information of the manned vehicle at the time of marking;
Judging whether the unmanned vehicle runs to a working area or not based on the working information;
if yes, executing corresponding unmanned operation in the operation area.
According to a second aspect of the present disclosure, there is provided a unmanned aerial vehicle based on manned, the apparatus comprising:
the system comprises a grouping module, a control module and a control module, wherein the grouping module is used for forming a grouping of an unmanned vehicle and a manned vehicle;
the data acquisition module is used for acquiring the manned data provided by the manned vehicles in the marshalling by the unmanned vehicles; wherein the manned data includes: manually marking job information when a user drives the manned vehicle to work in a target area, and associating the job information with real-time positioning information of the manned vehicle at the time of marking;
the judging module is used for judging whether the unmanned vehicle runs to a working area or not based on the working information;
and the operation module is used for executing corresponding unmanned operation in the operation area when the unmanned vehicle runs to the operation area.
According to a third aspect of the present disclosure, an electronic device is provided. The electronic device includes: a memory and a processor, the memory having stored thereon a computer program, the processor implementing the method as described above when executing the program.
According to a fourth aspect of the present disclosure, there is provided a computer readable storage medium having stored thereon a computer program which when executed by a processor implements the above-described method of the present disclosure.
According to the unmanned operation method, the device, the electronic equipment and the storage medium based on the manned operation, the unmanned operation method, the device, the electronic equipment and the storage medium are used for forming a group of unmanned vehicles and manned vehicles, the unmanned vehicles acquire manned data provided by the manned vehicles in the group, and the manned data comprise: and when the user drives the manned vehicle to work in the target area, manually marking the work information, and associating the work information with the real-time positioning information of the manned vehicle when marking, and executing corresponding unmanned work in the work area when judging that the unmanned vehicle runs to the work area based on the work information. Thus, the unmanned vehicle can work in the target area through the manned data obtained by the unmanned vehicle, so that the unmanned vehicle can be ensured to keep optimal decisions in different environmental conditions and changing scenes by manpower; the unmanned vehicle utilizes the advantages of the unmanned vehicle, adapts to the scene of real-time dynamic change, and achieves the capability complementation between the unmanned vehicle and the unmanned vehicle so as to improve the environment and scene adaptability.
Drawings
Further details, features and advantages of the present disclosure are disclosed in the following description of exemplary embodiments, with reference to the following drawings, wherein:
FIG. 1 is a schematic illustration of a scenario provided by an exemplary embodiment of the present disclosure;
FIG. 2 is a flow chart of a unmanned job based on manned operation provided in an exemplary embodiment of the present disclosure;
FIG. 3 is a schematic block diagram of functional modules of a unmanned aerial vehicle based on manned operation device provided in an exemplary embodiment of the present disclosure;
FIG. 4 is a block diagram of an electronic device provided in an exemplary embodiment of the present disclosure;
fig. 5 is a block diagram of a computer system according to an exemplary embodiment of the present disclosure.
Detailed Description
Embodiments of the present disclosure will be described in more detail below with reference to the accompanying drawings. While certain embodiments of the present disclosure have been shown in the accompanying drawings, it is to be understood that the present disclosure may be embodied in various forms and should not be construed as limited to the embodiments set forth herein, but are provided to provide a more thorough and complete understanding of the present disclosure. It should be understood that the drawings and embodiments of the present disclosure are for illustration purposes only and are not intended to limit the scope of the present disclosure.
It should be understood that the various steps recited in the method embodiments of the present disclosure may be performed in a different order and/or performed in parallel. Furthermore, method embodiments may include additional steps and/or omit performing the illustrated steps. The scope of the present disclosure is not limited in this respect.
The term "including" and variations thereof as used herein are intended to be open-ended, i.e., including, but not limited to. The term "based on" is based at least in part on. The term "one embodiment" means "at least one embodiment"; the term "another embodiment" means "at least one additional embodiment"; the term "some embodiments" means "at least some embodiments. Related definitions of other terms will be given in the description below. It should be noted that the terms "first," "second," and the like in this disclosure are merely used to distinguish between different devices, modules, or units and are not used to define an order or interdependence of functions performed by the devices, modules, or units.
It should be noted that references to "one", "a plurality" and "a plurality" in this disclosure are intended to be illustrative rather than limiting, and those of ordinary skill in the art will appreciate that "one or more" is intended to be understood as "one or more" unless the context clearly indicates otherwise.
The names of messages or information interacted between the various devices in the embodiments of the present disclosure are for illustrative purposes only and are not intended to limit the scope of such messages or information.
In the unmanned field such as mines, the situation that the terrain change of a loading area cannot be exhausted along with the promotion of the excavation process, and the situation that the laser radar cannot acquire the beam reflection of the coal mine in the scene such as coal mining and the like are added. The recognition capability of the unmanned system to the scene is improved simply by improving the processing capability of the sensor and the algorithm, and the efficiency is low.
Therefore, in order to avoid that the recognition capability of the unmanned system to the scene is simply improved by improving the processing capability of the sensor and the algorithm, the embodiment of the disclosure can form a group of the manned vehicle and the unmanned vehicle, the manned vehicle carries out normal driving operation by a driver, automatically recognizes the scene, and shares the recognition result of the scene to the unmanned vehicle through a cloud platform or a direct transmission mode; in the embodiment, the manned vehicle shares the manned data to the cloud platform or the unmanned vehicle, and the cloud platform or the unmanned vehicle is used for identifying the scene in the manned data to obtain the running track and the operation area in the operation environment, so that the unmanned vehicle can conveniently operate.
This may combine manual experience with the unmanned system so that the unmanned vehicle may implement work in the target area based on the manned data. The manned vehicle related in the embodiment of the present disclosure may be a manned vehicle, or may be a manual remote control driving vehicle, or the like.
In the embodiment provided by the disclosure, the manned vehicle and the unmanned vehicle can form a group, the manned vehicle is driven by a driver to perform normal operation, and after completing one complete operation, the running track and the identified operation scene in the operation process are sequentially obtained, wherein the operation scene can be a work area.
In the embodiments provided by the present disclosure, by grouping the manned vehicle with the unmanned vehicle, the manned vehicle and the unmanned vehicle have the same work environment and perform the same work content. Wherein, the manned vehicle can be identical with the unmanned vehicle, and has the same sensor configuration and unmanned capability; the manned vehicle also does not have unmanned ability, and some degradation is made relative to the type and the ability of the sensor of the unmanned vehicle, but the unmanned vehicle needs basic data acquisition and processing ability according to business requirements. And in the process of operating the manned vehicle, the collected basic data of the vehicle can be recorded and stored in real time. The basic data includes high-precision positioning data, etc., and in the embodiment, the basic data collected by the manned vehicle can be used as the manned data, the driving track and the working area are obtained through the processing and the scene recognition of the manned data, and the data including the driving track and the working area are used as the manned data.
In an embodiment, the collected manned data is processed. For example, the manned vehicle may process the manned data when acquiring the manned data, process the manned data into manned data including a travel track and a work area, and may transmit the acquired manned data to a cloud platform, and process the manned data into manned data through the cloud platform; or the manned vehicle sends the collected manned data to the unmanned vehicle in a V2V (vehicle-to-vehicle communication) mode, and the unmanned vehicle processes the manned data into manned data; or the manned vehicle sends the processed manned data to the cloud platform or the unmanned vehicle. When the method is specifically used for transmitting, the manned vehicle can transmit the manned data after completing one complete operation, and the manned vehicle can acquire the manned data after completing multiple complete operations, so that the scene identified based on the manned data is more accurate, and incomplete acquired manned data after one operation is avoided. In the embodiment, when the driving data of the person is processed, the processing can be performed in a screening, optimizing or integrating mode.
The scene that the manned vehicle sends the collected manned data to the unmanned vehicle or the cloud platform in a real-time transmission mode can be that one manned vehicle and a plurality of unmanned vehicles form a group, the manned vehicle normally works in a target area, and the manned data are collected while working. Unmanned vehicles within the consist operate in the target area based on the manned data obtained from the manned data. For example, an unmanned vehicle obtains manned data provided by manned vehicles within a consist; wherein the manned data may include: the user manually marks the job information when driving the manned vehicle to work in the target area, and the job information is associated with real-time positioning information of the manned vehicle at the time of marking. Therefore, the unmanned vehicle can judge whether to run to the operation area or not based on the operation information, and if the unmanned vehicle runs to the operation area, the unmanned vehicle can operate according to the operation information manually marked when the unmanned vehicle operates in the target area according to the driving data, namely, the unmanned vehicle executes corresponding unmanned operation. The job information may include, among other things, the type of travel track, the content of the job area, the boundary, or the range. The content of the operation area comprises a driving area, a queuing area, a loading position or a loading position.
In an embodiment, for example, the target area may be a strip mine area, which may be subdivided into a number of work areas, such as a travel area, a queuing area, a loading or loading location, etc., which may be referred to as different work areas. Thus, when the unmanned vehicle runs to different working areas in the target area, corresponding operation can be executed, for example, when the unmanned vehicle runs to a queuing area, the queuing of the vehicle is carried out; the unmanned vehicle travels to the loading area, and loading work or the like is performed.
In addition, the real-time positioning information may be obtained by a positioning sensor on the manned vehicle. In an embodiment, the to-be-loaded bit may be the to-be-loaded dock of fig. 1, and the load bit may be the load dock of fig. 1.
In an embodiment, the types of the driving track may include a heavy-load track and a no-load track, and as shown in fig. 1, the loading process of the unmanned vehicle in the mine scene is illustrated as an example. After entering the loading area, the manned vehicle can reach a loading stop point, and the loading state of the manned vehicle is an empty state during the period, so that the running track of the manned vehicle from the time of entering the loading area to the loading stop point is an empty track; after the manned vehicle finishes loading at the loading stop, the running track between the loading stop and the loading area exit is a heavy-load track.
In an exemplary embodiment provided by the present disclosure, a loading process of an unmanned vehicle in a mine scenario is illustrated as an example. In general, when an unmanned vehicle reaches a loading area entrance, a waiting loading position needs to be determined, and a travel path to the waiting loading position needs to be planned. And (3) moving to a waiting loading position according to the planned driving track, moving to a position for receiving the excavator to request to move to loading, and moving away from the loading position after loading is completed. In the above process, many links require the awareness of the unmanned vehicle to the environment and the scene, such as determining the waiting loading position, planning the driving track going to the waiting loading position, determining the loading position, planning the driving track going to the loading position, planning the driving track away from the loading position, and so on, which may have safety and efficiency problems.
The safety problems such as determining the waiting loading position, planning the driving track, going to the waiting loading position, going to the loading position and the like depend on basic maps and sensing capabilities, and the basic safety cannot be ensured under the condition that the identification capability of the laser radar is limited or completely lost in similar coal mining scenes.
Regarding efficiency problems, for example, whether the selection of the position and the track reasonably directly affects the running efficiency of the vehicle in the processes of determining the waiting loading position, planning the running track and the like, and the selection of the position and the selection of the track are strongly dependent on the accurate identification of the environment and the scene; however, as the operation progresses, the environment and scene of the loading area change continuously, and the environment and scene recognition algorithm fails.
Aiming at the safety problem and the efficiency problem, if the sensor capacity is improved and the recognition capacity of the environment and the scene is improved, the cost can be greatly improved; if the operating boundary of the whole system is strictly controlled by limiting the use range of the unmanned vehicle, the service adaptability of the unmanned vehicle is reduced.
Therefore, in order to solve the above-mentioned safety problem and efficiency problem, taking the loading process of unmanned vehicles in the mine scene as an example, a plurality of unmanned vehicles and one manned vehicle can be formed into a working group, the manned vehicles can go to the position of the area to be loaded for queuing waiting after entering the entrance of the working scene in the working process, and the manned vehicles can automatically collect the manned data in the driving process, including positioning data, manual labeling and the like. The manned vehicle will go to the loading position and stop at the loading position according to the instruction of the excavator, and the manned vehicle will collect driving data and stop positions during driving. After the loading of the excavator is completed, a driver drives the vehicle to leave the loading position, and the vehicle automatically records the driving data of the leaving loading position. For example, a manned vehicle may share collected manned data to other unmanned vehicles based on the acquired manned data, and the unmanned vehicles may travel and stop based on the shared travel track and stop position. Of course, the collected manned data may also be sent to the cloud platform or the unmanned vehicle by the manned vehicle for processing, specifically referring to the above description, and will not be repeated here.
In an embodiment, the manned vehicle or unmanned vehicle is operated in a target area, for example, the manned vehicle and unmanned vehicle are transport vehicles, the target area may be a loading area including a loading area entrance, an empty vehicle driving area, a queuing area, a waiting area, a loading area, a heavy vehicle driving area, and a loading area exit. The manned vehicle enters the loading area from the loading area entrance, at the moment, the manned vehicle is an empty vehicle and can travel in the empty traveling area, the manned vehicle and other vehicles are queued to wait for loading when reaching the queuing area, the manned vehicle enters the waiting area after the queuing is completed, at the moment, the manned vehicle can be stopped in the specific loading area for loading according to the instruction of the excavator, the manned vehicle is a heavy-load vehicle after the loading is completed, and the manned vehicle travels away from the loading area through the heavy-load vehicle traveling area and exits the loading area exit. The manned vehicle can generate manned data in the process of operating in the loading area, and the unmanned vehicle can finish the operation in the loading area according to the manned vehicle, and the flow is consistent with that of the manned vehicle in the operation in the loading area, and is not described in detail herein. Of course, when the unmanned vehicle is based on the operation of the manned vehicle, the manned vehicle may be processed data, and the processing manner is described above, and will not be described herein.
Specifically, as shown in fig. 1, fig. 1 is a schematic view of a scenario provided in an embodiment of the disclosure. In an embodiment, the type of unmanned vehicle and the manned vehicle may be a transport vehicle, for example, in a mine scenario, by which ore is transported.
In fig. 1, a manned vehicle enters a loading area entrance, enters a queuing area after passing through the entrance area, then enters a waiting area, stops at a waiting stop, and stops at the loading stop of the loading area by receiving an instruction of an excavator. After the loading operation of the excavator is completed, the person drives the vehicle away from the loading stop. The method comprises the step that a manned vehicle generates manned data in the process of entering from a loading area entrance to a driving-away loading area, wherein the manned data comprise positioning data of the manned vehicle, manual marks for each stop point and each area and the like. By the positioning data in the manned data, a travel track of the manned vehicle including a travel track from the entrance of the loading area to the loading stop, a travel track from the loading stop, and the like can be generated, and by the manned data, the respective work areas such as the entrance area, the queuing area, the waiting area, the loading area, and the like can be identified, so that the manned data of the travel track and the work area can be obtained, so that the unmanned vehicle can perform the work based on the unmanned vehicle.
In an embodiment, when each operation area is identified, for example, a loading area may be between a point to be loaded and a loading point; a waiting area is arranged between the front vehicle to the waiting stop point and the position 30 m before the reversing position of the vehicle is completed; a queuing area is arranged between the loading area entrance and the waiting area. The individual stop points can be determined by manual calibration or by preset positions on the map.
Therefore, the unmanned vehicle can be executed according to the running track and the parking position of the unmanned vehicle, and in the embodiment, the unmanned vehicle can also finely adjust the running track and the parking position according to the self-perception information so as to adapt to real-time environment change. In addition, the manned vehicles can also be driven in a remote control mode, and according to service requirements, one manned vehicle can form a group with a plurality of unmanned vehicles, so that the cost improvement caused by manual intervention is greatly reduced.
According to the embodiment of the disclosure, after scene recognition is performed by performing one-time operation through one-man driving according to the characteristic of slow and continuous change of mine scenes, the scene recognition results have certain accuracy in the time range of several operation cycles of manual driving, so that one-man driving can actually support automatic operation of a plurality of grouped unmanned vehicles, and therefore, in a comprehensive view, the cost improvement caused by the increase of labor cost is avoided, on the contrary, the situation that the optimal decision is kept in the scene which slowly changes at any time by the manual driving is ensured, and the unmanned vehicles adapt to the real-time dynamic environment by utilizing the advantages of the unmanned vehicles, compared with the two solutions mentioned above, the method has more cost advantages, and the adaptability of the scene is improved.
Based on the above embodiments, the embodiments of the present disclosure first provide a unmanned operation method based on manned driving, as shown in fig. 2, the method may include the following steps:
in step S210, a consist of an unmanned vehicle and a manned vehicle is formed.
In the embodiment of the disclosure, the manned vehicles with the same or similar operation content as the unmanned vehicles can be grouped, or the manned vehicles consistent with the vehicle types of the unmanned vehicles can be grouped, so that the unmanned vehicles can be directly operated based on the generated manned vehicles of the manned vehicles when operating in the target area.
In step S220, the unmanned vehicle obtains the manned data provided by the manned vehicles within the consist.
In an embodiment, the unmanned vehicle may obtain the manned data through a cloud platform, a near field communication technology, or a short range communication technology. When the unmanned vehicle acquires the manned data through the cloud platform, the cloud platform can process the manned data, such as optimizing, screening or integrating, and the like, and sends the processed manned data to the unmanned vehicle, so that the unmanned vehicle can work in a target area according to the manned data.
Wherein the manned data includes: the user manually marks the job information when driving the manned vehicle to work in the target area, and the job information is associated with real-time positioning information of the manned vehicle at the time of marking. The job information in the embodiment includes the type of the travel track, the content of the job area, the boundary, or the range. Wherein the unmanned vehicle may be used for transportation, and the contents of the operation area may include at least one of a driving area, a queuing area, a loading position, and a loading position. The real-time positioning information may be obtained by a positioning sensor on the manned vehicle.
In the embodiments provided by the present disclosure, a manned vehicle may be identical to an unmanned vehicle, having the same sensor configuration and unmanned capability; the manned vehicle also does not have unmanned ability, and some degradation is made relative to the type and the ability of the sensor of the unmanned vehicle, but the unmanned vehicle needs basic data acquisition and processing ability according to business requirements. And in the process of operating the manned vehicle, the collected basic data of the vehicle can be recorded and stored in real time. The basic data comprise high-precision positioning data and the like, and in the embodiment, the basic data collected by the manned vehicle and the information manually marked by the user are used as the manned data. For example, the manual annotation may be an annotation by a button on the display device.
In an embodiment, the travel track may be obtained from positioning data in the manned data, and the work area is obtained by identifying the scene. As can be seen from fig. 1 and the corresponding embodiments, the travel track may be a travel track from the loading area entrance to the loading stop point and a travel track from the loading stop point. The travel track may be further divided according to the dock, for example, the dock may include a loading area entrance, a to-be-loaded dock, and a loading dock, and the travel track may include a travel track from the loading area entrance to the to-be-loaded dock, a travel track from the to-be-loaded dock to the loading dock, and the like. The identification of the job scenario in the target area may include a travel area, a queuing area, a loading location, or a loading location. The areas and positions in the embodiments can be obtained by manual labeling by the user when driving the manned vehicle to work in the target area.
In an embodiment, an area between the loading area entrance and the to-be-loaded stop point may be used as a queuing area; taking the area between the to-be-loaded stop point and the loading stop point as the to-be-loaded area; taking the area where the loading stop point is located as a loading area.
In the embodiment provided by the disclosure, the working area in the target area can be determined according to the manually marked information. For example, region labeling information in the manned data is acquired, and a work region in the target region is determined based on the region labeling information.
In the embodiment of the disclosure, the unmanned vehicle can complete the operation with the same operation content as the manned vehicle in the target area based on the obtained running track and the operation area without planning the running track of the operation and identifying the scene again.
In step S230, it is determined whether the unmanned vehicle is traveling to the work area based on the work information.
When the unmanned vehicle travels to the work area, in step S240, a corresponding unmanned work is performed in the work area.
Since the manned data includes manually noted job information when the user is driving the manned vehicle to work in the target area, the job information is associated with real-time positioning information of the manned vehicle at the time of the noted. In the embodiment, the target area may be further divided into a driving area, a queuing area, a waiting area, a loading area, a waiting position or a loading position according to the content of the working area, so that the vehicle needs to execute corresponding working content when entering into different working areas.
Therefore, it is necessary to determine whether or not the unmanned vehicle is traveling to the work area, and when traveling to the work area, the unmanned vehicle can execute a corresponding unmanned work in the work area based on the manned data. For example, when an unmanned vehicle travels to a queuing area, it is queued together with other vehicles; waiting for loading when the unmanned vehicle runs to the area to be loaded; when the unmanned vehicle travels to the loading area, a loading operation is performed.
In the embodiment, since the unmanned vehicle and the manned vehicle form a group, when the unmanned vehicle travels to the corresponding operation area, the operation contents of the unmanned vehicle and the manned vehicle in the operation area may be the same, the unmanned vehicle may travel along the travel track in the manned data, and the unmanned operation may be performed in the operation area. For example, when an unmanned vehicle enters a queuing area, if other vehicles are detected in the queuing area, the unmanned vehicle queues in the queuing area; when entering the region to be loaded, waiting for receiving the instruction of the excavator, when receiving the instruction of the large excavator, driving to a corresponding loading stop point to stop, and driving away from the loading stop point after completing loading operation.
In the embodiment, the manned data is obtained based on the data collected when the manned vehicle operates in the target area and the manually marked information, so that the unmanned vehicle can be the data obtained after the manned vehicle completes one complete operation in the target area when operating in the target area based on the manned data, one or more unmanned vehicles can operate according to the manned data in the target area, the validity of the manned data can be verified in the operation process of the unmanned vehicle according to the manned data, and the manned data can be perfectly or dynamically adjusted, so that other vehicles can complete the operation better according to the perfect or dynamically adjusted unmanned vehicle.
According to the unmanned operation method based on the manned operation, the unmanned vehicle and the manned vehicle are formed into the group, the unmanned vehicle obtains the manned data provided by the manned vehicle in the group, wherein the manned data comprise: and when the user drives the manned vehicle to work in the target area, manually marking the work information, and associating the work information with the real-time positioning information of the manned vehicle when marking, and executing corresponding unmanned work in the work area when judging that the unmanned vehicle runs to the work area based on the work information. Thus, the unmanned vehicle can work in the target area through the manned data obtained by the unmanned vehicle, so that the unmanned vehicle can be ensured to keep optimal decisions in different environmental conditions and scenes which change slowly with time by manpower; the unmanned vehicle utilizes the advantages of the unmanned vehicle, adapts to the scene of real-time dynamic change, and achieves the capability complementation between the unmanned vehicle and the unmanned vehicle so as to improve the environment and scene adaptability.
Based on the above embodiment, in still another embodiment provided in the present disclosure, the step S240 may further include the steps of:
in step S241, information to be worked of the unmanned vehicle is acquired.
In step S242, a target travel track and a target work area are acquired. Wherein the target travel track and the target working area are determined from the travel track and the working area, respectively, based on the information to be worked.
In step S243, the vehicle travels along the target travel route and works when entering the target area.
In the embodiment provided by the disclosure, the manned data may be data obtained when one manned vehicle works in the target area, or may be data obtained when a plurality of manned vehicles work in the target area.
For the case of the manned data obtained when one manned vehicle works in the target area, if the content of the operation of the manned vehicle is the same as that of the unmanned vehicle, the unmanned vehicle can directly perform corresponding operation according to the driving track and the operation area obtained by the manned data. If the unmanned vehicle works in the target area, more work contents are included, and if the unmanned vehicle works in the target area, only one or more of the work contents of the unmanned vehicle are included, the driving track and the work area are obtained from the driving data, so that the target driving track and the target work area matched with the operation information of the unmanned vehicle are respectively determined from the driving track and the work area, and the unmanned vehicle can drive according to the target driving track and work when entering the target work area.
In the case of the manned data obtained when a plurality of manned vehicles are operated in the target area, the operation content of the manned data may also include more operation content, and the unmanned vehicle may include only one or more of the operation content of the manned vehicles when operated in the operation area, and since the travel track and the operation area are obtained from the manned data, it is necessary to determine the target travel track and the target operation area, which are matched with the operation information of the unmanned vehicle, from the travel track and the operation area, respectively, so that the unmanned vehicle can travel according to the target travel track and operate when entering the target operation area.
By determining the target running track and the target operation area, the unmanned vehicle can be ensured to run according to the target running track better aiming at the operation information, and the operation is performed when entering the target operation area.
It should be noted that, the method provided by the embodiment of the present disclosure may be applied to an unmanned vehicle, and may also be applied to a device such as a console for controlling the unmanned vehicle, and the embodiment of the present disclosure is not limited thereto.
In the case of dividing each functional module by adopting a corresponding function, the embodiment of the disclosure provides a unmanned operation device based on manned operation, which may be a server or a chip applied to the server. Fig. 3 is a schematic block diagram of functional modules of a unmanned operation device based on manned operation according to an exemplary embodiment of the present disclosure. As shown in fig. 3, the unmanned operation device based on the manned operation includes:
A consist module 10 for forming a consist of an unmanned vehicle and a manned vehicle;
a data acquisition module 20, configured to acquire, by an unmanned vehicle, manned data provided by a manned vehicle in a consist; wherein the manned data includes: manually marking job information when a user drives the manned vehicle to work in a target area, and associating the job information with real-time positioning information of the manned vehicle at the time of marking;
a judging module 30 for judging whether the unmanned vehicle is traveling to a work area based on the work information;
and a work module 40 for executing corresponding unmanned work in the work area when the unmanned vehicle runs to the work area.
In yet another embodiment provided by the present disclosure, the job information includes a type of a travel track, a content of a job area, a boundary, or a range.
In yet another embodiment provided by the present disclosure, the unmanned vehicle is used for transportation, and the contents of the working area include a driving area, a queuing area, a loading location, or a loading position.
In yet another embodiment provided by the present disclosure, the manual annotation is an annotation by a button on the display device.
In yet another embodiment provided by the present disclosure, the real-time positioning information is obtained by a positioning sensor on the manned vehicle.
In yet another embodiment provided by the present disclosure, the unmanned vehicle obtains the manned data through a cloud platform, a near field communication technology, or a short range communication technology.
In yet another embodiment provided by the present disclosure, the data acquired by the unmanned vehicle from the cloud platform is processed by the cloud platform.
In yet another embodiment provided by the present disclosure, the processing of the cloud platform includes optimizing, screening, or integrating the manned data.
Since the apparatus embodiments correspond to the above method embodiments, specific reference may be made to the description of the above method embodiments, and details are not repeated here.
According to the unmanned operation device based on the manned operation, the unmanned operation device and the unmanned operation vehicle are formed into the group, the unmanned operation vehicle obtains the manned data provided by the manned operation vehicle in the group, wherein the manned data comprise: and when the user drives the manned vehicle to work in the target area, manually marking the work information, and associating the work information with the real-time positioning information of the manned vehicle when marking, and executing corresponding unmanned work in the work area when judging that the unmanned vehicle runs to the work area based on the work information. Thus, the unmanned vehicle can work in the target area through the manned data obtained by the unmanned vehicle, so that the unmanned vehicle can be ensured to keep optimal decisions under different environmental conditions and in changing scenes by manpower; the unmanned vehicle utilizes the advantages of the unmanned vehicle, adapts to the scene of real-time dynamic change, and achieves the capability complementation between the unmanned vehicle and the unmanned vehicle so as to improve the environment and scene adaptability.
The embodiment of the disclosure also provides an electronic device, including: at least one processor; a memory for storing the at least one processor-executable instruction; wherein the at least one processor is configured to execute the instructions to implement the above-described methods disclosed by embodiments of the present disclosure.
Fig. 4 is a schematic structural diagram of an electronic device according to an exemplary embodiment of the present disclosure. As shown in fig. 4, the electronic device 1800 includes at least one processor 1801 and a memory 1802 coupled to the processor 1801, the processor 1801 may perform corresponding steps in the above-described methods disclosed by embodiments of the present disclosure.
The processor 1801 may also be referred to as a central processing unit (central processing unit, CPU), which may be an integrated circuit chip with signal processing capabilities. The steps of the above-described methods disclosed in the embodiments of the present disclosure may be accomplished by instructions in the form of integrated logic circuits or software in hardware in the processor 1801. The processor 1801 may be a general purpose processor, a digital signal processor (digital signal processing, DSP), an ASIC, an off-the-shelf programmable gate array (field-programmable gate array, FPGA) or other programmable logic device, discrete gate or transistor logic device, discrete hardware components. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like. The steps of a method disclosed in connection with the embodiments of the present disclosure may be embodied directly in hardware, in a decoded processor, or in a combination of hardware and software modules in a decoded processor. The software modules may reside in a memory 1802 such as random access memory, flash memory, read only memory, programmable read only memory, or electrically erasable programmable memory, registers, etc. as is well known in the art. The processor 1801 reads the information in the memory 1802 and, in combination with its hardware, performs the steps of the method described above.
In addition, various operations/processes according to the present disclosure, when implemented by software and/or firmware, may be installed from a storage medium or network to a computer system having a dedicated hardware structure, such as computer system 1900 shown in fig. 5, which is capable of performing various functions including functions such as those described previously, and the like, when various programs are installed. Fig. 5 is a block diagram of a computer system according to an exemplary embodiment of the present disclosure.
Computer system 1900 is intended to represent various forms of digital electronic computing devices, such as laptops, desktops, workstations, personal digital assistants, servers, blade servers, mainframes, and other suitable computers. The electronic device may also represent various forms of mobile devices, such as personal digital processing, cellular telephones, smartphones, wearable devices, and other similar computing devices. The components shown herein, their connections and relationships, and their functions, are meant to be exemplary only, and are not meant to limit implementations of the disclosure described and/or claimed herein.
As shown in fig. 5, the computer system 1900 includes a computing unit 1901, and the computing unit 1901 may perform various appropriate actions and processes according to a computer program stored in a Read Only Memory (ROM) 1902 or a computer program loaded from a storage unit 1908 into a Random Access Memory (RAM) 1903. In the RAM 1903, various programs and data required for the operation of the computer system 1900 may also be stored. The computing unit 1901, ROM 1902, and RAM 1903 are connected to each other via a bus 1904. An input/output (I/O) interface 1905 is also connected to bus 1904.
Various components in computer system 1900 are connected to I/O interface 1905, including: an input unit 1906, an output unit 1907, a storage unit 1908, and a communication unit 1909. The input unit 1906 may be any type of device capable of inputting information to the computer system 1900, and the input unit 1906 may receive input numeric or character information and generate key signal inputs related to user settings and/or function controls of the electronic device. The output unit 1907 may be any type of device capable of presenting information and may include, but is not limited to, a display, speakers, video/audio output terminals, vibrators, and/or printers. Storage unit 1908 may include, but is not limited to, magnetic disks, optical disks. The communication unit 1909 allows the computer system 1900 to exchange information/data with other devices over a network, such as the internet, and may include, but is not limited to, modems, network cards, infrared communication devices, wireless communication transceivers and/or chipsets, such as bluetooth (TM) devices, wiFi devices, wiMax devices, cellular communication devices, and/or the like.
The computing unit 1901 may be a variety of general and/or special purpose processing components having processing and computing capabilities. Some examples of computing unit 1901 include, but are not limited to, a Central Processing Unit (CPU), a Graphics Processing Unit (GPU), various dedicated Artificial Intelligence (AI) computing chips, various computing units running machine learning model algorithms, a Digital Signal Processor (DSP), and any suitable processor, controller, microcontroller, etc. The computing unit 1901 performs the various methods and processes described above. For example, in some embodiments, the above-described methods disclosed by embodiments of the present disclosure may be implemented as a computer software program tangibly embodied on a machine-readable medium, such as storage unit 1908. In some embodiments, some or all of the computer programs may be loaded and/or installed onto electronic device 1900 via ROM 1902 and/or communication unit 1909. In some embodiments, the computing unit 1901 may be configured to perform the above-described methods of the disclosed embodiments by any other suitable means (e.g., by means of firmware).
The disclosed embodiments also provide a computer-readable storage medium, wherein instructions in the computer-readable storage medium, when executed by a processor of an electronic device, enable the electronic device to perform the above-described method disclosed by the disclosed embodiments.
A computer readable storage medium in embodiments of the present disclosure may be a tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. The computer readable storage medium described above can include, but is not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any suitable combination of the foregoing. More specifically, the computer-readable storage medium described above may include one or more wire-based electrical connections, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing.
The computer readable medium may be contained in the electronic device; or may exist alone without being incorporated into the electronic device.
The disclosed embodiments also provide a computer program product comprising a computer program, wherein the computer program, when executed by a processor, implements the above-described methods of the disclosed embodiments.
In an embodiment of the present disclosure, computer program code for performing the operations of the present disclosure may be written in one or more programming languages, including but not limited to an object oriented programming language such as Java, smalltalk, C ++ and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. In the case of remote computers, the remote computers may be connected to the user computer through any kind of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or may be connected to external computers.
The flowcharts and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various embodiments of the present disclosure. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
The modules, components or units referred to in the embodiments of the present disclosure may be implemented by software or hardware. Where the name of a module, component or unit does not in some cases constitute a limitation of the module, component or unit itself.
The functions described above herein may be performed, at least in part, by one or more hardware logic components. For example, without limitation, exemplary hardware logic components that may be used include: a Field Programmable Gate Array (FPGA), an Application Specific Integrated Circuit (ASIC), an Application Specific Standard Product (ASSP), a system on a chip (SOC), a Complex Programmable Logic Device (CPLD), and the like.
The above description is merely illustrative of some embodiments of the present disclosure and of the principles of the technology applied. It will be appreciated by persons skilled in the art that the scope of the disclosure referred to in this disclosure is not limited to the specific combinations of features described above, but also covers other embodiments which may be formed by any combination of features described above or equivalents thereof without departing from the spirit of the disclosure. Such as those described above, are mutually substituted with the technical features having similar functions disclosed in the present disclosure (but not limited thereto).
Although some specific embodiments of the present disclosure have been described in detail by way of example, it should be understood by those skilled in the art that the above examples are for illustration only and are not intended to limit the scope of the present disclosure. It will be appreciated by those skilled in the art that modifications may be made to the above embodiments without departing from the scope and spirit of the disclosure. The scope of the present disclosure is defined by the appended claims.

Claims (10)

1. A method of unmanned operation based on manned operation, the method comprising:
forming a consist of an unmanned vehicle and a manned vehicle;
the unmanned vehicle acquires the manned data provided by the manned vehicles in the consist; wherein the manned data includes: manually marking operation information when a user drives the manned vehicle to operate in a target area, wherein the operation information is associated with real-time positioning information of the manned vehicle during marking, and the manual marking is performed through a button on a display device; the operation information comprises the type of a running track, the type of an operation area, the boundary of the operation area or the range of the operation area; the target area comprises an open pit mining area, the content of the operation area comprises a driving area, a queuing area, an area to be loaded, a loading area, a position to be loaded or a loading position, and the type of the driving track comprises a heavy load track and a no load track;
judging whether the unmanned vehicle runs to a working area or not based on the working information;
if yes, executing corresponding unmanned operation in the operation area.
2. The method of claim 1, wherein the unmanned vehicle is used for transportation and the contents of the work area include a travel area, a queuing area, a loading location, or a loading location.
3. The method of claim 2, wherein the manual annotation is an annotation via a button on a display device.
4. The method of claim 1, wherein the real-time positioning information is obtained by a positioning sensor on the manned vehicle.
5. The method of any one of claims 1-4, wherein the unmanned vehicle obtains the manned data through a cloud platform, near field communication technology, or short range communication technology.
6. The method of claim 5, wherein the data acquired by the unmanned vehicle from the cloud platform is processed by the cloud platform.
7. The method of claim 5, wherein the processing of the cloud platform comprises optimizing, screening, or integrating the manned data.
8. An unmanned operation device based on manned operation, the device comprising:
The system comprises a grouping module, a control module and a control module, wherein the grouping module is used for forming a grouping of an unmanned vehicle and a manned vehicle;
the data acquisition module is used for acquiring the manned data provided by the manned vehicles in the marshalling by the unmanned vehicles; wherein the manned data includes: manually marking job information when a user drives the manned vehicle to work in a target area, and associating the job information with real-time positioning information of the manned vehicle at the time of marking; wherein, the manual labeling refers to labeling through buttons on a display device; the operation information comprises the type of a running track, the type of an operation area, the boundary of the operation area or the range of the operation area; the target area comprises an open pit mining area, the content of the operation area comprises a driving area, a queuing area, an area to be loaded, a loading area, a position to be loaded or a loading position, and the type of the driving track comprises a heavy load track and a no load track;
the judging module is used for judging whether the unmanned vehicle runs to a working area or not based on the working information;
and the operation module is used for executing corresponding unmanned operation in the operation area when the unmanned vehicle runs to the operation area.
9. An electronic device, comprising:
at least one processor;
a memory for storing the at least one processor-executable instruction;
wherein the at least one processor is configured to execute the instructions to implement the method of any of claims 1-7.
10. A computer readable storage medium, characterized in that instructions in the computer readable storage medium, when executed by a processor of an electronic device, enable the electronic device to perform the method of any one of claims 1-7.
CN202211670197.0A 2022-12-26 2022-12-26 Unmanned operation method and device based on manned operation, electronic equipment and storage medium Active CN115649186B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211670197.0A CN115649186B (en) 2022-12-26 2022-12-26 Unmanned operation method and device based on manned operation, electronic equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211670197.0A CN115649186B (en) 2022-12-26 2022-12-26 Unmanned operation method and device based on manned operation, electronic equipment and storage medium

Publications (2)

Publication Number Publication Date
CN115649186A CN115649186A (en) 2023-01-31
CN115649186B true CN115649186B (en) 2023-11-07

Family

ID=85023259

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211670197.0A Active CN115649186B (en) 2022-12-26 2022-12-26 Unmanned operation method and device based on manned operation, electronic equipment and storage medium

Country Status (1)

Country Link
CN (1) CN115649186B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116151596B (en) * 2023-04-20 2023-08-01 北京路凯智行科技有限公司 Mixed grouping scheduling method for open-pit mining area

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2015173073A1 (en) * 2014-05-15 2015-11-19 Cnh Industrial Belgium Nv Harvesting method using unmanned agricultural work vehicles
CA2941194A1 (en) * 2014-09-30 2016-04-07 Hitachi Construction Machinery Co., Ltd. Driving assistance system, vehicle, driving assistance terminal device, and driving assistance program
WO2016129671A1 (en) * 2015-02-13 2016-08-18 ヤンマー株式会社 Control system for autonomously traveling work vehicle
JP2019114138A (en) * 2017-12-25 2019-07-11 井関農機株式会社 Farm work supporting system
CN110519703A (en) * 2019-08-28 2019-11-29 北京易控智驾科技有限公司 A kind of mine car Unmanned Systems
JP2020162617A (en) * 2018-02-28 2020-10-08 株式会社クボタ Work vehicle
CN112455440A (en) * 2020-11-30 2021-03-09 北京易控智驾科技有限公司 Collaborative avoidance method, device, equipment and medium for automatically driving vehicle marshalling
CN112613672A (en) * 2020-12-28 2021-04-06 北京易控智驾科技有限公司 Vehicle dispatching method and device for strip mine, storage medium and electronic equipment
WO2022257767A1 (en) * 2021-06-11 2022-12-15 华能伊敏煤电有限责任公司 Method for automatically controlling path of mining area transport truck

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2015173073A1 (en) * 2014-05-15 2015-11-19 Cnh Industrial Belgium Nv Harvesting method using unmanned agricultural work vehicles
CA2941194A1 (en) * 2014-09-30 2016-04-07 Hitachi Construction Machinery Co., Ltd. Driving assistance system, vehicle, driving assistance terminal device, and driving assistance program
WO2016129671A1 (en) * 2015-02-13 2016-08-18 ヤンマー株式会社 Control system for autonomously traveling work vehicle
JP2019114138A (en) * 2017-12-25 2019-07-11 井関農機株式会社 Farm work supporting system
JP2020162617A (en) * 2018-02-28 2020-10-08 株式会社クボタ Work vehicle
CN110519703A (en) * 2019-08-28 2019-11-29 北京易控智驾科技有限公司 A kind of mine car Unmanned Systems
CN112455440A (en) * 2020-11-30 2021-03-09 北京易控智驾科技有限公司 Collaborative avoidance method, device, equipment and medium for automatically driving vehicle marshalling
CN112613672A (en) * 2020-12-28 2021-04-06 北京易控智驾科技有限公司 Vehicle dispatching method and device for strip mine, storage medium and electronic equipment
WO2022257767A1 (en) * 2021-06-11 2022-12-15 华能伊敏煤电有限责任公司 Method for automatically controlling path of mining area transport truck

Also Published As

Publication number Publication date
CN115649186A (en) 2023-01-31

Similar Documents

Publication Publication Date Title
US11874671B2 (en) Performing tasks using autonomous machines
JP7281489B2 (en) Autonomous operation method, device, program and storage medium for self-driving logistics vehicle
US20200209874A1 (en) Combined virtual and real environment for autonomous vehicle planning and control testing
CN109767130B (en) Method and device for controlling a vehicle
CN115649186B (en) Unmanned operation method and device based on manned operation, electronic equipment and storage medium
US11398150B2 (en) Navigation analysis for a multi-lane roadway
CN112595334B (en) Map updating method, device and system for unloading area of surface mine
JP7060398B2 (en) Server device
US11874121B2 (en) Method and system for fleet route optimization
CN109624994A (en) A kind of Vehicular automatic driving control method, device, equipment and terminal
CN113535743A (en) Real-time updating method and device for unmanned map, electronic equipment and storage medium
CN115686028B (en) Unmanned operation method and device based on manned operation, electronic equipment and storage medium
CN112712608B (en) System and method for collecting performance data by a vehicle
CN109945880B (en) Path planning method, related equipment and readable storage medium
CN111861008A (en) Unmanned vehicle and path planning method, device and readable storage medium thereof
US11745747B2 (en) System and method of adaptive distribution of autonomous driving computations
CN115675493B (en) Unmanned method and device using manual driving track layer information
CN113793518A (en) Vehicle passing processing method and device, electronic equipment and storage medium
CN115686029B (en) Unmanned operation method and device based on manned operation, electronic equipment and storage medium
CN115657692A (en) Unmanned operation method and device based on manned driving, electronic equipment and storage medium
CN114485670B (en) Path planning method and device for mobile unit, electronic equipment and medium
US20230366683A1 (en) Information processing device and information processing system
CN114379588B (en) Inbound state detection method, apparatus, vehicle, device and storage medium
CN116520854B (en) Control method and device for work vehicle, electronic equipment and storage medium
FI130312B (en) Tracking a truck in a container handling area

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant