CN113192381A - Hybrid scene-based driving simulation method, system, device and storage medium - Google Patents

Hybrid scene-based driving simulation method, system, device and storage medium Download PDF

Info

Publication number
CN113192381A
CN113192381A CN202110511226.8A CN202110511226A CN113192381A CN 113192381 A CN113192381 A CN 113192381A CN 202110511226 A CN202110511226 A CN 202110511226A CN 113192381 A CN113192381 A CN 113192381A
Authority
CN
China
Prior art keywords
scene
driving
space data
virtual vehicle
scene space
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202110511226.8A
Other languages
Chinese (zh)
Other versions
CN113192381B (en
Inventor
谭黎敏
章嵘
谢怿
陈晓虎
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shanghai Westwell Information Technology Co Ltd
Original Assignee
Shanghai Westwell Information Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shanghai Westwell Information Technology Co Ltd filed Critical Shanghai Westwell Information Technology Co Ltd
Priority to CN202110511226.8A priority Critical patent/CN113192381B/en
Publication of CN113192381A publication Critical patent/CN113192381A/en
Application granted granted Critical
Publication of CN113192381B publication Critical patent/CN113192381B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09BEDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
    • G09B9/00Simulators for teaching or training purposes
    • G09B9/02Simulators for teaching or training purposes for teaching control of vehicles or other craft
    • G09B9/04Simulators for teaching or training purposes for teaching control of vehicles or other craft for teaching control of land vehicles
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09BEDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
    • G09B9/00Simulators for teaching or training purposes
    • G09B9/02Simulators for teaching or training purposes for teaching control of vehicles or other craft
    • G09B9/04Simulators for teaching or training purposes for teaching control of vehicles or other craft for teaching control of land vehicles
    • G09B9/05Simulators for teaching or training purposes for teaching control of vehicles or other craft for teaching control of land vehicles the view from a vehicle being simulated
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02TCLIMATE CHANGE MITIGATION TECHNOLOGIES RELATED TO TRANSPORTATION
    • Y02T10/00Road transport of goods or passengers
    • Y02T10/10Internal combustion engine [ICE] based vehicles
    • Y02T10/40Engine management systems

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Business, Economics & Management (AREA)
  • Physics & Mathematics (AREA)
  • Educational Administration (AREA)
  • Educational Technology (AREA)
  • General Physics & Mathematics (AREA)
  • Processing Or Creating Images (AREA)
  • Navigation (AREA)
  • Electric Propulsion And Braking For Vehicles (AREA)

Abstract

The invention provides a method, a system, equipment and a storage medium for simulating driving based on a mixed scene, wherein the method comprises the following steps: obtaining scene space data of a spatial scene; constructing at least one virtual vehicle in the scene space data and generating a view angle picture in real time according to the view angle of the virtual vehicle; providing a driving simulator for each virtual vehicle, wherein the driving simulator displays a visual angle picture of the virtual vehicle in real time, receives driving operation information of a driver, updates position change of the virtual vehicle in scene space data and updates the scene space data; the unmanned equipment receives at least part of scene space data in real time, carries out route planning or navigation driving at least based on the scene space data, collects position changes of the unmanned equipment in real time, and updates the position changes into the scene space data. The invention can help the driver to adapt to the scene of the joint operation with the unmanned operation equipment in the closed scene under the simulation device, and improves the operation efficiency and the safety of the mixed scene of the joint operation of the driver and the unmanned operation equipment.

Description

Hybrid scene-based driving simulation method, system, device and storage medium
Technical Field
The invention relates to the field of simulated driving, in particular to a method, a system, equipment and a storage medium for simulated driving based on a mixed scene under an unmanned wharf scene.
Background
The unmanned wharf is also called as an automatic wharf, adopts frontmost science and technology, enables a wharf operation machine to automatically and effectively operate on the premise of non-manual operation, and mainly comprises an automatic yard operation machine, an automatic shoreside operation machine, an automatic horizontal transportation machine and an automatic control system, wherein the automatic control system is the core of the whole automatic wharf. The automatic horizontal transport machinery of the unmanned wharf is mainly a container truck, an intelligent AGV and a straddle carrier.
However, the dock is usually provided with a large number of various devices, and in the construction stage of the unmanned dock, there may be a stage in which manned devices (trucks or cranes) and unmanned devices (unmanned trucks or unmanned cranes) are mixed for use, or a mixed use of manned devices and unmanned devices may be a development mode of the dock in the future. Although the unmanned device can simulate a scene mixed by a large amount of unmanned devices and accumulate data, the use experience of the unmanned device in a complex scene (a large amount of manned devices and unmanned devices are mixed and shared) is not rich enough, and a driver drives the unmanned device by himself or herself and has certain psychological worry about the scene of working together with the unmanned truck.
The conventional VR simulation driving training method is usually based on a simulation scene given by an algorithm, and the simulation scene has many differences with a complex scene (a large number of manned devices and unmanned devices are mixed and shared), especially, a scene where a plurality of unmanned devices collect, judge and execute operations and a plurality of real drivers are mixed and operated based on respective independent environments has many emergencies (the situations that vehicle performance is influenced by sudden severe weather and high and low temperatures, pedestrians walk randomly, goods stacking is not standard, road preemption or congestion occurs and the like), and the emergencies are obtained by artificial intelligence and cannot be actively simulated, so that even when a driver and the unmanned devices in VR simulation driving training work cooperatively for a long time, many inadaptations and operation risks exist. Since the measurement or goods value of the wharf is very high, accidents can cause high commercial loss to the wharf.
Therefore, the invention provides a simulated driving method, a system, equipment and a storage medium based on a mixed scene.
Disclosure of Invention
In view of the problems in the prior art, an object of the present invention is to provide a method, a system, a device and a storage medium for simulating driving based on a mixed scene, which overcome the difficulties in the prior art and can help a driver to adapt to a scene of working together with unmanned working equipment in a closed scene under a simulation device, such as: the unmanned wharf, the unmanned mine and the like are beneficial to collecting driving data of the unmanned equipment and a vehicle with a driver when meeting to establish a corresponding driving model of a neural network, and the safety of the joint operation of the driver and the unmanned truck is optimized.
The embodiment of the invention provides a simulated driving method based on a mixed scene, which comprises the following steps:
s110, obtaining scene space data of a space scene;
s120, constructing at least one virtual vehicle in the scene space data and generating a view angle picture in real time according to the view angle of the virtual vehicle;
s130, providing a driving simulator for each virtual vehicle, wherein the driving simulator displays a visual angle picture of the virtual vehicle in real time, receives driving operation information of a driver, updates position change of the virtual vehicle in scene space data, and updates the scene space data;
and S140, receiving at least part of scene space data by the plurality of unmanned devices in real time, planning a route or navigating based on the scene space data, collecting the position and pose change of the unmanned devices in real time, and updating the position and pose change into the scene space data.
Preferably, in step S110, performing scene space modeling on the spatial scene through spatial scanning to obtain scene space data;
preferably, in step S120, a spatial data model and a motion control model of the virtual vehicle are constructed and implanted into the scene spatial data.
Preferably, the step S130 includes:
s131, downloading and updating scene space data in a preset range from a server in real time by the driving simulator;
s132, generating a view angle picture according to the current view angle of the virtual vehicle and the updated scene space data and displaying the view angle picture;
s133, receiving current driving operation information of a driver, performing driving operation on the virtual vehicle according to the driving operation information, obtaining position change of the virtual vehicle after driving in scene space data, and uploading and updating the position change to the scene space data.
Preferably, in step S140, each of the unmanned devices updates scene space data within a preset range in real time from a server, performs route planning or navigation driving based on the scene space data and a preset driving rule of an intelligent module, and updates a position change of the unmanned device to the scene space data.
Preferably, the scene space data within the preset range is scene space data with a radius of 200 meters and centered on a virtual vehicle or an unmanned device.
Preferably, the step S140 is followed by:
s150, establishing a driving model of a neural network for meeting the vehicle with the driver by each unmanned device according to the driving data of the virtual vehicle during meeting.
The embodiment of the present invention further provides a simulated driving system based on a mixed scene, which is used for implementing the simulated driving method based on the mixed scene, and the simulated driving system based on the mixed scene includes:
the space establishing module is used for obtaining scene space data of a space scene;
the image feedback module is used for constructing at least one virtual vehicle in the scene space data and generating an angle of view image in real time according to the angle of view of the virtual vehicle;
the first updating module is used for providing a driving simulator for each virtual vehicle, wherein the driving simulator displays a visual angle picture of the virtual vehicle in real time, receives driving operation information of a driver, updates position change of the virtual vehicle in scene space data and updates the scene space data;
and the plurality of unmanned devices receive at least part of scene space data in real time, perform route planning or navigation driving based on the scene space data, acquire position and pose changes of the unmanned devices in real time and update the position and pose changes into the scene space data.
Preferably, the driving simulator comprises a triple screen for displaying the view angle picture, a steering wheel, a gear device and an operation conversion module, wherein the operation conversion module is respectively connected with the steering wheel and the gear device, receives operation information, converts the operation information into a running operation instruction of the virtual vehicle, and feeds the running operation instruction back to the server.
Preferably, the method further comprises the following steps: and each unmanned device establishes a driving model of the neural network for a vehicle meeting scene with a driver according to the driving data of the virtual vehicle during meeting.
The embodiment of the invention also provides a simulated driving device based on a mixed scene, which comprises:
a processor;
a memory having stored therein executable instructions of the processor;
wherein the processor is configured to perform the steps of the hybrid scenario based simulated driving method described above via execution of executable instructions.
Embodiments of the present invention also provide a computer-readable storage medium for storing a program, which when executed implements the steps of the above-described hybrid scene-based simulated driving method.
The simulated driving method, the system, the equipment and the storage medium based on the mixed scene can help a driver to adapt to the scene of working together with unmanned operation equipment in a closed scene under a simulation device, such as: the unmanned wharf, the unmanned mine and the like are beneficial to collecting driving data of the unmanned equipment and a vehicle with a driver when meeting to establish a corresponding driving model of a neural network, and the safety of the joint operation of the driver and the unmanned truck is optimized.
Drawings
Other features, objects and advantages of the present invention will become more apparent upon reading of the following detailed description of non-limiting embodiments thereof, with reference to the accompanying drawings.
FIG. 1 is a flow chart of a hybrid scene based simulated driving method of the present invention.
Fig. 2 to 5 are schematic diagrams of the implementation process of the hybrid scene-based simulated driving method of the invention.
FIG. 6 is a schematic structural diagram of a hybrid scene-based simulated driving system of the present invention
Fig. 7 is a schematic structural diagram of a simulated driving device based on a mixed scene. And
fig. 8 is a schematic structural diagram of a computer-readable storage medium according to an embodiment of the present invention.
Detailed Description
Example embodiments will now be described more fully with reference to the accompanying drawings. Example embodiments may, however, be embodied in many different forms and should not be construed as limited to the embodiments set forth herein. Rather, these embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the concept of example embodiments to those skilled in the art. The same reference numerals in the drawings denote the same or similar structures, and thus their repetitive description will be omitted.
FIG. 1 is a flow chart of a hybrid scene based simulated driving method of the present invention. As shown in fig. 1, an embodiment of the present invention provides a simulated driving method based on a mixed scene, and the method of the present invention includes the following steps:
and S110, obtaining scene space data of the space scene.
And S120, constructing at least one virtual vehicle in the scene space data and generating a view angle picture in real time according to the view angle of the virtual vehicle.
S130, providing a driving simulator for each virtual vehicle, wherein the driving simulator displays a visual angle picture of the virtual vehicle in real time, receives driving operation information of a driver, updates position change of the virtual vehicle in scene space data, and updates the scene space data.
And S140, receiving at least part of scene space data by the plurality of unmanned devices in real time, planning a route or navigating based on the scene space data, collecting the position and pose changes of the unmanned devices in real time, and updating the position and pose changes into the scene space data, but not limited to the above. The unmanned equipment can be matched with an automatic driving module thereof to carry out route planning or navigation driving based on composite information means such as scene space data, self real-time radar scanning, image acquisition results, information interaction between vehicles and the like, and the details are not repeated here.
In a preferred embodiment, in step S110, the spatial scene is modeled by spatial scanning to obtain scene space data, but not limited thereto.
In a preferred embodiment, in step S120, a spatial data model and a motion control model of the virtual vehicle are constructed and embedded into the scene space data, but not limited thereto.
In a preferred embodiment, step S130 includes:
s131, the driving simulator downloads and updates scene space data in a preset range from the server in real time.
And S132, generating a view angle picture according to the current view angle of the virtual vehicle and the updated scene space data and displaying the view angle picture.
And S133, receiving current driving operation information of the driver, performing driving operation on the virtual vehicle according to the driving operation information, obtaining position change of the virtual vehicle after driving in the scene space data, and uploading and updating the position change to the scene space data, but not limited to this.
In a preferred embodiment, in step S140, each of the unmanned devices updates scene space data within a preset range from the server in real time, performs route planning or navigation driving based on the scene space data, and updates the position change of the unmanned device to the scene space data, but not limited thereto.
In a preferred embodiment, the scene space data within the preset range is scene space data centered on the virtual vehicle or the unmanned device and having a radius within 200 meters, but not limited thereto.
In a preferred embodiment, step S140 is followed by: s150, each of the unmanned devices establishes a driving model of a neural network for meeting with a vehicle having a driver according to the driving data of the virtual vehicle during meeting, but not limited thereto.
The simulated driving method based on the mixed scene is completely different from the traditional VR driving equipment, but collects space data in real time from a real space, after a virtual space is constructed, the virtual vehicle is implanted into the virtual space, a driver can drive the virtual vehicle to run in the virtual space through a simulator, meanwhile, as the unmanned vehicles can update the space data in real time, a plurality of unmanned vehicles can sense the position change of the virtual vehicle which does not exist in the real space when running, and respectively and automatically run based on the scene and artificial intelligence, so that the driver can simulate the scene of the common operation of the vehicle and the unmanned vehicle which are driven by the driver on site in a safe simulator, and the reality and the safety of the scene are considered. And a data collection scene is provided for the cooperative work of a plurality of unmanned vehicle learning and real persons, and optimization iteration of an unmanned vehicle system is facilitated.
The simulation driving method based on the mixed scene can help a driver to adapt to a scene of working together with the unmanned operation equipment in a closed scene under the simulation device, is favorable for collecting driving data of the unmanned equipment and a vehicle with the driver when meeting to establish a corresponding driving model of a neural network, and optimizes the safety of the working together of the driver and the unmanned truck.
Fig. 2 to 5 are schematic diagrams of the implementation process of the hybrid scene-based simulated driving method of the invention. As shown in fig. 2 to 5, the implementation process of the simulated driving method based on the mixed scene of the present invention is as follows:
as shown in fig. 2, scene space data is obtained by performing scene space modeling on a spatial scene through spatial scanning. Three unmanned vehicles 11, 12, 13 are present in the scene space modeling. At least one virtual vehicle 14 is constructed in the scene space data and a perspective picture is generated in real time according to the perspective of the virtual vehicle. The method comprises the steps of constructing a spatial data model and a motion control model of a virtual vehicle, and implanting the spatial data model and the motion control model into scene spatial data.
As shown in fig. 3 and 4, a driving simulator 21 is provided for the virtual vehicle 14, and the driving simulator includes a triple screen for displaying a view angle, a steering wheel, a shift device, and an operation conversion module, which is respectively connected to the steering wheel and the shift device, receives operation information, converts the operation information into a driving operation command of the virtual vehicle, and feeds the driving operation command back to the server. The triple screen 21 of the driving simulator displays the view angle picture of the virtual vehicle in real time, and at this time, because the unmanned vehicle 11 is located right in front of the virtual vehicle 14, the view angle picture displayed by the triple screen 21 includes the image 111 of the unmanned vehicle 11. The driving simulator 21 receives the running operation information of the driver 22 to update the position change of the virtual vehicle in the scene space data and to update to the scene space data. The driving simulator downloads scene space data within an updated preset range from the server in real time, generates and displays a view angle picture according to the current view angle of the virtual vehicle and the updated scene space data, receives current driving operation information of a driver, performs driving operation on the virtual vehicle according to the driving operation information, obtains position change of the virtual vehicle after driving in the scene space data, and uploads the position change to the scene space data. The scene space data within the preset range is the scene space data within the range of 200 m in radius by taking the virtual vehicle as the center.
As shown in fig. 5, the unmanned vehicle 11 receives at least part of the scene space data in real time, performs route planning or navigation driving based on the scene space data, and acquires the position change of the unmanned vehicle 11 in real time and updates the position change into the scene space data. Each unmanned vehicle 11 updates scene space data within a preset range in real time from the server, performs route planning or navigation driving based on the scene space data, and updates the position change of the unmanned vehicle 11 into the scene space data. The scene space data within the preset range is the scene space data within the range of 200 m in radius with the unmanned vehicle 11 as the center. The unmanned vehicle 11 establishes a driving model of a neural network for a vehicle meeting scene with a driver according to the driving data of the virtual vehicle 14 during meeting, so that the driver is helped to adapt to the scene of working together with the unmanned operation equipment in a closed scene under a simulation device, and the operation efficiency and the safety of a mixed scene of working together with the unmanned operation equipment by the driver are improved.
Fig. 6 is a schematic structural diagram of a hybrid scene-based simulated driving system of the present invention, as shown in fig. 6, an embodiment of the present invention further provides a hybrid scene-based simulated driving system 5, which is used for implementing the hybrid scene-based simulated driving method described above, and the hybrid scene-based simulated driving system includes:
the space creation module 51 obtains scene space data of a spatial scene. And carrying out scene space modeling on the space scene through space scanning to obtain scene space data.
And the image feedback module 52 constructs at least one virtual vehicle in the scene space data and generates a view image in real time according to the view of the virtual vehicle. The method comprises the steps of constructing a spatial data model and a motion control model of a virtual vehicle, and implanting the spatial data model and the motion control model into scene spatial data.
The first updating module 53 provides a driving simulator for each virtual vehicle, and the driving simulator displays the view angle picture of the virtual vehicle in real time, receives the driving operation information of the driver, updates the position change of the virtual vehicle in the scene space data, and updates the scene space data. The driving simulator downloads scene space data within an updated preset range from the server in real time, generates and displays a view angle picture according to the current view angle of the virtual vehicle and the updated scene space data, receives current driving operation information of a driver, performs driving operation on the virtual vehicle according to the driving operation information, obtains position change of the virtual vehicle after driving in the scene space data, and uploads the position change to the scene space data. The scene space data within the preset range is the scene space data within the range of 200 m in radius by taking the virtual vehicle as the center. The driving simulator (see fig. 4) includes a triple screen for displaying a view angle picture, a steering wheel, a gear device, and an operation conversion module, wherein the operation conversion module is respectively connected with the steering wheel and the gear device, receives operation information, converts the operation information into a driving operation instruction of the virtual vehicle, and feeds the driving operation instruction back to the server.
And the second updating module 54 is used for receiving at least part of the scene space data by the plurality of unmanned devices in real time, planning a route or navigating and driving based on the scene space data, collecting the position and pose change of the unmanned devices in real time, and updating the position and pose change into the scene space data. And each unmanned device updates scene space data within a preset range in real time from the server, performs route planning or navigation driving based on the scene space data, and updates the position change of the unmanned device into the scene space data. The scene space data within the preset range is the scene space data with the unmanned equipment as the center and the radius of 200 meters.
And a neural network module 55, wherein each unmanned device establishes a driving model of the neural network for a vehicle meeting scene with a driver according to the driving data of the virtual vehicle when meeting.
The simulation driving system based on the mixed scene can help a driver to adapt to a scene of working together with the unmanned operation equipment in a closed scene under the simulation device, is favorable for collecting driving data of the unmanned equipment and a vehicle with the driver when meeting to establish a corresponding driving model of a neural network, and optimizes the safety of the working together of the driver and the unmanned truck.
The embodiment of the invention also provides a driving simulation device based on the mixed scene, which comprises a processor. A memory having stored therein executable instructions of the processor. Wherein the processor is configured to perform the steps of the hybrid scenario based simulated driving method via execution of executable instructions.
As described above, the simulated driving device based on the hybrid scene can help the driver to adapt to the scene of the joint operation with the unmanned operation device in the closed scene under the simulation device, and is beneficial to collecting the driving data when the unmanned device meets the vehicle with the driver to establish a corresponding driving model of the neural network, thereby optimizing the safety of the joint operation of the driver and the unmanned truck.
As will be appreciated by one skilled in the art, aspects of the present invention may be embodied as a system, method or program product. Thus, various aspects of the invention may be embodied in the form of: an entirely hardware embodiment, an entirely software embodiment (including firmware, microcode, etc.) or an embodiment combining hardware and software aspects that may all generally be referred to herein as a "circuit," module "or" platform.
Fig. 7 is a schematic structural diagram of a simulated driving device based on a mixed scene. An electronic device 600 according to this embodiment of the invention is described below with reference to fig. 7. The electronic device 600 shown in fig. 7 is only an example, and should not bring any limitation to the functions and the scope of use of the embodiments of the present invention.
As shown in fig. 7, the electronic device 600 is embodied in the form of a general purpose computing device. The components of the electronic device 600 may include, but are not limited to: at least one processing unit 610, at least one memory unit 620, a bus 630 connecting the different platform components (including the memory unit 620 and the processing unit 610), a display unit 640, etc.
Wherein the storage unit stores program code executable by the processing unit 610 to cause the processing unit 610 to perform steps according to various exemplary embodiments of the present invention described in the above-mentioned electronic prescription flow processing method section of the present specification. For example, processing unit 610 may perform the steps as shown in fig. 1.
The storage unit 620 may include readable media in the form of volatile memory units, such as a random access memory unit (RAM)6201 and/or a cache memory unit 6202, and may further include a read-only memory unit (ROM) 6203.
The memory unit 620 may also include a program/utility 6204 having a set (at least one) of program modules 6205, such program modules 6205 including, but not limited to: an operating system, one or more application programs, other program modules, and program data, each of which, or some combination thereof, may comprise an implementation of a network environment.
Bus 630 may be one or more of several types of bus structures, including a memory unit bus or memory unit controller, a peripheral bus, an accelerated graphics port, a processing unit, or a local bus using any of a variety of bus architectures.
The electronic device 600 may also communicate with one or more external devices 700 (e.g., keyboard, pointing device, bluetooth device, etc.), with one or more devices that enable a user to interact with the electronic device 600, and/or with any devices (e.g., router, modem, etc.) that enable the electronic device 600 to communicate with one or more other computing devices. Such communication may occur via an input/output (I/O) interface 650. Also, the electronic device 600 may communicate with one or more networks (e.g., a Local Area Network (LAN), a Wide Area Network (WAN), and/or a public network such as the Internet) via the network adapter 660. The network adapter 660 may communicate with other modules of the electronic device 600 via the bus 630. It should be appreciated that although not shown in the figures, other hardware and/or software modules may be used in conjunction with the electronic device 600, including but not limited to: microcode, device drivers, redundant processing units, external disk drive arrays, RAID systems, tape drives, and data backup storage platforms, to name a few.
The embodiment of the invention also provides a computer readable storage medium for storing a program, and the program is used for realizing the steps of the simulated driving method based on the mixed scene when being executed. In some possible embodiments, the aspects of the present invention may also be implemented in the form of a program product comprising program code for causing a terminal device to perform the steps according to various exemplary embodiments of the present invention described in the above-mentioned electronic prescription flow processing method section of this specification, when the program product is run on the terminal device.
As described above, the program of the computer-readable storage medium of this embodiment, when executed, can help the driver adapt to the scene of working together with the unmanned aerial vehicle in the closed scene under the simulation device, and is beneficial to collecting the driving data when the unmanned aerial vehicle meets the vehicle with the driver to establish a driving model of a corresponding neural network, and optimize the safety of the working together of the driver and the unmanned truck.
Fig. 8 is a schematic structural diagram of a computer-readable storage medium of the present invention. Referring to fig. 8, a program product 800 for implementing the above method according to an embodiment of the present invention is described, which may employ a portable compact disc read only memory (CD-ROM) and include program code, and may be run on a terminal device, such as a personal computer. However, the program product of the present invention is not limited in this regard and, in the present document, a readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device.
The program product may employ any combination of one or more readable media. The readable medium may be a readable signal medium or a readable storage medium. A readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any combination of the foregoing. More specific examples (a non-exhaustive list) of the readable storage medium include: an electrical connection having one or more wires, a portable disk, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing.
A computer readable storage medium may include a propagated data signal with readable program code embodied therein, for example, in baseband or as part of a carrier wave. Such a propagated data signal may take many forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof. A readable storage medium may also be any readable medium that is not a readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device. Program code embodied on a readable storage medium may be transmitted using any appropriate medium, including but not limited to wireless, wireline, optical fiber cable, RF, etc., or any suitable combination of the foregoing.
Program code for carrying out operations for aspects of the present invention may be written in any combination of one or more programming languages, including an object oriented programming language such as Java, C + + or the like and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the user's computing device, partly on the user's device, as a stand-alone software package, partly on the user's computing device and partly on a remote computing device, or entirely on the remote computing device or server. In the case of a remote computing device, the remote computing device may be connected to the user computing device through any kind of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or may be connected to an external computing device (e.g., through the internet using an internet service provider).
To sum up, the driving simulation method, system, device and storage medium based on the mixed scene of the present invention can help the driver adapt to the scene of working together with the unmanned working device in the closed scene under the simulation device, for example: the unmanned wharf, the unmanned mine and the like are beneficial to collecting driving data of the unmanned equipment and a vehicle with a driver when meeting to establish a corresponding driving model of a neural network, and the safety of the joint operation of the driver and the unmanned truck is optimized.
The foregoing is a more detailed description of the invention in connection with specific preferred embodiments and it is not intended that the invention be limited to these specific details. For those skilled in the art to which the invention pertains, several simple deductions or substitutions can be made without departing from the spirit of the invention, and all shall be considered as belonging to the protection scope of the invention.

Claims (12)

1. A simulated driving method based on a mixed scene is characterized by comprising the following steps:
s110, obtaining scene space data of a space scene;
s120, constructing at least one virtual vehicle in the scene space data and generating a view angle picture in real time according to the view angle of the virtual vehicle;
s130, providing a driving simulator for each virtual vehicle, wherein the driving simulator displays a visual angle picture of the virtual vehicle in real time, receives driving operation information of a driver, updates position change of the virtual vehicle in scene space data, and updates the scene space data;
and S140, receiving at least part of scene space data by the plurality of unmanned devices in real time, planning a route or navigating based on the scene space data, collecting the position and pose change of the unmanned devices in real time, and updating the position and pose change into the scene space data.
2. The driving simulation method based on mixed scene as claimed in claim 1, wherein in step S110, the spatial scene is modeled by spatial scanning to obtain scene space data.
3. The hybrid scene-based simulation driving method according to claim 1, wherein in the step S120, a spatial data model and a motion control model of the virtual vehicle are constructed and implanted into the scene spatial data.
4. The method for simulating driving based on mixed scene as claimed in claim 1, wherein said step S130 includes:
s131, downloading and updating scene space data in a preset range from a server in real time by the driving simulator;
s132, generating a view angle picture according to the current view angle of the virtual vehicle and the updated scene space data and displaying the view angle picture;
s133, receiving current driving operation information of a driver, performing driving operation on the virtual vehicle according to the driving operation information, obtaining position change of the virtual vehicle after driving in scene space data, and uploading and updating the position change to the scene space data.
5. The hybrid scene-based driving simulation method according to claim 1, wherein in step S140, each of the unmanned devices updates scene space data within a preset range from a server in real time, performs route planning or navigation driving based on the scene space data and a preset driving rule of an intelligent module, and updates a position change of the unmanned device into the scene space data.
6. The hybrid scene-based simulated driving method according to claim 4 or 5, wherein the scene space data in the preset range is scene space data in a preset radius range centered on a virtual vehicle or an unmanned device.
7. The hybrid scene-based driving simulation method according to claim 1, wherein the step S140 is followed by further comprising:
s150, establishing a driving model of a neural network for meeting the vehicle with the driver by each unmanned device according to the driving data of the virtual vehicle during meeting.
8. A hybrid scenario-based simulated driving system for implementing the hybrid scenario-based simulated driving method according to claim 1, comprising:
the space establishing module is used for obtaining scene space data of a space scene;
the image feedback module is used for constructing at least one virtual vehicle in the scene space data and generating an angle of view image in real time according to the angle of view of the virtual vehicle;
the first updating module is used for providing a driving simulator for each virtual vehicle, wherein the driving simulator displays a visual angle picture of the virtual vehicle in real time, receives driving operation information of a driver, updates position change of the virtual vehicle in scene space data and updates the scene space data;
and the plurality of unmanned devices receive at least part of scene space data in real time, perform route planning or navigation driving based on the scene space data, acquire position and pose changes of the unmanned devices in real time and update the position and pose changes into the scene space data.
9. The hybrid scene-based driving simulation system of claim 8, wherein the driving simulator comprises a triple screen for displaying the view angle image, a steering wheel, a gear device and an operation conversion module, the operation conversion module is respectively connected to the steering wheel and the gear device, receives operation information, converts the operation information into a driving operation command of the virtual vehicle, and feeds the driving operation command back to the server.
10. The hybrid scene-based simulated driving system of claim 8, further comprising:
and each unmanned device establishes a driving model of the neural network for a vehicle meeting scene with a driver according to the driving data of the virtual vehicle during meeting.
11. A simulated driving apparatus based on a hybrid scene, comprising:
a processor;
a memory having stored therein executable instructions of the processor;
wherein the processor is configured to perform the steps of the hybrid scenario based simulated driving method of any of claims 1 to 7 via execution of executable instructions.
12. A computer-readable storage medium storing a program, wherein the program is configured to implement the steps of the hybrid scenario-based simulated driving method of any one of claims 1 to 7 when executed.
CN202110511226.8A 2021-05-11 2021-05-11 Hybrid scene-based driving simulation method, system, equipment and storage medium Active CN113192381B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110511226.8A CN113192381B (en) 2021-05-11 2021-05-11 Hybrid scene-based driving simulation method, system, equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110511226.8A CN113192381B (en) 2021-05-11 2021-05-11 Hybrid scene-based driving simulation method, system, equipment and storage medium

Publications (2)

Publication Number Publication Date
CN113192381A true CN113192381A (en) 2021-07-30
CN113192381B CN113192381B (en) 2023-07-28

Family

ID=76981062

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110511226.8A Active CN113192381B (en) 2021-05-11 2021-05-11 Hybrid scene-based driving simulation method, system, equipment and storage medium

Country Status (1)

Country Link
CN (1) CN113192381B (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113946259A (en) * 2021-09-18 2022-01-18 北京城市网邻信息技术有限公司 Vehicle information processing method and device, electronic equipment and readable medium
CN114141092A (en) * 2021-11-10 2022-03-04 武汉未来幻影科技有限公司 Method and system for constructing animation scene of driving test simulator
CN114185330A (en) * 2021-12-12 2022-03-15 蜂联智能(深圳)有限公司 Control method and control device based on multi-scene interaction
WO2023131124A1 (en) * 2022-01-04 2023-07-13 上海三一重机股份有限公司 Virtual interaction method, apparatus and system for work machine and work environment

Citations (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106652645A (en) * 2017-03-16 2017-05-10 百度在线网络技术(北京)有限公司 Vehicle driving training device, as well as operation method and device of vehicle driving training device
US20170269681A1 (en) * 2016-03-18 2017-09-21 Volvo Car Corporation Method and system for enabling interaction in a test environment
US20180285631A1 (en) * 2017-03-31 2018-10-04 Honda Motor Co., Ltd. Interaction with physical objects as proxy objects representing virtual objects
CN109632339A (en) * 2018-12-28 2019-04-16 同济大学 A kind of automatic driving vehicle traffic coordinating real steering vectors system and method
CN109765060A (en) * 2018-12-29 2019-05-17 同济大学 A kind of automatic driving vehicle traffic coordinating virtual test system and method
CN109781431A (en) * 2018-12-07 2019-05-21 山东省科学院自动化研究所 Automatic Pilot test method and system based on mixed reality
CN109887372A (en) * 2019-04-16 2019-06-14 北京中公高远汽车试验有限公司 Driving training analogy method, electronic equipment and storage medium
US20190318267A1 (en) * 2018-04-12 2019-10-17 Baidu Usa Llc System and method for training a machine learning model deployed on a simulation platform
WO2019210821A1 (en) * 2018-05-03 2019-11-07 Formula Square Holdings Ltd Systems and methods for providing driving guidance
US20190354643A1 (en) * 2018-05-17 2019-11-21 Toyota Jidosha Kabushiki Kaisha Mixed reality simulation system for testing vehicle control system designs
CN111781855A (en) * 2020-07-15 2020-10-16 北京领骏科技有限公司 Traffic on-loop automatic driving simulation system
CN112198859A (en) * 2020-09-07 2021-01-08 西安交通大学 Method, system and device for testing automatic driving vehicle in vehicle ring under mixed scene
CN112365215A (en) * 2020-12-02 2021-02-12 青岛慧拓智能机器有限公司 Mining area unmanned transportation simulation test system and method based on depth virtual-real mixing

Patent Citations (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20170269681A1 (en) * 2016-03-18 2017-09-21 Volvo Car Corporation Method and system for enabling interaction in a test environment
CN106652645A (en) * 2017-03-16 2017-05-10 百度在线网络技术(北京)有限公司 Vehicle driving training device, as well as operation method and device of vehicle driving training device
US20180285631A1 (en) * 2017-03-31 2018-10-04 Honda Motor Co., Ltd. Interaction with physical objects as proxy objects representing virtual objects
US20190318267A1 (en) * 2018-04-12 2019-10-17 Baidu Usa Llc System and method for training a machine learning model deployed on a simulation platform
WO2019210821A1 (en) * 2018-05-03 2019-11-07 Formula Square Holdings Ltd Systems and methods for providing driving guidance
US20190354643A1 (en) * 2018-05-17 2019-11-21 Toyota Jidosha Kabushiki Kaisha Mixed reality simulation system for testing vehicle control system designs
CN109781431A (en) * 2018-12-07 2019-05-21 山东省科学院自动化研究所 Automatic Pilot test method and system based on mixed reality
CN109632339A (en) * 2018-12-28 2019-04-16 同济大学 A kind of automatic driving vehicle traffic coordinating real steering vectors system and method
CN109765060A (en) * 2018-12-29 2019-05-17 同济大学 A kind of automatic driving vehicle traffic coordinating virtual test system and method
CN109887372A (en) * 2019-04-16 2019-06-14 北京中公高远汽车试验有限公司 Driving training analogy method, electronic equipment and storage medium
CN111781855A (en) * 2020-07-15 2020-10-16 北京领骏科技有限公司 Traffic on-loop automatic driving simulation system
CN112198859A (en) * 2020-09-07 2021-01-08 西安交通大学 Method, system and device for testing automatic driving vehicle in vehicle ring under mixed scene
CN112365215A (en) * 2020-12-02 2021-02-12 青岛慧拓智能机器有限公司 Mining area unmanned transportation simulation test system and method based on depth virtual-real mixing

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113946259A (en) * 2021-09-18 2022-01-18 北京城市网邻信息技术有限公司 Vehicle information processing method and device, electronic equipment and readable medium
CN114141092A (en) * 2021-11-10 2022-03-04 武汉未来幻影科技有限公司 Method and system for constructing animation scene of driving test simulator
CN114185330A (en) * 2021-12-12 2022-03-15 蜂联智能(深圳)有限公司 Control method and control device based on multi-scene interaction
WO2023131124A1 (en) * 2022-01-04 2023-07-13 上海三一重机股份有限公司 Virtual interaction method, apparatus and system for work machine and work environment

Also Published As

Publication number Publication date
CN113192381B (en) 2023-07-28

Similar Documents

Publication Publication Date Title
CN113192381B (en) Hybrid scene-based driving simulation method, system, equipment and storage medium
US11548516B2 (en) Data acquisition method, apparatus, device and computer-readable storage medium
Nguyen et al. Virtual reality interfaces for visualization and control of remote vehicles
US20220156939A1 (en) Systems and Methods for Video Object Segmentation
US20220153314A1 (en) Systems and methods for generating synthetic motion predictions
US20220153298A1 (en) Generating Motion Scenarios for Self-Driving Vehicles
Guzman et al. Robotnik—Professional service robotics applications with ROS
Teng et al. FusionPlanner: A multi-task motion planner for mining trucks via multi-sensor fusion
US20200166355A1 (en) Generation of route network data for movement
US20220036184A1 (en) Compression of Machine-Learned Models by Vector Quantization
Lin et al. Integrated smart robot with earthquake early warning system for automated inspection and emergency response
Badiru et al. Handbook of emergency response: A human factors and systems engineering approach
Guvenc et al. Simulation Environment for Safety Assessment of CEAV Deployment in Linden
Wang et al. Simplexity testbed: A model-based digital twin testbed
WO2021010612A1 (en) Mobile robot platform system and operation method therefor
CN211577684U (en) Unmanned aerial vehicle rescue simulation platform
CN114103994A (en) Control method, device and equipment based on automatic road surface cleaning of vehicle and vehicle
KR20210038451A (en) Verification method and device for modeling route, unmanned vehicle, and storage medium
Zheng et al. Virtual Prototyping-Based Path Planning of Unmanned Aerial Vehicles for Building Exterior Inspection
CN114461104B (en) Building type splicing method, device, equipment and storage medium
Correal et al. Autonomy for ground-level robotic space exploration: framework, simulation, architecture, algorithms and experiments
Li Constructing the intelligent expressway traffic monitoring system using the internet of things and inspection robot
Erdogmuş et al. Robot Operating System Compatible Mobile Robots for Education and Research
Gotad Application of Neural Networks for Design and Development of Low-cost Autonomous Vehicles
Kantale et al. An Overview of Artificial Intelligence based Autonomous Vehicle Robotics Simulators

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
CB02 Change of applicant information

Address after: Room 503-3, 398 Jiangsu Road, Changning District, Shanghai 200050

Applicant after: Shanghai Xijing Technology Co.,Ltd.

Address before: Room 503-3, 398 Jiangsu Road, Changning District, Shanghai 200050

Applicant before: SHANGHAI WESTWELL INFORMATION AND TECHNOLOGY Co.,Ltd.

CB02 Change of applicant information
GR01 Patent grant
GR01 Patent grant