CN113192381B - Hybrid scene-based driving simulation method, system, equipment and storage medium - Google Patents

Hybrid scene-based driving simulation method, system, equipment and storage medium Download PDF

Info

Publication number
CN113192381B
CN113192381B CN202110511226.8A CN202110511226A CN113192381B CN 113192381 B CN113192381 B CN 113192381B CN 202110511226 A CN202110511226 A CN 202110511226A CN 113192381 B CN113192381 B CN 113192381B
Authority
CN
China
Prior art keywords
scene
space data
driving
virtual vehicle
real time
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202110511226.8A
Other languages
Chinese (zh)
Other versions
CN113192381A (en
Inventor
谭黎敏
章嵘
谢怿
陈晓虎
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shanghai Xijing Technology Co ltd
Original Assignee
Shanghai Xijing Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shanghai Xijing Technology Co ltd filed Critical Shanghai Xijing Technology Co ltd
Priority to CN202110511226.8A priority Critical patent/CN113192381B/en
Publication of CN113192381A publication Critical patent/CN113192381A/en
Application granted granted Critical
Publication of CN113192381B publication Critical patent/CN113192381B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09BEDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
    • G09B9/00Simulators for teaching or training purposes
    • G09B9/02Simulators for teaching or training purposes for teaching control of vehicles or other craft
    • G09B9/04Simulators for teaching or training purposes for teaching control of vehicles or other craft for teaching control of land vehicles
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09BEDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
    • G09B9/00Simulators for teaching or training purposes
    • G09B9/02Simulators for teaching or training purposes for teaching control of vehicles or other craft
    • G09B9/04Simulators for teaching or training purposes for teaching control of vehicles or other craft for teaching control of land vehicles
    • G09B9/05Simulators for teaching or training purposes for teaching control of vehicles or other craft for teaching control of land vehicles the view from a vehicle being simulated
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02TCLIMATE CHANGE MITIGATION TECHNOLOGIES RELATED TO TRANSPORTATION
    • Y02T10/00Road transport of goods or passengers
    • Y02T10/10Internal combustion engine [ICE] based vehicles
    • Y02T10/40Engine management systems

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Business, Economics & Management (AREA)
  • Physics & Mathematics (AREA)
  • Educational Administration (AREA)
  • Educational Technology (AREA)
  • General Physics & Mathematics (AREA)
  • Processing Or Creating Images (AREA)
  • Navigation (AREA)
  • Electric Propulsion And Braking For Vehicles (AREA)

Abstract

The invention provides a hybrid scene-based simulated driving method, a system, equipment and a storage medium, wherein the method comprises the following steps: obtaining scene space data of a space scene; constructing at least one virtual vehicle in the scene space data and generating a view angle picture in real time according to the view angle of the virtual vehicle; providing a driving simulator for each virtual vehicle, displaying the visual angle picture of the virtual vehicle in real time by the driving simulator, receiving driving operation information of a driver, updating the position change of the virtual vehicle in the scene space data, and updating the position change to the scene space data; and the unmanned equipment receives at least part of scene space data in real time, performs route planning or navigation running based on at least the scene space data, acquires the position change of the unmanned equipment in real time, and updates the position change into the scene space data. The invention can help the driver adapt to the scene of the joint operation with the unmanned operation equipment in the closed scene under the simulation device, and improves the operation efficiency and the safety of the mixed scene of the joint operation of the driver and the unmanned operation equipment.

Description

Hybrid scene-based driving simulation method, system, equipment and storage medium
Technical Field
The invention relates to the field of simulated driving, in particular to a hybrid scene-based simulated driving method, system, equipment and storage medium for an unmanned wharf scene.
Background
The unmanned wharf is also called an automatic wharf, adopts the technology of the forefront, enables the wharf operation machine to operate automatically and effectively on the premise of non-manual operation, and mainly comprises an automatic yard operation machine, an automatic shoreside operation machine, an automatic horizontal transportation machine and an automatic control system, wherein the automatic control system is the core of the whole automatic wharf. The unmanned wharf automatic horizontal transport machinery mainly comprises a collection card, an intelligent AGV and a straddle carrier.
However, in the construction stage of the unmanned wharf, there may be a stage of mixed use of the unmanned device (the header card or the crane) and the unmanned device (the unmanned header card or the unmanned crane), or a development mode of the future wharf. Although the unmanned equipment can simulate a scene mixed by a large number of unmanned equipment and accumulate data, the use experience of the unmanned equipment in a complex scene (the mixed and shared by a large number of unmanned equipment and the unmanned equipment) is not rich enough, and a driver drives himself or herself to have certain psychological worry on the scene working together with the unmanned collection card.
The conventional VR simulation driving training method is generally based on a simulation scene given by an algorithm, and the simulation scene has many differences with a complex scene (a large number of pieces of unmanned equipment are mixed and shared with unmanned equipment), especially, a plurality of unmanned equipment has many emergency situations (under the conditions of sudden bad weather, high and low temperature influence on vehicle performance, random walking of pedestrians, irregular goods stacking, road preemption or congestion and the like) based on the scene of mixed operation of independent environment acquisition, judgment, execution operation and a plurality of real drivers, and the emergency situations are obtained by artificial intelligence in an active simulation mode, so that even if the driver and the unmanned equipment for VR simulation driving training work cooperatively for a long time, many inadaptation and operation risks exist. Due to the measurement of the quay or the very high value of the goods, accidents can bring about high business losses to the quay.
Therefore, the invention provides a hybrid scene-based simulated driving method, a system, equipment and a storage medium.
Disclosure of Invention
Aiming at the problems in the prior art, the invention aims to provide a hybrid scene-based driving simulation method, a system, equipment and a storage medium, which overcome the difficulties in the prior art and can help a driver to adapt to a scene which works together with unmanned operation equipment in a closed scene under a simulation device, for example: unmanned wharf, unmanned mine, etc. and is favorable to collecting the driving data of unmanned equipment and the vehicle meeting time with driver to establish a corresponding neural network driving model, optimize the safety of driver and unmanned collection card joint operation.
The embodiment of the invention provides a hybrid scene-based simulated driving method, which comprises the following steps of:
s110, obtaining scene space data of a space scene;
s120, constructing at least one virtual vehicle in the scene space data and generating a view angle picture in real time according to the view angle of the virtual vehicle;
s130, providing a driving simulator for each virtual vehicle, wherein the driving simulator displays the visual angle picture of the virtual vehicle in real time, receives driving operation information of a driver, updates the position change of the virtual vehicle in scene space data, and updates the position change to the scene space data;
and S140, receiving at least part of scene space data by the unmanned equipment in real time, carrying out route planning or navigation driving at least based on the scene space data, collecting the position and pose change of the unmanned equipment in real time, and updating the position and pose change into the scene space data.
Preferably, in the step S110, scene space modeling is performed on the spatial scene through space scanning, so as to obtain scene space data;
preferably, in the step S120, a spatial data model and a motion control model of the virtual vehicle are constructed and implanted into the scene spatial data.
Preferably, the step S130 includes:
s131, the driving simulator downloads scene space data in a preset range from a server in real time;
s132, generating and displaying a view angle picture according to the current view angle of the virtual vehicle and updated scene space data;
s133, receiving current driving operation information of a driver, performing driving operation on the virtual vehicle according to the driving operation information, obtaining position change of the virtual vehicle after driving in scene space data, and uploading and updating the position change to the scene space data.
Preferably, in step S140, each of the unmanned devices updates the scene space data in the preset range from the server in real time, performs route planning or navigation running based on the scene space data and the preset running rule of the intelligent module, and updates the position change of the unmanned device to the scene space data.
Preferably, the scene space data in the preset range is scene space data in a radius range of 200 meters with a virtual vehicle or unmanned equipment as a center.
Preferably, the step S140 further includes:
s150, each unmanned device establishes a driving model of a neural network for meeting a vehicle with a driver according to the driving data of the virtual vehicle during meeting.
The embodiment of the invention also provides a hybrid scene-based simulated driving system for realizing the hybrid scene-based simulated driving method, which comprises the following steps:
the space establishing module is used for obtaining scene space data of a space scene;
the picture feedback module is used for constructing at least one virtual vehicle in the scene space data and generating a visual angle picture in real time according to the visual angle of the virtual vehicle;
the first updating module is used for providing a driving simulator for each virtual vehicle, wherein the driving simulator displays the visual angle picture of the virtual vehicle in real time, receives driving operation information of a driver, updates the position change of the virtual vehicle in scene space data and updates the position change to the scene space data;
and the second updating module is used for receiving at least part of scene space data in real time by the unmanned equipment, carrying out route planning or navigation running based on the scene space data, and collecting the position and the pose change of the unmanned equipment in real time and updating the position and the pose change into the scene space data.
Preferably, the driving simulator comprises a triple screen for displaying the visual angle picture, a steering wheel, a gear device and an operation conversion module, wherein the operation conversion module is respectively connected with the steering wheel and the gear device, receives operation information, converts the operation information into a running operation instruction of the virtual vehicle, and feeds back the running operation instruction to the server.
Preferably, the method further comprises: and the neural network module is used for establishing a driving model of a neural network for meeting a vehicle with a driver according to the driving data of the virtual vehicle during meeting.
The embodiment of the invention also provides a driving simulation device based on the mixed scene, which comprises:
a processor;
a memory having stored therein executable instructions of a processor;
wherein the processor is configured to perform the steps of the hybrid scene based simulated driving method described above via execution of the executable instructions.
The embodiment of the invention also provides a computer readable storage medium for storing a program which, when executed, implements the steps of the hybrid scene-based simulated driving method described above.
The driving simulation method, system, equipment and storage medium based on the mixed scene can help a driver adapt to a scene which is operated together with unmanned operation equipment in a closed scene under a simulation device, for example: unmanned wharf, unmanned mine, etc. and is favorable to collecting the driving data of unmanned equipment and the vehicle meeting time with driver to establish a corresponding neural network driving model, optimize the safety of driver and unmanned collection card joint operation.
Drawings
Other features, objects and advantages of the present invention will become more apparent upon reading of the detailed description of non-limiting embodiments, made with reference to the following drawings.
Fig. 1 is a flow chart of a hybrid scene-based simulated driving method of the present invention.
Fig. 2 to 5 are schematic diagrams of the implementation of the hybrid scene-based simulated driving method of the present invention.
FIG. 6 is a schematic diagram of a hybrid scene-based simulated driving system according to the present invention
Fig. 7 is a schematic structural view of the hybrid scene-based simulated driving apparatus of the present invention. and
Fig. 8 is a schematic structural view of a computer-readable storage medium according to an embodiment of the present invention.
Detailed Description
Example embodiments will now be described more fully with reference to the accompanying drawings. However, the example embodiments may be embodied in many different forms and should not be construed as limited to the embodiments set forth herein. Rather, these embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the concept of the example embodiments to those skilled in the art. The same reference numerals in the drawings denote the same or similar structures, and thus a repetitive description thereof will be omitted.
Fig. 1 is a flow chart of a hybrid scene-based simulated driving method of the present invention. As shown in fig. 1, an embodiment of the present invention provides a driving simulation method based on a hybrid scene, including the steps of:
s110, obtaining scene space data of a space scene.
S120, constructing at least one virtual vehicle in the scene space data and generating a view angle picture in real time according to the view angle of the virtual vehicle.
And S130, providing a driving simulator for each virtual vehicle, displaying the visual angle picture of the virtual vehicle in real time by the driving simulator, receiving driving operation information of a driver, updating the position change of the virtual vehicle in the scene space data, and updating the position change to the scene space data.
And S140, the unmanned equipment receives at least part of scene space data in real time, performs route planning or navigation driving at least based on the scene space data, acquires the position and pose changes of the unmanned equipment in real time, and updates the position and pose changes into the scene space data, but the method is not limited to the scene space data. The unmanned equipment can carry out route planning or navigation running by matching with an automatic driving module based on composite information means such as scene space data, self real-time radar scanning, image acquisition results, information interaction among vehicles and the like, and the unmanned equipment is not described herein.
In a preferred embodiment, in step S110, the spatial scene is spatially modeled by spatial scanning to obtain scene spatial data, but not limited thereto.
In a preferred embodiment, in step S120, a spatial data model and a motion control model of the virtual vehicle are constructed and implanted into the scene spatial data, but not limited thereto.
In a preferred embodiment, step S130 includes:
s131, the driving simulator downloads and updates scene space data in a preset range from the server in real time.
And S132, generating and displaying a view angle picture according to the current view angle of the virtual vehicle and the updated scene space data.
S133, receiving current driving operation information of a driver, performing driving operation on the virtual vehicle according to the driving operation information, obtaining position change of the virtual vehicle after driving in the scene space data, and uploading and updating the position change to the scene space data, but not limited to the scene space data.
In a preferred embodiment, in step S140, each of the unmanned devices updates the scene space data in the preset range from the server in real time, performs route planning or navigation running based on the scene space data, and updates the position change of the unmanned device to the scene space data, but not limited thereto.
In a preferred embodiment, the scene space data within the preset range is scene space data within a radius of 200 meters with the virtual vehicle or the unmanned device as the center, but not limited to this.
In a preferred embodiment, step S140 further comprises: and S150, each unmanned equipment establishes a driving model of a neural network for a vehicle meeting scene with a driver according to the driving data when the virtual vehicle meets, but the driving model is not limited to the driving model.
The simulation driving method based on the mixed scene is completely different from the traditional VR driving equipment, space data are collected in real time from the real space, after the virtual space is built, the virtual vehicle is implanted into the virtual space, a driver can drive the virtual vehicle to drive in the virtual space through a simulator, meanwhile, as the unmanned vehicles can update the space data in real time, a plurality of unmanned vehicles can sense the position change of the virtual vehicle which does not exist in the real space when driving, and the driver can drive automatically based on the scene and artificial intelligence, so that the driver simulates the scene of the joint operation of the vehicle driven on site and the unmanned vehicles in the safe simulator, and the reality and the safety of the scene are considered. And a data collection scene is provided for the collaborative operation of the learning and the true man of a plurality of unmanned vehicles, so that the optimization iteration of the unmanned vehicle system is facilitated.
The hybrid scene-based driving simulation method can help a driver adapt to a scene which is operated together with unmanned operation equipment in a closed scene under a simulation device, is beneficial to collecting driving data of the unmanned operation equipment when the unmanned operation equipment and a vehicle with the driver meet to establish a corresponding driving model of a neural network, and optimizes the safety of the joint operation of the driver and an unmanned collecting card.
Fig. 2 to 5 are schematic diagrams of the implementation of the hybrid scene-based simulated driving method of the present invention. As shown in fig. 2 to 5, the implementation procedure of the hybrid scene-based simulated driving method of the present invention is as follows:
as shown in fig. 2, scene space data is obtained by spatially modeling a spatial scene through spatial scanning. Three unmanned vehicles 11, 12 and 13 are arranged in scene space modeling. At least one virtual vehicle 14 is constructed in the scene space data and a perspective view is generated in real time according to the perspective of the virtual vehicle. The method comprises the steps of constructing a spatial data model and a motion control model of the virtual vehicle, and implanting the spatial data model and the motion control model into scene spatial data.
As shown in fig. 3 and 4, a driving simulator 21 is provided for the virtual vehicle 14, and the driving simulator includes a triple screen for displaying a viewing angle picture, a steering wheel, a gear device, and an operation conversion module respectively connected to the steering wheel and the gear device, receiving the operation information, converting the operation information into a driving operation instruction of the virtual vehicle, and feeding back to the server. The triple screen of the driving simulator displays the view angle picture of the virtual vehicle in real time, and at this time, because the unmanned vehicle 11 is arranged right in front of the virtual vehicle 14, the view angle picture displayed by the triple screen comprises the image 111 of the unmanned vehicle 11. The driving simulator 21 receives the driving operation information of the driver 22, updates the position change of the virtual vehicle in the scene space data, and updates to the scene space data. The driving simulator downloads and updates scene space data in a preset range from the server in real time, generates and displays a view angle picture according to the current view angle of the virtual vehicle and the updated scene space data, receives current driving operation information of a driver, performs driving operation on the virtual vehicle according to the driving operation information, obtains the position change of the virtual vehicle after driving in the scene space data, and uploads and updates the scene space data. The scene space data in the preset range are scene space data in the radius range of 200 meters by taking the virtual vehicle as the center.
As shown in fig. 5, the unmanned aerial vehicle 11 receives at least part of the scene space data in real time, performs route planning or navigation running based on the scene space data, and acquires the position change of the unmanned aerial vehicle 11 in real time and updates the position change into the scene space data. Each unmanned vehicle 11 updates scene space data in a preset range from the server in real time, performs route planning or navigation running based on the scene space data, and updates the position change of the unmanned vehicle 11 into the scene space data. The scene space data within the preset range is scene space data within a radius of 200 meters with the unmanned vehicle 11 as the center. The unmanned vehicle 11 establishes a driving model of a neural network for a vehicle meeting scene with a driver according to driving data of the virtual vehicle 14 during meeting, thereby helping the driver adapt to a scene which is operated together with unmanned operation equipment in a closed scene under a simulation device, and improving the operation efficiency and the safety of a mixed scene which is operated together with the unmanned operation equipment by the driver.
Fig. 6 is a schematic structural diagram of a hybrid scene-based driving simulation system according to the present invention as shown in fig. 6, and an embodiment of the present invention further provides a hybrid scene-based driving simulation system 5 for implementing the above hybrid scene-based driving simulation method, where the hybrid scene-based driving simulation system includes:
the space creation module 51 obtains scene space data of a space scene. And carrying out scene space modeling on the space scene through space scanning to obtain scene space data.
The picture feedback module 52 constructs at least one virtual vehicle in the scene space data and generates a view picture in real time according to the view of the virtual vehicle. The method comprises the steps of constructing a spatial data model and a motion control model of the virtual vehicle, and implanting the spatial data model and the motion control model into scene spatial data.
The first updating module 53 provides a driving simulator for each virtual vehicle, the driving simulator displays the view angle picture of the virtual vehicle in real time, and receives the driving operation information of the driver to update the position change of the virtual vehicle in the scene space data and update to the scene space data. The driving simulator downloads and updates scene space data in a preset range from the server in real time, generates and displays a view angle picture according to the current view angle of the virtual vehicle and the updated scene space data, receives current driving operation information of a driver, performs driving operation on the virtual vehicle according to the driving operation information, obtains the position change of the virtual vehicle after driving in the scene space data, and uploads and updates the scene space data. The scene space data in the preset range are scene space data in the radius range of 200 meters by taking the virtual vehicle as the center. The driving simulator (see fig. 4) comprises a triple screen for displaying visual angle pictures, a steering wheel, a gear device and an operation conversion module, wherein the operation conversion module is respectively connected with the steering wheel and the gear device, receives operation information, converts the operation information into a driving operation instruction of the virtual vehicle, and feeds back the driving operation instruction to the server.
The second updating module 54 receives at least part of the scene space data in real time, performs route planning or navigation running based on at least the scene space data, and acquires the position and pose changes of the unmanned equipment in real time and updates the position and pose changes into the scene space data. And each unmanned equipment updates scene space data in a preset range from the server in real time, performs route planning or navigation running based on the scene space data, and updates the position change of the unmanned equipment into the scene space data. The scene space data in the preset range are scene space data in the radius range of 200 meters by taking unmanned equipment as a center.
The neural network module 55, each unmanned device builds a driving model of a neural network for a vehicle meeting scene with a driver from the driving data when the virtual vehicle is meeting.
The hybrid scene-based simulated driving system can help a driver adapt to a scene which is operated together with unmanned operation equipment in a closed scene under a simulation device, is beneficial to collecting driving data of the unmanned operation equipment when a vehicle with the driver meets, establishes a corresponding driving model of a neural network, and optimizes the safety of the joint operation of the driver and an unmanned collecting card.
The embodiment of the invention also provides a driving simulation device based on the mixed scene, which comprises a processor. A memory having stored therein executable instructions of a processor. Wherein the processor is configured to execute the steps of the hybrid scene based simulated driving method via execution of the executable instructions.
As described above, the hybrid scene-based driving simulation device can help a driver adapt to a scene which works together with unmanned operation equipment in a closed scene under a simulation device, is beneficial to collecting driving data of the unmanned operation equipment when a vehicle with the driver meets, establishes a corresponding driving model of a neural network, and optimizes the safety of the joint work of the driver and an unmanned collection card.
Those skilled in the art will appreciate that the various aspects of the invention may be implemented as a system, method, or program product. Accordingly, aspects of the invention may be embodied in the following forms, namely: an entirely hardware embodiment, an entirely software embodiment (including firmware, micro-code, etc.) or an embodiment combining hardware and software aspects may be referred to herein as a "circuit," module "or" platform.
Fig. 7 is a schematic structural view of the hybrid scene-based simulated driving apparatus of the present invention. An electronic device 600 according to this embodiment of the invention is described below with reference to fig. 7. The electronic device 600 shown in fig. 7 is merely an example, and should not be construed as limiting the functionality and scope of use of embodiments of the present invention.
As shown in fig. 7, the electronic device 600 is in the form of a general purpose computing device. Components of electronic device 600 may include, but are not limited to: at least one processing unit 610, at least one memory unit 620, a bus 630 connecting the different platform components (including memory unit 620 and processing unit 610), a display unit 640, etc.
Wherein the storage unit stores program code executable by the processing unit 610 such that the processing unit 610 performs the steps according to various exemplary embodiments of the present invention described in the above-described electronic prescription flow processing method section of the present specification. For example, the processing unit 610 may perform the steps as shown in fig. 1.
The storage unit 620 may include readable media in the form of volatile storage units, such as Random Access Memory (RAM) 6201 and/or cache memory unit 6202, and may further include Read Only Memory (ROM) 6203.
The storage unit 620 may also include a program/utility 6204 having a set (at least one) of program modules 6205, such program modules 6205 including, but not limited to: an operating system, one or more application programs, other program modules, and program data, each or some combination of which may include an implementation of a network environment.
Bus 630 may be a local bus representing one or more of several types of bus structures including a memory unit bus or memory unit controller, a peripheral bus, an accelerated graphics port, a processing unit, or using any of a variety of bus architectures.
The electronic device 600 may also communicate with one or more external devices 700 (e.g., keyboard, pointing device, bluetooth device, etc.), one or more devices that enable a user to interact with the electronic device 600, and/or any device (e.g., router, modem, etc.) that enables the electronic device 600 to communicate with one or more other computing devices. Such communication may occur through an input/output (I/O) interface 650. Also, electronic device 600 may communicate with one or more networks such as a Local Area Network (LAN), a Wide Area Network (WAN), and/or a public network, such as the Internet, through network adapter 660. The network adapter 660 may communicate with other modules of the electronic device 600 over the bus 630. It should be appreciated that although not shown, other hardware and/or software modules may be used in connection with electronic device 600, including, but not limited to: microcode, device drivers, redundant processing units, external disk drive arrays, RAID systems, tape drives, data backup storage platforms, and the like.
The embodiment of the invention also provides a computer readable storage medium for storing a program, and the steps of the hybrid scene-based simulated driving method are realized when the program is executed. In some possible embodiments, the aspects of the present invention may also be implemented in the form of a program product comprising program code for causing a terminal device to carry out the steps according to the various exemplary embodiments of the invention as described in the electronic prescription stream processing method section of this specification, when the program product is run on the terminal device.
As described above, the program of the computer readable storage medium of this embodiment, when executed, can help the driver adapt to a scene of co-operation with the unmanned operation device in a closed scene under the simulation apparatus, and is beneficial to collecting driving data of the unmanned operation device and the vehicle with the driver to establish a driving model of a corresponding neural network, and optimize the safety of co-operation of the driver and the unmanned set card.
Fig. 8 is a schematic structural view of a computer-readable storage medium of the present invention. Referring to fig. 8, a program product 800 for implementing the above-described method according to an embodiment of the present invention is described, which may employ a portable compact disc read only memory (CD-ROM) and include program code, and may be run on a terminal device, such as a personal computer. However, the program product of the present invention is not limited thereto, and in this document, a readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device.
The program product may employ any combination of one or more readable media. The readable medium may be a readable signal medium or a readable storage medium. The readable storage medium can be, for example, but is not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or a combination of any of the foregoing. More specific examples (a non-exhaustive list) of the readable storage medium would include the following: an electrical connection having one or more wires, a portable disk, a hard disk, random Access Memory (RAM), read-only memory (ROM), erasable programmable read-only memory (EPROM or flash memory), optical fiber, portable compact disk read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing.
The computer readable storage medium may include a data signal propagated in baseband or as part of a carrier wave, with readable program code embodied therein. Such a propagated data signal may take any of a variety of forms, including, but not limited to, electro-magnetic, optical, or any suitable combination of the foregoing. A readable storage medium may also be any readable medium that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device. Program code embodied on a readable storage medium may be transmitted using any appropriate medium, including but not limited to wireless, wireline, optical fiber cable, RF, etc., or any suitable combination of the foregoing.
Program code for carrying out operations of the present invention may be written in any combination of one or more programming languages, including an object oriented programming language such as Java, C++ or the like and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the user's computing device, partly on the user's device, as a stand-alone software package, partly on the user's computing device, partly on a remote computing device, or entirely on the remote computing device or server. In the case of remote computing devices, the remote computing device may be connected to the user computing device through any kind of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or may be connected to an external computing device (e.g., connected via the Internet using an Internet service provider).
In summary, the driving simulation method, system, equipment and storage medium based on the mixed scene can help a driver adapt to a scene which is operated together with unmanned operation equipment in a closed scene under a simulation device, for example: unmanned wharf, unmanned mine, etc. and is favorable to collecting the driving data of unmanned equipment and the vehicle meeting time with driver to establish a corresponding neural network driving model, optimize the safety of driver and unmanned collection card joint operation.
The foregoing is a further detailed description of the invention in connection with the preferred embodiments, and it is not intended that the invention be limited to the specific embodiments described. It will be apparent to those skilled in the art that several simple deductions or substitutions may be made without departing from the spirit of the invention, and these should be considered to be within the scope of the invention.

Claims (9)

1. The simulated driving method based on the mixed scene is characterized by comprising the following steps of:
s110, acquiring space data from a real space in real time, and constructing scene space data of a space scene;
s120, constructing at least one virtual vehicle in the scene space data and generating a view angle picture in real time according to the view angle of the virtual vehicle;
s130, providing a driving simulator for each virtual vehicle, downloading and updating scene space data in a preset range from a server in real time by the driving simulator, generating a visual angle picture according to the current visual angle of the virtual vehicle and the updated scene space data, displaying the visual angle picture in real time, receiving current driving operation information of a driver, driving the virtual vehicle according to the driving operation information, obtaining position change of the virtual vehicle after driving in the scene space data, and uploading and updating the scene space data;
s140, a plurality of unmanned equipment in a real space receives at least part of scene space data in real time, perceives the position change of the virtual vehicle, performs route planning or navigation running based on at least the scene space data, acquires the position and position change of the unmanned equipment in real time, updates the position and position change of the unmanned equipment into the scene space data, updates the scene space data in a preset range in real time from a server, performs route planning or navigation running based on the scene space data and an intelligent module preset running rule, and updates the position change of the unmanned equipment into the scene space data;
s150, each unmanned device establishes a driving model of a neural network for meeting a vehicle with a driver according to the driving data of the virtual vehicle during meeting.
2. The hybrid scene-based simulated driving method as claimed in claim 1, wherein in step S110, scene space data is obtained by spatially modeling a spatial scene through spatial scanning.
3. The hybrid scene-based simulated driving method as claimed in claim 1, wherein in step S120, a spatial data model and a motion control model of the virtual vehicle are constructed and implanted into the scene spatial data.
4. The hybrid scene-based simulated driving method as claimed in claim 1, wherein the scene space data within the preset range is scene space data within a preset radius range centered on a virtual vehicle or an unmanned device.
5. A hybrid scene based simulated driving system for implementing the hybrid scene based simulated driving method as claimed in claim 1, comprising:
the space establishing module is used for acquiring space data from a real space in real time and constructing scene space data of a space scene;
the picture feedback module is used for constructing at least one virtual vehicle in the scene space data and generating a visual angle picture in real time according to the visual angle of the virtual vehicle;
the first updating module is used for providing a driving simulator for each virtual vehicle, wherein the driving simulator displays the visual angle picture of the virtual vehicle in real time, receives driving operation information of a driver, updates the position change of the virtual vehicle in scene space data and updates the position change to the scene space data;
and the second updating module is used for receiving at least part of scene space data in real time by a plurality of unmanned equipment in real space, perceiving the position change of the virtual vehicle, carrying out route planning or navigation running based on the scene space data, collecting the position and position posture change of the unmanned equipment in real time, and updating the position and position posture change into the scene space data.
6. The driving simulator of claim 5, wherein the driving simulator comprises a triple screen for displaying the visual angle picture, a steering wheel, a gear device and an operation conversion module, wherein the operation conversion module is respectively connected with the steering wheel and the gear device, receives operation information, converts the operation information into a running operation instruction of the virtual vehicle, and feeds back to a server.
7. The hybrid scene based simulated driving system as claimed in claim 5, further comprising:
and the neural network module is used for establishing a driving model of a neural network for meeting a vehicle with a driver according to the driving data of the virtual vehicle during meeting.
8. A hybrid scene-based simulated driving apparatus, comprising:
a processor;
a memory having stored therein executable instructions of a processor;
wherein the processor is configured to perform the steps of the hybrid scene based simulated driving method of any one of claims 1 to 4 via execution of executable instructions.
9. A computer-readable storage medium storing a program, characterized in that the program when executed implements the steps of the hybrid scene-based simulated driving method according to any one of claims 1 to 4.
CN202110511226.8A 2021-05-11 2021-05-11 Hybrid scene-based driving simulation method, system, equipment and storage medium Active CN113192381B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110511226.8A CN113192381B (en) 2021-05-11 2021-05-11 Hybrid scene-based driving simulation method, system, equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110511226.8A CN113192381B (en) 2021-05-11 2021-05-11 Hybrid scene-based driving simulation method, system, equipment and storage medium

Publications (2)

Publication Number Publication Date
CN113192381A CN113192381A (en) 2021-07-30
CN113192381B true CN113192381B (en) 2023-07-28

Family

ID=76981062

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110511226.8A Active CN113192381B (en) 2021-05-11 2021-05-11 Hybrid scene-based driving simulation method, system, equipment and storage medium

Country Status (1)

Country Link
CN (1) CN113192381B (en)

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113946259B (en) * 2021-09-18 2023-04-07 北京城市网邻信息技术有限公司 Vehicle information processing method and device, electronic equipment and readable medium
CN114141092B (en) * 2021-11-10 2023-01-20 武汉未来幻影科技有限公司 Method and system for constructing animation scene of driving test simulator
CN114185330A (en) * 2021-12-12 2022-03-15 蜂联智能(深圳)有限公司 Control method and control device based on multi-scene interaction
CN114327076A (en) * 2022-01-04 2022-04-12 上海三一重机股份有限公司 Virtual interaction method, device and system for working machine and working environment

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2019210821A1 (en) * 2018-05-03 2019-11-07 Formula Square Holdings Ltd Systems and methods for providing driving guidance
CN111781855A (en) * 2020-07-15 2020-10-16 北京领骏科技有限公司 Traffic on-loop automatic driving simulation system
CN112198859A (en) * 2020-09-07 2021-01-08 西安交通大学 Method, system and device for testing automatic driving vehicle in vehicle ring under mixed scene
CN112365215A (en) * 2020-12-02 2021-02-12 青岛慧拓智能机器有限公司 Mining area unmanned transportation simulation test system and method based on depth virtual-real mixing

Family Cites Families (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP3220233B1 (en) * 2016-03-18 2020-11-04 Volvo Car Corporation Method and system for enabling interaction in a test environment
CN106652645A (en) * 2017-03-16 2017-05-10 百度在线网络技术(北京)有限公司 Vehicle driving training device, as well as operation method and device of vehicle driving training device
US10319109B2 (en) * 2017-03-31 2019-06-11 Honda Motor Co., Ltd. Interaction with physical objects as proxy objects representing virtual objects
US11328219B2 (en) * 2018-04-12 2022-05-10 Baidu Usa Llc System and method for training a machine learning model deployed on a simulation platform
US10755007B2 (en) * 2018-05-17 2020-08-25 Toyota Jidosha Kabushiki Kaisha Mixed reality simulation system for testing vehicle control system designs
CN109781431B (en) * 2018-12-07 2019-12-10 山东省科学院自动化研究所 automatic driving test method and system based on mixed reality
CN109632339A (en) * 2018-12-28 2019-04-16 同济大学 A kind of automatic driving vehicle traffic coordinating real steering vectors system and method
CN109765060A (en) * 2018-12-29 2019-05-17 同济大学 A kind of automatic driving vehicle traffic coordinating virtual test system and method
CN109887372A (en) * 2019-04-16 2019-06-14 北京中公高远汽车试验有限公司 Driving training analogy method, electronic equipment and storage medium

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2019210821A1 (en) * 2018-05-03 2019-11-07 Formula Square Holdings Ltd Systems and methods for providing driving guidance
CN111781855A (en) * 2020-07-15 2020-10-16 北京领骏科技有限公司 Traffic on-loop automatic driving simulation system
CN112198859A (en) * 2020-09-07 2021-01-08 西安交通大学 Method, system and device for testing automatic driving vehicle in vehicle ring under mixed scene
CN112365215A (en) * 2020-12-02 2021-02-12 青岛慧拓智能机器有限公司 Mining area unmanned transportation simulation test system and method based on depth virtual-real mixing

Also Published As

Publication number Publication date
CN113192381A (en) 2021-07-30

Similar Documents

Publication Publication Date Title
CN113192381B (en) Hybrid scene-based driving simulation method, system, equipment and storage medium
US20210261141A1 (en) Data acquisition method, apparatus, device and computer-readable storage medium
Nguyen et al. Virtual reality interfaces for visualization and control of remote vehicles
Chi et al. Development of user interface for tele-operated cranes
US20220153314A1 (en) Systems and methods for generating synthetic motion predictions
Leingartner et al. Evaluation of sensors and mapping approaches for disasters in tunnels
US20220153298A1 (en) Generating Motion Scenarios for Self-Driving Vehicles
US20200166355A1 (en) Generation of route network data for movement
Guzman et al. Robotnik—Professional service robotics applications with ROS
KR20230007256A (en) Method and apparatus for fusing road data to generate a map, and electronic device
Lin et al. Integrated smart robot with earthquake early warning system for automated inspection and emergency response
CN114627331A (en) Model training method and device
Wang et al. Simplexity testbed: A model-based digital twin testbed
CN113932796A (en) High-precision map lane line generation method and device and electronic equipment
Chae et al. Development of Physics-Based Virtual Training Simulator for Inspections of Steel Transmission Towers
CN114882759B (en) Virtual-real hybrid integrated simulation intelligent ship multichannel interaction simulation system and method
CN116403174A (en) End-to-end automatic driving method, system, simulation system and storage medium
CN114461104B (en) Building type splicing method, device, equipment and storage medium
Zheng et al. Virtual Prototyping-Based Path Planning of Unmanned Aerial Vehicles for Building Exterior Inspection
MARTINS Study of artificial intelligence and computer vision methods for tracking transmission lines with the AID of UAVs
Satheesan Real-Time Augmented Reality based Operator Assistance for Driving Cut-to-Length Forest Machines
Gautason et al. Mars Rover analog2: a MESR analog with environmental mapping and simulation
CN114140590A (en) Urban environment index display method, device, equipment and storage medium
Nejad et al. High Performance Networking Layer for Simulation Applications
KR20230162278A (en) Weapon system test method of electronic apparatus

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
CB02 Change of applicant information
CB02 Change of applicant information

Address after: Room 503-3, 398 Jiangsu Road, Changning District, Shanghai 200050

Applicant after: Shanghai Xijing Technology Co.,Ltd.

Address before: Room 503-3, 398 Jiangsu Road, Changning District, Shanghai 200050

Applicant before: SHANGHAI WESTWELL INFORMATION AND TECHNOLOGY Co.,Ltd.

GR01 Patent grant
GR01 Patent grant