WO2022166264A1 - 作业机械的模拟训练系统、方法、装置和电子设备 - Google Patents

作业机械的模拟训练系统、方法、装置和电子设备 Download PDF

Info

Publication number
WO2022166264A1
WO2022166264A1 PCT/CN2021/126599 CN2021126599W WO2022166264A1 WO 2022166264 A1 WO2022166264 A1 WO 2022166264A1 CN 2021126599 W CN2021126599 W CN 2021126599W WO 2022166264 A1 WO2022166264 A1 WO 2022166264A1
Authority
WO
WIPO (PCT)
Prior art keywords
scene
user
simulation training
cockpit
information
Prior art date
Application number
PCT/CN2021/126599
Other languages
English (en)
French (fr)
Inventor
周文君
熊建华
罗成发
Original Assignee
三一汽车起重机械有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 三一汽车起重机械有限公司 filed Critical 三一汽车起重机械有限公司
Publication of WO2022166264A1 publication Critical patent/WO2022166264A1/zh

Links

Images

Classifications

    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09BEDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
    • G09B9/00Simulators for teaching or training purposes

Definitions

  • the present invention relates to the technical field of working machines, in particular to a simulation training system, method, device and electronic equipment for working machines.
  • the invention provides a simulation training system, method, device and electronic equipment for a working machine, which are used to solve the defect of poor immersion in the simulation training of the working machine in the prior art, and realize the simulation training with high immersion.
  • the present invention provides a simulation training system for a working machine, comprising:
  • the simulated cockpit is determined based on the cockpit of the working machine
  • the somatosensory controller is used to collect the body motion information of the user, and output the posture image information corresponding to the body motion information;
  • the VR device is used to generate a VR scene based on the simulated cockpit, the VR device is connected in communication with the somatosensory controller, and is used to integrate the gesture image information into the VR scene, and display an update post VR scene.
  • the VR device is configured to generate a VR scene based on the positioning information of at least part of the operation components of the simulated cockpit.
  • the somatosensory controller is a non-wearable device, and includes an infrared fill light and at least two grayscale cameras.
  • the present invention also provides a simulation training method for the working machine, comprising:
  • the gesture image information is integrated into the VR scene, and the updated VR scene is displayed.
  • the VR scene is generated based on the positioning information of at least part of the operation components of the simulated cockpit corresponding to the cockpit of the working machine.
  • the limb movement information is collected by at least two gray-scale cameras.
  • the present invention also provides a simulation training device for a working machine, comprising:
  • the display module is used to display the VR scene corresponding to the manipulation of the working machine
  • the receiving module is used to receive the user's body movement information
  • a generating module configured to output gesture image information corresponding to the body movement information in response to the body movement information
  • the fusion module is used to fuse the gesture image information into the VR scene and display the updated VR scene.
  • the VR scene is generated based on the positioning information of at least part of the operation components of the simulated cockpit corresponding to the cockpit of the working machine.
  • the present invention also provides an electronic device, comprising a memory, a processor, and a computer program stored in the memory and running on the processor, the processor implementing the program as described above when the processor executes the program. Simulate the steps of the training method.
  • the present invention also provides a non-transitory computer-readable storage medium on which a computer program is stored, and when the computer program is executed by a processor, implements the steps of any of the above-mentioned methods for simulating training of a working machine.
  • the simulation training system, method, device and electronic device for a working machine obtained an updated VR scene by integrating the VR scene and user posture image information, and can simultaneously display the virtual cockpit scene and the virtual user body movements, And achieve full synchronization of virtual and reality.
  • the user can operate the hardware operation components in the simulated cockpit, and the position of the corresponding hardware operation components, the operation status of the operation machine under the operation command, and the user's body movement information can be displayed in the VR scene.
  • the unity of action and vision significantly enhances the user's sense of immersion and experience.
  • FIG. 1 is a schematic structural diagram of a simulation training system for a working machine provided by some embodiments of the present invention
  • FIG. 2 is a schematic flowchart of a simulation training method for a work machine provided by some embodiments of the present invention
  • FIG. 3 is a schematic structural diagram of a simulation training device for a working machine provided by some embodiments of the present invention.
  • FIG. 4 is a schematic structural diagram of an electronic device provided by some embodiments of the present invention.
  • the simulation training system includes: a simulated cockpit 110 , a somatosensory controller 120 and a VR device 130 .
  • a simulated cockpit 110 determined for a work machine-based cockpit
  • the working machine in the embodiment of the present invention may be an excavator, a crane, a road roller, a crane, a fire truck, or the like.
  • the simulated cockpit 110 is used to provide the user with an actual training site, so that the user can touch the hardware operating components for operation training without entering the cockpit of the real working machine.
  • the hardware operating components of the simulated cockpit 110 can be restored with at least part of the hardware operating components of the real cockpit in a ratio of 1:1.
  • the simulated cockpit 110 in this embodiment has a low construction cost and can meet the requirements of Some specific operating components are required for training.
  • the hardware operating components of the simulated cockpit 110 can be restored in a ratio of 1:1 to all the hardware operating components of the real cockpit, so as to provide a more realistic training scenario for the user by providing a completely identical simulated environment.
  • the simulated cockpit 110 may be provided with buttons for controlling switches of the crane, which are used to control the lifting and lowering of various actuators such as the boom, the hoisting mechanism, the luffing mechanism, and the slewing device of the crane. , Rotating operating handle and brakes, accelerators, clutches, etc. used to control the running of the crane.
  • the user can experience an operation experience that is completely consistent with operating a real crane by manipulating hardware operation components such as buttons, handles, and accelerators on the simulated cockpit 110 .
  • the somatosensory controller 120 is used to collect the body motion information of the user, and output the posture image information corresponding to the body motion information;
  • the user's body motion information is used to represent the user's motion instructions when operating the hardware operating components of the work machine.
  • the body motion information may include at least one of the following parts: the user's hand motion information, the user's foot motion information, the user's head motion information, the user's torso motion information, and the like.
  • the gesture image information is used to intuitively represent the user's body motion information in the form of images.
  • the gesture image information corresponds to the body motion information, and may include at least one of the following parts: a hand gesture image, a footstep gesture image, a head gesture image, and a torso gesture image.
  • Posture image information and body motion information keep changing synchronously.
  • the VR device 130 is used to generate a VR scene based on the simulated cockpit 110 , and the VR device 130 is connected in communication with the somatosensory controller 120 to integrate the gesture image information into the VR scene and display the updated VR scene.
  • the VR device 130 further includes:
  • Modeling equipment for generating VR scenes based on the simulated cockpit 110 such as a 3D scanner;
  • Display equipment used to display VR scenes generated by modeling equipment, such as: 3D display system, large-scale projection system or head-mounted stereoscopic display, etc.;
  • Sound equipment used to output ambient sound to present a more realistic simulated environment to the user
  • the VR scene includes a cockpit model generated based on a cockpit of a real work machine, models of hardware operating components inside the cockpit, an actuator model generated based on each actuator of the real work machine, and a working environment based on the real work machine Generated job environment model.
  • the VR scene generated by the VR device 130 and the real cockpit are constructed in a ratio of 1:1.
  • the positions of the hardware operating components of the simulated cockpit and the corresponding hardware operating components in the VR scene are set to be completely synchronized.
  • the same operation scene will be generated in the VR scene.
  • the VR scene will also generate the operation status corresponding to the operation machine under the operation instruction, so as to present a better operating condition for the user.
  • a true immersive simulation experience When the user operates the hardware operation components of the simulated cockpit, the same operation scene will be generated in the VR scene.
  • the VR scene will also generate the operation status corresponding to the operation machine under the operation instruction, so as to present a better operating condition for the user.
  • a true immersive simulation experience is
  • the VR device 130 when simulating training a crane, the VR device 130 generates a VR scene corresponding to the crane, including: a crane cab model; models of various hardware operating components in the cab, such as buttons, handles, accelerators, clutches, and brakes; Mechanism models, such as boom, hoisting mechanism, luffing mechanism, slewing device, etc.; as well as the operating environment model around the crane.
  • the hardware operating components in the above VR scene are fully synchronized with the hardware operating components in the simulated cockpit.
  • press the switch button of the corresponding crane in the simulated cockpit and simultaneously push the handle for controlling each actuator.
  • the switch button model in the VR scene is activated correspondingly, and the handle model and the handle in the simulated cockpit keep moving synchronously; at the same time, the actuator model corresponding to the crane controlled by the handle starts to rotate and lift to lift the target model.
  • the VR device 130 is communicatively connected to the motion sensing controller 120 , wherein, there are various ways of communication connection, including: wired electrical connection through a cable and radio connection through a wireless transceiver.
  • the VR device 130 is electrically connected with the motion controller 120 through a cable.
  • the somatosensory controller 120 and the VR device 130 are respectively provided with an output interface and an input interface, and the gesture image information generated by the somatosensory controller 120 is output from the output interface of the somatosensory controller 120 and transmitted through a wired communication medium Input interface to VR device 130.
  • the wired communication medium may include: coaxial cable, twisted pair, or optical fiber.
  • the transmission of attitude image information through wired communication has strong anti-interference ability, the loss of attitude image information during the transmission process is small, and the quality of attitude image information received by the VR device 130 is high.
  • the VR device 130 and the motion controller 120 are wirelessly connected through a wireless transceiver.
  • the somatosensory controller 120 and the VR device 130 are respectively provided with a signal generating device and a signal receiving device, the gesture image information generated by the somatosensory controller 120 is sent from the signal generating device of the somatosensory controller 120, The signal receiving device of the VR device 130 is transmitted by means of communication connection.
  • the wireless communication mode may include: Zig-Bee transmission, Bluetooth transmission, wireless broadband transmission, 5G transmission, microwave transmission, or wireless bridge transmission.
  • attitude image information through wireless communication has the advantages of low cost, easy maintenance, convenient operation, and good adaptability and expansibility.
  • the inventor found that in the prior art, the VR scene presented by the VR equipment is based on a virtual scene in a completely virtual environment, and the user can only see the VR equipment during the simulation training of the operation machine based on the VR scene.
  • the generated virtual cockpit model cannot see its own motion instructions when operating the hardware operating parts of the work machine, resulting in a sense of separation between motion and vision.
  • the VR device 130 after receiving the gesture image information generated by the somatosensory controller 120, the VR device 130 fuses the gesture image information into the VR scene, and displays the updated VR scene.
  • the updated VR scene includes: the VR scene generated by the simulated cockpit 110 and the gesture image information generated by the somatosensory controller 120 .
  • the updated VR scene can be synchronized in real time with the actual simulated cockpit scene and user actions. For example, when the user activates the power button of the simulated cockpit with a finger, the updated VR scene will also display a virtual finger to activate the power button of the virtual cockpit, realizing the unity of action and vision, and significantly enhancing the sense of immersion.
  • an updated VR scene is obtained by integrating the VR scene and the user's posture image information, and the virtual cockpit scene and the virtual user's limb movements can be displayed simultaneously, and the virtual and The reality is fully synchronized.
  • the user operates the hardware operation components in the simulated cockpit.
  • the VR scene can display the position of the corresponding hardware operation components, the operation status of the operation machine under the operation command, and the user's status.
  • Body movement information realize the unity of movement and vision, and significantly enhance the user's sense of immersion and experience.
  • the VR device 130 is configured to generate a VR scene based on positioning information of at least part of the operating components of the simulated cockpit.
  • At least some hardware operating components in the simulated cockpit are positioned and tracked by the VR device 130 to generate position data, and then a VR scene is generated based on the position data.
  • all the hardware operating components generated in the VR scene are in the same position as the corresponding hardware operating components in the simulated cockpit, and keep synchronous displacement.
  • two or more components of the hardware operating components generated in the VR scene and part of the hardware operating components corresponding to the simulated cockpit hardware operating components are used for position tracking, and constructed in a ratio of 1:1, especially when only When some key operation components need to be trained, this embodiment has low cost and good effect.
  • the position tracking of all the hardware operating components generated in the VR scene and all the hardware operating components corresponding to the simulated cockpit hardware operating components is constructed in a 1:1 ratio, which can provide users with a more realistic , More comprehensive simulation operation training.
  • the simulation training system for a working machine provided by the embodiment of the present invention, by generating a VR scene based on the positioning information of at least part of the operation components of the simulated cockpit corresponding to the cockpit of the working machine, a more realistic and comprehensive simulation can be provided for the user. environment, with low cost and simple operation.
  • the somatosensory controller 120 is a non-wearable device, and includes an infrared fill light and at least two grayscale cameras.
  • the grayscale camera is used to capture the user's body motion information from at least two angles to obtain a three-dimensional stereo image.
  • the gray-scale camera captures the user's hand motion information from two angles, generates posture image information based on the two angles, and then fuses the posture image information to obtain the user's hand motion information.
  • the three-dimensional gesture image information of the hand based on the three-dimensional gesture image information, the action information of the user's hand in the three-dimensional space of the real world can be reconstructed.
  • Infrared fill light is used to fill light for grayscale cameras and improve image quality, especially when the light is dim.
  • the infrared fill light is small in size, low in power consumption, and has good directivity.
  • the present invention is not limited to this, for example, a white light supplementary light can also be used, which is not limited in the present invention.
  • these data gloves are difficult to fit the finger size of each user, and are expensive and easy to wear.
  • the body motion information of the user is collected by the somatosensory controller 120, and the user's body motion information can be accurately captured without the user wearing any wearable device such as gloves, which has good universality.
  • the somatosensory controller 120 may be a Leap Motion somatosensory controller.
  • the Leap Motion somatosensory controller is equipped with a first grayscale camera, a second grayscale camera and an infrared LED.
  • the user in the actual simulation, to obtain the user's hand motion information, the user only needs to fix the Leap Motion somatosensory controller on the head-mounted stereoscopic display, and the Leap Motion somatosensory controller can monitor the surrounding environment.
  • the motion is captured, and the detection range is between 25 mm and 600 mm above the Leap Motion somatosensory controller, and the detection space is roughly an inverted pyramid.
  • the Leap Motion controller can track all 10 fingers of the user with an accuracy of up to 1/100th of a millimeter, far more accurate than existing motion control technologies.
  • the simulation training system for a working machine provided by the embodiment of the present invention, by using the somatosensory controller 120 to photograph the user's body motion information, three-dimensional posture image information is generated, and the user's body can be accurately captured without wearing any wearable device.
  • Body movement information simple operation and good universality.
  • the execution subject of the simulation training method of the working machine may be a controller on the simulation cockpit, or a control device independent of the simulation cockpit, or a server connected in communication with the simulation cockpit.
  • the simulation training method of the working machine includes: step 210 , step 220 , step 230 and step 240 .
  • the work machine may include an excavator, a crane, a road roller, a crane, or a fire truck, among others.
  • Step 210 displaying a VR scene corresponding to the manipulation of the working machine
  • the position information of the hardware operating components of the actual working machine is collected through the VR positioning technology, and a VR scene corresponding to each hardware operating component of the actual working machine is generated.
  • the VR scene can be constructed in a 1:1 ratio with some hardware operating components of the actual work machine; or, the VR scene can be constructed with all the hardware operating components of the actual work machine in a 1:1 ratio; or, VR The scene can be constructed in a 1:1 ratio with the entire cockpit of the actual working machine, all hardware operating components in the cockpit, the actuators of the working machine, and the surrounding working environment of the working machine to present a more realistic immersive simulation for users experience.
  • the present invention is not limited to this.
  • the VR scene can also be constructed in other proportions with the actual working machine, which is not limited in the present invention.
  • the hardware operation components in the VR scene and the hardware operation components in the simulated cockpit are set to be completely synchronized, so that when the user actually operates the hardware operation components in the simulated cockpit, the corresponding hardware operation components in the VR scene can be displayed in real time. operating scene.
  • the start switch in the simulated cockpit when the user operates the start switch in the simulated cockpit, the corresponding start switch in the VR scene is turned on, and the simulated work machine starts at the same time.
  • the VR scene may be displayed by a 3D presentation system, a large projection system, a head mounted stereoscopic display, or the like.
  • Step 220 receiving the user's body movement information
  • the user's body motion information is used to represent the motion instructions when the user operates the work machine.
  • the user's body motion information may include at least one of the following parts: the user's hand motion information, the user's foot motion information, the user's head motion information, the user's torso motion information, and the like.
  • the user's body movement information can be collected by sensors, and the number of sensors is 2 or more, which are used to collect the user's body movement information from different angles.
  • Step 230 in response to the body movement information, output the gesture image information corresponding to the body movement information;
  • the limb motion information collected in step 220 is converted into corresponding posture image information, which is used to represent the limb motion information corresponding to the actual operation instruction of the user in the virtual scene.
  • the gesture image information may correspondingly include at least one of the following parts: hand gesture image information, foot gesture image information, head gesture image information, and torso gesture image information.
  • the gesture image information may be three-dimensional gesture image information.
  • Step 240 Integrate the gesture image information into the VR scene, and display the updated VR scene.
  • the VR scene is a virtual scene based on a completely virtual environment, and the operator can only see the virtual cockpit model during the training of the operation machine based on the VR scene, but cannot. Seeing their own body movements when operating the machine, resulting in a sense of separation between movement and vision.
  • a working machine simulation training system based on mixed reality technology. This technology introduces a real scene into a virtual scene, and the operator can see the real machine hardware and the real machine hardware while seeing the virtual cockpit model. With real hands, users repeatedly switch between virtual and reality, destroying the sense of immersion and experience.
  • the posture image information is integrated into the VR scene, and the updated VR scene is displayed, and the updated VR scene includes both the cockpit VR scene and the posture image information.
  • users can not only see the virtual cockpit model, but also see their own body movements when operating the machine, which has a better sense of immersion.
  • an updated VR scene is obtained by integrating the VR scene and the user's posture image information, and the virtual cockpit scene and the virtual user's body movements can be displayed at the same time, and the virtual and The reality is fully synchronized.
  • the user operates the hardware operation components in the simulated cockpit.
  • the VR scene can display the position of the corresponding hardware operation components, the operation status of the operation machine under the operation command, and the user's status.
  • Body movement information realize the unity of movement and vision, and significantly enhance the user's sense of immersion and experience.
  • the VR scene is generated based on the positioning information of at least part of the operating components of the simulated cockpit corresponding to the cockpit of the work machine.
  • position data is generated, and a VR scene is generated based on the above-mentioned position data, so that all the hardware operating components generated in the VR scene are matched with the corresponding hardware in the simulated cockpit.
  • the operating parts are in the same position and maintain synchronous displacement.
  • two or more components of the hardware operating components generated in the VR scene and some hardware operating components corresponding to the simulated cockpit hardware operating components are used for position tracking, and constructed in a 1:1 ratio, especially for In a scenario where only some key hardware operating components need to be trained, this embodiment has low cost and good effect.
  • the position tracking of all the hardware operating components generated in the VR scene and all the hardware operating components corresponding to the simulated cockpit hardware operating components is constructed in a 1:1 ratio, which can provide users with a more realistic , More comprehensive simulation operation training.
  • some hardware operating components of the simulated cockpit can be restored 1:1 with some hardware operating components of the real cockpit; or, all hardware operating components of the simulated cockpit can be 1:1 with all hardware operating components of the real cockpit reduction.
  • the simulation training method for a work machine provided by the embodiment of the present invention, by generating a VR scene based on the positioning information of at least part of the operating components of the simulated cockpit corresponding to the cockpit of the work machine, a more realistic and comprehensive simulation can be provided for the user. environment, with low cost and simple operation.
  • the user's limb motion information is collected by at least two gray-scale cameras.
  • these data gloves are difficult to adapt to the size of each user, and are expensive and easy to wear.
  • the user's body movement information is collected by the camera, and the user can accurately capture the user's body movement information without wearing any wearable device such as gloves, which has good universality.
  • the user's body motion information can be captured from at least two angles to obtain a three-dimensional image.
  • the gray-scale camera captures the user's hand motion from two angles, generates posture image information based on the two angles, and then fuses the posture image information to obtain the user's hand motion. Based on the three-dimensional posture image information, the action information of the user's hand in the three-dimensional space of the real world can be reconstructed.
  • the user's limb movement information can be photographed by at least two gray-scale cameras, so that three-dimensional posture image information can be generated, and the user can accurately grasp without wearing any wearable device.
  • the user's body movement information is simple to operate and has good universality.
  • the following describes the simulation training device for a working machine provided by the present invention, and the simulation training device for a working machine described below and the simulation training method for a working machine described above can refer to each other correspondingly.
  • the simulation training device of the working machine includes: a display module 310 , a receiving module 320 , a generating module 330 and a fusion module 340 .
  • the display module 310 is used to display the VR scene corresponding to the manipulation of the working machine
  • a receiving module 320 configured to receive the user's body movement information
  • the generating module 330 is used for outputting gesture image information corresponding to the body movement information in response to the body movement information;
  • the fusion module 340 is used to fuse the gesture image information into the VR scene and display the updated VR scene.
  • the VR scene is generated based on positioning information of at least a portion of the operating components of the simulated cockpit corresponding to the cockpit of the work machine.
  • the body motion information is collected by at least two grayscale cameras.
  • the VR scene and the user posture image information are fused by the fusion module 340 to obtain the updated VR scene, which can display the virtual cockpit scene and the virtual user's limb movements at the same time. Realize the complete synchronization of virtual and reality.
  • the user can operate the hardware operation components in the simulated cockpit, and the VR scene can display the position of the corresponding hardware operation components and the operation status of the operation machine under the operation command. And the user's body movement information, which significantly enhances the user's sense of immersion and experience.
  • FIG. 4 illustrates a schematic diagram of the physical structure of an electronic device.
  • the electronic device may include: a processor (Processor) 410, a communication interface (Communications Interface) 420, a memory (Memory) 430 and a communication bus 440,
  • the processor 410 , the communication interface 420 , and the memory 430 communicate with each other through the communication bus 440 .
  • the processor 410 can invoke the logic instructions in the memory 430 to execute a simulation training method of the work machine, the method comprising: displaying a VR scene corresponding to the manipulation of the work machine; receiving the user's body movement information; responding to the body movement information , output the posture image information corresponding to the body motion information; fuse the posture image information into the VR scene, and display the updated VR scene.
  • the above-mentioned logic instructions in the memory 430 can be implemented in the form of software functional units and can be stored in a computer-readable storage medium when sold or used as an independent product.
  • the technical solution of the present invention can be embodied in the form of a software product in essence, or the part that contributes to the prior art or the part of the technical solution.
  • the computer software product is stored in a storage medium, including Several instructions are used to cause a computer device (which may be a personal computer, a server, or a network device, etc.) to execute all or part of the steps of the methods of various embodiments of the present invention.
  • the aforementioned storage medium includes: U disk, mobile hard disk, Read-Only Memory (ROM, Read-Only Memory), Random Access Memory (RAM, Random Access Memory), magnetic disk or optical disk and other media that can store program codes .
  • the present invention also provides a computer program product
  • the computer program product includes a computer program stored on a non-transitory computer-readable storage medium
  • the computer program includes program instructions
  • the method comprises: displaying a VR scene corresponding to the manipulation of the working machine; receiving the limb movement information of the user; in response to the limb movement information, outputting the limb movement information Corresponding posture image information; fuse the posture image information into the VR scene, and display the updated VR scene.
  • the present invention also provides a non-transitory computer-readable storage medium on which a computer program is stored, and when the computer program is executed by a processor, the computer program is implemented to execute the method for simulating training of a work machine provided by the above methods, the The method includes: displaying a VR scene corresponding to the manipulation of a working machine; receiving body motion information of a user; in response to the body motion information, outputting posture image information corresponding to the body motion information; and merging the posture image information into the VR scene, and display the updated VR scene.
  • the device embodiments described above are only schematic, wherein the units described as separate components may or may not be physically separated, and the components shown as units may or may not be physical units, that is, they may be located in one place , or distributed to multiple network elements. Some or all of the modules may be selected according to actual needs to achieve the purpose of the solution in this embodiment. Those of ordinary skill in the art can understand and implement it without creative effort.
  • each embodiment can be implemented by means of software plus a necessary general hardware platform, and certainly can also be implemented by hardware.
  • the above-mentioned technical solutions can be embodied in the form of software products in essence or the parts that make contributions to the prior art, and the computer software products can be stored in computer-readable storage media, such as ROM/RAM, magnetic Disks, optical discs, etc., include instructions for causing a computer device (which may be a personal computer, server, or network device, etc.) to perform the methods of various embodiments or portions of embodiments.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Business, Economics & Management (AREA)
  • Physics & Mathematics (AREA)
  • Educational Administration (AREA)
  • Educational Technology (AREA)
  • General Physics & Mathematics (AREA)
  • Processing Or Creating Images (AREA)
  • Electrically Operated Instructional Devices (AREA)

Abstract

本发明提供一种作业机械的模拟训练系统、方法、装置和电子设备,其中模拟训练系统包括:模拟驾驶舱,模拟驾驶舱为基于作业机械的驾驶舱确定;体感控制器,体感控制器用于采集用户的肢体动作信息,并输出与肢体动作信息对应的姿态图像信息;VR设备,VR设备用于基于模拟驾驶舱生成VR场景,VR设备与体感控制器通信连接,用于将姿态图像信息融合入VR场景,并显示更新后的VR场景。本发明的模拟训练系统,通过融合VR场景和用户姿态图像信息,得到更新后的VR场景,可同时显示虚拟的驾驶舱场景和虚拟的用户肢体动作,实现虚拟和现实的完全同步,实现模拟训练过程中,用户操作动作与视觉的统一,显著增强用户的沉浸感和体验感。

Description

作业机械的模拟训练系统、方法、装置和电子设备 技术领域
本发明涉及作业机械技术领域,尤其涉及一种作业机械的模拟训练系统、方法、装置和电子设备。
背景技术
作业机械的作业环境复杂、操作复杂、设备价值高,因此对操作人员的要求也较高,在上机前,需要对操作人员进行操作培训。在现有技术中,由于作业机械价值较高,操作失误极易导致作业事故的发生,进行基于实际设备之上的操作培训成本高、风险大,所以一般不会进行实操训练,而是采用纯理论培训或通过虚拟场景进行模拟训练,导致理论与实际脱节,且模拟体验感不佳,培训效果不好。
发明内容
本发明提供一种作业机械的模拟训练系统、方法、装置和电子设备,用以解决现有技术中在进行作业机械的模拟训练时,沉浸感不佳的缺陷,实现高沉浸感的模拟训练。
本发明提供一种作业机械的模拟训练系统,包括:
模拟驾驶舱,所述模拟驾驶舱为基于所述作业机械的驾驶舱确定;
体感控制器,所述体感控制器用于采集用户的肢体动作信息,并输出与所述肢体动作信息对应的姿态图像信息;
VR设备,所述VR设备用于基于所述模拟驾驶舱生成VR场景,所述VR设备与所述体感控制器通信连接,用于将所述姿态图像信息融合入所述VR场景,并显示更新后的VR场景。
根据本发明提供的一种作业机械的模拟训练系统,所述VR设备用于基于所述模拟驾驶舱的至少部分操作部件的定位信息生成VR场景。
根据本发明提供的一种作业机械的模拟训练系统,所述体感控制器为非穿戴式设备,且包括红外补光灯和至少两个灰阶摄像头。
本发明还提供一种作业机械的模拟训练方法,包括:
显示与作业机械的操控对应的VR场景;
接收用户的肢体动作信息;
响应于所述肢体动作信息,输出与所述肢体动作信息对应的姿态图像信息;
将所述姿态图像信息融合入所述VR场景,并显示更新后的VR场景。
根据本发明提供的一种作业机械的模拟训练方法,所述VR场景基于与作业机械的驾驶舱对应的模拟驾驶舱的至少部分操作部件的定位信息生成。
根据本发明提供的一种作业机械的模拟训练方法,所述肢体动作信息通过至少两个 灰阶摄像头采集。
本发明还提供一种作业机械的模拟训练装置,包括:
显示模块,用于显示与作业机械的操控对应的VR场景;
接收模块,用于接收用户的肢体动作信息;
生成模块,用于响应于所述肢体动作信息,输出与所述肢体动作信息对应的姿态图像信息;
融合模块,用于将所述姿态图像信息融合入所述VR场景,并显示更新后的VR场景。
根据本发明提供的一种作业机械的模拟训练装置,所述VR场景基于与作业机械的驾驶舱对应的模拟驾驶舱的至少部分操作部件的定位信息生成。
本发明还提供一种电子设备,包括存储器、处理器及存储在存储器上并可在处理器上运行的计算机程序,所述处理器执行所述程序时实现如上述任一种所述作业机械的模拟训练方法的步骤。
本发明还提供一种非暂态计算机可读存储介质,其上存储有计算机程序,该计算机程序被处理器执行时实现如上述任一种所述作业机械的模拟训练方法的步骤。
本发明提供的作业机械的模拟训练系统、方法、装置和电子设备,通过融合VR场景和用户姿态图像信息,得到更新后的VR场景,可以同时显示虚拟的驾驶舱场景和虚拟的用户肢体动作,并实现虚拟和现实的完全同步。在进行作业机械的模拟训练时,用户通过操作模拟驾驶舱中的硬件操作部件,VR场景中可以显示对应的硬件操作部件位置、该操作指令下作业机械的作业状态以及用户的肢体动作信息,实现动作与视觉的统一,显著增强用户的沉浸感和体验感。
附图说明
为了更清楚地说明本发明或现有技术中的技术方案,下面将对实施例或现有技术描述中所需要使用的附图作简单地介绍,显而易见地,下面描述中的附图是本发明的一些实施例,对于本领域普通技术人员来讲,在不付出创造性劳动的前提下,还可以根据这些附图获得其他的附图。
图1是本发明一些实施例提供的作业机械的模拟训练系统的结构示意图;
图2是本发明一些实施例提供的作业机械的模拟训练方法的流程示意图;
图3是本发明一些实施例提供的作业机械的模拟训练装置的结构示意图;
图4是本发明一些实施例提供的电子设备的结构示意图。
具体实施方式
为使本发明的目的、技术方案和优点更加清楚,下面将结合本发明中的附图,对本 发明中的技术方案进行清楚、完整地描述,显然,所描述的实施例是本发明一部分实施例,而不是全部的实施例。基于本发明中的实施例,本领域普通技术人员在没有作出创造性劳动前提下所获得的所有其他实施例,都属于本发明保护的范围。
下面结合图1描述本发明的作业机械的模拟训练系统。
如图1所示,该模拟训练系统包括:模拟驾驶舱110、体感控制器120和VR设备130。
模拟驾驶舱110,为基于作业机械的驾驶舱确定;
本发明实施例的作业机械可以为挖掘机、起重机、压路机、吊车或者消防车等。
其中,模拟驾驶舱110用于为用户提供实际训练场地,使用户不必进入真实的作业机械的驾驶舱,便可触摸到硬件操作部件,进行操作训练。
在一些实施例中,模拟驾驶舱110的硬件操作部件可与真实驾驶舱的至少部分硬件操作部件按照1:1的比例还原,该实施例下的模拟驾驶舱110,建设成本小,可以满足对某些特定操作部件进行针对训练的需求。
在一些实施例中,模拟驾驶舱110的硬件操作部件可与真实驾驶舱的全部硬件操作部件按照1:1的比例还原,通过提供一个完全相同的模拟环境,以为用户提供更真实的训练场景。
例如,在对起重机进行模拟训练时,模拟驾驶舱110中可以设置有用于控制起重机开关的按钮,用于控制起重机的起重臂、起升机构、变幅机构、回转装置等各个执行机构进行升降、旋转的操作手柄和用于控制起重机走行的刹车、油门、离合器等。
在实际训练过程中,用户通过操控模拟驾驶舱110上的各按钮、手柄、油门等硬件操作部件,可以体验到与操作真实起重机完全一致的操作体验。
体感控制器120,用于采集用户的肢体动作信息,并输出与肢体动作信息对应的姿态图像信息;
其中,用户的肢体动作信息用于表征用户在操作作业机械的硬件操作部件时的动作指令。
肢体动作信息可以包括如下至少一个部分:用户的手部动作信息、用户的脚部动作信息、用户的头部动作信息和用户的躯干动作信息等。
姿态图像信息用于以图像的形式直观地表征用户的肢体动作信息。
姿态图像信息与肢体动作信息相对应,可以包括如下至少一个部分:手部姿态图像、脚步姿态图像、头部姿态图像和躯干姿态图像。
姿态图像信息和肢体动作信息保持同步变化。
VR设备130,用于基于模拟驾驶舱110生成VR场景,VR设备130与体感控制器120通信连接,用于将姿态图像信息融合入VR场景,并显示更新后的VR场景。
其中,VR设备130还包括:
建模设备,用于基于模拟驾驶舱110生成VR场景,如3D扫描仪等;
显示设备,用于显示由建模设备生成的VR场景,如:3D展示系统、大型投影系统或头戴式立体显示器等;
声音设备,用于输出环境声音,以给用户呈现更真实的模拟环境;
交互设备,用于进行定位、动作捕捉及其他交互操作。
在一些实施例中,VR场景包括基于真实作业机械驾驶舱生成的驾驶舱模型,驾驶舱内部各硬件操作部件模型,基于真实作业机械各执行机构生成的执行机构模型,以及基于真实作业机械作业环境生成的作业环境模型。
由VR设备130生成的VR场景与真实驾驶舱按照1:1的比例构建,通过VR定位技术,将模拟驾驶舱的硬件操作部件与VR场景中对应的硬件操作部件的位置设置成完全同步。
当用户操作模拟驾驶舱的硬件操作部件时,VR场景中会生成相同的操作场景,同时基于该操作指令,VR场景还会生成该操作指令下的作业机械对应的作业状态,从而为用户呈现更真实的沉浸式模拟体验。
例如,在对起重机进行模拟训练时,VR设备130生成起重机对应的VR场景,包括:起重机驾驶室模型;驾驶室内各硬件操作部件模型,如按钮、手柄、油门、离合器和刹车等;起重机各执行机构模型,如起重臂、起升机构、变幅机构、回转装置等;以及起重机周围作业环境模型。
上述VR场景中的各硬件操作部件与模拟驾驶舱中的硬件操作部件完全同步。当用户需要进行起吊操作时,按下模拟驾驶舱中对应的起重机的开关按钮,同时推动控制各执行机构的手柄。VR场景中的该开关按钮模型对应启动,手柄模型与模拟驾驶舱中的手柄保持同步移动;同时,受手柄控制的起重机对应的执行机构模型开始旋转、升降,以对目标物模型进行起吊作业。
VR设备130与体感控制器120通信连接,其中,通信连接的方式有多种,包括:通过线缆的有线电连接和通过无线收发器的无线电连接。
在一些实施例中,VR设备130与体感控制器120通过线缆有线电连接。
在该实施例中,体感控制器120和VR设备130上分别设置有输出接口和输入接口,由体感控制器120生成的姿态图像信息,从体感控制器120的输出接口输出,通过有线通信介质传输至VR设备130的输入接口。
其中,有线通信介质可以包括:同轴电缆、双绞线或者光纤等。
通过有线通信方式进行姿态图像信息的传输,具有较强的抗干扰能力,传输过程中姿态图像信息损失小,VR设备130接收到的姿态图像信息质量高。
在另一些实施例中,VR设备130与体感控制器120通过无线收发器无线电连接。
在该实施例中,体感控制器120和VR设备130上分别设置有信号发生装置和信号 接收装置,由体感控制器120生成的姿态图像信息,从体感控制器120的信号发生装置发出,通过无线通信连接方式发送VR设备130的信号接收装置。
其中,无线通信方式可以包括:Zig-Bee传输、蓝牙传输、无线宽带传输、5G传输、微波传输或者无线网桥传输等方式。
通过无线通信方式进行姿态图像信息的传输,成本低、易维护、操作方便,具有较好的适应性和扩展性。
发明人在研发过程中发现,在现有技术中,VR设备所呈现的VR场景是基于完全虚拟环境下的虚拟场景,用户在进行基于VR场景的作业机械模拟训练中,只能看到VR设备生成的虚拟驾驶舱模型,而无法看到自身在操作作业机械硬件操作部件时的动作指令,导致动作和视觉产生割裂感。基于上述情况,还存在一种基于混合现实技术的作业机械模拟训练系统,该技术在虚拟场景中引入现实场景,用户在看到虚拟驾驶舱模型的同时,也能看到真实的机器硬件和真实的手,用户在虚拟和现实之间反复切换,破坏了沉浸感和体验感。
根据本发明的一些实施例,VR设备130在接收到由体感控制器120生成的姿态图像信息后,将姿态图像信息融合入VR场景,并显示更新后的VR场景。
更新后的VR场景包括:由基于模拟驾驶舱110生成VR场景和由体感控制器120生成的姿态图像信息。
在实际操作中,更新后的VR场景可以与实际中的模拟驾驶舱场景和用户动作实现实时同步。例如,当用户用手指启动模拟驾驶舱的电源按钮时,更新后的VR场景里也会同时显示虚拟的手指启动虚拟驾驶舱的电源按钮,实现动作与视觉的统一,显著增强沉浸感。
根据本发明实施例提供的作业机械的模拟训练系统,通过融合VR场景和用户姿态图像信息,得到更新后的VR场景,可以同时显示虚拟的驾驶舱场景和虚拟的用户肢体动作,并实现虚拟和现实的完全同步,在进行作业机械的模拟训练时,用户通过操作模拟驾驶舱中的硬件操作部件,VR场景中可以显示对应的硬件操作部件位置、该操作指令下作业机械的作业状态以及用户的肢体动作信息,实现动作与视觉的统一,显著增强用户的沉浸感和体验感。
在一些实施例中,VR设备130用于基于模拟驾驶舱的至少部分操作部件的定位信息生成VR场景。
在该实施例中,通过VR设备130对模拟驾驶舱中的至少部分硬件操作部件进行定位及位置追踪,生成位置数据,然后基于位置数据生成VR场景。其中,VR场景中生成的全部硬件操作部件与模拟驾驶舱中对应的硬件操作部件位置相同,且保持同步位移。
在该实施例中,将VR场景中生成的硬件操作部件的两个或多个部件与模拟驾驶舱 硬件操作部件对应的部分硬件操作部件进行位置追踪,按照1:1的比例构建,尤其在只需要对某些重点操作部件进行训练时,该实施例成本低,效果好。
在另一些实施例中,将VR场景中生成的硬件操作部件的全部部件与模拟驾驶舱硬件操作部件对应的全部硬件操作部件进行位置追踪,按照1:1的比例构建,可以为用户提供更真实、更全面的模拟操作训练。
根据本发明实施例提供的作业机械的模拟训练系统,通过基于与作业机械的驾驶舱对应的模拟驾驶舱的至少部分操作部件的定位信息生成VR场景,可以为用户提供更真实、更全面的模拟环境,且成本低,操作简单。
在一些实施例中,体感控制器120为非穿戴式设备,且包括红外补光灯和至少两个灰阶摄像头。
在该实施例中,灰阶摄像头用于从至少两个角度拍摄用户肢体动作信息,以获取三维立体图像。
例如,在获取用户手部动作信息时,灰阶摄像头分别从两个角度捕捉用户的手部动作信息,生成基于两个角度的姿态图像信息,然后将该姿态图像信息进行融合,便可以得到用户手部三维姿态图像信息,基于该三维姿态图像信息,可以重建用户手部在真实世界三维空间的动作信息。
红外补光灯用于给灰阶摄像头补光,提高成像质量,尤其适用于当光线较暗的情况下。
红外补光灯体积小、功耗低、具有较好的指向性。
但本发明并不仅限于此,例如也可采用白光补光灯,本发明不做限定。
发明人在研发过程中发现,在现有技术中,往往是通过可穿戴式设备捕捉用户的肢体动作信息。例如:在需要捕捉用户手部动作信息时,用户需穿戴数据手套,通过设置于数据手套内部的传感器采集用户手指的弯曲度,或用户在进行抓取、移动、旋转等操作时的力度,结合手指弯曲度测试和空间定位测试的数据,构建虚拟场景中的抓取、移动、旋转等动作模型。但在实际使用中,这些数据手套很难适配每一个用户的手指尺寸,且价格昂贵易磨损。
该实施例通过体感控制器120采集用户的肢体动作信息,无需用户佩戴手套等任何穿戴设备,即可精准抓取用户的肢体动作信息,具有较好的普适性。
其中,体感控制器120可以为Leap Motion体感控制器。Leap Motion体感控制器内部设置有第一灰阶摄像头、第二灰阶摄像头和红外LED。
根据本发明的一些实施例,在实际模拟中,若要获取用户手部动作信息,用户只需将Leap Motion体感控制器固定在头戴式立体显示器上,Leap Motion体感控制器即可对周边环境动作进行捕捉,检测范围在Leap Motion体感控制器上方25毫米到600毫米之间,检测的空间大体呈倒四棱锥体。
该Leap Motion体感控制器能追踪用户全部10只手指,精度高达1/100毫米,远比现有的运动控制技术更为精确。
根据本发明实施例提供的作业机械的模拟训练系统,通过采用体感控制器120对用户的肢体动作信息进行拍摄,生成三维姿态图像信息,且无需用户佩戴任何穿戴设备,即可精准抓取用户的肢体动作信息,操作简单,具有较好的普适性。
下面对本发明提供的作业机械的模拟训练方法进行描述。
该作业机械的模拟训练方法的执行主体可以为模拟驾驶舱上的控制器,或者独立于模拟驾驶舱的控制装置,或者与该模拟驾驶舱通信连接的服务器。
如图2所示,该作业机械的模拟训练方法,包括:步骤210、步骤220、步骤230和步骤240。
根据本发明的一些实施例,作业机械可以包括:挖掘机、起重机、压路机、吊车或消防车等。
步骤210,显示与作业机械的操控对应的VR场景;
在该步骤中,通过VR定位技术,采集实际作业机械硬件操作部件的位置信息,生成与实际作业机械各硬件操作部件对应的VR场景。
其中,VR场景可以与实际作业机械的部分硬件操作部件,按照1:1的比例构建;或者,VR场景可以与实际作业机械的全部部分硬件操作部件,按照1:1的比例构建;或者,VR场景可以与实际作业机械的整个驾驶舱、驾驶舱内全部硬件操作部件、作业机械的各执行机构、以及作业机械周围作业环境,按照1:1的比例构建,以为用户呈现更真实的沉浸式模拟体验。
但本发明并不仅限于此,例如,VR场景也可以与实际作业机械按照其他比例构建,本发明不做限定。
在该步骤中,将VR场景中的硬件操作部件与模拟驾驶舱中的硬件操作部件设置成完全同步,可以使用户在实际操作模拟驾驶舱中的硬件操作部件时,VR场景中可以实时显示对应的操作场景。
例如,当用户操作模拟驾驶舱中的启动开关时,VR场景中对应的启动开关被开启,模拟作业机械同时启动。
在一些实施例中,VR场景可以通过3D展示系统、大型投影系统或头戴式立体显示器等进行显示。
步骤220,接收用户的肢体动作信息;
其中,用户的肢体动作信息用于表征用户操作作业机械时的动作指令。
在该步骤中,用户的肢体动作信息可以包括如下至少一个部分:用户的手部动作信息、用户的脚部动作信息、用户的头部动作信息和用户的躯干动作信息等。
在一些实施例中,用户的肢体动作信息可以由传感器采集,传感器数量为2个或以 上,用于从不同角度采集用户的肢体动作信息。
步骤230,响应于肢体动作信息,输出与肢体动作信息对应的姿态图像信息;
在该步骤中,将步骤220采集到的肢体动作信息转化为对应的姿态图像信息,用于表征用户在虚拟场景下的与实际操作指令对应的肢体动作信息。
姿态图像信息可以对应包括如下至少一个部分:手部姿态图像信息、脚部姿态图像信息、头部姿态图像信息和躯干姿态图像信息。
其中,姿态图像信息可以为三维姿态图像信息。
步骤240,将姿态图像信息融合入VR场景,并显示更新后的VR场景。
发明人在研发过程中发现,在现有技术中,VR场景是基于完全虚拟环境下的虚拟场景,操作人员在进行基于VR场景的作业机械培训中,只能看到虚拟驾驶舱模型,而无法看到自身在进行作业机械操作时的肢体动作,导致动作和视觉产生割裂感。基于上述情况,还存在一种基于混合现实技术的作业机械模拟训练系统,该技术在虚拟场景中引入现实场景,操作人员在看到虚拟驾驶舱模型的同时,也能看到真实的机器硬件和真实的手,用户在虚拟和现实之间反复切换,破坏了沉浸感和体验感。
该步骤通过将姿态图像信息融合入VR场景,并显示更新后的VR场景,更新后的VR场景同时包括驾驶舱VR场景和姿态图像信息。在实际操作中,用户既能看到虚拟驾驶舱模型,也能看到自身在进行作业机械操作时的肢体动作,具有更好的沉浸感。
根据本发明实施例提供的作业机械的模拟训练方法,通过融合VR场景和用户姿态图像信息,得到更新后的VR场景,可以同时显示虚拟的驾驶舱场景和虚拟的用户肢体动作,并实现虚拟和现实的完全同步,在进行作业机械的模拟训练时,用户通过操作模拟驾驶舱中的硬件操作部件,VR场景中可以显示对应的硬件操作部件位置、该操作指令下作业机械的作业状态以及用户的肢体动作信息,实现动作与视觉的统一,显著增强用户的沉浸感和体验感。
根据本发明的一些实施例,在步骤210中,VR场景基于与作业机械的驾驶舱对应的模拟驾驶舱的至少部分操作部件的定位信息生成。
其中,通过对模拟驾驶舱中的至少部分硬件操作部件进行定位及位置追踪,生成位置数据,基于上述位置数据生成VR场景,使VR场景中生成的全部硬件操作部件与模拟驾驶舱中对应的硬件操作部件位置相同,且保持同步位移。
在一些实施例中,将VR场景中生成的硬件操作部件的两个或多个部件与模拟驾驶舱硬件操作部件对应的部分硬件操作部件进行位置追踪,按照1:1的比例构建,尤其用于只需要对某些重点硬件操作部件进行训练的情景,该实施例成本低,效果好。
在另一些实施例中,将VR场景中生成的硬件操作部件的全部部件与模拟驾驶舱硬件操作部件对应的全部硬件操作部件进行位置追踪,按照1:1的比例构建,可以为用户提供更真实、更全面的模拟操作训练。
其中,模拟驾驶舱的部分硬件操作部件可以与真实驾驶舱的部分硬件操作部件进行1:1还原;或者,模拟驾驶舱的全部硬件操作部件可以与真实驾驶舱的全部硬件操作部件进行1:1还原。
根据本发明实施例提供的作业机械的模拟训练方法,通过基于与作业机械的驾驶舱对应的模拟驾驶舱的至少部分操作部件的定位信息生成VR场景,可以为用户提供更真实、更全面的模拟环境,且成本低,操作简单。
根据本发明的一些实施例,在步骤220中,用户肢体动作信息通过至少两个灰阶摄像头采集。
发明人在研发过程中发现,在现有技术中,往往是通过可穿戴式设备捕捉用户的肢体动作信息。例如:在需要捕捉用户手部动作信息时,用户需穿戴数据手套,通过设置于数据手套内部的传感器采集用户手指的弯曲度,或用户在进行抓取、移动、旋转等操作时的力度,结合手指弯曲度测试和空间定位测试的数据,构建虚拟场景中的抓取、移动、旋转等动作模型。但在实际使用中,这些数据手套很难适配每一个用户的尺寸,且价格昂贵易磨损。
该实施例通过摄像头采集用户的肢体动作信息,用户无需佩戴手套等任何穿戴设备,即可精准抓取用户的肢体动作信息,具有较好的普适性。
除此之外,通过采用至少两个灰阶摄像头进行拍摄,能够从至少两个角度拍摄用户肢体动作信息,以获取三维立体图像。
例如,在获取用户手部动作信息时,灰阶摄像头分别从两个角度捕捉用户的手部动作,生成基于两个角度的姿态图像信息,然后将该姿态图像信息进行融合,便可以得到用户手部三维姿态图像信息,基于该三维姿态图像信息,可以重建用户手部在真实世界三维空间的动作信息。
根据本发明实施例提供的作业机械的模拟训练方法,通过至少两个灰阶摄像头对用户的肢体动作信息进行拍摄,可以生成三维姿态图像信息,且无需用户佩戴任何穿戴设备,即可精准抓取用户的肢体动作信息,操作简单,具有较好的普适性。
下面对本发明提供的作业机械的模拟训练装置进行描述,下文描述的作业机械的模拟训练装置与上文描述的作业机械的模拟训练方法可相互对应参照。
如图3所示,该作业机械的模拟训练装置,包括:显示模块310、接收模块320、生成模块330和融合模块340。
显示模块310,用于显示与作业机械的操控对应的VR场景;
接收模块320,用于接收用户的肢体动作信息;
生成模块330,用于响应于肢体动作信息,输出与肢体动作信息对应的姿态图像信息;
融合模块340,用于将姿态图像信息融合入VR场景,并显示更新后的VR场景。
在一些实施例中,VR场景基于与作业机械的驾驶舱对应的模拟驾驶舱的至少部分操作部件的定位信息生成。
在一些实施例中,肢体动作信息通过至少两个灰阶摄像头采集。
根据本发明实施例提供的作业机械的模拟训练装置,通过融合模块340融合VR场景和用户姿态图像信息,得到更新后的VR场景,可以同时显示虚拟的驾驶舱场景和虚拟的用户肢体动作,并实现虚拟和现实的完全同步,在进行作业机械的模拟训练时,用户通过操作模拟驾驶舱中的硬件操作部件,VR场景中可以显示对应的硬件操作部件位置、该操作指令下作业机械的作业状态以及用户的肢体动作信息,显著增强用户的沉浸感和体验感。
图4示例了一种电子设备的实体结构示意图,如图4所示,该电子设备可以包括:处理器(Processor)410、通信接口(Communications Interface)420、存储器(Memory)430和通信总线440,其中,处理器410,通信接口420,存储器430通过通信总线440完成相互间的通信。处理器410可以调用存储器430中的逻辑指令,以执行作业机械的模拟训练方法,该方法包括:显示与作业机械的操控对应的VR场景;接收用户的肢体动作信息;响应于所述肢体动作信息,输出与所述肢体动作信息对应的姿态图像信息;将所述姿态图像信息融合入所述VR场景,并显示更新后的VR场景。
此外,上述的存储器430中的逻辑指令可以通过软件功能单元的形式实现并作为独立的产品销售或使用时,可以存储在一个计算机可读取存储介质中。基于这样的理解,本发明的技术方案本质上或者说对现有技术做出贡献的部分或者该技术方案的部分可以以软件产品的形式体现出来,该计算机软件产品存储在一个存储介质中,包括若干指令用以使得一台计算机设备(可以是个人计算机,服务器,或者网络设备等)执行本发明各个实施例方法的全部或部分步骤。而前述的存储介质包括:U盘、移动硬盘、只读存储器(ROM,Read-Only Memory)、随机存取存储器(RAM,Random Access Memory)、磁碟或者光盘等各种可以存储程序代码的介质。
另一方面,本发明还提供一种计算机程序产品,计算机程序产品包括存储在非暂态计算机可读存储介质上的计算机程序,计算机程序包括程序指令,当程序指令被计算机执行时,计算机能够执行上述各方法所提供的作业机械的模拟训练方法,该方法包括:显示与作业机械的操控对应的VR场景;接收用户的肢体动作信息;响应于所述肢体动作信息,输出与所述肢体动作信息对应的姿态图像信息;将所述姿态图像信息融合入所述VR场景,并显示更新后的VR场景。
又一方面,本发明还提供一种非暂态计算机可读存储介质,其上存储有计算机程序,该计算机程序被处理器执行时实现以执行上述各方法提供的作业机械的模拟训练方法,该方法包括:显示与作业机械的操控对应的VR场景;接收用户的肢体动作信息;响应于所述肢体动作信息,输出与所述肢体动作信息对应的姿态图像信息;将所述姿态图像 信息融合入所述VR场景,并显示更新后的VR场景。
以上所描述的装置实施例仅仅是示意性的,其中作为分离部件说明的单元可以是或者也可以不是物理上分开的,作为单元显示的部件可以是或者也可以不是物理单元,即可以位于一个地方,或者也可以分布到多个网络单元上。可以根据实际的需要选择其中的部分或者全部模块来实现本实施例方案的目的。本领域普通技术人员在不付出创造性的劳动的情况下,即可以理解并实施。
通过以上的实施方式的描述,本领域的技术人员可以清楚地了解到各实施方式可借助软件加必需的通用硬件平台的方式来实现,当然也可以通过硬件。基于这样的理解,上述技术方案本质上或者说对现有技术做出贡献的部分可以以软件产品的形式体现出来,该计算机软件产品可以存储在计算机可读存储介质中,如ROM/RAM、磁碟、光盘等,包括若干指令用以使得一台计算机设备(可以是个人计算机,服务器,或者网络设备等)执行各个实施例或者实施例的某些部分的方法。
最后应说明的是:以上实施例仅用以说明本发明的技术方案,而非对其限制;尽管参照前述实施例对本发明进行了详细的说明,本领域的普通技术人员应当理解:其依然可以对前述各实施例所记载的技术方案进行修改,或者对其中部分技术特征进行等同替换;而这些修改或者替换,并不使相应技术方案的本质脱离本发明各实施例技术方案的精神和范围。

Claims (10)

  1. 一种作业机械的模拟训练系统,其特征在于,包括:
    模拟驾驶舱,所述模拟驾驶舱为基于所述作业机械的驾驶舱确定;
    体感控制器,所述体感控制器用于采集用户的肢体动作信息,并输出与所述肢体动作信息对应的姿态图像信息;
    VR设备,所述VR设备用于基于所述模拟驾驶舱生成VR场景,所述VR设备与所述体感控制器通信连接,用于将所述姿态图像信息融合入所述VR场景,并显示更新后的VR场景。
  2. 根据权利要求1所述的作业机械的模拟训练系统,其特征在于,所述VR设备用于基于所述模拟驾驶舱的至少部分操作部件的定位信息生成VR场景。
  3. 根据权利要求1所述的作业机械的模拟训练系统,其特征在于,所述体感控制器为非穿戴式设备,且包括红外补光灯和至少两个灰阶摄像头。
  4. 一种作业机械的模拟训练方法,其特征在于,包括:
    显示与作业机械的操控对应的VR场景;
    接收用户的肢体动作信息;
    响应于所述肢体动作信息,输出与所述肢体动作信息对应的姿态图像信息;
    将所述姿态图像信息融合入所述VR场景,并显示更新后的VR场景。
  5. 根据权利要求4所述的作业机械的模拟训练方法,其特征在于,所述VR场景基于与作业机械的驾驶舱对应的模拟驾驶舱的至少部分操作部件的定位信息生成。
  6. 根据权利要求4所述的作业机械的模拟训练方法,其特征在于,所述肢体动作信息通过至少两个灰阶摄像头采集。
  7. 一种作业机械的模拟训练装置,其特征在于,包括:
    显示模块,用于显示与作业机械的操控对应的VR场景;
    接收模块,用于接收用户的肢体动作信息;
    生成模块,用于响应于所述肢体动作信息,输出与所述肢体动作信息对应的姿态图像信息;
    融合模块,用于将所述姿态图像信息融合入所述VR场景,并显示更新后的VR场景。
  8. 根据权利要求7所述的作业机械的模拟训练装置,其特征在于,所述VR场景基于与作业机械的驾驶舱对应的模拟驾驶舱的至少部分操作部件的定位信息生成。
  9. 一种电子设备,包括存储器、处理器及存储在所述存储器上并可在所述处理器上运行的计算机程序,其特征在于,所述处理器执行所述程序时实现如权利要求4至6任一项所述作业机械的模拟训练方法的步骤。
  10. 一种非暂态计算机可读存储介质,其上存储有计算机程序,其特征在于,所述计算机程序被处理器执行时实现如权利要求4至6任一项所述作业机械的模拟训练方法的步骤。
PCT/CN2021/126599 2021-02-04 2021-10-27 作业机械的模拟训练系统、方法、装置和电子设备 WO2022166264A1 (zh)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202110166726.2 2021-02-04
CN202110166726.2A CN112908084A (zh) 2021-02-04 2021-02-04 作业机械的模拟训练系统、方法、装置和电子设备

Publications (1)

Publication Number Publication Date
WO2022166264A1 true WO2022166264A1 (zh) 2022-08-11

Family

ID=76123395

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2021/126599 WO2022166264A1 (zh) 2021-02-04 2021-10-27 作业机械的模拟训练系统、方法、装置和电子设备

Country Status (2)

Country Link
CN (1) CN112908084A (zh)
WO (1) WO2022166264A1 (zh)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115629670A (zh) * 2022-12-01 2023-01-20 北京格如灵科技有限公司 虚拟现实环境下手部姿态的显示方法、装置、设备及介质

Families Citing this family (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112908084A (zh) * 2021-02-04 2021-06-04 三一汽车起重机械有限公司 作业机械的模拟训练系统、方法、装置和电子设备
CN113470466B (zh) * 2021-06-15 2023-04-14 华北科技学院(中国煤矿安全技术培训中心) 一种混合现实掘进机作业实训系统
CN113572769A (zh) * 2021-07-23 2021-10-29 河南省洛阳正骨医院(河南省骨科医院) 一种基于5g实时传输的vr沉浸式中医药文化传播系统
CN114327076A (zh) * 2022-01-04 2022-04-12 上海三一重机股份有限公司 作业机械与作业环境的虚拟交互方法、装置及系统
CN114283649B (zh) * 2022-01-14 2023-10-27 成都运达科技股份有限公司 列车巡检仿真培训系统和方法及培训设备和使用方法
CN114495632A (zh) * 2022-02-24 2022-05-13 阿维塔科技(重庆)有限公司 Vr模拟方法、设备、汽车、系统及存储介质
US11928307B2 (en) 2022-03-11 2024-03-12 Caterpillar Paving Products Inc. Guided operator VR training
CN115938186A (zh) * 2022-11-24 2023-04-07 武汉未来幻影科技有限公司 一种vr驾驶模拟器空间坐标调节方法及系统

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2007017595A2 (fr) * 2005-08-09 2007-02-15 Total Immersion Procede et dispositifs pour visualiser un habitacle reel dans un environnement de synthese
CN109427097A (zh) * 2017-08-29 2019-03-05 深圳市掌网科技股份有限公司 一种基于虚拟现实的吊装仿真方法和系统
CN109426343A (zh) * 2017-08-29 2019-03-05 深圳市掌网科技股份有限公司 基于虚拟现实的协作训练方法及系统
CN110363841A (zh) * 2018-04-02 2019-10-22 当家移动绿色互联网技术集团有限公司 一种虚拟驾驶环境中手部运动跟踪方法
CN110610547A (zh) * 2019-09-18 2019-12-24 深圳市瑞立视多媒体科技有限公司 基于虚拟现实的座舱实训方法、系统及存储介质
CN112102682A (zh) * 2020-11-09 2020-12-18 中电科芜湖钻石飞机制造有限公司南京研发中心 基于5g通信的飞行器驾驶培训系统和方法
CN112289125A (zh) * 2020-11-16 2021-01-29 株洲壹星科技股份有限公司 一种车辆mr模拟驾驶实训方法及实训装置
CN112908084A (zh) * 2021-02-04 2021-06-04 三一汽车起重机械有限公司 作业机械的模拟训练系统、方法、装置和电子设备

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2007017595A2 (fr) * 2005-08-09 2007-02-15 Total Immersion Procede et dispositifs pour visualiser un habitacle reel dans un environnement de synthese
CN109427097A (zh) * 2017-08-29 2019-03-05 深圳市掌网科技股份有限公司 一种基于虚拟现实的吊装仿真方法和系统
CN109426343A (zh) * 2017-08-29 2019-03-05 深圳市掌网科技股份有限公司 基于虚拟现实的协作训练方法及系统
CN110363841A (zh) * 2018-04-02 2019-10-22 当家移动绿色互联网技术集团有限公司 一种虚拟驾驶环境中手部运动跟踪方法
CN110610547A (zh) * 2019-09-18 2019-12-24 深圳市瑞立视多媒体科技有限公司 基于虚拟现实的座舱实训方法、系统及存储介质
CN112102682A (zh) * 2020-11-09 2020-12-18 中电科芜湖钻石飞机制造有限公司南京研发中心 基于5g通信的飞行器驾驶培训系统和方法
CN112289125A (zh) * 2020-11-16 2021-01-29 株洲壹星科技股份有限公司 一种车辆mr模拟驾驶实训方法及实训装置
CN112908084A (zh) * 2021-02-04 2021-06-04 三一汽车起重机械有限公司 作业机械的模拟训练系统、方法、装置和电子设备

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115629670A (zh) * 2022-12-01 2023-01-20 北京格如灵科技有限公司 虚拟现实环境下手部姿态的显示方法、装置、设备及介质
CN115629670B (zh) * 2022-12-01 2023-03-17 北京格如灵科技有限公司 虚拟现实环境下手部姿态的显示方法、装置、设备及介质

Also Published As

Publication number Publication date
CN112908084A (zh) 2021-06-04

Similar Documents

Publication Publication Date Title
WO2022166264A1 (zh) 作业机械的模拟训练系统、方法、装置和电子设备
US10850949B2 (en) Remote control device for a crane, a construction machine and/or for a pallet truck
CN107221223B (zh) 一种带有力/触觉反馈的虚拟现实飞机座舱系统
JP6063749B2 (ja) 仮想エンジニアリングのためのシステムおよび方法
CN107193371A (zh) 一种基于虚拟现实的实时人机交互系统及方法
JP6723738B2 (ja) 情報処理装置、情報処理方法及びプログラム
US11455905B2 (en) Simulator for crane, construction machine or industrial truck
CN106527177A (zh) 一种多功能一站式遥操作控制设计与仿真系统及方法
US20120122062A1 (en) Reconfigurable platform management apparatus for virtual reality-based training simulator
EP2568355A2 (en) Combined stereo camera and stereo display interaction
CN103531051A (zh) 一种起重机操作的虚拟现实训练方法及模拟器
Naceri et al. The vicarios virtual reality interface for remote robotic teleoperation: Teleporting for intuitive tele-manipulation
JP7471428B2 (ja) 製造又は産業環境における協調ロボットとの相互作用のために仮想/拡張現実を使用するためのシステム、方法及びコンピュータ読み取り可能媒体
CN107257946B (zh) 用于虚拟调试的系统
CN107577159A (zh) 扩增实境仿真系统
CN110977981A (zh) 一种机器人虚拟现实同步系统及同步方法
JP2010257081A (ja) 画像処理方法及び画像処理装置
CN107443374A (zh) 机械手操控系统及其操控方法、操控装置、存储介质
Krupke et al. Prototyping of immersive HRI scenarios
JP2023507241A (ja) 随意のデュアルレンジ運動学を用いたプロキシコントローラスーツ
CN107838921A (zh) 一种基于vr的机器人培训系统
CN211319464U (zh) 一种基于虚拟现实的仿真模拟飞行器
WO2024055397A1 (zh) 基于数字孪生的观测视角可追踪方法、系统及终端
CN113282173B (zh) 一种基于虚拟现实的双臂机器人远程实时控制系统及方法
Stone Virtual reality: A tool for telepresence and human factors research

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21924261

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 21924261

Country of ref document: EP

Kind code of ref document: A1