WO2023131124A1 - Virtual interaction method, apparatus and system for work machine and work environment - Google Patents

Virtual interaction method, apparatus and system for work machine and work environment Download PDF

Info

Publication number
WO2023131124A1
WO2023131124A1 PCT/CN2023/070137 CN2023070137W WO2023131124A1 WO 2023131124 A1 WO2023131124 A1 WO 2023131124A1 CN 2023070137 W CN2023070137 W CN 2023070137W WO 2023131124 A1 WO2023131124 A1 WO 2023131124A1
Authority
WO
WIPO (PCT)
Prior art keywords
interaction
virtual
machine
information
operating
Prior art date
Application number
PCT/CN2023/070137
Other languages
French (fr)
Chinese (zh)
Inventor
胡立辛
Original Assignee
上海三一重机股份有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 上海三一重机股份有限公司 filed Critical 上海三一重机股份有限公司
Publication of WO2023131124A1 publication Critical patent/WO2023131124A1/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F30/00Computer-aided design [CAD]
    • G06F30/10Geometric CAD
    • G06F30/17Mechanical parametric or variational design
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics

Definitions

  • the present application relates to the field of virtual reality technology, in particular to a method, device and system for virtual interaction between an operating machine and an operating environment.
  • the application environment of the operating machine is mainly a complex construction operation scene
  • the interaction process is often time-consuming and labor-intensive due to the limitations of the application environment, and it is difficult to meet the actual needs.
  • the present application provides a method, device and system for virtual interaction between an operating machine and an operating environment, which are used to solve the problem of time-consuming interactions in the interaction process often caused by the limitations of the application environment in the prior art involving the interaction between the operating machine and its application environment. Time-consuming and labor-intensive, it is difficult to meet the defects of actual needs.
  • the present application provides a virtual interaction method between an operating machine and an operating environment, the method comprising:
  • the motion state data and the pre-constructed virtual environment information respectively determine the interaction state between the controlled working machine and each space element in the virtual environment information, and generate scene update information according to the interaction state;
  • the user's actual viewing angle range is acquired, and corresponding scene image information is intercepted from the scene update information and output according to the user's actual viewing angle range.
  • the distance between the controlled operating machine and each spatial element in the virtual environment information is respectively determined according to the motion state data and the pre-constructed virtual environment information.
  • interaction state, and generate scene update information according to the interaction state including:
  • the motion state data and the pre-constructed virtual environment information when it is determined that the controlled working machine is in contact with at least one spatial element in the virtual environment information, determine the interaction type of the spatial element in contact;
  • the virtual environment information is updated in real time according to the scene change information to generate scene update information.
  • the interaction categories of the space elements include interactable categories and non-interactive categories.
  • the scene change information is determined according to the interaction category and the motion state data of the controlled operating machine, including:
  • the motion state data of the controlled working machine at the current moment is used as the scene change information.
  • the actual viewing angle range of the user is obtained, including:
  • the user's actual viewing angle range is determined.
  • the present application also provides a virtual interaction device between the working machine and the working environment, the device comprising:
  • An acquisition module configured to acquire an opening proportional signal of the manipulation component
  • the first processing module is configured to determine the motion state data of the controlled working machine according to the opening ratio signal
  • the second processing module is configured to respectively determine the interaction state between the controlled working machine and each spatial element in the virtual environment information according to the motion state data and the pre-built virtual environment information, and generate scene update information;
  • the third processing module is configured to acquire the actual viewing angle range of the user, and intercept and output corresponding scene image information from the scene update information according to the actual viewing angle range of the user.
  • the present application also provides a virtual interaction system between an operating machine and an operating environment, the system comprising: a manipulation component, a data processing device, and at least one imaging device, and the manipulation component and the at least one imaging device are both compatible with the connection to the above-mentioned data processing equipment;
  • the control component is used for the user to initiate a control action on the controlled working machine
  • the data processing device is used to obtain the opening proportional signal of the control component; determine the motion state data of the controlled working machine according to the opening proportional signal; according to the motion state data and pre-built virtual environment information, Respectively determine the interaction state between the controlled working machine and each spatial element in the virtual environment information, and generate scene update information according to the interaction state; obtain the user's actual viewing angle range, and based on the user's actual viewing angle range, from Intercepting and outputting corresponding scene image information from the scene update information;
  • the imaging device is used to display the scene image information.
  • the imaging device is a display screen or a head-mounted imaging device.
  • the control assembly when the controlled operating machine is an excavator, the control assembly includes left and right handles and left and right pedals.
  • the present application also provides a test system for operating machine handling performance, which uses any one of the above-mentioned virtual interaction methods between the working machine and the working environment.
  • the method, device and system for virtual interaction between the working machine and the working environment determine the motion state data of the controlled working machine by obtaining the opening ratio signal of the control component, and according to the motion state data and the pre-built virtual environment information, Determine the interaction state of the controlled operating machine and each space element in the virtual environment information, and generate scene update information according to the interaction state, and finally intercept and output the corresponding scene image information from the scene update information according to the user's actual viewing angle range, to realize
  • the interactive process between the operating machine and the operating environment in the virtual scene is realized. Since this process can be realized through the virtual environment, it overcomes the time-consuming interaction process due to the limitations of the actual application environment in the scene involving the interaction between the operating machine and its application environment. energy-consuming problem.
  • Fig. 1 is a schematic flow chart of a virtual interaction method between an operating machine and an operating environment provided by the present application
  • Fig. 2 is a schematic structural diagram of a virtual interaction device between an operating machine and an operating environment provided by the present application
  • Fig. 3 is a schematic structural diagram of the virtual interaction system between the operating machine and the operating environment provided by the present application;
  • Fig. 4 shows a schematic structural diagram of the virtual interaction system between the working machine and the working environment when the controlled object is an excavator
  • Fig. 5 shows a schematic diagram of the data processing flow in the computing host
  • FIG. 6 is a schematic structural diagram of an electronic device provided by the present application.
  • Fig. 1 shows the virtual interaction method between the working machine and the working environment provided by the embodiment of the present application, the method includes:
  • Step 110 Obtain an opening proportional signal of the control component.
  • control components mentioned here refer to the components that can control the movement and construction actions of the controlled working machine.
  • the control components can be two control handles on the left and right and two pedals on the left and right.
  • the board can simulate the left and right joysticks and left and right walking pedals of a real excavator, so that the user can trigger the corresponding control commands by operating the control components.
  • the forward and backward movements of the left-hand handle correspond to the extension and recovery control commands of the arm respectively, and the left and right movements of the left-hand handle respectively correspond to the left and right rotation control commands of the upper body;
  • the forward and backward movements of the right handle correspond to the lowering and Lifting control command, the left and right movements of the right handle correspond to the excavation and unloading control commands of the bucket respectively;
  • the corresponding opening proportional signal can be obtained, so as to sense the user's manipulation action on the manipulation component in the form of an electric signal.
  • Step 120 Determine the motion state data of the controlled working machine according to the opening ratio signal.
  • the conversion from the opening proportional signal to the motion state data of the controlled working machine can be realized by constructing a system simulation model.
  • the system simulation model is a one-dimensional physical parameterized model, which specifically includes the basic electric control logic model, the hydraulic system model, the operating machine boarding and the dynamic model of the working device, and the electric control logic model outputs various
  • the electronic control valve signal is sent to the hydraulic system model, and the hydraulic system model outputs the real-time pressure of each cylinder to the dynamic model of the working machine and the working device.
  • the output is controlled
  • the motion state data of the work machine, the motion state data includes the space coordinates, speed, acceleration and other information corresponding to the body of the work machine and the working devices on it.
  • the working device mentioned in this embodiment refers to the components that the working machine needs to use during walking and construction operations.
  • the working device can refer to the arm of the excavator, the bucket A system consisting of rods, buckets and hydraulic cylinders.
  • Step 130 According to the motion state data and the pre-constructed virtual environment information, respectively determine the interaction state between the controlled working machine and each spatial element in the virtual environment information, and generate scene update information according to the interaction state.
  • the motion state data and the pre-constructed virtual environment information respectively determine the interaction state of the controlled operation machine and each space element in the virtual environment information, and generate scene update information according to the interaction state, which may include:
  • Step 1 According to the motion state data and the pre-constructed virtual environment information, when it is determined that the controlled operation machine is in contact with at least one spatial element in the virtual environment information, determine the interaction category of the spatial element that has been in contact;
  • Step 2 Determine the scene change information according to the interaction category of the contacted spatial elements and the motion state data of the controlled operating machine
  • Step 3 Update the virtual environment information in real time according to the scene change information to generate scene update information.
  • the interaction categories of the spatial elements include interactive objects and non-interactive objects.
  • the spatial elements are divided into interactive objects and non-interactive objects.
  • non-interactive objects include spatial elements such as terrain, roads, other machinery, obstacles, and people.
  • Interactable objects include construction objects corresponding to operating machines such as soil and stone.
  • judging whether the controlled operating machine is in contact with the spatial element in the virtual environment information may be based on the known external normal vector defined by each point in the contour information of the spatial element, when the controlled operating machine When the distance between the coordinates of each point corresponding to the contour of the working device and some points on the contour of the spatial element is less than the preset threshold, calculate the vector formed by the line connecting the contour point of the spatial element to the contour point of the nearest working device at each time step, and the distance between the contour point of the spatial element The dot product operation is performed on the outer normal vector, and if the operation result is less than or equal to 0, it is determined that the controlled operation machine is in contact with a certain spatial element in the virtual environment information.
  • the process of determining the scene change information according to the interaction type of the spatial element in contact and the motion state data of the controlled working machine may include:
  • the deformation state of the space element is determined according to the motion state data of the controlled operating machine, and the deformation state and the motion state of the controlled operating machine after construction work Data as scene change information;
  • the motion state data of the controlled working machine at the current moment is used as the scene change information.
  • a space and obstacle environment model in order to construct virtual scene information, a space and obstacle environment model can be built in advance.
  • the model is a space data model, which records all the space coordinates corresponding to the outline or boundary of each element in the space.
  • the motion state data of the machine is the input. By comparing the relative position, speed and acceleration of each element in the space with the operating machine in real time, it is judged whether each element is in contact with the operating machine.
  • the model After determining that a certain element is in contact with the operating machine, it is further determined whether the element belongs to an interactive object or a non-interactive object. For non-interactive objects, the model can output information such as the position and posture of the operating machine under environmental constraints, that is, the contact state The motion status data of the controlled working machine.
  • an interactive model of construction objects can be built in advance, which is a spatial data model and a material property model of construction objects (such as soil and stones).
  • the construction can be simulated through a simulation algorithm based on discrete element method or material point method The deformation and residual shape of the object under the action of the controlled operation machine, so as to output the coordinates of each point of the outer contour of the construction object after construction.
  • the model when the controlled operation machine is in contact with the interactive object, the model can simulate the construction action process, specifically according to the position, speed and other information of the construction component, and based on the material properties of the construction object, it can calculate in real time what the construction component is under the interactive condition.
  • the resistance received and the deformation state of the construction object, and then the deformation state of the construction object and the motion state data of the controlled operating machine after the construction operation are fed back to the space and obstacle environment model to realize the acquisition of scene change information during the construction process.
  • the The scene change information is recombined with other immutable environment elements in the virtual environment information to generate updated 3D virtual environment information, that is, scene update information.
  • this embodiment combines the application system simulation model, space and obstacle environment model, and construction object interaction model, from handle and pedal signals to high-precision simulation of soil and attachments in the excavation process, to achieve the effect of virtual reality.
  • Step 140 Obtain the user's actual viewing angle range, intercept and output corresponding scene image information from the scene update information according to the user's actual viewing angle range.
  • the process of obtaining the user's actual viewing angle range may include:
  • the user's actual viewing angle range is determined according to the forward head-up direction vector and the preset user's viewing angle range.
  • the above-mentioned space and obstacle environment model includes the coordinates of each point on the ground in the movable space of the working machine, and the data is stored in the form of point cloud data with a certain resolution.
  • an excavator is taken as an example to illustrate the acquisition process of the actual viewing angle range, as follows:
  • the ground in the driving direction must ensure that the z-axis coordinates of the bottom of the excavator’s crawler (set as a plane) are greater than or equal to the z-axis coordinates of the corresponding point on the ground projected by it, otherwise it will occur At the same time, it is necessary to ensure that the z-axis coordinates of at least three points are equal to the corresponding coordinates on the ground.
  • the normal vector n and the traveling direction vector s of the entire excavator can be determined by the plane formed by the bottom of the crawler, and the coordinates of the excavator's turning center o and the turning angle ⁇ are known parameters, which can be obtained by the DH homogeneous matrix transformation method Find the head-up direction vector b in front of the excavator cab at each time point, which can be calculated by the travel direction vector s passing through the excavator’s rotation center o and the axis rotation angle ⁇ in the direction n, assuming that the user’s viewing angle range is head-up The up-down angle of the direction is ⁇ 1 , and the left-right angle is ⁇ 2 , and then the user's viewing angle and range can be obtained, so as to determine the user's actual viewing angle range.
  • the virtual environment area that the user can see under the current viewing angle can be obtained from the scene update information.
  • the coordinates (should be the coordinates of the eyes of the cab personnel) are extended along the line connecting the above viewing angle range, and the intersection formed with the outline of the virtual environment in the scene update information is the virtual environment area that the user can see.
  • the virtual environment area can be fed back to the user through data such as images and depth information. During the feedback process, it can be displayed to the user through the imaging device, and the user can also continue or change the input of the handle and pedal according to the visual feedback seen by his operation. Realize the function of real-time virtual human-computer interaction, and then realize the virtual interaction between the working machine and the working environment.
  • the imaging device can not only be one or more display screens, but also a wearable imaging device that can follow the movement of the human head in real time, such as head-mounted glasses.
  • the real-time viewing angle movement of the human body is combined with the actual viewing angle range of the user obtained above.
  • the viewing angle transformation (including viewing angle translation and rotation) captured by the head-mounted imaging device can be described by the D-H homogeneous matrix transformation method, namely The head-up direction vector b is transformed to obtain a more accurate actual viewing angle range, and the final real-time image can be output to the user by the head-mounted glasses.
  • the virtual interaction method between the working machine and the working environment can construct a corresponding virtual scene for the construction scene, which can meet the specific needs of the construction scene.
  • the mechanism model of the actual construction operation process and the virtual reality makes the interactive process of construction and walking of the controlled operation machine closer to the real scene, and better meets the actual application requirements.
  • the virtual interaction device between the working machine and the working environment provided by this application is described below.
  • the virtual interaction device between the working machine and the working environment described below and the virtual interaction method between the working machine and the working environment described above can be referred to in correspondence.
  • Fig. 2 shows the virtual interaction device between the working machine and the working environment provided by the embodiment of the present application, the device includes:
  • An acquisition module 210 configured to acquire an opening ratio signal of the manipulation component
  • the first processing module 220 is configured to determine the motion state data of the controlled working machine according to the opening ratio signal
  • the second processing module 230 is configured to respectively determine the interaction state between the controlled working machine and each spatial element in the virtual environment information according to the motion state data and the pre-constructed virtual environment information, and generate scene update information according to the interaction state;
  • the third processing module 240 is configured to acquire the user's actual viewing angle range, and intercept and output corresponding scene image information from the scene update information according to the user's actual viewing angle range.
  • the above-mentioned second processing module 230 is specifically configured to: judge whether the controlled working machine is in contact with each spatial element in the virtual environment information according to the motion state data and the pre-constructed virtual environment information; When the controlled operation machine is in contact with at least one spatial element in the virtual environment information, determine the interaction category of the contacted spatial element; determine the scene change information according to the interaction category of the contacted spatial element and the motion state data of the controlled operation machine; The scene change information updates the virtual environment information in real time to generate scene update information.
  • the interaction categories of the spatial elements in this embodiment may include interactive categories and non-interactive categories.
  • the above-mentioned second processing module 230 determines the function of the scene change information according to the interaction type of the contacted spatial element and the motion state data of the controlled operating machine, which can be specifically realized in the following manner:
  • the deformation state of the spatial element is determined according to the motion state data of the controlled operating machine, and the deformation state and the motion state data of the controlled operating machine after construction work are used as the scene change information;
  • the motion state data of the controlled working machine at the current moment is used as the scene change information.
  • the above-mentioned function of the third processing module 240 to obtain the user's actual viewing angle range can be specifically implemented in the following manner:
  • the user's actual viewing angle range is determined.
  • FIG. 3 shows a virtual interaction system between an operating machine and an operating environment provided by an embodiment of the present application.
  • the system includes: a manipulation component 310, a data processing device 320, and at least one imaging device 330. Both the manipulation component 310 and the at least one imaging device 330 are Connect with data processing equipment 320;
  • the manipulation component 310 is used for the user to initiate a manipulation action on the controlled working machine
  • the data processing device 320 is used to obtain the opening proportional signal of the control component; determine the motion state data of the controlled operating machine according to the opening proportional signal; determine the controlled operating machine and the The interactive state of each spatial element in the virtual environment information, and generate scene update information according to the interactive state; obtain the user's actual viewing angle range, intercept and output the corresponding scene image information from the scene update information according to the user's actual viewing angle range;
  • the imaging device 330 is used to display scene image information.
  • the imaging device 330 in this embodiment may be a display screen or a head-mounted imaging device.
  • the control assembly when the controlled working machine is an excavator, the control assembly includes left and right handles and left and right pedals.
  • Fig. 4 shows the structural framework of the virtual interaction system between the working machine and the working environment when the controlled object is an excavator.
  • the system includes three display screens 410, two left and right pedals 420, a left handle 430, Right handle 440 and computing host 450;
  • the user operates the left and right pedals 420, the left handle 430, and the right handle 440, and the corresponding handle and pedal opening ratio signals are transmitted to the computer through a certain transmission method (either wired transmission or wireless transmission).
  • the host computer 450 obtains the scene image information through a series of data processing procedures through the calculation host computer 450, and the scene image information can be displayed to the user in real time through the display screen 410.
  • the real-time interaction between the excavator and the virtual operating environment is realized by means of human-computer interaction.
  • the system is also provided with a seat 460 on which the user can manipulate the above-mentioned control components.
  • Step 510 the user operates the handle and the pedal, and then outputs the handle signal and the pedal signal;
  • Step 520 Input the handle signal and pedal signal into the system simulation model, that is, the physical parameterized model, and output the motion state data of the excavator after being processed by the system simulation model;
  • Step 530 The motion state data of the excavator is further input into the space and obstacle environment model.
  • the model is obtained through virtual mapping or real-scene scanning transformation. After the model is processed, and the head-mounted device is used to follow the change of the perspective of the human body, the excavation can be output. The angle of view and position change corresponding to the rotation and walking of the machine;
  • Step 540 At the same time, through the interaction between the construction object interaction model and the space and obstacle environment model, the soil virtual environment changes caused by excavation can be obtained;
  • Step 550 Integrate the data obtained in step 530 and step 540 to obtain the interactive virtual environment and real-time perspective;
  • Step 560 Present the interactive virtual environment to the user on the imaging device, so as to facilitate the user to perform the next operation.
  • the present application also provides a control performance test system of an operating machine, which uses the above-mentioned virtual interaction method between the operating machine and the operating environment.
  • the test system can use the virtual environment to test the control performance of operating machinery, which is more convenient and efficient than the existing test methods based on actual scenarios.
  • FIG. 6 illustrates a schematic diagram of the physical structure of an electronic device.
  • the electronic device may include: a processor (processor) 610, a communication interface (Communications Interface) 620, a memory (memory) 630 and a communication bus 640, Wherein, the processor 610 , the communication interface 620 , and the memory 630 communicate with each other through the communication bus 640 .
  • the processor 610 can call the logic instructions in the memory 630 to execute the virtual interaction method between the working machine and the working environment.
  • the method includes: acquiring the opening proportional signal of the control component; determining the movement of the controlled working machine according to the opening proportional signal State data; according to the motion state data and the pre-constructed virtual environment information, respectively determine the interaction state of the controlled operation machine and each space element in the virtual environment information, and generate scene update information according to the interaction state; obtain the user's actual viewing angle range, according to The user's actual viewing angle range is intercepted from the scene update information and the corresponding scene image information is output.
  • the logic instructions in the above-mentioned memory 630 may be implemented in the form of software functional units and when sold or used as an independent product, they may be stored in a computer-readable storage medium.
  • the technical solution of the present application is essentially or the part that contributes to the prior art or the part of the technical solution can be embodied in the form of a software product, and the computer software product is stored in a storage medium, including Several instructions are used to make a computer device (which may be a personal computer, a server, or a network device, etc.) execute all or part of the steps of the methods described in the various embodiments of the present application.
  • the aforementioned storage media include: U disk, mobile hard disk, read-only memory (ROM, Read-Only Memory), random access memory (RAM, Random Access Memory), magnetic disk or optical disc, etc., which can store program codes. .
  • the present application also provides a computer program product
  • the computer program product includes a computer program stored on a non-transitory computer-readable storage medium
  • the computer program includes program instructions, and when the program instructions are executed by a computer
  • the method includes: obtaining the opening proportional signal of the control component; according to the opening proportional signal, determining the motion state data of the controlled working machine ;According to the motion state data and the pre-constructed virtual environment information, respectively determine the interaction state of the controlled operation machine and each space element in the virtual environment information, and generate scene update information according to the interaction state; obtain the user's actual viewing angle range, according to the user's The actual viewing angle range is intercepted from the scene update information and the corresponding scene image information is output.
  • the present application also provides a non-transitory computer-readable storage medium, on which a computer program is stored, and when the computer program is executed by a processor, it is implemented to perform the above-mentioned methods for virtual interaction between an operating machine and an operating environment.
  • the method includes: obtaining the opening proportional signal of the control component; determining the motion state data of the controlled operating machine according to the opening proportional signal; respectively determining the controlled operating machine and the virtual The interactive state of each spatial element in the environmental information, and generate scene update information according to the interactive state; obtain the user's actual viewing angle range, intercept and output the corresponding scene image information from the scene update information according to the user's actual viewing angle range.
  • the device embodiments described above are only illustrative, and the units described as separate components may or may not be physically separated, and the components shown as units may or may not be physical units, that is, they may be located in One place, or it can be distributed to multiple network elements. Part or all of the modules can be selected according to actual needs to achieve the purpose of the solution of this embodiment. It can be understood and implemented by those skilled in the art without any creative efforts.
  • each implementation can be implemented by means of software plus a necessary general-purpose hardware platform, and of course also by hardware.
  • the essence of the above technical solution or the part that contributes to the prior art can be embodied in the form of software products, and the computer software products can be stored in computer-readable storage media, such as ROM/RAM, magnetic discs, optical discs, etc., including several instructions to make a computer device (which may be a personal computer, server, or network device, etc.) execute the methods described in various embodiments or some parts of the embodiments.

Abstract

Provided in the present application are a virtual interaction method, apparatus and system for a work machine and a work environment. The method comprises: acquiring an open degree proportion signal of a control assembly; determining motion state data of a controlled work machine; respectively determining, according to the motion state data and pre-constructed virtual environment information, the state of interaction between the controlled work machine and each spatial element in the virtual environment information, and generating scenario update information according to the state of interaction; and finally, intercepting corresponding scenario image information from the scenario update information, according to an actual angle-of-view range of a user, and outputting same. Therefore, an interaction process between a work machine and a work environment in a virtual scenario is realized. The process can be realized by means of a virtual environment, thereby overcoming the problem of the interaction process being labour-intensive and time-consuming due to the limitations of an actual application environment in a scenario in which the work machine is operated and controlled to interact with an application environment thereof.

Description

作业机械与作业环境的虚拟交互方法、装置及系统Virtual interaction method, device and system for operating machinery and operating environment
相关申请的交叉引用Cross References to Related Applications
本申请要求于2022年01月04日提交的申请号为202210002264.5,发明名称为“作业机械与作业环境的虚拟交互方法、装置及系统”的中国专利申请的优先权,其通过引用方式全部并入本文。This application claims the priority of the Chinese patent application with the application number 202210002264.5 filed on January 04, 2022, and the title of the invention is "virtual interaction method, device and system for operating machinery and operating environment", which is incorporated by reference in its entirety This article.
技术领域technical field
本申请涉及虚拟现实技术领域,尤其涉及一种作业机械与作业环境的虚拟交互方法、装置及系统。The present application relates to the field of virtual reality technology, in particular to a method, device and system for virtual interaction between an operating machine and an operating environment.
背景技术Background technique
考虑到作业机械的应用环境主要是复杂的施工作业场景,在涉及操控作业机械与其应用环境交互的场景中,时常因应用环境的局限性导致交互过程耗时耗力,难以满足实际需求。Considering that the application environment of the operating machine is mainly a complex construction operation scene, in the scene involving the interaction between the operating machine and the application environment, the interaction process is often time-consuming and labor-intensive due to the limitations of the application environment, and it is difficult to meet the actual needs.
比如,在作业机械的操控性能测试场景中,一般需要在多种工况环境下完成,考虑到实际测试场地有限,一块场地仅能测试一种工况环境,测试多种工况环境时,需要将待测试的作业机械转运到不同的测试场地,导致测试过程耗时耗力,测试效率低下。For example, in the control performance test scene of operating machinery, it generally needs to be completed under various working conditions. Considering the actual test site is limited, one site can only test one working condition environment. When testing multiple working conditions environments, it is necessary to Transferring the operating machinery to be tested to different test sites leads to time-consuming and labor-intensive testing and low testing efficiency.
发明内容Contents of the invention
本申请提供一种作业机械与作业环境的虚拟交互方法、装置及系统,用以解决现有技术中在涉及操控作业机械与其应用环境交互的场景中,时常因应用环境的局限性导致交互过程耗时耗力,难以满足实际需求的缺陷。The present application provides a method, device and system for virtual interaction between an operating machine and an operating environment, which are used to solve the problem of time-consuming interactions in the interaction process often caused by the limitations of the application environment in the prior art involving the interaction between the operating machine and its application environment. Time-consuming and labor-intensive, it is difficult to meet the defects of actual needs.
第一方面,本申请提供一种作业机械与作业环境的虚拟交互方法,该方法包括:In a first aspect, the present application provides a virtual interaction method between an operating machine and an operating environment, the method comprising:
获取操控组件的开度比例信号;Obtain the opening proportional signal of the control component;
根据所述开度比例信号,确定被控作业机械的运动状态数据;Determine the motion state data of the controlled working machine according to the opening ratio signal;
根据所述运动状态数据以及预先构建的虚拟环境信息,分别确定所述被控作业机械与所述虚拟环境信息中各空间要素的交互状态,并根据所述交互状态生成场景更新信息;According to the motion state data and the pre-constructed virtual environment information, respectively determine the interaction state between the controlled working machine and each space element in the virtual environment information, and generate scene update information according to the interaction state;
获取用户的实际视角范围,根据所述用户的实际视角范围,从所述场景更新信息中截取并输出相应的场景图像信息。The user's actual viewing angle range is acquired, and corresponding scene image information is intercepted from the scene update information and output according to the user's actual viewing angle range.
根据本申请提供的一种作业机械与作业环境的虚拟交互方法,根据所述运动状态数据以及预先构建的虚拟环境信息,分别确定所述被控作业机械与所述虚拟环境信息中各空间要素的交互状态,并根据所述交互状态生成场景更新信息,包括:According to a virtual interaction method between an operating machine and an operating environment provided in the present application, the distance between the controlled operating machine and each spatial element in the virtual environment information is respectively determined according to the motion state data and the pre-constructed virtual environment information. interaction state, and generate scene update information according to the interaction state, including:
根据所述运动状态数据以及预先构建的虚拟环境信息,在确定所述被控作业机械与所述虚拟环境信息中至少一个空间要素接触时,确定发生接触的所述空间要素的交互类别;According to the motion state data and the pre-constructed virtual environment information, when it is determined that the controlled working machine is in contact with at least one spatial element in the virtual environment information, determine the interaction type of the spatial element in contact;
根据所述交互类别和所述被控作业机械的运动状态数据,确定场景变化信息;Determine scene change information according to the interaction category and the motion state data of the controlled working machine;
根据所述场景变化信息对所述虚拟环境信息进行实时更新,生成场景更新信息。The virtual environment information is updated in real time according to the scene change information to generate scene update information.
根据本申请提供的一种作业机械与作业环境的虚拟交互方法,所述空间要素的交互类别包括可交互类和不可交互类。According to a virtual interaction method between an operating machine and an operating environment provided by the present application, the interaction categories of the space elements include interactable categories and non-interactive categories.
根据本申请提供的一种作业机械与作业环境的虚拟交互方法,根据所述交互类别和所述被控作业机械的运动状态数据,确定场景变化信息,包括:According to a virtual interaction method between an operating machine and an operating environment provided in the present application, the scene change information is determined according to the interaction category and the motion state data of the controlled operating machine, including:
当发生接触的所述空间要素的交互类别为可交互类时,根据所述被控作业机械的运动状态数据,确定所述空间要素的形变状态,并将所述形变状态以及施工作业后所述被控作业机械的运动状态数据作为场景变化信息;When the interaction category of the space element in contact is interactive, determine the deformation state of the space element according to the motion state data of the controlled working machine, and compare the deformation state and the The motion state data of the controlled operating machine is used as scene change information;
当发生接触的所述空间要素的交互类别为不可交互类时,将当前时刻所述被控作业机械的运动状态数据作为场景变化信息。When the interaction type of the space element in contact is non-interactive, the motion state data of the controlled working machine at the current moment is used as the scene change information.
根据本申请提供的一种作业机械与作业环境的虚拟交互方法,获取用户的实际视角范围,包括:According to a virtual interaction method between an operating machine and an operating environment provided by the present application, the actual viewing angle range of the user is obtained, including:
获取被控作业机械的回转中心坐标和回转角度;Obtain the rotation center coordinates and rotation angle of the controlled operating machine;
根据所述回转中心坐标和所述回转角度,确定被控作业机械驾驶室的前方平视方向向量;Determine the front head-up direction vector of the cab of the controlled working machine according to the coordinates of the center of rotation and the angle of rotation;
根据所述前方平视方向向量以及预设的用户视角范围,确定用户的实际视角范围。According to the front head-up direction vector and the preset user viewing angle range, the user's actual viewing angle range is determined.
第二方面,本申请还提供一种作业机械与作业环境的虚拟交互装置, 该装置包括:In the second aspect, the present application also provides a virtual interaction device between the working machine and the working environment, the device comprising:
获取模块,用于获取操控组件的开度比例信号;An acquisition module, configured to acquire an opening proportional signal of the manipulation component;
第一处理模块,用于根据所述开度比例信号,确定被控作业机械的运动状态数据;The first processing module is configured to determine the motion state data of the controlled working machine according to the opening ratio signal;
第二处理模块,用于根据所述运动状态数据以及预先构建的虚拟环境信息,分别确定所述被控作业机械与所述虚拟环境信息中各空间要素的交互状态,并根据所述交互状态生成场景更新信息;The second processing module is configured to respectively determine the interaction state between the controlled working machine and each spatial element in the virtual environment information according to the motion state data and the pre-built virtual environment information, and generate scene update information;
第三处理模块,用于获取用户的实际视角范围,根据所述用户的实际视角范围,从所述场景更新信息中截取并输出相应的场景图像信息。The third processing module is configured to acquire the actual viewing angle range of the user, and intercept and output corresponding scene image information from the scene update information according to the actual viewing angle range of the user.
第三方面,本申请还提供一种作业机械与作业环境的虚拟交互系统,该系统包括:操控组件、数据处理设备以及至少一个成像设备,所述操控组件和所述至少一个成像设备均与所述数据处理设备连接;In a third aspect, the present application also provides a virtual interaction system between an operating machine and an operating environment, the system comprising: a manipulation component, a data processing device, and at least one imaging device, and the manipulation component and the at least one imaging device are both compatible with the connection to the above-mentioned data processing equipment;
所述操控组件用于供用户发起对被控作业机械的操控动作;The control component is used for the user to initiate a control action on the controlled working machine;
所述数据处理设备用于获取所述操控组件的开度比例信号;根据所述开度比例信号,确定被控作业机械的运动状态数据;根据所述运动状态数据以及预先构建的虚拟环境信息,分别确定所述被控作业机械与所述虚拟环境信息中各空间要素的交互状态,并根据所述交互状态生成场景更新信息;获取用户的实际视角范围,根据所述用户的实际视角范围,从所述场景更新信息中截取并输出相应的场景图像信息;The data processing device is used to obtain the opening proportional signal of the control component; determine the motion state data of the controlled working machine according to the opening proportional signal; according to the motion state data and pre-built virtual environment information, Respectively determine the interaction state between the controlled working machine and each spatial element in the virtual environment information, and generate scene update information according to the interaction state; obtain the user's actual viewing angle range, and based on the user's actual viewing angle range, from Intercepting and outputting corresponding scene image information from the scene update information;
所述成像设备用于显示所述场景图像信息。The imaging device is used to display the scene image information.
根据本申请提供的一种作业机械与作业环境的虚拟交互系统,所述成像设备为显示屏或头戴式成像设备。According to a virtual interaction system between an operating machine and an operating environment provided by the present application, the imaging device is a display screen or a head-mounted imaging device.
根据本申请提供的一种作业机械与作业环境的虚拟交互系统,当所述被控作业机械为挖掘机时,所述操控组件包括左右手柄和左右脚踏板。According to a virtual interaction system between an operating machine and an operating environment provided in the present application, when the controlled operating machine is an excavator, the control assembly includes left and right handles and left and right pedals.
第四方面,本申请还提供一种作业机械操控性能测试系统,该系统使用上述任一种所述的作业机械与作业环境的虚拟交互方法。In a fourth aspect, the present application also provides a test system for operating machine handling performance, which uses any one of the above-mentioned virtual interaction methods between the working machine and the working environment.
本申请提供的作业机械与作业环境的虚拟交互方法、装置及系统,通过获取操控组件的开度比例信号,确定被控作业机械的运动状态数据,根据运动状态数据以及预先构建的虚拟环境信息,分别确定被控作业机械与虚拟环境信息中各空间要素的交互状态,并根据交互状态生成场景更新信息,最后根据用户的实际视角范围,从场景更新信息中截取并输出相应的 场景图像信息,实现了虚拟场景中的作业机械与作业环境的交互过程,由于该过程可以通过虚拟环境实现,从而克服了涉及操控作业机械与其应用环境交互的场景中,因实际应用环境的局限性导致交互过程耗时耗力的问题。The method, device and system for virtual interaction between the working machine and the working environment provided by this application determine the motion state data of the controlled working machine by obtaining the opening ratio signal of the control component, and according to the motion state data and the pre-built virtual environment information, Determine the interaction state of the controlled operating machine and each space element in the virtual environment information, and generate scene update information according to the interaction state, and finally intercept and output the corresponding scene image information from the scene update information according to the user's actual viewing angle range, to realize The interactive process between the operating machine and the operating environment in the virtual scene is realized. Since this process can be realized through the virtual environment, it overcomes the time-consuming interaction process due to the limitations of the actual application environment in the scene involving the interaction between the operating machine and its application environment. energy-consuming problem.
附图说明Description of drawings
为了更清楚地说明本申请或现有技术中的技术方案,下面将对实施例或现有技术描述中所需要使用的附图作简单地介绍,显而易见地,下面描述中的附图是本申请的一些实施例,对于本领域普通技术人员来讲,在不付出创造性劳动的前提下,还可以根据这些附图获得其他的附图。In order to more clearly illustrate the technical solutions in the present application or the prior art, the following will briefly introduce the accompanying drawings that need to be used in the description of the embodiments or the prior art. Obviously, the accompanying drawings in the following description are the For some embodiments of the present invention, those of ordinary skill in the art can also obtain other drawings based on these drawings on the premise of not paying creative efforts.
图1是本申请提供的作业机械与作业环境的虚拟交互方法的流程示意图;Fig. 1 is a schematic flow chart of a virtual interaction method between an operating machine and an operating environment provided by the present application;
图2是本申请提供的作业机械与作业环境的虚拟交互装置的结构示意图;Fig. 2 is a schematic structural diagram of a virtual interaction device between an operating machine and an operating environment provided by the present application;
图3是本申请提供的作业机械与作业环境的虚拟交互系统的结构示意图;Fig. 3 is a schematic structural diagram of the virtual interaction system between the operating machine and the operating environment provided by the present application;
图4示出了被控对象为挖掘机时,作业机械与作业环境的虚拟交互系统的结构架构示意图;Fig. 4 shows a schematic structural diagram of the virtual interaction system between the working machine and the working environment when the controlled object is an excavator;
图5示出了计算主机内的数据处理流程示意图;Fig. 5 shows a schematic diagram of the data processing flow in the computing host;
图6是本申请提供的电子设备的结构示意图。FIG. 6 is a schematic structural diagram of an electronic device provided by the present application.
具体实施方式Detailed ways
为使本申请的目的、技术方案和优点更加清楚,下面将结合本申请中的附图,对本申请中的技术方案进行清楚、完整地描述,显然,所描述的实施例是本申请一部分实施例,而不是全部的实施例。基于本申请中的实施例,本领域普通技术人员在没有作出创造性劳动前提下所获得的所有其他实施例,都属于本申请保护的范围。In order to make the purpose, technical solutions and advantages of this application clearer, the technical solutions in this application will be clearly and completely described below in conjunction with the accompanying drawings in this application. Obviously, the described embodiments are part of the embodiments of this application , but not all examples. Based on the embodiments in this application, all other embodiments obtained by persons of ordinary skill in the art without creative efforts fall within the protection scope of this application.
下面结合图1至图5描述本申请实施例提供的作业机械与作业环境的虚拟交互方法、装置及系统。The method, device and system for virtual interaction between the working machine and the working environment provided by the embodiments of the present application are described below with reference to FIGS. 1 to 5 .
图1示出了本申请实施例提供的作业机械与作业环境的虚拟交互方法,该方法包括:Fig. 1 shows the virtual interaction method between the working machine and the working environment provided by the embodiment of the present application, the method includes:
步骤110:获取操控组件的开度比例信号。Step 110: Obtain an opening proportional signal of the control component.
这里提到的操控组件,指的是能够控制被控作业机械行进动作和施工动作的部件,比如如果被控作业机械是挖掘机,那么操控组件可以是左右两个操控手柄以及左右两个脚踏板,可以模拟真实挖掘机的左右操纵杆和左右行走踏板,从而用户操作操控组件,即可触发相应的操控指令。The control components mentioned here refer to the components that can control the movement and construction actions of the controlled working machine. For example, if the controlled working machine is an excavator, the control components can be two control handles on the left and right and two pedals on the left and right. The board can simulate the left and right joysticks and left and right walking pedals of a real excavator, so that the user can trigger the corresponding control commands by operating the control components.
具体地,针对挖掘机而言,左手手柄前后动作分别对应斗杆的外伸和回收操控指令,左手手柄左右动作分别对应上车身的左右回转操控指令;右手柄前后动作分别对应于动臂的下降和提升操控指令,右手柄左右动作分别对应于铲斗的挖掘和卸载操控指令;左踏板的前后动作对应挖掘机正向左侧的履带前进和后退操控指令;右踏板的前后动作对应挖掘机正向右侧的履带前进和后退操控指令。Specifically, for an excavator, the forward and backward movements of the left-hand handle correspond to the extension and recovery control commands of the arm respectively, and the left and right movements of the left-hand handle respectively correspond to the left and right rotation control commands of the upper body; the forward and backward movements of the right handle correspond to the lowering and Lifting control command, the left and right movements of the right handle correspond to the excavation and unloading control commands of the bucket respectively; Track forward and reverse control commands on the right.
操控组件被用户操作时,可以获取到相应的开度比例信号,从而以电信号的方式感知用户对操控组件的操控动作。When the manipulation component is operated by the user, the corresponding opening proportional signal can be obtained, so as to sense the user's manipulation action on the manipulation component in the form of an electric signal.
步骤120:根据开度比例信号,确定被控作业机械的运动状态数据。Step 120: Determine the motion state data of the controlled working machine according to the opening ratio signal.
在示例性实施例中,可以通过构建系统仿真模型来实现从开度比例信号到被控作业机械运动状态数据的转换。In an exemplary embodiment, the conversion from the opening proportional signal to the motion state data of the controlled working machine can be realized by constructing a system simulation model.
具体地,该系统仿真模型为一维的物理参数化模型,具体包含基本电控逻辑模型、液压系统模型、作业机械上车及工作装置动力学模型,电控逻辑模型根据开度比例信号输出各电控阀信号到液压系统模型,液压系统模型输出各油缸实时压力到作业机械上车及工作装置动力学模型,通过感知作业机械的行进、回转以及其上工作装置的运动状态,进而输出被控作业机械的运动状态数据,该运动状态数据包括作业机械车身以及其上的工作装置对应的空间坐标、速度和加速度等信息。Specifically, the system simulation model is a one-dimensional physical parameterized model, which specifically includes the basic electric control logic model, the hydraulic system model, the operating machine boarding and the dynamic model of the working device, and the electric control logic model outputs various The electronic control valve signal is sent to the hydraulic system model, and the hydraulic system model outputs the real-time pressure of each cylinder to the dynamic model of the working machine and the working device. By sensing the travel, rotation of the working machine and the motion state of the working device on it, the output is controlled The motion state data of the work machine, the motion state data includes the space coordinates, speed, acceleration and other information corresponding to the body of the work machine and the working devices on it.
可以理解的是,本实施例提到的工作装置,指的是作业机械在行走和施工作业时需要用到的部件,比如当作业机械为挖掘机时,工作装置可以指挖掘机动臂、斗杆、铲斗及液压缸所组成的系统。It can be understood that the working device mentioned in this embodiment refers to the components that the working machine needs to use during walking and construction operations. For example, when the working machine is an excavator, the working device can refer to the arm of the excavator, the bucket A system consisting of rods, buckets and hydraulic cylinders.
步骤130:根据运动状态数据以及预先构建的虚拟环境信息,分别确定被控作业机械与虚拟环境信息中各空间要素的交互状态,并根据交互状态生成场景更新信息。Step 130: According to the motion state data and the pre-constructed virtual environment information, respectively determine the interaction state between the controlled working machine and each spatial element in the virtual environment information, and generate scene update information according to the interaction state.
具体地,根据运动状态数据以及预先构建的虚拟环境信息,分别确定被控作业机械与虚拟环境信息中各空间要素的交互状态,并根据交互状态 生成场景更新信息,可以包括:Specifically, according to the motion state data and the pre-constructed virtual environment information, respectively determine the interaction state of the controlled operation machine and each space element in the virtual environment information, and generate scene update information according to the interaction state, which may include:
第一步:根据运动状态数据以及预先构建的虚拟环境信息,在确定被控作业机械与虚拟环境信息中至少一个空间要素接触时,确定发生接触的空间要素的交互类别;Step 1: According to the motion state data and the pre-constructed virtual environment information, when it is determined that the controlled operation machine is in contact with at least one spatial element in the virtual environment information, determine the interaction category of the spatial element that has been in contact;
第二步:根据发生接触的空间要素的交互类别和被控作业机械的运动状态数据,确定场景变化信息;Step 2: Determine the scene change information according to the interaction category of the contacted spatial elements and the motion state data of the controlled operating machine;
第三步:根据场景变化信息对虚拟环境信息进行实时更新,生成场景更新信息。Step 3: Update the virtual environment information in real time according to the scene change information to generate scene update information.
在本实施例中,空间要素的交互类别包括可交互类和不可交互类,对应地,空间要素分为可交互对象和不可交互对象。In this embodiment, the interaction categories of the spatial elements include interactive objects and non-interactive objects. Correspondingly, the spatial elements are divided into interactive objects and non-interactive objects.
可以理解的是,不可交互对象包括地形、道路、其他机械、障碍物以及人员等空间要素。可交互对象包括土、石等作业机械对应的施工对象。It can be understood that non-interactive objects include spatial elements such as terrain, roads, other machinery, obstacles, and people. Interactable objects include construction objects corresponding to operating machines such as soil and stone.
在示例性实施例中,判断被控作业机械与虚拟环境信息中的空间要素是否接触,可以在已知空间要素的轮廓信息中各点定义的外法向量的情况下,当被控作业机械的工作装置轮廓对应各点坐标与空间要素轮廓上部分点距离小于预设阈值时,计算每一时间步下空间要素轮廓点到最近工作装置轮廓点连线所形成的向量,与空间要素轮廓点的外法向量作点积运算,若运算结果小于或等于0,则判定被控作业机械与虚拟环境信息中某空间要素发生接触。In an exemplary embodiment, judging whether the controlled operating machine is in contact with the spatial element in the virtual environment information may be based on the known external normal vector defined by each point in the contour information of the spatial element, when the controlled operating machine When the distance between the coordinates of each point corresponding to the contour of the working device and some points on the contour of the spatial element is less than the preset threshold, calculate the vector formed by the line connecting the contour point of the spatial element to the contour point of the nearest working device at each time step, and the distance between the contour point of the spatial element The dot product operation is performed on the outer normal vector, and if the operation result is less than or equal to 0, it is determined that the controlled operation machine is in contact with a certain spatial element in the virtual environment information.
在示例性实施例中,根据发生接触的空间要素的交互类别和被控作业机械的运动状态数据,确定场景变化信息的过程,可以包括:In an exemplary embodiment, the process of determining the scene change information according to the interaction type of the spatial element in contact and the motion state data of the controlled working machine may include:
一方面,当发生接触的空间要素的交互类别为可交互类时,根据被控作业机械的运动状态数据,确定空间要素的形变状态,并将形变状态以及施工作业后被控作业机械的运动状态数据作为场景变化信息;On the one hand, when the interaction category of the space element in contact is interactive, the deformation state of the space element is determined according to the motion state data of the controlled operating machine, and the deformation state and the motion state of the controlled operating machine after construction work Data as scene change information;
另一方面,当发生接触的空间要素的交互类别为不可交互类时,将当前时刻被控作业机械的运动状态数据作为场景变化信息。On the other hand, when the interaction category of the contacted spatial element is non-interactive, the motion state data of the controlled working machine at the current moment is used as the scene change information.
在本实施例中,为了构建虚拟场景信息,可以预先搭建空间及障碍物环境模型,该模型为空间数据模型,记录了空间中各要素的轮廓或边界所对应的所有空间坐标,该模型以作业机械的运动状态数据为输入,通过实时比较空间中各要素与作业机械的相对位置、速度以及加速度等信息,判断各要素是否与作业机械发生接触。In this embodiment, in order to construct virtual scene information, a space and obstacle environment model can be built in advance. The model is a space data model, which records all the space coordinates corresponding to the outline or boundary of each element in the space. The motion state data of the machine is the input. By comparing the relative position, speed and acceleration of each element in the space with the operating machine in real time, it is judged whether each element is in contact with the operating machine.
在确定某一要素与作业机械发生接触后,进一步确定该要素属于可交互对象还是不可交互对象,对于不可交互对象,该模型可以输出受环境限制下作业机械的位置及姿态等信息,即接触状态下被控作业机械的运动状态数据。After determining that a certain element is in contact with the operating machine, it is further determined whether the element belongs to an interactive object or a non-interactive object. For non-interactive objects, the model can output information such as the position and posture of the operating machine under environmental constraints, that is, the contact state The motion status data of the controlled working machine.
对于可交互对象,可以预先搭建施工对象交互模型,该模型为空间数据模型和施工对象(比如土壤、石块)的材料特性模型,可以通过基于离散元法或物质点法的仿真算法,模拟施工对象在被控作业机械作用下的形变和残留形状,从而输出经施工后的施工对象外轮廓的各点坐标。For interactive objects, an interactive model of construction objects can be built in advance, which is a spatial data model and a material property model of construction objects (such as soil and stones). The construction can be simulated through a simulation algorithm based on discrete element method or material point method The deformation and residual shape of the object under the action of the controlled operation machine, so as to output the coordinates of each point of the outer contour of the construction object after construction.
具体地,在被控作业机械与可交互对象接触时,该模型可以模拟施工动作过程,具体根据施工部件的位置、速度等信息,基于施工对象的材料特性,实时计算出交互条件下施工部件所受到的阻力以及施工对象的形变状态,进而将施工对象的形变状态以及施工作业后被控作业机械的运动状态数据反馈给空间及障碍物环境模型,实现施工过程中场景变化信息的获取,最后将场景变化信息与虚拟环境信息中其他不可变环境要素进行重新组合,可以生成更新后的三维虚拟环境信息,即场景更新信息。Specifically, when the controlled operation machine is in contact with the interactive object, the model can simulate the construction action process, specifically according to the position, speed and other information of the construction component, and based on the material properties of the construction object, it can calculate in real time what the construction component is under the interactive condition. The resistance received and the deformation state of the construction object, and then the deformation state of the construction object and the motion state data of the controlled operating machine after the construction operation are fed back to the space and obstacle environment model to realize the acquisition of scene change information during the construction process. Finally, the The scene change information is recombined with other immutable environment elements in the virtual environment information to generate updated 3D virtual environment information, that is, scene update information.
因此,本实施例结合应用系统仿真模型、空间及障碍物环境模型、施工对象交互模型,从手柄及踏板信号到挖掘过程中土壤与属具的高精度模拟,实现虚拟现实的效果。Therefore, this embodiment combines the application system simulation model, space and obstacle environment model, and construction object interaction model, from handle and pedal signals to high-precision simulation of soil and attachments in the excavation process, to achieve the effect of virtual reality.
步骤140:获取用户的实际视角范围,根据用户的实际视角范围,从场景更新信息中截取并输出相应的场景图像信息。Step 140: Obtain the user's actual viewing angle range, intercept and output corresponding scene image information from the scene update information according to the user's actual viewing angle range.
在示例性实施例中,获取用户的实际视角范围的过程,可以包括:In an exemplary embodiment, the process of obtaining the user's actual viewing angle range may include:
首先,获取被控作业机械的回转中心坐标和回转角度;First, obtain the coordinates of the center of rotation and the angle of rotation of the controlled operating machine;
然后,根据回转中心坐标和回转角度,确定被控作业机械驾驶室的前方平视方向向量;Then, according to the slewing center coordinates and slewing angle, determine the front head-up direction vector of the cab of the controlled working machine;
最后,根据前方平视方向向量以及预设的用户视角范围,确定用户的实际视角范围。Finally, the user's actual viewing angle range is determined according to the forward head-up direction vector and the preset user's viewing angle range.
可以理解的是,上述空间及障碍物环境模型中包含了作业机械可活动空间内地面上各点坐标,该数据以一定分辨率的点云数据的形式存储。本实施例以挖掘机为例,说明实际视角范围的获取过程,具体如下:It can be understood that the above-mentioned space and obstacle environment model includes the coordinates of each point on the ground in the movable space of the working machine, and the data is stored in the form of point cloud data with a certain resolution. In this embodiment, an excavator is taken as an example to illustrate the acquisition process of the actual viewing angle range, as follows:
在挖掘机朝任意方向行进时,行驶方向上的地面需保证挖掘机履带底部(设为一个平面)的z轴方向坐标均要大于或等于其投影的地面对应点 z轴坐标,否则会发生穿透现象,同时需要保证至少三个点的z轴坐标与地面对应坐标相等。When the excavator is moving in any direction, the ground in the driving direction must ensure that the z-axis coordinates of the bottom of the excavator’s crawler (set as a plane) are greater than or equal to the z-axis coordinates of the corresponding point on the ground projected by it, otherwise it will occur At the same time, it is necessary to ensure that the z-axis coordinates of at least three points are equal to the corresponding coordinates on the ground.
此时整个挖掘机的法向量n与行进方向向量s均可由履带底部所形成的平面确定,而挖掘机回转中心o的坐标和回转角度θ为已知参数,通过D-H齐次矩阵变换法可求出每个时间点挖掘机驾驶室前方的平视方向向量b,具体可以由行进方向向量s经绕穿过挖掘机回转中心o,方向为n的轴转动角度θ计算得到,假定用户视角范围为平视方向上下角度±α 1,左右角度±α 2,进而可得用户视角及范围,从而确定用户的实际视角范围。 At this time, the normal vector n and the traveling direction vector s of the entire excavator can be determined by the plane formed by the bottom of the crawler, and the coordinates of the excavator's turning center o and the turning angle θ are known parameters, which can be obtained by the DH homogeneous matrix transformation method Find the head-up direction vector b in front of the excavator cab at each time point, which can be calculated by the travel direction vector s passing through the excavator’s rotation center o and the axis rotation angle θ in the direction n, assuming that the user’s viewing angle range is head-up The up-down angle of the direction is ±α 1 , and the left-right angle is ±α 2 , and then the user's viewing angle and range can be obtained, so as to determine the user's actual viewing angle range.
根据用户的实时实际视角范围,可以从场景更新信息中获得当前视角下用户所能见到的虚拟环境区域,具体地,根据上述获得的用户平视方向和上下左右角度,可将从视角中人所处坐标(应为驾驶室人员眼睛所处坐标)延上述视角范围连线进行延伸,与场景更新信息中中虚拟环境轮廓所形成的交集,即为用户所能见到的虚拟环境区域。According to the real-time actual viewing angle range of the user, the virtual environment area that the user can see under the current viewing angle can be obtained from the scene update information. The coordinates (should be the coordinates of the eyes of the cab personnel) are extended along the line connecting the above viewing angle range, and the intersection formed with the outline of the virtual environment in the scene update information is the virtual environment area that the user can see.
该虚拟环境区域可以通过图像和深度信息等数据反馈给用户,反馈过程中,可以通过成像设备展示给用户,用户也可以根据自己的操作所见的视觉反馈来继续或改变手柄和踏板的输入,实现实时虚拟人机交互的功能,进而实现作业机械与作业环境的虚拟交互。The virtual environment area can be fed back to the user through data such as images and depth information. During the feedback process, it can be displayed to the user through the imaging device, and the user can also continue or change the input of the handle and pedal according to the visual feedback seen by his operation. Realize the function of real-time virtual human-computer interaction, and then realize the virtual interaction between the working machine and the working environment.
当然,在实际应用过程中,成像设备不仅可以是一个或多个显示屏,还可以是可以实时跟随人体头部运动的穿戴式成像设备,比如头戴式眼镜,此时该成像设备还可以将人体实时的视角移动结合到上述获取到的用户的实际视角范围中,具体地,头戴式成像设备捕获到的视角变换(包括视角平移和转动)可以通过D-H齐次矩阵变换法进行描述,即对平视方向向量b进行变换,从而得到更加准确的实际视角范围,最终的实时图像可以由头戴式眼镜输出给用户。Of course, in the actual application process, the imaging device can not only be one or more display screens, but also a wearable imaging device that can follow the movement of the human head in real time, such as head-mounted glasses. The real-time viewing angle movement of the human body is combined with the actual viewing angle range of the user obtained above. Specifically, the viewing angle transformation (including viewing angle translation and rotation) captured by the head-mounted imaging device can be described by the D-H homogeneous matrix transformation method, namely The head-up direction vector b is transformed to obtain a more accurate actual viewing angle range, and the final real-time image can be output to the user by the head-mounted glasses.
由此可见,本实施例提供的作业机械与作业环境的虚拟交互方法,可以针对施工场景构建相应的虚拟场景,能够满足施工场景的特定需求,同时,将实际施工作业过程的机理模型与虚拟现实数字实现技术相结合,使被控作业机械的施工、行走等交互的过程更接近真实场景,更满足实际应用需求。It can be seen that the virtual interaction method between the working machine and the working environment provided by this embodiment can construct a corresponding virtual scene for the construction scene, which can meet the specific needs of the construction scene. At the same time, the mechanism model of the actual construction operation process and the virtual reality The combination of digital realization technology makes the interactive process of construction and walking of the controlled operation machine closer to the real scene, and better meets the actual application requirements.
下面对本申请提供的作业机械与作业环境的虚拟交互装置进行描述,下文描述的作业机械与作业环境的虚拟交互装置与上文描述的作业机械与 作业环境的虚拟交互方法可相互对应参照。The virtual interaction device between the working machine and the working environment provided by this application is described below. The virtual interaction device between the working machine and the working environment described below and the virtual interaction method between the working machine and the working environment described above can be referred to in correspondence.
图2示出了本申请实施例提供的作业机械与作业环境的虚拟交互装置,该装置包括:Fig. 2 shows the virtual interaction device between the working machine and the working environment provided by the embodiment of the present application, the device includes:
获取模块210,用于获取操控组件的开度比例信号;An acquisition module 210, configured to acquire an opening ratio signal of the manipulation component;
第一处理模块220,用于根据开度比例信号,确定被控作业机械的运动状态数据;The first processing module 220 is configured to determine the motion state data of the controlled working machine according to the opening ratio signal;
第二处理模块230,用于根据运动状态数据以及预先构建的虚拟环境信息,分别确定被控作业机械与虚拟环境信息中各空间要素的交互状态,并根据交互状态生成场景更新信息;The second processing module 230 is configured to respectively determine the interaction state between the controlled working machine and each spatial element in the virtual environment information according to the motion state data and the pre-constructed virtual environment information, and generate scene update information according to the interaction state;
第三处理模块240,用于获取用户的实际视角范围,根据用户的实际视角范围,从场景更新信息中截取并输出相应的场景图像信息。The third processing module 240 is configured to acquire the user's actual viewing angle range, and intercept and output corresponding scene image information from the scene update information according to the user's actual viewing angle range.
在示例性实施例中,上述第二处理模块230,具体用于:根据运动状态数据以及预先构建的虚拟环境信息,分别判断被控作业机械与虚拟环境信息中各空间要素是否接触;在判定被控作业机械与虚拟环境信息中至少一个空间要素接触时,确定发生接触的空间要素的交互类别;根据发生接触的空间要素的交互类别和被控作业机械的运动状态数据,确定场景变化信息;根据场景变化信息对虚拟环境信息进行实时更新,生成场景更新信息。In an exemplary embodiment, the above-mentioned second processing module 230 is specifically configured to: judge whether the controlled working machine is in contact with each spatial element in the virtual environment information according to the motion state data and the pre-constructed virtual environment information; When the controlled operation machine is in contact with at least one spatial element in the virtual environment information, determine the interaction category of the contacted spatial element; determine the scene change information according to the interaction category of the contacted spatial element and the motion state data of the controlled operation machine; The scene change information updates the virtual environment information in real time to generate scene update information.
具体地,本实施例中空间要素的交互类别可以包括可交互类和不可交互类。Specifically, the interaction categories of the spatial elements in this embodiment may include interactive categories and non-interactive categories.
上述第二处理模块230根据发生接触的空间要素的交互类别和被控作业机械的运动状态数据,确定场景变化信息的功能,具体可以通过如下方式实现:The above-mentioned second processing module 230 determines the function of the scene change information according to the interaction type of the contacted spatial element and the motion state data of the controlled operating machine, which can be specifically realized in the following manner:
当发生接触的空间要素的交互类别为可交互类时,根据被控作业机械的运动状态数据,确定空间要素的形变状态,并将形变状态以及施工作业后被控作业机械的运动状态数据作为场景变化信息;When the interaction category of the contacted spatial element is interactive, the deformation state of the spatial element is determined according to the motion state data of the controlled operating machine, and the deformation state and the motion state data of the controlled operating machine after construction work are used as the scene change information;
当发生接触的空间要素的交互类别为不可交互类时,将当前时刻被控作业机械的运动状态数据作为场景变化信息。When the interaction category of the contacted spatial elements is non-interactive, the motion state data of the controlled working machine at the current moment is used as the scene change information.
上述第三处理模块240获取用户的实际视角范围的功能,具体可以通过如下方式实现:The above-mentioned function of the third processing module 240 to obtain the user's actual viewing angle range can be specifically implemented in the following manner:
获取被控作业机械的回转中心坐标和回转角度;Obtain the rotation center coordinates and rotation angle of the controlled operating machine;
根据回转中心坐标和回转角度,确定被控作业机械驾驶室的前方平视方向向量;According to the coordinates of the center of rotation and the angle of rotation, determine the front head-up direction vector of the cab of the controlled operating machine;
根据前方平视方向向量以及预设的用户视角范围,确定用户的实际视角范围。According to the front head-up direction vector and the preset user viewing angle range, the user's actual viewing angle range is determined.
图3示出了本申请实施例提供的作业机械与作业环境的虚拟交互系统,该系统包括:操控组件310、数据处理设备320以及至少一个成像设备330,操控组件310和至少一个成像设备330均与数据处理设备320连接;FIG. 3 shows a virtual interaction system between an operating machine and an operating environment provided by an embodiment of the present application. The system includes: a manipulation component 310, a data processing device 320, and at least one imaging device 330. Both the manipulation component 310 and the at least one imaging device 330 are Connect with data processing equipment 320;
操控组件310用于供用户发起对被控作业机械的操控动作;The manipulation component 310 is used for the user to initiate a manipulation action on the controlled working machine;
数据处理设备320用于获取操控组件的开度比例信号;根据开度比例信号,确定被控作业机械的运动状态数据;根据运动状态数据以及预先构建的虚拟环境信息,分别确定被控作业机械与虚拟环境信息中各空间要素的交互状态,并根据交互状态生成场景更新信息;获取用户的实际视角范围,根据用户的实际视角范围,从场景更新信息中截取并输出相应的场景图像信息;The data processing device 320 is used to obtain the opening proportional signal of the control component; determine the motion state data of the controlled operating machine according to the opening proportional signal; determine the controlled operating machine and the The interactive state of each spatial element in the virtual environment information, and generate scene update information according to the interactive state; obtain the user's actual viewing angle range, intercept and output the corresponding scene image information from the scene update information according to the user's actual viewing angle range;
成像设备330用于显示场景图像信息。The imaging device 330 is used to display scene image information.
需要说明的是,本实施例中成像设备330可以是显示屏或头戴式成像设备。It should be noted that the imaging device 330 in this embodiment may be a display screen or a head-mounted imaging device.
在示例性实施例中,当被控作业机械为挖掘机时,操控组件包括左右手柄和左右脚踏板。In an exemplary embodiment, when the controlled working machine is an excavator, the control assembly includes left and right handles and left and right pedals.
图4示出了被控对象为挖掘机时,作业机械与作业环境的虚拟交互系统的结构架构,该系统中包含三个显示屏410、左、右两个脚踏板420、左手柄430、右手柄440以及计算主机450;Fig. 4 shows the structural framework of the virtual interaction system between the working machine and the working environment when the controlled object is an excavator. The system includes three display screens 410, two left and right pedals 420, a left handle 430, Right handle 440 and computing host 450;
用户通过操作左、右两个脚踏板420、左手柄430、右手柄440,对应的手柄和踏板的开度比例信号通过某种传输方式(可以是有线传输,也可以是无线传输)到计算主机450,通过计算主机450经过一系列数据处理流程,得到场景图像信息,该场景图像信息可以通过显示屏410实时显示给用户,用户基于显示信息可以进一步操控踏板和手柄进行下一步动作,从而在虚拟场景下,以人机交互方式实现挖掘机与虚拟作业环境的实时交互。The user operates the left and right pedals 420, the left handle 430, and the right handle 440, and the corresponding handle and pedal opening ratio signals are transmitted to the computer through a certain transmission method (either wired transmission or wireless transmission). The host computer 450 obtains the scene image information through a series of data processing procedures through the calculation host computer 450, and the scene image information can be displayed to the user in real time through the display screen 410. In the virtual scene, the real-time interaction between the excavator and the virtual operating environment is realized by means of human-computer interaction.
参见附图4,该系统还设置了一个座椅460,用户可以在座椅460上操控上述操控组件。Referring to FIG. 4 , the system is also provided with a seat 460 on which the user can manipulate the above-mentioned control components.
对于计算主机450内的数据处理流程,可以参见附图5,处理流程具体如下:For the data processing flow in the computing host 450, please refer to accompanying drawing 5, the processing flow is specifically as follows:
步骤510:用户操作手柄和踏板,进而输出手柄信号和踏板信号;Step 510: the user operates the handle and the pedal, and then outputs the handle signal and the pedal signal;
步骤520:手柄信号和踏板信号输入系统仿真模型,即物理参数化模型,经系统仿真模型处理后输出挖掘机的运动状态数据;Step 520: Input the handle signal and pedal signal into the system simulation model, that is, the physical parameterized model, and output the motion state data of the excavator after being processed by the system simulation model;
步骤530:挖掘机的运动状态数据进一步输入空间及障碍物环境模型,该模型经虚拟建图或实景扫描转化得到,经该模型处理,并使用头戴式设备跟随人体的视角变化,可以输出挖掘机的回转、行走对应的视角及位置变化;Step 530: The motion state data of the excavator is further input into the space and obstacle environment model. The model is obtained through virtual mapping or real-scene scanning transformation. After the model is processed, and the head-mounted device is used to follow the change of the perspective of the human body, the excavation can be output. The angle of view and position change corresponding to the rotation and walking of the machine;
步骤540:同时,通过施工对象交互模型与空间及障碍物环境模型交互,可以得到挖掘导致的土体虚拟环境变化;Step 540: At the same time, through the interaction between the construction object interaction model and the space and obstacle environment model, the soil virtual environment changes caused by excavation can be obtained;
步骤550:将步骤530和步骤540得到的数据进行整合,得到交互后虚拟环境及实时视角;Step 550: Integrate the data obtained in step 530 and step 540 to obtain the interactive virtual environment and real-time perspective;
步骤560:将交互后虚拟环境在成像设备上呈现给用户,从而便于用户进行下一步操作。Step 560: Present the interactive virtual environment to the user on the imaging device, so as to facilitate the user to perform the next operation.
此外,本申请还提供一种作业机械操控性能测试系统,该系统使用上述作业机械与作业环境的虚拟交互方法。该测试系统可以利用虚拟环境实现作业机械操控性能的测试,相比于现有的基于实际场景的测试方式,要更加便捷、高效。In addition, the present application also provides a control performance test system of an operating machine, which uses the above-mentioned virtual interaction method between the operating machine and the operating environment. The test system can use the virtual environment to test the control performance of operating machinery, which is more convenient and efficient than the existing test methods based on actual scenarios.
图6示例了一种电子设备的实体结构示意图,如图6所示,该电子设备可以包括:处理器(processor)610、通信接口(Communications Interface)620、存储器(memory)630和通信总线640,其中,处理器610,通信接口620,存储器630通过通信总线640完成相互间的通信。处理器610可以调用存储器630中的逻辑指令,以执行作业机械与作业环境的虚拟交互方法,该方法包括:获取操控组件的开度比例信号;根据开度比例信号,确定被控作业机械的运动状态数据;根据运动状态数据以及预先构建的虚拟环境信息,分别确定被控作业机械与虚拟环境信息中各空间要素的交互状态,并根据交互状态生成场景更新信息;获取用户的实际视角范围,根据用户的实际视角范围,从场景更新信息中截取并输出相应的场景图像信息。FIG. 6 illustrates a schematic diagram of the physical structure of an electronic device. As shown in FIG. 6, the electronic device may include: a processor (processor) 610, a communication interface (Communications Interface) 620, a memory (memory) 630 and a communication bus 640, Wherein, the processor 610 , the communication interface 620 , and the memory 630 communicate with each other through the communication bus 640 . The processor 610 can call the logic instructions in the memory 630 to execute the virtual interaction method between the working machine and the working environment. The method includes: acquiring the opening proportional signal of the control component; determining the movement of the controlled working machine according to the opening proportional signal State data; according to the motion state data and the pre-constructed virtual environment information, respectively determine the interaction state of the controlled operation machine and each space element in the virtual environment information, and generate scene update information according to the interaction state; obtain the user's actual viewing angle range, according to The user's actual viewing angle range is intercepted from the scene update information and the corresponding scene image information is output.
此外,上述的存储器630中的逻辑指令可以通过软件功能单元的形式实现并作为独立的产品销售或使用时,可以存储在一个计算机可读取存储 介质中。基于这样的理解,本申请的技术方案本质上或者说对现有技术做出贡献的部分或者该技术方案的部分可以以软件产品的形式体现出来,该计算机软件产品存储在一个存储介质中,包括若干指令用以使得一台计算机设备(可以是个人计算机,服务器,或者网络设备等)执行本申请各个实施例所述方法的全部或部分步骤。而前述的存储介质包括:U盘、移动硬盘、只读存储器(ROM,Read-Only Memory)、随机存取存储器(RAM,Random Access Memory)、磁碟或者光盘等各种可以存储程序代码的介质。In addition, the logic instructions in the above-mentioned memory 630 may be implemented in the form of software functional units and when sold or used as an independent product, they may be stored in a computer-readable storage medium. Based on this understanding, the technical solution of the present application is essentially or the part that contributes to the prior art or the part of the technical solution can be embodied in the form of a software product, and the computer software product is stored in a storage medium, including Several instructions are used to make a computer device (which may be a personal computer, a server, or a network device, etc.) execute all or part of the steps of the methods described in the various embodiments of the present application. The aforementioned storage media include: U disk, mobile hard disk, read-only memory (ROM, Read-Only Memory), random access memory (RAM, Random Access Memory), magnetic disk or optical disc, etc., which can store program codes. .
另一方面,本申请还提供一种计算机程序产品,所述计算机程序产品包括存储在非暂态计算机可读存储介质上的计算机程序,所述计算机程序包括程序指令,当所述程序指令被计算机执行时,计算机能够执行上述各方法所提供的作业机械与作业环境的虚拟交互方法,该方法包括:获取操控组件的开度比例信号;根据开度比例信号,确定被控作业机械的运动状态数据;根据运动状态数据以及预先构建的虚拟环境信息,分别确定被控作业机械与虚拟环境信息中各空间要素的交互状态,并根据交互状态生成场景更新信息;获取用户的实际视角范围,根据用户的实际视角范围,从场景更新信息中截取并输出相应的场景图像信息。On the other hand, the present application also provides a computer program product, the computer program product includes a computer program stored on a non-transitory computer-readable storage medium, the computer program includes program instructions, and when the program instructions are executed by a computer When executing, the computer can execute the virtual interaction method between the working machine and the working environment provided by the above methods, the method includes: obtaining the opening proportional signal of the control component; according to the opening proportional signal, determining the motion state data of the controlled working machine ;According to the motion state data and the pre-constructed virtual environment information, respectively determine the interaction state of the controlled operation machine and each space element in the virtual environment information, and generate scene update information according to the interaction state; obtain the user's actual viewing angle range, according to the user's The actual viewing angle range is intercepted from the scene update information and the corresponding scene image information is output.
又一方面,本申请还提供一种非暂态计算机可读存储介质,其上存储有计算机程序,该计算机程序被处理器执行时实现以执行上述各提供的作业机械与作业环境的虚拟交互方法,该方法包括:获取操控组件的开度比例信号;根据开度比例信号,确定被控作业机械的运动状态数据;根据运动状态数据以及预先构建的虚拟环境信息,分别确定被控作业机械与虚拟环境信息中各空间要素的交互状态,并根据交互状态生成场景更新信息;获取用户的实际视角范围,根据用户的实际视角范围,从场景更新信息中截取并输出相应的场景图像信息。In yet another aspect, the present application also provides a non-transitory computer-readable storage medium, on which a computer program is stored, and when the computer program is executed by a processor, it is implemented to perform the above-mentioned methods for virtual interaction between an operating machine and an operating environment. , the method includes: obtaining the opening proportional signal of the control component; determining the motion state data of the controlled operating machine according to the opening proportional signal; respectively determining the controlled operating machine and the virtual The interactive state of each spatial element in the environmental information, and generate scene update information according to the interactive state; obtain the user's actual viewing angle range, intercept and output the corresponding scene image information from the scene update information according to the user's actual viewing angle range.
以上所描述的装置实施例仅仅是示意性的,其中所述作为分离部件说明的单元可以是或者也可以不是物理上分开的,作为单元显示的部件可以是或者也可以不是物理单元,即可以位于一个地方,或者也可以分布到多个网络单元上。可以根据实际的需要选择其中的部分或者全部模块来实现本实施例方案的目的。本领域普通技术人员在不付出创造性的劳动的情况下,即可以理解并实施。The device embodiments described above are only illustrative, and the units described as separate components may or may not be physically separated, and the components shown as units may or may not be physical units, that is, they may be located in One place, or it can be distributed to multiple network elements. Part or all of the modules can be selected according to actual needs to achieve the purpose of the solution of this embodiment. It can be understood and implemented by those skilled in the art without any creative efforts.
通过以上的实施方式的描述,本领域的技术人员可以清楚地了解到各 实施方式可借助软件加必需的通用硬件平台的方式来实现,当然也可以通过硬件。基于这样的理解,上述技术方案本质上或者说对现有技术做出贡献的部分可以以软件产品的形式体现出来,该计算机软件产品可以存储在计算机可读存储介质中,如ROM/RAM、磁碟、光盘等,包括若干指令用以使得一台计算机设备(可以是个人计算机,服务器,或者网络设备等)执行各个实施例或者实施例的某些部分所述的方法。Through the above description of the implementations, those skilled in the art can clearly understand that each implementation can be implemented by means of software plus a necessary general-purpose hardware platform, and of course also by hardware. Based on this understanding, the essence of the above technical solution or the part that contributes to the prior art can be embodied in the form of software products, and the computer software products can be stored in computer-readable storage media, such as ROM/RAM, magnetic discs, optical discs, etc., including several instructions to make a computer device (which may be a personal computer, server, or network device, etc.) execute the methods described in various embodiments or some parts of the embodiments.
最后应说明的是:以上实施例仅用以说明本申请的技术方案,而非对其限制;尽管参照前述实施例对本申请进行了详细的说明,本领域的普通技术人员应当理解:其依然可以对前述各实施例所记载的技术方案进行修改,或者对其中部分技术特征进行等同替换;而这些修改或者替换,并不使相应技术方案的本质脱离本申请各实施例技术方案的精神和范围。Finally, it should be noted that: the above embodiments are only used to illustrate the technical solutions of the present application, rather than limiting them; although the present application has been described in detail with reference to the foregoing embodiments, those of ordinary skill in the art should understand that: it can still Modifications are made to the technical solutions described in the foregoing embodiments, or equivalent replacements are made to some of the technical features; and these modifications or replacements do not make the essence of the corresponding technical solutions deviate from the spirit and scope of the technical solutions of the various embodiments of the present application.

Claims (10)

  1. 一种作业机械与作业环境的虚拟交互方法,包括:A virtual interaction method between an operating machine and an operating environment, comprising:
    获取操控组件的开度比例信号;Obtain the opening proportional signal of the control component;
    根据所述开度比例信号,确定被控作业机械的运动状态数据;Determine the motion state data of the controlled working machine according to the opening ratio signal;
    根据所述运动状态数据以及预先构建的虚拟环境信息,分别确定所述被控作业机械与所述虚拟环境信息中各空间要素的交互状态,并根据所述交互状态生成场景更新信息;According to the motion state data and the pre-constructed virtual environment information, respectively determine the interaction state between the controlled working machine and each space element in the virtual environment information, and generate scene update information according to the interaction state;
    获取用户的实际视角范围,根据所述用户的实际视角范围,从所述场景更新信息中截取并输出相应的场景图像信息。The user's actual viewing angle range is acquired, and corresponding scene image information is intercepted from the scene update information and output according to the user's actual viewing angle range.
  2. 根据权利要求1所述的一种作业机械与作业环境的虚拟交互方法,其中,根据所述运动状态数据以及预先构建的虚拟环境信息,分别确定所述被控作业机械与所述虚拟环境信息中各空间要素的交互状态,并根据所述交互状态生成场景更新信息,包括:The virtual interaction method between an operating machine and an operating environment according to claim 1, wherein, according to the motion state data and the pre-built virtual environment information, respectively determine the controlled operating machine and the virtual environment information Interaction state of each spatial element, and generate scene update information according to the interaction state, including:
    根据所述运动状态数据以及预先构建的虚拟环境信息,在确定所述被控作业机械与所述虚拟环境信息中至少一个空间要素接触时,确定发生接触的所述空间要素的交互类别;According to the motion state data and the pre-constructed virtual environment information, when it is determined that the controlled working machine is in contact with at least one spatial element in the virtual environment information, determine the interaction type of the spatial element in contact;
    根据所述交互类别和所述被控作业机械的运动状态数据,确定场景变化信息;Determine scene change information according to the interaction category and the motion state data of the controlled working machine;
    根据所述场景变化信息对所述虚拟环境信息进行实时更新,生成场景更新信息。The virtual environment information is updated in real time according to the scene change information to generate scene update information.
  3. 根据权利要求2所述的一种作业机械与作业环境的虚拟交互方法,其中,所述空间要素的交互类别包括可交互类和不可交互类。The method for virtual interaction between a working machine and a working environment according to claim 2, wherein the interaction categories of the spatial elements include interactive categories and non-interactive categories.
  4. 根据权利要求3所述的一种作业机械与作业环境的虚拟交互方法,其中,根据所述交互类别和所述被控作业机械的运动状态数据,确定场景变化信息,包括:A method for virtual interaction between an operating machine and an operating environment according to claim 3, wherein determining scene change information according to the interaction category and the motion state data of the controlled operating machine includes:
    当发生接触的所述空间要素的交互类别为可交互类时,根据所述被控作业机械的运动状态数据,确定所述空间要素的形变状态,并将所述形变状态以及施工作业后所述被控作业机械的运动状态数据作为场景变化信息;When the interaction category of the space element in contact is interactive, determine the deformation state of the space element according to the motion state data of the controlled working machine, and compare the deformation state and the The motion state data of the controlled operating machine is used as scene change information;
    当发生接触的所述空间要素的交互类别为不可交互类时,将当前时刻 所述被控作业机械的运动状态数据作为场景变化信息。When the interaction category of the space element in contact is non-interactive, the motion state data of the controlled working machine at the current moment is used as the scene change information.
  5. 根据权利要求1所述的一种作业机械与作业环境的虚拟交互方法,其中,获取用户的实际视角范围,包括:A virtual interaction method between an operating machine and an operating environment according to claim 1, wherein obtaining the user's actual viewing angle range includes:
    获取被控作业机械的回转中心坐标和回转角度;Obtain the rotation center coordinates and rotation angle of the controlled operating machine;
    根据所述回转中心坐标和所述回转角度,确定被控作业机械驾驶室的前方平视方向向量;Determine the front head-up direction vector of the cab of the controlled working machine according to the coordinates of the center of rotation and the angle of rotation;
    根据所述前方平视方向向量以及预设的用户视角范围,确定用户的实际视角范围。According to the front head-up direction vector and the preset user viewing angle range, the user's actual viewing angle range is determined.
  6. 一种作业机械与作业环境的虚拟交互装置,包括:A virtual interaction device for an operating machine and an operating environment, comprising:
    获取模块,用于获取操控组件的开度比例信号;An acquisition module, configured to acquire an opening proportional signal of the manipulation component;
    第一处理模块,用于根据所述开度比例信号,确定被控作业机械的运动状态数据;The first processing module is configured to determine the motion state data of the controlled working machine according to the opening ratio signal;
    第二处理模块,用于根据所述运动状态数据以及预先构建的虚拟环境信息,分别确定所述被控作业机械与所述虚拟环境信息中各空间要素的交互状态,并根据所述交互状态生成场景更新信息;The second processing module is configured to respectively determine the interaction state between the controlled working machine and each spatial element in the virtual environment information according to the motion state data and the pre-built virtual environment information, and generate Scene update information;
    第三处理模块,用于获取用户的实际视角范围,根据所述用户的实际视角范围,从所述场景更新信息中截取并输出相应的场景图像信息。The third processing module is configured to acquire the actual viewing angle range of the user, and intercept and output corresponding scene image information from the scene update information according to the actual viewing angle range of the user.
  7. 一种作业机械与作业环境的虚拟交互系统,包括:操控组件、数据处理设备以及至少一个成像设备,所述操控组件和所述至少一个成像设备均与所述数据处理设备连接;A virtual interaction system between an operating machine and an operating environment, comprising: a manipulation component, a data processing device, and at least one imaging device, and the manipulation component and the at least one imaging device are both connected to the data processing device;
    所述操控组件用于供用户发起对被控作业机械的操控动作;The control component is used for the user to initiate a control action on the controlled working machine;
    所述数据处理设备用于获取所述操控组件的开度比例信号;根据所述开度比例信号,确定被控作业机械的运动状态数据;根据所述运动状态数据以及预先构建的虚拟环境信息,分别确定所述被控作业机械与所述虚拟环境信息中各空间要素的交互状态,并根据所述交互状态生成场景更新信息;获取用户的实际视角范围,根据所述用户的实际视角范围,从所述场景更新信息中截取相应的场景图像信息;The data processing device is used to obtain the opening proportional signal of the control component; determine the motion state data of the controlled working machine according to the opening proportional signal; according to the motion state data and pre-built virtual environment information, Respectively determine the interaction state between the controlled working machine and each spatial element in the virtual environment information, and generate scene update information according to the interaction state; obtain the user's actual viewing angle range, and based on the user's actual viewing angle range, from Intercepting corresponding scene image information from the scene update information;
    所述成像设备用于显示所述场景图像信息。The imaging device is used to display the scene image information.
  8. 根据权利要求7所述的一种作业机械与作业环境的虚拟交互系统,其中,所述成像设备为显示屏或头戴式成像设备。The virtual interaction system between an operating machine and an operating environment according to claim 7, wherein the imaging device is a display screen or a head-mounted imaging device.
  9. 根据权利要求7所述的一种作业机械与作业环境的虚拟交互系 统,其中,当所述被控作业机械为挖掘机时,所述操控组件包括左右手柄和左右脚踏板。The virtual interaction system between an operating machine and an operating environment according to claim 7, wherein when the controlled operating machine is an excavator, the control components include left and right handles and left and right pedals.
  10. 一种作业机械操控性能测试系统,该系统使用如权利要求1至5任一项所述的一种作业机械与作业环境的虚拟交互方法。A control performance test system for an operating machine, which uses the virtual interaction method between an operating machine and an operating environment as claimed in any one of claims 1 to 5.
PCT/CN2023/070137 2022-01-04 2023-01-03 Virtual interaction method, apparatus and system for work machine and work environment WO2023131124A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202210002264.5A CN114327076A (en) 2022-01-04 2022-01-04 Virtual interaction method, device and system for working machine and working environment
CN202210002264.5 2022-01-04

Publications (1)

Publication Number Publication Date
WO2023131124A1 true WO2023131124A1 (en) 2023-07-13

Family

ID=81022281

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2023/070137 WO2023131124A1 (en) 2022-01-04 2023-01-03 Virtual interaction method, apparatus and system for work machine and work environment

Country Status (2)

Country Link
CN (1) CN114327076A (en)
WO (1) WO2023131124A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116934180A (en) * 2023-09-15 2023-10-24 恒实建设管理股份有限公司 Whole process consultation information management method, system, device and storage medium

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114327076A (en) * 2022-01-04 2022-04-12 上海三一重机股份有限公司 Virtual interaction method, device and system for working machine and working environment

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20170069218A1 (en) * 2015-09-07 2017-03-09 Industry-University Cooperation Foundation Korea Aerospace University L-v-c operating system and unmanned aerial vehicle training/testing method using the same
CN108701425A (en) * 2016-01-14 2018-10-23 比伯拉赫利勃海尔零部件有限公司 The simulator of crane, building machinery or industrial truck
CN112908084A (en) * 2021-02-04 2021-06-04 三一汽车起重机械有限公司 Simulation training system, method and device for working machine and electronic equipment
CN112965399A (en) * 2021-03-24 2021-06-15 中国人民解放军63653部队 Semi-physical simulation test method and device for engineering mechanical equipment
CN113192381A (en) * 2021-05-11 2021-07-30 上海西井信息科技有限公司 Hybrid scene-based driving simulation method, system, device and storage medium
CN114327076A (en) * 2022-01-04 2022-04-12 上海三一重机股份有限公司 Virtual interaction method, device and system for working machine and working environment

Family Cites Families (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10019634B2 (en) * 2010-06-04 2018-07-10 Masoud Vaziri Method and apparatus for an eye tracking wearable computer
CN106802712A (en) * 2015-11-26 2017-06-06 英业达科技有限公司 Interactive augmented reality system
CN106582012B (en) * 2016-12-07 2018-12-11 腾讯科技(深圳)有限公司 Climbing operation processing method and device under a kind of VR scene
WO2019019248A1 (en) * 2017-07-28 2019-01-31 深圳市瑞立视多媒体科技有限公司 Virtual reality interaction method, device and system
CN110046833A (en) * 2019-05-13 2019-07-23 吉林大学 A kind of traffic congestion auxiliary system virtual test system
CN112738010B (en) * 2019-10-28 2023-08-22 阿里巴巴集团控股有限公司 Data interaction method and system, interaction terminal and readable storage medium
CN112249005B (en) * 2020-10-23 2021-10-12 广州小鹏汽车科技有限公司 Interaction method and device for automatic parking of vehicle

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20170069218A1 (en) * 2015-09-07 2017-03-09 Industry-University Cooperation Foundation Korea Aerospace University L-v-c operating system and unmanned aerial vehicle training/testing method using the same
CN108701425A (en) * 2016-01-14 2018-10-23 比伯拉赫利勃海尔零部件有限公司 The simulator of crane, building machinery or industrial truck
CN112908084A (en) * 2021-02-04 2021-06-04 三一汽车起重机械有限公司 Simulation training system, method and device for working machine and electronic equipment
CN112965399A (en) * 2021-03-24 2021-06-15 中国人民解放军63653部队 Semi-physical simulation test method and device for engineering mechanical equipment
CN113192381A (en) * 2021-05-11 2021-07-30 上海西井信息科技有限公司 Hybrid scene-based driving simulation method, system, device and storage medium
CN114327076A (en) * 2022-01-04 2022-04-12 上海三一重机股份有限公司 Virtual interaction method, device and system for working machine and working environment

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116934180A (en) * 2023-09-15 2023-10-24 恒实建设管理股份有限公司 Whole process consultation information management method, system, device and storage medium
CN116934180B (en) * 2023-09-15 2023-12-08 恒实建设管理股份有限公司 Whole process consultation information management method, system, device and storage medium

Also Published As

Publication number Publication date
CN114327076A (en) 2022-04-12

Similar Documents

Publication Publication Date Title
WO2023131124A1 (en) Virtual interaction method, apparatus and system for work machine and work environment
US20210139293A1 (en) Crane, Construction Machine Or Industrial Truck Simulator
CN104484522B (en) A kind of construction method of robot simulation's drilling system based on reality scene
US20170199580A1 (en) Grasping virtual objects in augmented reality
US8203529B2 (en) Tactile input/output device and system to represent and manipulate computer-generated surfaces
CN107943286B (en) Method for enhancing roaming immersion
CN113366491B (en) Eyeball tracking method, device and storage medium
US11567579B2 (en) Selection of an edge with an immersive gesture in 3D modeling
JP2010257081A (en) Image procession method and image processing system
US6760030B2 (en) Method of displaying objects in a virtual 3-dimensional space
IL299465A (en) Object recognition neural network for amodal center prediction
Yamada et al. Construction tele-robot system with virtual reality
US5577176A (en) Method and apparatus for displaying a cursor along a two dimensional representation of a computer generated three dimensional surface
Du et al. An intelligent interaction framework for teleoperation based on human-machine cooperation
CN112530022A (en) Method for computer-implemented simulation of LIDAR sensors in a virtual environment
Valentini Natural interface in augmented reality interactive simulations: This paper demonstrates that the use of a depth sensing camera that helps generate a three-dimensional scene and track user's motion could enhance the realism of the interactions between virtual and physical objects
CA2924696C (en) Interactive haptic system for virtual reality environment
CN112486319B (en) VR (virtual reality) interaction method, device, equipment and medium based on touch rendering equipment
CN115239636A (en) Assembly detection method based on augmented reality technology
KR102314578B1 (en) Auxiliary camera position optimization apparatus for 3D object operation in virtual reality and method thereof
Zheng et al. Research on virtual driving system of a forestry logging harvester
Carozza et al. An immersive hybrid reality system for construction training
Ni et al. Teleoperation system with virtual reality based on stereo vision
Huang et al. Teleoperate system of underwater cleaning robot based on HUD
CN117289796B (en) High-interaction mixed reality system and method for complex equipment based on haptic glove

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 23736989

Country of ref document: EP

Kind code of ref document: A1