CN213938189U - Non-blind area remote control system based on mixed reality technology and engineering vehicle - Google Patents
Non-blind area remote control system based on mixed reality technology and engineering vehicle Download PDFInfo
- Publication number
- CN213938189U CN213938189U CN202022275198.8U CN202022275198U CN213938189U CN 213938189 U CN213938189 U CN 213938189U CN 202022275198 U CN202022275198 U CN 202022275198U CN 213938189 U CN213938189 U CN 213938189U
- Authority
- CN
- China
- Prior art keywords
- video
- mixed reality
- module
- processor
- transmitting
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
- 238000005516 engineering process Methods 0.000 title claims description 19
- 239000011521 glass Substances 0.000 claims abstract description 80
- 230000000007 visual effect Effects 0.000 claims description 20
- 238000012937 correction Methods 0.000 claims description 17
- 230000009466 transformation Effects 0.000 claims description 14
- 230000001133 acceleration Effects 0.000 claims description 13
- 238000009434 installation Methods 0.000 claims description 5
- 238000000034 method Methods 0.000 abstract description 13
- 238000004891 communication Methods 0.000 description 56
- 238000010276 construction Methods 0.000 description 5
- 230000005540 biological transmission Effects 0.000 description 3
- 238000006073 displacement reaction Methods 0.000 description 3
- 208000002173 dizziness Diseases 0.000 description 3
- 238000010586 diagram Methods 0.000 description 2
- 238000012986 modification Methods 0.000 description 2
- 230000004048 modification Effects 0.000 description 2
- 230000008569 process Effects 0.000 description 2
- 238000000926 separation method Methods 0.000 description 2
- 230000009471 action Effects 0.000 description 1
- 238000003491 array Methods 0.000 description 1
- 230000007547 defect Effects 0.000 description 1
- 238000011161 development Methods 0.000 description 1
- 239000002360 explosive Substances 0.000 description 1
- 230000002452 interceptive effect Effects 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 230000008447 perception Effects 0.000 description 1
- 230000005855 radiation Effects 0.000 description 1
- 238000012800 visualization Methods 0.000 description 1
Images
Landscapes
- Testing, Inspecting, Measuring Of Stereoscopic Televisions And Televisions (AREA)
Abstract
The embodiment of the utility model provides a non-blind area remote control system and vehicle based on mixed reality technique, remote control system includes: the camera array comprises a plurality of cameras arranged on the periphery of the controlled equipment and is used for collecting videos on the periphery of the controlled equipment without blind areas; the field controller is connected with the video output end of the camera array and used for receiving the video collected by the camera array, transmitting the video to the far-end mixed reality glasses, receiving an operation signal transmitted by the remote controller and controlling the controlled equipment to work; the mixed reality glasses are used for remotely receiving the videos transmitted by the field controller and splicing the videos into a panoramic video to be displayed; a remote operation section for generating an operation signal; an operator operates the remote cockpit operating part to control the controlled equipment according to the panoramic video; and the remote controller is used for acquiring the operation signal of the remote operation part and transmitting the operation signal to the field controller.
Description
Technical Field
The utility model relates to a remote control technical field specifically relates to a non-blind area remote control system based on mixed reality technique and an application are based on non-blind area remote control system's of mixed reality technique engineering vehicle.
Background
MR (Mixed Reality) refers to a new visualization environment resulting from the merging of real and virtual worlds. The mixed reality technology is a further development of the virtual reality technology, and the technology builds an interactive feedback information loop among the virtual world, the real world and a user by introducing real scene information into a virtual environment so as to enhance the reality sense of the user experience.
In many occasions with severe construction environments, such as radiation, flammable and explosive environments and the like, engineering vehicles are driven to work in the environments, the life safety of drivers faces great threat, and for the scenes, an engineering vehicle system capable of being remotely controlled and a device for checking camera pictures based on a virtual reality technology appear, but the engineering vehicle system and the device are simple and do not give full play to the advantages of the technology.
The patent with application number 201720266098.4 discloses a control tower crane based on Virtual Reality technique, including tower crane and ground control room, be provided with data acquisition module and image acquisition module on the tower crane, be provided with the power in the ground control room, main control unit, equipment control unit, VR (Virtual Reality ) head shows and intelligent seat, data acquisition module sets up in tower crane top tower crane body and wire rope's combination department, image acquisition module comprises but horizontal rotation camera and two minute cameras, but horizontal rotation camera sets up tower davit and tower crane body junction on the tower crane, a minute camera sets up in tower davit's dolly bottom and camera lens down, another minute camera sets up in tower crane top tower crane body and wire rope's combination department and camera lens slope downwards.
The ground control room that has proposed to adopt the separation in this scheme is controlled, adopt the first perception site environment of VR to show simultaneously, the ground control room of separation can effectively ensure operating personnel's safety, but the first head of VR shows for totally enclosed shows, can shield the observation of user to the reality scene completely, the user can not operate the button in the ground control room when using the first apparent of VR, the action bars etc. on the other hand, what image acquisition module adopted in this scheme is that but set up tower davit and the horizontal rotation camera of tower crane body junction on the tower crane, set up in the dolly bottom of tower davit and camera lens one branch camera down and set up in the tower crane top tower crane body and wire rope's junction and the downward another branch camera of camera lens slope. For an operator, the camera can be shielded by the structure of the engineering vehicle, so that a visual blind area exists, accidents are easy to happen, and loss is brought.
SUMMERY OF THE UTILITY MODEL
The embodiment of the utility model provides a non-blind area remote control system and engineering vehicle based on mixed reality technique, this remote control system adopts the camera array who installs around the controlled equipment to gather the video around the controlled equipment, covers around the controlled equipment comprehensively, and operating personnel can obtain the real-time picture of the non-blind area around the controlled equipment, and the controlled equipment itself need not rotate when watching the side and back; the remote operation part is deployed in a safe area near a construction site, an operator wears mixed reality glasses to check real-time video pictures around the controlled equipment, the mixed reality glasses are transparent glasses bodies, the operation of the remote operation part by the operator is not influenced, the interference to the operation is small, and the remote control of the controlled equipment is realized.
In order to achieve the above object, the utility model discloses the first aspect provides a no blind area remote control system based on mixed reality technique, remote control system includes:
the camera array comprises a plurality of cameras arranged on the periphery of the controlled equipment and is used for collecting videos on the periphery of the controlled equipment without blind areas;
the field controller is connected with the video output end of the camera array and used for receiving the video collected by the camera array, transmitting the video to the far-end mixed reality glasses and controlling the controlled equipment to work according to an operation signal received from the remote controller;
the mixed reality glasses are used for remotely receiving the videos transmitted by the field controller and splicing the videos into a panoramic video to be displayed;
the remote operation part is used for generating the operation signal under the operation of an operator, and the operator operates the remote operation part according to the panoramic video;
the remote controller is used for collecting the operation signal from the remote operation part and transmitting the operation signal to the field controller.
Optionally, the remote control system further includes a large display screen, and the large display screen is connected to the site controller and is configured to remotely receive and display the video transmitted by the site controller.
Optionally, the remote control system further includes a controlled device information sensor, where the controlled device information sensor is installed on the controlled device, and is used to collect real-time working state information of the controlled device and transmit the real-time working state information to the field controller;
the field controller is also used for receiving the real-time working state information and transmitting the real-time working state information to the far-end mixed reality glasses.
Further, the controlled device information sensor at least comprises a satellite positioning sensor, a direction sensor and an inclination sensor. The sensor collects the real-time working state information of the controlled equipment and provides a judgment basis of the real-time state of the controlled equipment for the remote operator.
Optionally, the remote operation part comprises a joystick, a button, a switch and a meter which are the same as those of the controlled device control room. The remote operation part adopts an operation part consistent with that on the controlled equipment, and an operator operates a control lever, a button, a switch and the like to generate an operation signal, the operation signal is transmitted to a field controller on the controlled equipment through the remote controller, and the field controller controls the controlled equipment to execute, so that the remote control is realized.
Optionally, the mixed reality glasses at least include: the system comprises a processor, a video splicing module and a display module;
the processor is used for receiving the video transmitted by the field controller and transmitting the video to the video splicing module; and receiving the panoramic video from the video stitching module and transmitting to the display module;
the video splicing module is connected with a first path of data input/output end of the processor and is used for receiving the video transmitted by the processor, splicing the received video to generate the panoramic video and transmitting the panoramic video back to the processor;
and the control end of the display module is connected with the display control end of the processor and is used for receiving and displaying the panoramic video transmitted by the processor. The mixed reality glasses splice the videos into panoramic videos to be displayed, and an operator can view real-time video pictures in a holographic display mode.
Further, the video stitching module comprises:
the distortion correction module is used for carrying out distortion correction on the received video according to the camera calibration parameters;
the panoramic stitching module is used for carrying out panoramic stitching on the video subjected to distortion correction;
and the spherical projection transformation module is used for carrying out spherical projection transformation on the images after the panorama splicing to obtain the finally displayed panoramic video. And the received video is displayed through mixed reality glasses after distortion correction, splicing and spherical projection transformation.
Furthermore, the mixed reality glasses further comprise a three-axis acceleration sensor and a video number calculation module;
the data output end of the triaxial acceleration sensor is connected with a first path of data receiving end of the processor and used for acquiring the rotation direction data and the angle data of the mixed reality glasses and transmitting the rotation direction data and the angle data to the processor;
the processor is also used for receiving the rotation direction data and the angle data of the mixed reality glasses and transmitting the data to the video number calculation module; receiving the number of the video from the video number calculation module, and transmitting the video corresponding to the received number to the video splicing module;
the video number calculation module is connected with the second path of data input and output end of the processor and used for calculating the number of the video collected by the camera which can be watched by the mixed reality glasses under the current rotation angle according to the rotation direction data and the angle data of the mixed reality glasses, the visual angle of the mixed reality glasses, the installation direction of the camera and the visual angle of the camera, and transmitting the calculated number of the video back to the processor. The triaxial acceleration sensor gathers the displacement and the rotation of mixed reality glasses wearer's head, then calculates the camera that the angle that the wearer's head rotated to and the video serial number through video serial number calculation module, then the video transmission that the treater will correspond the serial number splices to video concatenation module to show through mixed reality glasses after the concatenation. The operating personnel switches the visual angle in real time when turning round, accords with the impression directly perceived, is close real observation experience, can reduce because of the circumstances such as dizziness that the visual conflict caused.
The utility model discloses the second aspect provides an engineering vehicle, engineering vehicle disposes foretell non-blind area remote control system based on mixed reality technique. This engineering vehicle can carry out remote control, and operating personnel can look over engineering controlled equipment real-time video picture around through wearing mixed reality glasses simultaneously, can effectively ensure driver's life safety in the abominable occasion of construction environment.
According to the technical scheme, the remote control system adopts the camera arrays arranged around the controlled equipment to collect the videos around the controlled equipment, so that the periphery of the controlled equipment is completely covered, an operator can obtain real-time pictures without blind areas around the controlled equipment, and the controlled equipment does not need to rotate when watching the side surfaces and the back surfaces; the remote operation part is deployed in a safe area near a construction site, a driver wears mixed reality glasses to check real-time video pictures around the controlled equipment, the mixed reality glasses are transparent glasses bodies, operation of operators on the remote operation part is not affected, interference on operation is small, and remote control on the controlled equipment is achieved.
Other features and advantages of embodiments of the present invention will be described in detail in the detailed description which follows.
Drawings
The accompanying drawings, which are included to provide a further understanding of the embodiments of the invention and are incorporated in and constitute a part of this specification, illustrate embodiments of the invention and together with the description serve to explain the embodiments of the invention, but do not constitute a limitation of the embodiments of the invention. In the drawings:
fig. 1 is a block diagram of a remote control system according to a first embodiment of the present invention;
fig. 2 is a block diagram of a remote control system according to a second embodiment of the present invention.
Detailed Description
The following describes in detail embodiments of the present invention with reference to the accompanying drawings. It is to be understood that the description herein is only intended to illustrate and explain embodiments of the present invention, and is not intended to limit embodiments of the present invention.
The terms "first," "second," "third," and the like are used solely to distinguish one from another and are not to be construed as indicating or implying relative importance.
In the description of the present invention, it should also be noted that, unless otherwise explicitly specified or limited, the terms "disposed," "mounted," "connected," and "connected" are to be construed broadly, e.g., as meaning either a fixed connection, a removable connection, or an integral connection; they may be connected directly or indirectly through intervening media, or they may be interconnected between two elements. The specific meaning of the above terms in the present invention can be understood in specific cases to those skilled in the art.
Example one
Referring to fig. 1, the present embodiment provides a blind area-free remote control system based on a mixed reality technology, the remote control system including:
the camera array comprises a plurality of cameras arranged on the periphery of the controlled equipment and is used for collecting videos on the periphery of the controlled equipment without blind areas;
the field controller is connected with the video output end of the camera array and used for receiving the video collected by the camera array, transmitting the video to the far-end mixed reality glasses and controlling the controlled equipment to work according to an operation signal received from the remote controller;
the mixed reality glasses are used for remotely receiving the videos transmitted by the field controller and splicing the videos into a panoramic video to be displayed;
the remote operation part is used for generating the operation signal under the operation of an operator, and the operator operates the remote operation part according to the panoramic video;
the remote controller is used for collecting the operation signal from the remote operation part and transmitting the operation signal to the field controller.
Optionally, the remote control system further includes a first communication module and a second communication module;
the control end of the first communication module is connected with the communication control end of the field controller and is used for sending the video transmitted by the field controller to the second communication module; receiving the operation signal sent by the second communication module and transmitting the operation signal to the field controller;
the control end of the second communication module is connected with the communication control end of the remote controller and is used for receiving the video sent by the first communication module and transmitting the video to the mixed reality glasses; and transmitting the operation signal transmitted by the remote controller to the first communication module. The field terminal and the remote terminal exchange data and control signals through the first communication module and the second communication module, so that the field video can be remotely watched and the remote control can be realized. The first communication module and the second communication module preferably adopt wireless communication modules, wireless communication does not need to be wired, and the controlled equipment can move freely and is not limited by communication lines.
Optionally, the remote control system further includes a controlled device information sensor, where the controlled device information sensor is installed on the controlled device, and is used to collect real-time working state information of the controlled device and transmit the real-time working state information to the field controller;
the field controller is also used for receiving the real-time working state information and transmitting the real-time working state information to the far-end mixed reality glasses.
Further, the controlled device information sensor at least comprises a satellite positioning sensor, a direction sensor and an inclination sensor. The sensor collects the real-time working state information of the controlled equipment and provides a judgment basis of the real-time state of the controlled equipment for the remote operator.
Optionally, the remote operation part comprises a joystick, a button, a switch and a meter which are the same as those of the controlled device control room. The remote operation part adopts an operation part consistent with that on the controlled equipment, and an operator operates a control lever, a button, a switch and the like to generate an operation signal, the operation signal is transmitted to a field controller on the controlled equipment through the remote controller, and the field controller controls the controlled equipment to execute, so that the remote control is realized.
Optionally, the mixed reality glasses at least include: the system comprises a processor, a video splicing module and a display module;
the processor is used for receiving the video transmitted by the second communication module and transmitting the video to the video splicing module; and receiving the panoramic video from the video stitching module and transmitting to the display module;
the video splicing module is connected with a first path of data input/output end of the processor and is used for receiving the video transmitted by the processor, splicing the received video to generate the panoramic video and transmitting the panoramic video back to the processor;
and the control end of the display module is connected with the display control end of the processor and is used for receiving and displaying the panoramic video transmitted by the processor. The mixed reality glasses splice the videos into panoramic videos to be displayed, and an operator can view real-time video pictures in a holographic display mode.
Further, the video stitching module comprises:
the distortion correction module is used for carrying out distortion correction on the received video according to the camera calibration parameters;
the panoramic stitching module is used for carrying out panoramic stitching on the video subjected to distortion correction;
and the spherical projection transformation module is used for carrying out spherical projection transformation on the images after the panorama splicing to obtain the finally displayed panoramic video. And the received video is displayed through mixed reality glasses after distortion correction, splicing and spherical projection transformation.
Furthermore, the mixed reality glasses further comprise a three-axis acceleration sensor and a video number calculation module;
the data output end of the triaxial acceleration sensor is connected with a first path of data receiving end of the processor and used for acquiring the rotation direction data and the angle data of the mixed reality glasses and transmitting the rotation direction data and the angle data to the processor;
the processor is also used for receiving the rotation direction data and the angle data of the mixed reality glasses and transmitting the data to the video number calculation module; receiving the number of the video from the video number calculation module, and transmitting the video corresponding to the received number to the video splicing module;
the video number calculation module is connected with the second path of data input and output end of the processor and used for calculating the number of the video collected by the camera which can be watched by the mixed reality glasses under the current rotation angle according to the rotation direction data and the angle data of the mixed reality glasses, the visual angle of the mixed reality glasses, the installation direction of the camera and the visual angle of the camera, and transmitting the calculated number of the video back to the processor. The triaxial acceleration sensor gathers the displacement and the rotation of mixed reality glasses wearer's head, then calculates the camera that the angle that the wearer's head rotated to and the video serial number through video serial number calculation module, then the video transmission that the treater will correspond the serial number splices to video concatenation module to show through mixed reality glasses after the concatenation. The operating personnel switches the visual angle in real time when turning round, accords with the impression directly perceived, is close real observation experience, can reduce because of the circumstances such as dizziness that the visual conflict caused.
In this embodiment, the field controller is a controller of the controlled device itself, and the remote controller may be the same as the controller of the controlled device or may be another existing controller, and specific models are not listed here. The mixed reality glasses can adopt Microsoft Hololens 2 and can also be other mixed reality glasses. The first communication module and the second communication module are preferably wireless communication modules, and in the embodiment, a 5G communication module is selected; the processor, the display module and the three-axis acceleration sensor of the mixed reality glasses are all owned by the mixed reality glasses. The video splicing module realizes panoramic video splicing, and the distortion correction module, the panoramic splicing module and the spherical projection transformation module are transplanted by adopting the prior art. And the video number calculation module obtains the corresponding camera number through simple data calculation according to the visual angle range of the mixed reality glasses, the camera installation direction, the visual angle range of the camera and the angle data of the current mixed reality glasses. Generally, the horizontal field angle of the mixed reality glasses is 30 to 43 degrees, and the horizontal field angle of the cameras used in the embodiment is 45 to 70 degrees, so that the horizontal field angle of the mixed reality glasses at the same time includes at most two cameras. Since the aspect ratio of glasses and cameras is typically 16: 9 or close to it, the difference is small, so similarly, the picture of at most two cameras is included in the vertical direction at the same time.
In one specific example of the present embodiment, the field angle of the mixed reality glasses is 30 °, and the field angle of the camera is 50 °. In the working process of the system, a camera array collects videos around a controlled device and transmits the videos to a field controller, the field controller transmits the videos to a far end through a first 5G communication module, a second 5G communication module at the far end receives video signals and transmits the video signals to mixed reality glasses, meanwhile, a three-axis acceleration sensor of the mixed reality glasses collects current position data of the head of a wearer and transmits the current position data to a processor, the processor transmits the position data to a video number calculation module to perform video number calculation, the calculated video number is transmitted to the processor, the processor transmits the videos with the corresponding numbers to a video splicing module to be spliced into a panoramic video, the panoramic video is displayed in the mixed reality glasses, an operator observes the surrounding situation of the controlled device according to the videos, then operates a remote operation component to generate operation signals, and the operation signals are transmitted to the field controller through the remote controller, And the second 5G communication module and the first 5G communication module are transmitted to the field controller, and the field controller controls the controlled equipment to act according to the operation signal.
The video number calculation and splicing process comprises the following steps: suppose that the current position data of the head of the wearer is 5 degrees, the camera is a front camera, the visual angle range of the camera is-25 degrees to-25 degrees, the camera is a camera at a 45 degree position, and the visual angle range of the camera is 20 degrees to 70 degrees. The current mixed reality glasses can see the range of-10 degrees to 20 degrees, and because the field of view range of the mixed reality belongs to the field of view range of the camera I, only the image of the camera I can be seen at the moment, and the video splicing module only needs to perform spherical projection transformation on the image of the camera I and then displays the transformed image.
In another case, assuming that the current position data of the head of the wearer is 10 °, the current range that the mixed reality glasses can see is-5 ° to 25 °, at this time, one part of the mixed reality field range belongs to the field range of the first camera, the other part belongs to the field range of the second camera, the images of the first camera and the second camera can be seen, at this time, the video stitching module needs to acquire one frame of images of the first camera and the second camera, perform distortion correction on the images according to camera calibration parameters, stitch the corrected images, perform spherical projection transformation on the stitched images, and finally display the transformed images.
Example two
Referring to fig. 2, the present embodiment provides a blind area-free remote control system based on a mixed reality technology, the remote control system including:
the camera array comprises a plurality of cameras arranged on the periphery of the controlled equipment and is used for collecting videos on the periphery of the controlled equipment without blind areas;
the field controller is connected with the video output end of the camera array and used for receiving the video collected by the camera array, transmitting the video to the far-end mixed reality glasses and controlling the controlled equipment to work according to an operation signal received from the remote controller;
the mixed reality glasses are used for remotely receiving the videos transmitted by the field controller and splicing the videos into a panoramic video to be displayed;
the remote operation part is used for generating the operation signal under the operation of an operator, and the operator operates the remote operation part according to the panoramic video;
the remote controller is used for collecting the operation signal from the remote operation part and transmitting the operation signal to the field controller.
The remote control system further comprises a large display screen, and the large display screen is connected with the field controller and used for remotely receiving and displaying the video transmitted by the field controller.
The large display screen is arranged to assist people to watch the surrounding situation of the current controlled equipment, and the large display screen can synchronously display videos collected by all cameras in a split screen mode, so that the overall situation of longitudinal observation is facilitated. Because the mixed reality glasses lens is transparent, the navigating mate can also carry out accessible and communicate with the assistance personnel.
Optionally, the remote control system further includes a third communication module, a fourth communication module, and a fifth communication module;
the control end of the third communication module is connected with the communication control end of the field controller and is used for sending the video transmitted by the field controller to the fourth communication module and the fifth communication module; receiving the operation signal sent by the fourth communication module and transmitting the operation signal to the field controller;
the control end of the fourth communication module is connected with the communication control end of the remote controller and is used for receiving the video sent by the third communication module and transmitting the video to the mixed reality glasses; and transmitting the operation signal transmitted by the remote controller to the third communication module;
and the control end of the fifth communication module is connected with the video receiving end of the large display screen and is used for receiving the video sent by the third communication module and transmitting the video to the large display screen. The controlled equipment end, the remote end and the large display screen exchange data and control signals through the third communication module, the fourth communication module and the fifth communication module, so that the remote watching of the site video and the remote control are realized. The third communication module, the fourth communication module and the fifth communication module preferably adopt wireless communication modules, wireless communication does not need wiring, and controlled equipment can move freely and is not limited by communication lines.
Optionally, the remote control system further includes a controlled device information sensor, where the controlled device information sensor is installed on the controlled device, and is used to collect real-time working state information of the controlled device and transmit the real-time working state information to the field controller;
the field controller is also used for receiving the real-time working state information and transmitting the real-time working state information to the far-end mixed reality glasses.
Further, the controlled device information sensor at least comprises a satellite positioning sensor, a direction sensor and an inclination sensor. The sensor collects the real-time working state information of the controlled equipment and provides a judgment basis of the real-time state of the controlled equipment for the remote operator.
Optionally, the remote operation part comprises a joystick, a button, a switch and a meter which are the same as those of the controlled device control room. The remote operation part adopts an operation part consistent with that on the controlled equipment, and an operator operates a control lever, a button, a switch and the like to generate an operation signal, the operation signal is transmitted to a field controller on the controlled equipment through the remote controller, and the field controller controls the controlled equipment to execute, so that the remote control is realized.
Optionally, the mixed reality glasses at least include: the system comprises a processor, a video splicing module and a display module;
the processor is used for receiving the video transmitted by the fourth communication module and transmitting the video to the video splicing module; and receiving the panoramic video from the video stitching module and transmitting to the display module;
the video splicing module is connected with a first path of data input/output end of the processor and is used for receiving the video transmitted by the processor, splicing the received video to generate the panoramic video and transmitting the panoramic video back to the processor;
and the control end of the display module is connected with the display control end of the processor and is used for receiving and displaying the panoramic video transmitted by the processor. The mixed reality glasses splice videos into panoramic videos to be displayed, and an operator can view real-time video pictures in a holographic display mode.
The video stitching module comprises:
the distortion correction module is used for carrying out distortion correction on the received video according to the camera calibration parameters;
the panoramic stitching module is used for carrying out panoramic stitching on the video subjected to distortion correction;
and the spherical projection transformation module is used for carrying out spherical projection transformation on the images after the panorama splicing to obtain the finally displayed panoramic video. And the received video is displayed through mixed reality glasses after distortion correction, splicing and spherical projection transformation.
Furthermore, the mixed reality glasses further comprise a three-axis acceleration sensor and a video number calculation module;
the data output end of the triaxial acceleration sensor is connected with a first path of data receiving end of the processor and used for acquiring the rotation direction data and the angle data of the mixed reality glasses and transmitting the rotation direction data and the angle data to the processor;
the processor is also used for receiving the rotation direction data and the angle data of the mixed reality glasses and transmitting the data to the video number calculation module; receiving the number of the video from the video number calculation module, and transmitting the video corresponding to the received number to the video splicing module;
the video number calculation module is connected with the second path of data input and output end of the processor and used for calculating the number of the video collected by the camera which can be watched by the mixed reality glasses under the current rotation angle according to the rotation direction data and the angle data of the mixed reality glasses, the visual angle of the mixed reality glasses, the installation direction of the camera and the visual angle of the camera, and transmitting the calculated number of the video back to the processor. The triaxial acceleration sensor gathers the displacement and the rotation of mixed reality glasses wearer's head, then calculates the camera that the angle that the wearer's head rotated to and the video serial number through video serial number calculation module, then the video transmission that the treater will correspond the serial number splices to video concatenation module to show through mixed reality glasses after the concatenation. The operating personnel switches the visual angle in real time when turning round, accords with the impression directly perceived, is close real observation experience, can reduce because of the circumstances such as dizziness that the visual conflict caused.
In this embodiment, compared with the first embodiment, a large display screen is additionally arranged for large-screen display, the third communication module, the fourth communication module and the fifth communication module all adopt 5G communication modules, and videos transmitted by the field controller through the third 5G communication module are received by the fifth 5G communication module and then displayed by the large display screen, so that the field environment videos can be conveniently watched by auxiliary personnel.
The large display screen can adopt a split screen technology, videos collected by all cameras are displayed on the same large screen, video switching of the large display screen is independently achieved, auxiliary personnel can watch videos different from those of the operating personnel conveniently, and the defects of the visual angle of the operating personnel are overcome.
It should be noted that the utility model provides a controlled equipment among non-blind area remote control system based on mixed reality technique can be engineering vehicle, engineering machine tool and other equipment that need remote control.
The utility model also provides an engineering vehicle, engineering vehicle disposes foretell non-blind area remote control system based on mixed reality technique. This engineering vehicle can carry out remote control, and operating personnel can look over real-time video picture around the engineering vehicle through wearing mixed reality glasses simultaneously, can effectively ensure driver's life safety in the abominable occasion of construction environment.
The above describes in detail optional implementation manners of embodiments of the present invention with reference to the accompanying drawings, however, the embodiments of the present invention are not limited to the details in the above implementation manners, and in the technical concept scope of the embodiments of the present invention, it is possible to perform various simple modifications on the technical solutions of the embodiments of the present invention, and these simple modifications all belong to the protection scope of the embodiments of the present invention.
It should be noted that the various features described in the above embodiments may be combined in any suitable manner without departing from the scope of the invention. In order to avoid unnecessary repetition, the embodiments of the present invention do not separately describe various possible combinations.
Those skilled in the art will understand that all or part of the steps in the method according to the above embodiments may be implemented by a program, which is stored in a storage medium and includes several instructions to enable a single chip, a chip, or a processor (processor) to execute all or part of the steps in the method according to the embodiments of the present application. And the aforementioned storage medium includes: a U-disk, a removable hard disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk, and other various media capable of storing program codes.
In addition, various different implementation manners of the embodiments of the present invention can be combined arbitrarily, and as long as it does not violate the idea of the embodiments of the present invention, it should be considered as the disclosure of the embodiments of the present invention.
Claims (9)
1. A non-blind area remote control system based on mixed reality technology, characterized in that, the remote control system includes:
the camera array comprises a plurality of cameras arranged on the periphery of the controlled equipment and is used for collecting videos on the periphery of the controlled equipment without blind areas;
the field controller is connected with the video output end of the camera array and used for receiving the video collected by the camera array, transmitting the video to the far-end mixed reality glasses and controlling the controlled equipment to work according to an operation signal received from the remote controller;
the mixed reality glasses are used for remotely receiving the videos transmitted by the field controller and splicing the videos into a panoramic video to be displayed;
the remote operation part is used for generating the operation signal under the operation of an operator, and the operator operates the remote operation part according to the panoramic video;
the remote controller is used for collecting the operation signal from the remote operation part and transmitting the operation signal to the field controller.
2. The blind-area-free remote control system based on the mixed reality technology as claimed in claim 1, further comprising a large display screen connected to the site controller for remotely receiving and displaying the video transmitted by the site controller.
3. The blind-area-free remote control system based on the mixed reality technology as claimed in claim 1, further comprising a controlled device information sensor, wherein the controlled device information sensor is installed on the controlled device and is used for collecting real-time working state information of the controlled device and transmitting the real-time working state information to the field controller;
the field controller is also used for receiving the real-time working state information and transmitting the real-time working state information to the far-end mixed reality glasses.
4. The mixed reality technology-based blind area-free remote control system according to claim 3, wherein the controlled device information sensor at least comprises a satellite positioning sensor, a direction sensor and a tilt sensor.
5. The mixed reality technology-based blind area-free remote control system according to claim 1, wherein the remote operation components include joysticks, buttons, switches and meters identical to a controlled device control room.
6. The mixed reality technology-based blind area-free remote control system according to claim 2, wherein the mixed reality glasses at least comprise: the system comprises a processor, a video splicing module and a display module;
the processor is used for receiving the video transmitted by the field controller and transmitting the video to the video splicing module; and receiving the panoramic video from the video stitching module and transmitting to the display module;
the video splicing module is connected with a first path of data input/output end of the processor and is used for receiving the video transmitted by the processor, splicing the received video to generate the panoramic video and transmitting the panoramic video back to the processor;
and the control end of the display module is connected with the display control end of the processor and is used for receiving and displaying the panoramic video transmitted by the processor.
7. The mixed reality technology-based blind area-free remote control system according to claim 6, wherein the video stitching module comprises:
the distortion correction module is used for carrying out distortion correction on the received video according to the camera calibration parameters;
the panoramic stitching module is used for carrying out panoramic stitching on the video subjected to distortion correction;
and the spherical projection transformation module is used for carrying out spherical projection transformation on the images after the panorama splicing to obtain the finally displayed panoramic video.
8. The mixed reality technology-based blind area-free remote control system according to claim 7, wherein the mixed reality glasses further comprise a three-axis acceleration sensor and a video number calculation module;
the data output end of the triaxial acceleration sensor is connected with a first path of data receiving end of the processor and used for acquiring the rotation direction data and the angle data of the mixed reality glasses and transmitting the rotation direction data and the angle data to the processor;
the processor is also used for receiving the rotation direction data and the angle data of the mixed reality glasses and transmitting the data to the video number calculation module; receiving the number of the video from the video number calculation module, and transmitting the video corresponding to the received number to the video splicing module;
the video number calculation module is connected with the second path of data input and output end of the processor and used for calculating the number of the video collected by the camera which can be watched by the mixed reality glasses under the current rotation angle according to the rotation direction data and the angle data of the mixed reality glasses, the visual angle of the mixed reality glasses, the installation direction of the camera and the visual angle of the camera, and transmitting the calculated number of the video back to the processor.
9. An engineering vehicle equipped with a blind-area-free remote control system based on mixed reality technology according to any one of claims 1 to 8.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202022275198.8U CN213938189U (en) | 2020-10-13 | 2020-10-13 | Non-blind area remote control system based on mixed reality technology and engineering vehicle |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202022275198.8U CN213938189U (en) | 2020-10-13 | 2020-10-13 | Non-blind area remote control system based on mixed reality technology and engineering vehicle |
Publications (1)
Publication Number | Publication Date |
---|---|
CN213938189U true CN213938189U (en) | 2021-08-10 |
Family
ID=77162268
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202022275198.8U Active CN213938189U (en) | 2020-10-13 | 2020-10-13 | Non-blind area remote control system based on mixed reality technology and engineering vehicle |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN213938189U (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN114415828A (en) * | 2021-12-27 | 2022-04-29 | 北京五八信息技术有限公司 | Method and device for remotely checking vehicle based on augmented reality |
-
2020
- 2020-10-13 CN CN202022275198.8U patent/CN213938189U/en active Active
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN114415828A (en) * | 2021-12-27 | 2022-04-29 | 北京五八信息技术有限公司 | Method and device for remotely checking vehicle based on augmented reality |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US7839423B2 (en) | Image display system with gaze directed zooming | |
RU2646360C2 (en) | Imaging device and method, mobile device, imaging system and computer programme | |
JP2016180866A (en) | Aerial shoot device | |
CN109068062B (en) | Remote instant image sending system | |
CN204741528U (en) | Intelligent control ware is felt to three -dimensional immersive body | |
CN201385620Y (en) | Cab visual disturbance eliminating device | |
CN214090044U (en) | Excavator and excavator remote display system | |
CN213938189U (en) | Non-blind area remote control system based on mixed reality technology and engineering vehicle | |
CN110555913B (en) | Virtual imaging method and device based on industrial human-computer interface | |
CN113774984A (en) | Immersive remote control system and method for excavator | |
US20030193562A1 (en) | Natural vision-based video surveillance system | |
JP7287798B2 (en) | Remote camera system, control system, video output method, virtual camera work system, and program | |
JPH09214943A (en) | Remote monitor system | |
CN115076561A (en) | Tele-immersion type binocular holder follow-up system and method applied to engineering machinery | |
CN111836021B (en) | Panoramic visual patrol trolley remote real-time control system | |
JP3206874B2 (en) | Image system for remote construction support | |
CN216075329U (en) | Immersive remote control system of excavator | |
CN211364895U (en) | Underwater interaction controller and diver auxiliary system | |
CN113223205A (en) | Control system based on integrated controller and engineering machinery | |
CN105946714A (en) | Automotive panoramic observation system and panoramic all-around observation automobile | |
KR20220029926A (en) | Integrated surveillance apparatus for monitoring in cooperation with drones | |
US20220032472A1 (en) | A remote-controlled demolition robot with improved field of application and a method to achieve such a demolition robot | |
JP2634880B2 (en) | Robot remote control device | |
JP3203587U (en) | Visual assist system for heavy machinery | |
KR101027533B1 (en) | Apparatus and method for monitoring image |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
GR01 | Patent grant | ||
GR01 | Patent grant |