CN115076561A - Tele-immersion type binocular holder follow-up system and method applied to engineering machinery - Google Patents

Tele-immersion type binocular holder follow-up system and method applied to engineering machinery Download PDF

Info

Publication number
CN115076561A
CN115076561A CN202210551231.6A CN202210551231A CN115076561A CN 115076561 A CN115076561 A CN 115076561A CN 202210551231 A CN202210551231 A CN 202210551231A CN 115076561 A CN115076561 A CN 115076561A
Authority
CN
China
Prior art keywords
follow
head
user
module
information
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210551231.6A
Other languages
Chinese (zh)
Inventor
华长春
魏饶
丁伟利
穆殿瑞
张波
陈智
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Yanshan University
Original Assignee
Yanshan University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Yanshan University filed Critical Yanshan University
Priority to CN202210551231.6A priority Critical patent/CN115076561A/en
Publication of CN115076561A publication Critical patent/CN115076561A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • FMECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
    • F16ENGINEERING ELEMENTS AND UNITS; GENERAL MEASURES FOR PRODUCING AND MAINTAINING EFFECTIVE FUNCTIONING OF MACHINES OR INSTALLATIONS; THERMAL INSULATION IN GENERAL
    • F16MFRAMES, CASINGS OR BEDS OF ENGINES, MACHINES OR APPARATUS, NOT SPECIFIC TO ENGINES, MACHINES OR APPARATUS PROVIDED FOR ELSEWHERE; STANDS; SUPPORTS
    • F16M11/00Stands or trestles as supports for apparatus or articles placed thereon Stands for scientific apparatus such as gravitational force meters
    • F16M11/02Heads
    • F16M11/04Means for attachment of apparatus; Means allowing adjustment of the apparatus relatively to the stand
    • F16M11/06Means for attachment of apparatus; Means allowing adjustment of the apparatus relatively to the stand allowing pivoting
    • F16M11/12Means for attachment of apparatus; Means allowing adjustment of the apparatus relatively to the stand allowing pivoting in more than one direction
    • FMECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
    • F16ENGINEERING ELEMENTS AND UNITS; GENERAL MEASURES FOR PRODUCING AND MAINTAINING EFFECTIVE FUNCTIONING OF MACHINES OR INSTALLATIONS; THERMAL INSULATION IN GENERAL
    • F16MFRAMES, CASINGS OR BEDS OF ENGINES, MACHINES OR APPARATUS, NOT SPECIFIC TO ENGINES, MACHINES OR APPARATUS PROVIDED FOR ELSEWHERE; STANDS; SUPPORTS
    • F16M11/00Stands or trestles as supports for apparatus or articles placed thereon Stands for scientific apparatus such as gravitational force meters
    • F16M11/02Heads
    • F16M11/18Heads with mechanism for moving the apparatus relatively to the stand
    • FMECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
    • F16ENGINEERING ELEMENTS AND UNITS; GENERAL MEASURES FOR PRODUCING AND MAINTAINING EFFECTIVE FUNCTIONING OF MACHINES OR INSTALLATIONS; THERMAL INSULATION IN GENERAL
    • F16MFRAMES, CASINGS OR BEDS OF ENGINES, MACHINES OR APPARATUS, NOT SPECIFIC TO ENGINES, MACHINES OR APPARATUS PROVIDED FOR ELSEWHERE; STANDS; SUPPORTS
    • F16M11/00Stands or trestles as supports for apparatus or articles placed thereon Stands for scientific apparatus such as gravitational force meters
    • F16M11/42Stands or trestles as supports for apparatus or articles placed thereon Stands for scientific apparatus such as gravitational force meters with arrangement for propelling the support stands on wheels

Abstract

The invention provides a remote immersion type binocular holder follow-up system and a method applied to engineering machinery, wherein the remote immersion type binocular holder follow-up system comprises a remote follow-up device, a user immersion type central control operation room and a signal wireless transmission system; the remote follow-up device comprises a main control module, a plurality of variable-focus camera modules, a sound acquisition module, a data compression transmission module, an IMU (inertial measurement Unit), a two-degree-of-freedom pan-tilt and an annular hanger rail, wherein the user immersion type central control operation room comprises an annular mute cabin, a follow-up projection module, a user head posture and sight line acquisition module, a sound box and a processor, and the signal wireless transmission system is in network communication and can be a pair of high-power network bridges; the two-degree-of-freedom cradle head is arranged on the annular hanging rail, can bypass the shielding and provides a larger view; the user can watch the surrounding environment while watching the video; the 3D projector arranged on the six-degree-of-freedom holder can provide images at various angles, and provides better on-site experience for users.

Description

Tele-immersion type binocular holder follow-up system and method applied to engineering machinery
Technical Field
The invention belongs to the fields of follow-up system technology, video monitoring and virtual reality, and particularly relates to a tele-immersion type binocular head follow-up system and method suitable for engineering machinery.
Background
With the development of teleoperation technology, more and more engineering machines work in dangerous construction environments such as large-scale engineering machinery equipment like excavators and forklifts in a remote teleoperation mode, and some high-altitude operation equipment carries out construction operation in a remote teleoperation mode such as tower cranes and gantry cranes. An operator needs to control equipment in a dangerous environment in a safe working environment, and a real-time and clear image needs to be transmitted to the operator, so that the cloud platform is very important in the whole operation process, especially the definition and the real-time performance of a picture acquired by the cloud platform, and the stereoscopic impression and the perception of the surrounding environment of the picture can provide extremely strong on-site experience for the operator.
Chinese patent CN113774984A discloses an immersive remote control system and method for an excavator, which introduces a method for operating an excavator in an immersive manner, wherein a driver can well feel the posture of the excavator through a simulation cockpit arranged at six degrees of freedom, so as to improve the immersive sense of the driver, thereby greatly improving the remote control operation efficiency and safety, but the method does not consider the convenience brought by the stereoscopic video information to the immersive operation of the driver.
Chinese patent CN108093244B discloses a remote follow-up stereoscopic vision system, which introduces a binocular follow-up pan tilt, and adopts a head-mounted display as a video player to provide better stereoscopic pictures for users, and the system does not consider the influence of the dizzy effect on users caused by the longer wearing time of the head-mounted display, and also does not consider the visual field blind area caused by the shielded pan tilt.
Disclosure of Invention
The invention aims to provide a tele-immersion type binocular tripod head follow-up system and a tele-immersion type binocular tripod head follow-up method applied to engineering machinery, wherein the shortcoming that a tripod head acquisition area is shielded by equipment is overcome through an annular hanging rail at the bottom of a tripod head, so that a camera acquires images in a larger range; the IMU is connected to the cloud deck, the posture of the cloud deck is fed back in real time, a basis is provided for sensing the surrounding environment, and the data are used for eliminating the shake of cloud deck images; building a camera module by using at least two variable-focus cameras on a holder, and building a three-dimensional image of the surrounding environment through the parallax of the cameras and IMU data; the video information is directly projected to the operator by the 3D projector arranged on the six-degree-of-freedom cradle head in the user immersion type central control operating room, and the video projection area and the image acquisition range of the cradle head camera can be controlled according to the rotation of the head and eyeballs of the operator, so that the operator can obtain better immersion type on-site experience.
In order to realize the purpose, the technical scheme adopted by the invention is as follows:
a remote immersion type binocular head servo system applied to engineering machinery mainly comprises a remote servo device, a user immersion type central control operation room and a signal wireless transmission system.
The remote follow-up device comprises a main control module, n variable-focus camera modules, a sound acquisition module, a data compression transmission module, an IMU, a two-degree-of-freedom cradle head and an annular hanging rail, wherein n is more than or equal to 2, so that the cradle head can avoid shielding to obtain a wide field of view; the zoom camera module acquires a remote high-definition image and sound information acquired by the sound acquisition module is directly sent to a processor of the user immersion type central control operation room by the data compression transmission module after being coded; the IMU and the zoom camera module are integrated together, and the two-degree-of-freedom tripod head is driven by a motor servo system by adopting a high-precision tripod head control algorithm, so that the high-precision control of the movement of the tripod head is realized; the method comprises the following steps that a main control module of an embedded operating system is connected with a two-degree-of-freedom cradle head and controls the cradle head to move, the pitch angle and the yaw angle of the cradle head are changed according to the posture of a user, the cradle head is driven to move on an annular hanging rail according to position information, surrounding environment information is obtained through an IMU and a camera, and the environment information is uploaded to the user operating system to construct surrounding scenes;
the user immersion type central control operation room comprises an annular mute cabin, a follow-up projection module, a user head posture and sight line acquisition module, a sound box and a processor, so that a user can sense a remote operation environment indoors, and the operation process is controlled through the processor in the operation room; the follow-up projection module comprises a 3D projector and a six-degree-of-freedom holder; an operation table is arranged in the annular mute cabin, and a user wears the head posture and sight line acquisition module when working; the processor is used for receiving data acquired by the follow-up projection module and the user head posture and sight acquisition module; the follow-up projection module is hoisted at the top of the annular mute cabin, the white inner wall is convenient for the 3D projector to project the acquired image, in order to ensure that the image is always presented in front of a user, the 3D projector is hoisted on a six-degree-of-freedom cradle head, the position and the angle projected by the 3D projector are changed through the movement of the cradle head, the follow-up of a projection area and the eyeball sight of the user is achieved, and the user acquires a three-dimensional image through the 3D glasses worn by the user; the sight line attitude acquisition module integrates a gyroscope and a camera, determines the head motion attitude and the sight line of a user by using a multi-sensor fusion method, and simultaneously transmits head motion attitude data and sight line data to a main control module and a six-degree-of-freedom pan-tilt of a remote follow-up device through a signal transmission network to control the pan-tilt movement of the remote follow-up device so as to realize the follow-up between the user and the pan-tilt.
And 3D glasses are further integrated on the user head posture and sight line acquisition module.
The signal wireless transmission system is designed for network communication, the remote follow-up system and the immersive central control operation room are connected into the signal transmission system through network ports, so that signals in the whole system are transmitted through a network, the signals comprise video information and sound information transmitted to the immersive central control operation room by the remote follow-up device, and head posture information and sight line information are transmitted to the remote follow-up device by the immersive central control operation room.
A tele-immersion binocular head follow-up method applied to engineering machinery comprises the following steps:
step 1: the variable-focus camera mounted on the two-degree-of-freedom pan-tilt collects video information with parallax, the sound collection module collects sound information, the collected video information and the sound information are encoded and directly sent by the data compression transmission module, the IMU collects pan-tilt attitude information, the information is sent to the processor of the immersive central control operation room through the signal wireless transmission system, the processor of the immersive central control operation room starts to decode the video information in the processor after receiving the video signals, the image information is fused with the attitude information of the IMU after the decoding is successful, the image is subjected to shake elimination processing, a stable stereoscopic image is obtained and is played by the 3D projection equipment, the sound information is played by a sound box, and a user obtains the stereoscopic image through 3D glasses worn by the user;
step 2: after the processor of the immersive central control operation room is initialized successfully, the user presents image information acquired by the remote follow-up device in real time, and the user attitude information acquired by the head attitude and sight acquisition module is resolved into head attitude information in the processor by adopting a multi-sensor fusion algorithm and is sent to the six-degree-of-freedom holder and the remote follow-up device.
Regarding a holder motor as a first-order inertia link with time lag, the transfer function is as follows:
Figure BDA0003650108850000031
the description is as follows:
Figure BDA0003650108850000032
the state variables of the above objects are implemented as:
Figure BDA0003650108850000041
y(t)=x(t)
the time-lag system is processed in an ignorable time-lag mode, and the transfer relationship is changed into that:
Figure BDA0003650108850000042
the ADRC-based control algorithm is adopted as follows:
fh=fhan(v 1 -v(t),v 2 ,r 0 ,h 0 )
v 1 =v 1 +hv 2
v 2 =v 2 +hfh
e=z 1 -y
z 1 =z 1 +h(z 2 +b 0 u)-β 01 e
z 2 =z 202 e
e 1 =v 1 -z 1
u=(β 1 e 1 -z 2 )/b 0
where h is the sampling step length, T is the time constant, h 0 Is an integer multiple of h, r 0 、β 01 、β 02 、b 0 Are all adjustable parameters, which need to be adjusted according to the actual system, v 1 Is a transition of the tracking signal, v 2 Is v is 1 Derivative of (a), z 1 Is an observed value of output y, z 2 Is an observed value of the total disturbance.
Wherein fhan is noted as:
Figure BDA0003650108850000043
the nonlinear time lag control algorithm is designed through the model to control the six-degree-of-freedom holder motor, so that the effect of following the user posture information and the projection position is achieved;
and step 3: the remote follow-up device adopts a nonlinear time lag control algorithm to solve the position information and the angle information of the holder according to the sent attitude information, the angle information is sent to the motor servo module, so that the pitch angle and the yaw angle of the holder with two degrees of freedom are changed, the position information is used as the basis for driving the holder to move on the annular hanging rail, the holder is driven to move on the annular hanging rail, a larger visual field range is obtained by bypassing a shielding object, the problem that the sight of the holder is shielded after the holder rotates 180 degrees to observe the holder is solved, and the effect of remote follow-up between a camera picture and a user is achieved; the user reaches the long-range follow-up of head gesture and cloud platform through rotating the head, rotates the eyeball and reaches sight and projection area's follow-up to reach the effect that the user immerses formula remote operation mechanical equipment with the help of the real-time projection of the image that 3D projecting apparatus obtained long-range to annular projection wall.
Due to the adoption of the technical scheme, the invention has the following beneficial effects:
the 3D projector is adopted as a projection device in the cockpit, a user can view a three-dimensional picture only by using portable 3D glasses, the vision of the user is well protected due to the fact that the user is far away from the screen, and the user can see a scene in the cockpit in the process of viewing a video; the 3D projector is arranged on the six-freedom-degree holder, and images with different angles and positions are presented on a projection wall according to different postures of a user, so that the on-site experience of the user is improved; the cloud platform is installed on annular hanger rail, can provide great field of vision scope, when having the shelter in the picture, can drive the cloud platform and move on annular hanger rail, bypasses the barrier and shoots the picture, and user operating system is simple, the practicality is strong.
Drawings
FIG. 1 is a block diagram of the overall architecture provided by the implementation of the present invention;
FIG. 2 is a schematic diagram of a remote follow-up system implemented according to the present invention;
FIG. 3 is a schematic diagram of a user operating system architecture provided by the present invention;
the system comprises an annular hanging rail 1, a two-degree-of-freedom cradle head 2, a mechanical device 3, a projection wall 4, a six-degree-of-freedom cradle head 5, a 3D projector 6 and an operator 7.
Detailed Description
The present invention will be further described with reference to the following embodiments.
As shown in FIG. 1, the overall structure block diagram of the invention is that the overall design comprises a remote follow-up device, a user immersion type central control operation room and a signal wireless transmission system, and the modular design is beneficial to the maintenance and the upgrade of the later-period products. The remote follow-up device comprises a main control module, n variable-focus camera modules, a sound acquisition module, a data compression transmission module, an IMU, a two-degree-of-freedom holder 2 and an annular hanger rail 1, wherein n is more than or equal to 2, the annular hanger rail is arranged above the mechanical equipment 3, the two-degree-of-freedom holder 2 is arranged on the annular hanger rail 1, and the variable-focus camera modules are carried on the two-degree-of-freedom holder 2; as shown in fig. 2, the structural schematic diagram of the remote follow-up system, the two-degree-of-freedom pan-tilt 2 can rotate around the annular hanging rail 1, so that the problem that the pan-tilt cannot acquire an image behind the pan-tilt due to shielding of the mechanical equipment 3 in the prior art is solved, the installation can bypass a shielding object in a manner of moving the pan-tilt to acquire a wider field of view, and if the image behind the pan-tilt needs to be observed, the two-degree-of-freedom pan-tilt 2 only needs to move to the rear of the annular hanging rail 1.
Accuse control operation room in user's immersive includes that annular silence storehouse, follow-up projection module, user's head gesture and sight gather module, stereo set and treater, and annular silence storehouse top hoist and mount follow-up projection module, follow-up projection module contain six degrees of freedom cloud platforms 5 and 3D projecting apparatus 6, and 6 hoists of 3D projecting apparatus are on six degrees of freedom cloud platforms 5.
The signal wireless transmission system is designed for network communication, the remote follow-up system and the immersive central control operation room are connected into the signal transmission system through network ports, and signals in the whole system are transmitted through a network.
The user operating system is as shown in figure 3 user operating system structure schematic diagram, operating personnel 7 wears user's head gesture and sight collection module and operates entire system at the position shown in the figure, integrated gyroscope, camera and 3D glasses on the user's head gesture and sight collection module, the picture that the cloud platform was gathered is projected on the projection wall 4 in the cabin by 3D projecting apparatus 6 of installing on six degrees of freedom cloud platform 5, user's head gesture and eyeball sight are tracked in whole in-process gyroscope and camera acquisition, control remote cloud platform motion and projection position change, reach immersive remote operation's effect. The gyroscope can be selected from MPU 6050.
The whole system can be initialized once before starting to work, at the moment, the remote two-degree-of-freedom holder 2 collects a picture right in front, the picture information is transmitted in the whole system and finally displayed in front of a user, and the user can operate after the initialization is successful. The gyroscope MPU6050 worn on the head of the user and the camera in front of the user acquire the head gesture of the user at the same time, the two kinds of information are transmitted to the Windows processor at the same time, the acquired image information of the camera extracts the head gesture through an algorithm, the head gesture information of the user is calculated through a multi-sensor fusion mode together with the head information acquired by the gyroscope, the Windows processor stores the information, eyeball position information in the information is sent to the six-degree-of-freedom cradle head 5, the pitch angle, the yaw angle and the roll angle of the 3D projector 6 are changed by the motion platform according to the eyeball position information, and images with different positions and angles are displayed on the projection wall 4 finally. The head posture information stored by the Windows processor is transmitted to the network through the Ethernet port and finally transmitted to the Linux controller in the remote follow-up system through the signal transmission system.
The method comprises the steps that in a Linux controller, angle information and position information are respectively solved from head posture information by adopting a nonlinear time lag control algorithm, the angle information is sent to a motor servo module, a main control chip of the motor servo module adopts a 32-bit ARM processor STM32F407VET6, the processor adopts an FOC high-precision control algorithm to control two paths of direct-current brushless motors according to the angle information, and a high-precision motor servo system adopting the FOC high-precision control algorithm can well realize remote follow-up between a holder and a user. The position signal in the Linux controller is used for driving the two-freedom-degree holder 2 to move on the annular hanger rail 1, and the two-freedom-degree holder 2 can be driven to acquire remote on-site 360-degree image information according to the angle information and the position information. The IMU carried on the two-degree-of-freedom cradle head 2 sends the acquired attitude to the Linux controller, the site environment is constructed in the controller, and the constructed environment information is finally sent to the Windows processor of the user operating system through the signal transmission system.
In addition, at least two zooming camera modules and a sound acquisition module are mounted on the two-degree-of-freedom holder 2, and acquired image information and sound information are coded and then directly transmitted to a processor of the user immersion type central control operation room through the data compression transmission module.
Acting as a signal transmission system between the remote follow-up system and the user-immersive central operating room may be a pair of high-power bridges that allow both systems to access the network.
The Windows processor receives environment, sound and image information, the sound information is directly played through a sound box, the Windows processor fuses the image information with parallax acquired by the camera and the holder posture information acquired by the IMU to finally obtain a stable three-dimensional image, the stable three-dimensional image is projected to a projection wall through the 3D projector, a user feels the field environment through the 3D glasses worn by the user, and the image and the projection position acquired by the remote holder are controlled and changed in real time through head movement and eyeball movement at any time, so that the user achieves the effect of remote immersion operation.
The above-mentioned embodiments are merely illustrative of the preferred embodiments of the present invention, and do not limit the scope of the present invention, and various modifications and improvements of the technical solution of the present invention by those skilled in the art should fall within the protection scope defined by the claims of the present invention without departing from the spirit of the present invention.

Claims (9)

1. The utility model provides a be applied to engineering machine tool's two mesh cloud platforms follow-up systems of tele-immersion, its characterized in that: the system comprises a remote follow-up device, a user immersion type central control operation room and a signal wireless transmission system;
the remote follow-up device comprises a main control module, n variable-focus camera modules, a sound acquisition module, a data compression transmission module, an IMU, a two-degree-of-freedom tripod head (2) and an annular hanger rail (1), wherein n is more than or equal to 2, the annular hanger rail is arranged above the mechanical equipment (3), the two-degree-of-freedom tripod head (2) is arranged on the annular hanger rail (1), and the variable-focus camera modules are carried on the two-degree-of-freedom tripod head (2);
the user immersion type central control operation room comprises an annular silent bin, a follow-up projection module, a user head posture and sight line acquisition module, a sound device and a processor, wherein the follow-up projection module is hoisted at the top of the annular silent bin;
the signal wireless transmission system is designed for network communication, the remote follow-up system and the immersive central control operation room are connected into the signal transmission system through network ports, and signals in the whole system are transmitted through a network.
2. The tele-immersion binocular head follow-up system applied to the engineering machinery, which is characterized in that: the signals in the whole system comprise video information and sound information transmitted to the immersive central control operation room by the remote follow-up device, and head posture information and sight line information transmitted to the remote follow-up device by the immersive central control operation room.
3. The tele-immersion binocular head follow-up system applied to the engineering machinery, which is characterized in that: a variable-focus camera module in the remote follow-up device acquires a remote high-definition image and sound information acquired by a sound acquisition module is coded and directly sent to a processor of a user immersion type central control operation room by a data compression transmission module; the IMU and the zoom camera module are integrated together, and the two-degree-of-freedom pan-tilt (2) is driven by a motor servo system by adopting a high-precision pan-tilt control algorithm; the main control module of the embedded operating system is connected with the two-freedom-degree pan-tilt and controls the motion of the two-freedom-degree pan-tilt.
4. The tele-immersion binocular head follow-up system applied to the engineering machinery, which is characterized in that: an operation table is arranged in the annular mute bin, and the processor is used for receiving data acquired by the follow-up projection module and the user head posture and sight acquisition module; follow-up projection module contains six degrees of freedom cloud platforms (5) and 3D projecting apparatus (6), and 3D projecting apparatus (6) hoist and mount are on six degrees of freedom cloud platforms (5), integrated gyroscope and camera on user's head gesture and the sight collection module.
5. The tele-immersive binocular head follow-up system applied to the engineering machinery, according to claim 4, wherein: the 3D glasses are integrated on the user head posture and sight line acquisition module.
6. The tele-immersion binocular head servo system applied to the engineering machinery, as claimed in claim 4, wherein the tele-immersion binocular head servo system comprises: a projection wall (4) is arranged in the annular mute bin, the projection wall (4) is white, and the position and the angle projected by the 3D projector (6) are changed along with the movement of the six-degree-of-freedom holder (5).
7. The tele-immersive binocular head follow-up system applied to the engineering machinery, according to claim 4, wherein: the user head gesture and sight line acquisition module determines the motion gesture and sight line of the user head by using a multi-sensor fusion method, and the head motion gesture data and the sight line data are simultaneously sent to a main control module and a six-degree-of-freedom pan-tilt of a remote follow-up device through a signal transmission network and are used for controlling the pan-tilt movement of the remote follow-up device so as to realize the follow-up between the user and the pan-tilt.
8. The tele-immersion binocular head follow-up method applied to engineering machinery is characterized in that: using the system of any of claims 1-7, the workflow is as follows:
step 1: when a user works, the user wears the head posture and sight line acquisition module, the zoom camera module acquires video information with parallax, the sound acquisition module acquires sound information, the acquired video information and the sound information are encoded and directly transmitted by the data compression transmission module, the information is transmitted to the processor of the immersive central control operation room through the signal wireless transmission system, the processor of the immersive central control operation room starts to decode the video information in the processor after receiving the video signal, the image information is played by the 3D projector in a follow-up mode after the decoding is successful, the sound information is played by the sound box, and the user acquires a stereoscopic image through the 3D glasses;
step 2: after the processor of the immersive central control operation room is initialized successfully, the user presents image information acquired by the remote follow-up device in real time, and the user attitude information acquired by the head attitude and sight acquisition module is resolved into head attitude information in the processor by adopting a multi-sensor fusion algorithm and is sent to the six-degree-of-freedom holder (5) and the remote follow-up device;
and step 3: the remote follow-up device calculates the position information and the angle information of the pan-tilt head by adopting a nonlinear time lag control algorithm according to the sent posture information, the angle information is sent to the motor servo module, the pitch angle and the yaw angle of the pan-tilt head (2) with two degrees of freedom are changed, the position information is used as the basis for driving the pan-tilt head to move on the annular hanging rail (1), the pan-tilt head (2) with two degrees of freedom is driven to move on the annular hanging rail (1), and the follow-up projection module of the immersive central control operation room immediately projects the remotely acquired image to the annular projection wall in real time according to the head pose and the sight direction of a user.
9. The tele-immersion binocular head follow-up method applied to the engineering machinery, wherein the tele-immersion binocular head follow-up method comprises the following steps of:
the following specific implementation of the user posture information and the projection position in the step 2 is as follows:
regarding a holder motor as a first-order inertia link with time lag, the transfer function is as follows:
Figure FDA0003650108840000031
the description is as follows:
Figure FDA0003650108840000032
the state variables of the above objects are implemented as:
Figure FDA0003650108840000033
y(t)=x(t)
the time-lag system is processed in an ignorable time-lag mode, and the transfer relationship is changed into that:
Figure FDA0003650108840000034
the ADRC-based control algorithm is adopted as follows:
fh=fhan(υ 1 -υ(t),υ 2 ,r 0 ,h 0 )
υ 1 =υ 1 +hυ 2
υ 2 =υ 2 +hfh
e=z 1 -y
z 1 =z 1 +h(z 2 +b 0 u)-β 01 e
z 2 =z 202 e
e 1 =υ 1 -z 1
u=(β 1 e 1 -z 2 )/b 0
where h is the sampling step length, T is the time constant, h 0 Is an integer multiple of h, r 0 、β 01 、β 02 、b 0 Are all adjustable parameters, which need to be adjusted according to the actual system, v 1 Is a transition of the tracking signal, v 2 Is v is 1 Derivative of (a), z 1 Is an observed value of output y, z 2 Is an observed value of the total disturbance;
wherein fhan is noted as:
Figure FDA0003650108840000041
a nonlinear time-lag control algorithm is designed through the model to control the six-freedom-degree holder motor, and the effect of following the user posture information and the projection position is achieved.
CN202210551231.6A 2022-05-18 2022-05-18 Tele-immersion type binocular holder follow-up system and method applied to engineering machinery Pending CN115076561A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210551231.6A CN115076561A (en) 2022-05-18 2022-05-18 Tele-immersion type binocular holder follow-up system and method applied to engineering machinery

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210551231.6A CN115076561A (en) 2022-05-18 2022-05-18 Tele-immersion type binocular holder follow-up system and method applied to engineering machinery

Publications (1)

Publication Number Publication Date
CN115076561A true CN115076561A (en) 2022-09-20

Family

ID=83248520

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210551231.6A Pending CN115076561A (en) 2022-05-18 2022-05-18 Tele-immersion type binocular holder follow-up system and method applied to engineering machinery

Country Status (1)

Country Link
CN (1) CN115076561A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115190287A (en) * 2022-06-22 2022-10-14 秦皇岛希睿智能科技有限公司 Stereoscopic vision follow-up system applied to remote teaching

Citations (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4221419A (en) * 1977-02-19 1980-09-09 Keith Riley Gripping devices
GB8307678D0 (en) * 1983-03-19 1983-04-27 Riley K Clamping devices
US4799639A (en) * 1987-03-18 1989-01-24 Keith Riley Clamps
GB9025419D0 (en) * 1990-11-22 1991-01-09 Riley Lifting Equip Clamps
CN101890719A (en) * 2010-07-09 2010-11-24 中国科学院深圳先进技术研究院 Robot remote control device and robot system
CN102348068A (en) * 2011-08-03 2012-02-08 东北大学 Head gesture control-based following remote visual system
CN203759941U (en) * 2014-03-31 2014-08-06 东北石油大学 All-dimensional demonstration device of mathematic stereoscopic model
CN203896436U (en) * 2014-05-18 2014-10-22 王傲立 Virtual reality projector
CN205080320U (en) * 2015-09-28 2016-03-09 山东魔幻教育科技股份有限公司 Triaxial trailing type virtual reality projector
CN107024825A (en) * 2017-06-16 2017-08-08 广景视睿科技(深圳)有限公司 Follow projection arrangement and its projecting method
CN108093244A (en) * 2017-12-01 2018-05-29 电子科技大学 A kind of remotely servo-actuated stereo visual system
CN108366208A (en) * 2018-03-29 2018-08-03 燕山大学 A kind of unmanned plane stereoscopic vision servomechanism applied to disaster area search
US20190121522A1 (en) * 2017-10-21 2019-04-25 EyeCam Inc. Adaptive graphic user interfacing system
CN112037697A (en) * 2020-09-07 2020-12-04 深圳优色专显科技有限公司 Following projection type advertising device
CN112287880A (en) * 2020-11-18 2021-01-29 苏州臻迪智能科技有限公司 Cloud deck attitude adjusting method, device and system and electronic equipment
CN112611380A (en) * 2020-12-03 2021-04-06 燕山大学 Attitude detection method based on multi-IMU fusion and attitude detection device thereof

Patent Citations (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4221419A (en) * 1977-02-19 1980-09-09 Keith Riley Gripping devices
GB8307678D0 (en) * 1983-03-19 1983-04-27 Riley K Clamping devices
US4799639A (en) * 1987-03-18 1989-01-24 Keith Riley Clamps
GB9025419D0 (en) * 1990-11-22 1991-01-09 Riley Lifting Equip Clamps
CN101890719A (en) * 2010-07-09 2010-11-24 中国科学院深圳先进技术研究院 Robot remote control device and robot system
CN102348068A (en) * 2011-08-03 2012-02-08 东北大学 Head gesture control-based following remote visual system
CN203759941U (en) * 2014-03-31 2014-08-06 东北石油大学 All-dimensional demonstration device of mathematic stereoscopic model
CN203896436U (en) * 2014-05-18 2014-10-22 王傲立 Virtual reality projector
CN205080320U (en) * 2015-09-28 2016-03-09 山东魔幻教育科技股份有限公司 Triaxial trailing type virtual reality projector
CN107024825A (en) * 2017-06-16 2017-08-08 广景视睿科技(深圳)有限公司 Follow projection arrangement and its projecting method
US20190121522A1 (en) * 2017-10-21 2019-04-25 EyeCam Inc. Adaptive graphic user interfacing system
CN108093244A (en) * 2017-12-01 2018-05-29 电子科技大学 A kind of remotely servo-actuated stereo visual system
CN108366208A (en) * 2018-03-29 2018-08-03 燕山大学 A kind of unmanned plane stereoscopic vision servomechanism applied to disaster area search
CN112037697A (en) * 2020-09-07 2020-12-04 深圳优色专显科技有限公司 Following projection type advertising device
CN112287880A (en) * 2020-11-18 2021-01-29 苏州臻迪智能科技有限公司 Cloud deck attitude adjusting method, device and system and electronic equipment
CN112611380A (en) * 2020-12-03 2021-04-06 燕山大学 Attitude detection method based on multi-IMU fusion and attitude detection device thereof

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
石嘉、裴忠才、唐志勇、胡达达: "改进型自抗扰四旋翼无人机控制系统设计与实现", 《北京航空航天大学学报》, vol. 47, no. 9, pages 1824 - 1826 *
胡小强等: "虚拟现实技术与应用", 北京邮电大学出版社, pages: 231 - 232 *

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115190287A (en) * 2022-06-22 2022-10-14 秦皇岛希睿智能科技有限公司 Stereoscopic vision follow-up system applied to remote teaching

Similar Documents

Publication Publication Date Title
CN109129523B (en) Mobile robot real-time remote control system based on human-computer interaction
CN104536579B (en) Interactive three-dimensional outdoor scene and digital picture high speed fusion processing system and processing method
CN111438673B (en) High-altitude operation teleoperation method and system based on stereoscopic vision and gesture control
CN109311639A (en) Remote control device for crane, construction machinery and/or tray truck
CN108093244B (en) Remote follow-up stereoscopic vision system
GB2128842A (en) Method of presenting visual information
CN204741528U (en) Intelligent control ware is felt to three -dimensional immersive body
JP2016180866A (en) Aerial shoot device
CN106707810A (en) Auxiliary system and method for ship remote fault diagnosis and maintenance based on mixed reality glasses
KR20170044451A (en) System and Method for Controlling Remote Camera using Head mount display
CN111716365B (en) Immersive remote interaction system and method based on natural walking
CA2950822C (en) System and method for remote monitoring at least one observation area
CN106327583A (en) Virtual reality equipment for realizing panoramic image photographing and realization method thereof
CN107471216A (en) VR body man-controlled mobile robots under hazardous environment
CN110977981A (en) Robot virtual reality synchronization system and synchronization method
US10904427B2 (en) Coordinated cinematic drone
CN115076561A (en) Tele-immersion type binocular holder follow-up system and method applied to engineering machinery
CN103945122B (en) The method that virtual window is realized using monopod video camera and projector
WO2020209167A1 (en) Information processing device, information processing method, and program
CN207448453U (en) A kind of robot and robot system
CN107322596A (en) For the VR body man-controlled mobile robots under hazardous environment
Fernando et al. Effectiveness of Spatial Coherent Remote Drive Experience with a Telexistence Backhoe for Construction Sites.
CN115514885A (en) Monocular and binocular fusion-based remote augmented reality follow-up perception system and method
CN213938189U (en) Non-blind area remote control system based on mixed reality technology and engineering vehicle
JP2021180496A (en) Remote control system, remote work device, video processing device and program

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination