CN113534835B - Tourism virtual remote experience system and method - Google Patents

Tourism virtual remote experience system and method Download PDF

Info

Publication number
CN113534835B
CN113534835B CN202110754987.6A CN202110754987A CN113534835B CN 113534835 B CN113534835 B CN 113534835B CN 202110754987 A CN202110754987 A CN 202110754987A CN 113534835 B CN113534835 B CN 113534835B
Authority
CN
China
Prior art keywords
unmanned aerial
aerial vehicle
flight
sight
module
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202110754987.6A
Other languages
Chinese (zh)
Other versions
CN113534835A (en
Inventor
周震宇
叶琴
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Xiangnan University
Original Assignee
Xiangnan University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Xiangnan University filed Critical Xiangnan University
Priority to CN202110754987.6A priority Critical patent/CN113534835B/en
Publication of CN113534835A publication Critical patent/CN113534835A/en
Application granted granted Critical
Publication of CN113534835B publication Critical patent/CN113534835B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/10Simultaneous control of position or course in three dimensions
    • G05D1/101Simultaneous control of position or course in three dimensions specially adapted for aircraft

Landscapes

  • Engineering & Computer Science (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)

Abstract

The invention provides a virtual remote experience system and a virtual remote experience method for tourism, which comprise the following steps: the unmanned aerial vehicle is provided with a binocular camera and is used for acquiring panoramic video frames in real time; the sight tracking module is used for acquiring the sight direction of the user by utilizing the trained deep neural network model; the flight control module controls the flight direction of the unmanned aerial vehicle by utilizing the tracked sight line direction to realize remote flight control of the unmanned aerial vehicle; the server processing module is used for processing the panoramic video frames collected by the binocular camera and sending the panoramic video frames to a naked eye 3D display screen; the naked eye 3D display screen is used for displaying information and providing immersive flight experience for users. The invention can realize the control of the unmanned aerial vehicle by using the sight, and obtains excellent immersive experience by matching with a 5G high-speed network and a naked eye 3D display screen.

Description

Tourism virtual remote experience system and method
Technical Field
The invention relates to a tourism experience technology, in particular to a virtual tourism remote experience system and a virtual tourism remote experience method, and relates to the technical fields of unmanned aerial vehicle control, sight tracking, deep learning and the like.
Background
With the improvement of living standard of people, the tourism demand of people is increased vigorously. But with the influence of global epidemic situation, great harm is brought to the tourism industry. To avoid conglomerates, many amusement scenes are closed. For example: the Guangzhou tower of the famous scenery spot is closed once due to the influence of the local epidemic in 2021 in 5-6 months and does not receive the visit of tourists.
Therefore, the virtual tourism becomes a real demand of people, and compared with the traditional tourism, the virtual tourism has the advantages of no need of going out, no need of crowding, low carbon, environmental protection, low time cost, high speed, high efficiency and the like. In the prior art, some methods related to virtual tourism exist, however, the control experience of virtual tourism is poor, interaction is performed by adopting a manual control mode, immersive experience is lacked, visited scenic spots are limited, for example, a certain scenic spot needs to be modeled, only the tourism of the fixed scenic spot can be participated, the modeled scene is not a real scene, and the immersive experience is lacked.
According to the tourism virtual remote experience system and method, information acquisition of a tourism scene is achieved by means of the unmanned aerial vehicle, the sight line of a user is tracked, the flight direction and speed of the unmanned aerial vehicle are controlled, the man-machine integrated state is achieved, the user feels that the user is an airplane and not only controls the airplane, the unmanned aerial vehicle information is passively received in the prior art, the man-machine integrated state cannot be achieved through the sight line tracking technology, and the real immersive experience effect is obtained by combining a naked eye 3D display screen.
The innovation of the invention is mainly as follows:
1) the virtual remote experience system and method for tourism provided by the invention are suitable for remote virtual tourism, the unmanned aerial vehicle is introduced to participate in the virtual tourism, the problem of single tourism scene of the traditional virtual tourism is avoided, and the unmanned aerial vehicle can carry out close-range and/or remote observation at any time according to the viewpoint of a user due to flexible flight.
2) The invention adopts a deep learning method to realize sight tracking so as to control the flight state of the unmanned aerial vehicle and carry out speed control and direction control, and the deep learning method can solve the problem of realizing tracking of a user and further convert the tracking into a flight control instruction, so that participants obtain man-machine-in-one immersive virtual tourism experience.
Disclosure of Invention
The invention provides a virtual remote experience system and a virtual remote experience method for tourism, wherein the system comprises the following steps:
the unmanned aerial vehicle is provided with a binocular camera and is used for acquiring panoramic video frames in real time;
the sight tracking module is used for acquiring the sight direction of the user by utilizing the trained first deep neural network model;
the flight control module controls the flight direction of the unmanned aerial vehicle by utilizing the traced sight direction, so as to realize remote flight control of the unmanned aerial vehicle;
the server processing module is used for processing the panoramic video frames collected by the binocular camera and sending the panoramic video frames to a naked eye 3D display screen;
the naked eye 3D display screen is used for displaying information and providing immersive flight experience for users.
Optionally, the unmanned aerial vehicle and the server processing module perform high-speed communication through the 5G communication module, and send panoramic video frame data acquired by the binocular camera to the server processing module.
Optionally, the experience system further includes: and the 3D sound box module is used for playing audio information captured in the flight process of the unmanned aerial vehicle.
Optionally, the control module controls the flying direction and speed of the unmanned aerial vehicle through the sight line direction and the sight line concentration ratio; the higher the concentration of the sight line is, the faster the advancing direction is; the flight direction is adjusted following the direction of the line of sight.
Optionally, the system further comprises a console module, and the unmanned aerial vehicle is remotely controlled through each operating button; and/or the voice control module is used for realizing the remote control of the unmanned aerial vehicle through a voice command; and/or the gesture control module is used for realizing the remote control of the unmanned aerial vehicle through the recognized gesture.
Correspondingly, the invention also provides a virtual tourism remote experience method, which is characterized in that:
acquiring a panoramic video frame in real time by using an unmanned aerial vehicle, wherein the unmanned aerial vehicle is provided with a binocular camera;
obtaining a sight direction of a user by using a sight tracking module, wherein the sight direction is obtained by using a trained first deep neural network model;
the flight control module is used for realizing remote flight control of the unmanned aerial vehicle, and the flight control module controls the flight direction of the unmanned aerial vehicle by using the traced sight direction;
processing the panoramic video frames collected by the binocular camera by using a server processing module, and sending the panoramic video frames to a naked eye 3D display screen;
and information is displayed by utilizing a naked eye 3D display screen, and immersive flight experience is provided for a user.
Optionally, the unmanned aerial vehicle and the server processing module perform high-speed communication through the 5G communication module, and send the panoramic video frame data acquired by the binocular camera to the server processing module.
Optionally, the method further includes: and playing audio information captured in the flight process of the unmanned aerial vehicle by using the 3D sound box module.
Optionally, the method further includes: the control module controls the flying direction and speed of the unmanned aerial vehicle through the sight line direction and the sight line concentration ratio; the higher the concentration of the sight line is, the faster the advancing direction is; the flight direction is adjusted along with the sight line direction.
Optionally, the method further includes: the control console module is used for realizing remote control of the unmanned aerial vehicle, and the unmanned aerial vehicle is remotely controlled through each operating button; and/or the voice control module is used for realizing the remote control of the unmanned aerial vehicle, and the voice command is used for realizing the remote control of the unmanned aerial vehicle; and/or utilize gesture control module to realize the remote control to unmanned aerial vehicle, realize the remote control to unmanned aerial vehicle through the gesture of discerning.
Has the beneficial effects that:
1) the virtual remote experience system and method for tourism provided by the invention are suitable for remote virtual tourism, the unmanned aerial vehicle is introduced to participate in the virtual tourism, the problem of single tourism scene of the traditional virtual tourism is avoided, such as Guangzhou tower, Baiyunshan, Zhujiang day/night sightseeing and the like can be flown, and the unmanned aerial vehicle can carry out close-range and/or remote observation at any time according to the viewpoint of a user due to flexible flight.
2) The invention adopts a deep learning method to realize sight tracking so as to control the flight state of the unmanned aerial vehicle and carry out speed control and direction control, and the deep learning method can solve the problem of realizing tracking of a user and further convert the tracking into a flight control instruction, so that participants obtain man-machine-in-one immersive virtual tourism experience.
Drawings
FIG. 1 is a functional schematic diagram of a travel virtual remote experience system.
Detailed Description
In order to make the objects, technical solutions and advantages of the present invention more apparent, the present invention will be described in detail with reference to the accompanying drawings and embodiments.
As shown in FIG. 1, the present invention provides a virtual travel remote experience system, comprising:
the unmanned aerial vehicle is provided with a binocular camera and is used for acquiring panoramic video frames in real time;
the sight tracking module is used for acquiring the sight direction of the user by utilizing the trained first deep neural network model;
the flight control module controls the flight direction of the unmanned aerial vehicle by utilizing the traced sight direction, so as to realize remote flight control of the unmanned aerial vehicle;
the server processing module is used for processing the panoramic video frames collected by the binocular camera and sending the panoramic video frames to the naked eye 3D display screen;
the naked eye 3D display screen is used for displaying information and providing immersive flight experience for users.
Optionally, the unmanned aerial vehicle and the server processing module perform high-speed communication through the 5G communication module, and send panoramic video frame data acquired by the binocular camera to the server processing module.
Optionally, the experience system further includes: and the 3D sound box module is used for playing audio information captured in the flight process of the unmanned aerial vehicle.
Optionally, the control module controls the flying direction and speed of the unmanned aerial vehicle through the sight line direction and the sight line concentration ratio; the higher the concentration of the sight line is, the faster the advancing direction is; the flight direction is adjusted following the direction of the line of sight.
Optionally, the system further comprises a console module, and the unmanned aerial vehicle is remotely controlled through each operating button; and/or the voice control module is used for realizing the remote control of the unmanned aerial vehicle through a voice command; and/or the gesture control module is used for realizing the remote control of the unmanned aerial vehicle through the recognized gesture.
Optionally, under the condition that the control distance of the unmanned aerial vehicle can be met, the user can configure the unmanned aerial vehicle by himself; when the condition does not allow, also can rent the unmanned aerial vehicle of sight spot, then can rent unmanned aerial vehicle after the user pays, be responsible for unmanned aerial vehicle's management by sight spot integrated control center, for example: when a plurality of unmanned aerial vehicles possibly have flight position conflicts in the air flight process, an alarm is given in advance, flight positions and speeds meeting requirements are given, and under the emergency situation, a renter directly obtains flight authorities, so that different unmanned aerial vehicles are prevented from colliding. And after the risk is relieved, the speaking right is returned to the user.
Optionally, the first deep neural network is used to track the line of sight, determine a change state of the line of sight, and further convert the change state into a flight control command, for example: left, right, up, down, etc.
Optionally, the first deep neural network is a DRCNN network, and the DRCNN includes: one or more convolutional layers, one or more pooling layers, fully-connected layers; the convolution kernel size adopted by the convolution layer is 3 x 3; the DRCNN adopts an excitation function which is a sigmod function;
optionally, the DRCNN utilizes a Sine-Index-Softmax (Sine-Index-Softmax) to improve the accuracy of the gaze tracking; the sine exponential loss function is:
Figure BDA0003143927910000041
wherein, thetayiDenoted as sample i and its corresponding label yiAngle of vector (b) in whichyiIndicating that sample i is at its label yiDeviation of (a) from (b)jRepresents the deviation at output node j; the N represents the number of training samples; said wyiRepresenting a sample i on its label yiThe weight of (c).
Optionally, the pooling method of the pooling layer is as follows:
S=f(elogw+LOSSSIS);
where s represents the output of the current layer, f () represents the activation function, and w represents the weight of the current layer.
Optionally, the gaze concentration is obtained by a second deep neural network, where the second deep neural network is implemented by using an attention mechanism, and the second deep neural network is specifically an attention neural network, and optionally, the gaze concentration may share a convolution feature with the first neural network; or training independently to obtain the convolution characteristics suitable for the model. The attention neural network divides the user's attention into a plurality of speed levels. Optionally, the speed levels are 1, 2, 3, 4, 5, 6, 7 … N. The serial numbers represent the magnitude levels of the speeds, and the smaller the number, the faster the flight speed, whereas the larger the number, the slower the flight speed.
The excitation function adopted by the attention neural network is a cosine exponential excitation function and is marked as g (), wherein
Figure RE-RE-GDA0003235465650000042
Wherein, thetayiDenoted as sample i and its corresponding label yiThe vector included angle of (A); the N represents the number of training samples; said wyiIndicating that sample i is at its label yiThe weight of (b).
Correspondingly, the invention also provides a virtual tourism remote experience method, which is characterized by comprising the following steps:
acquiring a panoramic video frame in real time by using an unmanned aerial vehicle, wherein the unmanned aerial vehicle is provided with a binocular camera;
obtaining a sight direction of a user by using a sight tracking module, wherein the sight direction is obtained by using a trained first deep neural network model;
the flight control module is used for realizing remote flight control of the unmanned aerial vehicle, and the flight control module controls the flight direction of the unmanned aerial vehicle by using the traced sight direction;
processing the panoramic video frames acquired by the binocular camera by using a server processing module, and sending the panoramic video frames to a naked eye 3D display screen;
and information is displayed by utilizing a naked eye 3D display screen, and immersive flight experience is provided for a user.
Optionally, the unmanned aerial vehicle and the server processing module perform high-speed communication through the 5G communication module, and send panoramic video frame data acquired by the binocular camera to the server processing module.
Optionally, the method further includes: utilize 3D audio amplifier module broadcast unmanned aerial vehicle flight in-process to catch audio information.
Optionally, the method further includes: the control module controls the flying direction and speed of the unmanned aerial vehicle through the sight line direction and the sight line concentration ratio; the higher the concentration of the sight line is, the faster the advancing direction is; the flight direction is adjusted along with the sight line direction.
Optionally, the method further includes: the control console module is used for realizing remote control of the unmanned aerial vehicle, and the unmanned aerial vehicle is remotely controlled through each operating button; and/or the voice control module is used for realizing the remote control of the unmanned aerial vehicle, and the voice command is used for realizing the remote control of the unmanned aerial vehicle; and/or utilize gesture control module to realize the remote control to unmanned aerial vehicle, realize the remote control to unmanned aerial vehicle through the gesture of discerning.
Optionally, under the condition that the control distance of the unmanned aerial vehicle can be met, the user can configure the unmanned aerial vehicle by himself; when the condition does not allow, also can rent the unmanned aerial vehicle of sight spot, then can rent unmanned aerial vehicle after the user pays, be responsible for unmanned aerial vehicle's management by sight spot integrated control center, for example: when a plurality of unmanned aerial vehicles possibly have flight position conflicts in the air flight process, an alarm is given in advance, flight positions and speeds meeting requirements are given, and under the emergency situation, a renter directly obtains flight authorities, so that different unmanned aerial vehicles are prevented from colliding. And after the risk is relieved, the authority is returned to the user.
Optionally, the first deep neural network is used to track the line of sight, determine a change state of the line of sight, and further convert the change state into a flight control command, for example: left, right, up, down, etc.
Optionally, the first deep neural network is a DRCNN network, and the DRCNN includes: one or more convolutional layers, one or more pooling layers, fully-connected layers; the convolution kernel size adopted by the convolution layer is 3 x 3; the excitation function adopted by the DRCNN is a sigmod excitation function.
Optionally, the DRCNN utilizes a Sine-Index-Softmax (Sine-Index-Softmax) to improve the accuracy of the gaze tracking; the sine exponential loss function is:
Figure BDA0003143927910000051
wherein, thetayiDenoted as sample i and its corresponding label yiAngle of vector (b) in whichyiIndicating that sample i is at its label yiDeviation of (b)jRepresents the deviation at output node j; the N represents the number of training samples; said wyiIndicating that sample i is at its label yiThe weight of (c).
Optionally, the pooling method of the pooling layer is as follows:
S=f(elogw+LOSSSIS);
where s represents the output of the current layer, f () represents the activation function, and w represents the weight of the current layer.
Optionally, the gaze concentration is obtained by a second deep neural network, where the second deep neural network is implemented by using an attention mechanism, and the second deep neural network is specifically an attention neural network, and optionally, the gaze concentration may share a convolution feature with the first neural network; or training independently to obtain the convolution characteristics suitable for the model. The attention neural network divides the user's attention into a plurality of speed levels. Optionally, the speed levels are 1, 2, 3, 4, 5, 6, 7 … N. The serial numbers represent the magnitude levels of the speeds, and the smaller the number, the faster the flight speed, whereas the larger the number, the slower the flight speed.
The excitation function adopted by the attention neural network is a cosine exponential excitation function and is marked as g (), wherein
Figure RE-GDA0003235465650000061
Wherein, thetayiDenoted as sample i and its corresponding label yiThe vector included angle of (A); the N represents the number of training samples; said wyiIndicating that sample i is at its label yiThe weight of (c).
The present application also proposes a computer-readable medium storing computer program instructions capable of executing any of the methods proposed by the present invention.
In the description herein, references to the description of "one embodiment," "an example," "a specific example" or the like are intended to mean that a particular feature, structure, material, or characteristic described in connection with the embodiment or example is included in at least one embodiment or example of the invention. In this specification, the schematic representations of the terms used above do not necessarily refer to the same embodiment or example.
Program code embodied on a computer readable medium may be transmitted using any appropriate medium, including but not limited to wireless, wireline, optical fiber cable, RF, etc., or any suitable combination of the foregoing. The computer readable medium may be a computer readable signal medium or a computer readable storage medium. The computer readable storage medium may be, for example, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any combination of the foregoing. More specific examples (a non-exhaustive list) of the computer readable storage medium would include the following: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In the context of this document, a computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. Computer program code for carrying out operations for aspects of the present invention may be written in any combination of one or more programming languages, including an object oriented programming language such as Java, Smalltalk, C + + or the like and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. In the case of a remote computer, the remote computer may be connected to the user's computer through any type of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or the connection may be made to an external computer (for example, through the Internet using an Internet service provider). The integrated unit implemented in the form of a software functional unit may be stored in a computer readable storage medium. The software functional unit is stored in a storage medium and includes several instructions to enable a computer device (which may be a personal computer, a server, or a network device) or a processor (processor) to execute some steps of the methods according to the embodiments of the present invention. And the aforementioned storage medium includes: various media capable of storing program codes, such as a usb disk, a removable hard disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk, and an optical disk.
The integrated unit implemented in the form of a software functional unit may be stored in a computer readable storage medium. The software functional unit is stored in a storage medium and includes several instructions to enable a computer device (which may be a personal computer, a server, or a network device) or a processor (processor) to execute some steps of the methods according to the embodiments of the present invention. And the aforementioned storage medium includes: various media capable of storing program codes, such as a usb disk, a removable hard disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk, or an optical disk.
The above description is only for the preferred embodiment of the present invention and is not intended to limit the scope of the present invention, and all modifications of equivalent structures and equivalent processes, or direct or indirect applications in other related fields, which are made by the present specification and drawings, are included in the scope of the present invention. The preferred embodiments of the invention disclosed above are intended to be illustrative only. The preferred embodiments are not intended to be exhaustive or to limit the invention to the precise embodiments disclosed. Obviously, many modifications and variations are possible in light of the above teaching. The embodiments were chosen and described in order to best explain the principles of the invention and the practical application, to thereby enable others skilled in the art to best utilize the invention. The invention is limited only by the claims and their full scope and equivalents.

Claims (10)

1. A travel virtual remote experience system, the system comprising:
the unmanned aerial vehicle is provided with a binocular camera and is used for acquiring panoramic video frames in real time;
the sight tracking module is used for acquiring the sight direction of the user by utilizing the trained first deep neural network model;
the flight control module controls the flight direction of the unmanned aerial vehicle by utilizing the traced sight line direction, so that the remote flight control of the unmanned aerial vehicle is realized;
the server processing module is used for processing the panoramic video frames collected by the binocular camera and sending the panoramic video frames to a naked eye 3D display screen;
the naked eye 3D display screen is used for displaying information and providing immersive flight experience for a user;
the first deep neural network is used for tracking the sight line, judging the change state of the sight line and further converting the change state into a flight control command, and the flight control command comprises but is not limited to: left, right, up, down;
the first deep neural network is a DRCNN network, and the DRCNN includes: one or more convolutional layers, one or more pooling layers, fully-connected layers; the convolution kernel size adopted by the convolution layer is 3 x 3; the DRCNN adopts an excitation function which is a sigmod function;
the DRCNN utilizes a sine exponential loss function to improve the accuracy of sight tracking; the sinusoidal exponential loss function is:
Figure FDA0003593254170000011
wherein, thetayiDenoted as sample i and its corresponding label yiAngle of vector (b) in whichyiIndicating that sample i is at its label yiDeviation of (a) from (b)jRepresents the deviation at output node j; the N represents the number of training samples; said wyiIndicating that sample i is at its label yiThe weight of (b);
the pooling layer pooling method comprises the following steps:
S=f(elogw+LOSSSIS);
where s represents the output of the current layer, f () represents the activation function, and w represents the weight of the current layer.
2. The system of claim 1, wherein the drone communicates with the server processing module via the 5G communication module to send the panoramic video frame data captured by the binocular camera to the server processing module.
3. The system of claim 1, the experience system further comprising: and the 3D sound box module is used for playing audio information captured in the flight process of the unmanned aerial vehicle.
4. The system of claim 1, the control module controls the direction and speed of flight of the drone through the direction and concentration of the line of sight; the higher the concentration of the sight line is, the faster the advancing direction is; the flight direction is adjusted along with the sight line direction.
5. The system of claim 1, further comprising a console module for remotely controlling the drone via the operating buttons; and/or the voice control module is used for realizing the remote control of the unmanned aerial vehicle through a voice command; and/or the gesture control module is used for realizing the remote control of the unmanned aerial vehicle through the recognized gesture.
6. A virtual remote experience method for tourism is characterized in that:
acquiring a panoramic video frame in real time by using an unmanned aerial vehicle, wherein the unmanned aerial vehicle is provided with a binocular camera;
obtaining a sight direction of a user by using a sight tracking module, wherein the sight direction is obtained by using a trained first deep neural network model;
the flight control module is used for realizing remote flight control of the unmanned aerial vehicle, and the flight control module is used for controlling the flight direction of the unmanned aerial vehicle by utilizing the traced sight line direction;
processing the panoramic video frames collected by the binocular camera by using a server processing module, and sending the panoramic video frames to a naked eye 3D display screen;
the information is displayed by using a naked eye 3D display screen, and immersive flight experience is provided for a user;
the first deep neural network is used for tracking the sight line, judging the change state of the sight line and further converting the change state into a flight control command, and the flight control command comprises but is not limited to: left, right, up, down;
the first deep neural network is a DRCNN network, and the DRCNN includes: one or more convolutional layers, one or more pooling layers, fully-connected layers; the convolution kernel size adopted by the convolution layer is 3 x 3; the DRCNN adopts an excitation function which is a sigmod function;
the DRCNN utilizes a sine exponential loss function to improve the accuracy of sight tracking; the sinusoidal exponential loss function is:
Figure FDA0003593254170000021
wherein, thetayiDenoted as sample i and its corresponding label yiAngle of vector (b) in whichyiIndicating that sample i is at its label yiDeviation of (a) from (b)jRepresents the deviation at output node j; the N represents the number of training samples; said wyiIndicating that sample i is at its label yiThe weight of (c);
the pooling method of the pooling layer comprises the following steps:
S=f(elogw+LOSSSIS);
where s represents the output of the current layer, f () represents the activation function, and w represents the weight of the current layer.
7. The method of claim 6, wherein the UAV communicates with the server processing module via the 5G communication module to send the panoramic video frame data acquired by the binocular camera to the server processing module.
8. The method of claim 6, further comprising: and playing audio information captured in the flight process of the unmanned aerial vehicle by using the 3D sound box module.
9. The method of claim 6, further comprising: the control module controls the flying direction and speed of the unmanned aerial vehicle through the sight line direction and the sight line concentration ratio; the higher the concentration of the sight line is, the faster the advancing direction is; the flight direction is adjusted along with the sight line direction.
10. The method of claim 6, further comprising: the control console module is used for realizing remote control of the unmanned aerial vehicle, and the unmanned aerial vehicle is remotely controlled through each operating button; and/or the voice control module is used for realizing the remote control of the unmanned aerial vehicle, and the voice command is used for realizing the remote control of the unmanned aerial vehicle; and/or utilize gesture control module to realize the remote control to unmanned aerial vehicle, realize the remote control to unmanned aerial vehicle through the gesture of discerning.
CN202110754987.6A 2021-07-01 2021-07-01 Tourism virtual remote experience system and method Active CN113534835B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110754987.6A CN113534835B (en) 2021-07-01 2021-07-01 Tourism virtual remote experience system and method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110754987.6A CN113534835B (en) 2021-07-01 2021-07-01 Tourism virtual remote experience system and method

Publications (2)

Publication Number Publication Date
CN113534835A CN113534835A (en) 2021-10-22
CN113534835B true CN113534835B (en) 2022-05-31

Family

ID=78126648

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110754987.6A Active CN113534835B (en) 2021-07-01 2021-07-01 Tourism virtual remote experience system and method

Country Status (1)

Country Link
CN (1) CN113534835B (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107065905A (en) * 2017-03-23 2017-08-18 东南大学 A kind of immersion unmanned aerial vehicle control system and its control method
CN109032183A (en) * 2018-08-23 2018-12-18 广州创链科技有限公司 A kind of unmanned plane control device and method based on Pupil diameter
CN110412996A (en) * 2019-06-18 2019-11-05 中国人民解放军军事科学院国防科技创新研究院 It is a kind of based on gesture and the unmanned plane control method of eye movement, device and system
CN111277756A (en) * 2020-02-13 2020-06-12 西安交通大学 Small multi-rotor unmanned aerial vehicle camera control method based on eyeball identification tracking technology
CN112738498A (en) * 2020-12-24 2021-04-30 京东方科技集团股份有限公司 Virtual tour system and method

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GR20130100620A (en) * 2013-10-25 2015-05-18 Ιωαννης Γεωργιου Μικρος System and method for the electronic guidance of drones (take-off/ landing on the ground or on a ship)
KR102353231B1 (en) * 2015-04-24 2022-01-20 삼성디스플레이 주식회사 Flying Display

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107065905A (en) * 2017-03-23 2017-08-18 东南大学 A kind of immersion unmanned aerial vehicle control system and its control method
CN109032183A (en) * 2018-08-23 2018-12-18 广州创链科技有限公司 A kind of unmanned plane control device and method based on Pupil diameter
CN110412996A (en) * 2019-06-18 2019-11-05 中国人民解放军军事科学院国防科技创新研究院 It is a kind of based on gesture and the unmanned plane control method of eye movement, device and system
CN111277756A (en) * 2020-02-13 2020-06-12 西安交通大学 Small multi-rotor unmanned aerial vehicle camera control method based on eyeball identification tracking technology
CN112738498A (en) * 2020-12-24 2021-04-30 京东方科技集团股份有限公司 Virtual tour system and method

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
凝视控制系统——人眼控制无人机;无双;《微信公众号》;20190128;正文第1-4页 *

Also Published As

Publication number Publication date
CN113534835A (en) 2021-10-22

Similar Documents

Publication Publication Date Title
AU2021258005B2 (en) System and method for augmented and virtual reality
WO2017045251A1 (en) Systems and methods for uav interactive instructions and control
US10885106B1 (en) Optical devices and apparatuses for capturing, structuring, and using interlinked multi-directional still pictures and/or multi-directional motion pictures
WO2016059580A1 (en) Street-level guidance via route path
CN101968833A (en) Virtual three-dimensional tourism real-time online intelligent navigation interactive traveling system
Hildebrand Aerial play: Drone medium, mobility, communication, and culture
CN107436608A (en) Control device for unmanned plane and the system for guide
CN113534835B (en) Tourism virtual remote experience system and method
Zheng et al. Uavs in multimedia: Capturing the world from a new perspective
Beesley Head in the Clouds: documenting the rise of personal drone cultures
Hildebrand Consumer Drones as Mobile Media: A Technographic Study of Seeing, Moving, and Being (with) Drones
Dowling Place-based journalism, aesthetics, and branding
Hildebrand Aerial Play
CN113467616A (en) Augmented reality processing method and related device, vehicle and storage medium
CN118334936A (en) Intelligent campus teaching device and teaching method for virtual simulation
CN117745861A (en) Image generation method, device, vehicle and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant