CN113534835B - A kind of tourism virtual remote experience system and method - Google Patents
A kind of tourism virtual remote experience system and method Download PDFInfo
- Publication number
- CN113534835B CN113534835B CN202110754987.6A CN202110754987A CN113534835B CN 113534835 B CN113534835 B CN 113534835B CN 202110754987 A CN202110754987 A CN 202110754987A CN 113534835 B CN113534835 B CN 113534835B
- Authority
- CN
- China
- Prior art keywords
- flight
- sight
- uav
- module
- line
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
- 238000000034 method Methods 0.000 title claims abstract description 41
- RZVHIXYEVGDQDX-UHFFFAOYSA-N 9,10-anthraquinone Chemical compound C1=CC=C2C(=O)C3=CC=CC=C3C(=O)C2=C1 RZVHIXYEVGDQDX-UHFFFAOYSA-N 0.000 claims abstract description 27
- 238000012545 processing Methods 0.000 claims abstract description 19
- 238000003062 neural network model Methods 0.000 claims abstract description 7
- 238000013528 artificial neural network Methods 0.000 claims description 22
- 238000004891 communication Methods 0.000 claims description 12
- 238000011176 pooling Methods 0.000 claims description 12
- 230000005284 excitation Effects 0.000 claims description 9
- 230000008569 process Effects 0.000 claims description 7
- 238000012549 training Methods 0.000 claims description 6
- 230000004913 activation Effects 0.000 claims description 4
- 230000006870 function Effects 0.000 description 15
- 238000013135 deep learning Methods 0.000 description 5
- 238000005516 engineering process Methods 0.000 description 5
- 230000003287 optical effect Effects 0.000 description 4
- 230000008901 benefit Effects 0.000 description 2
- 238000004590 computer program Methods 0.000 description 2
- 230000010354 integration Effects 0.000 description 2
- 230000007246 mechanism Effects 0.000 description 2
- 239000013307 optical fiber Substances 0.000 description 2
- OKTJSMMVPCPJKN-UHFFFAOYSA-N Carbon Chemical compound [C] OKTJSMMVPCPJKN-UHFFFAOYSA-N 0.000 description 1
- 230000002776 aggregation Effects 0.000 description 1
- 238000004220 aggregation Methods 0.000 description 1
- 230000009286 beneficial effect Effects 0.000 description 1
- 229910052799 carbon Inorganic materials 0.000 description 1
- 238000010586 diagram Methods 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 230000007613 environmental effect Effects 0.000 description 1
- 230000006872 improvement Effects 0.000 description 1
- 230000003993 interaction Effects 0.000 description 1
- 239000000463 material Substances 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 239000004065 semiconductor Substances 0.000 description 1
- 230000009466 transformation Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/10—Simultaneous control of position or course in three dimensions
- G05D1/101—Simultaneous control of position or course in three dimensions specially adapted for aircraft
Landscapes
- Engineering & Computer Science (AREA)
- Aviation & Aerospace Engineering (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Automation & Control Theory (AREA)
- Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)
Abstract
本发明提出了一种旅游虚拟远程体验系统及方法,包括:无人机,所述无人机配置有双目相机,用于实时采集全景视频帧;视线追踪模块,利用训练的深度神经网络模型获得用户的视线方向;飞行控制模块,所述飞行控制模块利用追踪到的所述视线方向控制无人机的飞行方向,实现对无人机的远程飞行控制;服务器处理模块,用于将双目相机采集的全景视频帧进行处理,发送至裸眼3D显示屏;裸眼3D显示屏,所述裸眼3D显示屏用于进行信息显示,提供给用户沉浸式飞行体验。本发明能够利用视线实现对无人机的控制,配合5G的高速网络及裸眼3D显示屏,获得极佳的沉浸式的体验。
The present invention provides a system and method for virtual remote experience of tourism, including: an unmanned aerial vehicle, wherein the unmanned aerial vehicle is equipped with a binocular camera, which is used to collect panoramic video frames in real time; a sight tracking module, which uses a trained deep neural network model Obtain the line of sight direction of the user; a flight control module, the flight control module uses the tracked line of sight direction to control the flight direction of the unmanned aerial vehicle to realize remote flight control of the unmanned aerial vehicle; the server processing module is used to convert the binocular The panoramic video frames collected by the camera are processed and sent to the naked-eye 3D display screen; the naked-eye 3D display screen is used to display information and provide users with an immersive flight experience. The invention can realize the control of the drone by using the sight line, and cooperate with the 5G high-speed network and the naked-eye 3D display screen to obtain an excellent immersive experience.
Description
技术领域technical field
本发明涉及旅游体验技术,特别是一种旅游虚拟远程体验系统及方法,所涉及的技术包括无人机控 制、视线追踪、深度学习等技术领域。The invention relates to tourism experience technology, in particular to a virtual remote experience system and method for tourism, and the technology involved includes the technical fields of drone control, sight tracking, deep learning and the like.
背景技术Background technique
随着人们生活水平的提高,人们的旅游需求增长旺盛。但是随着全球疫情的影响,给旅游业带来较 大伤害。为了避免聚体,很多游乐场景都会关闭。例如:著名景点广州塔,由于在2021年5-6月,当 地的疫情影响,一度关闭,不接受游客访问。With the improvement of people's living standards, people's demand for tourism is growing rapidly. However, with the impact of the global epidemic, it has brought great harm to the tourism industry. To avoid aggregation, many amusement scenes are closed. For example: Guangzhou Tower, a famous scenic spot, was temporarily closed due to the impact of the local epidemic in May-June 2021 and will not accept tourists.
因此,虚拟旅游成为人民的一种现实需求,而且,由于虚拟旅游比传统的旅游相比,具有不少优点, 不必出行、不必拥挤,低碳环保,时间成本低,快速高效等。现有技术也有一些有关虚拟旅游的方法, 但是,在虚拟旅游的控制体验方面不佳,多采用手动控制的模式进行交互,缺乏沉浸式体验,而且访问 的景点受限,比如需要对某个景点建模,只能参与这个固定的旅游景点的旅游,建模场景非真实场景, 缺乏沉浸式体验。Therefore, virtual tourism has become a real demand of the people, and compared with traditional tourism, virtual tourism has many advantages, such as no need to travel, no need to crowd, low carbon and environmental protection, low time cost, fast and efficient, etc. There are also some methods related to virtual tours in the existing technology, but the control experience of virtual tours is not good, and manual control mode is mostly used for interaction, which lacks immersive experience, and the visited attractions are limited. Modeling, you can only participate in the tourism of this fixed tourist attraction, the modeling scene is not a real scene, and it lacks an immersive experience.
本发明提出的一种旅游虚拟远程体验系统及方法,借助无人机实现旅游场景的信息采集,利用对用 户的视线进行追踪,控制无人机的飞行方向、速度,达到人机合一的状态,使用者感觉自身就是飞机, 而不仅仅是在控制飞机,而现有技术只是被动接收无人机信息,无法通过视线追踪技术达到人机合一的 状态,结合裸眼3D显示屏,获得真实的沉浸式体验效果。The system and method for virtual remote experience of tourism proposed by the present invention realize information collection of tourism scenes by means of unmanned aerial vehicles, track the user's sight line, control the flying direction and speed of unmanned aerial vehicles, and achieve the state of man-machine integration. , the user feels that he is an aircraft, not just controlling the aircraft, and the existing technology only passively receives the drone information, and cannot achieve the state of man-machine integration through the sight tracking technology. Combined with the naked-eye 3D display screen, the real Immersive experience effect.
本发明的创新主要体现在:The innovation of the present invention is mainly reflected in:
1)本发明提出的一种旅游虚拟远程体验系统及方法,适用于远程虚拟旅游,引入无人机参与虚拟旅 游,避免的传统虚拟旅游的旅游场景单一的问题,而且由于无人机飞行灵活,可以随时按照用户的视点 进行近距离和/或远距离的观察。1) A travel virtual remote experience system and method proposed by the present invention is suitable for long-distance virtual travel, introducing drones to participate in virtual travel, avoiding the single problem of traditional virtual travel tourism scenes, and because the drones fly flexibly, A close-up and/or distant observation can be made at any time according to the user's viewpoint.
2)本发明采用深度学习的方法实现视线追踪,以控制无人机的飞行状态,进行速度控制与方向控制, 所提出的深度学习方法能够解决对用户的实现追踪问题,进而转换成飞行控制指令,使得参与者获得人 机合一的沉浸式虚拟旅游体验。2) The present invention adopts the method of deep learning to realize sight tracking, so as to control the flight state of the UAV, and perform speed control and direction control. The proposed deep learning method can solve the problem of tracking the user, and then convert it into a flight control command. , so that participants can get an immersive virtual tourism experience that integrates human and machine.
发明内容SUMMARY OF THE INVENTION
本发明提出了一种旅游虚拟远程体验系统及方法,所述系统包括:The present invention provides a travel virtual remote experience system and method, the system includes:
无人机,所述无人机配置有双目相机,用于实时采集全景视频帧;Unmanned aerial vehicle, the unmanned aerial vehicle is equipped with a binocular camera for real-time acquisition of panoramic video frames;
视线追踪模块,利用训练的第一深度神经网络模型获得用户的视线方向;The gaze tracking module uses the trained first deep neural network model to obtain the user's gaze direction;
飞行控制模块,所述飞行控制模块利用追踪到的所述视线方向控制无人机的飞行方向,实现对无人 机的远程飞行控制;A flight control module, the flight control module utilizes the tracked direction of sight to control the flight direction of the unmanned aerial vehicle, so as to realize remote flight control of the unmanned aerial vehicle;
服务器处理模块,用于将双目相机采集的全景视频帧进行处理,发送至裸眼3D显示屏;The server processing module is used to process the panoramic video frames collected by the binocular camera and send them to the naked-eye 3D display screen;
裸眼3D显示屏,所述裸眼3D显示屏用于进行信息显示,提供给用户沉浸式飞行体验。The naked-eye 3D display screen is used to display information and provide users with an immersive flight experience.
可选的,所述无人机与服务器处理模块通过5G通信模块进行高速信通信,将双目相机获取的全景视 频帧数据发送给服务器处理模块。Optionally, the UAV and the server processing module perform high-speed communication through a 5G communication module, and send the panoramic video frame data obtained by the binocular camera to the server processing module.
可选的,所述体验系统还包括:3D音箱模块,用于播放无人机飞行过程中捕捉到的音频信息。Optionally, the experience system further includes: a 3D speaker module for playing audio information captured during the flight of the drone.
可选的,所述控制模块通过视线方向及视线集中度控制无人机飞行的方向及速度;视线集中度越高, 前进方向越快;飞行方向跟随视线方向进行调整。Optionally, the control module controls the flying direction and speed of the UAV through the line of sight direction and the line of sight concentration; the higher the line of sight concentration, the faster the forward direction; the flight direction is adjusted following the line of sight direction.
可选的,所述系统还有控制台模块,通过各操作按钮实现对无人机远程控制;和/或语音控制模块, 通过语音命令实现对无人机的远程控制;和/或手势控制模块,通过识别到的手势实现对无人机的远程 控制。Optionally, the system also has a console module, which realizes remote control of the UAV through each operation button; and/or a voice control module, which realizes the remote control of the UAV through voice commands; and/or a gesture control module , to realize remote control of the drone through the recognized gestures.
相应的,本发明还提出了一种虚拟旅游远程体验方法,其特征在于:Correspondingly, the present invention also proposes a virtual travel remote experience method, which is characterized in that:
利用无人机实时采集全景视频帧,所述无人机配置有双目相机;UAV is used to collect panoramic video frames in real time, and the UAV is equipped with a binocular camera;
利用视线追踪模块获得用户的视线方向,所述视线方向利用训练的第一深度神经网络模型获得;Utilize the gaze tracking module to obtain the user's gaze direction, and the gaze direction is obtained by using the trained first deep neural network model;
利用飞行控制模块实现对无人机的远程飞行控制,所述飞行控制模块利用追踪到的所述视线方向控 制无人机的飞行方向;Utilize the flight control module to realize the remote flight control to the unmanned aerial vehicle, and the described flight control module utilizes the described line-of-sight direction to control the flight direction of the unmanned aerial vehicle;
利用服务器处理模块将双目相机采集的全景视频帧进行处理,发送至裸眼3D显示屏;Use the server processing module to process the panoramic video frames collected by the binocular camera and send them to the naked-eye 3D display screen;
利用裸裸眼3D显示屏进行信息显示,提供给用户沉浸式飞行体验。The naked eye 3D display screen is used for information display, providing users with an immersive flight experience.
可选的,所述无人机与服务器处理模块通过5G通信模块进行高速信通信,将双目相机获取的全景视 频帧数据发送给服务器处理模块。Optionally, the UAV and the server processing module perform high-speed communication through a 5G communication module, and send the panoramic video frame data obtained by the binocular camera to the server processing module.
可选的,所述方法还包括:利用3D音箱模块播放无人机飞行过程中捕捉到的音频信息。Optionally, the method further includes: using a 3D speaker module to play audio information captured during the flight of the drone.
可选的,所述方法还包括:所述控制模块通过视线方向及视线集中度控制无人机飞行的方向及速度; 视线集中度越高,前进方向越快;飞行方向跟随视线方向进行调整。Optionally, the method further includes: the control module controls the flying direction and speed of the UAV by using the line of sight direction and the line of sight concentration; the higher the line of sight concentration, the faster the forward direction; the flight direction is adjusted following the line of sight direction.
可选的,所述方法还包括:利用控制台模块实现对无人机的远程控制,通过各操作按钮实现对无人 机远程控制;和/或利用语音控制模块实现对无人机的远程控制,通过语音命令实现对无人机的远程控 制;和/或利用手势控制模块实现对无人机的远程控制,通过识别到的手势实现对无人机的远程控制。Optionally, the method further includes: using a console module to realize remote control of the UAV, and realizing the remote control of the UAV through each operation button; and/or using a voice control module to realize the remote control of the UAV , realize the remote control of the UAV through voice commands; and/or realize the remote control of the UAV by using the gesture control module, and realize the remote control of the UAV through the recognized gestures.
有益效果:Beneficial effects:
1)本发明提出的一种旅游虚拟远程体验系统及方法,适用于远程虚拟旅游,引入无人机参与虚拟旅 游,避免的传统虚拟旅游的旅游场景单一的问题,例如可以飞行广州塔、白云山、珠江日/夜览等,而 且由于无人机飞行灵活,可以随时按照用户的视点进行近距离和/或远距离的观察。1) A kind of tourism virtual remote experience system and method proposed by the present invention is suitable for remote virtual tourism, introduces drones to participate in virtual tourism, and avoids the single problem of traditional virtual tourism tourism scenes, such as can fly Canton Tower, Baiyun Mountain , Pearl River Day/Night View, etc., and due to the flexible flight of the drone, close-range and/or long-distance observation can be carried out at any time according to the user's viewpoint.
2)本发明采用深度学习的方法实现视线追踪,以控制无人机的飞行状态,进行速度控制与方向控制, 所提出的深度学习方法能够解决对用户的实现追踪问题,进而转换成飞行控制指令,使得参与者获得人 机合一的沉浸式虚拟旅游体验。2) The present invention adopts the method of deep learning to realize sight tracking, so as to control the flight state of the UAV, and perform speed control and direction control. The proposed deep learning method can solve the problem of tracking the user, and then convert it into a flight control command. , so that participants can get an immersive virtual tourism experience that integrates human and machine.
附图说明Description of drawings
图1是一种旅游虚拟远程体验系统功能示意图。Figure 1 is a functional schematic diagram of a travel virtual remote experience system.
具体实施方式Detailed ways
为了使本发明的目的、技术方案及优点更佳清楚明了,以下结合附图及实施例,对本发明进行进一 步详细说明。In order to make the objectives, technical solutions and advantages of the present invention clearer and clearer, the present invention will be further described in detail below with reference to the accompanying drawings and embodiments.
如图1所述,本发明提出了一种提出了一种虚拟旅游远程体验系统,所述系统包括:As shown in Figure 1, the present invention proposes a virtual travel remote experience system, the system includes:
无人机,所述无人机配置有双目相机,用于实时采集全景视频帧;Unmanned aerial vehicle, the unmanned aerial vehicle is equipped with a binocular camera for real-time acquisition of panoramic video frames;
视线追踪模块,利用训练的第一深度神经网络模型获得用户的视线方向;The gaze tracking module uses the trained first deep neural network model to obtain the user's gaze direction;
飞行控制模块,所述飞行控制模块利用追踪到的所述视线方向控制无人机的飞行方向,实现对无人 机的远程飞行控制;A flight control module, the flight control module utilizes the tracked direction of sight to control the flight direction of the unmanned aerial vehicle, so as to realize remote flight control of the unmanned aerial vehicle;
服务器处理模块,用于将双目相机采集的全景视频帧进行处理,发送至裸眼3D显示屏;The server processing module is used to process the panoramic video frames collected by the binocular camera and send them to the naked-eye 3D display screen;
裸眼3D显示屏,所述裸眼3D显示屏用于进行信息显示,提供给用户沉浸式飞行体验。The naked-eye 3D display screen is used to display information and provide users with an immersive flight experience.
可选的,所述无人机与服务器处理模块通过5G通信模块进行高速信通信,将双目相机获取的全景视 频帧数据发送给服务器处理模块。Optionally, the UAV and the server processing module perform high-speed communication through a 5G communication module, and send the panoramic video frame data obtained by the binocular camera to the server processing module.
可选的,所述体验系统还包括:3D音箱模块,用于播放无人机飞行过程中捕捉到的音频信息。Optionally, the experience system further includes: a 3D speaker module for playing audio information captured during the flight of the drone.
可选的,所述控制模块通过视线方向及视线集中度控制无人机飞行的方向及速度;视线集中度越高, 前进方向越快;飞行方向跟随视线方向进行调整。Optionally, the control module controls the flying direction and speed of the UAV through the line of sight direction and the line of sight concentration; the higher the line of sight concentration, the faster the forward direction; the flight direction is adjusted following the line of sight direction.
可选的,所述系统还有控制台模块,通过各操作按钮实现对无人机远程控制;和/或语音控制模块, 通过语音命令实现对无人机的远程控制;和/或手势控制模块,通过识别到的手势实现对无人机的远程 控制。Optionally, the system also has a console module, which realizes remote control of the UAV through each operation button; and/or a voice control module, which realizes the remote control of the UAV through voice commands; and/or a gesture control module , to realize remote control of the drone through the recognized gestures.
可选的,对于能满足无人机控制距离的的情况下,用户可自行配置无人机;当条件不允许时,也可 以租用景点的无人机,用户付费后则可以租用无人机,由景点综合控制中心负责无人机的管理,例如: 在空中飞行时若存在多个无人机可能飞行位置冲突时,提前给出警报,并给出符合要求的飞行位置和速 度,在紧急情况下,由出租方直接获得飞行权限,避免不同的无人机相撞。待风险解除后,再讲权限交 回给用户。Optionally, if the control distance of the drone can be met, the user can configure the drone by himself; when the conditions do not allow, he can also rent the drone of the scenic spot, and the user can rent the drone after paying for it. The integrated control center of scenic spots is responsible for the management of drones. For example, if there are multiple drones in the air that may conflict with their flying positions, an alarm will be given in advance, and the required flight position and speed will be given. In emergency situations The lessor directly obtains the flight permission to avoid the collision of different drones. After the risk is removed, the authority will be returned to the user.
可选的,第一深度神经网络用于实现对视线的追踪,判断其变化状态,进而转化成飞行控制指令, 例如:向左、向右、向上、向下等等。Optionally, the first deep neural network is used for tracking the sight line, judging its changing state, and then converting it into flight control instructions, such as: left, right, up, down, and so on.
可选的,第一深度神经网络为DRCNN网络,所述DRCNN包括:一个或多个卷积层、一个或多个池化 层、全连接层;所述卷积层采用的卷积核大小3*3;所述DRCNN采用的激励函数为sigmod函数;Optionally, the first deep neural network is a DRCNN network, and the DRCNN includes: one or more convolution layers, one or more pooling layers, and a fully connected layer; the convolution kernel size used by the convolution layer is 3 *3; The excitation function used by the DRCNN is the sigmod function;
可选的,所述DRCNN利用了正弦指数损失函数(Sine-Index-Softmax)提高视线追踪的准确性;所 述正弦指数损失函数为:Optionally, described DRCNN utilizes sine exponential loss function (Sine-Index-Softmax) to improve the accuracy of line of sight tracking; Described sine exponential loss function is:
其中,θyi表示为样本i与其对应标签yi的向量夹角,其中byi表示样本i在其标签yi处的偏差,bj表示输出节点j处的偏差;所述N表示训练样本个数;所述wyi表 示样本i在其标签yi处的权重。 Among them, θ yi represents the vector angle between sample i and its corresponding label yi, where b yi represents the deviation of sample i at its label y i , and b j represents the deviation of output node j; the N represents the number of training samples number; the w yi represents the weight of sample i at its label yi.
可选的,所述池化层的池化方法如下:Optionally, the pooling method of the pooling layer is as follows:
S=f(elogw+LOSSSIS);S=f(elogw+LOSS SIS );
其中,s表示当前层的输出,f()表示激活函数,w表示当前层的权重。Among them, s represents the output of the current layer, f() represents the activation function, and w represents the weight of the current layer.
可选的,所述视线集中度通过第二深度神经网络获得,所述第二深度神经网络采用注意力机制实现, 第二深度神经网络具体为注意力神经网络,可选的,其可以共享与第一神经网络的卷积特征;也可以独 立训练,获得适用于其模型本身的卷积特征。所述注意力神经网络将用户的注意力分为多个速度级别。 可选的,速度级别为1、2、3、4、5、6、7…N。序号代表速度的大小级别,数字越小,飞行速度越快, 反之,数字越大,飞行速度越慢。Optionally, the line of sight concentration is obtained through a second deep neural network, and the second deep neural network is realized by an attention mechanism, and the second deep neural network is specifically an attention neural network. Convolutional features of the first neural network; can also be trained independently to obtain convolutional features suitable for its model itself. The attentional neural network divides the user's attention into multiple speed levels. Optionally, the speed levels are 1, 2, 3, 4, 5, 6, 7...N. The serial number represents the speed level. The smaller the number is, the faster the flight speed is. On the contrary, the larger the number is, the slower the flight speed is.
所述注意力神经网络采用的激励函数为余弦指数激励函数,记为g(),其中The excitation function adopted by the attention neural network is the cosine exponential excitation function, denoted as g(), where
其中,θyi表示为样本i与其对应标签yi的向量夹角;所述N表示训练样本个数;所述wyi表示样本i 在其标签yi处的权重。Among them, θ yi represents the vector angle between the sample i and its corresponding label yi; the N represents the number of training samples; the w yi represents the weight of the sample i at its label yi.
相应的,本发明还提出了一种虚拟旅游远程体验方法,其特征在于:Correspondingly, the present invention also proposes a virtual travel remote experience method, which is characterized in that:
利用无人机实时采集全景视频帧,所述无人机配置有双目相机;UAV is used to collect panoramic video frames in real time, and the UAV is equipped with a binocular camera;
利用视线追踪模块获得用户的视线方向,所述视线方向利用训练的第一深度神经网络模型获得;Utilize the gaze tracking module to obtain the user's gaze direction, and the gaze direction is obtained by using the trained first deep neural network model;
利用飞行控制模块实现对无人机的远程飞行控制,所述飞行控制模块利用追踪到的所述视线方向控 制无人机的飞行方向;Utilize the flight control module to realize the remote flight control to the unmanned aerial vehicle, and the described flight control module utilizes the described line-of-sight direction to control the flight direction of the unmanned aerial vehicle;
利用服务器处理模块将双目相机采集的全景视频帧进行处理,发送至裸眼3D显示屏;Use the server processing module to process the panoramic video frames collected by the binocular camera and send them to the naked-eye 3D display screen;
利用裸裸眼3D显示屏进行信息显示,提供给用户沉浸式飞行体验。The naked eye 3D display screen is used for information display, providing users with an immersive flight experience.
可选的,所述无人机与服务器处理模块通过5G通信模块进行高速信通信,将双目相机获取的全景视 频帧数据发送给服务器处理模块。Optionally, the UAV and the server processing module perform high-speed communication through a 5G communication module, and send the panoramic video frame data obtained by the binocular camera to the server processing module.
可选的,所述方法还包括:利用3D音箱模块播放无人机飞行过程中捕捉到的音频信息。Optionally, the method further includes: using a 3D speaker module to play audio information captured during the flight of the drone.
可选的,所述方法还包括:所述控制模块通过视线方向及视线集中度控制无人机飞行的方向及速度; 视线集中度越高,前进方向越快;飞行方向跟随视线方向进行调整。Optionally, the method further includes: the control module controls the flying direction and speed of the UAV by using the line of sight direction and the line of sight concentration; the higher the line of sight concentration, the faster the forward direction; the flight direction is adjusted following the line of sight direction.
可选的,所述方法还包括:利用控制台模块实现对无人机的远程控制,通过各操作按钮实现对无人 机远程控制;和/或利用语音控制模块实现对无人机的远程控制,通过语音命令实现对无人机的远程控 制;和/或利用手势控制模块实现对无人机的远程控制,通过识别到的手势实现对无人机的远程控制。Optionally, the method further includes: using a console module to realize remote control of the UAV, and realizing the remote control of the UAV through each operation button; and/or using a voice control module to realize the remote control of the UAV , realize the remote control of the UAV through voice commands; and/or realize the remote control of the UAV by using the gesture control module, and realize the remote control of the UAV through the recognized gestures.
可选的,对于能满足无人机控制距离的的情况下,用户可自行配置无人机;当条件不允许时,也可 以租用景点的无人机,用户付费后则可以租用无人机,由景点综合控制中心负责无人机的管理,例如: 在空中飞行时若存在多个无人机可能飞行位置冲突时,提前给出警报,并给出符合要求的飞行位置和速 度,在紧急情况下,由出租方直接获得飞行权限,避免不同的无人机相撞。待风险解除后,再将权限交 回给用户。Optionally, if the control distance of the drone can be met, the user can configure the drone by himself; when the conditions do not allow, he can also rent the drone of the scenic spot, and the user can rent the drone after paying for it. The integrated control center of scenic spots is responsible for the management of drones. For example, if there are multiple drones in the air that may conflict with their flying positions, an alarm will be given in advance, and the required flight position and speed will be given. In emergency situations The lessor directly obtains the flight permission to avoid the collision of different drones. After the risk is removed, the permissions are returned to the user.
可选的,第一深度神经网络用于实现对视线的追踪,判断其变化状态,进而转化成飞行控制指令, 例如:向左、向右、向上、向下等等。Optionally, the first deep neural network is used for tracking the sight line, judging its changing state, and then converting it into flight control instructions, such as: left, right, up, down, and so on.
可选的,第一深度神经网络为DRCNN网络,所述DRCNN包括:一个或多个卷积层、一个或多个池化 层、全连接层;所述卷积层采用的卷积核大小3*3;所述DRCNN采用的激励函数为sigmod激励函数。Optionally, the first deep neural network is a DRCNN network, and the DRCNN includes: one or more convolution layers, one or more pooling layers, and a fully connected layer; the convolution kernel size used by the convolution layer is 3 *3; The excitation function used by the DRCNN is the sigmod excitation function.
可选的,所述DRCNN利用了正弦指数损失函数(Sine-Index-Softmax)提高视线追踪的准确性;所 述正弦指数损失函数为:Optionally, described DRCNN utilizes sine exponential loss function (Sine-Index-Softmax) to improve the accuracy of line of sight tracking; Described sine exponential loss function is:
其中,θyi表示为样本i与其对应标签yi的向量夹角,其中byi表示样本i在其标签yi处的偏差,bj表示输出节点j处的偏差;所述N表示训练样本个数;所述wyi表 示样本i在其标签yi处的权重。 Among them, θ yi represents the vector angle between sample i and its corresponding label yi, where b yi represents the deviation of sample i at its label y i , and b j represents the deviation of output node j; the N represents the number of training samples number; the w yi represents the weight of sample i at its label yi.
可选的,所述池化层的池化方法如下:Optionally, the pooling method of the pooling layer is as follows:
S=f(elogw+LOSSSIS);S=f(elogw+LOSS SIS );
其中,s表示当前层的输出,f()表示激活函数,w表示当前层的权重。Among them, s represents the output of the current layer, f() represents the activation function, and w represents the weight of the current layer.
可选的,所述视线集中度通过第二深度神经网络获得,所述第二深度神经网络采用注意力机制实现, 第二深度神经网络具体为注意力神经网络,可选的,其可以共享与第一神经网络的卷积特征;也可以独 立训练,获得适用于其模型本身的卷积特征。所述注意力神经网络将用户的注意力分为多个速度级别。 可选的,速度级别为1、2、3、4、5、6、7…N。序号代表速度的大小级别,数字越小,飞行速度越快, 反之,数字越大,飞行速度越慢。Optionally, the line of sight concentration is obtained through a second deep neural network, and the second deep neural network is realized by an attention mechanism, and the second deep neural network is specifically an attention neural network. Convolutional features of the first neural network; can also be trained independently to obtain convolutional features suitable for its model itself. The attentional neural network divides the user's attention into multiple speed levels. Optionally, the speed levels are 1, 2, 3, 4, 5, 6, 7...N. The serial number represents the speed level. The smaller the number is, the faster the flight speed is. On the contrary, the larger the number is, the slower the flight speed is.
所述注意力神经网络采用的激励函数为余弦指数激励函数,记为g(),其中The excitation function adopted by the attention neural network is the cosine exponential excitation function, denoted as g(), where
其中,θyi表示为样本i与其对应标签yi的向量夹角;所述N表示训练样本个数;所述wyi表示样本i 在其标签yi处的权重。Among them, θ yi represents the vector angle between the sample i and its corresponding label yi; the N represents the number of training samples; the w yi represents the weight of the sample i at its label yi.
本申请还提出了一种计算机可读介质,存储有计算机程序指令,所述程序指令能够执行本发明提出 的上述任一种方法。The present application also proposes a computer-readable medium storing computer program instructions, wherein the program instructions can execute any of the above methods proposed by the present invention.
在本说明书的描述中,参考术语“一个实施例”、“示例”、“具体示例”等的描述意指结合该实施例 或示例描述的具体特征、结构、材料或者特点包含于本发明的至少一个实施例或示例中。在本说明书 中,对上述术语的示意性表述不一定指的是相同的实施例或示例。In the description of this specification, description with reference to the terms "one embodiment," "example," "specific example," etc. means that a particular feature, structure, material, or characteristic described in connection with the embodiment or example is included in at least one aspect of the present invention. in one embodiment or example. In this specification, schematic representations of the above terms do not necessarily refer to the same embodiment or example.
计算机可读介质上包含的程序代码可以用任何适当的介质传输,包括但不限于无线、电线、光缆、 RF等等,或者上述的任意合适的组合。计算机可读介质可以是计算机可读信号介质或者计算机可读存储 介质。计算机可读存储介质例如可以是电、磁、光、电磁、红外线、或半导体的系统、装置或器件,或 者任意以上的组合。计算机可读存储介质的更具体的例子(非穷举的列表)包括:具有一个或多个导线的 电连接、便携式计算机磁盘、硬盘、随机存取存储器(RAM)、只读存储器(ROM)、可擦式可编程只读存储 器(EPROM或闪存)、光纤、便携式紧凑磁盘只读存储器(CD-ROM)、光存储器件、磁存储器件、或者上述 的任意合适的组合。在本文件中,计算机可读存储介质可以是任何包含或存储程序的有形介质,该程序可以被指令执行系统、装置或者器件使用或者与其结合使用。可以以一种或多种程序设计语言或其组合 来编写用于执行本发明操作的计算机程序代码,所述程序设计语言包括面向对象的程序设计语言—诸如 Java、Smalltalk、C++,还包括常规的过程式程序设计语言—诸如“C”语言或类似的程序设计语言。 程序代码可以完全地在用户计算机上执行、部分地在用户计算机上执行、作为一个独立的软件包执行、 部分在用户计算机上部分在远程计算机上执行、或者完全在远程计算机或服务器上执行。在涉及远程计 算机的情形中,远程计算机可以通过任意种类的网络——包括局域网(LAN)或广域网(WAN)连接到用户计 算机,或者,可以连接到外部计算机(例如利用因特网服务提供商来通过因特网连接)。上述以软件功能 单元的形式实现的集成的单元,可以存储在一个计算机可读取存储介质中。上述软件功能单元存储在一 个存储介质中,包括若干指令用以使得一台计算机设备(可以是个人计算机,服务器,或者网络设备等) 或处理器(processor)执行本发明各个实施例所述方法的部分步骤。而前述的存储介质包括:U盘、移动 硬盘、只读存储器(Read-Only Memory,ROM)、随机存取存储器(Random Access Memory,RAM)、磁碟或 者光盘等各种可以存储程序代码的介质。Program code embodied on a computer readable medium may be transmitted using any suitable medium, including but not limited to wireless, wireline, optical fiber cable, RF, etc., or any suitable combination of the foregoing. The computer-readable medium may be a computer-readable signal medium or a computer-readable storage medium. The computer-readable storage medium can be, for example, an electrical, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus or device, or a combination of any of the above. More specific examples (a non-exhaustive list) of computer readable storage media include: electrical connections having one or more wires, portable computer disks, hard disks, random access memory (RAM), read only memory (ROM), Erasable programmable read only memory (EPROM or flash memory), optical fiber, portable compact disk read only memory (CD-ROM), optical storage devices, magnetic storage devices, or any suitable combination of the above. In this document, a computer-readable storage medium can be any tangible medium that contains or stores a program that can be used by or in conjunction with an instruction execution system, apparatus, or device. Computer program code for carrying out operations of the present invention may be written in one or more programming languages, including object-oriented programming languages—such as Java, Smalltalk, C++, but also conventional Procedural programming language - such as the "C" language or similar programming language. The program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer, or entirely on the remote computer or server. In the case of a remote computer, the remote computer may be connected to the user's computer through any kind of network, including a local area network (LAN) or a wide area network (WAN), or may be connected to an external computer (eg, using an Internet service provider through the Internet connect). The above-mentioned integrated units implemented in the form of software functional units can be stored in a computer-readable storage medium. The above-mentioned software function unit is stored in a storage medium, and includes several instructions to cause a computer device (which may be a personal computer, a server, or a network device, etc.) or a processor (processor) to execute the methods described in the various embodiments of the present invention. some steps. The aforementioned storage medium includes: U disk, removable hard disk, read-only memory (Read-Only Memory, ROM), random access memory (Random Access Memory, RAM), magnetic disk or optical disk and other media that can store program codes .
上述以软件功能单元的形式实现的集成的单元,可以存储在一个计算机可读取存储介质中。上述软 件功能单元存储在一个存储介质中,包括若干指令用以使得一台计算机设备(可以是个人计算机,服务 器,或者网络设备等)或处理器(processor)执行本发明各个实施例所述方法的部分步骤。而前述的存储 介质包括:U盘、移动硬盘、只读存储器(Read-Only Memory,ROM)、随机存取存储器(Random Access Memory, RAM)、磁碟或者光盘等各种可以存储程序代码的介质。The above-mentioned integrated units implemented in the form of software functional units can be stored in a computer-readable storage medium. The above-mentioned software functional unit is stored in a storage medium, and includes several instructions to cause a computer device (which may be a personal computer, a server, or a network device, etc.) or a processor (processor) to execute the methods described in the various embodiments of the present invention. some steps. The aforementioned storage medium includes: U disk, mobile hard disk, read-only memory (Read-Only Memory, ROM), random access memory (Random Access Memory, RAM), magnetic disk or optical disk and other media that can store program codes .
以上所述仅为本发明的优选实施例,并非因此限制本发明的专利范围,凡是利用本发明说明书及附 图内容所作的等效结构或等效流程变换,或直或间接运用在其他相关的技术领域,均同理包括在本发明 的专利保护范围内。以上公开的本发明优选实施例只是用于帮助阐述本发明。优选实施例并没有详尽叙 述所有的细节,也不限制该发明仅为所述的具体实施方式。显然,根据本说明书的内容,可作很多的修 改和变化。本说明书选取并具体描述这些实施例,是为了更好地解释本发明的原理和实际应用,从而使 所属技术领域技术人员能很好地理解和利用本发明。本发明仅受权利要求书及其全部范围和等效物的限制。The above are only the preferred embodiments of the present invention, and are not intended to limit the scope of the present invention. Any equivalent structure or equivalent process transformation made by using the contents of the description and drawings of the present invention, or directly or indirectly applied to other related All technical fields are similarly included in the scope of patent protection of the present invention. The above-disclosed preferred embodiments of the present invention are provided only to help illustrate the present invention. The preferred embodiments are not exhaustive of all details, nor do they limit the invention to only the described embodiments. Obviously, many modifications and variations are possible in light of the contents of this specification. The present specification selects and specifically describes these embodiments in order to better explain the principles and practical applications of the present invention, so that those skilled in the art can well understand and utilize the present invention. The present invention is to be limited only by the claims and their full scope and equivalents.
Claims (10)
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202110754987.6A CN113534835B (en) | 2021-07-01 | 2021-07-01 | A kind of tourism virtual remote experience system and method |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202110754987.6A CN113534835B (en) | 2021-07-01 | 2021-07-01 | A kind of tourism virtual remote experience system and method |
Publications (2)
Publication Number | Publication Date |
---|---|
CN113534835A CN113534835A (en) | 2021-10-22 |
CN113534835B true CN113534835B (en) | 2022-05-31 |
Family
ID=78126648
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202110754987.6A Active CN113534835B (en) | 2021-07-01 | 2021-07-01 | A kind of tourism virtual remote experience system and method |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN113534835B (en) |
Families Citing this family (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN119648478A (en) * | 2024-11-13 | 2025-03-18 | 贵州迦太利华信息科技有限公司 | Immersive digital tourism experience configuration method and system based on eye tracking technology |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN107065905A (en) * | 2017-03-23 | 2017-08-18 | 东南大学 | A kind of immersion unmanned aerial vehicle control system and its control method |
CN109032183A (en) * | 2018-08-23 | 2018-12-18 | 广州创链科技有限公司 | A kind of unmanned plane control device and method based on Pupil diameter |
CN110412996A (en) * | 2019-06-18 | 2019-11-05 | 中国人民解放军军事科学院国防科技创新研究院 | It is a kind of based on gesture and the unmanned plane control method of eye movement, device and system |
CN111277756A (en) * | 2020-02-13 | 2020-06-12 | 西安交通大学 | Camera control method of small multi-rotor UAV based on eye recognition and tracking technology |
CN112738498A (en) * | 2020-12-24 | 2021-04-30 | 京东方科技集团股份有限公司 | A virtual tour system and method |
Family Cites Families (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
GR20130100620A (en) * | 2013-10-25 | 2015-05-18 | Ιωαννης Γεωργιου Μικρος | System and method for the electronic guidance of drones (take-off/ landing on the ground or on a ship) |
KR102353231B1 (en) * | 2015-04-24 | 2022-01-20 | 삼성디스플레이 주식회사 | Flying Display |
-
2021
- 2021-07-01 CN CN202110754987.6A patent/CN113534835B/en active Active
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN107065905A (en) * | 2017-03-23 | 2017-08-18 | 东南大学 | A kind of immersion unmanned aerial vehicle control system and its control method |
CN109032183A (en) * | 2018-08-23 | 2018-12-18 | 广州创链科技有限公司 | A kind of unmanned plane control device and method based on Pupil diameter |
CN110412996A (en) * | 2019-06-18 | 2019-11-05 | 中国人民解放军军事科学院国防科技创新研究院 | It is a kind of based on gesture and the unmanned plane control method of eye movement, device and system |
CN111277756A (en) * | 2020-02-13 | 2020-06-12 | 西安交通大学 | Camera control method of small multi-rotor UAV based on eye recognition and tracking technology |
CN112738498A (en) * | 2020-12-24 | 2021-04-30 | 京东方科技集团股份有限公司 | A virtual tour system and method |
Non-Patent Citations (1)
Title |
---|
凝视控制系统——人眼控制无人机;无双;《微信公众号》;20190128;正文第1-4页 * |
Also Published As
Publication number | Publication date |
---|---|
CN113534835A (en) | 2021-10-22 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
AU2024203150B2 (en) | System and method for augmented and virtual reality | |
US10678266B2 (en) | Method and system for continued navigation of unmanned aerial vehicles beyond restricted airspace boundaries | |
US20200309557A1 (en) | Systems and methods for providing virtual navigation guidance | |
KR101917630B1 (en) | System and method for augmented and virtual reality | |
US20190156558A1 (en) | Virtual reality system | |
CN112669464B (en) | A method and device for sharing data | |
CN108351649A (en) | System and method for UAV interactive instructions and control | |
CN106648045A (en) | Real-time tourism experience system based on virtual reality technology | |
AU2015332046A1 (en) | Street-level guidance via route path | |
CN106951561A (en) | Electronic map system based on virtual reality technology and GIS data | |
US11107506B2 (en) | Method and system for combining and editing UAV operation data and video data | |
CN108650494A (en) | The live broadcast system that can obtain high definition photo immediately based on voice control | |
TW202324042A (en) | Gaze-based camera auto-capture | |
CN113534835B (en) | A kind of tourism virtual remote experience system and method | |
KR20240005727A (en) | Panoptic segmentation prediction for augmented reality | |
WO2022188151A1 (en) | Image photographing method, control apparatus, movable platform, and computer storage medium | |
CN119110924A (en) | Navigation corrections for extreme winds | |
Beesley | Head in the Clouds: documenting the rise of personal drone cultures | |
Kusuma et al. | Filming with drones: is AI the new filmmaker? | |
CN119603558A (en) | Shooting control method, device, system and computer readable storage medium | |
CN113467616A (en) | Augmented reality processing method and related device, vehicle and storage medium | |
CN119277134A (en) | Display method, system, device and equipment based on head-mounted display device | |
CN118505944A (en) | Online browsing method, device, equipment and storage medium | |
CN119031112A (en) | HUD screen display method, device, equipment, medium and vehicle | |
Kang | Interactive and Intelligent Camera View Composing |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |