CN115857702B - Scene roaming and visual angle conversion method under space scene - Google Patents

Scene roaming and visual angle conversion method under space scene Download PDF

Info

Publication number
CN115857702B
CN115857702B CN202310176406.4A CN202310176406A CN115857702B CN 115857702 B CN115857702 B CN 115857702B CN 202310176406 A CN202310176406 A CN 202310176406A CN 115857702 B CN115857702 B CN 115857702B
Authority
CN
China
Prior art keywords
observation
star
observation point
scene
space scene
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202310176406.4A
Other languages
Chinese (zh)
Other versions
CN115857702A (en
Inventor
张俊杰
刘伟
刘瑞林
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Guoxing Chuangtu Technology Co ltd
Original Assignee
Beijing Guoxing Chuangtu Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Guoxing Chuangtu Technology Co ltd filed Critical Beijing Guoxing Chuangtu Technology Co ltd
Priority to CN202310176406.4A priority Critical patent/CN115857702B/en
Publication of CN115857702A publication Critical patent/CN115857702A/en
Application granted granted Critical
Publication of CN115857702B publication Critical patent/CN115857702B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Landscapes

  • Processing Or Creating Images (AREA)

Abstract

The invention relates to the technical field of space scene roaming, in particular to a scene roaming and visual angle conversion method under a space scene. The method comprises the following steps: acquiring space scene data and constructing a virtual space scene; establishing star coordinates of a star in a space scene, establishing an observation point according to a current view angle, determining the star coordinates nearest to the view angle range of the observation point, converting the star coordinates into positions of new observation points, and observing the space scene and roaming according to the observation points at the positions; generating a visual angle distance according to the distance between the observation point and the star coordinates, enabling the current observation point to determine a secondary star coordinate through the visual angle distance, then observing a space scene and roaming by the observation point according to the position of the secondary star coordinate, and the like; the plurality of observation points form a comprehensive observation range. The visual angle of the observation point is changed in order of the star, so that the visual field and the range of a user are wider when the user observes a space scene.

Description

Scene roaming and visual angle conversion method under space scene
Technical Field
The invention relates to the technical field of space scene roaming, in particular to a scene roaming and visual angle conversion method under a space scene.
Background
The space scene roaming is realized by setting up a virtual space scene, when a user roams in the current virtual space scene, the user usually carries out range observation according to a random space scene, but for the primary knowledge of some space environments, the random space scene roaming is unfavorable for understanding the space environment, because the space environment is complex, namely the virtual space scene is also complex, the cognition or learning efficiency of the primary knowledge on the space environment and the subsequent experience sense under the space scene are greatly influenced by the random space scene roaming, and how to orderly roam and observe in the space scene is a problem which needs to be solved currently;
when roaming in space, the observation angle of view of the user will change, and how to provide a view transformation adapted to orderly roaming for understanding the cognitive space scene for the user is also a problem to be solved.
Disclosure of Invention
The invention aims to provide a scene roaming and visual angle conversion method in a space scene so as to solve the problems in the background technology.
In order to achieve the above object, the present invention provides a method for scene roaming and view angle conversion in space scenes, comprising the steps of:
s1, acquiring space scene data and building a virtual space scene;
s2, establishing star coordinates of a star in the space scene, establishing an observation point according to the current view angle, determining the star coordinates closest to the view angle range of the observation point, converting the star coordinates into the position of a new observation point, and observing the space scene and roaming according to the observation point of the position;
s3, generating visual angle distance according to the distance between the observation point and the star coordinates, enabling the current observation point to determine auxiliary star coordinates through the visual angle distance, then observing space scenes and roaming through the observation point according to the positions of the auxiliary star coordinates, and the like;
s4, forming a comprehensive observation range by the plurality of observation points, determining the central position of the comprehensive observation range, establishing the position as an extremely observation point, and then carrying out panoramic view observation and scene roaming on the space by the extremely observation point.
As a further improvement of the technical scheme, the method for acquiring the space scene data comprises the following steps:
the information and the picture of the current space scene are searched in the network database, and a virtual space scene is built by the information and the picture.
As a further improvement of the technical scheme, the constructed space scene comprises the following various substance conditions:
the celestial body system specifically comprises: various stars, black holes, etc.;
the star cloud system specifically comprises: stars, clouds, etc.;
the material energy attached to the celestial system is specifically: minerals, plants and animals, the state of matter comprises gases, liquids and solids, the energy of which is specifically: light, wind, etc.
As a further improvement of the technical scheme, the positions of the observation points are specifically as follows:
initial position: the initial position of the user entering the space scene is established as an observation point;
viewing angle conversion position: determining a star ball nearest to the view angle range of the observation point, establishing a star coordinate according to the position of the star ball, and changing the observation point according to the position of the star coordinate so that the observation point can observe and roam according to the view angle of the star coordinate;
viewing angle spacing and position of new observation point: generating a visual angle distance according to the distance between the observation point and the star coordinates, and converting and determining the change of the position of the subsequent observation point according to the distance of the visual angle distance;
coupling position generated by a plurality of observation points: the plurality of observation points exist, the distances between the adjacent observation points are the same, the plurality of observation points form a comprehensive observation range, and the central position of the comprehensive observation range is used for establishing a polar observation point.
As a further improvement of the technical scheme, after the comprehensive observation range is formed, when the space scene is observed at the polar observation point in the comprehensive observation range, the virtual star is set to be in a rotating state, and the virtual star is comprehensively observed through the polar observation point.
As a further improvement of the technical scheme, the influence of the visual angle distance on the position of the observation point exists in the following three conditions:
when the visual angle distance is the initial distance, the observation roaming of the space scene of the user is in an initial state;
when the viewing angle distance is larger than the initial viewing angle distance, the observation roaming range of the space scene of the user is enlarged, and the user obtains a wider space roaming observation field of view;
when the viewing angle distance is smaller than the initial viewing angle distance, the observation roaming range of the space scene of the user is reduced, and the field of view of the user in space roaming observation is more accurate.
As a further improvement of the technical scheme, the position between the observation point and the star coordinate derives a difference coordinate, and the difference coordinate determines the difference star coordinate at a certain distance from the view angle, so that the position of the difference star coordinate is the observation point.
As a further improvement of the technical scheme, the difference star coordinates are affected by the above viewing angle spacing.
Compared with the prior art, the invention has the beneficial effects that:
1. in the scene roaming and visual angle conversion method under the space scene, the visual angle of the observation point is changed by the star by determining the distance between adjacent stars, and then the observation point is changed equidistantly by the distance between the initial adjacent stars, so that the observation field of view is changed orderly, the user roams orderly under the space scene, the field of view and the range of the user when observing the space scene are wider, and the user is accelerated to know the current space environment.
2. In the scene roaming and visual angle conversion method under the space scene, the observation points are formed according to equidistant stars, so that a plurality of observation points are formed, an optimal observation point is established through the centers of the plurality of observation points, and a user observes the space scene at the observation points more comprehensively and cognizes more comprehensively.
Drawings
Fig. 1 is an overall flow chart of the present invention.
Detailed Description
The following description of the embodiments of the present invention will be made clearly and completely with reference to the accompanying drawings, in which it is apparent that the embodiments described are only some embodiments of the present invention, but not all embodiments. All other embodiments, which can be made by those skilled in the art based on the embodiments of the invention without making any inventive effort, are intended to be within the scope of the invention.
Example 1: referring to fig. 1, the present embodiment provides a method for scene roaming and view angle conversion in a space scene, which includes the following steps:
s1, acquiring space scene data and building a virtual space scene;
s2, establishing star coordinates of a star in the space scene, establishing an observation point according to the current view angle, determining the star coordinates closest to the view angle range of the observation point, converting the star coordinates into the position of a new observation point, and observing the space scene and roaming according to the observation point of the position;
s3, generating visual angle distance according to the distance between the observation point and the star coordinates, enabling the current observation point to determine auxiliary star coordinates through the visual angle distance, then observing space scenes and roaming through the observation point according to the positions of the auxiliary star coordinates, and the like;
s4, forming a comprehensive observation range by the plurality of observation points, determining the central position of the comprehensive observation range, establishing the position as an extremely observation point, and then carrying out panoramic view observation and scene roaming on the space by the extremely observation point; in the scheme, firstly, data of a space scene are acquired, then, a virtual space scene is correspondingly built according to the acquired data, a star system is covered in the built space scene, the star system consists of tens of thousands of stars, when a user is in the space scene, the current observation view angle of the user is taken as an observation point, the star and the scene in the space are observed, then, the user can determine the star closest to the current observation point from the view angle of the current observation point, establish star coordinates at the current star point, re-determine the star coordinates as the observation point, the user performs space observation and scene roaming at the observation point, the distance between the observation point and the star coordinates is generated, the distance between the view angles is also the distance which can be changed, then, the change of each observation point is influenced by the distance, then, the user can determine the position of the next star and record as auxiliary star coordinates in a manner of the view angle distance, observe the position of the auxiliary star coordinates and realize the roaming of the scene, the position of the subsequent space points is gradually similar to the observation point, and the situation of the star is also realized, and the distance between the observation points and the positions of the user is different in the observation point is realized, and the distance between the observation points is also realized, and the distance of the user is more rapid;
then, for the plurality of observation points formed as described above, the comprehensive observation range in the space scene is established with the plurality of observation points, the central position of the comprehensive observation range is established as the polar observation point by determining the central position of the comprehensive observation range, then the space is observed and the scene roams with the polar observation point in panoramic view, at this time, the positions of the polar observation point from the plurality of stars are the same, further, for the plurality of stars in the space, the user is allowed to observe in an equidistant manner, and, because the distances between the polar observation point and the plurality of stars are the same and the distances between the plurality of stars are the same, the user is allowed to know the environment and the material state of the cognitive space through the orderly and targeted formation.
The method for acquiring the space scene data by the scheme comprises the following steps:
the information and the picture of the current space scene are searched in the network database, and the virtual space scene is built by the information and the picture, so that the environment state of the current space is rapidly determined, and the space scene is built.
Wherein, the space scene of setting up includes following multiple material condition:
the celestial body system specifically comprises: various stars, black holes, etc.;
the star cloud system specifically comprises: stars, clouds, etc.;
the material energy attached to the celestial system is specifically: minerals, plants and animals, the state of matter comprises gases, liquids and solids, the energy of which is specifically: light, wind and the like, and then correspondingly building the various conditions into a space scene to realize a virtual space scene which is relatively attached to the current space environment, thereby improving the subsequent experience of a user in the space scene, attracting the user to experience and know or cognize the space environment, and improving the cognition of the user in the current world to the space environment; secondly, when the space scene is relatively attached to the real space environment, convenience is also provided for subsequent scientific research work, for example, scientific research personnel can implement corresponding operation in the real space environment in the virtual space scene, and further development of scientific technology is also promoted.
Further explanation for the above observation points: the positions of the observation points are specifically as follows:
initial position: the initial position of the user entering the space scene is established as an observation point;
viewing angle conversion position: determining a star ball nearest to the view angle range of the observation point, establishing a star coordinate according to the position of the star ball, and changing the observation point according to the position of the star coordinate so that the observation point can observe and roam according to the view angle of the star coordinate;
viewing angle spacing and position of new observation point: generating a visual angle distance according to the distance between the observation point and the star coordinates, and converting and determining the change of the position of the subsequent observation point according to the distance of the visual angle distance;
coupling position generated by a plurality of observation points: the multiple observation points exist, the distances between the adjacent observation points are the same, the multiple observation points form a comprehensive observation range, the central position of the comprehensive observation range is used for establishing an extremely observation point, the positions of the observation points are divided into an initial position, a visual angle conversion position, a visual angle distance and a position of a new observation point, and a connection position generated by the multiple observation points, and the positions of the new observation points are influenced by the visual angle distance, so that the positions of the multiple observation points can be expanded, and are not limited to one, two or three of the above positions.
Secondly, there are three cases where the influence of the viewing angle distance on the position of the observation point is as follows:
when the visual angle distance is the initial distance, the observation roaming of the space scene of the user is in an initial state;
when the viewing angle distance is larger than the initial viewing angle distance, the observation roaming range of the space scene of the user is enlarged, and the user obtains a wider space roaming observation field of view;
when the viewing angle distance is smaller than the initial viewing angle distance, the observation roaming range of the space scene is reduced, the field of view of the user in the space roaming observation is more accurate, and further, when the user enters the space scene each time, different space roaming observation fields of view can be obtained by determining different viewing angle distances, so that the rapid space scene observation is realized, and the experience of the user in the space scene observation can be improved through the expansion and reduction of the fields of view.
Example 2: after the comprehensive observation range is formed, when the space scene is observed at the polar observation point in the comprehensive observation range, the virtual star is set to be in a rotating state, and the virtual star is comprehensively observed through the polar observation point, so that when the virtual star is in the rotating state, a user can observe a large number of star conditions in the space environment through the polar observation point, and the space world can be quickly and comprehensively known.
Example 3: the position between the observation point and the star coordinate derives the difference coordinate, the difference coordinate determines the difference star coordinate by the distance of the visual angle, so that the position of the difference star coordinate is the observation point, the observation point is determined by the star coordinate, the user observes by the position of the planet, the space scene observation at different positions is realized, after the difference star coordinate is determined, the observation point is converted to the position of the difference star coordinate for space observation, the visual angle for observing the space scene is changed, the space scene observation range is enlarged, the defect existing in the observation mode is made up, and the user can orderly and comprehensively roam and observe the space scene.
The star coordinate is affected by the above viewing angle spacing, specifically:
when the viewing angle distance is increased, the position of the difference star coordinate is prolonged, namely the position of the observation point on the difference star coordinate is changed, and when a user observes a space scene through the changed observation point, the observation field of view is increased;
on the contrary, when the visual angle is reduced at a certain distance, the state of the adjacent star is observed more accurately when the user observes the space scene, so that the design of the whole scheme is more specific and complete, and the implementation of the technical personnel is facilitated.
The foregoing has shown and described the basic principles, principal features and advantages of the invention. It will be understood by those skilled in the art that the present invention is not limited to the above-described embodiments, and that the above-described embodiments and descriptions are only preferred embodiments of the present invention, and are not intended to limit the invention, and that various changes and modifications may be made therein without departing from the spirit and scope of the invention as claimed. The scope of the invention is defined by the appended claims and equivalents thereof.

Claims (1)

1. A scene roaming and visual angle conversion method in space scene is characterized in that: the method comprises the following steps:
s1, acquiring space scene data and building a virtual space scene;
s2, establishing star coordinates of a star in a space scene, establishing an initial observation point according to a current view angle, observing the star and the scene of the space, determining a star closest to the current observation point according to the view angle of the current observation point, converting the star coordinates of the closest star into positions of new observation points, and observing the space scene and roaming according to the observation point of the positions;
s3, generating visual angle distance according to the distance between the initial observation point and the star coordinates of the nearest star, enabling the current observation point to determine auxiliary star coordinates through the visual angle distance, then observing space scenes and roaming through the positions of the auxiliary star coordinates by the observation point, and generating a plurality of observation points by pushing the auxiliary star coordinates;
s4, forming a comprehensive observation range by the plurality of observation points, determining the central position of the comprehensive observation range, establishing the position as an extremely observation point, and then carrying out panoramic view observation and scene roaming on the space by the extremely observation point;
the positions of the observation points are specifically as follows:
initial position: the initial position of the user entering the space scene is established as an observation point;
viewing angle conversion position: determining a star ball nearest to the view angle range of the observation point, establishing a star coordinate according to the position of the star ball, and changing the observation point according to the position of the star coordinate so that the observation point can observe and roam according to the view angle of the star coordinate;
viewing angle spacing and position of new observation point: generating a visual angle distance according to the distance between the initial observation point and the star coordinates of the nearest star, enabling the current observation point to determine the auxiliary star coordinates through the visual angle distance, and converting and determining the change of the auxiliary star coordinates of the subsequent new observation point according to the distance of the visual angle distance;
coupling position generated by a plurality of observation points: a plurality of observation points exist, the distances between the adjacent observation points are the same, a comprehensive observation range is formed by the plurality of observation points, and a polar observation point is established at the central position of the comprehensive observation range;
after the comprehensive observation range is formed, setting a virtual planet as a rotation state when a space scene is observed at a polar observation point in the comprehensive observation range, and comprehensively observing the virtual planet through the polar observation point;
the influence of the visual angle distance on the position of the observation point exists in three conditions:
when the visual angle distance is the initial distance, the observation roaming of the space scene of the user is in an initial state;
when the viewing angle distance is larger than the initial viewing angle distance, the range of the observation roaming of the space scene of the user is enlarged;
when the viewing angle distance is smaller than the initial viewing angle distance, the scope of the observed roaming of the space scene of the user is reduced;
the method for acquiring the space scene data comprises the following steps: searching information and pictures of the current space scene in a network database, and constructing a virtual space scene by using the information and pictures;
the constructed space scene comprises the following various substance elements: the celestial body system specifically comprises: various stars and black holes; the star cloud system specifically comprises: stars and clouds; the material energy attached to the celestial system is specifically: minerals, plants and animals, the state of matter comprises gases, liquids and solids, the energy of which is specifically: light and wind;
the method comprises the steps of deriving a difference coordinate according to the position between an observation point and a star coordinate, determining the difference star coordinate at a viewing angle distance by the difference coordinate, and enabling the position of the difference star coordinate to be the observation point, wherein the observation point is determined by the star coordinate, the positions of the stars are used for observation, so that space scene observation at different positions is realized, after the difference star coordinate is determined, the observation point is converted to the position of the difference star coordinate for space observation, the viewing angle for observing the space scene is changed, and the space scene observation range is enlarged;
the star coordinate is influenced by the visual angle distance;
when the viewing angle distance is increased, the position of the difference star coordinate is prolonged, namely the position of the observation point on the difference star coordinate is changed, and when the changed observation point observes the space scene, the observation field of view is increased;
on the contrary, when the visual angle is reduced at a certain distance, the state condition of the adjacent star is observed more accurately when the space scene is observed.
CN202310176406.4A 2023-02-28 2023-02-28 Scene roaming and visual angle conversion method under space scene Active CN115857702B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202310176406.4A CN115857702B (en) 2023-02-28 2023-02-28 Scene roaming and visual angle conversion method under space scene

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310176406.4A CN115857702B (en) 2023-02-28 2023-02-28 Scene roaming and visual angle conversion method under space scene

Publications (2)

Publication Number Publication Date
CN115857702A CN115857702A (en) 2023-03-28
CN115857702B true CN115857702B (en) 2024-02-02

Family

ID=85659333

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310176406.4A Active CN115857702B (en) 2023-02-28 2023-02-28 Scene roaming and visual angle conversion method under space scene

Country Status (1)

Country Link
CN (1) CN115857702B (en)

Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1414496A (en) * 2002-11-13 2003-04-30 北京航空航天大学 Universal virtual environment roaming engine computer system
CN102636172A (en) * 2012-05-04 2012-08-15 深圳市凯立德科技股份有限公司 Electronic map dynamic view angle adjusting method and terminal
CN103679799A (en) * 2012-09-25 2014-03-26 中国航天科工集团第二研究院二〇七所 A method for fast generating an optical starry sky background
CN106445106A (en) * 2016-08-29 2017-02-22 宇龙计算机通信科技(深圳)有限公司 Space virtual scene experience method and system
WO2018078444A1 (en) * 2016-10-26 2018-05-03 Zhonglian Shengshi Culture (Beijing) Co.Ltd. Image display method, client terminal and system, and image sending method and server
CN108257219A (en) * 2018-01-31 2018-07-06 广东三维家信息科技有限公司 A kind of method for realizing the roaming of panorama multiple spot
CN108594996A (en) * 2018-04-16 2018-09-28 微幻科技(北京)有限公司 The method and device of automatic visual angle adjustment in a kind of virtual roaming
CN111968246A (en) * 2020-07-07 2020-11-20 北京城市网邻信息技术有限公司 Scene switching method and device, electronic equipment and storage medium
CN112308948A (en) * 2020-09-30 2021-02-02 深圳市聚点数码科技有限公司郑州分公司 Construction method and application of light field roaming model for house property marketing
CN114387400A (en) * 2022-01-18 2022-04-22 北京有竹居网络技术有限公司 Three-dimensional scene display method, display device, electronic equipment and server
CN115097975A (en) * 2022-07-08 2022-09-23 北京有竹居网络技术有限公司 Method, apparatus, device and storage medium for controlling view angle conversion
CN115423935A (en) * 2022-08-12 2022-12-02 广州市城市规划设计有限公司 Visual domain analysis method and system for historical culture block

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2013074997A1 (en) * 2011-11-18 2013-05-23 Infinite Z, Inc. Indirect 3d scene positioning control

Patent Citations (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1414496A (en) * 2002-11-13 2003-04-30 北京航空航天大学 Universal virtual environment roaming engine computer system
CN102636172A (en) * 2012-05-04 2012-08-15 深圳市凯立德科技股份有限公司 Electronic map dynamic view angle adjusting method and terminal
CN103679799A (en) * 2012-09-25 2014-03-26 中国航天科工集团第二研究院二〇七所 A method for fast generating an optical starry sky background
CN106445106A (en) * 2016-08-29 2017-02-22 宇龙计算机通信科技(深圳)有限公司 Space virtual scene experience method and system
WO2018078444A1 (en) * 2016-10-26 2018-05-03 Zhonglian Shengshi Culture (Beijing) Co.Ltd. Image display method, client terminal and system, and image sending method and server
CN107995477A (en) * 2016-10-26 2018-05-04 中联盛世文化(北京)有限公司 Image presentation method, client and system, image sending method and server
CN108257219A (en) * 2018-01-31 2018-07-06 广东三维家信息科技有限公司 A kind of method for realizing the roaming of panorama multiple spot
CN108594996A (en) * 2018-04-16 2018-09-28 微幻科技(北京)有限公司 The method and device of automatic visual angle adjustment in a kind of virtual roaming
CN111968246A (en) * 2020-07-07 2020-11-20 北京城市网邻信息技术有限公司 Scene switching method and device, electronic equipment and storage medium
CN112308948A (en) * 2020-09-30 2021-02-02 深圳市聚点数码科技有限公司郑州分公司 Construction method and application of light field roaming model for house property marketing
CN114387400A (en) * 2022-01-18 2022-04-22 北京有竹居网络技术有限公司 Three-dimensional scene display method, display device, electronic equipment and server
CN115097975A (en) * 2022-07-08 2022-09-23 北京有竹居网络技术有限公司 Method, apparatus, device and storage medium for controlling view angle conversion
CN115423935A (en) * 2022-08-12 2022-12-02 广州市城市规划设计有限公司 Visual domain analysis method and system for historical culture block

Non-Patent Citations (4)

* Cited by examiner, † Cited by third party
Title
基于浸入式虚拟环境技术的景观空间尺度感教学模块构建与应用;翟宇佳;;中国园林(第04期) *
张璇 ; .虚拟场景中交互问题的探讨.兰州工业高等专科学校学报.2008,(第03期),全文. *
翟宇佳 ; .基于浸入式虚拟环境技术的景观空间尺度感教学模块构建与应用.中国园林.2018,(第04期),全文. *
虚拟场景中交互问题的探讨;张璇;;兰州工业高等专科学校学报(第03期) *

Also Published As

Publication number Publication date
CN115857702A (en) 2023-03-28

Similar Documents

Publication Publication Date Title
KR102125959B1 (en) Method and apparatus for determining a matching relationship between point cloud data
CN106296812B (en) It is synchronous to position and build drawing method
US5764239A (en) Automatic graph layout apparatus and methods determining and using a stable state of a physical model
CN112305559A (en) Power transmission line distance measuring method, device and system based on ground fixed-point laser radar scanning and electronic equipment
CN105101093A (en) Network topology visualization method with respect to geographical location information
CN106441298A (en) Method for map data man-machine interaction with robot view image
CN107169260B (en) Heterogeneous multi-source data resonance system and method based on space-time trajectory
CN105955067A (en) Multi-satellite intelligent cluster control simulation system based on quadrotor unmanned planes, and simulation method using the same to implement
WO2023273415A1 (en) Positioning method and apparatus based on unmanned aerial vehicle, storage medium, electronic device, and product
CN105096376A (en) Information processing method and electronic device
CN115857702B (en) Scene roaming and visual angle conversion method under space scene
WO2019151569A1 (en) Method for providing three-dimensional geographic information system web service
CN107067454B (en) Radar force diagram three-dimensional display method based on hypergraph component development platform
KR20210040294A (en) Method, device, electronic equipment and storage medium for querying historical street view
CN111199584A (en) Target object positioning virtual-real fusion method and device
CN110933682A (en) Automatic address selection method for unmanned aerial vehicle base station
Lovtsov et al. Automated special purpose optical electronic system’s functional diagnosis
CN105957005A (en) Method for bridge image splicing based on feature points and structure lines
CN109827503A (en) A kind of field crops precise positioning method and apparatus
CN106558074A (en) Coarse-fine combination matching algorithm in assemble of the satellite based on rotational transformation matrix
CN113034679A (en) Real-time dynamic three-dimensional visual display method for power transmission line
CN112417226A (en) Big data processing method based on DAG transformation
CN107688431A (en) Man-machine interaction method based on radar fix
Wang et al. Recognition and locating of damaged Poles in distribution network through images shot by unmanned aerial vehicle (UA V)
CN105957074A (en) Line segment matching method and system based on V-shape intersection description and local homography matrix

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant