CN110968705A - Navigation method, navigation device, navigation equipment, navigation system and storage medium - Google Patents

Navigation method, navigation device, navigation equipment, navigation system and storage medium Download PDF

Info

Publication number
CN110968705A
CN110968705A CN201911230100.2A CN201911230100A CN110968705A CN 110968705 A CN110968705 A CN 110968705A CN 201911230100 A CN201911230100 A CN 201911230100A CN 110968705 A CN110968705 A CN 110968705A
Authority
CN
China
Prior art keywords
navigation
current
unit
visual angle
file
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201911230100.2A
Other languages
Chinese (zh)
Other versions
CN110968705B (en
Inventor
巩一璞
王芳芳
杜鹃
王小伟
王顺仁
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
DUNHUANG ACADEMY
Original Assignee
DUNHUANG ACADEMY
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by DUNHUANG ACADEMY filed Critical DUNHUANG ACADEMY
Priority to CN201911230100.2A priority Critical patent/CN110968705B/en
Publication of CN110968705A publication Critical patent/CN110968705A/en
Application granted granted Critical
Publication of CN110968705B publication Critical patent/CN110968705B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/40Information retrieval; Database structures therefor; File system structures therefor of multimedia data, e.g. slideshows comprising image and additional audio data
    • G06F16/43Querying
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/40Information retrieval; Database structures therefor; File system structures therefor of multimedia data, e.g. slideshows comprising image and additional audio data
    • G06F16/43Querying
    • G06F16/438Presentation of query results
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q50/00Systems or methods specially adapted for specific business sectors, e.g. utilities or tourism
    • G06Q50/10Services
    • G06Q50/14Travel agencies
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02DCLIMATE CHANGE MITIGATION TECHNOLOGIES IN INFORMATION AND COMMUNICATION TECHNOLOGIES [ICT], I.E. INFORMATION AND COMMUNICATION TECHNOLOGIES AIMING AT THE REDUCTION OF THEIR OWN ENERGY USE
    • Y02D10/00Energy efficient computing, e.g. low power processors, power management or thermal management

Abstract

The invention relates to a navigation method, a navigation device, navigation equipment, a navigation system and a storage medium, wherein the method comprises the following steps: determining the identification of the space to be navigated where the user is currently located; inquiring a pre-established database according to the identification, and acquiring navigation data corresponding to a space to be navigated, wherein the navigation data comprises a first corresponding relation between each preset visual angle and a real azimuth included in the panoramic view file and a second corresponding relation between each interpretation unit and each preset visual angle included in the panoramic view file; acquiring current azimuth information of the mobile terminal; determining a current preset visual angle corresponding to the current orientation information and a unit to be explained corresponding to the current preset visual angle in the panoramic view file according to the first corresponding relation and the second corresponding relation of the current orientation information; and displaying the current preset visual angle and the unit to be explained. Wherein, the content of the interpretation unit displayed by the navigation device and the panoramic image displayed by the current preset viewing angle change along with the change of the user orientation.

Description

Navigation method, navigation device, navigation equipment, navigation system and storage medium
Technical Field
The present application belongs to the field of communications, and in particular, relates to a navigation method, apparatus, navigation device, navigation system, and storage medium.
Background
When visiting the human landscape, the tourist can scan the two-dimensional codes or specific numbers distributed on the periphery of the scenic spot by means of a mobile terminal such as a mobile phone and the like to determine the position of the tourist, and then the mobile terminal obtains the tourism explanation of the landscape corresponding to the current position.
However, for some architectural landscapes, different orientations (e.g., four walls and a roof) in the same building space need to be defined by a number of orientation terms and further described. For tourists unfamiliar with the building space content, it is difficult to quickly find the landscape content corresponding to the explanation content, so that the navigation effect is poor, and for the tourists, the interactive experience is not good enough.
Disclosure of Invention
In view of the above, an object of the present invention is to provide a navigation method, a navigation apparatus, a navigation system, and a storage medium, which can display navigation data corresponding to a landscape viewed by a user to the user, avoid the situation that the navigation data is inconsistent with the content actually viewed by the user, and improve the user experience.
The embodiment of the application is realized as follows:
in a first aspect, an embodiment of the present application provides a navigation method, which is applied to a navigation device, where the method includes: determining the identification of the space to be navigated where the user is currently located; inquiring a pre-established database according to the identification to obtain navigation data corresponding to the space to be navigated, wherein the navigation data comprises a panoramic view file, a plurality of explanation units, a first corresponding relation between each preset visual angle included by the panoramic view file and the real orientation, and a second corresponding relation between each explanation unit and each preset visual angle included by the panoramic view file; acquiring current azimuth information of the mobile terminal; determining a current preset visual angle corresponding to the current orientation information and a unit to be explained corresponding to the current preset visual angle in the panoramic view file according to the first corresponding relation and the second corresponding relation of the current orientation information; and displaying the current preset visual angle and the unit to be explained. The current preset visual angle image of the panoramic view displayed by the navigation equipment corresponds to the content of the wall surface to which the user faces at the moment, and the content of the current unit to be explained is also related to the content of the wall surface to which the user faces at the moment, namely, the current preset visual angle panoramic image displayed by the navigation equipment and the unit to be explained change along with the change of the orientation of the user, so that for the user, the navigation content does not need to be positioned according to the orientation words appearing in the navigation data, the situation that the navigation data is inconsistent with the content actually seen by the user can not be sensed, and the user experience can be improved.
With reference to the embodiment of the first aspect, in a possible implementation manner, a signal transmitter is disposed in each space to be navigated, and a signal transmitted by the signal transmitter includes a tag of the space to be navigated where the signal transmitter is located; the determining the identifier of the space to be navigated where the user is currently located includes: acquiring a signal transmitted by the signal transmitter; and determining the label included in the signal with the strongest signal strength in the acquired signals as the identifier.
With reference to the first aspect, in a possible implementation manner, the number of preset views included in the panoramic view file is the same as the number of walls of a space to be navigated corresponding to the panoramic view file, a panoramic image corresponding to each preset view corresponds to a plurality of interpretation units, each interpretation unit is labeled by a polygon, an anchor link is created, and the anchor link is clicked to pop up a detailed content view included in the corresponding interpretation unit.
With reference to the first aspect, in a possible implementation manner, the explaining unit includes a text file, an audio file, and a video file, and the explaining unit corresponds to an area of a certain wall surface, where the area is a whole wall surface or a local area on a current wall surface, and the displaying the current preset view angle and the unit to be explained includes: when an instruction which is triggered by a user and used for representing a navigation mode is obtained, a first image layer and a second image layer are formed, wherein the first image layer is positioned below the second image layer, and the second image layer is in a hidden state by default; displaying the panoramic image corresponding to the current preset visual angle on the first layer, and displaying the image-text file, the audio file and the video file which are included by the unit to be explained on the second layer; marking the unit to be explained corresponding to the current preset visual angle on the first image layer in a polygonal mode, and creating an anchor point link; and responding to an anchor point link click instruction to pop up the second image layer.
With reference to the embodiment of the first aspect, in a possible implementation manner, the displaying the current preset view angle and the unit to be explained includes:
when the instruction which is triggered by the user and used for representing the navigation mode is an active navigation mode, actively and sequentially playing audio files contained in the unit to be explained corresponding to the current preset visual angle, and highlighting the polygon label of the explanation unit corresponding to the audio file which is currently played on the first image layer; and when the instruction which is triggered by the user and used for representing the navigation mode is acquired to be the passive navigation mode, responding to an anchor point link click instruction triggered by the user, and displaying the image-text file, the audio file and the video file which are contained in the unit to be explained and correspond to the anchor point link click instruction.
In a second aspect, an embodiment of the present application provides a navigation device, including: applied to a navigation device, the apparatus comprising: the determining module is used for determining the identifier of the space to be navigated where the user is currently located; an obtaining module, configured to query a pre-established database according to the identifier, and obtain navigation data corresponding to the space to be navigated, where the navigation data includes a panoramic view file, multiple interpretation units, a first correspondence between each preset view included in the panoramic view file and the real orientation, and a second correspondence between each interpretation unit and each preset view included in the panoramic view file; the acquisition module is also used for acquiring the current azimuth information of the acquisition module; the determining module is further configured to determine a current preset viewing angle corresponding to the current orientation information and a to-be-explained unit corresponding to the current preset viewing angle in the panoramic view file according to the first corresponding relationship and the second corresponding relationship of the current orientation information; and the display module is used for displaying the current preset visual angle and the unit to be explained.
With reference to the second aspect, in a possible implementation manner, a signal transmitter is disposed in each space to be navigated, and a signal transmitted by the signal transmitter includes a tag of the space to be navigated where the signal transmitter is located; the determining module is used for acquiring the signal transmitted by the signal transmitter; and determining the label included in the signal with the strongest signal strength in the acquired signals as the identifier.
With reference to the second aspect, in a possible implementation manner, the number of preset views included in the panoramic view file is the same as the number of walls of the space to be navigated corresponding to the panoramic view file, the panoramic image corresponding to each preset view corresponds to a plurality of interpretation units, each interpretation unit is labeled by a polygon, an anchor link is created, and a detailed content view included in the corresponding interpretation unit can be popped up by clicking the anchor link.
With reference to the second aspect, in a possible implementation manner, the explaining unit includes a text file, an audio file, and a video file, and the explaining unit corresponds to an area of a certain wall surface, where the area is a whole wall surface or a local area on a current wall surface, and the displaying the current preset view angle and the unit to be explained includes: when an instruction which is triggered by a user and used for representing a navigation mode is obtained, a first image layer and a second image layer are formed, wherein the first image layer is positioned below the second image layer, and the second image layer is in a hidden state by default; displaying the panoramic image corresponding to the current preset visual angle on the first layer, and displaying the image-text file, the audio file and the video file which are included by the unit to be explained on the second layer; marking the unit to be explained corresponding to the current preset visual angle on the first image layer in a polygonal mode, and creating an anchor point link; and responding to an anchor point link click instruction to pop up the second image layer.
With reference to the second aspect, in a possible implementation manner, the navigation mode includes an active navigation mode and a passive navigation mode, and the displaying the current preset viewing angle and the unit to be explained includes: when the active navigation mode is acquired, actively and sequentially playing audio files contained in the unit to be explained corresponding to the current preset visual angle, and highlighting the polygon label of the explaining unit corresponding to the currently played audio file on the first layer; and when the passive navigation mode is acquired, responding to an anchor point link click instruction triggered by a user, and displaying the image-text file, the audio file and the video file which are contained in the unit to be explained and correspond to the anchor point link click instruction.
In a third aspect, an embodiment of the present application further provides a navigation apparatus, including: the device comprises a memory, a processor, a positioning component and a transceiver which are connected with each other; the memory is used for storing programs; the processor calls a program stored in the memory to perform the method of the first aspect embodiment and/or any possible implementation manner of the first aspect embodiment.
In a fourth aspect, the present application further provides a non-transitory computer-readable storage medium (hereinafter, storage medium), on which a computer program is stored, where the computer program is executed by a computer to perform the method in the foregoing first aspect and/or any possible implementation manner of the first aspect.
In a fifth aspect, an embodiment of the present application provides a navigation system, including a navigation device and a signal transmitter; the signal transmitter is used for transmitting a signal; the navigation equipment is used for inquiring a pre-established database according to the identification to acquire navigation data corresponding to the space to be navigated, wherein the navigation data comprises a panoramic view file, a plurality of explanation units, a first corresponding relation between each preset visual angle included by the panoramic view file and the real position, and a second corresponding relation between each explanation unit and each preset visual angle included by the panoramic view file; acquiring current azimuth information of the mobile terminal; determining a current preset visual angle corresponding to the current orientation information and a unit to be explained corresponding to the current preset visual angle in the panoramic view file according to the first corresponding relation and the second corresponding relation of the current orientation information; and displaying the current preset visual angle and the unit to be explained.
In combination with the fifth aspect of the embodiments, in one possible implementation, a plurality of signal transmitters are disposed in each navigation space, and the plurality of signal transmitters disposed in the same navigation space are located at different physical positions of the same horizontal plane.
Additional features and advantages of the application will be set forth in the description which follows, and in part will be obvious from the description, or may be learned by the practice of the embodiments of the application. The objectives and other advantages of the application may be realized and attained by the structure particularly pointed out in the written description and drawings.
Drawings
In order to more clearly illustrate the embodiments of the present application or the technical solutions in the prior art, the drawings needed to be used in the embodiments will be briefly described below, and it is obvious that the drawings in the following description are only some embodiments of the present application, and it is obvious for those skilled in the art to obtain other drawings without creative efforts. The foregoing and other objects, features and advantages of the application will be apparent from the accompanying drawings. Like reference numerals refer to like parts throughout the drawings. The drawings are not intended to be to scale as practical, emphasis instead being placed upon illustrating the subject matter of the present application.
Fig. 1 shows a schematic structural diagram of a navigation system provided in an embodiment of the present application.
Fig. 2 shows a schematic structural diagram of a navigation device provided in an embodiment of the present application.
Fig. 3 is a flowchart illustrating a navigation method provided in an embodiment of the present application.
Fig. 4 shows a block diagram of a navigation device according to an embodiment of the present application.
Reference numbers: 10-a navigation system; 100-a navigation device; 110-a processor; 120-a memory; 130-display screen; 140-a positioning member; 150-a transceiver; 160-a loudspeaker; 200-a signal transmitter; 400-a navigation device; 410-a determination module; 420-an acquisition module; 430-display module.
Detailed Description
The technical solutions in the embodiments of the present application will be described below with reference to the drawings in the embodiments of the present application.
It should be noted that: like reference numbers and letters refer to like items in the following figures, and thus, once an item is defined in one figure, it need not be further defined and explained in subsequent figures. Meanwhile, relational terms such as "first," "second," and the like may be used solely in the description herein to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. Also, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other identical elements in a process, method, article, or apparatus that comprises the element.
Further, the term "and/or" in the present application is only one kind of association relationship describing the associated object, and means that three kinds of relationships may exist, for example, a and/or B may mean: a exists alone, A and B exist simultaneously, and B exists alone.
First, an application scenario implemented by the present application is introduced.
For a tourist attraction of building type, a plurality of independent building spaces are often included, for example, the palace includes a plurality of independent palace in the palace, each palace includes a plurality of independent rooms; the mojock comprises a plurality of independent caverns. Each room or cave in the above example is an independent building space. Each building space includes a plurality of walls. For convenience of explanation, in the embodiment of the present application, it is assumed that each building space includes six wall surfaces, i.e., an east wall, a south wall, a west wall, a north wall, a top wall, and a bottom surface. Of course, the above examples are merely for reference, and it can be understood that the number of the wall surfaces of the special-shaped building space can be more or less.
For each wall of the building space, the corresponding landscape may be different, for example, for each cave of the moho cave, different wall paintings are drawn on each wall thereof, for example, for the individual rooms included in the palace, the finishing engraving on each wall thereof is different.
The contents displayed on the wall surfaces have rich values of history, art, humanity and the like, and are used for reflecting specific building subjects or consciousness forms, so that professional personnel or equipment are generally required to guide visitors when visiting the landscape. However, since each building space includes a plurality of walls, when navigating, it is necessary to define the building space by using a large number of directional terms, and further description is performed, so for tourists who are not familiar with the contents of the walls, it is difficult to quickly find corresponding contents, the interpretation precision is low, and the actual use effect is poor.
In order to solve the foregoing problems, embodiments of the present application provide a navigation method, a navigation apparatus, a navigation system, and a storage medium, and the navigation technique can be implemented by using corresponding software, hardware, and a combination of software and hardware. The following describes the embodiments of the present application in detail, taking the wall painting of the guide grotto as an example. In the method, a grotto site is generally composed of a plurality of relatively independent grottos, and a non-separable grotto in each grotto is an independent building space mentioned in the embodiment of the application.
Referring to fig. 1, the present embodiment provides a navigation system 10, which includes a navigation device 100 and a signal transmitter 200.
The signal transmitter 200 needs to meet the requirements of high precision, low power consumption, convenient deployment, no influence on cultural relics and the like, and meanwhile, the signal transmitted by the signal transmitter 200 is broadcast data, and the signal at least comprises the equipment identification of the signal transmitter 200 transmitting the signal.
The signal transmitters 200 are disposed in the building spaces, so that one signal transmitter 200 exists in each building space, and the building space identifier and the device identifier of the signal transmitter 200 are associated with each other.
As an alternative implementation, the device identification may include a device unique number, a group code, and a device code. The unique device number is the factory device number of the signal transmitter 200, and the block code and the device code can be configured by user. In one embodiment of the present application, the block code of the signal transmitter 200 may be used as the identification of the building space to which it belongs. Wherein the identification of the building space is a tag.
The navigation device 100 is used to receive the signal transmitted by the signal transmitter 200.
Since the signal transmitted by each signal transmitter 200 is broadcast data, it can be understood that, in theory, the navigation device 100 can acquire signals transmitted by a plurality of signal transmitters 200 within the signal reception range.
Furthermore, since the closer the distance, the less the signal attenuation, the signal strength of the signal received by the navigation device 100 can be used as a measure of the distance between the navigation device 100 and the signal transmitter 200 that transmitted the signal. Therefore, when the navigation device 100 receives a plurality of signals, the building space corresponding to the signal with the strongest signal strength may be determined as the building space closest to itself (i.e., the building space in which itself is currently located). Since the signal includes the device identifier of the signal transmitter 200 and the device identifier has a corresponding relationship with the independent space, the navigation device 100 can determine the identifier of the building space where the navigation device is currently located according to the device identifier included in the signal with the strongest signal strength, that is, determine the identifier of the current space to be navigated.
Furthermore, if the two building spaces are located too close to each other or if the signal strengths of the signals broadcast by the signal transmitters 200 received by the users from the different building spaces are similar due to the spatial structure relationship, the navigation device 100 may not be able to accurately determine the identification of the space to be navigated in which the visitor is currently located.
To alleviate the above problem, as an alternative embodiment, at least two signal transmitters 200 may be deployed at different physical locations of the same horizontal plane covered by the building space or all the building spaces in the above problem (for example, one signal transmitter 200 is deployed on the left-side ground, and the other signal transmitter 200 should also be deployed on the ground, for example, the right-side ground, but should not be deployed at other high-altitude locations), and the signal transmitters 200 belonging to the same building space are configured with the same block coding and different device codes, so as to ensure that the signals broadcast by all the signal transmitters 200 in the same building space can cover the whole building space as much as possible. When the subsequent navigation device 100 receives a plurality of signals, the identification of the building space corresponding to the signal with the strongest signal strength is still used as the identification of the space to be navigated.
Of course, as an alternative embodiment, the tourist may also manually input the identifier of the building space where the tourist is currently located into the navigation apparatus 100, so that the navigation apparatus 100 acquires the identifier of the space to be navigated.
After acquiring the identifier of the space to be navigated, the navigation device 100 queries a pre-established database, reads navigation data corresponding to the identifier of the space to be navigated from the database, and starts navigating the user, where the specific navigation process please refer to the following method embodiment.
In addition, referring to fig. 2, an embodiment of the present application provides a navigation apparatus 100, which can provide a navigation service for tourists.
Alternatively, the navigation Device 100 may be, but is not limited to, a smart phone, a tablet computer, a Mobile Internet Device (MID), a personal digital assistant, and the like.
Among them, the navigation apparatus 100 may include: processor 110, memory 120, display 130, positioning component 140, transceiver 150, and speaker 160.
It should be noted that the components and structure of the navigation device 100 shown in fig. 2 are exemplary only and not limiting, and that the navigation device 100 may have other components and structures as desired.
The processor 110, memory 120, display 130, positioning means 140, transceiver 150, and speaker 160, as well as other components that may be present in the navigation device 100, are electrically connected to each other, directly or indirectly, to enable the transmission or interaction of data. For example, the processor 110, the memory 120, the display 130, the positioning component 140, the transceiver 150, the speaker 160, and other components that may be present may be electrically connected to each other via one or more communication buses or signal lines.
The memory 120 is used for storing a program, for example, a program corresponding to a navigation method appearing later or a navigation apparatus appearing later. Optionally, when the memory 120 stores therein the navigation device, the navigation device includes at least one software function module which can be stored in the memory 120 in the form of software or firmware (firmware).
Alternatively, the software function module included in the navigation apparatus may be solidified in an Operating System (OS) of the navigation device 100.
The processor 110 is adapted to execute executable modules stored in the memory 120, such as software functional modules or computer programs comprised by the navigation device. When the processor 110 receives the execution instruction, it may execute the computer program, for example, to perform: determining the identification of the space to be navigated where the user is currently located; inquiring a pre-established database according to the identification to obtain navigation data corresponding to the space to be navigated, wherein the navigation data comprises a panoramic view file, a plurality of explanation units, a first corresponding relation between each preset visual angle included by the panoramic view file and the real orientation, and a second corresponding relation between each explanation unit and each preset visual angle included by the panoramic view file; acquiring current azimuth information of the mobile terminal; determining a current preset visual angle corresponding to the current orientation information and a unit to be explained corresponding to the current preset visual angle in the panoramic view file according to the first corresponding relation and the second corresponding relation of the current orientation information; and displaying the current preset visual angle and the unit to be explained.
The display screen 130 is used for displaying the panorama view information and the graphic and text information related to the mural contents to the visitor when the navigation data includes the panorama view file and the graphic and text file.
The positioning component 140 is used to determine current orientation information of the navigation device 100. The positioning component 140 may include, but is not limited to, a geomagnetic sensor and a gyroscope, among others.
The transceiver 150 is used for transmitting and receiving signals and commands.
The speaker 160 is used to present a voice introduction associated with the mural to the visitor when the navigation data includes voice information.
Of course, the method disclosed in any of the embodiments of the present application can be applied to the processor 110, or implemented by the processor 110.
The following description will be made for the navigation method provided in the present application.
Referring to fig. 3, the present embodiment provides a navigation method applied to the navigation apparatus 100, which includes the following steps.
Step S110: and determining the identification of the space to be navigated where the user is currently located.
As an alternative embodiment, the identification of the space to be navigated determined by the navigation device 100 may be provided directly by the user, for example, by direct input from the user, or by the user scanning a two-dimensional code of the space to be navigated currently located. However, in practical use, this method requires the user to scan the two-dimensional code frequently or submit the position number, and the interaction experience is not good enough for the user.
Thus, as another alternative, the signal transmitter 200 may be provided in each space to be navigated as described above, and the signal transmitted by the signal transmitter 200 includes the tag of the space to be navigated in which it is located.
After acquiring the signals transmitted by the plurality of different signal transmitters 200, the navigation device 100 determines the tag included in the signal with the strongest signal strength in the acquired signals as the identifier of the space to be navigated. The space to be navigated is a building space where the user is currently located, and the navigation device 100 is required to navigate the landscape of the building space.
Step S120: and inquiring a pre-established database according to the identification to obtain navigation data corresponding to the space to be navigated, wherein the navigation data comprises a panoramic view file, a plurality of explanation units, a first corresponding relation between each preset visual angle included by the panoramic view file and the real orientation, and a second corresponding relation between each explanation unit and each preset visual angle included by the panoramic view file.
Wherein the database is pre-established by the staff.
The database includes navigation data corresponding to the identification of each building space. Aiming at each navigation data, a panoramic view file of the building space corresponding to the navigation data and a plurality of explaining units corresponding to the wall surface of the building space corresponding to the panoramic view file are included, a first corresponding relation between each preset visual angle and the real orientation included in the panoramic view file is included, and a second corresponding relation between each explaining unit and each preset visual angle included in the panoramic view file is included.
The process of creating the database will be described below by taking one of the building spaces as an example.
For a certain building space a, the worker firstly determines the identifier of the building space a according to the block code of the signal transmitter 200 deployed in the building space a, and then determines the wall number of the building space a. The term "six-sided" is used herein to refer to the east, south, west, north, top and bottom surfaces, respectively.
Then, the worker photographs images of respective preset viewing angles in the building space a to compose a panoramic view file. The panoramic view file comprises a plurality of preset views, wherein the number of the preset views is the same as the number of the wall surfaces of the building space a, for example, the building space a is a six-sided wall surface, correspondingly, the panoramic view file corresponding to the building space a comprises six preset views, and the panoramic image displayed by each preset view is related to the content of the corresponding wall surface.
When a panoramic view file is established, firstly, the initial position of a shot panoramic image is determined to be 0 degree of the coordinates of the panoramic view file, then an initial preset visual angle is determined from the panoramic image, the preset visual angle is the preset visual angle which is firstly seen when a user opens the panoramic view file, and the horizontal rotation angle α of the initial preset visual angle relative to the initial position in the panoramic view file is calculated, then an angle deviation β of the wall surface corresponding to the initial preset visual angle relative to the geomagnetic north pole is collected by a worker in a mode that the worker holds the positioning part 140 by hand and faces the wall surface corresponding to the initial preset visual angle, so that the first north pole relation of each preset visual angle and the orientation reality included in the panoramic view file can be obtained, wherein gamma is β - α, and the first north pole relation psi is phi-gamma, and the panoramic view file is used for representing that the horizontal rotation angle deviation phi of the viewing position of the user in the panoramic view file is phi.
To this end, for the building space a, a panorama view file and a panorama corresponding to the identifier thereof may be formed, and the navigation data includes a plurality of interpretation units and first correspondence between each preset view angle included in the panorama view file and the real orientation.
Subsequently, the staff establishes an explanation unit corresponding to each wall surface included in the building space a, with each wall surface as a unit. Because the wall surface corresponds to the preset visual angle, the explaining unit also corresponds to the preset visual angle, and a second corresponding relation is formed.
The method comprises the following steps that a preset visual angle can correspond to an explanation unit, and at the moment, the content displayed by the explanation unit corresponds to the whole area included by a wall surface corresponding to the preset visual angle; in addition, a preset visual angle may also correspond to a plurality of interpretation units, at this time, the content displayed by each interpretation unit corresponds to a local area of the wall surface corresponding to the preset visual angle, and the content displayed by all the interpretation units corresponding to the preset visual angle is combined to correspond to the whole area of the wall surface corresponding to the preset visual angle.
For each explaining unit, the displayed content comprises rich media such as image files, audio files, video files, text files and the like or links of the rich media, and is used for introducing the content of the wall painting displayed on the wall corresponding to the corresponding preset visual angle.
In addition, each explanation unit is marked through a polygon, an anchor link is created, and a detailed content view included in the corresponding explanation unit can be popped up by clicking the anchor link.
After the database is established, the database may be stored in the navigation device 100 or in the cloud, so that the navigation device 100 may query the corresponding navigation data according to the identifier of the space to be navigated.
Of course, it can be understood that if the database is stored in the cloud, the navigation device 100 needs to perform data interaction with the cloud server to obtain the relevant content in the database.
Step S130: and acquiring the current orientation information of the mobile terminal.
It has been mentioned above that the positioning means 140 are provided within the navigation device 100, and the current orientation information of the navigation device 100 can be detected in real time.
As an alternative embodiment, the positioning component 140 may be a geomagnetic sensor and a gyroscope. Here, the navigation apparatus 100 acquires an angular deviation between itself and the magnetic north pole through the geomagnetic sensor, and determines the angular deviation as current bearing information of itself.
Step S140: and determining a current preset visual angle corresponding to the current orientation information and a unit to be explained corresponding to the current preset visual angle in the panoramic view file according to the current orientation information, the first corresponding relation and the second corresponding relation.
In the embodiment of the application, the number of the explaining units is six, and the panoramic view file comprises six preset viewing angles.
After determining the orientation information of the navigation device 100, the navigation device queries the acquired navigation data, and compares the current orientation information of the navigation device with the first corresponding relationship, thereby determining a current preset viewing angle corresponding to the current orientation information of the navigation device and a unit to be explained corresponding to the current preset viewing angle.
As an optional implementation manner, in the first corresponding relationship and the second corresponding relationship, when the current position information is located in a certain preset range, the current position information corresponds to a preset viewing angle corresponding to the preset range and a plurality of interpretation units corresponding to the preset viewing angle. That is, when the navigation device 100 detects that the current position of the user is within a certain preset range, the current position corresponds to the same preset viewing angle and the unit to be explained corresponding to the preset viewing angle. Assuming that the current position of the user detected by the navigation device 100 is a, the current position of the user at the next time changes and becomes B, but both a and B belong to the preset range corresponding to the preset viewing angle a, and at this time, the current time determined by the navigation device 100, the current preset viewing angle determined at the next time, and the unit to be explained corresponding to the current preset viewing angle do not change.
Step S150: and displaying the unit to be explained and the current preset visual angle.
Subsequently, the navigation apparatus 100 displays the determined current preset viewing angle and the unit to be explained corresponding to the current preset viewing angle.
When displaying, the navigation apparatus 100 forms a first layer and a second layer. The first layer is located below the second layer, and the second layer is in a hidden state by default. At this time, the navigation device 100 displays the determined panoramic image corresponding to the current preset view angle on the first layer, displays the determined image-text file, audio file and video file included in the unit to be explained on the second layer, marks the unit to be explained on the first layer in a polygonal form, and creates an anchor link. When the user clicks the anchor point, the navigation device 100 responds to the anchor point link click command to pop up the second layer.
In addition, in the embodiment of the present application, the user may select the navigation mode, wherein the navigation mode includes an active navigation mode and a passive navigation mode.
When the navigation device 100 acquires the active navigation mode triggered by the user, the audio files contained in the unit to be explained corresponding to the current preset viewing angle are actively and sequentially played. In addition, the navigation apparatus 100 may also highlight the polygon label of the explanation unit corresponding to the currently playing audio file on the first layer, for example, thicken the polygon corresponding to the currently playing unit to be explained, change the polygon frame corresponding to the currently playing unit to be explained into different colors, and make the polygon frame corresponding to the currently playing unit to be explained flash, until the navigation apparatus 100 determines that the current preset view angle has been switched to another preset view angle.
When the navigation device 100 acquires the passive navigation mode triggered by the user, the navigation device 100 displays a second layer in response to the anchor point link click command triggered by the user, and displays a picture-text file, an audio file and a video file contained in the unit to be explained, which correspond to the anchor point link click command triggered by the user, on the second layer.
It should be noted that, in the active mode and the passive mode, the user can independently click any anchor link of the interpretation unit at the current preset viewing angle, so that the navigation device 100 pops up the second layer, and the user can conveniently view the contents of the image-text file, the audio file, the video file, and the like included in the current interpretation unit.
The current preset viewing angle of the panoramic view file displayed by the navigation apparatus 100 and the content of the unit to be explained corresponding to the current preset viewing angle are related to the content of the wall surface to which the user faces at the moment, that is, the current preset viewing angle displayed by the navigation apparatus 100 and the content of the unit to be explained corresponding to the current preset viewing angle change along with the change of the orientation of the user, so that the content of the unit to be explained does not need to set up an orientation word any more, for the user, the content referred to by the navigation content does not need to be derived according to the orientation word any more, the situation that the navigation data is inconsistent with the content actually seen by the user is not experienced, and the user experience can be improved.
As shown in fig. 4, an embodiment of the present application further provides a navigation apparatus 400, where the navigation apparatus 400 may include:
a determining module 410, configured to determine an identifier of a space to be navigated where a user is currently located;
an obtaining module 420, configured to query a pre-established database according to the identifier, and obtain navigation data corresponding to the space to be navigated, where the navigation data includes a panoramic view file, multiple explanation units, a first corresponding relationship between each preset view included in the panoramic view file and the real orientation, and a second corresponding relationship between each preset view included in the panoramic view file and each explanation unit;
the obtaining module 420 is further configured to obtain current position information of the mobile terminal;
the determining module 410 is further configured to determine a current preset viewing angle corresponding to the current orientation information and a unit to be explained corresponding to the current preset viewing angle in the panoramic view file according to the first corresponding relationship and the second corresponding relationship of the current orientation information;
a display module 430, configured to display the current preset view angle and the unit to be explained.
In a possible implementation manner, a signal transmitter is arranged in each space to be navigated, and the signal transmitted by the signal transmitter comprises a tag of the space to be navigated in which the signal transmitter is located; the determining module 410 is configured to obtain a signal transmitted by the signal transmitter; and determining the label included in the signal with the strongest signal strength in the acquired signals as the identifier.
In a possible implementation manner, the number of the preset viewing angles included in the panoramic view file is the same as the number of the walls of the space to be navigated corresponding to the panoramic view file, the panoramic image corresponding to each preset viewing angle corresponds to a plurality of interpretation units, each interpretation unit is labeled by a polygon, an anchor link is created, and a detailed content view included in the corresponding interpretation unit can be popped up by clicking the anchor link.
In a possible implementation manner, the explanation unit includes a text file, an audio file, and a video file, and the explanation unit corresponds to an area of a certain wall surface, where the area is a whole wall surface or a local area on a current wall surface, and the presentation module 430 is configured to form a first layer and a second layer when an instruction for representing a navigation mode triggered by a user is obtained, where the first layer is located below the second layer, and the second layer is in a hidden state by default; displaying the panoramic image corresponding to the current preset visual angle on the first layer, and displaying the image-text file, the audio file and the video file which are included by the unit to be explained on the second layer; marking the unit to be explained corresponding to the current preset visual angle on the first image layer in a polygonal mode, and creating an anchor point link; and responding to an anchor point link click instruction to pop up the second image layer.
In a possible implementation manner, the navigation mode includes an active navigation mode and a passive navigation mode, and the display module 430 is configured to actively and sequentially play audio files contained in the unit to be explained corresponding to the current preset view angle when the active navigation mode is acquired, and highlight a polygon label of the explanation unit corresponding to the audio file currently being played on the first layer; and when the passive navigation mode is acquired, responding to an anchor point link click instruction triggered by a user, and displaying the image-text file, the audio file and the video file which are contained in the unit to be explained and correspond to the anchor point link click instruction.
The navigation device 400 provided in the embodiment of the present application has the same implementation principle and the same technical effect as those of the foregoing method embodiments, and for the sake of brief description, reference may be made to the corresponding contents in the foregoing method embodiments for the parts of the embodiment of the device that are not mentioned.
In addition, the embodiment of the present application further provides a storage medium, where a computer program is stored on the storage medium, and when the computer program is executed by a computer, the steps included in the navigation method described above are executed.
To sum up, the navigation method, apparatus, navigation device, navigation system and storage medium provided in the embodiments of the present invention include: determining the identification of the space to be navigated where the user is currently located; inquiring a pre-established database according to the identification to obtain navigation data corresponding to the space to be navigated, wherein the navigation data comprises a panoramic view file, a plurality of explanation units, a first corresponding relation between each preset visual angle included by the panoramic view file and the real orientation, and a second corresponding relation between each explanation unit and each preset visual angle included by the panoramic view file; acquiring current azimuth information of the mobile terminal; determining a current preset visual angle corresponding to the current orientation information and a unit to be explained corresponding to the current preset visual angle in the panoramic view file according to the first corresponding relation and the second corresponding relation of the current orientation information; and displaying the current preset visual angle and the unit to be explained. The content of the explanation unit displayed by the navigation equipment is related to the content of the wall surface to which the user faces at the moment, and the panoramic image displayed by the current preset visual angle also corresponds to the content of the wall surface to which the user faces at the moment, namely, the content of the explanation unit displayed by the navigation equipment and the panoramic image displayed by the current preset visual angle change along with the change of the orientation of the user, so that for the user, the navigation content does not need to be positioned according to the orientation words appearing in the navigation data, the situation that the navigation data is inconsistent with the content actually seen by the user can not be sensed, and the user experience can be improved.
It should be noted that, in the present specification, the embodiments are all described in a progressive manner, each embodiment focuses on differences from other embodiments, and the same and similar parts among the embodiments may be referred to each other.
In the embodiments provided in the present application, it should be understood that the disclosed apparatus and method can be implemented in other ways. The apparatus embodiments described above are merely illustrative, and for example, the flowchart and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of apparatus, methods and computer program products according to various embodiments of the present application. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
In addition, functional modules in the embodiments of the present application may be integrated together to form an independent part, or each module may exist separately, or two or more modules may be integrated to form an independent part.
The functions may be stored in a storage medium if they are implemented in the form of software function modules and sold or used as separate products. Based on such understanding, the technical solution of the present application or portions thereof that substantially contribute to the prior art may be embodied in the form of a software product stored in a storage medium and including instructions for causing a computer device (which may be a personal computer, a notebook computer, a server, or a network device) to execute all or part of the steps of the method according to the embodiments of the present application. And the aforementioned storage medium includes: a U-disk, a removable hard disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk, and other various media capable of storing program codes.
The above description is only for the specific embodiments of the present application, but the scope of the present application is not limited thereto, and any person skilled in the art can easily conceive of the changes or substitutions within the technical scope of the present application, and shall be covered by the scope of the present application.

Claims (10)

1. A navigation method, applied to a navigation device, the method comprising:
determining the identification of the space to be navigated where the user is currently located;
inquiring a pre-established database according to the identification to obtain navigation data corresponding to the space to be navigated, wherein the navigation data comprises a panoramic view file, a plurality of explanation units, a first corresponding relation between each preset visual angle and a real position included in the panoramic view file and a second corresponding relation between each explanation unit and each preset visual angle included in the panoramic view file;
acquiring current azimuth information of the mobile terminal;
determining a current preset visual angle corresponding to the current orientation information and a unit to be explained corresponding to the current preset visual angle in the panoramic view file according to the first corresponding relation and the second corresponding relation of the current orientation information;
and displaying the current preset visual angle and the unit to be explained.
2. The method according to claim 1, wherein a signal transmitter is arranged in each space to be navigated, and the signal transmitted by the signal transmitter comprises a tag of the space to be navigated in which the signal transmitter is located; the determining the identifier of the space to be navigated where the user is currently located includes:
acquiring a signal transmitted by the signal transmitter;
and determining the label included in the signal with the strongest signal strength in the acquired signals as the identifier.
3. The method according to claim 1, wherein the number of the preset views included in the panoramic view file is the same as the number of the walls of the space to be navigated corresponding to the panoramic view file, the panoramic image corresponding to each preset view corresponds to a plurality of interpretation units, each interpretation unit is labeled by a polygon, an anchor link is created, and a detailed content view included in the corresponding interpretation unit can be popped up by clicking the anchor link, and in the first corresponding relationship and the second corresponding relationship, when the current orientation information is within a certain preset range, the current orientation information corresponds to the preset view corresponding to the preset range and the plurality of interpretation units corresponding to the preset view.
4. The method of claim 3, wherein the interpretation unit comprises a text file, an audio file and a video file, and the interpretation unit corresponds to a region of a wall, the region being a whole wall or a local region on a current wall, and the displaying the current predetermined viewing angle and the unit to be interpreted comprises:
when an instruction which is triggered by a user and used for representing a navigation mode is obtained, a first image layer and a second image layer are formed, wherein the first image layer is positioned below the second image layer, and the second image layer is in a hidden state by default;
displaying the panoramic image corresponding to the current preset visual angle on the first layer, and displaying the image-text file, the audio file and the video file which are included by the unit to be explained on the second layer;
marking the unit to be explained corresponding to the current preset visual angle on the first image layer in a polygonal mode, and creating an anchor point link;
and responding to an anchor point link click instruction to pop up the second image layer.
5. The method as claimed in claim 4, wherein the navigation mode includes an active navigation mode and a passive navigation mode, and the displaying the current preset viewing angle and the unit to be explained includes:
when the active navigation mode is acquired, actively and sequentially playing audio files contained in the unit to be explained corresponding to the current preset visual angle, and highlighting the polygon label of the explaining unit corresponding to the currently played audio file on the first layer;
and when the passive navigation mode is acquired, responding to an anchor point link click instruction triggered by a user, and displaying the image-text file, the audio file and the video file which are contained in the unit to be explained and correspond to the anchor point link click instruction.
6. A navigation device for use with a navigation apparatus, the device comprising:
the determining module is used for determining the identifier of the space to be navigated where the user is currently located;
the acquisition module is used for inquiring a pre-established database according to the identification and acquiring navigation data corresponding to the space to be navigated, wherein the navigation data comprises a panoramic view file, a plurality of explanation units, a first corresponding relation between each preset visual angle and a real position included in the panoramic view file and a second corresponding relation between each explanation unit and each preset visual angle included in the panoramic view file;
the acquisition module is also used for acquiring the current azimuth information of the acquisition module;
the determining module is further configured to determine a current preset viewing angle corresponding to the current orientation information and a to-be-explained unit corresponding to the current preset viewing angle in the panoramic view file according to the first corresponding relationship and the second corresponding relationship of the current orientation information;
and the display module is used for displaying the current preset visual angle and the unit to be explained.
7. A navigation device, comprising: the device comprises a memory, a processor, a positioning component and a transceiver which are connected with each other;
the memory is used for storing programs;
the processor calls a program stored in the memory to perform the method of any of claims 1-5.
8. A storage medium, having stored thereon a computer program which, when executed by a computer, performs the method of any one of claims 1-5.
9. A navigation system, comprising a navigation device and a signal transmitter;
the signal transmitter is used for transmitting a signal;
the navigation equipment is used for determining the identification of the space to be navigated where the user is currently located according to the signal transmitted by the signal transmitter;
inquiring a pre-established database according to the identification to obtain navigation data corresponding to the space to be navigated, wherein the navigation data comprises a panoramic view file, a plurality of explanation units, a first corresponding relation between each preset visual angle and a real position included in the panoramic view file and a second corresponding relation between each explanation unit and each preset visual angle included in the panoramic view file; acquiring current azimuth information of the mobile terminal; determining a current preset visual angle corresponding to the current orientation information and a unit to be explained corresponding to the current preset visual angle in the panoramic view file according to the first corresponding relation and the second corresponding relation of the current orientation information; and displaying the current preset visual angle and the unit to be explained.
10. The system according to claim 9, characterized in that a plurality of signal emitters are deployed in each navigation space, and that the plurality of signal emitters deployed in the same navigation space are located at different physical locations of the same horizontal plane.
CN201911230100.2A 2019-12-04 2019-12-04 Navigation method, navigation device, navigation apparatus, navigation system, and storage medium Active CN110968705B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201911230100.2A CN110968705B (en) 2019-12-04 2019-12-04 Navigation method, navigation device, navigation apparatus, navigation system, and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201911230100.2A CN110968705B (en) 2019-12-04 2019-12-04 Navigation method, navigation device, navigation apparatus, navigation system, and storage medium

Publications (2)

Publication Number Publication Date
CN110968705A true CN110968705A (en) 2020-04-07
CN110968705B CN110968705B (en) 2023-07-18

Family

ID=70033176

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201911230100.2A Active CN110968705B (en) 2019-12-04 2019-12-04 Navigation method, navigation device, navigation apparatus, navigation system, and storage medium

Country Status (1)

Country Link
CN (1) CN110968705B (en)

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100039506A1 (en) * 2008-08-15 2010-02-18 Amir Sarvestani System for and method of visualizing an interior of body
JP2010185761A (en) * 2009-02-12 2010-08-26 Toyota Motor Corp Navigation system, road map display method
CN103632627A (en) * 2013-12-12 2014-03-12 北京清城睿现数字科技研究院有限公司 Information display method and apparatus and mobile navigation electronic equipment
CN104392633A (en) * 2014-11-12 2015-03-04 国家电网公司 Interpretation control method oriented to power system simulating training
CN105279750A (en) * 2014-07-09 2016-01-27 雷震 Equipment display guiding system based on IR-UWB and image moment
CN105592148A (en) * 2015-12-17 2016-05-18 成都新橙北斗智联有限公司 Voice navigation-based interactive service system and method thereof
CN105989573A (en) * 2015-02-16 2016-10-05 上海钧铭商务咨询有限公司 Method and system for providing exhibition hall guide information based on 360-degree digital panoramic technology
CN106302427A (en) * 2016-08-09 2017-01-04 深圳市豆娱科技有限公司 Sharing method in reality environment and device
CN107742491A (en) * 2017-11-01 2018-02-27 江苏鸿信系统集成有限公司 A kind of self-service guide method
CN109668568A (en) * 2019-01-25 2019-04-23 天津煋鸟科技有限公司 A kind of method carrying out location navigation using panoramic imagery is looked around

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100039506A1 (en) * 2008-08-15 2010-02-18 Amir Sarvestani System for and method of visualizing an interior of body
JP2010185761A (en) * 2009-02-12 2010-08-26 Toyota Motor Corp Navigation system, road map display method
CN103632627A (en) * 2013-12-12 2014-03-12 北京清城睿现数字科技研究院有限公司 Information display method and apparatus and mobile navigation electronic equipment
CN105279750A (en) * 2014-07-09 2016-01-27 雷震 Equipment display guiding system based on IR-UWB and image moment
CN104392633A (en) * 2014-11-12 2015-03-04 国家电网公司 Interpretation control method oriented to power system simulating training
CN105989573A (en) * 2015-02-16 2016-10-05 上海钧铭商务咨询有限公司 Method and system for providing exhibition hall guide information based on 360-degree digital panoramic technology
CN105592148A (en) * 2015-12-17 2016-05-18 成都新橙北斗智联有限公司 Voice navigation-based interactive service system and method thereof
CN106302427A (en) * 2016-08-09 2017-01-04 深圳市豆娱科技有限公司 Sharing method in reality environment and device
CN107742491A (en) * 2017-11-01 2018-02-27 江苏鸿信系统集成有限公司 A kind of self-service guide method
CN109668568A (en) * 2019-01-25 2019-04-23 天津煋鸟科技有限公司 A kind of method carrying out location navigation using panoramic imagery is looked around

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
FANG CHENG等: ""The evaluating simulation system of ship anchoring operation"", 2015 34TH CHINESE CONTROL CONFERENCE *
刘峪瑄: ""无线传感器网络的视频应用研究"" *
张凯: ""基于JAVA和VRML的虚拟校园漫游导航系统的设计与实现"", 中国优秀硕士学位论文全文数据库 *

Also Published As

Publication number Publication date
CN110968705B (en) 2023-07-18

Similar Documents

Publication Publication Date Title
US11721073B2 (en) Synchronized, interactive augmented reality displays for multifunction devices
US11405558B2 (en) Automated control of image acquisition via use of hardware sensors and camera content
US10535116B2 (en) Shared virtual reality
US7634354B2 (en) Location signposting and orientation
CN108337664B (en) Tourist attraction augmented reality interactive navigation system and method based on geographical position
CN104040546A (en) Method and system for displaying panoramic imagery
US11609345B2 (en) System and method to determine positioning in a virtual coordinate system
JP6093841B2 (en) System, information processing method and program
CN111242704B (en) Method and electronic equipment for superposing live character images in real scene
JPWO2019069575A1 (en) Information processing equipment, information processing methods and programs
JP2008070705A (en) Picture providing system and picture providing server device
CN110968705B (en) Navigation method, navigation device, navigation apparatus, navigation system, and storage medium
Davies et al. Mobile cross reality for cultural heritage
US10970930B1 (en) Alignment and concurrent presentation of guide device video and enhancements
JP6187037B2 (en) Image processing server, image processing system, and program
Fabritz et al. Open Specification for Indoor-Navigation
US9402162B1 (en) Mobile device, positioning method and non-transitor computer-readable recording medium
EP4250744A1 (en) Display terminal, communication system, method for displaying, method for communicating, and carrier means
JP2000311256A (en) Shared virtual space display system, avatar display method and recording medium
JP2022535793A (en) Interaction method and electronic device based on optical communication device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant